WO2019006503A1 - Passenger management - Google Patents
Passenger management Download PDFInfo
- Publication number
- WO2019006503A1 WO2019006503A1 PCT/AU2018/050690 AU2018050690W WO2019006503A1 WO 2019006503 A1 WO2019006503 A1 WO 2019006503A1 AU 2018050690 W AU2018050690 W AU 2018050690W WO 2019006503 A1 WO2019006503 A1 WO 2019006503A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- passenger
- vehicle
- data template
- stored
- identification information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
Definitions
- the present invention relates to passenger management.
- the invention relates to a system or method for identifying, counting, locating, and/or surveilling passengers of a sea vessel.
- Vessel based tourism involves accommodating and entertaining passengers whilst ferrying them to one or more stopovers. At the stopovers passengers temporarily disembark the vessel in order to undertake activities or explore the site, before re-boarding the vessel for transit to the next stopover or final destination.
- a passenger obtaining another passenger s paper boarding / activities ticket may present safety concerns, such as where a passenger without prerequisite diving training is able to undertake a diving activity in the name of a trained passenger. S ecurity concerns may also be apparent, such as where a violent criminal is misidentified or takes the place of a passenger who is not a security threat. F urther, manual ticketing and identification procedures may result in financia l losses due to, for instance, usage of another passenger s ticket to obtain perks or privileges not persona lly purchased.
- tour vessels lack video surveillance and/or facial recognition and/or passenger tracking ca pabilities for identifying potentially problematic situations or security threats, such as passenger overcrowding in a n a rea of the vessel or the presence of passengers in restricted a reas.
- tour vessel operators may be required to submit management reports, a nd many having only a manual set-up, are required to compile the necessary information and prepa re the reports by inefficiently by ha nd a nd without the benefits of automation.
- present methods of identification may lack the accuracy or sensitivity required to correctly identify passengers at a n accepta ble rate.
- T hus it may be advantageous to provide a new system and/or method of passenger ma nagement which reduces, limits, overcomes, or ameliorates some of the problems, drawbacks, or disadvantages associated with the prior art, or provides an effective alternative.
- a system or method which facilitates passenger identification In another particular aspect, it may be advantageous to provide a system or method which facilitates passenger counting. In another particular aspect, it may be advantageous to provide a system or method adapted for passenger surveillance. In another particular aspect, it may be advantageous to provide a system or method which facilitates passenger tracking or locating. In another particular aspect it may be advantageous to provide a system or method which facilities passenger ticketing, bookings, or purchasing. In another particular aspect, it may be advantageous to provide a method or system which facilitates the provision of electronic medical or legal forms. In another particular aspect, it may be advantageous to provide a method or system for improving security on-board the vessel.
- the invention provides a passenger management system.
- the invention provides a system for managing passengers of a vehicle.
- T he vehicle may comprise any means of transporting passengers.
- the vehicle may comprise a plane, bus, or train.
- the vehicle comprises a vessel.
- the vessel may comprise any means of transporting or supporting passengers on water.
- the vessel comprises a ship, boat, pontoon, or sea platform for activities.
- the system may comprise passenger identification means.
- the passenger identification means may comprise a reading device.
- the passenger identification means may comprise:
- an identification device disposed on or in relation to the passenger, the identification device being adapted to convey a non- biometric characteristic relating to the passenger, and
- a reading device adapted for reading the non-biometric characteristic conveyed by the identification device.
- the non-biometric characteristic may comprise a data template.
- the data template may comprise unique passenger identification information.
- the passenger identification means may comprise: an identification device disposed on or in relation to the passenger, the identification device being adapted to convey unique passenger identification information, and
- a reading device adapted for reading the unique passenger identification information conveyed by the identification device.
- the identification device may be or comprise a portable device.
- the reading device may be or comprise a portable device.
- the unique passenger identification information may comprise, be comprised by, or be in the form of, a data template.
- the unique passenger identification information may comprise, or be comprised by, non-biometric characteristics.
- the identification device may comprise a transmitter.
- the unique passenger identification information may be conveyed by a transmission from the transmitter.
- the reading device may comprise a receiver.
- the receiver may detect the transmission from the identification device, thereby obtaining the unique passenger identification information.
- the receiver may be adapted to detect the transmission when in close proximity to the transmitter. For instance, the receiver may be adapted to detect the transmission when within twenty, fifteen, ten, nine, eight seven, six five, four, three, two, or one centimetres of the transmitter, or the transmission may dissipate over such distances.
- the identification device may comprise a wearable object.
- the wearable object may comprise a wristband.
- the wristband may be printable.
- the wristband may be adjustable prior to fastening.
- the wristband may be irremovable without damage thereto.
- the identification device may comprise an RFID device.
- the RFID device may be embedded in the wristband.
- the RFID device may comprise the transmitter.
- the RFID device may comprise an RFID tag.
- the RFID tag may comprise an NFC tag.
- the NFC tag may comprise the transmitter, the transmitter comprising an NFC antenna.
- the NFC tag may comprise an NFC chip.
- the reading device may comprise, or be comprised by a tablet, smart phone, console, or handheld portable reader.
- the reading device may comprise an RFID reader.
- the RFID reader may comprise an NFC reader.
- the reading device may comprise a biometric measuring device adapted for measuring biometric characteristics of a passenger.
- the biometric characteristics may be associated with or used as a basis to generate unique passenger identification information.
- the biometric characteristics may comprise physiological characteristics.
- the physiological characteristics may comprise at least one of fingerprint, DNA, hand geometry, palm vein arrangement, retinal blood vessel arrangement, iris configuration, and physical facial characteristics.
- the biometric characteristics may comprise behavioural characteristics.
- the behavioural characteristics may comprise at least one of signature, and voice characteristics.
- the biometric measuring device may comprise a palm vein scanner.
- the palm vein scanner may comprise a light emitter and a sensor for detecting the emitted light.
- the light emitter may emit infrared light and the sensor may detect infra-red light absorbed by the palm veins.
- the biometric measuring device may be connected with, comprise, or be comprised by a tablet, smart phone, or console.
- T he system, or passenger identification mea ns may comprise passenger counting means.
- T he passenger counting means may comprise pa rt or a ll of the passenger identification mea ns.
- T he passenger counting mea ns may be adapted to count passengers as they enter or exit the vessel.
- T he passenger counting means may comprise a processor adapted to perform the counting operations.
- T he processor may be comprised by the reading device. Additionally, or alternatively, the processor may be comprised by a n on-board server linked with reading device.
- T he passenger counting means may comprise the reading device positioned at or on the way to a vessel exit or entry point.
- T he reading device may be positioned at or near a boa rding gate.
- T he passenger counting mea ns may comprise passenger identification devices disposed on or in relation to respective passengers.
- T he passenger counting means may be adapted to count passengers entering or exiting the vessel as their passenger identification devices are detected by the reading device. Additionally, or alternatively, the passenger counting means may be ada pted to count passengers entering or exiting the vessel as their biometric cha racteristics are detected or measured by the reading device.
- T he passenger counting means may be adapted to distinguish between passengers entering the vessel and passengers exiting the vessel.
- the passenger counting mea ns may determine that every odd number of detections for a pa rticula r passenger relates to a boarding movement thereby reducing the missing passenger count by one assuming the identified passenger is not a lready accounted for, a nd every even number of detections for a particular passenger relates to a disembarking movement, thereby increasing the missing passenger count by one unless the passenger is at the end of their journey.
- the reading device may receive input, for insta nce from a crew member, indicating the direction of movement (on or off the vessel) of the detected passenger.
- entry and exit reading devices in the form of palm vein sca nners, or facial recognition cameras, a re provided.
- T he passenger counting means may be adapted to distinguish between passengers boarding the vessel at the sta rt of their journey or disembarking at the end of their journey, a nd passengers disemba rking the vessel for an activity and re-boarding from activity.
- T he passenger counting mea ns may comprise a passenger manifest.
- T he passenger manifest may identify the total number of passengers due to board the vessel. Information relating to the passenger manifest may be wirelessly transmitted from an off-board server to the on-board server. T his information may in-turn be trans mitted to the reading device.
- T he passenger counting means may comprise a n activity manifest.
- T he activity manifest may identify the number of passengers due to re-board the vessel after returning from an activity.
- T he passenger counting means may comprise a missing passenger count for identifying passengers missing from the vessel.
- T he missing passenger count may comprise a booking count.
- T he booking count may indicate the number of passengers yet to boa rd the vessel.
- T he missing passenger count may comprise an activity count
- T he activity count may indicate the number of passengers yet to re-board the vessel after disembarking (for a n activity, for insta nce).
- T he system, or passenger identification mea ns may comprise surveillance mea ns.
- T he surveillance mea ns may comprise pa rt or all of the passenger identification mea ns.
- T he surveillance mea ns may comprise a data base of stored images showing passenger faces, or of stored templates based on or derived from such images.
- T he surveilla nce means may comprise an on-boa rd camera.
- T he reading device may comprise the on-board ca mera.
- the on-board camera may be wirelessly linked with the on-boa rd server.
- T he on-board camera may be adapted to capture images of passengers.
- the captured images may comprise images showing passenger faces.
- the surveillance means may be adapted to detect passenger faces in a captured image.
- the surveillance means may be adapted to compare the captured images with the stored images, or compare templates of captured images with stored templates derived from images.
- the surveillance means may be adapted to identify a passenger based on similarities of a detected face with a stored image, or based on similarities of a template of a captured image including a passenger face with a stored template.
- the surveillance means may be adapted to trigger an alert when the number of people captured in an image, or the number of people captured within a designated zone of the vessel, is greater than a predetermined value.
- the surveillance means may be adapted to calculate the number of times, or total time, a passenger visits a predesignated area. If the number of times or total time is greater than their respective predesignated values, the surveillance means may trigger a frequency or duration alert.
- the surveillance means may be adapted to trigger a rendezvous alert when two passengers facially recognised from a captured image match two passengers in stored images together associated with a rendezvous alert.
- the surveillance means may be adapted to trigger a transit alert when a passenger facially recognised from a captured image in one zone matches the same passenger facially recognised from an earlier captured image in another zone, or does not match zoning information associated with a stored image of that passenger.
- the surveillance means may be adapted to trigger an unattended alert when a passenger facially recognised from a captured image is not in close proximity to, or is greater than a pre-determined distance away from, or is not in the captured image with, or is determined to be in a different zone to, a predesignated accompanying passenger.
- the system, or passenger identification means may comprise positioning means for identifying passenger position on the vessel.
- the positioning means may comprise part or all of the passenger identification means.
- T he pos itioning means may comprise the passenger identification device dis posed on or in relation to the passenger.
- T he positioning means may comprise an on-boa rd positioning receiver.
- T he positioning receiver may be adapted to receive a positioning transmission from the passenger identification device.
- the positioning receiver may be associated with a receiver ID.
- T he positioning mea ns may be adapted to determine the position of the passenger on or in relation to the vessel based on at least one of the received transmission of positional information and the receiver ID.
- T he positioning means may be adapted to display the position of the detected passenger on a map.
- T he pos itioning mea ns may be adapted to determine or indicate the pos ition of the passenger within twenty metres, fifteen metres, ten metres, eight metres, six metres, five metres, four metres, three metres, two metres or one metre.
- T he pos itioning receiver may be adapted to detect the positioning transmission over an extended dista nce from the transmitter.
- the positioning receiver may be adapted to detect the transmission when over two metres, five metres, ten metres, twenty metres, thirty metres, fifty metres, seventy five metres, or one hundred metres away, or the positioning transmission may dissipate over such dista nces.
- T he identification device may comprise a B LE chip.
- T he B L E chip may be ada pted to transmit a low energy B luetooth signal.
- T he B LE chip may be embedded in the wristband.
- T he B LE chip may comprise the positioning transmitter.
- T he positioning means, positioning receiver, or other linked device or processor may be adapted to determine whether an earlier positioning transmission from the wristba nd was received within a predetermined time period, or whether a later positioning transmission is received within a predetermined time period.
- the time period may be configurable.
- the positioning receiver may be ada pted to ignore later positioning trans missions from the identification device that are received within a predetermined time period.
- the positioning means or receiver may be adapted to ignore the positioning transmission if a previous transmission has been received within a predetermined time period. T hus, the passenger / positional transmitter position may only be updated after a pre-determined time interval.
- the positioning means may be adapted to determine whether the detected position of the passenger or positional transmitter is within a restricted zone or zone restricted to them.
- the positioning means may be adapted to trigger or generate a location alert when the passenger is detected in the restricted zone.
- the system, or an electronic device or devices comprised by the system may comprise a processor.
- the system, or electronic device or devices comprised by the system may comprise memory.
- the memory may comprise at least one of RAM, RO M, data, programs, and an operating system.
- the system or electronic device or devices comprised by the system may comprise input means, a display, a power source, and/or a communications interface.
- the communications interface may comprise a network interface.
- The, or some of the, devices of the system may be, or be adapted to be, wirelessly linked.
- the invention provides a passenger management method. In another aspect the invention provides a method of managing passengers of a vehicle.
- the vehicle may comprise a vessel.
- the method may comprise:
- the information may be read or received by a reading device.
- the transmission may comprise an R FID transmission.
- the R FID transmission may comprise an NFC transmission.
- the method may comprise determining the identity of the passenger based on the received passenger identification information.
- the method may comprise bringing the reading device into close proximity with the identification device in order to receive the transmission therefrom.
- C lose proximity may be less than five centimetres. It is to be noted that bringing is used herein as a relative term, so bringing the reading device into close proximity with the identification device is no different to bringing the identification device into close proximity with the reading device.
- the method may comprise positioning the reading device at or on the way to an exit or entry point.
- the exit point may be an entry or exit point of a or the vessel.
- the method may comprise reading or measuring biometric characteristics of a passenger.
- the biometric characteristics may comprise at least one of fingerprint, D NA, hand geometry, palm vein arrangement, retinal blood vessel arrangement, iris configuration, facial structure, signature, and voice characteristics.
- the biometric characteristics comprise palm vein arrangement.
- reading or measuring biometric characteristics of a passenger may comprise measuring palm vein arrangement.
- Measuring palm vein arrangement may comprise beaming infrared light towards a passenger's palm and detecting the infra-red light absorbed by the palm veins.
- the method may comprise determining or conveying unique passenger identification information based on the measured biometric characteristics.
- the method may comprise counting passengers by received transmissions. Additionally, or alternatively, the method may comprise counting passengers by measuring or reading biometric characteristics.
- the method may comprise increasing or decreasing a passenger count by one each time a transmission is received. Additionally, or alternatively, the method may comprise increasing or decreasing a passenger count by one each time biometric characteristics are read or measured.
- the passenger count may comprise a missing count of passengers off the vessel.
- the passenger count may comprise a count of passengers on-board the vessel.
- the method may comprise obtaining or determining the direction of passenger movement either on or off the vessel at the time the transmission is received. Additionally, or alternatively, the method may comprise obtaining or determining the direction of passenger movement either on or off the vessel at the time the biometric characteristics are measured.
- the method may comprise reducing the missing count when a passenger is detected moving on to the vessel, or increasing the missing count when a passenger is detected moving off of the vessel.
- the method may comprise reducing the on-board count by one when a passenger is detected disembarking the vessel, and increasing the onboard count by one when a passenger is detected moving on to the vessel.
- the method may comprise updating a status of a passenger when detected moving on or off the vessel.
- the status may be updated to indicate that the passenger is off board when detected moving off the vessel.
- T he status may be updated to indicate that the passenger is on-board when detected moving on to the vessel.
- the method may comprise determining a passenger is missing based on their present status or most recent direction of movement
- the method may comprise determining the identity of a missing passenger based on their received passenger identification information. Additionally, or alternatively, the method may comprise determining the identity of a missing passenger based on their determined passenger identification information.
- the method may comprise immobilising the vessel, or an engine of the vessel until a missing passenger is accounted for.
- the missing passenger may be accounted for when their status is :on-board ⁇ Where there are multiple missing passengers, all may be accounted for when the missing passenger count equals zero.
- the method may comprise transmitting information relating to the passenger count from the reading device to an on-board server.
- the method may comprise syncing the passenger count on other reading devices with the passenger count recorded by the on-board server.
- the method may comprise presenting personalised booking, marketing, sales, activity, medical, security or photo information to a reading device, or device linked therewith, based on received or read personal identification information. Additionally, or alternatively, the method may comprise presenting personalised booking, marketing, sales, activity, medical, security or photo information to a reading device, or device linked therewith, based on personal identification information associated with measured biometric characteristics.
- the method may comprise transmitting positional information from a position device disposed on or in relation to a passenger.
- the position device may be the identification device.
- the position device may comprise the identification device.
- the passenger identification device may comprise the position device.
- the method may comprise receiving the transmission of positional information by a positional receiver.
- the method may comprise the receiving device receiving a transmission of passenger identification information from the position or identification device.
- the method may comprise determining the identity of the passenger based on the received transmission.
- the method may comprise associating the positional receiver with a receiver ID.
- the method may comprise identifying the location of the positional receiver based on the receiver ID.
- the method may comprise determining the position of the passenger on or in relation to the vessel based on at least one of the received transmission of positional information and the receiver ID.
- the method may comprise determining or indicating the position of the passenger within twenty metres, fifteen metres, ten metres, eight metres, six metres, five metres, four metres, three metres, two metres or one metre.
- the method may comprise generating a map of at least part of the vessel.
- T he method may comprise displaying the detected position of the passenger on the map.
- the identification or position device may be adapted to transmit positional information over at least five, ten, fifteen, twenty, thirty, fifty, seventy or one hundred metres.
- the transmission of positional information may comprise a BLE transmission.
- the transmission of positional information may comprise a low energy Bluetooth transmission.
- the method may comprise dividing the vessel or parts of the vessel into zones.
- the method may comprise locating at least one positional receiver in each zone.
- the positional receiver may be adapted to receive positional information of passengers within the zone of the positional receiver.
- the method may comprise determining or locating the position of the passenger within a particular zone.
- the method may comprise multiple towers receiving a positional transmission from a position or identification device. Multiple towers may be placed within a single zone and/or between zones (i.e. in different zones).
- the method may comprise determining the position of the passenger based on positional information received from multiple positional receivers. Determining the position of the passenger may comprise triangulation of positional information received by multiple positional receivers.
- the positional receivers may transmit information relating to the position of a passenger to the on-board server for processing or storage.
- the method may comprise determining whether an earlier positioning transmission from the wristband has been received within a predetermined time period.
- the method may comprise determining whether a later positioning transmission has been received within a predetermined time period.
- T he method may comprise ignoring positioning transmissions received with the predetermined time period.
- T he method may comprise not updating or re-evaluating the passenger position until after the pre-determined time period.
- T he method may comprise determining whether the detected or determined position of the passenger is within a restricted zone. T he method may comprise triggering a location alert when the passenger is detected in the restricted zone.
- T he method may comprise capturing an image of a passenger.
- T he image may be captured by a n on-boa rd camera.
- T he method may comprise detecting the face of the passenger in the image.
- T he method may comprise storing images showing passenger faces.
- T he stored images may be stored on the on-boa rd server. Additiona lly, or alternatively, the method may comprise creating templates based on images showing passenger faces.
- T he method may comprise storing the templates.
- T he template may be stored on the on-board sever.
- T he method may comprise associating passenger identification information with a stored image of the passenger. Additionally, or alternatively, the method may comprise associating passenger identification information with a stored template of or based on an image of the passenger.
- T he method may comprise comparing the face of the passenger in the captured image with the faces of passengers in the stored images. Additionally, or alternatively, the method may comprise comparing a template of a captured image including a passenger face with stored templates.
- T he method may comprise determining similarities between the face of a passenger in a ca ptured image and the face of a passenger in the stored image. Additiona lly, or alternatively, the method may comprise determining similarities between the template of a captured image including a passenger face and a stored template.
- T he method may comprise determining a passenger match based on the s imilarities between faces in the captured image and stored image. Additionally, or alternatively, the method may comprise determining a passenger match based on the similarities between templates generated from captured images and stored templates.
- T he method may comprise identifying the passenger in the captured image based on the passenger identification information associated with the stored image for which a passenger match was determined. Additionally, or alternatively, the method may comprise identifying the passenger in the template of the captured image based on the passenger identification information associated with the stored template for which a passenger match was determined
- T he method may comprise identifying a passenger in the captured image based on similarities with a passenger in a stored image. Additionally, or a lternatively, the method may comprise identifying a passenger from a template of a captured image based on similarities with a stored template of an image of or including the passenger.
- T he method may comprise determining when the number of people captured in a n image, or the number of people ca ptured within a designated zone of the vessel, is greater tha n a predetermined value.
- T he method may comprise generating a crowd a lert when the number of people ca ptured in an image, or the number of people captured within a designated zone of the vessel, is greater than a predetermined value.
- T he method may comprise capturing multiple images.
- T may be multiple images of a particular passenger or a pa rticular zone or a pa rticular passenger within a pa rticula r zone.
- T he method may comprise analysing or comparing multiple captured images.
- T he method may comprise calculating the number of times, or total time, a passenger visits a predesignated area based on multiple captured images.
- T he method may comprise triggering a frequency or duration a lert when the number of times, or total time is, greater tha n a respective associated predes ignated value.
- T he method may comprise identifying two passengers in a captured image as matching two passengers in stored images associated with a rendezvous alert.
- T he method may comprise generating the rendezvous alert in such instances.
- T he method may comprise identifying the same passenger from two captured images taken in or of different zones.
- T he method may comprise identifying a passenger from a captured image in a zone which does not match zoning information associated with a stored image of that passenger.
- T he method may comprise generating a transit alert in such insta nces.
- T he method may comprise determining that a passenger identified from a captured image is not in close enough proximity to, or is greater than a pre-determined distance away from, or is not in the captured image with, or is determined to be in a different zone to, a predesignate accompanying passenger associated with the identified passenger.
- T he method may comprise presenting or making accessible at least one captured image of a passenger upon reading personal identification information relating to that passenger as transmitted from the identification device.
- the invention provides a system for ma naging passengers of a vehicle comprising a reading device ada pted for reading biometric characteristics of a passenger, the biometric cha racteristics being associated with unique passenger identification information.
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a system for ma naging passengers of a vehicle, comprising:
- a palm vein scanner adapted to scan palm veins of a passenger, generate a palm vein image based on the scan, create a data template based on the palm vein image, and convey the data template to an external device;
- a server adapted to associate unique passenger identification information with the conveyed data template, and store the data template in a database of templates for future comparison.
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a system for managing passengers of a vehicle, comprising:
- a palm vein scanner adapted to scan palm veins of a passenger, generate a palm vein image based on the scan, create a data template based on the palm vein image, and convey the data template to an external device;
- a server adapted to compare the data template with a database of stored templates associated with unique passenger identification information, match the data template with one of the stored templates; and identify the passenger using the unique passenger identification information associated with the matched stored template.
- the invention provides a method of identifying passengers of a vehicle, comprising:
- the invention provides a system for counting passengers of a vehicle, comprising:
- a palm vein scanner adapted to scan the pa lm veins of a passenger, generate a palm vein image based on the scan, create a data template based on the pa lm vein image, a nd convey the data template to a n external device;
- a server adapted to compare the data template with a database of stored templates associated with unique passenger identification information, match the data template with one of the stored templates, a nd increase or reduce a passenger count according to whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port; terminal, facility or vehicle.
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a system for ma naging passengers of a vehicle, comprising:
- a palm vein scanner adapted to scan palm veins of a passenger, generate a palm vein image based on the scan, create a data template based on the palm vein image, and convey the data template to an externa l device; a server adapted to receive and compare the data template with a database of stored templates associated with unique passenger identification information, match the data template with one of the stored templates; and identify the passenger using the unique passenger identification information associated with the matched stored template; and
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of identifying passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of identifying passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of identifying passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising: reading a biometric characteristic of, or an identification device disposed on, a passenger;
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of identifying passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of identifying passengers of a vehicle, comprising:
- the method may further comprise increasing or reducing a passenger count according to the passenger s direction of movement or whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port, terminal, facility or vehicle.
- the method may further comprise enabling the identified passenger to access personalised content on an electronic device or terminal.
- the invention provides a method of identifying passengers of a vehicle, comprising:
- the method may further comprise increasing or reducing a passenger count according to the passenger s direction of movement or whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port, terminal, facility or vehicle.
- the method may further comprise enabling the identified passenger to access personalised content on an electronic device or terminal.
- a combination of means may be used in accordance with the invention in order to identify passengers of a vehicle.
- two or more of the following means may be used in combination:
- the identification device being adapted to convey unique passenger identification information; and, a reading device adapted for reading the unique passenger identification information conveyed by the identification device; ii) a camera for capturing an image which includes the face of a passenger; and, facial recognition means for identifying the face of the passenger captured in the image; iii) a palm vein scanner adapted to scan palm veins of a
- a server adapted to match template data conveyed from the palm vein scanner with one of a number of stored templates containing unique passenger identification information for identifying the passenger.
- the invention provides a system for managing passengers of a vehicle, comprising:
- a palm vein scanner adapted to scan palm veins of a passenger
- a server adapted to match template data conveyed from the palm vein scanner with one of a number of stored templates containing unique passenger identification information for identifying the passenger
- a camera for capturing an image which includes the face of a passenger
- facial recognition means for identifying the face of the passenger captured in the image.
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a method of managing passengers of a vehicle such as a vessel, comprising:
- invention provides a method of managing passengers of a vehicle, comprising:
- reading at least one biometric characteristic of a passenger reading at least one different biometric characteristic, or a non- biometric characteristic comprising a transmitted data template, relating to the, or another, passenger;
- the invention provides a method of managing passengers of a vehicle, comprising:
- the at least one non-biometric characteristic comprises a transmitted data template relating to the, or another, passenger; accessing a database of stored data templates, each stored data template being associated with unique passenger identification information; comparing each read data template with the stored data templates; matching each read data template with a stored data template;
- the method may comprise:
- R eading at least one biometric characteristic may comprise scanning the palm veins of the passenger.
- R eading at least one non-biometric characteristic may comprise electronically receiving a transmission of a data template from a device disposed on or in relation to a passenger.
- R eading the at least one biometric characteristic may comprise capturing an image which includes the face of a passenger.
- the method may comprise increasing or reducing a passenger count according to whether the, or each, passenger is entering or exiting the vehicle when identified.
- the method may comprise enabling the, or each, identified passenger to access personalised content on an electronic device or terminal.
- the method may comprise presenting content on an electronic device or terminal to the, or each, identified passenger.
- R eading of at least one biometric or non-biometric characteristic may occur at least as the, or each, passenger is boarding the vehicle. Once the, or each, passenger is identified, a database may be updated to indicate that the, or each, passenger is boarding or is aboard the vehicle.
- R eading of at least one biometric or non-biometric characteristic may occur at least as the, or each, passenger is disembarking the vehicle. Once the, or each, passenger is identified, a database may be updated to indicate that the, or each, passenger is disembarking or has disembarked, the vehicle
- the method may comprise identifying a zone or region of the vehicle in which the, or each, identified passenger is located.
- the method may comprise verifying that the, or each, identified passenger is on a passenger manifest for the vehicle.
- the invention provides a system for managing passengers of a vehicle, the system comprising:
- At least one biometric reader configured to read a biometric characteristic of a passenger, create a data template based on the reading, and convey the data template directly or indirectly to a server;
- At least one non-biometric reader configured to read a transmitted non-biometric characteristic comprising a data template relating to the, or another, passenger, and convey the data template directly or indirectly to the server;
- the server configured to receive each data template, access a database of stored data templates, each stored template being associated with unique passenger identification information, compare each data template with the stored data templates, match each data template with a stored data template; and identify the, or each, passenger based on the unique passenger identification information associated with each matched stored template.
- the invention provides a method of managing passengers of a vehicle, comprising:
- the invention provides a system for managing passengers of a vehicle, comprising:
- At least one palm vein scanner configured to scan palm veins of a passenger at least as they board or disembark the vehicle, create a data template based on the scan, and convey the data template directly or indirectly to a server;
- the server configured to compare the created data template with a database of stored templates associated with unique passenger identification information, match the created data template with one of the stored data templates; identify the passenger using the unique passenger identification information associated with the matched stored template; update a vehicle presence database to indicate that the identified passenger is boarding, or is aboard, the vehicle when scanned boarding, or update the vehicle presence database to indicate that the identified passenger is disembarking, or has disembarked, the vehicle when scanned disembarking.
- F ig. 1 is a diagram of an example passenger management system that may be utilised to embody or implement aspects of the invention
- F ig. 2a is a diagram illustrating an assisted check-in
- F ig. 2b is a diagram illustrating a self-check-in
- F ig. 2c is a diagram illustrating a self-check-in via a kiosk using a palm vein scanner
- F ig. 3 is a block diagram illustrating components of the example system of F igure " ! ;
- F ig. 4a is an isometric view of a self-check-in kiosk
- F ig. 4b is a front view of the kiosk of Fig 4a;
- F ig. 4c is a side view of a self-check-in kiosk having an alternative wristband dispensing outlet;
- F ig 4d. is a front view of a self-check-in kiosk having a palm vein scanner
- F ig. 4e is a side view of the kiosk of F ig. 4d;
- F ig. 4f is an isometric view of a self-check-in kiosk having an alternative palm vein scanner
- F ig. 5 is an isometric view of an internal R FID printer of the kiosk;
- Fig.6 is an isometric view of an external RFID printer for connection with an assisted check-in terminal;
- Figs. 7a & 7b are isometric views of rolls of wristband material for placement within an RFID printer
- Fig.8a is a bottom view of an NFC wristband
- Fig.8b is a top view of an NFC wristband
- Fig.8c is an isometric view of an NFC wristband
- Fig.8d is an isometric view of a fastening clip for an NFC wristband
- Fig.8e is an isometric view of an NFC wristband fastened on the wrist of a passenger
- Fig. 8f is an isometric view of an NFC & BLE wristband with ends magnetically fastened
- Fig.8g is a side view of the NFC & BLE wristband of Fig.8f with ends magnetically fastened;
- Fig. 8h is a top view of the NFC & BLE wristband of Fig. 8f in an unfastened, flattened configuration
- Fig. 8i is a side view of the NFC & BLE wristband of Fig. 8f in an unfastened, flattened configuration
- Fig.8j is an isometric view of a palm vein sensor device
- Fig.8k is an isometric view of a hand guide suitable for mounting on the palm vein sensor device of Fig.8j;
- Fig.8I is an isometric view of the alternative palm vein of Fig.4f;
- Fig.9a is a front view of an on-board server resting on its feet;
- Fig.9b is a perspective view of the on-board server of Fig.9a standing on its side;
- Fig 9c is a diagram of internal hardware componentry of the on-board server of Fig.9a;
- Fig.10a is an isometric view of a surveillance camera
- Fig.10b is a rear view of the camera of Fig.10a;
- Fig.10c is a left side view of the surveillance camera of Fig.10a;
- Fig. 10d is a view of a portion of the right side of the camera of Fig.
- F ig. 1 1 a is a front view of a tablet with stand
- F ig. 1 1 b is a side view of the tablet of F ig. 1 1 a with stand;
- F ig. 12 is an isometric view of an NFC reader
- F ig. 13a is an isometric view of a tablet mount with NFC reader pocket
- F ig. 13b is an isometric view of a tablet mount with integral NFC reader
- F ig. 13c is an isometric view of a tablet mount with mounting arm
- F ig. 13d is an isometric view of a tablet mount with stand
- F ig. 13e is an isometric view of a tablet mount adapted to hold tablets of different sizes
- F ig. 14a is an isometric view of a console with twenty-seven-inch screen
- F ig. 14b is an isometric view of a console with twenty-four- inch screen
- F ig. 14c is an isometric view of a console with twenty-one-inch screen
- F ig. 14d is an isometric view of a table connected with a small palm vein scanner
- F ig. 14e is an isometric view of a tablet and palm vein scanner enclosed within a case
- F ig. 14f is an isometric view of a tablet on a mount and connected by cable to a separate palm vein scanner;
- F ig. 14g is an isometric view of the palm vein scanner of F ig. 14f, showing the sensor portion of the scanner as partially transparent;
- F ig. 14h is an isometric view of an adjustable hand guide or bracket of the palm vein scanner of F ig. 14f;
- F ig. 14i is an isometric view of a wall mountable palm vein scanner
- F ig. 1 5 is an isometric view of a BLE receiver
- F ig. 16 is a diagram of an example passenger tracking system that may be utilised to embody or implement aspects of a passenger tracking process
- F ig. 17 is a diagram illustrating an exemplary tour process
- F ig. 18 is a logic flow diagram illustrating a self-check-in process
- F ig. 19 is a logic flow diagram illustrating an assisted check-in process
- F ig. 20 is a logic flow diagram illustrating a passenger counting process
- F ig. 21 is a logic flow diagram illustrating a passenger activity counting process
- F ig. 22 is a logic flow diagram illustrating a video surveillance process
- F ig. 23 is a logic flow diagram illustrating a location tracking process
- F igs. 24a-d are a series of schematic diagrams respectively illustrating the process of: a) palm placement above a palm vein scanner, b) palm vein scanning, c) palm vein image generation, and d) palm vein template storage;
- F ig. 25 is a logic flow diagram illustrating an enrolment process for palm vein scanning during check-in;
- F ig. 26 is a counting on process for passengers boarding the vessel using palm vein scanning
- F ig. 27 is a counting off process for passenger disembarking the vessel using palm vein scanning.
- F ig. 28 is a process for accessing customised on-board applications authorised by palm vein scanning.
- the system 10 comprises multiple hardware devices linked by a network.
- the hardware devices comprise various electronic, processing and/or computing devices including: Near F ield C ommunication (:NFC) wristbands 13a, NFC / Bluetooth Low E nergy (:B LE ) wristbands 13b, NFC / G lobal Positioning System ('G PS ') wristbands 13c, or colour coded wristbands 13d to be worn by each passenger 1 1 , self-check-in kiosks 14 (see also Figures 2b and 2C ), self-check-in tablets 190, and staff 57 assisted check-in terminals 15 (see also Figure 2a).
- the kiosks 14, tablets 190, and terminals 15 are each located at a port; station or other check-in facility 185, which is off-vessel in this instance.
- E ach kiosk 14, tablet 190, and terminal 15 comprises or is connected in close proximity with a camera 16, and/or a radio-frequency identification (iR FID 1 ) printer 17, and/or a check-in palm vein scanner 191 , and/or a check-in DNA reader 193, and is adapted to execute a self or assisted check-in application 18 installed thereon.
- iR FID 1 radio-frequency identification
- a R E S T API server runs a R E ST API application which connects various electronic devices together via the internet
- the kiosks 14, tablets 190, and terminals 15 are each connected through R E S T API to a remote booking server 19, via the internet
- the booking server 19 has installed thereon a booking application 100, and in turn communicates with an off-vessel database server 23 which controls data transfer to and from a provider database 24.
- the kiosks 14 and terminals 1 5 are connected to an off-vessel facial recognition server 21 via the internet 20.
- the facial recognition server 21 has installed thereon a facial recognition application.
- the facial recognition server 21 is adapted to send a security alert 25 to third party devices (not shown) when predetermined conditions are met.
- the vessel 12 On board the vessel 12 is located an on-board server 26 which, depending on vessel location, is intermittently connected with the remote booking server 19, facial recognition server 21 , and database server 23.
- the on-board server 26 is connected via a closed on-board network with multiple BLE receivers 81 , multiple cameras 16 placed around the vessel, such as closed-circuit television (:C CTV) cameras and/or other forms of surveillance cameras including cameras suitable for facial recognition processes, multiple provider consoles 27 either standing, fixedly mounted or portably carried by crew, the consoles 27 being adapted to run a real-time tracking web application 28 which runs in a web browser that is accessible in the closed network, and multiple portable tablets 29 adapted to run a native (e.g. to android platform) management application 31 having various management modules.
- a native e.g. to android platform
- Detection and/or identification devices in the form of NFC readers 32 can be linked with or comprised by the consoles 27 and/or tablets 29. Further, identification devices in the form of on- board/trans it palm vein scanners 192, and/or on-boardAransit DNA readers 194 can be linked with or comprised by the consoles 27 and/or tablets 29.
- the on-board server 26 is adapted to send a security alert 33 to third party devices, off-board devices, and/or on-board devices when appropriate security conditions are met.
- T he on-board server is linked with one or more on-board databases 135 in which are recorded a passenger manifest 127 and an activity manifest 136.
- T he passenger manifest is originally received from the database server 23.
- F urther installed on the on-board server are a suite of executable applications comprising: a passenger count application 128 for providing real-time numbers and whereabouts of passengers on and off board, an on-board sales and marketing application 129 enabling cashless purchases by passengers using their wristbands for identification, a medical and legal application 130 for enabling passengers to complete Medical S elf R isk Assessments and Assumption of R isk and Liability Waivers prior to undertaking any activities, plus Medical Declarations counter signed by staff prior to undertaking any high-risk activities, an activity scheduling application 131 for making and changing tour activity bookings, a photo management application 132 capable of filtering and sending photos of passengers taken by on-board cameras directly to the passengers captured in the photos, a management reporting application 133 for E MC reporting and analytics, and a security application 134 for real-time facial recognition image processing and security alert generation.
- a passenger count application 128 for providing real-time numbers and whereabouts of passengers on and off board
- an on-board sales and marketing application 129 enabling cashless
- a ship log application with programs relating to the weather, tides, navigations and systems checks, master and crew, counts and voyage details, incidents, and diving instruction, along with the passenger manifest application and a passenger location tracking application, are all accessible and viewable by crew via a graphic user interface presented on consoles 27.
- passenger count, crew count, sales / marketing, activity scheduling, medical and legal forms, and/or photo sales / marketing are presentable onto guest and/or crew tablets 29 by the on-board server 26.
- the system 10 may additionally, or alternatively, include other electronic or computing devices, such as one or more desktop computers, laptop computers, implanted devices, smart phones 34 and/or wearable devices such as smartwatches and headsets.
- the network of the exemplary system 10 comprises the internet 20 and an on-board LAN, and may further include other networks such as WAN, Ethernet, token ring, satellite communications networks, telecommunications networks, including mobile telephone networks such as G S M or 3G networks, etc., or any combination thereof, by which the hardware devices can communication. This enables, for instance, input and output data to be communicated via the network between the hardware devices.
- networks such as WAN, Ethernet, token ring, satellite communications networks, telecommunications networks, including mobile telephone networks such as G S M or 3G networks, etc., or any combination thereof, by which the hardware devices can communication. This enables, for instance, input and output data to be communicated via the network between the hardware devices.
- Interconnections between devices facilitating transfer of data and/or information over the network may be wholly or partially wired, for example by optical fibre, or wireless, for example by utilising W i-Fi, Bluetooth, cellular, or satellite communications networks.
- the closed on-board LAN uses ad-hoc Wi-F i to interconnect local devices, whereas connections between NFC readers 32 and consoles or tablets can be via bus and power cables.
- networked infrastructural system 10 represents only a single example of infrastructure which may be suitable for implementing aspects of the invention.
- Other suitable networked systems for implementing the invention may involve various alternative devices, configurations, networks, or architectures without departing from the scope of the present invention.
- R eferring now to F igure 3 there is shown a block diagram of computing hardware 56 associated with the system 10.
- S ome or all of the hardware 56 is associated with potentially any of the electronic devices 13, 14, 15, 16, 17, 23, 26, 27, 29, 32, 190, 191 , 192, 193, &/or 194.
- T he hardware 56 includes memory 35 which includes R O M 36 and RAM 37 as well as stored data 38, collections of instructions forming programs and applications 39, and an operating system 40.
- a bus 46 links the memory 35 to a processor 41 , display 42, and input means 43.
- T he hardware is powered by a power supply 44.
- a communications interface 45 enables the device to communicate with other devices.
- the exemplary kiosk 14a includes internally a high- performance Intel ⁇ 5-6500 processor, memory in the form of 4G B RAM (upgradable to 32 G B) and a 2T B hard disk drive.
- Various communications interfaces are provided including an open component interface zone 49 which includes an audio jack, and a Wi-F i compliant network interface card.
- the kiosk 14a further includes multiple payment interfaces in the form of a credit / debit card slot interface 48 and an NFC / R FID card touch or proximity interface 47, enabling direct card or phone touch purchases to be made conveniently by the user.
- the kiosk 14a further includes input devices or means in the forms of a keypad 50 and a nineteen-inch touchscreen 51 (e.g. P CAP, SAW), the touchscreen doubling as an anti-glare display screen.
- a barcode scanner 52 is present for scanning tickets or boarding passes.
- An amplified speaker and 360-degree LE D shelf lighting are also provided.
- the kiosk 14a further includes an internal R FID printer 17a (see F igure 4c) for printing NFC wristbands 13 and dispensing them for user collection via a slot 54 (F igure 4b) or recess 55 (F igure 4a).
- an internal R FID printer 17a see F igure 4c for printing NFC wristbands 13 and dispensing them for user collection via a slot 54 (F igure 4b) or recess 55 (F igure 4a).
- R eferring now to F igures 4d and 4e there is shown a second version of the self-check-in kiosk 14b.
- the exemplary kiosk 14b shares many of the features of the first version of the kiosk 14a, but further includes a first version of a check-in palm vein scanner 191 a.
- R eferring now to F igure 4f there is shown a third version of the self-check-in kiosk 14c.
- the exemplary kiosk 14c shares many of the features of the first version of the kiosk 14a, but further includes a second version of a check-in palm vein scanner 191 b, as well as a passport reader 195.
- the R FID printer 17a is adapted to thermally print on the wristbands 13 and embed them with R FID transponders comprising N FC chips.
- the exemplary printer 17a includes F LAS H expansion memory of up to 28MB, an automatic cutting unit for band cutting, network interfaces for Ethernet, Wi-F i and Bluetooth, 600 D PI print resolution, twenty-four-volt DC power input, and an R FID encoder.
- an external R FID printer 17b suitable for connection with an assisted check-in terminal 1 5.
- the exemplary printer 17b encloses its internal hardware components within its own housing 58.
- the external printer includes F LAS H expansion memory of up to 28MB, network interfaces for E thernet Wi-Fi and Bluetooth, 600 D PI print resolution, twenty-four-volt DC power input, and an R FID encoder.
- an automatic cutting unit is an optional feature, the standard exemplary printer 17b has a slot 54 with cutting edges adapted for manual band tear off by a staff member 57.
- a length of material 59 wound into a roll 60 The exemplary roll is installed in the printers 17 for printing and cutting into shorter lengths to form up to five hundred wristbands 13.
- the material may be soft comfortable and durable, neutral to the skin, resistant to water, tamper evident, tear resistant and stretch proof.
- the material comprises a thermal film having a thermal side 61 and a non-thermal side 62. The film is adapted for cutting and the thermal side is adapted for printing on of data such as barcodes, dates and times by the printers 17.
- the material 59 defines a series of fastening holes 63 running centrally along its length. The material may also have pre-defined cutting lines 6.
- the N FC wristband 13 as cut by the R FID printer 17 from the roll 60.
- the wristband 13 is waterproof and comprises a flat NFC tag 64 in the form of an inlay having an NFC chip and antenna, and being embedded in the band material 59.
- the NFC chip comprises 1024-bit user memory, three independent twenty-four- bit one-way counters. Further, the chip features 106 Kbit s communication speed, anti- collision support, one-time pin, lock bits, configurable counters, protected data access through 32-bit password and a unique seven-byte serial number.
- F igure 8d shows a clip 65 which is adapted to fasten overlapping fastening holes 63 of the wristband, thereby securing the wristband 13 to the wrist of the passenger 1 1 (see F igure 8e).
- the wristband is irremovable without it being damaged, such as may occur when cutting off the band with scissors.
- the wristband is adjustable prior to fastening by adjusting the holes which are overlapped, whereby overlapping holes which are further from their respective ends produces a smaller fit.
- peel-off adhesive wristbands are provided.
- NFC plus BLE enabled wristband 13b which may be issued during assisted check-in.
- T he wristband 13b comprises: a power source 44 in the form of a battery, a main board having a central processing unit 41 , memory 35 comprising 64K RAM and F LAS H expansion slots for separate storage of encrypted transactional and personal data, input means 43 in the form of programming buttons 187, a system clock, a display 42, pressure sensors, thin LE D strips, a communications interface 45 comprising a micro-US B port or connection and network interfaces comprising an NFC tag 64 and a Bluetooth tag 186 in the form of an inlay having a CS R E nergy C hipset with a low energy Bluetooth transmitter with a 50m Bluetooth V4 radius range.
- the material 59 of the wristband 13b is opaque, waterproof (silicon), and flexible.
- the wristband 13b comprises a pair of magnetic clasps of opposite polarity 188 co-operatively disposed / embedded in respective ends.
- a palm vein sensor 196 (which in this instance is a commercially available F ujitsu PalmS ecure sensor) connected at its side with a US B cable 219.
- the sensor 196 is a component of the check-in palm vein scanner 191 a.
- the sensor device 196 is generally rectangular cuboidal in shape, having dimensions of 35mm width x 35mm depth x 27mm height. It uses 128-bit AE S data encryption, has around a 0.01 per cent false recognition rate with one retry. The mean time between failures is around 1 ,000,000 hours.
- the sensor 196 has a capturing distance of between forty and sixty millimetres, and requires a 4.4 to 5.4 volt power supply.
- a hand guide 197 (which in this instance is a commercially available F ujitsu PalmS ecure U G uide).
- the guide 197 is a component of the check-in palm vein scanner 191 a.
- the guide 197 connects on to the top of the sensor 196, acting as a spacer to correctly space a passenger s palm in the capturing distance from the sensor when the passenger places their palm on top of the guide.
- the guide 197 has a central aperture 198 to allow passage of light emitted from the sensor 196 to the passenger s hand.
- the scanner 191 b (which in this instance is a commercially available F ujitsu PalmS ecure ID Match device).
- the scanner 191 b has a 10.9cm diagonally measured capacitive touch display 198, an embedded AR M board, an RJ -45 ethernet communications interface and a US B port, and two external and one internal secure access module slots.
- the scanner 191 b includes a palm vein sensor, runs on the Linux operating system, and along with palm veins, is adapted to scan smart cards, further improving identification accuracy of the device, with a false acceptance rate of around 0.00008%.
- F igures 9a and 9b show the on-board server 26 from the outside, whilst Figure 9c shows the internal hardware componentry found within the server s housing 66.
- T he exemplary on-board server 26 includes a processor 41 in the form of an Intel C ore i7-6770 HQ chip and an Intel Iris P ro graphics card with connection interfaces in the form of H DMI, mini display and display ports, as well as multiple US B ports and a Thunderbolt interface.
- the on-board server further includes Wi-Fi, Ethernet and Bluetooth wireless network interfaces, dual channel DDR4 random access memory modules, and a serial ATA bus.
- a camera 16 in the form of a C CTV video camera.
- the camera comprises a C MOS image sensor, has a 1920 x 1080 resolution, a frame rate of 25 frames per second, and supports various protocols over which data is sent including HTT PS and RTS P.
- the tablet 29 which comprises an Exynos 7870 processor with a 1.6 G Hz processing speed, 2 G B RAM and 16 G B R O M internal memory, up to 200 G B Micro S D external memory received by micro S D slot 67, communications interfaces including Wi-F i and Bluetooth network interfaces, and a US B port; front and rear cameras, 66a and 66b, with two megapixel and eight megapixel resolution respectively, the cameras being adapted for 1080p video recording at thirty frames per second.
- the tablet comprises a touchscreen 74, a power source in the form of a lithium ion battery, and runs an Android operating system.
- the tablet may also comprise an internal NFC reader.
- the NFC reader 32 comprises an embedded contactless smartcard and NFC tag reader / writer 68 based on 13.56 MHz contactless R FID technology.
- the reader 32 is compliant with the IS O/IE C 18092 standard for near field communication, and supports MIFAR E and B O 14443 A and B cards and four types of NFC tags.
- F igure 13a shows the mount 70a comprising a mounting panel 71 having a recess for embedding the tablet 29, and a pocket 72 for receiving the NFC reader 32.
- the mount 70a includes an interface 73 for receiving connections of an interconnecting the tablet 29 and NFC reader 32.
- F igure 13b shows a mount 70b comprising a panel 71 and a mounting arm 75 on which the panel is supported.
- the tablet 29 is enclosed within the panel 71 , the touchscreen 74 of the tablet being accessible via an open window 76 defined in the top surface of the panel 71.
- FIG. 13c shows a mount 70c with a bracket 77 for receiving the tablets 29 and a mounting arm 75 supporting the bracket 77.
- F igure 13d shows a mount 70d having a panel housing 71 which encloses the tablet 29 but for an open window 76 through which the touchscreen of the tablet is accessible. The panel housing 71 is supported on a stand 78.
- F igure 13e shows a mount 70e having a mounting panel 71 supported on a mounting arm 75, the mounting panel being adapted to integrate either 10 inch or 12 inch screen tablets.
- console versions, 27a-c having twenty-seven, twenty-four, and twenty-one-inch touchscreens 78 respectively.
- E ach of the consoles 27 further includes an adjustable stand 79 for adjustably propping up and tilting the screen 78 towards a user.
- the consoles 27 include communication interfaces in the form of Wi-Fi and Bluetooth network interfaces.
- consoles 27b and 27c include communication interfaces in the form of US B ports 80.
- C onsole 27b also includes a HD MI port and may include an internal NFC reader.
- a tablet 29 (which is commercially available as the F ujitsu S tylistic Q736) embodied differently to the tablet of F igures 1 1 a and 1 1 b.
- T he tablet 29 of F igure 14d is optionally connected with a small transit palm vein scanner 192a.
- the scanner 192a includes a palm vein sensor 196, and a US B connector (not shown) by which it is removably connected to a US B port of the tablet 29.
- the tablet has a 33.8cm display 199 capable of receiving touch and pen input as well as 5mp front and rear cameras 200.
- the tablet 29 has 4G/LT E , G PS , W LAN, Bluetooth and/or NFC communications interfaces, an intel core i7-6600U processor, memory in the form of S ATA 256 G B solid state drives and 8 G B RAM, along with micro S D memory card slots.
- the tablet 29 includes an internal power source in the form of a pair of batteries, and sensors associated with a 3-axis accelerometer, gyroscope, ambient light, magnetic field, and compass.
- R eferring now to F igure 14e there is shown a case 201 in which a tablet 29 and transit palm vein scanner 192b are embedded, the connection between the palm vein scanner 192b and the tablet 29 being hidden by the case 201 .
- the case maintains portability of the tablet 29 with scanner 192b and serves to protect the tablet 29 and scanner 192b from damage.
- R eferring to F igure 14f there is shown another form of mount 70 for mounting a tablet 29 which is adapted for connection with a separate palm vein scanner 192c.
- the mount 70 has a mounting panel 71 supported on a mounting arm 75 which projects up from a base 210.
- F igure 14c shows the palm vein scanner 192c which comprises a palm vein sensing device 196 mounted on to an adjustable hand guide or bracket 197.
- the palm sensing device uses near-infrared light pattern capture, with a capture distance of 40-60 mm for initial enrolment and 35- 70mm for authentication.
- the hand bracket 197 has a rectangular base portion 21 1 with a generally flat under surface for resting on a support surface beneath, a first adjustable guide portion 212 pivotally attached at its bottom end to the base portion 21 1 , the first guide portions defining finger depressions 214 at the its top end, and a second adjustable guide portion 213 also pivotally attached at its bottom end to the base portion, but at the opposite end thereof.
- the palm is placed flatly down on top of the bracket 197, so that the heel of the palm is placed on the second guide portion 213 and the fingers in the finger depressions 214 of the first guide portion 212. Partial folding of the guide portions, 212 and 213, brings their top ends closer together, as may be required for placement of smaller hands, while maintaining the placed palm within a suitable distance range from the sensor as required for effective vein scanning.
- a wall mountable on-board palm vein scanner 192d (which in this case is a commercially available Intus 1600PC Palm Vein authenticator).
- the scanner 192d includes a wall mountable housing 215, a dome 216 which protects a palm vein sensor therebehind, and alpha-numerically labelled keys 217 for PIN entry. Once wall mounted, this scanner 192d requires the passenger to place their palm vertically, spaced 3-8cm from and facing the sensor.
- the BLE receiver comprises a housing 189 of 54mm x 41 mm x 18mm dimensions, with an antenna 190 projecting from the rear of the housing 189.
- the B LE receiver 81 further comprises a 5V, 500mA micro-US B port and a network interface enabling wireless software updates.
- the B LE receiver 81 is adapted to operate in temperatures between minus twenty and sixty degrees C elsius, and operates on low power consumption (80mA typical working current).
- the B LE receiver 81 is adapted to read multiple BLE enabled wristbands 13b simultaneously.
- F igure 1 6 illustrates use of a passenger tracking system 83.
- the tracking system 82 comprises multiple BLE receivers 81 a-d placed in fixed positions around the vessel. The position of each BLE receiver is associated with a gateway receiver ID. A map of the vessel or vessel area is generated by the on-board server 26 accounting for or showing the B LE receivers.
- the BLE receivers are placed in or allocated to respective zones 82a-d in the form of quadrants.
- a low energy Bluetooth signal 84 conveying unique passenger identification information is transmitted from the passenger s BLE enabled wristband 13b and received by the B LE receiver.
- the passenger identification information is then wirelessly transmitted 85 to and received by the on-board server 26, which information is worked on via a location tracking application installed on the on-board server 26.
- the on-board server 26, by virtue of reading and executing instructions of the location tracking application, can identify on the map the location or position of the passenger s wristband, and therefore the passenger, within approximately ten metres.
- Accuracy may be improved by increasing the number of receivers within a given area.
- the map showing passenger position can be viewed by crew on a console display 42.
- the crew may be able to interact with the map in order to obtain identification information relating to a selected passenger marker displayed on screen.
- identification information relating to a selected passenger marker displayed on screen.
- it may be possible to track hundreds or more passengers at once.
- a security alert 33 may be signalled by the on-board server where, for example, passengers are located outside of their designated zones, the number of passengers within a zone exceeds a predetermined number for overcrowding, babies or small children are separated more than a predetermined distance from their designated family or carers, passenger position is lost or moves external to the sides of the vessel indicating a potential passenger overboard situation, and a passenger signal remains static for an extended pre-determined period suggesting wristband removal or medical emergency.
- F igure 17 provides an exemplary overview of passenger movements on tour, along with the various system technologies used in managing the passengers.
- the passenger is shown to move from check-in, to boarding the tour vessel at a departure point, to disembarking the tour vessel and boarding a transfer vessel at a transfer point, to disembarking the transfer vessel for an activity, to re-boarding the transfer vessel after the activity, to disembarking the transfer vessel and re-boarding the tour vessel at the transfer ⁇ to disembarking the tour vessel at a stopover, to re-boarding the tour vessel at the stopover, and finally to disembarking the tour vessel back at the point of departure.
- system devices work to identify, count, surveil, and locate the passengers.
- the tablet devices are adapted to scan passenger wristbands or biometric features such as DNA, face, or palm veins, in offline mode, making it possible to continue to keep track of passengers during transfers and activities taking place off-board the main tour vessel.
- a passenger After a passenger has booked their ticket through the vessel operator or a booking agency, the passenger checks-in on the day of travel, off board the vessel, either via a staff assisted check-in terminal 1 5 (Fig. 2a), a self- check-in kiosk 14 (Figs. 2b and 2c), or self-check-in tablet 190 (Fig. 17).
- FIG. 18 An exemplary self-check-in procedure 86 suitable for enrolling in or engaging on-board or transit wristband identification is shown in Figure 18.
- the passenger 1 1 inputs personal and/or booking information such as their booking reference number or family name either manually via the kiosk keypad 50 or touchscreen 51 , or by using the barcode scanner 52 to scan a barcode on their printed ticket or phone 34 displayed ticket
- the kiosk 14 wirelessly transmits the inputted information to the booking server 19 in order to ascertain the booking details from the booking server 19 based on the inputted information.
- the booking details are presented and displayed on the kiosk screen 51.
- the passenger 1 1 uses the touchscreen 51 to select and confirm the passenger on the ticket to check-in.
- the kiosk 14 or other device processor 41 determines whether the selected passenger requires a Medical S elf R isk Assessment form and/or a Medical Declaration form and/or an Assumption of R isk and Liability Waiver form based on passenger or booking information provided such as their medical history and the activities chosen or destination being visited whilst a passenger on the vessel 12.
- T he Medical S elf R isk Assessment may be issued to all passengers, or all passengers booking in to low risk activities.
- the Assessment includes a questionnaire requiring the passenger to check off :yes ⁇ or :no " answers to a series of questions.
- F rom completion of this form the tour operator may gain a basic understanding of the passenger s general health. Any failed medical question on the Assessment form prompts the system to require the additional completion of a Medical Declaration form.
- S uccessful completion of the Assessment form clears the passenger to undertake non-risk or low-risk activities on tour, such as snorkelling.
- R isk and Liability Waiver forms are also to be completed by passengers undertaking any activities on tour, be they low risk or high risk activities.
- Medical Declarations are typically to be completed by passengers planning on undertaking high risk activities such as scuba diving or helmet diving.
- the Medical Declaration form provides a detailed questionnaire, requiring the passenger to check off or provide satisfactory answers to presented questions and finger sign the document via the touchscreen 51 in order to proceed to step 92. Any failed answer to a question on the Declaration prompts an automatic response from the system. The response is based on local legal requirements and standard international industry practice, having been developed in collaboration with doctors who enhance in the industry of the relevant activity, e.g. diving. Additionally, the Medical Declaration form must later be counter signed by a qualified staff member having reviewed the form. Based on the information provided in the Declaration, the staff member determines whether total, partial, or no restriction on the activity is required for the passenger.
- a social media share screen is presented to the passenger, enabling them to share details of their trip, cruise, or tour on social media platforms such as Facebook. If the passenger selects yes " they would like to share trip details, at step 93 the kiosk or other linked device communicates with the server of the social media platform to bring this to fruition. If the passenger selects :no " they do not want to share, at step 94 the kiosk camera 16 takes a front profile photo of the face of the passenger 1 1 and transmits the image to the facial recognition server 21 for analysis via the facial recognition application 101. At step 95, the kiosk prints an NFC wristband 13a for collection by the passenger.
- the printed NFC wristband 13a is associated with a data template having unique indicators for identifying the passenger 1 1 and their tour booking.
- the kiosk or other linked processor asks on screen 51 whether another passenger on the booking is to be processed. If the passenger inputs yes " then the process repeats again from step 89 for the next passenger. If :no " then the self-check-in procedure is complete at step 97.
- An exemplary assisted check-in procedure 98 suitable for enrolling in or engaging wristband identification for the upcoming journey is shown in F igure 19.
- the passenger 1 1 advises a staff operator 57 manning a terminal 15 of their booking details or family name.
- the operator 57 inputs the passenger details into the terminal and the terminal 1 5 wirelessly transmits the inputted information to the booking server 19 in order to ascertain the relevant booking details from the booking server 19.
- the booking details are then presented to the operator 57 on an operator facing touchscreen 125 (F ig. 2a) of the terminal 15.
- the operator selects the correct passenger 1 1 if multiple passengers are found in the booking.
- the booking details are presented on the passenger facing touchscreen 126 (F ig.
- the passenger updates details if required, and then confirms accuracy of the booking details.
- the terminal 15 or other device processor 41 determines whether the selected passenger requires a medical or legal form based on passenger or booking information provided. If yes " a medical or legal form is required, the kiosk or other device processor determines which of these is or are relevant and presents this on the touchscreen at step 1 15. The passenger 1 1 then checks off answers to the questions and, if necessary, finger signs the form via the touchscreen 126 in order to proceed to step 1 1 6.
- a social media share window is presented to the passenger, enabling them to share details of their trip, cruise, or tour on social media platforms. If the passenger selects :yes ⁇ they would like to share trip details, at step 1 17 the terminal 15 or other linked device communicates with the server of the selected social media platform, thereby accessing the platform and sharing the tour details thereon. If the passenger selects :no " they do not want to share, at step 118 the staff member 57 uses a camera 16 linked to the terminal 15 to take a head-shot of the passenger 11.
- the image is transmitted from the camera 16 to the terminal 15 and then on to the facial recognition server 21 for analysis and identification by the facial recognition application 101.
- the transmitted image is also compared with images stored in the shared operator database 24 linked with the database server 23.
- the facial recognition server 21 or a linked processor determines whether there is cause for an alert; such as where there is a mismatch between the passenger image and booking details, or the passenger is flagged for sea travel or on a watch list. If yes " there is cause for an alert; at step 121 a security alert 25 is transmitted to the relevant authorities.
- step 123 the kiosk or other linked processor asks on screen 125 and/or 126 whether another passenger on the booking is to be processed. If the passenger or crew inputs :yes " into their respective screen, then the process repeats again from step 1 1 1 for the next passenger. If :no ⁇ then the assisted-check-in procedure is complete at step 124.
- the kiosk 14 or terminal 15 where check-in took place transmits passenger details to the database server 23 for storage and subsequent transmission to the server 26 on board vessel 12, the vessel being, at that time, docked at the port of origin and so within receiving range of internet Wi-F i transmissions.
- the data template having unique personalised identification details of the wristband 13 is stored with other data templates having details from wristbands of other passengers in the passenger manifest 127 on the database server 23.
- the database server then push syncs the passenger manifest on the on-board server 26.
- the passenger 1 1 proceeds to the dock for boarding of the vessel 12.
- a crew member carrying a tablet 29 is stationed at the boarding entrance to the vessel 12.
- a boarding counting process 137 for counting passengers by their wristbands 13 is initiated for passenger boarding.
- the tablet 29 receives an input selection from the crew member via touchscreen 74 relating to which entry / exit station the crew member is positioned and whether or not the passenger is entering or exiting the vessel.
- the passenger taps the NFC tag 64 of their wristband 13 on the NFC reader enabled tablet 29, or at least waves the NFC tag in close proximity (within a few centimetres) to the tablet 29.
- the data template / unique identifier data transmitted wirelessly from the NFC tag 64 is received by the tablet s 29 integrated NFC reader.
- passenger details associated with the unique identification data / data template are attempted to be retrieved by the tablet.
- the tablet queries whether passenger data is found relating to the scanned wristband. If :no " data is found, at step 142 an error is displayed on the tablet screen, thereby notifying the crew member. If :yes " passenger data relating to the scanned wristband is found for this cruise, the passenger / tag details are displayed at step 143.
- a camera 16 located at the point of entry captures a picture of the boarding passenger s face and this image is compared with the image taken of the passenger associated with the retrieved N FC tag details at check-in, and if a discrepancy is found an alert notification is triggered.
- the passenger or crew confirm by touchscreen selection that the tag details are correct in respect of the passenger attempting to board.
- the direction of passenger movement (on to the ship) is logged and their status is updated as being onboard.
- Information regarding passenger movement and status is wirelessly synced from the tablet to the on-board server over the on-board Wi-F i network.
- step 146 the counting process is completed for that particular passenger now boarded. The counting process is repeated for all passengers boarding the vessel, with the boarding process being completed when the missing passenger count equals zero.
- F igure 21 shows an activity counting-off process 147 for counting passengers by their wristbands, as may be initiated for each passenger 1 1 disembarking the vessel 12 for a scheduled activity or stopover.
- a crew member holding a tablet 29 is stationed at the vessel exit.
- the tablet 29 receives an input selection from the crew member via touchscreen 74, the input selection relating to which entry / exit station the crew member is positioned, whether the passenger is entering or exiting the vessel, and which activity the passenger is disembarking the vessel for.
- the passenger positions the NFC tag 64 of their wristband 13 in close proximity with the N FC reader enabled tablet 29 such that the tablet scans the tag.
- passenger or tag details associated with the scanned tag are retrieved by the tablet.
- the tablet determines whether the passenger associated with the scanned tag is registered for the activity entered by the crew member during step 148. If it is determined :no " the passenger is not registered for the activity, an error message is presented on the on the tablet screen, thereby notifying the crew member. If yes " the passenger is registered for the activity, at step 153 the tablet or other linked processor determines which medical or legal form is required for the activity, if any, and checks whether or not it has been completed by the passenger. If :no " the necessary medical or legal form has not been completed then an error message is displayed at step 152. If yes " the requisite medical and legal forms have been completed, at step 1 55 the passenger details relating to the scanned wristband are displayed on the tablet screen.
- step 1 56 the direction of passenger movement (off the ship) is logged and their status is updated as being off-board.
- information regarding passenger movement status, and activity is wirelessly synced from the tablet to the onboard server over the on-board Wi-F i network.
- All tablets are in turn synced with the on-board server, thereby keeping the missing passenger activity counts on the tablets up to date.
- step 1 57 the activity counting process is completed for the particular passenger having disembarked.
- the activity counting-off process is repeated for all passengers disembarking the vessel for their registered activity.
- an activity counting-on process is initiated.
- the crew member stations themselves at the vessel entry point with tablet in hand, and selects their present location (which may or may not be a different location from where the passengers disembarked for their activity).
- the passengers each :tap on ' with their wristband, and once the missing passenger activity count is reduced to zero, the activity manifest 136 is recalled. P rior to recall of the activity manifest the vessel is disabled so as to prevent it from leaving the location until all passengers are accounted for. If a passenger is missing, their personal details and photograph are accessible by the crew.
- the crew member signs the electronic form presented by the tablet indicating that all passengers are accounted for on board the vessel.
- This digital form is synced to the on-board server over the on-board Wi-F i network, and in-turn synced to the database server when within range.
- F igure 22 illustrates an exemplary video surveillance alert process 158.
- the next frame from the video surveillance camera 30 feed is transmitted to and received by the on-board server 26.
- the onboard server or other linked processor analyses the frame and determines whether or not the number of people in the frame is greater than a predetermined value :n ⁇ If :yes " the number of people in the frame is greater than n, then a crowd alert is generated, which alert may be in the form of, for instance, a notification presented on passenger phones 34 or tablets 29, a vibration of vibrating means on passenger wristbands within the zone, an alert notification presented on crew consoles 27, an alarm sounded on speakers within the zone, etc.
- step 162 the frame image is scanned to detect faces of passengers captured in the image. If :no " faces are detected at step 163, the next frame image is taken and the process repeats from step 159. If :yes " a face is detected at step 1 63, at step 1 64 the face is compared with faces of boarded passengers stored in on-board database 135 in order to identify the passenger. If at step 165 :no " the passenger is not able to be identified, then an unknown passenger alert is transmitted by the on-board server to predesignated linked devices on board the vessel, and subsequently the process 158 is completed at step 175.
- the on-board server 26 or an external linked processor 41 calculates the total number of visits the passenger has made to the surveilled area. If it is determined that :yes " the number of visits is greater than a predetermined number within a predetermined period of time or days, at step 1 68 a frequent visit alert is transmitted by the on-board server to predesignated linked devices. If :no " the number of visits is less than the predetermined number within the predetermined period, then at step 169 the on-board server checks whether two facially recognised passengers in a particular zone match two passengers in a rendezvous alert gallery stored in the on-board database 135.
- a rendezvous alert is issued. If :no " at step 171 the on-board server checks whether a facially recognised passenger from the presently captured frame matches a facially recognised passenger from a previously captured frame with the passenger located in different zone 82. If yes " there is a match, then a transit alert is generated at step 172. If :no " there is not a match, then at step 173 the on-board server determines whether the facially recognised passenger from the captured frame is not in close proximity to, or is in a different zone to, a predesignated accompanying passenger recorded in the on-board database 135.
- an unattended alert is generated at step 174. If :no " the person is not being unaccompanied (i.e. the person is being accompanied) by their accompanying passenger, the video surveillance alert process is completed at step 175.
- F ig. 23 illustrates an exemplary location tracking alert process 176.
- a BLE enabled wristband 13b is issued to the passenger during an assisted check-in, and the passenger fastens the wristband on their wrist
- a BLE transmission 84 sent from the wristband 13b is received by a B LE receiver 81 when in range.
- a processor 41 associated with the B LE receiving device 81 determines whether a transmission 84 from the wristband 13b has already been received by the receiver 81 within a predesignated time period or window. Only the first signal 84 in the predesignated time period is registered, accepted or updated.
- passenger / BLE tag location will only be updated every :x " predesignated or configurable number of seconds. If the transmission from the BLE tag / wristband / passenger has been read in the last x number of seconds, within the predesignated time period, then at step 180 the current read is ignored and the process ends at step 184. If the transmission from the B LE tag / wristband / passenger has not been received in the last x number of seconds, within the predesignated time period, then at step 181 the time and location of the passenger is updated. At step 182, the processor 41 determines whether the current location of the passenger is in breach of location restrictions, either for that particular passenger or for all passengers generally. If :yes " the passenger is in a restricted location, then a location alert is generated at step 183. If :no ⁇ then the process ends at step 184.
- F igures 24a-d illustrate the process of scanning the palm veins of a passenger's hand 218.
- the passenger 1 1 places their hand 218 at a suitable distance, such as forty to sixty millimetres, from and facing the palm vein sensor 196.
- the sensor 196 will be incorporated into a palm vein scanner 191 which also includes a hand guide 197 on which the user s palm can be placed for suitable capture distancing (as shown in F igure 2c). With the hand 218 correctly placed, the sensor 196 emits near-infrared light towards the hand 218 as shown in F igure 24b.
- the compressed template is again encrypted using AE S , before being stored in a palm vein database 221 , shown in F igure 24d, linked with the database server 23.
- the palm vein database 221 is synced with an on-board database 135 linked with the onboard server 26.
- An exemplary check-in procedure 222 suitable for enrolling in or engaging palm vein identification for the upcoming journey is shown in F igure 25.
- the passenger 1 1 is prompted to place their palm on or a palm vein scanner 191 .
- the passenger's palm is scanned and a palm vein image 220 is generated.
- the generated palm vein image is transmitted by the check-in device, 14 or 190, linked with the scanner 191 to the database server 23 for conversion into template form and comparison with stored templates in the linked palm vein database 221. If the generated template matches a stored template, which is likely to be the case for a returning passenger, then the passenger's details associated with the stored template are displayed on the check-in device, 14 or 190, at step 226.
- the passenger is then asked to confirm or update their displayed details at step 227 before continuing on with the remainder of the check-in procedure at step 228. If, however, the generated template does not match any of the stored templates, an enrolment process is initiated at step 229. Then, at step 230, the passenger's palm is scanned twice, a palm vein image is generated, and a template based on the image is created at step 231. At step 232 the passenger's palm is once again scanned. At step 233, a template based on the palm vein image generated from the further scan at step 232 is compared with the template created at step 231 for authentication purposes. At step 234, if the further template matches the earlier template then the passenger is able to continue with the remainder of the check-in process at step 228.
- step 234 If however at step 234, the further template generated at step 232 does not match the template created at step 231 , the template creation procedure is restarted at step 235, and the passenger will be required to repeat steps 230 to 234 until their created palm vein template is authenticated, at which time they can move on to the remainder of the check-in procedure at step 228.
- An exemplary procedure 236 for counting passenger's 1 1 boarding the vessel 12 using palm vein scanning technology is illustrated in F igure 26.
- the passenger 1 1 who is boarding the vessel places their palm on or at a suitable capturing distance from a palm vein scanner 192 located in the proximity of the boarding gate.
- the scanner 192 may be any of various forms including a small scanner 192a attached to a tablet 29 (F igure 14d) held by a member of staff 57 at the boarding gate, or a small scanner 192b connected with a tablet 29 embedded in a case 201 ( Figure 14e) and held by a staff member 57 at the boarding gate, or a palm vein scanner 192c having an adjustable bracket 197 for hand placement and being connected by cable with a tablet 70 mounted on a base (F igure 14f) at the boarding gate, or a scanner 192d mounted on a wall (F igure 14i) near the boarding gate.
- a palm vein image is generated for the passenger and a template created either by the scanner or remotely.
- the generated palm vein image is either compared with existing templates stored in the memory of the scanner itself or the connected tablet, or transmitted remotely to the database server for comparison with existing templates in the palm vein database. If a match is not found during the comparison, at step 241 a notification appears on the scanner or connected tablet display indicating that the passenger is unidentified. The passenger is then directed to re-place their palm on or at an appropriate distance from the scanner in order to repeat steps 237 to 240. If step 241 is re-engaged, it may be appropriate to send a notification to authorities that an unidentified passenger is attempting to board the vessel.
- step 246 If however, at step 246, a match is found in the comparison between the generated palm vein image template at step 240 and existing image templates, then the passenger is identified at step 242 such as by their name being displayed on the scanner or tablet screen. At step 243, the passenger-on-board count is increased by one in the passenger manifest. At step 244, the passenger is then allowed to board the vessel.
- An exemplary procedure 245 for counting passengers 1 1 off the vessel 12 using palm vein scanning technology is illustrated in F igure 27. The steps of the counting off procedure 245 are the same as the steps of the counting on procedure 236, aside from step 243 which is replaced by step 246, and step 244 which is replaced by step 247.
- the passenger-on-board count is instead reduced by one in the passenger manifest at step 246. T he passenger is then allowed to depart the vessel at step 247.
- a process 248 for enabling passenger 1 1 or staff 57 access to one or more of the suite of executable applications At step 249, the user, 1 1 or 57, selects the application that they wish to access on an on-board tablet 29 or console 27 connected with a palm vein scanner 192. At step 237 the user places their palm on or at a capturing distance from the palm vein scanner 192. At step 238, the user's palm is scanned, and at step 239 an image of the user's palm is generated and a template created.
- step 240 they system checks whether the generated template matches a stored template, and if not, at step 240 an unidentified user notification is displayed on the screen of the scanner, tablet, or console, and the user is required to repeat the process from step 237 to 240 if they still wish to access the selected application. If, at step 240, a match is found for the palm vein image generated at step 239, then details or formatting relating to the person identified by the matched template are presented within the selected application on the tablet or console. At step 251 , the user is then free to use the selected application on the tablet or console.
- a palm vein scanner may be used to identify/manage some or all passengers
- a reading device e.g. NFC reader, BLE reader
- a passenger disposed identification device e.g. NFC wristband, B LE wristband, mobile phone
- Using two or more types of modalities on the same passengers in order to identify / manage those passengers is advantageous in the sense of having a back-up where one modality fails, is lost, or becomes inaccurate under certain conditions.
- Using two or more types of modalities may improve accuracy by cross-checking against passenger readings from both modalities. F urther, some modalities may be more suitable in some aspects / conditions / environments of the passenger s journey while other modalities may be more suitable in other aspects / conditions / environments of the same passenger s journey.
- one passenger may, for instance, refuse to have their biometric characteristics (e.g. palm veins, face, DNA) read, in which case a non-biometric modality (e.g. NFC wristband) may be required for them, while the remaining passengers have their biometric characteristics read.
- a non-biometric modality e.g. NFC wristband
- one type of modality may not work effectively with some passengers, e.g. palm vein scanning of double-hand amputees, in which case another type of modality could be utilised for identification / management of that particular passenger while the remaining passengers use palm-vein scanning.
- bus _ and its derivatives should be construed broadly as a system for communicating data.
- a method involving implementation of one or more steps by computing devices should not necessarily be inferred as being performed by a single computing device such that the one or more steps of the method may be performed by more than one cooperating computing devices.
- Objects such as :web server “ server” computing device “ computer readable medium “ and the like should not necessarily be construed as being a single object, and may be implemented as a two or more objects in cooperation, such as, for example, a web server being construed as two or more web servers in a server farm cooperating to achieve a desired goal or a computer readable medium being distributed in a composite manner, such as program code being provided on a compact disk activatable by a license key downloadable from a computer network.
- database and its derivatives may be used to describe a single database, a set of databases, a system of databases or the like.
- the system of databases may comprise a set of databases wherein the set of databases may be stored on a single implementation or span across multiple implementations.
- database is also not limited to refer to a certain database format rather may refer to any database format
- database formats may include MyS Q L, MyS Q Li , X ML or the like.
- the invention may be embodied using devices conforming to other network standards and for other applications, including, for example other WLAN standards and other wireless standards.
- Applications that can be accommodated include IE E E 802.1 1 wireless LANs and links, and wireless Ethernet.
- wireless , and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not.
- wired , and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a solid medium. The term does not imply that the associated devices are coupled by electrically conductive wires.
- processor may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that e.g., may be stored in registers and/or memory.
- a " computer , or a " computing device , or a " computing machine , or a “ computing platform , may include one or more processors.
- One or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment the one or more processors may operate in the capacity of a server or a client machine in server-client network environment or as a peer machine in a peer-to-peer or distributed network environment.
- the one or more processors may form a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
- steps of methods discussed may be performed by an appropriate processor (or processors) of a processing (i.e., computer) system executing instructions (computer-readable code) stored in storage. It will also be understood that the invention is not limited to any particular implementation or programming technique and that the invention may be implemented using any appropriate techniques for implementing the functionality described herein. T he invention is not limited to any particular programming language or operating system.
- S ome elements of methods described herein may be implemented by a processor or a processor device, computer system, or by other means of carrying out the function.
- a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method.
- a device A connected to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
- C onnected may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
- first . , second . , " third . , etc., to describe a common object merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
A system (10) for managing passengers (11) of a vehicle, the system (10) comprising: at least one biometric reader (16,191,193,16,192,194) configured to read a biometric characteristic of a passenger (11), create a data template based on the reading, and convey the data template directly or indirectly to a server (19,21,23,21,26); at least one non-biometric reader (81,32) configured to read a transmitted non-biometric characteristic comprising a data template associated with the, or another, passenger (11), and convey the data template directly or indirectly to the server (19,21,23,21,26); and the server (19,21,23,21,26) configured to receive each data template, access a database (23,24,135,221) of stored data templates, each stored template being associated with unique passenger identification information, compare each data template with the stored data templates, match each data template with a stored data template; and identify the, or each, passenger (11) based on the unique passenger identification information associated with each matched stored template.
Description
PAS S E NG E R MANAG E ME NT
T E C H NICAL FIE L D
The present invention relates to passenger management. In a particular aspect, the invention relates to a system or method for identifying, counting, locating, and/or surveilling passengers of a sea vessel.
BAC KG R OU ND ART
Any discussion of documents, devices, acts or knowledge in this specification is included to explain the context of the invention. It should not be taken as an admission that any of the material forms a part of the prior art base or the common general knowledge in the relevant art in Australia or elsewhere on or before the priority date of the disclosure and broad consistory statements herein.
Vessel based tourism involves accommodating and entertaining passengers whilst ferrying them to one or more stopovers. At the stopovers passengers temporarily disembark the vessel in order to undertake activities or explore the site, before re-boarding the vessel for transit to the next stopover or final destination.
Despite technological advances in recent times, many vessel operators continue to implement manual booking, ticketing and head counting systems. This practice suffers from several disadvantages and inefficiencies. For instance, inaccurate bookings or double bookings of tours or tour activities may easily occur as a result of human error and/or lack of a central electronic bookings database directly accessible by passengers.
Additionally, there may be difficulties associated with identifying passengers when making bookings or once aboard the vessel. For instance, a passenger obtaining another passenger s paper boarding / activities ticket may present safety concerns, such as where a passenger without prerequisite diving training is able to undertake a diving activity in the name of a trained passenger. S ecurity concerns may also be apparent, such as where a violent criminal is misidentified or takes the place of a passenger who is not a security threat. F urther, manual ticketing and identification
procedures may result in financia l losses due to, for instance, usage of another passenger s ticket to obtain perks or privileges not persona lly purchased.
O ne exa mple of a safety concern, which has been tragically played out on multiple occasions, is on scuba diving tours where passengers are ferried out to a dive location, disemba rk the vessel for diving activities, a nd then resurface only to find that the vessel has long departed as a result of inaccurate manua l passenger headcounts performed by staff aboard the vessel. If a nd when the mistake is realised, even identifying who exactly has been left behind is problematic due to the manua l booking and ticketing arra ngements.
Additionally, at present many tour vessels lack video surveillance and/or facial recognition and/or passenger tracking ca pabilities for identifying potentially problematic situations or security threats, such as passenger overcrowding in a n a rea of the vessel or the presence of passengers in restricted a reas.
In addition, at present many tour vessels require manua l forms and procedures for various aspects in addition to ticketing a nd bookings, such as on-boa rd purchases, medica l assessments or declarations, liability waivers, marketing, or activity scheduling. S uch practices are inefficient and prone to error.
Additionally, at present many tour vessels leave passengers without a full catalogue of photos specific to them a nd their tour, that being a shame for the passenger and a missed financial gain for the tour operator.
F urther, tour vessel operators may be required to submit management reports, a nd many having only a manual set-up, are required to compile the necessary information and prepa re the reports by inefficiently by ha nd a nd without the benefits of automation.
Additionally, present methods of identification may lack the accuracy or sensitivity required to correctly identify passengers at a n accepta ble rate.
T hus, it may be advantageous to provide a new system and/or method of passenger ma nagement which reduces, limits, overcomes, or ameliorates
some of the problems, drawbacks, or disadvantages associated with the prior art, or provides an effective alternative.
In a particular aspect it may be advantageous to provide a system or method which facilitates passenger identification. In another particular aspect, it may be advantageous to provide a system or method which facilitates passenger counting. In another particular aspect, it may be advantageous to provide a system or method adapted for passenger surveillance. In another particular aspect, it may be advantageous to provide a system or method which facilitates passenger tracking or locating. In another particular aspect it may be advantageous to provide a system or method which facilities passenger ticketing, bookings, or purchasing. In another particular aspect, it may be advantageous to provide a method or system which facilitates the provision of electronic medical or legal forms. In another particular aspect, it may be advantageous to provide a method or system for improving security on-board the vessel. In another particular aspect, it may be advantageous to provide a method or system which facilitates on-board sales or marketing, or activity scheduling, or photo management, or management reporting. In another particular aspect, it may be advantageous to provide a method or system which identifies passengers with a high level of accuracy.
DIS C LOS U R E O F T HE INVE NTION
In one aspect, the invention provides a passenger management system.
In another aspect the invention provides a system for managing passengers of a vehicle. T he vehicle may comprise any means of transporting passengers. The vehicle may comprise a plane, bus, or train. In a particular form, the vehicle comprises a vessel. The vessel may comprise any means of transporting or supporting passengers on water. In a particular form, the vessel comprises a ship, boat, pontoon, or sea platform for activities.
The system may comprise passenger identification means.
The passenger identification means may comprise a reading device.
The passenger identification means may comprise:
an identification device disposed on or in relation to the passenger, the identification device being adapted to convey a non- biometric characteristic relating to the passenger, and
a reading device adapted for reading the non-biometric characteristic conveyed by the identification device.
The non-biometric characteristic may comprise a data template. The data template may comprise unique passenger identification information.
In another aspect the passenger identification means may comprise: an identification device disposed on or in relation to the passenger, the identification device being adapted to convey unique passenger identification information, and
a reading device adapted for reading the unique passenger identification information conveyed by the identification device.
The identification device may be or comprise a portable device.
The reading device may be or comprise a portable device.
The unique passenger identification information may comprise, be comprised by, or be in the form of, a data template.
The unique passenger identification information may comprise, or be comprised by, non-biometric characteristics.
The identification device may comprise a transmitter. The unique passenger identification information may be conveyed by a transmission from the transmitter.
The reading device may comprise a receiver. The receiver may detect the transmission from the identification device, thereby obtaining the unique passenger identification information. The receiver may be adapted to detect the transmission when in close proximity to the transmitter. For instance, the receiver may be adapted to detect the transmission when within twenty, fifteen, ten, nine, eight seven, six five, four, three, two, or one centimetres of the transmitter, or the transmission may dissipate over such distances.
The identification device may comprise a wearable object. The wearable object may comprise a wristband. The wristband may be printable. The wristband may be adjustable prior to fastening. The wristband may be irremovable without damage thereto.
The identification device may comprise an RFID device. The RFID device may be embedded in the wristband. The RFID device may comprise the transmitter. The RFID device may comprise an RFID tag. The RFID tag may comprise an NFC tag. The NFC tag may comprise the transmitter, the transmitter comprising an NFC antenna. The NFC tag may comprise an NFC chip.
The reading device may comprise, or be comprised by a tablet, smart phone, console, or handheld portable reader.
The reading device may comprise an RFID reader. The RFID reader may comprise an NFC reader.
Additionally, or alternatively, the reading device may comprise a biometric measuring device adapted for measuring biometric characteristics of a passenger. The biometric characteristics may be associated with or used as a basis to generate unique passenger identification information.
The biometric characteristics may comprise physiological characteristics. The physiological characteristics may comprise at least one of fingerprint, DNA, hand geometry, palm vein arrangement, retinal blood vessel arrangement, iris configuration, and physical facial characteristics.
The biometric characteristics may comprise behavioural characteristics. The behavioural characteristics may comprise at least one of signature, and voice characteristics.
The biometric measuring device may comprise a palm vein scanner. The palm vein scanner may comprise a light emitter and a sensor for detecting the emitted light. The light emitter may emit infrared light and the sensor may detect infra-red light absorbed by the palm veins.
The biometric measuring device may be connected with, comprise, or be comprised by a tablet, smart phone, or console.
T he system, or passenger identification mea ns, may comprise passenger counting means. T he passenger counting means may comprise pa rt or a ll of the passenger identification mea ns.
T he passenger counting mea ns may be adapted to count passengers as they enter or exit the vessel.
T he passenger counting means may comprise a processor adapted to perform the counting operations. T he processor may be comprised by the reading device. Additionally, or alternatively, the processor may be comprised by a n on-board server linked with reading device.
T he passenger counting means may comprise the reading device positioned at or on the way to a vessel exit or entry point. T he reading device may be positioned at or near a boa rding gate.
T he passenger counting mea ns may comprise passenger identification devices disposed on or in relation to respective passengers. T he passenger counting means may be adapted to count passengers entering or exiting the vessel as their passenger identification devices are detected by the reading device. Additionally, or alternatively, the passenger counting means may be ada pted to count passengers entering or exiting the vessel as their biometric cha racteristics are detected or measured by the reading device.
T he passenger counting means may be adapted to distinguish between passengers entering the vessel and passengers exiting the vessel. F or instance, the passenger counting mea ns may determine that every odd number of detections for a pa rticula r passenger relates to a boarding movement thereby reducing the missing passenger count by one assuming the identified passenger is not a lready accounted for, a nd every even number of detections for a particular passenger relates to a disembarking movement, thereby increasing the missing passenger count by one unless the passenger is at the end of their journey. Additiona lly, or alternatively, the reading device may receive input, for insta nce from a crew member, indicating the direction of movement (on or off the vessel) of the detected passenger. Additionally, or alternatively, there may be, for instance, two reading devices, one positioned
so as to be facing a nd ca pable of detecting boarding but not exiting passengers, and the other positioned so as to be facing and ca pable of detecting exiting but not boarding passengers. S uch an arrangement may be suitable where, for instance, entry and exit reading devices in the form of palm vein sca nners, or facial recognition cameras, a re provided.
T he passenger counting means may be adapted to distinguish between passengers boarding the vessel at the sta rt of their journey or disembarking at the end of their journey, a nd passengers disemba rking the vessel for an activity and re-boarding from activity.
T he passenger counting mea ns may comprise a passenger manifest.
T he passenger manifest may identify the total number of passengers due to board the vessel. Information relating to the passenger manifest may be wirelessly transmitted from an off-board server to the on-board server. T his information may in-turn be trans mitted to the reading device.
T he passenger counting means may comprise a n activity manifest.
T he activity manifest may identify the number of passengers due to re-board the vessel after returning from an activity.
T he passenger counting means may comprise a missing passenger count for identifying passengers missing from the vessel. T he missing passenger count may comprise a booking count. T he booking count may indicate the number of passengers yet to boa rd the vessel. T he missing passenger count may comprise an activity count T he activity count may indicate the number of passengers yet to re-board the vessel after disembarking (for a n activity, for insta nce).
T he system, or passenger identification mea ns, may comprise surveillance mea ns. T he surveillance mea ns may comprise pa rt or all of the passenger identification mea ns.
T he surveillance mea ns may comprise a data base of stored images showing passenger faces, or of stored templates based on or derived from such images. T he surveilla nce means may comprise an on-boa rd camera. T he reading device may comprise the on-board ca mera. The on-board camera may be wirelessly linked with the on-boa rd server. T he on-board
camera may be adapted to capture images of passengers. The captured images may comprise images showing passenger faces. The surveillance means may be adapted to detect passenger faces in a captured image. The surveillance means may be adapted to compare the captured images with the stored images, or compare templates of captured images with stored templates derived from images. The surveillance means may be adapted to identify a passenger based on similarities of a detected face with a stored image, or based on similarities of a template of a captured image including a passenger face with a stored template.
The surveillance means may be adapted to trigger an alert when the number of people captured in an image, or the number of people captured within a designated zone of the vessel, is greater than a predetermined value.
The surveillance means may be adapted to calculate the number of times, or total time, a passenger visits a predesignated area. If the number of times or total time is greater than their respective predesignated values, the surveillance means may trigger a frequency or duration alert.
The surveillance means may be adapted to trigger a rendezvous alert when two passengers facially recognised from a captured image match two passengers in stored images together associated with a rendezvous alert.
The surveillance means may be adapted to trigger a transit alert when a passenger facially recognised from a captured image in one zone matches the same passenger facially recognised from an earlier captured image in another zone, or does not match zoning information associated with a stored image of that passenger.
The surveillance means may be adapted to trigger an unattended alert when a passenger facially recognised from a captured image is not in close proximity to, or is greater than a pre-determined distance away from, or is not in the captured image with, or is determined to be in a different zone to, a predesignated accompanying passenger.
The system, or passenger identification means, may comprise positioning means for identifying passenger position on the vessel. The
positioning means may comprise part or all of the passenger identification means.
T he pos itioning means may comprise the passenger identification device dis posed on or in relation to the passenger. T he positioning means may comprise an on-boa rd positioning receiver. T he positioning receiver may be adapted to receive a positioning transmission from the passenger identification device. The positioning receiver may be associated with a receiver ID. T he positioning mea ns may be adapted to determine the position of the passenger on or in relation to the vessel based on at least one of the received transmission of positional information and the receiver ID.
T he positioning means may be adapted to display the position of the detected passenger on a map. T he pos itioning mea ns may be adapted to determine or indicate the pos ition of the passenger within twenty metres, fifteen metres, ten metres, eight metres, six metres, five metres, four metres, three metres, two metres or one metre.
T he pos itioning receiver may be adapted to detect the positioning transmission over an extended dista nce from the transmitter. For instance, the positioning receiver may be adapted to detect the transmission when over two metres, five metres, ten metres, twenty metres, thirty metres, fifty metres, seventy five metres, or one hundred metres away, or the positioning transmission may dissipate over such dista nces.
T he identification device may comprise a B LE chip. T he B L E chip may be ada pted to transmit a low energy B luetooth signal. T he B LE chip may be embedded in the wristband. T he B LE chip may comprise the positioning transmitter.
T he positioning means, positioning receiver, or other linked device or processor, may be adapted to determine whether an earlier positioning transmission from the wristba nd was received within a predetermined time period, or whether a later positioning transmission is received within a predetermined time period. The time period may be configurable. After receipt of a transmission, the positioning receiver may be ada pted to ignore later positioning trans missions from the identification device that are received
within a predetermined time period. The positioning means or receiver may be adapted to ignore the positioning transmission if a previous transmission has been received within a predetermined time period. T hus, the passenger / positional transmitter position may only be updated after a pre-determined time interval.
The positioning means may be adapted to determine whether the detected position of the passenger or positional transmitter is within a restricted zone or zone restricted to them. The positioning means may be adapted to trigger or generate a location alert when the passenger is detected in the restricted zone.
The system, or an electronic device or devices comprised by the system, such as the on-board server, off-board server, identification device, reading device, camera, or positioning receiver, may comprise a processor. The system, or electronic device or devices comprised by the system may comprise memory. The memory may comprise at least one of RAM, RO M, data, programs, and an operating system. The system or electronic device or devices comprised by the system may comprise input means, a display, a power source, and/or a communications interface. The communications interface may comprise a network interface. The, or some of the, devices of the system may be, or be adapted to be, wirelessly linked.
In another aspect the invention provides a passenger management method. In another aspect the invention provides a method of managing passengers of a vehicle. The vehicle may comprise a vessel.
The method may comprise:
transmitting unique passenger identification information from an identification device disposed on or in relation to a passenger; and
reading or receiving the transmission of passenger identification information.
The information may be read or received by a reading device.
The transmission may comprise an R FID transmission. The R FID transmission may comprise an NFC transmission.
The method may comprise determining the identity of the passenger based on the received passenger identification information.
The method may comprise bringing the reading device into close proximity with the identification device in order to receive the transmission therefrom. C lose proximity may be less than five centimetres. It is to be noted that bringing is used herein as a relative term, so bringing the reading device into close proximity with the identification device is no different to bringing the identification device into close proximity with the reading device.
The method may comprise positioning the reading device at or on the way to an exit or entry point. The exit point may be an entry or exit point of a or the vessel.
Additionally, or alternatively, the method may comprise reading or measuring biometric characteristics of a passenger. The biometric characteristics may comprise at least one of fingerprint, D NA, hand geometry, palm vein arrangement, retinal blood vessel arrangement, iris configuration, facial structure, signature, and voice characteristics. In a particular form, the biometric characteristics comprise palm vein arrangement. Thus, reading or measuring biometric characteristics of a passenger may comprise measuring palm vein arrangement. Measuring palm vein arrangement may comprise beaming infrared light towards a passenger's palm and detecting the infra-red light absorbed by the palm veins.
The method may comprise determining or conveying unique passenger identification information based on the measured biometric characteristics.
The method may comprise counting passengers by received transmissions. Additionally, or alternatively, the method may comprise counting passengers by measuring or reading biometric characteristics.
The method may comprise increasing or decreasing a passenger count by one each time a transmission is received. Additionally, or alternatively, the method may comprise increasing or decreasing a passenger count by one each time biometric characteristics are read or
measured. The passenger count may comprise a missing count of passengers off the vessel. The passenger count may comprise a count of passengers on-board the vessel.
The method may comprise obtaining or determining the direction of passenger movement either on or off the vessel at the time the transmission is received. Additionally, or alternatively, the method may comprise obtaining or determining the direction of passenger movement either on or off the vessel at the time the biometric characteristics are measured.
The method may comprise reducing the missing count when a passenger is detected moving on to the vessel, or increasing the missing count when a passenger is detected moving off of the vessel.
The method may comprise reducing the on-board count by one when a passenger is detected disembarking the vessel, and increasing the onboard count by one when a passenger is detected moving on to the vessel.
The method may comprise updating a status of a passenger when detected moving on or off the vessel. The status may be updated to indicate that the passenger is off board when detected moving off the vessel. T he status may be updated to indicate that the passenger is on-board when detected moving on to the vessel.
The method may comprise determining a passenger is missing based on their present status or most recent direction of movement
The method may comprise determining the identity of a missing passenger based on their received passenger identification information. Additionally, or alternatively, the method may comprise determining the identity of a missing passenger based on their determined passenger identification information.
The method may comprise immobilising the vessel, or an engine of the vessel until a missing passenger is accounted for. The missing passenger may be accounted for when their status is :on-board~ Where there are multiple missing passengers, all may be accounted for when the missing passenger count equals zero.
The method may comprise transmitting information relating to the passenger count from the reading device to an on-board server.
The method may comprise syncing the passenger count on other reading devices with the passenger count recorded by the on-board server.
The method may comprise presenting personalised booking, marketing, sales, activity, medical, security or photo information to a reading device, or device linked therewith, based on received or read personal identification information. Additionally, or alternatively, the method may comprise presenting personalised booking, marketing, sales, activity, medical, security or photo information to a reading device, or device linked therewith, based on personal identification information associated with measured biometric characteristics.
The method may comprise transmitting positional information from a position device disposed on or in relation to a passenger. The position device may be the identification device. The position device may comprise the identification device. The passenger identification device may comprise the position device.
The method may comprise receiving the transmission of positional information by a positional receiver. The method may comprise the receiving device receiving a transmission of passenger identification information from the position or identification device. The method may comprise determining the identity of the passenger based on the received transmission.
The method may comprise associating the positional receiver with a receiver ID. The method may comprise identifying the location of the positional receiver based on the receiver ID.
The method may comprise determining the position of the passenger on or in relation to the vessel based on at least one of the received transmission of positional information and the receiver ID.
The method may comprise determining or indicating the position of the passenger within twenty metres, fifteen metres, ten metres, eight metres, six metres, five metres, four metres, three metres, two metres or one metre.
The method may comprise generating a map of at least part of the vessel. T he method may comprise displaying the detected position of the passenger on the map.
The identification or position device may be adapted to transmit positional information over at least five, ten, fifteen, twenty, thirty, fifty, seventy or one hundred metres.
The transmission of positional information may comprise a BLE transmission. The transmission of positional information may comprise a low energy Bluetooth transmission.
The method may comprise dividing the vessel or parts of the vessel into zones.
The method may comprise locating at least one positional receiver in each zone. The positional receiver may be adapted to receive positional information of passengers within the zone of the positional receiver.
The method may comprise determining or locating the position of the passenger within a particular zone.
In an alternative aspect, the method may comprise multiple towers receiving a positional transmission from a position or identification device. Multiple towers may be placed within a single zone and/or between zones (i.e. in different zones). The method may comprise determining the position of the passenger based on positional information received from multiple positional receivers. Determining the position of the passenger may comprise triangulation of positional information received by multiple positional receivers.
The positional receivers may transmit information relating to the position of a passenger to the on-board server for processing or storage.
The method may comprise determining whether an earlier positioning transmission from the wristband has been received within a predetermined time period. The method may comprise determining whether a later positioning transmission has been received within a predetermined time period.
T he method may comprise ignoring positioning transmissions received with the predetermined time period.
T he method may comprise not updating or re-evaluating the passenger position until after the pre-determined time period.
T he method may comprise determining whether the detected or determined position of the passenger is within a restricted zone. T he method may comprise triggering a location alert when the passenger is detected in the restricted zone.
T he method may comprise capturing an image of a passenger. T he image may be captured by a n on-boa rd camera.
T he method may comprise detecting the face of the passenger in the image.
T he method may comprise storing images showing passenger faces. T he stored images may be stored on the on-boa rd server. Additiona lly, or alternatively, the method may comprise creating templates based on images showing passenger faces. T he method may comprise storing the templates. T he template may be stored on the on-board sever.
T he method may comprise associating passenger identification information with a stored image of the passenger. Additionally, or alternatively, the method may comprise associating passenger identification information with a stored template of or based on an image of the passenger.
T he method may comprise comparing the face of the passenger in the captured image with the faces of passengers in the stored images. Additionally, or alternatively, the method may comprise comparing a template of a captured image including a passenger face with stored templates.
T he method may comprise determining similarities between the face of a passenger in a ca ptured image and the face of a passenger in the stored image. Additiona lly, or alternatively, the method may comprise determining similarities between the template of a captured image including a passenger face and a stored template.
T he method may comprise determining a passenger match based on the s imilarities between faces in the captured image and stored image.
Additionally, or alternatively, the method may comprise determining a passenger match based on the similarities between templates generated from captured images and stored templates.
T he method may comprise identifying the passenger in the captured image based on the passenger identification information associated with the stored image for which a passenger match was determined. Additionally, or alternatively, the method may comprise identifying the passenger in the template of the captured image based on the passenger identification information associated with the stored template for which a passenger match was determined
T he method may comprise identifying a passenger in the captured image based on similarities with a passenger in a stored image. Additionally, or a lternatively, the method may comprise identifying a passenger from a template of a captured image based on similarities with a stored template of an image of or including the passenger.
T he method may comprise determining when the number of people captured in a n image, or the number of people ca ptured within a designated zone of the vessel, is greater tha n a predetermined value. T he method may comprise generating a crowd a lert when the number of people ca ptured in an image, or the number of people captured within a designated zone of the vessel, is greater than a predetermined value.
T he method may comprise capturing multiple images. T here may be multiple images of a particular passenger or a pa rticular zone or a pa rticular passenger within a pa rticula r zone.
T he method may comprise analysing or comparing multiple captured images.
T he method may comprise calculating the number of times, or total time, a passenger visits a predesignated area based on multiple captured images.
T he method may comprise triggering a frequency or duration a lert when the number of times, or total time is, greater tha n a respective associated predes ignated value.
T he method may comprise identifying two passengers in a captured image as matching two passengers in stored images associated with a rendezvous alert. T he method may comprise generating the rendezvous alert in such instances.
T he method may comprise identifying the same passenger from two captured images taken in or of different zones. T he method may comprise identifying a passenger from a captured image in a zone which does not match zoning information associated with a stored image of that passenger. T he method may comprise generating a transit alert in such insta nces.
T he method may comprise determining that a passenger identified from a captured image is not in close enough proximity to, or is greater than a pre-determined distance away from, or is not in the captured image with, or is determined to be in a different zone to, a predesignate accompanying passenger associated with the identified passenger.
T he method may comprise presenting or making accessible at least one captured image of a passenger upon reading personal identification information relating to that passenger as transmitted from the identification device.
In a nother aspect, the invention provides a system for ma naging passengers of a vehicle comprising a reading device ada pted for reading biometric characteristics of a passenger, the biometric cha racteristics being associated with unique passenger identification information.
In a nother aspect, the invention provides a method of managing passengers of a vehicle, comprising:
reading biometric characteristics of a passenger; and
identifying the passenger based on unique passenger identification information associated with the read biometric characteristics.
In a nother aspect, the invention provides a system for ma naging passengers of a vehicle, comprising:
a palm vein scanner adapted to scan palm veins of a passenger, generate a palm vein image based on the scan, create a data template
based on the palm vein image, and convey the data template to an external device; and
a server adapted to associate unique passenger identification information with the conveyed data template, and store the data template in a database of templates for future comparison.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger;
generating a palm vein image based on the scan;
creating a data template based on the palm vein image;
associating unique passenger identification information with the data template; and
storing the data template in a database of templates for comparison on future scans.
In another aspect, the invention provides a system for managing passengers of a vehicle, comprising:
a palm vein scanner adapted to scan palm veins of a passenger, generate a palm vein image based on the scan, create a data template based on the palm vein image, and convey the data template to an external device; and
a server adapted to compare the data template with a database of stored templates associated with unique passenger identification information, match the data template with one of the stored templates; and identify the passenger using the unique passenger identification information associated with the matched stored template.
In another aspect, the invention provides a method of identifying passengers of a vehicle, comprising:
scanning the palm veins of a passenger;
generating a palm vein image based on the scan;
creating a data template based on the palm vein image;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template.
In a nother aspect, the invention provides a system for counting passengers of a vehicle, comprising:
a palm vein scanner adapted to scan the pa lm veins of a passenger, generate a palm vein image based on the scan, create a data template based on the pa lm vein image, a nd convey the data template to a n external device; and
a server adapted to compare the data template with a database of stored templates associated with unique passenger identification information, match the data template with one of the stored templates, a nd increase or reduce a passenger count according to whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port; terminal, facility or vehicle.
In a nother aspect, the invention provides a method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger;
generating a palm vein image based on the scan;
creating a data template based on the pa lm vein image;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
increasing or reducing a passenger count according to whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port; termina l, facility or vehicle.
In a nother aspect, the invention provides a system for ma naging passengers of a vehicle, comprising:
a palm vein scanner adapted to scan palm veins of a passenger, generate a palm vein image based on the scan, create a data template based on the palm vein image, and convey the data template to an externa l device;
a server adapted to receive and compare the data template with a database of stored templates associated with unique passenger identification information, match the data template with one of the stored templates; and identify the passenger using the unique passenger identification information associated with the matched stored template; and
an on-vehicle terminal linked with the palm vein scanner and running an on-vehicle application, wherein passenger access to personalised content using the on-vehicle application is enabled once the passenger has been identified.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger;
generating a palm vein image based on the scan;
creating a data template based on the palm vein image;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template;
enabling the identified passenger to access personalised content.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger;
creating a data template based on the scan;
associating unique passenger identification information with the data template; and
storing the data template in a database of templates for comparison on future scans.
In another aspect, the invention provides a method of identifying passengers of a vehicle, comprising:
scanning the palm veins of a passenger;
creating a data template based on the scan;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger;
creating a data template based on the scan;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
increasing or reducing a passenger count according to whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port; terminal, facility or vehicle.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger;
creating a data template based on the scan;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template;
enabling the identified passenger to access personalised content on an electronic device or terminal.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
measuring a biometric characteristic of a passenger;
creating a data template based on the measurement;
associating unique passenger identification information with the data template; and
storing the data template in a database of templates for comparison on future biometric measurements.
In another aspect, the invention provides a method of identifying passengers of a vehicle, comprising:
measuring a biometric characteristic of a passenger;
creating a data template based on the measurement;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
measuring a biometric characteristic of a passenger;
creating a data template based on the measurement;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
increasing or reducing a passenger count according to whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port; terminal, facility or vehicle.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
measuring a biometric characteristic of a passenger;
creating a data template based on the measurement;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template;
enabling the identified passenger to access personalised content on an electronic device or terminal.
In another aspect, the invention provides a method of identifying passengers of a vehicle, comprising:
reading a biometric characteristic of, or an identification device disposed on, a passenger;
obtaining unique passenger identification information based on the reading;
comparing unique passenger identification information with a database of stored passenger identification information associated with passengers of the vehicle;
matching the obtained passenger identification information with stored passenger identification information associated with a particular passenger;
identifying the passenger based on the matched unique passenger identification information.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
reading a biometric characteristic of, or an identification device disposed on, a passenger;
obtaining unique passenger identification information based on the reading;
comparing unique passenger identification information with a database of stored passenger identification information associated with passengers of the vehicle;
matching the obtained passenger identification information with stored passenger identification information associated with a particular passenger;
increasing or reducing a passenger count according to whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port; terminal, facility or vehicle.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
reading a biometric characteristic of, or an identification device disposed on, a passenger;
obtaining unique passenger identification information based on the reading;
comparing unique passenger identification information with a database of stored passenger identification information associated with passengers of the vehicle;
matching the obtained passenger identification information with stored passenger identification information associated with a particular passenger;
identifying the passenger based on the matched unique passenger identification information;
enabling the identified passenger to access personalised content on an electronic device or terminal.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
reading a biometric characteristic of, or an identification device disposed on, a passenger;
creating a data template based on the reading;
associating unique passenger identification information with the data template; and
storing the data template in a database of templates for comparison on future readings.
In another aspect, the invention provides a method of identifying passengers of a vehicle, comprising:
reading a biometric characteristic of, or an identification device disposed on, a passenger;
creating a data template based on the reading;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
reading a biometric characteristic of, or an identification device disposed on, a passenger;
creating a data template based on the reading;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
increasing or reducing a passenger count according to the passenger s direction of movement or whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port; terminal, facility or vehicle.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
reading a biometric characteristic of, or an identification device disposed on, a passenger;
creating a data template based on the reading;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template;
enabling the identified passenger to access personalised content on an electronic device or terminal.
In another aspect, the invention provides a method of identifying passengers of a vehicle, comprising:
capturing an image which includes the face of a passenger;
creating a facial data template based on the captured image;
comparing the facial data template with a database of stored facial data templates associated with unique passenger identification information;
matching the created facial data template with one of the stored facial data templates;
identifying the passenger based on the unique passenger
identification information associated with the matched stored template.
The method may further comprise increasing or reducing a passenger count according to the passenger s direction of movement or whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port, terminal, facility or vehicle.
The method may further comprise enabling the identified passenger to access personalised content on an electronic device or terminal.
In another aspect, the invention provides a method of identifying passengers of a vehicle, comprising:
transmitting unique passenger identification information from an identification device disposed on or in relation to a passenger;
reading the transmission of passenger identification information; comparing the transmitted passenger identification information with a database of stored passenger identification information records;
matching the transmitted passenger identification information with one of the stored passenger identification information records;
identifying the passenger based on passenger identification information in the matched stored record.
The method may further comprise increasing or reducing a passenger count according to the passenger s direction of movement or whether or not the passenger is entering, exiting, checking in, or checking out of a zone, port, terminal, facility or vehicle.
The method may further comprise enabling the identified passenger to access personalised content on an electronic device or terminal.
A combination of means may be used in accordance with the invention in order to identify passengers of a vehicle. For instance, two or more of the following means may be used in combination:
i) an identification device disposed on or in relation to the
passenger, the identification device being adapted to convey
unique passenger identification information; and, a reading device adapted for reading the unique passenger identification information conveyed by the identification device; ii) a camera for capturing an image which includes the face of a passenger; and, facial recognition means for identifying the face of the passenger captured in the image; iii) a palm vein scanner adapted to scan palm veins of a
passenger; and, a server adapted to match template data conveyed from the palm vein scanner with one of a number of stored templates containing unique passenger identification information for identifying the passenger.
Thus, in another aspect, the invention provides a system for managing passengers of a vehicle, comprising:
a palm vein scanner adapted to scan palm veins of a passenger; a server adapted to match template data conveyed from the palm vein scanner with one of a number of stored templates containing unique passenger identification information for identifying the passenger;
a camera for capturing an image which includes the face of a passenger; and,
facial recognition means for identifying the face of the passenger captured in the image.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger boarding the vehicle;
identifying the passenger on the basis of the scan;
verifying that the identified passenger is on a passenger manifest for the vehicle.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger boarding the vehicle;
identifying the passenger on the basis of the scan;
updating a database to indicate that a, or the, passenger is boarding, or is aboard, the vehicle,
scanning the palm veins of the passenger disembarking the vehicle at a new location;
identifying the passenger on the basis of the scan;
updating a database to indicate that a, or the, passenger is disembarking, or has disembarked the vehicle.
In another aspect, the invention provides a method of managing passengers of a vehicle such as a vessel, comprising:
scanning the palm veins of a passenger boarding the vehicle;
identifying the passenger on the basis of the scan;
updating a database to indicate that a, or the, passenger is boarding, or is aboard, the vehicle,
capturing an image of the passenger aboard the vehicle during transit the image comprising the face of the passenger;
identifying the passenger on the basis of the captured image;
identifying the zone or region of the vehicle in which the passenger image was captured, thereby locating the passenger aboard vehicle;
scanning the palm veins of the passenger disembarking the vehicle, the vehicle having travelled to a new location;
identifying the passenger on the basis of the scan; and
updating a database to indicate that a, or the, passenger is disembarking, or has disembarked the vehicle.
In another aspect, invention provides a method of managing passengers of a vehicle, comprising:
reading at least one biometric characteristic of a passenger, reading at least one different biometric characteristic, or a non- biometric characteristic comprising a transmitted data template, relating to the, or another, passenger;
creating a data template for and based on the, or each, reading of a biometric characteristic;
accessing a database of stored data templates, each stored data template being associated with unique passenger identification information; comparing each data template with the stored data templates;
matching each created or received data template with a stored data template;
identifying the, or each, passenger based on the unique passenger identification information associated with each matched stored template.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
reading at least one biometric characteristic of a passenger, and creating a data template based on the read biometric characteristic;
reading at least one non-biometric characteristic relating to the, or another passenger, wherein the at least one non-biometric characteristic comprises a transmitted data template relating to the, or another, passenger; accessing a database of stored data templates, each stored data template being associated with unique passenger identification information; comparing each read data template with the stored data templates; matching each read data template with a stored data template;
identifying the, or each, passenger based on the unique passenger identification information associated with each matched stored template.
The method may comprise:
reading at least one biometric characteristic of the passenger, and creating a data template based on the read biometric characteristic;
creating a data template of at least one non-biometric characteristic relating to the, or another, passenger
associating unique passenger identification information with each data template; and
storing each data template in a database of templates for comparison on future readings.
R eading at least one biometric characteristic may comprise scanning the palm veins of the passenger.
R eading at least one non-biometric characteristic may comprise electronically receiving a transmission of a data template from a device disposed on or in relation to a passenger.
R eading the at least one biometric characteristic may comprise capturing an image which includes the face of a passenger.
The method may comprise increasing or reducing a passenger count according to whether the, or each, passenger is entering or exiting the vehicle when identified.
The method may comprise enabling the, or each, identified passenger to access personalised content on an electronic device or terminal.
The method may comprise presenting content on an electronic device or terminal to the, or each, identified passenger.
R eading of at least one biometric or non-biometric characteristic may occur at least as the, or each, passenger is boarding the vehicle. Once the, or each, passenger is identified, a database may be updated to indicate that the, or each, passenger is boarding or is aboard the vehicle.
R eading of at least one biometric or non-biometric characteristic may occur at least as the, or each, passenger is disembarking the vehicle. Once the, or each, passenger is identified, a database may be updated to indicate that the, or each, passenger is disembarking or has disembarked, the vehicle
The method may comprise identifying a zone or region of the vehicle in which the, or each, identified passenger is located.
The method may comprise verifying that the, or each, identified passenger is on a passenger manifest for the vehicle.
In another aspect, the invention provides a system for managing passengers of a vehicle, the system comprising:
at least one biometric reader configured to read a biometric characteristic of a passenger, create a data template based on the reading, and convey the data template directly or indirectly to a server;
at least one non-biometric reader configured to read a transmitted non-biometric characteristic comprising a data template relating to the, or
another, passenger, and convey the data template directly or indirectly to the server; and
the server configured to receive each data template, access a database of stored data templates, each stored template being associated with unique passenger identification information, compare each data template with the stored data templates, match each data template with a stored data template; and identify the, or each, passenger based on the unique passenger identification information associated with each matched stored template.
In another aspect, the invention provides a method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger at least as they board or disembark the vehicle;
creating a data template based on the scan;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template;
updating a database to indicate that the identified passenger is boarding or is aboard, or is disembarking or has disembarked, the vehicle when the passenger is scanned as they board or disembark the vehicle respectively.
In another aspect, the invention provides a system for managing passengers of a vehicle, comprising:
at least one palm vein scanner configured to scan palm veins of a passenger at least as they board or disembark the vehicle, create a data template based on the scan, and convey the data template directly or indirectly to a server; and
the server configured to compare the created data template with a database of stored templates associated with unique passenger identification information, match the created data template with one of the stored data
templates; identify the passenger using the unique passenger identification information associated with the matched stored template; update a vehicle presence database to indicate that the identified passenger is boarding, or is aboard, the vehicle when scanned boarding, or update the vehicle presence database to indicate that the identified passenger is disembarking, or has disembarked, the vehicle when scanned disembarking.
B RIE F DE S C RIPTION OF T H E DRAWING S
In order that the invention may be more clearly understood and put into practical effect there shall now be described in detail embodiments in accordance with the invention. The ensuing description is given by way of non-limitative examples only and is with reference to the accompanying drawings, wherein:
F ig. 1 is a diagram of an example passenger management system that may be utilised to embody or implement aspects of the invention;
F ig. 2a is a diagram illustrating an assisted check-in;
F ig. 2b is a diagram illustrating a self-check-in;
F ig. 2c is a diagram illustrating a self-check-in via a kiosk using a palm vein scanner;
F ig. 3 is a block diagram illustrating components of the example system of F igure"! ;
F ig. 4a is an isometric view of a self-check-in kiosk
F ig. 4b is a front view of the kiosk of Fig 4a;
F ig. 4c is a side view of a self-check-in kiosk having an alternative wristband dispensing outlet;
F ig 4d. is a front view of a self-check-in kiosk having a palm vein scanner;
F ig. 4e is a side view of the kiosk of F ig. 4d;
F ig. 4f is an isometric view of a self-check-in kiosk having an alternative palm vein scanner;
F ig. 5 is an isometric view of an internal R FID printer of the kiosk;
Fig.6 is an isometric view of an external RFID printer for connection with an assisted check-in terminal;
Figs. 7a & 7b are isometric views of rolls of wristband material for placement within an RFID printer;
Fig.8a is a bottom view of an NFC wristband;
Fig.8b is a top view of an NFC wristband;
Fig.8c is an isometric view of an NFC wristband;
Fig.8d is an isometric view of a fastening clip for an NFC wristband;
Fig.8e is an isometric view of an NFC wristband fastened on the wrist of a passenger;
Fig. 8f is an isometric view of an NFC & BLE wristband with ends magnetically fastened;
Fig.8g is a side view of the NFC & BLE wristband of Fig.8f with ends magnetically fastened;
Fig. 8h is a top view of the NFC & BLE wristband of Fig. 8f in an unfastened, flattened configuration;
Fig. 8i is a side view of the NFC & BLE wristband of Fig. 8f in an unfastened, flattened configuration;
Fig.8j is an isometric view of a palm vein sensor device;
Fig.8k is an isometric view of a hand guide suitable for mounting on the palm vein sensor device of Fig.8j;
Fig.8I is an isometric view of the alternative palm vein of Fig.4f;
Fig.9a is a front view of an on-board server resting on its feet;
Fig.9b is a perspective view of the on-board server of Fig.9a standing on its side;
Fig 9c is a diagram of internal hardware componentry of the on-board server of Fig.9a;
Fig.10a is an isometric view of a surveillance camera;
Fig.10b is a rear view of the camera of Fig.10a;
Fig.10c is a left side view of the surveillance camera of Fig.10a;
Fig. 10d is a view of a portion of the right side of the camera of Fig.
10a;
F ig. 1 1 a is a front view of a tablet with stand;
F ig. 1 1 b is a side view of the tablet of F ig. 1 1 a with stand;
F ig. 12 is an isometric view of an NFC reader;
F ig. 13a is an isometric view of a tablet mount with NFC reader pocket;
F ig. 13b is an isometric view of a tablet mount with integral NFC reader;
F ig. 13c is an isometric view of a tablet mount with mounting arm; F ig. 13d is an isometric view of a tablet mount with stand;
F ig. 13e is an isometric view of a tablet mount adapted to hold tablets of different sizes;
F ig. 14a is an isometric view of a console with twenty-seven-inch screen;
F ig. 14b is an isometric view of a console with twenty-four- inch screen; F ig. 14c is an isometric view of a console with twenty-one-inch screen;
F ig. 14d is an isometric view of a table connected with a small palm vein scanner;
F ig. 14e is an isometric view of a tablet and palm vein scanner enclosed within a case;
F ig. 14f is an isometric view of a tablet on a mount and connected by cable to a separate palm vein scanner;
F ig. 14g is an isometric view of the palm vein scanner of F ig. 14f, showing the sensor portion of the scanner as partially transparent;
F ig. 14h is an isometric view of an adjustable hand guide or bracket of the palm vein scanner of F ig. 14f;
F ig. 14i is an isometric view of a wall mountable palm vein scanner;
F ig. 1 5 is an isometric view of a BLE receiver;
F ig. 16 is a diagram of an example passenger tracking system that may be utilised to embody or implement aspects of a passenger tracking process;
F ig. 17 is a diagram illustrating an exemplary tour process;
F ig. 18 is a logic flow diagram illustrating a self-check-in process;
F ig. 19 is a logic flow diagram illustrating an assisted check-in process;
F ig. 20 is a logic flow diagram illustrating a passenger counting process;
F ig. 21 is a logic flow diagram illustrating a passenger activity counting process;
F ig. 22 is a logic flow diagram illustrating a video surveillance process; and
F ig. 23 is a logic flow diagram illustrating a location tracking process; F igs. 24a-d are a series of schematic diagrams respectively illustrating the process of: a) palm placement above a palm vein scanner, b) palm vein scanning, c) palm vein image generation, and d) palm vein template storage;
F ig. 25 is a logic flow diagram illustrating an enrolment process for palm vein scanning during check-in;
F ig. 26 is a counting on process for passengers boarding the vessel using palm vein scanning;
F ig. 27 is a counting off process for passenger disembarking the vessel using palm vein scanning; and
F ig. 28 is a process for accessing customised on-board applications authorised by palm vein scanning.
MO DE S FOR CAR RYING OUT T HE INVE NTIO N
R eferring to F igure 1 , there is shown an exemplary networked infrastructural system 10 for managing passengers 1 1 of a sea vessel 12. The system 10 comprises multiple hardware devices linked by a network. The hardware devices comprise various electronic, processing and/or computing devices including: Near F ield C ommunication (:NFC) wristbands 13a, NFC / Bluetooth Low E nergy (:B LE ) wristbands 13b, NFC / G lobal Positioning System ('G PS ') wristbands 13c, or colour coded wristbands 13d to be worn by each passenger 1 1 , self-check-in kiosks 14 (see also Figures 2b and 2C ), self-check-in tablets 190, and staff 57 assisted check-in terminals 15 (see also Figure 2a). The kiosks 14, tablets 190, and terminals
15 are each located at a port; station or other check-in facility 185, which is off-vessel in this instance. E ach kiosk 14, tablet 190, and terminal 15 comprises or is connected in close proximity with a camera 16, and/or a radio-frequency identification (iR FID 1) printer 17, and/or a check-in palm vein scanner 191 , and/or a check-in DNA reader 193, and is adapted to execute a self or assisted check-in application 18 installed thereon.
In the cloud, a R E S T API server runs a R E ST API application which connects various electronic devices together via the internet For instance, the kiosks 14, tablets 190, and terminals 15 are each connected through R E S T API to a remote booking server 19, via the internet The booking server 19 has installed thereon a booking application 100, and in turn communicates with an off-vessel database server 23 which controls data transfer to and from a provider database 24. Additionally, the kiosks 14 and terminals 1 5 are connected to an off-vessel facial recognition server 21 via the internet 20. The facial recognition server 21 has installed thereon a facial recognition application. The facial recognition server 21 is adapted to send a security alert 25 to third party devices (not shown) when predetermined conditions are met.
On board the vessel 12 is located an on-board server 26 which, depending on vessel location, is intermittently connected with the remote booking server 19, facial recognition server 21 , and database server 23. On board, the on-board server 26 is connected via a closed on-board network with multiple BLE receivers 81 , multiple cameras 16 placed around the vessel, such as closed-circuit television (:C CTV) cameras and/or other forms of surveillance cameras including cameras suitable for facial recognition processes, multiple provider consoles 27 either standing, fixedly mounted or portably carried by crew, the consoles 27 being adapted to run a real-time tracking web application 28 which runs in a web browser that is accessible in the closed network, and multiple portable tablets 29 adapted to run a native (e.g. to android platform) management application 31 having various management modules. Detection and/or identification devices in the form of NFC readers 32 can be linked with or comprised by the consoles 27 and/or
tablets 29. Further, identification devices in the form of on- board/trans it palm vein scanners 192, and/or on-boardAransit DNA readers 194 can be linked with or comprised by the consoles 27 and/or tablets 29.
The on-board server 26 is adapted to send a security alert 33 to third party devices, off-board devices, and/or on-board devices when appropriate security conditions are met. T he on-board server is linked with one or more on-board databases 135 in which are recorded a passenger manifest 127 and an activity manifest 136. T he passenger manifest is originally received from the database server 23. F urther installed on the on-board server are a suite of executable applications comprising: a passenger count application 128 for providing real-time numbers and whereabouts of passengers on and off board, an on-board sales and marketing application 129 enabling cashless purchases by passengers using their wristbands for identification, a medical and legal application 130 for enabling passengers to complete Medical S elf R isk Assessments and Assumption of R isk and Liability Waivers prior to undertaking any activities, plus Medical Declarations counter signed by staff prior to undertaking any high-risk activities, an activity scheduling application 131 for making and changing tour activity bookings, a photo management application 132 capable of filtering and sending photos of passengers taken by on-board cameras directly to the passengers captured in the photos, a management reporting application 133 for E MC reporting and analytics, and a security application 134 for real-time facial recognition image processing and security alert generation.
Also on the on-board server, a ship log application with programs relating to the weather, tides, navigations and systems checks, master and crew, counts and voyage details, incidents, and diving instruction, along with the passenger manifest application and a passenger location tracking application, are all accessible and viewable by crew via a graphic user interface presented on consoles 27. S imilarly, passenger count, crew count, sales / marketing, activity scheduling, medical and legal forms, and/or photo sales / marketing, are presentable onto guest and/or crew tablets 29 by the on-board server 26.
It is to be appreciated that the system 10 may additionally, or alternatively, include other electronic or computing devices, such as one or more desktop computers, laptop computers, implanted devices, smart phones 34 and/or wearable devices such as smartwatches and headsets.
The network of the exemplary system 10 comprises the internet 20 and an on-board LAN, and may further include other networks such as WAN, Ethernet, token ring, satellite communications networks, telecommunications networks, including mobile telephone networks such as G S M or 3G networks, etc., or any combination thereof, by which the hardware devices can communication. This enables, for instance, input and output data to be communicated via the network between the hardware devices.
Interconnections between devices facilitating transfer of data and/or information over the network may be wholly or partially wired, for example by optical fibre, or wireless, for example by utilising W i-Fi, Bluetooth, cellular, or satellite communications networks. For instance, in the exemplary embodiment, the closed on-board LAN uses ad-hoc Wi-F i to interconnect local devices, whereas connections between NFC readers 32 and consoles or tablets can be via bus and power cables.
It is to be appreciated that the networked infrastructural system 10 represents only a single example of infrastructure which may be suitable for implementing aspects of the invention. Other suitable networked systems for implementing the invention may involve various alternative devices, configurations, networks, or architectures without departing from the scope of the present invention.
R eferring now to F igure 3, there is shown a block diagram of computing hardware 56 associated with the system 10. S ome or all of the hardware 56 is associated with potentially any of the electronic devices 13, 14, 15, 16, 17, 23, 26, 27, 29, 32, 190, 191 , 192, 193, &/or 194. T he hardware 56 includes memory 35 which includes R O M 36 and RAM 37 as well as stored data 38, collections of instructions forming programs and applications 39, and an operating system 40. A bus 46 links the memory 35 to a processor 41 , display 42, and input means 43. T he hardware is powered
by a power supply 44. A communications interface 45 enables the device to communicate with other devices.
R eferring now to F igures 4a - c, there is shown a first version of the self-check-in kiosk 14a. The exemplary kiosk 14a includes internally a high- performance Intel÷ 5-6500 processor, memory in the form of 4G B RAM (upgradable to 32 G B) and a 2T B hard disk drive. Various communications interfaces are provided including an open component interface zone 49 which includes an audio jack, and a Wi-F i compliant network interface card.
The kiosk 14a further includes multiple payment interfaces in the form of a credit / debit card slot interface 48 and an NFC / R FID card touch or proximity interface 47, enabling direct card or phone touch purchases to be made conveniently by the user.
The kiosk 14a further includes input devices or means in the forms of a keypad 50 and a nineteen-inch touchscreen 51 (e.g. P CAP, SAW), the touchscreen doubling as an anti-glare display screen. A barcode scanner 52 is present for scanning tickets or boarding passes. An amplified speaker and 360-degree LE D shelf lighting are also provided.
The kiosk 14a further includes an internal R FID printer 17a (see F igure 4c) for printing NFC wristbands 13 and dispensing them for user collection via a slot 54 (F igure 4b) or recess 55 (F igure 4a).
R eferring now to F igures 4d and 4e, there is shown a second version of the self-check-in kiosk 14b. The exemplary kiosk 14b shares many of the features of the first version of the kiosk 14a, but further includes a first version of a check-in palm vein scanner 191 a. R eferring now to F igure 4f, there is shown a third version of the self-check-in kiosk 14c. Like the second version of the kiosk 14b, the exemplary kiosk 14c shares many of the features of the first version of the kiosk 14a, but further includes a second version of a check-in palm vein scanner 191 b, as well as a passport reader 195.
R eferring now to F igure 5, there is shown the internal R FID printer 17a of kiosk 14. The R FID printers 17 are adapted to thermally print on the wristbands 13 and embed them with R FID transponders comprising N FC
chips. The exemplary printer 17a includes F LAS H expansion memory of up to 28MB, an automatic cutting unit for band cutting, network interfaces for Ethernet, Wi-F i and Bluetooth, 600 D PI print resolution, twenty-four-volt DC power input, and an R FID encoder.
R eferring now to F igure 6, there is shown an external R FID printer 17b suitable for connection with an assisted check-in terminal 1 5. The exemplary printer 17b encloses its internal hardware components within its own housing 58. Like the internal printer 17a, the external printer includes F LAS H expansion memory of up to 28MB, network interfaces for E thernet Wi-Fi and Bluetooth, 600 D PI print resolution, twenty-four-volt DC power input, and an R FID encoder. Whilst an automatic cutting unit is an optional feature, the standard exemplary printer 17b has a slot 54 with cutting edges adapted for manual band tear off by a staff member 57.
R eferring to F igures 7a and 7b, there is shown a length of material 59 wound into a roll 60. The exemplary roll is installed in the printers 17 for printing and cutting into shorter lengths to form up to five hundred wristbands 13. The material may be soft comfortable and durable, neutral to the skin, resistant to water, tamper evident, tear resistant and stretch proof. In the example shown, the material comprises a thermal film having a thermal side 61 and a non-thermal side 62. The film is adapted for cutting and the thermal side is adapted for printing on of data such as barcodes, dates and times by the printers 17. The material 59 defines a series of fastening holes 63 running centrally along its length. The material may also have pre-defined cutting lines 6.
R eferring to F igures 8a-c, there is shown the N FC wristband 13 as cut by the R FID printer 17 from the roll 60. The wristband 13 is waterproof and comprises a flat NFC tag 64 in the form of an inlay having an NFC chip and antenna, and being embedded in the band material 59. The NFC chip comprises 1024-bit user memory, three independent twenty-four- bit one-way counters. Further, the chip features 106 Kbit s communication speed, anti- collision support, one-time pin, lock bits, configurable counters, protected data access through 32-bit password and a unique seven-byte serial number.
F igure 8d shows a clip 65 which is adapted to fasten overlapping fastening holes 63 of the wristband, thereby securing the wristband 13 to the wrist of the passenger 1 1 (see F igure 8e). Once fastened, the wristband is irremovable without it being damaged, such as may occur when cutting off the band with scissors. The wristband is adjustable prior to fastening by adjusting the holes which are overlapped, whereby overlapping holes which are further from their respective ends produces a smaller fit. In an alternative form, rather than providing wristbands with fastening holes, peel-off adhesive wristbands are provided.
R eferring to F igures 8f-i, there is shown there is shown an exemplary
NFC plus BLE enabled wristband 13b which may be issued during assisted check-in. T he wristband 13b comprises: a power source 44 in the form of a battery, a main board having a central processing unit 41 , memory 35 comprising 64K RAM and F LAS H expansion slots for separate storage of encrypted transactional and personal data, input means 43 in the form of programming buttons 187, a system clock, a display 42, pressure sensors, thin LE D strips, a communications interface 45 comprising a micro-US B port or connection and network interfaces comprising an NFC tag 64 and a Bluetooth tag 186 in the form of an inlay having a CS R E nergy C hipset with a low energy Bluetooth transmitter with a 50m Bluetooth V4 radius range. The material 59 of the wristband 13b is opaque, waterproof (silicon), and flexible. F urther, the wristband 13b comprises a pair of magnetic clasps of opposite polarity 188 co-operatively disposed / embedded in respective ends.
R eferring now to F igure 8j, there is shown a palm vein sensor 196 (which in this instance is a commercially available F ujitsu PalmS ecure sensor) connected at its side with a US B cable 219. As is evident in Figures 2c and 4e, the sensor 196 is a component of the check-in palm vein scanner 191 a. In this particular non-limiting embodiment, the sensor device 196 is generally rectangular cuboidal in shape, having dimensions of 35mm width x 35mm depth x 27mm height. It uses 128-bit AE S data encryption, has around a 0.01 per cent false recognition rate with one retry. The mean time between failures is around 1 ,000,000 hours. The sensor 196 has a capturing
distance of between forty and sixty millimetres, and requires a 4.4 to 5.4 volt power supply.
R eferring now to F igure 8k, there is shown a hand guide 197 (which in this instance is a commercially available F ujitsu PalmS ecure U G uide). As is evident in F igures 2c and 4e, the guide 197 is a component of the check-in palm vein scanner 191 a. The guide 197 connects on to the top of the sensor 196, acting as a spacer to correctly space a passenger s palm in the capturing distance from the sensor when the passenger places their palm on top of the guide. The guide 197 has a central aperture 198 to allow passage of light emitted from the sensor 196 to the passenger s hand.
R eferring now to F igure 8I, there is shown the check-in palm vein scanner 191 b (which in this instance is a commercially available F ujitsu PalmS ecure ID Match device). The scanner 191 b has a 10.9cm diagonally measured capacitive touch display 198, an embedded AR M board, an RJ -45 ethernet communications interface and a US B port, and two external and one internal secure access module slots. The scanner 191 b includes a palm vein sensor, runs on the Linux operating system, and along with palm veins, is adapted to scan smart cards, further improving identification accuracy of the device, with a false acceptance rate of around 0.00008%.
F igures 9a and 9b show the on-board server 26 from the outside, whilst Figure 9c shows the internal hardware componentry found within the server s housing 66. T he exemplary on-board server 26 includes a processor 41 in the form of an Intel C ore i7-6770 HQ chip and an Intel Iris P ro graphics card with connection interfaces in the form of H DMI, mini display and display ports, as well as multiple US B ports and a Thunderbolt interface. The on-board server further includes Wi-Fi, Ethernet and Bluetooth wireless network interfaces, dual channel DDR4 random access memory modules, and a serial ATA bus.
R eferring to F igures 10a-d, there is shown a camera 16 in the form of a C CTV video camera. The camera comprises a C MOS image sensor, has a 1920 x 1080 resolution, a frame rate of 25 frames per second, and supports various protocols over which data is sent including HTT PS and RTS P.
R eferring to F igures 1 1 a-b, there is shown the tablet 29 which comprises an Exynos 7870 processor with a 1.6 G Hz processing speed, 2 G B RAM and 16 G B R O M internal memory, up to 200 G B Micro S D external memory received by micro S D slot 67, communications interfaces including Wi-F i and Bluetooth network interfaces, and a US B port; front and rear cameras, 66a and 66b, with two megapixel and eight megapixel resolution respectively, the cameras being adapted for 1080p video recording at thirty frames per second. F urther, the tablet comprises a touchscreen 74, a power source in the form of a lithium ion battery, and runs an Android operating system. The tablet may also comprise an internal NFC reader.
R eferring now to F igure 12, there is shown an external NFC reader 32 adapted for P C linkage via cord 69. The NFC reader 32 comprises an embedded contactless smartcard and NFC tag reader / writer 68 based on 13.56 MHz contactless R FID technology. The reader 32 is compliant with the IS O/IE C 18092 standard for near field communication, and supports MIFAR E and B O 14443 A and B cards and four types of NFC tags.
R eferring now to F igures 13 a-e, there is shown various mounts 70 adapted for mounting the tablet 29 with or without the external NFC reader 32. F igure 13a shows the mount 70a comprising a mounting panel 71 having a recess for embedding the tablet 29, and a pocket 72 for receiving the NFC reader 32. The mount 70a includes an interface 73 for receiving connections of an interconnecting the tablet 29 and NFC reader 32. F igure 13b shows a mount 70b comprising a panel 71 and a mounting arm 75 on which the panel is supported. The tablet 29 is enclosed within the panel 71 , the touchscreen 74 of the tablet being accessible via an open window 76 defined in the top surface of the panel 71. E mbedded in the top surface of the panel is a smartcard or tag reader / writer 68 which is connected with the tablet 29 within the housing of the panel. Figure 13c shows a mount 70c with a bracket 77 for receiving the tablets 29 and a mounting arm 75 supporting the bracket 77. F igure 13d shows a mount 70d having a panel housing 71 which encloses the tablet 29 but for an open window 76 through which the touchscreen of the tablet is accessible. The panel housing 71 is supported on
a stand 78. F igure 13e shows a mount 70e having a mounting panel 71 supported on a mounting arm 75, the mounting panel being adapted to integrate either 10 inch or 12 inch screen tablets.
R eferring to F igures 14a-c, there is shown exemplary console versions, 27a-c, having twenty-seven, twenty-four, and twenty-one-inch touchscreens 78 respectively. E ach of the consoles 27 further includes an adjustable stand 79 for adjustably propping up and tilting the screen 78 towards a user. The consoles 27 include communication interfaces in the form of Wi-Fi and Bluetooth network interfaces. Additionally, consoles 27b and 27c include communication interfaces in the form of US B ports 80. C onsole 27b also includes a HD MI port and may include an internal NFC reader.
R eferring to F igure 14d, there is shown a tablet 29 (which is commercially available as the F ujitsu S tylistic Q736) embodied differently to the tablet of F igures 1 1 a and 1 1 b. T he tablet 29 of F igure 14d is optionally connected with a small transit palm vein scanner 192a. The scanner 192a includes a palm vein sensor 196, and a US B connector (not shown) by which it is removably connected to a US B port of the tablet 29. The tablet has a 33.8cm display 199 capable of receiving touch and pen input as well as 5mp front and rear cameras 200. The tablet 29 has 4G/LT E , G PS , W LAN, Bluetooth and/or NFC communications interfaces, an intel core i7-6600U processor, memory in the form of S ATA 256 G B solid state drives and 8 G B RAM, along with micro S D memory card slots. F urther, the tablet 29 includes an internal power source in the form of a pair of batteries, and sensors associated with a 3-axis accelerometer, gyroscope, ambient light, magnetic field, and compass.
R eferring now to F igure 14e, there is shown a case 201 in which a tablet 29 and transit palm vein scanner 192b are embedded, the connection between the palm vein scanner 192b and the tablet 29 being hidden by the case 201 . The case maintains portability of the tablet 29 with scanner 192b and serves to protect the tablet 29 and scanner 192b from damage.
R eferring to F igure 14f, there is shown another form of mount 70 for mounting a tablet 29 which is adapted for connection with a separate palm vein scanner 192c. The mount 70 has a mounting panel 71 supported on a mounting arm 75 which projects up from a base 210.
F igure 14c shows the palm vein scanner 192c which comprises a palm vein sensing device 196 mounted on to an adjustable hand guide or bracket 197. The palm sensing device uses near-infrared light pattern capture, with a capture distance of 40-60 mm for initial enrolment and 35- 70mm for authentication. As shown in F igure 14h, the hand bracket 197 has a rectangular base portion 21 1 with a generally flat under surface for resting on a support surface beneath, a first adjustable guide portion 212 pivotally attached at its bottom end to the base portion 21 1 , the first guide portions defining finger depressions 214 at the its top end, and a second adjustable guide portion 213 also pivotally attached at its bottom end to the base portion, but at the opposite end thereof. Effectively in use the palm is placed flatly down on top of the bracket 197, so that the heel of the palm is placed on the second guide portion 213 and the fingers in the finger depressions 214 of the first guide portion 212. Partial folding of the guide portions, 212 and 213, brings their top ends closer together, as may be required for placement of smaller hands, while maintaining the placed palm within a suitable distance range from the sensor as required for effective vein scanning.
R eferring to F igure 14i, there is shown a wall mountable on-board palm vein scanner 192d (which in this case is a commercially available Intus 1600PC Palm Vein authenticator). The scanner 192d includes a wall mountable housing 215, a dome 216 which protects a palm vein sensor therebehind, and alpha-numerically labelled keys 217 for PIN entry. Once wall mounted, this scanner 192d requires the passenger to place their palm vertically, spaced 3-8cm from and facing the sensor.
R eferring to Figure 1 5, there is shown an exemplary low energy
Bluetooth receiving / reading device in the form of BLE receiver 81. The BLE receiver comprises a housing 189 of 54mm x 41 mm x 18mm dimensions,
with an antenna 190 projecting from the rear of the housing 189. The B LE receiver 81 further comprises a 5V, 500mA micro-US B port and a network interface enabling wireless software updates. The B LE receiver 81 is adapted to operate in temperatures between minus twenty and sixty degrees C elsius, and operates on low power consumption (80mA typical working current). The B LE receiver 81 is adapted to read multiple BLE enabled wristbands 13b simultaneously.
F igure 1 6 illustrates use of a passenger tracking system 83. The tracking system 82 comprises multiple BLE receivers 81 a-d placed in fixed positions around the vessel. The position of each BLE receiver is associated with a gateway receiver ID. A map of the vessel or vessel area is generated by the on-board server 26 accounting for or showing the B LE receivers.
In the exemplary diagram, the BLE receivers are placed in or allocated to respective zones 82a-d in the form of quadrants. When a passenger moves into the vicinity of a B LE receiver in the zone, such as when walking by it, a low energy Bluetooth signal 84 conveying unique passenger identification information is transmitted from the passenger s BLE enabled wristband 13b and received by the B LE receiver.
The passenger identification information, along with the BLE receiver s gateway ID, is then wirelessly transmitted 85 to and received by the on-board server 26, which information is worked on via a location tracking application installed on the on-board server 26. With this information, the on-board server 26, by virtue of reading and executing instructions of the location tracking application, can identify on the map the location or position of the passenger s wristband, and therefore the passenger, within approximately ten metres.
Accuracy may be improved by increasing the number of receivers within a given area. The map showing passenger position can be viewed by crew on a console display 42. The crew may be able to interact with the map in order to obtain identification information relating to a selected passenger marker displayed on screen. Depending on the number of Bluetooth
receivers provided, it may be possible to track hundreds or more passengers at once.
A security alert 33 may be signalled by the on-board server where, for example, passengers are located outside of their designated zones, the number of passengers within a zone exceeds a predetermined number for overcrowding, babies or small children are separated more than a predetermined distance from their designated family or carers, passenger position is lost or moves external to the sides of the vessel indicating a potential passenger overboard situation, and a passenger signal remains static for an extended pre-determined period suggesting wristband removal or medical emergency.
F igure 17 provides an exemplary overview of passenger movements on tour, along with the various system technologies used in managing the passengers. In the example diagram, the passenger is shown to move from check-in, to boarding the tour vessel at a departure point, to disembarking the tour vessel and boarding a transfer vessel at a transfer point, to disembarking the transfer vessel for an activity, to re-boarding the transfer vessel after the activity, to disembarking the transfer vessel and re-boarding the tour vessel at the transfer ροίηζ to disembarking the tour vessel at a stopover, to re-boarding the tour vessel at the stopover, and finally to disembarking the tour vessel back at the point of departure. As the tour proceeds, system devices work to identify, count, surveil, and locate the passengers. Notably, the tablet devices are adapted to scan passenger wristbands or biometric features such as DNA, face, or palm veins, in offline mode, making it possible to continue to keep track of passengers during transfers and activities taking place off-board the main tour vessel.
After a passenger has booked their ticket through the vessel operator or a booking agency, the passenger checks-in on the day of travel, off board the vessel, either via a staff assisted check-in terminal 1 5 (Fig. 2a), a self- check-in kiosk 14 (Figs. 2b and 2c), or self-check-in tablet 190 (Fig. 17).
An exemplary self-check-in procedure 86 suitable for enrolling in or engaging on-board or transit wristband identification is shown in Figure 18. At
step 87, the passenger 1 1 inputs personal and/or booking information such as their booking reference number or family name either manually via the kiosk keypad 50 or touchscreen 51 , or by using the barcode scanner 52 to scan a barcode on their printed ticket or phone 34 displayed ticket The kiosk 14 wirelessly transmits the inputted information to the booking server 19 in order to ascertain the booking details from the booking server 19 based on the inputted information. At step 88, the booking details are presented and displayed on the kiosk screen 51. At step 89, the passenger 1 1 uses the touchscreen 51 to select and confirm the passenger on the ticket to check-in. At step 90, the kiosk 14 or other device processor 41 determines whether the selected passenger requires a Medical S elf R isk Assessment form and/or a Medical Declaration form and/or an Assumption of R isk and Liability Waiver form based on passenger or booking information provided such as their medical history and the activities chosen or destination being visited whilst a passenger on the vessel 12.
If :yes " a medical or legal form is required, the kiosk or other device processor determines which of these is or are relevant and presents this on the touchscreen at step 91. T he Medical S elf R isk Assessment may be issued to all passengers, or all passengers booking in to low risk activities. The Assessment includes a questionnaire requiring the passenger to check off :yes ~ or :no" answers to a series of questions. F rom completion of this form, the tour operator may gain a basic understanding of the passenger s general health. Any failed medical question on the Assessment form prompts the system to require the additional completion of a Medical Declaration form. S uccessful completion of the Assessment form clears the passenger to undertake non-risk or low-risk activities on tour, such as snorkelling. Typically, Assumption of R isk and Liability Waiver forms are also to be completed by passengers undertaking any activities on tour, be they low risk or high risk activities. Medical Declarations are typically to be completed by passengers planning on undertaking high risk activities such as scuba diving or helmet diving. The Medical Declaration form provides a detailed questionnaire, requiring the passenger to check off or provide satisfactory
answers to presented questions and finger sign the document via the touchscreen 51 in order to proceed to step 92. Any failed answer to a question on the Declaration prompts an automatic response from the system. The response is based on local legal requirements and standard international industry practice, having been developed in collaboration with doctors who specialise in the industry of the relevant activity, e.g. diving. Additionally, the Medical Declaration form must later be counter signed by a qualified staff member having reviewed the form. Based on the information provided in the Declaration, the staff member determines whether total, partial, or no restriction on the activity is required for the passenger.
If :no"a medical or legal form is not required, or no further medical or legal forms are required, at step 92 a social media share screen is presented to the passenger, enabling them to share details of their trip, cruise, or tour on social media platforms such as Facebook. If the passenger selects yes " they would like to share trip details, at step 93 the kiosk or other linked device communicates with the server of the social media platform to bring this to fruition. If the passenger selects :no"they do not want to share, at step 94 the kiosk camera 16 takes a front profile photo of the face of the passenger 1 1 and transmits the image to the facial recognition server 21 for analysis via the facial recognition application 101. At step 95, the kiosk prints an NFC wristband 13a for collection by the passenger. The printed NFC wristband 13a is associated with a data template having unique indicators for identifying the passenger 1 1 and their tour booking. At step 96, the kiosk or other linked processor asks on screen 51 whether another passenger on the booking is to be processed. If the passenger inputs yes " then the process repeats again from step 89 for the next passenger. If :no" then the self-check-in procedure is complete at step 97.
An exemplary assisted check-in procedure 98 suitable for enrolling in or engaging wristband identification for the upcoming journey is shown in F igure 19. At step 99, the passenger 1 1 advises a staff operator 57 manning a terminal 15 of their booking details or family name. At step 1 10, the operator 57 inputs the passenger details into the terminal and the terminal 1 5
wirelessly transmits the inputted information to the booking server 19 in order to ascertain the relevant booking details from the booking server 19. The booking details are then presented to the operator 57 on an operator facing touchscreen 125 (F ig. 2a) of the terminal 15. At step 1 1 1 , the operator selects the correct passenger 1 1 if multiple passengers are found in the booking. At step 1 12, the booking details are presented on the passenger facing touchscreen 126 (F ig. 2a). At step 1 13, the passenger updates details if required, and then confirms accuracy of the booking details. At step 1 14, the terminal 15 or other device processor 41 determines whether the selected passenger requires a medical or legal form based on passenger or booking information provided. If yes "a medical or legal form is required, the kiosk or other device processor determines which of these is or are relevant and presents this on the touchscreen at step 1 15. The passenger 1 1 then checks off answers to the questions and, if necessary, finger signs the form via the touchscreen 126 in order to proceed to step 1 1 6. If :no"a medical or legal form is not required, or no further medical or legal forms are required, at step 1 1 6 a social media share window is presented to the passenger, enabling them to share details of their trip, cruise, or tour on social media platforms. If the passenger selects :yes ~they would like to share trip details, at step 1 17 the terminal 15 or other linked device communicates with the server of the selected social media platform, thereby accessing the platform and sharing the tour details thereon. If the passenger selects :no"they do not want to share, at step 118 the staff member 57 uses a camera 16 linked to the terminal 15 to take a head-shot of the passenger 11. At step 1 19, the image is transmitted from the camera 16 to the terminal 15 and then on to the facial recognition server 21 for analysis and identification by the facial recognition application 101. The transmitted image is also compared with images stored in the shared operator database 24 linked with the database server 23. At step 120, the facial recognition server 21 or a linked processor determines whether there is cause for an alert; such as where there is a mismatch between the passenger image and booking details, or the passenger is flagged for sea travel or on a watch list. If yes "there is cause
for an alert; at step 121 a security alert 25 is transmitted to the relevant authorities. If :no"there is no cause for alert; either an NFC wristband 13a or NFC / B LE wristband is printed by a printer 17 linked to the terminal 1 5, which printer 17 is capable of embedding both NF C and BLE tags in the wristband. T he crew member hands the personalised wristband to passenger for fastening on their wrist. Typically, combined NF C / BLE wristbands will only be issued for passengers on larger vessels 12 such as cruise ships capable of holding many passengers and having large areas suitable for zoning. At step 123, the kiosk or other linked processor asks on screen 125 and/or 126 whether another passenger on the booking is to be processed. If the passenger or crew inputs :yes "into their respective screen, then the process repeats again from step 1 1 1 for the next passenger. If :no~ then the assisted-check-in procedure is complete at step 124.
When the wristband is issued by the printer, the kiosk 14 or terminal 15 where check-in took place transmits passenger details to the database server 23 for storage and subsequent transmission to the server 26 on board vessel 12, the vessel being, at that time, docked at the port of origin and so within receiving range of internet Wi-F i transmissions. The data template having unique personalised identification details of the wristband 13 is stored with other data templates having details from wristbands of other passengers in the passenger manifest 127 on the database server 23. The database server then push syncs the passenger manifest on the on-board server 26.
With wristband fastened, the passenger 1 1 proceeds to the dock for boarding of the vessel 12. A crew member carrying a tablet 29 is stationed at the boarding entrance to the vessel 12. As shown in F igure 20, a boarding counting process 137 for counting passengers by their wristbands 13 is initiated for passenger boarding. In this regard, at step 138, the tablet 29 receives an input selection from the crew member via touchscreen 74 relating to which entry / exit station the crew member is positioned and whether or not the passenger is entering or exiting the vessel. At step 139, the passenger taps the NFC tag 64 of their wristband 13 on the NFC reader enabled tablet 29, or at least waves the NFC tag in close proximity (within a
few centimetres) to the tablet 29. In doing this, the data template / unique identifier data transmitted wirelessly from the NFC tag 64 is received by the tablet s 29 integrated NFC reader. At step 140, passenger details associated with the unique identification data / data template (transmitted from the scanned wristband) are attempted to be retrieved by the tablet. At step 141 , the tablet queries whether passenger data is found relating to the scanned wristband. If :no"data is found, at step 142 an error is displayed on the tablet screen, thereby notifying the crew member. If :yes " passenger data relating to the scanned wristband is found for this cruise, the passenger / tag details are displayed at step 143. At this time, a camera 16 located at the point of entry captures a picture of the boarding passenger s face and this image is compared with the image taken of the passenger associated with the retrieved N FC tag details at check-in, and if a discrepancy is found an alert notification is triggered. At step 144, the passenger or crew confirm by touchscreen selection that the tag details are correct in respect of the passenger attempting to board. At step 145, the direction of passenger movement (on to the ship) is logged and their status is updated as being onboard. At this time, Information regarding passenger movement and status is wirelessly synced from the tablet to the on-board server over the on-board Wi-F i network. This results in the on-board server updating a central missing passenger count as deduced from the passenger manifest, thereby reducing the missing passenger count by one. All tablets are in turn synced with the on-board server, thereby keeping the missing passenger counts on the tablets up to date. At step 146, the counting process is completed for that particular passenger now boarded. The counting process is repeated for all passengers boarding the vessel, with the boarding process being completed when the missing passenger count equals zero.
F igure 21 shows an activity counting-off process 147 for counting passengers by their wristbands, as may be initiated for each passenger 1 1 disembarking the vessel 12 for a scheduled activity or stopover. A crew member holding a tablet 29 is stationed at the vessel exit. At step 148, the tablet 29 receives an input selection from the crew member via touchscreen
74, the input selection relating to which entry / exit station the crew member is positioned, whether the passenger is entering or exiting the vessel, and which activity the passenger is disembarking the vessel for. At step 149 the passenger positions the NFC tag 64 of their wristband 13 in close proximity with the N FC reader enabled tablet 29 such that the tablet scans the tag. At step 150, passenger or tag details associated with the scanned tag are retrieved by the tablet. At step 1 51 , the tablet determines whether the passenger associated with the scanned tag is registered for the activity entered by the crew member during step 148. If it is determined :no" the passenger is not registered for the activity, an error message is presented on the on the tablet screen, thereby notifying the crew member. If yes " the passenger is registered for the activity, at step 153 the tablet or other linked processor determines which medical or legal form is required for the activity, if any, and checks whether or not it has been completed by the passenger. If :no" the necessary medical or legal form has not been completed then an error message is displayed at step 152. If yes " the requisite medical and legal forms have been completed, at step 1 55 the passenger details relating to the scanned wristband are displayed on the tablet screen. Once the crew member signs an electronic verification form on the tablet screen, at step 1 56 the direction of passenger movement (off the ship) is logged and their status is updated as being off-board. At this time, information regarding passenger movement status, and activity, is wirelessly synced from the tablet to the onboard server over the on-board Wi-F i network. This results in the on-board server updating a central missing passenger activity count as deduced from the activity manifest 136, thereby increasing the missing passenger activity count by one. All tablets are in turn synced with the on-board server, thereby keeping the missing passenger activity counts on the tablets up to date. At step 1 57, the activity counting process is completed for the particular passenger having disembarked. The activity counting-off process is repeated for all passengers disembarking the vessel for their registered activity.
For passengers re-boarding after completion of their activity, an activity counting-on process is initiated. The crew member stations themselves at the vessel entry point with tablet in hand, and selects their present location (which may or may not be a different location from where the passengers disembarked for their activity). The passengers each :tap on'with their wristband, and once the missing passenger activity count is reduced to zero, the activity manifest 136 is recalled. P rior to recall of the activity manifest the vessel is disabled so as to prevent it from leaving the location until all passengers are accounted for. If a passenger is missing, their personal details and photograph are accessible by the crew. Once the activity manifest is recalled, the crew member signs the electronic form presented by the tablet indicating that all passengers are accounted for on board the vessel. This digital form is synced to the on-board server over the on-board Wi-F i network, and in-turn synced to the database server when within range.
F igure 22 illustrates an exemplary video surveillance alert process 158. At step 159, the next frame from the video surveillance camera 30 feed is transmitted to and received by the on-board server 26. At step 1 60, the onboard server or other linked processor analyses the frame and determines whether or not the number of people in the frame is greater than a predetermined value :n~ If :yes "the number of people in the frame is greater than n, then a crowd alert is generated, which alert may be in the form of, for instance, a notification presented on passenger phones 34 or tablets 29, a vibration of vibrating means on passenger wristbands within the zone, an alert notification presented on crew consoles 27, an alarm sounded on speakers within the zone, etc. If :no"the number of people is not greater than n, or if the crowd alert has been generated, at step 162 the frame image is scanned to detect faces of passengers captured in the image. If :no"faces are detected at step 163, the next frame image is taken and the process repeats from step 159. If :yes "a face is detected at step 1 63, at step 1 64 the face is compared with faces of boarded passengers stored in on-board database 135 in order to identify the passenger. If at step 165 :no" the
passenger is not able to be identified, then an unknown passenger alert is transmitted by the on-board server to predesignated linked devices on board the vessel, and subsequently the process 158 is completed at step 175. Otherwise, if at step 1 65 yes "the passenger is able to be identified, at step 167 the on-board server 26 or an external linked processor 41 calculates the total number of visits the passenger has made to the surveilled area. If it is determined that :yes " the number of visits is greater than a predetermined number within a predetermined period of time or days, at step 1 68 a frequent visit alert is transmitted by the on-board server to predesignated linked devices. If :no" the number of visits is less than the predetermined number within the predetermined period, then at step 169 the on-board server checks whether two facially recognised passengers in a particular zone match two passengers in a rendezvous alert gallery stored in the on-board database 135. If yes "there is a match, then at step 170 a rendezvous alert is issued. If :no" at step 171 the on-board server checks whether a facially recognised passenger from the presently captured frame matches a facially recognised passenger from a previously captured frame with the passenger located in different zone 82. If yes "there is a match, then a transit alert is generated at step 172. If :no"there is not a match, then at step 173 the on-board server determines whether the facially recognised passenger from the captured frame is not in close proximity to, or is in a different zone to, a predesignated accompanying passenger recorded in the on-board database 135. If :yes "the passenger is not in close proximity to or is in a different zone to the accompanying passenger, an unattended alert is generated at step 174. If :no" the person is not being unaccompanied (i.e. the person is being accompanied) by their accompanying passenger, the video surveillance alert process is completed at step 175.
F ig. 23 illustrates an exemplary location tracking alert process 176. At step 177, a BLE enabled wristband 13b is issued to the passenger during an assisted check-in, and the passenger fastens the wristband on their wrist At step 178, on-board the vessel, a BLE transmission 84 sent from the wristband 13b is received by a B LE receiver 81 when in range. At step 179, a
processor 41 associated with the B LE receiving device 81 determines whether a transmission 84 from the wristband 13b has already been received by the receiver 81 within a predesignated time period or window. Only the first signal 84 in the predesignated time period is registered, accepted or updated. Therefore, passenger / BLE tag location will only be updated every :x"predesignated or configurable number of seconds. If the transmission from the BLE tag / wristband / passenger has been read in the last x number of seconds, within the predesignated time period, then at step 180 the current read is ignored and the process ends at step 184. If the transmission from the B LE tag / wristband / passenger has not been received in the last x number of seconds, within the predesignated time period, then at step 181 the time and location of the passenger is updated. At step 182, the processor 41 determines whether the current location of the passenger is in breach of location restrictions, either for that particular passenger or for all passengers generally. If :yes "the passenger is in a restricted location, then a location alert is generated at step 183. If :no~ then the process ends at step 184.
F igures 24a-d illustrate the process of scanning the palm veins of a passenger's hand 218. In Figure 24 the passenger 1 1 places their hand 218 at a suitable distance, such as forty to sixty millimetres, from and facing the palm vein sensor 196. Typically, the sensor 196 will be incorporated into a palm vein scanner 191 which also includes a hand guide 197 on which the user s palm can be placed for suitable capture distancing (as shown in F igure 2c). With the hand 218 correctly placed, the sensor 196 emits near-infrared light towards the hand 218 as shown in F igure 24b. Much of the infrared light passes through the skin layers of the hand before being absorbed by the oxygen depleted blood coursing through palm veins on its way back to the heart. As shown in F igure 24c, absorption of the palm venous blood is recorded by a wide angular infrared camera integrated with the sensor, generating a raw image of the passenger's palm veins 220, along with a less distinct outline of the palm. The raw image of the palm veins 220 is then encrypted, in this embodiment by the sensor device using AE S method. T he
encrypted data is then transmitted to the database server 23 on which a palm vein template application converts and compresses the encrypted raw image to a data template with a size of around 1 KB. The compressed template is again encrypted using AE S , before being stored in a palm vein database 221 , shown in F igure 24d, linked with the database server 23. The palm vein database 221 is synced with an on-board database 135 linked with the onboard server 26.
An exemplary check-in procedure 222 suitable for enrolling in or engaging palm vein identification for the upcoming journey is shown in F igure 25. At step 223, the passenger 1 1 is prompted to place their palm on or a palm vein scanner 191 . At step 224, the passenger's palm is scanned and a palm vein image 220 is generated. At step 225, the generated palm vein image is transmitted by the check-in device, 14 or 190, linked with the scanner 191 to the database server 23 for conversion into template form and comparison with stored templates in the linked palm vein database 221. If the generated template matches a stored template, which is likely to be the case for a returning passenger, then the passenger's details associated with the stored template are displayed on the check-in device, 14 or 190, at step 226. The passenger is then asked to confirm or update their displayed details at step 227 before continuing on with the remainder of the check-in procedure at step 228. If, however, the generated template does not match any of the stored templates, an enrolment process is initiated at step 229. Then, at step 230, the passenger's palm is scanned twice, a palm vein image is generated, and a template based on the image is created at step 231. At step 232 the passenger's palm is once again scanned. At step 233, a template based on the palm vein image generated from the further scan at step 232 is compared with the template created at step 231 for authentication purposes. At step 234, if the further template matches the earlier template then the passenger is able to continue with the remainder of the check-in process at step 228. If however at step 234, the further template generated at step 232 does not match the template created at step 231 , the template creation procedure is restarted at step 235, and the passenger will be required to repeat steps 230
to 234 until their created palm vein template is authenticated, at which time they can move on to the remainder of the check-in procedure at step 228.
An exemplary procedure 236 for counting passenger's 1 1 boarding the vessel 12 using palm vein scanning technology is illustrated in F igure 26. At step 237 the passenger 1 1 who is boarding the vessel places their palm on or at a suitable capturing distance from a palm vein scanner 192 located in the proximity of the boarding gate. The scanner 192 may be any of various forms including a small scanner 192a attached to a tablet 29 (F igure 14d) held by a member of staff 57 at the boarding gate, or a small scanner 192b connected with a tablet 29 embedded in a case 201 (Figure 14e) and held by a staff member 57 at the boarding gate, or a palm vein scanner 192c having an adjustable bracket 197 for hand placement and being connected by cable with a tablet 70 mounted on a base (F igure 14f) at the boarding gate, or a scanner 192d mounted on a wall (F igure 14i) near the boarding gate. Once the palm is correctly positioned with respect to the scanner, the passenger's palm is scanned at step 238. At step 239, a palm vein image is generated for the passenger and a template created either by the scanner or remotely. At step 240, the generated palm vein image is either compared with existing templates stored in the memory of the scanner itself or the connected tablet, or transmitted remotely to the database server for comparison with existing templates in the palm vein database. If a match is not found during the comparison, at step 241 a notification appears on the scanner or connected tablet display indicating that the passenger is unidentified. The passenger is then directed to re-place their palm on or at an appropriate distance from the scanner in order to repeat steps 237 to 240. If step 241 is re-engaged, it may be appropriate to send a notification to authorities that an unidentified passenger is attempting to board the vessel. If however, at step 246, a match is found in the comparison between the generated palm vein image template at step 240 and existing image templates, then the passenger is identified at step 242 such as by their name being displayed on the scanner or tablet screen. At step 243, the passenger-on-board count is increased by one in the passenger manifest. At step 244, the passenger is then allowed to
board the vessel. An exemplary procedure 245 for counting passengers 1 1 off the vessel 12 using palm vein scanning technology is illustrated in F igure 27. The steps of the counting off procedure 245 are the same as the steps of the counting on procedure 236, aside from step 243 which is replaced by step 246, and step 244 which is replaced by step 247. In this case, rather than the passenger-on-board count being increased in the passenger manifest once they are identified following their palm vein scan, the passenger-on-board count is instead reduced by one in the passenger manifest at step 246. T he passenger is then allowed to depart the vessel at step 247.
R eferring to F igure 28, there is shown a process 248 for enabling passenger 1 1 or staff 57 access to one or more of the suite of executable applications. At step 249, the user, 1 1 or 57, selects the application that they wish to access on an on-board tablet 29 or console 27 connected with a palm vein scanner 192. At step 237 the user places their palm on or at a capturing distance from the palm vein scanner 192. At step 238, the user's palm is scanned, and at step 239 an image of the user's palm is generated and a template created. At step 240, they system checks whether the generated template matches a stored template, and if not, at step 240 an unidentified user notification is displayed on the screen of the scanner, tablet, or console, and the user is required to repeat the process from step 237 to 240 if they still wish to access the selected application. If, at step 240, a match is found for the palm vein image generated at step 239, then details or formatting relating to the person identified by the matched template are presented within the selected application on the tablet or console. At step 251 , the user is then free to use the selected application on the tablet or console.
It is to be understood that a combination of aforementioned means or methods may be used in accordance with the invention in order to identify and/or manage passengers of a vehicle. Therefore, a palm vein scanner may be used to identify/manage some or all passengers, while a reading device (e.g. NFC reader, BLE reader) adapted for reading unique passenger identification data transmitted by a passenger disposed identification device
(e.g. NFC wristband, B LE wristband, mobile phone) may be used to identify/manage some or all of the passengers as well.
Using two or more types of modalities on the same passengers in order to identify / manage those passengers is advantageous in the sense of having a back-up where one modality fails, is lost, or becomes inaccurate under certain conditions. Using two or more types of modalities may improve accuracy by cross-checking against passenger readings from both modalities. F urther, some modalities may be more suitable in some aspects / conditions / environments of the passenger s journey while other modalities may be more suitable in other aspects / conditions / environments of the same passenger s journey.
Using different types of modalities on different passengers in order to identify / manage different passengers is also advantageous as one passenger may, for instance, refuse to have their biometric characteristics (e.g. palm veins, face, DNA) read, in which case a non-biometric modality (e.g. NFC wristband) may be required for them, while the remaining passengers have their biometric characteristics read. Additionally, one type of modality may not work effectively with some passengers, e.g. palm vein scanning of double-hand amputees, in which case another type of modality could be utilised for identification / management of that particular passenger while the remaining passengers use palm-vein scanning.
INT E R P R E TATION
In the context of this document, the term "bus _ and its derivatives should be construed broadly as a system for communicating data.
As described herein, a method involving implementation of one or more steps by computing devices should not necessarily be inferred as being performed by a single computing device such that the one or more steps of the method may be performed by more than one cooperating computing devices.
Objects such as :web server" server" computing device " computer readable medium" and the like should not necessarily be construed as being
a single object, and may be implemented as a two or more objects in cooperation, such as, for example, a web server being construed as two or more web servers in a server farm cooperating to achieve a desired goal or a computer readable medium being distributed in a composite manner, such as program code being provided on a compact disk activatable by a license key downloadable from a computer network.
In the context of this document, the term "database, and its derivatives may be used to describe a single database, a set of databases, a system of databases or the like. The system of databases may comprise a set of databases wherein the set of databases may be stored on a single implementation or span across multiple implementations. The term "database , is also not limited to refer to a certain database format rather may refer to any database format For example, database formats may include MyS Q L, MyS Q Li , X ML or the like.
The invention may be embodied using devices conforming to other network standards and for other applications, including, for example other WLAN standards and other wireless standards. Applications that can be accommodated include IE E E 802.1 1 wireless LANs and links, and wireless Ethernet.
In the context of this document, the term "wireless , and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not. In the context of this document, the term "wired, and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a solid medium. The term does not imply that the associated devices are coupled by electrically conductive wires.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions
utilizing terms such as processing., computing., calculating., "determining., analysing, or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.
In a similar manner, the term "processor, may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that e.g., may be stored in registers and/or memory. A "computer, or a "computing device, or a "computing machine, or a "computing platform, may include one or more processors.
One or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment the one or more processors may operate in the capacity of a server or a client machine in server-client network environment or as a peer machine in a peer-to-peer or distributed network environment. The one or more processors may form a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
Note that while a Figure may only show a single processor and a single memory that carries the computer-readable code, those in the art will understand that many of the components described above are included, but not explicitly shown or described in order not to obscure the inventive aspect. For example, while only a single machine is illustrated, the term "machine. shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
It will be understood that steps of methods discussed may be performed by an appropriate processor (or processors) of a processing (i.e., computer) system executing instructions (computer-readable code) stored in storage. It will also be understood that the invention is not limited to any particular implementation or programming technique and that the invention
may be implemented using any appropriate techniques for implementing the functionality described herein. T he invention is not limited to any particular programming language or operating system.
S ome elements of methods described herein may be implemented by a processor or a processor device, computer system, or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method.
It is to be noticed that the term connected, when used in the claims, should not be interpreted as being limitative to direct connections only. Thus, the scope of the expression a device A connected to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "C onnected, may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
Particular features, structures or characteristics or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments/arrangements.
It should be appreciated that in the above description of example embodiments/arrangements of the invention, various features of the invention are sometimes grouped together in a single embodiment/arrangement, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. T his method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment arrangement. Thus, the claims following the Detailed
Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment/arrangement of this invention.
F urthermore, while some embodiments/arrangements described herein may include some but not other features included in other embodiments/arrangements, combinations of features of different embodiments/arrangements are meant to be within the scope of the invention, and form different embodiments/arrangements, as would be understood by those in the art. For example, in the following claims, any of the claimed embodiments/arrangements can be used in any combination.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
In describing the preferred embodiment of the invention illustrated in the drawings, specific terminology will be resorted to for the sake of clarity. However, the invention is not intended to be limited to the specific terms so selected, and it is to be understood that each specific term includes all technical equivalents which operate in a similar manner to accomplish a similar technical purpose. Terms such as "forward", "rearward", "radially", "peripherally", "upwardly", "downwardly", and the like are used as words of convenience to provide reference points and are not to be construed as limiting terms.
Unless otherwise specified the use herein of the ordinal adjectives
"first., second., "third., etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
In the claims which follow and in the preceding description of the invention, except where the context requires otherwise due to express language or necessary implication, the word "comprise, or variations such as
"comprises , or comprising, are used in an inclusive sense, i.e. to specify the presence of the stated features but not to preclude the presence or addition of further features in various embodiments of the invention.
Any one of the terms: "including, or which includes , or "that includes , as used herein is also an open term that also means "including at least, the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
While there has been described suitable arrangements of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as fall within the scope of the invention. F unctionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.
Although the invention has been described with reference to specific examples, it will be appreciated by those skilled in the art that the invention may be embodied in many other forms.
Claims
1. A system for managing passengers of a vehicle, the system comprising: at least one biometric reader configured to read a biometric characteristic of a passenger, create a data template based on the reading, and convey the data template directly or indirectly to a server;
at least one non-biometric reader configured to read a transmitted non-biometric characteristic comprising a data template associated with the, or another, passenger, and convey the data template directly or indirectly to the server; and
the server configured to receive each data template, access a database of stored data templates, each stored template being associated with unique passenger identification information, compare each data template with the stored data templates, match each data template with a stored data template; and identify the, or each, passenger based on the unique passenger identification information associated with each matched stored template.
2. The system according to claim 1 , wherein the server is configured to associate unique passenger identification information with the created data template; and, store the data template in a database of templates for comparison on future readings
3. The system according to claim 1 or claim 2, wherein:
the at least one biometric reader comprises a palm vein scanner configured to scan palm veins of a passenger and create the data template based on the scan.
4. The system according to any one of the preceding claims, comprising: an identification device disposed on or in relation to the, or another, passenger, the identification device being configured to transmit the data template associated with the, or another, passenger.
5. The system according to any one of the preceding claims, wherein:
the at least one biometric reader comprises a camera configured to capture an image which includes the face of a passenger and create a data template based on the image.
6. The system according to any one of the preceding claims, comprising: an on-vehicle terminal linked with a biometric and/or non-biometric reader and running an on-vehicle application, wherein passenger access to personalised content using the on-vehicle application is enabled once the passenger has been identified.
7. The system according to any one of the preceding claims, wherein:
the sever is configured to identify a zone or region of the vehicle in which the passenger was identified.
8. The system according to any one of the preceding claims, wherein:
the server is configured to increase or reduce a passenger count according to whether biometric or non-biometric characteristics of the identified passenger are read in the process of entering or exiting the vehicle.
9. The system according to anyone of the preceding claims, wherein:
when the passenger is identified as they are boarding the vehicle, the server is configured to update a database to indicate that the passenger is boarding, or is aboard, the vehicle; and,
when the passenger is identified as they are disembarking the vehicle, the server is configured to update a database to indicate that the passenger is disembarking, or has disembarked, the vehicle.
10. The system according to any one of the preceding claims, wherein: the server is configured to verify that the identified passenger is on a passenger manifest for the vehicle.
1 1. A method of managing passengers of a vehicle, comprising:
reading at least one biometric characteristic of a passenger, and creating a data template based on the read biometric characteristic;
reading at least one non-biometric characteristic relating to the, or another passenger, wherein the at least one non-biometric characteristic comprises a transmitted data template relating to the, or another, passenger; accessing a database of stored data templates, each stored data template being associated with unique passenger identification information; comparing each read data template with the stored data templates; matching each read data template with a stored data template;
identifying the, or each, passenger based on the unique passenger identification information associated with each matched stored template.
12. The method according to claim 1 1 , comprising:
reading at least one biometric characteristic of the passenger, and creating a data template based on the read biometric characteristic;
creating a data template of at least one non-biometric characteristic relating to the, or another, passenger
associating unique passenger identification information with each data template; and
storing each data template in a database of templates for comparison on future readings.
13. The method according to claim 1 1 or claim 12, wherein reading at least one biometric characteristic comprises scanning the palm veins of the passenger.
14. The method according to any one of claims 1 1 to 13, wherein reading at least one non-biometric characteristic comprises electronically receiving a
transmission of a data template from a device disposed on or in relation to a passenger.
15. The method of according to any one of claims 1 1 to 14, wherein reading the at least one biometric characteristic comprises capturing an image which includes the face of a passenger.
16. T he method according to any one of claims 1 1 to 1 5, comprising:
increasing or reducing a passenger count according to whether the, or each, passenger is entering or exiting the vehicle when identified.
17. The method according to any one of claims 1 1 to 16, comprising:
enabling the, or each, identified passenger to access personalised content on an electronic device or terminal.
18. The method according to any one of claims 1 1 to 17, comprising:
presenting content on an electronic device or terminal to the, or each, identified passenger.
19. T he method according to any one of claims 1 1 to 18, wherein:
reading of at least one biometric or non-biometric characteristic occurs at least as the, or each, passenger is boarding the vehicle, and once the, or each, passenger is identified, updating a database to indicate that the, or each, passenger is boarding or is aboard the vehicle; and, wherein:
reading of at least one biometric or non-biometric characteristic occurs at least as the, or each, passenger is disembarking the vehicle, and once the, or each, passenger is identified, updating a database to indicate that the, or each, passenger is disembarking or has disembarked, the vehicle.
20. The method according to any one of claims 1 1 to 19, comprising:
identifying a zone or region of the vehicle in which the, or each, identified passenger is located.
21. The method according to any one of claims 1 1 to 20, comprising:
verifying that the, or each, identified passenger is on a passenger manifest for the vehicle.
22. The method according to any one of claims 1 1 to 21 , wherein the reading of at least one non-biometric characteristic relates to another passenger.
23. The method according to any one of claims 1 1 to 22, wherein the reading of at least one non-biometric characteristic relates to the passenger.
24. A system for managing passengers of a vehicle, comprising:
at least one palm vein scanner configured to scan palm veins of a passenger at least as they board or disembark the vehicle, create a data template based on the scan, and convey the data template directly or indirectly to a server; and
the server configured to compare the created data template with a database of stored templates associated with unique passenger identification information, match the created data template with one of the stored data templates; identify the passenger using the unique passenger identification information associated with the matched stored template; update a vehicle presence database to indicate that the identified passenger is boarding, or is aboard, the vehicle when scanned boarding, or update the vehicle presence database to indicate that the identified passenger is disembarking, or has disembarked, the vehicle when scanned disembarking.
25. The system according to claim 24, wherein:
the at least one palm vein scanner is configured to scan the palm veins of the passenger prior to boarding the vehicle; create a data template
based on the scan; and convey the data template directly or indirectly to the server; and
the server is configured to associate unique passenger identification information with the conveyed data template, and store the data template in the database of templates for future comparison.
26. The system according to claim 24 or 25, comprising:
an on-vehicle terminal linked with the palm vein scanner and running an on-vehicle application, wherein passenger access to personalised content using the on-vehicle application is enabled once the passenger has been identified.
27. The system according to any one of claims 24 to 26, comprising:
an identification device disposed on or in relation to the passenger, the identification device being configured to convey unique passenger identification information; and
a reading device configured to read the unique passenger identification information conveyed by the identification device.
28. The system according to any one of claims 24 to 27, comprising:
a camera for capturing an image which includes the face of a passenger; and
facial recognition means, and/or the camera and/or the server, configured to create a facial data template based on the captured image; compare the facial data template with a database of stored facial data templates associated with unique passenger identification information; match the created facial data template with one of the stored facial data templates; and identify the passenger based on the unique passenger identification information associated with the matched stored template
29. The system according to any one of claims 24 to 28, wherein:
the sever is configured to identify a zone or region of the vehicle in which the passenger is identified.
30. The system according to any one of claims 24 to 29, wherein:
the server is configured to verify that the identified passenger is on a passenger manifest for the vehicle.
31. A method of managing passengers of a vehicle, comprising:
scanning the palm veins of a passenger at least as they board or disembark the vehicle;
creating a data template based on the scan;
comparing the data template with a database of stored templates associated with unique passenger identification information;
matching the data template with one of the stored templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template;
updating a database to indicate that the identified passenger is boarding or is aboard, or is disembarking or has disembarked, the vehicle when the passenger is scanned as they board or disembark the vehicle respectively.
32. The method according to claim 31 , comprising:
scanning the palm veins of a passenger prior to boarding the vehicle; creating a data template based on the scan;
associating unique passenger identification information with the created data template; and
storing the data template in a database of templates for comparison on future palm vein scans.
33. The method according to claim 31 or claim 32, comprising:
electronically receiving a transmission of unique passenger identification data transmitted from a device disposed on or in relation to a passenger;
accessing a database of stored passenger identification information records;
comparing the transmitted passenger identification data with a database of stored passenger identification information records;
matching the transmitted passenger identification data with one of the stored passenger identification information records; and
identifying the passenger based on passenger identification information in the matched stored record.
34. A method of according to any one of claims 31 to 33, comprising:
capturing an image which includes the face of a passenger;
creating a facial data template based on the captured image;
comparing the facial data template with a database of stored facial data templates associated with unique passenger identification information; matching the created facial data template with one of the stored facial data templates;
identifying the passenger based on the unique passenger identification information associated with the matched stored template.
35. The method according to any one of claims 31 to 34, comprising:
identifying a zone or region of the vehicle in which the passenger is identified.
36. T he method according to any one claims 31 to 35, comprising:
increasing or reducing a passenger count in a database according to whether or not the identified passenger is entering, exiting, checking in, or checking out of a zone, port; terminal, facility or vehicle.
37. The method according to any one of claims 31 to 36, comprising:
enabling the identified passenger to access, and/or presenting to the identified passenger, personalised content on an electronic device or terminal.
38. The method according to any one of claims 31 to 37, comprising:
verifying that the identified passenger is on a passenger manifest for the vehicle.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2017902680A AU2017902680A0 (en) | 2017-07-07 | Passenger Management | |
| AU2017902680 | 2017-07-07 | ||
| AU2018900536A AU2018900536A0 (en) | 2018-02-20 | Passenger Management | |
| AU2018900536 | 2018-02-20 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019006503A1 true WO2019006503A1 (en) | 2019-01-10 |
Family
ID=64949529
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/AU2018/050690 Ceased WO2019006503A1 (en) | 2017-07-07 | 2018-07-04 | Passenger management |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2019006503A1 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111523364A (en) * | 2019-02-05 | 2020-08-11 | 丰田自动车株式会社 | Information processing system, readable storage medium, and vehicle |
| NO20211352A1 (en) * | 2021-11-09 | 2023-05-10 | Dimeq As | Adaptable Communication System for a Vessel |
| US12254494B2 (en) * | 2020-03-12 | 2025-03-18 | Nec Corporation | Information processing apparatus, information processing method, and computer readable recording medium for providing information to a passenger |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070040672A1 (en) * | 2005-08-22 | 2007-02-22 | Andrew Chinigo | Security system for mass transit and mass transportation |
| US20140125502A1 (en) * | 2012-11-07 | 2014-05-08 | Jim Wittkop | Systems and methods for tracking vehicle occupants |
| US20140309806A1 (en) * | 2012-03-14 | 2014-10-16 | Flextronics Ap, Llc | Intelligent vehicle for assisting vehicle occupants |
-
2018
- 2018-07-04 WO PCT/AU2018/050690 patent/WO2019006503A1/en not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070040672A1 (en) * | 2005-08-22 | 2007-02-22 | Andrew Chinigo | Security system for mass transit and mass transportation |
| US20140309806A1 (en) * | 2012-03-14 | 2014-10-16 | Flextronics Ap, Llc | Intelligent vehicle for assisting vehicle occupants |
| US20140125502A1 (en) * | 2012-11-07 | 2014-05-08 | Jim Wittkop | Systems and methods for tracking vehicle occupants |
Non-Patent Citations (2)
| Title |
|---|
| ADAM VRANKULJ, A.: "Fujitsu launches new biometric palm vein access control system", BIOMETRICUPDATE, 10 December 2014 (2014-12-10), XP055564670, Retrieved from the Internet <URL:https://web.archive.org/web/20141210024542/http://www.biometricupdate.com/201404/fujitsu-launches-new-biometric-palm-vein-access-control-system> [retrieved on 20180903] * |
| COXWORTH, B.: "Kidtrack biometric system keeps track of kids on school buses", NEW ATLAS, 24 March 2017 (2017-03-24), XP055564668, Retrieved from the Internet <URL:https://newatlas.com/kidtrack-biometric-school-bus-scanner/26723> [retrieved on 20180903] * |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111523364A (en) * | 2019-02-05 | 2020-08-11 | 丰田自动车株式会社 | Information processing system, readable storage medium, and vehicle |
| CN111523364B (en) * | 2019-02-05 | 2023-11-07 | 丰田自动车株式会社 | Information processing systems, readable storage media and vehicles |
| US12254494B2 (en) * | 2020-03-12 | 2025-03-18 | Nec Corporation | Information processing apparatus, information processing method, and computer readable recording medium for providing information to a passenger |
| US12346935B2 (en) * | 2020-03-12 | 2025-07-01 | Nec Corporation | Information processing apparatus, information processing method, and computer readable recording medium for providing information to a passenger |
| NO20211352A1 (en) * | 2021-11-09 | 2023-05-10 | Dimeq As | Adaptable Communication System for a Vessel |
| NO347347B1 (en) * | 2021-11-09 | 2023-09-25 | Dimeq As | Adaptable Communication System for a Vessel |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11996175B2 (en) | Trusted third-party computerized platform using biometric validation data structure for AI-based health wallet | |
| US9300925B1 (en) | Managing multi-user access to controlled locations in a facility | |
| JP2010518461A (en) | Proximity location system | |
| US20160185503A1 (en) | Methods, devices, and systems for secure transport of materials | |
| JP6595268B2 (en) | Entrance / exit management system | |
| US12254720B2 (en) | Ticket issuing system, and ticket checking apparatus | |
| US20140125502A1 (en) | Systems and methods for tracking vehicle occupants | |
| US20120035906A1 (en) | Translation Station | |
| CN103544738A (en) | Attendance system for company | |
| CN105427056A (en) | Passenger ticket booking and checking information management system | |
| US20050256724A1 (en) | Personalized boarding pass | |
| US12170934B2 (en) | Information processing apparatus, information processing method, and storage medium | |
| WO2019006503A1 (en) | Passenger management | |
| US20250022088A1 (en) | Information processing apparatus, information processing method, and storage medium | |
| JP2007079656A (en) | Ticketless boarding system and ticketless boarding method | |
| US9507980B2 (en) | Intelligent container | |
| US20140363060A1 (en) | Hand-held device for biometric identification | |
| CN110375716B (en) | Unmanned aerial vehicle searching information generation method and unmanned aerial vehicle | |
| Annapurna et al. | Design of Authenticated Radio Frequency Identification based Electronic Voting Machine | |
| Abd Alhasan et al. | RFID based protection to newborns in the hospitals | |
| US20230360453A1 (en) | Authentication system, operation method of authentication system, and operation program of authentication system | |
| JP7730128B1 (en) | Machines, systems, and methods | |
| US20040238617A1 (en) | Identifying method, identifying chip and identifying system | |
| KR20220114141A (en) | Location tracking system using automatic barcode generation function | |
| Tiwari et al. | Patient Information Retrieval Using Fingerprint |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18829075 Country of ref document: EP Kind code of ref document: A1 |
|
| DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18829075 Country of ref document: EP Kind code of ref document: A1 |