[go: up one dir, main page]

WO2012159166A1 - Publication d'image - Google Patents

Publication d'image Download PDF

Info

Publication number
WO2012159166A1
WO2012159166A1 PCT/AU2012/000584 AU2012000584W WO2012159166A1 WO 2012159166 A1 WO2012159166 A1 WO 2012159166A1 AU 2012000584 W AU2012000584 W AU 2012000584W WO 2012159166 A1 WO2012159166 A1 WO 2012159166A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
user
display
article
processing system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/AU2012/000584
Other languages
English (en)
Inventor
Yasharth KRISHNA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FIVE FACES Pty Ltd
Original Assignee
FIVE FACES Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2011902019A external-priority patent/AU2011902019A0/en
Application filed by FIVE FACES Pty Ltd filed Critical FIVE FACES Pty Ltd
Publication of WO2012159166A1 publication Critical patent/WO2012159166A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/06Special arrangements of screening, diffusing, or reflecting devices, e.g. in studio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet

Definitions

  • the present invention relates to a method and apparatus for publishing an image of a user, and in one particular example, for publishing one or more images of a user via a communications network.
  • the present invention seeks to provide a system for publishing an image of a user, the system including:
  • an image capture device for capturing an image of the user through the partially reflective surface
  • a processing system including a electronic processing device that: i) causes the image capture device to capture the image of the user;
  • iii) transfers the image via a communications network to allow at least one individual to view the image.
  • the processing system transfers the image to a server.
  • the system includes, an input for receiving input commands from the user, the user providing the input commands to at least one of:
  • the input includes a touch sensitive membrane positioned between the at least partially reflective surface screen and at least one display.
  • the image is published via a social media application.
  • the system includes a second display for displaying content.
  • the content includes at least one of:
  • the trigger is a predetermined action performed by the processing system.
  • the image capture device captures an image sequence, the image sequence being transferred via a communications network to allow at least one individual to view the image sequence.
  • the apparatus includes a sensor for sensing an identifier associated with a clothing article, and wherein the electronic processing device:
  • the apparatus includes a payment sensor for sensing a payment, and wherein the electronic processing device is responsive to the payment to provide confirmation of payment.
  • the apparatus includes a printer for printing at least one of:
  • the present invention seeks to provide a method for publishing an image of a user, the method including:
  • the method includes:
  • the method includes:
  • Figure 1 is a flow chart of an example of a process for publishing an image
  • Figure 2 is a schematic diagram of an example of a distributed computer architecture
  • Figure 3 is a schematic diagram of an example of a processing system
  • Figure 4 is a schematic diagram of an example of an end station
  • Figure 5A is a schematic side view of an example of a system for publishing an image
  • Figure 5B is a schematic front view of the system of Figure 5 A;
  • Figure 5C is a schematic perspective view of a specific example of the housing of
  • Figures 6A and 6B are a flow chart of an example of a process for publishing an image.
  • Figure 7 is a flow chart of an example of a process for displaying content on a second display.
  • an image of a user is acquired. Whilst this may be performed in any appropriate manner, in one example this is performed using a processing system which operates to capture an image of the user using an image capture device, such as a camera, webcam, or the like. As will be described in more detail below, in one particular example the camera is mounted behind an at least partially reflective surface, such as two way mirror, or partially reflective glass (such as SCHOTTTM reflective glass), allowing the image to be captured through the glass. However, this is not essential and any appropriate image capture method may be used.
  • the captured image can be a still image, such as a photo, or a sequence of images, such as a video, and reference to the term image is not intended to be limiting.
  • publication details are determined.
  • the publication details are indicative of the manner in which the image is to be published.
  • the image is typically published in such a manner as to allow this to be viewed by a number of individuals, such as through the use of social media, such as FacebookTM, or the like.
  • alternative publication techniques such as email, MMS (Multimedia Messaging Service), or the like can be used. It will be appreciated that the term publication therefore includes any method of providing the image to other individuals to allow the other individuals to view the image, and is not intended to be limiting.
  • the publication details can be determined in any suitable manner, including for example retrieving publication details from a store, such as a memory or database, or can be determined based on input commands provided by the user.
  • the image is transferred via a communications network for publication, in accordance with the publication details.
  • a communications network for publication, in accordance with the publication details.
  • the image will be uploaded to a server operated by FacebookTM thereby allowing the image to be published.
  • the image may be transferred directly to communications devices of other individuals, for example via email or the like.
  • the image is optionally viewed by one or more individuals who then can provide feedback to the user at step 140.
  • the process is performed at least in part using a processing system, such as a suitably programmed computer system. Whilst this can be performed on a stand alone machine, in one example, this is more typically performed by one or more processing systems operating as part of a distributed architecture. An example of a distributed architecture will now be described with reference to Figure 2.
  • a base station 201 is coupled via a communications network, such as the Internet 202, and/or a number of local area networks (LANs) 204, to a number of end stations 203, and optionally to third party servers 205, which will be described in more detail below.
  • a communications network such as the Internet 202, and/or a number of local area networks (LANs) 204
  • LANs local area networks
  • third party servers 205 which will be described in more detail below.
  • the configuration of the networks 202, 204 are for the purpose of example only, and in practice the base station 201, end stations 203 and servers 205 can communicate via any appropriate mechanism, such as via wired or wireless connections, including, but not limited to mobile networks, private networks, such as an 802.11 networks, the Internet, LANs, WANs, or the like, as well as via direct or point-to- point connections, such as Bluetooth, or the like.
  • the base station 201 includes a processing system 210 connected to an imaging device 21 1, such as a camera, first and second displays 212, 213 and an input 214, such as a touch sensitive input.
  • an imaging device 21 1 such as a camera
  • first and second displays 212, 213 such as a touch sensitive input
  • an input 214 such as a touch sensitive input.
  • a motion sensing device such as a Kinect sensor
  • the input could be in the form of a wireless connection that receives signals from a remote device, such as a smartphone, or the like.
  • the processing system 210 may also be connected to other optional peripheral devices 215, such as sensors, printers or the like, as will be described in more detail below.
  • the processing system 210 executes applications software allowing images of the user to be captured using the camera 211 and then subsequently electronically published. As part of this process, the processing system 210 may interpret user input commands provided via the input device 214, as well as allowing information to be presented to the user via either or both of the displays 212, 213.
  • the third party servers 205 can be used in order to allow the image to be published, as will be described in more detail below. However, this is not essential, and publication can be performed directly by the processing system 210 or by any appropriate technique.
  • the end stations 203 are typically used to allow published images to be viewed by other individuals, and accordingly each end station 203 therefore typically executes applications software allowing communication with the third party servers 205 or the base station 201, such as a browser application, or the like. However, this is not essential and any suitable arrangement may be used.
  • the processing system 210 includes at least one electronic processing device, such as a microprocessor 300, a memory 301, an input/output device 302, typically including the displays 212, 213 and the touch sensitive input device 214, and an external interface 303, interconnected via a bus 304 as shown.
  • the external interface 303 can be utilised for connecting the processing system 210 to peripheral devices, such as the communications networks 202, 204, or the camera 211.
  • Additional peripheral devices 215 can be provided including an identifier sensor, such as a barcode or QR code scanner, for sensing identifiers associated with products, a payment sensor such as a credit card reader, for detecting a payment, as well as printers for providing printed information or other content.
  • an identifier sensor such as a barcode or QR code scanner
  • a payment sensor such as a credit card reader
  • printers for providing printed information or other content.
  • the processor 300 executes instructions in the form of applications software stored in the memory 301 to allow the image publication process to be performed.
  • the processing system 210 may be formed from any suitable processing system, such as a suitably programmed computer system, PC, web server, network server, or the like and that the electronic processing device can include a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement capable of interacting with the terminal 120 and performing the payment process.
  • the third party servers 205 will typically have a similar configuration to the processing system 210, and these will not therefore be described in any further detail.
  • the end stations 203 include at least one electronic processing device such as a microprocessor 400, a memory 401, an input/output device 402, such as a keyboard and/or display, and an external interface 403, interconnected via a bus 404 as shown.
  • the external interface 403 can be utilised for connecting the end station 203 to peripheral devices, such as the communications networks 202, 204, or the like and that the electronic processing device can include a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement capable of interacting with the terminal 120 and performing the payment process.
  • a microprocessor 400 such as a microprocessor 400, a memory 401, an input/output device 402, such as a keyboard and/or display, and an external interface 403, interconnected via a bus 404 as shown.
  • the external interface 403 can be utilised for connecting the end station 203 to peripheral devices, such as the communications networks 202
  • the processor 400 executes instructions in the form of applications software stored in the memory 401 to allow communication with the base station 201 or servers 205, thereby allowing images of the user to be viewed.
  • this is achieved by having the end station 203 execute a browser application, allowing images to be presented on webpages or the like, although this is not essential and any suitable configuration may be used.
  • the end stations 203 may be formed from any suitably programmed processing system, such as a suitably programmed PC, Internet terminal, lap-top, hand-held PC, tablet PC, slate PC, iPadTM, mobile phone, smart phone, PDA (Personal Data Assistant), or other communications device.
  • the base station 201 includes a housing 500 having a front made of SCHOTTTM reflective glass 501, which is a partially reflective glass, typically manufactured by providing a thin layer of metal on a glass.
  • the housing 500 contains the processing device 210, which is in turn coupled to the camera 211 and first and second displays 212, 213, which are provided behind the reflective glass 501.
  • An IR touch sensitive membrane 214 is also typically mounted between the first display 212 and the reflective glass 501, allowing the first display to act as a touch screen.
  • the relative position of the camera 211 and displays 212, 213 is for the purpose of example only, and it will be appreciated that alternative arrangements can be used.
  • the housing may also house additional peripheral devices 215, such identifier sensors, payment sensors, printers or the like.
  • the reflective glass 501 is partially reflective, allowing the user to view their own reflection, for example, when the displays 212, 213 are deactivated, or displaying a dark image.
  • the camera 21 1 can be used to capture an image of the user 510 through the reflective glass 501, whilst when activated, the displays 212, 213 can be viewed through the reflective glass 501.
  • viewing of the displays 212, 213 can be enhanced by additional lighting provided in the housing 500, for example around the perimeter of the displays 212, as shown at 520.
  • the user tries on an article of clothing.
  • the user can typically view their reflection in the reflective glass 501 allowing the user to perform a visual assessment of whether or not they like the clothing.
  • the user activates the image publication system. This will be achieved in any appropriate manner and can involve for example having the user touch the touch screen 214, although a separate activation control may be provided (not shown).
  • the processing system 210 prompts the user to pose for image acquisition. This can be achieved in any appropriate manner but typically involves having the processing system display a message via the first display 212, indicating that image capture is about to occur.
  • the processing system 210 acquires an image of the user using the camera 21 1 , with the captured image being displayed on the display 212 at step 620, as shown for example in Figure 5B.
  • the image can include a sequence of images, for example in the form of a video segment, or a sequence of static images indifferent poses.
  • the image may be displayed together with input options allowing the user to confirm if the image is acceptable at step 625. This can include for example a "yes" or "no" selection displayed on the display 212, allowing the user to select an appropriate option using the touch membrane 214.
  • the process returns to step 610 allowing the user to pose for a new image to be captured.
  • the base station 201 can be configured to allow multiple images of the user to be captured, with each of these being displayed to the user, thereby allowing the user to select one or more images they prefer.
  • the publication options can include any appropriate option such as a list of different social media via which the image can be published.
  • the publication techniques that can be used can include any electronic publication, and in one example can be configured by the operator of the base station 201, and/or selected based on user inputs.
  • the base station 201 can be adapted to provide the user with access to their own social media account, for example by having the processing system 210 implement a browser application, allowing the user to navigate to a social media website, or the like.
  • the user selects a publication option and then provides any required information.
  • a publication option For example, if the user selects to publish the image via Facebook' M the user may be required to provide details of their Facebook account including authentication information required to logon.
  • the image can be published to a default FacebookTM account that is not necessarily associated with the user, in which case such login information may not be required.
  • the user may be required to enter relevant information such as email addresses, phone numbers or the like.
  • the image is published to a Facebook account associated with the shop.
  • the publication of the image to the relevant account can be performed in accordance with account information provided during an initial configuration process, thereby avoiding the need for the user to input account information.
  • the user can then tag the image, so that third parties such as friends, can find and view the image on the shop's account page, which can in turn increase exposure for the shop, thereby providing additional advertising for the shop.
  • the processing system 210 transfers the image, typically via the communications networks 202, 204 to a relevant hardware device, thereby allowing the image to be published.
  • This can include for example transferring the image to a third party server 205, or can alternatively include transferring the image directly to designated user end stations 203, for example by email, MMS, or the like.
  • a status update to indicate that an image is available for viewing. It will be appreciated that this can be achieved in any suitable manner, such as by tagging the image and then providing a status application via their own social media account.
  • the published image may also be provided together with additional information, such as advertising, a set background, border, or the like.
  • the image is published together with advertising relating to the shop where the article is available for sale, for example by including the slogan "This outfit can be purchased at ", thereby providing additional advertising for the retailer.
  • the image of the user can be superimposed on a background associated with the shop, so for example, in an adventure store, the image of the user could be superimposed on a mountain scene, or the like.
  • acquired images can be branded before publication, taking advantage of the viral nature of Social Media to distribute the host's brand further and instantly.
  • friends of the user are optionally able to view the image and provide feedback by the user, for example via the user's FacebookTM account at step 660, thereby helping the user decide whether to purchase the clothing.
  • friends of the user are optionally able to view the image and provide feedback by the user, for example via the user's FacebookTM account at step 660, thereby helping the user decide whether to purchase the clothing.
  • the second display is used to present other information, such as content including store information, advertising, or the like. This is of particular benefit to advertisers, as the user is captive during the process, and this can therefore significantly enhance the effectiveness of the advertising process. However, this is not essential, and alternatively a single display may be used.
  • the second display In the event that the second display is used, this can be used to displayed information in accordance with a schedule, as known in the art.
  • a central server can be used to serve content to the base station 201 in accordance with a predetermined schedule, allowing a variety of content to be displayed as required.
  • the second display can also be used to display content in response to trigger events, as will now be described with reference to Figure 7.
  • the processing system 210 determines scheduled content and then causes this to be presented oh the second display 213, at step 705. This can be performed in any appropriate manner, such as by having a central server upload a schedule and or content, thereby allowing this to be displayed. As such techniques are known in the art these will not be described in any further detail.
  • the processing system 210 monitors for triggers, which are typically predetermined actions performed by the processing system 210, such as taking of, or uploading of a photo.
  • the trigger events may be identified using any suitable technique, but typically this involves having a list of actions stored in memory, for example during a configuration process, or uploaded from a server.
  • the process returns to step 700, allowing scheduled content to continue being displayed.
  • the process moves onto step 720, with the processing system determining content associated with the trigger, before displaying this at step 725.
  • the content can be any suitable content, and could include specific advertising, discount offers, vouchers for redemption, or the like.
  • the content can include a machine readable code, such as a 2-D barcode, or the like, allowing the user to access a website, or discount offer, using a smartphone.
  • a discount offer could be displayed indicating that the user can have a 10% discount on any items worn in the image.
  • the offer can then be accepted in any appropriate manner such as by scanning a displayed barcode using a smartphone, or the like.
  • the triggered content can then be displayed either for a predetermined amount of time, or until a further action is performed, at which point the process can return to step 700, allowing the presentation of scheduled content to resume. Allowing content to be displayed in response to triggers allows shops to use targeted advertising, offers, discounts, or the like, to thereby help maximize advertising effectiveness and hence sales.
  • the base station 201 can be connected to a server, allowing content or the like to be uploaded thereto. This allows a number of systems to be centrally controlled, so that base stations 201 provided in a number of different stores can be configured appropriately. This can be used, for example, to allow a franchisor to control the systems used by franchisees, thereby ensuring franchising requirements, particularly with respect to advertising or the like to be met.
  • the above described process allows users to obtain feedback from friends or other individuals as to the appearance of articles for purchase, such as clothing or the like. It will be appreciated from this that the base station 201 can therefore be provided in shops or other retail outlets.
  • the customer walks into a store, and tries out a product (clothing, shoes, apparel), examining their reflection in a partially reflective surface to see what they look like.
  • general advertising can be displayed by a second display 213 mounted behind the partially reflective surface.
  • the user wants a second opinion regarding their article selection, they can activate the system, for example by touching the touch sensitive membrane 214, causing the system to take one or more photos of them, which can be done several times with different outfits/poses. Images can be displayed for review, with- the user selecting one or more images, and a publication option.
  • the base station 201 can then post the image to any number of social media sites (configurable). This can be a store FacebookTM page, the customers FacebookTM page, or similar. In one example, the image/video taken will have a customisable border with branding information on it, making this a powerful marketing tool for the store.
  • the base station 201 can display images, interpret user input commands provided via the input device 214, and communicate via a communications network, the arrangement can also be used to provide access to network content and services. In one example, this therefore allows a user to access internet content, for example to browse alternative products or the like. For example, this allows the above described system to be used as a means for customers to check-in to a venue or event, and share their fun experience with their friends.
  • the base station 201 includes an identifier sensor for sensing identifiers associated with products, such as barcodes, QR codes, or the like. This may be a separate sensor, or can alternatively be implemented using the imaging device 21 1, which can image codes. .This can be used to allow a user to use the system to sense an identifier associated with a clothing article of clothing, allowing additional information regarding the clothing to be displayed to the user. [0078] In particular, this allows the processing system 210 to determine an identity of the clothing article, determine information relating to the article and then display the information.
  • an identifier sensor for sensing identifiers associated with products, such as barcodes, QR codes, or the like. This may be a separate sensor, or can alternatively be implemented using the imaging device 21 1, which can image codes. . This can be used to allow a user to use the system to sense an identifier associated with a clothing article of clothing, allowing additional information regarding the clothing to be displayed to the user. [0078] In particular, this allows the processing system 210
  • the processing system 210 can receive signals from the sensor and interpret these to identify the clothing, for example by comparing the sensed identifier to a list of identifiers stored in a store together with information regarding the relevant article. Information can therefore be retrieved, either from local memory, or from a remote store, via the communications networks 202, 204, and then displayed to the user. It will also be appreciated that in addition to using a sensor, clothing articles could be identified using other techniques, such as image recognition, or by having the user manually input an alphanumeric or other similar code that uniquely identifies a particular article of clothing, or the like.
  • the processing system 210 can connect to a merchants stock management system, allowing a user to determine the number of articles that are in stock, or alternative stores that might include the same or similar articles (eg: different sizes, colours, or the like). This can significantly assist the user with the shopping experience.
  • the system can be adapted to allow transactions to be performed, for example to allow a user to make purchases of articles.
  • a user can arrange to sense an identifier associated with an article, allowing the processing system 210 to retrieve information regarding the cost of the product. The user can then be presented with an option allowing the user to purchase the product.
  • the user is required to provide payment, for example by presenting a payment device to a payment sensor.
  • a payment sensor This could include presenting a credit card, debit card, store card or the like to a card reader, or by providing physical currency into a banknote and/or coin sensor.
  • the processing system 210 causes details of the transaction to be presented to the user via one of the displays 212, 213, allowing the user to provide confirmation they wish to proceed with the transaction.
  • alternative payment options could be used, such as to use PayPal, or another similar online payment system, allowing a payment to be transferred directly from the user to the merchant.
  • the processing system 210 can cause a confirmation of payment to be provided to the user, either by having a receipt printed by a printer, or by having a receipt or other confirmation of the transaction transmitted to a communications device of the user, such as a mobile phone or the like.
  • a communications device of the user such as a mobile phone or the like.
  • the user when leaving the shopping environment, the user can then simply present their receipt confirming that the goods have been purchased.
  • the system can be used as a point-of-sale terminal in order to allow a user to make purchases.
  • the system can be used to allow a user to make other purchases, join loyalty programs, rate their experience, engage in competitions, perform transactions or the like.
  • the base station 201 can integrate with loyalty programs or sales systems to provide relevant directed information, as well as to allow a user to perform a transaction, interact with loyalty programs, or the like.
  • the system also allows merchants to display videos and control them, either remotely, in response to defined triggers, or based on time and date information.
  • This allows the merchant to display advertising TVC's, or other advertising collateral. Any captured images can be published via social media, together with branding information, such as logos, branded border, or the like.
  • This allows the merchant to leverage the system to provide marketing driven by user interaction with the system. This means the individuals viewing the images are targeted directly by the marketing, making it more relevant to them, and increasing the likelihood of it providing a benefit to the merchant.
  • the base station 201 can provide logging functionality, for example by storing information regarding user interaction with the system. This allows operators to view reports of interactions by customers. Additionally, the device can perform gender recognition, allowing operators to further ascertain the gender of customers. [0086] Persons skilled in the art will appreciate that numerous variations and modifications will become apparent. All such variations and modifications which become apparent to persons skilled in the art, should be considered to fall within the spirit and scope that the invention broadly appearing before described.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

L'invention concerne un système pour publier au moins une image d'un utilisateur, lequel système comprend une surface au moins partiellement réfléchissante afin de permettre à un utilisateur de visualiser un reflet de lui-même, un dispositif de capture d'image pour capturer au moins une image de l'utilisateur à travers la surface partiellement réfléchissante, au moins un dispositif d'affichage et un système de traitement comprenant un dispositif de traitement électronique qui commande au dispositif de capture d'image de capturer ladite au moins une image de l'utilisateur, qui affiche ladite au moins une image pour l'utilisateur en utilisant ledit au moins un dispositif d'affichage, et qui transfère ladite au moins une image via un réseau de communication pour qu'au moins un individu puisse visualiser ladite au moins une image.
PCT/AU2012/000584 2011-05-24 2012-05-24 Publication d'image Ceased WO2012159166A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2011902019 2011-05-24
AU2011902019A AU2011902019A0 (en) 2011-05-24 Image publication

Publications (1)

Publication Number Publication Date
WO2012159166A1 true WO2012159166A1 (fr) 2012-11-29

Family

ID=47216456

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2012/000584 Ceased WO2012159166A1 (fr) 2011-05-24 2012-05-24 Publication d'image

Country Status (1)

Country Link
WO (1) WO2012159166A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014113187A1 (fr) * 2013-01-16 2014-07-24 Eminvent, LLC Systèmes et procédés de rassemblement d'informations de client

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090116766A1 (en) * 2007-11-06 2009-05-07 Palo Alto Research Center Incorporated Method and apparatus for augmenting a mirror with information related to the mirrored contents and motion

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090116766A1 (en) * 2007-11-06 2009-05-07 Palo Alto Research Center Incorporated Method and apparatus for augmenting a mirror with information related to the mirrored contents and motion

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
"Dressing-Rooms Of The Future", 10 June 2010 (2010-06-10), Retrieved from the Internet <URL:http://wayback.archive.org/web/*/www.retail.org.nz/downloads/retail%20design.pdf> [retrieved on 20120727] *
"Learning from Prada", RFID JOURNAL, 24 June 2002 (2002-06-24), Retrieved from the Internet <URL:http://www.rfidjournal.com/article/view/425> [retrieved on 20120727] *
MUI, Y.Q: "Interactive Mirror Mirror on the Wall", REDORBIT, 16 January 2007 (2007-01-16), Retrieved from the Internet <URL:http://www.redorbit.com/news/technology/803464/interactive_mirror_mirror_on_the_wall> [retrieved on 20120727] *
O'DONNELL, J.: "Retailers try on dressed-up fitting rooms", USA TODAY, 19 January 2007 (2007-01-19), Retrieved from the Internet <URL:http://www.usatoday.com/tech/news/techinnovations/2007-01-18-high-tech-dressing-rooms_x.htm> [retrieved on 20120727] *
PRADA'S SMART TAGS TOO CLEVER, 27 October 2002 (2002-10-27), Retrieved from the Internet <URL:http://www.wired.com/science/discoveries/news/2002/10/56042> [retrieved on 20120727] *
SINGER, N: "''If the Mirror Could Talk (It Can)", THE NEW YORK TIMES, 18 March 2007 (2007-03-18), Retrieved from the Internet <URL:http://www.nytimes.com/2007/03/18/fashion/18mirror.html> [retrieved on 20120727] *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014113187A1 (fr) * 2013-01-16 2014-07-24 Eminvent, LLC Systèmes et procédés de rassemblement d'informations de client

Similar Documents

Publication Publication Date Title
US12277586B2 (en) Augmented reality systems and methods for purchasing
US20240311891A1 (en) Useful and Novel Shopping Application
KR101905290B1 (ko) 사용자 관심에 기초한 대화형 디스플레이
US10467672B2 (en) Displaying an electronic product page responsive to scanning a retail item
US10339579B2 (en) Systems and methods for controlling shelf display units and for graphically presenting information on shelf display units
US10043209B2 (en) Method and system for consumer transactions using voice or human based gesture actions
US20180182025A1 (en) Systems and methods for visualizing in-budget products and services
KR101881939B1 (ko) 판매자 중심의 전자상거래 서비스 제공 방법, 장치, 서비스 서버 및 사용자 단말기
US20100265311A1 (en) Apparatus, systems, and methods for a smart fixture
KR20170121328A (ko) 미래의 상점
KR101620938B1 (ko) 의류 제품 판매 지원 장치와 이와 통신 가능한 의류 제품 정보 관리 서버, 의류 연관 제품 추천 서버 및 의류 제품 정보 제공 방법
WO2015159601A1 (fr) Dispositif de traitement d&#39;informations
US20150324850A1 (en) System and method for automatically providing virtual customer-personalized content in a retail commerce environment
KR20120057668A (ko) 오프라인 쇼핑몰에서 고객들 간의 커뮤니케이션 지원시스템 및 방법
JP2020095581A (ja) 情報処理方法、情報処理装置、情報処理システム、および店舗
JP6912436B2 (ja) 情報処理装置、情報処理方法および情報処理プログラム
WO2012159166A1 (fr) Publication d&#39;image
KR101402497B1 (ko) 객층 인식 기능을 구비한 판매 시점 관리 장치 및 방법
KR101927078B1 (ko) 사용자와 관련한 이미지 기반의 정보를 제공하는 방법 및 디바이스
KR20200071634A (ko) Qr코드를 이용한 핀테크 결제 시스템 및 그것의 제어 방법
JP2021168178A (ja) 情報処理装置、情報処理方法および情報処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12790234

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03.04.2014)

122 Ep: pct application non-entry in european phase

Ref document number: 12790234

Country of ref document: EP

Kind code of ref document: A1