[go: up one dir, main page]

HK1174751B - Method and apparatus for automatic interfacing between a master device and object device - Google Patents

Method and apparatus for automatic interfacing between a master device and object device Download PDF

Info

Publication number
HK1174751B
HK1174751B HK13101474.8A HK13101474A HK1174751B HK 1174751 B HK1174751 B HK 1174751B HK 13101474 A HK13101474 A HK 13101474A HK 1174751 B HK1174751 B HK 1174751B
Authority
HK
Hong Kong
Prior art keywords
target device
target
image
broadcast data
master
Prior art date
Application number
HK13101474.8A
Other languages
Chinese (zh)
Other versions
HK1174751A (en
Inventor
M.S.格罗布
S.迪亚兹斯宾多拉
G.V.小怀特
V.W.基廷
Original Assignee
高通股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 高通股份有限公司 filed Critical 高通股份有限公司
Publication of HK1174751A publication Critical patent/HK1174751A/en
Publication of HK1174751B publication Critical patent/HK1174751B/en

Links

Description

Method and apparatus for automatic interfacing between a host device and a target device
Cross Reference to Related Applications
This application claims the benefit of U.S. provisional application No.61/226,552 filed on 7/17/2009 and U.S. provisional application No.61/226,545 filed on 7/17/2009, which are assigned to the assignee of the present application and expressly incorporated herein by reference.
Background
Wireless devices are becoming more and more popular. Interfacing with wireless devices is currently a relatively difficult or limited process. For example, pairing electronic devices is currently a complex task that requires a user to open a protocol on both devices, then request discovery, and then pair. In addition, some electronic devices have little or no built-in interfaces, making it difficult to interact with. The generally desirable price of a display screen and the smaller device size are obstacles to placing a rich interface on each wireless device.
As the use of wireless devices increases, it becomes increasingly difficult to interface between electronic devices. For example, current electronic devices typically include unspecific device names that are not associated with physical devices and are easily confusing to the user, e.g., the default phone bluetooth name may be NOK234B and not appear on the phone. The ever increasing number of potentially paired wireless devices will increase the confusion experienced by users.
Therefore, there is an increasing need to be able to interface easily and naturally with devices.
SUMMARY
The master device images the target device and uses the image to identify the target device. The master device then automatically interfaces with the identified target device, for example, by pairing with the target device. The master device may retrieve data related to the target device and display the related data, which may be displaying the data over the displayed image of the target device. The master device may provide an interface to control the target device or be used to communicate data to the target device.
The master device receives broadcast data from a target device, the broadcast data including information about the visual appearance of the target device. The master device may display the broadcast data and interface with the target device based on a selection of the displayed broadcast data from the target device. The image of the target device may be used to filter broadcast data from multiple target devices to reduce the number of target devices whose broadcast data is displayed by the master device.
Brief Description of Drawings
Fig. 1 is a block diagram illustrating an example system of wirelessly connected devices including a master device and a target device.
Fig. 2 is a block diagram illustrating an example of a master device that may automatically interface with a target device.
FIG. 3 is a flow chart of a method of interfacing the master device with the target device.
Fig. 4A and 4B illustrate a master device to target device interface, where data related to the target device is displayed on the master device.
Fig. 5A and 5B illustrate a master device interfacing with an object device and the display of the master device augmented with data related to the object device.
Fig. 6A, 6B, 6C, and 6D illustrate a master device and target device interface where augmented data related to the target device is displayed on the master device over the target device image.
Fig. 7A and 7B illustrate a master device interfacing with a target device having limited interface capabilities.
Fig. 8A and 8B illustrate a master device interfacing with a target device, wherein the master device provides extended functionality for the target device.
Fig. 9A illustrates a master device using information broadcast by a target device to identify one of a plurality of identical target devices.
Fig. 9A, 9B, and 9C illustrate a master device being used to interface with two target devices.
10A, 10B, and 10C illustrate a master device being used to initiate an interface between two target devices.
FIG. 11 is a flow chart of a method by which a master device can image a passive target with which it is not possible for the master device to pair or interface and the master device retrieves and displays data related to the target.
FIG. 12 illustrates the host device automatically launching and displaying an application based on an image of an object.
Fig. 13A and 13B illustrate the master device imaging a passive object and retrieving and displaying information about the passive object.
Fig. 14A and 14B illustrate another example of the master device imaging a passive object and retrieving and displaying information related to the passive object.
15A, 15B, and 15C illustrate a master device imaging a passive object, retrieving data related to the object, and passing this data to the object device that the master device is interfacing with.
Detailed Description
The systems and methods described herein automate the pairing or otherwise interfacing of a master device with a target device using, for example, an image of the target device taken with the master device. The master device may be used to provide relevant information about the target device, for example by overlaying the information on an image of the target device. The master device may include superior interaction capabilities than those present in the target device and may enable interaction with the target device that would otherwise be difficult or impossible, for example, due to limited or no interface capabilities of the target device.
Fig. 1 is a block diagram illustrating an example system 100 of wirelessly connected devices including a master device 110 and a target device 180. By way of example, the host device 110 may be a mobile phone or other electronic device such as an ultra mobile personal computer ("UMPC"). In one embodiment, the master device 110 may be semi-mobile or stationary. For example, master device 110 may be located stationary or semi-stationaryLocation, for example, in a store, can be used to easily pair new devices together. By way of example, the target device may be a device such as a digital camera, music player, television, digital photo frame, or any other device. Additionally, it should be appreciated that the target device 180 may be portable, or in some cases relatively or completely non-portable. The master device 110 includes a camera 112 and a display 114. The master device 110 images the object device 180 by using the camera 112-the image 180 of the object device 180Image of a personMay be displayed in display 114 and use image 180Image of a personTo identify the target device 180 to interface with the target device 180. Where the target device 180 is identified, the master device 110 can then automatically interface with the target device 180, as illustrated by arrow 116. The master device 110 may also communicate over link 104 with a network 102, for example, a wide area network such as the internet, or the like, which may be used to assist in identifying the target device 180 or interfacing with the target device 180. Optionally, the master device 110 may interface 116a with the target device 180 indirectly through the network 102.
Fig. 2 is a block diagram illustrating an example of the master device 110. The master device 110 includes means for imaging, such as a camera 112, and means for interfacing with a target device, such as a wireless interface 130 for communicating via a wireless link 116 (shown in fig. 1). The master device 110 may also include means for sensing motion, such as a motion sensor 118, which may be an accelerometer or gyroscope, to detect motion of the master device 110, which may be used to detect gestures used as input to the master device 110 by a user. The master device 110 further includes means for identifying a target device 180, which may be, for example, a master device control unit 120, which includes a processor 122 in communication with a memory 124 containing software 126, and may further include hardware 128 and firmware 129, and may further include aspects of a user interface 150 such as a display or keypad 152. The camera 112 and the wireless interface 130, and the motion sensor 118 (if used) are connected to the control unit 120. The control unit 120 includes a graphics engine 125, which is illustrated as being functionally separate from the processor 122, but may in fact be executed by the processor 122. The graphics engine 125 generates augmented data that may be displayed over an image of, for example, the target device 180 or the like. It will be understood that a processor as used herein can, but need not, include one or more microprocessors, embedded processors, controllers, Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), and the like. The term processor is intended to describe functionality implemented by a system rather than dedicated hardware. Moreover, the term "memory," as used herein, refers to any type of computer storage medium, including long term, short term, or other memory associated with a mobile platform, and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
The master device 110 also includes a user interface 150 that includes means for display, such as the display 114, and a user input device 152, such as a keypad, touch screen, or other suitable tactile input device. The user interface 150 may also include a speaker 154 and a microphone 156, such as when the primary device 110 is a mobile telephone or the like.
The wireless interface 130 includes any suitable antenna 132, receiver, and transmitter or transceiver to enable the master device 110 to communicate with one or more target devices over the wireless link 116. Optionally, the wireless interface 130 may also have processing capabilities to reduce processing requirements for the processor 122.
Optionally, the master device 110 may include a network interface 140, such as a transceiver, with an antenna 142 for communicating over the network 102 (shown in FIG. 1) via the link 104. For example, the master device 110 may provide connectivity to other networks 102 (e.g., a wide area network such as the internet) via a wired or wireless communication link. Accordingly, the master device 110 may enable other target devices 180 (e.g., Wi-Fi stations) to access the network 102. Network interface 140 may be implemented in conjunction with various wireless communication networks, such as a Wireless Wide Area Network (WWAN), a Wireless Local Area Network (WLAN), a Wireless Personal Area Network (WPAN), and the like, including cellular towers and from wireless communication access points. The terms "network" and "system" are often used interchangeably. The WWAN may be a Code Division Multiple Access (CDMA) network, a Time Division Multiple Access (TDMA) network, a Frequency Division Multiple Access (FDMA) network, an Orthogonal Frequency Division Multiple Access (OFDMA) network, a single carrier frequency division multiple access (SC-FDMA) network, Long Term Evolution (LTE), and so forth. A CDMA network may implement one or more Radio Access Technologies (RATs), such as CDMA2000, wideband CDMA (W-CDMA), and so on. cdma2000 covers IS-95, IS-2000 and IS-856 standards. A TDMA network may implement Global System for Mobile communications (GSM), digital advanced Mobile Phone System (D-AMPS), or some other RAT. GSM and W-CDMA are described in documents from a consortium named "third generation partnership project" (3 GPP). cdma2000 is described in a document from a consortium named "third generation partnership project 2" (3GPP 2). The 3GPP and 3GPP2 documents are publicly available. The WLAN may be an IEEE 802.11x network, and the WPAN may be a bluetooth network, IEEE 802.11 5x, or some other type of network. The techniques may also be implemented in conjunction with any combination of WWAN, WLAN and/or WPAN.
The master device 110 may optionally include a battery 121 for powering one or more components of the master device 110. The primary apparatus 110 may comprise at least one of a mobile handset, a personal digital assistant, a laptop computer, a headset, a hands-free device for a vehicle, or any other electronic apparatus. In particular, the teachings herein may be incorporated into (e.g., implemented within or performed by) a wide variety of devices. For example, one or more aspects taught herein may be incorporated into a phone (e.g., a cellular phone), a personal data assistant ("PDA"), a UMPC, an entertainment device (e.g., a music or video device), or any other device that incorporates a camera 112.
The methodologies described herein may be implemented by various means depending on the application. For example, these methodologies may be implemented in hardware 128, software 126, firmware 129, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. The memory may be implemented within the processing unit or external to the processing unit. As used herein, the term "memory" refers to any type of long term, short term, volatile, nonvolatile, or other memory and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
For example, software 126 code may be stored in memory 124 and executed by processor 122 and may be used to run the processor and control the operation of master device 110 as described herein. For example, program code stored in a computer readable medium, such as memory 124, may include program code to identify a target device using an image of the target device and program code to interface with the identified target device. The computer readable medium may include program code for displaying broadcast data received from a target device and using the broadcast data to assist in identifying the target device to interface with.
If implemented in firmware and/or software, the functions may be stored on a computer-readable medium as one or more instructions or code. Examples include computer readable media encoded with a data structure and computer readable media encoded with a computer program. Computer-readable media includes physical computer storage media. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk (disk) and disc (disc), as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Fig. 3 is a flow chart of a method of interfacing a master device 110 with a target device 180. The master device 110 may interface with the target device by "pairing," which is the process by which devices register with each other. Once paired, the devices are typically able to communicate with each other as long as they are within range and active, without having to re-perform the pairing process. For ease of reference, an interface is sometimes referred to herein as a pairing, but it should be understood that more forms of interfaces other than pairing are also contemplated herein.
As illustrated in fig. 3, a target device to be paired is imaged using a master device (202). An image of the target device may be displayed to the user to ensure that the master device is interfacing with a desirable target device. The image may be generated as a still image using a camera or as a video. The master device identifies the target device from the image (204). Identification may be based on the visual appearance of the target device or based on visual markers, such as barcodes or other types of markers, appearing on the target device. The identification may further be based on a broadcast beacon from the target device 180, e.g., based on a tag or frequency, which, for example, allows public access to restricted information such as geographic location, geographic orientation, ID, or a category, such as, but not limited to, the type of target device (e.g., digital camera, mp3 player, etc.), a sub-category (e.g., point-and-shoot digital camera), a brand (e.g., sony, nokia, etc.), a genre of content (e.g., movie, song, etc.), or any combination thereof. The master device 110 may identify the target device 180 by retrieving a match from a lookup table using a visual appearance, a visual marker, or information obtained from a broadcast beacon. In one embodiment, master device 110 may use available non-visual sensors to narrow the look-up table of potential items, followed by performing visual matching of the image against the look-up table using keypoint features or other visual references such as lines. The look-up table may comprise, for example, previously stored images of the consumer electronics industry design or a 3-dimensional model of such a design. The lookup table may be stored, for example, on the master device 110, on the target device 180, or at a remote location, for example, on a server accessed by the network 102 via the link 104 (shown in FIG. 1).
In one embodiment, the identification may be based on visual identification information provided by the target device. For example, a target device may broadcast visual information about the target device, whereby the target device may be identified. The target device may broadcast what the target device looks like or, for example, broadcast an image or information about the image displayed by the target device when the target device is a digital picture frame or the like as illustrated in fig. 9A, which may be useful when the same target device is present. The master device 110 can then image the target device and compare the image captured by the master device 110 with broadcast information about the target device, such as the image currently displayed on a digital picture frame, to identify the target device. For example, as illustrated in FIG. 9A, the master device 110 can identify the target device using the industrial design of the digital picture frame 180A. However, if there are multiple similar or identical picture frames, e.g., the same industrial design illustrated by picture frames 180A and 180B, but each displaying a different screen image, the user will be able to select one (180A) in the camera view and the master device 110 will be able to use the currently displayed image to identify which of the target devices 180A, 180B to pair with. The identification may be based on a displayed still image, for example in the case of a digital photo frame or the like, or based on a displayed video. For video-based identification, a match may be made, for example, based on a sequence of shared video signature features that are a digest of what the video "looks" in a predetermined time segment, or based on a frame of the video generated at a known time plus error correction that compares a selected number of frames on each side of the selected frame. These techniques may be used with other target devices such as computers, laptops, cell phones, music players, and the like. In one embodiment, the master device 110 may display an image of the information broadcast by the target devices from which the user may select a desired target device to interface with, which would avoid the need for the master device 110 to image the target device with the master device 110. If desired, the image of the target device taken by the master device 110 may be used to filter the number of mateable target devices that the master device 110 displays for selection by the user.
Based on the identification of the target device, the master device may determine whether it may interface with the target device, as illustrated in optional step 206. The master device 110 may indicate on the image of the target device 180 in the display 114 whether it is possible to interface with the target device 180 and give the user the opportunity to indicate whether an interface is desired. For example, an image of the object device 180 may be displayed outlined or illuminated in the display 114 of the master device 110 to indicate pairability. The augmentation of the display of the target device image may include information obtained from the broadcast beacon from the target device 180. The user can choose to pair, unpair, or hide the pairability of the target devices. The pairability may be hidden, for example, once, always, or for all target devices within a particular class, e.g., for all digital picture frames.
The master device 110 may then interface or pair with the target device, either automatically or in response to a user selection (208). Using user selection may be advantageous when more than one pairable target device is present in the image. As an example, user selection may be made based on a gesture, for example, where different pairable target devices are highlighted based on the gesture, and a desired target device is selected once highlighted. Additionally, the user may indicate, for example, that pairing is desired, and the pairing process may be automated. Where pairing is automatic, an intermediate security step may be used to ensure that the user wishes to pair with the target device. Pairing can be optimized for a particular device in a particular location, e.g., using a lowest power pairing method to extend the distance between the master device and the target device. Further, if the relative positions of the master device and the target device change, the pairing method may be automatically changed as necessary. For interfacing or pairing, the master device and the target device must first discover each other, for example, by entering a discoverable state in which the devices discover each other by exchanging discovery messages. For example, the master device 110 may request that a locally pairable target device enter a discovery mode. The master device 110 may initiate the request, for example, over the network 102. The network may prompt any pairable target devices that fall within the area of the master device 110 to enter a discovery mode. Alternatively, the master device may first identify desirable target devices 180 and then initiate a discovery request, e.g., over the network 102 to request all similar target devices that are within range of the master device 110 to enter a discovery mode. Once found, the devices can be paired with each other. Pairing is, at least in part, a security function that restricts pairing to a particular device. For example, pairing may include message exchange, which may include a password authentication scheme in which a first device must respond to a second device with a password (such as a 4-digit number, which is often set for devices in the factory) to prevent unauthorized or undesired device pairing. In networking protocols such as bluetooth, discovery and pairing are separate procedures, but may be performed together. The actual pairing can be done with any current protocol through standard practice for that protocol (e.g., bluetooth, WiFi, IR … …).
If desired, the image displayed on the master device 110 may be augmented with augmentation data, e.g., provided by the graphics engine 125, to indicate the status of the target device, e.g., to confirm the pairing as illustrated in optional step 210. In addition, data relating to the target device may be displayed on the master device in optional step 214.
Fig. 4A and 4B illustrate, as an example, pairing the master device 110 with the object device 180, in this case the object device 180 is a digital camera, wherein data relating to the object device 180 is displayed on the master device 110 after pairing (214). Fig. 4A illustrates the master device 110 imaging (202) the object device 180, where the object device image 180Image of a personDisplay displayed on main device 110114. Based on target device image 180Image of a personThe master device 110 pairs 116 with the object device 180 and displays data relating to the object device 180 on the display 114 of the master device 110, which in this case is a photograph 182 taken by the object device 180, as illustrated in fig. 4B. The data related to the object device 180 displayed by the master device 110 may be selected by the user or may be automatically displayed.
The data related to the target device 180 may be information about the target device. For example, the data may relate to a history or attributes of the target device, help files describing how to perform tasks related to the target device, or other such data. Alternatively, this related data may come from the target device, such as a photograph from a digital camera, a list of files such as music or movies, or other information stored in the target device. Additionally, the related data may be an application or an interface with the target data. By way of example, fig. 5A and 5B illustrate the master device 110 interfacing with an object device 180 in the form of a USB flash drive, and the display of the master device 110 uses data 180 relating to the object device 180Data ofAmplification was performed. As illustrated in fig. 5B, after pairing 116, pertinent data 180Data ofDisplayed, the data may be a list of files stored on the flash drive or even open files stored on the flash drive. Thus, the master device 110 may provide a user interface for the object device 180 when the object device 180 itself has no user interface.
Fig. 6A, 6B, 6C, and 6D illustrate another example of a master device 110 interfacing with an object device 180 in the form of a watch, where on the master device 110, augmentation data 180 related to the object device 180Data ofIs displayed in the target device image 180Image of a personAbove. As illustrated, this related data 180Data ofDisplayable by the master device 110 as a target device image 180Image of a personClosely aligned (fig. 6A), loosely aligned (fig. 6B), or misaligned (fig. 6C, 6D). As illustrated in fig. 6D, by comparing the image 180Image of a personUsing keypoint tracking followed by locking of graphical augmentation to the imageThe relevant data from the target device 180 may be displayed over the entire display of the master device 180.
The master device 110 may be used to provide a user interface with a target device having limited user interface capabilities or simply provide a user interface having functionality that is extended relative to the functionality allowed by the user interface of the target device. For example, fig. 7A and 7B illustrate the pairing of a master device 110 with an object device 180 having limited user interface capabilities. The target device 180 in fig. 7A and 7B is in the form of an MP3 player, which may have a physical user interface but no screen to provide a Graphical User Interface (GUI). As shown in fig. 7A, the master device 110 pairs with the object device 180 by imaging the object device 180. As shown in fig. 7B, after pairing 116, the data on the master device 110 related to the target device 180 is a GUI that can be used, for example, to enable additional functionality of an MP3 player.
Fig. 8A and 8B illustrate the master device 110 interfacing with the object device 180 where the master device 110 provides extended functionality. The target device 180 in fig. 8A and 8B is in the form of an mp3 player with a graphical user interface. As shown in fig. 8A, the master device 110 pairs with the object device 180 by imaging the object device 180. As shown in fig. 8B, after pairing 116, the data on the primary device 110 related to the target device 180 is a GUI, and the interface provides expanded functionality, such as a touch screen to facilitate easier user interaction, a larger screen, allow more for natural control gestures, or provide additional capabilities, such as creating a playlist.
Accordingly, various types of data related to the object device 180 may be displayed by the master device 110. For example, the data may be passive, such as the information illustrated in fig. 6A, or interactive, such as the music interface displayed in fig. 7B. Further, functionality exposed by the target device may be displayed either directly or through a remote source, such as network 102 shown in FIG. 1. As an example, in fig. 5A, 5B, 180Data ofMay come from the network 102 rather than from the target device 180 itself. AsIllustratively, when the USB flash drive 180 shown in FIG. 5A is inserted into a computer, 180Data ofMay be synchronized in the network. When the master device 110 is paired with the USB flash drive 180, the master device 110 may actually be paired with the synchronized data in the network 102. In another example, MP3 player functionality is extended where a copy of the data on MP3 player is on the network 102. Since extended functionality (e.g., making playlists) may not be possible on the MP3 player itself, the master device 110 may be used to manipulate the data on the network 102. Once this interaction is complete (e.g., a new playlist is made), the network 102 synchronizes with the MP3 player (target device) 180. If desired, the interaction between the master device 110 and the target device 180 may continue between the augmented calls. For example, the displayed augmented data may be kept in the same state as it was presented when it was last viewed. In addition, the master device 110 may provide commands or data to the target device 180. In other words, the target device 180 may be controlled by interfacing with the master device 110. The data may be stored on the master device 110, on the target device 180, or at a remote location, such as a server accessed via the network 102 (FIG. 1).
In addition, the master device 110 may also interface with multiple target devices. For example, as illustrated in fig. 9A, 9B, and 9C, the master device 110 is used to interface with two target devices, illustrated as a digital camera 180 and a digital picture frame 180A. The master device 110 may be used to retrieve data from one target device (e.g., the camera 180) and pass the data to another target device, such as the digital picture frame 180A. Fig. 9A illustrates that master device 110 pairs with camera 180 by imaging camera 180. The digital picture frame 180A may have been or subsequently be paired 116A in a similar manner, as shown in FIG. 9B. Fig. 9B illustrates that after pairing 116 between the master device 110 and the camera 180, data is received by the master device 110 from the camera 180 and displayed by the master device 110. The selected data may then be transferred from the master device 110 to a second target device, e.g., digital picture frame 180A, and may be displayed or otherwise stored or used by the second target device as shown in FIG. 9C. This data transfer may be automatic or caused by user interaction, including selection by a touch screen or user gesture detected by motion sensors 118 in the master device 110, such as an accelerometer or gyroscope. To initiate data transfer using gestures, the relative position of one or more of the target devices 180, 180A with respect to each other and/or the master device 110 may be determined in advance, for example, by detecting a change in position or orientation of the master device 110 using the motion sensor 118 between imaging the two target devices 180, 180A. Alternatively, pixel flow direction tracking may be used to visually determine the association between the target devices 180, 180A.
In one embodiment, the master device 110 may be used to facilitate pairing between two or more target devices by interfacing with and initiating pairing between the target devices, which may be particularly useful when the user interface of the target devices makes direct pairing of those devices difficult. For example, as illustrated in fig. 10A, 10B, and 10C, the master device 110 is used to initiate an interface between two target devices, illustrated as a digital camera 180 and a digital picture frame 180A. Fig. 10A illustrates the master device 110 pairing 116 with the camera 180 by imaging the camera 180. FIG. 10B shows that the master device 110 is paired 116A with a target device, illustrated as digital picture frame 180A, by imaging the digital picture frame 180A. In the case where the two object devices 180, 180A are paired with the master device 110, as illustrated in fig. 10C, the master device 110 may be used to initiate the pairing 116B between the two object devices 180, 180A. The pairing of the two object devices 180, 180A may be done automatically, or in response to user interaction such as selection through a user interface of the master device 110, by gestures between the object devices 180, 180A where the master device 110 knows the relative locations of the object devices 180, 180A, or by using a protocol specification from each object device 180, 180A. The pairing 186 between the object devices 180, 180A may persist when the master device 110 is not present.
In another embodiment, as illustrated by the flow diagram shown in fig. 11, the master device 110 may image the passive target 190, i.e., a target that cannot be paired or otherwise interfaced (302), and retrieve and display data related to the target. The master device 110 may identify the passive target (304) and retrieve data related to the identified target and display the data on the master device 110 (306). The passive target 190 may be any target that can be registered by the camera 112 of the master device 110, including devices without connectivity, such as a watch, a stapler, or a car; devices with unutilized connectivity, e.g., cameras with bluetooth; an object such as a pencil, stone or table; printed or digital media such as magazines, screenshots, and the like; a body part such as a wrist or a foot; or other registered objects such as color or texture.
Passive target 190 may be identified from its visual appearance. This identification may be based on a specific image of the passive object, e.g. an image previously taken by the user, a textured 3-dimensional model of the passive object, or a general category of images, such as a general image of a watch, rather than a particular brand of watch. The user may provide additional information to define the image, such as marking the image as a "watch," or providing a PIN for a particular object or class of objects. Additional information provided by the user may be entered once, for example during training of the primary device 110, and subsequent identification using the entered information is automatic. In addition, passive targets may include visual markers that are imaged and used to narrow the identification search, for example by category, subcategory, brand, genre, or any combination thereof. The identification may be performed by the master device 110 or remotely, such as over the network 102 (fig. 1). The image of the passive object displayed on the master device 110 may indicate that the passive device has been identified by using an augmentation of the image, for example, by illuminating or outlining the passive object. If desired, the user may select the passive device to be identified and the type of data to be displayed. The data may be displayed automatically. The selection of the passive target to be identified may be performed once by the user and subsequent imaging of the passive target results in automatic identification. The relationship between the passive target and the retrieved data may be user defined or externally or automatically defined. Further, the data may be stored on the master device 110 or on a remote source and retrieved by the master device 110, for example, over the network 102 (FIG. 1). The related data may be in the form of launching an application such as a calendar application. Alternatively, the application may be retrieved or unlocked based on the image of the passive target. The related data may be information related to a passive target, such as instructions or help information. The related data may provide access to the related data or initiate a download of the related data based on the image of the passive object. For example, imaging the cover of a book may automatically download an audio book or a digital book. The related data may be any media or combination of media and may include, but is not limited to, two-dimensional or three-dimensional images, animations, video, sound, haptic feedback, or other sensory data.
As an example, a user may define within the primary device 110 an image of the user's watch 190 to automatically launch a calendar application on the primary device 110 and display the user's calendar, as illustrated in fig. 12. Alternatively, the user may personalize the relationship between the passive target and the retrieved data, such as opening an image of the user's wedding ring using the image of the user's wedding ring. The master device 110 may be trained to identify an individual's passive target, such as a user's watch, ring, stapler, car, etc., by a process that includes imaging the passive target and inputting a device name and consensus data to be associated with the passive target, which the master device 110 then associates with the passive target.
In addition, the relevant data retrieved for the passive objective 190 may be defined externally or automatically. For example, fig. 13A and 13B illustrate an example where the master device 110 images a passive target 190, such as a unpaired DVR (fig. 13A), and related data in the form of operational instructions 192, such as operational instructions, are retrieved from the network 102 and automatically displayed by the master device 110 (fig. 13B). This particular DVR is identified, but these operational instructions or other information about the particular DVR can be obtained from network 102 without the need for a connection or pairing. As illustrated in fig. 13B, the operation instructions 192 may be displayed in close alignment with the image.
FIG. 14A and 14B illustrate examples where the master device 110 images a passive target 190, such as an advertisement for a movie, and in this example, a DarkLogo (FIG. 14A), and associated with DarkThe relevant data (192) is retrieved and displayed. For example, as shown in FIG. 14B, DarkThe movie may be automatically retrieved and displayed on the master device 110. Alternatively, other information related to the object being imaged may be displayed, such as DarkThe location and time of the movie showing, etc. If desired, the image of passive target 190 may be based on visual appearance, e.g., the logo itself, or may be based in part or in whole on visual marker 194 to assist in identification of the passive target.
In another embodiment, the master device 110 may interface with the pairable object device 180 and provide data for images taken from another object device to the object device 180, where the other object device may be pairable or passive. In one embodiment, there may be more than one target device for obtaining data, and/or more than one target device 180. For example, as illustrated in fig. 15A, 15B, and 15C, the master device 110 may target a passive target 190 (such as Dark)Advertisement) is imaged (fig. 15A) and data is identified and obtained based on the image of the passive target as discussed above. A user may select data to be delivered to a paired (116) target device 180, such as a television (fig. 15B), and the data (e.g., a movie) is automatically delivered to the target device 180 (fig. 15C). The data may be retrieved by the master device 110 andis passed to the target device 180 or, alternatively, the master device 110 may provide instructions to the target device 180 to retrieve the data itself.
Thus, the master device 110 may be used to communicate commands to the object device 180 based on images of one or more passive or pairable object devices. For example, the master device 110 may image a particular brand of running shoes, then image an mp3 player and automatically download a particular playlist, such as sports music, to the mp3 player in response thereto. In one embodiment, the master device 110 downloads data, such as music, and communicates the data to the target device, or alternatively instructs the target device to retrieve the data itself. To make multiple selections, the camera may be aimed at these targets for a brief period of time, or the user may interact with the image, for example, by pressing or holding a button, or using a gesture to indicate to start or stop multiple selections, etc. The user may select the target device in the image, the target device may be automatically selected, or the user may select the target device once and the subsequent selection is automatic. The actions that occur in response to the multiple selections may be user-defined or externally defined.
Although the present invention is illustrated in connection with specific embodiments for instructional purposes, the present invention is not limited thereto. Various adaptations and modifications may be made without departing from the scope of the invention. Therefore, the spirit and scope of the appended claims should not be limited to the foregoing description.

Claims (39)

1. An interface method, comprising:
imaging a target device with a master device and displaying an image of the target device on the master device;
identifying the target device using the image of the target device;
interfacing with the target device through the master device based on the identity of the target device; and
displaying a user interface for the target device on the master device for controlling the target device with the master device, wherein the user interface for the target device for controlling the target device is displayed on an image of the target device on the master device.
2. The method of claim 1, further comprising displaying data related to the target device on the master device.
3. The method of claim 1, further comprising:
identifying the target device as pairable using the image of the target device; and
pairing the master device with the target device.
4. The method of claim 1, wherein the method further comprises:
interfacing with a target device through the master device;
wherein interfacing with the target device comprises retrieving relevant data from the target device; and
communicating the data to the target device.
5. The method of claim 1, further comprising:
pairing the master device with the target device;
pairing the master device with a second target device; and
pairing the target device with the second target device using the master device.
6. The method of claim 1, further comprising:
receiving broadcast data from the target device, the broadcast data including information about a visual appearance of the target device;
wherein the target device is identified by comparing the image of the target device with the broadcast data from the target device.
7. The method of claim 6, wherein the information about the visual appearance of the target device comprises at least one of an image and a textured 3-dimensional model of the target device.
8. The method of claim 1, further comprising:
receiving broadcast data from a plurality of target devices, the broadcast data including information about a visual appearance of the target devices;
wherein identifying the target device using the image of the target device comprises;
filtering the broadcast data using the image of the target device to produce a broadcast data subset;
displaying the subset of broadcast data;
identifying the target device by a selection made by a user from a subset of the displayed broadcast data.
9. The method of claim 1, further comprising interfacing the target device with a wide area network through the master device.
10. An interface method, comprising:
receiving broadcast data from a target device, the broadcast data including information about a visual appearance of the target device, the visual appearance information of the target device including an image of the target device resulting from imaging the target device;
displaying the broadcast data on a master device;
interfacing with the target device based on a selection of the displayed broadcast data from the target device; and
displaying a user interface for the target device on the master device for controlling the target device with the master device, wherein the user interface for the target device for controlling the target device is displayed on an image of the target device on the master device.
11. The method of claim 10, wherein the broadcast data includes information about an image displayed by the target device.
12. The method of claim 10, further comprising:
imaging the target device with the master device; and
identifying the target device by comparing an image of the target device with the received broadcast data.
13. The method of claim 10, further comprising interfacing the target device with a wide area network through the master device.
14. An interface device, comprising:
a camera operable to image the target device;
a wireless transceiver capable of transmitting and receiving wireless signals to and from the target device;
a processor connected to the camera and the wireless transceiver;
a memory connected to the processor;
a digital display coupled to the processor;
software held in the memory and running in the processor to cause the processor to identify the target device using an image provided by the camera and interface with the identified target device through the wireless transceiver, and to display a user interface for the target device on the digital display for controlling the target device, wherein the software causes the processor to display a user interface for the target device on the image of the target device displayed on the digital display for controlling the target device.
15. The apparatus of claim 14, wherein data related to the target device is displayed on the digital display.
16. The apparatus of claim 15, wherein the data related to the target device is received by the wireless transceiver from the target device.
17. The apparatus of claim 15, wherein the apparatus further comprises a network interface transceiver coupled to the processor, wherein the data related to the target device is received by the network interface transceiver.
18. The apparatus of claim 14, wherein the software causes the processor to identify the target device as pairable using an image of the target device and to pair with the target device.
19. The apparatus of claim 14, wherein the software causes the processor to interface with a target device, wherein interfacing with the target device comprises by retrieving relevant data from the target device; and transmitting the data to the target device via the wireless transceiver.
20. The apparatus of claim 14, wherein the software causes the processor to pair with the target device, pair with a second target device, and pair the target device with the second target device.
21. The apparatus of claim 14, wherein the wireless transceiver receives broadcast data from the target device and provides the broadcast data to the processor, the broadcast data including information about a visual appearance of the target device, the software causing the processor to identify the target device by comparing the image provided by the camera with the broadcast data received from the target device.
22. The apparatus of claim 21, wherein the information about the visual appearance of the target device comprises at least one of an image and a textured 3-dimensional model of the target device.
23. The apparatus of claim 14, further comprising a digital display coupled to the processor and a user interface coupled to the processor, wherein the wireless transceiver receives broadcast data from a plurality of target devices and provides the broadcast data to the processor, the broadcast data including information about a visual appearance of the target devices, the software causing the processor to identify the target devices by filtering the broadcast data using the images provided by the camera to produce a subset of broadcast data, displaying the subset of broadcast data on the digital display, and identifying the target devices from the subset of broadcast data based on input from the user interface.
24. The apparatus of claim 14, wherein the apparatus further comprises a network interface transceiver coupled to the processor, wherein the software causes the processor to interface the target device with a wide area network through the network interface transceiver.
25. An interface device, comprising:
a wireless transceiver capable of transmitting and receiving wireless signals to and from a target device, the wireless transceiver receiving broadcast data from a target device, the broadcast data including information about a visual appearance of the target device, the information about the visual appearance of the target device including an image of the target device resulting from imaging the target device;
a digital display;
a user interface;
a processor connected to the digital display and user interface;
a memory connected to the processor; and
software held in the memory and running in the processor to cause the processor to display the broadcast data on the digital display and to interface with the target device based on input from the user interface in response to the displayed broadcast data, and to display a user interface for the target device on the digital display for controlling the target device, wherein the software causes the processor to display the user interface for the target device on an image of the target device displayed on the digital display for controlling the target device.
26. The apparatus of claim 25, wherein the broadcast data includes information about an image displayed by the target device.
27. The apparatus of claim 25, wherein the wireless transceiver receives broadcast data from a plurality of target devices, the apparatus further comprising a camera operable to image the target devices, the software causing the processor to compare an image of the target devices with the received broadcast data and filter the broadcast data using the image provided by the camera to produce a subset of broadcast data; displaying the subset of broadcast data on the digital display.
28. The apparatus of claim 25, wherein the apparatus further comprises a network interface transceiver coupled to the processor, wherein the software causes the processor to interface the target device with a wide area network through the network interface transceiver.
29. A system for interfacing between a master device and a target device, comprising:
means for imaging a target device with the master device;
means for displaying an image of the target device on the master device;
means for identifying the target device using the image of the target device;
means for interfacing with the identified target device; and
means for displaying a user interface for the identified target device to control the identified target device with the master device, wherein the means for displaying a user interface for the target device is for displaying a user interface for the identified target device on an image of the target device with the master device.
30. The system of claim 29, further comprising means for obtaining data related to the target device, wherein the means for displaying the image of the target device displays the data related to the target device.
31. The system of claim 29, wherein the means for interfacing pairs the master device with the target device.
32. The system of claim 29, wherein the means for interfacing with the identified target device further interfaces with a target device different from the target device; the system further comprises means for retrieving relevant data from the identified target device; wherein the means for interfacing communicates the retrieved data to the target device.
33. The system of claim 29, wherein the means for interfacing pairs the master device with the object device and pairs the master device with a second object device, the system further comprising means for pairing the object device with the second object device through the master device.
34. The system of claim 29, further comprising:
means for receiving broadcast data from the target device, the broadcast data comprising information about a visual appearance of the target device;
wherein the means for identifying a target device compares the image of the target device with the broadcast data from the target device.
35. The system of claim 34, wherein the information about the visual appearance of the target device comprises at least one of an image and a textured 3-dimensional model of the target device.
36. The system of claim 29, further comprising:
means for receiving broadcast data from a plurality of target devices, the broadcast data comprising information about a visual appearance of the target devices;
wherein the means for identifying a target device comprises:
means for filtering the broadcast data using the image of the target device to generate a subset of broadcast data, wherein the means for displaying displays the subset of broadcast data;
means for selecting the target device based on the displayed subset of broadcast data.
37. The system of claim 29, further comprising means for interfacing the target device to a wide area network.
38. An interface method, comprising:
identifying a target device using an image of the target device imaged to the target device;
interfacing with the identified target device; and
displaying a user interface for the identified target device for controlling the identified target device, wherein the user interface for the identified target device for controlling the identified target device is displayed on an image of the target device on a master device.
39. An interface device, comprising:
means for identifying a target device using an image of the target device imaged to the target device;
means for interfacing with the identified target device; and
means for displaying a user interface for the identified target device for controlling the identified target device, wherein the means for displaying a user interface for the identified target device for controlling the identified target device displays the user interface for the identified target device on an image of the target device on a master device.
HK13101474.8A 2009-07-17 2010-07-16 Method and apparatus for automatic interfacing between a master device and object device HK1174751B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US61/226,545 2009-07-17
US61/226,552 2009-07-17
US12/786,886 2010-05-25

Publications (2)

Publication Number Publication Date
HK1174751A HK1174751A (en) 2013-06-14
HK1174751B true HK1174751B (en) 2018-02-15

Family

ID=

Similar Documents

Publication Publication Date Title
US9667817B2 (en) Interface between object devices initiated with a master device
US9218530B2 (en) Smartphone-based methods and systems
US8762852B2 (en) Smartphone-based methods and systems
CN103944963B (en) Method of data synchronization, device, equipment and system
EP3276910B1 (en) Bluetooth-based identity recognition method and device
US20120224743A1 (en) Smartphone-based methods and systems
US20090158206A1 (en) Method, Apparatus and Computer Program Product for Displaying Virtual Media Items in a Visual Media
CN110601959A (en) Session message display method, device, terminal and storage medium
US11132398B2 (en) Electronic device for generating video comprising character and method thereof
KR102501713B1 (en) Method for displaying an image and an electronic device thereof
CN104809204A (en) Picture processing method and picture processing device
CN108346179A (en) AR equipment display methods and device
US8953050B2 (en) Interaction with electronic device recognized in a scene captured by mobile device
CN104615743B (en) Image display method and device
CN102625981B (en) Method and apparatus for automatic interfacing between master and target devices
KR101738513B1 (en) Mobile terminal for providing video media, system including the same and method for controlling the same
HK1174751B (en) Method and apparatus for automatic interfacing between a master device and object device
HK1174751A (en) Method and apparatus for automatic interfacing between a master device and object device
CN110673732A (en) Scene sharing method, device, system, electronic equipment and storage medium
US20230297311A1 (en) Display device
KR101773010B1 (en) Mobile terminal for providing video media, system including the same and method for controlling the same
CN116095226A (en) Photo processing method and electronic equipment
JP2013106156A (en) Apparatus controller, control program, recording medium, apparatus control method, and apparatus control system