WO2012057764A1 - Method of generating a photo album - Google Patents
Method of generating a photo album Download PDFInfo
- Publication number
- WO2012057764A1 WO2012057764A1 PCT/US2010/054481 US2010054481W WO2012057764A1 WO 2012057764 A1 WO2012057764 A1 WO 2012057764A1 US 2010054481 W US2010054481 W US 2010054481W WO 2012057764 A1 WO2012057764 A1 WO 2012057764A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- image
- sub
- collection
- computing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
- G11B27/322—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00137—Transmission
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00167—Processing or editing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00185—Image output
- H04N1/00196—Creation of a photo-montage, e.g. photoalbum
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3212—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
- H04N2201/3214—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3212—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
- H04N2201/3215—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3253—Position information, e.g. geographical position at time of capture, GPS data
Definitions
- FIG. 1 is a diagram of one illustrative photo album generation system for uploading digital photographs and generating a photo album, according to one example of principles described herein.
- FIG. 2 is a diagram of another illustrative photo album
- FIG. 3 is an illustrative flowchart depicting a method of generating a photo album on-the-fly, according to one example of principles described herein.
- the following specification discloses a number of illustrative systems and methods for simultaneously uploading digital images and editing images. More specifically, the following specification discloses illustrative systems and methods for generating a photo album on a photo album
- the event detection device may be based on a number of factors including, for example, the individual time stamps of each image, the contents of each image, and any metadata associated with the image.
- the user may first upload all of the digital images for the photo album and then wait an additional period of time for those images to be processed before the user is allowed to begin working on the collection, e.g., adjust the various features or layout of the individual images.
- the extra time taken to first upload images before anything else is done can be a deterrent to working on the photo album.
- photo album is meant to be understood broadly as any collection of individual digital images meant, by the user, to be contained in a single collection.
- Metadata is meant to be understood broadly as data that provides information about or documentation of other data managed within an application or environment.
- metadata may be
- image is meant to be understood broadly as any visual representation of a subject or digital data that can be used to produce such a visual representation of a subject. Users commonly take images of such things as people, places, events (e.g. weddings), etc.
- a user computing device or “computing device” is meant to be understood broadly as any device capable of receiving, storing, and manipulating data.
- a user computing device may receive digital data representing an image, store, at least temporarily, that data, and provide a provide a means for a user to manipulate that data to produce, for example, a photo album.
- FIG. 1 a diagram of one illustrative photo album generation system (100) for uploading digital photographs and generating a photo album is shown.
- the photo album generation system (100) comprises a user computing device (105) that has access to a web page (1 10) on a web page server (1 15).
- the user computing device (105) and the web page server (1 15) are separate computing devices communicatively coupled to each other through a mutual connection to a network (120), for example, the Internet.
- a user computing device (105) performs the functions of both the user computing device (105) and web page server (1 15), such as in a photo kiosk.
- alternative examples within the scope of the principles of the present specification include, but are not limited to, examples in which the functionality of the user computing device (105) is implemented by multiple interconnected computers (for example, a server in a data center and a user's client machine) and examples in which the user computing device (105) and the web page server (1 15) communicate directly through a bus without intermediary network devices.
- the principles set forth in the present specification extend equally to examples where the user computing device (105) performs an initial analysis of the images being uploaded, groups various images based on a event detection device being used, uploads a first image or set of images, allows the user to interact with or otherwise edit the uploaded image or set of images, and continues to upload other images or groups of images while the user is engaged in editing the first group of images. Therefore, all of these tasks may also be accomplished at, for example, a kiosk-type computing system which may allow a user to generate a photo album without any connection to a specific web site or web page server.
- the user computing device (105) of the present example is a computing device that retrieves the web page (1 10) hosted by the web page server (1 15) and allows a user to, through interaction with the web page (1 10), upload any number of digital images. In the present example, this is accomplished by the user computing device (105) requesting the web page (1 10) from the web page server (1 15) over the network (120) using the appropriate network protocol (e.g., Internet Protocol ("IP”)).
- IP Internet Protocol
- the processor (125) associated with the user computing device (105) allows the user to, through, for example, a Flash or JavaScript application, upload any number of images via any number of threads. Illustrative processes for uploading and simultaneously editing digital photographs are set forth in more detail below.
- the user computing device (105) includes various hardware components.
- the user computing device (105) may therefore include at least one processing unit (125), at least one data storage device (130), peripheral device adapters (135), and a network adapter (140). These hardware components may be interconnected through the use of one or more busses and/or network connections.
- the processor (126), as described above, may interpret and execute the instructions to access the web page (1 10).
- the user computing device (105) may comprise two or more
- the user computing device (105) may comprise a single processor (125) which simultaneously uploads a number of digital images and runs the feature extraction process.
- any number (n) of processors (125) may perform the feature extraction process while,
- any number (m) of processors are conducting the uploading process.
- the data storage devices (130, 150) may digitally store data implemented and produced by the processing units (125, 155) of the user computing device (105) and web page server (1 15).
- the data storage devices (130, 150) may include various types of memory modules, including volatile and nonvolatile memory.
- the data storage devices (130, 150) of the present example include Random Access Memory (RAM), Read Only Memory (ROM), and Hard Disk Drive (HDD) memory.
- RAM Random Access Memory
- ROM Read Only Memory
- HDD Hard Disk Drive
- Many other types of memory are available in the art, and the present specification contemplates the use of many varying types of memory in the data storage devices (130, 150) as may suit a particular application of the principles described herein.
- different types of memory in the data storage devices (130, 150) may be used for different data storage needs.
- processing units (125, 155) may boot from ROM, maintain nonvolatile storage in HDD memory, and execute program code stored in RAM.
- the data storage device (150) of the web page server (1 15) may store the images uploaded by the user for the user to edit.
- the hardware adapters (135, 140) in the user computing device (105) enable the processing unit (125) to interface with various other hardware elements, external and internal to the user computing device (105).
- peripheral device adapters (135) may provide an interface to input/output devices (145) to create a user interface and/or access external sources of memory storage.
- peripheral device adapters (135) may provide an interface between the user computing device (105) and an external data storage device.
- a peripheral device adapter (135) may provide an interface by which the user computing device (105) and the processor (125) may access an internal data storage device communicatively coupled with a digital camera.
- peripheral device adapters (135) may provide an interface to a removable data storage device such as, for example, a memory card, a flash memory, floppy disks, compact disks, USB drives, among others.
- Peripheral device adapters (135) may also create an interface between the processing unit (125) and a printer or other media output device.
- a network adapter (140) may additionally provide an interface to the network (120), thereby enabling the transmission of data to, and receipt of data from, other devices on the network (120), including the web page server (1 15).
- FIG. 2 a diagram of another illustrative photo album generation system (200) for uploading digital photographs
- the computing device (205) performs an initial analysis of the images being uploaded, groups various images based on a event detection device being used, uploads a first image or set of images, allows the user to interact with or otherwise edit the uploaded image or set of images, and continues to upload other images or groups of images while the user is engaged in editing the first group of images.
- the computing device (205) as depicted in Fig. 2 may be a kiosk-type computing device which may be located in, for example, a local marketplace or other retail location.
- the computing device (205) of Fig. 2 may include at least one processing unit (210), at least one data storage device (215), peripheral device adapters (220), and an input/output device (225). These hardware components may be interconnected through the use of one or more busses and/or network connections.
- the processor (210) may interpret and execute the instructions to access data such as, for example, digital images accessible through interfaces created by peripheral device adapters (220). Additionally, the processor (210) may interpret and execute instructions to perform an initial analysis of images being uploaded to the computing device (205), group various images based on a event detection device, upload a first image or set of images, receiving instructions from a user so that the user may interact with or otherwise edit the uploaded image or set of images, and continue to upload other images or groups of images while the user is engaged in editing the first group of images.
- data such as, for example, digital images accessible through interfaces created by peripheral device adapters (220). Additionally, the processor (210) may interpret and execute instructions to perform an initial analysis of images being uploaded to the computing device (205), group various images based on a event detection device, upload a first image or set of images, receiving instructions from a user so that the user may interact with or otherwise edit the uploaded image or set of images, and continue to upload other images or groups of images while the user is engaged in editing the first group of images.
- the computing device (205) may comprise two or more processors (210); one processor (210) being used to upload a number of images, while another processor (210) is used to conduct a feature extraction process of each individual digital image.
- the computing device (205) may comprise a single processor (210) which simultaneously uploads a number of digital images and runs the feature extraction process.
- any number (n) of processors (210) may perform the feature extraction process while,
- any number (m) of processors are conducting the uploading process.
- the data storage device (215) may digitally store data implemented and produced by the processor (210) of the computing device (205).
- the data storage devices (215) may include various types of memory modules, including volatile and nonvolatile memory.
- the data storage devices (215) of the present example include Random Access Memory (RAM), Read Only Memory (ROM), and Hard Disk Drive (HDD) memory.
- RAM Random Access Memory
- ROM Read Only Memory
- HDD Hard Disk Drive
- Many other types of memory are available in the art, and the present specification contemplates the use of many varying types of memory in the data storage devices (215) as may suit a particular application of the principles described herein.
- different types of memory in the data storage devices (215) may be used for different data storage needs.
- the processor (210) may boot from ROM, maintain nonvolatile storage in HDD memory, and execute program code stored in RAM.
- the data storage device (215) may store images uploaded by the user through the interface created by the peripheral device adapters (220) so that the user may edit them.
- peripheral device adapters (220) in the computing device (205) enable the processor (210) to interface with various other hardware elements, external and internal to the computing device (205).
- peripheral device adapters (220) may provide an interface to input/output devices (225) to create a user interface and/or access external sources of memory storage.
- peripheral device adapters (220) may provide an interface between the computing device (205) and an external data storage device.
- a peripheral device adapter (220) may provide an interface by which the computing device (205) and the processor (210) may access an internal data storage device communicatively coupled with a digital camera.
- peripheral device adapters (220) may provide an interface to a removable data storage device such as, for example, a memory card, a flash memory, floppy disks, compact disks, USB drives, among others.
- Peripheral device adapters (220) may also create an interface between the processor (210) and a printer or other media output device. .
- FIG. 3 an illustrative flowchart depicting a method of generating a photo album according to principles disclosed herein is shown. The process starts with the processor (Fig. 1 , 125; Fig. 2, 210) gaining access to a user's set of digital images (Block 300).
- a kiosk-type computing device (205) as depicted in Fig.2 this may be accomplished by the user inserting a portable data storage device such as, for example, a flash memory card, floppy disk, compact disk, or universal serial bus USB drives.
- a portable data storage device such as, for example, a flash memory card, floppy disk, compact disk, or universal serial bus USB drives.
- this may be accomplished by the user first uploading digital images to the user computing device (Fig. 1 , 105) and then interfacing with the various elements associated with the web page (Fig. 1 , 1 10) to upload those images to the data storage device (Fig. 1 , 150) of associated with the web page server (1 15).
- the user may be allowed to connect a digital camera to the user computing device (Fig. 1 , 105; Fig. 2, 205) via a peripheral device adapter (Fig. 1 , 135; Fig. 2, 220) and through the cameras hardware and machine readable instructions (such as software) allow the processor (Fig. 1 , 125; Fig. 2, 210) access to the image data stored thereon.
- the camera itself may be the user computing device.
- the user may place photos into a scanner that is in communication with the user computing device (Fig. 1 , 105; Fig. 2, 205) via a peripheral device adapter (Fig. 1 ,135; Fig. 2, 220).
- the user computing device receives the image data for the scanned image from the scanner.
- the processor may receive digital image data from a remote computing device in which the digital image date is received from the remote device by the user computing device (Fig. 1 , 105) via a network (120) such as, for example, the Internet.
- a remote device may include a client computer or an internet enabled smart phone, among others.
- the processor begins to run an event detection device (Block 305).
- the event detection device utilizes the time stamps associated with the individual images to determine which image or set of images are uploaded first.
- the principles set forth in the present specification extend equally to any alternative method where the event detection device utilizes other data or metadata associated with each of the individual images.
- Metadata such as tags associated with the images may be used to group all the images into a number of groups of images. For, example, when creating a photo album, many of the images may contain features within the image that have been tagged previously by the user as being a facial image of a specific person. These images may then, through the use of the event detection device, be grouped together based on the individuals' names tagged on each image.
- the tag or metadata may simply be a word or phrase which has been associated with image and, therefore, any image containing a certain word or phrase as a tag may be grouped together.
- the event detection device may take advantage of face recognition that analyzes each image to determine if a face is present and, if one is, determine who is in the picture and again group images containing similar faces with other images containing that face or simply group images containing a face with other images containing a face.
- the processor (Fig. 1 , 125; Fig. 2, 210) initiates the event detection device and extracts the time stamp data from each image (Block 305).
- the processor then computes a time similarity matrix between all pairs of images (Block 310) accessed by the processor (Fig. 1 , 125; Fig. 2, 210). This matrix compares each time stamp of the individual images and then forms an un-directed weighted graph (Bock 315).
- the weighted graph shows different "communities" or groups of images lumped or arranged closer together some other images.
- each of the images has a time stamp which would reflect the fact that these events were separated by a certain time difference. If one image was taken, and a subsequent image was also taken some time later, that difference (At) in time may be significant enough for the processor to determine that the second or subsequent image was taken at a different event. Therefore, a predetermined time differential (At) threshold may be set, and the processor may group (Block 320) these images together based on whether any specific image meets that threshold time difference or not.
- Threshold time differences could be, for example, as short as a few seconds or as long as the user determines.
- a 24 hour time differential (At) threshold may be set by the user, and therefore any subsequent images taken within 24 hours of any previously taken image are grouped in with the previous image or images. Conversely, any images take outside of a 24 hour period may be group with a different set of images.
- a first image or set of images may be uploaded (Block 325).
- This first set of images may represent a particular event or occasion and thereby may naturally fit within one chapter or section of the section of the photo album being created.
- This set of images may then be immediately uploaded (Block 325) for subsequent processing (Block 330) as is discussed below.
- the images may be uploaded (Block 325) to the web page server (Fig. 1 , 1 15) and saved, at least temporarily, on the data storage device (Fig. 1 , 150) associated with the web page server (Fig. 1 , 1 15).
- the images may be uploaded (Block 325) to a local data storage device associated with the kiosk or computing device (205).
- the processor may begin to initiate a feature extraction process (Block 330) that involves using an algorithm to detect and isolate various desired portions, shapes, or features of the image and transform the image data for each image into a reduced representation set of those features.
- These features may include, for example, edges of brightness changes within the image, intersections of two edges within the image, points within the image for which there are two dominant and different edges directions in a local neighborhood of the point, points or regions within the image that are either brighter or darker than the surrounding portions of the image, changes in colors within the image, and shapes within the image.
- the feature extraction process may take as long as the uploading of the individual images because this process may use more processing power and therefore may take more time.
- a second processor may be implemented to simultaneously begin uploading a subsequent image or set of images (Block 335) the user intends to include in the photo album. Therefore, as briefly described above, in one illustrative example, the user computing device (Fig. 1 , 105; Fig.
- 2, 205) may contain any number of processors that are engaged in the task of extracting features from the most recently uploaded image or set of images (Block 330), while any number of other processors are engaged in the task of uploading subsequent images or sets of images (Block 335).
- the user may then be able to start editing the pictures to create, for example, a photo album. While the user is engaged with the editing the initial set of images, the processor (Fig. 1 , 125; Fig. 2, 210) may continue to upload subsequent sets of images (Block 335) while
- the use of multiple processors can reduce the processing time to create a photo album and thereby reduce the wait for a project to be completed. Indeed, instead of waiting until all of the images are uploaded and then waiting for the feature extraction process to be completed, the user may instead, begin to preview and edit a relatively smaller set of images, thereby decreasing the initial wait time for the user as well as decreasing the entire time it otherwise may take to create the photo album.
- This may be particularly beneficial for the user of a kiosk that is implementing this procedure, where a user may face a lengthy wait in line. These considerations may discourage a user and potential purchaser of goods from creating a photo album, potentially resulting in a lost opportunity for the user and lost revenue for the owner of the kiosk.
- a single processor may be used to alternate between uploading images (Block 335) and running a feature extraction process (Block 330).
- the processor may be able to handle both the uploading (Block 335) and processing (Block 330) by implementing a direct access memory channel to upload the images while the computations for the feature extraction process (Block 330) continue to be implemented directly by the processor.
- the uploading (Block 335) and feature extraction (330) processes may continue (Block 340) until all of the user's images have been uploaded to the computing device (Fig. 2, 205) or the web page server (Fig. 1 , 1 15). Therefore, the user is already engaged with creating the photo album before all of the images are uploaded and is substantially closer to completing the photo album.
- the specification and figures describe a system and a method of generating a photo album.
- the system and method take advantage of an event detection device to group any number of uploading images into smaller groups, upload a first smaller set of images, and manipulate that smaller set of images while all other images or sets of images are being uploaded to the photo album generation system.
- This system and method of generating a photo album can provide a number of advantages, including reducing the total time to create a photo album which can result in higher sales revenues from users.
- higher customer satisfaction can result since the system and method can be used to reduce the periods of time a user may spend in a line.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Television Signal Processing For Recording (AREA)
- Processing Or Creating Images (AREA)
Abstract
A method of generating a photo album comprising, with a processor (125, 210), accessing metadata (300) associated with individual images within a collection of digital images, running an event detection device (305) that separates the collection of images into smaller sub-sets of images (320) by using the metadata, uploading a first sub-set of images (325) to a computing device (105, 205), and with the processor (125, 210), simultaneously conducting a feature extraction process (330) while uploading (335) to the computing device (105, 205) a subsequent sub-set of images within the collection of digital images.
Description
Method of Generating a Photo Album
BACKGROUND
[0001] Digital photography has created new possibilities for both the novice and professional photographer. One of the advantages to the consumer has come in the size and amount of photographs that can be taken. Today's consumer digital cameras can take a digital photograph with very high resolution and image quality. The higher the resolution of the image, the more data is used to represent the image. Additionally, the number of pictures that can be taken has increased with the increase in data storage accessible to the digital camera. Large data storage devices are readily available that allow hundreds or even thousands of pictures, even at high resolution, to be stored on a digital camera.
[0002] However, a consumer can have too much data to upload if and when the consumer wishes to share, print, edit, organize or otherwise utilize the digital photographs. In attempting to upload these photos, consumers find themselves waiting longer periods of time before anything can be done to manipulate their photographs. Indeed, as the amount of time to upload a set of digital photographs increases, the consumer may decide to discontinue the project. This may be particularly true where the user is using a public kiosk system to upload their photographs if the waiting line causes frustration.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] The accompanying drawings illustrate various examples of the principles described herein and are a part of the specification. The illustrated examples are merely examples and do not limit the scope of the claims.
[0004] Fig. 1 is a diagram of one illustrative photo album generation system for uploading digital photographs and generating a photo album, according to one example of principles described herein.
[0005] Fig. 2 is a diagram of another illustrative photo album
generation system for uploading digital photographs and generating a photo album, according to one example of principles described herein.
[0006] Fig. 3 is an illustrative flowchart depicting a method of generating a photo album on-the-fly, according to one example of principles described herein.
[0007] Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements.
DETAILED DESCRIPTION
[0008] The following specification discloses a number of illustrative systems and methods for simultaneously uploading digital images and editing images. More specifically, the following specification discloses illustrative systems and methods for generating a photo album on a photo album
generation system by taking advantage of an event detection device to group any number of uploading images into smaller groups, upload a first smaller set of images, and manipulate that smaller set of images while all other images or sets of images are being uploaded to the photo album generation system. The event detection device may be based on a number of factors including, for example, the individual time stamps of each image, the contents of each image, and any metadata associated with the image.
[0009] As noted above, advancements in digital cameras have resulted in an increase in the resolution of digital photographs that can be taken
as well as the number of photographs that can be stored on a data storage device. With the increase in resolution of each image taken comes an increase in the amount of data associated with each of the images. Accordingly, this can increase the time to upload images, for example when a user attempts to upload those images to a photo album creation application.
[0010] Because of the time taken to transfer data, when a user has wanted to generate a photo album, the user may first upload all of the digital images for the photo album and then wait an additional period of time for those images to be processed before the user is allowed to begin working on the collection, e.g., adjust the various features or layout of the individual images. The extra time taken to first upload images before anything else is done can be a deterrent to working on the photo album.
[0011] As used in the present specification and in the appended claims, the term "photo album" is meant to be understood broadly as any collection of individual digital images meant, by the user, to be contained in a single collection.
[0012] Further, as used in the present specification and in the appended claims, the term "metadata" is meant to be understood broadly as data that provides information about or documentation of other data managed within an application or environment. For example, metadata may be
associated with and provide information regarding the data composing a digital image.
[0013] Still further, as used in the present specification and in the appended claims, the term "image" is meant to be understood broadly as any visual representation of a subject or digital data that can be used to produce such a visual representation of a subject. Users commonly take images of such things as people, places, events (e.g. weddings), etc.
[0014] Even further, as used in the present specification and in the appended claims, the term "user computing device" or "computing device" is meant to be understood broadly as any device capable of receiving, storing, and manipulating data. For example, a user computing device may receive digital data representing an image, store, at least temporarily, that data, and provide a
provide a means for a user to manipulate that data to produce, for example, a photo album.
[0015] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough
understanding of the principles disclosed including illustrative systems and methods. However, the principles disclosed herein may be practiced without these specific details. The various instances of the phrase "in one example" or similar phrases in various places in the specification are not necessarily all referring to the same example.
[0016] Turning now to Fig. 1 , a diagram of one illustrative photo album generation system (100) for uploading digital photographs and generating a photo album is shown. The photo album generation system (100) comprises a user computing device (105) that has access to a web page (1 10) on a web page server (1 15). In the present example, for the purposes of simplicity in illustration, the user computing device (105) and the web page server (1 15) are separate computing devices communicatively coupled to each other through a mutual connection to a network (120), for example, the Internet.
[0017] However, the principles set forth in the present specification extend equally to any alternative configuration in which a user computing device (105) performs the functions of both the user computing device (105) and web page server (1 15), such as in a photo kiosk. Additionally, alternative examples within the scope of the principles of the present specification include, but are not limited to, examples in which the functionality of the user computing device (105) is implemented by multiple interconnected computers (for example, a server in a data center and a user's client machine) and examples in which the user computing device (105) and the web page server (1 15) communicate directly through a bus without intermediary network devices.
[0018] Indeed, the principles set forth in the present specification extend equally to examples where the user computing device (105) performs an initial analysis of the images being uploaded, groups various images based on a event detection device being used, uploads a first image or set of images, allows the user to interact with or otherwise edit the uploaded image or set of
images, and continues to upload other images or groups of images while the user is engaged in editing the first group of images. Therefore, all of these tasks may also be accomplished at, for example, a kiosk-type computing system which may allow a user to generate a photo album without any connection to a specific web site or web page server.
[0019] Returning to Fig. 1 , the user computing device (105) of the present example is a computing device that retrieves the web page (1 10) hosted by the web page server (1 15) and allows a user to, through interaction with the web page (1 10), upload any number of digital images. In the present example, this is accomplished by the user computing device (105) requesting the web page (1 10) from the web page server (1 15) over the network (120) using the appropriate network protocol (e.g., Internet Protocol ("IP")).
Specifically, the processor (125) associated with the user computing device (105) allows the user to, through, for example, a Flash or JavaScript application, upload any number of images via any number of threads. Illustrative processes for uploading and simultaneously editing digital photographs are set forth in more detail below.
[0020] To achieve its desired functionality, the user computing device (105) includes various hardware components. The user computing device (105) may therefore include at least one processing unit (125), at least one data storage device (130), peripheral device adapters (135), and a network adapter (140). These hardware components may be interconnected through the use of one or more busses and/or network connections.
[0021] The processor (126), as described above, may interpret and execute the instructions to access the web page (1 10). In one illustrative example the user computing device (105) may comprise two or more
processors (125); one processor (125) being used to upload a number of images while another processor (125) is used to conduct a feature extraction process of each individual digital image. In another illustrative example, the user computing device (105) may comprise a single processor (125) which simultaneously uploads a number of digital images and runs the feature extraction process. In still another illustrative example, any number (n) of
processors (125) may perform the feature extraction process while,
simultaneously, any number (m) of processors are conducting the uploading process.
[0022] The data storage devices (130, 150) may digitally store data implemented and produced by the processing units (125, 155) of the user computing device (105) and web page server (1 15). The data storage devices (130, 150) may include various types of memory modules, including volatile and nonvolatile memory. For example, the data storage devices (130, 150) of the present example include Random Access Memory (RAM), Read Only Memory (ROM), and Hard Disk Drive (HDD) memory. Many other types of memory are available in the art, and the present specification contemplates the use of many varying types of memory in the data storage devices (130, 150) as may suit a particular application of the principles described herein. In certain examples, different types of memory in the data storage devices (130, 150) may be used for different data storage needs. For example, in certain examples the processing units (125, 155) may boot from ROM, maintain nonvolatile storage in HDD memory, and execute program code stored in RAM. In one illustrative example, the data storage device (150) of the web page server (1 15) may store the images uploaded by the user for the user to edit.
[0023] The hardware adapters (135, 140) in the user computing device (105) enable the processing unit (125) to interface with various other hardware elements, external and internal to the user computing device (105). For example, peripheral device adapters (135) may provide an interface to input/output devices (145) to create a user interface and/or access external sources of memory storage. Additionally, peripheral device adapters (135) may provide an interface between the user computing device (105) and an external data storage device. For example, a peripheral device adapter (135) may provide an interface by which the user computing device (105) and the processor (125) may access an internal data storage device communicatively coupled with a digital camera. Still further, peripheral device adapters (135) may provide an interface to a removable data storage device such as, for
example, a memory card, a flash memory, floppy disks, compact disks, USB drives, among others.
[0024] Peripheral device adapters (135) may also create an interface between the processing unit (125) and a printer or other media output device. A network adapter (140) may additionally provide an interface to the network (120), thereby enabling the transmission of data to, and receipt of data from, other devices on the network (120), including the web page server (1 15).
[0025] Turning now to Fig. 2, a diagram of another illustrative photo album generation system (200) for uploading digital photographs and
generating a photo album is shown. As mentioned previously, the principles set forth in the present specification extend equally to all examples where the computing device (205) performs an initial analysis of the images being uploaded, groups various images based on a event detection device being used, uploads a first image or set of images, allows the user to interact with or otherwise edit the uploaded image or set of images, and continues to upload other images or groups of images while the user is engaged in editing the first group of images. The computing device (205) as depicted in Fig. 2 may be a kiosk-type computing device which may be located in, for example, a local marketplace or other retail location.
[0026] Much like the user computing device (Fig. 1 , 105) of Fig. 1 , the computing device (205) of Fig. 2 may include at least one processing unit (210), at least one data storage device (215), peripheral device adapters (220), and an input/output device (225). These hardware components may be interconnected through the use of one or more busses and/or network connections.
[0027] The processor (210) may interpret and execute the instructions to access data such as, for example, digital images accessible through interfaces created by peripheral device adapters (220). Additionally, the processor (210) may interpret and execute instructions to perform an initial analysis of images being uploaded to the computing device (205), group various images based on a event detection device, upload a first image or set of images, receiving instructions from a user so that the user may interact with or otherwise edit the uploaded image or set of images, and continue to upload
other images or groups of images while the user is engaged in editing the first group of images.
[0028] In one illustrative example the computing device (205) may comprise two or more processors (210); one processor (210) being used to upload a number of images, while another processor (210) is used to conduct a feature extraction process of each individual digital image. In another illustrative example, the computing device (205) may comprise a single processor (210) which simultaneously uploads a number of digital images and runs the feature extraction process. In still another illustrative example, any number (n) of processors (210) may perform the feature extraction process while,
simultaneously, any number (m) of processors are conducting the uploading process.
[0029] The data storage device (215) may digitally store data implemented and produced by the processor (210) of the computing device (205). The data storage devices (215) may include various types of memory modules, including volatile and nonvolatile memory. For example, the data storage devices (215) of the present example include Random Access Memory (RAM), Read Only Memory (ROM), and Hard Disk Drive (HDD) memory. Many other types of memory are available in the art, and the present specification contemplates the use of many varying types of memory in the data storage devices (215) as may suit a particular application of the principles described herein. In certain examples, different types of memory in the data storage devices (215) may be used for different data storage needs. For example, in certain examples the processor (210) may boot from ROM, maintain nonvolatile storage in HDD memory, and execute program code stored in RAM. In one illustrative example, the data storage device (215) may store images uploaded by the user through the interface created by the peripheral device adapters (220) so that the user may edit them.
[0030] The peripheral device adapters (220) in the computing device (205) enable the processor (210) to interface with various other hardware elements, external and internal to the computing device (205). For example, peripheral device adapters (220) may provide an interface to input/output
devices (225) to create a user interface and/or access external sources of memory storage. Additionally, peripheral device adapters (220) may provide an interface between the computing device (205) and an external data storage device. For example, a peripheral device adapter (220) may provide an interface by which the computing device (205) and the processor (210) may access an internal data storage device communicatively coupled with a digital camera. Still further, peripheral device adapters (220) may provide an interface to a removable data storage device such as, for example, a memory card, a flash memory, floppy disks, compact disks, USB drives, among others.
Peripheral device adapters (220) may also create an interface between the processor (210) and a printer or other media output device. .
[0031] Turning now to Fig. 3, an illustrative flowchart depicting a method of generating a photo album according to principles disclosed herein is shown. The process starts with the processor (Fig. 1 , 125; Fig. 2, 210) gaining access to a user's set of digital images (Block 300).
[0032] In an illustrative example of a kiosk-type computing device (205) as depicted in Fig.2, this may be accomplished by the user inserting a portable data storage device such as, for example, a flash memory card, floppy disk, compact disk, or universal serial bus USB drives. In another illustrative example where generating a photo album is completed over a network such as that depicted in Fig. 1 , this may be accomplished by the user first uploading digital images to the user computing device (Fig. 1 , 105) and then interfacing with the various elements associated with the web page (Fig. 1 , 1 10) to upload those images to the data storage device (Fig. 1 , 150) of associated with the web page server (1 15).
[0033] Additionally, in another illustrative example, the user may be allowed to connect a digital camera to the user computing device (Fig. 1 , 105; Fig. 2, 205) via a peripheral device adapter (Fig. 1 , 135; Fig. 2, 220) and through the cameras hardware and machine readable instructions (such as software) allow the processor (Fig. 1 , 125; Fig. 2, 210) access to the image data stored thereon. In other examples, the camera itself may be the user computing device.
[0034] In yet another illustrative example, the user may place photos into a scanner that is in communication with the user computing device (Fig. 1 , 105; Fig. 2, 205) via a peripheral device adapter (Fig. 1 ,135; Fig. 2, 220). Thus, the user computing device receives the image data for the scanned image from the scanner.
[0035] In still another illustrative example, the processor (Fig. 1 , 125; Fig. 2, 210) may receive digital image data from a remote computing device in which the digital image date is received from the remote device by the user computing device (Fig. 1 , 105) via a network (120) such as, for example, the Internet. Examples of a remote device may include a client computer or an internet enabled smart phone, among others.
[0036] Once the processor (Fig. 1 , 125; Fig. 2, 210) has gained access to the image data (Block 300), the processor (Fig. 1 , 125; Fig. 2, 210) begins to run an event detection device (Block 305). In the present example, for the purposes of simplicity in illustration, the event detection device utilizes the time stamps associated with the individual images to determine which image or set of images are uploaded first. However, the principles set forth in the present specification extend equally to any alternative method where the event detection device utilizes other data or metadata associated with each of the individual images.
[0037] Therefore, in one illustrative example, metadata such as tags associated with the images may be used to group all the images into a number of groups of images. For, example, when creating a photo album, many of the images may contain features within the image that have been tagged previously by the user as being a facial image of a specific person. These images may then, through the use of the event detection device, be grouped together based on the individuals' names tagged on each image.
[0038] In another illustrative example, the tag or metadata may simply be a word or phrase which has been associated with image and, therefore, any image containing a certain word or phrase as a tag may be grouped together. In yet another illustrative example, the event detection device may take advantage of face recognition that analyzes each image to determine if a face is
present and, if one is, determine who is in the picture and again group images containing similar faces with other images containing that face or simply group images containing a face with other images containing a face.
[0039] In the example where time stamps are used as the criteria by which images are grouped, the processor (Fig. 1 , 125; Fig. 2, 210) initiates the event detection device and extracts the time stamp data from each image (Block 305). The processor then computes a time similarity matrix between all pairs of images (Block 310) accessed by the processor (Fig. 1 , 125; Fig. 2, 210). This matrix compares each time stamp of the individual images and then forms an un-directed weighted graph (Bock 315). The weighted graph then shows different "communities" or groups of images lumped or arranged closer together some other images.
[0040] For example, if the complete set of images accessed by the processor (Fig. 1 , 125; Fig. 2, 210) contained a number of events over a period of days, each of the images has a time stamp which would reflect the fact that these events were separated by a certain time difference. If one image was taken, and a subsequent image was also taken some time later, that difference (At) in time may be significant enough for the processor to determine that the second or subsequent image was taken at a different event. Therefore, a predetermined time differential (At) threshold may be set, and the processor may group (Block 320) these images together based on whether any specific image meets that threshold time difference or not.
[0041] Threshold time differences could be, for example, as short as a few seconds or as long as the user determines. In one illustrative example, a 24 hour time differential (At) threshold may be set by the user, and therefore any subsequent images taken within 24 hours of any previously taken image are grouped in with the previous image or images. Conversely, any images take outside of a 24 hour period may be group with a different set of images.
[0042] After the processor has determined that any subsequent image has not met the threshold time difference, a first image or set of images may be uploaded (Block 325). This first set of images may represent a particular event or occasion and thereby may naturally fit within one chapter or section of the
section of the photo album being created. This set of images may then be immediately uploaded (Block 325) for subsequent processing (Block 330) as is discussed below.
[0043] In the photo album generation system (Fig. 1 , 100) as described in connection with Fig. 1 , the images may be uploaded (Block 325) to the web page server (Fig. 1 , 1 15) and saved, at least temporarily, on the data storage device (Fig. 1 , 150) associated with the web page server (Fig. 1 , 1 15). However, in a kiosk setting as depicted in Fig. 2 where the kiosk performs the functions of both the web page server (1 15) and the user computing device (105) of Fig. 1 , the images may be uploaded (Block 325) to a local data storage device associated with the kiosk or computing device (205).
[0044] After the first image or set of images has been uploaded (Block 325), the processor may begin to initiate a feature extraction process (Block 330) that involves using an algorithm to detect and isolate various desired portions, shapes, or features of the image and transform the image data for each image into a reduced representation set of those features. These features may include, for example, edges of brightness changes within the image, intersections of two edges within the image, points within the image for which there are two dominant and different edges directions in a local neighborhood of the point, points or regions within the image that are either brighter or darker than the surrounding portions of the image, changes in colors within the image, and shapes within the image.
[0045] The feature extraction process (Block 330) may take as long as the uploading of the individual images because this process may use more processing power and therefore may take more time. However, while the first image or set of images are being subjected to this feature extraction process (Block 330), a second processor may be implemented to simultaneously begin uploading a subsequent image or set of images (Block 335) the user intends to include in the photo album. Therefore, as briefly described above, in one illustrative example, the user computing device (Fig. 1 , 105; Fig. 2, 205) may contain any number of processors that are engaged in the task of extracting features from the most recently uploaded image or set of images (Block 330),
while any number of other processors are engaged in the task of uploading subsequent images or sets of images (Block 335). After the first set of pictures are uploaded (Block 325) and the feature extraction process is completed (Block 330) for that first set of images, the user may then be able to start editing the pictures to create, for example, a photo album. While the user is engaged with the editing the initial set of images, the processor (Fig. 1 , 125; Fig. 2, 210) may continue to upload subsequent sets of images (Block 335) while
simultaneously conducting the feature extraction process (Block 330) for other sets of images.
[0046] The use of multiple processors can reduce the processing time to create a photo album and thereby reduce the wait for a project to be completed. Indeed, instead of waiting until all of the images are uploaded and then waiting for the feature extraction process to be completed, the user may instead, begin to preview and edit a relatively smaller set of images, thereby decreasing the initial wait time for the user as well as decreasing the entire time it otherwise may take to create the photo album.
[0047] This may be particularly beneficial for the user of a kiosk that is implementing this procedure, where a user may face a lengthy wait in line. These considerations may discourage a user and potential purchaser of goods from creating a photo album, potentially resulting in a lost opportunity for the user and lost revenue for the owner of the kiosk.
[0048] Although a number of processors may be used during the process described in connection with Fig. 3, the principles set forth in the present specification extend equally to a system that employs a single processor. Therefore, in one illustrative example, a single processor may be used to alternate between uploading images (Block 335) and running a feature extraction process (Block 330). The processor may be able to handle both the uploading (Block 335) and processing (Block 330) by implementing a direct access memory channel to upload the images while the computations for the feature extraction process (Block 330) continue to be implemented directly by the processor.
[0049] The uploading (Block 335) and feature extraction (330) processes may continue (Block 340) until all of the user's images have been uploaded to the computing device (Fig. 2, 205) or the web page server (Fig. 1 , 1 15). Therefore, the user is already engaged with creating the photo album before all of the images are uploaded and is substantially closer to completing the photo album.
[0050] The specification and figures describe a system and a method of generating a photo album. The system and method take advantage of an event detection device to group any number of uploading images into smaller groups, upload a first smaller set of images, and manipulate that smaller set of images while all other images or sets of images are being uploaded to the photo album generation system. This system and method of generating a photo album can provide a number of advantages, including reducing the total time to create a photo album which can result in higher sales revenues from users. In addition, higher customer satisfaction can result since the system and method can be used to reduce the periods of time a user may spend in a line.
[0051] The preceding description has been presented only to illustrate and describe examples of the principles described. This description is not intended to be exhaustive or to limit these principles to any precise form disclosed. Many modifications and variations are possible in light of the above teaching.
Claims
1 . A method of generating a photo album comprising:
with a processor (125, 210), accessing metadata associated with individual images within a collection of digital images (300);
running an event detection device that separates the collection of images into smaller sub-sets of images by using the metadata (305);
uploading a first sub-set of images (325) to a computing device (105, 205); and
with the processor (125, 210), simultaneously conducting a feature extraction process (330) while uploading (335) to the computing device (105, 205) a subsequent sub-set of images within the collection of digital images.
2. The method of claim 1 , further comprising allowing a user to edit individual images within the first sub-set of images after the first sub-set has been uploaded and processed.
3. The method of claim 1 , wherein the metadata associated with the individual images within the collection of digital images is a time stamp.
4. The method of claim 1 , wherein the event detection device determines the time differences between time stamps associated with each image within the collection of images and places each image into a sub-set of images based on whether a threshold time difference between each image has been met.
5. The method of claim 4, wherein the first sub-set of images is uploaded only when the processor (125, 210) has determined that no other image within the collection of images has met the threshold time difference.
6. The method of claim 1 , wherein subsequent sub-sets of images are uploaded sequentially and processed using the feature extraction process (330) until all sub-sets of images within the collection of images has been uploaded.
7. The method of claim 1 , wherein the feature extraction process (330) detects and isolates edges of brightness changes within the image,
intersections of two edges within the image, points within the image for which there are two dominant and different edges directions in a local neighborhood of the point, points or regions within the image that are either brighter or darker than the surrounding portions of the image, changes in colors within the image, shapes within the image, and combinations thereof.
8. A system for generating a photo album comprising:
a computing device (105, 205);
a processor (125, 210) associated with the computing device (105, 205); wherein the processor (125, 210) accesses metadata associated with individual images within a collection of digital images, uploads a first sub-set of images within the collection of digital images to the computing device (105, 205), and simultaneously conducts a feature extraction process of each image uploaded while uploading a subsequent sub-set of images within the collection of digital images to the computing device (105, 205).
9. The system of claim 8, wherein the computing device (105) is
communicatively coupled to a web page server (1 15) over a network (120).
10. The system of claim 9, wherein the individual images within a collection of digital images are uploaded to a data storage device (150) associated with the web page server (1 15).
1 1 . The system of claim 8, wherein the processor (125) runs an event detection device that separates the collection of images into smaller sub-sets of images based on the metadata associated with the individual images.
12. The system of claim 1 1 , wherein the event detection device determines the time differences between time stamps associated with each image within the collection of images and places each image into a sub-set of images based on whether a threshold time difference between each image has been met.
13. The system of claim 8, wherein a user is allowed to edit individual images within the first sub-set of images after the first sub-set has been uploaded and processed.
14. The system of claim 8, wherein subsequent sub-sets of images are uploaded sequentially and processed using the feature extraction process until all sub-sets of images within the collection of images has been uploaded.
15. A method of generating a photo album comprising:
with a processor (125, 210), accessing a time stamp (300) associated with individual images within a collection of digital images;
running an event detection device (305) that separates (320) the collection of images into smaller sub-sets of images by determining (310) the time differences between time stamps associated with each image within the collection of images and placing each image into a sub-set of images (320) based on whether a threshold time difference between each image has been met;
uploading a first sub-set of images to a computing device (105; 205); and with the processor (125, 210), simultaneously conducting a feature extraction process (330) while uploading (335) to the computing device (105, 205) a subsequent sub-set of images within the collection of digital images.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2010/054481 WO2012057764A1 (en) | 2010-10-28 | 2010-10-28 | Method of generating a photo album |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2010/054481 WO2012057764A1 (en) | 2010-10-28 | 2010-10-28 | Method of generating a photo album |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012057764A1 true WO2012057764A1 (en) | 2012-05-03 |
Family
ID=45994233
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2010/054481 Ceased WO2012057764A1 (en) | 2010-10-28 | 2010-10-28 | Method of generating a photo album |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2012057764A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112698775A (en) * | 2020-12-30 | 2021-04-23 | 维沃移动通信(杭州)有限公司 | Image display method and device and electronic equipment |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030200268A1 (en) * | 2002-04-23 | 2003-10-23 | Morris Robert P. | Method and system for sharing digital images over a network |
| US20060285772A1 (en) * | 2004-10-01 | 2006-12-21 | Hull Jonathan J | System and methods for creation and use of a mixed media environment |
| KR20080044610A (en) * | 2006-11-17 | 2008-05-21 | 고려대학교 산학협력단 | GS image search method, geographic location service based GS image search method, blog service based GS image search method and regional blog service provision method |
| US7392284B2 (en) * | 2000-12-29 | 2008-06-24 | Fotomedia Technologies, Llc | Meta-application architecture for integrating photo-service websites for browser-enabled devices |
| US7702821B2 (en) * | 2005-09-15 | 2010-04-20 | Eye-Fi, Inc. | Content-aware digital media storage device and methods of using the same |
-
2010
- 2010-10-28 WO PCT/US2010/054481 patent/WO2012057764A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7392284B2 (en) * | 2000-12-29 | 2008-06-24 | Fotomedia Technologies, Llc | Meta-application architecture for integrating photo-service websites for browser-enabled devices |
| US20030200268A1 (en) * | 2002-04-23 | 2003-10-23 | Morris Robert P. | Method and system for sharing digital images over a network |
| US20060285772A1 (en) * | 2004-10-01 | 2006-12-21 | Hull Jonathan J | System and methods for creation and use of a mixed media environment |
| US7702821B2 (en) * | 2005-09-15 | 2010-04-20 | Eye-Fi, Inc. | Content-aware digital media storage device and methods of using the same |
| KR20080044610A (en) * | 2006-11-17 | 2008-05-21 | 고려대학교 산학협력단 | GS image search method, geographic location service based GS image search method, blog service based GS image search method and regional blog service provision method |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112698775A (en) * | 2020-12-30 | 2021-04-23 | 维沃移动通信(杭州)有限公司 | Image display method and device and electronic equipment |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11061971B2 (en) | Method and apparatus for photograph finding | |
| CN111062871B (en) | Image processing method and device, computer equipment and readable storage medium | |
| US10261962B2 (en) | System and method for intelligently determining image capture times for image applications | |
| EP2402867A1 (en) | A computer-implemented method, a computer program product and a computer system for image processing | |
| Hays et al. | Scene completion using millions of photographs | |
| US8935322B1 (en) | Methods and systems for improved uploading of media files for use in media-rich projects | |
| US20140293069A1 (en) | Real-time image classification and automated image content curation | |
| US20140114643A1 (en) | Autocaptioning of images | |
| US20140304269A1 (en) | Automatic media sharing via shutter click | |
| US20140105466A1 (en) | Interactive photography system and method employing facial recognition | |
| JP2003298991A (en) | Image arranging method and apparatus, and program | |
| WO2015144043A1 (en) | Photo collection display method and apparatus | |
| US20120179571A1 (en) | System and method for producing digital image photo-specialty products | |
| US9665773B2 (en) | Searching for events by attendants | |
| CN102150163A (en) | Interactive image selection method | |
| US7925121B2 (en) | Theme-based batch processing of a collection of images | |
| WO2012057764A1 (en) | Method of generating a photo album | |
| CN110503738A (en) | A kind of multi-functional attendance all-in-one machine and multi-functional attendance checking system | |
| AU2022402119B2 (en) | Ai-powered raw file management | |
| EP2646981A1 (en) | Method for determining the movements of an object from a stream of images | |
| US20190318416A1 (en) | System and Method for Providing Photography | |
| KR20230128751A (en) | Method for controlling exposure of product advertisement content and apparatus for using the same | |
| JP7651846B2 (en) | Image management device, print production system and program | |
| WO2025204150A1 (en) | Electronic album generation device, method for operating electronic album generation device, and program for operating electronic album generation device | |
| WO2025197268A1 (en) | Image processing device, image processing system, image processing method, program, and recording medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10859078 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 10859078 Country of ref document: EP Kind code of ref document: A1 |