WO2016006090A1 - Appareil électronique, procédé et programme - Google Patents
Appareil électronique, procédé et programme Download PDFInfo
- Publication number
- WO2016006090A1 WO2016006090A1 PCT/JP2014/068502 JP2014068502W WO2016006090A1 WO 2016006090 A1 WO2016006090 A1 WO 2016006090A1 JP 2014068502 W JP2014068502 W JP 2014068502W WO 2016006090 A1 WO2016006090 A1 WO 2016006090A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- group
- image
- images
- classification
- face
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- Embodiments described herein relate generally to an electronic device, a method, and a program.
- An electronic apparatus includes a processor, a display control unit, and a reception unit.
- the processor classifies a plurality of first images whose features of each image have corresponding to the first classification criterion into a first group, and a second feature in which the features of the portions of each image are different from the first classification criterion.
- a plurality of second images corresponding to the classification criteria are classified into a second group.
- the display control unit displays a plurality of first images included in the first group and a plurality of second images included in the second group on a display.
- the accepting unit accepts a first operation for changing one or more second images included in the second group to the first group.
- the processor changes at least the first classification criterion using one or more second images changed from the second group to the first group.
- FIG. 1 is a perspective view showing an external appearance of an electronic apparatus according to one embodiment.
- FIG. 2 is a diagram illustrating an example of a system configuration of a tablet according to an embodiment.
- FIG. 3 is a diagram illustrating an example of a functional configuration of an image management program executed by the tablet of one embodiment.
- FIG. 4 is a diagram illustrating a screen displayed on the LCD of one embodiment.
- FIG. 5 is a flowchart illustrating a part of the procedure of the image clustering process in the tablet of one embodiment.
- FIG. 6 is a diagram illustrating a screen displayed on the LCD after the image is reclassified according to one embodiment.
- FIG. 7 is a diagram illustrating another example of a screen displayed on the LCD after the image is reclassified according to one embodiment.
- FIGS. 1 to 7. Note that a plurality of expressions may be written together for the constituent elements according to the embodiment and the description of the elements. It is not precluded that other expressions not described in the component and description are made. Furthermore, it is not prevented that other expressions are given for the components and descriptions in which a plurality of expressions are not described.
- FIG. 1 is a perspective view showing an external appearance of an electronic apparatus according to one embodiment.
- This electronic device can be realized as an embedded system incorporated in various electronic devices such as a tablet computer, a notebook personal computer, a smartphone, a PDA, or a digital camera.
- a tablet computer a notebook personal computer
- smartphone a smartphone
- PDA a digital camera
- the tablet 10 is a portable electronic device also called a slate computer, and includes a main body 11 and a touch screen display 17 as shown in FIG.
- the touch screen display 17 is attached to be superposed on the upper surface of the main body 11.
- the main body 11 has a thin box-shaped housing.
- the touch screen display 17 incorporates a flat panel display and a sensor configured to detect a contact position of a stylus pen or a finger on the screen of the flat panel display.
- the flat panel display may be, for example, a liquid crystal display (LCD).
- As the sensor for example, a capacitive touch panel, an electromagnetic induction digitizer, or the like can be used.
- the main body 11 is provided with a camera module 18 for taking an image (photograph) from the front side of the main body 11.
- the main body 11 may be provided with another camera module for taking an image from the back side of the main body 11.
- FIG. 2 is a diagram illustrating an example of the system configuration of the tablet 10.
- the tablet 10 includes a central processing unit (CPU) 101, a system controller 102, a main memory 103, a graphics controller 104, a BIOS-ROM 105, a nonvolatile memory 106, a wireless communication device 107, an embedded controller ( EC) 108, a card controller 110, and the like.
- CPU central processing unit
- system controller 102 main memory
- main memory 103 main memory
- BIOS-ROM 105 BIOS-ROM
- nonvolatile memory 106 nonvolatile memory
- wireless communication device 107 a wireless communication device
- EC embedded controller
- the CPU 101 is a processor that controls the operation of various modules in the tablet 10.
- the CPU 101 executes various software loaded into the main memory 103 from the nonvolatile memory 106 that is a storage device.
- These software include an operating system (OS) 201 and various application programs.
- the application program includes an image management program 202.
- This image management program 202 is, for example, an image (image file) taken using the camera module 18, an image stored in the nonvolatile memory 106, an image taken from an external storage medium or an external storage device (imported). Image) and the like.
- the CPU 101 also executes a basic input / output system (BIOS) stored in the BIOS-ROM 105.
- BIOS is a program for hardware control.
- the system controller 102 is a device that connects between the local bus of the CPU 101 and various components.
- the system controller 102 also includes a memory controller that controls access to the main memory 103.
- the system controller 102 also has a function of executing communication with the graphics controller 104 via a PCI EXPRESS serial bus or the like.
- the graphics controller 104 is a display controller that controls the LCD 17 ⁇ / b> A used as a display of the tablet 10.
- a display signal generated by the graphics controller 104 is sent to the LCD 17A.
- the LCD 17A displays a screen image based on the display signal.
- a touch panel 17B used as a sensor of the tablet 10 is disposed.
- the display is not limited to the LCD 17A, but may be another device such as an organic EL display.
- the wireless communication device 107 is a device configured to execute wireless communication such as wireless LAN or 3G mobile communication.
- the EC 108 is a one-chip microcomputer including an embedded controller for power management.
- the EC 108 has a function of turning on or off the tablet 10 in accordance with the operation of the power button by the user.
- the camera module 18 captures an image in response to, for example, the user touching (tapping) a button (graphical object) displayed on the screen of the touch screen display 17.
- the card controller 110 executes communication with the recording medium 25A inserted into the card slot 25.
- the card controller 110 executes, for example, communication when an image file stored in an SD card that is the recording medium 25 ⁇ / b> A is read and stored in the nonvolatile memory 106.
- the image management program 202 executed by the tablet 10 has a function of managing a plurality of images (image files).
- the image management program 202 can classify a plurality of images based on a person or an object.
- FIG. 3 is a diagram illustrating an example of a functional configuration of the image management program 202 executed by the tablet 10.
- FIG. 4 is a diagram showing a screen 30 displayed on the LCD 17 ⁇ / b> A when the tablet 10 executes the image management program 202.
- the image management program 202 will be described in detail with reference to FIGS. 3 and 4.
- the image management program 202 includes, for example, an image acquisition unit 81, a classification part detection unit 82, a feature amount calculation unit 83, a clustering unit 84, a display control unit 85, and an operation reception unit 86. And a threshold value calculation unit 87.
- the operation reception unit 86 is an example of a reception unit.
- the image management program 202 includes a plurality of images (photographs) stored in a storage device or storage medium built in or externally connected to the tablet 10, a storage device in another computer connected to the tablet 10 via a network, or the like. Can be managed. Each unit of the image management program 202 classifies a plurality of images based on a person, and displays the classification result on the screen 30 in FIG.
- the image acquisition unit 81 in FIG. 3 stores a plurality of management target images in a storage device or storage medium that is built in or externally connected to the tablet 10, a storage device in another computer connected to the tablet 10 via a network, Etc. Note that the image acquisition unit 81 is not limited to an image, and may acquire information (related information) associated with an image such as an image creation date. The image acquisition unit 81 outputs the acquired plurality of images to the classification part detection unit 82.
- the classification part detection unit 82 detects, for example, a plurality of face images from a plurality of images.
- the face image is an example of a part included in each image.
- a plurality of face images may be included in one image, and none may be included in one image.
- the classification part detection unit 82 detects, for example, an area (face image area) estimated to include a face image in the image using pixel values of a plurality of pixels included in the image. This face image area is, for example, a rectangular area that circumscribes the face image.
- the classification part detection unit 82 outputs data indicating the detected face image area to the feature amount calculation unit 83.
- the feature amount calculation unit 83 calculates the feature amount of the face image (for example, the feature amount representing the color or shape of the object in the face image region) using the pixel values of the pixels included in the detected face image region. To do.
- the feature amount calculation unit 83 calculates a plurality of feature amounts corresponding to the plurality of face images.
- examples of the method for extracting the feature amount of the image include feature detection based on a region-based edge image recognition technique and feature detection based on a feature amount such as HOG (Histograms of Oriented Gradients).
- HOG Hemograms of Oriented Gradients
- the feature amount calculation unit 83 can detect a feature amount even for an image to which special information such as a QR code (registered trademark) is not added.
- the clustering unit 84 classifies a plurality of images (photos) including the plurality of face images into a plurality of groups (clusters) based on the detected plurality of face images. More specifically, the clustering unit 84 clusters a plurality of images (photos) each including a face image based on the feature amount of the face image calculated by the feature amount calculation unit 83. The clustering unit 84 classifies a plurality of face images into similar face images, thereby classifying images (photos) including the face images into groups (for example, first to fourth groups 31 to 34). That is, the clustering unit 84 classifies images (photographs) including face images estimated to be the same person's face into the same group.
- the clustering unit 84 classifies one or more images corresponding to the first face (the face of the first person) into the first group based on the detected plurality of face images, and the second face (the second face). Classify one or more images corresponding to the face of the second person) into the second group, and classify one or more images corresponding to the nth face (the face of the nth person) into the nth group. .
- the clustering unit 84 generates a number of groups corresponding to the number of persons of the facial photograph included in the plurality of images. Since the clustering unit 84 makes a mistake in classification or classifies images including face images in which the same person does not exist into the “other” group, the number of face photo persons included in the plurality of images is generated. May be different from the number of groups.
- One image including a plurality of face images can be classified into a plurality of groups. For example, when the image corresponding to the first face and the image corresponding to the second face are detected from the first image (photo) of the plurality of images, the clustering unit 84 detects the first image (photo). Are classified into both the first group and the second group. In addition, even when the clustering unit 84 detects at least one of the image corresponding to the first face or the image corresponding to the second face from the second image (photograph) of the plurality of images, Depending on the shooting state of the image corresponding to the first face or the image corresponding to the second face, it may not be classified into the first group or the second group.
- the shooting state of the image corresponding to the first face or the image corresponding to the second face is the position of the image corresponding to the first face or the image corresponding to the second face, or the focus (for example, the focus on the face). It is at least one of whether or not.
- a data clustering method such as a K-means method or a shortest distance method can be cited.
- the clustering unit 84 classifies images by these data clustering methods using threshold values (classification criteria), for example.
- the threshold value is an example of a first classification criterion and a second classification criterion. That is, the clustering unit 84 classifies the images into a plurality of groups based on the feature amount and threshold value of the face image.
- the data clustering method may be a hierarchical method or a non-hierarchical method.
- the clustering unit 84 may recognize the person corresponding to each group using the feature amount of the face image for each person prepared in advance.
- the clustering unit 84 may classify images using not only the feature amount of the face image but also related information acquired by the image acquisition unit 81, for example. For example, when information about a person appearing in the image is set as the related information, the clustering unit 84 may classify the image using the related information.
- the clustering unit 84 classifies the plurality of images into, for example, a first group 31, a second group 32, a third group 33, and a fourth group 34.
- the first group 31 is an example of a first group.
- the third group 33 is an example of a second group.
- an image including a plurality of face images 41A, 41B, 41C, 41D, 41E, 41F, 41G, and 41H belongs to the first group 31. That is, in the classification using a certain threshold, the clustering unit 84 estimates that the plurality of face images 41A to 41H are face images indicating the face of the same person.
- Images including a plurality of face images 42A, 42B, 42C, and 42D belong to the second group 32. That is, the clustering unit 84 estimates that the plurality of face images 42A to 42D are face images indicating the face of the same person in the classification using a certain threshold value.
- the images including the plurality of face images 43A and 43B belong to the third group 33. That is, the clustering unit 84 estimates that the plurality of face images 43A and 43B are face images indicating the face of the same person in the classification using a certain threshold value.
- the images including a plurality of face images 44A, 44B, 44C, 44D, 44E, 44F, 44G, 44H, 44I, and 44J belong to the fourth group 34.
- the fourth group 34 is an “other” group. That is, the clustering unit 84 estimates that a plurality of face images 44A to 44J do not have face images showing the face of the same person in classification using a certain threshold value.
- the display control unit 85 displays the face images 41A to 41H, 42A to 42D, 43A, 43B, and 44A to 44J on the screen 30 for each group as shown in FIG. 4 based on the image classification result by the clustering unit 84. In other words, the display control unit 85 displays a list of face images on the screen 30. Note that the display unit 85 is not limited to a face image, and may display a reduced image on the screen 30, for example.
- the display control unit 85 collects a plurality of face images 41A to 41H belonging to the first group 31 and displays them on the screen 30.
- the display control unit 85 displays the thumbnail 31A together with the face images 41A to 41H.
- the thumbnail 31A is randomly selected from images including a plurality of face images 41A to 41H belonging to the first group 31.
- the display control unit 85 collects a plurality of face images 42A to 42D belonging to the second group 32 and displays them on the screen 30.
- the display control unit 85 displays the thumbnail 32A together with the face images 42A to 42D.
- the thumbnail 32A is randomly selected from images including a plurality of face images 42A to 42D belonging to the second group 32.
- the display control unit 85 collects a plurality of face images 43A and 43B belonging to the third group 33 and displays them on the screen 30.
- the display control unit 85 displays the thumbnail 33A together with the face images 43A and 43B.
- the thumbnail 33A is randomly selected from images including a plurality of face images 43A and 43B belonging to the third group 33.
- the display control unit 85 collects a plurality of face images 44A to 44J belonging to the fourth group 34 and displays them on the screen 30.
- the display control unit 85 displays the thumbnail 34A together with the face images 44A to 44J.
- the thumbnail 34A is randomly selected from images including a plurality of face images 44A to 44J belonging to the fourth group 34.
- the display control unit 85 displays a plurality of face images, which are part of the image, on the screen 30 for each group.
- the clustering unit 84 may classify the plurality of face images 44A to 44J that are estimated to have no face images showing the face of the same person into the fourth group 34 as described above. They may be classified into groups. In this case, a plurality of groups to which only one face image (individual face images 44A to 44J) belongs are generated. That is, the number of images belonging to the group may be one or plural.
- the display control unit 85 may hierarchically display the face image. For example, the display control unit 85 displays only the thumbnails 31A to 34A of the first to fourth groups 31 to 34 on the screen 30, and the face of each group 31 to 34 when the user touches the thumbnails 31A to 34A. Images may be displayed individually. Thus, the classification result of the image displayed on the screen 30 by the display control unit 85 is not limited to that shown in FIG.
- the display control unit 85 also performs display of the entire screen 30.
- the display control unit 85 further displays a preview unit 51, a selection release button 52, a group join button 53, a group deletion button 54, a group addition button 55, and a reclassification button 56 on the screen 30.
- the preview unit 51 enlarges and displays a part of an image randomly selected from images including a plurality of face images 41A to 41H, 42A to 42D, 43A, 43B, 44A to 44J, for example.
- the selection cancel button 52, the group join button 53, the group delete button 54, the group add button 55, and the reclassify button 56 are buttons (graphical objects) displayed on the screen 30.
- the operation reception unit 86 performs various processes described later.
- the operation accepting unit 86 performs a touch operation on the behavior of the finger or the like detected by the touch panel 17B when the user touches the touch panel 17B with a finger or a stylus pen (hereinafter referred to as a finger or the like) or moves the finger or the like. , Input as various events such as a long press operation and a move operation.
- the operation accepting unit 86 selects, for example, a face image and a group, or moves a face image between groups according to a touch operation, a long press operation, and a movement operation performed on a group or a face image performed by a user. Various processes are performed.
- the operation receiving unit 86 displays images including a plurality of face images 41A to 41H, 42A to 42D, 43A, 43B, and 44A to 44J in accordance with a user operation on the screen 30, in the first to fourth groups 31 to 34. Move between. For example, when the user presses and holds the position where the face image 43A of the screen 30 is displayed with a finger or the like and moves the finger or the like to the position where the first group 31 is displayed, the operation reception unit 86 displays the face The image 43A is moved (reclassified) to the first group 31. In other words, the operation receiving unit 86 receives an operation for changing one or more images included in one group to another group. The operation for moving the image is not limited to this.
- the operation reception unit 86 brings the first group 31 into a selected state. Further, for example, when the user performs a long press operation at a position where the face image 41A on the screen 30 is displayed, the operation reception unit 86 sets the face image 41A to the selected state. Other groups and other face images are selected in the same manner.
- the display control unit 85 shows the selected group and face image, for example, surrounded by a selection frame 61.
- the operation receiving unit 86 cancels the face image and group selection operation by the user's operation.
- the selection cancellation button 52 is operated, for example, when the user performs a touch (tap) operation on the position where the selection cancellation button 52 is displayed on the screen 30.
- the selection release button 52 may be operated by a click operation with a mouse, for example.
- the group join button 53, the group delete button 54, the group add button 55, and the reclassify button 56 are operated in the same manner.
- the operation reception unit 86 When the user operates the group combination button 53, the operation reception unit 86 combines a plurality of groups in the selected state. In other words, the operation reception unit 86 moves images belonging to at least one other selected group to one selected group. For example, when the first group 31 and the third group 33 are selected as shown in FIG. 4, the operation accepting unit 86 selects an image including the face images 43A and 43B belonging to the third group 33 as the first group 31 and the third group 33. 1 group 31 is moved.
- the operation reception unit 86 deletes at least one group in the selected state. In other words, the operation reception unit 86 moves the images belonging to the selected group to the fourth group 34 that is the “others” group. For example, when the first group 31 and the third group 33 are selected as shown in FIG. 4, the operation accepting unit 86 displays the face images 41A to 41H belonging to the first group 31 and the third group 33. , 43A, 43B are moved to the fourth group 34.
- the operation reception unit 86 adds a new group (blank) that does not include an image.
- the user can move the face image to the added group.
- the clustering unit 84 mistakenly classifies two person images into one group, the user can add a new group and move the image of one person to the new group.
- the operation accepting unit 86 causes the clustering unit 84 to classify a plurality of images again. Then, the display control unit 85 displays a new classification result on the screen 30. The image reclassification will be described later.
- the operation accepting unit 86 moves an image, joins a plurality of groups, deletes a group, adds a group, and the like.
- the operation receiving unit 86 receives an image moving operation between a plurality of groups by the user.
- the operation reception unit 86 is not limited to this, and receives various operations by the user.
- the threshold calculation unit 87 calculates a threshold for classification used by the clustering unit 84 based on the result of the operation of moving the image between the plurality of groups by the user.
- the clustering unit 84 performs classification using a plurality of threshold values set for each group. For this reason, the threshold value calculation part 87 calculates a some threshold value separately. Note that the clustering unit 84 may classify all face images using one threshold value. In this case, the threshold calculation unit 87 calculates the threshold.
- the face images 43 ⁇ / b> A and 43 ⁇ / b> B move to the first group 31 when the user performs a combining operation of the first group 31 and the third group 33.
- the threshold value calculation unit 87 calculates a threshold value such that the clustering unit 84 classifies the face images 43A and 43B into the first group 31.
- the threshold calculation unit 87 calculates a first threshold that is an individual threshold for the first group 31.
- the first threshold is an example of a first classification criterion.
- the threshold value calculation unit 87 calculates a threshold value that causes the clustering unit 84 to remove the face images 43A and 43B from the third group 33.
- the threshold calculation unit 87 calculates, for example, a third threshold that is an individual threshold for the third group 33.
- the third threshold is an example of the second classification criterion, and is different from the first threshold.
- the threshold value calculation unit 87 outputs the calculated threshold value to the clustering unit 84.
- the clustering unit 84 changes the existing threshold value to a new threshold value acquired from the threshold value calculation unit 87. In other words, the clustering unit 84 updates the threshold used for clustering by storing the threshold acquired from the threshold calculation unit 87.
- the threshold value calculation unit 87 uses the face images 43A and 43B changed from the third group 33 to the first group 31 to calculate new first threshold values and third threshold values. calculate.
- the clustering unit 84 changes the first threshold value and the third threshold value to the new first threshold value and third threshold value calculated by the threshold value calculation unit 87.
- the clustering unit 84 performs image reclassification when the user operates the reclassification button 56. That is, the clustering unit 84 reclassifies the images into a plurality of groups based on the feature amount of the face image and the changed threshold value (first threshold value or third threshold value) reflecting the user's movement operation. To do.
- the display control unit 85 displays the face image for each group on the screen 30 based on the image classification result by the clustering unit 84. That is, the display control unit 85 displays a face image on the screen 30 for each group classified based on the changed threshold value.
- FIG. 5 is a flowchart illustrating a part of the procedure of the image clustering process in the tablet 10.
- the image acquisition unit 81 acquires a plurality of images (a plurality of photos) (step S11).
- the classification part detection unit 82 detects a face image included in each of the plurality of images (step S12).
- the feature amount calculation unit 83 calculates the feature amount of the face image using the pixel values of the pixels included in the detected face image (step S13).
- the feature amount calculation unit 83 calculates a plurality of feature amounts corresponding to the plurality of face images when the classification portion detection unit 82 detects a plurality of face images from the plurality of images.
- the clustering unit 84 clusters a plurality of face images based on the calculated feature amount of the face image and the threshold value (step S14).
- the clustering unit 84 uses, for example, a preset threshold value in the initial clustering.
- the clustering unit 84 classifies the plurality of face images into clusters (groups) for similar face images. That is, the clustering unit 84 classifies face images estimated to be the same person's face into the same cluster. Classify.
- the clustering unit 84 stores the classification result in the main memory 103, for example.
- the display control unit 85 displays a face image for each group on the screen 30 based on the classification result (step S15).
- the operation receiving unit 86 shifts to a state in which an image moving operation between a plurality of groups can be received. Then, the operation receiving unit 86 determines whether or not the user has performed an image moving operation between a plurality of groups (step S16).
- the user When the person indicated by the face images 41A to 41H classified in the first group 31 and the person indicated by the face images 43A and 43B classified in the third group 33 are the same person, the user The first group 31 and the third group 33 are selected.
- the user operates the group combination button 53 to combine the first group 31 and the third group 33.
- the face images 43A and 43B of the third group 33 are moved (changed) to the first group 31, and the face images 41A to 41H, 41A, and 41B indicating the same person are collected in the first group 31.
- the threshold value calculation unit 87 creates a new threshold value (first value) based on the result of the image movement operation between the plurality of groups by the user. Threshold value or third threshold value).
- the clustering unit 84 changes the threshold used for clustering to the new threshold calculated by the threshold calculation unit 87 (step S17).
- the operation reception unit 86 determines whether or not the user has operated the reclassification button 56 (step S18). When the user does not operate the reclassification button 56 (step S18: No) and does not end the image management program 202 (step S19: No), the operation reception unit 86 continues to receive the image movement operation (step S16). .
- the threshold value calculation unit 87 calculates a new threshold value (step S17) each time the user performs an operation of moving an image between a plurality of groups (step S16: Yes).
- the clustering unit 84 selects a plurality of face images based on the feature amount of the face image and the changed threshold value. Clustering (reclassification) is performed (step S14).
- the display control unit 85 displays a face image for each group on the screen 30 based on the reclassification result (step S15).
- FIG. 6 is a diagram showing a screen 30 displayed on the LCD 17A after the image is reclassified.
- the threshold value calculation unit 87 calculates a new threshold value based on the result of the movement operation.
- the clustering unit 84 classifies the face images 43 ⁇ / b> A and 43 ⁇ / b> B into the first group 31 and removes them from the third group 33 in order to classify the images using the threshold values. Since the face images 43A and 43B are out of the third group 33, the clustering unit 84 deletes (does not generate) the third group 33.
- the clustering unit 84 that classifies images using the new threshold classifies the face images 44A and 44E belonging to the fourth group 34 into the new fifth group 35 and removes it from the fourth group 34. .
- the clustering unit 84 uses the new threshold value calculated by the threshold value calculation unit 87 to move an image that has not been moved by the user between the plurality of groups, between the plurality of groups. There is. That is, the clustering unit 84 automatically reclassifies images other than the image moved by the user according to the changed threshold value.
- the display control unit 85 displays the thumbnail 35A together with the face images 44A and 44E of the fifth group 35.
- the display control unit 85 displays the frame 62 to indicate a portion (change portion) where the image classification result before the threshold is changed and the image classification result after the threshold is changed are different.
- the clustering unit 84 compares the classification result of the image before the threshold change stored in the main memory 103 with the classification result of the image after the threshold change, and obtains a changed portion.
- the clustering unit 84 notifies the display control unit 85 of the changed part.
- the display control unit 85 displays a frame 62 that surrounds the changed portion (the first group 31, the fourth group 34, and the fifth group 35 in FIG. 6) on the screen 30 to indicate the changed portion.
- FIG. 7 is a diagram showing another example of the screen 30 displayed on the LCD 17A after the image is reclassified.
- FIG. 7 another example of reclassification will be described.
- the user moves the face image 43 ⁇ / b> A belonging to the third group 33 to the first group 31.
- the face image 43 ⁇ / b> A is changed to the first group 31.
- the face image 43A is an example of a second image.
- the threshold value calculation unit 87 sets a new threshold value (first value). 1 threshold or third threshold).
- the clustering unit 84 performs image reclassification using the threshold value.
- the clustering unit 84 classifies the face image 43A into the first group 31 and removes it from the third group 33 based on the changed first threshold value and third threshold value. Further, the clustering unit 84 automatically reclassifies the face image 43B into the first group 31 based on the changed first threshold value.
- the face image 43B is an example of a third image. In this way, by changing the standard for classifying into the first group 31 (first classification standard), the clustering unit 84 assigns the face image 43B whose feature corresponds to the standard to the first group 31. Reclassify automatically.
- the clustering unit 84 automatically reclassifies the face images 41A and 41B into the third group 33 based on the changed third threshold value.
- the face images 41A and 41B are examples of fourth images.
- the clustering unit 84 causes the face images 41A and 41B whose features correspond to the standard to be the third group. Reclassify to 33 automatically.
- the display control unit 85 includes a frame 62 that surrounds the face images 43 ⁇ / b> A and 43 ⁇ / b> B changed to the first group 31 and the face images 41 ⁇ / b> A and 41 ⁇ / b> B changed to the third group 33. indicate.
- the display control unit 85 displays the face images 43A and 43B that have been classified into the third group 33 and changed to the first group 31 in the past so that they can be identified by the frame 62.
- the display control unit 85 displays the face images 41A and 41B that have been classified into the first group 31 and changed to the third group 33 in the past so that they can be identified by the frame 62.
- the display control unit 85 displays not only the frame 62 but also, for example, by highlighting, enlarging, brightening, or changing the color of a face image or group so that the changed portion can be identified. May be.
- the image management program 202 receives the image movement operation (step S16), changes the threshold value (step S17), receives the operation of the reclassification button 56 (step S18), and is based on the changed threshold value.
- the image reclassification (step S14) and the reclassification result display (step S15) are continued until the image management program 202 is terminated (step S19: Yes).
- the image management program 202 ends when a predetermined button is operated, for example, similarly to various application programs executed by the tablet 10.
- the threshold value calculation unit 87 calculates a new threshold value (step S17). However, when the reclassification button 56 is operated (step S18: Yes), the threshold value calculation unit 87 may calculate a new threshold value.
- step S18 when the user operates the reclassification button 56 (step S18: Yes), the clustering unit 84 reclassifies the image (step S14). However, when the threshold is changed (step S17), the clustering unit 84 may automatically reclassify the image without waiting for a user operation.
- the image management program 202 classifies images based on face images.
- the image management program 202 is not limited to this, and the images may be classified into a plurality of groups based on objects included in the images.
- the image management program 202 classifies the image for each fruit or vegetable that appears in the image (photograph).
- the part in which the fruit or vegetable of an image is reflected is an example of the part which each image has.
- the classification part detection unit 82 detects a part in which fruits or vegetables are reflected from a plurality of images.
- the clustering unit 84 classifies the images into a plurality of groups set for each fruit or vegetable based on the portion of the image in which the fruit or vegetable is reflected and the threshold value.
- the threshold value calculation unit 87 calculates a threshold value for classifying images into a plurality of groups based on a result of an image moving operation between a plurality of groups by the user.
- the clustering unit 84 changes the threshold used for clustering to the calculated new threshold. Accordingly, the threshold value is automatically changed based on the image moving operation by the user without the user manually changing the threshold value, so that a threshold value closer to the user's desire can be easily obtained.
- the clustering unit 84 classifies the images into a plurality of groups based on the changed threshold value. That is, when an image moving operation is performed by the user, the image is reclassified based on the threshold value changed based on the moving operation. For this reason, it is not necessary for the user to reclassify unnecessary images in order to understand the effect on the classification result due to the change of the threshold value, and a classification result close to the user's desire can be obtained earlier.
- the display control unit 85 shows a portion where the image classification result before the threshold is changed and the image classification result after the threshold is changed are displayed on the screen 30 on the LCD 17A by a frame 62, for example.
- the classification result by the clustering unit 84 is presented to the user, so that the classification result close to the user's desire can be obtained more easily.
- the threshold value calculation unit 87 changes the first threshold value so that the face images 43A and 43B are classified into the first group 31 when the face images 43A and 43B are moved to the first group 31, for example. Thereby, when the image is reclassified after the threshold value is changed, the face images 43A and 43B that should have been moved to the first group 31 by the user are prevented from being removed from the first group.
- the threshold value calculation unit 87 changes the third threshold value so that the face images 43A and 43B are removed from the third group 33 when the face images 43A and 43B are moved from the third group 33, for example. Thereby, when the image is reclassified after the change of the threshold, the face images 43A and 43B that should have been moved from the third group 33 by the user are prevented from being classified again into the third group 33. Is done.
- the image management program 202 executed by the tablet 10 of this embodiment is provided by being incorporated in advance in a ROM or the like.
- the image management program 202 executed by the tablet 10 of the present embodiment is an installable or executable file, such as a CD-ROM, a flexible disk (FD), a CD-R, a DVD (Digital Versatile Disk), or the like. You may comprise so that it may record and provide on a computer-readable recording medium.
- the image management program 202 executed by the tablet 10 of the present embodiment may be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network. Further, the image management program 202 executed by the tablet 10 of this embodiment may be provided or distributed via a network such as the Internet.
- the image management program 202 executed by the tablet 10 of the present embodiment includes the above-described units (image acquisition unit 81, classification part detection unit 82, feature amount calculation unit 83, clustering unit 84, display control unit 85, and operation reception unit 86. And a threshold calculation unit 87).
- the CPU processor
- the CPU reads out and executes the image management program 202 from the ROM, and the respective units are loaded on the main storage device.
- an image acquisition unit 81, a classification part detection unit 82, a feature calculation unit 83, a clustering unit 84, a display control unit 85, an operation reception unit 86, and a threshold calculation unit 87 are generated on the main storage device. Yes.
- the processor changes at least the first classification criterion by using one or more second images changed from the second group to the first group.
- the classification standard is automatically changed based on the user's operation to change the group of images without manually changing the classification standard, so that a classification standard closer to the user's wish can be easily obtained. It is done.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Processing Or Creating Images (AREA)
Abstract
Selon un mode de réalisation, un appareil électronique comprend un processeur, une unité de commande d'affichage et une unité de réception. Le processeur classe une pluralité de premières images dans un premier groupe, une caractéristique d'une section de chacune des premières images correspondant à une première référence de classification, et classe une pluralité de secondes images dans un second groupe, une caractéristique d'une section de chacune des secondes images correspondant à une seconde référence de classification qui est différente de la première référence de classification. Sur un dispositif d'affichage, l'unité de commande d'affichage affiche la pluralité de premières images comprises dans le premier groupe et la pluralité de secondes images comprises dans le second groupe. L'unité de réception reçoit une première opération pour passer une ou plusieurs des secondes images comprises dans le second groupe au premier groupe. Le processeur change au moins la première référence de classification à l'aide d'une ou de plusieurs secondes images qui ont été passées du second groupe au premier groupe.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2014/068502 WO2016006090A1 (fr) | 2014-07-10 | 2014-07-10 | Appareil électronique, procédé et programme |
| JP2016532378A JPWO2016006090A1 (ja) | 2014-07-10 | 2014-07-10 | 電子機器、方法及びプログラム |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2014/068502 WO2016006090A1 (fr) | 2014-07-10 | 2014-07-10 | Appareil électronique, procédé et programme |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016006090A1 true WO2016006090A1 (fr) | 2016-01-14 |
Family
ID=55063763
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2014/068502 Ceased WO2016006090A1 (fr) | 2014-07-10 | 2014-07-10 | Appareil électronique, procédé et programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JPWO2016006090A1 (fr) |
| WO (1) | WO2016006090A1 (fr) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017520809A (ja) * | 2015-04-08 | 2017-07-27 | 小米科技有限責任公司Xiaomi Inc. | アルバム表示方法及び装置 |
| WO2019008961A1 (fr) * | 2017-07-07 | 2019-01-10 | 日本電気株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
| WO2021177073A1 (fr) * | 2020-03-05 | 2021-09-10 | ソニーグループ株式会社 | Dispositif et procédé de traitement d'informations |
| WO2021193271A1 (fr) * | 2020-03-23 | 2021-09-30 | 株式会社湯山製作所 | Système d'aide à l'inspection et programme d'aide à l'inspection |
| JP2022526381A (ja) * | 2019-08-22 | 2022-05-24 | シェンチェン センスタイム テクノロジー カンパニー リミテッド | 画像処理方法及び装置、電子機器並びに記憶媒体 |
| WO2022249277A1 (fr) * | 2021-05-25 | 2022-12-01 | 日本電気株式会社 | Dispositif de traitement d'image, procédé de traitement d'image et programme |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001134763A (ja) * | 1999-11-09 | 2001-05-18 | Hitachi Ltd | 撮像画像に基づく欠陥の分類方法、および、その結果の表示方法 |
| JP2011145791A (ja) * | 2010-01-13 | 2011-07-28 | Hitachi Ltd | 識別器学習画像生成プログラム、方法、及びシステム |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4132229B2 (ja) * | 1998-06-03 | 2008-08-13 | 株式会社ルネサステクノロジ | 欠陥分類方法 |
| JP4220595B2 (ja) * | 1998-08-10 | 2009-02-04 | 株式会社日立製作所 | 欠陥の分類方法並びに教示用データ作成方法 |
| JP2011158373A (ja) * | 2010-02-02 | 2011-08-18 | Dainippon Screen Mfg Co Ltd | 自動欠陥分類のための教師データ作成方法、自動欠陥分類方法および自動欠陥分類装置 |
| JP2012174222A (ja) * | 2011-02-24 | 2012-09-10 | Olympus Corp | 画像認識プログラム、方法及び装置 |
| JP2012208710A (ja) * | 2011-03-29 | 2012-10-25 | Panasonic Corp | 属性推定装置 |
| JP6071288B2 (ja) * | 2012-07-09 | 2017-02-01 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
-
2014
- 2014-07-10 JP JP2016532378A patent/JPWO2016006090A1/ja active Pending
- 2014-07-10 WO PCT/JP2014/068502 patent/WO2016006090A1/fr not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001134763A (ja) * | 1999-11-09 | 2001-05-18 | Hitachi Ltd | 撮像画像に基づく欠陥の分類方法、および、その結果の表示方法 |
| JP2011145791A (ja) * | 2010-01-13 | 2011-07-28 | Hitachi Ltd | 識別器学習画像生成プログラム、方法、及びシステム |
Cited By (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017520809A (ja) * | 2015-04-08 | 2017-07-27 | 小米科技有限責任公司Xiaomi Inc. | アルバム表示方法及び装置 |
| US9953212B2 (en) | 2015-04-08 | 2018-04-24 | Xiaomi Inc. | Method and apparatus for album display, and storage medium |
| WO2019008961A1 (fr) * | 2017-07-07 | 2019-01-10 | 日本電気株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
| JPWO2019008961A1 (ja) * | 2017-07-07 | 2020-06-25 | 日本電気株式会社 | 情報処理装置、情報処理方法、およびプログラム |
| US11663184B2 (en) | 2017-07-07 | 2023-05-30 | Nec Corporation | Information processing method of grouping data, information processing system for grouping data, and non-transitory computer readable storage medium |
| JP2022526381A (ja) * | 2019-08-22 | 2022-05-24 | シェンチェン センスタイム テクノロジー カンパニー リミテッド | 画像処理方法及び装置、電子機器並びに記憶媒体 |
| WO2021177073A1 (fr) * | 2020-03-05 | 2021-09-10 | ソニーグループ株式会社 | Dispositif et procédé de traitement d'informations |
| WO2021193271A1 (fr) * | 2020-03-23 | 2021-09-30 | 株式会社湯山製作所 | Système d'aide à l'inspection et programme d'aide à l'inspection |
| JPWO2021193271A1 (fr) * | 2020-03-23 | 2021-09-30 | ||
| WO2022249277A1 (fr) * | 2021-05-25 | 2022-12-01 | 日本電気株式会社 | Dispositif de traitement d'image, procédé de traitement d'image et programme |
| JPWO2022249277A1 (fr) * | 2021-05-25 | 2022-12-01 | ||
| JP7556463B2 (ja) | 2021-05-25 | 2024-09-26 | 日本電気株式会社 | 画像処理装置、画像処理方法、及びプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2016006090A1 (ja) | 2017-05-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11113523B2 (en) | Method for recognizing a specific object inside an image and electronic device thereof | |
| US10796405B2 (en) | Image processing apparatus and method, and non-transitory computer-readable storage medium storing program | |
| US10649633B2 (en) | Image processing method, image processing apparatus, and non-transitory computer-readable storage medium | |
| US9129150B2 (en) | Electronic apparatus and display control method | |
| EP3125135B1 (fr) | Dispositif et procédé de traitement d'images | |
| US20190327367A1 (en) | Image processing apparatus, image processing method, and storage medium | |
| US10403014B2 (en) | Image processing apparatus and image processing method | |
| CN108694400B (zh) | 信息处理装置、其控制方法及存储介质 | |
| EP3373201A1 (fr) | Appareil de traitement d'informations, procédé de traitement d'informations et support d'informations | |
| US20170039746A1 (en) | Image processing apparatus, image processing method, and storage medium storing program | |
| JP5799817B2 (ja) | 指位置検出装置、指位置検出方法及び指位置検出用コンピュータプログラム | |
| US9734591B2 (en) | Image data processing method and electronic device supporting the same | |
| CN112954210A (zh) | 拍照方法、装置、电子设备及介质 | |
| US11037265B2 (en) | Information processing method, information processing apparatus, and storage medium | |
| WO2016006090A1 (fr) | Appareil électronique, procédé et programme | |
| US11073962B2 (en) | Information processing apparatus, display control method, and program | |
| WO2013073168A1 (fr) | Dispositif de traitement d'image, dispositif d'imagerie et procédé de traitement d'image | |
| CN108781252A (zh) | 一种图像拍摄方法及装置 | |
| KR102303206B1 (ko) | 전자장치에서 이미지 내의 특정 객체를 인식하기 위한 방법 및 장치 | |
| CN112732961A (zh) | 图像分类方法及装置 | |
| EP3128733B1 (fr) | Méthode de traitement d'informations, appareil de traitement d'informations et programme | |
| JP6419560B2 (ja) | 検索装置、方法及びプログラム | |
| EP4024347B1 (fr) | Appareil de traitement d'images, procédé de traitement d'images et programme | |
| JP2014209326A (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
| US20150205434A1 (en) | Input control apparatus, input control method, and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14897372 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2016532378 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 14897372 Country of ref document: EP Kind code of ref document: A1 |