US20160353014A1 - Information processing apparatus, information processing method, and medium using an action state of a user - Google Patents
Information processing apparatus, information processing method, and medium using an action state of a user Download PDFInfo
- Publication number
- US20160353014A1 US20160353014A1 US15/235,703 US201615235703A US2016353014A1 US 20160353014 A1 US20160353014 A1 US 20160353014A1 US 201615235703 A US201615235703 A US 201615235703A US 2016353014 A1 US2016353014 A1 US 2016353014A1
- Authority
- US
- United States
- Prior art keywords
- image
- user
- information processing
- unit
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N5/23219—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00352—Input means
- H04N1/00392—Other manual input means, e.g. digitisers or writing tablets
-
- G06K9/00664—
-
- G06K9/2054—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00204—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
- H04N1/00209—Transmitting or receiving image data, e.g. facsimile data, via a computer, e.g. using e-mail, a computer network, the internet, I-fax
- H04N1/00214—Transmitting or receiving image data, e.g. facsimile data, via a computer, e.g. using e-mail, a computer network, the internet, I-fax details of transmission
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0084—Digital still camera
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- JP 4289326B discloses a technology that recognizes an action of a user holding a camcorder based on sensor data obtained by a sensor that is built in the camcorder in the same timing as that of photographing by the camcorder, and records the recognized result of action in association with the photographed image.
- the above-mentioned technology may be unable to reduce adaptively the communication traffic.
- an image to be transmitted to an external device may be difficult to be optionally selected.
- all of the images are necessary to be transmitted, and when there are a large number of photographed images, the communication traffic will be increased.
- the above-mentioned technology although the amount of information of the photographed image can be reduced, the uniform reduction in the amount of information for all of the images may be difficult.
- an undesirable situation for the user occurs such as when even the resolution of an image region regarding an object that is watched by the user during photographing is reduced.
- a novel and improved information processing apparatus, information processing method, and program capable of reducing adaptively the communication traffic in case of transmitting a photographed image to an external device.
- an information processing apparatus including an action recognition unit configured to recognize an action state of a user based on a measurement result obtained by a sensor carried by the user, an image processing unit configured to perform a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized by the action recognition unit during photographing by the photographing unit carried by the user, and a transmission control unit configured to cause the image processed by the image processing unit to be transmitted to an image processing device used to perform image recognition.
- an information processing method including recognizing an action state of a user based on a measurement result obtained by a sensor carried by the user, performing, by a processor, a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized during photographing by the photographing unit carried by the user, and causing the processed image to be transmitted to an image processing device used to perform image recognition.
- FIG. 1 is a diagram for describing the basic configuration of an information processing system that is common to each embodiment of the present disclosure
- FIG. 2 is a diagram for describing the hardware configuration of an information processing apparatus 10 according to each embodiment of the present disclosure
- FIG. 3 is a functional block diagram illustrating the configuration of the information processing apparatus 10 according to a first embodiment of the present disclosure
- FIG. 4 is a diagram for describing an example of the action recognition by an action recognition unit 102 according to the first embodiment
- FIG. 5 is a diagram for describing an example of the action recognition by the action recognition unit 102 according to the first embodiment
- FIG. 6 is a diagram for describing an example of the action recognition by the action recognition unit 102 according to the first embodiment
- FIG. 7 is a diagram for describing an example of the action recognition by the action recognition unit 102 according to the first embodiment
- FIG. 9 is a diagram for describing an example of the photographing control by the photographing control unit 104 according to the first embodiment
- FIG. 10 is a diagram for describing an example of the display control by a display control unit 108 according to the first embodiment
- FIG. 11 is a sequence diagram illustrating the operation according to the first embodiment
- FIG. 12 is a functional block diagram illustrating the configuration of the information processing apparatus 10 according to a second embodiment of the present disclosure
- FIG. 13 is a diagram for describing an example of the image processing by an image processing unit 110 according to the second embodiment
- FIG. 14 is a diagram for describing an example of the image processing by the image processing unit 110 according to the second embodiment
- FIG. 15 is a diagram for describing a state in which a user looks forward and then is stationary
- FIG. 16 is a diagram for describing a state in which a user is walking while looking forward
- FIG. 17 is a diagram for describing an example of image processing by the image processing unit 110 according to the second embodiment.
- FIG. 18 is a diagram for describing an example of image processing by the image processing unit 110 according to the second embodiment.
- FIG. 19 is a diagram for describing a state in which a user looks down and then is stationary
- FIG. 20 is a diagram for describing an example of image processing by the image processing unit 110 according to the second embodiment
- FIG. 21 is a diagram for describing an example of image processing by the image processing unit 110 according to the second embodiment.
- FIG. 22 is a diagram for describing an example of image processing by the image processing unit 110 according to the second embodiment
- FIG. 23 is a diagram for describing an example of image processing by the image processing unit 110 according to the second embodiment.
- FIG. 24 is a sequence diagram illustrating operation according to the second embodiment.
- the present disclosure may be implemented in various embodiments, as described in detail in the items “3-1. First Embodiment” and “3-2. Second Embodiment” and their subcategories, as an example. To clearly illustrate features of embodiments of the present disclosure, the technical background that led to the conception of an information processing apparatus according to an embodiment of the present disclosure is first described.
- wearable devices which have a built-in image sensor and are used by being mounted constantly on the user, has been recently studied.
- Such a wearable device is likely to be able to recognize constantly an object, a person, or the like contained in an image captured by a camera attached thereto and provide the user with information related to an object, a person, or the like, for example, viewed by the user based on recognition results.
- the information processing apparatus 10 is capable of controlling adaptively the timing of photographing depending on the action state of the user.
- the information processing apparatus 10 when transmitting a photographed image to a server 20 , can reduce adaptively communication traffic.
- the information processing system includes the information processing apparatus 10 , a communication network 12 , and the server 20 .
- the information processing apparatus 10 is an example of the information processing apparatus according to an embodiment of the present disclosure.
- the information processing apparatus 10 is, for example, a device provided with a glasses-type display, as illustrated in FIG. 1 .
- a translucent see-through type display can be employed as the glasses-type display. This see-through type display enables the user to view the outside environment through the display.
- the information processing apparatus 10 is used by being worn on the user's head.
- the information processing apparatus 10 has a camera 166 , which will be described later, at a position in the periphery of the display. The user can photograph a landscape at which the user is looking with the camera 166 while moving by wearing the information processing apparatus 10 .
- the information processing apparatus 10 may have such hardware configuration as illustrated in FIG. 2 .
- the information processing apparatus 10 is configured to include a central processing unit (CPU) 150 , a read only memory (ROM) 152 , a random access memory (RAM) 154 , an internal bus 156 , an interface 158 , an output device 160 , a storage device 162 , a communication device 164 , a camera 166 , a position information measuring device 168 , an acceleration sensor 170 , a gyroscope 172 , and a microphone 174 .
- CPU central processing unit
- ROM read only memory
- RAM random access memory
- the CPU 150 is composed of a various types of processing circuits and serves as a controller 100 for controlling the entire information processing apparatus 10 .
- the CPU 150 implements each function of an action recognition unit 102 , a photographing control unit 104 , a transmission control unit 106 , a display control unit 108 , an image processing unit 110 , a face region detection unit 112 , and a blur determination unit 114 , which are described later.
- the ROM 152 stores a program used by the CPU 150 and it also stores data for control of operation parameters or the like to be used by the CPU 150 .
- the RAM 154 stores temporarily a program, for example, to be executed by the CPU 150 .
- the interface 158 connects the output device 160 , the storage device 162 , the communication device 164 , the camera 166 , the position information measuring device 168 , and acceleration sensor 170 , the gyroscope 172 , and the microphone 174 with the internal bus 156 .
- the output device 160 exchanges data with the CPU 150 and other components via the interface 158 and the internal bus 156 .
- the output device 160 includes a display device such as a liquid crystal display (LCD), an organic light emitting diode (OLED), and a lamp.
- This display device displays an image captured by the camera 166 , an image generated by the CPU 150 , or the like.
- the output device 160 includes an audio output device such as a loudspeaker. This audio output device converts audio data or the like into sound and outputs it.
- the storage device 162 is a device for storing data, which is used to store a program or various data to be executed by the CPU 150 .
- the storage device 162 includes a storage medium, a recording device for recording data in a storage medium, a reading device for reading out data from a storage medium, a deletion device for deleting data recorded in a storage medium, or the like.
- the communication device 164 is a communication interface that is composed of a communication device or the like used to connect to a communication network such as a public network or the Internet.
- the communication device 164 may be a wireless LAN compatible communication device, a long-term evolution (LTE) compatible communication device, or a wired communication device that performs communication through a wired line.
- LTE long-term evolution
- the communication device 164 serves, for example, as a communication unit 120 that will be described later.
- the camera 166 has functions of forming an image obtained from the outside through a lens on an image sensor, such as a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS), and of photographing still or moving images.
- an image sensor such as a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS)
- CCD charge-coupled device
- CMOS complementary metal-oxide semiconductor
- the position information measuring device 168 receives a positioning signal from a positioning satellite, such as the global positioning system or a global navigation satellite system (GLONASS), and thus measures its current position.
- a positioning satellite such as the global positioning system or a global navigation satellite system (GLONASS)
- the position information measuring device 168 may have functions of receiving Wi-Fi (registered trademark) radio waves from a plurality of base stations and measuring its current position based on the reception intensity of the received Wi-Fi radio waves and the positions of each base station.
- the position information measuring device 168 may have a function of measuring its current position based on communications with a Bluetooth access point.
- the position information measuring device 168 serves as a measurement unit 122 that will be described later.
- the acceleration sensor 170 measures the acceleration of the information processing apparatus 10 .
- the acceleration sensor 170 serves as the measurement unit 122 .
- the gyroscope 172 measures the angle or the angular velocity of the information processing apparatus 10 .
- the gyroscope 172 detects the inertial force or the Coriolis force applied to the information processing apparatus 10 and thus measures the angular velocity of the information processing apparatus 10 .
- the gyroscope 172 serves as the measurement unit 122 .
- the microphone 174 collects sound coming from outside.
- the microphone 174 serves as the measurement unit 122 .
- the hardware configuration of the information processing apparatus 10 is not limited to the above-described configuration.
- the information processing apparatus 10 may be configured without any one or more of the storage device 162 , the position information measuring device 168 , the acceleration sensor 170 , the gyroscope 172 , and the microphone 174 .
- the communication network 12 is a wired or wireless communication channel for information transmitted from a device connected to the communication network 12 .
- the communication network 12 may include public networks such as the Internet, telephone network, and satellite communication network, various local area networks (LANs) including Ethernet (registered trademark), and wide area networks (WANs).
- LANs local area networks
- WANs wide area networks
- the communication network 12 may include a leased line network such as Internet protocol virtual private network (IP-VPN).
- IP-VPN Internet protocol virtual private network
- the server 20 is an exemplary image processing apparatus according to an embodiment of the present disclosure.
- the server 20 has a function of performing image recognition on a photographed image.
- the server 20 includes a storage unit (not illustrated) for storing a plurality of data for reference that include various types of information in the real world.
- the server 20 for example when receiving a photographed image from the information processing apparatus 10 , performs image recognition on the photographed image, and so can recognize an object, a person, or the like contained in the photographed image.
- the server 20 can extract additional information that is information related to the recognized object, person, or the like from the plurality of data for reference stored in the storage unit, and can transmit the extracted additional information to the information processing apparatus 10 .
- Such a function of the server 20 makes it possible for the user carrying the information processing apparatus 10 to obtain a notification of detailed information related to buildings, goods, persons, or the like viewed by the user while moving around in the real world from the server 20 .
- the user can know the reception of additional information from the server 20 through a change in video, audio, or vibration outputted from the output device 160 of the information processing apparatus 10 .
- the output device 160 superimposes the additional information on a display screen or changes the mode of vibration, thereby informing the user of the contents of information or the location direction.
- FIG. 3 is a functional block diagram illustrating the configuration of the information processing apparatus 10 according to the first embodiment.
- the information processing apparatus 10 is configured to include a controller 100 , a communication unit 120 , a measurement unit 122 , a photographing unit 124 , and a display unit 126 .
- the controller 100 controls the overall operation of the information processing apparatus 10 using hardware, such as the CPU 150 and RAM 154 , which is built in the information processing apparatus 10 .
- the controller 100 is configured to include an action recognition unit 102 , a photographing control unit 104 , a transmission control unit 106 , and a display control unit 108 .
- the action recognition unit 102 recognizes an action state of the user based on the results of measurement by the measurement unit 122 that will be described later.
- the measurement unit 122 is an example of the sensor according to an embodiment of the present disclosure.
- the action state includes, for example, a movement state of the user, a vision-related state of the user, and a voice-related state of the user. The detailed processing of each state is described below.
- the action recognition unit 102 is able to recognize a movement state of the user based on the results of measurement by the measurement unit 122 .
- the movement state indicates states in which a user is walking, running, riding a bicycle, and riding in a vehicle such as automobiles, trains, and planes, or a state in which a user is stationary such as when the user is sitting in a chair.
- FIG. 4 is a graph showing values obtained by measurement of the acceleration sensor 170 in association with measurement time.
- the changes in acceleration having a substantially similar waveform are measured in time zones of time t 1 to t 2 , time t 2 to t 3 , and time t 3 to t 4 .
- the change in acceleration having such a waveform approximates the change in acceleration when a person is walking.
- the action recognition unit 102 recognizes that the user is in walking state in the time zone from time t 1 to t 4 , based on the measurement results obtained by the acceleration sensor 170 , which are shown in FIG. 4 .
- the action recognition unit 102 can recognize a vision-related state of the user based on the measurement results obtained by the measurement unit 122 .
- the vision-related state indicates in what state and under what circumstances the user looks at an object.
- the vision-related state indicates a state in which the user is watching a particular object or person, a state in which the user is looking around, or the direction in which the user is looking, such as the front or a vertical or horizontal direction.
- the action recognition unit 102 can recognize the user's vision-related state based on the movement of the user's head that is measured by the gyroscope 172 . For example, when the gyroscope 172 measures that the speed of movement of the user's head is reduced to less than or equal to a predetermined value, the action recognition unit 102 recognizes that the user is watching.
- FIG. 5 is an example of the graph showing measurement values obtained by the acceleration sensor 170 in association with measurement time.
- the acceleration fluctuations in the time zone from time t 1 to t 2 are large, and after time t 2 , the fluctuation amount is significantly reduced and thus measured values of the acceleration are at or close to zero.
- the fluctuations in acceleration having the waveform as shown in FIG. 5 approximate the fluctuations in acceleration of the head when a person finds something and watches it at approximately time t 2 .
- FIG. 6 is an example of the graph showing measurement values of the angle of the user's head obtained by the gyroscope 172 in association with measurement time.
- the angular fluctuations in the time zone from time t 1 to t 2 are large, and after time t 2 , the amount of fluctuations is significantly reduced and thus measured values of the angle are at a substantially fixed value.
- the acceleration fluctuations having the waveform as shown in FIG. 6 approximate the acceleration fluctuations of the head when the user moves his head and then stops its motion, such as when a person finds something and watches it at approximately time t 2 .
- the action recognition unit 102 recognizes that the user is watching after the time t 2 , based on the measurement results obtained by the acceleration sensor 170 as shown in FIG. 5 and/or the measurement results obtained by the gyroscope 172 as shown in FIG. 6 .
- the action recognition unit 102 also can recognize the action of the user moving his head and then stopping the motion based on the change in magnitude of the sound of blowing wind that is detected by the microphone 174 instead of the gyroscope 172 .
- FIG. 7 is another example of the graph showing the values obtained by measuring the angle of the user's head in association with the measurement time.
- the angles fluctuate gradually between the angles 81 and 82 over the whole measurement period.
- the fluctuations in angle of the waveform as shown in FIG. 7 approximate the fluctuations in acceleration of a person's head, for example, when the person is looking around, wandering, or looking down.
- the action recognition unit 102 recognizes that, for example, the user is looking around over the whole measurement period based on the measurement results obtained by the acceleration sensor 170 as shown in FIG. 7 .
- the action recognition unit 102 can recognize a voice-related state of the user based on the measurement results obtained by the measurement unit 122 .
- the voice-related state indicates, for example, a state in which a user is talking with other people, a state in which a user is silent, or the degree of magnitude of the voice produced by a user.
- the action recognition unit 102 recognizes that a user is talking with other people based on the measurement results obtained by the microphone 174 .
- the photographing control unit 104 controls the timing at which a photographing unit 124 described later is allowed to perform photographing based on the action state of the user recognized by the action recognition unit 102 .
- the following description herein is given based on an example in which the photographing control unit 104 allows the photographing unit 124 to capture still images.
- an embodiment of the present disclosure may be intended to reduce the power consumption as much as possible, but it is not limited to such an example.
- the photographing control unit 104 may allow the photographing unit 124 to capture moving images.
- the photographing control unit 104 can change the timing at which the photographing unit 124 is allowed to perform photographing based on whether a user is moving or not. For example, the photographing control unit 104 sets a frequency with which the photographing unit 124 is allowed to perform photographing when the user is stationary to be smaller than a frequency to be set when the user is moving. In general, when a user is stationary, it is assumed that the landscape viewed by user has a little change. Thus, with control example 1, an image in which it is assumed that the image is undesirable to the user or the image is not considered important can be prevented from being captured.
- FIG. 9 is an example of the graph showing the measurement results obtained by the acceleration sensor 170 in association with measurement time.
- FIG. 9 there are illustrated the results in which the acceleration fluctuations in the time zones from time t 1 to t 7 that are similar to the acceleration fluctuations in the time zone from time t 1 to t 2 shown in FIG. 4 are continuously measured.
- the duration between time tb and tc is assumed to be the same as the duration between time ta and tb.
- the action recognition unit 102 recognizes that a user is in walking state.
- the photographing control unit 104 allows the photographing unit 124 to perform photographing at the time interval between times ta and tb so that photographing is performed, for example, at time ta, tb, and Tc in FIG. 9 .
- control example 1 it is possible to photograph a sequence of images by capturing the change in landscapes with the movement of the user even without capturing moving images. Accordingly, the number of photographing times by the photographing unit 124 can be decreased, thereby reducing the power consumption.
- the photographing control unit 104 can change the timing at which the photographing unit 124 is allowed to perform photographing based on the vision-related state of a user that is recognized by the action recognition unit 102 . For example, when the action recognition unit 102 recognizes that a user is watching as immediately after time t 2 shown in FIG. 5 or 6 , the photographing control unit 104 allows the photographing unit 124 to perform photographing.
- control example 2 when a user is watching, for example, goods displayed in a department store, a building or structure in a tour place, or a person who a user saw on the street, it is possible to photograph the object that is being watched by the user in reliable and immediate manner.
- the photographing control unit 104 may allow the photographing unit 124 to perform photographing in a continuous way in accordance with the movement of the user's head or neck. According to this modification, there is an advantage that a change in how the user watched during photographing can be recorded.
- the photographing control unit 104 is able to allow the photographing unit 124 to perform photographing when the action recognition unit 102 recognizes that a user has spoken. According to this control example 3, when a user is talking with other people, it is possible for the user to photograph the conversation partner automatically, for example, without having to release the shutter.
- the transmission control unit 106 allows the photographing unit 120 to transmit, for example, an image photographed by the photographing unit 124 to the server 20 .
- the display control unit 108 allows the display unit 126 , which will be described later, to display, for example, various character strings or images such as additional information received from the server 20 .
- FIG. 10 is a diagram for describing a display example in which additional information received from the server 20 is displayed on the display unit 126 .
- the left side view of FIG. 10 is an example of a photographed image (a photographed image 30 ) captured by the photographing unit 124 .
- the right side view of FIG. 10 is an example (a picture 40 ) in which the photographed image 30 is transmitted to the server 20 and then additional information received from the server 20 is displayed on the display unit 126 .
- FIG. 10 is a diagram for describing a display example in which additional information received from the server 20 is displayed on the display unit 126 .
- the left side view of FIG. 10 is an example of a photographed image (a photographed image 30 ) captured by the photographing unit 124 .
- the right side view of FIG. 10 is an example (a picture 40 ) in which the photographed image 30 is transmitted to the server 20 and then additional information received from the server 20 is displayed on the display unit 126 .
- the information processing apparatus 10 illustrates an example in which the information processing apparatus 10 receives, from the server 20 , additional information about the advertisement of “ON SALE (10/1-10/31)” for a department store 300 contained in the photographed image 30 or addition information indicating a station name of “STATION X ON LINE A” for a station 302 .
- the display control unit 108 superimposes a display that indicates additional information received from the server 20 on the picture 40 .
- the display unit 126 is composed of a see-through type display
- the picture 40 is a landscape at which the user is actually looking through the display.
- the display control unit 108 may display an image, which is the same as the photographed image 30 , as the picture 40 .
- the communication unit 120 transmits and receives information to and from various types of devices connected to the communication network 12 , for example, by wireless communication.
- the communication unit 120 transmits the image photographed by the photographing unit 124 to the server 20 under the control of the transmission control unit 106 .
- the communication unit 120 receives the above-described additional information from the server 20 .
- the measurement unit 122 is composed, for example, of the position information measuring device 168 , the acceleration sensor 170 , the gyroscope 172 , and the microphone 174 .
- the measurement unit 122 measures acceleration of the information processing apparatus 10 , an angle of the information processing apparatus 10 , or the sound coming from outside.
- the measurement unit 122 is basically assumed to perform continuous measurement. The reason for this is, for example, because various sensors such as the acceleration sensor 170 are continuously activated but their power continuously consumed is substantially smaller than the power consumed by photographing of the camera 166 .
- the photographing unit 124 photographs an outside still or moving image under the control of the photographing control unit 104 .
- the photographing unit 124 can also photograph an outside sill or moving image, for example, in accordance with an instruction from the user to an input device (not illustrated) such as a button attached to the information processing apparatus 10 .
- an input device such as a button attached to the information processing apparatus 10 .
- the following description herein is given based on an example in which the photographing unit 124 photographs an image under the control of the photographing control unit 104 .
- the display unit 126 displays, for example, various character strings or images such as additional information received from the server 20 under the control of the display control unit 108 .
- the configuration of the information processing apparatus 10 according to the first embodiment is not limited to the above-described configuration.
- the measurement unit 122 may not be included in the information processing apparatus 10 but may be included in other devices.
- FIG. 11 is a sequence diagram illustrating the operation according to the first embodiment.
- the measurement unit 122 of the information processing apparatus 10 measures, for example, acceleration of the information processing apparatus 10 , an angle of the information processing apparatus 10 , or the sound coming from outside (S 101 ).
- the action recognition unit 102 recognizes an action state of the user based on the measurement results measured in step S 101 (S 102 ).
- the photographing control unit 104 determines whether the present is the timing at which the photographing unit 124 is allowed to perform photographing based on the action state of the user that is recognized in step S 012 (S 103 ). If it is not determined that the present is the timing at which the photographing unit 124 is allowed to perform photographing (S 103 : NO), then the information processing apparatus 10 performs the operation of S 101 again.
- the photographing control unit 104 adjusts various types of parameters such as photographic sensitivity or shutter speed to an appropriate value, for example, based on information including brightness of a surrounding environment (S 104 ). Then, the photographing control unit 104 allows the photographing unit 124 to perform photographing (S 105 ).
- the transmission control unit 106 determines whether the image photographed in step S 105 is transmitted to the server 20 , for example, based on a predetermined condition of whether the photographed image contains an object, a person, or the like (S 106 ). If it is not determined that the image is transmitted (S 106 : NO), then the information processing apparatus 10 performs the operation of step S 101 again.
- the transmission control unit 106 allows the communication unit 120 to transmit the photographed image to the server 20 (S 107 ).
- the server 20 performs image recognition on the image received from the information processing apparatus 10 (S 108 ). Then, the server 20 extracts additional information about the object, person, or the like recognized from the received image, for example, from a large amount of reference data stored in a storage unit of the server 20 (S 109 ). Then, the server 20 transmits the extracted additional information to the information processing apparatus 10 (S 110 ).
- the display control unit 108 of the information processing apparatus 10 allows the display unit 126 to display the additional information received from the server 20 (S 111 ).
- the information processing apparatus 10 recognizes an action state of the user based on the measurement results obtained by the measurement unit 122 and controls the timing at which the photographing unit 124 is allowed to perform photographing based on the recognized action state. Thus, it is possible to control adaptively the timing of photographing depending on the action state of the user.
- the information processing apparatus 10 sets the frequency with which the photographing unit 124 is allowed to perform photographing to be smaller than a frequency to be set when the user is moving.
- a frequency to be set when the user is moving In general, when a user is stationary, it is assumed that the landscape at which user is looking has a little change. Thus, an image where it is assumed that the image is undesirable to the user or the image is not considered important can be significantly prevented from being photographed. Accordingly, the power consumed by photographing can be reduced.
- an information processing apparatus 10 can reduce the amount of information of the image transmitted to the server 20 , thereby reducing the power consumption.
- FIG. 12 is a functional block diagram illustrating the configuration of the information processing apparatus 10 according to the second embodiment.
- the information processing apparatus 10 further includes an image processing unit 110 , a face region detection unit 112 , and a blur determination unit 114 , as compared with the configuration of the first embodiment.
- a transmission control unit 107 allows the communication unit 120 to transmit an image processed by an image processing unit 110 described later to the server 20 . More specifically, the transmission control unit 107 can cause the image, which is generated by the image processing unit 110 and has the reduced amount of information, to be transmitted to the server 20 based on the photographed image captured by the photographing unit 124 .
- the transmission control unit 107 causes the image of a particular region clipped from the photographed image by the image processing unit 110 to be transmitted to the server 20 .
- the transmission control unit 107 causes the image obtained by reducing the resolution from the photographed image by the image processing unit 110 to be transmitted to the server 20 .
- the transmission control unit 107 causes an image of one or more face regions of a person that is clipped from the photographed image by the image processing unit 110 to be transmitted to the server 20 .
- the transmission control unit 107 may also cause the photographed image to be not transmitted to the server 20 .
- the image processing unit 110 performs a process regarding the amount of information on the image photographed by the photographing unit 124 based on an action state of the user that is recognized by the action recognition unit 102 during photographing by the photographing unit 124 . More specifically, the image processing unit 110 can generate an image obtained by reducing the amount of information from the photographed image using a way corresponding to the action state of the user that is recognized by the action recognition unit 102 during photographing by the photographing unit 124 .
- the image processing unit 110 when the action recognition unit 102 recognizes that, for example, a user looks around, downward, or upward when during photographing, the image processing unit 110 generates an image obtained by compressing resolution from the photographed image.
- FIG. 13 is a diagram for describing a generation example in which an image with compressed resolution (processed image 50 ) is generated from the photographed image (photographed image 30 ).
- the image processing unit 110 generates the processed image 50 that is an image with the compressed resolution of 320 ⁇ 240 from the photographed image 30 with resolution of 640 ⁇ 480.
- processing example 1 by compressing the resolution of the photographed image, it is possible to reduce the amount of information of the image without reducing the photographing range in which the user is intended to perform photographing.
- the image processing unit 110 can generate an image obtained by clipping a predetermined region from the photographed image.
- FIG. 14 is a diagram for describing a generation example of generating an image (processed image 50 ) obtained by clipping a predetermined region in the photographed image 30 from the photographed image (photographed image 30 ).
- the image processing unit 110 generates the processed image 50 that is an image obtained by clipping a region with the resolution of 320 ⁇ 240 that is a central portion of the photographed image 30 from the photographed image 30 with the resolution of 640 ⁇ 480.
- the size of the predetermined region may be set, for example, in accordance with the maximum bit width of the communication line for transmission to the server 20 .
- the image processing unit 110 can also generate an image obtained by clipping a predetermined region from the photographed image.
- FIG. 15 is a diagram for describing a state in which a user is stationary while looking forward.
- FIG. 15 illustrates a visible area 32 at which the user is looking as a plan view for simplicity of description.
- FIG. 16 is a diagram for describing a state in which a user is walking while looking forward.
- the field of view of the user when walking is smaller than that when the user is stationary, and thus the user tends to look at a region 52 smaller than the visible region 32 of the user when stationary, as illustrated in FIG. 16 .
- the image processing unit 110 may generate an image that is obtained by clipping a region within a predetermined distance from the center in the photographed image as illustrated in FIG. 17 .
- FIG. 17 illustrates a generation example of an image (processed image 50 ) obtained by clipping a region within the distance d 1 from the center in the photographed image (photographed image 30 ).
- an image of an object, a person, or the like that is watched by the user during photographing can be clipped appropriately, and the amount of information of an image can be reduced.
- the image processing unit 110 may generate an image that is obtained by clipping a peripheral region at a predetermined distance or more away from the center in the photographed image as illustrated in FIG. 18 .
- FIG. 18 illustrates a generation example of the image (processed image 50 ) that is obtained by clipping a peripheral region at the distance d 1 or more away from the center in the photographed image (photographed image 30 ).
- this generation example it is possible to extract an image of an object, a person, or the like that is not watched by the user during photographing.
- the clipped image is transmitted to the server 20 and additional information regarding the clipped image is received from the server 20 , it is advantageously possible for the user to know information regarding an object or a person that is not noticed or is little considered by the user during photographing.
- the image processing unit 110 can clip a region from the photographed image such that the region to be clipped is within a predetermined distance from the lower end of the photographed image.
- FIG. 19 is a diagram for describing a state in which a user looks downward and is stationary. As illustrated in FIG. 19 , in general, when a user tilts his head to look down, for example, the user casts his eyes downward, and thus the user tends to look at a region 52 in the lower side than the visible region 32 when the eyes look forward.
- the image processing unit 110 may generate an image (processed image 50 ) that is obtained by clipping a region from the photographed image (photographed image 30 ) such that the region to be clipped is within the distance d 2 from the lower end of the photographed image as illustrated in FIG. 20 .
- the value of the distance d 2 can be set as an appropriate fixed value, for example, by performing a user test in advance.
- the image processing unit 110 can clip a region from the photographed image such that the region to be clipped is within a predetermined distance from the upper end of the photographed image.
- the image processing unit 110 may generate an image (processed image 50 ) that is obtained by clipping a region from the photographed image (photographed image 30 ) such that the region to be clipped is within the distance d 2 from the upper end of the photographed image 30 as illustrated in FIG. 21 .
- the image processing unit 110 can generate an image obtained by clipping the detected face region from the photographed image.
- the image processing unit 110 may generate an image obtained by clipping the detected entire region of the face from the photographed image.
- the image processing unit 110 may generate an image obtained by clipping only the detected partial region of the face from the photographed image.
- the image processing unit 110 can also correct a blur contained in the photographed image based on the action state of the user recognized by the action recognition unit 102 during photographing. For example, when the action recognition unit 102 recognizes that a user is in walking state during photographing, the image processing unit 110 may correct the photographed image using a program, which is previously stored in the storage device 162 , for blur correction corresponding to a walking state.
- the image processing unit 110 may correct the photographed image using a program, which is previously stored in the storage device 162 , for blur correction corresponding to features of each action of the user.
- the image processing unit 110 may correct a blur contained in the photographed image depending on the amount of change in movement or the amount of change in angle in three-dimensional space of the information processing apparatus 10 , which is measured by the measurement unit 122 during photographing.
- the face region detection unit 112 is able to detect a face region of a person contained in the photographed image.
- the face region detection unit 112 detects a face region of a person by extracting feature points such as eye, nose, or facial contour in the photographed image.
- the blur determination unit 114 determines whether the photographed image is blurred to be greater than or equal to a threshold. For example, the blur determination unit 114 determines whether the photographed image is blurred to be greater than or equal to a threshold depending on the magnitude of the angle fluctuations measured by the gyroscope 172 during photographing.
- the threshold may be a value that is set by, for example, a designer or user of the information processing apparatus 10 .
- FIG. 24 is a sequence diagram illustrating the operation according to the second embodiment.
- the operations of steps S 201 to S 205 are similar to those according to the first embodiment illustrated in FIG. 11 , and thus the description thereof is omitted.
- the transmission control unit 107 determines whether the image photographed in S 205 is transmitted to the server 20 (S 206 ). For example, if the blur determination unit 114 determines that the photographed image is blurred to be greater than or equal to a threshold, then the transmission control unit 107 determines that the photographed image is not transmitted to the server 20 (S 206 : NO). On the other hand, if the blur determination unit 114 determines that the photographed image is blurred to be less than a threshold, then the transmission control unit 107 determines that the photographed image is transmitted to the server 20 (S 206 : YES).
- the information processing apparatus 10 performs the operation of S 201 again.
- the image processing unit 110 determines that the photographed image is transmitted to the server 20 (S 206 : YES)
- the image processing unit 110 generates an image that is obtained by reducing the amount of information from the photographed image using a way corresponding to the action state of the user that is recognized by the action recognition unit 102 during photographing by the photographing unit 124 (S 207 ).
- the face region detection unit 112 detects a face region of a person in the photographed image
- the image processing unit 110 generates an image, which is obtained by clipping the detected face region, from the photographed image.
- the transmission control unit 107 allows the communication unit 120 to transmit the image generated or processed in step S 207 to the server 20 (S 208 ).
- step S 208 The operations subsequent to step S 208 are substantially similar to those of steps S 108 to 111 of the first embodiment illustrated in FIG. 11 , and thus a description thereof is omitted.
- the information processing apparatus 10 recognizes an action state of the user based on the measurement results obtained by the measurement unit 122 . Then, the information processing apparatus 10 perform the process regarding the amount of information on the image photographed by the photographing unit 124 based on the action state of the user recognized during photographing by the photographing unit 124 . Then, the information processing apparatus 10 causes the processed image to be transmitted to the server 20 . Thus, information processing apparatus 10 can reduce adaptively the communication traffic in transmitting the processed image to the server 20 .
- the information processing apparatus 10 when the information processing apparatus 10 recognizes an action in which a user is looking around during photographing, the information processing apparatus 10 compresses the resolution of the photographed image and transmits the image having compressed resolution to the server 20 .
- the information processing apparatus 10 when the information processing apparatus 10 recognizes an action in which a user is watching during photographing, the information processing apparatus 10 clips the region that is estimated as being watched by the user and transmits the clipped region to the server 20 .
- the information processing apparatus 10 may be unable to transmit the image to the server 20 . In this way, the information processing apparatus 10 does not transmit an image, which is difficult to perform appropriate image recognition by the server 20 , to the server 20 , thereby reducing the communication traffic more efficiently.
- the information processing apparatus 10 performs only detection processing on whether the photographed image contains a face region of a person, and if a face region is detected, then the information processing apparatus 10 transmits an image of the detected face region to the server 20 .
- the information processing apparatus 10 allows the server 20 to specify a person corresponding to the detected face region and receives the specified result from the server 20 .
- the information processing apparatus 10 can reduce the amount of calculation necessary to specify a person contained in the photographed image, resulting in reduced power consumption.
- the information processing apparatus 10 is not limited to a device provided with a glasses-type display as illustrated in FIG. 1 .
- the information processing apparatus 10 may be configured as a wristwatch type device, a device that is worn on the user's neck such as a neck strap, a device that is mounted on the clothing of the user such as a wearable badge, or a device that is attached to the body of the user such as a headphone.
- the information processing apparatus 10 is configured to include, for example, all of various types of sensors such as the position information measuring device 168 and the acceleration sensor 170 , but an embodiment of the present disclosure is not limited thereto.
- Any one or more of the position information measuring device 168 , the acceleration sensor 170 , the gyroscope 172 , and the microphone 174 may be provided in other portable devices that a user can carry.
- the acceleration sensor 170 is provided in another device that is attached near of the user's waist, the acceleration when the user is walking can advantageously be measured more accurately.
- a computer program used to allow hardware such as the CPU 150 , the ROM 152 , and the RAM 154 to execute a function equivalent to that of each component in the information processing apparatus 10 described above.
- a storage medium for storing the computer program is also provided.
- present technology may also be configured as below.
- An information processing apparatus including:
- an action recognition unit configured to recognize an action state of a user based on a measurement result obtained by a sensor carried by the user
- an image processing unit configured to perform a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized by the action recognition unit during photographing by the photographing unit carried by the user;
- a transmission control unit configured to cause the image processed by the image processing unit to be transmitted to an image processing device used to perform image recognition.
- the image processing unit generates an image obtained by reducing an amount of information from the photographed image using a way corresponding to an action state of the user recognized by the action recognition unit during photographing by the photographing unit, and
- the transmission control unit causes the image having the reduced amount of information to be transmitted to the image processing device.
- the action state of the user includes a vision-related state of the user
- the action recognition unit recognizes that the user is looking around when the sensor measures that a movement of the user's head is within a predetermined range
- the image processing unit generates an image obtained by reducing resolution from the photographed image when a state in which the user is looking around is recognized.
- the action recognition unit recognizes that the user is watching when the sensor measures that a speed of movement of the head of the user is reduced to be less than or equal to a predetermined value
- the image processing unit generates an image obtained by clipping a predetermined region from the photographed image when a state in which the user is watching is recognized.
- the action state of the user includes a movement state of the user
- the image processing unit generates an image obtained by clipping a predetermined region from the photographed image when a state in which the user is moving during photographing by the photographing unit is recognized.
- the predetermined region is a region within a predetermined distance from a center in the photographed image.
- a face region detection unit configured to detect a face region of a person contained in the photographed image
- the image processing unit clips a face region detected by the face region detection unit from the photographed image
- transmission control unit causes an image of the face region clipped by the image processing unit to be transmitted to the image processing device.
- the action recognition unit recognizes that the user is looking down using the sensor
- the image processing unit clips a region from the photographed image such that the region to be clipped is within a predetermined distance from a lower end of the photographed image when a state in which the user is looking down is recognized
- the transmission unit causes an image of the region clipped by the image processing unit to be transmitted to the image processing device.
- the action recognition unit recognizes that the user is looking upward using the sensor
- the image processing unit clips a region from the photographed image such that the region to be clipped is within a predetermined distance from an upper end of the photographed image when a state in which the user is looking upward is recognized
- the transmission unit causes an image of the region clipped by the image processing unit to be transmitted to the image processing device.
- a blur determination unit configured to determine whether the photographed image is blurred to be greater than or equal to a threshold
- transmission control unit prevents the photographed image from being transmitted to the image processing device when the blur determination unit determines that the photographed image is blurred to be greater than or equal to the threshold.
- a blur determination unit configured to determine whether the photographed image is blurred to be greater than or equal to a threshold
- the image processing unit corrects a blur contained in the photographed image based on an action state of the user recognized by the action recognition unit when the blur determination unit determines that the photographed image is blurred to be greater than or equal to the threshold.
- An information processing method including:
- an action recognition unit configured to recognize an action state of a user based on a measurement result obtained by a sensor carried by the user
- an image processing unit configured to perform a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized by the action recognition unit during photographing by the photographing unit carried by the user;
- a transmission control unit configured to cause the image processed by the image processing unit to be transmitted to an image processing device used to perform image recognition.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
There is provided an information processing apparatus including an action recognition unit configured to recognize an action state of a user based on a measurement result obtained by a sensor carried by the user, an image processing unit configured to perform a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized by the action recognition unit during photographing by the photographing unit carried by the user, and a transmission control unit configured to cause the image processed by the image processing unit to be transmitted to an image processing device used to perform image recognition.
Description
- This application is a continuation application of U.S. patent application Ser. No. 14/521,007, filed on Oct. 22, 2014, which claims the benefit of Japanese Priority Patent Application JP 2013-228074 filed Nov. 1, 2013, the entire contents of which are incorporated herein by reference.
- The present disclosure relates to an information processing apparatus, an information processing method, and a program.
- In recent years, various types of devices, such as digital cameras and smartphones, which are equipped with various sensors including gyroscopes as well as image sensors have been developed.
- As an example, JP 4289326B discloses a technology that recognizes an action of a user holding a camcorder based on sensor data obtained by a sensor that is built in the camcorder in the same timing as that of photographing by the camcorder, and records the recognized result of action in association with the photographed image.
- However, if it is assumed that the above-mentioned technology is applied to a case of transmitting the photographed image to an external device, the above-mentioned technology may be unable to reduce adaptively the communication traffic. For example, with the above-mentioned technology, an image to be transmitted to an external device may be difficult to be optionally selected. Thus, all of the images are necessary to be transmitted, and when there are a large number of photographed images, the communication traffic will be increased. In addition, with the above-mentioned technology, although the amount of information of the photographed image can be reduced, the uniform reduction in the amount of information for all of the images may be difficult. Thus, an undesirable situation for the user occurs such as when even the resolution of an image region regarding an object that is watched by the user during photographing is reduced.
- Therefore, according to an embodiment of the present disclosure, there is provided a novel and improved information processing apparatus, information processing method, and program, capable of reducing adaptively the communication traffic in case of transmitting a photographed image to an external device.
- According to an embodiment of the present disclosure, there is provided an information processing apparatus including an action recognition unit configured to recognize an action state of a user based on a measurement result obtained by a sensor carried by the user, an image processing unit configured to perform a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized by the action recognition unit during photographing by the photographing unit carried by the user, and a transmission control unit configured to cause the image processed by the image processing unit to be transmitted to an image processing device used to perform image recognition.
- According to another embodiment of the present disclosure, there is provided an information processing method including recognizing an action state of a user based on a measurement result obtained by a sensor carried by the user, performing, by a processor, a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized during photographing by the photographing unit carried by the user, and causing the processed image to be transmitted to an image processing device used to perform image recognition.
- According to another embodiment of the present disclosure, there is provided a program for causing a computer to function as an action recognition unit configured to recognize an action state of a user based on a measurement result obtained by a sensor carried by the user, an image processing unit configured to perform a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized by the action recognition unit during photographing by the photographing unit carried by the user, and a transmission control unit configured to cause the image processed by the image processing unit to be transmitted to an image processing device used to perform image recognition.
- According to one or more of embodiments of the present disclosure described above, it is possible to reduce adaptively the communication traffic in case of transmitting a photographed image to an external device. Note that the advantages described here are not necessarily limited, or any other advantages described herein and other advantages understood from the present disclosure may be achievable.
-
FIG. 1 is a diagram for describing the basic configuration of an information processing system that is common to each embodiment of the present disclosure; -
FIG. 2 is a diagram for describing the hardware configuration of aninformation processing apparatus 10 according to each embodiment of the present disclosure; -
FIG. 3 is a functional block diagram illustrating the configuration of theinformation processing apparatus 10 according to a first embodiment of the present disclosure; -
FIG. 4 is a diagram for describing an example of the action recognition by anaction recognition unit 102 according to the first embodiment; -
FIG. 5 is a diagram for describing an example of the action recognition by theaction recognition unit 102 according to the first embodiment; -
FIG. 6 is a diagram for describing an example of the action recognition by theaction recognition unit 102 according to the first embodiment; -
FIG. 7 is a diagram for describing an example of the action recognition by theaction recognition unit 102 according to the first embodiment; -
FIG. 8 is a diagram for describing an example of the photographing control by a photographingcontrol unit 104 according to the first embodiment; -
FIG. 9 is a diagram for describing an example of the photographing control by the photographingcontrol unit 104 according to the first embodiment; -
FIG. 10 is a diagram for describing an example of the display control by adisplay control unit 108 according to the first embodiment; -
FIG. 11 is a sequence diagram illustrating the operation according to the first embodiment; -
FIG. 12 is a functional block diagram illustrating the configuration of theinformation processing apparatus 10 according to a second embodiment of the present disclosure; -
FIG. 13 is a diagram for describing an example of the image processing by animage processing unit 110 according to the second embodiment; -
FIG. 14 is a diagram for describing an example of the image processing by theimage processing unit 110 according to the second embodiment; -
FIG. 15 is a diagram for describing a state in which a user looks forward and then is stationary; -
FIG. 16 is a diagram for describing a state in which a user is walking while looking forward; -
FIG. 17 is a diagram for describing an example of image processing by theimage processing unit 110 according to the second embodiment; -
FIG. 18 is a diagram for describing an example of image processing by theimage processing unit 110 according to the second embodiment; -
FIG. 19 is a diagram for describing a state in which a user looks down and then is stationary; -
FIG. 20 is a diagram for describing an example of image processing by theimage processing unit 110 according to the second embodiment; -
FIG. 21 is a diagram for describing an example of image processing by theimage processing unit 110 according to the second embodiment; -
FIG. 22 is a diagram for describing an example of image processing by theimage processing unit 110 according to the second embodiment; -
FIG. 23 is a diagram for describing an example of image processing by theimage processing unit 110 according to the second embodiment; and -
FIG. 24 is a sequence diagram illustrating operation according to the second embodiment. - Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
- The “embodiments for implementing the present disclosure” will be described in the following order of items.
- 1. Background
- 2. Basic Configuration of Information Processing System
- 3. Detailed Description of each Embodiment
- 3-1. First Embodiment
- 3-2. Second Embodiment
- 4. Modification
- The present disclosure may be implemented in various embodiments, as described in detail in the items “3-1. First Embodiment” and “3-2. Second Embodiment” and their subcategories, as an example. To clearly illustrate features of embodiments of the present disclosure, the technical background that led to the conception of an information processing apparatus according to an embodiment of the present disclosure is first described.
- The development of wearable devices, which have a built-in image sensor and are used by being mounted constantly on the user, has been recently studied. Such a wearable device is likely to be able to recognize constantly an object, a person, or the like contained in an image captured by a camera attached thereto and provide the user with information related to an object, a person, or the like, for example, viewed by the user based on recognition results.
- However, there remains a problem as described below in providing such service. First, when a wearable device is configured as a battery-powered mobile device, it is necessary to reduce its weight because the user wears the device on his or her body. Thus, there is a problem with restrictions on the capacity of a battery provided in the wearable device. Accordingly, the time available for the user to use continuously the device is limited.
- Second, because of the nature of the device worn by the user, blurring occurs with the movement of a photographer depending on the timing of photographing. Thus, if a service of allowing a camera to be continuously running and to perform automatic photographing is assumed, improper environment recognition is performed due to blurring contained in a photographed image and thus an erroneous recognition or delay in processing may be occurred. In addition, when recognition is necessary to be performed in a very short time such as when a subject is moving, it is more likely to result in failed recognition.
- Third, in general, when environment recognition is performed based on the output of an image sensor, for example, if an object is specified, or a photographing location or a posture of the photographer is specified, an additional large amount of data for reference is necessary. Such data for reference is difficult to store in a wearable device, and thus a method of sending a photographed image to a server that stores the data for reference and of causing the server to perform image recognition may be considered. However, the transmission of a large amount of data, such as data of moving images captured for a long time, to a server is considered to be impractical in view of communication speed, battery performance, or the like in the current mobile communication environment.
- Thus, in view of the foregoing situation, the
information processing apparatus 10 according to an embodiment of the present disclosure has been conceived. Theinformation processing apparatus 10 is capable of controlling adaptively the timing of photographing depending on the action state of the user. In addition, theinformation processing apparatus 10, when transmitting a photographed image to aserver 20, can reduce adaptively communication traffic. - The basic configuration of an information processing system that is common to each embodiment is described below with reference to
FIG. 1 . As illustrated inFIG. 1 , the information processing system according to each embodiment includes theinformation processing apparatus 10, acommunication network 12, and theserver 20. - The
information processing apparatus 10 is an example of the information processing apparatus according to an embodiment of the present disclosure. Theinformation processing apparatus 10 is, for example, a device provided with a glasses-type display, as illustrated inFIG. 1 . In addition, a translucent see-through type display can be employed as the glasses-type display. This see-through type display enables the user to view the outside environment through the display. - The
information processing apparatus 10 is used by being worn on the user's head. In addition, as illustrated inFIG. 1 , theinformation processing apparatus 10 has acamera 166, which will be described later, at a position in the periphery of the display. The user can photograph a landscape at which the user is looking with thecamera 166 while moving by wearing theinformation processing apparatus 10. - The
information processing apparatus 10 may have such hardware configuration as illustrated inFIG. 2 . As illustrated inFIG. 2 , theinformation processing apparatus 10 is configured to include a central processing unit (CPU) 150, a read only memory (ROM) 152, a random access memory (RAM) 154, aninternal bus 156, aninterface 158, anoutput device 160, astorage device 162, acommunication device 164, acamera 166, a positioninformation measuring device 168, anacceleration sensor 170, agyroscope 172, and amicrophone 174. - The
CPU 150 is composed of a various types of processing circuits and serves as acontroller 100 for controlling the entireinformation processing apparatus 10. In addition, in theinformation processing apparatus 10, theCPU 150 implements each function of anaction recognition unit 102, a photographingcontrol unit 104, atransmission control unit 106, adisplay control unit 108, animage processing unit 110, a faceregion detection unit 112, and ablur determination unit 114, which are described later. - The
ROM 152 stores a program used by theCPU 150 and it also stores data for control of operation parameters or the like to be used by theCPU 150. - The
RAM 154 stores temporarily a program, for example, to be executed by theCPU 150. - The
interface 158 connects theoutput device 160, thestorage device 162, thecommunication device 164, thecamera 166, the positioninformation measuring device 168, andacceleration sensor 170, thegyroscope 172, and themicrophone 174 with theinternal bus 156. For example, theoutput device 160 exchanges data with theCPU 150 and other components via theinterface 158 and theinternal bus 156. - The
output device 160 includes a display device such as a liquid crystal display (LCD), an organic light emitting diode (OLED), and a lamp. This display device displays an image captured by thecamera 166, an image generated by theCPU 150, or the like. - Furthermore, the
output device 160 includes an audio output device such as a loudspeaker. This audio output device converts audio data or the like into sound and outputs it. - The
storage device 162 is a device for storing data, which is used to store a program or various data to be executed by theCPU 150. Thestorage device 162 includes a storage medium, a recording device for recording data in a storage medium, a reading device for reading out data from a storage medium, a deletion device for deleting data recorded in a storage medium, or the like. - The
communication device 164 is a communication interface that is composed of a communication device or the like used to connect to a communication network such as a public network or the Internet. In addition, thecommunication device 164 may be a wireless LAN compatible communication device, a long-term evolution (LTE) compatible communication device, or a wired communication device that performs communication through a wired line. Thecommunication device 164 serves, for example, as acommunication unit 120 that will be described later. - The
camera 166 has functions of forming an image obtained from the outside through a lens on an image sensor, such as a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS), and of photographing still or moving images. - The position
information measuring device 168 receives a positioning signal from a positioning satellite, such as the global positioning system or a global navigation satellite system (GLONASS), and thus measures its current position. In addition, the positioninformation measuring device 168 may have functions of receiving Wi-Fi (registered trademark) radio waves from a plurality of base stations and measuring its current position based on the reception intensity of the received Wi-Fi radio waves and the positions of each base station. In addition, the positioninformation measuring device 168 may have a function of measuring its current position based on communications with a Bluetooth access point. The positioninformation measuring device 168 serves as ameasurement unit 122 that will be described later. - The
acceleration sensor 170 measures the acceleration of theinformation processing apparatus 10. Theacceleration sensor 170 serves as themeasurement unit 122. - The
gyroscope 172 measures the angle or the angular velocity of theinformation processing apparatus 10. For example, thegyroscope 172 detects the inertial force or the Coriolis force applied to theinformation processing apparatus 10 and thus measures the angular velocity of theinformation processing apparatus 10. Thegyroscope 172 serves as themeasurement unit 122. - The
microphone 174 collects sound coming from outside. Themicrophone 174 serves as themeasurement unit 122. - The hardware configuration of the
information processing apparatus 10 is not limited to the above-described configuration. For example, theinformation processing apparatus 10 may be configured without any one or more of thestorage device 162, the positioninformation measuring device 168, theacceleration sensor 170, thegyroscope 172, and themicrophone 174. - The
communication network 12 is a wired or wireless communication channel for information transmitted from a device connected to thecommunication network 12. For example, thecommunication network 12 may include public networks such as the Internet, telephone network, and satellite communication network, various local area networks (LANs) including Ethernet (registered trademark), and wide area networks (WANs). In addition, thecommunication network 12 may include a leased line network such as Internet protocol virtual private network (IP-VPN). - The
server 20 is an exemplary image processing apparatus according to an embodiment of the present disclosure. Theserver 20 has a function of performing image recognition on a photographed image. In addition, theserver 20 includes a storage unit (not illustrated) for storing a plurality of data for reference that include various types of information in the real world. - The
server 20, for example when receiving a photographed image from theinformation processing apparatus 10, performs image recognition on the photographed image, and so can recognize an object, a person, or the like contained in the photographed image. Theserver 20 can extract additional information that is information related to the recognized object, person, or the like from the plurality of data for reference stored in the storage unit, and can transmit the extracted additional information to theinformation processing apparatus 10. - Such a function of the
server 20 makes it possible for the user carrying theinformation processing apparatus 10 to obtain a notification of detailed information related to buildings, goods, persons, or the like viewed by the user while moving around in the real world from theserver 20. In addition, the user can know the reception of additional information from theserver 20 through a change in video, audio, or vibration outputted from theoutput device 160 of theinformation processing apparatus 10. For example, theoutput device 160 superimposes the additional information on a display screen or changes the mode of vibration, thereby informing the user of the contents of information or the location direction. - The basic configuration of the information processing system according to each embodiment has been described above. Subsequently, each embodiment is described in detail.
- The configuration of the
information processing apparatus 10 according to the first embodiment is described in detail.FIG. 3 is a functional block diagram illustrating the configuration of theinformation processing apparatus 10 according to the first embodiment. As illustrated inFIG. 3 , theinformation processing apparatus 10 is configured to include acontroller 100, acommunication unit 120, ameasurement unit 122, a photographingunit 124, and adisplay unit 126. - The
controller 100 controls the overall operation of theinformation processing apparatus 10 using hardware, such as theCPU 150 andRAM 154, which is built in theinformation processing apparatus 10. In addition, as illustrated inFIG. 3 , thecontroller 100 is configured to include anaction recognition unit 102, a photographingcontrol unit 104, atransmission control unit 106, and adisplay control unit 108. - The
action recognition unit 102 recognizes an action state of the user based on the results of measurement by themeasurement unit 122 that will be described later. Themeasurement unit 122 is an example of the sensor according to an embodiment of the present disclosure. The action state includes, for example, a movement state of the user, a vision-related state of the user, and a voice-related state of the user. The detailed processing of each state is described below. - For example, the
action recognition unit 102 is able to recognize a movement state of the user based on the results of measurement by themeasurement unit 122. The movement state indicates states in which a user is walking, running, riding a bicycle, and riding in a vehicle such as automobiles, trains, and planes, or a state in which a user is stationary such as when the user is sitting in a chair. - Referring to
FIG. 4 , a description of how it functions is given in more detail.FIG. 4 is a graph showing values obtained by measurement of theacceleration sensor 170 in association with measurement time. As shown inFIG. 4 , the changes in acceleration having a substantially similar waveform are measured in time zones of time t1 to t2, time t2 to t3, and time t3 to t4. The change in acceleration having such a waveform approximates the change in acceleration when a person is walking. Thus, theaction recognition unit 102 recognizes that the user is in walking state in the time zone from time t1 to t4, based on the measurement results obtained by theacceleration sensor 170, which are shown inFIG. 4 . - Furthermore, as recognition example 2, the
action recognition unit 102 can recognize a vision-related state of the user based on the measurement results obtained by themeasurement unit 122. The vision-related state indicates in what state and under what circumstances the user looks at an object. For example, the vision-related state indicates a state in which the user is watching a particular object or person, a state in which the user is looking around, or the direction in which the user is looking, such as the front or a vertical or horizontal direction. - More specifically, the
action recognition unit 102 can recognize the user's vision-related state based on the movement of the user's head that is measured by thegyroscope 172. For example, when thegyroscope 172 measures that the speed of movement of the user's head is reduced to less than or equal to a predetermined value, theaction recognition unit 102 recognizes that the user is watching. - Referring to
FIG. 5 or 6 , a description of how it functions is given in more detail.FIG. 5 is an example of the graph showing measurement values obtained by theacceleration sensor 170 in association with measurement time. In the graph shownFIG. 5 , the acceleration fluctuations in the time zone from time t1 to t2 are large, and after time t2, the fluctuation amount is significantly reduced and thus measured values of the acceleration are at or close to zero. The fluctuations in acceleration having the waveform as shown inFIG. 5 approximate the fluctuations in acceleration of the head when a person finds something and watches it at approximately time t2. - In addition,
FIG. 6 is an example of the graph showing measurement values of the angle of the user's head obtained by thegyroscope 172 in association with measurement time. In the graph shown inFIG. 6 , the angular fluctuations in the time zone from time t1 to t2 are large, and after time t2, the amount of fluctuations is significantly reduced and thus measured values of the angle are at a substantially fixed value. The acceleration fluctuations having the waveform as shown inFIG. 6 approximate the acceleration fluctuations of the head when the user moves his head and then stops its motion, such as when a person finds something and watches it at approximately time t2. - Thus, the
action recognition unit 102 recognizes that the user is watching after the time t2, based on the measurement results obtained by theacceleration sensor 170 as shown inFIG. 5 and/or the measurement results obtained by thegyroscope 172 as shown inFIG. 6 . Theaction recognition unit 102 also can recognize the action of the user moving his head and then stopping the motion based on the change in magnitude of the sound of blowing wind that is detected by themicrophone 174 instead of thegyroscope 172. - In addition,
FIG. 7 is another example of the graph showing the values obtained by measuring the angle of the user's head in association with the measurement time. In the graph shown inFIG. 7 , the angles fluctuate gradually between theangles 81 and 82 over the whole measurement period. The fluctuations in angle of the waveform as shown inFIG. 7 approximate the fluctuations in acceleration of a person's head, for example, when the person is looking around, wandering, or looking down. Thus, theaction recognition unit 102 recognizes that, for example, the user is looking around over the whole measurement period based on the measurement results obtained by theacceleration sensor 170 as shown inFIG. 7 . - In addition, as recognition example 3, the
action recognition unit 102 can recognize a voice-related state of the user based on the measurement results obtained by themeasurement unit 122. The voice-related state indicates, for example, a state in which a user is talking with other people, a state in which a user is silent, or the degree of magnitude of the voice produced by a user. - For example, the
action recognition unit 102 recognizes that a user is talking with other people based on the measurement results obtained by themicrophone 174. - The photographing
control unit 104 controls the timing at which a photographingunit 124 described later is allowed to perform photographing based on the action state of the user recognized by theaction recognition unit 102. The following description herein is given based on an example in which the photographingcontrol unit 104 allows the photographingunit 124 to capture still images. As its technical background, an embodiment of the present disclosure may be intended to reduce the power consumption as much as possible, but it is not limited to such an example. The photographingcontrol unit 104 may allow the photographingunit 124 to capture moving images. - More specifically, the photographing
control unit 104 can change the timing at which the photographingunit 124 is allowed to perform photographing based on whether a user is moving or not. For example, the photographingcontrol unit 104 sets a frequency with which the photographingunit 124 is allowed to perform photographing when the user is stationary to be smaller than a frequency to be set when the user is moving. In general, when a user is stationary, it is assumed that the landscape viewed by user has a little change. Thus, with control example 1, an image in which it is assumed that the image is undesirable to the user or the image is not considered important can be prevented from being captured. - Furthermore, the photographing
control unit 104 allows the photographingunit 124 to perform photographing at predetermined time intervals when a user continues to move. Referring toFIG. 9 , a description of how it functions is given in more detail.FIG. 9 is an example of the graph showing the measurement results obtained by theacceleration sensor 170 in association with measurement time. InFIG. 9 , there are illustrated the results in which the acceleration fluctuations in the time zones from time t1 to t7 that are similar to the acceleration fluctuations in the time zone from time t1 to t2 shown inFIG. 4 are continuously measured. In addition, inFIG. 9 , the duration between time tb and tc is assumed to be the same as the duration between time ta and tb. - In the time zones from time t1 to t7 shown in
FIG. 9 , theaction recognition unit 102 recognizes that a user is in walking state. Thus, the photographingcontrol unit 104 allows the photographingunit 124 to perform photographing at the time interval between times ta and tb so that photographing is performed, for example, at time ta, tb, and Tc inFIG. 9 . - In general, when a user is walking or is riding a train, the surrounding landscape viewed through the eyes of the user changes generally from moment to moment. Thus, according to control example 1, it is possible to photograph a sequence of images by capturing the change in landscapes with the movement of the user even without capturing moving images. Accordingly, the number of photographing times by the photographing
unit 124 can be decreased, thereby reducing the power consumption. - In addition, as control example 2, the photographing
control unit 104 can change the timing at which the photographingunit 124 is allowed to perform photographing based on the vision-related state of a user that is recognized by theaction recognition unit 102. For example, when theaction recognition unit 102 recognizes that a user is watching as immediately after time t2 shown inFIG. 5 or 6 , the photographingcontrol unit 104 allows the photographingunit 124 to perform photographing. - According to control example 2, when a user is watching, for example, goods displayed in a department store, a building or structure in a tour place, or a person who a user saw on the street, it is possible to photograph the object that is being watched by the user in reliable and immediate manner.
- As a modification, when an action in which a user moves his head or neck is recognized, it is estimated that the user is watching, and thus the photographing
control unit 104 may allow the photographingunit 124 to perform photographing in a continuous way in accordance with the movement of the user's head or neck. According to this modification, there is an advantage that a change in how the user watched during photographing can be recorded. - In addition, as control example 3, the photographing
control unit 104 is able to allow the photographingunit 124 to perform photographing when theaction recognition unit 102 recognizes that a user has spoken. According to this control example 3, when a user is talking with other people, it is possible for the user to photograph the conversation partner automatically, for example, without having to release the shutter. - The
transmission control unit 106 allows the photographingunit 120 to transmit, for example, an image photographed by the photographingunit 124 to theserver 20. - The
display control unit 108 allows thedisplay unit 126, which will be described later, to display, for example, various character strings or images such as additional information received from theserver 20. - Referring to
FIG. 10 , a description of how it functions is given in more detail.FIG. 10 is a diagram for describing a display example in which additional information received from theserver 20 is displayed on thedisplay unit 126. The left side view ofFIG. 10 is an example of a photographed image (a photographed image 30) captured by the photographingunit 124. In addition, the right side view ofFIG. 10 is an example (a picture 40) in which the photographedimage 30 is transmitted to theserver 20 and then additional information received from theserver 20 is displayed on thedisplay unit 126.FIG. 10 illustrates an example in which theinformation processing apparatus 10 receives, from theserver 20, additional information about the advertisement of “ON SALE (10/1-10/31)” for adepartment store 300 contained in the photographedimage 30 or addition information indicating a station name of “STATION X ON LINE A” for astation 302. - As illustrated in the right side view of
FIG. 10 , thedisplay control unit 108 superimposes a display that indicates additional information received from theserver 20 on thepicture 40. For example, when thedisplay unit 126 is composed of a see-through type display, thepicture 40 is a landscape at which the user is actually looking through the display. Alternatively, thedisplay control unit 108 may display an image, which is the same as the photographedimage 30, as thepicture 40. - The
communication unit 120 transmits and receives information to and from various types of devices connected to thecommunication network 12, for example, by wireless communication. For example, thecommunication unit 120 transmits the image photographed by the photographingunit 124 to theserver 20 under the control of thetransmission control unit 106. In addition, thecommunication unit 120 receives the above-described additional information from theserver 20. - The
measurement unit 122 is composed, for example, of the positioninformation measuring device 168, theacceleration sensor 170, thegyroscope 172, and themicrophone 174. Themeasurement unit 122 measures acceleration of theinformation processing apparatus 10, an angle of theinformation processing apparatus 10, or the sound coming from outside. - In the first embodiment, the
measurement unit 122 is basically assumed to perform continuous measurement. The reason for this is, for example, because various sensors such as theacceleration sensor 170 are continuously activated but their power continuously consumed is substantially smaller than the power consumed by photographing of thecamera 166. - The photographing
unit 124 photographs an outside still or moving image under the control of the photographingcontrol unit 104. - In addition, the photographing
unit 124 can also photograph an outside sill or moving image, for example, in accordance with an instruction from the user to an input device (not illustrated) such as a button attached to theinformation processing apparatus 10. The following description herein is given based on an example in which the photographingunit 124 photographs an image under the control of the photographingcontrol unit 104. - The
display unit 126 displays, for example, various character strings or images such as additional information received from theserver 20 under the control of thedisplay control unit 108. - The configuration of the
information processing apparatus 10 according to the first embodiment is not limited to the above-described configuration. For example, themeasurement unit 122 may not be included in theinformation processing apparatus 10 but may be included in other devices. - The configuration according to the first embodiment has been described above. Next, the operation according to the first embodiment is described.
-
FIG. 11 is a sequence diagram illustrating the operation according to the first embodiment. As illustrated inFIG. 11 , themeasurement unit 122 of theinformation processing apparatus 10 measures, for example, acceleration of theinformation processing apparatus 10, an angle of theinformation processing apparatus 10, or the sound coming from outside (S101). - Subsequently, the
action recognition unit 102 recognizes an action state of the user based on the measurement results measured in step S101 (S102). - Subsequently, the photographing
control unit 104 determines whether the present is the timing at which the photographingunit 124 is allowed to perform photographing based on the action state of the user that is recognized in step S012 (S103). If it is not determined that the present is the timing at which the photographingunit 124 is allowed to perform photographing (S103: NO), then theinformation processing apparatus 10 performs the operation of S101 again. - On the other hand, if it is determined that the present is the timing at which the photographing
unit 124 is allowed to perform photographing (S103: YES), then the photographingcontrol unit 104 adjusts various types of parameters such as photographic sensitivity or shutter speed to an appropriate value, for example, based on information including brightness of a surrounding environment (S104). Then, the photographingcontrol unit 104 allows the photographingunit 124 to perform photographing (S105). - Subsequently, the
transmission control unit 106 determines whether the image photographed in step S105 is transmitted to theserver 20, for example, based on a predetermined condition of whether the photographed image contains an object, a person, or the like (S106). If it is not determined that the image is transmitted (S106: NO), then theinformation processing apparatus 10 performs the operation of step S101 again. - If it is determined that the image is transmitted (S106: YES), then the
transmission control unit 106 allows thecommunication unit 120 to transmit the photographed image to the server 20 (S107). - Subsequently, the
server 20 performs image recognition on the image received from the information processing apparatus 10 (S108). Then, theserver 20 extracts additional information about the object, person, or the like recognized from the received image, for example, from a large amount of reference data stored in a storage unit of the server 20 (S109). Then, theserver 20 transmits the extracted additional information to the information processing apparatus 10 (S110). - Then, the
display control unit 108 of theinformation processing apparatus 10 allows thedisplay unit 126 to display the additional information received from the server 20 (S111). - In the above, as described, for example, with reference to
FIGS. 3, 11 , and other illustrations, theinformation processing apparatus 10 according to the first embodiment recognizes an action state of the user based on the measurement results obtained by themeasurement unit 122 and controls the timing at which the photographingunit 124 is allowed to perform photographing based on the recognized action state. Thus, it is possible to control adaptively the timing of photographing depending on the action state of the user. - For example, when it is recognized that a user is not moved while remaining stationary, the
information processing apparatus 10 sets the frequency with which the photographingunit 124 is allowed to perform photographing to be smaller than a frequency to be set when the user is moving. In general, when a user is stationary, it is assumed that the landscape at which user is looking has a little change. Thus, an image where it is assumed that the image is undesirable to the user or the image is not considered important can be significantly prevented from being photographed. Accordingly, the power consumed by photographing can be reduced. - The first embodiment has been described above. Next, a second embodiment is described. As described later, according to the second embodiment, an
information processing apparatus 10 can reduce the amount of information of the image transmitted to theserver 20, thereby reducing the power consumption. - The configuration of the
information processing apparatus 10 according to the second embodiment is first described in detail.FIG. 12 is a functional block diagram illustrating the configuration of theinformation processing apparatus 10 according to the second embodiment. As illustrated inFIG. 12 , theinformation processing apparatus 10 further includes animage processing unit 110, a faceregion detection unit 112, and ablur determination unit 114, as compared with the configuration of the first embodiment. - A transmission control unit 107 according to the second embodiment allows the
communication unit 120 to transmit an image processed by animage processing unit 110 described later to theserver 20. More specifically, the transmission control unit 107 can cause the image, which is generated by theimage processing unit 110 and has the reduced amount of information, to be transmitted to theserver 20 based on the photographed image captured by the photographingunit 124. - As described in detail later, for example, the transmission control unit 107 causes the image of a particular region clipped from the photographed image by the
image processing unit 110 to be transmitted to theserver 20. Alternatively, the transmission control unit 107 causes the image obtained by reducing the resolution from the photographed image by theimage processing unit 110 to be transmitted to theserver 20. Alternatively, the transmission control unit 107 causes an image of one or more face regions of a person that is clipped from the photographed image by theimage processing unit 110 to be transmitted to theserver 20. - In addition, as a modification, if the
blur determination unit 114 described later determines that blurring of the photographed image is greater than or equal to a threshold, then the transmission control unit 107 may also cause the photographed image to be not transmitted to theserver 20. - The
image processing unit 110 performs a process regarding the amount of information on the image photographed by the photographingunit 124 based on an action state of the user that is recognized by theaction recognition unit 102 during photographing by the photographingunit 124. More specifically, theimage processing unit 110 can generate an image obtained by reducing the amount of information from the photographed image using a way corresponding to the action state of the user that is recognized by theaction recognition unit 102 during photographing by the photographingunit 124. - For example, when the
action recognition unit 102 recognizes that, for example, a user looks around, downward, or upward when during photographing, theimage processing unit 110 generates an image obtained by compressing resolution from the photographed image. - Referring to
FIG. 13 , a description of how it functions is given in more detail.FIG. 13 is a diagram for describing a generation example in which an image with compressed resolution (processed image 50) is generated from the photographed image (photographed image 30). As illustrated inFIG. 13 , for example, theimage processing unit 110 generates the processedimage 50 that is an image with the compressed resolution of 320×240 from the photographedimage 30 with resolution of 640×480. - In general, when a user looks around, downward, or upward, it is assumed that the user looks at a region of wider angle than usual. According to processing example 1, by compressing the resolution of the photographed image, it is possible to reduce the amount of information of the image without reducing the photographing range in which the user is intended to perform photographing.
- In addition, as processing example 2, when the
action recognition unit 102 recognizes that a user is watching during photographing, theimage processing unit 110 can generate an image obtained by clipping a predetermined region from the photographed image. - Referring to
FIG. 14 , a description of how it functions is given in more detail.FIG. 14 is a diagram for describing a generation example of generating an image (processed image 50) obtained by clipping a predetermined region in the photographedimage 30 from the photographed image (photographed image 30). As illustrated inFIG. 14 , for example, theimage processing unit 110 generates the processedimage 50 that is an image obtained by clipping a region with the resolution of 320×240 that is a central portion of the photographedimage 30 from the photographedimage 30 with the resolution of 640×480. The size of the predetermined region may be set, for example, in accordance with the maximum bit width of the communication line for transmission to theserver 20. - In addition, when the
action recognition unit 102 recognizes that a user is moving during photographing, theimage processing unit 110 can also generate an image obtained by clipping a predetermined region from the photographed image. - Referring to
FIGS. 15 to 18 , a description of how it functions is given in more detail.FIG. 15 is a diagram for describing a state in which a user is stationary while looking forward.FIG. 15 illustrates avisible area 32 at which the user is looking as a plan view for simplicity of description. - In addition,
FIG. 16 is a diagram for describing a state in which a user is walking while looking forward. In general, the field of view of the user when walking is smaller than that when the user is stationary, and thus the user tends to look at aregion 52 smaller than thevisible region 32 of the user when stationary, as illustrated inFIG. 16 . - Thus, when it is recognized that the user is walking while looking forward during photographing, for example, the
image processing unit 110 may generate an image that is obtained by clipping a region within a predetermined distance from the center in the photographed image as illustrated inFIG. 17 .FIG. 17 illustrates a generation example of an image (processed image 50) obtained by clipping a region within the distance d1 from the center in the photographed image (photographed image 30). According to this generation example, an image of an object, a person, or the like that is watched by the user during photographing can be clipped appropriately, and the amount of information of an image can be reduced. - Alternatively, in such a case, the
image processing unit 110 may generate an image that is obtained by clipping a peripheral region at a predetermined distance or more away from the center in the photographed image as illustrated inFIG. 18 .FIG. 18 illustrates a generation example of the image (processed image 50) that is obtained by clipping a peripheral region at the distance d1 or more away from the center in the photographed image (photographed image 30). - According to this generation example, it is possible to extract an image of an object, a person, or the like that is not watched by the user during photographing. Thus, when the clipped image is transmitted to the
server 20 and additional information regarding the clipped image is received from theserver 20, it is advantageously possible for the user to know information regarding an object or a person that is not noticed or is little considered by the user during photographing. - In addition, when the
action recognition unit 102 recognizes that a user is looking down during photographing, theimage processing unit 110 can clip a region from the photographed image such that the region to be clipped is within a predetermined distance from the lower end of the photographed image. - Referring to
FIGS. 19 and 20 , a description of how it functions is given in more detail.FIG. 19 is a diagram for describing a state in which a user looks downward and is stationary. As illustrated inFIG. 19 , in general, when a user tilts his head to look down, for example, the user casts his eyes downward, and thus the user tends to look at aregion 52 in the lower side than thevisible region 32 when the eyes look forward. - Thus, when the
action recognition unit 102 recognizes that a user looks down during photographing, theimage processing unit 110 may generate an image (processed image 50) that is obtained by clipping a region from the photographed image (photographed image 30) such that the region to be clipped is within the distance d2 from the lower end of the photographed image as illustrated inFIG. 20 . The value of the distance d2 can be set as an appropriate fixed value, for example, by performing a user test in advance. - In addition, when the
action recognition unit 102 recognizes that a user looks upward during photographing, theimage processing unit 110 can clip a region from the photographed image such that the region to be clipped is within a predetermined distance from the upper end of the photographed image. - Referring to
FIG. 21 , a description of how it functions is given in more detail. In general, when a user tilts his head to look upward, the user casts his eyes upward as opposed to when the user looks down, and thus the user tends to look at a region in the upper side than when the eyes look forward. - Thus, when the
action recognition unit 102 recognizes that a user looks upward during photographing, theimage processing unit 110 may generate an image (processed image 50) that is obtained by clipping a region from the photographed image (photographed image 30) such that the region to be clipped is within the distance d2 from the upper end of the photographedimage 30 as illustrated inFIG. 21 . - In addition, as processing example 3, as illustrated in
FIG. 22 or 23 , when a faceregion detection unit 112 described later detects a face region of a person contained in the photographed image, theimage processing unit 110 can generate an image obtained by clipping the detected face region from the photographed image. - As illustrated in
FIG. 22 , when the faceregion detection unit 112 detects the entire region of the face of a person, theimage processing unit 110 may generate an image obtained by clipping the detected entire region of the face from the photographed image. In addition, as illustrated inFIG. 23 , when the faceregion detection unit 112 detects a partial region of the face of a person, theimage processing unit 110 may generate an image obtained by clipping only the detected partial region of the face from the photographed image. - In addition, as a modification, if the
blur determination unit 114 described later determines that the photographed image is blurred to be greater than or equal to a threshold, then theimage processing unit 110 can also correct a blur contained in the photographed image based on the action state of the user recognized by theaction recognition unit 102 during photographing. For example, when theaction recognition unit 102 recognizes that a user is in walking state during photographing, theimage processing unit 110 may correct the photographed image using a program, which is previously stored in thestorage device 162, for blur correction corresponding to a walking state. - In addition, when features of each action of the user are known previously, for example, by the
information processing apparatus 10, theimage processing unit 110 may correct the photographed image using a program, which is previously stored in thestorage device 162, for blur correction corresponding to features of each action of the user. - In addition, the
image processing unit 110 may correct a blur contained in the photographed image depending on the amount of change in movement or the amount of change in angle in three-dimensional space of theinformation processing apparatus 10, which is measured by themeasurement unit 122 during photographing. - The face
region detection unit 112 is able to detect a face region of a person contained in the photographed image. For example, the faceregion detection unit 112 detects a face region of a person by extracting feature points such as eye, nose, or facial contour in the photographed image. - The
blur determination unit 114 determines whether the photographed image is blurred to be greater than or equal to a threshold. For example, theblur determination unit 114 determines whether the photographed image is blurred to be greater than or equal to a threshold depending on the magnitude of the angle fluctuations measured by thegyroscope 172 during photographing. The threshold may be a value that is set by, for example, a designer or user of theinformation processing apparatus 10. - The functions of other components are similar to those of the first embodiment, and thus the description thereof is omitted here.
- The configuration according to the second embodiment has been described above. Next, the operation according to the second embodiment is described.
-
FIG. 24 is a sequence diagram illustrating the operation according to the second embodiment. The operations of steps S201 to S205 are similar to those according to the first embodiment illustrated inFIG. 11 , and thus the description thereof is omitted. - After step S205, the transmission control unit 107 determines whether the image photographed in S205 is transmitted to the server 20 (S206). For example, if the
blur determination unit 114 determines that the photographed image is blurred to be greater than or equal to a threshold, then the transmission control unit 107 determines that the photographed image is not transmitted to the server 20 (S206: NO). On the other hand, if theblur determination unit 114 determines that the photographed image is blurred to be less than a threshold, then the transmission control unit 107 determines that the photographed image is transmitted to the server 20 (S206: YES). - Then, if it is determined that the photographed image is not transmitted to the server 20 (S206: NO), the
information processing apparatus 10 performs the operation of S201 again. - On the other hand, if it is determined that the photographed image is transmitted to the server 20 (S206: YES), then the
image processing unit 110 generates an image that is obtained by reducing the amount of information from the photographed image using a way corresponding to the action state of the user that is recognized by theaction recognition unit 102 during photographing by the photographing unit 124 (S207). For example, if the faceregion detection unit 112 detects a face region of a person in the photographed image, then theimage processing unit 110 generates an image, which is obtained by clipping the detected face region, from the photographed image. - Subsequently, the transmission control unit 107 allows the
communication unit 120 to transmit the image generated or processed in step S207 to the server 20 (S208). - The operations subsequent to step S208 are substantially similar to those of steps S108 to 111 of the first embodiment illustrated in
FIG. 11 , and thus a description thereof is omitted. - In the above, as described, for example, with reference to
FIGS. 12, 24 , and other illustrations, theinformation processing apparatus 10 according to the second embodiment recognizes an action state of the user based on the measurement results obtained by themeasurement unit 122. Then, theinformation processing apparatus 10 perform the process regarding the amount of information on the image photographed by the photographingunit 124 based on the action state of the user recognized during photographing by the photographingunit 124. Then, theinformation processing apparatus 10 causes the processed image to be transmitted to theserver 20. Thus,information processing apparatus 10 can reduce adaptively the communication traffic in transmitting the processed image to theserver 20. - For example, when the
information processing apparatus 10 recognizes an action in which a user is looking around during photographing, theinformation processing apparatus 10 compresses the resolution of the photographed image and transmits the image having compressed resolution to theserver 20. Alternatively, when theinformation processing apparatus 10 recognizes an action in which a user is watching during photographing, theinformation processing apparatus 10 clips the region that is estimated as being watched by the user and transmits the clipped region to theserver 20. Thus, it is possible to reduce the amount of information using an appropriate way depending on the action state of the user during photographing, thereby reducing adaptively the communication traffic. - In addition, if the photographed image is blurred to be greater than or equal to a threshold, then the
information processing apparatus 10 may be unable to transmit the image to theserver 20. In this way, theinformation processing apparatus 10 does not transmit an image, which is difficult to perform appropriate image recognition by theserver 20, to theserver 20, thereby reducing the communication traffic more efficiently. - Furthermore, basically, the
information processing apparatus 10 performs only detection processing on whether the photographed image contains a face region of a person, and if a face region is detected, then theinformation processing apparatus 10 transmits an image of the detected face region to theserver 20. Theinformation processing apparatus 10 allows theserver 20 to specify a person corresponding to the detected face region and receives the specified result from theserver 20. Thus, theinformation processing apparatus 10 can reduce the amount of calculation necessary to specify a person contained in the photographed image, resulting in reduced power consumption. - The preferred embodiments of the present disclosure have been described above with reference to the accompanying drawings, whilst the present disclosure is not limited to the above examples, of course. A person skilled in the art may find various alterations and modifications within the scope of the appended claims, and it should be understood that they will naturally come under the technical scope of the present disclosure.
- For example, the
information processing apparatus 10 according to each embodiment is not limited to a device provided with a glasses-type display as illustrated inFIG. 1 . Theinformation processing apparatus 10 may be configured as a wristwatch type device, a device that is worn on the user's neck such as a neck strap, a device that is mounted on the clothing of the user such as a wearable badge, or a device that is attached to the body of the user such as a headphone. - Furthermore, in the above, there has been described the example in which the
information processing apparatus 10 is configured to include, for example, all of various types of sensors such as the positioninformation measuring device 168 and theacceleration sensor 170, but an embodiment of the present disclosure is not limited thereto. Any one or more of the positioninformation measuring device 168, theacceleration sensor 170, thegyroscope 172, and themicrophone 174 may be provided in other portable devices that a user can carry. For example, when theacceleration sensor 170 is provided in another device that is attached near of the user's waist, the acceleration when the user is walking can advantageously be measured more accurately. - Moreover, according to one or more embodiments of the present disclosure, it is possible to provide a computer program used to allow hardware such as the
CPU 150, theROM 152, and theRAM 154 to execute a function equivalent to that of each component in theinformation processing apparatus 10 described above. In addition, a storage medium for storing the computer program is also provided. - Additionally, the present technology may also be configured as below.
- (1) An information processing apparatus including:
- an action recognition unit configured to recognize an action state of a user based on a measurement result obtained by a sensor carried by the user;
- an image processing unit configured to perform a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized by the action recognition unit during photographing by the photographing unit carried by the user; and
- a transmission control unit configured to cause the image processed by the image processing unit to be transmitted to an image processing device used to perform image recognition.
- (2) The information processing apparatus according to (1),
- wherein the image processing unit generates an image obtained by reducing an amount of information from the photographed image using a way corresponding to an action state of the user recognized by the action recognition unit during photographing by the photographing unit, and
- wherein the transmission control unit causes the image having the reduced amount of information to be transmitted to the image processing device.
- (3) The information processing apparatus according to (1) or (2), wherein the action recognition unit recognizes an action state of the user based on a movement of a head of the user measured by the sensor.
- (4) The information processing apparatus according to (3),
- wherein the action state of the user includes a vision-related state of the user,
- wherein the action recognition unit recognizes that the user is looking around when the sensor measures that a movement of the user's head is within a predetermined range, and
- wherein the image processing unit generates an image obtained by reducing resolution from the photographed image when a state in which the user is looking around is recognized.
- (5) The information processing apparatus according to (3) or (4),
- wherein the action recognition unit recognizes that the user is watching when the sensor measures that a speed of movement of the head of the user is reduced to be less than or equal to a predetermined value, and
- wherein the image processing unit generates an image obtained by clipping a predetermined region from the photographed image when a state in which the user is watching is recognized.
- (6) The information processing apparatus according to any one of (3) to (5),
- wherein the action state of the user includes a movement state of the user, and
- wherein the image processing unit generates an image obtained by clipping a predetermined region from the photographed image when a state in which the user is moving during photographing by the photographing unit is recognized.
- (7) The information processing apparatus according to (5) or (6), wherein the predetermined region is a peripheral region at a predetermined distance or more away from a center in the photographed image.
- (8) The information processing apparatus according to (5) or (6), wherein the predetermined region is a region within a predetermined distance from a center in the photographed image.
- (9) The information processing apparatus according to any one of (1) to (4), further including:
- a face region detection unit configured to detect a face region of a person contained in the photographed image,
- wherein the image processing unit clips a face region detected by the face region detection unit from the photographed image, and
- wherein the transmission control unit causes an image of the face region clipped by the image processing unit to be transmitted to the image processing device.
- (10) The information processing apparatus according to (3) or (4),
- wherein the action recognition unit recognizes that the user is looking down using the sensor,
- wherein the image processing unit clips a region from the photographed image such that the region to be clipped is within a predetermined distance from a lower end of the photographed image when a state in which the user is looking down is recognized, and
- wherein the transmission unit causes an image of the region clipped by the image processing unit to be transmitted to the image processing device.
- (11) The information processing apparatus according to (3) or (4),
- wherein the action recognition unit recognizes that the user is looking upward using the sensor,
- wherein the image processing unit clips a region from the photographed image such that the region to be clipped is within a predetermined distance from an upper end of the photographed image when a state in which the user is looking upward is recognized, and
- wherein the transmission unit causes an image of the region clipped by the image processing unit to be transmitted to the image processing device.
- (12) The information processing apparatus according to any one of (1) to (11), further including:
- a blur determination unit configured to determine whether the photographed image is blurred to be greater than or equal to a threshold,
- wherein the transmission control unit prevents the photographed image from being transmitted to the image processing device when the blur determination unit determines that the photographed image is blurred to be greater than or equal to the threshold.
- (13) The information processing apparatus according to any one of (1) to (11), further including:
- a blur determination unit configured to determine whether the photographed image is blurred to be greater than or equal to a threshold,
- wherein the image processing unit corrects a blur contained in the photographed image based on an action state of the user recognized by the action recognition unit when the blur determination unit determines that the photographed image is blurred to be greater than or equal to the threshold.
- (14) An information processing method including:
- recognizing an action state of a user based on a measurement result obtained by a sensor carried by the user;
- performing, by a processor, a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized during photographing by the photographing unit carried by the user; and
- causing the processed image to be transmitted to an image processing device used to perform image recognition.
- (15) A program for causing a computer to function as:
- an action recognition unit configured to recognize an action state of a user based on a measurement result obtained by a sensor carried by the user;
- an image processing unit configured to perform a process regarding an amount of information on an image photographed by a photographing unit based on an action state of the user recognized by the action recognition unit during photographing by the photographing unit carried by the user; and
- a transmission control unit configured to cause the image processed by the image processing unit to be transmitted to an image processing device used to perform image recognition.
Claims (15)
1. An information processing device, comprising
at least one CPU configured to:
determine, based on motion information sent from a motion sensor, an action state of a user while a camera carried by the user is activated, wherein the action state includes a movement state and a non-movement state, and a movement amount in the movement state is larger than that in the non-movement state;
in response to determining the movement state, control the camera to capture an image at a first capturing frequency; and
in response to determining the non-movement state, control the camera to capture an image at a second capturing frequency less than the first capturing frequency.
2. The information processing device according to claim 1 , wherein the at least one CPU is configured to make power consumption of the non-movement state less than that of the movement state based on control of the camera to capture the image at the second capturing frequency.
3. The information processing device according to claim 1 , wherein
the at least one CPU is configured to:
control the camera to clip a region from the captured image in an event the action state is a state in which the user is watching; and
cause the clipped region to be transmitted to an image recognition unit configured to recognize the image,
wherein the region is within a determined distance from a center in the captured image.
4. The information processing device according to claim 3 , wherein the clipped region has information amount less than that of the captured image.
5. The information processing device according to claim 4 , wherein the at least one CPU is configured to determine the action state of the user based on a movement of a head of the user detected by the motion sensor.
6. The information processing device according to claim 5 , wherein
the motion sensor is configured to detect a speed of movement of the head of the user, and
the at least one CPU is configured to determine that the user is watching in an event that the detected speed is reduced to be less than or equal to a threshold value.
7. The information processing device according to claim 1 , wherein
the motion sensor is configured to obtain acceleration information, and
the at least one CPU is configured to control the camera to change the second capturing frequency to the first capturing frequency in accordance with the acceleration information.
8. The information processing device according to claim 7 , wherein the at least one CPU is configured to control the camera to capture an image at the first capturing frequency in an event the action state is a walking state.
9. The information processing device according to claim 1 , wherein the at least one CPU is configured to reduce an information amount of the captured image in an event the user is in the movement state, and transmit the captured image that has the reduced information amount to an image recognition unit configured for image recognition.
10. The information processing device according to claim 9 , wherein the information amount is a resolution of the captured image.
11. The information processing device according to claim 9 , wherein the at least one CPU is configured to:
clip a region that corresponds to a facing direction of the user, and
transmit the clipped region to an image recognition unit configured for image recognition.
12. The information processing device according to claim 11 , wherein the facing direction is associated with a state where the user looks around, downward, or upward.
13. The information processing device according to claim 1 , wherein the at least one CPU is configured to prevent an image, which has a blurring level greater than or equal to a threshold, from being transmitted to an image recognition unit configured for image recognition.
14. An information processing method, comprising:
determining, based on motion information sent from a motion sensor, an action state of a user while a camera carried by the user is activated, wherein the action state includes a movement state and a non-movement state, and a movement amount in the movement state is larger than that in the non-movement state;
in response to determining the movement state, controlling the camera to capture an image at a first capturing frequency; and
in response to determining the non-movement state, controlling the camera to capture an image at a second capturing frequency less than the first capturing frequency.
15. A non-transitory computer-readable storage medium, having stored thereon, a set of instructions for causing a computer to perform operations, comprising:
determining, based on motion information sent from a motion sensor, an action state of a user while a camera carried by the user is activated, wherein the action state includes a movement state and a non-movement state, and a movement amount in the movement state is larger than that in the non-movement state;
in response to determining the movement state, controlling the camera to capture an image at a first capturing frequency; and
in response to determining the non-movement state, controlling the camera to capture an image at a second capturing frequency less than the first capturing frequency.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/235,703 US20160353014A1 (en) | 2013-11-01 | 2016-08-12 | Information processing apparatus, information processing method, and medium using an action state of a user |
| US15/969,261 US10609279B2 (en) | 2013-11-01 | 2018-05-02 | Image processing apparatus and information processing method for reducing a captured image based on an action state, transmitting the image depending on blur, and displaying related information |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2013228074A JP6079566B2 (en) | 2013-11-01 | 2013-11-01 | Information processing apparatus, information processing method, and program |
| JP2013-228074 | 2013-11-01 | ||
| US14/521,007 US9432532B2 (en) | 2013-11-01 | 2014-10-22 | Information processing apparatus, information processing method, and medium using an action state of a user |
| US15/235,703 US20160353014A1 (en) | 2013-11-01 | 2016-08-12 | Information processing apparatus, information processing method, and medium using an action state of a user |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/521,007 Continuation US9432532B2 (en) | 2013-11-01 | 2014-10-22 | Information processing apparatus, information processing method, and medium using an action state of a user |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/969,261 Continuation US10609279B2 (en) | 2013-11-01 | 2018-05-02 | Image processing apparatus and information processing method for reducing a captured image based on an action state, transmitting the image depending on blur, and displaying related information |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20160353014A1 true US20160353014A1 (en) | 2016-12-01 |
Family
ID=53006767
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/521,007 Active US9432532B2 (en) | 2013-11-01 | 2014-10-22 | Information processing apparatus, information processing method, and medium using an action state of a user |
| US15/235,703 Abandoned US20160353014A1 (en) | 2013-11-01 | 2016-08-12 | Information processing apparatus, information processing method, and medium using an action state of a user |
| US15/969,261 Active 2034-11-27 US10609279B2 (en) | 2013-11-01 | 2018-05-02 | Image processing apparatus and information processing method for reducing a captured image based on an action state, transmitting the image depending on blur, and displaying related information |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/521,007 Active US9432532B2 (en) | 2013-11-01 | 2014-10-22 | Information processing apparatus, information processing method, and medium using an action state of a user |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/969,261 Active 2034-11-27 US10609279B2 (en) | 2013-11-01 | 2018-05-02 | Image processing apparatus and information processing method for reducing a captured image based on an action state, transmitting the image depending on blur, and displaying related information |
Country Status (2)
| Country | Link |
|---|---|
| US (3) | US9432532B2 (en) |
| JP (1) | JP6079566B2 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11206303B2 (en) | 2017-10-18 | 2021-12-21 | Mitsubishi Electric Corporation | Image sharing assistance device, image sharing system, and image sharing assistance method |
Families Citing this family (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015089059A (en) | 2013-11-01 | 2015-05-07 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
| US9407823B2 (en) * | 2013-12-09 | 2016-08-02 | Microsoft Technology Licensing, Llc | Handling video frames compromised by camera motion |
| JP6539122B2 (en) * | 2015-06-17 | 2019-07-03 | キヤノン株式会社 | INFORMATION PROCESSING APPARATUS, CONTROL METHOD FOR INFORMATION PROCESSING APPARATUS, AND PROGRAM |
| JP7043255B2 (en) | 2017-12-28 | 2022-03-29 | キヤノン株式会社 | Electronic devices and their control methods |
| JP2019121857A (en) * | 2017-12-28 | 2019-07-22 | キヤノン株式会社 | Electronic apparatus and control method of the same |
| EP3866161A1 (en) | 2018-10-09 | 2021-08-18 | Sony Group Corporation | Information processing device, information processing method, and program |
| WO2021255975A1 (en) * | 2020-06-17 | 2021-12-23 | ソニーグループ株式会社 | Imaging device, imaging control device, imaging device control method, and program |
| CN112699884A (en) * | 2021-01-29 | 2021-04-23 | 深圳市慧鲤科技有限公司 | Positioning method, positioning device, electronic equipment and storage medium |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060256140A1 (en) * | 2005-05-11 | 2006-11-16 | L-3 Communications Corporation | Dynamic display optimization method and system with image motion |
| US20060291840A1 (en) * | 2005-06-09 | 2006-12-28 | Sony Corporation | Information processing device and method, photographing device, and program |
| US20070104462A1 (en) * | 2005-11-10 | 2007-05-10 | Sony Corporation | Image signal processing device, imaging device, and image signal processing method |
| US20100073515A1 (en) * | 2008-09-22 | 2010-03-25 | Todd Conard | Systems and methods for imaging objects |
| US20130127980A1 (en) * | 2010-02-28 | 2013-05-23 | Osterhout Group, Inc. | Video display modification based on sensor input for a see-through near-to-eye display |
| US20130241955A1 (en) * | 2010-11-09 | 2013-09-19 | Fujifilm Corporation | Augmented reality providing apparatus |
| US20130308002A1 (en) * | 2012-05-17 | 2013-11-21 | Samsung Electronics Co. Ltd. | Apparatus and method for adaptive camera control method based on predicted trajectory |
| US20130335301A1 (en) * | 2011-10-07 | 2013-12-19 | Google Inc. | Wearable Computer with Nearby Object Response |
| US20140176722A1 (en) * | 2012-12-25 | 2014-06-26 | Casio Computer Co., Ltd. | Imaging device, imaging control method and storage medium |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0788666B2 (en) | 1991-03-09 | 1995-09-27 | 日本植生株式会社 | Spraying method of spraying base material in greening method |
| JP2002118780A (en) * | 2000-10-05 | 2002-04-19 | Ricoh Co Ltd | Image pickup device having camera-shake correction function |
| JP4720358B2 (en) | 2005-08-12 | 2011-07-13 | ソニー株式会社 | Recording apparatus and recording method |
| JP2007228154A (en) | 2006-02-22 | 2007-09-06 | Matsushita Electric Ind Co Ltd | Image processing apparatus and image processing method |
| JP5159189B2 (en) * | 2007-06-29 | 2013-03-06 | キヤノン株式会社 | Image processing apparatus, imaging apparatus, image processing method, and program |
| JP2010061265A (en) | 2008-09-02 | 2010-03-18 | Fujifilm Corp | Person retrieval and registration system |
| JP2012049871A (en) * | 2010-08-27 | 2012-03-08 | Nec Casio Mobile Communications Ltd | Portable device and camera shake correction method |
| JP2012212989A (en) | 2011-03-30 | 2012-11-01 | Brother Ind Ltd | Head-mounted camera and head-mounted display |
| JP5868618B2 (en) * | 2011-06-14 | 2016-02-24 | オリンパス株式会社 | Information processing apparatus, image processing system, and program |
-
2013
- 2013-11-01 JP JP2013228074A patent/JP6079566B2/en active Active
-
2014
- 2014-10-22 US US14/521,007 patent/US9432532B2/en active Active
-
2016
- 2016-08-12 US US15/235,703 patent/US20160353014A1/en not_active Abandoned
-
2018
- 2018-05-02 US US15/969,261 patent/US10609279B2/en active Active
Patent Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7593026B2 (en) * | 2005-05-11 | 2009-09-22 | L-3 Communications Corporation | Dynamic display optimization method and system with image motion |
| US20060256140A1 (en) * | 2005-05-11 | 2006-11-16 | L-3 Communications Corporation | Dynamic display optimization method and system with image motion |
| US7917020B2 (en) * | 2005-06-09 | 2011-03-29 | Sony Corporation | Information processing device and method, photographing device, and program |
| US20060291840A1 (en) * | 2005-06-09 | 2006-12-28 | Sony Corporation | Information processing device and method, photographing device, and program |
| US20070104462A1 (en) * | 2005-11-10 | 2007-05-10 | Sony Corporation | Image signal processing device, imaging device, and image signal processing method |
| US8416319B2 (en) * | 2008-09-22 | 2013-04-09 | Freedom Scientific, Inc. | Systems and methods for imaging objects |
| US20100073515A1 (en) * | 2008-09-22 | 2010-03-25 | Todd Conard | Systems and methods for imaging objects |
| US20130127980A1 (en) * | 2010-02-28 | 2013-05-23 | Osterhout Group, Inc. | Video display modification based on sensor input for a see-through near-to-eye display |
| US20130241955A1 (en) * | 2010-11-09 | 2013-09-19 | Fujifilm Corporation | Augmented reality providing apparatus |
| US9001155B2 (en) * | 2010-11-09 | 2015-04-07 | Fujifilm Corporation | Augmented reality providing apparatus |
| US20130335301A1 (en) * | 2011-10-07 | 2013-12-19 | Google Inc. | Wearable Computer with Nearby Object Response |
| US20130308002A1 (en) * | 2012-05-17 | 2013-11-21 | Samsung Electronics Co. Ltd. | Apparatus and method for adaptive camera control method based on predicted trajectory |
| US8773542B2 (en) * | 2012-05-17 | 2014-07-08 | Samsung Electronics Co., Ltd. | Apparatus and method for adaptive camera control method based on predicted trajectory |
| US20140176722A1 (en) * | 2012-12-25 | 2014-06-26 | Casio Computer Co., Ltd. | Imaging device, imaging control method and storage medium |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11206303B2 (en) | 2017-10-18 | 2021-12-21 | Mitsubishi Electric Corporation | Image sharing assistance device, image sharing system, and image sharing assistance method |
Also Published As
| Publication number | Publication date |
|---|---|
| US20180255236A1 (en) | 2018-09-06 |
| US10609279B2 (en) | 2020-03-31 |
| JP2015089060A (en) | 2015-05-07 |
| JP6079566B2 (en) | 2017-02-15 |
| US9432532B2 (en) | 2016-08-30 |
| US20150124111A1 (en) | 2015-05-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10609279B2 (en) | Image processing apparatus and information processing method for reducing a captured image based on an action state, transmitting the image depending on blur, and displaying related information | |
| EP3923634B1 (en) | Method for identifying specific position on specific route and electronic device | |
| EP3907981B1 (en) | Recording frame rate control method and related apparatus | |
| US10638046B2 (en) | Wearable device, control apparatus, photographing control method and automatic imaging apparatus | |
| US9661221B2 (en) | Always-on camera sampling strategies | |
| JP6096654B2 (en) | Image recording method, electronic device, and computer program | |
| CN113556466A (en) | Focusing method and electronic equipment | |
| CN111368765A (en) | Vehicle position determining method and device, electronic equipment and vehicle-mounted equipment | |
| US9742988B2 (en) | Information processing apparatus, information processing method, and program | |
| CN114449151B (en) | Image processing method and related device | |
| CN112165575A (en) | Image blurring processing method and device, storage medium and electronic equipment | |
| CN113672756A (en) | A visual positioning method and electronic device | |
| CN116700654B (en) | Image display method, device, terminal equipment and storage medium | |
| JP6256634B2 (en) | Wearable device, wearable device control method, and program | |
| US20210366423A1 (en) | Display control device, imaging device, display control method, and display control program | |
| JP6414313B2 (en) | Portable device, method for controlling portable device, and program | |
| CN113923351A (en) | Exit method, device, storage medium and program product for multi-channel video shooting | |
| CN114885086A (en) | Image processing method, head-mounted device and computer-readable storage medium | |
| JP2018006797A (en) | Electronic apparatus, electronic apparatus system, wearable apparatus, control program, still picture acquisition method and operation method for wearable apparatus | |
| CN119544885A (en) | A control method and electronic device based on vertical synchronization signal | |
| US20220198829A1 (en) | Mobile communications device and application server | |
| CN117714835A (en) | Image processing method, electronic device and readable storage medium | |
| CN118968704A (en) | Fatigue reminder method, device and electronic equipment | |
| JP2014179743A (en) | Electronic apparatus and method of controlling electronic apparatus | |
| CN107306328A (en) | Active recording system and control method thereof |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |