WO2018008210A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents
Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDFInfo
- Publication number
- WO2018008210A1 WO2018008210A1 PCT/JP2017/013655 JP2017013655W WO2018008210A1 WO 2018008210 A1 WO2018008210 A1 WO 2018008210A1 JP 2017013655 W JP2017013655 W JP 2017013655W WO 2018008210 A1 WO2018008210 A1 WO 2018008210A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- real object
- user
- display
- information processing
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/38—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- AR augmented reality
- Patent Document 2 describes a technique for controlling display of a display object on a transmissive display so that a user can visually recognize an actual object located behind the transmissive display through the transmissive display.
- JP 2012-155654 A Japanese Patent No. 5830987
- Patent Document 2 does not disclose changing the display method of the display object on the transmissive display according to the actual object.
- a new and improved information processing apparatus, information processing method, and information processing apparatus capable of changing the degree of recognition of the real object by the user adaptively to the real object included in the user's field of view, and Suggest a program.
- An information processing apparatus includes an output control unit that controls display by a display unit so that the degree of object recognition changes.
- an information processing method including a processor controlling display by a display unit so that a recognition degree of the real object is changed.
- the computer can be used in a range between the user and the real object based on a determination result of whether or not the real object included in the user's field of view is the first real object.
- a program is provided for functioning as an output control unit that controls display by a display unit such that the degree of recognition of the real object by the user changes.
- the recognition degree of the real object by the user can be adaptively changed to the real object included in the user's field of view.
- the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
- FIG. 3 is a schematic diagram showing how a user visually recognizes a real object and a virtual object through an AR glass 10.
- FIG. 6 is another schematic diagram illustrating a state in which a user visually recognizes a real object and a virtual object through the AR glass 10. It is the figure which showed the real object and virtual object which are visually recognized through the display part 124 in the condition shown in FIG. It is the functional block diagram which showed the structural example of AR glass 10 by this embodiment. It is the figure which showed a mode that a user visually recognizes through AR glass 10 when the real object 30a is set to avoid concealment.
- FIG. 6 is a diagram illustrating an example of a position change allowable range set for a virtual object 32.
- FIG. It is the figure which showed the example of a display of the virtual object in an evacuation route. It is the figure which showed another example of a display of the virtual object in an evacuation route. It is the flowchart which showed the operation example by this embodiment. It is explanatory drawing which showed the hardware structural example of AR glass 10 by this embodiment.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral.
- a plurality of configurations having substantially the same functional configuration are differentiated as needed, such as the AR glass 10a and the AR glass 10b.
- the AR glass 10a and the AR glass 10b are simply referred to as the AR glass 10 when it is not necessary to distinguish between them.
- the information processing system includes an AR glass 10, a server 20, and a communication network 22.
- the AR glass 10 is an example of an information processing apparatus according to the present disclosure.
- the AR glass 10 is a device that controls display of a virtual object associated with a position in the real world in advance. For example, the AR glass 10 first sends a virtual object located around the position (for example, a certain range in all directions) from the server 20 via the communication network 22 based on the position information of the AR glass 10. get. Then, the AR glass 10 displays, on the display unit 124 described later, a virtual object included in the user's field of view among the acquired virtual objects based on the posture of the AR glass 10 (or the detection result of the user's line-of-sight direction). indicate.
- the AR glass 10 generates a right-eye image and a left-eye image based on the acquired virtual object, displays the right-eye image on the right-eye display unit 124a, and displays the left-eye image on the left-eye display unit. It is displayed on 124b. Thereby, a user can visually recognize a virtual stereoscopic video.
- the virtual object is basically a 3D object, but is not limited to such an example, and may be a 2D object.
- the display unit 124 of the AR glass 10 is configured by a transmissive display.
- FIG. 2 and FIG. 3 are schematic diagrams showing how the user wears the AR glass 10 and visually recognizes the real object 30 and the virtual object 32.
- the user simultaneously displays the real object 30 and the virtual object 32 included in the user's view 40 among the plurality of real objects 30 located in the real world through the display unit 124. It can be visually recognized.
- “real object” includes not only a single real object but also a predetermined area in the real world (for example, the entire building, an intersection, a corridor, etc.).
- the user's field of view 40 may be defined in various ways. For example, it may be estimated that the user's field of view 40 is approximately the center of the area captured by the camera provided on the outside of the AR glass 10, that is, the front side of the AR glass 10.
- the gaze direction of the user is estimated based on the eyeball image captured by the camera provided inside the AR glass 10, that is, the rear side of the AR glass 10, and a predetermined three-dimensional space corresponding to the gaze direction is obtained. It may be estimated that it is the user's field of view 40.
- the three-dimensional shape of the user's field of view 40 may be determined as appropriate, but the three-dimensional shape is preferably defined as a substantially conical shape.
- the relationship between the user's field of view 40 and the display area of the display unit 124 can be variously determined. For example, as shown in FIGS. 2 and 4, the relationship between the two may be determined so that the area of the user's field of view 40 a that intersects the display area of the display unit 124 is greater than or equal to the entire display area. Alternatively, as illustrated in FIGS. 3 and 4, the relationship between the two may be determined so that the area of the user's field of view 40 b that intersects the display area of the display unit 124 is smaller than the entire display area.
- FIG. 4 is a diagram illustrating an example of the real object 30 and the virtual object 32 that are visually recognized through the display unit 124 in the situation illustrated in FIGS. 2 and 3.
- the virtual object 32 is a non-transparent object.
- the virtual object 32 is located between the real object 30a and the real object 30b and the user. Therefore, as shown in FIG. 4, a part of each of the real object 30 a and the real object 30 b is hidden by the virtual object 32 and visually recognized by the user.
- the AR glass 10 can communicate with the server 20 via the communication network 22.
- the server 20 is a device that stores virtual objects in association with real-world position information.
- the real world position information may be information including latitude and longitude, or may be floor plan information in a predetermined building.
- the server 20 receives a virtual object acquisition request from another device such as the AR glass 10, for example, the server 20 transmits a virtual object corresponding to the acquisition request to the other device.
- the communication network 22 is a wired or wireless transmission path for information transmitted from a device connected to the communication network 22.
- the communication network 22 may include a public line network such as a telephone line network, the Internet, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like.
- the communication network 22 may include a dedicated network such as an IP-VPN (Internet Protocol-Virtual Private Network).
- the AR glass 10 according to the present embodiment has been created with the above circumstances taken into consideration.
- the AR glass 10 according to the present embodiment is based on the determination result of whether or not a real object included in the user's field of view is a specific real object, and the real object in the range between the user and the real object.
- the display by the display unit 124 is controlled so that the degree of recognition changes. For this reason, for example, when a virtual object exists between a specific real object and the user, the AR glass 10 can control the display of the virtual object so that the user can recognize the real object. it can. As a result, it is possible to improve safety when displaying the virtual object.
- FIG. 5 is a functional block diagram showing a configuration example of the AR glass 10 according to the present embodiment.
- the AR glass 10 includes a control unit 100, a communication unit 120, a sensor unit 122, a display unit 124, and a storage unit 126.
- Control unit 100 centrally controls the operation of the AR glass 10 using hardware such as a CPU (Central Processing Unit) 150 and a RAM (Random Access Memory) 154, which will be described later, built in the AR glass 10. .
- the control unit 100 includes a virtual object acquisition unit 102, a real object determination unit 104, an overlap determination unit 106, and an output control unit 108.
- the virtual object acquisition unit 102 acquires a virtual object to be displayed from the server 20 based on the measurement result of the position information of the AR glass 10 by the sensor unit 122 described later. For example, first, the virtual object acquisition unit 102 transmits the position information measured by the sensor unit 122 to the server 20 to be positioned around the position information (for example, a certain range in all directions). A plurality of virtual objects are acquired from the server 20. The virtual object acquisition unit 102 then includes a virtual object included in the user's field of view from the plurality of received virtual objects based on the posture of the AR glass 10 or the user's line-of-sight direction measured by the sensor unit 122. Are extracted as virtual objects to be displayed.
- Real object determination unit 104 determines whether or not individual real objects included in the user's field of view are set to avoid concealment.
- the real object that is set to avoid concealment is an example of the first real object in the present disclosure
- the real object that is not set to avoid concealment is an example of the second real object in the present disclosure. .
- concealment avoidance setting conditions may be registered in advance for each type of real object.
- the real object determination unit 104 first performs object recognition on each real object included in the user's field of view based on, for example, a captured image in front of the user captured by the sensor unit 122. Then, the real object determination unit 104 determines whether or not the object is concealed based on the concealment avoidance setting condition corresponding to the recognized object type for each real object in the user's field of view. Judging.
- the concealment avoidance setting condition list may be stored in the storage unit 126 or may be stored in the server 20.
- the real object determination unit 104 sends an inquiry as to whether or not each recognized real object is a concealment avoidance target to the server 20, and obtains an answer, thereby obtaining a response within the user's field of view. It is specified whether or not each real object is set to avoid concealment.
- concealment avoidance target For example, a public sign such as a traffic light, a road sign, or a signboard of a construction site can be always set to avoid concealment. Further, concealment avoidance setting can be always set for a predetermined area such as a pedestrian crossing (the entire sky), an intersection (the entire sky), or an evacuation route in a building such as a leisure facility. Thereby, the user can pass more safely or drive the car.
- a public sign such as a traffic light, a road sign, or a signboard of a construction site can be always set to avoid concealment.
- concealment avoidance setting can be always set for a predetermined area such as a pedestrian crossing (the entire sky), an intersection (the entire sky), or an evacuation route in a building such as a leisure facility. Thereby, the user can pass more safely or drive the car.
- the predetermined reference may include a positional relationship between the real object and the user. For example, when the user is located in front of a traffic light or an advertisement, these real objects are set to avoid concealment, and when the user is located on the side or behind these real objects, these real objects are set. Is not set to avoid concealment.
- the predetermined standard includes a distance between the real object and the user.
- the predetermined criteria may include a moving direction of the real object, a speed of the real object, and / or an acceleration of the real object.
- the predetermined criteria may include a user moving direction, a user speed, and / or a user acceleration. For example, when the user is moving toward a certain real object and the speed of the user is equal to or higher than a predetermined threshold, the real object can be set to avoid concealment dynamically.
- the predetermined standard may include a recognition result of another person's action. For example, when it is detected that another person is facing the user and speaking, the other person can be set to avoid concealment.
- the predetermined criteria may include the state of the real object.
- the predetermined criterion includes the temperature of the real object.
- the predetermined threshold may be determined for each type of real object.
- the predetermined standard includes a device state of a real object (for example, an electronic device).
- a device state of a real object for example, an electronic device.
- the concealment avoidance setting of the television receiver and the PC can be performed only when the power of the television receiver and the PC (Personal Computer) is ON. Further, when it is detected that an electronic device in operation has failed, the electronic device can be set to avoid concealment.
- the predetermined standard may include the state of sound, light, or smoke from the real object.
- the real object when a predetermined sensor detects that smoke is generated from a real object, the real object can be set to avoid concealment. Or, for example, opening and closing of doors such as clocks and entrances, knocking of doors, entrance chimes, telephones, kettles, real object collisions (falling, etc.), various electronic equipment timers, or fire alarms, etc. If detected, these real objects can be set to avoid concealment.
- the output control unit 108 can hide the virtual object located between the sound source and the user. Accordingly, when the user looks in the direction of arrival of the sound, the virtual object on the flow line toward the sound generation source is not displayed, so that the user can clearly perceive the sound generation source.
- the predetermined criteria can include the presence or absence of a contract or the billing status. For example, if the server 20 registers that an agreement regarding the display of advertisements and products has been exchanged with the AR service operator, the advertisements and products will remain in the contract period (or Concealment avoidance setting can only be set.
- the overlap determination unit 106 includes a real object determined by the real object determination unit 104 as being concealment avoidance among real objects included in the user's field of view, and a display target acquired by the virtual object acquisition unit 102. It is determined whether or not there is an overlap with the virtual object. For example, the overlap determination unit 106 first specifies distance information (depth map) regarding all virtual objects to be displayed based on the position and orientation of the AR glass 10 (or the detection result of the user's line-of-sight direction).
- the overlap determination unit 106 specifies distance information (depth map) regarding all real objects that are set to avoid concealment based on the position and orientation of the AR glass 10 (or the detection result of the user's line-of-sight direction). . Then, the overlap determination unit 106 determines whether or not there is an overlap with the virtual object to be displayed for each real object that is set to avoid concealment by comparing the two pieces of distance information. Specifically, the overlap determination unit 106 determines whether or not a virtual object exists between the real object and the AR glass 10 for each real object that is set to avoid concealment, and exists. In this case, all corresponding virtual objects are specified.
- distance information depth map
- Output control unit 108 ⁇ (2-1-5-1. Control example 1) Based on the determination result by the real object determination unit 104 and the determination result by the overlap determination unit 106, the output control unit 108 changes the degree of recognition of the real object in a predetermined range located between the real object and the user. Thus, the display by the display unit 124 is controlled. For example, the output control unit 108 determines that, among the real objects included in the user's field of view, a real object that is set to avoid concealment is a real object that is set to avoid concealment. The display by the display unit 124 is controlled so that the degree of recognition increases.
- the predetermined range may be a range that is located between the real object and the user and does not include the real object.
- the output control unit 108 may hide all or a part of the virtual object (for example, a portion overlapping with a real object set to avoid concealment and its vicinity).
- the output control unit 108 may make the virtual object translucent, display only the outline of the virtual object (wire frame display), or blink the virtual object at a predetermined time interval. May be displayed.
- FIG. 6 is an explanatory diagram showing an example in which the real object 30a is set to avoid concealment.
- a certain range around the real object 30 a can be set as the concealment avoidance area 50.
- the concealment avoidance area 50 is an area (space) where concealment by a virtual object is avoided.
- the output control unit 108 hides the virtual object 32 positioned between the concealment avoidance area 50 and the display unit 124.
- the entire real object 30 a is visible to the user on the display unit 124 without being hidden by the virtual object 32.
- the output control unit 108 makes the virtual object 32 translucent.
- the entire real object 30 a is visible to the user on the display unit 124 without being hidden by the virtual object 32.
- the output control unit 108 may control the display so as to emphasize the change in the display mode. For example, when changing to display only the contour line of the virtual object, the output control unit 108 may temporarily and gently flash the vicinity of the contour line. Alternatively, when the real object set to avoid concealment is moving, the output control unit 108 displays an animation such as a virtual object located on the movement locus of the real object exploding or collapsing. The virtual object may be hidden. Alternatively, the output control unit 108 may change the display mode of the virtual object while outputting sound or generating vibration to the AR glass 10 or other devices carried by the user. According to these control examples, it is possible to notify the user more emphasized that the display mode of the virtual object has changed.
- the output control unit 108 causes the virtual object to be displayed while being shifted from the default display position so that the real object set to avoid concealment does not overlap the virtual object. May be.
- the output control unit 108 shifts and displays the virtual object so that it does not overlap with the real object that is set to avoid concealment, and the position change amount is as small as possible.
- the output control unit 108 shifts the display position of the virtual object 32 so that the virtual object 32 does not overlap the region where the concealment avoidance region 50 is visually recognized on the display unit 124. .
- the entire real object 30 a is not hidden by the virtual object 32 on the display unit 124 and is visually recognized by the user.
- the change allowable range (upper limit) of the position, posture, and size may be set in advance.
- the change allowable range for the position is ⁇ 50 cm to 50 cm in each direction
- the change allowable size range can be set to 0.4 times or more and 1.0 times or less.
- the change allowable range may be defined in a world coordinate system, or may be defined in a user coordinate system (for example, a coordinate system based on the AR glass 10).
- the change allowable range may be defined in a vector space.
- FIG. 8 is an explanatory diagram showing an example in which the change allowable range 50 of the virtual object is defined on the vector space. For example, at each point within the change allowable range 50, an allowable rotation amount and expansion / contraction value of each axis can be set.
- the output control unit 108 may predict the future movement of the user in advance, and may predetermine the time series of the display position of the virtual object based on the prediction result. Thereby, the load at the time of the display update of a virtual object can be reduced.
- Modification 2 In general, for example, in a crowd, the position of an individual person (that is, a real object) can change sequentially. Therefore, as another modified example, the output control unit 108 predicts the movement of one or more real objects that are set to avoid concealment, thereby predetermining the time series of the display positions of the virtual objects (path finding). Is also possible. This eliminates the need to sequentially calculate the position at which the virtual object is shifted each time a surrounding person moves, thereby reducing the load when updating the display of the virtual object.
- the output control unit 108 may change how the virtual object is shifted based on, for example, the predetermined criterion described above. For example, the output control unit 108 shifts the virtual object located between the real object and the display unit 124 at a higher speed or a farther distance as the speed of the real object moving in the direction of the user increases. May be. Alternatively, the output control unit 108 may shift the virtual object positioned between the real object and the display unit 124 at higher speed or farther as the temperature of the real object set to avoid concealment is higher. Good. According to these control examples, the magnitude of danger can be notified to the user.
- the output control unit 108 may newly display another virtual object related to the corresponding real object at a position that does not overlap the virtual object. For example, when the corresponding real object is a traffic light, the output control unit 108 may newly display a virtual traffic light at a position shifted from the virtual object displayed overlapping the traffic light. Alternatively, the output control unit 108 may newly display information (text or image) indicating the current lighting color of the traffic signal, or display the entire display unit 124 lightly with the current lighting color of the traffic signal. You may let them.
- the output control unit 108 may dynamically change the display mode of the virtual object located between the real object set to avoid concealment and the display unit 124, for example, based on the predetermined criterion described above. .
- the output control unit 108 dynamically changes the display mode of the virtual object located between the real object and the display unit 124 based on the change in the distance between the real object and the display unit 124.
- the output control unit 108 increases the transparency of the virtual object or gradually increases the mesh size of the virtual object as the distance between the real object and the display unit 124 decreases.
- the virtual object may be gradually made into a wire frame.
- the output control unit 108 is positioned between the traffic signal and the display unit 124 as the distance between the traffic signal located in front of the user and the display unit 124 decreases.
- the transparency of the virtual object to be performed may be increased.
- the output control unit 108 may dynamically change the display mode of the virtual object located between the real object and the display unit 124 according to a change in the surrounding situation. For example, it is assumed that the escape route is set to avoid concealment in a predetermined facility such as a leisure facility. In this case, at the time of evacuation guidance, the output control unit 108 performs control so as not to display on the evacuation route, such as hiding a virtual object that becomes an obstacle, or newly displays a virtual object for guidance. May be. Or the output control part 108 may always display the virtual object located on an evacuation route translucently.
- FIG. 9A is a diagram showing a display example of the virtual object 32 at the normal time in the corridor which is an evacuation route. As shown in FIG. 9A, for example, during normal times, the output control unit 108 displays the virtual object 32a located on the evacuation route in a translucent manner.
- FIG. 9B is a diagram showing a display example of the virtual object 32 at the time of evacuation guidance in the same corridor. As shown in FIG. 9B, at the time of evacuation guidance, the output control unit 108 hides the virtual object 32a shown in FIG. 9A and newly displays the virtual object 32b for guidance. Note that the output control unit 108 may sequentially update the display position, tilt, or display content of the virtual object 32b in accordance with the movement of the user so as to guide evacuation.
- the output control unit 108 may change the display mode of one or more virtual objects located in the user's field of view according to the surrounding brightness. For example, when the surroundings are dark (such as when it is night or cloudy), the output control unit 108 displays a portion to be hidden among the virtual objects located between the real object set to avoid concealment and the user. It may be larger than when the surroundings are bright. Alternatively, in this case, the output control unit 108 may further make other virtual objects located around the virtual object semi-transparent, or make all other virtual objects located within the user's field of view semi-transparent. May be.
- the output control unit 108 determines whether or not the virtual object is positioned between the advertisement and the display unit 124 based on the positional relationship between the user and the advertisement.
- the display mode may be controlled dynamically. For example, when the user is located in front of the advertisement, the output control unit 108 increases the transparency of the virtual object. In addition, when the user is located on the side or rear of the advertisement, the output control unit 108 further reduces the transparency of the virtual object. Thereby, the fall of the visibility of a virtual object can be suppressed as much as possible, maintaining the recognition degree of an advertisement.
- the output control unit 108 may change the display mode of the virtual object based on the display position of the virtual object on the display unit 124. For example, as the distance between the display position of the virtual object on the display unit 124 and the center of the display unit 124 is smaller, the output control unit 108 may relatively increase the transparency of the virtual object. Alternatively, the output control unit 108 may relatively increase the transparency of the virtual object as the distance between the display position of the virtual object and the user's gaze point on the display unit 124 is smaller.
- the output control unit 108 can sequentially execute the display control described above every time the position and posture of the AR glass 10 (or the detection result of the user's line-of-sight direction) change.
- the output control unit 108 can dynamically change the display mode of the virtual object in accordance with the change in the determination result by the real object determination unit 104. For example, when the concealment avoidance setting for a certain real object is switched from OFF to ON, the output control unit 108 displays the display mode of the virtual object positioned between the display unit 124 and the real object (as described above). Is changed from the default display mode. When the setting for avoiding concealment of a certain real object is switched from ON to OFF, the output control unit 108 sets the display mode of the virtual object positioned between the display unit 124 and the real object to the default display mode. Return to.
- the output control unit 108 when the user makes an input for returning the display mode of the virtual object after the display mode of the virtual object is changed from the default display mode, the output control unit 108 It is also possible to return the display mode of the virtual object to the default display mode. Further, the output control unit 108 specifies a virtual object that the user has permitted in the past to return to the default display mode based on the history information, and displays the specified virtual object in the default display mode thereafter. May be.
- the communication unit 120 transmits and receives information to and from other devices. For example, the communication unit 120 transmits an acquisition request for a virtual object located around the current position to the server 20 according to the control of the virtual object acquisition unit 102. In addition, the communication unit 120 receives a virtual object from the server 20.
- the sensor unit 122 may include a positioning device that receives a positioning signal from a positioning satellite such as a GPS (Global Positioning System) and measures the current position.
- the sensor unit 122 may include a range sensor.
- the sensor unit 122 includes, for example, a three-axis acceleration sensor, a gyroscope, a magnetic sensor, a camera, a depth sensor, and / or a microphone.
- the sensor unit 122 measures the speed, acceleration, posture, orientation, or the like of the AR glass 10.
- the sensor unit 122 captures an image of the eyes of the user wearing the AR glass 10 or captures an image in front of the AR glass 10.
- the sensor unit 122 can detect an object positioned in front of the user and can detect a distance to the detected object.
- the display unit 124 displays an image according to the control of the output control unit 108.
- the display unit 124 projects an image on at least a partial region (projection plane) of each of the left-eye lens and the right-eye lens.
- the left-eye lens and the right-eye lens can be formed of a transparent material such as resin or glass.
- the display unit 124 may include a liquid crystal panel and the transmittance of the liquid crystal panel may be controllable. Thereby, the display unit 124 can be controlled to be transparent or translucent.
- Storage unit 126 stores various data and various software. For example, the storage unit 126 stores a list of concealment avoidance setting conditions for each type of real object.
- FIG. 10 is a flowchart showing an operation example according to the present embodiment.
- the virtual object acquisition unit 102 of the AR glass 10 is arranged around the current position (for example, a certain range in all directions of up, down, left, and right directions) based on the position information measurement result by the sensor unit 122.
- a plurality of virtual objects located are acquired from the server 20.
- the virtual object acquisition unit 102 selects a virtual object included in the user's field of view from the acquired virtual objects. Extract (S101).
- the real object determination unit 104 avoids concealment among real objects included in the user's field of view based on a list of concealment avoidance setting conditions for each type of real object stored in the storage unit 126, for example. It is determined whether or not the set real object exists (S103). If the real object set to avoid concealment does not exist in the user's field of view (S103: No), the AR glass 10 performs the process of S109 described later.
- the overlap determination unit 106 for each of the corresponding real objects, it is determined whether or not at least one of the virtual objects acquired in S101 is located (S105).
- the AR glass 10 When there is no virtual object between all the corresponding real objects and the user (S105: No), the AR glass 10 performs the process of S109 described later.
- the output control unit 108 may decrease the visibility of the corresponding virtual object.
- the display mode of the virtual object is changed. For example, the output control unit 108 hides the corresponding virtual object, makes it translucent, or shifts the display position so as not to overlap the corresponding real object (S107).
- the output control unit 108 causes the display unit 124 to display the virtual object acquired in S101 (S109).
- the AR glass 10 determines whether the real object included in the user's field of view is a real object that is set to avoid concealment.
- the display by the display unit 124 is controlled so that the recognition degree of the real object changes in a range between the real object and the real object. For this reason, the recognition degree of the real object can be adaptively changed to the real object included in the user's field of view.
- display by the display unit 124 is performed so that a real object that is set to avoid concealment has a higher degree of recognition than a real object that is not set to avoid concealment.
- the user can visually recognize the real object that is set to avoid the concealment by the virtual object, or the user can visually recognize the virtual object that replaces the real object. Accordingly, since the user can recognize the existence of the real object that is set to avoid concealment, the safety when the user acts while wearing the AR glass 10 can be improved.
- the AR glass 10 includes a CPU 150, a ROM (Read Only Memory) 152, a RAM 154, a bus 156, an interface 158, an input device 160, an output device 162, a storage device 164, and a communication device 166.
- the CPU 150 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the AR glass 10 according to various programs. In addition, the CPU 150 realizes the function of the control unit 100 in the AR glass 10.
- the CPU 150 is configured by a processor such as a microprocessor.
- the ROM 152 stores programs used by the CPU 150 and control data such as calculation parameters.
- the RAM 154 temporarily stores a program executed by the CPU 150, for example.
- the bus 156 includes a CPU bus and the like.
- the bus 156 connects the CPU 150, the ROM 152, and the RAM 154 to each other.
- the interface 158 connects the input device 160, the output device 162, the storage device 164, and the communication device 166 to the bus 156.
- the input device 160 includes, for example, input means for a user to input information, such as buttons, switches, levers, and microphones, and an input control circuit that generates an input signal based on the input by the user and outputs the input signal to the CPU 150.
- input means for a user to input information such as buttons, switches, levers, and microphones
- input control circuit that generates an input signal based on the input by the user and outputs the input signal to the CPU 150.
- the output device 162 includes, for example, a display device such as a projector and an audio output device such as a speaker.
- the display device may be a liquid crystal display (LCD) device, an organic light emitting diode (OLED) device, or the like.
- the storage device 164 is a data storage device that functions as the storage unit 126.
- the storage device 164 includes, for example, a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, or a deletion device that deletes data recorded on the storage medium.
- the communication device 166 is a communication interface composed of a communication device for connecting to the communication network 22 or the like, for example.
- the communication device 166 may be a wireless LAN compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wire communication device that performs wired communication.
- the communication device 166 functions as the communication unit 120.
- Modification 1> For example, in the above-described embodiment, an example in which the AR glass 10 (the output control unit 108) changes the display mode for each virtual object has been described. However, the present invention is not limited to this example, and the AR glass 10 includes a plurality of AR glasses 10. You may change the display mode of a virtual object collectively. As an example, the output control unit 108 may hide all the virtual objects located between all the real objects set to avoid concealment and the user, or may make them semi-transparent.
- the output control unit 108 may change the way of changing the display mode of the virtual object depending on the type of the virtual object. For example, for a virtual object that prompts the user to confirm only once, the output control unit 108 may change the display mode of the virtual object depending on whether or not the user has recognized that the virtual object has been viewed. More specifically, when the user has not yet seen the virtual object, the output control unit 108 displays the virtual object in a default display mode. Further, after it is recognized that the user has seen the virtual object, the output control unit 108 displays the virtual object in a simple display format such as hiding or displaying only the outline.
- the output control unit 108 displays the display mode of the virtual object displayed for each of the plurality of users.
- the display may be controlled to be the same.
- the output control unit 108 may further control display based on detection of occurrence of a system error so that the visibility of a virtual object located in the user's field of view is reduced. .
- the output control unit 108 hides all virtual objects. Or semi-transparent.
- the output control unit 108 may notify the user by voice or vibration (tactile stimulus) instead of or in addition to the above-described display control of the virtual object.
- voice or vibration vibration
- the output control unit 108 may cause the built-in speaker (not shown) to output sound indicating the lighting color of the traffic light (for example, “currently red” or “changed to blue”).
- a vibration pattern is registered for each type of lighting color, and the output control unit 108 is a vibration pattern corresponding to the current lighting color of the traffic light, and another device (smart phone or smart watch) carried by the user. Or the 3D glass 10 itself may be vibrated.
- the information processing apparatus may be the server 20.
- the server 20 controls the display of the virtual object with respect to the AR glass 10 by acquiring the position information and posture information of the AR glass 10 (and the detection result of the user's line-of-sight direction) from the AR glass 10. It is possible.
- the information processing apparatus is not limited to the server 20, and may be another type of apparatus that can be connected to the communication network 22, such as a smartphone, a tablet terminal, a PC, or a game machine. Alternatively, the information processing apparatus may be a car.
- the display unit in the present disclosure is the display unit 124 of the AR glass 10
- the present invention is not limited to this example.
- the display unit may be a see-through device such as a head-up display (for example, an in-vehicle windshield) or a desktop transparent display, or a video transmission type HMD (Head Mounted Display) or tablet. It may be a video see-through device such as a terminal. In the latter case, the captured image in front of the user can be sequentially displayed on the corresponding display.
- the display unit may be a 3D projector.
- the 3D projector performs projection mapping of the virtual object on the projection target while the sensor worn by the user or the sensor disposed in the environment senses the user's field of view. This function can be realized.
- the projection target may be a flat surface or a three-dimensional object.
- an example in which an object to be concealed is set as a real object has been described.
- the present invention is not limited to such an example, and a virtual object may be set to be concealed.
- a specific type of virtual object may be set in advance as a concealment avoidance target.
- important notification information from the system, a message display window in a chat service, or a message reception notification screen may be set as a concealment avoidance target.
- the AR glass 10 controls display so that the virtual object set to avoid concealment has a higher recognition degree than the virtual object not set to avoid concealment.
- the AR glass 10 may be configured such that a virtual object that is not set to avoid concealment that is positioned between the virtual object that is set to avoid concealment and the user is hidden, translucent, or shifted in display position. Good.
- the priority for each type of virtual object can be registered in the table in advance.
- the AR glass 10 may hide another semi-transparent object positioned between the virtual object and the user and has a lower priority than the virtual object, or may be translucent, or the display position may be shifted. Good. According to these display examples, a virtual object with high importance can be visually recognized by a user without being hidden by other virtual objects.
- each step in the operation of the above-described embodiment does not necessarily have to be processed in the order described.
- the steps may be processed by changing the order as appropriate.
- Each step may be processed in parallel or individually instead of being processed in time series. Further, some of the described steps may be omitted, or another step may be further added.
- An output control unit for controlling display by the display unit An information processing apparatus comprising: (2) When the real object is determined to be the first real object, and when the real object is determined to be a second real object different from the first real object, The information processing apparatus according to (1), wherein the output control unit controls display by the display unit so that the degree of recognition is different. (3) When it is determined that the real object is the first real object, the degree of recognition of the real object is greater than when the real object is determined to be the second real object.
- the information processing apparatus wherein the output control unit controls display by the display unit.
- the range between the user and the real object is a range having a predetermined shape located between the user and the real object, When it is determined that the real object is the first real object and at least a part of the real object is included in the range having the predetermined shape, the recognition degree of the real object is increased.
- the information processing apparatus according to (3), wherein the output control unit controls display by the display unit.
- the information processing apparatus according to (3) or (4), wherein whether or not the real object is the first real object is determined based on a positional relationship between the real object and the user.
- Information processing device (11)
- the real object is an electronic device, The information processing apparatus according to any one of (3) to (10), wherein whether or not the real object is the first real object is determined based on a device state of the real object. (12) Any of (3) to (11), wherein whether or not the real object is the first real object is determined based on whether or not the real object is a predetermined type of object.
- the output control unit changes a display mode of the first virtual object located in a range between the user and the real object.
- the information processing apparatus according to any one of (3) to (12).
- (14) The information processing apparatus according to (13), wherein when the real object is determined to be the second real object, the output control unit does not change a display mode of the first virtual object.
- the output control unit controls display so that a part or all of the first virtual object is hidden.
- the output control unit increases the transparency of the first virtual object, and is any one of (13) to (15).
- the information processing apparatus according to one item.
- the output control unit determines a region where the real object is visually recognized on the display unit and a display position of the first virtual object.
- the information processing apparatus according to any one of (13) to (16), wherein a display position of the first virtual object is changed so as not to overlap.
- the output control unit sets a second position related to the real object at a display position different from the display position of the first virtual object.
- the information processing apparatus according to any one of (13) to (17), wherein the virtual object is newly displayed.
- AR glass 20 server 22 communication network 30 real object 32 virtual object 100 control unit 102 virtual object acquisition unit 104 real object determination unit 106 overlap determination unit 108 output control unit 120 communication unit 122 sensor unit 124 display unit 126 storage unit
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
La présente invention a pour but de fournir un dispositif de traitement d'informations, un procédé de traitement d'informations et un programme qui peuvent modifier le degré de reconnaissance de l'utilisateur d'un objet réel inclus dans le champ visuel de l'utilisateur de manière appropriée pour cet objet réel. Pour atteindre ce but, l'invention concerne un dispositif de traitement d'informations qui est pourvu d'une unité de commande de sortie qui, sur la base du résultat de la détermination si un objet réel dans le champ visuel de l'utilisateur est un premier objet réel ou non, commande l'affichage au moyen d'une unité d'affichage dans la zone entre l'utilisateur et l'objet réel de sorte à modifier le degré de reconnaissance de l'utilisateur dudit objet réel.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016132696A JP2018005005A (ja) | 2016-07-04 | 2016-07-04 | 情報処理装置、情報処理方法、およびプログラム |
| JP2016-132696 | 2016-07-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018008210A1 true WO2018008210A1 (fr) | 2018-01-11 |
Family
ID=60912445
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/013655 Ceased WO2018008210A1 (fr) | 2016-07-04 | 2017-03-31 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2018005005A (fr) |
| WO (1) | WO2018008210A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111311754A (zh) * | 2018-12-12 | 2020-06-19 | 联想(新加坡)私人有限公司 | 用于扩展现实内容排除的方法、信息处理设备和产品 |
| CN113411227A (zh) * | 2021-05-07 | 2021-09-17 | 上海纽盾科技股份有限公司 | 基于ar辅助的网络设备测试方法及装置 |
| US20240245463A1 (en) * | 2018-06-11 | 2024-07-25 | Brainlab Ag | Visualization of medical data depending on viewing-characteristics |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6976186B2 (ja) * | 2018-02-01 | 2021-12-08 | Kddi株式会社 | 端末装置及びプログラム |
| KR102585051B1 (ko) | 2018-05-08 | 2023-10-04 | 그리 가부시키가이샤 | 액터의 움직임에 기초하여 생성되는 캐릭터 오브젝트의 애니메이션을 포함하는 동화상을 배신하는 동화상 배신 시스템, 동화상 배신 방법 및 동화상 배신 프로그램 |
| JP7321787B2 (ja) * | 2019-06-19 | 2023-08-07 | 日産自動車株式会社 | 情報処理装置及び情報処理方法 |
| JP7001719B2 (ja) * | 2020-01-29 | 2022-02-04 | グリー株式会社 | コンピュータプログラム、サーバ装置、端末装置、及び方法 |
| EP4207084A4 (fr) * | 2020-08-25 | 2024-05-01 | Maxell, Ltd. | Dispositif d'affichage de réalité virtuelle 3d, visiocasque et procédé d'affichage de réalité virtuelle 3d |
| US20240104883A1 (en) * | 2020-12-10 | 2024-03-28 | Maxell, Ltd. | Display apparatus and display method |
| JP6968326B1 (ja) * | 2021-04-16 | 2021-11-17 | ティフォン株式会社 | 表示装置、表示方法及びその表示プログラム |
| JP2023017615A (ja) * | 2021-07-26 | 2023-02-07 | 富士フイルムビジネスイノベーション株式会社 | 情報処理システム及びプログラム |
| JP2023079490A (ja) * | 2021-11-29 | 2023-06-08 | 富士電機株式会社 | 作業支援装置、作業支援方法およびプログラム |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09330489A (ja) * | 1996-06-07 | 1997-12-22 | Hitachi Ltd | 設備の監視方法及び装置 |
| JP2005037181A (ja) * | 2003-07-17 | 2005-02-10 | Pioneer Electronic Corp | ナビゲーション装置、サーバ装置、ナビゲーションシステム、及びナビゲーション方法 |
| JP2006171302A (ja) * | 2004-12-15 | 2006-06-29 | Konica Minolta Photo Imaging Inc | 映像表示装置及び情報提供システム |
| JP2008083289A (ja) * | 2006-09-27 | 2008-04-10 | Sony Corp | 撮像表示装置、撮像表示方法 |
| JP2015204615A (ja) * | 2014-04-11 | 2015-11-16 | 三菱電機株式会社 | 機器と移動デバイスとの間でインタラクトする方法及びシステム |
| US20160041624A1 (en) * | 2013-04-25 | 2016-02-11 | Bayerische Motoren Werke Aktiengesellschaft | Method for Interacting with an Object Displayed on Data Eyeglasses |
| JP2016045814A (ja) * | 2014-08-25 | 2016-04-04 | 泰章 岩井 | 仮想現実サービス提供システム、仮想現実サービス提供方法 |
-
2016
- 2016-07-04 JP JP2016132696A patent/JP2018005005A/ja active Pending
-
2017
- 2017-03-31 WO PCT/JP2017/013655 patent/WO2018008210A1/fr not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09330489A (ja) * | 1996-06-07 | 1997-12-22 | Hitachi Ltd | 設備の監視方法及び装置 |
| JP2005037181A (ja) * | 2003-07-17 | 2005-02-10 | Pioneer Electronic Corp | ナビゲーション装置、サーバ装置、ナビゲーションシステム、及びナビゲーション方法 |
| JP2006171302A (ja) * | 2004-12-15 | 2006-06-29 | Konica Minolta Photo Imaging Inc | 映像表示装置及び情報提供システム |
| JP2008083289A (ja) * | 2006-09-27 | 2008-04-10 | Sony Corp | 撮像表示装置、撮像表示方法 |
| US20160041624A1 (en) * | 2013-04-25 | 2016-02-11 | Bayerische Motoren Werke Aktiengesellschaft | Method for Interacting with an Object Displayed on Data Eyeglasses |
| JP2015204615A (ja) * | 2014-04-11 | 2015-11-16 | 三菱電機株式会社 | 機器と移動デバイスとの間でインタラクトする方法及びシステム |
| JP2016045814A (ja) * | 2014-08-25 | 2016-04-04 | 泰章 岩井 | 仮想現実サービス提供システム、仮想現実サービス提供方法 |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240245463A1 (en) * | 2018-06-11 | 2024-07-25 | Brainlab Ag | Visualization of medical data depending on viewing-characteristics |
| CN111311754A (zh) * | 2018-12-12 | 2020-06-19 | 联想(新加坡)私人有限公司 | 用于扩展现实内容排除的方法、信息处理设备和产品 |
| CN111311754B (zh) * | 2018-12-12 | 2025-01-14 | 联想(新加坡)私人有限公司 | 用于扩展现实内容排除的方法、信息处理设备和产品 |
| CN113411227A (zh) * | 2021-05-07 | 2021-09-17 | 上海纽盾科技股份有限公司 | 基于ar辅助的网络设备测试方法及装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2018005005A (ja) | 2018-01-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018008210A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| US11386626B2 (en) | Information processing apparatus, information processing method, and program | |
| US10650600B2 (en) | Virtual path display | |
| CN107408026B (zh) | 信息处理设备、信息处理方法和计算机程序 | |
| TW202113428A (zh) | 用於針對頭戴式顯示器產生動態障礙物碰撞警告之系統和方法 | |
| US20200020161A1 (en) | Virtual Barrier Objects | |
| US20190139307A1 (en) | Modifying a Simulated Reality Display Based on Object Detection | |
| WO2022164586A9 (fr) | Systèmes de réalité étendue sensibles au contexte | |
| CN107831908A (zh) | 具有附近物体响应的可穿戴计算机 | |
| JP6693223B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| JPWO2019176577A1 (ja) | 情報処理装置、情報処理方法、および記録媒体 | |
| US11004273B2 (en) | Information processing device and information processing method | |
| WO2019244670A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| JP7743585B2 (ja) | 表示装置および表示方法 | |
| US20250233917A1 (en) | Mapping networked devices | |
| CN112105983A (zh) | 增强的视觉能力 | |
| KR20160009879A (ko) | 웨어러블 디스플레이 디바이스 및 그 제어 방법 | |
| US10636199B2 (en) | Displaying and interacting with scanned environment geometry in virtual reality | |
| WO2019238017A1 (fr) | Procédé et dispositif pour fournir une alerte de source sonore anormale et lunettes de réalité augmentée | |
| JP2023549842A (ja) | ウェアラブルデバイスを用いた制御可能デバイスの位置の特定 | |
| US20200135150A1 (en) | Information processing device, information processing method, and program | |
| JPWO2017169001A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| US10409464B2 (en) | Providing a context related view with a wearable apparatus | |
| JP7625039B2 (ja) | プログラム | |
| JP2018005676A (ja) | 情報処理装置、情報処理方法、およびプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17823818 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17823818 Country of ref document: EP Kind code of ref document: A1 |