[go: up one dir, main page]

WO2019082520A1 - Appareil de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Appareil de traitement d'informations, procédé de traitement d'informations et programme

Info

Publication number
WO2019082520A1
WO2019082520A1 PCT/JP2018/032997 JP2018032997W WO2019082520A1 WO 2019082520 A1 WO2019082520 A1 WO 2019082520A1 JP 2018032997 W JP2018032997 W JP 2018032997W WO 2019082520 A1 WO2019082520 A1 WO 2019082520A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
coordinate system
information processing
user
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/032997
Other languages
English (en)
Japanese (ja)
Inventor
誠司 鈴木
健太郎 井田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of WO2019082520A1 publication Critical patent/WO2019082520A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/22Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program.
  • projectors are projected onto a screen perpendicular to the ground, as typified by conventional home theaters, and with the rise of projection mapping technology, there are more cases where images are projected onto the table surface and all other places. .
  • Patent Document 1 discloses control for estimating the position of the user with respect to a large display installed on a wall surface or a table, and displaying a new display object in the vicinity of the user according to the estimated position of the user.
  • the present disclosure proposes an information processing apparatus, an information processing method, and a program that can align the arrangement direction of a display object with the positional relationship of a plurality of devices in a space.
  • An information processing apparatus comprising: a control unit that performs processing for displaying the plurality of display objects based on the rotated display object arrangement direction.
  • the processor rotates the display object arrangement direction for arranging the plurality of display objects so as to coincide with the reference direction of the space coordinate system representing the positional relationship of the plurality of devices in the three-dimensional space.
  • An information processing method is proposed, including: displaying the plurality of display objects based on the rotated display object arrangement direction.
  • a computer is rotated so that a display object arrangement direction for arranging a plurality of display objects coincides with a reference direction of a space coordinate system representing the positional relationship of a plurality of devices in a three-dimensional space.
  • a program is proposed to function as a control unit that performs processing and processing for displaying the plurality of display objects based on the rotated display object arrangement direction.
  • FIG. 1 is a diagram for describing an overview of an information processing system according to an embodiment of the present disclosure.
  • the information processing system according to the present embodiment includes an information processing device 100 (not shown in FIG. 1), an output device 200 (in FIG. 1, a projector 210 and a TV 220 are shown as an example), and a sensor device 300.
  • the sensor device 300 is a device that senses various information.
  • the sensor device 300 includes a camera, a depth sensor, a microphone, and the like, and senses information on a user and a space in which the user is present.
  • the sensor device 300 senses the position, posture, movement, line of sight, shape of a room, arrangement of furniture, etc. of the user.
  • the output device 200 is a device that outputs various information from the information processing device 100, and assumes, for example, a projector 210 and a TV 220.
  • the projector 210 can project information as a projection location (that is, a projection plane or a projection region) at any place (that is, a region) such as a wall, a floor, a table, or other furniture included in a space sensed by the sensor device 300. It is.
  • a projection place is not limited to a plane, A curved surface may be sufficient, and you may divide into several surface.
  • the projector 210 is realized by a plurality of projectors or a so-called moving projector so that the projector 210 can project anywhere in space.
  • the output device 200 and the sensor device 300 may be singular or plural.
  • the display image 10 is displayed on the top surface of the table 30 by the projector 210.
  • the display image 10 is a display object indicating an interaction from an application with respect to a user input, and is, for example, various UIs such as a still image, a moving image (video), a menu screen or a control screen.
  • the user can move various operation inputs on the display image 10 by moving the operating body, for example, a hand on the display image 10. It can be performed.
  • the display position of the display image 10 is not limited to the top surface of the table 30, but may be any place such as a wall, a floor, or furniture in a space, and is controlled automatically or according to an instruction by the user.
  • the present disclosure proposes a mechanism capable of aligning the arrangement of display objects with the positional relationship between a plurality of devices in a space.
  • FIG. 2 is a block diagram showing an example of the configuration of the system 1 according to the present embodiment. As shown in FIG. 2, the system 1 includes an information processing device 100, an output device 200 and a sensor device 300.
  • the output device 200 includes a projector 210, a TV 220, a tablet 230, a smartphone 240, a PC 250, a speaker 260, and a unidirectional speaker 270.
  • the system 1 may include, as the output device 200, a combination of one or more of them, or may include a plurality of devices of the same type.
  • the projector 210 is a projection device that projects an image to any place in space.
  • the projector 210 may be, for example, a fixed wide-angle projector, or may be a so-called moving projector provided with a movable portion such as a Pan / Tilt drive type capable of changing the projection direction.
  • the TV 220 is a device that receives radio waves of television broadcasting and outputs an image and sound.
  • the tablet 230 is a mobile device capable of wireless communication, which typically has a screen larger than the smartphone 240, and can output images, sounds, vibrations, and the like.
  • the smartphone 240 is a mobile device capable of wireless communication, which typically has a screen smaller than the tablet 230, and can output images, sounds, vibrations, and the like.
  • the PC 250 may be a fixed desktop PC or a mobile notebook PC, and can output images, sounds, and the like.
  • the speaker 260 converts audio data into an analog signal via a DAC (Digital Analog Converter) and an amplifier and outputs (reproduces) it.
  • Unidirectional speaker 270 is a speaker capable of forming directivity in a single direction.
  • the output device 200 outputs information based on control by the information processing device 100.
  • the information processing apparatus 100 can control an output method in addition to the content of the information to be output.
  • the information processing apparatus 100 can control the projection direction of the projector 210 or control the directivity of the unidirectional speaker 270.
  • the output device 200 may include components capable of arbitrary output other than the components described above.
  • the output device 200 may include wearable devices such as a head mounted display (HMD), an augmented reality (AR) glass, and a watch-type device.
  • HMD head mounted display
  • AR augmented reality
  • watch-type device a watch-type device
  • the output device 200 may include a lighting device, an air conditioner, a music reproduction device, and the like.
  • the sensor device 300 includes a camera 310, a depth sensor 320 and a microphone 330.
  • the camera 310 is an imaging device that has a lens system such as an RGB camera, a drive system, and an imaging element, and captures an image (still image or moving image).
  • the depth sensor 320 is a device that acquires depth information of an infrared distance measuring device, an ultrasonic distance measuring device, LiDAR (Laser Imaging Detection and Ranging), a stereo camera, or the like.
  • the microphone 330 is a device that picks up surrounding sound and outputs audio data converted into a digital signal through an amplifier and an ADC (Analog Digital Converter).
  • the microphone 330 may be an array microphone.
  • the sensor device 300 senses information based on control by the information processing device 100.
  • the information processing apparatus 100 can control the zoom ratio and the imaging direction of the camera 310.
  • the sensor apparatus 300 may contain the component in which arbitrary sensing other than the component mentioned above is possible.
  • the sensor device 300 may include a device such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever, which allows information to be input by the user.
  • the sensor device 300 may include various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, an illuminance sensor, a force sensor, an ultrasonic sensor, an atmospheric pressure sensor, a gas sensor (Co2), and a thermo camera.
  • the information processing apparatus 100 includes an I / F (Interface) unit 110, a user operation detection unit 120, a user detection unit 130, an environment detection unit 140, a device detection unit 150, a space information storage unit 161, a coordinate system storage unit 163, and content storage. And a control unit 170.
  • I / F Interface
  • the I / F unit 110 is a connection device for connecting the information processing apparatus 100 to another device.
  • the I / F unit 110 is realized by, for example, a USB (Universal Serial Bus) connector or the like, and performs input and output of information with each component of the output device 200 and the sensor device 300.
  • USB Universal Serial Bus
  • the user operation detection unit 120 has a function of detecting user operation information based on the information sensed by the sensor device 300.
  • the operation information may be detected by, for example, a depth camera, a thermo camera, an RGB camera, or an ultrasonic sensor.
  • the operation information is, for example, information such as touch, tap, double tap, and swipe of the user. More specifically, the user operation detection unit 120 detects an operation such as touch, tap, double tap, or swipe on a projected place (display place) such as a wall, a floor, a table, or other furniture.
  • these user operations will be described below collectively as touch operations.
  • the touch operation is also detected as an operation input by the user on a display image projected on a wall, a floor, furniture or the like.
  • the user operation detection unit 120 analyzes the captured image and depth information input from the sensor device 300 and positions the user's hand or finger positioned on the display screen. And depth information (in other words, three-dimensional information) to detect contact or proximity of the user's hand to the table 30 in the height direction and detachment of the hand from the table 30.
  • depth information in other words, three-dimensional information
  • User operation detection unit 120 outputs the detected user operation information to control unit 170.
  • the user detection unit 130 has a function of detecting information (user information) related to the user based on the information sensed by the sensor device 300.
  • the user information may include information indicating the position and the number of users in the space sensed by the sensor device 300.
  • the position and number of users may be detected by a thermo camera, an RGB camera, an infrared sensor, an ultrasonic sensor or the like.
  • the user information may include information indicating the line of sight of the user.
  • the information indicating the line of sight of the user includes information indicating the position of the viewpoint and the direction of the line of sight.
  • the information indicating the line of sight of the user may be information indicating the direction of the face or head of the user, or may be information indicating the direction of the eyeball.
  • the information indicating the line of sight of the user can be detected by analyzing the eye image of the user obtained by an RGB camera, an infrared camera, an eyepiece camera or the like attached to the user.
  • the user information may include information indicating the posture of the user.
  • the information indicating the posture of the user can be detected by analyzing an image obtained by an RGB camera or an infrared camera.
  • the user information may include information indicating the user's uttered voice.
  • the information indicative of the user's speech can be detected by analyzing the speech information obtained by the microphone.
  • the user detection unit 130 outputs the detected user information to the control unit 170.
  • the environment detection unit 140 has a function of detecting environmental information based on the information sensed by the sensor device 300.
  • Environmental information is information on the space in which the user is present.
  • Environmental information may include various information.
  • the environmental information may include information indicating the shape of the space in which the user is present.
  • the information indicating the shape of the space includes, for example, information indicating the shape of the object forming the space, such as a wall surface, a ceiling, a floor, a door, furniture, and household items.
  • the information indicating the shape of the space may be two-dimensional information or three-dimensional information such as a point cloud.
  • the information indicating the shape of the space may be detected based on depth information obtained by, for example, infrared distance measurement, ultrasonic distance measurement, or a stereo camera.
  • the environmental information may include information indicating the state of the projection plane.
  • the state of the projection plane means, for example, unevenness and color of the projection plane.
  • the unevenness of the projection surface can be detected based on depth information obtained by, for example, LiDAR.
  • the color of the projection plane can be detected, for example, by analyzing an image captured by an RGB camera.
  • the environmental information may include information indicating the brightness of the projection surface.
  • the brightness of the projection plane can be detected by an illumination sensor or an RGB camera.
  • Environmental information may include information indicating the position (three-dimensional position) of an object in space.
  • the position of a cup, chair, table, electronics, etc. in a room can be detected by image recognition.
  • the position of the smartphone in the room may be detected by the radio wave intensity related to the communication between the smartphone and the access point of the wireless LAN.
  • Environmental information may include environmental sounds. Environmental sounds may be detected by a microphone.
  • the environment detection unit 140 outputs the detected environment information to the control unit 170.
  • the device detection unit 150 has a function of detecting information (device information) on devices in the space.
  • Device information may include the presence of the device and the three-dimensional position of the device.
  • the information processing apparatus 100 is connected to each device (output device 200) via the I / F unit 110.
  • the I / F unit 110 may be a wireless / wired LAN (Local Area Network), DLNA (registered trademark) (Digital Living Network Alliance), Wi-Fi (registered trademark), Bluetooth (registered trademark), USB connection, or other dedicated use. Connect to each device in the space by wire etc.
  • the device detection unit 150 detects the presence of the device by connecting the devices via the I / F unit 110.
  • the three-dimensional position of the device may be identified based on the information sensed by the sensor device 300.
  • the device detection unit 150 may extract the retroreflecting material provided in the device by analysis of an infrared image captured by an IR (infrared) camera of the sensor device 300, and may specify the position of the device in space .
  • the device detection unit 150 extracts a specific pattern (maker's name or two-dimensional barcode etc.) provided in the device by analysis of a captured image captured by a camera (RGB camera) of the sensor device 300
  • the position of the device at In addition, the device detection unit 150 may acquire a unique ultrasonic wave transmitted for each device with the microphone of the sensor device 300, and may specify the position of the device in the space.
  • the device detection unit 150 senses the user's operation of designating a place (such as pointing, touching, sighting, placing a marker, etc.) and a registration operation (such as UI selection or voice uttering) with the sensor device 300.
  • a place such as pointing, touching, sighting, placing a marker, etc.
  • a registration operation such as UI selection or voice uttering
  • the device detection unit 150 outputs the detected device information to the control unit 170.
  • the function which detects the information regarding the person in a space, an environment, and an apparatus was demonstrated.
  • the detection of each information by the user operation detection unit 120, the user detection unit 130, the environment detection unit 140, and the device detection unit 150 corresponds to space recognition, and the obtained information
  • the results are also referred to as spatial information.
  • the space recognition process by the information processing apparatus 100 may be performed periodically.
  • Control unit 170 controls the overall operation in information processing apparatus 100 in accordance with various programs.
  • the control unit 170 includes a display control unit 171 and a sound control unit 173.
  • the display control unit 171 controls display by the output device 200.
  • the sound control unit 173 controls the audio output by the output device 200.
  • the control unit 170 When controlling the display by the output device 200 by the display control unit 171, the control unit 170 is a space coordinate system in which a display object arrangement direction for arranging a plurality of display objects represents a positional relationship of a plurality of devices in the space. It is possible to perform control to rotate and display so as to coincide with the reference direction.
  • a recommended coordinate system is set in advance for each display object.
  • the control unit 170 determines the arrangement direction of the display object according to the recommended coordinate system of the display object (for example, rotates the display object as necessary).
  • a “world coordinate system”, a “spatial coordinate system”, a “display place coordinate system”, and a "user coordinate system” are mentioned, for example.
  • this four coordinate system is mentioned as an example in this embodiment, this indication is not limited to this.
  • FIG. 3 is a view for explaining each coordinate system according to the present embodiment.
  • the “world coordinate system” is a coordinate system in which north is an upward direction of the display object based on the orientation of the earth.
  • the "spatial coordinate system” is a coordinate system in which a specific one direction (linked) associated with a space (e.g., a room) is defined as the upper direction of the display object.
  • the coordinate system is along the wall of the room, but it does not have to be the coordinate system along the wall, and one side (reference direction) horizontal with the ground set arbitrarily is up It should be the direction.
  • the “display location coordinate system” is a coordinate system in which a specific one direction associated with a location where a display object is displayed is the upper direction of the display object.
  • the display location coordinate system is set for each display location.
  • a coordinate system associated with a table which is a projection location is shown as an example of the display location coordinate system.
  • the “user coordinate system” is a coordinate system which is associated with the user who is the operator and whose front direction of the user is the upper direction of the display object.
  • each coordinate system is formed by two axes (an axis (reference axis component) in the reference direction and an axis (orthogonal axis component) in the direction orthogonal to the reference axis).
  • the invention is not limited to this.
  • it may be formed only on one axis (reference axis component).
  • the spatial coordinate system may be formed by one axis parallel to the long side of the room.
  • Table 1 an example of a recommended coordinate system of each display object to which the control unit 170 refers is shown in Table 1 below.
  • the data shown in Table 1 below is stored in the coordinate system storage unit 163.
  • a “user coordinate system” is set in advance as a recommended coordinate system of the menu UI
  • a “space coordinate system” is set in advance as a recommended coordinate system of the illumination UI.
  • control unit 170 outputs the user information detected by the user detection unit 130, the environment information detected by the environment detection unit 140, and the device information detected by the device detection unit 150 to the space information storage unit 161. .
  • the control unit 170 can also specify a speaker. Since the user detection unit 130 and the environment detection unit 140 periodically recognize the positions and orientations of all the users in the space, the control unit 170 may use the microphone to obtain an audio signal of a certain volume or more.
  • the microphone array is used to specify the direction of the speaker, and the speaker is specified by referring to the positions of all the previously recognized users.
  • the control unit 170 can also recognize the content of the user's utterance. For example, the control unit 170 acquires a character string from speech information (uttered speech) collected by the microphone 330 using a speech recognition engine, and further performs syntactic analysis to detect a trigger of a user operation.
  • the trigger of the user operation is a predetermined keyword (for example, the name of the system, a call to the system, etc.) or a processing command, for example, “display menu UI”, “display illumination UI”, “show map here” And so on.
  • a display surface near the user may be used as the display location.
  • Space information storage unit 161 stores the user information detected by the user detection unit 130, the environment information detected by the environment detection unit 140, and the device information detected by the device detection unit 150.
  • Coordinate system storage unit 163 stores information of a recommended coordinate system set for each display object.
  • the coordinate system storage unit 163 also stores information such as a space coordinate system and a display location coordinate system associated with each display location.
  • the content storage unit 165 stores various display objects such as a menu UI and a lighting UI.
  • the configuration of the information processing apparatus 100 has been specifically described above.
  • the configuration of the information processing apparatus 100 is not limited to the example shown in FIG.
  • the information processing device 100 may be in the same space as the output device 200 and the sensor device 300, or may be in another space.
  • the information processing apparatus 100 may be on the network.
  • at least a part of the configuration of the information processing apparatus 100 may be included in the external device.
  • FIG. 4 is a diagram for explaining an example of the flow of display processing of the menu UI performed in the system 1 according to the present embodiment.
  • control unit 170 detects a tap operation by the user on a table surface or the like based on the information obtained by the user detection unit 130 (step S103).
  • control unit 170 acquires a recommended coordinate system of the menu UI from the coordinate system storage unit 163 (step S109).
  • the control unit 170 performs control to display the display object in the acquired recommended coordinate system (step S112). That is, the control unit 170 performs processing for rotating and displaying the display object such that the upper direction of the display object coincides with the upper direction (reference direction) of the recommended coordinate system.
  • the menu UI is displayed with the tap operation as a trigger in this flow as an example, the user operation as a trigger for displaying the menu UI is not limited to the tap operation, and may be, for example, a double tap or a swipe operation. Alternatively, the menu UI may be displayed in the vicinity of the user, triggered by the user's voice.
  • FIG. 5 shows a display example of the menu UI 11 displayed in consideration of the recommended coordinate system according to the present embodiment.
  • the menu UI 11 is displayed (projected) on the table 30, as shown on the right side of FIG.
  • the arrangement direction of the menu UI 11 is controlled in consideration of the recommended coordinate system. That is, for example, when the recommended coordinate system of the menu UI 11 is a user coordinate system, the arrangement direction of the menu UI 11 is controlled such that the front direction of the user is upward.
  • the front direction of the user may be the direction of the user's face, or may be estimated from the direction of the finger tapping the table 30 (the direction of the index finger).
  • control unit 170 When the position of the table 30 moves after displaying the menu UI 11 on the table 30, the control unit 170 follows the movement of the table 30 to control the display position of the menu UI 11, and the menu UI 11 is displayed on the table surface. It may be kept displayed. In this case, the control unit 170 sets the display location coordinate system associated with the table 30 and the user coordinate system of the menu UI 11 in a parent-child relationship, and adds the user coordinate system to the table UI 11 even when the table 30 moves. Control is performed to display in the arranged direction.
  • the display position of the menu UI 11 is determined based on the tap position of the user, but as shown in FIG. 6, when the menu UI 11 a is out of the table 30, the menu UI 11 is not out of the table 30.
  • the display position may be adjusted.
  • the control unit 170 may perform animation display for performing gaze guidance from a position where the user taps to a predetermined position and displaying a menu from the predetermined position.
  • FIG. 7 is a diagram for explaining an example of the flow of display processing in consideration of the display area according to the present embodiment.
  • the control unit 170 simulates the display area based on the tap position of the user (step S123). Specifically, when the control unit 170 displays the menu UI based on the tap position of the user based on the size of the display surface (display area) of the table 30 and the display size of the menu UI (for example, the tap position (When the upper left end of the menu UI is aligned), it is determined whether the menu UI does not extend from the table 30 or not.
  • the control unit 170 sets a position where the display of the menu UI does not extend (for example, when the menu UI is displayed, the upper left end of the menu UI (Step S129).
  • the calculation of the position where the display of the menu UI does not protrude may be calculated by giving priority to being as close as possible to the tap position of the user.
  • the control unit 170 performs line-of-sight guidance from the tap position of the user to the calculated position using the point-like display object 11b, and controls to display the menu UI 11 as shown on the right side of FIG. (Step S132).
  • the control unit 170 performs line-of-sight guidance from the tap position to the display position, it is possible to avoid the occurrence of an unnatural operation feeling that the menu UI 11 is displayed at a place not operated by the user.
  • the display object such as the menu UI so as not to protrude from the display location, the user can perform the tap operation without worrying about the display area or the display size of the display object.
  • FIG. 9 is a view for explaining a display example of the illumination UI according to the present embodiment.
  • the illumination UI 12 is a UI that controls the illumination intensity of the illumination device installed in the room.
  • the control unit 170 may perform display area simulation of the illumination UI 12 so that the illumination UI 12 does not protrude from the table 30, and may change the display position with visual line guidance when adjusting the display position.
  • the illumination UI 12 is formed of, for example, three dials (a plurality of display objects). Each dial corresponds to an area of a room such as a kitchen, dining room or living room, and serves as a UI for controlling the lighting of each area.
  • the numbers on the dial indicate the current illumination intensity.
  • the user can control the illumination intensity by dragging a knob located on the dial.
  • the "Living" dial is displayed larger than the other dials, because the user is currently in “Living” and the possibility of operating the "Living” illumination is high.
  • the control of the size of the dial is an example, and all may be displayed in the same size.
  • the control unit 170 adds the space (room) coordinate system to the lighting UI 12 Control the display of That is, the control unit 170 is a space coordinate system in which the arrangement direction of the illumination UI 12 which is a UI for controlling the illumination device in space represents the positional relationship of the illumination device in space (or the area in which the illumination device is disposed). Display control to match the reference direction of. More specifically, the illumination UI 12 (formed from a plurality of illumination UIs corresponding to the respective illumination devices) is arranged, for example, in the same arrangement or relative position relationship as the arrangement of the plurality of illumination devices and the relative positional relationship.
  • the illumination UI 12 is arranged in the same direction as the arrangement of a plurality of lighting devices in the real space in the direction parallel to one side of the room.
  • the lighting UI 12 is arranged in accordance with the arrangement of the device (area where the device is installed) in the actual space, the user can intuitively understand which area of the room the lighting UI is. it can.
  • this will be specifically described with reference to FIG.
  • FIG. 10 is an overhead view of a room according to the present embodiment.
  • the control unit 170 arranges three dials forming the lighting UI, the layout of the actual room Perform display control in accordance with. That is, regardless of the orientation of the user (user coordinate system) and the orientation of the table 30 (display location coordinate system), the arrangement of the lighting UI 12 is determined in accordance with the arrangement of the room (space coordinate system).
  • the control unit 170 can control, for example, the relative positional relationship of the kitchen, dining, and living areas in the space (more specifically, of the lighting equipment installed in each area)
  • the three dials of the lighting UI 12 can be displayed on the table 30 in a relative positional relationship similar to the relative positional relationship). This enables the user to intuitively grasp which dial is the UI that controls the illumination of which area.
  • the control unit 170 can perform processing of mapping and displaying the illumination UI 12 of the arrangement based on the space coordinate system on the display location coordinate system associated with the table 30.
  • this embodiment is not limited to this,
  • there exist several illumination in a kitchen It may be a lighting UI on which a dial for controlling each lighting intensity is displayed.
  • the arrangement of the dials is displayed in accordance with the space coordinate system, so that the user can intuitively grasp.
  • the control unit 170 follows the movement of the table 30 to control the display position of the illumination UI 12 so that the illumination UI 12 continues to be displayed on the table surface. It is also good.
  • control unit 170 sets the display location coordinate system associated with the table 30 and the space coordinate system of the lighting UI 12 in a parent-child relationship, and the lighting UI 12 adds the space coordinate system to the table surface even when the table 30 moves. Control is performed to display in the arranged direction.
  • the display of the name of the area may be added to each dial of the illumination UI 12.
  • control unit 170 may display character information such as the illumination intensity of the illumination UI 12 and the name of the area in the direction of the user, that is, the direction in which the “user coordinate system” is added. This makes the text easier for the user to read. Thus, it is also possible to mix two or more coordinate systems in one UI display.
  • the "opposite side” means the opposite side when the user's position with respect to the table 30 shown in FIG. 10 is the normal position (front).
  • FIG. 11 is a view for explaining a display example of the illumination UI 12 a when the user operates from the opposite side of the table according to the present embodiment.
  • FIG. 12 is an enlarged view of the illumination UI 12a shown in FIG.
  • the control unit 170 recommends the illumination UI as in the case shown in FIG. Display control is performed in consideration of the "spatial coordinate system" which is a coordinate system. That is, the control unit 170 displays the arrangement of the three dials corresponding to each area of the room forming the illumination UI 12a in accordance with the arrangement of the area of the actual room. Since the positional relationship between the kitchen, the dining room, and the living room is constant regardless of the position of the user, the arrangement of the three dials is the same as the example shown in FIG. Note that the character information may be adjusted to the “user coordinate system” so as to be easy for the user to read.
  • FIG. 13 is a diagram for explaining display of a display object on the vertical plane according to the present embodiment.
  • a menu UI (not shown in FIG. 13) is displayed on the wall.
  • the menu UI refers to Table 1
  • the recommended coordinate system is the "user coordinate system”
  • the upper direction matches the upper direction of the "user coordinate system” and is displayed to the user without turning upside down It is possible to
  • FIG. 14 is a diagram for explaining an example of the flow of display processing of a display object according to a modification of the present embodiment.
  • the control unit 170 detects a tap operation on a table surface, a wall surface, or the like by the user (step S203).
  • the control unit 170 acquires the recommended coordinate system of the display object from the coordinate system storage unit 163 (step S209). For example, in accordance with Table 1, if the display object is a menu UI, the user coordinate system is selected, and if the light UI is tapped from the menu UI and the illumination UI is selected, a space coordinate system is selected.
  • the control unit 170 determines whether the recommended coordinate system is a space coordinate system (step S212).
  • the "spatial coordinate system" is a coordinate system defined on the horizontal plane, it is determined whether or not the recommended coordinate system is a spatial coordinate system. That is, the control unit 170 may determine whether the coordinate system is defined on the horizontal plane.
  • the control unit 170 controls the user operation detection unit 120 and the environment detection unit 140 or the space information.
  • the display surface is detected based on the information obtained by the storage unit 160 (step S215).
  • the display surface is, for example, a flat area of a display place where the user performs a tap operation.
  • control unit 170 determines whether the display surface is vertical (step S218). For example, although the top surface (table surface) of the table 30 is a surface horizontal to the ground, the wall surface is a surface vertical to the ground.
  • the control unit 170 maps the space coordinate system on the vertical display surface (step S221).
  • the spatial coordinate system is a coordinate system defined on the horizontal plane, it is necessary to convert the coordinate system defined on the horizontal plane to the vertical plane when using it on a vertical plane such as a wall surface is there.
  • An example of the mapping to the vertical plane will be described later with reference to FIG.
  • control unit 170 performs control to display the display object using the mapped space coordinate system (step S224).
  • a mapping example of the space coordinate system to the vertical plane and a display example of the display object will be described with reference to FIG.
  • an axis (depth direction) perpendicular to the wall surface in the space coordinate system is It will be lost (ie, when displaying on the wall, the arrangement of the three dials forming the lighting UI 12 will overlap). Therefore, an axis perpendicular to the wall surface (broken line arrow on the horizontal plane shown in FIG. 15) in the space coordinate system is extended on the vertical surface, that is, converted to an axis in the vertical direction (dotted line on the wall surface shown in FIG. Arrows) represent the arrangement in the depth direction vertically with respect to the wall surface.
  • an axis parallel to the wall surface (solid arrow in the horizontal plane shown in FIG. 15) is mapped in parallel on the wall as it is (solid arrow on the wall shown in FIG. 15).
  • the control unit 170 displays the illumination UI 12 b as shown in FIG. 15 based on the space coordinate system mapped to the wall surface as described above. This makes it possible to display the lighting UI corresponding to the area arrangement in the space, and the user can intuitively understand which area the UI is.
  • FIG. 16 an example of mapping a space coordinate system to another wall surface is shown in FIG. Also in this case, an axis perpendicular to the wall surface in the space coordinate system (a solid-line arrow on the horizontal plane shown in FIG. 16) is extended on the vertical plane, ie, converted to an axis in the vertical direction (shown in FIG. Solid arrows on the wall). Further, in the space coordinate system, an axis parallel to the wall surface (broken line arrow on the horizontal plane shown in FIG. 16) is mapped in parallel on the wall surface as it is (broken line arrow on the wall surface shown in FIG. 16). Then, the control unit 170 displays the illumination UI 12c as shown in FIG. 16 based on the space coordinate system mapped to the wall surface as described above.
  • a solid-line arrow on the horizontal plane shown in FIG. 16 is extended on the vertical plane, ie, converted to an axis in the vertical direction (shown in FIG. Solid arrows on the wall).
  • mapping to the wall surface of a space coordinate system and the display example of a display object were demonstrated concretely.
  • the control unit 170 performs control to display a display object in the acquired recommended coordinate system, as in the display process described with reference to FIG. 4 (Step S227).
  • the display processing described above when displaying a display object on a vertical plane, if the recommended coordinate system of the display object is a coordinate system defined on a horizontal plane, the display is optimal after mapping to the vertical plane.
  • the present embodiment is not limited to this. For example, when displaying a display object on a horizontal surface, if the recommended coordinate system of the display object is a coordinate system defined on a vertical surface, it is possible to perform display control after mapping this on the horizontal surface is there.
  • FIG. 17 is a block diagram showing an example of the hardware configuration of the information processing apparatus according to this embodiment.
  • the information processing apparatus 900 shown in FIG. 17 can realize, for example, the information processing apparatus 100 shown in FIG.
  • Information processing by the information processing apparatus 100 according to the present embodiment is realized by cooperation of software and hardware described below.
  • the information processing apparatus 900 includes a central processing unit (CPU) 901, a read only memory (ROM) 902, a random access memory (RAM) 903, and a host bus 904 a.
  • the information processing apparatus 900 further includes a bridge 904, an external bus 904 b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, and a communication device 913.
  • the information processing apparatus 900 may have a processing circuit such as an electric circuit, a DSP, or an ASIC instead of or in addition to the CPU 901.
  • the CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Also, the CPU 901 may be a microprocessor.
  • the ROM 902 stores programs used by the CPU 901, calculation parameters, and the like.
  • the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters and the like that appropriately change in the execution.
  • the CPU 901 can form, for example, the user operation detection unit 120, the user detection unit 130, the environment detection unit 140, the device detection unit 150, and the control unit 170 shown in FIG.
  • the CPU 901, the ROM 902, and the RAM 903 are mutually connected by a host bus 904a including a CPU bus and the like.
  • the host bus 904 a is connected to an external bus 904 b such as a peripheral component interconnect / interface (PCI) bus via the bridge 904.
  • PCI peripheral component interconnect / interface
  • the host bus 904a, the bridge 904, and the external bus 904b do not necessarily need to be separately configured, and these functions may be implemented on one bus.
  • the input device 906 is realized by, for example, a device such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever to which information is input by the user. Further, the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or PDA corresponding to the operation of the information processing apparatus 900. . Furthermore, the input device 906 may include, for example, an input control circuit that generates an input signal based on the information input by the user using the above input unit, and outputs the generated input signal to the CPU 901. The user of the information processing apparatus 900 can input various data to the information processing apparatus 900 or instruct processing operations by operating the input device 906.
  • the output device 907 is formed of a device capable of visually or aurally notifying the user of the acquired information.
  • Such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, laser projectors, display devices such as LED projectors and lamps, audio output devices such as speakers and headphones, printer devices, etc. .
  • the output device 907 outputs, for example, results obtained by various processes performed by the information processing apparatus 900.
  • the display device visually displays the results obtained by the various processes performed by the information processing apparatus 900 in various formats such as text, images, tables, graphs, and the like.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data and the like into an analog signal and aurally outputs it.
  • the storage device 908 is a device for data storage formed as an example of a storage unit of the information processing device 900.
  • the storage device 908 is realized by, for example, a magnetic storage unit device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the storage device 908 may include a storage medium, a recording device that records data in the storage medium, a reading device that reads data from the storage medium, and a deletion device that deletes data recorded in the storage medium.
  • the storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
  • the storage device 908 can form, for example, a space information storage unit 161, a coordinate system storage unit 163, and a content storage unit 165 shown in FIG.
  • the drive 909 is a reader / writer for a storage medium, and is built in or externally attached to the information processing apparatus 900.
  • the drive 909 reads out information recorded in a removable storage medium such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 903.
  • the drive 909 can also write information to the removable storage medium.
  • connection port 911 is an interface connected to an external device, and is a connection port to an external device capable of data transmission by USB (Universal Serial Bus), for example.
  • the connection port 911 may form, for example, the I / F unit 110 shown in FIG.
  • the connection port 911 is connected to the output device 200 and the sensor device 300 shown in FIG.
  • the communication device 913 is, for example, a communication interface formed of a communication device or the like for connecting to the network 920.
  • the communication device 913 is, for example, a communication card for wired or wireless Local Area Network (LAN), Long Term Evolution (LTE), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 913 may be a router for optical communication, a router for asymmetric digital subscriber line (ADSL), a modem for various communications, or the like.
  • the communication device 913 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet or another communication device.
  • the communication device 913 may form, for example, the I / F unit 110 illustrated in FIG.
  • the communication device 913 can then communicate with the output device 200 and the sensor device 300 shown in FIG.
  • the network 920 is a wired or wireless transmission path of information transmitted from a device connected to the network 920.
  • the network 920 may include the Internet, a public network such as a telephone network, a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), a WAN (Wide Area Network), or the like.
  • the network 920 may include a leased line network such as an Internet Protocol-Virtual Private Network (IP-VPN).
  • IP-VPN Internet Protocol-Virtual Private Network
  • each component described above may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level of the time of carrying out the present embodiment.
  • a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be created and implemented on a PC or the like.
  • a computer readable recording medium in which such a computer program is stored can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory or the like.
  • the above computer program may be distributed via, for example, a network without using a recording medium.
  • the functions of the information processing device 100, the output device 200, or the sensor device 300 can be exhibited in hardware such as the CPU, ROM, and RAM built in the information processing device 100, the output device 200, or the sensor device 300 described above. It is also possible to create a computer program for A computer readable storage medium storing the computer program is also provided.
  • the recommended coordinate system shown in Table 1 above is an example, and the present disclosure is not limited thereto.
  • the recommended coordinate system of the illumination UI as the “user coordinate system”
  • UIs highly likely to be operated by the user may be arranged in order from the left when viewed from the user.
  • the control UI of the device installed in the area where the user is currently present can be said to be a UI that is highly likely to be operated.
  • control unit 170 when displaying a display object at a display location at a certain distance from the user, such as a wall surface, the control unit 170 performs control to display the display object as large as 1.5 times the display size of the display object set in advance. Good.
  • the control unit 170 maps the north of the map UI in the north direction of the real space. Control to rotate and display the UI. At this time, the control unit 170 may display character information and the like of the map UI in the “user coordinate system” so as to be easy for the user to read.
  • An information processing apparatus comprising: a control unit that performs processing of displaying the plurality of display objects based on the rotated display object arrangement direction.
  • the plurality of display objects are UIs that control devices in corresponding positional relationships.
  • the control unit performs display control to make the arrangement of the plurality of display objects coincide with the arrangement of the plurality of devices in a reference direction of the space coordinate system.
  • the control unit performs display control to make the arrangement of the plurality of display objects correspond to the positional relationship of the plurality of devices including the reference direction of the space coordinate system and the orthogonal direction orthogonal to the reference direction.
  • the information processing apparatus according to 2).
  • the control unit performs control to rotate and display character information included in the plurality of display objects in the front direction of the user based on a user coordinate system representing the front direction of the user.
  • the information processing apparatus according to any one of the above.
  • the control unit is configured to display at least the display object when the display object is displayed based on the operation position based on the display size of the display object, the size of the display place, and the user's operation position at the display place.
  • the information processing apparatus according to any one of (1) to (5), wherein display control is performed by calculating a position at which all the display objects can be displayed on the display location, when part of the display objects deviates. (7) The information processing apparatus according to (6), wherein the control unit performs display control for guiding the gaze from the operation position to the calculated display position. (8) The control unit maps the plurality of display objects on the display location coordinate system corresponding to the display surface on which the plurality of display objects are displayed, based on the rotated display object arrangement direction, and maps the display location The information processing apparatus according to any one of (1) to (7), which performs a process of displaying the plurality of display objects on the display surface based on a position of a coordinate system.
  • the control unit follows the movement of the real object when the real object, which is a display place where the plurality of display objects are displayed, moves, and the display is performed while maintaining the arrangement corresponding to the reference direction of the space coordinate system.
  • the information processing apparatus according to any one of (1) to (8), which performs control of displaying an object on the real object.
  • the control unit displays the plurality of display objects on a vertical plane
  • the control unit maps the space coordinate system defined on the horizontal plane on the vertical plane, and rotates so as to coincide with the reference direction of the mapped space coordinate system.
  • the information processing apparatus according to any one of (1) to (9), which performs a process of (11)
  • the control unit acquires information on a positional relationship of a plurality of devices in the three-dimensional space from a result of an environmental sensing process of space, and performs a process of defining a reference direction of the space coordinate system.
  • the information processing apparatus according to any one of (10).
  • Processor is Rotating a display object arrangement direction for arranging a plurality of display objects to coincide with a reference direction of a space coordinate system representing a positional relationship of a plurality of devices in a three-dimensional space; Displaying the plurality of display objects based on the rotated display object arrangement direction; Information processing methods, including: (14) Computer, A process of rotating a display object arrangement direction for arranging a plurality of display objects to coincide with a reference direction of a space coordinate system representing a positional relationship of a plurality of devices in a three-dimensional space; A program for functioning as a control unit that performs a process of displaying the plurality of display objects based on the rotated display object arrangement direction.
  • Reference Signs List 1 system 10 display image 100 information processing apparatus 110 I / F unit 120 user operation detection unit 130 user detection unit 140 environment detection unit 150 device detection unit 161 space information storage unit 163 coordinate system storage unit 165 content storage unit 170 control unit 171 display Control unit 173 Sound control unit 200 Output device 210 Projector 220 TV 230 Tablet 240 Smartphone 250 PC 260 Speaker 270 Unidirectional Speaker 300 Sensor Device 310 Camera 320 Depth Sensor 330 Microphone

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un appareil de traitement d'informations, un procédé de traitement d'informations et un programme qui sont capables de mettre en correspondance une direction d'agencement d'objets d'affichage avec la relation de position d'une pluralité de dispositifs dans un espace. Cet appareil de traitement d'informations est pourvu d'une unité de commande qui effectue un processus de rotation de la direction d'agencement d'objets d'affichage pour agencer une pluralité d'objets d'affichage de façon à correspondre à la direction de référence d'un système de coordonnées spatial qui représente la relation de position d'une pluralité de dispositifs dans un espace tridimensionnel, et un processus d'affichage de la pluralité d'objets d'affichage sur la base de la direction d'agencement d'objets d'affichage tournée.
PCT/JP2018/032997 2017-10-25 2018-09-06 Appareil de traitement d'informations, procédé de traitement d'informations et programme Ceased WO2019082520A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017205716 2017-10-25
JP2017-205716 2017-10-25

Publications (1)

Publication Number Publication Date
WO2019082520A1 true WO2019082520A1 (fr) 2019-05-02

Family

ID=66247323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/032997 Ceased WO2019082520A1 (fr) 2017-10-25 2018-09-06 Appareil de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2019082520A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004320209A (ja) * 2003-04-14 2004-11-11 Sony Corp 通信装置、その表示方法、コンピュータプログラム、および通信方法
JP2009223490A (ja) * 2008-03-14 2009-10-01 Shimizu Corp 仮想スイッチならびにそれを用いた家電制御システムおよび家電制御方法
JP2011137905A (ja) * 2009-12-28 2011-07-14 Fujitsu Ltd 投影システム、投影処理プログラムおよび投影システムの制御方法
JP2013152711A (ja) * 2011-12-28 2013-08-08 Nikon Corp 投影装置及び表示装置
JP2013164696A (ja) * 2012-02-10 2013-08-22 Sony Corp 画像処理装置、画像処理方法及びプログラム
WO2015140106A1 (fr) * 2014-03-17 2015-09-24 IT-Universitetet i København Procédé d'interaction du regard mis en œuvre par ordinateur et son appareil

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004320209A (ja) * 2003-04-14 2004-11-11 Sony Corp 通信装置、その表示方法、コンピュータプログラム、および通信方法
JP2009223490A (ja) * 2008-03-14 2009-10-01 Shimizu Corp 仮想スイッチならびにそれを用いた家電制御システムおよび家電制御方法
JP2011137905A (ja) * 2009-12-28 2011-07-14 Fujitsu Ltd 投影システム、投影処理プログラムおよび投影システムの制御方法
JP2013152711A (ja) * 2011-12-28 2013-08-08 Nikon Corp 投影装置及び表示装置
JP2013164696A (ja) * 2012-02-10 2013-08-22 Sony Corp 画像処理装置、画像処理方法及びプログラム
WO2015140106A1 (fr) * 2014-03-17 2015-09-24 IT-Universitetet i København Procédé d'interaction du regard mis en œuvre par ordinateur et son appareil

Similar Documents

Publication Publication Date Title
US11373650B2 (en) Information processing device and information processing method
US10318011B2 (en) Gesture-controlled augmented reality experience using a mobile communications device
US10528154B2 (en) System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US20220057922A1 (en) Systems and interfaces for location-based device control
CN110383214B (zh) 信息处理装置、信息处理方法和记录介质
US11107287B2 (en) Information processing apparatus and information processing method
US20160306422A1 (en) Virtual reality system with a finger-wearable control
WO2016027536A1 (fr) Dispositif de traitement d'informations et procédé de commande
US20230400956A1 (en) Displaying Representations of Environments
JP6627775B2 (ja) 情報処理装置、情報処理方法およびプログラム
US12322048B2 (en) Connecting spatially distinct settings
US11221684B2 (en) Information processing device, information processing method, and recording medium
JP6374203B2 (ja) 表示システム及びプログラム
WO2019082520A1 (fr) Appareil de traitement d'informations, procédé de traitement d'informations et programme
US10545716B2 (en) Information processing device, information processing method, and program
WO2019044100A1 (fr) Appareil de traitement d'informations, procédé de traitement d'informations et programme
CN111033606A (zh) 信息处理装置、信息处理方法和程序
WO2019123754A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
US20250378575A1 (en) Tracking Occluded Objects in Hand

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18869580

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18869580

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP