WO2023210288A1 - 情報処理装置、情報処理方法および情報処理システム - Google Patents
情報処理装置、情報処理方法および情報処理システム Download PDFInfo
- Publication number
- WO2023210288A1 WO2023210288A1 PCT/JP2023/014209 JP2023014209W WO2023210288A1 WO 2023210288 A1 WO2023210288 A1 WO 2023210288A1 JP 2023014209 W JP2023014209 W JP 2023014209W WO 2023210288 A1 WO2023210288 A1 WO 2023210288A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- information
- information processing
- generated
- processing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/373—Details of the operation on graphic patterns for modifying the size of the graphic pattern
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Definitions
- the present invention relates to an information processing device, an information processing method, and an information processing system.
- a technique in which an image captured by a camera mounted on a moving body is presented to a remote operator (hereinafter also referred to as an operator) who operates the moving body from a remote location.
- a technique is known that zooms in on an image presented to a remote operator in accordance with the movement of a mobile object.
- the image presented to the remote operator is simply zoomed in according to the movement of the mobile object, and it is not necessarily possible to improve usability in remote operation of the mobile object.
- the present disclosure proposes an information processing device, an information processing method, and an information processing system that can improve usability in remote control of a mobile object.
- a first image generated by an imaging unit mounted on a moving object and first time information at which the first image was generated are received at a first frequency
- the Movement information regarding the movement of the moving object generated by a sensor unit mounted on the moving object and second time information at which the movement information was generated are transmitted at a second frequency higher than the first frequency.
- An information processing device includes a control unit that receives the first image and generates a second image based on the first image and the exercise information.
- FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
- FIG. 3 is a diagram showing the relationship between the predicted position of the vehicle and the viewing frustum according to the same embodiment.
- FIG. 3 is a diagram illustrating an example of object arrangement in a virtual space according to the embodiment.
- FIG. 6 is a diagram showing an example of a first image generated by an imaging unit of a vehicle traveling straight and a second image viewed by a remote operator according to the embodiment.
- FIG. 7 is a diagram showing the relationship between the acquisition frequency of the first image and the acquisition frequency of exercise information according to the same embodiment.
- FIG. 3 is a diagram illustrating an example of object arrangement in a virtual space according to the embodiment.
- FIG. 3 is a diagram illustrating an example of a first image generated by an imaging unit of a vehicle turning a curve and a second image viewed by a remote operator according to the embodiment. It is a flowchart which shows the information processing procedure by the remote control device based on the same embodiment.
- FIG. 7 is a diagram for explaining vibration handling of a camera according to a modification of the embodiment.
- FIG. 1 is a hardware configuration diagram showing an example of a computer that implements the functions of an information processing device.
- the information processing device moves the viewing frustum between the camera image and the viewing frustum placed in the virtual space to a position calculated based on the movement information of the moving object. Generates an image with apparent delay time disabled. The information processing device then displays the image with the apparent delay time nullified to the remote operator. As a result, even if the video is delayed, the information processing device allows the operator to operate the information processing device without feeling any discomfort or anxiety, without perceiving the delay in the video.
- the information processing device acquires movement information of the moving object at a higher frequency than the frequency of acquiring camera images.
- the information processing device can complement the video even between frames of the camera video, so that problems caused by a low frame rate can be solved at the same time.
- the communication rate and communication delay always change, so the frame interval of camera images also changes all the time.
- the information processing device can flexibly respond to such changes in the frame interval of camera images, and can display images with less discomfort even if camera images are suddenly delayed or missing.
- the information processing device can improve usability in remote control of a mobile object.
- FIG. 1 is a diagram illustrating a configuration example of an information processing system 1 according to an embodiment of the present disclosure.
- the information processing system 1 includes a mobile object 10 and a remote control device 100.
- the mobile object 10 and the remote control device 100 are connected via a predetermined network N so that they can communicate wirelessly.
- image is a concept that includes both video and still images.
- the mobile object 10 is an autonomous mobile device that can be moved under the control of a remote operator.
- the mobile body 10 is a vehicle, but the mobile body 10 is not limited to a vehicle.
- the mobile object 10 may be a robot or a drone.
- the mobile body 10 includes a communication section 11, a control section 12, an imaging section 13, and a sensor section 14.
- the communication unit 11 is realized by, for example, a NIC (Network Interface Card). Further, the communication unit 11 is wirelessly connected to the network N, and transmits and receives information to and from the remote control device 100. For example, the communication unit 11 receives operation information regarding the operation of the mobile object 10 from the remote control device 100. When receiving the operation information, the communication section 11 outputs the received operation information to the operation control section of the control section 12 .
- NIC Network Interface Card
- the control unit 12 is a controller, and controls the mobile body 10 using, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field Programmable Gate Array). This is realized by executing various programs (corresponding to an example of an information processing program) stored in an internal storage device using a storage area such as a RAM as a work area.
- the control section 12 includes a vibration removal section and an operation control section.
- the operation control unit of the control unit 12 receives operation information from the communication unit 11.
- the operation control section of the control section 12 controls the movement of the moving body 10 according to the received operation information.
- the control unit 12 acquires from the imaging unit 13 an image generated by the imaging unit 13 (hereinafter referred to as a first image) and first time information at which the first image was generated.
- the control unit 12 provides the first image generated by the imaging unit 13 with first time information at which the first image was generated. Further, the vibration removal unit of the control unit 12 removes vibration from the first image.
- the control unit 12 outputs the first image and first time information at which the first image was generated to the communication unit 11.
- the communication unit 11 transmits the first image and first time information at which the first image was generated to the remote control device 100 at a first frequency.
- control unit 12 acquires the exercise information generated by the sensor unit 14 and the second time information at which the exercise information was generated from the sensor unit 14.
- the control unit 12 adds second time information at which the exercise information was generated to the exercise information generated by the sensor unit 14 .
- the control unit 12 outputs the exercise information and second time information at which the exercise information was generated to the communication unit 11.
- the communication unit 11 transmits the exercise information and second time information at which the exercise information is generated to the remote control device 100 at a second frequency higher than the first frequency.
- the imaging unit 13 realizes a camera function to capture an image of a target object.
- the imaging unit 13 includes, for example, an optical system such as a lens, and an imaging element such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor.
- the imaging unit 13 images the environment around the moving object 10 .
- the sensor section 14 is also referred to as a motion measurement section.
- the sensor unit 14 includes various sensors mounted on or connected to the moving body 10.
- the connection may be a wired connection or a wireless connection.
- the sensors may be detection devices other than the mobile object 10, such as wireless devices.
- the sensor unit 14 includes a speed sensor, an inertial sensor, and a steering angle sensor, and generates motion information regarding the motion of the moving body 10.
- the sensor unit 14 generates information regarding temporal changes in the position and posture of the moving body 10 as movement information.
- the remote control device 100 includes a communication section 110 and a control section 120.
- the communication unit 110 is realized by, for example, a NIC or the like. Furthermore, the communication unit 110 is wirelessly connected to the network N and transmits and receives information to and from the mobile body 10 . For example, the communication unit 110 receives operation information related to the operation of the mobile body 10 input by a remote operator from the remote operation unit of the control unit 120. When receiving the operation information, the communication unit 110 transmits the received operation information to the remote control device 100.
- the control unit 120 is a controller, and for example, various programs (corresponding to an example of an information processing program) stored in a storage device inside the remote control device 100 are stored in a RAM or the like using a CPU, MPU, ASIC, FPGA, or the like. This is achieved by using the storage area as a work area.
- the control unit 120 includes a video delay calculation unit, a virtual imaging position prediction unit, a delay nullification video processing unit, a video output unit, and a remote control unit.
- FIG. 2 is a diagram showing the relationship between the predicted position of the vehicle and the viewing frustum according to the embodiment of the present disclosure.
- the control unit 120 determines the position and posture of the mobile body 10 at the time when the remote operator views the second image (the time when the second image is displayed to the remote operator). Estimate location information. Specifically, the virtual imaging position prediction unit of the control unit 120 calculates the position of the moving object 10 at the time when the image is displayed to the remote operator, based on the position of the vehicle at the time when the first image was taken. and estimate the moving object position information regarding the posture. More specifically, the control unit 120 acquires the most recent exercise information (the latest exercise information) from among the exercise information acquired up to the point in time when the moving body position information is estimated.
- control unit 120 acquires information indicating the most recent speed of the moving body 10 as the most recent movement information. Furthermore, the control unit 120 calculates the time difference between the time when the first image was taken and the time when the second image is displayed to the remote operator. Subsequently, the control unit 120 estimates the distance traveled by the moving object 10 as moving object position information by multiplying the obtained speed of the moving object 10 by the calculated difference time.
- control unit 120 estimates view frustum position information regarding the position and orientation of the view frustum at the time when the operator views the second image, based on the motion information. Specifically, the virtual imaging position prediction unit of the control unit 120 determines the position and orientation of the viewing frustum at the time when the image is displayed to the remote operator based on the estimated moving body position information. Estimate location information. Specifically, based on the estimated moving body position information, the control unit 120 uses the position and orientation of the viewing frustum corresponding to the position of the moving body 10 indicated by the moving body position information as the viewing frustum position information. presume.
- FIG. 3 is a diagram illustrating an example of object arrangement in a virtual space according to an embodiment of the present disclosure.
- FIG. 3 shows a case where the vehicle body moves straight in the direction of travel.
- the control unit 120 generates a second image that is viewed by an operator who remotely controls the mobile object, based on the first image and the exercise information.
- the delay disabling video processing unit of the control unit 120 selects a first image (camera image G1) placed in the virtual space as shown in FIG.
- the area (G12) that intersects with the view cone (the view cone whose apex is point P12) located at the position and orientation indicated by the view cone position information at the time when the second image is viewed is defined as the second image.
- the second image the second image.
- FIG. 4 is a diagram illustrating an example of a first image generated by an imaging unit of a vehicle traveling straight and a second image viewed by a remote operator according to an embodiment of the present disclosure.
- the image G11 shown on the left side of FIG. 4 is one of the first images (camera image G1) placed in the virtual space shown in FIG. This corresponds to the area (G11) that intersects with the viewing frustum (the viewing frustum having the vertex at point P11) located at the position.
- the image G12 shown on the right side of FIG. 4 is placed in the virtual space among the first images (camera image G1) placed in the virtual space shown in FIG.
- the video output unit of the control unit 120 enlarges the image of the region (G12) generated as the second image and displays it on a screen viewed by the remote operator. As shown in FIG. 4, the control unit 120 controls the CG image G13 corresponding to the dashboard of the mobile object to be displayed superimposed on the second image.
- FIG. 5 is a diagram showing the relationship between the acquisition frequency of the first image and the acquisition frequency of exercise information according to the embodiment of the present disclosure.
- the control unit 120 receives the first image (the video frame shown in FIG. 5) generated by the imaging unit 13 mounted on the mobile body 10 from the mobile body 10 via the communication unit 110, and the first image.
- the generated first time information is received at a first frequency.
- the control unit 120 receives, from the mobile body 10 via the communication unit 110, movement information regarding the movement of the mobile body generated by the sensor unit 14 mounted on the mobile body 10, and a second time information at a second frequency higher than the first frequency.
- the control unit 120 receives the set of exercise information and the second time information every second period shorter than the first period in which the set of the first image and the first time information is received. .
- the control unit 120 receives a set of one video frame (first image) and first time information from the imaging unit 13 of the moving body 10, and then receives the next video frame and the first time information.
- the movement information and the second time information are sent from the sensor unit 14 of the moving body 10 every second cycle, which is approximately one-fourth of the time (first cycle) until the time information is received. and time information.
- the video delay calculation section of the control section 120 calculates the first image after the control section 120 (e.g., delay-invalidation video processing section) acquires the first image.
- the time required to output the second image is calculated as the video delay time.
- FIG. 6 is a diagram illustrating an example of object arrangement in the virtual space according to the embodiment of the present disclosure.
- FIG. 6 differs from FIG. 3 in that the vehicle body is performing a movement of turning to the right with respect to the traveling direction.
- the delay disabling video processing section of the control section 120 is arranged in the virtual space among the first images (camera image G2) arranged in the virtual space as shown in FIG.
- the view frustum (with point P22 as its apex) is placed at the position and orientation indicated by the view frustum position information at the time when the second image is viewed (the time when the second image is displayed to the remote operator).
- the region (G22) that intersects with the viewing cone (view frustum) is generated as a second image.
- FIG. 7 is a diagram illustrating an example of a first image generated by an imaging unit of a vehicle turning a curve and a second image viewed by a remote operator according to an embodiment of the present disclosure.
- Image G21 shown on the left side of FIG. 7 is one of the first images (camera image G2) placed in the virtual space shown in FIG. This corresponds to the area (G21) that intersects with the viewing frustum (the viewing frustum having the vertex at point P21) located at the position.
- the image G22 shown on the right side of FIG. 7 is placed in the virtual space among the first images (camera image G2) placed in the virtual space shown in FIG.
- the video output unit of the control unit 120 enlarges the image of the area (G22) generated as the second image and displays it on a screen viewed by the remote operator.
- the remote control device 100 may include a storage unit.
- the storage unit is realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory, or a storage device such as a hard disk or an optical disk.
- the storage unit stores various programs and data used by each unit of the remote control device 100.
- FIG. 8 is a flowchart showing an information processing procedure by the remote control device according to the embodiment of the present disclosure.
- the remote control device 100 determines whether a new video frame exists (step S1). If the remote control device 100 determines that a new video frame exists (step S1; Yes), it performs a new video frame acquisition process. Specifically, the remote control device 100 updates camera images placed in the virtual space (step S2). Subsequently, the remote control device 100 compares the captured video with a predicted video having a similar time stamp, and corrects the prediction function (step S3). Next, the remote control device 100 determines whether it is the timing to start drawing by counting backwards from the time the remote operator views the video (step S4).
- step S4 determines whether or not it is the timing to start drawing by counting backwards from the time the remote operator views the video. is determined (step S4).
- the remote control device 100 determines that it is the timing to start drawing (step S4; Yes), it acquires motion information of the moving body 10 (step S5). Subsequently, the remote control device 100 predicts the position and orientation (of the viewing frustum) of the moving body at the time when the remote operator views the video (step S6). Subsequently, the remote control device 100 moves the view frustum to the predicted position (step S7). Subsequently, the remote control device 100 outputs an image of an area that intersects with the viewing frustum among the camera images arranged in the virtual space (step S8). Subsequently, the remote control device 100 determines whether there is time to perform the process of acquiring a new video frame (step S9). On the other hand, if the remote control device 100 does not determine that it is the timing to start drawing (step S4; No), the remote control device 100 determines whether there is time to perform the acquisition process of a new video frame (step S9). ).
- step S9 If the remote control device 100 does not determine that there is time to perform the acquisition process of a new video frame (step S9; No), the remote control device 100 calculates backward from the time the remote operator views the video and determines when the drawing should start. It is determined whether there is one (step S4). On the other hand, if the remote control device 100 determines that there is time to perform the acquisition process of a new video frame (step S9; Yes), the remote control device 100 determines whether a new video frame exists (step S1).
- the remote control device 100 according to the embodiment described above may be implemented in various different forms other than the embodiment described above. Therefore, other embodiments of the remote control device 100 will be described below. Note that the same parts as those in the embodiment are given the same reference numerals and the description thereof will be omitted.
- the remote control device 100 estimates the view frustum position information regarding the position and orientation of the view frustum at the time when the operator views the second image, and calculates the estimated view frustum position and orientation.
- the remote control device 100 may generate the second image without using a viewing frustum.
- the control unit 120 estimates the distance that the vehicle travels in the traveling direction in a predetermined time based on the motion information. Subsequently, the control unit 120 may generate, as the second image, an area of the first image that is made smaller according to the distance traveled by the vehicle.
- FIG. 9 is a diagram for explaining vibration handling of a camera according to a modification of the embodiment of the present disclosure.
- the camera mounted on the vehicle when the vehicle bumps (moves in the vertical direction) on the road, the camera mounted on the vehicle also bumps, so the image captured by the camera includes vertical vibrations. .
- the remote operator views the vibrating video with a delay corresponding to the video delay.
- control unit 120 corrects the blurring of the first image caused by the movement of the moving body in the vertical direction. Specifically, the control unit 120 corrects the first image to cancel vibrations included in the first image. Further, the control unit 120 applies processing corresponding to vibrations caused by the vertical movement of the mobile body to the second image viewed by the operator at the time when the mobile body moves in the vertical direction. Specifically, the control unit 120 controls the second image viewed by the operator at the time when the moving body moves in the vertical direction to generate a vibration equivalent to the vibration included in the first image based on the movement information. Add processing to accommodate vibrations.
- control unit 120 generates the first image so that the difference between the resolution of the second image generated first and the resolution of the second image generated last falls within a predetermined range.
- the resolution of the second image is made coarser to generate the second image that is generated first.
- control unit 120 controls the second and subsequent images so that the difference between the resolution of the second image generated after the second image and the resolution of the second image generated last falls within a predetermined range.
- the resolution of the second image generated in the second image is coarsened to generate the second image generated after the second image.
- the control unit 120 controls the second image to be displayed on the screen in a display manner that allows the operator to visually recognize that the second image is being displayed. For example, when displaying the second image on the screen, the control unit 120 displays a display that allows the operator to visually recognize that the second image is being displayed, such as by surrounding the second image with a red frame. Display in the following manner.
- control unit 120 displays information on the video output unit to notify the operator that the object has been detected.
- Control For example, the control unit 120 controls the appearance area of the object to be visually emphasized and displayed on the video output unit.
- the sensor section 14 includes a distance measurement sensor.
- the sensor unit 14 may include a RADAR (Radio Detection and Ranging), a LiDAR (Light Detection and Ranging), an ultrasonic sensor, a stereo camera, or the like as a ranging sensor.
- RADAR Radio Detection and Ranging
- LiDAR Light Detection and Ranging
- ultrasonic sensor a stereo camera, or the like as a ranging sensor.
- the information processing device includes a control unit (the control unit 120 in the embodiment).
- the control unit controls the first image generated by the imaging unit (the imaging unit 13 in the embodiment) mounted on the mobile object (the mobile object 10 in the embodiment) and the first time at which the first image was generated.
- the information is received at a first frequency, and the motion information regarding the motion of the moving object generated by the sensor section (sensor section 14 in the embodiment) mounted on the moving object and the second information on which the movement information was generated are received.
- time information at a second frequency higher than the first frequency, and generates a second image based on the first image and the exercise information.
- the second image is viewed by an operator who remotely controls the mobile object.
- the information processing device can present an image with the apparent delay time nullified to the remote operator.
- the information processing device allows the operator to operate the information processing device without feeling any discomfort or anxiety, without perceiving the delay in the video.
- the information processing device acquires movement information of the moving body at a higher frequency than the frequency at which camera images are acquired.
- the information processing device can complement the video even between frames of the camera video, so that problems caused by a low frame rate can be solved at the same time.
- the communication rate and communication delay always change, so the frame interval of camera images also changes all the time.
- the information processing device can flexibly respond to such changes in the frame interval of camera images, and can display images with less discomfort even if camera images are suddenly delayed or missing. Therefore, the information processing device can improve usability in remote control of a mobile object.
- the control unit also estimates mobile body position information regarding the position and orientation of the mobile body at the time when the operator views the second image based on the motion information, and compares the first image with the mobile body position information. A second image is generated based on.
- the information processing device can present an image with the apparent delay time nullified to the remote operator.
- control unit estimates, based on the motion information, the viewing frustum position information regarding the position and orientation of the viewing frustum at the time when the operator views the second image, and estimates the viewing frustum position information regarding the position and orientation of the viewing frustum at the time the operator views the second image, A region of the image that intersects with the view frustum arranged in the virtual space and arranged at the position and orientation indicated by the view frustum position information is generated as a second image.
- the information processing device can present an image with the apparent delay time nullified to the remote operator.
- control unit generates each of the plurality of second images such that the resolution of each of the plurality of second images generated from one first image falls within a predetermined range.
- the information processing device can present the image to the remote operator without making the user feel a change in the roughness of the image with the apparent delay time nullified.
- control unit corrects blurring of the first image caused by vertical movement of the mobile body, and displays the mobile body in a second image viewed by the operator at the time when the mobile body moves in the vertical direction. Add processing to deal with the vibrations caused by the vertical movement of the
- the information processing device can present to the remote operator an image in which the delay time of the time when the moving body moved in the vertical direction has been invalidated.
- the information processing device further includes an output unit that displays the second image.
- the control section controls the second image to be displayed on the output section in a display manner that allows the operator to visually recognize that the second image is being displayed. .
- the information processing device can remind the remote operator that they need to be careful because the image is different from the real-time on-site image.
- control unit controls the output unit to display information notifying the operator that the object has been detected. do.
- the information processing device can, for example, remotely control notifications regarding on-site accidents (such as animals jumping in) that occur in real time, even when viewing images with the apparent delay time disabled. This can be done by anyone.
- on-site accidents such as animals jumping in
- control unit controls the appearance area of the object to be visually emphasized and displayed on the output unit.
- the information processing device can, for example, provide more effective information regarding on-site accidents (such as animals jumping in) that occur in real time.
- a notification can be sent to a remote operator.
- control unit controls the image corresponding to the dashboard of the mobile object to be displayed superimposed on the second image.
- the information processing device can use the image corresponding to the dashboard of the mobile object as a starting point to help the remote operator perform remote control in a state similar to actually operating the mobile object. I can do it.
- FIG. 10 is a hardware configuration diagram showing an example of a computer 1000 that implements the functions of an information processing device such as the remote control device 100.
- the remote control device 100 according to the embodiment will be described below as an example.
- Computer 1000 has CPU 1100, RAM 1200, ROM (Read Only Memory) 1300, HDD (Hard Disk Drive) 1400, communication interface 1500, and input/output interface 1600. Each part of computer 1000 is connected by bus 1050.
- the CPU 1100 operates based on a program stored in the ROM 1300 or the HDD 1400 and controls each part. For example, the CPU 1100 loads programs stored in the ROM 1300 or HDD 1400 into the RAM 1200, and executes processes corresponding to various programs.
- the ROM 1300 stores boot programs such as BIOS (Basic Input Output System) that are executed by the CPU 1100 when the computer 1000 is started, programs that depend on the hardware of the computer 1000, and the like.
- BIOS Basic Input Output System
- the HDD 1400 is a computer-readable recording medium that non-temporarily records programs executed by the CPU 1100 and data used by the programs.
- HDD 1400 is a recording medium that records a program according to the present disclosure, which is an example of program data 1450.
- the communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (for example, the Internet).
- CPU 1100 receives data from other devices or transmits data generated by CPU 1100 to other devices via communication interface 1500.
- the input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000.
- the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600. Further, the CPU 1100 transmits data to an output device such as a display, speaker, or printer via an input/output interface 1600.
- the input/output interface 1600 may function as a media interface that reads programs and the like recorded on a predetermined recording medium.
- Media includes, for example, optical recording media such as DVD (Digital Versatile Disc) and PD (Phase change rewritable disk), magneto-optical recording media such as MO (Magneto-Optical disk), tape media, magnetic recording media, semiconductor memory, etc. It is.
- the CPU 1100 of the computer 1000 reproduces the functions of the control unit 120 and the like by executing a program loaded onto the RAM 1200.
- the HDD 1400 stores programs and various data according to the present disclosure. Note that although the CPU 1100 reads and executes the program data 1450 from the HDD 1400, as another example, these programs may be obtained from another device via the external network 1550.
- the present technology can also have the following configuration. (1) receiving a first image generated by an imaging unit mounted on a mobile object and first time information at which the first image was generated at a first frequency; Movement information regarding the movement of the moving body generated by a sensor unit mounted on the moving body and second time information at which the movement information was generated are transmitted at a second frequency higher than the first frequency. received at An information processing device comprising: a control unit that generates a second image based on the first image and the exercise information. (2) The second image is viewed by an operator who remotely controls the mobile object.
- the information processing device according to (1) above.
- the control unit includes: estimating mobile body position information regarding the position and orientation of the mobile body at the time when the operator views the second image based on the motion information; generating the second image based on the first image and the moving body position information; The information processing device according to (2) above.
- the control unit includes: estimating viewing frustum position information regarding the position and orientation of the viewing frustum at a time when the operator views the second image based on the motion information; A region of the first image arranged in the virtual space that intersects with the view frustum arranged in the virtual space and at a position and orientation indicated by the view frustum position information is used as the second image. generate as, The information processing device according to (3) above.
- the control unit includes: generating each of the plurality of second images such that the resolution of each of the plurality of second images generated from one of the first images falls within a predetermined range; The information processing device according to any one of (1) to (4) above.
- the control unit includes: Correcting blur in the first image caused by vertical movement of the moving body, adding processing to the second image that corresponds to vibrations caused by vertical movement of the moving body; The information processing device according to any one of (1) to (5) above.
- (7) further comprising an output unit that displays the second image The control unit includes: When displaying the second image on the output section, the second image is displayed on the output section in a display manner that allows the operator to visually recognize that the second image is being displayed.
- Control The information processing device according to any one of (2) to (6) above.
- (8) further comprising an output unit that displays the second image
- the control unit includes: When the sensor unit detects an object located within a predetermined range from the moving body, control is performed to display information on the output unit notifying the operator that the object has been detected. , The information processing device according to any one of (2) to (7) above.
- the control unit includes: controlling the appearance area of the object to be visually emphasized and displayed on the output unit; The information processing device according to (8) above.
- the control unit includes: controlling an image corresponding to a dashboard of the mobile object to be displayed superimposed on the second image; The information processing device according to any one of (1) to (9) above.
- An information processing method executed by an information processing device comprising: receiving at a first frequency a first image generated by an imaging unit mounted on a moving body and first time information at which the first image was generated; Movement information regarding the movement of the moving body generated by a sensor unit mounted on the moving body and second time information at which the movement information was generated are transmitted at a second frequency higher than the first frequency. and to receive it at generating a second image based on the first image and the motion information; Information processing methods including.
- An information processing system comprising a mobile object and an information processing device,
- the information processing device includes: receiving a first image generated by an imaging unit mounted on the mobile body and first time information at which the first image was generated at a first frequency; Movement information regarding the movement of the moving body generated by a sensor unit mounted on the moving body and second time information at which the movement information was generated are transmitted at a second frequency higher than the first frequency. received at An information processing system, comprising: a control unit that generates a second image based on the first image and the exercise information.
- Information processing system 10 Mobile object 11 Communication unit 12 Control unit 13 Imaging unit 14 Sensor unit 100 Remote control device 110 Communication unit 120 Control unit
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
[1.はじめに]
従来、移動体の遠隔操作において、遠隔地への映像情報の伝達が遅延することが、課題となっている。例えば、映像の伝達が遅れることで、障害物の発見が遅れてしまう。また、映像の伝達が遅れることで、走行しながら方向転換するようなタイミングが重要な操作が行いにくいという問題が生じる。また、精緻な遠隔操作を行うためには高解像度の映像が求められるが、伝送容量に制約がある中で高解像度の映像を伝送するためは、相対的に送信頻度(フレームレート)を抑える必要が生じてしまう。
図1は、本開示の実施形態に係る情報処理システム1の構成例を示す図である。以下では、本開示の実施形態に係る情報処理装置が、遠隔操作装置100である場合について説明する。情報処理システム1は、移動体10と、遠隔操作装置100と、を備える。移動体10と、遠隔操作装置100とは、所定のネットワークNを介して、無線により通信可能に接続される。なお、以下では、画像とは、映像および静止画像の両方を含む概念とする。
図8は、本開示の実施形態に係る遠隔操作装置による情報処理手順を示すフローチャートである。遠隔操作装置100は、新規のビデオフレームが存在するか否かを判定する(ステップS1)。遠隔操作装置100は、新規のビデオフレームが存在すると判定した場合(ステップS1;Yes)、新規ビデオフレームの取得処理を行う。具体的には、遠隔操作装置100は、仮想空間内に配置するカメラ映像を更新する(ステップS2)。続いて、遠隔操作装置100は、撮像された映像のタイムスタンプが近い予測映像と比較し、予測関数を補正する(ステップS3)。続いて、遠隔操作装置100は、遠隔操作者が映像を見る時間から逆算して描画を開始すべきタイミングであるか否かを判定する(ステップS4)。一方、遠隔操作装置100は、新規のビデオフレームが存在すると判定しなかった場合(ステップS1;No)、遠隔操作者が映像を見る時間から逆算して描画を開始すべきタイミングであるか否かを判定する(ステップS4)。
上述した実施形態に係る遠隔操作装置100は、上記実施形態以外にも種々の異なる形態にて実施されてよい。そこで、以下では、遠隔操作装置100の他の実施形態について説明する。なお、実施形態と同一部分には、同一符号を付して説明を省略する。
上述した実施形態では、遠隔操作装置100が、操作者が第2の画像を視聴する時刻における視錘台の位置および姿勢に関する視錘台位置情報を推定し、推定された視錘台の位置および姿勢に基づいて、第2の画像を生成する場合について説明したが、遠隔操作装置100は、視錘台を用いずに第2の画像を生成してもよい。例えば、制御部120は、運動情報に基づいて、所定時間に車両が進行方向に進む距離を推定する。続いて、制御部120は、第1の画像のうち、車両が進む距離に応じて小さくした領域を第2の画像として生成してもよい。
図9は、本開示の実施形態の変形例に係るカメラの振動対応について説明するための図である。図9に示す比較技術では、道路上で車両がバンプ(上下方向に運動)した場合、車両に搭載されたカメラもバンプするため、カメラによって撮像された画像には、上下方向の振動が含まれる。また、遠隔操作者によって視聴される映像は遅れて届くため、遠隔操作者は映像の遅延分だけ遅れて振動する映像を視聴することとなる。
例えば、図3に示すように、車両が進行方向に直進する場合、仮想空間に配置された第1の画像に対して、視錘台が近づくため、視錘台によって切り取られる画像の面積は時間経過に伴って小さくなる。そのため、遠隔操作者に対して表示される第2の画像は、小さい面積の画像を拡大して生成されるため、時間経過に伴って画素が粗くなってしまう。そこで、制御部120は、一つの第1の画像から生成される複数の第2の画像の解像度それぞれが所定の範囲内に収まるように複数の第2の画像それぞれを生成する。例えば、制御部120は、一つの第1の画像から生成される複数の第2の画像のうち、最後に生成される第2の画像の解像度を算出する。続いて、制御部120は、最初に生成される第2の画像の解像度と、最後に生成される第2の画像の解像度との差が所定の範囲内に収まるように、最初に生成される第2の画像の解像度を粗くして最初に生成される第2の画像を生成する。同様に、制御部120は、2番目以降に生成される第2の画像の解像度と、最後に生成される第2の画像の解像度との差が所定の範囲内に収まるように、2番目以降に生成される第2の画像の解像度を粗くして2番目以降に生成される第2の画像を生成する。
制御部120は、第2の画像を画面に表示する場合は、第2の画像を表示していることを操作者が視認可能な表示態様で第2の画像を画面に表示するよう制御する。例えば、制御部120は、第2の画像を画面に表示する場合は、第2の画像を赤枠で囲むなどして、第2の画像を表示していることを操作者が視認可能な表示態様で表示する。
制御部120は、センサ部14により移動体から所定の範囲内に位置する物体が検出された場合は、操作者に対して物体が検出されたことを通知する情報を映像出力部に表示するよう制御する。例えば、制御部120は、物体の出現領域を視覚的に強調して映像出力部に表示するよう制御する。
上述のように、本開示の実施形態に係る情報処理装置(実施形態では遠隔操作装置100)は、制御部(実施形態では制御部120)を備える。制御部は、移動体(実施形態では移動体10)に搭載された撮像部(実施形態では撮像部13)により生成された第1の画像と、第1の画像が生成された第1の時刻情報とを、第1の頻度で受信し、移動体に搭載されたセンサ部(実施形態ではセンサ部14)により生成された移動体の運動に関する運動情報と、運動情報が生成された第2の時刻情報とを、第1の頻度よりも高い第2の頻度で受信し、第1の画像と、運動情報とに基づいて、第2の画像を生成する。例えば、第2の画像は、移動体を遠隔操作する操作者によって視聴される。
上述してきた実施形態に係る遠隔操作装置100等の情報機器は、例えば図10に示すような構成のコンピュータ1000によって再現される。図10は、遠隔操作装置100等の情報処理装置の機能を実現するコンピュータ1000の一例を示すハードウェア構成図である。以下、実施形態に係る遠隔操作装置100を例に挙げて説明する。コンピュータ1000は、CPU1100、RAM1200、ROM(Read Only Memory)1300、HDD(Hard Disk Drive)1400、通信インターフェイス1500、及び入出力インターフェイス1600を有する。コンピュータ1000の各部は、バス1050によって接続される。
(1)
移動体に搭載された撮像部により生成された第1の画像と、前記第1の画像が生成された第1の時刻情報とを、第1の頻度で受信し、
前記移動体に搭載されたセンサ部により生成された前記移動体の運動に関する運動情報と、前記運動情報が生成された第2の時刻情報とを、前記第1の頻度よりも高い第2の頻度で受信し、
前記第1の画像と、前記運動情報とに基づいて、第2の画像を生成する、制御部
を備える情報処理装置。
(2)
第2の画像は、前記移動体を遠隔操作する操作者によって視聴される、
前記(1)に記載の情報処理装置。
(3)
前記制御部は、
前記運動情報に基づいて、前記操作者が前記第2の画像を視聴する時刻における前記移動体の位置および姿勢に関する移動体位置情報を推定し、
前記第1の画像と、前記移動体位置情報とに基づいて、前記第2の画像を生成する、
前記(2)に記載の情報処理装置。
(4)
前記制御部は、
前記運動情報に基づいて、前記操作者が前記第2の画像を視聴する時刻における視錘台の位置および姿勢に関する視錘台位置情報を推定し、
仮想空間に配置された前記第1の画像のうち、前記仮想空間に配置され、前記視錘台位置情報によって示される位置および姿勢に配置された前記視錘台と交わる領域を前記第2の画像として生成する、
前記(3)に記載の情報処理装置。
(5)
前記制御部は、
一つの前記第1の画像から生成される複数の前記第2の画像の解像度それぞれが所定の範囲内に収まるように複数の前記第2の画像それぞれを生成する、
前記(1)~(4)のいずれか1つに記載の情報処理装置。
(6)
前記制御部は、
前記移動体が上下方向に運動したことによって生じる前記第1の画像のぶれを補正し、
前記第2の画像に前記移動体が上下方向に運動したことによって生じる振動に対応する加工を加える、
前記(1)~(5)のいずれか1つに記載の情報処理装置。
(7)
前記第2の画像を表示する出力部をさらに備え、
前記制御部は、
前記第2の画像を前記出力部に表示する場合は、前記第2の画像を表示していることを前記操作者が視認可能な表示態様で前記第2の画像を前記出力部に表示するよう制御する、
前記(2)~(6)のいずれか1つに記載の情報処理装置。
(8)
前記第2の画像を表示する出力部をさらに備え、
前記制御部は、
前記センサ部により前記移動体から所定の範囲内に位置する物体が検出された場合は、前記操作者に対して前記物体が検出されたことを通知する情報を前記出力部に表示するよう制御する、
前記(2)~(7)のいずれか1つに記載の情報処理装置。
(9)
前記制御部は、
前記物体の出現領域を視覚的に強調して前記出力部に表示するよう制御する、
前記(8)に記載の情報処理装置。
(10)
前記制御部は、
前記移動体のダッシュボードに対応する画像を前記第2の画像に重畳して表示するよう制御する、
前記(1)~(9)のいずれか1つに記載の情報処理装置。
(11)
情報処理装置によって実行される情報処理方法であって、
移動体に搭載された撮像部により生成された第1の画像と、前記第1の画像が生成された第1の時刻情報とを、第1の頻度で受信することと、
前記移動体に搭載されたセンサ部により生成された前記移動体の運動に関する運動情報と、前記運動情報が生成された第2の時刻情報とを、前記第1の頻度よりも高い第2の頻度で受信することと、
前記第1の画像と、前記運動情報とに基づいて、第2の画像を生成することと、
を含む情報処理方法。
(12)
移動体と、情報処理装置と、を備える情報処理システムであって、
前記情報処理装置は、
前記移動体に搭載された撮像部により生成された第1の画像と、前記第1の画像が生成された第1の時刻情報とを、第1の頻度で受信し、
前記移動体に搭載されたセンサ部により生成された前記移動体の運動に関する運動情報と、前記運動情報が生成された第2の時刻情報とを、前記第1の頻度よりも高い第2の頻度で受信し、
前記第1の画像と、前記運動情報とに基づいて、第2の画像を生成する、制御部、を備える
情報処理システム。
10 移動体
11 通信部
12 制御部
13 撮像部
14 センサ部
100 遠隔操作装置
110 通信部
120 制御部
Claims (12)
- 移動体に搭載された撮像部により生成された第1の画像と、前記第1の画像が生成された第1の時刻情報とを、第1の頻度で受信し、
前記移動体に搭載されたセンサ部により生成された前記移動体の運動に関する運動情報と、前記運動情報が生成された第2の時刻情報とを、前記第1の頻度よりも高い第2の頻度で受信し、
前記第1の画像と、前記運動情報とに基づいて、第2の画像を生成する、制御部
を備える情報処理装置。 - 前記第2の画像は、前記移動体を遠隔操作する操作者によって視聴される、
請求項1に記載の情報処理装置。 - 前記制御部は、
前記運動情報に基づいて、前記操作者が前記第2の画像を視聴する時刻における前記移動体の位置および姿勢に関する移動体位置情報を推定し、
前記第1の画像と、前記移動体位置情報とに基づいて、前記第2の画像を生成する、
請求項2に記載の情報処理装置。 - 前記制御部は、
前記運動情報に基づいて、前記操作者が前記第2の画像を視聴する時刻における視錘台の位置および姿勢に関する視錘台位置情報を推定し、
仮想空間に配置された前記第1の画像のうち、前記仮想空間に配置され、前記視錘台位置情報によって示される位置および姿勢に配置された前記視錘台と交わる領域を前記第2の画像として生成する、
請求項3に記載の情報処理装置。 - 前記制御部は、
一つの前記第1の画像から生成される複数の前記第2の画像の解像度それぞれが所定の範囲内に収まるように複数の前記第2の画像それぞれを生成する、
請求項1に記載の情報処理装置。 - 前記制御部は、
前記移動体が上下方向に運動したことによって生じる前記第1の画像のぶれを補正し、
前記第2の画像に前記移動体が上下方向に運動したことによって生じる振動に対応する加工を加える、
請求項1に記載の情報処理装置。 - 前記第2の画像を表示する出力部をさらに備え、
前記制御部は、
前記第2の画像を前記出力部に表示する場合は、前記第2の画像を表示していることを前記操作者が視認可能な表示態様で前記第2の画像を前記出力部に表示するよう制御する、
請求項2に記載の情報処理装置。 - 前記第2の画像を表示する出力部をさらに備え、
前記制御部は、
前記センサ部により前記移動体から所定の範囲内に位置する物体が検出された場合は、前記操作者に対して前記物体が検出されたことを通知する情報を前記出力部に表示するよう制御する、
請求項2に記載の情報処理装置。 - 前記制御部は、
前記物体の出現領域を視覚的に強調して前記出力部に表示するよう制御する、
請求項8に記載の情報処理装置。 - 前記制御部は、
前記移動体のダッシュボードに対応する画像を前記第2の画像に重畳して表示するよう制御する、
請求項1に記載の情報処理装置。 - 情報処理装置によって実行される情報処理方法であって、
移動体に搭載された撮像部により生成された第1の画像と、前記第1の画像が生成された第1の時刻情報とを、第1の頻度で受信することと、
前記移動体に搭載されたセンサ部により生成された前記移動体の運動に関する運動情報と、前記運動情報が生成された第2の時刻情報とを、前記第1の頻度よりも高い第2の頻度で受信することと、
前記第1の画像と、前記運動情報とに基づいて、第2の画像を生成することと、
を含む情報処理方法。 - 移動体と、情報処理装置と、を備える情報処理システムであって、
前記情報処理装置は、
前記移動体に搭載された撮像部により生成された第1の画像と、前記第1の画像が生成された第1の時刻情報とを、第1の頻度で受信し、
前記移動体に搭載されたセンサ部により生成された前記移動体の運動に関する運動情報と、前記運動情報が生成された第2の時刻情報とを、前記第1の頻度よりも高い第2の頻度で受信し、
前記第1の画像と、前記運動情報とに基づいて、第2の画像を生成する、制御部、を備える
情報処理システム。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024517938A JPWO2023210288A1 (ja) | 2022-04-25 | 2023-04-06 | |
| US18/857,508 US20250259453A1 (en) | 2022-04-25 | 2023-04-06 | Information processing device, information processing method, and information processing system |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022-071465 | 2022-04-25 | ||
| JP2022071465 | 2022-04-25 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023210288A1 true WO2023210288A1 (ja) | 2023-11-02 |
Family
ID=88518692
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/014209 Ceased WO2023210288A1 (ja) | 2022-04-25 | 2023-04-06 | 情報処理装置、情報処理方法および情報処理システム |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20250259453A1 (ja) |
| JP (1) | JPWO2023210288A1 (ja) |
| WO (1) | WO2023210288A1 (ja) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000330677A (ja) * | 1999-05-17 | 2000-11-30 | Internatl Business Mach Corp <Ibm> | 画像の表示、選択方法およびコンピュータ・システム |
| JP2005107967A (ja) * | 2003-09-30 | 2005-04-21 | Canon Inc | 画像合成装置及び方法 |
| WO2018155159A1 (ja) * | 2017-02-24 | 2018-08-30 | パナソニックIpマネジメント株式会社 | 遠隔映像出力システム、及び遠隔映像出力装置 |
| WO2020111133A1 (ja) * | 2018-11-29 | 2020-06-04 | 住友電気工業株式会社 | 交通支援システム、サーバ及び方法、車載装置及びその動作方法、コンピュータプログラム、記録媒体、コンピュータ、並びに半導体集積回路 |
| JP2021533420A (ja) * | 2019-03-15 | 2021-12-02 | アポステラ ゲーエムベーハー | 振動補償の装置および振動補償の方法 |
-
2023
- 2023-04-06 WO PCT/JP2023/014209 patent/WO2023210288A1/ja not_active Ceased
- 2023-04-06 US US18/857,508 patent/US20250259453A1/en active Pending
- 2023-04-06 JP JP2024517938A patent/JPWO2023210288A1/ja active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000330677A (ja) * | 1999-05-17 | 2000-11-30 | Internatl Business Mach Corp <Ibm> | 画像の表示、選択方法およびコンピュータ・システム |
| JP2005107967A (ja) * | 2003-09-30 | 2005-04-21 | Canon Inc | 画像合成装置及び方法 |
| WO2018155159A1 (ja) * | 2017-02-24 | 2018-08-30 | パナソニックIpマネジメント株式会社 | 遠隔映像出力システム、及び遠隔映像出力装置 |
| WO2020111133A1 (ja) * | 2018-11-29 | 2020-06-04 | 住友電気工業株式会社 | 交通支援システム、サーバ及び方法、車載装置及びその動作方法、コンピュータプログラム、記録媒体、コンピュータ、並びに半導体集積回路 |
| JP2021533420A (ja) * | 2019-03-15 | 2021-12-02 | アポステラ ゲーエムベーハー | 振動補償の装置および振動補償の方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2023210288A1 (ja) | 2023-11-02 |
| US20250259453A1 (en) | 2025-08-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6557768B2 (ja) | 追尾制御装置、追尾制御方法、追尾制御プログラム、及び、自動追尾撮影システム | |
| JP6095985B2 (ja) | 再生装置、再生システム、再生方法及びプログラム | |
| US9240047B2 (en) | Recognition apparatus, method, and computer program product | |
| EP3979617B1 (en) | Shooting anti-shake method and apparatus, terminal and storage medium | |
| JP2003051016A (ja) | 接近検出装置、接近検出方法、及び接近検出プログラム | |
| KR101896715B1 (ko) | 주변차량 위치 추적 장치 및 방법 | |
| JP5187292B2 (ja) | 車両周辺監視装置 | |
| JP7357150B2 (ja) | ジョイントローリングシャッター補正及び画像ぼけ除去 | |
| JP5418661B2 (ja) | 車両周辺監視装置 | |
| JP2019121941A (ja) | 画像処理装置および方法、並びに画像処理システム | |
| WO2016013409A1 (ja) | 制御装置、制御方法、プログラム、および制御システム | |
| JP6570904B2 (ja) | 補正情報出力装置、画像処理装置、補正情報出力方法、撮像制御システム及び移動体制御システム | |
| WO2021229761A1 (ja) | 撮像システム、撮像方法、及びコンピュータプログラム | |
| JP4962304B2 (ja) | 歩行者検出装置 | |
| JP2019145958A (ja) | 撮像装置およびその制御方法ならびにプログラム | |
| KR101520049B1 (ko) | 이미지 캡쳐 디바이스의 개선된 제어 | |
| KR101714759B1 (ko) | 광각 카메라 영상 처리 장치 및 방법 | |
| WO2023210288A1 (ja) | 情報処理装置、情報処理方法および情報処理システム | |
| JP5263519B2 (ja) | 表示制御システム、表示制御方法、及び表示制御プログラム | |
| JP2016021712A (ja) | 画像処理装置、及び、運転支援システム | |
| WO2021024627A1 (ja) | 情報処理装置、移動体、情報処理システム、情報処理方法及びプログラム | |
| CN115683084B (zh) | 信息处理设备、系统及控制方法和记录介质 | |
| JP5395373B2 (ja) | 周辺監視装置 | |
| JP2021111262A (ja) | 情報処理装置 | |
| JP7775151B2 (ja) | 情報処理装置、情報処理方法、および情報処理プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23796034 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2024517938 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 18857508 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 23796034 Country of ref document: EP Kind code of ref document: A1 |
|
| WWP | Wipo information: published in national office |
Ref document number: 18857508 Country of ref document: US |