WO2023053444A1 - 移動体制御システム、移動体制御方法、および画像通信装置 - Google Patents
移動体制御システム、移動体制御方法、および画像通信装置 Download PDFInfo
- Publication number
- WO2023053444A1 WO2023053444A1 PCT/JP2021/036427 JP2021036427W WO2023053444A1 WO 2023053444 A1 WO2023053444 A1 WO 2023053444A1 JP 2021036427 W JP2021036427 W JP 2021036427W WO 2023053444 A1 WO2023053444 A1 WO 2023053444A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- depth information
- image
- amount
- moving body
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/20—Means for actuating or controlling masts, platforms, or forks
- B66F9/24—Electrical devices or systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/243—Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
- G05D1/2435—Extracting 3D information
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/617—Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
- G05D1/622—Obstacle avoidance
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C15/00—Arrangements characterised by the use of multiplexing for the transmission of a plurality of signals over a common path
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q9/00—Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2105/00—Specific applications of the controlled vehicles
- G05D2105/20—Specific applications of the controlled vehicles for transportation
- G05D2105/28—Specific applications of the controlled vehicles for transportation of freight
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2107/00—Specific environments of the controlled vehicles
- G05D2107/70—Industrial sites, e.g. warehouses or factories
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/10—Land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/10—Optical signals
Definitions
- the present invention relates to a mobile body control system, a mobile body control method, and an image communication device.
- Patent Document 1 an information processing device provided inside or outside a robot derives information on the position or posture of the robot based on the sensing results of sensors mounted on the robot, and controls the robot. disclosed to do.
- One aspect of the present invention has been made in view of the above problem, and it is possible to control the amount of depth information so that the amount of depth information is appropriate according to the operation of a mobile object.
- One purpose is to provide a possible technology.
- a moving object control system includes acquisition means for acquiring operation details of the moving object; control means for controlling the amount of depth information acquired from a sensor according to the operation details of the moving object; Prepare.
- a mobile object control method acquires the operation details of a mobile object, and controls the amount of depth information acquired from a sensor according to the acquired operation details of the mobile object.
- An image communication apparatus includes receiving means for receiving parameters related to changes in the amount of information determined according to the operation details of a moving object, and depth information acquired from a sensor according to the parameters. and a control means for controlling the amount.
- the information amount of depth information it is possible to control the information amount of depth information so that the information amount of depth information is appropriate according to the operation content of a mobile object.
- FIG. 1 is a block diagram showing a functional configuration of a mobile body control system according to exemplary Embodiment 1 of the present invention
- FIG. It is a flowchart which shows the flow of a mobile control method.
- FIG. 8 is a block diagram showing the functional configuration of an image communication device according to exemplary Embodiment 2 of the present invention
- FIG. 11 is a block diagram showing the functional configuration of a server device according to exemplary embodiment 3 of the present invention
- FIG. 11 is a block diagram showing the configuration of a mobile body control system according to exemplary Embodiment 4 of the present invention
- FIG. 4 is a diagram for explaining a quantization range of depth information
- It is a figure which shows 16-bit depth information and 8-bit grayscale information.
- FIG. 8 is a block diagram showing the functional configuration of an image communication device according to exemplary Embodiment 2 of the present invention
- FIG. 11 is a block diagram showing the functional configuration of a server device according to exemplary embodiment 3 of the present invention
- FIG. 10 is a diagram showing another example of quantization processing
- FIG. 12 is a flow chart showing the flow of a moving body control method according to exemplary embodiment 4 of the present invention
- FIG. 12 is a block diagram showing the configuration of a mobile body control system according to exemplary embodiment 5 of the present invention
- FIG. 12 is a flow chart showing the flow of a moving body control method according to exemplary embodiment 5 of the present invention
- It is a figure which shows an example of the hardware of a computer.
- the mobile body control system 100 changes the amount of depth information acquired from the sensor according to the motion of the mobile body.
- FIG. 1 is a block diagram showing the configuration of a mobile body control system 100. As shown in FIG.
- the mobile body control system 100 includes an acquisition unit 11 and a control unit 12.
- the acquisition unit 11 is a configuration that implements acquisition means in this exemplary embodiment.
- the control unit 12 is a configuration that implements control means in this exemplary embodiment.
- the depth information may be changed according to the throughput of the network connected to the mobile object.
- HD high resolution
- image RAW data (30 fps (frame per second)
- color images consume a network bandwidth of 660 Mbps
- depth images consume a network bandwidth of 440 Mbps. Therefore, if there is no room in the communication throughput of the network, it will be necessary to reduce either one or both of the information amount of the color image and the information amount of the depth image.
- the sensor acquires depth information.
- the sensors are, for example, RGB-D cameras with depth sensors, 3D LiDAR (Light Detection and Ranging), TOF (Time-Of-Flight) sensors, and the like.
- a depth image representing the depth of the color image may be acquired. Note that the depth information is depth information that indicates the distance from the sensor to surrounding objects.
- One or a plurality of sensors are installed on the mobile body, and the acquisition unit 11 acquires color images and depth information output from the sensors.
- the depth information expresses the depth of each pixel of the color image, for example, in 16 bits (0 to 65535 [mm]).
- the control unit 12 can change the amount of depth information by compressing the depth image or reducing the amount of data of the depth information. For example, the control unit 12 can change the amount of depth information by converting the depth information from 16 bits to 8 bits.
- compression parameters Parameters that affect the compression rate, such as quantization parameters, are called compression parameters.
- the acquisition unit 11 acquires the motion content of the mobile object.
- mobile objects include automated guided forklifts (AGF), automated guided vehicles (AGV), and autonomous mobile robots (AMR).
- SLAM Simultaneous Localization And Mapping
- the moving body may have a function of automatically generating a travel route, and may automatically avoid obstacles without being bound by a fixed route.
- the work content is roughly divided into two: “travel” and “cargo handling”.
- an automatic forklift truck "runs", it moves to a destination while estimating its own position using VSLAM, mainly using color images. Therefore, by increasing the amount of information in the color image and decreasing the amount of information in the depth image, it is possible to perform self-position estimation with higher accuracy.
- the automatic forklift when the automatic forklift "runs", if there is an obstacle in the direction of travel of the automatic forklift, the automatic forklift must stop or avoid the obstacle. When avoiding obstacles, the automatic forklift needs to acquire depth information of obstacles with high precision, so the amount of depth information is increased and the amount of color image information is reduced.
- an automatic forklift when it performs "cargo handling", it mainly performs pallet recognition, rack recognition, QR marker recognition, etc. while loading and unloading.
- pallets and racks automatic forklifts need to acquire depth information of pallets and racks with high accuracy, so the amount of information in depth images is increased and the amount of information in color images is reduced.
- QR marker the automatic forklift needs to process the image of the QR marker, so the information amount of the color image is increased and the information amount of the depth image is decreased.
- control unit 12 may reduce depth information when the moving body moves along the mark. Further, when the robot deviates from the assumed route due to actions such as obstacle avoidance, the control unit 12 may increase the depth information. Further, when the moving body conveys the article, the control unit 12 may increase the depth information when the article is delivered.
- the control unit 12 controls the amount of depth information acquired from the sensor according to the operation details of the mobile object. For example, when the mobile object is an automatic forklift and the automatic forklift is "running", the control unit 12 reduces the amount of depth information. Also, when there is an obstacle in the traveling direction of the automatic forklift, the control unit 12 increases the amount of depth information.
- each function of the mobile object control system 100 may be implemented on the cloud.
- the acquisition unit 11 may be one device, and the control unit 12 may be one device. These may be implemented in one device or in separate devices. For example, when implemented in separate devices, information of each unit is transmitted and received via a communication network to advance processing.
- the control unit 12 controls the amount of depth information according to the operation of the mobile body. It is possible to suitably control the information amount of the depth image.
- FIG. 2 is a flow diagram showing the flow of the moving body control method. As shown in FIG. 2, the moving body control method includes steps S1 to S2.
- the acquisition unit 11 acquires the motion content of the moving object (S1). Then, the control unit 12 controls the information amount of the depth information acquired from the sensor according to the acquired operation contents of the moving object (S2).
- step S2 the amount of depth information acquired from the sensor is changed according to the acquired motion of the mobile body.
- the amount of depth information can be controlled so that the amount of depth information in mobile control is appropriate.
- the image communication apparatus is mounted on, for example, a mobile body and changes the amount of depth information acquired by a sensor.
- FIG. 3 is a block diagram showing the functional configuration of the image communication device 2.
- the image communication device 2 includes a receiver 21 and a controller 22 .
- the receiving unit 21 is a configuration that implements receiving means in this exemplary embodiment.
- the control unit 22 is a configuration that implements control means in this exemplary embodiment.
- the receiving unit 21 receives parameters related to changing the amount of information, which are determined according to the operation details of the mobile object. As will be described later, the parameters for changing the amount of information are transmitted from the server device. Parameters related to the change in information amount include the bit rate allocation amount for color images, the bit rate allocation amount for depth information, and the like.
- the work content is roughly divided into two: “travel” and “cargo handling”.
- “Running” includes “normal running”, “avoidance of obstacles”, and the like.
- Cargo handling includes “pallet recognition”, “rack recognition”, “QR marker recognition” and the like.
- the server device acquires the operation details of the automatic forklift, determines parameters including the bit rate allocation amount for color images, the bit rate allocation amount for depth images, etc. according to the operation contents of the automatic forklift, and sends the parameters to the image communication device 2. Send.
- the control unit 22 changes the amount of depth information acquired from the sensor according to the parameter. For example, the control unit 22 changes the compression rate of the depth information according to the bit rate allocation amount of the depth information included in the parameter received by the receiving unit 21, and changes the information amount corresponding to the bit rate allocation amount of the depth information. Change the information amount of the depth information so that
- the control unit 22 controls the parameters related to changes in the amount of information, which are determined according to the operation of the moving object, from the sensor. Since the information amount of the acquired depth information is changed, it is possible to control the information amount of the depth information so that the information amount of the depth information in mobile body control is appropriate.
- the server device in this exemplary embodiment acquires the operation details of the mobile object, and determines parameters for changing the amount of depth information according to the operation details of the mobile object.
- FIG. 4 is a block diagram showing the functional configuration of the server device 3.
- the server device 3 includes an acquisition section 31 and a transmission section 32 .
- the acquisition unit 31 is a configuration that implements acquisition means in this exemplary embodiment.
- the transmission unit 32 is a configuration that implements transmission means in this exemplary embodiment.
- the acquisition unit 31 acquires the operation content of the mobile object. If the moving body is, for example, an automatic forklift, the work content is roughly divided into two: “travel” and “cargo handling.” “Running” includes “normal running”, “avoidance of obstacles”, and the like. “Cargo handling” includes “pallet recognition”, “rack recognition”, “QR marker recognition” and the like. The work content of the mobile object is managed by another server device or the like. The acquisition unit 31 acquires information about which of these the work content of the mobile object is from another server device or the like.
- the transmission unit 32 determines a parameter for changing the amount of depth information acquired from the sensor according to the operation content of the mobile object, and transmits the parameter to the mobile object.
- Parameters related to the change in information amount include the bit rate allocation amount for color images, the bit rate allocation amount for depth information, and the like.
- the transmission unit 32 determines parameters including the bit rate allocation amount for color images, the bit rate allocation amount for depth information, etc. from the communication throughput between the mobile unit and the server device 3 and the work content of the mobile unit, Send the parameters to the mobile.
- the transmission unit 32 sets parameters for changing the amount of depth information acquired from the sensor according to the motion of the moving object. determine and send the parameters to the mobile. Therefore, by receiving this parameter on the moving body side and changing the information amount of the depth image representing the depth of the color image acquired from the sensor according to the parameter, the information amount of the depth information in moving body control becomes appropriate. Thus, the information amount of depth information can be controlled.
- FIG. 5 is a block diagram showing the configuration of the mobile control system 100A.
- the mobile control system 100A includes an image communication device 2A and a server device 3A.
- the image communication device 2A also includes a receiver 21 , a controller 22 and a transmitter 23 .
- the receiving unit 21 is a configuration that implements the receiving means of the image communication apparatus in this exemplary embodiment.
- the control unit 22 is a configuration that implements control means in this exemplary embodiment.
- the transmission unit 23 is a configuration that implements the transmission means of the image communication apparatus in this exemplary embodiment.
- the server device 3A also includes an acquisition unit 31, a transmission unit 32, an image processing unit 35, and a reception unit 37.
- the acquisition unit 31 is a configuration that implements acquisition means in this exemplary embodiment.
- the transmission unit 32 is a configuration that realizes the control means or transmission means of the server device in this exemplary embodiment.
- the image processing unit 35 is a configuration that implements image processing means in this exemplary embodiment.
- the acquisition unit 31 of the server device 3A acquires the operation content of the mobile object.
- the operation contents of the mobile object may be managed by another server device or the like, for example.
- the acquisition unit 31 of the server device 3A may acquire information about the work content of the mobile object from another server device or the like.
- the server device 3A may include a planning unit that plans the motion of the mobile object and a holding unit that holds the action plan of the mobile object. Further, the acquiring unit 31 may acquire the work content of the moving body from the planning unit or the holding unit.
- the transmission unit 32 of the server device 3A changes the approximation accuracy when approximating the depth information according to the operation content of the mobile body, and notifies the control unit 22 of the approximation accuracy after the change. Specifically, the transmission unit 32 changes the approximation accuracy when approximating the depth information, and transmits parameters including the changed approximation accuracy to the reception unit 21 of the image communication device 2A.
- Approximation accuracy indicates the degree of error when approximating depth information. For example, if the quantization range, which will be described later, is narrower, the error will be smaller, and the approximation accuracy will be higher. Also, the wider the quantization range, the larger the error, and the lower the approximation accuracy.
- the control unit 22 of the image communication device 2A approximates the depth information according to the approximation accuracy received by the receiving unit 21. For example, the control unit 22 of the image communication device 2A quantizes the depth information by narrowing the quantization range, which will be described later, when approximating the depth information so as to increase the approximation accuracy. Further, when approximating the depth information so that the approximation accuracy is low, the control unit 22 of the image communication device 2A quantizes the depth information by widening the quantization range, which will be described later.
- the reception unit 21 of the image communication device 2A receives parameters including approximation accuracy when approximating depth information, which are determined according to the operation content of the mobile object. Note that the approximation accuracy when approximating this depth information is sometimes called a quantization range.
- the control unit 22 reduces the information amount of the depth information by approximating the depth information according to the approximation accuracy.
- the approximation accuracy may be the approximation accuracy after being changed by the server device 3A.
- FIG. 6 is a diagram for explaining the quantization range of depth information.
- the depth information expresses the distance (depth) to the object corresponding to each pixel of the color image in 16 bits (0 to 65535 [mm]), for example.
- the effective depth is defined as the depth range from the lower limit D min to the upper limit D max .
- the horizontal axis represents the effective depth of the RGB-D camera, and the vertical axis represents the sampling depth . max .
- a 16-bit depth image is indicated by a solid line, and an 8-bit depth image is indicated by a dotted line.
- Quantization in this specification includes changing the degree of discreteness of discrete values and downsampling information expressed by discrete values.
- depth information is quantized within a quantization range.
- the scale factor s in the quantization range is given by the following formula (Formula 1).
- FIG. 7 is a diagram showing 16-bit depth information and 8-bit grayscale information. As shown in FIG. 7, when 16-bit depth information is quantized and converted into 8-bit grayscale information, the graph becomes a step-like graph, and 8-bit causes a larger quantization error than 16-bit. This quantization error e s is given by the following equation (Equation 3). It can be seen that the narrower the quantization range, the smaller the quantization error.
- FIG. 8 is a diagram showing another example of quantization processing. As shown in FIG. 8, the 16-bit depth image may be transformed using a sigmoid function instead of the 8-bit linear transformation.
- the sigmoid function is given by the following formula (formula 4).
- the quantization range is not explicitly defined by the lower limit value D min and the upper limit value D max as in linear transformation, but is controlled by the parameter a of the sigmoid function shown in (Equation 4). It will be.
- the image processing unit 35 of the server device 3A detects obstacles based on the depth information. For example, when the image processing unit 35 detects an obstacle by recognizing a color image while the mobile body is running, it notifies the transmitting unit 32 of the recognition result.
- the transmission unit 32 of the server device 3A changes the approximation accuracy of the depth information according to the motion content of the mobile body and the detection result of the obstacle.
- the receiving unit 21 of the image communication device 2A receives parameters including the approximation accuracy of the depth information, which are determined according to the motion of the moving object and the obstacles detected based on the depth information.
- the control unit 22 performs quantization processing by limiting the quantization range of depth information, thereby reducing the quantization error. can be reduced.
- the change unit 22 of the image communication device 2A includes obstacle detection means for detecting an obstacle based on the image acquired from the sensor.
- the approximation accuracy of the depth information may be changed according to.
- the control unit 22 can more efficiently change the information amount of the depth information by compressing the depth information after performing the quantization process.
- control unit 22 controls the compression ratio of the image acquired from the sensor according to the operation details of the moving object. For example, when the moving body is an automatic forklift and the automatic forklift is "running", the control unit 22 changes the compression rate of the color image so as to increase the information amount of the color image.
- the transmission unit 32 of the server device 3A changes the ratio of the bit rates allocated to the image and the depth information according to the operation details of the mobile object.
- the reception unit 21 of the image communication device 2A receives parameters including the ratio of bit rates to be assigned to images and depth information, which are determined according to the operation details of the mobile object.
- the transmission unit 23 transmits the depth information quantized and compressed by the control unit 22 and the color image compressed by the control unit 22 to the server device 3A.
- the transmission unit 23 can estimate the communication throughput from the stream delivery information of the color image and the depth information, and acquire the transmittable band.
- the transmittable band is B (bps) and the bit rate allocation ratio is r
- the color image bit rate b RGB (bps) and the depth image bit rate b Depth (bps) are given by the following equation (Equation 5 ) and (Equation 6). Note that the allocation ratio r is controlled within a range of 0 to 1 according to the operation contents of the robot.
- the transmission unit 32 of the server device 3A changes the image bit rate and the depth information bit rate according to the communication throughput of the network to which the mobile unit is connected.
- the bit rate of color images is b RGB (bps)
- the bit rate of depth images is b Depth (bps).
- FIG. 9 is a flow chart showing the flow of the moving body control method. As shown in FIG. 9, the moving body control method includes steps S11 to S17.
- the acquisition unit 31 of the server device 3A acquires the operation content of the mobile object and notifies the transmission unit 32 (S11).
- the operation contents of the mobile object are managed by another server device or the like.
- the acquisition unit 31 of the server device 3A acquires information about the operation details of the moving object from another server device or the like.
- the image processing unit 35 of the server device 3A performs recognition processing for at least one of the color image and the depth information, and notifies the transmission unit 32 of the server device 3A of the recognition result (S12).
- the transmission unit 32 of the server device 3A determines the depth range of the depth image, the bit rate allocation amount of the color image, and the bit rate allocation amount of the depth image according to the operation content of the moving body and the recognition result by the image processing unit 35. is changed and included in the parameter, and the parameter is transmitted to the image communication apparatus 2A (S13).
- the reception unit 21 of the image communication device 2A Upon receiving the parameters from the server device 3A, the reception unit 21 of the image communication device 2A notifies the control unit 22 of the approximation accuracy, the color image bit rate allocation amount, and the depth image bit rate allocation amount.
- the control unit 22 reduces the amount of depth information by quantizing the depth information according to the approximation accuracy (S14).
- control unit 22 changes the bit rates of the color image and the depth information according to the bit rate allocation amount of the color image and the bit rate allocation amount of the depth information received from the server device 3A (S15). Compress the color image and depth information according to the bit rate (S16).
- the transmission unit 23 of the image communication device 2A transmits the compressed color image and depth information to the server device 3A (S17).
- each function of the mobile object control system 100A may be implemented on the cloud.
- the acquisition unit 31 and the transmission unit 32 may be one device
- the image processing unit 35 and the reception unit 37 may be one device. These may be implemented in one device or in separate devices. For example, when implemented in separate devices, information of each unit is transmitted and received via a communication network to advance processing.
- the transmission unit 32 of the server device 3A changes the approximation accuracy of the depth information according to the motion of the mobile body,
- the control unit 22 is notified of the approximation accuracy after the change. Therefore, the control unit 22 of the image communication device 2A can reduce the amount of depth information by quantizing the depth information according to the approximation accuracy.
- the transmission unit 32 of the server device 3A changes the approximation accuracy of the depth information according to the operation content of the moving body and the recognition result by the image processing unit 35, and transmits the changed approximation accuracy to the control unit 22.
- the control unit 22 of the image communication device 2A can further preferably reduce the amount of depth information by quantizing the depth information according to the approximation accuracy.
- control unit 22 of the image communication apparatus 2A changes the amount of information of the depth information by changing the compression rate of the depth information, the information amount of the depth information according to the allocation ratio of the bit rate of the depth information. can be
- control unit 22 of the image communication apparatus 2A changes the amount of information of the color image by changing the compression rate of the color image according to the operation of the mobile unit, the bit rate allocation of the color image
- the information amount of the color image can be set according to the ratio.
- the transmission unit 32 of the server device 3A changes the bit rate of the color image and the bit rate of the depth information according to the communication throughput of the transmission unit 23 of the image communication unit 2A, the color image and the depth information are transmitted.
- the amount of information can be set according to the throughput.
- FIG. 10 is a block diagram showing the configuration of the mobile control system 100B.
- the mobile body control system 100B includes an image communication device 2B and a server device 3B.
- the image communication device 2B includes a receiving unit 21, a changing unit 22, a transmitting unit 23, an RGB image acquiring unit 24, a depth image acquiring unit 25, an RGB compressing unit 26, a depth compressing unit 27, and quantization information. and an additional portion 28 .
- the changing unit 22 also includes a quantization range changing unit 221 and a compression parameter changing unit 222 .
- the transmitter 23 also includes an RGB transmitter 231 and a depth transmitter 232 .
- the server device 3B includes an acquisition unit 31, a transmission unit 32, an image processing unit 35, a throughput measurement unit 36, and a reception unit 37.
- the receiver 37 also includes an RGB receiver/decoder 33 and a depth receiver/decoder 34 .
- the receiving unit 21 of the image communication device 2B receives from the server device 3B parameters including the approximation accuracy (quantization range) of the depth information, which are determined according to the motion of the moving object, and sets the parameters to the quantization range. It outputs to the changing unit 221 and the quantization information adding unit 28 .
- the receiving unit 21 of the image communication device 2B also receives parameters including the allocation ratio between the color image bit rate and the depth information bit rate from the server device 3B, and outputs the parameters to the compression parameter changing unit 222 .
- the quantization range changing unit 221 changes the quantization range by outputting the quantization range input from the receiving unit 21 to the depth image acquiring unit 25 .
- the RGB image acquisition unit 24 acquires a color image from the sensor. Also, the depth image acquisition unit 25 acquires depth information from the sensor. The depth image acquisition unit 25 reduces the amount of depth information by quantizing the depth information according to the quantization range, as described in the fourth exemplary embodiment.
- the RGB compression unit 26 compresses the color image output from the RGB image acquisition unit 24 according to the compression parameter for the color image output from the compression parameter change unit 222 and outputs the compressed color image to the RGB transmission unit 231. do.
- the RGB transmission unit 231 transmits the color image compressed by the RGB compression unit 26 to the server device 3B.
- the depth compression unit 27 compresses the quantized depth image output from the depth image acquisition unit 25 according to the compression parameter of the depth image output from the compression parameter change unit 222, and compresses the compressed depth image. to the depth transmission unit 232 .
- the quantization information adding unit 28 adds quantization information to the packet for transmitting the depth information after being compressed by the depth compression unit 27, and sends the packet to which the quantization information has been added to the depth transmission unit 232 in the server device. Send to 3B.
- the acquisition unit 31 of the server device 3B acquires the operation content of the mobile object.
- the operation contents of the mobile object are managed by another server device or the like.
- the acquisition unit 31 of the server device 3B acquires information about the work content of the mobile object from another server device or the like.
- the transmission unit 32 of the server device 3B changes the approximation accuracy of the depth information according to the operation content of the mobile object, and transmits the changed approximation accuracy to the image communication device 2B.
- the transmission unit 32 changes the approximation range of the depth information according to the operation content of the moving body and the recognition result by the image processing unit 35, and transmits the approximation accuracy after the change to the image communication device 2B.
- the image processing unit 35 performs recognition processing for at least one of the color image and the depth information. For example, when the image processing unit 35 detects an obstacle by recognizing a color image while the mobile body is running, it notifies the transmitting unit 32 of the recognition result.
- the throughput measurement unit 36 measures, for example, the time taken to receive a predetermined number of packets from the image communication device 2B, and measures the communication throughput from the data amount of the specified number of packets and the time taken to receive the specified number of packets. The throughput measurement unit 36 then notifies the transmission unit 32 of the communication throughput.
- the RGB receiving/decoding unit 33 receives the compressed color image from the image communication device 2B and decodes the compressed color image. Also, the depth receiving/decoding unit 34 receives the compressed depth information from the image communication device 2B and decodes the compressed depth information.
- FIG. 11 is a flow diagram showing the flow of the moving body control method. As shown in FIG. 11, the moving body control method includes steps S21 to S28. A case of an RGB image will be described as an example of a color image.
- the acquisition unit 31 of the server device 3B acquires the operation content of the mobile unit and notifies the transmission unit 32 of the operation content.
- the transmission unit 32 changes the allocation ratio and quantization range between the bit rate of the RGB image and the bit rate of the depth information according to the operation of the mobile unit, and changes the allocation ratio and quantization range of the bit rate to the image communication apparatus. 2B is notified (S21).
- the bit rate of the RGB image is B1
- the bit rate of depth information is B2
- the lower limit value Dmin of the quantization range is 200
- the upper limit value Dmax of the quantization range is 10,000.
- the receiving unit 21 of the image communication device 2B Upon receiving the bit rate allocation ratio and the quantization range, the receiving unit 21 of the image communication device 2B outputs the quantization range to the quantization range changing unit 221 and outputs the bit rate allocation ratio to the compression parameter changing unit 222. do.
- the quantization range changing unit 221 inputs the quantization range and sets it in the depth image acquiring unit 25 . Further, the compression parameter changing unit 222 sets the RGB image compression parameter (compression ratio) to the RGB compression unit 26, and sets the depth image compression parameter (compression ratio) to the depth compression unit 27 (S22).
- the quantization information addition unit 28 adds the quantization range to the header of the packet transmitting the depth information
- the depth transmission unit 232 is caused to transmit (S23).
- the lower limit value of the quantization range of 200 and the upper limit value of the quantization range of 10000 are added to the unique header of the packet that transmits the depth information, and the depth image (data) with the compression rate of B2 is accommodated in the payload.
- the receiving unit 37 of the server device 3B receives the RGB image and the depth image, and decodes the received RGB image and depth information. Then, by transmitting the RGB image and the depth information to another server device that controls the moving body, the other server device controls the moving body (S24).
- the acquisition unit 31 of the server device 3B acquires the operation content of the mobile unit and notifies the transmission unit 32 of the operation content.
- the transmission unit 32 changes the allocation ratio and quantization range between the bit rate of the RGB image and the bit rate of the depth information according to the operation of the mobile unit, and changes the allocation ratio and quantization range of the bit rate to the image communication apparatus. 2B is notified (S25).
- the bit rate of the RGB image is B1′
- the bit rate of depth information is B2′
- the lower limit value D min of the quantization range is 2000
- the upper limit value D max of the quantization range is 4000.
- the receiving unit 21 of the image communication device 2B Upon receiving the bit rate allocation ratio and the quantization range, the receiving unit 21 of the image communication device 2B outputs the quantization range to the quantization range changing unit 221 and outputs the bit rate allocation ratio to the compression parameter changing unit 222. do.
- the quantization range changing unit 221 inputs the quantization range and sets it in the depth image acquiring unit 25 . Further, the compression parameter changing unit 222 sets the RGB image compression parameter to the RGB compression unit 26, and sets the depth information compression parameter to the depth compression unit 27 (S26).
- the quantization information addition unit 28 adds the quantization range to the header of the packet transmitting the depth information
- the depth transmission unit 232 is caused to transmit (S27).
- the lower limit value of the quantization range of 2000 and the upper limit value of the quantization range of 4000 are added to the unique header of the packet that transmits the depth information, and the depth information (data) with the compression rate of B2′ is accommodated in the payload. ing.
- the receiving unit 37 of the server device 3B receives the RGB image and depth information, and decodes the received RGB image and depth information. Then, by transmitting the RGB image and the depth information to another server device that controls the moving body, the other server device controls the moving body (S28).
- the quantization information adding unit 28 of the image communication device 2B transmits the depth information compressed by the depth compression unit 27.
- the quantization information is added to the packet, and the depth transmission unit 232 is caused to transmit the packet to which the quantization information has been added to the server apparatus 3B. Therefore, on the server device 3B side, it is possible to easily check the compression rate of frequently updated RGB images and depth information, and the quantization range of depth information.
- the throughput measuring unit 36 of the server device 3B measures the communication throughput from the data amount of the predetermined number of packets and the time taken to receive the predetermined number of packets, the throughput according to the communication state at that time can be obtained. can be done.
- Some or all of the functions of the image communication devices 2, 2A, 2B and the server devices 3, 3A, 3B may be implemented by hardware such as integrated circuits (IC chips), or may be implemented by software. .
- the image communication devices 2, 2A, 2B and the server devices 3, 3A, 3B are, for example, implemented by computers that execute program instructions, which are software that implements each function.
- An example of such a computer (hereinafter referred to as computer C) is shown in FIG.
- Computer C comprises at least one processor C1 and at least one memory C2.
- a program P for operating the computer C as the image communication devices 2, 2A, 2B and the server devices 3, 3A, 3B is recorded in the memory C2.
- the processor C1 reads the program P from the memory C2 and executes it, thereby implementing the functions of the image communication devices 2, 2A, 2B and the server devices 3, 3A, 3B.
- processor C1 for example, CPU (Central Processing Unit), GPU (Graphic Processing Unit), DSP (Digital Signal Processor), MPU (Micro Processing Unit), FPU (Floating point number Processing Unit), PPU (Physics Processing Unit) , a microcontroller, or a combination thereof.
- memory C2 for example, a flash memory, HDD (Hard Disk Drive), SSD (Solid State Drive), or a combination thereof can be used.
- the computer C may further include a RAM (Random Access Memory) for expanding the program P during execution and temporarily storing various data.
- Computer C may further include a communication interface for sending and receiving data to and from other devices.
- Computer C may further include an input/output interface for connecting input/output devices such as a keyboard, mouse, display, and printer.
- the program P can be recorded on a non-temporary tangible recording medium M that is readable by the computer C.
- a recording medium M for example, a tape, disk, card, semiconductor memory, programmable logic circuit, or the like can be used.
- the computer C can acquire the program P via such a recording medium M.
- the program P can be transmitted via a transmission medium.
- a transmission medium for example, a communication network or broadcast waves can be used.
- Computer C can also obtain program P via such a transmission medium.
- (Appendix 1) Acquisition means for acquiring the motion content of the moving object; and a control means for controlling the amount of depth information acquired from a sensor according to the operation content of the moving body, Mobile control system.
- the amount of depth information can be reduced by quantizing the depth information according to the approximation accuracy.
- the amount of depth information can be further suitably reduced by quantizing the depth information according to the approximation accuracy.
- the amount of depth information can be further suitably reduced by quantizing the depth information according to the approximation accuracy.
- the information amount of the image can be set according to the allocation ratio of the bit rate of the image.
- the information amount of the image and depth information can be set according to the bit rate of the image and depth information.
- Appendix 7 The mobile body control system according to appendix 5 or 6, wherein the control means changes the bit rate of the image and the bit rate of the depth information according to communication throughput of a network to which the mobile body is connected.
- the information amount of the image and depth information can be set according to the bit rate allocation ratio of the image and depth information.
- the process of changing the amount of information includes: 9. The moving body control method according to appendix 8, wherein the depth information is approximated according to the operation content of the moving body.
- the process of changing the amount of information includes: The moving body control method according to Supplementary Note 9, wherein the approximation accuracy of the depth information is changed according to the operation content of the moving body and the detection result of the obstacle.
- the process of changing the amount of information includes: 11.
- the process of changing the amount of information includes: 12. The moving body control method according to appendix 10 or 11, wherein the compression ratio of the image acquired from the sensor is controlled according to the operation content of the moving body.
- the information amount of the image can be set according to the allocation ratio of the bit rate of the image.
- the process of changing the amount of information includes: 13.
- the information amount of the image and depth information can be set according to the bit rate of the image and depth information.
- the process of changing the amount of information includes: 14.
- the information amount of the image and depth information can be set according to the bit rate allocation ratio of the image and depth information.
- the receiving means receives the parameter including the approximation accuracy when approximating the depth information, which is determined according to the operation content of the moving body; 16.
- the image communication apparatus according to appendix 15, wherein the control means reduces the information amount of the depth information by approximating the depth information according to the approximation accuracy.
- Appendix 17 17. The method according to appendix 16, wherein the receiving means receives the parameter including the approximation accuracy of the depth information, which is determined according to the motion of the moving object and the obstacle detected based on the depth information. image communication device.
- the information amount of the depth information can be further suitably reduced.
- the receiving means receives the parameters including the approximation accuracy of the depth information, which are determined according to the motion of the moving body and the obstacle detected based on the image acquired from the sensor. 18.
- the image communication device according to 17.
- the information amount of the depth information can be further suitably reduced.
- Appendix 19 19. The image communication apparatus according to appendix 17 or 18, wherein the control means controls the compression ratio of the image acquired from the sensor according to the parameter determined according to the operation content of the moving body.
- the information amount of the image can be set according to the allocation ratio of the bit rate of the image.
- the information amount of the image and depth information can be set according to the bit rate of the image and depth information.
- Appendix 21 21.
- the information amount of the image and depth information can be set according to the bit rate allocation ratio of the image and depth information.
- At least one processor is provided, and the processor acquires the operation content of the moving body, and the processing of controlling the amount of depth information acquired from the sensor according to the acquired operation content of the moving body;
- a mobile control system that executes
- the robot control system may further include a memory, and the memory may store a program for causing the processor to execute the acquisition process and the control process. Also, this program may be recorded in a computer-readable non-temporary tangible recording medium.
- Appendix 23 at least one processor, wherein the processor receives a parameter related to changing the amount of information determined according to the operation content of the mobile body; A process of controlling the amount of depth information acquired from the sensor according to the parameter; An image communication device that executes
- the image communication apparatus may further include a memory, and the memory may store a program for causing the processor to execute the receiving process and the controlling process. Also, this program may be recorded in a computer-readable non-temporary tangible recording medium.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Structural Engineering (AREA)
- Mechanical Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Civil Engineering (AREA)
- Combustion & Propulsion (AREA)
- Life Sciences & Earth Sciences (AREA)
- Geology (AREA)
- Chemical & Material Sciences (AREA)
- Electromagnetism (AREA)
- Mathematical Physics (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Description
本発明の第1の例示的実施形態について、図面を参照して詳細に説明する。本例示的実施形態は、後述する例示的実施形態の基本となる形態である。
本例示的実施形態に係る移動体制御システム100は、概略的に言えば、移動体の動作内容に応じて、センサから取得したデプス情報の情報量を変更するものである。
本例示的実施形態に係る移動体制御システム100の構成について、図1を参照して説明する。図1は、移動体制御システム100の構成を示すブロック図である。
以上説明したように、本例示的実施形態に係る移動体制御システム100によれば、制御部12が、移動体の動作内容に応じて、デプス情報の情報量を制御するため、移動体制御におけるデプス画像の情報量を好適に制御することができる。
以上のように構成された移動体制御システム100が実行する移動体制御方法の流れについて、図2を参照して説明する。図2は、移動体制御方法の流れを示すフロー図である。図2に示すように、移動体制御方法は、ステップS1~S2を含む。
以上説明したように、本例示的実施形態に係る移動体制御方法によれば、ステップS2において、取得した移動体の動作内容に応じて、センサから取得したデプス情報の情報量を変更するので、移動体制御におけるデプス情報の情報量が適切となるように、デプス情報の情報量を制御することができる。
本発明の第2の例示的実施形態について、図面を参照して詳細に説明する。本例示的実施形態における画像通信装置は、例えば、移動体に搭載され、センサによって取得されたデプス情報の情報量を変更する。
本例示的実施形態に係る画像通信装置2の構成について、図3を参照して説明する。図3は、画像通信装置2の機能的構成を示すブロック図である。図3に示すように、画像通信装置2は、受信部21と、制御部22とを備えている。受信部21は、本例示的実施形態において受信手段を実現する構成である。制御部22は、本例示的実施形態において制御手段を実現する構成である。
以上説明したように、本例示的実施形態に係る画像通信装置2によれば、制御部22が、移動体の動作内容に応じて決定された、情報量の変更に関するパラメータに応じて、センサから取得したデプス情報の情報量を変更するので、移動体制御におけるデプス情報の情報量が適切となるように、デプス情報の情報量を制御することができる。
本発明の第3の例示的実施形態について、図面を参照して詳細に説明する。本例示的実施形態におけるサーバ装置は、移動体の動作内容を取得し、移動体の動作内容に応じて、デプス情報の情報量を変更するためのパラメータを決定する。
本例示的実施形態に係るサーバ装置3の構成について、図4を参照して説明する。図4は、サーバ装置3の機能的構成を示すブロック図である。図4に示すように、サーバ装置3は、取得部31と、送信部32とを備えている。取得部31は、本例示的実施形態において取得手段を実現する構成である。送信部32は、本例示的実施形態において送信手段を実現する構成である。
以上説明したように、本例示的実施形態に係るサーバ装置3によれば、送信部32は、移動体の動作内容に応じて、センサから取得したデプス情報の情報量を変更するためのパラメータを決定し、当該パラメータを移動体に送信する。したがって、移動体側でこのパラメータを受信し、パラメータに応じて、センサから取得したカラー画像の奥行きを表すデプス画像の情報量を変更することにより、移動体制御におけるデプス情報の情報量が適切となるように、デプス情報の情報量を制御することができる。
本発明の第4の例示的実施形態について、図面を参照して詳細に説明する。なお、例示的実施形態2および3において説明した構成要素と同じ機能を有する構成要素については、同じ符号を付し、その説明を適宜省略する。
本例示的実施形態に係る移動体制御システム100Aの構成について、図5を参照して説明する。図5は、移動体制御システム100Aの構成を示すブロック図である。
座標(u,v)における画素を、画素(u,v)とし、画素(u,v)における16bitのデプス情報をI16bit(u,v)とすると、量子化した後の8bitのグレースケール画像I8bit(u,v)は、次式(式2)の通りとなる。なお、次式(式2)は、8bitのグレースケール画像I8bit(u,v)の値が、0~255の範囲となるようにしている。
図7は、16bitのデプス情報と、8bitのグレースケール情報とを示す図である。図7に示すように、16bitのデプス情報を量子化して、8bitのグレースケール情報に変換した場合、階段状のグラフとなるため、8bitでは16bitよりも大きな量子化誤差が発生する。この量子化誤差esは、次式(式3)の通りとなる。量子化範囲が狭い程、量子化誤差が小さくなることが分かる。
図8は、量子化処理の他の一例を示す図である。図8に示すように、16bitのデプス画像を8bitの線形変換ではなく、シグモイド関数を用いて変換するようにしてもよい。シグモイド関数は、次式(式4)の通りである。
この場合、量子化範囲は、線形変換のような下限値Dminと、上限値Dmaxとで明示的に規定されるのではなく、(式4)に示すシグモイド関数のパラメータaで制御されることになる。
bDepth=r×B ・・・(式6)
サーバ装置3Aの送信部32は、移動体が接続されるネットワークの通信スループットに応じて、画像のビットレートとデプス情報のビットレートとを変更する。上述の通り、カラー画像のビットレートはbRGB(bps)、デプス画像のビットレートはbDepth(bps)となる。
以上のように構成された移動体制御システム100Aが実行する移動体制御方法の流れについて、図9を参照して説明する。図9は、移動体制御方法の流れを示すフロー図である。図9に示すように、移動体制御方法は、ステップS11~S17を含む。
以上説明したように、本例示的実施形態に係る移動体制御システム100Aによれば、サーバ装置3Aの送信部32は、移動体の動作内容に応じて、デプス情報の近似精度を変更して、変更後の近似精度を制御部22に通知する。したがって、画像通信装置2Aの制御部22は、近似精度に応じてデプス情報を量子化することによって、デプス情報の情報量を削減することができる。
本発明の第5の例示的実施形態について、図面を参照して詳細に説明する。なお、例示的実施形態2~4において説明した構成要素と同じ機能を有する構成要素については、同じ符号を付し、その説明を適宜省略する。
本例示的実施形態に係る移動体制御システム100Bの構成について、図10を参照して説明する。図10は、移動体制御システム100Bの構成を示すブロック図である。
以上のように構成された移動体制御システム100Bが実行する移動体制御方法の流れについて、図11を参照して説明する。図11は、移動体制御方法の流れを示すフロー図である。図11に示すように、移動体制御方法は、ステップS21~S28を含む。なお、カラー画像の一例として、RGB画像の場合について説明する。
以上説明したように、本例示的実施形態に係る移動体制御システム100Bによれば、画像通信装置2Bの量子化情報追加部28が、デプス圧縮部27によって圧縮された後のデプス情報を送信するパケットに量子化情報を追加し、デプス送信部232に、量子化情報を追加した後のパケットをサーバ装置3Bに送信させる。したがって、サーバ装置3B側で、頻繁に更新されるRGB画像およびデプス情報の圧縮率や、デプス情報の量子化範囲を容易に確認することができる。
画像通信装置2,2A,2B、サーバ装置3,3A,3Bの一部又は全部の機能は、集積回路(ICチップ)等のハードウェアによって実現してもよいし、ソフトウェアによって実現してもよい。
本発明は、上述した実施形態に限定されるものでなく、請求項に示した範囲で種々の変更が可能である。例えば、上述した実施形態に開示された技術的手段を適宜組み合わせて得られる実施形態についても、本発明の技術的範囲に含まれる。
上述した実施形態の一部又は全部は、以下のようにも記載され得る。ただし、本発明は、以下の記載する態様に限定されるものではない。
移動体の動作内容を取得する取得手段と、
前記移動体の動作内容に応じて、センサから取得するデプス情報の情報量を制御する制御手段と、を備える、
移動体制御システム。
前記制御手段は、前記移動体の動作内容に応じて、前記デプス情報を近似する、付記1に記載の移動体制御システム。
前記デプス情報に基づいて、障害物を検出する障害物検出手段を備え、
前記制御手段は、前記移動体の動作内容と、前記障害物の検出結果とに応じて、前記デプス情報の近似精度を変更する、付記2に記載の移動体制御システム。
前記センサから取得した画像に基づいて、障害物を検出する障害物検出手段を備え、
前記制御手段は、前記移動体の動作内容と、前記障害物の検出結果とに応じて、前記デプス情報の近似精度を変更する、付記2に記載の移動体制御システム。
前記制御手段は、前記移動体の動作内容に応じて、前記センサから取得する画像の圧縮率を制御する、付記3又は4に記載のロボット制御システム。
前記制御手段は、前記移動体の動作内容に応じて、前記画像と前記デプス情報とに割当てるビットレートの比率を変更する、付記5に記載の移動体制御システム。
前記制御手段は、前記移動体が接続されるネットワークの通信スループットに応じて、前記画像のビットレートと前記デプス情報のビットレートとを変更する、付記5または6に記載の移動体制御システム。
移動体の動作内容を取得し、
取得した前記移動体の動作内容に応じて、センサから取得するデプス情報の情報量を制御する、
移動体制御方法。
前記情報量を変更する処理は、
前記移動体の動作内容に応じて、前記デプス情報を近似する、付記8に記載の移動体制御方法。
前記デプス情報に基づいて、障害物を検出し、
前記情報量を変更する処理は、
前記移動体の動作内容と、前記障害物の検出結果とに応じて、前記デプス情報の近似精度を変更する、付記9に記載の移動体制御方法。
前記センサから取得した画像に基づいて、障害物を検出し、
前記情報量を変更する処理は、
前記移動体の動作内容と、前記障害物の検出結果とに応じて、前記デプス情報の近似精度を変更する、付記10に記載の移動体制御方法。
前記情報量を変更する処理は、
前記移動体の動作内容に応じて、前記センサから取得する画像の圧縮率を制御する、付記10又は11に記載の移動体制御方法。
前記情報量を変更する処理は、
前記移動体の動作内容に応じて、前記画像と前記デプス情報とに割当てるビットレートの比率を変更する、付記12に記載の移動体制御方法。
前記情報量を変更する処理は、
前記移動体が接続されるネットワークの通信スループットに応じて、前記画像のビットレートと前記デプス情報のビットレートとを変更する、付記12または13に記載の移動体制御方法。
移動体の動作内容に応じて決定された、情報量の変更に関するパラメータを受信する受信手段と、
前記パラメータに応じて、センサから取得したデプス情報の情報量を制御する制御手段と、を備える、
画像通信装置。
前記受信手段は、前記移動体の動作内容に応じて決定された、前記デプス情報を近似するときの近似精度を含む前記パラメータを受信し、
前記制御手段は、前記近似精度に応じて前記デプス情報を近似することによって、前記デプス情報の情報量を削減する、付記15に記載の画像通信装置。
前記受信手段は、前記移動体の動作内容と、前記デプス情報に基づいて検出された障害物とに応じて決定された、前記デプス情報の近似精度を含む前記パラメータを受信する、付記16に記載の画像通信装置。
前記受信手段は、前記移動体の動作内容と、前記センサから取得した画像に基づいて検出された障害物とに応じて決定された、前記デプス情報の近似精度を含む前記パラメータを受信する、付記17に記載の画像通信装置。
前記制御手段は、前記移動体の動作内容に応じて決定された前記パラメータに応じて、前記センサから取得する画像の圧縮率を制御する、付記17又は18に記載の画像通信装置。
前記受信手段は、前記移動体の動作内容に応じて決定された、前記画像と前記デプス情報とに割当てるビットレートの比率を含む前記パラメータを受信する、付記19に記載の画像通信装置。
前記受信手段は、前記移動体が接続されるネットワークの通信スループットに応じて決定された、前記画像のビットレートと前記デプス情報のビットレートとを受信する、付記19または20に記載の画像通信装置。
少なくとも1つのプロセッサを備え、前記プロセッサは、移動体の動作内容を取得する処理と、取得した前記移動体の動作内容に応じて、センサから取得するデプス情報の情報量を制御する処理と、
を実行する移動体制御システム。
少なくとも1つのプロセッサを備え、前記プロセッサは、移動体の動作内容に応じて決定された、情報量の変更に関するパラメータを受信する処理と、
前記パラメータに応じて、センサから取得したデプス情報の情報量を制御する処理と、
を実行する画像通信装置。
前記変更手段は、前記デプス画像の圧縮率を変更することによって、当該デプス画像の情報量を変更する、付記3に記載のロボット制御システム。
ロボットの動作内容を取得する取得手段と、
前記ロボットの動作内容に応じて、カメラから取得したデプス画像の情報量を変更するためのパラメータを決定し、当該パラメータを前記ロボットに送信する送信手段と、を備える、
サーバ装置。
前記送信手段は、前記ロボットの動作内容に応じて、前記デプス画像における量子化対象のデプス範囲を変更した前記パラメータを前記ロボットに送信する、付記25に記載のサーバ装置。
3,3A,3B サーバ装置
11,31 取得部
12,22 制御部
21,37 受信部
23,32 送信部
24 RGB画像取得部
25 デプス画像取得部
26 RGB圧縮部
27 デプス圧縮部
28 量子化情報追加部
33 RGB受信・復号部
34 デプス受信・復号部
35 画像処理部
36 スループット計測部
100,100A,100B ロボット制御システム
231 RGB送信部
232 デプス送信部
Claims (21)
- 移動体の動作内容を取得する取得手段と、
前記移動体の動作内容に応じて、センサから取得するデプス情報の情報量を制御する制御手段と、を備える、
移動体制御システム。 - 前記制御手段は、前記移動体の動作内容に応じて、前記デプス情報を近似する、請求項1に記載の移動体制御システム。
- 前記デプス情報に基づいて、障害物を検出する障害物検出手段を備え、
前記制御手段は、前記移動体の動作内容と、前記障害物の検出結果とに応じて、前記デプス情報の近似精度を変更する、請求項2に記載の移動体制御システム。 - 前記センサから取得した画像に基づいて、障害物を検出する障害物検出手段を備え、
前記制御手段は、前記移動体の動作内容と、前記障害物の検出結果とに応じて、前記デプス情報の近似精度を変更する、請求項2に記載の移動体制御システム。 - 前記制御手段は、前記移動体の動作内容に応じて、前記センサから取得する画像の圧縮率を制御する、請求項3又は4に記載の移動体制御システム。
- 前記制御手段は、前記移動体の動作内容に応じて、前記画像と前記デプス情報とに割当てるビットレートの比率を変更する、請求項5に記載の移動体制御システム。
- 前記制御手段は、前記移動体が接続されるネットワークの通信スループットに応じて、前記画像のビットレートと前記デプス情報のビットレートとを変更する、請求項5または6に記載の移動体制御システム。
- 移動体の動作内容を取得し、
取得した前記移動体の動作内容に応じて、センサから取得するデプス情報の情報量を制御する、
移動体制御方法。 - 前記情報量を変更する処理は、
前記移動体の動作内容に応じて、前記デプス情報を近似する、請求項8に記載の移動体制御方法。 - 前記デプス情報に基づいて、障害物を検出し、
前記情報量を変更する処理は、
前記移動体の動作内容と、前記障害物の検出結果とに応じて、前記デプス情報の近似精度を変更する、請求項9に記載の移動体制御方法。 - 前記センサから取得した画像に基づいて、障害物を検出し、
前記情報量を変更する処理は、
前記移動体の動作内容と、前記障害物の検出結果とに応じて、前記デプス情報の近似精度を変更する、請求項9に記載の移動体制御方法。 - 前記情報量を変更する処理は、
前記移動体の動作内容に応じて、前記センサから取得する画像の圧縮率を制御する、請求項10又は11に記載の移動体制御方法。 - 前記情報量を変更する処理は、
前記移動体の動作内容に応じて、前記画像と前記デプス情報とに割当てるビットレートの比率を変更する、請求項12に記載の移動体制御方法。 - 前記情報量を変更する処理は、
前記移動体が接続されるネットワークの通信スループットに応じて、前記画像のビットレートと前記デプス情報のビットレートとを変更する、請求項12または13に記載の移動体制御方法。 - 移動体の動作内容に応じて決定された、情報量の変更に関するパラメータを受信する受信手段と、
前記パラメータに応じて、センサから取得したデプス情報の情報量を制御する制御手段と、を備える、
画像通信装置。 - 前記受信手段は、前記移動体の動作内容に応じて決定された、前記デプス情報を近似するときの近似精度を含む前記パラメータを受信し、
前記制御手段は、前記近似精度に応じて前記デプス情報を近似することによって、前記デプス情報の情報量を削減する、請求項15に記載の画像通信装置。 - 前記受信手段は、前記移動体の動作内容と、前記デプス情報に基づいて検出された障害物とに応じて決定された、前記デプス情報の近似精度を含む前記パラメータを受信する、請求項16に記載の画像通信装置。
- 前記受信手段は、前記移動体の動作内容と、前記センサから取得した画像に基づいて検出された障害物とに応じて決定された、前記デプス情報の近似精度を含む前記パラメータを受信する、請求項16に記載の画像通信装置。
- 前記制御手段は、前記移動体の動作内容に応じて決定された前記パラメータに応じて、前記センサから取得する画像の圧縮率を制御する、請求項17又は18に記載の画像通信装置。
- 前記受信手段は、前記移動体の動作内容に応じて決定された、前記画像と前記デプス情報とに割当てるビットレートの比率を含む前記パラメータを受信する、請求項19に記載の画像通信装置。
- 前記受信手段は、前記移動体が接続されるネットワークの通信スループットに応じて決定された、前記画像のビットレートと前記デプス情報のビットレートとを受信する、請求項19または20に記載の画像通信装置。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2021/036427 WO2023053444A1 (ja) | 2021-10-01 | 2021-10-01 | 移動体制御システム、移動体制御方法、および画像通信装置 |
| US18/694,125 US20240393800A1 (en) | 2021-10-01 | 2021-10-01 | Moving body control system, moving body control method, and image communication device |
| JP2023550997A JPWO2023053444A1 (ja) | 2021-10-01 | 2021-10-01 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2021/036427 WO2023053444A1 (ja) | 2021-10-01 | 2021-10-01 | 移動体制御システム、移動体制御方法、および画像通信装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023053444A1 true WO2023053444A1 (ja) | 2023-04-06 |
Family
ID=85782088
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/036427 Ceased WO2023053444A1 (ja) | 2021-10-01 | 2021-10-01 | 移動体制御システム、移動体制御方法、および画像通信装置 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20240393800A1 (ja) |
| JP (1) | JPWO2023053444A1 (ja) |
| WO (1) | WO2023053444A1 (ja) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012160902A1 (ja) * | 2011-05-24 | 2012-11-29 | 日産自動車株式会社 | 車両用監視装置及び車両の監視方法 |
| WO2018155159A1 (ja) * | 2017-02-24 | 2018-08-30 | パナソニックIpマネジメント株式会社 | 遠隔映像出力システム、及び遠隔映像出力装置 |
| WO2019082958A1 (ja) * | 2017-10-27 | 2019-05-02 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 三次元モデル符号化装置、三次元モデル復号装置、三次元モデル符号化方法、および、三次元モデル復号方法 |
| WO2020111134A1 (ja) * | 2018-11-29 | 2020-06-04 | 住友電気工業株式会社 | システム、サーバコンピュータ、車載装置、制御方法、半導体集積回路及びコンピュータプログラム |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6232075B2 (ja) * | 2013-12-03 | 2017-11-22 | 日本電信電話株式会社 | 映像符号化装置及び方法、映像復号装置及び方法、及び、それらのプログラム |
| JP7029910B2 (ja) * | 2016-12-22 | 2022-03-04 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報処理装置、情報処理方法及びプログラム |
| JP7032062B2 (ja) * | 2017-06-02 | 2022-03-08 | 株式会社Ihi | 点群データ処理装置、移動ロボット、移動ロボットシステム、および点群データ処理方法 |
| CN112771867A (zh) * | 2018-09-28 | 2021-05-07 | 夏普株式会社 | 3d数据生成装置、3d数据再现装置、控制程序以及记录介质 |
| JP7351079B2 (ja) * | 2018-11-30 | 2023-09-27 | ソニーグループ株式会社 | 制御装置、制御方法及びプログラム |
| EP3729333A4 (en) * | 2019-02-19 | 2020-12-16 | SZ DJI Technology Co., Ltd. | AUTONOMOUS NAVIGATION BASED ON LOCAL DETECTION, AND ASSOCIATED SYSTEMS AND PROCESSES |
| JP2020177289A (ja) * | 2019-04-15 | 2020-10-29 | ソニー株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
| JP7276023B2 (ja) * | 2019-09-06 | 2023-05-18 | トヨタ自動車株式会社 | 車両遠隔指示システム、及び自動運転車両 |
| JP7310524B2 (ja) * | 2019-10-11 | 2023-07-19 | トヨタ自動車株式会社 | 遠隔自動運転車両、及び車両遠隔指示システム |
| JP2022110260A (ja) * | 2021-01-18 | 2022-07-29 | ソニーグループ株式会社 | 移動装置、および移動装置制御方法 |
| JP2022182277A (ja) * | 2021-05-28 | 2022-12-08 | ソニーグループ株式会社 | 情報処理装置、情報処理方法および自律移動体 |
-
2021
- 2021-10-01 US US18/694,125 patent/US20240393800A1/en active Pending
- 2021-10-01 WO PCT/JP2021/036427 patent/WO2023053444A1/ja not_active Ceased
- 2021-10-01 JP JP2023550997A patent/JPWO2023053444A1/ja active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012160902A1 (ja) * | 2011-05-24 | 2012-11-29 | 日産自動車株式会社 | 車両用監視装置及び車両の監視方法 |
| WO2018155159A1 (ja) * | 2017-02-24 | 2018-08-30 | パナソニックIpマネジメント株式会社 | 遠隔映像出力システム、及び遠隔映像出力装置 |
| WO2019082958A1 (ja) * | 2017-10-27 | 2019-05-02 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 三次元モデル符号化装置、三次元モデル復号装置、三次元モデル符号化方法、および、三次元モデル復号方法 |
| WO2020111134A1 (ja) * | 2018-11-29 | 2020-06-04 | 住友電気工業株式会社 | システム、サーバコンピュータ、車載装置、制御方法、半導体集積回路及びコンピュータプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| US20240393800A1 (en) | 2024-11-28 |
| JPWO2023053444A1 (ja) | 2023-04-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8406290B2 (en) | User sensitive information adaptive video transcoding framework | |
| JP7323545B2 (ja) | 三次元データ符号化方法、三次元データ復号方法、三次元データ符号化装置、及び三次元データ復号装置 | |
| KR101581097B1 (ko) | 인코딩 방법 및 장치와, 디코딩 방법 및 장치 | |
| US20200371243A1 (en) | Image capturing control method and apparatus, image capturing system and tof camera | |
| KR20210153060A (ko) | 비디오 인코더의 레이트 제어 | |
| US11814071B2 (en) | Vehicle, apparatus for a vehicle, computer program, and method for processing information for communication in a tele-operated driving session | |
| US6778605B1 (en) | Image processing apparatus and method | |
| CN112771859A (zh) | 基于感兴趣区域的视频数据编码方法、装置和存储介质 | |
| US20110044557A1 (en) | Method and control unit for rectifying a camera image | |
| KR20160024771A (ko) | 화상 통신 장치, 화상 송신 장치 및 화상 수신 장치 | |
| WO2019001283A1 (zh) | 编码分辨率控制方法及装置 | |
| US20200276996A1 (en) | Server implementing automatic remote control of moving conveyance and method of automatic remote control of moving conveyance | |
| KR20160142200A (ko) | Mjpeg 압축방식을 이용하는 영상촬영장치 | |
| US20250159256A1 (en) | Three-dimensional data storage method, three-dimensional data acquisition method, three-dimensional data storage device, and three-dimensional data acquisition device | |
| CN107197276A (zh) | 半导体设备、编码控制方法和相机设备 | |
| US20130301700A1 (en) | Video encoding device and encoding method thereof | |
| US20220294971A1 (en) | Collaborative object detection | |
| US11533484B1 (en) | Method and system for optimizing image and video compression for machine vision | |
| CN112866630A (zh) | 车辆及其控制方法 | |
| WO2023053444A1 (ja) | 移動体制御システム、移動体制御方法、および画像通信装置 | |
| US12490049B2 (en) | Electronic device for confirming position of external electronic device, and operation method therefor | |
| JP2019047401A (ja) | 画像処理装置 | |
| US7581018B2 (en) | Server system for performing communication over wireless network | |
| US12051327B2 (en) | In-vehicle wireless communication device, wireless communication system, and wireless communication method | |
| CN111066323B (zh) | 计算机视觉系统中的图像压缩/解压缩 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21959478 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 18694125 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023550997 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21959478 Country of ref document: EP Kind code of ref document: A1 |