WO2012096347A1 - Système de réseau, procédé de commande, unité de commande et programme de commande - Google Patents
Système de réseau, procédé de commande, unité de commande et programme de commande Download PDFInfo
- Publication number
- WO2012096347A1 WO2012096347A1 PCT/JP2012/050497 JP2012050497W WO2012096347A1 WO 2012096347 A1 WO2012096347 A1 WO 2012096347A1 JP 2012050497 W JP2012050497 W JP 2012050497W WO 2012096347 A1 WO2012096347 A1 WO 2012096347A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- model device
- controller
- course
- radio controlled
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
- A63H30/04—Electrical arrangements using wireless transmission
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8017—Driving on land or water; Flying
Definitions
- the present invention relates to a technique for controlling the operation of a model apparatus by radio control.
- Patent Document 1 discloses a mobile robot.
- a mobile robot is equipped with a self-position direction detecting means, a moving means, and an obstacle detecting means.
- An environmental map storage means and a route planning means are incorporated in an external device of the mobile robot.
- the route search from the self-location to the destination with reference to the environment map, the movement of the mobile robot along the planned route, the avoidance motion for unknown obstacles not on the environment map, the planned route A new route search from the new self-position after deviating from the route to the moving destination, and movement along the re-planned route are performed.
- Patent Document 2 discloses a course guidance control system for an automatic traveling body. According to Japanese Patent Laid-Open No. 61-70617 (Patent Document 2), a speed control method using a traveling speed as a function of a steering angle is introduced, and a traveling speed control corresponding to the steering angle is performed. It can be used as a self-guidance control method for all traveling objects regardless of the use / type of traveling object to be controlled or conditions of the traveling surface, for example, floor or on-road or off-road.
- a new network system using a model device that moves based on commands from a controller is being sought.
- a network system including a model device and a controller.
- the model device is configured to acquire a moving unit configured to move the model device based on a command from the controller, a camera configured to photograph the front of the model device, and a position of the model device.
- the GPS (Global Positioning System) unit, a compass configured to acquire the orientation of the model device, and commands from the controller are received, and the captured image, position, and orientation from the camera are transmitted to the controller.
- Configured first communication interface The controller is configured to store course data for the model device, a display, an operation unit configured to accept commands, a second communication interface configured to communicate with the model device, and the model device. Based on the memory and the position and orientation received from the model device, a virtual image for display is created from the course data, and the synthesized image of the captured image received from the model device and the virtual image is displayed on the display.
- a processor configured.
- the processor is configured to create course data for the model device along the position locus based on the time-series data of the position received from the model device.
- the memory is configured to store course data for a plurality of model devices.
- the processor is configured to receive a course selection command for a plurality of model apparatuses via the operation unit.
- the processor is configured to transmit the course data for the model device to the other controller via the second communication interface.
- the processor is configured to receive course data for the model device from another controller via the second communication interface.
- the processor receives, from the other controller via the second communication interface, time series data in which the position and orientation of the other model apparatus and the elapsed time from the start time are associated with each other, and the time series data To obtain the position and orientation of the other model device corresponding to the elapsed time from the start time of the model device, and based on the position and orientation received from the model device and the position and orientation of the other model device.
- An image showing the model device is configured to be added to the virtual image for display.
- a control method in a network system including a model device and a controller that stores course data for the model device.
- the control method includes a step in which the controller receives a command for moving the model device, a step in which the controller transmits a command to the model device, and a step in which the model device moves the model device based on the command from the controller.
- a display a memory configured to store course data for the model device, an operation unit configured to receive a command for moving the model device, and a command For display from the course data based on the communication interface configured to send to the model device and receive the captured image and the position and orientation of the model device from the model device, and the position and orientation received from the model device.
- a processor configured to display on the display an image obtained by synthesizing the captured image received from the model device and the virtual image.
- a control method in a controller including a memory, a display, a communication interface, and a processor configured to store course data for a model device.
- the control method includes a step in which a processor receives a command for moving the model device, a step in which the processor transmits a command to the model device via a communication interface, and a processor in which the model device is transmitted via the communication interface.
- Receiving the captured image and the position and orientation of the model device from the processor creating a virtual image for display from the course data based on the position and orientation received from the model device, and the processor And displaying on the display an image obtained by synthesizing the captured image received from the model device and the virtual image.
- a control program for causing a controller including a memory, a display, a communication interface, and a processor configured to store course data for the model device to control the model device.
- the control program includes a step of receiving a command for moving the model device to the processor, a step of transmitting a command to the model device via the communication interface, and a captured image and a model device from the model device via the communication interface.
- a step of receiving a position and orientation of the image a step of creating a virtual image for display from the course data based on the position and orientation received from the model device, a captured image and a virtual image received from the model device, And a step of displaying the synthesized image on the display.
- a computer-readable non-volatile data recording medium storing the above control program is provided.
- FIG. It is an image figure which shows the whole structure of the network system 1 which concerns on this Embodiment. It is a block diagram showing the hardware constitutions of the controller 100 which concerns on this Embodiment. It is an image figure which shows the state which the user which concerns on this Embodiment inputs the command for creating the course for the radio controlled car 200.
- FIG. It is an image figure which shows the touch panel 130 of the controller 100 before the race starts, or when the radio controlled car 200 is drive
- FIG. 6 is a flowchart illustrating a control method in the controller 100 according to the second embodiment.
- the “model device” may be a helicopter or airplane that can fly, or a robot that can walk. In the case of a helicopter or an airplane, it is preferable to include an altimeter that acquires the altitude of the model device.
- the “controller” may be dedicated to control of the model apparatus, and has a communication interface with a display such as a portable telephone, personal computer, electronic notebook, PDA (Personal Digital Assistant) having other functions. It may be a device.
- FIG. 1 is an image diagram showing an overall configuration of a network system 1 according to the present embodiment.
- network system 1 includes controllers 100X and 100Y and radio controlled cars 200X and 200Y.
- controllers 100X and 100Y are collectively referred to as the controller 100 for the sake of explanation.
- the radio controlled cars 200X and 200Y are collectively referred to as a radio controlled car 200.
- Each user places the radio controlled car 200 on the ground of a park or a garden.
- the user controls the operation of the radio controlled car 200 via the controller 100.
- the radio control car 200 is equipped with a camera for photographing the front of the radio control car 200.
- the radio controlled car 200 transmits captured images to the controller 100 sequentially.
- the radio controlled car 200 is equipped with a GPS (Global Positioning System) for measuring the current position of the radio controlled car 200.
- the radio controlled car 200 sequentially transmits the current position to the controller 100.
- the radio controlled car 200 is equipped with an electronic compass for measuring the current orientation (attitude) of the radio controlled car 200.
- the radio controlled car 200 transmits the current direction to the controller 100 sequentially.
- the radio controlled car 200 may transmit the captured image, the current position, and the current orientation to the controller 100 at the same time or separately.
- Controller 100 displays a captured image from radio controlled car 200.
- the user inputs a command (a forward command, a reverse command, a direction change command, an acceleration / deceleration command, hereinafter these commands are also referred to as a movement command) for moving the radio controlled car 200 to the controller 100 while viewing the captured image.
- the controller 100 transmits a movement command from the user to the radio controlled car 200.
- the controller 100 accumulates time series data (course creation time series data) of the current position from the radio controlled car 200.
- the controller 100 acquires the trajectory 301 of the radio controlled car 200 based on the time series data.
- the controller 100 creates data indicating a course (circuit) 302 for the radio controlled car 200 based on the trajectory 301 of the radio controlled car 200.
- the data indicating the course (also referred to as course data) includes 3D objects such as a white line indicating the track of the course and a course pylon.
- the course position, shape, and orientation are associated with actual map data. Alternatively, the course position, shape, and orientation are associated with latitude and longitude.
- the controller 100 may accept a course creation command via the touch panel 130 or the like.
- the controller 100 receives a slide operation on the map from the user while the map is displayed.
- the controller 100 determines the position, shape, and orientation of the course based on the slide operation.
- the controller 100 may store a plurality of course data in advance. That is, even if the radio controlled car 200 does not run, the user may select a desired course from a plurality of courses prepared in advance.
- the position, shape, and orientation of the course are associated with map data or latitude / longitude in advance.
- the shape of the course is prepared, and when the user selects a course, the position and orientation of the course may be designated by the user.
- the controller 100 After the course is decided, the controller 100 starts the race based on the user's command.
- the controller 100 acquires the position and orientation of the radio controlled car 200 in the course based on the course data and the current position and orientation of the radio controlled car 200.
- the controller 100 Based on the position and orientation of the radio controlled car 200 in the course, the controller 100 creates a virtual image indicating the 3D object viewed from the radio controlled car 200 from the course data.
- the controller 100 determines the viewpoint of the 3D model data based on the position and orientation of the radio controlled car 200 on the circuit.
- the controller 100 creates a display image (virtual image) of 3D model data from the viewpoint.
- the controller 100 superimposes and displays a virtual image on the radio-controlled car 200 captured image.
- the controller 100X for controlling the radio controlled car 200X can communicate with the controller 100Y for controlling the radio controlled car 200Y.
- the controller 100X transmits the course data to the controller 100Y and receives the course data from the radio controlled car 200.
- the controller 100X and the controller 100Y can control the traveling of the radio controlled cars 200X and 200Y based on the common course data.
- the user of the controller 100X can perform a race along the course while viewing a captured image from the radio controlled car 200X, that is, while viewing a real video of the radio controlled car 200Y.
- the user can control the movement of the radio controlled car 200 with the line of sight of the radio controlled car 200 while viewing the image (virtual image) indicating the virtual course. it can.
- the user can make the control of the movement of the radio controlled car 200 full of realism.
- FIG. 2 is a block diagram showing a hardware configuration of controller 100 according to the present embodiment.
- controller 100 includes a CPU 110, a memory 120, a touch panel 130, a speaker 140, a button 150, a memory interface 160, a communication interface 170, and a clock 180 as main components. .
- the memory 120 is realized by various types of RAM (Random Access Memory), ROM (Read-Only Memory), a hard disk, and the like.
- the memory 120 stores a program executed by the CPU 110, map data, various model data for indicating a virtual course, and the like. In other words, the CPU 110 controls each unit of the controller 100 by executing a program stored in the memory 120.
- the touch panel 130 includes a tablet 132 and a display 131 laid on the surface of the tablet 132.
- the display 131 is preferably a 3D display.
- the touch panel 130 may be any type such as a resistive film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method.
- the touch panel 130 may include an optical sensor liquid crystal.
- the touch panel 130 (tablet 132) detects a touch operation on the touch panel 130 by an external object every predetermined time, and inputs the touch coordinates (touch position) to the CPU 110. In other words, the CPU 110 sequentially acquires touch coordinates from the touch panel 130.
- Speaker 140 outputs sound based on a command from CPU 110.
- CPU 110 causes speaker 140 to output sound based on the sound data.
- the button 150 is disposed on the surface of the controller 100.
- a plurality of buttons such as a direction key, a determination key, and a numeric keypad may be arranged on the controller 100.
- the button 150 receives a command from the user.
- the button 150 inputs a command from the user to the CPU 110.
- the memory interface 160 reads data from the external storage medium 161.
- the CPU 110 reads data stored in the external storage medium 161 via the memory interface 160 and stores the data in the memory 120.
- the CPU 110 reads data from the memory 120 and stores the data in the external storage medium 161 via the memory interface 160.
- the storage medium 161 is a CD (Compact Disc), DVD (Digital Versatile Disk), BD (Blu-ray Disc), USB (Universal Serial Bus) memory, memory card, FD (Flexible Disk), hard disk, magnetic tape, Non-volatile data such as cassette tape, MO (Magnetic Optical Disc), MD (Mini Disc), IC (Integrated Circuit) card (excluding memory card), optical card, EPROM, EEPROM (Electronically Erasable Programmable Read-Only Memory) Including, but not limited to, a medium for storing.
- the communication interface 170 is realized by an antenna or a connector.
- the communication interface 170 exchanges data with the radio controlled car 200 and other controllers 100 by wireless communication.
- CPU 110 receives a program, map data, and the like from another computer via communication interface 170.
- the CPU 110 transmits course data to the other controller 100 and receives course data from the other controller 100 via the communication interface 170.
- the CPU 110 transmits a movement command to the radio controlled car 200 via the communication interface 170, and receives a captured image, a current position, and a current orientation from the radio controlled car 200.
- the clock 180 measures time or a period based on a command from the CPU 110.
- the CPU 110 controls each unit of the controller 100 by executing a program stored in the memory 120 or the storage medium 161. For example, the CPU 110 executes a control process of the radio controlled car 200 by executing a program stored in the memory 120 or the storage medium 161.
- the CPU 110 receives a captured image from the radio controlled car 200 via the communication interface 170.
- CPU 110 displays the captured image on touch panel 130.
- CPU 110 accepts a command (movement command) for moving radio controlled car 200 from the user via touch panel 130 or button 150.
- CPU 110 transmits a movement command to radio controlled car 200 via communication interface 170.
- the CPU 110 receives the current position and orientation from the radio controlled car 200 via the communication interface 170.
- CPU 110 stores time series data of the current position of radio controlled car 200 in memory 120.
- CPU110 acquires the locus
- the CPU 110 creates data indicating the course 302 for the radio controlled car 200 based on the trajectory 301 of the radio controlled car 200.
- the data indicating the course (also referred to as course data) includes 3D objects (model data) such as a white line 301X indicating the center line of the course, a white line 302X indicating the end line of the course, and a course pylon 303X.
- the CPU 110 associates the course position, shape, and orientation with actual map data. Alternatively, the CPU 110 associates the position, shape, and direction of the course with latitude / longitude.
- the CPU 110 may create course data without causing the radio controlled car 200 to travel. More specifically, CPU 110 according to the present embodiment accepts a map acquisition command from the user via touch panel 130 or button 150. The CPU 110 downloads map data from an external server or the like via the communication interface 170. The CPU 110 may read map data from the memory 120 or the storage medium 161.
- FIG. 3 is an image diagram showing a state in which a user according to the present embodiment inputs a command for creating a course for radio controlled car 200.
- CPU 110 accepts a slide operation by the user while displaying a map image on touch panel 130.
- the CPU 110 obtains the finger trajectory 301 on the map image by sequentially obtaining the touch position of the finger on the map image via the touch panel 130.
- the CPU 110 displays a map image and a pointer on the touch panel 130.
- the CPU 110 receives a pointer movement command from the user via the button 150.
- CPU110 acquires the locus
- CPU110 produces the data which show the course 302 for the radio controlled car 200 based on the locus
- the CPU 110 creates a course pylon, a track line, etc. for indicating the end of the course on both sides of the trajectory 301.
- the controller 100 may store a plurality of course data in advance. That is, the user may select a desired course from a plurality of courses prepared in advance.
- the position, shape, and orientation of the course are associated with map data or latitude / longitude in advance.
- the position and orientation of the course may be designated when the user selects the course.
- the CPU 110 transmits course data to other controllers 100 and receives course data from other controllers 100.
- the CPU 110 stores the course data received from the other controller 100 in the memory 120.
- CPU 110 accepts selection of the course data from the user via touch panel 130 or button 150.
- the users of the controllers 100X and 100Y can race by causing the radio controlled cars 200X and 200Y to run simultaneously on the same course.
- the CPU 110 After the course is determined, the CPU 110 starts the race based on the user's command.
- the CPU 110 acquires the position and orientation of the radio controlled car 200 in the course based on the course data and the current position and orientation of the radio controlled car 200.
- the CPU 110 creates a virtual image indicating a 3D object viewed from the radio controlled car 200 from the course data based on the position and orientation of the radio controlled car 200 in the course.
- CPU110 superimposes and displays a virtual image on the picked-up image of radio controlled car 200.
- FIG. For example, the radio controlled car 200 of another controller 100 may be reflected in the captured image.
- FIG. 4A and 4B are image diagrams showing the controller 100 in the normal mode and the race mode according to the present embodiment. More specifically, FIG. 4A is an image diagram showing the touch panel 130 of the controller 100 before the race starts or when the radio controlled car 200 is running to create a course. FIG. 4B is an image diagram showing the touch panel 130 of the controller 100 during the race.
- CPU 110 causes touch panel 130 to display a photographed image from radio controlled car 200 before the race starts or when radio controlled car 200 is running to create a course.
- the CPU 110 when the race starts or during the race, the CPU 110 includes a white line 301X indicating the center of the course, a white line 302X indicating the end of the course, and a course pylon 303X in the captured image from the radio controlled car 200.
- superimpose virtual images such as.
- FIG. 5 is an image diagram showing the controller 100 immediately before the start of the race according to the present embodiment.
- CPU 110 receives a start point and a goal point in the course via touch panel 130 or button 150.
- CPU110 reads a virtual image from the memory 120, when a race is started based on the present position and direction of the radio controlled car 200 in a course.
- CPU 110 causes touch panel 130 to display a captured image, white lines 301X and 302X, coarse pylon 303X, characters indicating the start timing, an image of a traffic light, and the like.
- FIG. 6 is an image diagram showing the controller 100 when the radio controlled car 200 according to the present embodiment is located immediately before the goal point of the race.
- the CPU 110 receives a start point and a goal point in the course via the touch panel 130 or the button 150.
- CPU 110 accepts designation of how many weeks the course is to be performed via touch panel 130 or button 150.
- CPU 110 reads a virtual image from memory 120 when radio controlled car 200 approaches the goal point based on the current position and orientation of radio controlled car 200 on the course.
- CPU 110 causes touch panel 130 to display a captured image, white lines 301X and 302X, course pylon 303X, an image showing a goal point, and the like.
- FIG. 7 is a block diagram showing a hardware configuration of radio controlled car 200 according to the present embodiment.
- the radio controlled car 200 includes a CPU 210, a memory 220, a moving mechanism 230, a GPS 240, an electronic compass 250, a memory interface 260, a communication interface 270, and a clock 280 as main components. And a camera 290.
- the memory 220 is realized by various RAMs, ROMs, hard disks, and the like.
- the memory 220 stores a program executed by the CPU 210 and the like. In other words, the CPU 210 controls each part of the radio controlled car 200 by executing a program stored in the memory 220.
- the moving mechanism 230 moves the radio controlled car 200 based on a command from the CPU 210.
- the moving mechanism 230 includes a motor, a shaft, a tire, and the like.
- the moving mechanism 230 may be a propeller, a wing, a leg, or the like.
- the moving mechanism 230 moves the radio controlled car 200 in accordance with a moving command from the controller 100.
- GPS240 acquires the current position of the radio controlled car 200.
- the current position is transmitted to the controller 100 via the communication interface 270.
- the electronic compass 250 acquires the direction of the radio controlled car 200.
- the direction is transmitted to the controller 100 via the communication interface 270.
- the memory interface 260 reads data from the external storage medium 261.
- the CPU 210 reads data stored in the external storage medium 261 via the memory interface 260 and stores the data in the memory 220.
- the CPU 210 reads data from the memory 220 and stores the data in the external storage medium 261 via the memory interface 260.
- the communication interface 270 is realized by an antenna or a connector.
- the communication interface 270 exchanges data with the controller 100 by wireless communication.
- the CPU 210 receives a movement command from the controller 100 via the communication interface 270, and transmits a captured image, the current position, and the current orientation.
- the clock 280 measures time or a period based on a command from the CPU 210.
- the camera 290 is disposed at the front part of the radio controlled car 200.
- the camera 290 captures a scene in front of the radio controlled car 200.
- the camera 290 preferably includes a right camera for capturing a right-eye image and a left camera for capturing a left-eye image. That is, it is preferable that the camera 290 can capture a 3D image.
- the CPU 210 controls each unit of the radio controlled car 200 by executing a program stored in the memory 220 or the storage medium 261. For example, the CPU 210 executes a program stored in the memory 220 or the storage medium 261.
- CPU 210 sequentially transmits images taken by camera 290 to controller 100 via communication interface 270.
- the CPU 210 transmits the current position and orientation of the radio controlled car 200 to the controller 100 via the communication interface 270 in response to a request from the controller 100 or periodically.
- the CPU 210 receives a movement command from the controller 100 via the communication interface 270.
- the CPU 210 drives the movement mechanism 230 based on the movement command.
- the CPU 210 receives a forward command, a reverse command, a direction change command, an acceleration / deceleration command, etc. as a movement command via the communication interface 270.
- the CPU 210 drives the moving mechanism 230 based on the command to move the radio controlled car 200 forward, backward, change direction, or increase the speed.
- FIG. 8 is a flowchart showing a control method in controller 100 according to the first embodiment.
- CPU 110 of controller 100 receives a race start command from the user via touch panel 130 or button 150, it executes the processing from step S106. In the following, it is assumed that the CPU 110 has received the current position and orientation from the radio controlled car 200 once via the communication interface 170.
- the CPU 110 creates course 3D virtual data by reading the course data from the memory 120 (step S106).
- CPU 110 determines whether a captured image or a current position and orientation is received from radio controlled car 200 via communication interface 170 (step S108).
- CPU110 repeats the process of step S108, when a picked-up image or the present position and direction are not received from the radio controlled car 200 (when it is NO in step S108).
- CPU 110 When CPU 110 receives the captured image and the current position and orientation from radio controlled car 200 (YES in step S108), CPU 110 views the 3D virtual model indicating the course based on the current position and orientation of the latest radio controlled car 200. Is determined (step S112). More specifically, the CPU 110 determines the viewpoint for the 3D virtual model based on the position and orientation in the course.
- CPU 110 creates a 3D virtual image from the determined viewpoint (step S114).
- CPU110 superimposes the newest 3D virtual image created on the latest captured image (step S116).
- CPU 110 causes touch panel 130 to display a composite image of the latest captured image and the latest 3D virtual image (step S118).
- CPU 110 determines whether an instruction for ending the race is received from the user via touch panel 130 or button 150 (step S120).
- CPU110 repeats the process from step S108, when the command for ending a race is not received (when it is NO in step S120).
- the controller 100X displays a virtual image indicating the radio controlled car 200Y that has already traveled on the course during the race of the radio controlled car 200X.
- network system 1 includes controllers 100X and 100Y and radio controlled cars 200X and 200Y. However, in the present embodiment, it is assumed that the radio controlled car 200X travels after the travel of the radio controlled car 200Y ends.
- the controllers 100X and 100Y are collectively referred to as the controller 100 for the sake of explanation.
- the radio controlled cars 200X and 200Y are collectively referred to as a radio controlled car 200.
- the user places the radio controlled car 200 on the ground of a park or garden.
- the user controls the operation of the radio controlled car 200 via the controller 100.
- the radio control car 200 is equipped with a camera for photographing the front of the radio control car 200.
- the radio controlled car 200 transmits captured images to the controller 100 sequentially.
- the radio controlled car 200 is equipped with a GPS for measuring the current position of the radio controlled car 200.
- the radio controlled car 200 sequentially transmits the current position to the controller 100.
- the radio controlled car 200 is equipped with an electronic compass for measuring the current orientation (attitude) of the radio controlled car 200.
- the radio controlled car 200 transmits the current direction to the controller 100 sequentially.
- Controller 100 displays a captured image from radio controlled car 200.
- the user inputs a command (movement command) for moving the radio controlled car 200 to the controller 100 while viewing the captured image.
- the controller 100 transmits a movement command from the user to the radio controlled car 200.
- the controller 100 accumulates time series data (course creation time series data) of the current position and direction from the radio controlled car 200.
- the controller 100 acquires the trajectory 301 of the radio controlled car 200 based on the time series data.
- the controller 100 creates data indicating the course 302 for the radio controlled car 200 based on the trajectory 301 of the radio controlled car 200.
- the data indicating the course includes a 3D object such as a white line indicating a course track or a course pylon.
- the course position, shape, and orientation are associated with actual map data. Alternatively, the course position, shape, and orientation are associated with latitude and longitude.
- the controller 100 may accept a course creation command via the touch panel 130 or the like.
- the controller 100 receives a slide operation on the map from the user while the map is displayed.
- the controller 100 determines the position, shape, and orientation of the course based on the slide operation.
- the controller 100 may store a plurality of course data in advance. That is, even if the radio controlled car 200 does not run, the user may select a desired course from a plurality of courses prepared in advance.
- the position, shape, and orientation of the course are associated with map data or latitude / longitude in advance.
- the shape of the course is prepared, and when the user selects a course, the position and orientation of the course may be designated by the user.
- the controller 100X for controlling the radio controlled car 200X can communicate with the controller 100Y for controlling the radio controlled car 200Y. After determining the course, the controller 100X transmits the course data to the controller 100Y or receives the course data from the radio controlled car 200Y. That is, the controller 100X and the controller 100Y can control the traveling of the radio controlled cars 200X and 200Y based on the common course data.
- the controller 100Y when the controller 100Y travels the course first, the controller 100Y accumulates time-series data (race time-series data) of the position and orientation of the radio controlled car 200Y.
- the controller 100X acquires time-series data of the position and orientation of the radio controlled car 200Y controlled by the controller 100Y before the radio controlled car 200X travels.
- the controller 100 starts the race based on the user's command after acquiring the time series data.
- the controller 100 acquires the position and orientation of the radio controlled car 200 in the course based on the course data and the current position and orientation of the radio controlled car 200.
- the controller 100 refers to time series data of another radio controlled car from the other controller 100, and acquires the position and orientation of the other radio controlled car on the course based on the time from the start of the race.
- the controller 100 creates a virtual image indicating a 3D object viewed from the radio controlled car 200 from the course data and the positions and orientations of other radio controlled cars based on the position and the orientation of the radio controlled car 200 in the course.
- the virtual image includes not only an image showing a course but also an image showing another radio controlled car. Examples of other radio controlled car images include pre-registered sports car animations and face pictures of other radio controlled cars.
- the controller 100 superimposes and displays a virtual image on the radio-controlled car 200 captured image.
- the controller 100X While the radio controlled car 200X is running on the course, the controller 100X accumulates time series data (race time series data) of the position and orientation of the radio controlled car 200X. When the traveling of the radio controlled car 200X ends, the controller 100X transmits time series data of the position and orientation of the radio controlled car 200X to the controller 100Y or the third controller.
- the user looks at the virtual image showing the virtual course and the other radio controlled car, while looking at the radio controlled car 200 controlled by the user,
- the movement can be controlled.
- the user can make the control of the movement of the radio controlled car 200 full of realism.
- Controller 100 Since the hardware configuration of controller 100 is the same as that of the first embodiment, description thereof will not be repeated here. Hereinafter, data stored in the memory 120 and the operation of the CPU 110 of the controller 100 will be described.
- the CPU 110 controls each unit of the controller 100 by executing a program stored in the memory 120 or the storage medium 161. For example, the CPU 110 executes a control process of the radio controlled car 200 by executing a program stored in the memory 120 or the storage medium 161.
- the CPU 110 receives a captured image from the radio controlled car 200 via the communication interface 170.
- CPU 110 displays the captured image on touch panel 130.
- CPU 110 accepts a command (movement command) for moving radio controlled car 200 from the user via touch panel 130 or button 150.
- CPU 110 transmits a movement command to radio controlled car 200 via communication interface 170.
- the CPU 110 receives the current position and orientation from the radio controlled car 200 via the communication interface 170.
- CPU 110 stores time series data (course creation time series data) of the current position and direction of radio controlled car 200 in memory 120.
- CPU110 acquires the locus
- the CPU 110 creates data indicating the course 302 for the radio controlled car 200 based on the trajectory 301 of the radio controlled car 200.
- the data indicating the course (also referred to as course data) includes a 3D object (model data) such as a white line indicating the course track and a course pylon.
- the CPU 110 associates the course position, shape, and orientation with actual map data. Alternatively, the CPU 110 associates the position, shape, and direction of the course with latitude / longitude.
- the CPU 110 may create course data without causing the radio controlled car 200 to travel. More specifically, CPU 110 according to the present embodiment accepts a map acquisition command from the user via touch panel 130 or button 150. The CPU 110 downloads map data from an external server or the like via the communication interface 170. The CPU 110 may read map data from the memory 120 or the storage medium 161.
- CPU 110 accepts a command for creating a course for radio controlled car 200 from the user via touch panel 130 or button 150. Referring to FIG. 3, CPU 110 accepts a slide operation by the user while displaying a map image on touch panel 130. For example, the CPU 110 obtains the finger trajectory 301 on the map image by sequentially obtaining the touch position of the finger on the map image via the touch panel 130.
- the CPU 110 displays a map image and a pointer on the touch panel 130.
- the CPU 110 receives a pointer movement command from the user via the button 150.
- CPU110 acquires the locus
- CPU110 produces the data which show the course 302 for the radio controlled car 200 based on the locus
- the CPU 110 creates a course pylon, a track line, etc. for indicating the end of the course on both sides of the trajectory 301.
- the controller 100 may store a plurality of course data in advance. That is, the user may select a desired course from a plurality of courses prepared in advance.
- the position, shape, and orientation of the course are associated with map data or latitude / longitude in advance.
- the position and orientation of the course may be designated when the user selects the course.
- the CPU 110 transmits course data to other controllers 100 and receives course data from other controllers 100.
- the CPU 110 stores the course data received from the other controller 100 in the memory 120.
- CPU 110 accepts selection of the course data from the user via touch panel 130 or button 150.
- the CPU 110 receives the position and orientation of the other radio controlled car from the other controller 100 via the communication interface 170. Series data (race time series data) is received. CPU 110 stores in memory 120 time-series data of the positions and orientations of other radio controlled cars.
- FIG. 9 is an image diagram showing a database 123 of race time-series data of positions and orientations of other radio controlled cars stored in the memory 120 according to the second embodiment.
- CPU 110 associates the position and orientation of another radio controlled car transmitted from other controller 100 with the time from the start of the race in which the position and orientation are measured, and at the time of racing. It is stored in the database 123 as series data.
- the controller 100 stores in the database 123 time series data for racing at the position of its own radio controlled car 200 and time series data for racing at the positions and orientations of the other two radio controlled cars.
- HH MM: SS
- the CPU 110 measures and records the time up to 1/100 second, and the time is recorded as HH: Stored in the database 123 in the format of MM: SS: NN.
- N north latitude
- S south latitude
- E east longitude
- W west longitude
- 0 north
- 90 east
- 180 south
- 270 west.
- the CPU 110 Based on the user's command, the CPU 110 starts the race.
- the CPU 110 acquires the position and orientation of the radio controlled car 200 in the course based on the course data and the current position and orientation of the radio controlled car 200.
- CPU110 acquires the position and direction of the other radio controlled car in a course based on the time from a race start by referring to the time series data of the other radio controlled car of database 123.
- the CPU 110 creates a virtual image indicating a 3D object viewed from the radio controlled car 200 from the course data and the positions and orientations of other radio controlled cars based on the position and the orientation of the radio controlled car 200 in the course.
- CPU 110 superimposes the virtual image on the captured image of radio controlled car 200 and displays it on touch panel 130.
- the CPU 110 when another radio controlled car precedes, the CPU 110 includes an image of the other radio controlled car in the virtual image.
- Examples of other radio controlled car images include pre-registered sports car animations and face pictures of other radio controlled cars.
- the CPU 110 accumulates the race time series data of the position and direction of the radio controlled car 200 in the database 123 of the memory 120 while the radio controlled car 200 is traveling on the course. When the traveling of the radio controlled car 200 is completed, the CPU 110 transmits the time series data for the race of the position and direction of the (radio) controlled car 200 to the other controller 100 via the communication interface 170.
- CPU 110 captures an image taken from radio controlled car 200 before the start of the race or when radio controlled car 200 is running to create a course. Is displayed on the touch panel 130.
- the CPU 110 when the race starts or during the race, the CPU 110 includes a white line 301X indicating the center of the course, a white line 302X indicating the end of the course, and a course pylon 303X in the captured image from the radio controlled car 200. Superimpose virtual images such as.
- the CPU 110 may superimpose a 2D or 3D virtual image suitable for the start of the race on the touch panel 130 on the touch panel 130. That is, the memory 120 stores a start virtual image.
- CPU 110 receives a start point and a goal point in the course via touch panel 130 or button 150.
- CPU110 reads a virtual image from the memory 120, when a race is started based on the present position and direction of the radio controlled car 200 in a course.
- CPU 110 causes touch panel 130 to display a captured image, white lines 301X and 302X, coarse pylon 303X, characters indicating the start timing, an image of a traffic light, and the like.
- the CPU 110 may cause the touch panel 130 to display a 2D or 3D virtual image suitable for the race goal on the captured image. That is, the memory 120 stores a goal virtual image.
- the CPU 110 receives a start point and a goal point in the course via the touch panel 130 or the button 150.
- CPU 110 accepts designation of how many weeks the course is to be performed via touch panel 130 or button 150.
- CPU 110 reads a virtual image from memory 120 when radio controlled car 200 approaches the goal point based on the current position and orientation of radio controlled car 200 on the course.
- CPU 110 causes touch panel 130 to display a captured image, white lines 301X and 302X, course pylon 303X, an image showing a goal point, and the like.
- Radio controlled car 200 ⁇ Configuration of radio controlled car 200> Note that the hardware configuration of the radio controlled car 200 is the same as that of the first embodiment, and therefore description thereof will not be repeated here.
- FIG. 10 is a flowchart showing a control method in controller 100 according to the second embodiment.
- CPU 110 of controller 100 receives a race start command from the user via touch panel 130 or button 150, it executes the processing from step S202.
- the CPU 110 has received the current position and orientation from the radio controlled car 200 once via the communication interface 170. Further, it is assumed that the CPU 110 stores the time series data for the race of the position and direction of the other radio controlled car controlled by the other controller 100 via the communication interface 170 in the database 123.
- the CPU 110 starts measuring the time from the start of the race using the clock 180 (step S202).
- the CPU 110 refers to the clock 180 and acquires the elapsed time from the start of the race (step S204).
- the CPU 110 creates the 3D virtual data of the course by reading the course data from the memory 120 (step S206).
- the CPU 110 refers to the time series data of other radio controlled cars in the database 123, and based on the time from the start of the race, the position and orientation of the other radio controlled cars on the course. To get. CPU 110 creates 3D virtual data including the course and the other radio controlled car from the data of the course and the position and orientation of the other radio controlled car.
- CPU 110 determines whether or not a captured image or a current position and orientation is received from radio controlled car 200 via communication interface 170 (step S208).
- CPU110 repeats the process from step S204, when a captured image or the present position and direction are not received from the radio controlled car 200 (when it is NO in step S208).
- CPU 110 When CPU 110 receives the captured image and the current position and direction from radio controlled car 200 (YES in step S208), CPU 110 stores the current position and direction in database 123 in association with the time from the start of the race (step S208). S210). CPU110 determines the viewpoint with respect to 3D virtual model which shows a course and another radio controlled car based on the present position and direction of the newest radio controlled car 200 (step S212). That is, CPU110 determines the viewpoint with respect to 3D virtual model based on the position and direction of the own radio controlled car 200 with respect to a course and another radio controlled car.
- CPU 110 creates a 3D virtual image from the determined viewpoint (step S214).
- CPU110 superimposes the newest 3D virtual image produced on the newest captured image (step S216).
- CPU 110 causes touch panel 130 to display a composite image of the latest captured image and the latest 3D virtual image (step S218).
- CPU 110 determines whether an instruction for ending the race is received from the user via touch panel 130 or button 150 (step S220). CPU110 repeats the process from step S204, when the command for ending a race is not received (when it is NO in step S220).
- the program code itself read from the external storage media 161 and 261 and the memories 120 and 220 realizes the functions of the above-described embodiment, and the external storage media 161 and 261 storing the program code are provided.
- the memories 120 and 220 constitute the present invention.
- the program code read from the external storage medium 161, 261 or the memory 120, 220 is written to another storage medium provided in a function expansion board inserted into the computer or a function expansion unit connected to the computer. Thereafter, the CPU of the function expansion board or function expansion unit performs part or all of the actual processing based on the instruction of the program code, and the functions of the above-described embodiment may be realized by the processing. Needless to say, it is included.
- 1 network system 100 controller, 110 CPU, 120 memory, 130 touch panel, 131 display, 132 tablet, 140 speaker, 150 button, 160 memory interface, 161 storage medium, 170 communication interface, 180 clock, 200 radio controlled car, 210 CPU, 220 memory, 230 moving mechanism, 240 GPS, 250 electronic compass, 260 memory interface, 261 storage medium, 270 communication interface, 280 clock, 290 camera, 301 trajectory, 302 course, white line indicating the center of 301X course, end of 302X course A white line indicating 303X course pylon.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Toys (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Processing Or Creating Images (AREA)
Abstract
La présente invention concerne un système de réseau novateur utilisant des dispositifs de type modèles réduits qui se déplacent en fonction d'instructions provenant d'unités de commande. Une unité de commande (100) reçoit une instruction en vue du déplacement d'un dispositif de type modèle réduit et transmet l'instruction au dispositif de type modèle réduit. Le dispositif de type modèle réduit (200) se déplace alors en fonction de l'instruction provenant de l'unité de commande, photographie la zone se trouvant devant lui, acquiert sa position, acquiert sa direction, et transmet l'image obtenue, la position et la direction à l'unité de commande. En fonction de la position et de la direction reçues du dispositif de type modèle réduit, l'unité de commande crée une image virtuelle à afficher à partir de données relatives à la progression du dispositif de type modèle réduit et affiche sur un écran (130) une image composée à partir de l'image obtenue reçue du dispositif de type modèle réduit et de l'image virtuelle.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011005036A JP2012143447A (ja) | 2011-01-13 | 2011-01-13 | ネットワークシステム、コントロール方法、コントローラ、およびコントロールプログラム |
| JP2011-005036 | 2011-01-13 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012096347A1 true WO2012096347A1 (fr) | 2012-07-19 |
Family
ID=46507239
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2012/050497 Ceased WO2012096347A1 (fr) | 2011-01-13 | 2012-01-12 | Système de réseau, procédé de commande, unité de commande et programme de commande |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2012143447A (fr) |
| WO (1) | WO2012096347A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016131342A (ja) * | 2015-01-15 | 2016-07-21 | 治幸 岩田 | 鉄道模型観賞用装置、方法、プログラム、専用表示モニタ、合成用情景画像データ |
| CN107305133A (zh) * | 2016-04-22 | 2017-10-31 | 中兴通讯股份有限公司 | 一种移动终端的采集图片方法及装置 |
| DE102018101862A1 (de) * | 2018-01-26 | 2019-08-01 | Perpetual Mobile Gmbh | Spielzeugfernsteueranlage |
Families Citing this family (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014063411A (ja) * | 2012-09-24 | 2014-04-10 | Casio Comput Co Ltd | 遠隔制御システム、制御方法、及び、プログラム |
| JPWO2014077046A1 (ja) | 2012-11-13 | 2017-01-05 | ソニー株式会社 | 画像表示装置及び画像表示方法、移動体装置、画像表示システム、並びにコンピューター・プログラム |
| US10269132B2 (en) | 2014-07-31 | 2019-04-23 | Sony Corporation | Displaying images according to head posture and camera posture |
| US10310617B2 (en) * | 2015-06-11 | 2019-06-04 | Intel Corporation | Drone controlling device and method |
| JP6598732B2 (ja) * | 2016-05-27 | 2019-10-30 | 京セラ株式会社 | 電子機器、制御装置、制御プログラム及び動画表示方法 |
| JP6586109B2 (ja) * | 2017-01-05 | 2019-10-02 | Kddi株式会社 | 操縦装置、情報処理方法、プログラム、及び飛行システム |
| JP6636558B2 (ja) * | 2018-03-30 | 2020-01-29 | 株式会社バンダイナムコエンターテインメント | 遠隔操作システム及びプログラム |
| JP2019219700A (ja) * | 2018-06-15 | 2019-12-26 | ブルーイノベーション株式会社 | 履歴管理システム、操作装置および履歴管理プログラム |
| JP6883628B2 (ja) * | 2019-09-06 | 2021-06-09 | Kddi株式会社 | 操縦装置、情報処理方法、及びプログラム |
| JP7484210B2 (ja) * | 2020-02-17 | 2024-05-16 | 株式会社セガ | ゲームシステム |
| JP7397482B2 (ja) * | 2020-04-22 | 2023-12-13 | 株式会社スパイシードローンキッチン | 無人移動体を用いた映像処理システム、映像処理方法及び映像処理装置 |
| JP6903800B1 (ja) * | 2020-07-13 | 2021-07-14 | 株式会社コロプラ | プログラム、情報処理方法、情報処理装置、及びシステム |
| JP7377372B2 (ja) * | 2020-09-30 | 2023-11-09 | 本田技研工業株式会社 | 画像処理装置および画像表示装置 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH07178235A (ja) * | 1993-12-21 | 1995-07-18 | Casio Comput Co Ltd | コントロール装置 |
| JP2006341086A (ja) * | 2005-05-11 | 2006-12-21 | Namco Bandai Games Inc | サーバシステム、プログラム、及び情報記憶媒体 |
| JP2008035962A (ja) * | 2006-08-02 | 2008-02-21 | Nintendo Co Ltd | 汎用遠隔制御機能を備えたゲーム装置 |
| JP2008253362A (ja) * | 2007-04-02 | 2008-10-23 | Taito Corp | 遠隔無線操縦型オンライン対戦ゲームシステムにおけるゲーム画像生成方法 |
-
2011
- 2011-01-13 JP JP2011005036A patent/JP2012143447A/ja not_active Withdrawn
-
2012
- 2012-01-12 WO PCT/JP2012/050497 patent/WO2012096347A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH07178235A (ja) * | 1993-12-21 | 1995-07-18 | Casio Comput Co Ltd | コントロール装置 |
| JP2006341086A (ja) * | 2005-05-11 | 2006-12-21 | Namco Bandai Games Inc | サーバシステム、プログラム、及び情報記憶媒体 |
| JP2008035962A (ja) * | 2006-08-02 | 2008-02-21 | Nintendo Co Ltd | 汎用遠隔制御機能を備えたゲーム装置 |
| JP2008253362A (ja) * | 2007-04-02 | 2008-10-23 | Taito Corp | 遠隔無線操縦型オンライン対戦ゲームシステムにおけるゲーム画像生成方法 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016131342A (ja) * | 2015-01-15 | 2016-07-21 | 治幸 岩田 | 鉄道模型観賞用装置、方法、プログラム、専用表示モニタ、合成用情景画像データ |
| CN107305133A (zh) * | 2016-04-22 | 2017-10-31 | 中兴通讯股份有限公司 | 一种移动终端的采集图片方法及装置 |
| DE102018101862A1 (de) * | 2018-01-26 | 2019-08-01 | Perpetual Mobile Gmbh | Spielzeugfernsteueranlage |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2012143447A (ja) | 2012-08-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2012096347A1 (fr) | Système de réseau, procédé de commande, unité de commande et programme de commande | |
| US11644832B2 (en) | User interaction paradigms for a flying digital assistant | |
| US11573562B2 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
| JP6329343B2 (ja) | 画像処理システム、画像処理装置、画像処理プログラム、および画像処理方法 | |
| US10977865B2 (en) | Augmented reality in vehicle platforms | |
| JP7620128B2 (ja) | 映像表示装置及び映像表示方法 | |
| US8838381B1 (en) | Automatic video generation for navigation and object finding | |
| WO2016168722A1 (fr) | Interface baguette magique et autres paradigmes d'interaction d'utilisateur pour un assistant numérique volant | |
| US12007763B2 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
| JP6788094B2 (ja) | 画像表示方法、画像表示システム、飛行体、プログラム、及び記録媒体 | |
| KR20200032547A (ko) | 자율주행차량용 ar게임 장치 및 그 방법 | |
| WO2012099194A1 (fr) | Dispositif de prise de vue, et procédé et système de réseau pour commander un dispositif de prise de vue | |
| US20200217665A1 (en) | Mobile platform, image capture path generation method, program, and recording medium | |
| JP2012143383A (ja) | コントローラ、模型装置、およびコントロール方法 | |
| CN110785720A (zh) | 信息处理装置、信息提示指示方法、程序以及记录介质 | |
| WO2021251441A1 (fr) | Procédé, système et programme | |
| KR101866464B1 (ko) | 전자 레이싱 장치 및 그 동작방법 | |
| US20240118703A1 (en) | Display apparatus, communication system, display control method, and recording medium | |
| JP2021057078A (ja) | 無人航空機、無人航空機の飛行制御装置、無人航空機の飛行制御方法、及びプログラム | |
| JP2023083072A (ja) | 方法、システムおよびプログラム | |
| JP2023157912A (ja) | 生成装置、生成方法および生成プログラム | |
| WO2022070851A1 (fr) | Procédé, système et programme |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12733921 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12733921 Country of ref document: EP Kind code of ref document: A1 |