[go: up one dir, main page]

WO2020080346A1 - Dispositif et programme de traitement d'informations - Google Patents

Dispositif et programme de traitement d'informations Download PDF

Info

Publication number
WO2020080346A1
WO2020080346A1 PCT/JP2019/040423 JP2019040423W WO2020080346A1 WO 2020080346 A1 WO2020080346 A1 WO 2020080346A1 JP 2019040423 W JP2019040423 W JP 2019040423W WO 2020080346 A1 WO2020080346 A1 WO 2020080346A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
screen
operation area
dimensional space
virtual camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/040423
Other languages
English (en)
Japanese (ja)
Inventor
龍一郎 佐伯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Corp
Original Assignee
Sega Games Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Games Co Ltd filed Critical Sega Games Co Ltd
Publication of WO2020080346A1 publication Critical patent/WO2020080346A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to an information processing device and a program.
  • Patent Document 1 An information processing device that moves an object in a virtual three-dimensional space by a user performing a touch operation on an operation area on the screen using a touch panel.
  • the present invention has been made in view of such circumstances, and an object thereof is to allow a user to easily operate an object while looking at a screen.
  • the main invention of the present invention for solving the above problems is A game image generation unit that generates a game image when the object on the moving plane is viewed from a virtual camera arranged so as to obliquely look down the moving plane on which the object moves in the virtual three-dimensional space; An operation area generation unit that generates an operation area when a virtual operation object is arranged in parallel with the moving plane in a virtual three-dimensional space when viewed from the virtual camera; A display control unit that arranges the game image and the operation area on the same screen; A touch panel that detects a user's touch operation on the screen on which the game image and the operation area are arranged; Based on the detection signal from the touch panel, the pointing direction on the screen instructed by the user's touch operation is determined, and the pointing direction on the screen instructed in the determined operation area is virtualized.
  • An object control unit that converts into a designated direction on the moving plane in the three-dimensional space and controls the object to move in the transformed designated direction on the moving plane;
  • An information processing apparatus comprising:
  • FIG. 6A is a diagram showing a screen configuration example of the client terminal 10 in the comparative example.
  • FIG. 6B is a diagram showing the moving direction of the character in the virtual three-dimensional space.
  • a game image generation unit that generates a game image when the object on the moving plane is viewed from a virtual camera arranged so as to obliquely look down the moving plane on which the object moves in the virtual three-dimensional space
  • An operation area generation unit that generates an operation area when a virtual operation object is arranged in parallel with the moving plane in a virtual three-dimensional space when viewed from the virtual camera
  • a display control unit that arranges the game image and the operation area on the same screen
  • a touch panel that detects a user's touch operation on the screen on which the game image and the operation area are arranged; Based on the detection signal from the touch panel, the pointing direction on the screen instructed in the operation area is determined by the touch operation of the user, and the pointing direction on the screen instructed in the determined operation area is virtualized.
  • An object control unit that converts into a designated direction on the moving plane in the three-dimensional space and controls the object to move in the transformed designated direction on the moving plane;
  • An information processing apparatus comprising: According to such an information processing apparatus, by arranging the moving plane of the object and the operation plane of the virtual operation object to be parallel to each other in the virtual three-dimensional space, the virtual camera when the virtual three-dimensional space is looked down obliquely. Even if the game image seen from above and the operation area are placed on the same screen, the pointing direction on the screen instructed by the user's touch operation and the moving direction on the screen where the object moves will match. Become. This allows the user to easily operate the object while looking at the screen.
  • a virtual camera control unit for controlling the placement of the virtual cameras in the virtual three-dimensional space,
  • the virtual camera is displayed in a state where the object continues to move in the pointing direction on the moving plane in the virtual three-dimensional space because the pointing direction on the screen instructed by the user's touch operation is maintained.
  • the operation area generation unit generates an operation area when the virtual operation object is viewed from the changed virtual camera,
  • the display control unit converts the pointing direction on the moving plane before the viewing direction of the virtual camera is changed into the pointing direction on the screen to be instructed by the user's touch operation in the generated operation area.
  • the generated operation area may be arranged on the screen so as to match the converted instruction direction on the screen. According to such an information processing device, even if the arrangement of the virtual camera is changed while the user's touch operation is continued, it is possible to move the object while maintaining the moving direction so far.
  • a virtual camera control unit for controlling the placement of the virtual cameras in the virtual three-dimensional space The object control unit performs the slide operation by the user based on the detection signal from the touch panel before the view direction of the virtual camera is changed and after the view direction of the virtual camera is changed. It is also possible to determine the slide distance on the screen designated in the operation area and control the object to move at high speed in the virtual three-dimensional space when the determined slide distance is a predetermined distance or more. According to such an information processing device, in order to move the object at high speed in the virtual three-dimensional space, even after the viewing direction of the virtual camera is changed, the same slide distance as before the viewing direction of the virtual camera is changed. Since only the slide operation needs to be performed, it is possible to suppress confusion due to the change of the viewing direction of the virtual camera.
  • the object control unit moves the object at a high speed in a designated direction on the moving plane in the virtual three-dimensional space by maintaining the slide distance on the screen designated in the manipulation area by the user's slide manipulation. If the viewing direction of the virtual camera is changed while continuing, after the viewing direction of the virtual camera is changed, the slide distance on the screen instructed in the operation area by the user's slide operation is continuously maintained. On the condition that the object has been moved, the object may be controlled so that the high-speed movement of the object is further continued in the virtual three-dimensional space. According to such an information processing device, it is possible to control so that the object does not suddenly move at low speed due to the change of the view direction of the virtual camera while the object moves at high speed. It is possible to suppress confusion due to a change in the viewing direction.
  • the computer Game image generation means for generating a game image when the object on the moving plane is viewed from a virtual camera arranged so as to obliquely look down the moving plane on which the object moves in the virtual three-dimensional space
  • An operation area generation unit that generates an operation area when a virtual operation object arranged in parallel with the moving plane in a virtual three-dimensional space is viewed from the virtual camera
  • Display control means for arranging the game image and the operation area on the same screen, Based on a detection signal from the touch panel, a user's touch operation determines a pointing direction on the screen instructed in the operation area, and the pointing direction on the screen instructed in the determined operation area is set to a virtual 3
  • Object control means for converting into a designated direction on the moving plane in the dimensional space and controlling the object to move in the transformed designated direction on the moving plane; It is a program to function as. With such a program, the user can easily operate the object while looking at the screen.
  • the information processing device, the program, and the information processing system according to the embodiment of the present invention will be described in detail.
  • the present invention can be widely applied to an information processing device, a program, an information processing system, etc. that employs a game playable on a touch panel.
  • FIG. 1 is a configuration diagram showing an example of an information processing system 1 according to the present embodiment. As shown in FIG. 1, in the information processing system 1 according to the present embodiment, one or more client terminals 10 and a server device 20 are connected via a network N.
  • the client terminal 10 is a terminal device such as a smartphone, tablet, or PC operated by a user, or a terminal device such as a game dedicated device for home or business use.
  • the server device 20 manages and controls the game played by the user on the client terminal 10 and performs billing processing within the game.
  • the network N is the Internet or the like and includes mobile radio base stations and the like.
  • FIG. 2 is a hardware configuration diagram showing an example of the computer 50 according to the present embodiment.
  • the client terminal 10 according to the present embodiment is realized by, for example, the computer 50 having the hardware configuration shown in FIG.
  • the computer 50 is an example of an information processing device.
  • the computer 50 includes a CPU 51, a RAM 52, a ROM 53, a communication interface 54, an input device 55, a display device 56, an external interface 57, an HDD 58, etc., each of which is a bus. They are connected to each other by a line B.
  • the CPU 51 is an arithmetic device that realizes control and functions of the entire computer by reading programs and data from a storage device such as the ROM 53 and the HDD 58 onto the RAM 52 and executing various processes based on the read programs and data.
  • RAM 52 is an example of a volatile semiconductor memory (storage device) for temporarily holding programs and data, and is also used as a work area when CPU 51 executes various processes.
  • the ROM 53 is an example of a non-volatile semiconductor memory (storage device) that can retain programs and data even when the power is turned off.
  • the ROM 53 stores programs and data such as a BIOS executed when the computer 50 is started up, OS settings, and network settings.
  • the communication interface 54 is an interface for connecting the computer 50 to the network N. This allows the computer 50 to perform data communication via the communication interface 54.
  • the input device 55 is a device used by the user to input various signals.
  • the input device 55 is an operation device such as a touch panel, operation keys or buttons, a keyboard or a mouse.
  • the client terminal 10 in this embodiment has at least a touch panel.
  • the touch panel is composed of a capacitance type panel laminated on the display device 56.
  • transparent electrodes arranged in a grid detect a change in capacitance and output a detection signal thereof.
  • the position of the center of gravity determined by the range of the transparent electrode in which the capacitance has changed is specified as the touch position (pointing position) on the screen.
  • the display device 56 is a device for displaying various information on the screen for a user who plays a game using a touch panel.
  • the display device 56 is, for example, a display such as liquid crystal or organic EL.
  • the external interface 57 is an interface for connecting to an external device so that data communication is possible. As a result, the computer 50 can read and / or write the recording medium via the external interface 57.
  • the external device is a recording medium such as a flexible disk, a CD, a DVD, an SD memory card, or a USB memory.
  • the HDD 58 is an example of a non-volatile storage device that stores programs and data.
  • the stored programs and data include an OS that is basic software that controls the entire computer, and applications that provide various functions on the OS.
  • a drive device eg, solid state drive: SSD
  • flash memory e.g., NAND
  • the client terminal 10 can realize various processes as described below by executing the program in the computer 50 having the above-described hardware configuration.
  • FIG. 3 is a functional block diagram showing an example of the client terminal 10 according to the present embodiment.
  • the client terminal 10 according to the present embodiment is realized by the functional blocks shown in FIG. 3, for example.
  • the client terminal 10 realizes the control unit 100, the storage unit 120, the communication unit 140, the operation receiving unit 150, and the screen display unit 160 by executing the program.
  • the control unit 100 has a function of executing various processes in the client terminal 10.
  • the control unit 100 includes a game progression unit 101, an object control unit 102, a virtual camera control unit 103, a game image generation unit 104, an operation area generation unit 105, and a display control unit 106.
  • the game progression unit 101 controls the progress of various games (eg, action games) playable on the client terminal 10 based on the game operation received by the client terminal 10 from the user.
  • various games eg, action games
  • the object control unit 102 controls the behavior of objects arranged in a virtual three-dimensional space (virtual game space).
  • the object control unit 102 according to the present exemplary embodiment converts the designated direction on the screen designated by the user's touch operation into the designated direction on the moving plane in the virtual three-dimensional space, and the transformed designated direction on the moving plane. Control the object to move to.
  • This object control unit 102 includes functions as a touch position determination unit and an operation input determination unit.
  • the touch position determination unit determines the touch position on the screen instructed by the touch operation of the user based on the detection signal from the touch panel. Specifically, based on the positions of the transparent electrodes where the capacitance changes among the transparent electrodes arranged in a grid pattern, the designated position (touch position) on the screen designated by the user's touch operation is determined. decide. Then, the operation input determination unit determines the type of game operation input by the user who plays the game using the touch panel.
  • the touch operation is input depending on whether or not the touch position of the user is located in the operation area, or the slide operation is performed depending on whether or not the touch position of the user is slid in the predetermined direction from the operation area. Whether or not it is input is determined.
  • the virtual camera control unit 103 controls the placement of virtual cameras in a virtual three-dimensional space.
  • the virtual camera control unit 103 changes the placement of the virtual cameras in the virtual three-dimensional space (for example, the viewpoint position, the viewing direction, the angle of view, etc.) according to the user's operation, so that the virtual cameras can be seen Control is performed so that the state of the virtual three-dimensional space changes.
  • the game image generation unit 104 generates a game image when an object arranged in the virtual three-dimensional space is viewed from the virtual camera as a two-dimensional image.
  • the game image generation unit 104 in the present embodiment generates a game image when the object on the moving plane is viewed from a virtual camera arranged so as to obliquely look down on the moving plane on which the object moves in the virtual three-dimensional space. To do.
  • the game image generation unit 104 performs coordinate conversion of the object arranged in the three-dimensional coordinates represented by the world coordinate system into the view coordinate system with the virtual camera as a reference.
  • the game image generation unit 104 also performs interpolation processing such as light source processing and processing for mapping a texture on an object.
  • the operation area generation unit 105 generates an operation area when a virtual operation object placed in a virtual three-dimensional space is viewed from a virtual camera as a two-dimensional image.
  • the operation area generation unit 105 in the present embodiment generates an operation area when a virtual operation object, which is arranged so as to be parallel to a moving plane on which the object moves in the virtual three-dimensional space, is viewed from the virtual camera.
  • the operation area generation unit 105 performs a specific drawing process in the same manner as the game image generation unit 104.
  • the display control unit 106 configures a game screen including two-dimensional images of various objects arranged in the virtual three-dimensional space for each frame (for example, every 1/60 seconds), and outputs the screen to the screen display unit 160. To do.
  • the display control unit 106 in the present embodiment configures a game screen by arranging the game image generated by the game image generation unit 104 and the operation area generated by the operation area generation unit 105 on the same screen.
  • the storage unit 120 stores installed applications (game applications, etc.) and various information required in the client terminal 10.
  • the communication unit 140 communicates with the server device 20.
  • the screen display unit 160 acquires the game screen according to the control of the display control unit 106 and displays the screen on the client terminal 10.
  • FIG. 4 is a diagram showing the relationship between the world coordinate system, the view coordinate system, and the screen coordinate system when a game image is generated in the comparative example.
  • the three-dimensional coordinates (Xw, Yw, Zw) with the origin Ow as the reference of the virtual three-dimensional space are expressed as a world coordinate system
  • the three-dimensional coordinates (Xc, Yc, Zc) with the virtual camera as the origin Oc are expressed.
  • a two-dimensional coordinate (Xs, Ys) having a center of a screen surface (projection surface) corresponding to the screen of the client terminal 10 as an origin Os is expressed as a screen coordinate system.
  • the virtual camera is arranged so that the Xw-Zw plane is obliquely looked down at the depression angle ⁇ with respect to the Zw axis direction from the height in the Yw axis direction of the world coordinate system.
  • the Zc axis direction of the view coordinate system is a direction perpendicular to the screen surface (origin Os of the screen coordinate system) corresponding to the screen of the client terminal 10, and coincides with the viewing direction of the virtual camera.
  • the Xc-Yc plane of the view coordinate system is parallel to the screen surface, and the distance from the view coordinate system origin Oc to the screen coordinate system origin Os coincides with the focal length of the virtual camera.
  • Character A which is an example of an object, is placed on the Xw-Zw plane of the world coordinate system.
  • the Xw-Zw plane of the world coordinate system coincides with the moving plane on which the character A moves in the virtual three-dimensional space.
  • the character A can move on the moving plane by changing the coordinate position on the Xw-Zw plane of the world coordinate system over time.
  • the game image generation unit 104 generates a game image when the character A on the moving plane is viewed from a virtual camera arranged so as to obliquely look down the moving plane on which the character A moves in the virtual three-dimensional space.
  • the character A arranged in the world coordinate system (Xw, Yw, Zw) is viewed in the view coordinate system. Coordinate conversion is performed to (Xc, Yc, Zc). Further, by using the perspective projection method or the like, the character A arranged in the view coordinate system (Xc, Yc, Zc) is subjected to coordinate conversion into the screen coordinate system (Xs, Ys). By thus performing geometrical calculation and repeating coordinate conversion, the character A in the virtual three-dimensional space can be generated as a two-dimensional image (game image).
  • FIG. 5 is a diagram showing the relationship between the world coordinate system, the view coordinate system, and the screen coordinate system when the operation area is generated in the comparative example.
  • the three-dimensional coordinates (Xw, Yw, Zw) are expressed as the world coordinate system
  • the three-dimensional coordinates (Xc, Yc, Zc) are expressed as the view coordinate system
  • the two-dimensional coordinates (Xs , Ys) is expressed as a screen coordinate system.
  • the view coordinate system is set in association with the world coordinate system.
  • the Zc axis direction of the view coordinate system is a direction perpendicular to the screen surface (origin Os of the screen coordinate system) corresponding to the screen of the client terminal 10, and coincides with the viewing direction of the virtual camera.
  • the Xc-Yc plane of the view coordinate system is parallel to the screen surface, and the distance from the view coordinate system origin Oc to the screen coordinate system origin Os coincides with the focal length of the virtual camera.
  • the virtual operation pad P as an example of the virtual operation object has a circular plane and is arranged on the Xw-Zw plane of the world coordinate system.
  • the Xw-Zw plane of the world coordinate system is parallel to the screen surface corresponding to the screen of the client terminal 10 and the Xc-Yc plane of the view coordinate system. That is, the moving plane on which the character A moves in the virtual three-dimensional space is also parallel to the screen surface thereof and the Xc-Yc plane of the view coordinate system.
  • the operation area generation unit 105 is an operation area when the virtual operation pad P arranged in parallel with the moving plane of the character A in the virtual three-dimensional space is viewed from a virtual camera arranged so as to look down from directly above. To generate.
  • the virtual operation pad P arranged in the world coordinate system (Xw, Yw, Zw) is used by using the coordinate conversion matrix calculated from the positional relationship between the virtual camera and the virtual operation pad P shown in FIG. To the view coordinate system (Xc, Yc, Zc). Further, by using the perspective projection method or the like, the coordinate conversion of the virtual operation pad P arranged in the view coordinate system (Xc, Yc, Zc) is performed into the screen coordinate system (Xs, Ys). By thus performing geometrical calculation and repeating coordinate conversion, the virtual operation pad P in the virtual three-dimensional space can be generated as a two-dimensional image (operation area).
  • FIG. 6A is a diagram showing a screen configuration example of the client terminal 10 in the comparative example.
  • FIG. 6B is a diagram showing the moving direction of the character in the virtual three-dimensional space.
  • the client terminal 10 in this embodiment has a touch panel 500 stacked on the screen.
  • the display control unit 106 generates the game image 501 generated by the game image generation unit 104 based on the virtual camera arrangement shown in FIG. 4, and the operation generated by the operation area generation unit 105 based on the virtual camera arrangement shown in FIG.
  • the area 502 is controlled so as to be arranged on the same screen.
  • the user can move the character A by sliding the finger touching the screen to perform the slide operation on the operation area 502.
  • the virtual camera placement settings affect the pointing direction (45 °) on the screen instructed in the operation area by the user's touch operation and the screen on which the character A moves.
  • the moving directions (30 °) do not match.
  • the character A does not move in the direction intended by the user, which makes it difficult for the user to operate the character A while looking at the screen.
  • FIG. 7 is a diagram showing the relationship between the world coordinate system, the view coordinate system, and the screen coordinate system when the operation area is generated in the embodiment according to the present invention.
  • the three-dimensional coordinates (Xw, Yw, Zw) are expressed as the world coordinate system
  • the three-dimensional coordinates (Xc, Yc, Zc) are expressed as the view coordinate system
  • the two-dimensional coordinates (Xs , Ys) is expressed as a screen coordinate system.
  • the virtual camera layout setting according to the present invention shown in FIG. 7 is different from the virtual camera layout setting in the comparative example shown in FIG. 5, and is the same as the virtual camera layout setting in the comparative example shown in FIG. It has become.
  • the virtual cameras are arranged so as to obliquely look down the Xw-Zw plane at the depression angle ⁇ from the height of the world coordinate system in the Yw axis direction to the Zw axis direction. .
  • the view coordinate system is set in association with the world coordinate system.
  • the Zc axis direction of the view coordinate system is a direction perpendicular to the screen surface (origin Os of the screen coordinate system) corresponding to the screen of the client terminal 10, and coincides with the viewing direction of the virtual camera.
  • the Xc-Yc plane of the view coordinate system is parallel to the screen surface, and the distance from the view coordinate system origin Oc to the screen coordinate system origin Os coincides with the focal length of the virtual camera.
  • the virtual operation pad P is similar to the virtual operation pad P shown in FIG. 5, has a circular plane, and is arranged on the Xw-Zw plane of the world coordinate system.
  • the Xw-Zw plane of the world coordinate system coincides with the moving plane on which the character A shown in FIG. 4 moves.
  • the operation area generation unit 105 is arranged so as to be parallel to the moving plane of the character A in the virtual three-dimensional space from the virtual camera arranged so as to obliquely look down the moving plane in which the character A moves in the virtual three-dimensional space.
  • the operation area when the virtual operation pad P is viewed is generated.
  • the virtual operation pad P arranged in the world coordinate system (Xw, Yw, Zw) is used by using the coordinate conversion matrix calculated from the positional relationship between the virtual camera and the virtual operation pad P shown in FIG. To the view coordinate system (Xc, Yc, Zc). Further, by using the perspective projection method or the like, the coordinate conversion of the virtual operation pad P arranged in the view coordinate system (Xc, Yc, Zc) is performed into the screen coordinate system (Xs, Ys). By thus performing geometrical calculation and repeating coordinate conversion, the virtual operation pad P in the virtual three-dimensional space can be generated as a two-dimensional image (operation area).
  • FIG. 8 is a diagram showing a screen configuration example of the client terminal 10 according to the present invention.
  • the display control unit 106 generates the game image 501 generated by the game image generation unit 104 based on the virtual camera arrangement shown in FIG. 4, and the operation generated by the operation area generation unit 105 based on the virtual camera arrangement shown in FIG.
  • the area 502 is controlled so as to be arranged on the same screen.
  • the user can move the character A in the sliding direction by sliding the finger touching the screen to perform the slide operation on the operation area 502.
  • the operation area 502 shown in FIG. 8 is displayed in an elliptical shape instead of the circular shape shown in FIG. 6A due to the influence of the virtual camera arrangement setting shown in FIG. 7.
  • the moving direction on the screen where the character A moves is also 30 °.
  • the moving plane in the virtual three-dimensional space that is, the world coordinate system.
  • the moving direction of the character A on the (Xw-Zw plane) is 45 °.
  • the pointing direction (30 °) on the screen designated in the operation area 502 by the touch operation by the user and the moving direction on the moving plane in the virtual three-dimensional space in which the character A moves. (45 °) is different, but the shape of the operation area is deformed (deformed from a circle to an ellipse) under the influence of the placement setting of the virtual camera, so that the touch operation on the screen indicated by the user It is possible to match the direction (30 °) instructed with the moving direction (30 °) on the screen where the character A moves.
  • the character A moves in the direction intended by the user. Therefore, the user can intuitively operate the character A while looking at the screen. Is possible.
  • FIG. 9 is a flowchart illustrating an operation example regarding character movement of the client terminal 10 according to the present exemplary embodiment.
  • FIG. 10 is a diagram illustrating conversion of the designated direction.
  • a control procedure performed by the client terminal 10 to move the character A in the virtual three-dimensional space when the user performs a touch operation on the operation area 502 on the screen illustrated in FIG. 8 using the touch panel will be described in detail. To explain.
  • the client terminal 10 designates an arbitrary position in the operation area 502 by the touch operation performed by the user using the touch panel on the operation area 502 on the screen shown in FIG. It is determined whether or not (step S11).
  • the object control unit 102 can acquire the detection signal from the touch panel, the object control unit 102 can determine the designated position on the screen designated by the touch operation of the user. Therefore, when the object control unit 102 can determine the designated position on the screen instructed by the user's touch operation, the object control unit 102 determines whether or not the determined designated position on the screen is located in the operation area 502. To do.
  • step S11: NO when it is determined that the arbitrary position of the operation area 502 is not specified by the user's touch operation using the touch panel (step S11: NO), the user performs a touch operation on the operation area 502. Wait until is done. On the other hand, when it is determined that the arbitrary position of the operation area 502 is designated by the user's touch operation using the touch panel (step S11: YES), the character A is moved in the virtual three-dimensional space. It is determined whether or not a slide operation for performing is performed (step S12).
  • the object control unit 102 continuously performs detection signals from the touch panel by performing a touch operation (slide operation) in which the user continuously moves a finger in contact with the operation area 502 in a certain direction.
  • a touch operation silica operation
  • the designated position in the operation area 502 that changes continuously by the slide operation can be determined one after another. Therefore, when the object control unit 102 can successively determine the designated position in the operation area 502 that continuously changes by the user's slide operation, the finger touched by the user at the center of the operation area 502 can be arbitrarily changed. It is determined whether or not a touch operation (slide operation) of continuously moving in the direction of is performed.
  • step S12: NO when it is determined that the slide operation for moving the character A in the virtual three-dimensional space has not been performed (step S12: NO), the client terminal 10 ends this process while the virtual three
  • step S12: YES the on-screen instruction direction instructed in the operation area 502 by the slide operation is determined ( Step S13).
  • the object control unit 102 continuously moves the finger touching the center of the operation area 502 to a certain point, so that the object control unit 102 determines the start point position Ps1 in the screen coordinate system based on the detection signal from the touch panel.
  • the coordinates of the end point position Ps2 are determined.
  • the designated angle with respect to the Xs axis of the screen coordinate system is determined.
  • the left diagram of FIG. 10 when the pointing direction is input in the operation area 502 on the screen shown in FIG.
  • the pointing angle with respect to the Xs axis of the screen coordinate system is 30 °. It was assumed that At this time, the starting point position Ps1 in the screen coordinate system is assumed to coincide with the origin Os of the screen coordinate system.
  • the client terminal 10 instructs the screen instructed in the determined operation area 502.
  • the direction is converted into the designated direction on the moving plane on which the character A moves in the virtual three-dimensional space (step S14).
  • the object control unit 102 uses an inverse perspective transformation matrix or the like calculated from the positional relationship between the virtual camera and the virtual operation pad P shown in FIG. 7, to thereby determine the start point position Ps1 in the determined screen coordinate system.
  • the coordinates of the end point position Ps2 are converted into the coordinates of the start point position Pc1 and the end point position Pc2 in the view coordinate system.
  • the coordinates of the start point position Pc1 and the end point position Pc2 in the view coordinate system are set to the start point in the world coordinate system by using the inverse coordinate transformation matrix calculated from the positional relationship between the virtual camera and the virtual operation pad P shown in FIG. The coordinates are converted into the coordinates of the position Pw1 and the end point position Pw2.
  • the pointing direction in the operation area 502 on the Xs-Ys plane of the screen coordinate system is changed to the Xw-Zw plane of the world coordinate system (that is, the character A in the virtual three-dimensional space).
  • the direction has been changed to the direction designated by the virtual operation pad P on the moving plane).
  • the designated angle with respect to the Xs axis of the screen coordinate system was 30 ° as shown in the left diagram of FIG. 10, but after the coordinate conversion, as shown in the right diagram of FIG. , The designated angle with respect to the Xw axis of the world coordinate system is 45 °.
  • the client terminal 10 causes the character A in the virtual three-dimensional space. Is controlled to move in the converted designated direction (step S15).
  • the object control unit 102 translates the coordinates of the character A arranged on the Xw-Zw plane of the world coordinate system in parallel in the designated direction after the conversion by using the coordinate conversion matrix, so that the virtual three-dimensional image is obtained. In the space, the character A is moved in the converted designated direction.
  • FIG. 11 is a diagram for explaining the conversion of the pointing direction due to the layout change of the virtual camera.
  • the client terminal 10 when the user performs a touch operation (slide operation) of continuously moving the finger touching the operation area 502 on the screen shown in FIG.
  • a touch operation silica operation
  • the designated direction on the screen designated by the operation area 502 is maintained, and the character A keeps moving in the designated direction in the virtual three-dimensional space. It is also possible to Then, while the character A continues to move in the virtual three-dimensional space as described above, it is possible to change the arrangement of the virtual cameras in the virtual three-dimensional space by the operation of the user.
  • the pointing direction on the screen which the user has previously pointed to in the operation area 502 is also linked. Will be changed. Therefore, although the user keeps the touched finger in the same state without releasing the finger from the screen, the character A will start to move in the designated direction after the change in the virtual three-dimensional space. .
  • the contacted finger is kept as it is without being released from the screen, and as shown in the left diagram of FIG. 11A, Xs- of the screen coordinate system is displayed. It is assumed that the pointing direction in the operation area 502 on the Ys plane is maintained at 30 °.
  • the designated direction on the virtual operation pad P on the Xw-Zw plane of the world coordinate system is changed as shown in the right diagram of FIG. 11A. It shall be maintained at 45 °. In this case, the character A continues to move in the indicated direction of 45 ° on the moving plane of the virtual three-dimensional space.
  • the viewing direction of the virtual camera is changed according to the user's operation.
  • the operation area 502 is deformed into an elliptical shape.
  • the pointing direction in the operation area 502 shown in the left diagram of FIG. 11B is the operation area 502 shown in the left diagram of FIG. 11A. It will be maintained at 30 °, which is the same as the direction indicated by. Therefore, when the designated direction is converted by the process of step S14 shown in FIG.
  • the designated direction on the virtual operation pad P shown in the right diagram of FIG. 11B becomes the virtual operation pad P shown in the right diagram of FIG. 11A.
  • the angle is not 45 °, which is the same as the instructed direction at, but is larger than that (here, 60 °). That is, when the arrangement of the virtual camera is changed in this way, the user can keep the contacted finger from the screen as it is, but at a 45 ° angle in the moving plane of the virtual three-dimensional space.
  • the character A who has continued to move in the indicated direction starts to move in the indicated direction of 60 °.
  • the client terminal 10 even if the virtual camera arrangement is changed when the user performs the slide operation and keeps the touched finger as it is without releasing the screen. , A in the moving plane of the virtual three-dimensional space is allowed to continue moving in the designated direction as before. The details will be described below.
  • the operation area generation unit 105 operates the operation area when the virtual operation pad P arranged on the moving plane of the virtual three-dimensional space is seen from the virtual camera after the arrangement change (the operation area 502 shown in the left diagram of FIG. 11B). ) Is generated.
  • the virtual operation pad P placed in the world coordinate system (Xw, Yw, Zw) is used by using a coordinate transformation matrix or the like calculated from the positional relationship between the virtual camera and the virtual operation pad P after the placement change.
  • the view coordinate system (Xc, Yc, Zc) is performed into the screen coordinate system (Xs, Ys).
  • the display control unit 106 generates an instruction direction on the virtual operation pad P (the same as the virtual operation pad P shown in the right diagram of FIG. 11A) before changing the placement of the virtual cameras shown in the right diagram of FIG. 11C.
  • the direction is changed to an instruction direction on the screen to be instructed by the user's slide operation.
  • the start point position Pw1 and the end point position Pw2 in the world coordinate system shown in the right diagram of FIG. 11C are used. Is converted into coordinates of a start point position Pc1 and an end point position Pc2 in the view coordinate system. Further, the coordinates of the starting point position Pc1 and the ending point position Pc2 in the view coordinate system are converted into the coordinates of the starting point position Ps1 and the ending point position Ps3 in the screen coordinate system.
  • the pointing direction on the virtual operation pad P on the Xw-Zw plane of the world coordinate system (that is, the moving plane on which the character A moves in the virtual three-dimensional space) is changed to the screen.
  • the indicated angle with respect to the Xw axis of the world coordinate system was 45 ° as shown in the right diagram of FIG. 11C, but after the coordinate conversion, the indicated angle with respect to the Xs axis of the screen coordinate system. Is assumed to be 15 °. That is, if the pointing direction designated in the operation area 502 on the screen by the user's slide operation is 15 °, the character A continues to move in the pointing direction of 45 ° in the moving plane of the virtual three-dimensional space as before.
  • the pointing direction designated in the operation area 502 on the screen by the user's slide operation is 15 °, the character A continues to move in the
  • the display control unit 106 keeps the designated direction instructed in the operation area 502 on the screen by the user's slide operation to be 15 ° as it is, so that the designated direction of 15 ° is maintained.
  • the end point position of is shifted from the coordinate position of PS3 to the coordinate position of PS2, and the start point position is shifted from the coordinate position of PS1 to the coordinate position of PS4.
  • the display control unit 106 performs coordinate transformation of the operation area 502 in the screen coordinate system using the coordinate transformation matrix, so that the end point position (coordinates of PS2 of the designated direction on the screen before and after the placement change of the virtual camera is performed.
  • the generated operation area 502 (the operation area shown in the left diagram of FIG. 11C) is made to match the position) and to match the instructed direction (15 °) on the screen to be instructed in the converted operation area 502. 502) is arranged on the screen.
  • the character A can be moved while maintaining the previous moving direction.
  • FIG. 12 is a diagram illustrating a specific example of the slide operation on the operation area.
  • the touch operation when a user performs a touch operation (slide operation) in which a finger touching the operation area 502 on the screen illustrated in FIG. 8 is continuously moved in a certain direction, the touch operation is performed. It is also possible to change the moving speed of the character A in the virtual three-dimensional space according to the distance (slide distance) with which the finger is continuously moved.
  • the operation area 502 on the screen is deformed according to the arrangement settings of the virtual cameras before and after the change. Therefore, the size of the slide distance required for the slide operation is large. Will be different.
  • the slide distance input by the user through the slide operation is equal to or greater than the predetermined distance before the virtual camera arrangement is changed and after the virtual camera arrangement is changed. Then, the character A is moved at high speed in the virtual three-dimensional space.
  • the object control unit 102 changes the viewing direction of the virtual camera.
  • the slide distance on the screen instructed in the operation area 502 by the user's slide operation is determined based on the detection signal from the touch panel. If the determined slide distance is greater than or equal to the predetermined distance, the character A is controlled to move at high speed in the virtual three-dimensional space.
  • the user If the slide distance input by the slide operation is the distance S or more, the character A is moved at high speed in the virtual three-dimensional space.
  • the character A continues to move at high speed in the virtual three-dimensional space, even if the placement of the virtual camera is changed, the character A continues to move at high speed. It is also possible to
  • the object control unit 102 causes the character A to move in the instruction direction on the moving plane in the virtual three-dimensional space by maintaining the slide distance on the screen instructed in the operation area 502 by the user's slide operation.
  • the virtual camera control unit 103 changes the viewing direction of the virtual camera in the state of continuing to move at high speed, after the viewing direction of the virtual camera is changed, the user operates the operation area 502 to instruct.
  • the high-speed movement of the character A is further controlled in the virtual three-dimensional space.
  • the case where the moving speed of the character is changed according to the slide distance input by the user through the slide operation has been described as an example.
  • the present invention is not limited to this.
  • the attack type and the skill type may be changed according to the slide distance input by the user through the slide operation (the attack type may be switched from the weapon A to the weapon B, and the skill type that can be activated from the skill A may be changed). You may switch to skill B).
  • the viewing direction of the virtual camera is changed according to the user's operation
  • the virtual camera is not automatically operated according to the user's operation but according to the progress of the game. It is also possible to control so that the viewing direction of is changed.
  • a screen configuration in which the arrangement of the operation area 502 is fixed on the screen may be used, or an arbitrary designated position designated by the user with the first touch operation is set as a reference position, and the reference position is set as the reference position.
  • a screen configuration in which the operation area 502 is arranged may be used.
  • the designated direction on the screen designated in the operation area 502 is transformed into the designated direction on the moving plane in which the character A moves in the virtual three-dimensional space by using coordinate transformation, perspective projection method and the like.
  • the present invention is not limited to this.
  • the pointing direction can be converted by obtaining the converted pointing angle from the area ratio between the operation area 502 and the virtual operation pad P.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention permet à un utilisateur de faire fonctionner facilement un objet tout en visualisant un écran. La présente invention concerne un dispositif de traitement d'informations comprenant : une unité de génération d'image de jeu qui génère une image de jeu qui fournit une vue, d'un objet sur un plan de déplacement sur lequel l'objet se déplace, à partir d'une caméra virtuelle disposée de façon à regarder diagonalement vers le bas le plan de déplacement dans un espace tridimensionnel virtuel ; une unité de génération de zone d'opération qui génère une zone d'opération qui est vue dans une vue, à partir de la caméra virtuelle, d'un objet d'opération virtuel disposé parallèlement au plan de déplacement dans l'espace tridimensionnel virtuel ; une unité de commande d'affichage qui amène l'image de jeu et la zone d'opération à être agencées sur le même écran ; et une unité de commande d'objet qui détermine, sur la base d'un signal de détection provenant d'un panneau tactile, une direction indiquée sur l'écran indiquée dans la zone d'opération par l'intermédiaire d'une opération tactile par un utilisateur, qui convertit la direction indiquée déterminée sur l'écran indiquée dans la zone d'opération en une direction indiquée sur le plan de déplacement dans l'espace tridimensionnel virtuel et qui effectue une commande de façon à déplacer l'objet dans la direction indiquée, sur le plan de déplacement, obtenue par la conversion.
PCT/JP2019/040423 2018-10-16 2019-10-15 Dispositif et programme de traitement d'informations Ceased WO2020080346A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-195454 2018-10-16
JP2018195454A JP6569794B1 (ja) 2018-10-16 2018-10-16 情報処理装置及びプログラム

Publications (1)

Publication Number Publication Date
WO2020080346A1 true WO2020080346A1 (fr) 2020-04-23

Family

ID=67844848

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/040423 Ceased WO2020080346A1 (fr) 2018-10-16 2019-10-15 Dispositif et programme de traitement d'informations

Country Status (2)

Country Link
JP (1) JP6569794B1 (fr)
WO (1) WO2020080346A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115738264A (zh) * 2022-10-12 2023-03-07 网易(杭州)网络有限公司 虚拟对象的操作控制方法、装置和电子设备
WO2023142767A1 (fr) * 2022-01-27 2023-08-03 北京字跳网络技术有限公司 Procédé et appareil de commande d'objet virtuel, et support lisible et dispositif électronique

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7409770B2 (ja) * 2018-12-28 2024-01-09 株式会社バンダイナムコエンターテインメント ゲームシステム、プログラム及び端末装置
CN111282266B (zh) * 2020-02-14 2021-08-03 腾讯科技(深圳)有限公司 三维虚拟环境中的技能瞄准方法、装置、终端及存储介质
JP7580747B2 (ja) 2020-09-30 2024-11-12 株式会社コナミアミューズメント ゲームシステム、それに用いるコンピュータプログラム、及び制御方法
WO2022196349A1 (fr) * 2021-03-15 2022-09-22 株式会社コナミデジタルエンタテインメント Support d'enregistrement, dispositif de traitement d'informations, et procédé de traitement d'informations
JP7320286B2 (ja) * 2021-03-15 2023-08-03 株式会社コナミデジタルエンタテインメント プログラム、情報処理装置、および情報処理方法
JP7320287B2 (ja) * 2021-03-15 2023-08-03 株式会社コナミデジタルエンタテインメント プログラム、情報処理装置、および情報処理方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013085811A (ja) * 2011-10-20 2013-05-13 Konami Digital Entertainment Co Ltd ゲーム装置、ゲーム装置の制御方法、及びプログラム
JP2018057742A (ja) * 2016-10-07 2018-04-12 株式会社コーエーテクモゲームス ゲーム処理プログラム及び記憶媒体
WO2018104921A1 (fr) * 2016-12-08 2018-06-14 Digital Pulse Pty. Limited Système et procédé d'apprentissage collaboratif utilisant la réalité virtuelle
JP2018097649A (ja) * 2016-12-14 2018-06-21 エヌエイチエヌ エンターテインメント コーポレーションNHN Entertainment Corporation プログラム、画像制御装置および画像制御方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5237325B2 (ja) * 2010-04-28 2013-07-17 株式会社スクウェア・エニックス ビデオゲーム処理装置、ビデオゲーム処理方法、およびビデオゲーム処理プログラム
JP2014208258A (ja) * 2014-06-12 2014-11-06 株式会社スクウェア・エニックス ビデオゲーム処理装置、およびビデオゲーム処理プログラム
JP6643776B2 (ja) * 2015-06-11 2020-02-12 株式会社バンダイナムコエンターテインメント 端末装置及びプログラム
JP6005831B1 (ja) * 2015-12-28 2016-10-12 株式会社Cygames プログラム及び情報処理方法
JP6084719B1 (ja) * 2016-02-26 2017-02-22 株式会社コロプラ 画像処理方法、及び画像処理プログラム
JP6661513B2 (ja) * 2016-10-31 2020-03-11 株式会社バンク・オブ・イノベーション ビデオゲーム処理装置、及びビデオゲーム処理プログラム
JP6907148B2 (ja) * 2016-12-22 2021-07-21 株式会社コロプラ 情報処理方法、装置、及び当該情報処理方法をコンピュータに実行させるためのプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013085811A (ja) * 2011-10-20 2013-05-13 Konami Digital Entertainment Co Ltd ゲーム装置、ゲーム装置の制御方法、及びプログラム
JP2018057742A (ja) * 2016-10-07 2018-04-12 株式会社コーエーテクモゲームス ゲーム処理プログラム及び記憶媒体
WO2018104921A1 (fr) * 2016-12-08 2018-06-14 Digital Pulse Pty. Limited Système et procédé d'apprentissage collaboratif utilisant la réalité virtuelle
JP2018097649A (ja) * 2016-12-14 2018-06-21 エヌエイチエヌ エンターテインメント コーポレーションNHN Entertainment Corporation プログラム、画像制御装置および画像制御方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"What is the white cat project Puni Kon?", CONQUEST MANUAL OF WHITE CAT PROJECT, 16 March 2015 (2015-03-16), Retrieved from the Internet <URL:http://web.archive.org/web/20150316120517> [retrieved on 20150920] *
FINAL FANTASY VI' IS UPDATED ON A LARGE-SCALE!, A FEELING OF PLAY FOR HOUSEHOLD REVIVES, CORRESPONDING TO THE GAMEPAD, 16 October 2014 (2014-10-16), Retrieved from the Internet <URL:https://app.famitsu.com/20141016_455607> [retrieved on 20190315] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023142767A1 (fr) * 2022-01-27 2023-08-03 北京字跳网络技术有限公司 Procédé et appareil de commande d'objet virtuel, et support lisible et dispositif électronique
CN115738264A (zh) * 2022-10-12 2023-03-07 网易(杭州)网络有限公司 虚拟对象的操作控制方法、装置和电子设备

Also Published As

Publication number Publication date
JP2020062179A (ja) 2020-04-23
JP6569794B1 (ja) 2019-09-04

Similar Documents

Publication Publication Date Title
JP6569794B1 (ja) 情報処理装置及びプログラム
US11752432B2 (en) Information processing device and method of causing computer to perform game program
US8830184B2 (en) Image displaying device, image displaying method, and program for displaying images
US9433857B2 (en) Input control device, input control method, and input control program
US11099723B2 (en) Interaction method for user interfaces
JP6969516B2 (ja) プログラム及び情報処理装置
CN111383345B (zh) 虚拟内容的显示方法、装置、终端设备及存储介质
CN111913565A (zh) 虚拟内容控制方法、装置、系统、终端设备及存储介质
TWI442305B (zh) 多點控制的操作方法及其控制系統
JP6394190B2 (ja) 遮蔽パターン検出に基づくジェスチャ制御を可能とするシステムと方法
JP2020062376A (ja) 情報処理装置及びプログラム
JP6521146B1 (ja) 情報処理装置及びプログラム
JP5767371B1 (ja) 仮想空間平面上に配置したオブジェクトを表示制御するゲーム・プログラム
JP6501533B2 (ja) アイコン選択のためのインターフェースプログラム
CN112402967B (zh) 游戏控制方法、装置、终端设备及介质
KR101528485B1 (ko) 스마트 디바이스 기반 가상현실 서비스 시스템 및 방법
JP5773818B2 (ja) 表示制御装置、表示制御方法及びコンピュータプログラム
JP2019202128A (ja) 情報処理装置及びプログラム
JP7473832B1 (ja) 電子機器及びプログラム
JP5997388B2 (ja) エミュレーション装置、エミュレーション方法、プログラム及び情報記憶媒体
CN115581917A (zh) 虚拟环境中视角控制的方法和装置
JP2019134881A (ja) プログラム及びゲーム装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19872465

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19872465

Country of ref document: EP

Kind code of ref document: A1