[go: up one dir, main page]

WO2023128043A1 - Procédé et système de fourniture d'interface pour la génération automatisée de modèles 3d - Google Patents

Procédé et système de fourniture d'interface pour la génération automatisée de modèles 3d Download PDF

Info

Publication number
WO2023128043A1
WO2023128043A1 PCT/KR2022/000487 KR2022000487W WO2023128043A1 WO 2023128043 A1 WO2023128043 A1 WO 2023128043A1 KR 2022000487 W KR2022000487 W KR 2022000487W WO 2023128043 A1 WO2023128043 A1 WO 2023128043A1
Authority
WO
WIPO (PCT)
Prior art keywords
model
display
mode
user
receiving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2022/000487
Other languages
English (en)
Korean (ko)
Inventor
윤경원
블랑코로저
현경훈
반성훈
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Recon Labs Inc
Original Assignee
Recon Labs Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Recon Labs Inc filed Critical Recon Labs Inc
Publication of WO2023128043A1 publication Critical patent/WO2023128043A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation

Definitions

  • the present disclosure relates to a method for providing an interface for automatically generating a 3D model, and more particularly, to a method and system for providing a user interface for automatically generating a 3D model using a 2D sketch.
  • Freehand sketching is an effective and efficient method of visualizing various design ideas in the initial stage of design and simulating the sense of form, production method, and function. To this end, designers conduct design simulation experiments using various sketch programs.
  • the present disclosure provides a method for providing an interface for automatically generating a 3D model, a computer program stored in a recording medium, and an apparatus (system) to solve the above problems.
  • the present disclosure may be implemented in a variety of ways, including a method, apparatus (system) or computer program stored on a readable storage medium.
  • a method of providing an interface for automatically generating a 3D model executed by at least one processor, provides a 2D image of a target object viewed from a first viewpoint by a user in a first mode of the automatic 3D model generation interface.
  • Receiving sketch data receiving a first user input for switching the 3D model automatic generation interface to the second mode, and in the second mode, to a target object generated based on the 2D sketch data and displaying a 3D model of the sketch on a display, wherein the 2D sketch data includes a plurality of strokes.
  • the display after displaying the 3D model on the display, receiving an additional stroke for correcting the 3D model from the user in a second mode, and a target object created based on the additional stroke and displaying on a display an updated 3-dimensional model for , wherein the additional stroke is received while the 3-dimensional model is displayed on the display.
  • the additional stroke is received while the rotated three-dimensional model is displayed on the display.
  • the 3D model after displaying the 3D model on the display, receiving a third user input for switching to the first mode, correcting the 3D model from the user in the first mode Receiving an additional stroke for a second mode, receiving a fourth user input for switching to a second mode, and an updated 3D model of a target object generated based on the 2D sketch data and the additional stroke in the second mode. and displaying on the display, and the additional stroke is received while the two-dimensional sketch data is displayed on the display.
  • the method further includes displaying on a display 2D sketch data of the target object corresponding to the second viewpoint.
  • the receiving of the two-dimensional sketch data may include receiving a plurality of strokes from a user in a first mode and displaying recommended strokes generated based on the plurality of strokes on a display. more includes
  • the recommended stroke is displayed overlapping with a plurality of strokes and displayed in a color or transparency different from that of the plurality of strokes.
  • the 2D sketch data includes a plurality of layers, each of the plurality of layers includes at least one stroke, and the 3D model displayed on the display is included in a selected layer among the plurality of layers. It is characterized in that it is a three-dimensional model generated based on the stroke.
  • a computer program stored in a computer readable recording medium is provided to execute a method for providing a 3D model automatically generating interface on a computer.
  • An information processing system includes a communication module, a memory, and at least one processor connected to the memory and configured to execute at least one computer-readable program included in the memory, and including at least one program.
  • a communication module In the first mode of the automatic 3D model generation interface, 2D sketch data from the user viewing the target object at a specific point of view is received, and a user input for switching to the second mode of the 3D model automatic generation interface is received, , and instructions for displaying on a display a 3D model of a target object generated based on 2D sketch data in the second mode, where the 2D sketch data includes a plurality of strokes.
  • a user can easily create/correct a 3D model by freely switching between the first mode (2D sketch view mode) and the second mode (3D sketch view mode).
  • a user may easily generate 2D sketch data by viewing a target object of the 2D sketch data from different viewpoints using an information processing system.
  • 1 is a diagram illustrating an example of converting 2D sketch data into a 3D model according to an embodiment of the present disclosure.
  • FIG. 2 is a block diagram showing the internal configuration of a user terminal and an information processing system according to an embodiment of the present disclosure.
  • FIG. 3 is a flowchart illustrating an example of a method for providing an interface for automatically generating a 3D model according to an embodiment of the present disclosure.
  • FIG. 4 is a diagram illustrating an example of a method of displaying a recommended stroke by a stroke recommendation model according to an embodiment of the present disclosure.
  • FIG. 5 is a diagram illustrating an example of a method of correcting a 3D model using an additional stroke according to an embodiment of the present disclosure.
  • FIG. 6 is a diagram illustrating an example of a first mode of an interface for automatically generating a 3D model according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram illustrating an example of a layer panel of an interface for automatically generating a 3D model according to an embodiment of the present disclosure.
  • FIG. 8 is a diagram illustrating an example of a second mode of an interface for automatically generating a 3D model according to an embodiment of the present disclosure.
  • FIG. 9 is a flowchart illustrating an example of a method for providing an interface for automatically generating a 3D model according to an embodiment of the present disclosure.
  • a modulee' or 'unit' used in the specification means a software or hardware component, and the 'module' or 'unit' performs certain roles.
  • 'module' or 'unit' is not meant to be limited to software or hardware.
  • a 'module' or 'unit' may be configured to reside in an addressable storage medium and may be configured to reproduce one or more processors.
  • a 'module' or 'unit' includes components such as software components, object-oriented software components, class components, and task components, processes, functions, and attributes. , procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, or variables.
  • a 'module' or 'unit' may be implemented with a processor and a memory.
  • 'Processor' should be interpreted broadly to include general-purpose processors, central processing units (CPUs), microprocessors, digital signal processors (DSPs), controllers, microcontrollers, state machines, and the like.
  • 'processor' may refer to an application specific integrated circuit (ASIC), programmable logic device (PLD), field programmable gate array (FPGA), or the like.
  • ASIC application specific integrated circuit
  • PLD programmable logic device
  • FPGA field programmable gate array
  • 'Processor' refers to a combination of processing devices, such as, for example, a combination of a DSP and a microprocessor, a combination of a plurality of microprocessors, a combination of one or more microprocessors in conjunction with a DSP core, or a combination of any other such configurations. You may. Also, 'memory' should be interpreted broadly to include any electronic component capable of storing electronic information.
  • 'Memory' includes random access memory (RAM), read-only memory (ROM), non-volatile random access memory (NVRAM), programmable read-only memory (PROM), erasable-programmable read-only memory (EPROM), It may also refer to various types of processor-readable media, such as electrically erasable PROM (EEPROM), flash memory, magnetic or optical data storage, registers, and the like.
  • RAM random access memory
  • ROM read-only memory
  • NVRAM non-volatile random access memory
  • PROM programmable read-only memory
  • EPROM erasable-programmable read-only memory
  • a memory is said to be in electronic communication with the processor if the processor can read information from and/or write information to the memory.
  • Memory integrated with the processor is in electronic communication with the processor.
  • a 'system' may include at least one of a server device and a cloud device, but is not limited thereto.
  • a system may consist of one or more server devices.
  • a system may consist of one or more cloud devices.
  • the system may be operated by configuring a server device and a cloud device together.
  • a 'machine learning model' may include any model used to infer an answer to a given input.
  • the machine learning model may include an artificial neural network model including an input layer (layer), a plurality of hidden layers, and an output layer, where each layer may include a plurality of nodes.
  • a machine learning model may refer to an artificial neural network model
  • an artificial neural network model may refer to a machine learning model.
  • a '3D model conversion model', a 'stroke recommendation model', and the like may be implemented as a machine learning model.
  • a model described as one machine learning model may include a plurality of machine learning models, and a plurality of models described as separate machine learning models may be implemented as a single machine learning model. may be
  • 'display' may refer to any display device associated with a computing device, for example, any display device capable of displaying any information/data provided or controlled by the computing device. can refer to
  • 'each of a plurality of A' or 'each of a plurality of A' may refer to each of all components included in a plurality of A's, or each of some components included in a plurality of A's. .
  • a user may input 2D sketch data 110 to a user terminal using an input device.
  • the input device may be provided in the user terminal itself or may be configured to transmit and receive 2D sketch data 110 input by the user through wireless or wired communication with the user terminal.
  • the input device may be a digital pen or a tablet device.
  • the information processing system receives the 2D sketch data 110 viewed from the user at the first point of view in the first mode (eg, 2D sketch view mode) of the 3D model automatic generation interface. can receive
  • a user may input 2D sketch data 110 of a target object using an input device and transmit the input 2D sketch data 110 to a user terminal.
  • the target object refers to an object that the user wishes to sketch.
  • the target object is a bookshelf, and the user may input 2D sketch data 110 of the bookshelf through an input device.
  • the 2D sketch data 110 may include a plurality of strokes.
  • the user terminal may transmit the 2D sketch data 110 to the information processing system.
  • the information processing system may convert/generate the received 2D sketch data 110 into a 3D model using a 3D model conversion model of the 2D sketch data.
  • the information processing system may receive a user input for switching to the second mode (eg, 3D sketch view mode).
  • the second mode eg, 3D sketch view mode
  • the information processing system may display the 3D model 120 of the target object generated based on the 2D sketch data 110 on the display in the second mode.
  • the information processing system may display the 3D model 120 on the display of the user terminal.
  • the information processing system may correct the 3D model based on receiving a user input for correcting the 3D model while the 3D model 120 is displayed. The correction of the 3D model is replaced with the description of FIG. 3 to be described later.
  • the process of generating a 3D model using 2D sketch data has been described as being performed by an information processing system, but is not limited thereto and may be implemented differently in other embodiments.
  • at least some or all of a series of processes of generating a 3D model from 2D sketch data may be performed by a user terminal.
  • the following will be described on the premise that the 3D model generation process is performed by the information processing system.
  • a user can create a 3D model only with 2D sketch data composed of strokes, even without creating a 2D image in a limited form in order to create a 3D model. Accordingly, image simulation for a larger design can be performed efficiently.
  • FIG. 2 is a block diagram showing the internal configuration of the user terminal 210 and the information processing system 230 according to an embodiment of the present disclosure.
  • the user terminal 210 may refer to any computing device capable of executing a 3D modeling application, a web browser, etc. and capable of wired/wireless communication, and may include, for example, a mobile phone terminal, a tablet terminal, a PC terminal, and the like.
  • the user terminal 210 may include a memory 212 , a processor 214 , a communication module 216 and an input/output interface 218 .
  • information processing system 230 may include memory 232 , processor 234 , communication module 236 and input/output interface 238 . As shown in FIG.
  • the user terminal 210 and the information processing system 230 are configured to communicate information and/or data through the network 220 using respective communication modules 216 and 236. It can be.
  • the input/output device 240 may be configured to input information and/or data to the user terminal 210 through the input/output interface 218 or output information and/or data generated from the user terminal 210.
  • the memories 212 and 232 may include any non-transitory computer readable media. According to one embodiment, the memories 212 and 232 are non-perishable mass storage devices such as random access memory (RAM), read only memory (ROM), disk drives, solid state drives (SSDs), flash memory, and the like. (permanent mass storage device) may be included. As another example, a non-perishable mass storage device such as a ROM, SSD, flash memory, or disk drive may be included in the user terminal 210 or the information processing system 230 as a separate permanent storage device separate from memory. In addition, the memories 212 and 232 may store an operating system and at least one program code (eg, a code for a 3D modeling application installed and driven in the user terminal 210).
  • program code eg, a code for a 3D modeling application installed and driven in the user terminal 210.
  • a recording medium readable by such a separate computer may include a recording medium directly connectable to the user terminal 210 and the information processing system 230, for example, a floppy drive, a disk, a tape, a DVD/CD- It may include a computer-readable recording medium such as a ROM drive and a memory card.
  • software components may be loaded into the memories 212 and 232 through a communication module rather than a computer-readable recording medium. For example, at least one program is loaded into the memories 212 and 232 based on a computer program installed by files provided by developers or a file distribution system that distributes application installation files through the network 220 . It can be.
  • the processors 214 and 234 may be configured to process instructions of a computer program by performing basic arithmetic, logic, and input/output operations. Instructions may be provided to processors 214 and 234 by memory 212 and 232 or communication modules 216 and 236 . For example, processors 214 and 234 may be configured to execute instructions received according to program code stored in a recording device such as memory 212 and 232 .
  • the communication modules 216 and 236 may provide configurations or functions for the user terminal 210 and the information processing system 230 to communicate with each other through the network 220, and the user terminal 210 and/or information processing.
  • System 230 may provide configurations or functions for communicating with other user terminals or other systems (eg, separate cloud systems, etc.).
  • a request or data generated by the processor 214 of the user terminal 210 according to a program code stored in a recording device such as the memory 212 eg, a 3D model generation request, a 2D model for a target object) sketch data, etc.
  • a program code stored in a recording device such as the memory 212
  • a control signal or command provided under the control of the processor 234 of the information processing system 230 passes through the communication module 236 and the network 220 through the communication module 216 of the user terminal 210. It may be received by the user terminal 210 .
  • the user terminal 210 may receive 3D model data of a target object from the information processing system 230 through the communication module 216 .
  • the input/output interface 218 may be a means for interfacing with the input/output device 240 .
  • the input device may include a device such as a camera, keyboard, microphone, mouse, etc. including an audio sensor and/or image sensor
  • the output device may include a device such as a display, speaker, haptic feedback device, or the like.
  • the input/output interface 218 may be a means for interface with a device in which a configuration or function for performing input and output is integrated into one, such as a touch screen. For example, when the processor 214 of the user terminal 210 processes a command of a computer program loaded into the memory 212, information and/or data provided by the information processing system 230 or other user terminals are used.
  • a service screen or the like configured as described above may be displayed on the display through the input/output interface 218 .
  • the input/output device 240 is not included in the user terminal 210 in FIG. 2 , it is not limited thereto, and the user terminal 210 and the user terminal 210 may be configured as one device.
  • the input/output interface 238 of the information processing system 230 is connected to the information processing system 230 or means for interface with a device (not shown) for input or output that the information processing system 230 may include.
  • the input/output interfaces 218 and 238 are shown as separate elements from the processors 214 and 234, but are not limited thereto, and the input/output interfaces 218 and 238 may be included in the processors 214 and 234. there is.
  • the user terminal 210 and the information processing system 230 may include more components than those shown in FIG. 2 . However, there is no need to clearly show most of the prior art components. According to one embodiment, the user terminal 210 may be implemented to include at least some of the aforementioned input/output devices 240 . In addition, the user terminal 210 may further include other components such as a transceiver, a global positioning system (GPS) module, a camera, various sensors, and a database. For example, when the user terminal 210 is a smart phone, it may include components that are generally included in a smart phone, for example, an acceleration sensor, a gyro sensor, a camera module, various physical buttons, and a touch screen.
  • GPS global positioning system
  • Various components such as a button using a panel, an input/output port, and a vibrator for vibration may be implemented to be further included in the user terminal 210 .
  • the processor 214 of the user terminal 210 may be configured to operate an application providing a 3D model generation service. At this time, codes associated with the application and/or program may be loaded into the memory 212 of the user terminal 210 .
  • the processor 214 receives inputs such as a camera including a touch screen, a keyboard, an audio sensor and/or an image sensor, and a microphone connected to the input/output interface 218. Text, image, video, voice, and/or action input or selected through the device may be received, and the received text, image, video, voice, and/or action may be stored in the memory 212 or the communication module 216 ) and the information processing system 230 through the network 220.
  • the processor 214 receives 2D sketch data through the input/output interface 218 and an input device, receives a user input requesting generation of a 3D model of a target object, and communicates the 2D sketch data. information processing system 230 via module 216 and network 220 .
  • the processor 214 may provide the corrected 2D sketch data to the information processing system 230 through the communication module 216 and the network 220 .
  • the processor 214 of the user terminal 210 manages, processes, and/or stores information and/or data received from the input device 240, other user terminals, the information processing system 230, and/or a plurality of external systems. can be configured to Information and/or data processed by processor 214 may be provided to information processing system 230 via communication module 216 and network 220 .
  • the processor 214 of the user terminal 210 may transmit and output information and/or data to the input/output device 240 through the input/output interface 218 .
  • the processor 214 may display the received information and/or data on the screen of the user terminal.
  • the processor 234 of the information processing system 230 may be configured to manage, process, and/or store information and/or data received from a plurality of user terminals 210 and/or a plurality of external systems. Information and/or data processed by the processor 234 may be provided to the user terminal 210 via the communication module 236 and the network 220 .
  • the processor 234 of the information processing system 230 receives 2D sketch data from a user terminal 210 looking at an object from a specific viewpoint, and receives a second mode (eg, a 3D sketch view mode).
  • a 3D model of the target object created based on the 2D sketch data may be generated in the second mode.
  • the processor 234 of the information processing system 230 may provide the 3D model thus generated to the user terminal 210 through the communication module 236 and the network 220 .
  • the processor 234 of the information processing system 230 uses the output device 240 such as a display output capable device (eg, a touch screen, a display, etc.) of the user terminal 210 and an audio output capable device (eg, a speaker). It may be configured to output processed information and/or data.
  • the processor 234 of the information processing system 230 provides a 3D model of the target object to the user terminal 210 through the communication module 236 and the network 220, and generates the 3D model. It may be configured to output through a display output capable device of the user terminal 210 .
  • the user terminal 210 may receive 310 2D sketch data of a target object.
  • a user may input a 2D sketch of a target object through an input device connected to the user terminal 210 by wire or wirelessly.
  • a user may input 2D sketch data using a WebGL-based 3D framework.
  • 2D sketch data may include a plurality of strokes. That is, the user terminal 210 may receive a plurality of strokes included in the 2D sketch data from the user. The user terminal 210 may transmit 2D sketch data to the information processing system 230 .
  • the information processing system 230 may determine whether a recommended stroke is necessary (320). For example, the information processing system 230 may determine whether a recommended stroke is necessary based on a user input.
  • the information processing system 230 may generate a recommended stroke by inputting a plurality of strokes included in the user's sketch data to the stroke recommendation model (330).
  • the stroke recommendation model may generate a recommended stroke based on at least one of the user's stroke writing habit and the type of the target object.
  • the recommended stroke may be a single stroke or a plurality of strokes.
  • the stroke recommendation model may use a pre-stored sketch database to recommend strokes based on the user's stroke writing habit and/or the type of object.
  • a data-based design learning model a natural language processing learning model using a 3D model generation and modification technique, etc. may be used, but it is not limited thereto, and various algorithms and models for machine learning may be used.
  • a sketch database used in the stroke recommendation model may include a plurality of strokes included in past sketch data of various users.
  • the information processing system 230 may store 2D sketch data including the input stroke as an image whenever the user inputs the stroke. Accordingly, the stroke recommendation model may recommend strokes similar to stroke habits of various or the same user, and the time required for the user to create a sketch may be reduced.
  • the information processing system 230 may visualize 350 the 2D sketch data on the display. If there is a recommended stroke, the information processing system 230 may overlap and display a plurality of strokes included in the 2D sketch data and the recommended stroke.
  • the plurality of strokes included in the two-dimensional sketch data and the recommended stroke may be visually distinguished.
  • the information processing system 230 may display a plurality of strokes and recommended strokes included in the 2D sketch data in different colors.
  • the information processing system 230 may display a plurality of strokes and recommended strokes included in the 2D sketch data to have different transparency. Accordingly, the user can visually distinguish the stroke input by the user and the recommended stroke.
  • the information processing system 230 may generate a 3D model of the target object by inputting the 2D sketch data to a 3D model generation model (360). If there is a recommended stroke, the information processing system 230 may generate a 3D model by inputting a plurality of strokes and recommended strokes included in the 2D sketch data to a 3D model generation model.
  • a method of generating a 3D model using the 3D model generation model will be described later in detail with reference to FIG. 8 .
  • the information processing system 230 may display the generated 3D model on the display (370). Accordingly, the user can check the 3D model of the 2D sketch through the display.
  • the information processing system 230 may visualize 380 the 3D model in any one of a 2D view, a 3D view, and a recommendation view.
  • the 2D view is a function of displaying 2D sketch data input by a user on a display.
  • the 3D view is a function of displaying the generated 3D model on a display.
  • the recommendation view is a function of displaying a recommended stroke on 2D sketch data (or a 3D model).
  • the information processing system 230 may display any one of a 2D view, a 3D view, and a recommended view on a display based on a user input.
  • the information processing system 230 may receive a user input for correction of the 3D model while the 3D model is visualized. For example, the information processing system 230 may receive an additional stroke for correction of the 3D model while the 2D sketch data is displayed on the display. As another example, the information processing system 230 may receive an additional stroke for correction of the 3D model while the 3D model is displayed on the display.
  • the information processing system 230 may generate 2D sketch data including additional strokes by returning to the step of receiving 310 the 2D sketch data. Also, the information processing system 230 may display the generated 2D sketch data on the display. After that, the information processing system 230 may newly generate a 3D model using the generated 2D sketch data.
  • the information processing system 230 may end generation of the 3D model based on receiving a user input indicating that correction of the 3D model is not necessary while the 3D model is visualized. In this case, the information processing system 230 may store the finally generated 3D model in the database.
  • the information processing system 230 may receive 2D sketch data 410 .
  • the information processing system 230 may receive a plurality of strokes from the user in the first mode (eg, 2D sketch view mode).
  • the information processing system 230 may display a recommended stroke generated based on a plurality of strokes on a display.
  • the information processing system 230 may input a plurality of strokes included in the 2D sketch data 410 to the stroke recommendation model.
  • the stroke recommendation model may be a model configured to generate subsequent strokes or subsequent strokes for completing a final 2D sketch image based on strokes input so far.
  • the stroke recommendation model may generate a recommended stroke based on a pre-stored database.
  • the information processing system 230 when at least one recommended stroke is generated, the information processing system 230 superimposes the at least one recommended stroke on a plurality of strokes (eg, the two-dimensional sketch data 410) and displays (420) can do.
  • the recommended stroke may be displayed 420 in a different color or transparency than the 2D sketch data 410 . That is, the recommended stroke may be displayed to be visually distinguished from a plurality of strokes included in the 2D sketch data 410 input by the user.
  • the information processing system 230 may display the recommended stroke and a plurality of strokes included in the 2D sketch data 410 input by the user in different thicknesses. Accordingly, the user can check the directly input stroke and the recommended stroke at a glance, and can easily determine whether or not to use the recommended stroke.
  • the information processing system 230 may receive an additional stroke 521 for correction of the 3D model in the second mode (eg, a state in which the 3D model 520 is displayed).
  • the additional stroke 521 may be recommended by a stroke recommendation model or input by a user input through an input device.
  • the information processing system 230 may display an updated 3D model 530 of the target object generated based on the additional stroke 521 on the display.
  • the information processing system 230 may generate an updated 3D model 530 of the generated target object based on the additional stroke 521 .
  • the updated 3D model 530 may include a 3D structure corresponding to the additional stroke 521 .
  • the information processing system 230 may display an updated 3D model of the target object generated based on the additional stroke 521 on the display.
  • the information processing system 230 receives an additional stroke 511 for correction of the 3D model 520 in the first mode (eg, a state in which the 2D sketch data 510 is displayed).
  • the information processing system 230 may receive a user input (eg, a third user input) for switching to the first mode.
  • the information processing system 230 may receive an additional stroke 511 for correcting the 3D model from the user in the first mode.
  • the information processing system 230 may receive a user input (eg, a fourth user input) for switching to the second mode.
  • the information processing system 230 may generate an updated 3D model 530 for the target object created based on the 2D sketch data 510 and the additional stroke 511 in the second mode. there is.
  • the 3D model 530 thus updated may be displayed on the display.
  • FIG. 6 is a diagram illustrating an example of a first mode of an interface for automatically generating a 3D model according to an embodiment of the present disclosure.
  • an icon for selecting a view mode may be included in the user interface 600 of a program for generating/correcting a 3D model according to the present invention.
  • 2D sketch data 630 of the target object input by the user may be displayed (eg, 2D sketch view mode).
  • the user terminal and/or information processing system may receive 2D sketch data 630 composed of a plurality of strokes in the first mode.
  • the user terminal may display a 3D model generated based on the 2D sketch input by the user, and a detailed description thereof will be described later with reference to FIG. 8 .
  • a menu 640 for checking the user's past strokes may be displayed together.
  • the menu 640 for checking past strokes may be configured in the form of a slide bar.
  • the slide bar may include at least one indicator corresponding to a stroke input in the past. Accordingly, the user terminal may display a stroke corresponding to the selected indicator in the central area of the user interface 600 or display only strokes input prior to the selected indicator in the central area, based on a user input for selecting a specific indicator of the slide bar. can Accordingly, the user can easily check the past input stroke, and more conveniently correct the 2D sketch using the past stroke.
  • a user interface 600 of a program for generating/correcting a 3D model according to the present invention may include a work area 650 .
  • the work area 650 represents an area in which a work environment of a program for generating/correcting a 3D model can be configured so that necessary functions can be quickly performed according to the purpose of the work.
  • the work area 650 may include a layer panel and a view switching icon, which will be described later with reference to FIGS. 7 and 8 .
  • the user may input an additional stroke for correcting the 3D model in the 2D sketch view mode.
  • the user terminal may receive a user input for switching to the first mode (eg, selection of the first mode icon 610) after the 3D model is displayed on the display in the second mode.
  • the information processing system 230 may receive an additional stroke for correcting the 3D model from the user in the first mode.
  • the user terminal may display a corrected 3D model based on the 2D sketch data and the additional stroke.
  • the user terminal may receive a user input for switching to the second mode (eg, selection of the second mode icon 620).
  • the user terminal may display on the display an updated 3D model of the target object generated based on the 2D sketch data and the additional stroke in the second mode.
  • the user may select a recommended stroke menu (not shown) and be provided with a recommended stroke generated based on 2D sketch data.
  • the user can more conveniently complete the 2D sketch data using the recommended stroke.
  • the recommended stroke provided by the system may be displayed in the central area together with the stroke input by the user on the 2D sketch data.
  • the plurality of recommended strokes provided by the system may be provided in a separate screen area instead of the central area. In this case, the user may select and use one of a plurality of recommended strokes provided by the system.
  • the Layers panel 700 includes a Show/Hide Layers toggle icon 710, Freeze/Unfreeze Layers toggle icon 720, Show/Hide Sketch toggle icon 730, and Enable/Disable 3D Reconstruct toggle icons. (740).
  • the 2D sketch data may include a plurality of layers, and each of the plurality of layers may include at least one stroke.
  • a user may use the layer show/hide toggle icon 710 to display or not display a specific layer on the user interface.
  • the user may prohibit or permit modification of a specific layer using the layer fix/non-fix toggle icon 720 .
  • the user may use the show/hide sketch toggle icon 730 to display or not display strokes included in a specific layer on the user interface.
  • the user may use the 3D reconstruction enable/disable toggle icon 740 to use or not use strokes included in a specific layer to create a 3D model.
  • a user may create a 3D model based on strokes included in a selected layer among a plurality of layers. For example, the user may create a 3D model using at least one layer selected to be used for rebuilding the 3D model through the 3D reconstruction use/non-use toggle icon 740 .
  • FIG. 8 is a diagram illustrating an example of a second mode of an interface for automatically generating a 3D model according to an embodiment of the present disclosure.
  • an icon for selecting a view mode may be included in the user interface 800 of the program for generating/correcting the 3D model 830 according to the present invention.
  • 2D sketch data of the target object input by the user may be displayed.
  • the user terminal and/or information processing system may receive 2D sketch data composed of a plurality of strokes in the 2D sketch view mode.
  • the user terminal After inputting the 2D sketch data, when the user selects the second mode icon 820, the user terminal displays the 3D model 830 generated based on the 2D sketch input by the user on the display (eg For example, 3D sketch view mode).
  • the user terminal may receive an input of an additional stroke for correcting the 3D model 830 while the 3D model 830 is displayed on the screen.
  • the user can change the display size of the 3D model 830, rotate it, etc., using the icon 840 for selecting a display method of the 3D model.
  • the user may input an additional stroke for correction after rotating the 3D model 830 in the 3D sketch view mode.
  • the 3D model 830 can be rotated using the icon 840 .
  • the user may input additional strokes to correct the rotated 3D model 830 .
  • the user terminal may display the updated 3D model 830 on the display.
  • the user terminal may display 2D sketch data after rotating the 3D model 830 in the 3D sketch view mode (or of the 3D model corrected in the rotated state). For example, after the 3D model 830 is displayed on the display, the user may rotate the 3D model 830 and select the first mode icon 810 to switch to the first mode. In this case, 2D sketch data of the target object corresponding to the rotated 3D model viewed from the second viewpoint may be displayed on the display.
  • a user interface 800 of a program for generating/correcting a 3D model according to the present invention may include a work area 850.
  • the work area 850 may include view change icons 852 and 854 .
  • the user may select the view switching icon 852 to display the 3D model 830 viewing the target object from the first viewpoint on the display.
  • the user may select the view switching icon 854 to display a 3D model viewed from a different viewpoint (eg, a second viewpoint) on the display.
  • the method 900 of providing an automatic 3D model generation interface may be performed by a user terminal or at least one processor of an information processing system.
  • the method 900 may start with a processor receiving 2D sketch data of a target object viewed from a first viewpoint from a user in a first mode of an interface for automatically generating a 3D model (S910).
  • the 2D sketch data may include a plurality of strokes.
  • the receiving of the two-dimensional sketch data (S910) includes receiving a plurality of strokes from the user in a first mode and displaying recommended strokes generated based on the plurality of strokes on a display.
  • the recommended stroke may be displayed overlapping with a plurality of strokes, and may be displayed in a color or transparency different from that of the plurality of strokes.
  • the processor may receive a first user input for switching the 3D model automatically generating interface to the second mode (S920).
  • the processor may display on the display a 3D model of the target object generated based on the 2D sketch data in the second mode (S930).
  • the processor may receive an additional stroke for correcting the 3D model from the user in the second mode.
  • the processor may display an updated 3D model of the target object generated based on the additional stroke on the display.
  • the additional stroke may be received while the 3D model is displayed on the display.
  • the processor may receive a second user input for rotating the 3D model from the user in the second mode.
  • the processor may display the rotated 3D model on the display.
  • the processor may receive an additional stroke for correcting the rotated 3D model from the user in the second mode.
  • the processor may display an updated 3D model of the target object on the display based on the additional stroke.
  • the additional stroke may be received while the rotated 3D model is displayed on the display.
  • the processor after displaying the 3D model on the display (S930), receives a third user input for switching to the first mode, and corrects the 3D model from the user in the first mode. Additional strokes may be received. Thereafter, the processor receives a fourth user input for switching to the second mode, and displays an updated 3D model of the target object generated based on the 2D sketch data and the additional stroke in the second mode on the display. can be displayed Here, the additional stroke may be received while the 2D sketch data is displayed on the display.
  • the processor may receive a fifth user input for rotating the 3D model from the user in the second mode. In response to receiving the fifth user input, the processor may display the rotated 3D model on the display. Thereafter, the processor receives a sixth user input for switching to the first mode, and in response to receiving the sixth user input, the processor views the target object corresponding to the rotated 3D model from the second viewpoint. Sketch data can be displayed on the display.
  • the 2D sketch data includes a plurality of layers, and each of the plurality of layers may include at least one stroke.
  • the 3D model displayed on the display in step S930 may be a 3D model generated based on strokes included in a selected layer among a plurality of layers.
  • the above method may be provided as a computer program stored in a computer readable recording medium to be executed on a computer.
  • the medium may continuously store programs executable by a computer or temporarily store them for execution or download.
  • the medium may be various recording means or storage means in the form of a single or combined hardware, but is not limited to a medium directly connected to a certain computer system, and may be distributed on a network. Examples of the medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical recording media such as CD-ROM and DVD, magneto optical media such as floptical disks, and Anything configured to store program instructions may include a ROM, RAM, flash memory, or the like.
  • examples of other media include recording media or storage media managed by an app store that distributes applications, a site that supplies or distributes various other software, and a server.
  • the processing units used to perform the techniques may include one or more ASICs, DSPs, digital signal processing devices (DSPDs), programmable logic devices (PLDs) ), field programmable gate arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, electronic devices, and other electronic units designed to perform the functions described in this disclosure. , a computer, or a combination thereof.
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, eg, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other configuration.
  • the techniques include random access memory (RAM), read-only memory (ROM), non-volatile random access memory (NVRAM), PROM (on a computer readable medium, such as programmable read-only memory (EPROM), erasable programmable read-only memory (EPROM), electrically erasable PROM (EEPROM), flash memory, compact disc (CD), magnetic or optical data storage device, or the like. It can also be implemented as stored instructions. Instructions may be executable by one or more processors and may cause the processor(s) to perform certain aspects of the functionality described in this disclosure.
  • Computer readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a computer.
  • such computer readable media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or desired program code in the form of instructions or data structures. It can be used for transport or storage to and can include any other medium that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium.
  • the software is transmitted from a website, server, or other remote source using coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave
  • coaxial cable , fiber optic cable, twisted pair, digital subscriber line, or wireless technologies such as infrared, radio, and microwave
  • Disk and disc as used herein include CD, laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc, where disks are usually magnetic data is reproduced optically, whereas discs reproduce data optically using a laser. Combinations of the above should also be included within the scope of computer readable media.
  • a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium can be coupled to the processor such that the processor can read information from or write information to the storage medium.
  • the storage medium may be integral to the processor.
  • the processor and storage medium may reside within an ASIC.
  • An ASIC may exist within a user terminal.
  • the processor and storage medium may exist as separate components in a user terminal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente divulgation concerne un procédé de fourniture d'une interface pour la génération automatisée de modèles 3D à l'aide d'un ou de plusieurs processeurs. Le procédé peut comprendre les étapes consistant à : recevoir un croquis 2D d'un objet d'intérêt vu à partir d'un premier point de vue provenant d'un utilisateur dans un premier mode de l'interface pour une génération automatisée de modèles 3D ; recevoir une entrée d'un premier utilisateur pour commuter vers un second mode de l'interface pour une génération automatisée de modèles 3D ; et, dans le second mode, afficher, sur un écran, un modèle 3D généré de l'objet d'intérêt sur la base des données de croquis 2D, les données de croquis 2D pouvant comprendre une pluralité de traits.
PCT/KR2022/000487 2021-12-29 2022-01-11 Procédé et système de fourniture d'interface pour la génération automatisée de modèles 3d Ceased WO2023128043A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210191656A KR102504720B1 (ko) 2021-12-29 2021-12-29 3차원 모델 자동 생성 인터페이스 제공 방법 및 시스템
KR10-2021-0191656 2021-12-29

Publications (1)

Publication Number Publication Date
WO2023128043A1 true WO2023128043A1 (fr) 2023-07-06

Family

ID=85326693

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/000487 Ceased WO2023128043A1 (fr) 2021-12-29 2022-01-11 Procédé et système de fourniture d'interface pour la génération automatisée de modèles 3d

Country Status (2)

Country Link
KR (1) KR102504720B1 (fr)
WO (1) WO2023128043A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024255931A3 (fr) * 2023-06-16 2025-02-20 魔芯(湖州)科技有限公司 Procédé d'interaction utilisateur pour modèle de génération de croquis haute résolution, procédé et appareil de traitement

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12164847B1 (en) 2023-05-24 2024-12-10 Scintium Ltd Method of generating a 3d computer-aided design (CAD) and system therefor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050023004A (ko) * 2003-08-28 2005-03-09 인테크놀러지(주) 동양화 시뮬레이션 장치 및 방법
KR101391386B1 (ko) * 2012-11-30 2014-05-07 엔디에스솔루션 주식회사 3d 이미지 모델링 시스템 및 방법
KR101462419B1 (ko) * 2013-09-09 2014-11-17 (주)토탈소프트뱅크 벡터 도형을 드로잉하기 위한 단말기
KR20160148885A (ko) * 2015-06-17 2016-12-27 (주)유니드픽쳐 2d 이미지를 이용한 3d모델링 및 3차원 형상 제작 기법
KR20200114429A (ko) * 2019-03-28 2020-10-07 한국과학기술원 영상 처리 방법 및 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050023004A (ko) * 2003-08-28 2005-03-09 인테크놀러지(주) 동양화 시뮬레이션 장치 및 방법
KR101391386B1 (ko) * 2012-11-30 2014-05-07 엔디에스솔루션 주식회사 3d 이미지 모델링 시스템 및 방법
KR101462419B1 (ko) * 2013-09-09 2014-11-17 (주)토탈소프트뱅크 벡터 도형을 드로잉하기 위한 단말기
KR20160148885A (ko) * 2015-06-17 2016-12-27 (주)유니드픽쳐 2d 이미지를 이용한 3d모델링 및 3차원 형상 제작 기법
KR20200114429A (ko) * 2019-03-28 2020-10-07 한국과학기술원 영상 처리 방법 및 장치

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024255931A3 (fr) * 2023-06-16 2025-02-20 魔芯(湖州)科技有限公司 Procédé d'interaction utilisateur pour modèle de génération de croquis haute résolution, procédé et appareil de traitement

Also Published As

Publication number Publication date
KR102504720B1 (ko) 2023-02-28

Similar Documents

Publication Publication Date Title
WO2023128027A1 (fr) Procédé et système de modélisation 3d basée sur un croquis irrégulier
WO2025048073A1 (fr) Procédé et système de génération automatique de graphe de connaissances
CN112116690B (zh) 视频特效生成方法、装置及终端
WO2023128043A1 (fr) Procédé et système de fourniture d'interface pour la génération automatisée de modèles 3d
CN112258622B (zh) 图像处理方法、装置、可读介质及电子设备
WO2023113093A1 (fr) Procédé et système de modélisation tridimensionnelle basé sur une inférence de volume
WO2020116960A1 (fr) Dispositif électronique servant à générer une vidéo comprenant des caractères et procédé associé
WO2023128045A1 (fr) Procédé et système de génération d'image de croquis à main levée pour apprentissage automatique
WO2023177007A1 (fr) Procédé de fonctionnement de dispositif électronique pour fournir une page, et dispositif électronique le prenant en charge
WO2016159624A1 (fr) Procédé et système de mise en œuvre d'un apprentissage de mots et support d'enregistrement
CN104461231B (zh) 信息显示控制装置以及信息显示控制方法
WO2015093754A1 (fr) Procédé et dispositif de partage d'informations de connexion dans un dispositif électronique
WO2014126331A1 (fr) Appareil d'affichage et procédé de commande associé
US20240273005A1 (en) Detecting and resolving video and audio errors in a metaverse application
WO2024025034A1 (fr) Procédé permettant de créer simultanément un contenu 2d et 3d et dispositif de création convergée associé
WO2018034509A1 (fr) Procédé et système de création de site web implémentés dans un navigateur web
WO2019172463A1 (fr) Procédé, système et support d'enregistrement non transitoire lisible par ordinateur pour recommander une photo d'un profil
WO2018194197A1 (fr) Procédé et système de correction d'image par analyse de motif de correction
WO2024038950A1 (fr) Procédé de gestion d'information d'articles et dispositif électronique pour le fournir
WO2024071519A1 (fr) Procédé et système de génération de contenu de réalité étendue (xr) dynamique
CN114117092A (zh) 远程协作方法、装置、电子设备和计算机可读介质
WO2018074787A1 (fr) Procédé de fourniture de contenu et dispositif électronique associé
WO2014133343A1 (fr) Appareil et procédé de fabrication de vignettes actives
WO2023132393A1 (fr) Procédé et système de fourniture de services de plateforme de jumeau numérique pour ville intelligente
WO2021145670A1 (fr) Dispositif électronique, procédé, et support d'enregistrement lisible par ordinateur pour

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22916204

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22.11.2024)

122 Ep: pct application non-entry in european phase

Ref document number: 22916204

Country of ref document: EP

Kind code of ref document: A1