[go: up one dir, main page]

US20200013235A1 - Method and apparatus for processing patches of point cloud - Google Patents

Method and apparatus for processing patches of point cloud Download PDF

Info

Publication number
US20200013235A1
US20200013235A1 US16/502,036 US201916502036A US2020013235A1 US 20200013235 A1 US20200013235 A1 US 20200013235A1 US 201916502036 A US201916502036 A US 201916502036A US 2020013235 A1 US2020013235 A1 US 2020013235A1
Authority
US
United States
Prior art keywords
patch
point cloud
patches
orientation
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/502,036
Inventor
Yi-Ting Tsai
Chun-Lung Lin
Ching-Chieh Lin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Priority to US16/502,036 priority Critical patent/US20200013235A1/en
Assigned to INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE reassignment INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIN, CHING-CHIEH, LIN, CHUN-LUNG, TSAI, YI-TING
Publication of US20200013235A1 publication Critical patent/US20200013235A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/597Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • G06K9/6211
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • H04N19/88Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving rearrangement of data among different coding units, e.g. shuffling, interleaving, scrambling or permutation of pixel data or permutation of transform coefficient data among different blocks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Definitions

  • the disclosure relates to a method and an apparatus for processing images, and more particularly, relates to a method and an apparatus for processing patches of a point cloud.
  • a point cloud is a set of a plurality of points in the three-dimensional space. Each of the points has three-dimensional coordinates, and some points may include image attribute values, such as colors, materials, reflective surface intensity, or other attributes.
  • the point cloud may be used for reconstructing objects or scenes into the composition of these points.
  • data points of the point cloud may be used to present 3D objects of VR and AR.
  • the point cloud may include thousands to billions of points captured by a plurality of cameras and depth sensors according to different configurations, so as to faithfully present a scene to be reconstructed. Therefore, a compression technique is required to reduce the amount of data used for presenting the point cloud, so as to ensure high quality and high speed video transmission.
  • the disclosure provides a method and an apparatus for processing patches of a point cloud capable of improving encoding efficiency of compression of the point cloud, so that high quality and high speed video transmission is ensured.
  • An embodiment of the disclosure provides an apparatus for processing patches of a point cloud including an input/output (I/O) device, a storage device, and a processor.
  • the I/O device is configured to receive data of the point cloud.
  • the storage device is configured to store an index table recording indexes corresponding to a plurality of orientations.
  • the processor is coupled to the I/O device and the storage device and is configured to execute a program to generate a plurality of patches of the point cloud.
  • the point cloud includes a plurality of points in a three-dimensional space, and each of the patches corresponds to a portion of the point cloud.
  • An orientation in which each patch is adapted to generate a patch image is determined, and each patch is transformed to generate the patch image according to the determined orientation.
  • the patch image is packed and the index corresponding to the orientation of each patch is recorded.
  • An embodiment of the disclosure provides an apparatus for processing patches of a point cloud including an input/output (I/O) device, a storage device, and a processor.
  • the I/O device is configured to receive a bit stream of the point cloud.
  • the storage device is configured to store an index table recording indexes corresponding to a plurality of orientations.
  • the processor is coupled to the I/O device land the storage device and is configured to execute a program to demultiplex the bit stream into a patch image and indexes corresponding to a plurality of patches in the patch image.
  • An index table is looked up to obtain an orientation of each patch and the patch image is transformed and projected according to the orientation to recover the plurality of patches of the point cloud.
  • the point cloud is reconstructed by using the recovered patches.
  • An embodiment of the disclosure provides a method for processing patches of a point cloud suitable for a decoder having a processor.
  • a bit stream of the point cloud is demultiplexed into a patch image and indexes corresponding to a plurality of patches in the patch image.
  • An index table is looked up to obtain an orientation of each patch and the patch image is transformed according to the orientation to recover the plurality of patches of the point cloud.
  • the point cloud is reconstructed by using the recovered patches.
  • FIG. 1 is a block diagram of an apparatus for processing patches of a point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 2 is a flow chart illustrating a method for processing patches of a point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 3 is an example illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 4A and FIG. 4B are examples illustrating patch projection methods according to an exemplary embodiment of the disclosure.
  • FIG. 5 is an example illustrating patch image generation through rotating and shifting patches according to an exemplary embodiment of the disclosure.
  • FIG. 6 is a detailed flow chart of the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 7 is a block diagram of an apparatus for processing patches of a point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 8 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 9 is a detailed flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 10 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 11 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • a method for processing patches of a point cloud provided by the disclosure, different projection methods are allowed to be used for each of the patches of the point cloud when encoding is performed. Further, the projection methods used for the patches may be instructed by additional information, so that a decoder may recover the patches and reconstruct the point cloud according to the projection methods instructed by such information. Accordingly, encoding efficiency of point cloud compression is improved, and high quality and high speed video transmission is ensured.
  • FIG. 1 is a block diagram of an apparatus for processing patches of a point cloud according to an exemplary embodiment of the disclosure.
  • a point cloud patch processing apparatus 10 is, for example, a camera, a video camera, a cell phone, a personal computer, a virtual reality apparatus, an augmented reality apparatus, a cloud server, or other apparatuses with a computing function and acts as an encoder, for example, to execute a method for processing patches of the point cloud provided by the embodiments of the disclosure.
  • I/O input/output
  • storage device 14 storage device
  • a processor 16 are included, and functions of the devices are provided as follows.
  • the I/O device 12 is, for example, a wired or wireless transmission interface supporting Universal Serial Bus (USB), RS232, Bluetooth (BT), wireless fidelity (Wi-Fi) and is configured to receive point cloud data provided by an image source apparatus such as a camera or a video camera and then outputs a processed video stream.
  • the I/O device 12 may also be a network card supporting Ethernet or supporting wireless network standards such as 802.11g, 802.11n, 802.11ac, etc., so that the point cloud patch processing apparatus 10 may be connected to the network and may input and output data through the network.
  • the storage device 14 is, for example, a fixed or movable random access memory (RAM), a read-only memory (ROM), a flash memory, a hard disk, or other similar devices of any form or a combination of the foregoing devices and is configured to store a program which may be executed by the processor 16 .
  • the storage device 14 for example, stores an index table recording indexes corresponding to a plurality of orientations.
  • the processor 16 is coupled to the I/O device 12 and the storage device 14 , and is, for example, a central processing unit (CPU) or a programmable microprocessor for general or special use, a digital signal processor (DSP), a programmable controller, an application specific integrated circuits (ASIC), a programmable logic controller (PLC), or other similar devices or a combination of these devices, and may load and execute the program stored in the storage device 14 to execute the method for processing patches of the point cloud provided by the embodiments of the disclosure.
  • CPU central processing unit
  • DSP digital signal processor
  • ASIC application specific integrated circuits
  • PLC programmable logic controller
  • FIG. 2 is a flow chart illustrating a method for processing patches of a point cloud according to an exemplary embodiment of the disclosure.
  • the method provided by this embodiment is adapted to the point cloud patch processing apparatus 10 , and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 10 are described in detail as follows.
  • the processor 16 generates a plurality of patches of the point cloud.
  • the point cloud includes a plurality of points in a three-dimensional space, and each point may include geometry information that can be used to define a geometry position of the point and attribute information that can be used to define color, reflectance, transparency, and other attributes of the point.
  • Each patch corresponds to a portion of the point cloud and the portion may be a set of several points having deviations in surface normal vectors less than a threshold, for example.
  • the processor 16 determines an orientation in which each patch is adapted to generate a patch image and transform each patch to generate the patch image according to the orientation. In some embodiments, the processor 16 determines the orientation in which each patch is adapted to generate the patch image by adopting an algorithm such as compact packing, similarity comparison, intra prediction, or inter prediction.
  • the processor 16 compares positions and sizes of a plurality of blocks in each patch to calculate the similarity of the blocks among the patches, so as to accordingly determine a position of each patch in the patch image.
  • the processor 16 projects the patches to different orientations and calculates similarity based on the patches projected to different orientations, so as to determine the orientation in which each patch is adapted to generate the patch image.
  • the processor 16 may use the patches being projected to different orientation to perform intra prediction or inter prediction, so as to find a combination which exhibits favorable encoding efficiency to determine the orientation in which each patch is adapted to generate the patch image.
  • the method for determining the projection orientation is not limited by the embodiment.
  • the processor 16 may determine the orientation in which each patch is adapted to generate the patch image according to a predetermined projection method or a projection method selected by a user.
  • the projection method for example, includes 2 orientations, 8 orientations, or any subset of n orientations (n is an integer greater than 2), which is not limited herein.
  • the processor 16 may determine whether each patch is adapted to rotate to the predetermined orientation. If the patch is adapted to rotate, the processor 16 places the patch into the patch image after rotating the patch to the predetermined orientation, and if the patch is not adapted to rotate, the processor 16 directly places the patch into the patch image. In another embodiment, the processor 16 may determine whether a width is greater than a height of each patch. If the determination result is no, the processor 16 places the patch into the patch image after rotating the patch to the predetermined orientation, and if the determination result is yes, the processor 16 directly places the patch into the patch image.
  • Taking 8 orientations for example, it includes, for example, the projection methods of rotating the patch by 0° (i.e., not rotated), 90°, 180°, and 270°, and rotating a mirror image of the patch by 0°, 90°, 180°, and 270°, for example.
  • the processor 16 may determine which orientation among the 8 predetermined orientations to which each patch is adapted to rotate, so as to place each patch into the patch image after rotating each patch according to the determined predetermined orientation.
  • the processor 16 packs the patch image and records the index corresponding to the orientation of each patch.
  • the processor 16 packs the patch image including patches which are appropriately transformed as a geometry image recording the geometry position of each patch, a texture image recording color composition of each patch, and an occupancy map recording which pixels in the geometry image and the texture image are valid data.
  • the geometry image, texture image, and occupancy map are composed as a compressed bit stream through a multiplexer and outputted through the I/O device 12 .
  • the processor 16 may transform the previously-determined orientation of each patch into a corresponding index according to the pre-established index table, records the index to auxiliary patch information, and outputs the auxiliary patch information together with the packed patch image.
  • FIG. 3 is an example illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • an object 30 in the point cloud is taken as an example in this embodiment to illustrate the process of patch image generation performed by the point cloud patch processing apparatus.
  • each patch 30 a to 30 f is analyzed.
  • a similarity comparison method is used to sequentially calculate similarity between the projection of the patches 30 a to 30 f in different orientations and the projection of other patches, so that the orientation and position of each of the patches 30 a to 30 f in the patch image is determined.
  • each of the patches 30 a to 30 f is transformed according to the determined orientation, so that a patch image including content of all of the patches 30 a to 30 f is generated.
  • the patch image is packed as a texture image 34 a recording the color composition of each of the patches 30 a to 30 f and a geometry image 34 b recording the geometry position of each of the patches 30 a to 30 f as shown in FIG. 3 .
  • the point cloud patch processing apparatus 10 determines such orientation according to the predetermined projection method or the projection method selected by the user, for example, and records the index corresponding to the orientation of each patch according to the index table corresponding to the selected projection method.
  • FIG. 4A and FIG. 4B are examples illustrating patch projection methods according to an exemplary embodiment of the disclosure.
  • the projection method of 2 orientations for example, as shown in FIG. 4A , it includes rotating a patch by 0° and rotating a mirror image of the patch by 270°, and the corresponding indexes are 0 and 1.
  • the projection method of 8 orientations as an example, as shown in FIG. 4B , it includes rotating the patch by 0°, 90°, 180°, and 270° and rotating a mirror image of the patch by 0°, 90°, 180°, and 270°, and the corresponding indexes are 0 to 7, respectively.
  • the point cloud patch processing apparatus 10 may further determine an offset of each patch to shift the patch. Through shifting the patches after being rotated, the patches in the patch image may be arranged more closely, so that efficiency of encoding is increased.
  • the offset is, for example, an offset between a position of each patch after being shifted and an original point of the patch or an original point of the point cloud, which is not limited therein.
  • FIG. 5 is an example illustrating patch image generation through rotating and shifting patches according to an exemplary embodiment of the disclosure.
  • a patch image 52 is generated through a conventional method. Taking patches 52 a, 52 b, and 52 c for example, in the conventional manner, without rotating or shifting the patches, unoccupied regions in the patch image 52 that may be used to accommodate existing patch are found row by row from top to bottom directly according to regions in patch image 52 that are occupied by other patches at the moment. The patches 52 a, 52 b, and 52 c are then placed into the patch image 52 in sequence.
  • a patch image 54 is generated through the method for processing patches of the point cloud provided by the embodiments of the disclosure.
  • the point cloud patch processing apparatus further shifts the rotated patches in the patch image 54 , so as to determine the orientations and offsets in which the existing patches are adapted to generate the patch image 54 according to similarity between the existing patches and other patches after the existing patches are rotated to different orientations and shifted.
  • the patches are rotated according to the determined orientations and the rotated patches are shifted according to the determined offsets so as to generate the patch image 54 .
  • the patch image 52 and the patch image 54 are compared, it can be seen that the patches in the patch image 54 are arranged more closely since the patches in the patch image 54 are appropriately rotated and shifted, so that efficiency of encoding is enhanced.
  • the processor 16 processes the geometry image, the texture image, and the occupancy map through image padding, smoothing, and compression, adds the indexes recording the orientation of each patch to the auxiliary patch information, and performs compression. Finally, the compressed data is composed as a compressed bit stream through the multiplexer and outputted through the I/O device 12 .
  • FIG. 6 is a detailed flow chart of the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • the method provided by this embodiment is adapted to the point cloud patch processing apparatus 10 , and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 10 are described in detail as follows.
  • step 601 the processor 16 generates a plurality of patches after receiving point cloud data through the I/O device 12 and in step S 602 , changes patch orientations through the method for processing patches of the point cloud to generate a patch image and patch information as described above (e.g., the orientation of each patch in the patch image).
  • the processor 16 packs the patch image to generate an occupancy map in step 603 , generates a geometry image in step 604 , and generates a texture image in step 605 .
  • Image padding and video compression are performed on the generated geometry image and texture image respectively in step 606 and step 607 in sequence, so that a compressed geometry image and a compressed texture image are obtained.
  • the processor 16 performs smoothing on the geometry image reconstructed by using the compressed geometry image according to the previously-generated patch information and then feeds the processed geometry image back to step 605 , so that the texture image may be accordingly generated.
  • step 609 the processor 16 compresses the occupancy map generated in step 603 and in step 610 , adds the patch information generated in step 602 to the auxiliary patch information and compresses the auxiliary patch information.
  • step 611 the processor 16 composes the compressed geometry image, the compressed texture image, the compressed occupancy map, and the compressed auxiliary patch information generated in the foregoing steps into a compressed bit stream through a multiplexer and outputs the compressed bit stream through the I/O device 12 .
  • the patch image is generated after the plurality of patches generated by the point cloud are appropriately transformed. Further, information such as the patch image and indexes recording patch orientations is compressed and then outputted, so that the decoder may recover the patches and reconstruct the point cloud according to the information.
  • Several examples are provided below to illustrate a structure of a decoder and a method for processing patches of a point cloud (decoding method) corresponding to the decoder.
  • FIG. 7 is a block diagram of a point cloud patch processing apparatus according to an exemplary embodiment of the disclosure.
  • a point cloud patch processing apparatus 70 is, for example, a camera, a video camera, a cell phone, a personal computer, a virtual reality apparatus, an augmented reality apparatus, a cloud server, or other apparatuses with a computing function and at least includes an I/O device 72 , a storage device 74 and a processor 76 .
  • Constitutions of the I/O device 72 , the storage device 74 , and the processor 76 are identical or similar to the constitutions of the I/O device 12 , the storage device 14 , and the processor 16 provided in the foregoing embodiments, repeated description is thus not provided herein.
  • the difference between this embodiment and the foregoing embodiments is that the point cloud patch processing apparatus 70 of this embodiment is used as a decoder to execute the method for processing patches of the point cloud provided by the embodiments of the disclosure.
  • FIG. 8 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • the method provided by this embodiment is adapted to the point cloud patch processing apparatus 70 , and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 70 are described in detail as follows.
  • step 801 the processor 76 receives a bit stream from the point cloud through the I/O device 72 , so as to demultiplex the bit stream of the point cloud into a patch image and indexes corresponding to a plurality of patches in the patch image.
  • step 802 the processor 76 looks up an index table pre-stored in the storage device 74 to obtain an orientation of each patch and transforms the patch image according to the orientations being looked up to recover the plurality of patches of the point cloud.
  • the processor 76 may select the index table to look up the orientations of the patches according to a predetermined projection method, and such projection method includes n orientations (n is an integer greater than 2), for example.
  • the processor 76 may look up the index table by using the indexes to determine whether each patch in the patch image has been rotated. Herein, if rotated, the processor 76 inverts the patches according to the predetermined orientation to recover the patches, and if not rotated, the processor 76 does not transform the patches.
  • the processor 76 further looks up an offset of each patch in the patch image by using the indexes, so as to reversely shift each patch according to the offset being looked up.
  • the offset is, for example, an offset between a position of each patch after being shifted and an original point of the patch or an original point of the point cloud, which is not limited therein.
  • Table 1 is a lookup table of adaptive patch rotation functions.
  • a value of an identifier corresponds to, for example, an index Idx. That is, when the index Idx is 0, the identifier is set as FPO_NULL, and when the index Idx is 1, the identifier is set as FPO_SWAP.
  • the frame coordinates (x, y) may be calculated as follows through the lookup table:
  • the coordinates (u, v) are the original coordinates of each patch in the patch image.
  • Outputs of a rotation function Rotation(x) and an offset function Offset(x) are the matrixes defined in Table 1.
  • An index fIdx is an index value of a number of a frame marked in the image
  • patch shift functions Patch2dShiftU and Patch2dShiftV respectively are the X position and Y position of the top left corner of the patch in the patch image.
  • taking 8 orientations includes the projection method of rotating the patch by 0° (i.e., not rotated), 90°, 180°, and 270°, and rotating a mirror image of the patch by 0°, 90°, 180°, and 270°, for example.
  • the processor 16 may determine which orientation among the 8 predetermined orientations each patch in the patch image is rotated through looking up the index table, so as to invert each patch according to the predetermined orientation being looked up.
  • Table 2 is a lookup table of adaptive patch rotation functions.
  • values of the rotation function Rotation(x) and the offset function Offset(x) correspond to, for example, the index Idx.
  • the BlockSize is a size of a coding block of the occupancy map
  • Patch2dSizeU and Patch2dSizeV respectively are values obtained by dividing the width and height of the patch by the BlockSize.
  • Outputs of the rotation function Rotation(x) and the offset function Offset(x) may be looked up through Table 2 and introduced to formula (1), so as to calculate the frame coordinates (x, y) of the patch.
  • the processor 76 reconstructs the point cloud through the recovered patches.
  • the processor 76 projects each two-dimensional patch to a three-dimensional space through the predetermined projection method so as to reconstruct the point cloud.
  • the processor 76 may demultiplex the bit stream of the point cloud into the geometry image, the texture image, the occupancy map, and the auxiliary patch information corresponding to each frame.
  • the processor 76 may learn which pixels in the geometry image and the texture image are valid data through the occupancy map and may learn the patch information such as belonging to which patch through the auxiliary patch information.
  • the processor 76 may then project the two-dimensional patch to the three-dimensional space through the valid data and the patch information, so as to reconstruct the point cloud.
  • FIG. 9 is a detailed flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • the method provided by this embodiment is adapted to the point cloud patch processing apparatus 70 , and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 70 are described in detail as follows.
  • step 901 after receiving the compressed bit stream of the point cloud from the I/O device 72 , the processor 76 demultiplexes the compressed bit stream into the compressed texture image, the compressed geometry image, the compressed occupancy map, and the compressed auxiliary patch information.
  • the processor 76 performs video decompression on the compressed texture image and the compressed geometry image to generate a decompressed texture image and a decompressed geometry image.
  • the processor 76 decompresses the compressed occupancy map to generate a decompressed occupancy map.
  • step 904 the processor 76 decompresses the compressed auxiliary patch information to generate decompressed auxiliary patch information.
  • step 905 the processor 76 learns which pixels in the decompressed geometry image are valid data according to the decompressed occupancy map and then inverts the orientation of each patch in the decompressed geometry image according to the patch information such as the orientation and offset of each patch recorded in the decompressed auxiliary patch information.
  • Such inversion may include orientation inversion and/or shift inversion, which is not limited herein.
  • step 906 the processor 76 projects the inverted patches to the three-dimensional space to reconstruct geometry portions (e.g., including positions and shapes) and in step 907 , performs smoothing on the reconstructed geometry portions.
  • step 908 the processor 76 decompresses the texture image and performs texture reconstruction on the reconstructed geometry portions, so as to obtain the reconstructed point cloud.
  • the point cloud patch processing apparatus 70 may decompose the compressed bit stream into the patch image and the patch information required for reconstructing the point cloud, and reconstruct the point cloud by inverting the patches and projecting the inverted patches to the three-dimensional space.
  • encoding is performed with reference to parameters such as computing resources and network resources of the existing device, and the appropriate projection method is then selected to process the patches, so that high quality and high speed video transmission is ensured.
  • FIG. 10 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • the method of this embodiment is adapted to the point cloud patch processing apparatus 10 , and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 10 are described in detail as follows.
  • step 1001 the processor 16 generates a plurality of patches of the point cloud.
  • step 1002 the processor 16 determines to use 2 rotation orientations or 8 rotation orientations to rotate the patches according to parameters at the moment, such as available computing resources, network bandwidth, encoding capabilities of the encoder, and decoding capabilities of the decoder, and sets a preferable orientation flag according to the determination result, so that the decoder may perform decoding with reference to the preferable orientation flag.
  • the value of the preferable orientation flag is set to be 1, then 2 rotation orientations are used in step 1003 , and if the value of the preferable orientation flag is set to be 0, then 8 rotation orientations are used in in step 1004 .
  • the processor 16 may also use more flags to mark the processing method of each patch according to needs of the system or user, so that the decoder may accordingly recover the patches and reconstruct the point cloud.
  • FIG. 11 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • the method provided by this embodiment is suitable for the point cloud patch processing apparatus 10 , and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 10 are described in detail as follows.
  • step 1101 the processor 16 generates a plurality of patches of the point cloud.
  • step 1102 the processor 16 determines whether to enable an orientation rotation function according to system configuration or user operation. Herein, if it is determined that the orientation rotation function is required to be enabled, the processor 16 sets the value of an orientation enabling flag to be 1, and performs step 1103 . Conversely, the processor 16 sets the value of the orientation enabling flag to be 0, and performs step 1111 in which no patch is rotated.
  • step 1103 the processor 16 determines to use 2 rotation orientations or 8 rotation orientations to rotate the patches according to available resource parameters at the moment and sets the preferable orientation flag according to the determination result.
  • the processor 16 determines to use 2 rotation orientations or 8 rotation orientations to rotate the patches according to available resource parameters at the moment and sets the preferable orientation flag according to the determination result.
  • the value of the preferable orientation flag is set to be 1, then 2 rotation orientations are used in step 1004 , and if the value of the preferable orientation flag is set to be 0, then 8 rotation orientations are used in in step 1005 .
  • the processor 16 may determine whether each patch is adapted to rotate to the predetermined orientation and set a patch rotation flag according to the determination result.
  • the processor 16 may determine whether each patch is adapted to rotate to the predetermined orientation and set a patch rotation flag according to the determination result.
  • the value of the patch rotation flag is set to be 1
  • the patches are rotated to a predetermined rotation orientation in step 1108
  • the value of the patch rotation flag is set to be 0, no patch is rotated in step 1109 .
  • the processor 16 may determine whether each patch is adapted to rotate to the predetermined orientation (i.e., the 7 orientations other than rotation by 0°) and set the patch rotation flag according to the determination result.
  • the value of the patch rotation flag is set to be 1
  • the patches are rotated to one of the predetermined 7 rotation orientations and the index of such rotation orientation is recorded (e.g., using 3 bits to record 7 types of states) in step 1110
  • the value of the patch rotation flag is set to be 0, no patch is rotated (i.e. rotation by 0°) in step 1109 .
  • step 1112 the processor 16 performs processing such as image padding, smoothing, compression on the patches processed and the auxiliary information generated through the foregoing steps, and detailed content thereof is described in the foregoing embodiments and thus is not repeated herein.
  • the patches of the point cloud are rotated and/or shifted during encoding, such that in the generated patch image, the patches are arranged more closely, and encoding efficiency of point cloud compression is therefore improved.
  • patch information such as the rotation orientation and offset of each patch is further added to the auxiliary information and provided to the decoder together, so that the decoder may recover the patches and reconstruct the point cloud. Accordingly, encoding efficiency of point cloud compression is improved, and high quality and high speed video transmission is ensured.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Architecture (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Compression Of Band Width Or Redundancy In Fax (AREA)

Abstract

A method and an apparatus for processing patches of a point cloud are provided. The apparatus includes an input/output (I/O) device, a storage device, and a processor. The I/O device is used to receive a bit stream of the point cloud. The storage device is configured to store an index table recording indexes corresponding to a plurality of orientations. The processor is coupled to the I/O device and the storage device and is configured to execute a program to demultiplex the bit stream of the point cloud into a patch image and indexes corresponding to a plurality of patches in the patch image, look up the index table obtain an orientation of each patch, transform the patch image according to the orientation to recover the plurality of patches of the point cloud, and reconstruct the point cloud by using the recovered patches.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the priority benefit of U.S. provisional application Ser. No. 62/693,485, filed on Jul. 3, 2018. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
  • BACKGROUND Technical Field
  • The disclosure relates to a method and an apparatus for processing images, and more particularly, relates to a method and an apparatus for processing patches of a point cloud.
  • Description of Related Art
  • In the existing methods for representing visual elements in the real world, outputs from the camera are compressed by using a specific Motion Picture Experts Group (MPEG) video encoding standard for transmission and storage, and finally decoded by a player for display on a flat panel display. Currently, an increasing number of apparatuses are configured to capture and display three-dimensional (3D) images in the real world. A point cloud is a set of a plurality of points in the three-dimensional space. Each of the points has three-dimensional coordinates, and some points may include image attribute values, such as colors, materials, reflective surface intensity, or other attributes. The point cloud may be used for reconstructing objects or scenes into the composition of these points.
  • For instance, in the applications of virtual reality (VR) and augmented reality (AR) which receive wide popularity in the entertainment industry in recent years, data points of the point cloud may be used to present 3D objects of VR and AR. Nevertheless, the point cloud may include thousands to billions of points captured by a plurality of cameras and depth sensors according to different configurations, so as to faithfully present a scene to be reconstructed. Therefore, a compression technique is required to reduce the amount of data used for presenting the point cloud, so as to ensure high quality and high speed video transmission.
  • SUMMARY
  • Accordingly, the disclosure provides a method and an apparatus for processing patches of a point cloud capable of improving encoding efficiency of compression of the point cloud, so that high quality and high speed video transmission is ensured.
  • An embodiment of the disclosure provides an apparatus for processing patches of a point cloud including an input/output (I/O) device, a storage device, and a processor. The I/O device is configured to receive data of the point cloud. The storage device is configured to store an index table recording indexes corresponding to a plurality of orientations. The processor is coupled to the I/O device and the storage device and is configured to execute a program to generate a plurality of patches of the point cloud. The point cloud includes a plurality of points in a three-dimensional space, and each of the patches corresponds to a portion of the point cloud. An orientation in which each patch is adapted to generate a patch image is determined, and each patch is transformed to generate the patch image according to the determined orientation. The patch image is packed and the index corresponding to the orientation of each patch is recorded.
  • An embodiment of the disclosure provides an apparatus for processing patches of a point cloud including an input/output (I/O) device, a storage device, and a processor. The I/O device is configured to receive a bit stream of the point cloud. The storage device is configured to store an index table recording indexes corresponding to a plurality of orientations. The processor is coupled to the I/O device land the storage device and is configured to execute a program to demultiplex the bit stream into a patch image and indexes corresponding to a plurality of patches in the patch image. An index table is looked up to obtain an orientation of each patch and the patch image is transformed and projected according to the orientation to recover the plurality of patches of the point cloud. The point cloud is reconstructed by using the recovered patches.
  • An embodiment of the disclosure provides a method for processing patches of a point cloud suitable for a decoder having a processor. In the method, a bit stream of the point cloud is demultiplexed into a patch image and indexes corresponding to a plurality of patches in the patch image. An index table is looked up to obtain an orientation of each patch and the patch image is transformed according to the orientation to recover the plurality of patches of the point cloud. The point cloud is reconstructed by using the recovered patches.
  • To make the aforementioned more comprehensible, several embodiments accompanied with drawings are described in detail as follows.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of an apparatus for processing patches of a point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 2 is a flow chart illustrating a method for processing patches of a point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 3 is an example illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 4A and FIG. 4B are examples illustrating patch projection methods according to an exemplary embodiment of the disclosure.
  • FIG. 5 is an example illustrating patch image generation through rotating and shifting patches according to an exemplary embodiment of the disclosure.
  • FIG. 6 is a detailed flow chart of the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 7 is a block diagram of an apparatus for processing patches of a point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 8 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 9 is a detailed flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 10 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • FIG. 11 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure.
  • DESCRIPTION OF THE EMBODIMENTS
  • In a method for processing patches of a point cloud provided by the disclosure, different projection methods are allowed to be used for each of the patches of the point cloud when encoding is performed. Further, the projection methods used for the patches may be instructed by additional information, so that a decoder may recover the patches and reconstruct the point cloud according to the projection methods instructed by such information. Accordingly, encoding efficiency of point cloud compression is improved, and high quality and high speed video transmission is ensured.
  • FIG. 1 is a block diagram of an apparatus for processing patches of a point cloud according to an exemplary embodiment of the disclosure. With reference to FIG. 1, a point cloud patch processing apparatus 10 is, for example, a camera, a video camera, a cell phone, a personal computer, a virtual reality apparatus, an augmented reality apparatus, a cloud server, or other apparatuses with a computing function and acts as an encoder, for example, to execute a method for processing patches of the point cloud provided by the embodiments of the disclosure. In the point cloud patch processing apparatus 10, at least an input/output (I/O) device 12, a storage device 14, and a processor 16 are included, and functions of the devices are provided as follows.
  • The I/O device 12 is, for example, a wired or wireless transmission interface supporting Universal Serial Bus (USB), RS232, Bluetooth (BT), wireless fidelity (Wi-Fi) and is configured to receive point cloud data provided by an image source apparatus such as a camera or a video camera and then outputs a processed video stream. In an embodiment, the I/O device 12 may also be a network card supporting Ethernet or supporting wireless network standards such as 802.11g, 802.11n, 802.11ac, etc., so that the point cloud patch processing apparatus 10 may be connected to the network and may input and output data through the network.
  • The storage device 14 is, for example, a fixed or movable random access memory (RAM), a read-only memory (ROM), a flash memory, a hard disk, or other similar devices of any form or a combination of the foregoing devices and is configured to store a program which may be executed by the processor 16. In an embodiment, the storage device 14, for example, stores an index table recording indexes corresponding to a plurality of orientations.
  • The processor 16 is coupled to the I/O device 12 and the storage device 14, and is, for example, a central processing unit (CPU) or a programmable microprocessor for general or special use, a digital signal processor (DSP), a programmable controller, an application specific integrated circuits (ASIC), a programmable logic controller (PLC), or other similar devices or a combination of these devices, and may load and execute the program stored in the storage device 14 to execute the method for processing patches of the point cloud provided by the embodiments of the disclosure.
  • FIG. 2 is a flow chart illustrating a method for processing patches of a point cloud according to an exemplary embodiment of the disclosure. With reference to FIG. 1 and FIG. 2 together, the method provided by this embodiment is adapted to the point cloud patch processing apparatus 10, and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 10 are described in detail as follows.
  • First, in step 201, the processor 16 generates a plurality of patches of the point cloud. Herein, the point cloud includes a plurality of points in a three-dimensional space, and each point may include geometry information that can be used to define a geometry position of the point and attribute information that can be used to define color, reflectance, transparency, and other attributes of the point. Each patch corresponds to a portion of the point cloud and the portion may be a set of several points having deviations in surface normal vectors less than a threshold, for example.
  • In step 202, the processor 16 determines an orientation in which each patch is adapted to generate a patch image and transform each patch to generate the patch image according to the orientation. In some embodiments, the processor 16 determines the orientation in which each patch is adapted to generate the patch image by adopting an algorithm such as compact packing, similarity comparison, intra prediction, or inter prediction.
  • Specifically, in an embodiment, the processor 16, for example, compares positions and sizes of a plurality of blocks in each patch to calculate the similarity of the blocks among the patches, so as to accordingly determine a position of each patch in the patch image. Herein, the processor 16, for example, projects the patches to different orientations and calculates similarity based on the patches projected to different orientations, so as to determine the orientation in which each patch is adapted to generate the patch image. In other embodiments, the processor 16 may use the patches being projected to different orientation to perform intra prediction or inter prediction, so as to find a combination which exhibits favorable encoding efficiency to determine the orientation in which each patch is adapted to generate the patch image. However, the method for determining the projection orientation is not limited by the embodiment.
  • In an embodiment, the processor 16 may determine the orientation in which each patch is adapted to generate the patch image according to a predetermined projection method or a projection method selected by a user. The projection method, for example, includes 2 orientations, 8 orientations, or any subset of n orientations (n is an integer greater than 2), which is not limited herein.
  • Taking 2 orientations for example, it includes, for example, the projection methods of rotating the patch by 0° (i.e., not rotated) and rotating the patch to a predetermined orientation (e.g., rotating a mirror image of the patch by 270°), for example. In an embodiment, the processor 16 may determine whether each patch is adapted to rotate to the predetermined orientation. If the patch is adapted to rotate, the processor 16 places the patch into the patch image after rotating the patch to the predetermined orientation, and if the patch is not adapted to rotate, the processor 16 directly places the patch into the patch image. In another embodiment, the processor 16 may determine whether a width is greater than a height of each patch. If the determination result is no, the processor 16 places the patch into the patch image after rotating the patch to the predetermined orientation, and if the determination result is yes, the processor 16 directly places the patch into the patch image.
  • Taking 8 orientations for example, it includes, for example, the projection methods of rotating the patch by 0° (i.e., not rotated), 90°, 180°, and 270°, and rotating a mirror image of the patch by 0°, 90°, 180°, and 270°, for example. The processor 16 may determine which orientation among the 8 predetermined orientations to which each patch is adapted to rotate, so as to place each patch into the patch image after rotating each patch according to the determined predetermined orientation.
  • Taking any subset of n orientations for example, 4 random orientations, for example, may be selected out of the 8 predetermined orientations (as described in the foregoing embodiment). That is, the processor 16 is predetermined to provide selection of 8 orientations but may enable 4 orientations only according to needs from the system or user at the moment and disable the rest of the 4 orientations. Accordingly, when determining the rotation orientation of each patch, the processor 16 determines the orientation in which each patch is adapted to rotate according to the 4 enabled orientations only and then places each patch into the patch image after rotating each patch according to the determined orientation.
  • Referring back to the flow of FIG. 2, in step 203, the processor 16 packs the patch image and records the index corresponding to the orientation of each patch. Specifically, the processor 16, for example, packs the patch image including patches which are appropriately transformed as a geometry image recording the geometry position of each patch, a texture image recording color composition of each patch, and an occupancy map recording which pixels in the geometry image and the texture image are valid data. After being processed through image padding, smoothing, and compression, the geometry image, texture image, and occupancy map are composed as a compressed bit stream through a multiplexer and outputted through the I/O device 12. In addition, the processor 16 may transform the previously-determined orientation of each patch into a corresponding index according to the pre-established index table, records the index to auxiliary patch information, and outputs the auxiliary patch information together with the packed patch image.
  • For instance, FIG. 3 is an example illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure. With reference to FIG. 3, an object 30 in the point cloud is taken as an example in this embodiment to illustrate the process of patch image generation performed by the point cloud patch processing apparatus.
  • First, projections on six surfaces of front, rear, left, right, up, and down around the object 30 in the point cloud are calculated, so that six patches 30 a to 30 f are generated. Next, a block distribution (different frames in a patch 32 represent different blocks) in each patch (e.g., the patch 32) is analyzed. A similarity comparison method is used to sequentially calculate similarity between the projection of the patches 30 a to 30 f in different orientations and the projection of other patches, so that the orientation and position of each of the patches 30 a to 30 f in the patch image is determined. Finally, each of the patches 30 a to 30 f is transformed according to the determined orientation, so that a patch image including content of all of the patches 30 a to 30 f is generated. The patch image is packed as a texture image 34 a recording the color composition of each of the patches 30 a to 30 f and a geometry image 34 b recording the geometry position of each of the patches 30 a to 30 f as shown in FIG. 3.
  • It is noted that when determining the orientation in which each patch is adapted to generate the patch image, the point cloud patch processing apparatus 10 determines such orientation according to the predetermined projection method or the projection method selected by the user, for example, and records the index corresponding to the orientation of each patch according to the index table corresponding to the selected projection method.
  • For instance, FIG. 4A and FIG. 4B are examples illustrating patch projection methods according to an exemplary embodiment of the disclosure. Taking the projection method of 2 orientations for example, as shown in FIG. 4A, it includes rotating a patch by 0° and rotating a mirror image of the patch by 270°, and the corresponding indexes are 0 and 1. Moreover, taking the projection method of 8 orientations as an example, as shown in FIG. 4B, it includes rotating the patch by 0°, 90°, 180°, and 270° and rotating a mirror image of the patch by 0°, 90°, 180°, and 270°, and the corresponding indexes are 0 to 7, respectively.
  • In an embodiment, in the process of determining the patch image adapted to be generated by each patch, in addition to determining the orientation of each patch, the point cloud patch processing apparatus 10 may further determine an offset of each patch to shift the patch. Through shifting the patches after being rotated, the patches in the patch image may be arranged more closely, so that efficiency of encoding is increased. The offset is, for example, an offset between a position of each patch after being shifted and an original point of the patch or an original point of the point cloud, which is not limited therein.
  • For instance, FIG. 5 is an example illustrating patch image generation through rotating and shifting patches according to an exemplary embodiment of the disclosure. With reference to FIG. 5, a patch image 52 is generated through a conventional method. Taking patches 52 a, 52 b, and 52c for example, in the conventional manner, without rotating or shifting the patches, unoccupied regions in the patch image 52 that may be used to accommodate existing patch are found row by row from top to bottom directly according to regions in patch image 52 that are occupied by other patches at the moment. The patches 52 a, 52 b, and 52 c are then placed into the patch image 52 in sequence.
  • On the other hand, a patch image 54 is generated through the method for processing patches of the point cloud provided by the embodiments of the disclosure. Taking patches 54 a, 54 b, and 54 c for example, after rotating the existing patches to different orientations, the point cloud patch processing apparatus further shifts the rotated patches in the patch image 54, so as to determine the orientations and offsets in which the existing patches are adapted to generate the patch image 54 according to similarity between the existing patches and other patches after the existing patches are rotated to different orientations and shifted. Finally, the patches are rotated according to the determined orientations and the rotated patches are shifted according to the determined offsets so as to generate the patch image 54. When the patch image 52 and the patch image 54 are compared, it can be seen that the patches in the patch image 54 are arranged more closely since the patches in the patch image 54 are appropriately rotated and shifted, so that efficiency of encoding is enhanced.
  • In an embodiment, the processor 16, for example, processes the geometry image, the texture image, and the occupancy map through image padding, smoothing, and compression, adds the indexes recording the orientation of each patch to the auxiliary patch information, and performs compression. Finally, the compressed data is composed as a compressed bit stream through the multiplexer and outputted through the I/O device 12.
  • Specifically, FIG. 6 is a detailed flow chart of the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure. With reference to FIG. 1 and FIG. 6 together, the method provided by this embodiment is adapted to the point cloud patch processing apparatus 10, and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 10 are described in detail as follows.
  • First, in step 601, the processor 16 generates a plurality of patches after receiving point cloud data through the I/O device 12 and in step S602, changes patch orientations through the method for processing patches of the point cloud to generate a patch image and patch information as described above (e.g., the orientation of each patch in the patch image).
  • The processor 16 packs the patch image to generate an occupancy map in step 603, generates a geometry image in step 604, and generates a texture image in step 605. Image padding and video compression are performed on the generated geometry image and texture image respectively in step 606 and step 607 in sequence, so that a compressed geometry image and a compressed texture image are obtained. Herein, in step 608, the processor 16 performs smoothing on the geometry image reconstructed by using the compressed geometry image according to the previously-generated patch information and then feeds the processed geometry image back to step 605, so that the texture image may be accordingly generated.
  • From another aspect, in step 609, the processor 16 compresses the occupancy map generated in step 603 and in step 610, adds the patch information generated in step 602 to the auxiliary patch information and compresses the auxiliary patch information. Finally, in step 611, the processor 16 composes the compressed geometry image, the compressed texture image, the compressed occupancy map, and the compressed auxiliary patch information generated in the foregoing steps into a compressed bit stream through a multiplexer and outputs the compressed bit stream through the I/O device 12.
  • Through the foregoing method, in the point cloud patch processing apparatus 10 provided by the embodiments of the disclosure, the patch image is generated after the plurality of patches generated by the point cloud are appropriately transformed. Further, information such as the patch image and indexes recording patch orientations is compressed and then outputted, so that the decoder may recover the patches and reconstruct the point cloud according to the information. Several examples are provided below to illustrate a structure of a decoder and a method for processing patches of a point cloud (decoding method) corresponding to the decoder.
  • FIG. 7 is a block diagram of a point cloud patch processing apparatus according to an exemplary embodiment of the disclosure. With reference to FIG. 7, in this embodiment, a point cloud patch processing apparatus 70 is, for example, a camera, a video camera, a cell phone, a personal computer, a virtual reality apparatus, an augmented reality apparatus, a cloud server, or other apparatuses with a computing function and at least includes an I/O device 72, a storage device 74 and a processor 76. Constitutions of the I/O device 72, the storage device 74, and the processor 76 are identical or similar to the constitutions of the I/O device 12, the storage device 14, and the processor 16 provided in the foregoing embodiments, repeated description is thus not provided herein. The difference between this embodiment and the foregoing embodiments is that the point cloud patch processing apparatus 70 of this embodiment is used as a decoder to execute the method for processing patches of the point cloud provided by the embodiments of the disclosure.
  • Specifically, FIG. 8 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure. With reference to FIG. 7 and FIG. 8 together, the method provided by this embodiment is adapted to the point cloud patch processing apparatus 70, and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 70 are described in detail as follows.
  • First, in step 801, the processor 76 receives a bit stream from the point cloud through the I/O device 72, so as to demultiplex the bit stream of the point cloud into a patch image and indexes corresponding to a plurality of patches in the patch image.
  • Next, in step 802, the processor 76 looks up an index table pre-stored in the storage device 74 to obtain an orientation of each patch and transforms the patch image according to the orientations being looked up to recover the plurality of patches of the point cloud.
  • In an embodiment, the processor 76 may select the index table to look up the orientations of the patches according to a predetermined projection method, and such projection method includes n orientations (n is an integer greater than 2), for example.
  • Taking 2 orientations for example, it includes the projection method of rotating a patch by 0° (i.e., not rotated) and rotating the patch to a predetermined orientation (e.g., rotating a mirror image of the patch by 90°), for example. The processor 76 may look up the index table by using the indexes to determine whether each patch in the patch image has been rotated. Herein, if rotated, the processor 76 inverts the patches according to the predetermined orientation to recover the patches, and if not rotated, the processor 76 does not transform the patches.
  • It is noted that in an embodiment, the processor 76 further looks up an offset of each patch in the patch image by using the indexes, so as to reversely shift each patch according to the offset being looked up. Herein, the offset is, for example, an offset between a position of each patch after being shifted and an original point of the patch or an original point of the point cloud, which is not limited therein.
  • For instance, the following Table 1 is a lookup table of adaptive patch rotation functions. Herein, a value of an identifier corresponds to, for example, an index Idx. That is, when the index Idx is 0, the identifier is set as FPO_NULL, and when the index Idx is 1, the identifier is set as FPO_SWAP.
  • Idx Identifier Rotation(Idx) Offset(Idx)
    0 FPO_NULL [ 1 0 0 1 ] [ 0 0 ]
    1 FPO_SWAP [ 0 1 1 0 ] [ 0 0 ]
  • Based on the set identifier, after the patches are recovered, the frame coordinates (x, y) may be calculated as follows through the lookup table:
  • [ x y ] = Rotation ( Idx ) [ u v ] + Offset ( Idx ) + [ Patch2dShiftU ( fIdx ) Patch2dShiftV ( fIdx ) ] ( 1 )
  • Herein, the coordinates (u, v) are the original coordinates of each patch in the patch image. Outputs of a rotation function Rotation(x) and an offset function Offset(x) are the matrixes defined in Table 1. An index fIdx is an index value of a number of a frame marked in the image, and patch shift functions Patch2dShiftU and Patch2dShiftV respectively are the X position and Y position of the top left corner of the patch in the patch image.
  • From another aspect, taking 8 orientations for example, it includes the projection method of rotating the patch by 0° (i.e., not rotated), 90°, 180°, and 270°, and rotating a mirror image of the patch by 0°, 90°, 180°, and 270°, for example. The processor 16 may determine which orientation among the 8 predetermined orientations each patch in the patch image is rotated through looking up the index table, so as to invert each patch according to the predetermined orientation being looked up.
  • For instance, the following Table 2 is a lookup table of adaptive patch rotation functions. Herein, values of the rotation function Rotation(x) and the offset function Offset(x) correspond to, for example, the index Idx.
  • TABLE 2
    Idx Rotation(Idx) Offset(Idx)
    0 [ 1 0 0 1 ] [ 0 0 ]
    1 [ 0 - 1 1 0 ] [ Patch 2 dSizeV * BlockSize - 1 0 ]
    2 [ - 1 0 0 - 1 ] [ Patch 2 dSizeU * BlockSize - 1 Patch 2 dSizeV * BlockSize - 1 ]
    3 [ 0 1 - 1 0 ] [ 0 Patch 2 dSizeU * BlockSize - 1 ]
    4 [ - 1 0 0 1 ] [ Patch 2 dSizeU * BlockSize - 1 0 ]
    5 [ 0 - 1 - 1 0 ] [ Patch 2 dSizeV * BlockSize - 1 Patch 2 dSizeU * BlockSize - 1 ]
    6 [ 1 0 0 - 1 ] [ 0 Patch 2 dSizeV * BlockSize - 1 ]
    7 [ 0 1 1 0 ] [ 0 0 ]
  • Herein, the BlockSize is a size of a coding block of the occupancy map, and Patch2dSizeU and Patch2dSizeV respectively are values obtained by dividing the width and height of the patch by the BlockSize. Outputs of the rotation function Rotation(x) and the offset function Offset(x) may be looked up through Table 2 and introduced to formula (1), so as to calculate the frame coordinates (x, y) of the patch.
  • Referring back to the flow of FIG. 8, in step 803, the processor 76 reconstructs the point cloud through the recovered patches. Herein, the processor 76, for example, projects each two-dimensional patch to a three-dimensional space through the predetermined projection method so as to reconstruct the point cloud. Specifically, the processor 76 may demultiplex the bit stream of the point cloud into the geometry image, the texture image, the occupancy map, and the auxiliary patch information corresponding to each frame. The processor 76 may learn which pixels in the geometry image and the texture image are valid data through the occupancy map and may learn the patch information such as belonging to which patch through the auxiliary patch information. The processor 76 may then project the two-dimensional patch to the three-dimensional space through the valid data and the patch information, so as to reconstruct the point cloud.
  • Specifically, FIG. 9 is a detailed flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure. With reference to FIG. 7 and FIG. 9 together, the method provided by this embodiment is adapted to the point cloud patch processing apparatus 70, and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 70 are described in detail as follows.
  • First, in step 901, after receiving the compressed bit stream of the point cloud from the I/O device 72, the processor 76 demultiplexes the compressed bit stream into the compressed texture image, the compressed geometry image, the compressed occupancy map, and the compressed auxiliary patch information. Herein, in step 902, the processor 76 performs video decompression on the compressed texture image and the compressed geometry image to generate a decompressed texture image and a decompressed geometry image. In step 903, the processor 76 decompresses the compressed occupancy map to generate a decompressed occupancy map. In step 904, the processor 76 decompresses the compressed auxiliary patch information to generate decompressed auxiliary patch information.
  • In step 905, the processor 76 learns which pixels in the decompressed geometry image are valid data according to the decompressed occupancy map and then inverts the orientation of each patch in the decompressed geometry image according to the patch information such as the orientation and offset of each patch recorded in the decompressed auxiliary patch information. Such inversion may include orientation inversion and/or shift inversion, which is not limited herein.
  • In step 906, the processor 76 projects the inverted patches to the three-dimensional space to reconstruct geometry portions (e.g., including positions and shapes) and in step 907, performs smoothing on the reconstructed geometry portions. Finally, in step 908, the processor 76 decompresses the texture image and performs texture reconstruction on the reconstructed geometry portions, so as to obtain the reconstructed point cloud.
  • Through the foregoing method, the point cloud patch processing apparatus 70 provided by the embodiments of the disclosure may decompose the compressed bit stream into the patch image and the patch information required for reconstructing the point cloud, and reconstruct the point cloud by inverting the patches and projecting the inverted patches to the three-dimensional space.
  • In an embodiment, in the method for processing patches of the point cloud provided by the disclosure, encoding is performed with reference to parameters such as computing resources and network resources of the existing device, and the appropriate projection method is then selected to process the patches, so that high quality and high speed video transmission is ensured.
  • Specifically, FIG. 10 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure. With reference to FIG. 1 and FIG. 10 together, the method of this embodiment is adapted to the point cloud patch processing apparatus 10, and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 10 are described in detail as follows.
  • First, in step 1001, the processor 16 generates a plurality of patches of the point cloud. In step 1002, the processor 16 determines to use 2 rotation orientations or 8 rotation orientations to rotate the patches according to parameters at the moment, such as available computing resources, network bandwidth, encoding capabilities of the encoder, and decoding capabilities of the decoder, and sets a preferable orientation flag according to the determination result, so that the decoder may perform decoding with reference to the preferable orientation flag. Herein, if the value of the preferable orientation flag is set to be 1, then 2 rotation orientations are used in step 1003, and if the value of the preferable orientation flag is set to be 0, then 8 rotation orientations are used in in step 1004.
  • In addition, the processor 16 may also use more flags to mark the processing method of each patch according to needs of the system or user, so that the decoder may accordingly recover the patches and reconstruct the point cloud.
  • Specifically, FIG. 11 is a flow chart illustrating the method for processing patches of the point cloud according to an exemplary embodiment of the disclosure. With reference to FIG. 1 and FIG. 11 together, the method provided by this embodiment is suitable for the point cloud patch processing apparatus 10, and steps of the method for processing patches of the point cloud provided by this embodiment together with the devices of the point cloud patch processing apparatus 10 are described in detail as follows.
  • First, in step 1101, the processor 16 generates a plurality of patches of the point cloud. In step 1102, the processor 16 determines whether to enable an orientation rotation function according to system configuration or user operation. Herein, if it is determined that the orientation rotation function is required to be enabled, the processor 16 sets the value of an orientation enabling flag to be 1, and performs step 1103. Conversely, the processor 16 sets the value of the orientation enabling flag to be 0, and performs step 1111 in which no patch is rotated.
  • If it is determined that the orientation rotation function is required to be enabled, in step 1103, the processor 16 determines to use 2 rotation orientations or 8 rotation orientations to rotate the patches according to available resource parameters at the moment and sets the preferable orientation flag according to the determination result. Herein, if the value of the preferable orientation flag is set to be 1, then 2 rotation orientations are used in step 1004, and if the value of the preferable orientation flag is set to be 0, then 8 rotation orientations are used in in step 1005.
  • If the 2 rotation orientations are used, in step 1106, the processor 16 may determine whether each patch is adapted to rotate to the predetermined orientation and set a patch rotation flag according to the determination result. Herein, if the value of the patch rotation flag is set to be 1, the patches are rotated to a predetermined rotation orientation in step 1108, and if the value of the patch rotation flag is set to be 0, no patch is rotated in step 1109.
  • If the 8 rotation orientations are used, in step 1107, the processor 16 may determine whether each patch is adapted to rotate to the predetermined orientation (i.e., the 7 orientations other than rotation by 0°) and set the patch rotation flag according to the determination result. Herein, if the value of the patch rotation flag is set to be 1, the patches are rotated to one of the predetermined 7 rotation orientations and the index of such rotation orientation is recorded (e.g., using 3 bits to record 7 types of states) in step 1110, and if the value of the patch rotation flag is set to be 0, no patch is rotated (i.e. rotation by 0°) in step 1109.
  • Finally, in step 1112, the processor 16 performs processing such as image padding, smoothing, compression on the patches processed and the auxiliary information generated through the foregoing steps, and detailed content thereof is described in the foregoing embodiments and thus is not repeated herein.
  • In view of the foregoing, in the method and apparatus for processing patches of the point cloud provided by the disclosure, the patches of the point cloud are rotated and/or shifted during encoding, such that in the generated patch image, the patches are arranged more closely, and encoding efficiency of point cloud compression is therefore improved. In addition, in the method provided by the disclosure, patch information such as the rotation orientation and offset of each patch is further added to the auxiliary information and provided to the decoder together, so that the decoder may recover the patches and reconstruct the point cloud. Accordingly, encoding efficiency of point cloud compression is improved, and high quality and high speed video transmission is ensured.
  • It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the disclosure covers modifications and variations provided that they fall within the scope of the following claims and their equivalents.

Claims (20)

What is claimed is:
1. An apparatus for processing patches of a point cloud, comprising:
an input/output (I/O) device, receiving point cloud data;
a storage device, storing an index table recording indexes corresponding to a plurality of orientations;
a processor, coupled to the input/output device and the storage device, and configured to execute a program to:
generate a plurality of patches of the point cloud, wherein the point cloud comprises a plurality of points in a three-dimensional space, and each of the patches corresponds to a portion of the point cloud;
determine an orientation in which each patch is adapted to generate a patch image and transform each patch to generate the patch image according to the orientation; and
pack the patch image and look up the index table to obtain the index corresponding to the orientation of each patch.
2. The apparatus for processing patches of the point cloud as claimed in claim 1, wherein the processor further determines that whether each patch is adapted to rotate to a predetermined orientation, places the patch into the patch image after rotating the patch if the patch is adapted to rotate, and directly places the patch into the patch image if the patch is not adapted to rotate.
3. The apparatus for processing patches of the point cloud as claimed in claim 1, wherein the processor further determines that whether a width is greater than a height of each patch, places the patch into the patch image after rotating the patch to a predetermined orientation if a determination result is no, and directly places the patch into the patch image without rotating if the determination result is yes.
4. The apparatus for processing patches of the point cloud as claimed in claim 1, wherein the processor further determines that each patch is adapted to rotate by 0° or adapted to rotate a mirror image of the patch by 270° and places each patch into the patch image after rotating the patch according to the determined orientation.
5. The apparatus for processing patches of the point cloud as claimed in claim 1, wherein the processor further determines one of n predetermined orientations that each patch is adapted to rotate to, wherein n is an integer greater than 2, and places each patch into the patch image after rotating the patch according to the determined predetermined orientation.
6. The apparatus for processing patches of the point cloud as claimed in claim 5, wherein the n orientations comprise any combination of rotation by 0°, 90°, 180°, and 270° and rotation of a mirror image by 0°, 90°, 180°, and 270°.
7. The apparatus for processing patches of the point cloud as claimed in claim 1, wherein the processor selects m predetermined orientations from n predetermined orientations, wherein n is an integer greater than 2 and m is an integer less than n, determines one of the m predetermined orientations that each patch is adapted to rotate to, and places each patch into the patch image after rotating the patch according to the determined predetermined orientation.
8. The apparatus for processing patches of the point cloud as claimed in claim 1, wherein the processor further rotates each patch according to the orientation, determines an offset of each patch adapted to generate the patch image after rotating each patch, and shifts each rotated patch according to the offset to generate the patch image, wherein the offset comprises an offset between a position of each patch after being shifted and an original point of the patch or an original point of the point cloud.
9. An apparatus for processing patches of a point cloud, comprising:
an input/output (I/O) device, receiving a bit stream of the point cloud;
a storage device, storing an index table recording indexes corresponding to a plurality of orientations;
a processor, coupled to the input/output device and the storage device, and configured to execute a program to:
demultiplex the bit stream into a patch image and indexes corresponding to a plurality of patches in the patch image;
look up the index table to obtain an orientation of each patch and transform the patch image according to the orientation to recover the plurality of patches of the point cloud; and
reconstruct the point cloud by using the recovered patches.
10. The apparatus for processing patches of the point cloud as claimed in claim 9, wherein the processor further looks up the index table to determine whether each patch has been rotated, wherein the patch is inverted to recover the patch if the patch has been rotated, and the patch is not transformed if the patch has not been rotated.
11. The apparatus for processing patches of the point cloud as claimed in claim 9, wherein the processor further looks up the index table to determine whether the patch has been rotated by 0° or have been rotated by rotating a mirror image of the patch by 270° and inverts each patch according to the determined orientation.
12. The apparatus for processing patches of the point cloud as claimed in claim 9, wherein the processor further looks up the index table to determine whether the patch has been rotated to one of n predetermined orientations, wherein n is an integer greater than 2, and inverts each patch according to the determined predetermined orientation.
13. The apparatus for processing patches of the point cloud as claimed in claim 12, wherein the n orientations comprise any combination of rotation by 0°, 90°, 180°, and 270° and rotation of a mirror image by 0°, 90°, 180°, and 270°.
14. The apparatus for processing patches of the point cloud as claimed in claim 9, wherein each index further comprises an offset of each patch, and the processor further reversely shifts each patch according to the offset, wherein the offset comprises an offset between a position of each patch after being shifted and an original point of the patch or an original point of the point cloud.
15. A method for processing patches of a point cloud, adapted to a decoder having a processor, the method comprising:
demultiplexing a bit stream of the point cloud into a patch image and indexes corresponding to a plurality of patches in the patch image;
looking up an index table to obtain an orientation of each patch and transforming the patch image according to the orientation to recover the plurality of patches of the point cloud; and
reconstructing the point cloud by using the recovered patches.
16. The method as claimed in claim 15, wherein the step of looking up the index table to obtain the orientation of each patch and transforming the patch image according to the orientation comprises:
looking up the index table to determine whether each patch has been rotated;
inverting the patch to recover the patch if the patch has been rotated; and
not transforming the patch if the patch has not been rotated.
17. The method as claimed in claim 15, wherein the step of looking up the index table to obtain the orientation of each patch and transforming the patch image according to the orientation comprises:
looking up the index table to determine whether the patch has been rotated by 0° or has been rotated by rotating a mirror image of the patch by 270°; and
inverting each patch according to the determined orientation.
18. The method as claimed in claim 15, wherein the step of looking up the index table to obtain the orientation of each patch and transforming the patch image according to the orientation comprises:
looking up the index table to determine whether the patch is rotated to one of n predetermined orientations, wherein n is an integer greater than 2; and
inverting each patch according to the determined predetermined orientation.
19. The method as claimed in claim 18, wherein the n orientations comprise any combination of rotation by 0°, 90°, 180°, and 270° and rotation of a mirror image by 0°, 90°, 180°, and 270°.
20. The method as claimed in claim 15, wherein each index further comprises an offset of each patch, and the step of transforming the patch image according to the orientation further comprises:
reversely shifting each patch according to the offset, wherein the offset comprises an offset between a position of each patch after being shifted and an original point of the patch or an original point of the point cloud.
US16/502,036 2018-07-03 2019-07-03 Method and apparatus for processing patches of point cloud Abandoned US20200013235A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/502,036 US20200013235A1 (en) 2018-07-03 2019-07-03 Method and apparatus for processing patches of point cloud

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862693485P 2018-07-03 2018-07-03
US16/502,036 US20200013235A1 (en) 2018-07-03 2019-07-03 Method and apparatus for processing patches of point cloud

Publications (1)

Publication Number Publication Date
US20200013235A1 true US20200013235A1 (en) 2020-01-09

Family

ID=67437496

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/502,036 Abandoned US20200013235A1 (en) 2018-07-03 2019-07-03 Method and apparatus for processing patches of point cloud

Country Status (5)

Country Link
US (1) US20200013235A1 (en)
EP (1) EP3591975A1 (en)
JP (1) JP2020017946A (en)
CN (1) CN110675315A (en)
TW (1) TW202006659A (en)

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200014953A1 (en) * 2018-07-05 2020-01-09 Apple Inc. Point cloud compression with multi-resolution video encoding
US20210183110A1 (en) * 2018-07-12 2021-06-17 Huawei Technologies Co., Ltd. Point Cloud Encoding Method, Point Cloud Decoding Method, Encoder, and Decoder
WO2021165566A1 (en) * 2020-02-19 2021-08-26 Nokia Technologies Oy An apparatus, a method and a computer program for volumetric video
US20220053216A1 (en) * 2018-10-08 2022-02-17 Samsung Electronics Co., Ltd. Method and apparatus for generating media file comprising 3-dimensional video content, and method and apparatus for replaying 3-dimensional video content
US11361471B2 (en) 2017-11-22 2022-06-14 Apple Inc. Point cloud occupancy map compression
US11367224B2 (en) 2018-10-02 2022-06-21 Apple Inc. Occupancy map block-to-patch information compression
US11386524B2 (en) 2018-09-28 2022-07-12 Apple Inc. Point cloud compression image padding
US11398058B2 (en) * 2018-07-17 2022-07-26 Huawei Technologies Co., Ltd. Prediction type signaling and temporal order signaling in point cloud coding (PCC)
US11430155B2 (en) 2018-10-05 2022-08-30 Apple Inc. Quantized depths for projection point cloud compression
US11508095B2 (en) 2018-04-10 2022-11-22 Apple Inc. Hierarchical point cloud compression with smoothing
US11508094B2 (en) 2018-04-10 2022-11-22 Apple Inc. Point cloud compression
US11514611B2 (en) 2017-11-22 2022-11-29 Apple Inc. Point cloud compression with closed-loop color conversion
US11516394B2 (en) 2019-03-28 2022-11-29 Apple Inc. Multiple layer flexure for supporting a moving image sensor
US11527018B2 (en) 2017-09-18 2022-12-13 Apple Inc. Point cloud compression
US11533494B2 (en) 2018-04-10 2022-12-20 Apple Inc. Point cloud compression
US11538196B2 (en) 2019-10-02 2022-12-27 Apple Inc. Predictive coding for point cloud compression
US11552651B2 (en) 2017-09-14 2023-01-10 Apple Inc. Hierarchical point cloud compression
US11562507B2 (en) 2019-09-27 2023-01-24 Apple Inc. Point cloud compression using video encoding with time consistent patches
US11615557B2 (en) 2020-06-24 2023-03-28 Apple Inc. Point cloud compression using octrees with slicing
US11620768B2 (en) 2020-06-24 2023-04-04 Apple Inc. Point cloud geometry compression using octrees with multiple scan orders
US11627314B2 (en) 2019-09-27 2023-04-11 Apple Inc. Video-based point cloud compression with non-normative smoothing
US11625866B2 (en) 2020-01-09 2023-04-11 Apple Inc. Geometry encoding using octrees and predictive trees
US11647226B2 (en) 2018-07-12 2023-05-09 Apple Inc. Bit stream structure for compressed point cloud data
US11663744B2 (en) 2018-07-02 2023-05-30 Apple Inc. Point cloud compression with adaptive filtering
US20230179797A1 (en) * 2020-03-25 2023-06-08 Sony Group Corporation Image processing apparatus and method
US11676309B2 (en) 2017-09-18 2023-06-13 Apple Inc Point cloud compression using masks
US11727603B2 (en) 2018-04-10 2023-08-15 Apple Inc. Adaptive distance based point cloud compression
US11798196B2 (en) 2020-01-08 2023-10-24 Apple Inc. Video-based point cloud compression with predicted patches
US11818401B2 (en) 2017-09-14 2023-11-14 Apple Inc. Point cloud geometry compression using octrees and binary arithmetic encoding with adaptive look-up tables
US11895307B2 (en) 2019-10-04 2024-02-06 Apple Inc. Block-based predictive coding for point cloud compression
US11935272B2 (en) 2017-09-14 2024-03-19 Apple Inc. Point cloud compression
US11948338B1 (en) 2021-03-29 2024-04-02 Apple Inc. 3D volumetric content encoding using 2D videos and simplified 3D meshes
WO2024094540A1 (en) * 2022-11-04 2024-05-10 Interdigital Ce Patent Holdings, Sas Coding format for optimized encoding of volumetric video
US20240249462A1 (en) * 2021-04-07 2024-07-25 Interdigital Ce Patent Holdings, Sas Volumetric video supporting light effects
US12100183B2 (en) 2018-04-10 2024-09-24 Apple Inc. Point cloud attribute transfer algorithm
US20240406440A1 (en) * 2021-07-21 2024-12-05 Nokia Technologies Oy Patch creation and signaling for v3c dynamic mesh compression
US12439083B2 (en) 2019-07-02 2025-10-07 Apple Inc. Point cloud compression with supplemental information messages

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11010955B2 (en) * 2018-06-27 2021-05-18 Sony Group Corporation Point cloud mapping
US10650554B2 (en) * 2018-09-27 2020-05-12 Sony Corporation Packing strategy signaling
SG11202103302VA (en) 2018-10-02 2021-04-29 Huawei Tech Co Ltd Motion estimation using 3d auxiliary data
US11095900B2 (en) * 2018-12-19 2021-08-17 Sony Group Corporation Point cloud coding structure
WO2021170906A1 (en) * 2020-02-28 2021-09-02 Nokia Technologies Oy An apparatus, a method and a computer program for volumetric video
WO2022075234A1 (en) * 2020-10-05 2022-04-14 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device
TWI798999B (en) * 2021-12-15 2023-04-11 財團法人工業技術研究院 Device and method for buliding three-dimensional video
CN114299158B (en) * 2021-12-28 2025-03-28 北京市商汤科技开发有限公司 Multi-camera system calibration method, device, system, electronic device and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7990397B2 (en) * 2006-10-13 2011-08-02 Leica Geosystems Ag Image-mapped point cloud with ability to accurately represent point coordinates
US9300976B2 (en) * 2011-01-14 2016-03-29 Cisco Technology, Inc. Video encoder/decoder, method and computer program product that process tiles of video data
CA2861951C (en) * 2012-01-20 2020-08-11 Thomas Schierl Coding concept allowing parallel processing, transport demultiplexer and video bitstream
US9530225B1 (en) * 2013-03-11 2016-12-27 Exelis, Inc. Point cloud data processing for scalable compression
CN107958489B (en) * 2016-10-17 2021-04-02 杭州海康威视数字技术股份有限公司 Surface reconstruction method and device
US11514613B2 (en) * 2017-03-16 2022-11-29 Samsung Electronics Co., Ltd. Point cloud and mesh compression using image/video codecs
US10909725B2 (en) * 2017-09-18 2021-02-02 Apple Inc. Point cloud compression
WO2019198522A1 (en) * 2018-04-11 2019-10-17 ソニー株式会社 Image processing device and method

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11935272B2 (en) 2017-09-14 2024-03-19 Apple Inc. Point cloud compression
US11818401B2 (en) 2017-09-14 2023-11-14 Apple Inc. Point cloud geometry compression using octrees and binary arithmetic encoding with adaptive look-up tables
US11552651B2 (en) 2017-09-14 2023-01-10 Apple Inc. Hierarchical point cloud compression
US11922665B2 (en) 2017-09-18 2024-03-05 Apple Inc. Point cloud compression
US11676309B2 (en) 2017-09-18 2023-06-13 Apple Inc Point cloud compression using masks
US11527018B2 (en) 2017-09-18 2022-12-13 Apple Inc. Point cloud compression
US11514611B2 (en) 2017-11-22 2022-11-29 Apple Inc. Point cloud compression with closed-loop color conversion
US11361471B2 (en) 2017-11-22 2022-06-14 Apple Inc. Point cloud occupancy map compression
US11508095B2 (en) 2018-04-10 2022-11-22 Apple Inc. Hierarchical point cloud compression with smoothing
US12100183B2 (en) 2018-04-10 2024-09-24 Apple Inc. Point cloud attribute transfer algorithm
US11508094B2 (en) 2018-04-10 2022-11-22 Apple Inc. Point cloud compression
US11727603B2 (en) 2018-04-10 2023-08-15 Apple Inc. Adaptive distance based point cloud compression
US11533494B2 (en) 2018-04-10 2022-12-20 Apple Inc. Point cloud compression
US11663744B2 (en) 2018-07-02 2023-05-30 Apple Inc. Point cloud compression with adaptive filtering
US20220070493A1 (en) * 2018-07-05 2022-03-03 Apple Inc. Point Cloud Compression with Multi-Resolution Video Encoding
US20200014953A1 (en) * 2018-07-05 2020-01-09 Apple Inc. Point cloud compression with multi-resolution video encoding
US11202098B2 (en) * 2018-07-05 2021-12-14 Apple Inc. Point cloud compression with multi-resolution video encoding
US11683525B2 (en) * 2018-07-05 2023-06-20 Apple Inc. Point cloud compression with multi-resolution video encoding
US20210183110A1 (en) * 2018-07-12 2021-06-17 Huawei Technologies Co., Ltd. Point Cloud Encoding Method, Point Cloud Decoding Method, Encoder, and Decoder
US11647226B2 (en) 2018-07-12 2023-05-09 Apple Inc. Bit stream structure for compressed point cloud data
US12401822B2 (en) 2018-07-12 2025-08-26 Apple Inc. Bit stream structure for compressed point cloud data
US12165366B2 (en) * 2018-07-12 2024-12-10 Huawei Technologies Co., Ltd. Point cloud encoding method and decoding method using rotation matrixes, encoder, and decoder
US11398058B2 (en) * 2018-07-17 2022-07-26 Huawei Technologies Co., Ltd. Prediction type signaling and temporal order signaling in point cloud coding (PCC)
US11386524B2 (en) 2018-09-28 2022-07-12 Apple Inc. Point cloud compression image padding
US11748916B2 (en) 2018-10-02 2023-09-05 Apple Inc. Occupancy map block-to-patch information compression
US11367224B2 (en) 2018-10-02 2022-06-21 Apple Inc. Occupancy map block-to-patch information compression
US12094179B2 (en) 2018-10-05 2024-09-17 Apple Inc. Quantized depths for projection point cloud compression
US11430155B2 (en) 2018-10-05 2022-08-30 Apple Inc. Quantized depths for projection point cloud compression
US11606576B2 (en) * 2018-10-08 2023-03-14 Samsung Electronics Co., Ltd. Method and apparatus for generating media file comprising 3-dimensional video content, and method and apparatus for replaying 3-dimensional video content
US20220053216A1 (en) * 2018-10-08 2022-02-17 Samsung Electronics Co., Ltd. Method and apparatus for generating media file comprising 3-dimensional video content, and method and apparatus for replaying 3-dimensional video content
US11516394B2 (en) 2019-03-28 2022-11-29 Apple Inc. Multiple layer flexure for supporting a moving image sensor
US12439083B2 (en) 2019-07-02 2025-10-07 Apple Inc. Point cloud compression with supplemental information messages
US11562507B2 (en) 2019-09-27 2023-01-24 Apple Inc. Point cloud compression using video encoding with time consistent patches
US11627314B2 (en) 2019-09-27 2023-04-11 Apple Inc. Video-based point cloud compression with non-normative smoothing
US11538196B2 (en) 2019-10-02 2022-12-27 Apple Inc. Predictive coding for point cloud compression
US11895307B2 (en) 2019-10-04 2024-02-06 Apple Inc. Block-based predictive coding for point cloud compression
US11798196B2 (en) 2020-01-08 2023-10-24 Apple Inc. Video-based point cloud compression with predicted patches
US11625866B2 (en) 2020-01-09 2023-04-11 Apple Inc. Geometry encoding using octrees and predictive trees
WO2021165566A1 (en) * 2020-02-19 2021-08-26 Nokia Technologies Oy An apparatus, a method and a computer program for volumetric video
US20230179797A1 (en) * 2020-03-25 2023-06-08 Sony Group Corporation Image processing apparatus and method
US11615557B2 (en) 2020-06-24 2023-03-28 Apple Inc. Point cloud compression using octrees with slicing
US11620768B2 (en) 2020-06-24 2023-04-04 Apple Inc. Point cloud geometry compression using octrees with multiple scan orders
US11948338B1 (en) 2021-03-29 2024-04-02 Apple Inc. 3D volumetric content encoding using 2D videos and simplified 3D meshes
US20240249462A1 (en) * 2021-04-07 2024-07-25 Interdigital Ce Patent Holdings, Sas Volumetric video supporting light effects
US20240406440A1 (en) * 2021-07-21 2024-12-05 Nokia Technologies Oy Patch creation and signaling for v3c dynamic mesh compression
WO2024094540A1 (en) * 2022-11-04 2024-05-10 Interdigital Ce Patent Holdings, Sas Coding format for optimized encoding of volumetric video

Also Published As

Publication number Publication date
JP2020017946A (en) 2020-01-30
EP3591975A1 (en) 2020-01-08
TW202006659A (en) 2020-02-01
CN110675315A (en) 2020-01-10

Similar Documents

Publication Publication Date Title
US20200013235A1 (en) Method and apparatus for processing patches of point cloud
US10798389B2 (en) Method and apparatus for content-aware point cloud compression using HEVC tiles
US10339701B2 (en) Method, system and apparatus for generation and playback of virtual reality multimedia
US20190108655A1 (en) Method and apparatus for encoding a point cloud representing three-dimensional objects
CN107454468B (en) Method, apparatus and stream for formatting immersive video
JP2022504344A (en) Methods and Devices for Encoding and Reconstructing Point Cloud Missing Points
JP2022524785A (en) Point cloud geometry padding
US12316883B2 (en) Techniques and apparatus for automatic ROI chunking for content-aware point cloud compression using HEVC tiles
CN114051734A (en) Method and device for decoding three-dimensional scene
CN110915216A (en) Method and apparatus for encoding/decoding colored point clouds representing geometry and color of 3D objects
JP2020536325A (en) Methods and devices for generating points in 3D scenes
JP7371691B2 (en) Point cloud encoding using homography transformation
KR20210114046A (en) Quantization Step Parameters for Point Cloud Compression
CN115398904A (en) 3D scene transmission with alpha layer
RU2767771C1 (en) Method and equipment for encoding/decoding point cloud representing three-dimensional object
WO2018067832A1 (en) Geometry sequence encoder and decoder
CN118511529A (en) Method and apparatus for progressive encoding and decoding of multi-plane images
JP2024514066A (en) Volumetric video with light effects support
JP7744334B2 (en) METHOD AND APPARATUS FOR ENCODING, TRANSMITTING AND DECODING VOLUMETRIC VIDEO - Patent application
TW202406340A (en) Reduction of redundant data in immersive video coding
US20230054523A1 (en) Enhancing 360-degree video using convolutional neural network (cnn)-based filter
WO2024216516A1 (en) Method for encoding and decoding a 3d point cloud, encoder, decoder
US12315081B2 (en) Mesh patch sub-division
US11956478B2 (en) Method and apparatus for point cloud chunking for improved patch packing and coding efficiency
CN114270863B (en) A method and apparatus for encoding and decoding stereoscopic video

Legal Events

Date Code Title Description
AS Assignment

Owner name: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TSAI, YI-TING;LIN, CHUN-LUNG;LIN, CHING-CHIEH;REEL/FRAME:050467/0111

Effective date: 20190918

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION