[go: up one dir, main page]

WO2018170685A1 - Procédé et système d'assemblage vidéo pour une plate-forme de surveillance en nuage - Google Patents

Procédé et système d'assemblage vidéo pour une plate-forme de surveillance en nuage Download PDF

Info

Publication number
WO2018170685A1
WO2018170685A1 PCT/CN2017/077332 CN2017077332W WO2018170685A1 WO 2018170685 A1 WO2018170685 A1 WO 2018170685A1 CN 2017077332 W CN2017077332 W CN 2017077332W WO 2018170685 A1 WO2018170685 A1 WO 2018170685A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
video files
time
cloud platform
monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2017/077332
Other languages
English (en)
Chinese (zh)
Inventor
张北江
胡君健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avcon Wisdom Information Technology (shenzhen) Co Ltd
Original Assignee
Avcon Wisdom Information Technology (shenzhen) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avcon Wisdom Information Technology (shenzhen) Co Ltd filed Critical Avcon Wisdom Information Technology (shenzhen) Co Ltd
Priority to PCT/CN2017/077332 priority Critical patent/WO2018170685A1/fr
Publication of WO2018170685A1 publication Critical patent/WO2018170685A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Definitions

  • the present invention relates to the field of monitoring, and in particular, to a video stitching method and system for monitoring a cloud platform.
  • the monitoring system consists of 5 parts: camera, transmission, control, display, and record registration.
  • the camera transmits the video image to the control host through the coaxial video cable, and the control host distributes the video signal to each monitor and recording device, and simultaneously records the voice signal to be transmitted into the recorder.
  • the operator can issue commands to control the up, down, left, and right movements of the pan/tilt and focus the zoom on the lens, and can be implemented in the multi-channel camera and the pan/tilt by the control host. Switch between. With special recording processing mode, images can be recorded, played back, processed, etc., so that the recording effect is optimal.
  • the video of the existing monitoring system is a separate video, and the related videos are not spliced together, which is inconvenient for the user to watch.
  • the application provides a video splicing method for monitoring a cloud platform. It solves the disadvantage that the technical solutions of the prior art are inconvenient to watch.
  • a video splicing method for monitoring a cloud platform includes the following steps: monitoring a cloud platform to acquire multiple video files, extracting multiple feature points in multiple video files, and time of a video file; monitoring the cloud platform And extracting at least two video files of the same feature point and having a time range of a set range; and the monitoring cloud platform splicing the at least two video files together in time series to obtain a spliced video.
  • the method further includes:
  • the monitoring cloud platform uses the earliest time and the latest time of at least two video files as the time period of the stitched video.
  • the method further includes:
  • the monitoring cloud platform marks the splicing points of the spliced video.
  • a video splicing system for monitoring a cloud platform comprising:
  • a receiving unit configured to acquire multiple video files
  • a processing unit configured to extract a plurality of feature points in the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and time the at least two video files by time Sequentially stitched together to get a stitched video.
  • system further includes:
  • the processing unit is configured to use the earliest time and the latest time of the at least two video files as the time period of the stitched video.
  • system further includes:
  • a processing unit for marking the splicing points of the spliced video is a processing unit for marking the splicing points of the spliced video.
  • a third aspect provides a monitoring system, including: a processor, a wireless transceiver, a memory, and a bus, wherein the processor, the wireless transceiver, and the memory are connected by a bus, and the wireless transceiver is configured to acquire multiple video files. ;
  • the processor is configured to extract a plurality of feature points of the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and set the at least two video files Splicing together in chronological order to get a stitched video.
  • the processor is specifically configured to use the earliest time and the latest time of the at least two video files as the time period of the stitched video.
  • the processor is configured to mark a splicing point of the spliced video.
  • the technical solution provided by the present invention has the advantages of being convenient for the user to view by splicing a plurality of videos of the same feature point and the set time period together.
  • FIG. 1 is a flowchart of a video splicing method for monitoring a cloud platform according to a first preferred embodiment of the present invention
  • FIG. 2 is a structural diagram of a video splicing system for monitoring a cloud platform according to a second preferred embodiment of the present invention.
  • FIG. 3 is a hardware structural diagram of a monitoring system according to a second preferred embodiment of the present invention.
  • FIG. 1 is a video splicing method for monitoring a cloud platform according to a first preferred embodiment of the present invention. The method is as shown in FIG. 1 and includes the following steps:
  • Step S101 The monitoring cloud platform acquires a plurality of video files, and extracts a plurality of feature points in the plurality of video files and a time of the video file.
  • Step S102 The monitoring cloud platform extracts at least two video files with the same feature point and a time range within a set range.
  • Step S103 The monitoring cloud platform splices the at least two video files together in time sequence to obtain a stitched video.
  • the technical solution provided by the present invention has the advantages of being convenient for the user to view by splicing a plurality of videos of the same feature point and the set time period together.
  • the monitoring cloud platform uses the earliest time and the latest time of the at least two video files as the time period of the stitched video.
  • the monitoring cloud platform marks the splicing points of the spliced video.
  • FIG. 2 is a video splicing system for monitoring a cloud platform according to a second preferred embodiment of the present invention. The system is as shown in FIG.
  • the receiving unit 201 is configured to acquire multiple video files.
  • the processing unit 202 is configured to extract a plurality of feature points in the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and press the at least two video files The time sequence is stitched together to get the stitched video.
  • the technical solution provided by the present invention has the advantages of being convenient for the user to view by splicing a plurality of videos of the same feature point and the set time period together.
  • the system may further include: the processing unit 202, the earliest time and the latest time of the at least two video files are used as the time period of the stitched video.
  • the processing unit 202 is configured to mark the splicing point of the spliced video.
  • FIG. 3 is a monitoring system 30, including: a processor 301, a wireless transceiver 302, a memory 303, and a bus 304.
  • the wireless transceiver 302 is configured to transmit and receive data with and from an external device.
  • the number of processors 301 can be one or more.
  • processor 301, memory 302, and transceiver 303 may be connected by bus 304 or other means.
  • Monitoring system 30 can be used to perform the steps of FIG. For the meaning and examples of the terms involved in the embodiment, reference may be made to the corresponding embodiment of FIG. 1. I will not repeat them here.
  • the wireless transceiver 302 is configured to acquire a plurality of video files.
  • the program code is stored in the memory 303.
  • the processor 901 is configured to call the program code stored in the memory 903 for performing the following operations:
  • the processor 301 is configured to extract a plurality of feature points in the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and press the at least two video files The time sequence is stitched together to get the stitched video.
  • the processor 301 herein may be a processing component or a general term of multiple processing components.
  • the processing element can be a central processor (Central) Processing Unit, CPU), or a specific integrated circuit (Application Specific Integrated) Circuit, ASIC), or one or more integrated circuits configured to implement embodiments of the present application, such as one or more microprocessors (digital singnal Processor, DSP), or one or more Field Programmable Gate Arrays (FPGAs).
  • CPU central processor
  • ASIC Application Specific Integrated Circuit
  • DSP digital singnal Processor
  • FPGAs Field Programmable Gate Arrays
  • the memory 303 may be a storage device or a collective name of a plurality of storage elements, and is used to store executable program code or parameters, data, and the like required for the application running device to operate. And the memory 303 may include random access memory (RAM), and may also include non-volatile memory (non-volatile memory) Memory), such as disk storage, flash (Flash), etc.
  • RAM random access memory
  • non-volatile memory non-volatile memory
  • flash flash
  • Bus 304 can be an industry standard architecture (Industry Standard Architecture, ISA) bus, Peripheral Component (PCI) bus or extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc.
  • the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is shown in Figure 3, but it does not mean that there is only one bus or one type of bus.
  • the terminal may further include input and output means connected to the bus 304 for connection to other parts such as the processor 301 via the bus.
  • the input/output device can provide an input interface for the operator, so that the operator can select the control item through the input interface, and can also be other interfaces through which other devices can be externally connected.
  • the program may be stored in a computer readable storage medium, and the storage medium may include: Flash drive, read-only memory (English: Read-Only Memory, referred to as: ROM), random accessor (English: Random Access Memory, referred to as: RAM), disk or CD.
  • ROM Read-Only Memory
  • RAM Random Access Memory

Landscapes

  • Alarm Systems (AREA)

Abstract

La présente invention concerne un procédé d'assemblage vidéo pour une plate-forme de surveillance en nuage. Le procédé comprend les étapes suivantes : une plate-forme de surveillance en nuage obtient une pluralité de fichiers vidéo, et extrait une pluralité de points caractéristiques à partir de la pluralité de fichiers vidéo ainsi que la séquence temporelle des fichiers vidéo ; la plate-forme de surveillance en nuage extrait au moins deux fichiers vidéo qui ont les mêmes points caractéristiques et ont des plages de temps dans une plage définie ; et la plate-forme de surveillance en nuage assemble les deux fichiers vidéo ou plus selon la séquence temporelle de sorte à obtenir une vidéo assemblée. La solution technique fournie par la présente invention a l'avantage de faciliter le visionnage par des utilisateurs.
PCT/CN2017/077332 2017-03-20 2017-03-20 Procédé et système d'assemblage vidéo pour une plate-forme de surveillance en nuage Ceased WO2018170685A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/077332 WO2018170685A1 (fr) 2017-03-20 2017-03-20 Procédé et système d'assemblage vidéo pour une plate-forme de surveillance en nuage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/077332 WO2018170685A1 (fr) 2017-03-20 2017-03-20 Procédé et système d'assemblage vidéo pour une plate-forme de surveillance en nuage

Publications (1)

Publication Number Publication Date
WO2018170685A1 true WO2018170685A1 (fr) 2018-09-27

Family

ID=63583968

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/077332 Ceased WO2018170685A1 (fr) 2017-03-20 2017-03-20 Procédé et système d'assemblage vidéo pour une plate-forme de surveillance en nuage

Country Status (1)

Country Link
WO (1) WO2018170685A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101000859B1 (ko) * 2009-05-25 2010-12-13 연세대학교 산학협력단 어안렌즈를 이용한 hdri파노라마 영상 생성장치 및 방법
CN102354449A (zh) * 2011-10-09 2012-02-15 昆山市工业技术研究院有限责任公司 基于车联网实现图像信息共享的方法、装置和系统
CN102737509A (zh) * 2012-06-29 2012-10-17 惠州天缘电子有限公司 基于车联网实现图像信息共享的方法及系统
CN106060479A (zh) * 2016-07-13 2016-10-26 三峡大学 一种基于超视距视频技术的智能放牧监控系统
CN106954030A (zh) * 2017-03-20 2017-07-14 华平智慧信息技术(深圳)有限公司 监控云平台的视频拼接方法及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101000859B1 (ko) * 2009-05-25 2010-12-13 연세대학교 산학협력단 어안렌즈를 이용한 hdri파노라마 영상 생성장치 및 방법
CN102354449A (zh) * 2011-10-09 2012-02-15 昆山市工业技术研究院有限责任公司 基于车联网实现图像信息共享的方法、装置和系统
CN102737509A (zh) * 2012-06-29 2012-10-17 惠州天缘电子有限公司 基于车联网实现图像信息共享的方法及系统
CN106060479A (zh) * 2016-07-13 2016-10-26 三峡大学 一种基于超视距视频技术的智能放牧监控系统
CN106954030A (zh) * 2017-03-20 2017-07-14 华平智慧信息技术(深圳)有限公司 监控云平台的视频拼接方法及系统

Similar Documents

Publication Publication Date Title
WO2018218806A1 (fr) Procédé et système de protection de la confidentialité d'un terminal
WO2018223354A1 (fr) Procédé et système d'enregistrement de présence à base de positionnement
US20190306320A1 (en) Transmission apparatus, teleconference system, information processing method, and recording medium
WO2018170685A1 (fr) Procédé et système d'assemblage vidéo pour une plate-forme de surveillance en nuage
WO2018161219A1 (fr) Procédé et système de gestion des mégadonnées de vidéos de surveillance
WO2018161218A1 (fr) Procédé et système de recommandation de séquençage de mégadonnées avec un système de surveillance
WO2018170684A1 (fr) Procédé et système de localisation de panne pour une plate-forme de surveillance en nuage
WO2017107205A1 (fr) Procédé et système de transmission de logiciel vidéo
WO2018209586A1 (fr) Procédé et système de positionnement bluetooth
WO2018161342A1 (fr) Procédé et système d'élection pour système distribué de plate-forme de surveillance en nuage
WO2018170683A1 (fr) Procédé et système d'attribution de tâches pour service en nuage dans un système de surveillance
WO2018223346A1 (fr) Procédé et système de positionnement dans un partage de photographies
WO2018161220A1 (fr) Procédé et système de distribution de tâche de regroupement de plateforme en nuage dans un système de surveillance
WO2018209550A1 (fr) Procédé et système de mise à jour de système de terminal
WO2018223355A1 (fr) Procédé et système de mise en œuvre d'un positionnement de carte de jeu
WO2018152669A1 (fr) Procédé et dispositif de contrôle de sécurité permettant une surveillance de sécurité
WO2018223375A1 (fr) Procédé et système de contrôle et de rappel de trafic de terminal
WO2018165837A1 (fr) Procédé et système pour recuperer des informations à partir d'un réseau
WO2018209549A1 (fr) Procédé et système de division d'intervalle vidéo de terminal
WO2018176225A1 (fr) Procédé et système de décodage pour données audio et vidéo
WO2018209548A1 (fr) Procédé et système de décodage vidéo de terminal
WO2018176223A1 (fr) Procédé et système de mise en oeuvre clonée pour message instantané
WO2018209502A1 (fr) Procédé et système de groupement pour applications de terminal
WO2025192887A1 (fr) Sèche-cheveux et procédé de commande de système
WO2018223376A1 (fr) Procédé et système de commande de la luminosité d'un terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17901511

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.01.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17901511

Country of ref document: EP

Kind code of ref document: A1