[go: up one dir, main page]

WO2019088335A1 - Serveur et système de collaboration intelligent, et procédé d'analyse associé basé sur la collaboration - Google Patents

Serveur et système de collaboration intelligent, et procédé d'analyse associé basé sur la collaboration Download PDF

Info

Publication number
WO2019088335A1
WO2019088335A1 PCT/KR2017/012832 KR2017012832W WO2019088335A1 WO 2019088335 A1 WO2019088335 A1 WO 2019088335A1 KR 2017012832 W KR2017012832 W KR 2017012832W WO 2019088335 A1 WO2019088335 A1 WO 2019088335A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image analysis
server
camera
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2017/012832
Other languages
English (en)
Korean (ko)
Inventor
김동칠
박성주
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Korea Electronics Technology Institute
Original Assignee
Korea Electronics Technology Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Korea Electronics Technology Institute filed Critical Korea Electronics Technology Institute
Publication of WO2019088335A1 publication Critical patent/WO2019088335A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N99/00Subject matter not provided for in other groups of this subclass
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present invention relates to an intelligent collaboration server, a system, and a collaboration-based analysis method thereof.
  • FIG. 1 is a view for explaining an intelligent image analysis system according to the prior art.
  • the intelligent image analysis system includes one or more edge cameras 10, an integrated control server 20 and an image analysis server 30.
  • This intelligent image analysis system performs server based image analysis and edge camera based image analysis according to the analysis subject.
  • the server-based image analysis method means a centralized method in which an image captured by the edge camera 10 is received by the image analysis server 30 and image analysis is performed.
  • a server-based image analysis method is easy to improve the algorithm and has high accuracy of analysis, but there is a problem that excessive server construction cost is incurred, and it is heavily influenced by the surrounding infrastructure such as network and platform. Also, if an error occurs in the server itself, there is a problem that the image analysis of the entire system is impossible.
  • the image analysis method based on an edge camera means a method of performing image analysis at the terminal end by embedding an image analysis algorithm in the edge camera 10.
  • This image analysis method has an advantage that image analysis can be performed even when the server is not operated, but the range and performance of spirituality analysis are limited rather than the server-based image analysis method, and it is difficult to rapidly improve the analysis algorithm.
  • the embodiment of the present invention is based on the hybrid method in which a server-based image analysis method and an edge camera-based image analysis method are combined to perform image analysis through image analysis of a multi-processing method in an environment requiring accuracy and real-
  • the present invention provides an intelligent collaborative server and system capable of ensuring the reliability of the information and adaptively cooperating with and controlling the cooperation of the situation information, and a collaboration-based analysis method thereof.
  • an intelligent collaboration server comprising: a communication module for transmitting / receiving data to / from an integrated control server and an image analysis server; a memory for storing a program for analyzing collaboration-
  • the processor receives the first image analysis information and the camera detail information captured and analyzed by the one or more edge cameras from the integrated control server through the communication module, Analyzing at least one of the first and second image analysis information when the second image analysis information is received from the image analysis server and receiving the analyzed image from the edge camera, And generates control information for controlling the at least one edge camera through the camera detailed information.
  • the processor may analyze one or more of the first and second image analysis information based on an event-based analysis technique, a learning-based analysis technique, and a rule-based analysis technique.
  • the processor analyzes the at least one of the first and second image analysis information to generate situation analysis information according to the event-based analysis technique, and when the situation analysis information is matched with an event included in a previously stored event list , And generate control information corresponding to the event.
  • the processor analyzes the first and second image analysis information based on a previously learned algorithm according to the learning analysis technique and controls the PTZ setting value of the edge camera based on the analyzed result
  • the previously learned algorithm sets coordinate information, speed, and bounding box size information of an object included in image analysis information prepared in advance as input, and outputs control information for controlling the corresponding edge camera as an output Can be trained.
  • Control information can be generated.
  • the processor may further include control information for zooming in or zooming out the edge camera according to the rule when the bounding box for detecting the object included in the first and second image analysis information is less than or equal to a predetermined threshold size, Lt; / RTI >
  • the processor analyzes the first and second image analysis information according to a result of analyzing the first and second image analysis information having the highest reliability according to the event-based analysis technique, the learning-based analysis technique, and the rule- Control information can be generated.
  • the processor may generate the control information in the form of metadata including at least one of a camera ID, a camera PTZ setting information, and a frame reanalysis request, and may be transmitted to at least one of the edge camera and the image analysis server.
  • an intelligent collaboration system including at least one edge camera for capturing an image and generating first image analysis information for analyzing the captured image
  • An image analysis server for receiving the image from the edge camera and receiving the image, the first image analysis information and the camera detail information, the image analysis server receiving the image from the integrated control server and generating second image analysis information obtained by analyzing the image, And an intelligent collaboration server for analyzing at least one of the first and second image analysis information and generating control information for controlling the at least one edge camera through the camera detail information based on the analysis result.
  • a third aspect of the present invention is a method for analyzing a collaboration based on an intelligent collaboration server, the method comprising: analyzing at least one of first image analysis information analyzed by one or more edge cameras and second image analysis information analyzed by an image analysis server Receiving; Analyzing at least one of the first and second image analysis information based on an event-based analysis technique, a learning-based analysis technique, and a rule-based analysis technique, and analyzing the at least one edge And generating control information for controlling the camera.
  • the edge camera transmits the captured image together with the first image analysis information and its own camera detailed information to the integrated control server, and the image analysis server receives the captured image from the integrated control server, And generates second image analysis information.
  • collaboration between an edge camera and an image analysis server can be performed.
  • FIG. 1 is a view for explaining an intelligent image analysis system according to the prior art.
  • FIG. 2 is a schematic diagram for explaining an intelligent collaboration system according to an embodiment of the present invention.
  • FIG. 3 is a block diagram of an intelligent collaboration system in accordance with an embodiment of the present invention.
  • FIG. 4 is a block diagram of an intelligent collaboration server according to an embodiment of the present invention.
  • FIG. 5 is a flowchart of a collaboration-based analysis method according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram for explaining an intelligent collaboration system 1 according to an embodiment of the present invention.
  • 3 is a block diagram of an intelligent collaboration system 1 in accordance with an embodiment of the present invention.
  • 4 is a block diagram of an intelligent collaboration server 100 according to an embodiment of the present invention.
  • the intelligent collaboration system 1 includes an intelligent collaboration server 100, an integrated control server 200, an integrated control viewer 210, an image analysis server 300 and one or more edge cameras 400, .
  • each server can be linked based on REST / JSON.
  • One or more edge cameras 400 may be provided and take images.
  • the edge camera 400 is embedded with an image analysis algorithm, and analyzes the photographed image to generate first image analysis information.
  • the edge camera 400 transmits the photographed image, the analyzed first image analysis information, and its own camera detailed information to the integrated control server 200.
  • the first image analysis information may include, for example, a camera ID, a frame number, a type of anomaly symptom, X and Y coordinates of the detected object, and area and height information of the detected object area.
  • the camera detailed information may include a camera ID, a position of a camera, information on a camera PTZ (Pan, Tilt, Zoon), and a type of a camera.
  • the integrated control server 200 outputs an image photographed by the edge camera 400 through the integrated control viewer and stores and manages the first image analysis information and the second image analysis information to be described later.
  • the integrated control server 200 receives the first image analysis information and the camera detail information of each of the edge cameras 400 from the edge camera 400, and outputs the first image analysis information and the camera detail information to the VMS (Video Mansgement System) metadata.
  • VMS Video Mansgement System
  • the image analysis server 300 receives the image and the VMS metadata from the integrated control server 200 and generates second image analysis information by analyzing the image based on the camera detailed information included in the VMS metadata.
  • the second image analysis information includes, for example, a camera ID, a frame number, an abnormality symptom type, an abnormality symptom area, a weather type, the number of detected objects, an object ID, The width and height information of the object area, the ratio of the detected object, the color, the shape, the moving speed, and the like.
  • the first image analysis information analyzed by the edge camera 400 generates analysis information having lower performance than the second image analysis information analyzed through the image analysis server 300, but is not limited thereto.
  • the intelligent collaboration server 100 analyzes at least one of the first image analysis information included in the VMS metadata received from the integrated control server 200 and the second image analysis information received from the image analysis server 300, And generates collaborative interworking information based on the result of the analysis.
  • the cooperative linking information includes control information for controlling the at least one edge camera 400 through the camera detailed information based on the analysis result.
  • the control information may include at least one of a camera ID, a camera PTZ setting information, and a frame reanalysis request, and the control information may be generated in a metadata form and transmitted to the edge camera 400 and the image analysis server 300 More than one can be delivered.
  • the intelligent collaboration server 100 includes a communication module 110, a memory 120, and a processor 130. Meanwhile, the integrated management server 200 and the image analysis server 300 may be implemented as a communication module, a memory, and a processor in the same manner as the intelligent collaboration server 100 according to an exemplary embodiment of the present invention.
  • the communication module 110 transmits and receives data to and from the integrated control server 200 and the image analysis server 300.
  • the communication module 110 may include both a wired communication module and a wireless communication module.
  • the wired communication module may be implemented by a power line communication device, a telephone line communication device, a cable home (MoCA), an Ethernet, an IEEE1294, an integrated wired home network, and an RS-485 control device.
  • the wireless communication module can be implemented by the UNB, WLAN, Bluetooth, HDR WPAN, UWB, ZigBee, Impulse Radio, 60 GHz WPAN, Binary-CDMA, wireless USB technology and wireless HDMI technology.
  • the memory 120 a program for analysis based on collaboration is stored.
  • the memory 120 is collectively referred to as a nonvolatile storage device and a volatile storage device which keep the stored information even when power is not supplied.
  • the memory 120 may be a compact flash (CF) card, a secure digital (SD) card, a memory stick, a solid-state drive (SSD)
  • CF compact flash
  • SD secure digital
  • SSD solid-state drive
  • a magnetic computer storage device such as a NAND flash memory, a hard disk drive (HDD) and the like, and an optical disc drive such as a CD-ROM, a DVD-ROM, etc. .
  • the processor 130 executes the program stored in the memory 120. [
  • the processor 130 receives the first image analysis information and the camera detail information captured and analyzed by the at least one edge camera 400 from the integrated control server 200 through the communication module 110 in the form of VMS metadata do.
  • the processor 130 may acquire collaboration control information directly input by an administrator and generate collaboration interworking information together with or separately from the received information.
  • the processor 130 analyzes one or more of the first and second image analysis information based on the event-based analysis technique, the learning-based analysis technique, and the rule-based analysis technique.
  • the processor 130 may analyze the at least one of the first and second image analysis information according to the event analysis technique to generate the situation analysis information.
  • the situation analysis information is matched with the event included in the event list stored in advance, control information corresponding to the event can be generated.
  • the processor 130 may generate the situation analysis information indicating that the object is determined to be 'roaming' as a result of analyzing at least one of the first and second image analysis information, If there is an event, the processor 130 transmits object tracking information for tracking the roaming object to the edge camera 400 through the integrated control system, and the edge camera 400 can track the object in real time .
  • the processor 130 analyzes the first and second image analysis information based on the previously learned algorithm according to the learning analysis technique and controls the PTZ setting values of the edge camera 400 based on the analyzed results Lt; / RTI >
  • the previously learned algorithm sets the coordinate information, the velocity, and the bounding box size information of the object included in the image analysis information prepared in advance as input, and sets the control information for controlling the corresponding edge camera 400 as output .
  • the processor 130 determines that one or more rules included in the rule stored in advance as a result of analyzing one or more of the first and second image analysis information according to the rule-based analysis technique, It is possible to generate control information.
  • the bounding box for detecting an object included in the first and second image analysis information is equal to or smaller than a preset threshold size, that is, when it is determined that an object is detected at a long distance.
  • the control information for zooming in the edge camera 400 can be generated.
  • control information for zooming out the edge camera 400 may be generated according to the rule.
  • the processor 130 analyzes the first and second image analysis information according to the event analysis technique, the learning analysis technique, and the rule-based analysis technique, if there are conflicting results or there are different results , And the control information can be generated based on the result of the probability-based analysis technique having the highest reliability among the above analysis techniques.
  • the processor 130 cooperates with at least one of the edge camera 400 and the image analysis server 300 together with the control information And transmits the generated collaboration information to the image analysis server 300 and the integrated control server 200.
  • the collaboration interworking information may include camera ID and PTZ setting information as camera control information, and may include re-analysis request information of a specific frame.
  • the processor 130 may receive feedback information from each server through the communication module 110.
  • the above process is repeatedly performed. If the first and second image analysis information are not received, Thereby keeping the intelligent collaboration server 100 in an inactive state.
  • 3 may be implemented in hardware such as software or an FPGA (Field Programmable Gate Array) or ASIC (Application Specific Integrated Circuit), and may perform predetermined roles can do.
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • 'components' are not meant to be limited to software or hardware, and each component may be configured to reside on an addressable storage medium and configured to play one or more processors.
  • an element may comprise components such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, Routines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
  • FIG. 5 is a flowchart of a collaboration-based analysis method according to an embodiment of the present invention.
  • an edge camera 400 captures an image (S101), and generates first image analysis information based on the captured image (S103).
  • the edge camera 400 transmits the photographed image, the camera detail information, and the first image analysis information to the integrated control server 200 (S105).
  • the integrated control server 200 generates VMS metadata including the first image analysis information analyzed by the edge camera 400, the camera detail information, and the state information of the edge camera 400 (S107)
  • the metadata is transmitted to the image analysis server 300 (S109), and the VMS metadata is transmitted to the intelligent collaboration server 100 (S111).
  • the image analysis server 300 receives the image, generates second image analysis information using the camera detail information (S113), and transmits the second image analysis information to the intelligent collaboration server 100 and the integrated management server (S115, S117).
  • the intelligent collaboration server 100 receives the first image analysis information included in the VMS metadata and the second image analysis information from the image analysis server 300 and generates an event based analysis technique, Based analysis technique based on at least one of the first and second image analysis information (S119).
  • control information for controlling one or more edge cameras 400 is generated through the camera detailed information, and the generated control information is generated as collaboration linked information (S121) To the server (S123, S125).
  • the integrated management server transmits camera control information to the edge camera 400 (S127), and the edge camera 400 performs PTZ control based on the control information (S129).
  • steps S101 to S127 may be further divided into further steps or combined into fewer steps, according to an embodiment of the present invention. Also, some of the steps may be omitted as necessary, and the order between the steps may be changed. In addition, the contents already described with respect to the intelligent collaboration system 1 and the server 100 in FIGS. 2 to 4 apply to the collaboration-based analysis method of FIG. 5 even if other contents are omitted.
  • an embodiment of the present invention may also be embodied in the form of a computer program stored in a medium executed by a computer or a recording medium including instructions executable by the computer.
  • Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media.
  • the computer-readable medium may include both computer storage media and communication media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Communication media typically includes any information delivery media, including computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transport mechanism.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Data Mining & Analysis (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Studio Devices (AREA)
  • Human Computer Interaction (AREA)

Abstract

L'invention concerne un serveur de collaboration intelligent comprenant : un module de communication permettant de transmettre et de recevoir des données vers et depuis un serveur de commande intégré et un serveur d'analyse d'image; une mémoire permettant de stocker un programme pour une analyse basée sur la collaboration; et un processeur permettant d'exécuter le programme stocké dans la mémoire. Le processeur : reçoit, du serveur de commande intégré, des premières informations d'analyse d'image capturées et analysées par au moins une caméra périphérique ainsi que des informations détaillées de caméra par le biais du module de communication lorsque le programme est exécuté; lors de la réception, à partir du serveur d'analyse d'image, des secondes informations d'analyse d'image dans lesquelles une image capturée par la caméra périphérique est reçue du serveur de commande intégré et ensuite analysée, analyse au moins les premières et/ou les secondes informations d'analyse d'image; et génère des informations de commande permettant de commander la ou les caméras périphériques au moyen des informations détaillées de la caméra d'après le résultat d'analyse.
PCT/KR2017/012832 2017-11-06 2017-11-14 Serveur et système de collaboration intelligent, et procédé d'analyse associé basé sur la collaboration Ceased WO2019088335A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0146537 2017-11-06
KR1020170146537A KR102008503B1 (ko) 2017-11-06 2017-11-06 지능형 협업 서버, 시스템 및 이의 협업 기반 분석 방법

Publications (1)

Publication Number Publication Date
WO2019088335A1 true WO2019088335A1 (fr) 2019-05-09

Family

ID=66332026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/012832 Ceased WO2019088335A1 (fr) 2017-11-06 2017-11-14 Serveur et système de collaboration intelligent, et procédé d'analyse associé basé sur la collaboration

Country Status (2)

Country Link
KR (1) KR102008503B1 (fr)
WO (1) WO2019088335A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110809103A (zh) * 2019-11-22 2020-02-18 英特灵达信息技术(深圳)有限公司 一种智能边缘微型服务器

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102119721B1 (ko) * 2019-10-23 2020-06-05 주식회사 넥스트케이 지능형 에지장치 및 그 장치의 구동방법
KR102151912B1 (ko) * 2020-01-29 2020-09-03 이정민 비디오 감시 시스템 및 비디오 감시 시스템에서 수행되는 방법
KR102152237B1 (ko) * 2020-05-27 2020-09-04 주식회사 와치캠 상황 분석 기반의 cctv 관제 방법 및 시스템
KR102321754B1 (ko) * 2020-08-18 2021-11-04 주식회사에스에이티 엣지 컴퓨팅 기반 영상 유고 감지 시스템
KR102637948B1 (ko) * 2021-11-19 2024-02-19 주식회사 핀텔 정책기반 가변적 영상분석 아키택쳐 생성방법, 장치 및 이에 대한 컴퓨터 프로그램
KR102768082B1 (ko) * 2022-10-14 2025-02-13 한동대학교 산학협력단 서비스 영역에서의 차량의 자율주행 제어 방법 및 시스템
KR102573263B1 (ko) * 2023-02-10 2023-08-31 국방과학연구소 영상 정보 통합 방법 및 이를 위한 전자 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100883417B1 (ko) * 2008-07-01 2009-02-12 (주)서광시스템 다중접속 및 영상 추적제어되는 미디어 서버시스템, 및그를 포함하는 감시 카메라 제어 시스템, 그리고 다중접속및 영상 추적제어 수행방법
WO2012118259A1 (fr) * 2011-03-03 2012-09-07 (주)엔써즈 Système et procédé de fourniture d'un service lié à la vidéo sur la base d'une image
WO2016133234A1 (fr) * 2015-02-17 2016-08-25 이노뎁 주식회사 Système d'analyse d'image pour analyser une image d'appareil photographique alloué de manière dynamique, système de commande intégré comprenant ce dernier, et procédé de fonctionnement associé
WO2017111257A1 (fr) * 2015-12-23 2017-06-29 한화테크윈 주식회사 Appareil de traitement d'images et procédé de traitement d'images
WO2017183769A1 (fr) * 2016-04-22 2017-10-26 주식회사 에스원 Dispositif et procédé destinés à la détection de situation anormale

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150031985A (ko) * 2013-09-17 2015-03-25 한국전자통신연구원 모바일 기기와 협력하여 위험 상황을 추적하기 위한 시스템 및 그 방법
KR101646952B1 (ko) * 2013-12-31 2016-08-10 주식회사세오 제어 및 이벤트 분석을 통한 자동ptz 프리셋 설정 장치 및 방법
KR101722664B1 (ko) * 2015-10-27 2017-04-18 울산과학기술원 능동형 상황 인식을 위한 다인칭 시스템, 웨어러블 카메라, cctv, 관제서버 및 방법
KR101786923B1 (ko) * 2017-05-15 2017-11-15 주식회사 두원전자통신 다목적 통합 운영 cctv 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100883417B1 (ko) * 2008-07-01 2009-02-12 (주)서광시스템 다중접속 및 영상 추적제어되는 미디어 서버시스템, 및그를 포함하는 감시 카메라 제어 시스템, 그리고 다중접속및 영상 추적제어 수행방법
WO2012118259A1 (fr) * 2011-03-03 2012-09-07 (주)엔써즈 Système et procédé de fourniture d'un service lié à la vidéo sur la base d'une image
WO2016133234A1 (fr) * 2015-02-17 2016-08-25 이노뎁 주식회사 Système d'analyse d'image pour analyser une image d'appareil photographique alloué de manière dynamique, système de commande intégré comprenant ce dernier, et procédé de fonctionnement associé
WO2017111257A1 (fr) * 2015-12-23 2017-06-29 한화테크윈 주식회사 Appareil de traitement d'images et procédé de traitement d'images
WO2017183769A1 (fr) * 2016-04-22 2017-10-26 주식회사 에스원 Dispositif et procédé destinés à la détection de situation anormale

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110809103A (zh) * 2019-11-22 2020-02-18 英特灵达信息技术(深圳)有限公司 一种智能边缘微型服务器

Also Published As

Publication number Publication date
KR102008503B1 (ko) 2019-10-21
KR20190051175A (ko) 2019-05-15

Similar Documents

Publication Publication Date Title
WO2019088335A1 (fr) Serveur et système de collaboration intelligent, et procédé d'analyse associé basé sur la collaboration
WO2020027607A1 (fr) Dispositif de détection d'objets et procédé de commande
WO2020246834A1 (fr) Procédé de reconnaissance d'objet dans une image
WO2013085148A1 (fr) Appareil et procédé d'élimination de bruit dans une image stéréo
WO2018070768A1 (fr) Procédé de commande de système de surveillance et dispositif électronique le prenant en charge
WO2015102126A1 (fr) Procédé et système pour gérer un album électronique à l'aide d'une technologie de reconnaissance de visage
WO2015115802A1 (fr) Dispositif et procédé d'extraction d'informations de profondeur
Tsaftaris et al. Plant phenotyping with low cost digital cameras and image analytics
CN113283509B (zh) 一种自动标注标签的方法、电子设备及存储介质
WO2021075772A1 (fr) Procédé et dispositif de détection d'objet au moyen d'une détection de plusieurs zones
WO2021071286A1 (fr) Procédé et dispositif d'apprentissage d'images médicales basés sur un réseau contradictoire génératif
WO2017195984A1 (fr) Dispositif et procédé de numérisation 3d
WO2016064107A1 (fr) Procédé et appareil de lecture vidéo sur la base d'une caméra à fonctions de panoramique/d'inclinaison/de zoom
WO2021172833A1 (fr) Dispositif de reconnaissance d'objets, procédé de reconnaissance d'objets et support d'enregistrement lisible par ordinateur pour le mettre en œuvre
CN112738808A (zh) 无线网络中DDoS攻击检测方法、云服务器及移动终端
WO2023128654A1 (fr) Procédé d'optimisation de modèle d'apprentissage d'appareil cible et système associé
WO2021071258A1 (fr) Dispositif et procédé d'apprentissage d'image de sécurité mobile basés sur l'intelligence artificielle
WO2022191380A1 (fr) Système et procédé de prévention de falsification et de contrefaçon d'une image sur la base d'une chaîne de blocs, et programme informatique associé
WO2020101121A1 (fr) Procédé d'analyse d'image basée sur l'apprentissage profond, système et terminal portable
WO2021040345A1 (fr) Dispositif électronique et procédé de commande de dispositif électronique
WO2024117457A1 (fr) Procédé et système pour empêcher le vol dans un environnement de magasin en libre-service
WO2025033602A1 (fr) Appareil et procédé d'estimation de pose d'objet d'après une image rvb et des informations de profondeur
WO2023149603A1 (fr) Système de surveillance par images thermiques utilisant une pluralité de caméras
WO2023158068A1 (fr) Système et procédé d'apprentissage pour améliorer le taux de détection d'objets
WO2023224377A1 (fr) Procédé de gestion d'informations d'un objet et appareil appliquant ledit procédé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17930643

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17930643

Country of ref document: EP

Kind code of ref document: A1