[go: up one dir, main page]

WO2017034309A1 - Procédé et appareil de classification de données multimédia - Google Patents

Procédé et appareil de classification de données multimédia Download PDF

Info

Publication number
WO2017034309A1
WO2017034309A1 PCT/KR2016/009349 KR2016009349W WO2017034309A1 WO 2017034309 A1 WO2017034309 A1 WO 2017034309A1 KR 2016009349 W KR2016009349 W KR 2016009349W WO 2017034309 A1 WO2017034309 A1 WO 2017034309A1
Authority
WO
WIPO (PCT)
Prior art keywords
media data
time
classifying
unit
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2016/009349
Other languages
English (en)
Korean (ko)
Inventor
이동훈
이영재
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Originpics Co Ltd
Original Assignee
Originpics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Originpics Co Ltd filed Critical Originpics Co Ltd
Publication of WO2017034309A1 publication Critical patent/WO2017034309A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the present invention relates to a method and apparatus for classifying media data, and more particularly, to a method and apparatus for automatically classifying a plurality of media data provided by a plurality of reporters for each event.
  • the media and the like receive video reports from a large number of ordinary people, it is highly likely that the same events and reports of different personalities will be followed at the same time. Moreover, when a large-scale event occurs and there are a large number of reporters who are willing to report the video data, the media and the like will be provided with a large amount of video data in a short time.
  • the present invention has been made in an effort to provide a method and apparatus for classifying a plurality of media data received from a plurality of reporters for each event.
  • an example of a method of classifying media data includes: receiving media data including a still image or a video from a plurality of user terminals; Classifying the media data into space units based on location information included in the media data; Classifying the media data into units of time based on time information included in the media data; And classifying media data into space-time units by combining the space units and the time units.
  • an apparatus for classifying media data including: a receiver configured to receive media data including a still image or a video from a plurality of user terminals; A space unit classification unit classifying the media data into space units based on the location information included in the media data; A time unit classification unit classifying the media data into time units based on time information included in the media data; And a situation classifying unit classifying a situation provided by the media data based on keywords identified from sound or text information included in each media data classified by space and time.
  • media data received from a plurality of user terminals may be classified in units of space time.
  • the media data can be automatically classified and provided, for example, a traffic accident or a fire. Without doing so, the media manager can easily grasp the type and importance of the incident.
  • media data is categorized into three dimensions based on the space where the event occurred, so that the administrator can easily view the event from the point of view desired.
  • the size of the space unit and time unit for classifying the media data can be actively changed according to the place where the event occurred (for example, urban center or camping ground) or the type of event (for example, traffic accident or fire). Accurately collect and classify media data for the incident.
  • FIG. 1 shows a schematic structure of an entire system for media data classification according to the present invention
  • FIG. 2 is a flowchart illustrating an example of a method of classifying media data according to the present invention
  • FIG. 3 illustrates an example of media data according to the present invention
  • FIG. 4 is a flowchart illustrating an example of a method of classifying media data into space units according to the present invention
  • FIG. 5 is a diagram illustrating an example of a method of classifying spatial units of media data according to the present invention
  • FIG. 6 is a diagram illustrating an example of changing a basic space unit according to the present invention.
  • FIG. 7 is a flowchart illustrating an example of a method of classifying media data according to an embodiment of the present invention in units of time;
  • FIG. 10 is a diagram illustrating an example of classifying media data into space-time units according to the present invention.
  • FIG. 11 illustrates an example of a method of determining a relative photographing position based on a photographing direction of media data according to the present invention
  • FIG. 12 illustrates an example of a method of determining a relative photographing distance based on a subject size of media data according to the present invention
  • FIG. 13 illustrates an example of a method of determining a relative photographing height based on a photographing angle of media data according to the present invention
  • FIG. 14 is a view showing an example of displaying the recording position of the media data in three dimensions according to the present invention.
  • 15 is a diagram illustrating an example of a method of classifying media data according to an indoor and outdoor reference according to the present invention.
  • 16 is a view showing the configuration of an embodiment of a user terminal according to the present invention.
  • FIG. 17 is a diagram showing the configuration of an embodiment of a media data classification apparatus according to the present invention.
  • FIG. 1 is a diagram illustrating a schematic structure of an entire system for media data classification according to the present invention.
  • the media data classification apparatus 100 for automatically classifying media data including a still image or a moving image receives media data from a plurality of user terminals 110, 112, 114, and 116 through a wired or wireless communication network.
  • the user terminals 110, 112, 114, and 116 refer to all types of terminals capable of transmitting media data to the media data classification apparatus through a wired or wireless communication network.
  • An example of the user terminals 110, 112, 114, and 116 may be a smartphone or a tablet PC.
  • An example configuration of the user terminals 110, 112, 114, and 116 will be described again with reference to FIG. 16.
  • the media data classification apparatus 100 classifies and provides media data received from the plurality of user terminals 110, 112, 114, and 116 in space-time units. As another example, the media data classification apparatus 100 may classify and provide media data according to an event type, importance, and the like. As another example, the media data classification apparatus 100 provides a three-dimensional relative photographing position of the media data so that the administrator can view the incident from a desired perspective.
  • FIG. 2 is a flowchart illustrating an example of a method of classifying media data according to the present invention.
  • the media data classification apparatus receives media data from a plurality of user terminals (S200).
  • the device is based on location information (for example, GPS (Global Positioning System) coordinate information, etc.) and time information (which may include date information according to an embodiment) included in the media data.
  • location information for example, GPS (Global Positioning System) coordinate information, etc.
  • time information which may include date information according to an embodiment
  • the device not only classifies the media data into spatial units (eg, two-dimensional or three-dimensional classification) and time units, but also according to the embodiment, the location where the media data is taken or the location of the event scene belongs to indoor or outdoor. Can be further classified on the basis of An example of a method of classifying media data in space units is illustrated in FIG. 4, and an example of a method of classifying media data in units of time is illustrated in FIG. 7. In addition, an example of a method of further classifying media data based on indoor and outdoor is illustrated in FIG. 15.
  • the device may further classify the media data on a case-by-case basis based on the additional information (S220).
  • the device may be a variety of conventional speech recognition algorithms or text recognition algorithms. You can analyze the recorded data or text information to determine the type of event. For example, if the word information such as 'fire', 'fire truck', or 'fire' is high in the additional information included in the plurality of media data, the device identifies the event indicated by the media data as 'fire'. To this end, the device may have a word list for each event in advance.
  • the device may provide the classified media data to another device or display it on the screen (S230). For example, when the device receives a request for providing media data in a specific time zone of a specific region, the device classifies and provides the media data. In this case, the device provides a relative position of each media data according to an embodiment so that an administrator or the like can easily identify the incident scene represented by the media data from a desired perspective. As another example, the device may provide media data on a case-by-case basis, such as a fire or a traffic accident.
  • FIG. 3 is a diagram illustrating an example of media data according to the present invention.
  • the media data 300 transmitted from the user terminal to the device includes a still image (or video) 310, tagging data 320, and additional data 330.
  • the still image or video 310 may be an image photographed through a camera, or may be an image photographed directly by a camera of the user terminal or photographed by a device other than the user terminal.
  • the still image or the video 310 will be described as limited to the case where the user terminal is photographed.
  • the tagging data 320 is data representing various environmental information at the time of image capturing, and includes position information indicating a photographing position and time information indicating a capturing time.
  • the tagging data 320 may further include various information according to an embodiment, such as illuminance, humidity, temperature, a tilt of the user terminal, and a shooting direction.
  • the additional data 330 is information added by the informant, and includes recording information or text information.
  • the whistleblower may record a description of an event scene situation or record a scene sound at the time of shooting through a recording function of a user terminal, or may record a description of the event scene.
  • the additional data 330 may further include various types of information added by the informant.
  • FIG. 4 is a flowchart illustrating an example of a method of classifying media data into space units according to the present invention.
  • the device determines the spatial distribution density of the media data based on the location information of the media data received from the plurality of user terminals (S400).
  • the device detects the distribution density of media data as an example of a method for automatically identifying where an event occurred.
  • the device determines the distribution density of each region by the basic space units 510, 512, and 514 having a predetermined area, and then identifies the region 510 that is greater than or equal to a predetermined threshold.
  • the device may vary the threshold value of the distribution density for the classification of spatial units according to geographic information of each region (for example, large cities, small cities, urban centers and neighborhoods, residential areas, and malls).
  • the device grasps the distribution center of media data of the corresponding area (S410). For example, there may be cases where the informant is skewed to one side of the space depending on the scene of the event, so that the device grasps and classifies the media data located in a certain space from the distribution center of the media data for more accurate media data classification. .
  • the distribution range of the informant may vary depending on the geographic environment where the event occurred or the type of event. For example, in the case of a large forest fire, the informant can range from several kilometers, and in the case of a traffic accident, the informant can range from several meters away from the scene of the incident. Also, in the middle of a crowded city center, the area of the scene is narrow because the view is blocked by buildings or people, and in the case of a secluded countryside, the scene can be taken from a distance.
  • the device does not apply the size of the basic spatial unit 510 of FIG. 5 to classify the media data as it is, but instead of the geographic information or the event type (the type of the central subject of the media data) where the event occurs as shown in FIG. 6. Actively change accordingly.
  • the large circle 520 of FIG. 5 is an example in which the size of the basic space unit is changed.
  • the apparatus may determine the type of event based on the additional information when the media data includes additional information, or the central subject (ie, the image center) present in the still image (or video) of the media data.
  • the type of event can be determined according to the type of the subject). For example, when the central subject of the media data is a car, the event type may be classified as a traffic accident, and when it is a flame or smoke, it may be classified as a fire event.
  • the device grasps and classifies the media data located within the space unit set at the distribution center of the media data (S430). For example, in FIG. 5, the number of media data located in the right basic space unit 514 is four, but when the size of the space unit is increased (520), the number of media data is nine.
  • FIG. 5 is a diagram illustrating an example of a method of classifying space units of media data according to the present invention
  • FIG. 6 is a diagram illustrating an example of changing a basic space unit according to the present invention.
  • the apparatus determines the distribution density of each media data based on the location information included in the media data.
  • the device may determine the distribution of media data accumulated for a predetermined time. For example, the device may determine the distribution density of the media data received on an hourly basis.
  • the apparatus changes the size of the basic space unit 600 according to the geographic information and the event type, and then grasps and classifies the media data located in the space units 610 and 620 of the changed size. For example, in the case of the first area (or the first event), if the range of where the informant is located is narrow, the device changes the basic space unit 600 into a small space unit 610. On the contrary, in the case of the second region (or the second event), if the reporter is located in a wide range, the apparatus changes the basic space unit 600 to a larger space unit 620.
  • the device may also apply different thresholds for classifying spatial units according to geographic information or event type.
  • FIG. 7 is a flowchart illustrating an example of a method of classifying media data according to an embodiment of the present invention in units of time.
  • the device determines a time distribution density of media data based on time information of media data received from a plurality of user terminals (S400).
  • the report will increase in frequency when an event occurs, so the device uses the time distribution density of the media data as an example of a method for automatically identifying and classifying the occurrence time of the event.
  • the device may vary the threshold value of the time distribution density for time unit classification according to the geographic information where the event occurs and the type of event. For example, referring to FIG. 8, a time zone for event classification may be determined by using the A threshold value in the case of an incident site in a downtown area and a B threshold value in a rural area.
  • the apparatus grasps the time distribution center of the media data (S410), and identifies and classifies the media data in a predetermined time region based on the time distribution center. Since the time distribution of the report can be different depending on the geographic information or the type of event, the device actively changes the size of the time range (ie, time unit) for media data classification. For example, if an event lasts for a long time, such as a fire, it may be necessary to increase the size of time units for media data classification.
  • the device grasps and classifies the media data located within the set time unit at the center of time distribution of the media data (S430).
  • FIG. 8 is a diagram illustrating an example of a time unit classification method of media data according to the present invention
  • FIG. 9 is a diagram illustrating an example of changing a basic time unit according to the present invention.
  • the reporting time and the reporting amount of the informant may vary according to the location of the event or the type of the event. Accordingly, the device actively varies and applies the size of the threshold values A and B for classifying the time zone where an event occurs and the time units 910 and 920 for classifying media data according to geographic information or event type.
  • the media data located within the time unit 1 when the threshold occurrence frequency 800 exceeds the threshold A in the event occurrence frequency 800 and 810 is identified and classified.
  • FIG. 10 illustrates an example of classifying media data in units of space time according to the present invention.
  • the device may determine a space unit for classifying media data through the method of FIG. 4, and may determine a time unit for classifying media data through the method of FIG. 7.
  • two methods of FIGS. 4 and 7 may be simultaneously applied or only one of the two methods may be applied.
  • the device classifies the media data located in the area 1000 by combining the space unit and the time unit. Therefore, the device can automatically identify the region and time zone where a particular event occurred and classify and provide media data about the event.
  • 11 to 14 illustrate an example of a method of three-dimensionally displaying media data classified in units of space and time.
  • 11 illustrates an example of a method of determining a relative photographing position based on a recording direction of media data according to the present invention
  • FIG. 12 illustrates a relative photographing distance based on a subject size of media data according to the present invention.
  • FIG. 13 is a diagram illustrating an example of a determination method
  • FIG. 13 is a diagram illustrating an example of a method of determining a relative photographing height based on a photographing angle of media data according to the present invention.
  • the apparatus grasps the location information of the media data so that the photographing direction (arrow direction in FIG. 11) of the media data converges to one point 1150.
  • the relative photographing positions 1112, 1122, 1132, and 1142 are determined by moving the photographing positions 1110, 1120, 1130, and 1140.
  • the device determines the relative photographing distance (l 1 , l 2 , l 3 , l 4 ) between the photographing positions of the media data from the center subject based on the size of the center subject included in the media data. Determine the relative photographing positions (1200, 1210, 1220, 1230).
  • This embodiment shows an example in which FIG. 12 is again applied to the result of FIG. 11, but an application order thereof may be changed in various ways according to the embodiment.
  • the device may determine a photographing angle of each media data. As the photographing angle is larger, the photographing heights 1300 and 1310 of the media data are higher, and the device determines the relative photographing heights h 1 and h 2 of the media data based on the photographing angles of the media data.
  • the apparatus determines the three-dimensional position 1400 of the media data based on the relative photographing direction, relative photographing distance, and relative photographing height of each media data determined in FIGS. 11 to 13. Since the device determines and provides the 3D location 1400 of each media data based on the center of the event, the administrator can easily identify the scene of the event in a desired direction or height. According to an embodiment, the device may provide a screen on which media data is three-dimensionally arranged.
  • 15 is a diagram illustrating an example of a method of classifying media data based on indoors and outdoorss according to the present invention.
  • the device determines whether the photographing position of the media data is indoors or outdoors (S1500). For example, when the device cannot receive the GPS information of the user terminal when receiving the media data from the user terminal, or when the location information included in the media data and the recording time of the still image or video included in the media data When the information is different or when the location information is missing in the media data, the photographing position of the media data can be regarded as indoor.
  • the device also distinguishes whether the event site is also indoors or outdoors (S1510). For example, the device may distinguish whether the event scene is indoor or outdoor based on the spectrum analysis of the subject of the media data.
  • the device may classify the media data into four cases as described below based on the case where the recording location and the event site location of the media data are indoors and outdoors, respectively (S1520).
  • 16 is a diagram illustrating a configuration of an embodiment of a user terminal according to the present invention.
  • the user terminal 110 includes a photographing unit 1600, a sensing unit 1610, a recording unit 1620, an input unit 1630, and a transmission unit 1640.
  • the user terminal may include various types of sensors according to embodiments, such as the position sensor 1612, the time sensor 1614, the tilt sensor 1616, the direction sensor 1618, and the like.
  • the photographing unit 1600 captures a still image or a video of the scene of the incident through a camera.
  • the sensing unit 1610 measures and stores various surrounding conditions through a sensor when the photographing unit 1600 is photographed. For example, the sensing unit 1610 measures and stores location information and time information at the time of shooting through the position sensor 1612 and the time sensor 1614. If there is a tilt sensor 1616 such as a gyroscope in the user terminal, the sensing unit 1610 measures and stores the tilt of the terminal at the time of shooting, and the direction sensor 1618 that can grasp the direction of the terminal like the geomagnetic sensor is provided. If present, the sensing unit 1610 measures and stores a photographing direction when photographing.
  • a tilt sensor 1616 such as a gyroscope
  • the recording unit 1620 receives and records the sound of the scene of the incident or the voice of the whistleblower.
  • the input unit 1630 receives and stores information such as text.
  • the transmitting unit 1640 may include tagging data including a still image (or a moving image) captured by the photographing unit 1600, various sensing information captured by the sensing unit 1610, and a recording unit 1620 or an input unit 1630.
  • Media data including the additional data consisting of the recording information or text information received through the) is generated and transmitted to the media data classification apparatus 100.
  • FIG. 17 is a diagram showing the configuration of an embodiment of a media data classification apparatus according to the present invention.
  • the media data classification apparatus 100 includes a receiver 1700, a spatiotemporal classifier 1710, and a provider 1770.
  • the space-time classification unit 1710 may include a space unit classification unit 1720, a time unit classification unit 1730, a situation classification unit 1740, an indoor / outdoor classification unit 1750, and a three-dimensional classification unit 1760. have.
  • the receiver 1700 receives a plurality of media data from a plurality of user terminals.
  • the space-time classification unit 1710 classifies the plurality of media data by space and time at which an event occurs.
  • the space unit classification unit 1720 identifies a space in which an event occurs and identifies and classifies media data in the space.
  • the spatial unit classification unit 1720 may use the method illustrated in FIG. 4.
  • the time unit classifying unit 1730 identifies the time when an event occurs and identifies and classifies corresponding time zone media data. For example, the time classifier 1730 may use the method illustrated in FIG. 7.
  • the situation classifier 1740 identifies and classifies the types of events provided by the media data using the additional data.
  • the indoor / outdoor classification unit 1750 classifies the recording location of the media data or the location of the occurrence location of the media data according to whether the location is indoor or outdoor. For example, the indoor / outdoor classification unit 1750 may use the method illustrated in FIG. 15.
  • the three-dimensional classifier 1760 determines and provides a relative three-dimensional position with respect to a photographing position of each media data when the media data includes tilt information or direction information.
  • the provider 1770 classifies and provides the media data.
  • the providing unit 1770 may identify an important event and automatically provide it to an administrator when the salping space distribution density or the time distribution density exceed a predetermined value in FIGS. 4 and 7, that is, when the number of reports is large. have.
  • the provider 1770 may automatically classify and provide media data according to a predetermined importance for each event type.
  • the invention can also be embodied as computer readable code on a computer readable recording medium.
  • the computer-readable recording medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, optical data storage devices, and the like.
  • the computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.

Landscapes

  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Development Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Alarm Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Television Signal Processing For Recording (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Editing Of Facsimile Originals (AREA)

Abstract

La présente invention concerne un procédé et un appareil de classification des données multimédia. Lors de la réception de données multimédia comprenant des images fixes ou des vidéos en provenance d'une pluralité de terminaux utilisateurs, l'appareil de classification des données multimédia classe les données multimédia en unités d'espace sur la base d'informations de localisation comprises dans les données multimédia, classe les données multimédia en unités de temps sur la base d'informations de temps comprises dans les données multimédia, et classe les données multimédia en unités d'espace-temps en combinant les unités d'espace et les unités de temps.
PCT/KR2016/009349 2014-12-29 2016-08-24 Procédé et appareil de classification de données multimédia Ceased WO2017034309A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR20140192564 2014-12-29
KR10-2015-0118870 2015-08-24
KR1020150118870A KR101646733B1 (ko) 2014-12-29 2015-08-24 미디어 데이터 분류 방법 및 그 장치

Publications (1)

Publication Number Publication Date
WO2017034309A1 true WO2017034309A1 (fr) 2017-03-02

Family

ID=56499540

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/009349 Ceased WO2017034309A1 (fr) 2014-12-29 2016-08-24 Procédé et appareil de classification de données multimédia

Country Status (2)

Country Link
KR (3) KR101646733B1 (fr)
WO (1) WO2017034309A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11995288B2 (en) 2017-04-27 2024-05-28 Snap Inc. Location-based search mechanism in a graphical user interface
US12058583B2 (en) 2017-04-27 2024-08-06 Snap Inc. Selective location-based identity communication
US12316589B2 (en) 2016-10-24 2025-05-27 Snap Inc. Generating and displaying customized avatars in media overlays

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107038589B (zh) * 2016-12-14 2019-02-22 阿里巴巴集团控股有限公司 一种实体信息验证方法及装置
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
WO2022159821A1 (fr) * 2021-01-25 2022-07-28 EmergeX, LLC Procédés et système pour coordonner un contenu non coordonné sur la base de métadonnées multimodales par l'intermédiaire d'un filtrage et d'une synchronisation de données afin de générer des contenus multimédias composites

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070011093A (ko) * 2005-07-20 2007-01-24 삼성전자주식회사 멀티미디어 컨텐츠 부호화/재생 방법 및 장치
KR20100065918A (ko) * 2008-12-09 2010-06-17 한국과학기술연구원 사진의 촬영 위치 및 방향 정보 태깅 방법과 그 장치
KR20110134998A (ko) * 2010-06-10 2011-12-16 (주)에스모바일 영상통화를 이용한 위치정보 제공 방법 및 장치
US8094948B2 (en) * 2007-04-27 2012-01-10 The Regents Of The University Of California Photo classification using optical parameters of camera from EXIF metadata
KR20130128800A (ko) * 2012-05-18 2013-11-27 삼성전자주식회사 미디어 기기에서의 콘텐츠 정렬 방법 및 장치와 그 방법에 대한 프로그램 소스를 저장한 기록 매체

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050040450A (ko) 2003-10-28 2005-05-03 이원형 디지털사진의 저작권 보호 및 위변조 검출을 위한 하이브리드 디지털 워터마킹
KR101362764B1 (ko) 2007-07-02 2014-02-14 삼성전자주식회사 사진 파일 제공 장치 및 방법
KR20100008103A (ko) 2008-07-15 2010-01-25 박용석 중개사이트를 이용한 기사제보 활성화 방법
KR101086243B1 (ko) 2009-06-24 2011-12-01 대한민국 디지털사진 위조 검출 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070011093A (ko) * 2005-07-20 2007-01-24 삼성전자주식회사 멀티미디어 컨텐츠 부호화/재생 방법 및 장치
US8094948B2 (en) * 2007-04-27 2012-01-10 The Regents Of The University Of California Photo classification using optical parameters of camera from EXIF metadata
KR20100065918A (ko) * 2008-12-09 2010-06-17 한국과학기술연구원 사진의 촬영 위치 및 방향 정보 태깅 방법과 그 장치
KR20110134998A (ko) * 2010-06-10 2011-12-16 (주)에스모바일 영상통화를 이용한 위치정보 제공 방법 및 장치
KR20130128800A (ko) * 2012-05-18 2013-11-27 삼성전자주식회사 미디어 기기에서의 콘텐츠 정렬 방법 및 장치와 그 방법에 대한 프로그램 소스를 저장한 기록 매체

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12316589B2 (en) 2016-10-24 2025-05-27 Snap Inc. Generating and displaying customized avatars in media overlays
US11995288B2 (en) 2017-04-27 2024-05-28 Snap Inc. Location-based search mechanism in a graphical user interface
US12058583B2 (en) 2017-04-27 2024-08-06 Snap Inc. Selective location-based identity communication
US12086381B2 (en) 2017-04-27 2024-09-10 Snap Inc. Map-based graphical user interface for multi-type social media galleries
US12112013B2 (en) * 2017-04-27 2024-10-08 Snap Inc. Location privacy management on map-based social media platforms
US12131003B2 (en) 2017-04-27 2024-10-29 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US12223156B2 (en) 2017-04-27 2025-02-11 Snap Inc. Low-latency delivery mechanism for map-based GUI
US12340064B2 (en) 2017-04-27 2025-06-24 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US12393318B2 (en) 2017-04-27 2025-08-19 Snap Inc. Map-based graphical user interface for ephemeral social media content

Also Published As

Publication number Publication date
KR20160082917A (ko) 2016-07-11
KR101646733B1 (ko) 2016-08-09
KR20160082915A (ko) 2016-07-11
KR20160082935A (ko) 2016-07-11

Similar Documents

Publication Publication Date Title
WO2017034309A1 (fr) Procédé et appareil de classification de données multimédia
US11615620B2 (en) Systems and methods of enforcing distancing rules
US8769688B2 (en) Simultaneous determination of a computer location and user identification
KR102546763B1 (ko) 영상 제공 장치 및 방법
US10043079B2 (en) Method and apparatus for providing multi-video summary
WO2013115470A1 (fr) Système et procédé de contrôle intégré utilisant une caméra de surveillance destinée à un véhicule
KR101178539B1 (ko) 통합 플랫폼 설계 기법을 이용한 지능형 영상 보안 시스템 및 방법
WO2016072625A1 (fr) Système de contrôle d'emplacement de véhicule pour parc de stationnement utilisant une technique d'imagerie, et son procédé de commande
WO2022186426A1 (fr) Dispositif de traitement d'image pour classification automatique de segments, et son procédé de commande
WO2014193065A1 (fr) Procédé et appareil de recherche de vidéo
US20250356659A1 (en) Systems and methods of identifying persons-of-interest
CN103299344A (zh) 监视系统以及占用比例检测方法
WO2016060312A1 (fr) Procédé et dispositif de gestion de sécurité basés sur la reconnaissance de position en intérieur
WO2020085558A1 (fr) Appareil de traitement d'image d'analyse à grande vitesse et procédé de commande associé
WO2016099084A1 (fr) Système de fourniture de service de sécurité et procédé utilisant un signal de balise
CN210515326U (zh) 基于人脸ai识别的景区票证稽查系统
WO2012137994A1 (fr) Dispositif de reconnaissance d'image et son procédé de surveillance d'image
KR102682052B1 (ko) 밀집도 알림 장치 및 방법
WO2016072627A1 (fr) Système et procédé de gestion de parc de stationnement à plans multiples à l'aide d'une caméra omnidirectionnelle
US20190147734A1 (en) Collaborative media collection analysis
WO2016208870A1 (fr) Dispositif pour lire un numéro de plaque d'immatriculation de véhicule et procédé associé
KR20190099216A (ko) Rgbd 감지 기반 물체 검출 시스템 및 그 방법
CN112653871A (zh) 一种分布式视频监控收集、查找的方法及装置
WO2022260264A1 (fr) Dispositif et procédé de relais vidéo en temps réel
WO2020218744A1 (fr) Procédé de détection de copie non autorisée de contenu et serveur de service l'utilisant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16839598

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 20/07/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16839598

Country of ref document: EP

Kind code of ref document: A1