[go: up one dir, main page]

US10297051B2 - Information processing device, display method, and program storage medium for monitoring object movement - Google Patents

Information processing device, display method, and program storage medium for monitoring object movement Download PDF

Info

Publication number
US10297051B2
US10297051B2 US15/508,573 US201515508573A US10297051B2 US 10297051 B2 US10297051 B2 US 10297051B2 US 201515508573 A US201515508573 A US 201515508573A US 10297051 B2 US10297051 B2 US 10297051B2
Authority
US
United States
Prior art keywords
information
movement
display
arrow
movement path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/508,573
Other languages
English (en)
Other versions
US20170263024A1 (en
Inventor
Akiko OSHIMA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Assigned to NEC CORPORATION reassignment NEC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Oshima, Akiko
Publication of US20170263024A1 publication Critical patent/US20170263024A1/en
Application granted granted Critical
Publication of US10297051B2 publication Critical patent/US10297051B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/203Drawing of straight lines or curves
    • G06K9/00342
    • G06K9/00536
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Definitions

  • the present invention relates to a technique to display a movement path of an object.
  • Research may be performed as to how persons behave within a shop, a warehouse or the like for a purpose of grasping purchasing trends of customers or improving business efficiency of workers.
  • information to be obtained from an RFID (Radio-frequency identification) tag carried by a person, an SIM (Subscriber Identity Module) card number of a mobile phone or the like; or a camera image to be captured by a monitor camera or the like is used.
  • trace information representing a trace of a behavior of each person is acquired.
  • information representing purchasing behaviors of customers, or information effective in improving business efficiency of workers is obtained.
  • PTL1 describes a customer trend collecting method, in which flow line data of a customer recognized by a flow line recognition system configured in a specific area (a monitoring area) of a shop, and transaction data of the customer processed by a payment device are associated with each other.
  • PTL 2 describes a customer behavior analysis device having a configuration for collecting conditions in which persons who abandon their self-service actions of choosing commodities from a self-service area in midstream appear.
  • PTL 3 describes a flow line simulation device of visitors, in which movements of visitors and conditions of staying are predicted.
  • the flow line simulation device displays a probability at which visitors choose a shortest path or a main street by the thickness of an arrow.
  • a flow line of a person based on flow line data is displayed on a monitoring area (a flow line area) displayed on a display screen.
  • a flow line area a flow line area displayed on a display screen.
  • a main subject of the present invention is to provide a technique to display a state in which an object to be monitored behaves (moves) in a display manner easily recognizable by, for instance, an analyst.
  • an information processing device of the present invention includes:
  • a detection unit that classifies a plurality of objects to be monitored into a plurality of movement paths predetermined based on information on movement of each of the objects
  • a display control unit that controls a display device equipped with a display screen to display movement of the object on the display screen using an arrow for each movement path
  • an axis of the arrow displayed by the display control unit has thickness depending on a number of the object in the movement path associated, and has shape depending on trace of the object in the movement path associated, a direction of the arrow represents a moving direction of the object.
  • a display method of the present invention includes:
  • a display device equipped with a display screen to display movement of the object on the display screen using an arrow for each movement path such that an axis of the arrow displayed has thickness depending on a number of the object in the movement path associated, and has shape depending on trace of the object in the movement path associated, and a direction of the arrow represents a moving direction of the object.
  • a program storage medium of the present invention stores a computer program, the computer program causes a computer to execute:
  • a display device equipped with a display screen to display movement of the object on the display screen using an arrow for each movement path such that an axis of the arrow displayed has thickness depending on a number of the object in the movement path associated, and has shape depending on trace of the object in the movement path associated, and a direction of the arrow represents a moving direction of the object.
  • main subject of the present invention is also achieved by a display method of the present invention corresponding to the information processing device of the present invention. Further, the main subject of the present invention is also achieved by a computer program corresponding to the information processing device of the present invention and the display method of the present invention, and a program storage medium storing the computer program.
  • FIG. 1 is block diagram illustrating simply a configuration of an information processing device of a first example embodiment according to the present invention.
  • FIG. 2 is a diagram illustrating a display example of an analysis result by the information processing device of the first example embodiment.
  • FIG. 3 is a flowchart illustrating an operation example of analysis processing in the information processing device of the first example embodiment.
  • FIG. 4 is a block diagram illustrating simply a configuration of an information processing device of a second example embodiment of the present invention.
  • FIG. 5 is a diagram describing a display example of an analysis result by the information processing device of the second example embodiment.
  • FIG. 6 is a diagram describing a display example of a movement path of an object by a moving image.
  • FIG. 7 is a diagram describing a display example of a plurality of movement paths.
  • FIG. 8 is a diagram describing another display example of a plurality of movement paths.
  • FIG. 9 is a flowchart illustrating an operation example of an analysis result in the information processing device of the second example embodiment.
  • FIG. 10 is a diagram describing an example of a hardware configuration of an information processing device according to the present invention.
  • FIG. 11 is a block diagram illustrating simply a configuration of an information processing device of another example embodiment according to the present invention.
  • FIG. 1 is a block diagram illustrating simply a behavior analysis system provided with an information processing device of the first example embodiment according to the present invention. Note that directions of arrows in the drawing represent an example, and do not limit the directions of signal between blocks.
  • the behavior analysis system 1 is configured for use in analyzing a behavior of a person on a floor.
  • the behavior analysis system 1 includes an information processing device 10 , a camera 20 , and a display device 30 .
  • the camera 20 is installed in a state that the camera 20 is capable of capturing a floor to be monitored.
  • the camera 20 has a function of transmitting a captured image to the information processing device 10 .
  • the display device 30 equips a display screen (a display), and has a function of displaying information by a control operation of the information processing device 10 .
  • the information processing device 10 has a function of analyzing a behavior of a person on a floor with use of an image captured by the camera 20 , and causing the display device 30 to display the analysis result.
  • FIG. 10 illustrates an example of a hardware configuration of the information processing device 10 together with the camera 20 and the display device 30 .
  • the information processing device 10 includes a CPU (Central Processing Unit) 300 , a memory 310 , an input-output I/F (InterFace) 320 , and a communication unit 330 . These components are connected to each other by a bus 340 .
  • the input-output I/F 320 has a configuration which enables to communicate information with a peripheral device such as the display device 30 .
  • the communication unit 330 has a configuration which enables to communicate with, for instance, the camera 20 using a predetermined communication method (e.g. a wireless LAN (Local Area Network) or an information communication network (e.g. communication via (the Internet)).
  • a predetermined communication method e.g. a wireless LAN (Local Area Network) or an information communication network (e.g. communication via (the Internet)
  • the memory 310 is a storage medium (a storage device) which stores data or a computer program (hereinafter, also abbreviated as a program).
  • the memory 310 stores a program which controls an operation of the information processing device 10 .
  • the program is stored in a portable storage medium, for instance.
  • the program may be written in the memory 310 from the portable storage medium.
  • the program may be supplied to the information processing device 10 through an information communication network, and written in the memory 310 .
  • the CPU 300 is able to implement various functions by reading a program stored in the memory 310 and by executing the program.
  • the information processing device 10 implements the following functions by the CPU 300 .
  • the information processing device 10 has, as functional units, a track unit 100 , a detection unit 101 , a direction calculation unit 102 , a time calculation unit 103 , an accumulation unit 104 , and a display control unit 105 as illustrated in FIG. 1 .
  • the track unit 100 has a function of detecting a person as an object to be monitored from a captured image (image including a floor (to be captured (monitored) (hereinafter, also described as a floor image)) using, for instance, image processing.
  • the track unit 100 receives the floor image from the camera 20 .
  • the behavior analysis system 1 described in the first example embodiment may be used as an object movement analysis system which analyzes movement of an object, in place of a behavior of a person.
  • the track unit 100 detects an object predetermined as an object to be monitored from the floor image by the camera 20 , in place of a person.
  • the track unit 100 has a function of acquiring position information of a person detected in each frame of the captured image (moving image) by image processing. Further, the track unit 100 has a function of generating tracking information of a person by arranging the position information of the person acquired in a time-series manner.
  • the tracking information is information such that information representing a time (e.g. information for identifying a frame, time information, or information representing the order) is associated with information relating to identification of the person detected, and the position information of the person.
  • the detection unit 101 has a function of classifying the tracking information generated for each person. Specifically, in response to receiving the tracking information from the track unit 100 , the detection unit 101 acquires information relating to identification of the person from the tracking information.
  • the detection unit 101 classifies the tracking information for each person based on the information acquired. Further, the detection unit 101 outputs the tracking information classified for each person to the direction calculation unit 102 and the time calculation unit 103 .
  • the detection unit 101 may classify the tracking information for each movement, for instance, movement at a first time and movement at a second time.
  • the direction calculation unit 102 has a function of generating direction information based on the tracking information.
  • the direction information is information relating to a moving direction of the person detected from the captured image.
  • the direction calculation unit 102 correlates the tracking information which is continued timewise and spacewise. Then, the direction calculation unit 102 calculates information relating to movement of the person detected from the captured image (in this example, start point information of movement, and moving direction information of movement on the floor) based on the tracking information correlated. The direction calculation unit 102 outputs the direction information including the calculated information to the accumulation unit 104 .
  • the time calculation unit 103 has a function of generating time information from the tracking information.
  • the time information is time information relating to movement of the person detected from the captured image.
  • the time calculation unit 103 calculates a move time on the floor to be monitored for each person based on the tracking information. Then, the time calculation unit 103 outputs the time information including the move time calculated to the accumulation unit 104 .
  • the accumulation unit 104 has a function of counting the number of pieces of flow line data as follows.
  • the flow line data is information including the start point information, the moving direction information, and the move time regarding movement of the person detected from the captured image.
  • the accumulation unit 104 receives the direction information from the direction calculation unit 102 , and receives the time information from the time calculation unit 103 .
  • the accumulation unit 104 generates the flow line data based on the direction information received and the time information received. Then, the accumulation unit 104 counts the number of pieces of the flow line data including the same moving direction information on the floor to be monitored. Thereafter, the accumulation unit 104 correlates the counted number of pieces of data with the pieces of flow line data counted, and outputs information obtained by correlating the data to the display control unit 105 as an analysis result.
  • the display control unit 105 has a function of causing the display device 30 to display the analysis result.
  • the analysis result is information in which the number of pieces of the flow line data including the same moving direction information measured by the accumulation unit 104 , and the pieces of the flow line data counted are associated with each other, and represents a movement tendency of persons.
  • FIG. 2 is a diagram illustrating a display example of an analysis result by the display control unit 105 .
  • the analysis results 301 to 303 illustrated in FIG. 2 are information relating to the movement tendency of customers within a shop.
  • the display control unit 105 controls the display device 30 such that the analysis results 301 to 303 are displayed on a plan view of a floor within a shop, as illustrated in FIG. 2 .
  • the analysis results 301 to 303 are represented by the arrows by the display control unit 105 .
  • the directions of the arrows correspond to moving direction information (moving directions of persons) included in the analysis results 301 to 303 .
  • the lengths of axes of the arrows depending on the time information included in the analysis results 301 to 303 (move times of persons (e.g. an average value of move times of a plurality of persons in which the moving directions are the same)). Note that a move time of a person may be displayed by another method.
  • the positions of start points of the arrows representing the analysis results 301 to 303 are aligned at a start point 501 .
  • the positions of the start points of the arrows correspond to the positions of start points of movements of persons.
  • the analysis results 301 to 303 may be displayed in another display manner, as far as the number of pieces of data by counting of the accumulation unit 104 , and pieces of the flow line data associated with the number of pieces of data counted are simultaneously displayed.
  • each function of the information processing device 10 may be implemented by a hardware component, or by combination of a hardware component and a software component.
  • FIG. 3 is a flowchart illustrating an operation example of personal behavior analysis processing by the information processing device 10 . Note that the flowchart illustrated in FIG. 3 illustrates a processing procedure to be executed by the CPU.
  • the track unit 100 In response to receiving the captured image (the floor image) from the camera 20 , the track unit 100 detects the person from the floor image by image processing, for instance (Step S 101 ).
  • the track unit 100 calculates the position information of the person detected for each frame. Then, the track unit 100 generates the tracking information based on the position information calculated (Step S 102 ). The track unit 100 outputs the tracking information generated to the detection unit 101 .
  • the detection unit 101 classifies the tracking information received for each person (Step S 103 ). Then, the detection unit 101 outputs the tracking information classified for each person to the direction calculation unit 102 and the time calculation unit 103 .
  • the direction calculation unit 102 generates the direction information (information including the start point information of movement and the moving direction information of movement on the floor) based on the tracking information input (Step S 104 ). Thereafter, the direction calculation unit 102 outputs the direction information generated to the accumulation unit 104 .
  • the time calculation unit 103 generates the time information (information including the move time of the person on the floor) based on the tracking information received (Step S 105 ). Then, the time calculation unit 103 outputs the time information generated to the accumulation unit 104 .
  • the accumulation unit 104 generates the flow line data based on the direction information received and time information received. Thereafter, the accumulation unit 104 counts (calculates) the number of pieces of the flow line data including the same moving direction information on the floor to be monitored based on the direction information (Step S 106 ). The accumulation unit 104 correlates the number of pieces of data counted with the pieces of the flow line data counted, and outputs the information to the display control unit 105 as an analysis result.
  • the display control unit 105 causes the display device 30 to display the analysis result received (Step S 107 ).
  • the information processing device 10 in the first example embodiment allows for the display control unit 105 to cause the display device 30 to display a plurality of pieces of information included in the analysis result (i.e. the number of pieces of the flow line data including the same moving direction information, and the direction information and the time information included in the pieces of the flow line data).
  • the information processing device 10 is able to display the plurality of pieces of information included in the personal behavior analysis result within the area to be monitored such as within a shop or within a warehouse (e.g. attribute information such as the moving direction or the move time of the person, or the number of persons) in a display manner easily recognizable by a user.
  • the user of the information processing device 10 in the first example embodiment can easily check the ratio of the numbers of persons moving in the respective moving directions on the floor to be monitored. Further, the user can easily grasp a customer trend within a shop, or a behavior tendency of work of workers within a warehouse.
  • the information processing device 10 provided in the behavior analysis system 1 which analyzes the behavior of the person on the floor.
  • the behavior analysis system 1 in the second example embodiment may be used as the object movement analysis system which analyzes movement of the object, in place of the behavior of the person.
  • the track unit 100 detects the object predetermined as a target to be monitored from the floor image by the camera 20 , in place of a person.
  • FIG. 4 is a block diagram illustrating simply a configuration example of the information processing device of the second example embodiment. Note that directions of arrows in the drawing represent an example, and do not limit the directions of signal between blocks.
  • the information processing device 10 of the second example embodiment is provided with a configuration, in which the personal behavior analysis result is displayed on the display device 30 with use of the direction of the arrow, thickness of the axis, and length of the axis. Further, the information processing device 10 of the second example embodiment is also provided with a configuration, in which movement tendency (movement path) of persons is displayed by the moving image with use of the display device 30 . Specifically, in the second example embodiment, the information processing device 10 is provided with the hardware configuration as illustrated in FIG. 10 .
  • the information processing device 10 includes, as functional units to be implemented by the CPU, the track unit 100 , the detection unit 101 , the direction calculation unit 102 , the time calculation unit 103 , the accumulation unit 104 , the display control unit 105 , a data generation unit 201 , and a moving image display unit 202 .
  • a configuration relating to moving image display is mainly described.
  • the detection unit 101 outputs the tracking information classified for each person to the direction calculation unit 102 , the time calculation unit 103 , and the data generation unit 201 .
  • the data generation unit 201 has a function of generating display data based on the tracking information received. Specifically, in response to receiving the tracking information, the data generation unit 201 calculates (detects) information on the start time, the moving direction, and an end time relating to movement of the person based on the tracking information.
  • the data generation unit 201 generates the display data based on the information calculated.
  • the display data is data relating to the movement of the person, and, for instance, is data in which sets of coordinate points representing positions of the person detected by the detection unit 101 , and points of time are arranged in a time-series manner. Thereafter, the data generation unit 201 outputs the display data generated to the moving image display unit 202 .
  • the moving image display unit 202 has a function of causing to display the movement tendency of persons based on the display data received from the data generation unit 201 by the moving image.
  • FIG. 5 is a diagram describing a display example of the movement tendency (moving image) of persons based on the display data.
  • the movement tendency of persons based on the display data is illustrated as the flow line information pieces 401 to 404 .
  • the flow line information pieces 401 to 404 in FIG. 5 illustrate the movement tendency of clients on a plan view of the floor to be monitored within a shop.
  • the flow line information pieces 401 to 404 are illustrated by arrows.
  • the shape of the axis of each arrow in the length direction corresponds to the trace of movement of the person.
  • the direction of each arrow corresponds to the moving direction of movement of the person.
  • the length of each arrow depending on the move time of the person.
  • the thicknesses of axes of the arrows representing the flow line information pieces 401 to 404 are the same.
  • the axis of each arrow may be displayed with thickness depending on the number of persons also when the movement tendency of persons is displayed by the moving image.
  • the moving image display unit 202 has a function of displaying the flow line information pieces 401 to 404 as illustrated in FIG. 5 by the moving image as follows.
  • FIG. 6 illustrates an example, in which the flow line information piece 401 in FIG. 5 is displayed by the moving image.
  • FIG. 6 is a diagram describing a display example of the moving image by the moving image display unit 202 .
  • the start point of movement of the person illustrated by the flow line information piece 401 is A
  • the end point thereof is B
  • the end point B is moved as illustrated in the order of (a), (b), and (c) of FIG. 6 .
  • the moving image display unit 202 causes the display device 30 to display the state of the arrow which extends in such a manner as to draw the trace of movement from the start point A to the end point B by the moving image.
  • FIG. 6 illustrates a state of the arrow representing the flow line information piece 401 to be displayed immediately after moving image display is started.
  • the flow line information piece 401 is displayed as the arrow slightly extending from the start point A.
  • FIG. 6 illustrates a state of the arrow representing the flow line information piece 401 to be displayed when about a half of the moving image is reproduced.
  • FIG. 6 illustrates a state of the arrow representing the flow line information piece 401 to be displayed immediately before moving image display is finished.
  • the moving image display unit 202 causes the display device 30 to display the flow line information piece 401 by the moving image by extending the arrow in such a manner as to draw the trace of movement of the end point B. Then, the movement display unit 202 terminates reproduction of the moving image when the end point B reaches a final point.
  • a method for displaying the moving image representing the flow line information by the moving image display unit 202 is not limited to the aforementioned method. As far as a state of movement of a person is known, any display method may be used.
  • the moving image display unit 202 may have a function of causing the display device 30 to display the plurality of flow line information pieces 401 to 404 as illustrated in FIG. 5 by the moving image.
  • FIG. 7 is a diagram describing the order in which the flow line information pieces 401 to 404 by the moving image display unit 202 are caused to be displayed by the display device 30 by the moving image.
  • the movements of persons respectively represented by the flow line information pieces 401 to 404 in FIG. 5 are performed in the order of the flow line information piece 401 ⁇ the flow line information piece 402 ⁇ the flow line information piece 403 ⁇ the flow line information piece 404 .
  • the moving image display unit 202 causes the display device 30 to display the arrows representing the flow line information pieces 401 to 404 in a time-series manner according to the order.
  • the moving image display unit 202 causes the display device 30 to display movement of the arrow based on the flow line information piece 401 representing the state of movement of the person performed at a first time by the moving image, for instance, in a period T 1 in FIG. 7 . Then, after the moving image display based on the flow line information piece 401 is terminated, the moving image display unit 202 causes the display device 30 to display the movement of the arrow based on the flow line information piece 402 representing a state of movement performed at a next time by the moving image, for instance, in a period T 2 in FIG. 7 .
  • the moving image display unit 202 causes the display device 30 to display the movement of the arrow based on the flow line information piece 403 representing a state of movement performed at a next time by the moving image, for instance, in a period T 3 in FIG. 7 . Furthermore, after moving image display based on the flow line information piece 403 is terminated, the moving image display unit 202 causes the display device 30 to display the movement of the arrow based on the flow line information piece 404 representing a state of movement performed at a next time by the moving image, for instance, in a period T 4 in FIG. 7 .
  • the periods T 1 to T 4 may be the same durations of times as the actual move times of persons, or may be times shortened by a predetermined ratio.
  • FIG. 8 is a diagram describing another example in which the moving image display unit 202 causes the display device 30 to display the flow line information pieces 401 to 404 by the moving image.
  • the moving image display unit 202 causes the display device 30 to display movements of persons started at different times under an assumption that the movements are started at the same time by the moving image.
  • the moving image display unit 202 displays the flow line information pieces 401 to 404 by the moving image during the periods T 1 to T 4 illustrated in FIG. 8 . Specifically, the moving image display unit 202 starts to display the respective flow line information pieces 401 to 404 by the moving image at the same time. In other words, the moving image display unit 202 displays the flow line information pieces 401 to 404 simultaneously by the moving image.
  • durations of the periods T 1 to T 4 during which the arrows are moved based on the flow line information pieces 401 to 404 correspond to move times of persons represented by the respective flow line information pieces 401 to 404 .
  • the periods T 1 to T 4 may be the same durations of times as the times during which persons actually move, or may be times shorted by a predetermined ratio.
  • the configuration of the information processing device 10 of the second example embodiment other than the above is the same as the first example embodiment.
  • FIG. 9 is a flowchart illustrating an operation example of personal behavior analysis processing by the information processing device 10 .
  • the flowchart of FIG. 9 illustrates a processing procedure to be executed by the CPU in the information processing device 10 .
  • the track unit 100 detects the person from the captured image (the floor image) received from the camera 20 (Step S 201 ).
  • the track unit 100 calculates the position information of the person detected for each frame. Then, the track unit 100 generates the tracking information based on each position information calculated (Step S 202 ). The track unit 100 outputs the tracking information generated to the detection unit 101 .
  • the detection unit 101 classifies the tracking information received for each person (Step S 203 ). Thereafter, the detection unit 101 outputs the tracking information classified for each person to the data generation unit 201 .
  • the data generation unit 201 generates display data relating to movement of each person based on the tracking information received for each person (Step S 204 ). Thereafter, the data generation unit 201 outputs the display data generated to the moving image display unit 202 .
  • the moving image display unit 202 causes the display device 30 to display the state of movement of each person (flow line information) by the moving image based on the display data received (Step S 205 ).
  • the information processing device 10 in the second example embodiment has a function of allowing the moving image display unit 202 to display the state of movement of the person detected from the captured image of the camera 20 by the moving image on the display device 30 .
  • the information processing device 10 allows the user of the device 10 to dynamically grasp movement of the person on the floor to be monitored. Further, the information processing device 10 allows the user to dynamically compare movements of the plurality of persons on the floor to be monitored.
  • the information processing device 10 in the second example embodiment allows the moving image display unit 202 to display the state of movement of each person in the order of movements by the moving image on the display device 30 . Further, the moving image display unit 202 is also able to cause the display device 30 to display the state of movement of each person under the assumption that the movements are started at the same time by the moving image. In this way, displaying under the assumption that movements are started at the same time is advantageous for the information processing device 10 in allowing the user to easily compare the moving direction or the move time regarding movement of each person.
  • FIG. 11 is a block diagram illustrating simply a configuration of an information processing device of another example embodiment according to the present invention.
  • an information processing device 5 includes a detection unit 6 and a display control unit 7 .
  • the detection unit 6 has a function of classifying information (tracking information) relating to movement of an object to be monitored based on movement path information included in the information for each movement path.
  • the display control unit 7 has a function of controlling a display device to display an arrow whose thickness represents the number of pieces of information (tracking information) classified for each movement path, and whose shape and direction represent the movement path on the display device.
  • the information processing device 5 has the aforementioned configuration, thereby being able to display the number of objects moving along a predetermined path, and the movement path simultaneously.
  • the display control unit 7 may control the display device to display the arrow whose length depending on a move time included in flow line information on the display device.
  • the information processing device 5 is also able to display the move time of the object together.
  • the information processing device 5 may include a moving image display unit that causes to display a trace of movement representing the flow line information of the object by the moving image representing a state that the arrow is extending.
  • the information processing device 5 is able to cause to dynamically display the state of movement of the object in an area to be monitored.
  • the moving image display unit may cause to display a state that arrows depending on pieces of the flow line information of a plurality of objects are extending in the order in which movements of the objects are started by the moving image, respectively.
  • the information processing device 5 is able to cause to display such that the user can dynamically compare the states of movements of objects.
  • the moving image display unit may display states that arrows depending on a plurality of pieces of the flow line information of the object are extending simultaneously by the moving image.
  • the information processing device 5 having the aforementioned configuration is able to cause to display states of movements of objects under the assumption that the movements are started at the same time.
  • a display system includes:
  • a detection unit that has a function as a classifying unit which classifies flow line information of an object for each movement path represented by the flow line information
  • a display control unit that displays an arrow whose thickness reflects the number of pieces of the flow line information classified for a predetermined movement path and whose direction and shape reflect the predetermined movement path.
  • the display control unit displays the arrow whose length reflects a move time represented by the flow line information.
  • the display system includes a moving image display unit that displays a state that the arrow is extending from a start point of the movement path represented by the flow line information of the object to an end point by a moving image.
  • the moving image display unit displays a state that the arrows depending on a plurality of pieces of the flow line information of the object are extending in the order in which movements represented by the pieces of the flow line information are started by the moving image, respectively.
  • the moving image display unit displays a state that the arrows depending on a plurality of pieces of the flow line information of the object are extending simultaneously by the moving image.
  • a display method includes:
  • a display program causes a computer to execute:
  • the display program causes the computer to execute:

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Game Theory and Decision Science (AREA)
  • Marketing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
US15/508,573 2014-09-11 2015-09-07 Information processing device, display method, and program storage medium for monitoring object movement Active US10297051B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2014-184741 2014-09-11
JP2014184741 2014-09-11
PCT/JP2015/004526 WO2016038872A1 (fr) 2014-09-11 2015-09-07 Dispositif de traitement d'informations, procédé d'affichage et support de stockage de programme

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/004526 A-371-Of-International WO2016038872A1 (fr) 2014-09-11 2015-09-07 Dispositif de traitement d'informations, procédé d'affichage et support de stockage de programme

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/365,833 Continuation US10825211B2 (en) 2014-09-11 2019-03-27 Information processing device, display method, and program storage medium for monitoring object movement

Publications (2)

Publication Number Publication Date
US20170263024A1 US20170263024A1 (en) 2017-09-14
US10297051B2 true US10297051B2 (en) 2019-05-21

Family

ID=55458643

Family Applications (5)

Application Number Title Priority Date Filing Date
US15/508,573 Active US10297051B2 (en) 2014-09-11 2015-09-07 Information processing device, display method, and program storage medium for monitoring object movement
US16/365,833 Active US10825211B2 (en) 2014-09-11 2019-03-27 Information processing device, display method, and program storage medium for monitoring object movement
US17/002,451 Active US11315294B2 (en) 2014-09-11 2020-08-25 Information processing device, display method, and program storage medium for monitoring object movement
US17/696,035 Active US11657548B2 (en) 2014-09-11 2022-03-16 Information processing device, display method, and program storage medium for monitoring object movement
US18/135,901 Active US12175566B2 (en) 2014-09-11 2023-04-18 Information processing device, display method, and program storage medium for monitoring object movement

Family Applications After (4)

Application Number Title Priority Date Filing Date
US16/365,833 Active US10825211B2 (en) 2014-09-11 2019-03-27 Information processing device, display method, and program storage medium for monitoring object movement
US17/002,451 Active US11315294B2 (en) 2014-09-11 2020-08-25 Information processing device, display method, and program storage medium for monitoring object movement
US17/696,035 Active US11657548B2 (en) 2014-09-11 2022-03-16 Information processing device, display method, and program storage medium for monitoring object movement
US18/135,901 Active US12175566B2 (en) 2014-09-11 2023-04-18 Information processing device, display method, and program storage medium for monitoring object movement

Country Status (3)

Country Link
US (5) US10297051B2 (fr)
JP (1) JP6399096B2 (fr)
WO (1) WO2016038872A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018185683A (ja) * 2017-04-26 2018-11-22 富士通株式会社 表示制御プログラム、表示制御方法、及び情報処理装置
JP7135329B2 (ja) * 2018-01-31 2022-09-13 日本電気株式会社 情報処理方法、情報処理装置、および情報処理プログラム
JP6969493B2 (ja) * 2018-05-16 2021-11-24 株式会社Jvcケンウッド 映像出力装置、映像出力方法、及びコンピュータプログラム
KR102436618B1 (ko) * 2019-07-19 2022-08-25 미쓰비시덴키 가부시키가이샤 표시 처리 장치, 표시 처리 방법 및 기억 매체

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06176004A (ja) 1992-07-27 1994-06-24 Takenaka Komuten Co Ltd 入場者の動線シミュレーション装置
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method
JPH11288213A (ja) 1998-04-03 1999-10-19 Mitsubishi Electric Corp ナビゲーション装置
JP2001124577A (ja) 1999-10-28 2001-05-11 Toyota Motor Corp 道路情報表示装置および経路探索装置
US20100013931A1 (en) * 2008-07-16 2010-01-21 Verint Systems Inc. System and method for capturing, storing, analyzing and displaying data relating to the movements of objects
JP2010123069A (ja) 2008-11-21 2010-06-03 Panasonic Corp センシングデータ検索装置及び検索画像作成方法
US20110200226A1 (en) 2010-02-17 2011-08-18 Toshiba Tec Kabushiki Kaisha Customer behavior collection method and customer behavior collection apparatus
US20110304497A1 (en) * 2008-12-05 2011-12-15 Nike, Inc. Athletic Performance Monitoring Systems and Methods in a Team Sports Environment
US20120059581A1 (en) 2010-09-02 2012-03-08 Casio Computer Co., Ltd. Positioning apparatus judging movement method to control positioning timing
JP2012246115A (ja) 2011-05-30 2012-12-13 Mitsubishi Electric Corp 乗降履歴情報の可視化装置および乗降履歴情報の可視化プログラム
US20130002854A1 (en) * 2010-09-17 2013-01-03 Certusview Technologies, Llc Marking methods, apparatus and systems including optical flow-based dead reckoning features
JP2013122652A (ja) 2011-12-09 2013-06-20 Omron Corp 動向分析システム、データ構造、および表示装置
US20130271602A1 (en) * 2010-08-26 2013-10-17 Blast Motion, Inc. Motion event recognition system and method
JP5356615B1 (ja) 2013-02-01 2013-12-04 パナソニック株式会社 顧客行動分析装置、顧客行動分析システムおよび顧客行動分析方法
JP2015069639A (ja) 2014-04-28 2015-04-13 パナソニックIpマネジメント株式会社 滞留時間測定装置、滞留時間測定システムおよび滞留時間測定方法
US20150104149A1 (en) * 2013-10-15 2015-04-16 Electronics And Telecommunications Research Institute Video summary apparatus and method

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6295367B1 (en) * 1997-06-19 2001-09-25 Emtera Corporation System and method for tracking movement of objects in a scene using correspondence graphs
JP2002041338A (ja) 2000-07-25 2002-02-08 Yokogawa Electric Corp データベースの作成方法
JP2002041339A (ja) 2000-07-25 2002-02-08 Toshiba Corp 情報検索システム
US20020041339A1 (en) * 2000-10-10 2002-04-11 Klaus Diepold Graphical representation of motion in still video images
US7433494B2 (en) * 2002-09-19 2008-10-07 Denso Corporation Moving body detecting apparatus
JP3904000B2 (ja) * 2004-05-26 2007-04-11 コニカミノルタホールディングス株式会社 移動体検出システム及び移動体検出方法
JP2008511066A (ja) * 2004-08-23 2008-04-10 リテイル テク カンパニー リミテッド Rfid基盤のショッピングパターン分析システム及び方法
JP2006279859A (ja) * 2005-03-30 2006-10-12 Hitachi Ltd 移動体移動実態情報提供システム、位置情報収集装置、カーナビ装置および移動体移動実態情報提供方法
JP4389866B2 (ja) * 2005-12-12 2009-12-24 セイコーエプソン株式会社 画像処理方法、画像処理装置、表示装置およびプログラム
US8116564B2 (en) * 2006-11-22 2012-02-14 Regents Of The University Of Minnesota Crowd counting and monitoring
US7796029B2 (en) * 2007-06-27 2010-09-14 Honeywell International Inc. Event detection system using electronic tracking devices and video devices
US8107676B2 (en) * 2007-07-30 2012-01-31 International Business Machines Corporation Line length estimation
JP4621716B2 (ja) * 2007-08-13 2011-01-26 東芝テック株式会社 人物行動分析装置,方法及びプログラム
US8009863B1 (en) * 2008-06-30 2011-08-30 Videomining Corporation Method and system for analyzing shopping behavior using multiple sensor tracking
JP2010113692A (ja) * 2008-11-10 2010-05-20 Nec Corp 顧客行動記録装置及び顧客行動記録方法並びにプログラム
JP4748254B2 (ja) 2009-05-29 2011-08-17 トヨタ自動車株式会社 省燃費運転推奨装置
US20120001828A1 (en) * 2010-06-30 2012-01-05 Gallagher Andrew C Selecting displays for displaying content
US8478048B2 (en) * 2010-07-08 2013-07-02 International Business Machines Corporation Optimization of human activity determination from video
CN103477355B (zh) * 2011-03-31 2016-04-20 松下电器产业株式会社 人数测量装置
US8438427B2 (en) * 2011-04-08 2013-05-07 Ca, Inc. Visualizing relationships between a transaction trace graph and a map of logical subsystems
JP5808576B2 (ja) 2011-05-26 2015-11-10 株式会社トプコン 眼科撮影装置
US9450873B2 (en) 2011-06-28 2016-09-20 Microsoft Technology Licensing, Llc Performance isolation for clouds
US9036864B2 (en) * 2011-08-12 2015-05-19 Edh Holdings (South Africa) (Pty) Ltd. Ball trajectory and bounce position detection
US8615107B2 (en) * 2012-01-11 2013-12-24 Ecole Polytechnique Federale De Lausanne (Epfl) Method and apparatus for multiple object tracking with K-shortest paths
US20140278688A1 (en) * 2013-03-15 2014-09-18 Disney Enterprises, Inc. Guest movement and behavior prediction within a venue
US20140316848A1 (en) * 2013-04-21 2014-10-23 International Business Machines Corporation Cross-Channel Analytics Combining Consumer Activity on the Web and in Physical Venues
US9664510B2 (en) * 2013-06-22 2017-05-30 Intellivision Technologies Corp. Method of tracking moveable objects by combining data obtained from multiple sensor types
JP5506990B1 (ja) * 2013-07-11 2014-05-28 パナソニック株式会社 追跡支援装置、追跡支援システムおよび追跡支援方法
JP6529078B2 (ja) * 2013-09-06 2019-06-12 日本電気株式会社 顧客行動分析システム、顧客行動分析方法、顧客行動分析プログラム及び棚システム
JP6176100B2 (ja) * 2013-12-11 2017-08-09 株式会社コナミデジタルエンタテインメント ゲームプログラム、ゲームシステム
JP6319421B2 (ja) * 2014-02-25 2018-05-09 日本電気株式会社 情報処理装置、データ分析方法、及び、プログラム
US9311799B2 (en) * 2014-03-18 2016-04-12 Symbol Technologies, Llc Modifying RFID system operation using movement detection
JP5853141B2 (ja) * 2014-03-26 2016-02-09 パナソニックIpマネジメント株式会社 人数計測装置、人数計測システムおよび人数計測方法
EP2942251B1 (fr) * 2014-05-08 2017-04-05 Volvo Car Corporation Procédé pour fournir une représentation de prédiction d'objet
JP5866564B1 (ja) * 2014-08-27 2016-02-17 パナソニックIpマネジメント株式会社 モニタリング装置、モニタリングシステムおよびモニタリング方法

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06176004A (ja) 1992-07-27 1994-06-24 Takenaka Komuten Co Ltd 入場者の動線シミュレーション装置
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method
JPH11288213A (ja) 1998-04-03 1999-10-19 Mitsubishi Electric Corp ナビゲーション装置
JP2001124577A (ja) 1999-10-28 2001-05-11 Toyota Motor Corp 道路情報表示装置および経路探索装置
US20100013931A1 (en) * 2008-07-16 2010-01-21 Verint Systems Inc. System and method for capturing, storing, analyzing and displaying data relating to the movements of objects
JP2010123069A (ja) 2008-11-21 2010-06-03 Panasonic Corp センシングデータ検索装置及び検索画像作成方法
US20110304497A1 (en) * 2008-12-05 2011-12-15 Nike, Inc. Athletic Performance Monitoring Systems and Methods in a Team Sports Environment
JP2011170565A (ja) 2010-02-17 2011-09-01 Toshiba Tec Corp 顧客動向収集方法、装置及びプログラム
US20110200226A1 (en) 2010-02-17 2011-08-18 Toshiba Tec Kabushiki Kaisha Customer behavior collection method and customer behavior collection apparatus
US20130271602A1 (en) * 2010-08-26 2013-10-17 Blast Motion, Inc. Motion event recognition system and method
US20120059581A1 (en) 2010-09-02 2012-03-08 Casio Computer Co., Ltd. Positioning apparatus judging movement method to control positioning timing
JP2012052937A (ja) 2010-09-02 2012-03-15 Casio Comput Co Ltd 測位装置、測位方法、及び、プログラム
US20130002854A1 (en) * 2010-09-17 2013-01-03 Certusview Technologies, Llc Marking methods, apparatus and systems including optical flow-based dead reckoning features
JP2012246115A (ja) 2011-05-30 2012-12-13 Mitsubishi Electric Corp 乗降履歴情報の可視化装置および乗降履歴情報の可視化プログラム
JP2013122652A (ja) 2011-12-09 2013-06-20 Omron Corp 動向分析システム、データ構造、および表示装置
JP5356615B1 (ja) 2013-02-01 2013-12-04 パナソニック株式会社 顧客行動分析装置、顧客行動分析システムおよび顧客行動分析方法
US20140222501A1 (en) 2013-02-01 2014-08-07 Panasonic Corporation Customer behavior analysis device, customer behavior analysis system and customer behavior analysis method
US20150104149A1 (en) * 2013-10-15 2015-04-16 Electronics And Telecommunications Research Institute Video summary apparatus and method
JP2015069639A (ja) 2014-04-28 2015-04-13 パナソニックIpマネジメント株式会社 滞留時間測定装置、滞留時間測定システムおよび滞留時間測定方法

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
Communication dated Mar. 6, 2018, from Japanese Patent Office in counterpart application No. 2016-547695.
English translation of Written opinion for PCT Application No. PCT/JP2015/004526.
International Search Report for PCT Application No. PCT/JP2015/004526, dated Oct. 6, 2015.
Japanese Office Action for JP Application No. 2016-547695 dated Aug. 7, 2018 with English Translation.
Kyoichiro Katabira, "Main stream detection in complex flocks by using laser range scanners and the application for advanced air conditioning control", Sep. 29, 2006, total 73 pages.
Machine Translation to English of JP 2012-052937. *
Machine Translation to English of JP 2012-246115. *
Machine Translation to English of JPH 11-288213. *

Also Published As

Publication number Publication date
US20210042971A1 (en) 2021-02-11
JPWO2016038872A1 (ja) 2017-06-22
US20170263024A1 (en) 2017-09-14
US12175566B2 (en) 2024-12-24
US11657548B2 (en) 2023-05-23
US10825211B2 (en) 2020-11-03
US20190221015A1 (en) 2019-07-18
US20230252698A1 (en) 2023-08-10
US11315294B2 (en) 2022-04-26
US20220207797A1 (en) 2022-06-30
WO2016038872A1 (fr) 2016-03-17
JP6399096B2 (ja) 2018-10-03

Similar Documents

Publication Publication Date Title
US11657548B2 (en) Information processing device, display method, and program storage medium for monitoring object movement
US20190205904A1 (en) Information-processing device, data analysis method, and recording medium
US10846537B2 (en) Information processing device, determination device, notification system, information transmission method, and program
US20210398416A1 (en) Systems and methods for a hand hygiene compliance checking system with explainable feedback
JP6172380B2 (ja) Pos端末装置、posシステム、商品認識方法及びプログラム
CN111428572B (zh) 信息处理方法、装置、电子设备和介质
CN113469132B (zh) 一种违规行为检测方法、装置、电子设备及存储介质
JPWO2014087725A1 (ja) 商品情報処理装置、そのデータ処理方法、およびプログラム
US20150006263A1 (en) System and Method of Customer Interaction Monitoring
CN104462530A (zh) 用户喜好的分析方法及装置、电子设备
CN114529850A (zh) 自助收银的风险识别方法、装置及系统
CN110659588A (zh) 一种客流量统计方法、装置及计算机可读存储介质
US20180293598A1 (en) Personal behavior analysis device, personal behavior analysis system, and personal behavior analysis method
JP2021026336A (ja) 情報処理装置、及び、マーケティング活動支援装置
CN109583296A (zh) 一种防止误检测方法、装置、系统及计算机存储介质
CN110677448A (zh) 关联信息推送方法、装置和系统
US12106566B2 (en) Image processing apparatus that manages checking work performed on articles and image processing method thereof
US20170372255A1 (en) Interaction analysis
EP4354388A1 (fr) Dispositif et procédé d'analyse de tâche
CN110689375A (zh) 一种基于便携式设备的信息交互方法、系统、设备和介质
CN104217223A (zh) 检测人员使用手持装置的方法、装置及其图像警报系统
US20220215525A1 (en) Information processing device, information processing program, and information processing method
CN112989200B (zh) 商品使用信息的提供方法、基于评论信息改进关联信息方法
JP7206806B2 (ja) 情報処理装置、分析方法、及びプログラム
JP6926895B2 (ja) 情報処理装置、情報処理システムおよびプログラム

Legal Events

Date Code Title Description
AS Assignment

Owner name: NEC CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OSHIMA, AKIKO;REEL/FRAME:041456/0879

Effective date: 20170222

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4