US20010043721A1 - Method and apparatus for performing motion analysis on an image sequence - Google Patents

Method and apparatus for performing motion analysis on an image sequence Download PDF

Info

Publication number
US20010043721A1
US20010043721A1 US09/769,599 US76959901A US2001043721A1 US 20010043721 A1 US20010043721 A1 US 20010043721A1 US 76959901 A US76959901 A US 76959901A US 2001043721 A1 US2001043721 A1 US 2001043721A1
Authority
US
United States
Prior art keywords
trajectory
motion
determining
route
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/769,599
Inventor
Dina Kravets
Suz-Hsi Wan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sarnoff Corp
Original Assignee
Sarnoff Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sarnoff Corp filed Critical Sarnoff Corp
Priority to US09/769,599 priority Critical patent/US20010043721A1/en
Assigned to SARNOFF CORPORATION reassignment SARNOFF CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KRAVETS, DINA, WAN, SUZ-HSI
Assigned to DARPA reassignment DARPA CONFIRMATORY LICENSE (SEE DOCUMENT FOR DETAILS). Assignors: SARNOFF CORPORATION
Publication of US20010043721A1 publication Critical patent/US20010043721A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30236Traffic on road, railway or crossing

Abstract

A method and concomitant apparatus for performing motion analysis on a sequence of images is disclosed. Initially, the sequence of images is received from a video source. The sequence of images captures a plurality of objects each moving along a trajectory in an area imaged by the video source. Motion information is extracted from the sequence of images for each of the plurality of objects. Spatial patterns are then determined from the extracted motion information.

Description

  • This application claims the benefit of U.S. Provisional Application No. 60/190,819, filed Mar. 21, 2000, which is herein incorporated by reference.[0001]
  • [0002] This invention was made with U.S. government support under NIDL contract number NMA20297D1033 and DARPA contract number MDA97297C0033. The U.S. government has certain rights in this invention.
  • The invention relates generally to a method and apparatus for video processing and, more particularly, to a method and apparatus for performing motion analysis on an image sequence. [0003]
  • BACKGROUND OF THE DISCLOSURE
  • Current research efforts have used prototype systems to perform data mining or data analysis of spatial and/or temporal information within an image sequence. The prototype systems may perform data mining in accordance to a theoretical framework from Chorochronos, a European-based research network for spatio-temporal database systems. Chrorochronos has addressed concerns related to generating datasets, data models, objects and representations of objects. [0004]
  • One existing system implements data mining to determine patterns from spatial data. See Han et al., “GeoMiner: A System Prototype for Spatial Data Mining,” Proceedings of the ACM SIGMOD International Conference on Management of Data, 1997. Other prototype systems implement data mining to determine patterns from temporal data. See Spiliopoulou, “Discovering Patterns in Sequences,” Dagstuhl Seminar 98471, Dagstuhl, Germany, 1998; and Han et al., “Efficient Mining of Partial Periodic Patterns in Time Series Database,” Proceedings of International Conference on Data Engineering, 1999. [0005]
  • In Spiliopoulou, the data mining is performed on a time dimension as an ordered lattice, i.e., the only operands on the time variable are “before” and “after.” Such data mining over the ordered lattice is not performed over time as a continuous variable but on time as a sequential variable. More recent work by Spiliopoulou and Han et al. have performed data mining over time as a discrete-valued variable. However, both Spiliopoulou and Han et al. have converted the time variable into a sequence of characters such that data mining for text is used to mine or analyze time-based sequences of events. [0006]
  • Another existing system performs data mining on data having both spatial and temporal components. See Stolorz et al., “Fast Spatio-Temporal Data Mining from Large Geophysical Datasets,” Proceedings of the First International Conference on Knowledge Discovery and Data Mining, IEEE Press, 1995. Stolorz et al. determines limited weather patterns by using parallel computers to automatically detect cyclones and blocking conditions from a large geophysical dataset with temporal components. [0007]
  • The existing systems are limited in applying data mining to numerical or one-dimensional data. However, with the increased use of video data, there is a need for a method and system to extend mining to video data, i.e., perform motion mining on a video sequence. [0008]
  • SUMMARY OF THE INVENTION
  • The present invention is a method and apparatus for performing motion analysis on a sequence of images. Initially, the sequence of images is received from a video source. The sequence of images captures a plurality of objects each moving along a trajectory in an area imaged by the video source. Motion information is extracted from the sequence of images for each of the plurality of objects. Spatial patterns are then determined from the extracted motion information and used for performing functions such as intelligence assessment, traffic control and airport security. [0009]
  • DESCRIPTION OF THE DRAWINGS
  • The teachings of the present invention can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which: [0010]
  • FIG. 1 depicts a block diagram of a video processing system for performing motion analysis on a video sequence and displaying the analyzed motion in response to a database query; [0011]
  • FIG. 2A depicts raw data received by a motion extraction system of the video processing system; [0012]
  • FIG. 2B depicts exemplary motion information derived from the raw data by the motion extraction system; [0013]
  • FIG. 3 depicts exemplary motion patterns determined from a motion mining system of the present invention; [0014]
  • FIG. 4 depicts a block diagram of one embodiment of the motion mining system; [0015]
  • FIG. 5 depicts an example graphical user interface (GUI) for displaying motion information at the user computer of the video processing system; [0016]
  • FIG. 6 depicts an exemplary set of possible patterns and constraints implemented in the GUI of FIG. 5; [0017]
  • FIG. 7 depicts an exemplary timeline window accessible as a menu option from the GUI of FIG. 5; [0018]
  • FIG. 8 depicts an exemplary details window accessible as a menu option from the GUI of FIG. 5; [0019]
  • FIG. 9 depicts a flow diagram of a method for implementing the video processing system of FIG. 1; and [0020]
  • FIG. 10 depicts a flow diagram of a method for implementing the motion mining system of the present invention.[0021]
  • To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. [0022]
  • DETAILED DESCRIPTION
  • The present invention is a method and apparatus for performing motion analysis on a sequence of images. Initially, a sequence of images is received from a video source. The sequence of images captures a plurality of objects each moving along a trajectory in an area imaged by the video source. Motion information is extracted from the sequence of images for each of the plurality of objects. Spatial patterns are then determined from the extracted motion information. The invention also determines temporal and spatio-temporal patterns from the extracted motion information. These spatial, temporal and spatio-temporal motion patterns support various applications and functions. For example, these motion patterns are used to perform such functions as intelligence assessment, mission planning, counter terrorism, counter drug traffic, traffic control, airport security, and urban policing. [0023]
  • FIG. 1 depicts a block diagram of the video processing system [0024] 100 of the present invention. The video processing system 100 performs motion analysis on a video sequence and displays the analyzed motion information in response to a database query. Specifically, the video processing system 100 comprises a video source 105, a motion extraction system 110, a motion mining system 115, a database 120, a server computer 125 and a user interface unit 130.
  • The [0025] video source 105 images a particular area and captures video or an image sequence of the imaged area. Once the video is captured, the video source 105 transmits the captured video to the motion extraction system 110. In one embodiment, the captured video contains a plurality of objects moving in the imaged area. Objects may include people and moving vehicles, e.g., airplanes and automobiles. Examples of the video source 105 include a stationary or moving video camera positioned on a unmanned air vehicle (UAV) or a ground-based video camera, i.e., a camera positioned on a traffic light, and a satellite-based video camera. The video source 105 may also comprise recorded video if the location of objects can be extracted from the recorded video by the motion extraction system 110.
  • In the present invention, the motion in the video is represented as [0026] raw data 201 depicted in FIG. 2A. The raw data 201 comprises coordinates (id, x, y, z, t) in 5-dimensional space, where id represents an object identifier, and x, y and z represent a location of the object at a time t. A maximal sequence of points {(id, x0, y0, z0, t0), (id, x1, y1, z1, t1), . . . , (id, Xk, yk, zk, tk)} defines the motion of a moving object, where t0<t1<. . . <tk and the object moves from location (xi, yi, zi) at time ti to location (xi+1, yi+1, zi+1) at ti+1, for all 0<i<k. A contiguous portion of the motion is depicted in FIG. 2A as a “trajectory” 202 of the moving object. The location (xi, yi, zi) of the object, i.e., the raw data 201, is produced by a global positioning system (GPS), while the time ti represents a timestamp at the video source 105.
  • The time interval between two successive points is defined by the motion extraction system [0027] 110. For a nominal frame rate of 30 frames per second, the time interval may be on the order of one second. In other words, the motion extraction system 110 may periodically process frames, e.g., every thirtieth frame, to reduce the amount of video for processing but maintain an appreciable degree of accuracy. The exact time interval between successive points is dependent on the type of object among other factors, e.g., weather conditions. For example, a video tracking the presence of aircraft would generally require a shorter time interval than a video tracking of the presence of automobiles.
  • The motion extraction system [0028] 110 receives the video or image sequence from the video source 105 and extracts motion information from the received information. An exemplary motion extraction system 110 is provided by the Sarnoff Corporation in U.S. Pat. No. 5,259,040, issued to Hanna on Nov. 2, 1993, which is herein incorporated by reference. FIG. 2B depicts exemplary motion information 206 extracted from the trajectory 202 of each moving object by the motion extraction system 100. Examples of such motion information 206 include a trajectory time span, a trajectory region, a trajectory start point, a trajectory end point, a direction of the trajectory, a speed range of the trajectory, an acceleration range of the trajectory, a shape of the trajectory and a path of the trajectory. Additionally, if the received video contains ESD (electronic sensor data), the motion information 206 also includes the geolocation, i.e., geographical position, of the moving objects. The motion information 206 for each trajectory 202 is then transmitted to the motion mining system 115 and the database 120.
  • The [0029] motion mining system 115 receives the extracted motion information from the motion extraction system 110 and determines spatial, temporal and spatio-temporal patterns from the motion information 206 received from the motion extraction system 110. These motion patterns are then transmitted to and stored in the database 120. FIG. 3 depicts exemplary motion patterns 300 as determined by the motion extraction system 110. The motion patterns 300 are provided in response to a database query 310 at the user computer 130. Examples of these motion patterns 300 include but are not limited to: an object stopping, a fast moving object, an active region, a source region, a “beaten” path, a road or route, a convoy, a violation of a traffic light and illegal parking.
  • In one embodiment, the [0030] motion mining system 115 may determine spatial patterns without regard to a particular time. The motion mining system 115 may perform “routes clustering,” i.e., determine the existence of “routes” by clustering or grouping of trajectories 202 of at least two objects traveling along the same path. Each route is determined in an iterative manner. For example, if any portion of a trajectory 202 is close, e.g., within a threshold distance, to a previously considered trajectory or a previously considered route, then the portion is combined to form a new route.
  • After clustering or grouping the trajectories into routes, the [0031] motion mining system 115 may also cluster each route in the time dimension. The clustering of routes in the time dimension represents a “busy time” along the route. For example, the motion mining system 115 determines whether the number of trajectories along the route exceeds a threshold number at different times. To determine such “busy times” along each route, the motion mining system 115 uses a clustering process, e.g., a “K-means” clustering process. Examples of the K-means process are shown in Bradley et al., “Refining Initial Points for K-Means Clustering,” Proceedings of the Fifteenth International Conference on Machine Learning, 1978, pages 91-99; and Hartigan et al., “A K-Means Clustering Algorithm,” Applied Statistics, Vol. 28, No. 1, 1979, pages 100-08.
  • The determined busy time represents a particular time or time interval when the number of trajectories in the route is greater than a threshold number. The exact value of the threshold number is dependent upon different factors, e.g., the type of object, the time interval under consideration and the region or location of the imaged area. For example, a busy time along a particular route may represent a morning rush hour. [0032]
  • The [0033] motion mining system 115 may also cluster or group trajectories to determine popular origins and popular destinations. In one embodiment, the motion mining system 115 uses the start and end points of various trajectories to determine popular source and destination points. If the number of trajectories 202 starting from the region or location is greater than a threshold number, the region or location is identified as a popular origin or source point. Similarly, if the number of trajectories 202 ending at a region or location is greater than the threshold number, the region or location is identified as a popular destination or sink point. The exact value of the threshold number is dependent upon different factors, e.g., the type of object, the time interval under consideration and the region or location of the imaged area. As with the clustering of routes along the time dimension, the motion mining system 115 may use a clustering process, e.g., a K-means clustering process, to identify regions containing many origins of trajectories and regions containing man destinations of trajectories.
  • Once the spatial patterns, e.g., routes, source points and sink points, are determined, the [0034] motion mining system 115 determines whether temporal or periodic patterns exist along the routes. Namely, the motion mining system 115 determines, for each “time scale” within a predefined set of time scales, whether there is any temporal pattern or periodicity in the time dimension along each route.
  • A time scale represents a time interval where two events are considered to have simultaneously occurred if the events occurred within the same time interval. For example, if the time scale is one hour, a first event at 10:12 AM is considered to have simultaneously occurred with a second event at 10:43 AM. By using a predefined time scale, the [0035] motion mining system 115 identifies periodic behavior over time intervals. For example, if the motion mining system 115 is to detect events occurring every Tuesday morning using an one hour time scale, then the following event sequence may be identified as a pattern: 12/28/99 at 10:12 AM, 1/4/00 at 10:43 AM, 1/11/00 at 10:21 AM, and 1/18/00 at 10:56 AM. As such, the motion mining system 115 detects events periodically occurring within a time range and is not limited to strictly periodic events, i.e., events occurring at exactly the same time. Exemplary time scale values are one minute, five minutes, one hour, one day and one week. However, a person of ordinary skill may also use other time scale values to determine periodicity.
  • The [0036] motion mining system 115 may also determine a “time correlation” between different routes for different time scales. Namely, the motion mining system 115 determines, for each pair of routes separated by less than a threshold distance apart, whether a trajectory 202 in one route is followed at the same time interval by a trajectory 202 in another route. For example, if the time scale is one hour, the following events is considered a four hour time correlation between a first route and a second route: events occurring on 12/28/99 at 10:12 AM, 1/4/00 at 10:43 AM, 1/11/00 at 10:21 AM and 1/18/00 at 10:56 AM over the first route, and events occurring on 12/28/99 at 2:33 PM, 1/4/00 at 2:18 PM, 1/11/00 at 2:26 PM and 1/18/00 at 2:34 PM over the second route.
  • The [0037] motion mining system 115 is not limited to the above-identified patterns. For example, the motion mining system 115 may apply a pattern operator 210 to determine a time periodicity of a particular object within a route. The motion mining system 115 may also identify fast-moving objects and slow-moving objects having trajectory speeds within particular ranges. Additionally, the motion mining system 115 may apply a deviation operator 212 to determine any deviations or violations from a previously determined pattern.
  • The above-mentioned motion patterns are used in a variety of practical applications. Such applications that use these motion patterns include: discovering periodic patterns of flights, determining whether different flights are time-correlated, finding regions of heavy influx of vehicles over the last week, finding regions of heavy outflow of vehicles, predicting the location of an object in the next hour, and predicting any suspicious patterns between two objects. Additionally, applications that use deviations of motion patterns include: finding objects flying in a region having no commercial routes, detecting a fast moving object in a no-fly zone, determining whether a vehicle has reversed its path near a road block, determining whether a vehicle is moving faster than average, detecting speeding objects, and detecting objects with unusual acceleration. These applications are exemplary and are not considered to be limiting in any manner. [0038]
  • The [0039] database 120 contains at least the above- identified motion patterns from the motion mining system 115 and motion information from the motion extraction system 110. The motion patterns and motion information represent motion of a plurality of objects in a imaged area and over a particular time interval. The server computer 125 accesses and uses the data in the database to perform calculations in response to a query from the user computer 130. The query results are displayed as a graphical user interface (GUI) at the user computer 130. An exemplary GUI is further described with respect to FIGS. 5-8.
  • FIG. 4 depicts one embodiment of the [0040] motion mining system 115 of the present invention. The motion mining system 115 is embodied as a computer 400 comprising a memory 402, a central processing unit (CPU) 404, a signal interface 406 and support circuits 408. The memory 402 stores software programs, e.g., a motion mining program 410 and a K-means clustering program 412. The motion mining program 410, when executed by the CPU 404, is used to implement the motion mining system 115. The K-means clustering program 412 is used to determine source points, target or destination points, and the busy times along routes.
  • The [0041] signal interface 406 receives motion information from the motion extraction system 110. Once the motion information is received at the signal interface 406, the CPU 404 executes instructions or commands in the motion mining program 410 to determine spatial patterns, temporal patterns and spatio-temporal patterns from the received motion information. The CPU 404 may use well-known support circuits 408 to implement the motion mining program 410. Examples of support circuits 408 may include a clock, a power supply, a cache memory, and the like. The signal interface 406 then transmits the patterns to the database 120. Examples of the signal interface 406 include a cable modem, a network card, and the like.
  • FIG. 5 depicts an exemplary graphical user interface (GUI) [0042] 500 for displaying motion information and motion patterns. A user may use the GUI 500 to specify a database query on the motion patterns and motion information stored in the database 120. The results of the query are then displayed on the GUI 500. Specifically, the GUI 500 comprises menu window 510, a map window 520 and a timeline window 530.
  • The menu window [0043] 510 includes a search tab 512, a details tab 514 and a timeline tab 516. The menu window 510 may also include additional tabs, e.g., a messages tab, an alert tab and a database tab. If the search tab 512 is selected, the menu window 510 displays a search window 550 containing a list 552 of predefined queries and fields 554, 556, 558 and 560 to define a selected query. For example, the list 552 may include queries to determine busy source points, routes and deviations thereof, and a time correlation between different routes.
  • The fields [0044] 554-560 are used to define the query with selected categories motion patterns and motion information 206 and with selected values of constraints. The type and number of available fields or option windows 554-560 is dependent on the type of query to perform. The constraints selected in the fields 554C, 556C and 558C are limitations to a particular motion pattern or a particular type of motion information 206. An exemplary set 600 of motion patterns, motion information 206 and constraints are depicted in FIG. 6. Such constraints, shown as shaded in the set 600, may define a speed range, an acceleration range, a direction of an object, a type of object, a time interval relative to an event, and a region relative to a trajectory 202.
  • The [0045] search window 550 depicted in FIG. 5 indicates the selection of a query “pln” as indicated in field 562. This query is used to determine the existence of a particular type of route. The query is specified by a speed category in field 554, a crossing condition in field 556, a time frame in field 558 and an existence of off-norm traffic in field 560. Constraints are provided in field 554C for a speed range, in field 556C for a region being crossed, and field 558C for objects travelling in a specified time frame.
  • The selection of the [0046] timeline tab 516 results in the display of a timeline window 700 depicted in FIG. 7. The timeline window 700 contains fields for defining the start and end of a time frame or time interval of interest. Once the query is performed and the query result is displayed on the map window 520, the user may select the details tab 514 to display a details window 800 depicted in FIG. 8. The details window 800 indicates information of each trajectory that satisfies the query specified in the query window 550. For example, the details window 800 may indicate the object identifier and the start time for the trajectory of the object.
  • The [0047] map window 520 displays the routes that satisfy the query specified in the search window 550. Namely, the map window provides a spatial representation of an area captured in the video. The creation of the map window 520 includes the display of a background image of the imaged area and then overlaying the background image with routes that satisfy the query.
  • The timeline window [0048] 530 displays the object identifier and time span of each trajectory in the routes shown in the map window 520. Thus, the timeline window 530 provides a temporal representation of the trajectories for each route. The range of time is specified in the timeline window 700. Specific information for each trajectory is displayed in the details window 800.
  • FIG. 9 depicts a flow diagram of a [0049] method 900 for implementing the video processing system 100. The method 900 starts at step 902 and proceeds to step 904, where the video source 105 captures video of the imaged area of interest. At step 906, the method 900 uses the motion extraction system 110 to extract motion information 206 from the video. Motion information 206 includes information relating to a trajectory 202 of a moving object captured in the video. Examples of motion information 206 include a path of the trajectory 202, a speed range of the trajectory 202, and a start point of the trajectory 202.
  • At [0050] step 908, the method 900 stores the extracted motion information 206 in the database 120. The method 900 uses the extracted motion information at step 910, where the inventive motion mining system 115 determines motion patterns from the extracted motion information 206. Step 910 is further described with respect to FIG. 10. The method motion patterns are stored in the database 120 at step 912.
  • The [0051] method 900 proceeds to step 914, where a server computer 125 performs a query on the stored motion information 206 and motion patterns. The query is performed in response to a command from a user at the user computer 130. The query may contain constraints used to specify particular categories or ranges of motion information 206 and motion patterns. At step 916, the motion information 206 and motion patterns specified in the query are retrieved from the 120. The method 900 proceeds to step 918, where the results of the query are displayed through a graphical user interface (GUI) at the user computer 130. Steps 914, 916 and 918 are used to perform each query specified by the user. The method 900 proceeds to end at step 920.
  • FIG. 10 depicts a flow diagram of a [0052] method 1000 for implementing the motion mining system 115 of the present invention. The motion mining system 115 performs the method 1000 in accordance to commands in the motion mining program 410. Specifically, the method 1000 starts at step 1002 and proceeds to step 1004, where motion information 206 is received from the motion extraction system 110.
  • The motion information is used to determine routes at [0053] step 1006. Routes represent the grouping or clustering of two or more trajectories having a common path. The information on the routes is prepared for storage at step 1008. For example, information from step 406 may be temporarily stored in the memory 402 or support circuits 408 of the motion mining system 115 or may be directly stored in the database 120.
  • At [0054] step 1010, the method determines busy times along each of the routes. Specifically, for each route, step 410 performs clustering of the trajectories along the time dimension. The busy times or intervals represent the times when the number of trajectories is greater than a predetermined threshold number. The information from step 1010 is prepared for storage at step 1008.
  • The [0055] method 1000 also uses the information on routes in step 1012, where periodic patterns are determined for each route. The determination of periodic patterns uses the concept of a time scale that represents a time interval where two events are considered to have occurred simultaneously if the events occurred within the same time interval. As such, step 1012 is not limited to strictly periodic patterns but captures additional periodic patterns within a predefined time interval or time scale. The information from step 1012 is also prepared for storage at step 1008.
  • At [0056] step 1014, the method 1000 also uses information from step 1006 to determine time correlations between two different routes for different time scales. For each pair of trajectories 302 separated within a threshold distance, step 1014 determines whether a trajectory 302 in a first route is followed at the same time interval by another trajectory 302 in a second route. The information on time correlation is prepared for storage at step 1008.
  • The [0057] method 1000 uses the received motion information 206 to determine popular origins and destinations at step 1016. The popular origins and destinations are determined by grouping or clustering common start points and common end points of trajectories 302. For example, step 1016 identifies a popular origin if the number of trajectories having a common start point is greater than a threshold number. Similarly, step 1016 identifies a popular destination if the number of trajectories having a common end point is greater than a threshold number. The information on popular origins and destinations is prepared for storage at step 1008. The method 1000 proceeds to end at step 1018 once all the motion patterns are determined from steps 1006, 1010, 1012, 1014 and 1016.
  • Although various embodiments which incorporate the teachings of the present invention have been shown and described in detail herein, those skilled in the art can readily devise many other varied embodiments that still incorporate these teachings. [0058]

Claims (25)

What is claimed is:
1. A method for performing motion analysis on a sequence of images, where said sequence of images captures a plurality of objects each moving along a trajectory in an imaged area, said method comprising:
extracting motion information for each of said plurality of objects contained in said sequence of images; and
determining spatial patterns from said extracted motion information.
2. The method of
claim 1
wherein said determining of spatial patterns comprises:
determining a route comprising a trajectory of a first object having the same trajectory of at least one other object.
3. The method of
claim 2
wherein said determining of said route comprises:
determining whether said trajectory of a second object is within a threshold distance said trajectory of said first object; and
including, if said trajectory of said second object is within the threshold distance, said trajectory of said second object in said route.
4. The method of
claim 1
wherein said determining of spatial patterns comprises:
determining a source point and a destination point from said trajectory of said plurality of objects.
5. The method of
claim 4
wherein said determining said source point comprises:
determining whether a number of trajectories originating from a location is greater than a threshold number; and
identifying, if the number of trajectories originating from the location is greater than the threshold number, the location as said source point.
6. The method of
claim 4
wherein said determining said destination point comprises:
determining whether a number of trajectories ending at a location is greater than a threshold number; and
identifying, if the number of trajectories ending at the location is greater than the threshold number, the location as said destination point.
7. The method of
claim 4
wherein said source point and said destination point are determined using a clustering process.
8. The method of
claim 1
further comprising:
determining spatio-temporal patterns from said determined spatial patterns along a time dimension.
9. The method of
claim 8
wherein said determining of spatio-temporal patterns comprises:
determining a busy time for said route, where the busy time represents a time when a number of trajectories for said plurality of objects along said route is greater than a threshold number.
10. The method of
claim 8
wherein said determining of spatio-temporal patterns comprises:
determining a periodicity of at least one trajectory in said route.
11. The method of
claim 10
wherein said determining the periodicity comprises:
selecting a time scale; and
determining whether a first occurrence of an event along said route and time scale is periodic with subsequent occurrences of said event along the same route and time scale.
12. The method of
claim 11
wherein said event comprises said trajectory of said first object.
13. The method of
claim 11
wherein said event comprises a number of said trajectories greater than a threshold value.
14. The method of
claim 1
further comprising:
determining a first route comprising a trajectory common to a first set of at least two objects;
determining a second route comprising a trajectory common to a second set of at least two objects; and
determining whether said trajectory in said first route is time correlated with said trajectory in said second route.
15. A method for displaying motion information of objects contained in a sequence of images, the method comprising:
performing a query on a plurality of spatial patterns stored in a database, where each of said plurality of spatial patterns comprises a route determined from a trajectory common to at least two objects moving in an imaged area captured in said sequence of images;
determining a trajectory satisfying at least one constraint specified in said query; and
displaying said determined trajectory on a user interface.
16. A system for performing motion analysis on a sequence of images, the apparatus comprising:
a motion extraction system for receiving said sequence of images capturing a plurality of objects each moving along a trajectory, and extracting motion information for each of said plurality of objects over said sequence of images; and
a motion mining system for determining spatial patterns from said extracted motion information, where said spatial patterns comprise a route determined from said trajectory common to at least two objects.
17. The system of
claim 16
further comprising a video source for capturing said plurality of objects in an imaged area and transmitting video containing said captured plurality of objects to said motion extraction system.
18. The system of
claim 16
further comprising:
a database for storing said spatial patterns determined from said motion mining system; and
a server computer for retrieving said trajectory satisfying at least one constraint specified in a query.
19. The system of
claim 16
wherein said spatial patterns comprise a route having a trajectory of a first object that is the same as the trajectory of at least one other object.
20. The system of
claim 16
wherein said spatial patterns comprise a source point and a destination point for said trajectories of said plurality of objects.
21. The system of
claim 16
wherein said motion mining system determines spatio-temporal patterns from said spatial patterns along a time dimension.
22. An apparatus for performing picture analysis, the apparatus comprising:
a memory for storing a motion mining program;
an interface for receiving motion information containing trajectory information for a plurality of objects captured in an image sequence;
a processor, upon executing said motion mining program retrieved from said memory, determines spatial patterns from the received motion information.
23. The apparatus of
claim 22
wherein said spatial patterns comprise a route having a trajectory of a first object that is the same as the trajectory of at least one other object.
24. The apparatus of
claim 22
wherein said spatial patterns comprise a source point and a destination point for said trajectories of said plurality of objects.
25. The apparatus of
claim 22
wherein said processor determines spatio-temporal patterns from said spatial patterns along a time dimension.
US09/769,599 2000-03-21 2001-01-25 Method and apparatus for performing motion analysis on an image sequence Abandoned US20010043721A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/769,599 US20010043721A1 (en) 2000-03-21 2001-01-25 Method and apparatus for performing motion analysis on an image sequence

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US19081900P 2000-03-21 2000-03-21
US09/769,599 US20010043721A1 (en) 2000-03-21 2001-01-25 Method and apparatus for performing motion analysis on an image sequence

Publications (1)

Publication Number Publication Date
US20010043721A1 true US20010043721A1 (en) 2001-11-22

Family

ID=26886486

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/769,599 Abandoned US20010043721A1 (en) 2000-03-21 2001-01-25 Method and apparatus for performing motion analysis on an image sequence

Country Status (1)

Country Link
US (1) US20010043721A1 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030058341A1 (en) * 2001-09-27 2003-03-27 Koninklijke Philips Electronics N.V. Video based detection of fall-down and other events
US6587752B1 (en) * 2001-12-25 2003-07-01 National Institute Of Advanced Industrial Science And Technology Robot operation teaching method and apparatus
US6718317B1 (en) * 2000-06-02 2004-04-06 International Business Machines Corporation Methods for identifying partial periodic patterns and corresponding event subsequences in an event sequence
US20060256210A1 (en) * 2005-04-28 2006-11-16 Kathleen Ryall Spatio-temporal graphical user interface for querying videos
US20070213874A1 (en) * 2006-03-10 2007-09-13 Fanuc Ltd Device, program, recording medium and method for robot simulation
WO2008083663A1 (en) * 2007-01-08 2008-07-17 Norbert Link Method for the automatic analysis of object movements
US20080250259A1 (en) * 2007-04-04 2008-10-09 Clark Equipment Company Power Machine or Vehicle with Power Management
US20090157233A1 (en) * 2007-12-14 2009-06-18 Kokkeby Kristen L System and methods for autonomous tracking and surveillance
US20100042269A1 (en) * 2007-12-14 2010-02-18 Kokkeby Kristen L System and methods relating to autonomous tracking and surveillance
US20110225194A1 (en) * 2010-03-09 2011-09-15 Electronics And Telecommunications Research Institute Apparatus and method for analyzing information about floating population
WO2012159617A2 (en) * 2011-05-26 2012-11-29 Conti Temic Microelectronic Gmbh Method for deriving information on the surrounding area from the identification of tail lights of preceding vehicles
US20140052816A1 (en) * 2012-08-20 2014-02-20 National Taiwan University Of Science And Technology Network matchmaking system
KR101420180B1 (en) * 2010-03-09 2014-07-21 한국전자통신연구원 Apparatus for analyzing of floating population information and method thereof
US20140358744A1 (en) * 2013-05-31 2014-12-04 Bank Of America Corporation Bitemporal milestoning of model free data
US9082018B1 (en) 2014-09-30 2015-07-14 Google Inc. Method and system for retroactively changing a display characteristic of event indicators on an event timeline
US9158974B1 (en) 2014-07-07 2015-10-13 Google Inc. Method and system for motion vector-based video monitoring and event categorization
US9449229B1 (en) 2014-07-07 2016-09-20 Google Inc. Systems and methods for categorizing motion event candidates
US9501915B1 (en) 2014-07-07 2016-11-22 Google Inc. Systems and methods for analyzing a video stream
USD782495S1 (en) 2014-10-07 2017-03-28 Google Inc. Display screen or portion thereof with graphical user interface
US20170371353A1 (en) * 2016-06-23 2017-12-28 Qualcomm Incorporated Automatic Tracking Mode For Controlling An Unmanned Aerial Vehicle
US10127783B2 (en) 2014-07-07 2018-11-13 Google Llc Method and device for processing motion events
US10140827B2 (en) 2014-07-07 2018-11-27 Google Llc Method and system for processing motion event notifications
US10502579B2 (en) * 2016-10-25 2019-12-10 Here Global B.V. Method and apparatus for determining modal routes between an origin area and a destination area
US10657382B2 (en) 2016-07-11 2020-05-19 Google Llc Methods and systems for person detection in a video feed
US11082701B2 (en) 2016-05-27 2021-08-03 Google Llc Methods and devices for dynamic adaptation of encoding bitrate for video streaming
CN114297437A (en) * 2021-12-29 2022-04-08 重庆紫光华山智安科技有限公司 File display method, device and equipment based on image gathering
US11599259B2 (en) 2015-06-14 2023-03-07 Google Llc Methods and systems for presenting alert event indicators
US11710387B2 (en) 2017-09-20 2023-07-25 Google Llc Systems and methods of detecting and responding to a visitor to a smart home environment
US11783010B2 (en) 2017-05-30 2023-10-10 Google Llc Systems and methods of person recognition in video streams

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4025718A (en) * 1974-12-10 1977-05-24 Comitato Nazionale Per L'energia Nucleare Method and apparatus for recording in a memory trajectories and traces of objects
US4179681A (en) * 1973-03-05 1979-12-18 The United States Of America As Represented By The Secretary Of The Navy Target tracking sonar signal processing and display system
US4361202A (en) * 1979-06-15 1982-11-30 Michael Minovitch Automated road transportation system
US4622458A (en) * 1982-11-30 1986-11-11 Messerschmitt-Boelkow-Blohm Gmbh Trajectory acquisition and monitoring system
US5259040A (en) * 1991-10-04 1993-11-02 David Sarnoff Research Center, Inc. Method for determining sensor motion and scene structure and image processing system therefor
US5416711A (en) * 1993-10-18 1995-05-16 Grumman Aerospace Corporation Infra-red sensor system for intelligent vehicle highway systems
US5444442A (en) * 1992-11-05 1995-08-22 Matsushita Electric Industrial Co., Ltd. Method for predicting traffic space mean speed and traffic flow rate, and method and apparatus for controlling isolated traffic light signaling system through predicted traffic flow rate
US5801943A (en) * 1993-07-23 1998-09-01 Condition Monitoring Systems Traffic surveillance and simulation apparatus
US5809161A (en) * 1992-03-20 1998-09-15 Commonwealth Scientific And Industrial Research Organisation Vehicle monitoring system
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method
US6177885B1 (en) * 1998-11-03 2001-01-23 Esco Electronics, Inc. System and method for detecting traffic anomalies
US6353796B1 (en) * 1996-10-24 2002-03-05 Trimble Navigation Limited Vehicle tracker, mileage-time monitor and calibrator
US6445308B1 (en) * 1999-01-12 2002-09-03 Toyota Jidosha Kabushiki Kaisha Positional data utilizing inter-vehicle communication method and traveling control apparatus
US6567116B1 (en) * 1998-11-20 2003-05-20 James A. Aman Multiple object tracking system

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4179681A (en) * 1973-03-05 1979-12-18 The United States Of America As Represented By The Secretary Of The Navy Target tracking sonar signal processing and display system
US4025718A (en) * 1974-12-10 1977-05-24 Comitato Nazionale Per L'energia Nucleare Method and apparatus for recording in a memory trajectories and traces of objects
US4361202A (en) * 1979-06-15 1982-11-30 Michael Minovitch Automated road transportation system
US4622458A (en) * 1982-11-30 1986-11-11 Messerschmitt-Boelkow-Blohm Gmbh Trajectory acquisition and monitoring system
US5259040A (en) * 1991-10-04 1993-11-02 David Sarnoff Research Center, Inc. Method for determining sensor motion and scene structure and image processing system therefor
US5809161A (en) * 1992-03-20 1998-09-15 Commonwealth Scientific And Industrial Research Organisation Vehicle monitoring system
US5444442A (en) * 1992-11-05 1995-08-22 Matsushita Electric Industrial Co., Ltd. Method for predicting traffic space mean speed and traffic flow rate, and method and apparatus for controlling isolated traffic light signaling system through predicted traffic flow rate
US5801943A (en) * 1993-07-23 1998-09-01 Condition Monitoring Systems Traffic surveillance and simulation apparatus
US5416711A (en) * 1993-10-18 1995-05-16 Grumman Aerospace Corporation Infra-red sensor system for intelligent vehicle highway systems
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method
US6353796B1 (en) * 1996-10-24 2002-03-05 Trimble Navigation Limited Vehicle tracker, mileage-time monitor and calibrator
US6177885B1 (en) * 1998-11-03 2001-01-23 Esco Electronics, Inc. System and method for detecting traffic anomalies
US6567116B1 (en) * 1998-11-20 2003-05-20 James A. Aman Multiple object tracking system
US6445308B1 (en) * 1999-01-12 2002-09-03 Toyota Jidosha Kabushiki Kaisha Positional data utilizing inter-vehicle communication method and traveling control apparatus

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6718317B1 (en) * 2000-06-02 2004-04-06 International Business Machines Corporation Methods for identifying partial periodic patterns and corresponding event subsequences in an event sequence
US20030058341A1 (en) * 2001-09-27 2003-03-27 Koninklijke Philips Electronics N.V. Video based detection of fall-down and other events
US7110569B2 (en) * 2001-09-27 2006-09-19 Koninklijke Philips Electronics N.V. Video based detection of fall-down and other events
US6587752B1 (en) * 2001-12-25 2003-07-01 National Institute Of Advanced Industrial Science And Technology Robot operation teaching method and apparatus
US20060256210A1 (en) * 2005-04-28 2006-11-16 Kathleen Ryall Spatio-temporal graphical user interface for querying videos
US7598977B2 (en) * 2005-04-28 2009-10-06 Mitsubishi Electric Research Laboratories, Inc. Spatio-temporal graphical user interface for querying videos
US20070213874A1 (en) * 2006-03-10 2007-09-13 Fanuc Ltd Device, program, recording medium and method for robot simulation
WO2008083663A1 (en) * 2007-01-08 2008-07-17 Norbert Link Method for the automatic analysis of object movements
US20080250259A1 (en) * 2007-04-04 2008-10-09 Clark Equipment Company Power Machine or Vehicle with Power Management
US8718878B2 (en) * 2007-04-04 2014-05-06 Clark Equipment Company Power machine or vehicle with power management
US20100042269A1 (en) * 2007-12-14 2010-02-18 Kokkeby Kristen L System and methods relating to autonomous tracking and surveillance
US8718838B2 (en) * 2007-12-14 2014-05-06 The Boeing Company System and methods for autonomous tracking and surveillance
US20090157233A1 (en) * 2007-12-14 2009-06-18 Kokkeby Kristen L System and methods for autonomous tracking and surveillance
US9026272B2 (en) 2007-12-14 2015-05-05 The Boeing Company Methods for autonomous tracking and surveillance
US20110225194A1 (en) * 2010-03-09 2011-09-15 Electronics And Telecommunications Research Institute Apparatus and method for analyzing information about floating population
US8554788B2 (en) * 2010-03-09 2013-10-08 Electronics And Telecommunications Research Institute Apparatus and method for analyzing information about floating population
KR101420180B1 (en) * 2010-03-09 2014-07-21 한국전자통신연구원 Apparatus for analyzing of floating population information and method thereof
WO2012159617A2 (en) * 2011-05-26 2012-11-29 Conti Temic Microelectronic Gmbh Method for deriving information on the surrounding area from the identification of tail lights of preceding vehicles
WO2012159617A3 (en) * 2011-05-26 2013-03-14 Conti Temic Microelectronic Gmbh Method for deriving information on the surrounding area from the identification of tail lights of preceding vehicles
US20140052816A1 (en) * 2012-08-20 2014-02-20 National Taiwan University Of Science And Technology Network matchmaking system
US20140358744A1 (en) * 2013-05-31 2014-12-04 Bank Of America Corporation Bitemporal milestoning of model free data
US9544636B2 (en) 2014-07-07 2017-01-10 Google Inc. Method and system for editing event categories
US9672427B2 (en) 2014-07-07 2017-06-06 Google Inc. Systems and methods for categorizing motion events
US9158974B1 (en) 2014-07-07 2015-10-13 Google Inc. Method and system for motion vector-based video monitoring and event categorization
US9213903B1 (en) * 2014-07-07 2015-12-15 Google Inc. Method and system for cluster-based video monitoring and event categorization
US9224044B1 (en) 2014-07-07 2015-12-29 Google Inc. Method and system for video zone monitoring
US9354794B2 (en) 2014-07-07 2016-05-31 Google Inc. Method and system for performing client-side zooming of a remote video feed
US9420331B2 (en) 2014-07-07 2016-08-16 Google Inc. Method and system for categorizing detected motion events
US9449229B1 (en) 2014-07-07 2016-09-20 Google Inc. Systems and methods for categorizing motion event candidates
US9479822B2 (en) 2014-07-07 2016-10-25 Google Inc. Method and system for categorizing detected motion events
US9489580B2 (en) 2014-07-07 2016-11-08 Google Inc. Method and system for cluster-based video monitoring and event categorization
US9501915B1 (en) 2014-07-07 2016-11-22 Google Inc. Systems and methods for analyzing a video stream
US10867496B2 (en) 2014-07-07 2020-12-15 Google Llc Methods and systems for presenting video feeds
US9602860B2 (en) 2014-07-07 2017-03-21 Google Inc. Method and system for displaying recorded and live video feeds
US11250679B2 (en) 2014-07-07 2022-02-15 Google Llc Systems and methods for categorizing motion events
US9609380B2 (en) 2014-07-07 2017-03-28 Google Inc. Method and system for detecting and presenting a new event in a video feed
US10789821B2 (en) 2014-07-07 2020-09-29 Google Llc Methods and systems for camera-side cropping of a video feed
US9674570B2 (en) 2014-07-07 2017-06-06 Google Inc. Method and system for detecting and presenting video feed
US9779307B2 (en) 2014-07-07 2017-10-03 Google Inc. Method and system for non-causal zone search in video monitoring
US11062580B2 (en) 2014-07-07 2021-07-13 Google Llc Methods and systems for updating an event timeline with event indicators
US9886161B2 (en) 2014-07-07 2018-02-06 Google Llc Method and system for motion vector-based video monitoring and event categorization
US9940523B2 (en) 2014-07-07 2018-04-10 Google Llc Video monitoring user interface for displaying motion events feed
US11011035B2 (en) 2014-07-07 2021-05-18 Google Llc Methods and systems for detecting persons in a smart home environment
US10108862B2 (en) 2014-07-07 2018-10-23 Google Llc Methods and systems for displaying live video and recorded video
US10127783B2 (en) 2014-07-07 2018-11-13 Google Llc Method and device for processing motion events
US10140827B2 (en) 2014-07-07 2018-11-27 Google Llc Method and system for processing motion event notifications
US10180775B2 (en) 2014-07-07 2019-01-15 Google Llc Method and system for displaying recorded and live video feeds
US10192120B2 (en) 2014-07-07 2019-01-29 Google Llc Method and system for generating a smart time-lapse video clip
US10452921B2 (en) 2014-07-07 2019-10-22 Google Llc Methods and systems for displaying video streams
US10467872B2 (en) 2014-07-07 2019-11-05 Google Llc Methods and systems for updating an event timeline with event indicators
US10977918B2 (en) 2014-07-07 2021-04-13 Google Llc Method and system for generating a smart time-lapse video clip
US9082018B1 (en) 2014-09-30 2015-07-14 Google Inc. Method and system for retroactively changing a display characteristic of event indicators on an event timeline
US9170707B1 (en) 2014-09-30 2015-10-27 Google Inc. Method and system for generating a smart time-lapse video clip
USD893508S1 (en) 2014-10-07 2020-08-18 Google Llc Display screen or portion thereof with graphical user interface
USD782495S1 (en) 2014-10-07 2017-03-28 Google Inc. Display screen or portion thereof with graphical user interface
US11599259B2 (en) 2015-06-14 2023-03-07 Google Llc Methods and systems for presenting alert event indicators
US11082701B2 (en) 2016-05-27 2021-08-03 Google Llc Methods and devices for dynamic adaptation of encoding bitrate for video streaming
US9977434B2 (en) * 2016-06-23 2018-05-22 Qualcomm Incorporated Automatic tracking mode for controlling an unmanned aerial vehicle
US20170371353A1 (en) * 2016-06-23 2017-12-28 Qualcomm Incorporated Automatic Tracking Mode For Controlling An Unmanned Aerial Vehicle
US10657382B2 (en) 2016-07-11 2020-05-19 Google Llc Methods and systems for person detection in a video feed
US11587320B2 (en) 2016-07-11 2023-02-21 Google Llc Methods and systems for person detection in a video feed
US10502579B2 (en) * 2016-10-25 2019-12-10 Here Global B.V. Method and apparatus for determining modal routes between an origin area and a destination area
US11783010B2 (en) 2017-05-30 2023-10-10 Google Llc Systems and methods of person recognition in video streams
US11710387B2 (en) 2017-09-20 2023-07-25 Google Llc Systems and methods of detecting and responding to a visitor to a smart home environment
CN114297437A (en) * 2021-12-29 2022-04-08 重庆紫光华山智安科技有限公司 File display method, device and equipment based on image gathering

Similar Documents

Publication Publication Date Title
US20010043721A1 (en) Method and apparatus for performing motion analysis on an image sequence
EP3497590B1 (en) Distributed video storage and search with edge computing
Collins et al. Algorithms for cooperative multisensor surveillance
Huang et al. Decentralized autonomous navigation of a UAV network for road traffic monitoring
US11328163B2 (en) Methods and apparatus for automated surveillance systems
Jeung et al. Trajectory pattern mining
US5801943A (en) Traffic surveillance and simulation apparatus
Porter et al. Wide-area motion imagery
EP2596630B1 (en) Tracking apparatus, system and method
US20200090504A1 (en) Digitizing and mapping the public space using collaborative networks of mobile agents and cloud nodes
US10929462B2 (en) Object recognition in autonomous vehicles
US20150264296A1 (en) System and method for selection and viewing of processed video
US20150116487A1 (en) Method for Video-Data Indexing Using a Map
Rosenbaum et al. Real-time image processing for road traffic data extraction from aerial images
CN111767432B (en) Co-occurrence object searching method and device
Sysoev et al. Heterogeneous data aggregation schemes to determine traffic flow parameters in regional intelligent transportation systems
Dokuz Weighted spatio-temporal taxi trajectory big data mining for regional traffic estimation
US20180260401A1 (en) Distributed video search with edge computing
JP2021196738A (en) Data collection device for map generation and data collection method for map generation
CN116610849A (en) Method, device, equipment and storage medium for acquiring moving objects with similar tracks
Emiyah et al. Extracting vehicle track information from unstabilized drone aerial videos using YOLOv4 common object detector and computer vision
US20220366575A1 (en) Method and system for gathering information of an object moving in an area of interest
CN113660462A (en) Surrounding ring type mobile vehicle video tracking method based on fusion multi-source data analysis
US10801841B1 (en) Trajectory prediction via a feature vector approach
Zou et al. Traffic flow video image recognition and analysis based on multi-target tracking algorithm and deep learning

Legal Events

Date Code Title Description
AS Assignment

Owner name: SARNOFF CORPORATION, NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KRAVETS, DINA;WAN, SUZ-HSI;REEL/FRAME:011502/0268;SIGNING DATES FROM 20010117 TO 20010123

AS Assignment

Owner name: DARPA, VIRGINIA

Free format text: CONFIRMATORY LICENSE;ASSIGNOR:SARNOFF CORPORATION;REEL/FRAME:011983/0141

Effective date: 20010711

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION