WO2006137072A2 - Wide area security system and method - Google Patents

Wide area security system and method Download PDF

Info

Publication number
WO2006137072A2
WO2006137072A2 PCT/IL2006/000738 IL2006000738W WO2006137072A2 WO 2006137072 A2 WO2006137072 A2 WO 2006137072A2 IL 2006000738 W IL2006000738 W IL 2006000738W WO 2006137072 A2 WO2006137072 A2 WO 2006137072A2
Authority
WO
WIPO (PCT)
Prior art keywords
point
information
security
interest
video camera
Prior art date
Application number
PCT/IL2006/000738
Other languages
French (fr)
Other versions
WO2006137072A3 (en
Inventor
Ron Zehavi
Original Assignee
Rontal Engineering Applications Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rontal Engineering Applications Ltd. filed Critical Rontal Engineering Applications Ltd.
Publication of WO2006137072A2 publication Critical patent/WO2006137072A2/en
Priority to GB0800820A priority Critical patent/GB2441491A/en
Publication of WO2006137072A3 publication Critical patent/WO2006137072A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19645Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B31/00Predictive alarm systems characterised by extrapolation or other computation using updated historic data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4135Peripherals receiving signals from specially adapted client devices external recorder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • some of the incoming information may be filtered by computerized means to screen and pass onward to the controller only information embedded in a video stream that contains, for example, a movement being of predefined characteristics.
  • computerized means may invoke alerts when a detected movement embedded in a video stream matches a predefined pattern of behavior. None of these systems is capable of analyzing future threats before a system has been tailored to a location, identify potential threats after installing it, training the security staff and control the security staff as well as the crowd under threat in real-time.
  • systems are not able also to integrate inputs from different sources so as to create a unified display displaying real-time input such as from a video camera, synthetic input such as underground infrastructure received from infrastructure database and the like.
  • security systems are as- well unable to fuse information received from different types of sensors and databases so as to create an educated, fused picture to an operator, according to pre-defined scenario and / or policy, such as prioritizing these sources by urgency of the content of that source or by its relevance to the event being handled or by any desired policy.
  • security systems do not provide also orientation cues that may help an operator of the security system in understanding the video picture he or she is viewing during management of a security event, which may turn to be a very complicated and confusing task, as the camera picture may be of an unknown zooming factor and pointing at a place not known to the operator by its view, etc.
  • security systems have typically high rate of false alarms and that rate may go even higher as the complexity of the system becomes higher.
  • FIG. 1 is a schematic illustration of a security system describing utilization of logical resources, constructed and functioning according to some embodiments of the present invention
  • FIG. 2 is a schematic illustration of security system describing utilization of peripheral resources, according to some embodiments of the present invention.
  • FIGS. 3A and 3B are a schematic block diagram illustration and a schematic side view illustration of a positioning system respectively according to some embodiments of the present invention.
  • FIG. 4 is a schematic block diagram illustration of a security system according to some embodiments of the present invention.
  • the present invention may be used in a variety of applications. Although the present invention is not limited in this respect, the system and method disclosed herein may be used in many security installations such as indoor environment such as shopping center, transportation stations, hotels and the like, outdoor environment such as university campus, stadium, airport or seaport and the like and perimeter line such as boundary of sensitive zone (such as a power plant), a border line, pipeline and the like. It should also be understood that while the present application widely discusses inventive security systems and methods, the principles of which may also be realized and utilized for similar needs, such as safety of people and / or systems and for the protection of viability and survivability of systems, such as communication systems.
  • Maintaining security may impose the need to solve several different problems such as what is the nature of the potential threat, whether a threat may be identified in a relatively early stage based on its nature, when shall an allegedly coincidental group of inputs be translated to an evolving threat. In case a threat has been detected what shall be the next developments of which, how should a random crowd be managed to minimize casualties and harms, and the like. Additionally, there is a need to train the security staff to react fast and accurate when a threat is identified.
  • Fig. 1 is a schematic illustration of a security system 10 describing utilization of logical resources, constructed and functioning according to the present invention.
  • Security system 10 may comprise a main unit 12, an expert know-how database 14, a situational awareness database 16, a geographical database 18, a planning optimizer unit 20, a decision support unit 22, a training unit
  • Expert know-how database 14 may comprise a large amount of information describing performance of security devices, operational methodology models, security handling policies, and the like. This information may be used as a basis for evaluation of detected events, in order to estimate the threat that they may impose, as well as to administer an on-going threatening event in order to utilize available security resources to minimize the harm that such threat may cause in the most efficient way, and to steer the protected crowed in the most safe way.
  • Situational awareness database 16 may comprise information describing abnormal behavior, position descriptors of monitored entities, pre-collected intelligence information, data received from security and safety devices, environmental conditions, analysis of expected results of potential threats on the environment (such as the expected damage to a building from the explosion of a given bomb at a given distance from that building) and on persons, and the like.
  • Geographical database 18 may comprise geographical data representing at least an area of interest, such as 2-Dimensional or 3 -Dimensional coordinates of a location inside said area of interest. Geographical database 18 may also comprise 3-D description of buildings and infrastructure contained in an area of interest.
  • Planning optimizer unit 20 may comprise information about gaps - known and suspected - in security monitoring coverage, profiles of optimized deployment of security resources and the like.
  • Planning optimizer unit 20 may function to optimize security resources management determined in advance or while a security event is going on.
  • Decision support unit 22 may comprise information on the identification of potential scenarios and may function to recommend of responsive actions that need to be taken in response to a developing security event.
  • Training unit 24 may comprise an updateable bank of scenarios and past events and function to create and monitor training sessions.
  • Activation unit 26 may comprise an appropriate interface supporting the interface to and activation of any auxiliary device, such as indication and guiding lights, summoning means, public address (PA) means, and the like. Such auxiliary device may be used to transmit instructions and / or information to other systems and / or to security staff and / or crowd.
  • auxiliary device may be used to transmit instructions and / or information to other systems and / or to security staff and / or crowd.
  • Main unit 12 may comprise a computing unit which is loadable with an appropriate software and equipped with means to perform all functionalities for combining data from the various units and for analyzing the ongoing incoming information in order to detect a developing event of interest, recognize the nature and order of magnitude of a threat it may represent, manage security resources available to it in order to block that recognized threat and to administer the crowd exposed to that threat, as will be explained in details below, by way of examples. Further, main unit 12 may receive information of the progress of a process of response to a threat, such as the evacuation of a crowd form a specific place, and update the operator by displaying that progress to him / her and by invoking updated cues and instructions to the crowd, so as to utilize evacuation passages and means more efficiently and safely.
  • a process of response to a threat such as the evacuation of a crowd form a specific place
  • Fig. 2 is a schematic illustration of security system 10 describing utilization of peripheral resources by main unit 12, according to some embodiments of the present invention. As will be explained later, there may certain overlap between units described in connection to Fig. 1 and those described herein forth in connection with Fig. 2.
  • Main unit 12 may be in active connection with video / audio digital recorder 54, with video matrix 64, with sensors matrix 66, with input / output (I/O) module 68, with video / audio monitors 58, 60, 62, with crowd steering signal unit 56 and with network 70.
  • I/O input / output
  • Audio / video digital recorder 54 may be used to save audio / video streams received from system 10, either representing raw data received from the various inputs connected to the system, processed data from the system, logging of events or any combination thereof. Audio / video data stored on digital recorder 54 may be used later for various purposes, such as debriefing of past events and actions, assessment of live input in delay, training, etc.
  • Video matrix 64 may be used to control all audio / video channels utilized by system 10 so as to connect or disconnect each available audio / video source to any available destination, as may be required. Accordingly, video matrix 64 may be connected to digital recorder 54.
  • Sensors matrix 66 may be used to enable connection of each of the sensors utilized by system 10 (not shown) to any available input channel.
  • Inputs connected to input matrix 66 may be of the discrete type, such as input from a alarm system signaling of the crossing of a defined line, digital or analog input representing a variable which, when its value crosses a pre-defined value, or when the nature of its changes in time according to a pre-defined curve, may represent the occurrence of an event of interest.
  • I / O module 68 may be used to interface I / O units, such as a keyboard, a pointing device and the like to system 10.
  • Video / audio monitors 58, 60, 62 may be used for various purposes, such as presenting audio / video streams received from various sources in system 10, present analysis of the evolving situation, present suggested actions to be taken by security staff, and the like.
  • Positioning system 100 may comprise at least one video camera 102, which may be connected to main unit 12.
  • Video camera 102 is capable of capturing at least part of zone of interest 104 within its frame so that its line of sight (LOS) 106 points at a point of interest 108 within zone of interest 104.
  • the projection of the captured picture of camera 102 on zone of interest 104 may be defined as the field of view (FOV) 109 of camera 102.
  • FOV field of view
  • point of interest 108 is included in FOV 109.
  • the shape of FOV 109 may vary according to variety of parameters, such as the shape of the frame of camera 102, the angle of incidence of LOS 106 with the terrain of FOV 109, the optical performance and features of camera 102 and according to the shape of the terrain covered within the boundaries of FOV 109 (some times called also terrain modeling).
  • Video camera 102 may be controlled by main unit 12 so as to point at any desired point within its substantially hemispheric range of coverage. Further, video camera 102 may transmit the coordinates of its LOS to main unit 12, for example elevation angle and turn angle.
  • the 3-D geographical coordinates of video camera 102, as well as its specific performance data may be known from geographical database 18 or from sensor matrix unit 66 or from any other available source of information comprising descriptive data of installed surveillance equipment.
  • LOS 106 may intercept at least one point of interest 108 so that the 3-D coordinates of the set of points along LOS 106 coincides with at least one point 108 included in the terrain within FOV 109 the planar position data and the height data of which may be obtained, for example, from the 3-D data of point of interest 108, as stored in geographical database 18.
  • the 3-D data of point of interest 108 may be calculated and stored in main unit 12 for further use.
  • the coordinates of the point closest to camera 102 will be stored in main unit 12.
  • a line-of-sight analysis may be carried out for all such points that satisfy the conditions above and in order to correctly elect only one of these points as point of interest 108 data from additional sensors, such as another camera 102, placed in a different position and viewing FOV 109, may be used to uniquely solve the correct coordinates of point of interest 108. Accordingly, the 3-D coordinates of any point within the boundaries of FOV 109 may be calculated.
  • said additional camera 102 which may provide a 2-D location information for assisting in the calculation and / or identification of a point of interest 108
  • said different sensor may provide, typically, a 3-D location information for an investigated entity (typically distance R and spherical angles ⁇ , ⁇ ).
  • said different sensor may be a line-type sensor (such as a security, monitored, fence or the like) which may provide a 1 -D or a 2-D location information if crossed by an intruder. Location information received from such sensor may be used in the manner described above in order to complete missing information of a location of a monitored entity and to remove ambiguity with respect to such location.
  • These coordinates may be used, once calculated, to synchronize additional security resources to that FOV 109, such as directing other directional security resources (like video camera, directional microphone and the like) to point of interest 108 or, if needed, to other points, related to point of interest 108; to direct security personnel to it or to direct the crowd away from it (in case it represents a spot of high risk) and the like.
  • additional security resources such as directing other directional security resources (like video camera, directional microphone and the like) to point of interest 108 or, if needed, to other points, related to point of interest 108; to direct security personnel to it or to direct the crowd away from it (in case it represents a spot of high risk) and the like.
  • Directional surveillance resources such as video camera
  • An operator of system 10 may identify a specific point of interest 108 within the FOV 109 and define it as a point to slave LOS 106 Of camera 102 to it.
  • the method of manual slaving may be initiated by pointing (with a pointing device, such as a mouse) at the desired point of interest 108 over a 3D presentation, from any virtual eye position - either in outdoor scenery or an indoor one).
  • LOS 106 of camera 102 may be slaved to keep the image located at point of interest 108 at the time of slaving in LOS 106, as long as it is in the range of following of camera 102.
  • system 10 may compare an image of an entity located at point of interest 108 to a bank of images of potential threats or may identify the movement of said image as abnormal behavior (as detailed hereinafter) and slave camera 102 to keep that entity substantially at its LOS 106.
  • system 10 may aim up to four directional sensors, such as cameras 102, and the like, to get a better coverage of the POI 108 either static or dynamic.
  • geographical database 18 comprises also a 3-D description of buildings and infrastructure contained in an area of interest
  • this data may further be integrated so as to more accurately calculate the 3-D data of point of interest 108 and more descriptively display such data on a picture of area of interest 104.
  • an advantage may be further be taken of the system ability to store and simulate scenarios of possible threats.
  • scenario is processed and specifically when a scenario is used for training of security personnel, gaps, weak points and malfunctions of the security system are identified and may then be fixed.
  • situational awareness database 16 may comprise information describing abnormal behavior, position descriptors of monitored entities, pre-collected intelligence information, data received from security and safety devices, environmental conditions, and the like. While normal behavior may be defined as the behavior that would have been expected from a monitored entity while in a given situation, an abnormal behavior is the complementary one. For example, a man walking along a pavement or a path may be regarded as acting in "normal behavior”. In the same manner a man crossing a garden or a car driving over the lawn may be regarded as acting in an "abnormal behavior”. Behavior of an entity may be deducted from the way it changes over time, for example.
  • Situational awareness database 16 may comprise definitions and description of abnormal behavior of persons and other entities that may be monitored during the occurrence of an event of interest. These definitions and description may be compared to the actual behavior of a monitored entity in real-time and when an abnormal behavior has been detected main unit 12 may be alerted. The level of deviation of a monitored behavior from the 'normal' so that it will be regarded 'abnormal' may be defined.
  • Entity recognition may be carried out by cross- linking descriptive information of a monitored entity received from plurality of sensors and additional sources. For example, the 2-D image of said entity as received in a video camera 102 may be compared to a bank of pre-defined entities and to location information received from another sensor.
  • the 2-D shape of that entity may correspond to more than a single entity found in said bank of entities, differing from one another in their sizes but having substantially the same shape.
  • the location information received from said additional sensor may define the distance of the monitored entity from video camera 102 and with this, the right entity from the plurality of entities may be decided.
  • Fig. 4 is a schematic block diagram of a security system 80 according to some embodiments of the present invention.
  • Data received from sensors 88 which may comprise video camera, surveillance microphone, trespassing sensor and the like, is forwarded to data processing and event prediction unit 82. This data may be processed in view of information stored in sensors database 84, geographical information system (GIS) database 86 and in events scenario database 90.
  • GIS geographical information system
  • Sensors database 84 may store technical and location description of each of the sensors in the security system, so that a signal received from such sensor may be fully appreciated and accurately processed and combined in the system.
  • GIS database 86 may comprise geographical information of the area monitored by the security system according to the present invention, such as terrain information (the elevation of points in that area), description of infrastructure in that area (buildings, roads, pipeline networks and the like), etc.
  • Events scenario database 90 may comprise plurality of pre-developed scenarios forecasting possible future developments in response to a set of present events. Based on information received from sensors 88 and in view of data retrieved from sensors database 84, GIS database 86 and events scenario database 90, data processing and event prediction unit 82 may process the information and decide whether an abnormal behavior has been detected, according to the principles detailed above.
  • a signal is transmitted to sensors 88 to focus on that event (block 94) in order to improve and enhance its reflection to the system.
  • An additional signal is transmitted to block 96 in order to invoke instructions and to provide recommendations (block 96) to security staff and to protected crowd, as may be required.
  • a signal may be transmitted back to data processing and event prediction unit 82 to serve as an updated part of the information that may continuously be processed by this unit.
  • Information representing an event of interest such as a monitored moving entity or an entity defined as having abnormal behavior
  • Information representing an event of interest may be displayed to an operator of the system together with complementary information - visual, textual, vocal or the like.
  • the real video of this entity taken from at least one video camera that is focusing on it, may be displayed on the real-video background around it in the frame of the camera.
  • layers of visual information there may be added information about underground infrastructure of interest, areas of coverage of LOS of additional video cameras in the near vicinity, line of fire of stationary or moving guards which may be taken from a database or reflected from sensors transmitting their actual position, signal next to the monitored entity that may reflect its evaluated momentary direction of movement and / or its calculated potential of causing harm, and the like.
  • These additional informative layers may be displayed as 'synthetic' layers (i.e. involving information calculated by the system) on the background of realtime video picture and the system may have the ability to allow an operator to switch each of these layers on or off from display, as may be required.
  • the system may add textual and vocal information corresponding to the displayed data that may contain complementary information such as instructions regarding the evolving situation and the like.
  • the system and method according to some embodiments of the present invention may comprise a decision support unit 22 (Fig. 1) which may assist in talcing complicated decisions specifically in conditions of an evolving security event.
  • decision support unit 22 may comprise information on the identification of potential scenarios and may function to recommend of responsive actions that need to be taken in response to a developing security event.
  • the scenarios may be received from expert know-how unit 14 (Fig. 1). Such scenario may reflect analysis made in advance by security experts as to the possible meaning or meanings of different combinations of various inputs from sensor matrix 64.
  • the resulting recommendation of decision support unit 22 may be applied automatically, semi- automatically or manually, as per the decision of the personnel in charge of handling that situation.
  • the support given by decision support unit 22 may be uttered in presenting reduced number of possible developments of the situation, reduced number of actions that need to be taken, educated "check-list" of operations that should be activated by the operator, and the like.
  • a very important part of a well functioning security system is the training part.
  • a security system is usually known for its high demands both for fast response and for its intolerance to mistakes.
  • training unit 24 is comprised in security system 10. Scenarios identified for real-time operation of system 10, as well as imaginary scenarios built on realistic basis may be used for training security personnel in real-like situations.
  • the ability of system 10 to collect and record information from the protected area as well as to record operations taken during handling of previous events may be used during training for debriefing the actual performance of said security personnel in order to improve in a later session of training. Same abilities may be further relied upon in improving the utilization of system 10 and all its sub- modes by the security personnel.

Abstract

A method and system for providing security to large scale sites with large number of people comprising plurality of surveillance sensors, geographical database (18) for the secured site, experts know-how database (14) with plurality of potential scenarios. The system and method according to the invention can handle a large number of inputs, analyze the meaning of the input, prioritize operation, identify threats and produce instructions to the security personnel in response to events taking place in the secured site.

Description

WIDE AREA SECURITY SYSTEM AND METHOD BACKGROUND OF THE INVENTION
[001] Security for places with large number of people such as transport hubs, highly occupied working places and the like is of high interest to organizations and establishments. The large number of people, the high rate of rotation of many of them in some cases, the difficulty of applying a unified security methodology to a crowd that is hard and even impossible to train for situations requiring security awareness - all these and many other effects create a need for a system and method for planning, applying, controlling and operating an over-all security solution. [002] Systems known in the art allow for the solutions in which all the information representing an event of interest in an area of interest is presented to a centralized location in which a person in charge, such as a controller, may process the information, extract an estimated evaluation of the upcoming threat and decide on actions that should be taken in response. In other currently known security systems some of the incoming information, such as video streams, may be filtered by computerized means to screen and pass onward to the controller only information embedded in a video stream that contains, for example, a movement being of predefined characteristics. In yet other systems, computerized means may invoke alerts when a detected movement embedded in a video stream matches a predefined pattern of behavior. None of these systems is capable of analyzing future threats before a system has been tailored to a location, identify potential threats after installing it, training the security staff and control the security staff as well as the crowd under threat in real-time. Nowadays systems are not able also to integrate inputs from different sources so as to create a unified display displaying real-time input such as from a video camera, synthetic input such as underground infrastructure received from infrastructure database and the like. Nowadays security systems are as- well unable to fuse information received from different types of sensors and databases so as to create an educated, fused picture to an operator, according to pre-defined scenario and / or policy, such as prioritizing these sources by urgency of the content of that source or by its relevance to the event being handled or by any desired policy. [003] Nowadays security systems do not provide also orientation cues that may help an operator of the security system in understanding the video picture he or she is viewing during management of a security event, which may turn to be a very complicated and confusing task, as the camera picture may be of an unknown zooming factor and pointing at a place not known to the operator by its view, etc. Finally, nowadays security systems have typically high rate of false alarms and that rate may go even higher as the complexity of the system becomes higher.
BRIEF DESCRIPTION OF THE DRAWINGS
[004] The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanied drawings in which:
[005] FIG. 1 is a schematic illustration of a security system describing utilization of logical resources, constructed and functioning according to some embodiments of the present invention;
[006] FIG. 2 is a schematic illustration of security system describing utilization of peripheral resources, according to some embodiments of the present invention;
[007] FIGS. 3A and 3B are a schematic block diagram illustration and a schematic side view illustration of a positioning system respectively according to some embodiments of the present invention.
[008] FIG. 4 is a schematic block diagram illustration of a security system according to some embodiments of the present invention.; and
[009] It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. Still further, functionalities referred to herein below as 'units' may be implemented as a physical unit comprising substantially hardware components, as a logical unit comprising substantially software, or as any combination thereof. DETAILED DESCRIPTION OF THE INVENTION
[0010] In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However it will be understood by those of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the present invention.
[0011] It should be understood that the present invention may be used in a variety of applications. Although the present invention is not limited in this respect, the system and method disclosed herein may be used in many security installations such as indoor environment such as shopping center, transportation stations, hotels and the like, outdoor environment such as university campus, stadium, airport or seaport and the like and perimeter line such as boundary of sensitive zone (such as a power plant), a border line, pipeline and the like. It should also be understood that while the present application widely discusses inventive security systems and methods, the principles of which may also be realized and utilized for similar needs, such as safety of people and / or systems and for the protection of viability and survivability of systems, such as communication systems.
[0012] Maintaining security, and especially for a large crowd of people, may impose the need to solve several different problems such as what is the nature of the potential threat, whether a threat may be identified in a relatively early stage based on its nature, when shall an allegedly coincidental group of inputs be translated to an evolving threat. In case a threat has been detected what shall be the next developments of which, how should a random crowd be managed to minimize casualties and harms, and the like. Additionally, there is a need to train the security staff to react fast and accurate when a threat is identified.
[0013] Reference is made now to Fig. 1, which is a schematic illustration of a security system 10 describing utilization of logical resources, constructed and functioning according to the present invention. Security system 10 may comprise a main unit 12, an expert know-how database 14, a situational awareness database 16, a geographical database 18, a planning optimizer unit 20, a decision support unit 22, a training unit
24 and output activation unit 26. Expert know-how database 14 may comprise a large amount of information describing performance of security devices, operational methodology models, security handling policies, and the like. This information may be used as a basis for evaluation of detected events, in order to estimate the threat that they may impose, as well as to administer an on-going threatening event in order to utilize available security resources to minimize the harm that such threat may cause in the most efficient way, and to steer the protected crowed in the most safe way. [0014] Situational awareness database 16 may comprise information describing abnormal behavior, position descriptors of monitored entities, pre-collected intelligence information, data received from security and safety devices, environmental conditions, analysis of expected results of potential threats on the environment (such as the expected damage to a building from the explosion of a given bomb at a given distance from that building) and on persons, and the like. Geographical database 18 may comprise geographical data representing at least an area of interest, such as 2-Dimensional or 3 -Dimensional coordinates of a location inside said area of interest. Geographical database 18 may also comprise 3-D description of buildings and infrastructure contained in an area of interest. Planning optimizer unit 20 may comprise information about gaps - known and suspected - in security monitoring coverage, profiles of optimized deployment of security resources and the like. Planning optimizer unit 20 may function to optimize security resources management determined in advance or while a security event is going on. Decision support unit 22 may comprise information on the identification of potential scenarios and may function to recommend of responsive actions that need to be taken in response to a developing security event. Training unit 24 may comprise an updateable bank of scenarios and past events and function to create and monitor training sessions. Activation unit 26 may comprise an appropriate interface supporting the interface to and activation of any auxiliary device, such as indication and guiding lights, summoning means, public address (PA) means, and the like. Such auxiliary device may be used to transmit instructions and / or information to other systems and / or to security staff and / or crowd.
[0015] Main unit 12 may comprise a computing unit which is loadable with an appropriate software and equipped with means to perform all functionalities for combining data from the various units and for analyzing the ongoing incoming information in order to detect a developing event of interest, recognize the nature and order of magnitude of a threat it may represent, manage security resources available to it in order to block that recognized threat and to administer the crowd exposed to that threat, as will be explained in details below, by way of examples. Further, main unit 12 may receive information of the progress of a process of response to a threat, such as the evacuation of a crowd form a specific place, and update the operator by displaying that progress to him / her and by invoking updated cues and instructions to the crowd, so as to utilize evacuation passages and means more efficiently and safely. That information of the progress of a response to an event may be collected from sensors utilized by system 10, as will be explained in more details below. [0016] Reference is made now also to Fig. 2, which is a schematic illustration of security system 10 describing utilization of peripheral resources by main unit 12, according to some embodiments of the present invention. As will be explained later, there may certain overlap between units described in connection to Fig. 1 and those described herein forth in connection with Fig. 2. Main unit 12 may be in active connection with video / audio digital recorder 54, with video matrix 64, with sensors matrix 66, with input / output (I/O) module 68, with video / audio monitors 58, 60, 62, with crowd steering signal unit 56 and with network 70. Audio / video digital recorder 54 may be used to save audio / video streams received from system 10, either representing raw data received from the various inputs connected to the system, processed data from the system, logging of events or any combination thereof. Audio / video data stored on digital recorder 54 may be used later for various purposes, such as debriefing of past events and actions, assessment of live input in delay, training, etc. Video matrix 64 may be used to control all audio / video channels utilized by system 10 so as to connect or disconnect each available audio / video source to any available destination, as may be required. Accordingly, video matrix 64 may be connected to digital recorder 54.
[0017] Sensors matrix 66 may be used to enable connection of each of the sensors utilized by system 10 (not shown) to any available input channel. Inputs connected to input matrix 66 may be of the discrete type, such as input from a alarm system signaling of the crossing of a defined line, digital or analog input representing a variable which, when its value crosses a pre-defined value, or when the nature of its changes in time according to a pre-defined curve, may represent the occurrence of an event of interest.
[0018] I / O module 68 may be used to interface I / O units, such as a keyboard, a pointing device and the like to system 10. Video / audio monitors 58, 60, 62, may be used for various purposes, such as presenting audio / video streams received from various sources in system 10, present analysis of the evolving situation, present suggested actions to be taken by security staff, and the like.
Extraction of 3-D information based on surveillance camera
[0019] Attention is made now to Figs. 3 A and 3B, which are a schematic block diagram illustration and a schematic side view illustration of a positioning system 100, respectively. Positioning system 100 may comprise at least one video camera 102, which may be connected to main unit 12. Video camera 102 is capable of capturing at least part of zone of interest 104 within its frame so that its line of sight (LOS) 106 points at a point of interest 108 within zone of interest 104. The projection of the captured picture of camera 102 on zone of interest 104 may be defined as the field of view (FOV) 109 of camera 102. Typically point of interest 108 is included in FOV 109. The shape of FOV 109 may vary according to variety of parameters, such as the shape of the frame of camera 102, the angle of incidence of LOS 106 with the terrain of FOV 109, the optical performance and features of camera 102 and according to the shape of the terrain covered within the boundaries of FOV 109 (some times called also terrain modeling). Video camera 102 may be controlled by main unit 12 so as to point at any desired point within its substantially hemispheric range of coverage. Further, video camera 102 may transmit the coordinates of its LOS to main unit 12, for example elevation angle and turn angle.
[0020] The 3-D geographical coordinates of video camera 102, as well as its specific performance data (such as zoom range, magnification figure, aspect ratio and the like) may be known from geographical database 18 or from sensor matrix unit 66 or from any other available source of information comprising descriptive data of installed surveillance equipment. LOS 106 may intercept at least one point of interest 108 so that the 3-D coordinates of the set of points along LOS 106 coincides with at least one point 108 included in the terrain within FOV 109 the planar position data and the height data of which may be obtained, for example, from the 3-D data of point of interest 108, as stored in geographical database 18. In such case the 3-D data of point of interest 108, may be calculated and stored in main unit 12 for further use. In case a plurality of points 108, 108 A and 108B, satisfy the conditions defined above, the coordinates of the point closest to camera 102 will be stored in main unit 12. Alternatively, a line-of-sight analysis may be carried out for all such points that satisfy the conditions above and in order to correctly elect only one of these points as point of interest 108 data from additional sensors, such as another camera 102, placed in a different position and viewing FOV 109, may be used to uniquely solve the correct coordinates of point of interest 108. Accordingly, the 3-D coordinates of any point within the boundaries of FOV 109 may be calculated. Instead of said additional camera 102, which may provide a 2-D location information for assisting in the calculation and / or identification of a point of interest 108, there may be used different sensors. In case one of said different sensor is a Radar sensor it may provide, typically, a 3-D location information for an investigated entity (typically distance R and spherical angles φ, φ). Still alternatively, said different sensor may be a line-type sensor (such as a security, monitored, fence or the like) which may provide a 1 -D or a 2-D location information if crossed by an intruder. Location information received from such sensor may be used in the manner described above in order to complete missing information of a location of a monitored entity and to remove ambiguity with respect to such location. These coordinates may be used, once calculated, to synchronize additional security resources to that FOV 109, such as directing other directional security resources (like video camera, directional microphone and the like) to point of interest 108 or, if needed, to other points, related to point of interest 108; to direct security personnel to it or to direct the crowd away from it (in case it represents a spot of high risk) and the like.
[0021] Based on the calculated 3-D coordinates of point of interest 108 further functions may be activated. Directional surveillance resources, such as video camera, may be slaved to a specific point of interest 108 either manually or automatically. An operator of system 10 may identify a specific point of interest 108 within the FOV 109 and define it as a point to slave LOS 106 Of camera 102 to it. The method of manual slaving may be initiated by pointing (with a pointing device, such as a mouse) at the desired point of interest 108 over a 3D presentation, from any virtual eye position - either in outdoor scenery or an indoor one). In case point of interest 108 is movable, such as in case it represents the momentary location of an intruder, LOS 106 of camera 102 may be slaved to keep the image located at point of interest 108 at the time of slaving in LOS 106, as long as it is in the range of following of camera 102. When in auto mode, system 10 may compare an image of an entity located at point of interest 108 to a bank of images of potential threats or may identify the movement of said image as abnormal behavior (as detailed hereinafter) and slave camera 102 to keep that entity substantially at its LOS 106. When LOS 106 of camera 102 is no longer available, the best next camera 102 may be allocated to the task, and when in automatic mode such camera may be slaved to the momentary location of the specific point of interest (POI) 108 via LOS 106. By doing so, the operator of system 10 may continue a continuous track on POI 108. When possible, system 10 may aim up to four directional sensors, such as cameras 102, and the like, to get a better coverage of the POI 108 either static or dynamic.
[0022] When geographical database 18 comprises also a 3-D description of buildings and infrastructure contained in an area of interest, this data may further be integrated so as to more accurately calculate the 3-D data of point of interest 108 and more descriptively display such data on a picture of area of interest 104.
Planning and Security Gap Monitoring (auditing)
[0023] For better security performance planning ahead is a key for success. With system 10 built and working according to the present invention planning is made an easier job. Based on the information stored in geographical database 18 and further based on the ability of system 10 to match planar coordinates to a point in the field of view of a camera engaged in system 10, as discussed above, a thorough inspection of the terrain in area of interest 104, including analysis of invisible areas created due to concealment by the terrain itself or by infrastructure entities, such as buildings, my be carried out by system 10. Such analysis may disclose to an operator of system 10 areas which have too low coverage by security means of system 10, thus assisting in planning a better security solution. Same features of system Io may assist in identifying in advance points of weakness of the security envelope provided by system 10 which, if are not curable, may be the weak link through which an intrusion or a threat may be expected.
[0024] For improved planning of security system and method according to the present invention an advantage may be further be taken of the system ability to store and simulate scenarios of possible threats. When such scenario is processed and specifically when a scenario is used for training of security personnel, gaps, weak points and malfunctions of the security system are identified and may then be fixed.
Situational Awareness
[0025] As discussed in brief above, situational awareness database 16 may comprise information describing abnormal behavior, position descriptors of monitored entities, pre-collected intelligence information, data received from security and safety devices, environmental conditions, and the like. While normal behavior may be defined as the behavior that would have been expected from a monitored entity while in a given situation, an abnormal behavior is the complementary one. For example, a man walking along a pavement or a path may be regarded as acting in "normal behavior". In the same manner a man crossing a garden or a car driving over the lawn may be regarded as acting in an "abnormal behavior". Behavior of an entity may be deducted from the way it changes over time, for example. Thus, when the monitored entity is a person, his movement, the first derivative of his location expressed by the momentary values of 6 dimensions (3 linear and 3 rotational vectors, for example), may be an example of the representation of "a behavior" of that person. Situational awareness database 16 may comprise definitions and description of abnormal behavior of persons and other entities that may be monitored during the occurrence of an event of interest. These definitions and description may be compared to the actual behavior of a monitored entity in real-time and when an abnormal behavior has been detected main unit 12 may be alerted. The level of deviation of a monitored behavior from the 'normal' so that it will be regarded 'abnormal' may be defined. Monitoring of the behavior of an entity in a monitored area may rely on known tracking solutions, while the decision on whether the track being performed by the monitored entity along time is within the 'normal' boundaries may take the advantage of combining of data describing infrastructure on a camera picture, as described above in details. Additionally, as part of the situational awareness of system built and functioning according embodiments of the present invention, computerized aided entity recognition ability may be supported. Entity recognition may be carried out by cross- linking descriptive information of a monitored entity received from plurality of sensors and additional sources. For example, the 2-D image of said entity as received in a video camera 102 may be compared to a bank of pre-defined entities and to location information received from another sensor. The 2-D shape of that entity may correspond to more than a single entity found in said bank of entities, differing from one another in their sizes but having substantially the same shape. In such case the location information received from said additional sensor may define the distance of the monitored entity from video camera 102 and with this, the right entity from the plurality of entities may be decided.
[0026] The combination of identification of abnormal behavior of a monitored entity with its ability to identify it in a bank of entities may not only dramatically improve the ability of system 10, 100 to identify a potential threat while lowering the rate of false alarm. It may also extend the alert time period by allowing a first alarm to be set earlier.
[0027] As part of the situational awareness capabilities of a security system according to the present invention, when an abnormal behavior of a monitored entity is detected, additional to a general alarm that may be invoked in the system, an automatic or semiautomatic directions may be transmitted to various security directional sensors, such as video cameras or directional microphones, to focus on that abnormal behavior zone. A reference is made here to Fig. 4, which is a schematic block diagram of a security system 80 according to some embodiments of the present invention. Data received from sensors 88, which may comprise video camera, surveillance microphone, trespassing sensor and the like, is forwarded to data processing and event prediction unit 82. This data may be processed in view of information stored in sensors database 84, geographical information system (GIS) database 86 and in events scenario database 90. Sensors database 84 may store technical and location description of each of the sensors in the security system, so that a signal received from such sensor may be fully appreciated and accurately processed and combined in the system. GIS database 86 may comprise geographical information of the area monitored by the security system according to the present invention, such as terrain information (the elevation of points in that area), description of infrastructure in that area (buildings, roads, pipeline networks and the like), etc. Events scenario database 90 may comprise plurality of pre-developed scenarios forecasting possible future developments in response to a set of present events. Based on information received from sensors 88 and in view of data retrieved from sensors database 84, GIS database 86 and events scenario database 90, data processing and event prediction unit 82 may process the information and decide whether an abnormal behavior has been detected, according to the principles detailed above. In case an abnormal behavior has been detected (block 93) a signal is transmitted to sensors 88 to focus on that event (block 94) in order to improve and enhance its reflection to the system. An additional signal is transmitted to block 96 in order to invoke instructions and to provide recommendations (block 96) to security staff and to protected crowd, as may be required. Additionally, in case an abnormal behavior has been detected, a signal may be transmitted back to data processing and event prediction unit 82 to serve as an updated part of the information that may continuously be processed by this unit. [0028] Additional to the above, situational awareness may be based on positional data of monitored entities received from these entities, directly or calculated by the system of the present invention; on early intelligence collected from various sources and stored in the system and on data representing the environmental conditions of the environment of the monitored system.
[0029] Information representing an event of interest, such as a monitored moving entity or an entity defined as having abnormal behavior, may be displayed to an operator of the system together with complementary information - visual, textual, vocal or the like. For example, when an entity having an abnormal behavior has been detected, the real video of this entity, taken from at least one video camera that is focusing on it, may be displayed on the real-video background around it in the frame of the camera. On this picture, as layers of visual information, there may be added information about underground infrastructure of interest, areas of coverage of LOS of additional video cameras in the near vicinity, line of fire of stationary or moving guards which may be taken from a database or reflected from sensors transmitting their actual position, signal next to the monitored entity that may reflect its evaluated momentary direction of movement and / or its calculated potential of causing harm, and the like. These additional informative layers may be displayed as 'synthetic' layers (i.e. involving information calculated by the system) on the background of realtime video picture and the system may have the ability to allow an operator to switch each of these layers on or off from display, as may be required. In addition, the system may add textual and vocal information corresponding to the displayed data that may contain complementary information such as instructions regarding the evolving situation and the like.
Decision Support
[0030] The system and method according to some embodiments of the present invention may comprise a decision support unit 22 (Fig. 1) which may assist in talcing complicated decisions specifically in conditions of an evolving security event. As discussed in brief above, decision support unit 22 may comprise information on the identification of potential scenarios and may function to recommend of responsive actions that need to be taken in response to a developing security event. The scenarios may be received from expert know-how unit 14 (Fig. 1). Such scenario may reflect analysis made in advance by security experts as to the possible meaning or meanings of different combinations of various inputs from sensor matrix 64. The resulting recommendation of decision support unit 22 may be applied automatically, semi- automatically or manually, as per the decision of the personnel in charge of handling that situation. The support given by decision support unit 22 may be uttered in presenting reduced number of possible developments of the situation, reduced number of actions that need to be taken, educated "check-list" of operations that should be activated by the operator, and the like.
Training
[0031] A very important part of a well functioning security system, such as that of the present invention, is the training part. A security system is usually known for its high demands both for fast response and for its intolerance to mistakes. In order to improve the functioning of each of the personnel involved in carrying out the policy of security system 10 according to the present invention training unit 24 is comprised in security system 10. Scenarios identified for real-time operation of system 10, as well as imaginary scenarios built on realistic basis may be used for training security personnel in real-like situations. The ability of system 10 to collect and record information from the protected area as well as to record operations taken during handling of previous events may be used during training for debriefing the actual performance of said security personnel in order to improve in a later session of training. Same abilities may be further relied upon in improving the utilization of system 10 and all its sub- modes by the security personnel.
[0032] While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those of ordinary skill in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention. It should also be understood that while the present application widely discusses inventive security systems and methods, the principles of which may also be realized and utilized for similar needs, such as safety of people and / or systems and for the protection of viability and survivability of systems, such as communication systems.

Claims

[0019] What is claimed is:
1. A method comprising: receiving data describing at least one sensor from a sensors database; receiving data describing at least one possible scenario from a scenario database; receiving data describing location information for said at least one sensor from a geographical database; receiving actual data describing location and movement of at least one monitored entity from said at least one sensor in combination with location information received from said geographical database; receiving data defining whether a monitored entity is acting in abnormal behavior; evaluating said received data to define a predicted scenario, and providing, based on that scenario, instructions to aim controllable sensors to said monitored entity in accordance to said predicted scenario and instruction and recommendation to security personnel.
2. The method of claim 1, further comprising a step of displaying information representing said predicted scenario to an operator.
3. The method of claim 1, further comprising, prior to said step of evaluating, the step of receiving information describing predefined models and policies.
4. The method of claim 1, further comprising, prior to said step of evaluating, the step of receiving information describing abnormal behavior parameters defined for a plurality of predefined entities observable within a defined zone.
5. The method of claim 1, further comprising, prior to said step of evaluating, the step of receiving information on the identification of potential scenarios.
6. The method of claim 1 , wherein the step of evaluating comprises the steps of: combining and analyzing data in order to detect a developing event; and recognizing the nature and order of magnitude of a threat said event may represent.
7. The method of claim 1, further comprising the steps of: receiving information of progress of a process of response to a threat; and displaying said information of progress to an operator.
8. The method of claim 1, further comprising the steps of collecting and recording information during an event.
9. A system comprising: a geographical database comprising location information of entities within a defined zone; a situational awareness database comprising data describing abnormal behavior parameters defined for a plurality of predefined entities observable within said defined zone; an expert know-how database comprising information describing performance of security devices, operational methodology models and security handling policies, and a main unit capable of receiving information from said databases, comparing said received data to predefined patterns, identify if a security situation is in progress and to output instructions accordingly.
10. The system of claim 9 further comprising a planning optimizer unit comprising information about gaps in security monitoring coverage in said zone and profiles of optimized deployment of security resources; a decision support unit comprising information on the identification of potential scenarios, and a training unit comprising an updateable bank of scenarios and past events, wherein said additional units are in active communication with said main unit.
11. A method comprising: receiving data describing location and direction of aiming of a video camera; receiving data describing terrain elevation of an area, said camera video is aiming to said area, calculating the 3-D location of at least one point intercepted by a line of sight of said camera and included in said area from said data describing location and direction of aiming of said camera and from said data describing terrain elevation of said area.
12. The method of claim 11 further comprising if more than one point in said terrain matches the results of said calculations, selecting from said more than one point that point which is closest to said camera.
13. The method of claim 11 further comprising, prior to said step of calculating receiving data from at least one additional sensor, monitoring said area, and comparing data received from said at least one additional sensor with said data received from said video camera to calculate said 3-D location of said at least one point with better accuracy.
14. The method of claim 11 further comprising slaving said video camera to a defined point in the field of view of said video camera so that its line-of-sight points as that point in response to a command from a user.
15. The method of claim 11 further comprising slaving said video camera to a defined object in the field of view of said video camera so that its line-of-sight points as that substantially object in response to a command from a user.
16. The method of claim 11 further comprising slaving said video camera to a defined object in the field of view of said video camera so that its line-of-sight points as that substantially object if said object has been detected as a threat.
17. The method of claim 16 wherein said identification is carried by comparison of the image of said object to images of potential threats saved in a storage unit.
18. A method comprising the steps of: calculating three dimensional locations of points on areas of interest included in fields of view of plurality of video cameras; providing, based on three dimensional location of a desired point of interest, identification of at least one video camera from said plurality of video cameras, said at least one video camera includes said point of interest in its field of view; and directing at least one video camera to said point of interest, based on three dimensional location of said point of interest, said at least one video camera includes said point of interest in its field of view.
19. The method of claim 18, wherein said point of interest is defined by a user.
20. The method of claim 18, wherein said point of interest is defined automatically by comparing an entity in said field of view to entities in a bank of potential threats when said comparison is positive.
21. 4. The method of claim 18, wherein said step of directing is done autonomously
PCT/IL2006/000738 2005-06-22 2006-06-22 Wide area security system and method WO2006137072A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0800820A GB2441491A (en) 2005-06-22 2008-01-17 Wide area security and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/158,347 2005-06-22
US11/158,347 US20070008408A1 (en) 2005-06-22 2005-06-22 Wide area security system and method

Publications (2)

Publication Number Publication Date
WO2006137072A2 true WO2006137072A2 (en) 2006-12-28
WO2006137072A3 WO2006137072A3 (en) 2009-05-22

Family

ID=37570841

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2006/000738 WO2006137072A2 (en) 2005-06-22 2006-06-22 Wide area security system and method

Country Status (3)

Country Link
US (1) US20070008408A1 (en)
GB (1) GB2441491A (en)
WO (1) WO2006137072A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2081163A1 (en) 2008-01-21 2009-07-22 Thales Nederland B.V. Multithreat safety and security system and specification method thereof
WO2014153656A1 (en) * 2013-03-29 2014-10-02 Symboticware Incorporated Method and apparatus for underground equipment monitoring
GB2542469A (en) * 2015-07-17 2017-03-22 Wal Mart Stores Inc Shopping facility assistance systems, devices, and method to identify security and safety anomalies
US9801517B2 (en) 2015-03-06 2017-10-31 Wal-Mart Stores, Inc. Shopping facility assistance object detection systems, devices and methods
US10017322B2 (en) 2016-04-01 2018-07-10 Wal-Mart Stores, Inc. Systems and methods for moving pallets via unmanned motorized unit-guided forklifts
US10346794B2 (en) 2015-03-06 2019-07-09 Walmart Apollo, Llc Item monitoring system and method
US11046562B2 (en) 2015-03-06 2021-06-29 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7742772B2 (en) 2005-10-31 2010-06-22 Terahop Networks, Inc. Determining relative elevation using GPS and ranging
US7830273B2 (en) * 2005-08-18 2010-11-09 Terahop Networks, Inc. Sensor networks for pipeline monitoring
US7583769B2 (en) 2005-06-16 2009-09-01 Terahop Netowrks, Inc. Operating GPS receivers in GPS-adverse environment
US20080303897A1 (en) 2000-12-22 2008-12-11 Terahop Networks, Inc. Visually capturing and monitoring contents and events of cargo container
US7783246B2 (en) 2005-06-16 2010-08-24 Terahop Networks, Inc. Tactical GPS denial and denial detection system
US7705747B2 (en) * 2005-08-18 2010-04-27 Terahop Networks, Inc. Sensor networks for monitoring pipelines and power lines
US7554442B2 (en) * 2005-06-17 2009-06-30 Terahop Networks, Inc. Event-driven mobile hazmat monitoring
US7733818B2 (en) * 2000-12-22 2010-06-08 Terahop Networks, Inc. Intelligent node communication using network formation messages in a mobile Ad hoc network
US7907941B2 (en) * 2006-01-01 2011-03-15 Terahop Networks, Inc. Determining presence of radio frequency communication device
US8280345B2 (en) 2000-12-22 2012-10-02 Google Inc. LPRF device wake up using wireless tag
US8050625B2 (en) 2000-12-22 2011-11-01 Terahop Networks, Inc. Wireless reader tags (WRTs) with sensor components in asset monitoring and tracking systems
US7142107B2 (en) 2004-05-27 2006-11-28 Lawrence Kates Wireless sensor unit
EP1905200A1 (en) 2005-07-01 2008-04-02 Terahop Networks, Inc. Nondeterministic and deterministic network routing
US20090129306A1 (en) 2007-02-21 2009-05-21 Terahop Networks, Inc. Wake-up broadcast including network information in common designation ad hoc wireless networking
TWI306231B (en) * 2006-05-22 2009-02-11 Accton Technology Corp Network communication device security system and method of the same
US9521371B2 (en) 2006-12-27 2016-12-13 Verizon Patent And Licensing Inc. Remote station host providing virtual community participation in a remote event
US8656440B2 (en) * 2006-12-27 2014-02-18 Verizon Patent And Licensing Inc. Method and system of providing a virtual community for participation in a remote event
US7667596B2 (en) * 2007-02-16 2010-02-23 Panasonic Corporation Method and system for scoring surveillance system footage
US8223680B2 (en) 2007-02-21 2012-07-17 Google Inc. Mesh network control using common designation wake-up
US20080249864A1 (en) * 2007-04-03 2008-10-09 Robert Lee Angell Generating customized marketing content to improve cross sale of related items
US20080249866A1 (en) * 2007-04-03 2008-10-09 Robert Lee Angell Generating customized marketing content for upsale of items
US20080249870A1 (en) * 2007-04-03 2008-10-09 Robert Lee Angell Method and apparatus for decision tree based marketing and selling for a retail store
US9031858B2 (en) * 2007-04-03 2015-05-12 International Business Machines Corporation Using biometric data for a customer to improve upsale ad cross-sale of items
US9846883B2 (en) 2007-04-03 2017-12-19 International Business Machines Corporation Generating customized marketing messages using automatically generated customer identification data
US20080249835A1 (en) * 2007-04-03 2008-10-09 Robert Lee Angell Identifying significant groupings of customers for use in customizing digital media marketing content provided directly to a customer
US20080249865A1 (en) * 2007-04-03 2008-10-09 Robert Lee Angell Recipe and project based marketing and guided selling in a retail store environment
US20080249858A1 (en) * 2007-04-03 2008-10-09 Robert Lee Angell Automatically generating an optimal marketing model for marketing products to customers
US8775238B2 (en) * 2007-04-03 2014-07-08 International Business Machines Corporation Generating customized disincentive marketing content for a customer based on customer risk assessment
US9092808B2 (en) * 2007-04-03 2015-07-28 International Business Machines Corporation Preferred customer marketing delivery based on dynamic data for a customer
US8812355B2 (en) 2007-04-03 2014-08-19 International Business Machines Corporation Generating customized marketing messages for a customer using dynamic customer behavior data
US9685048B2 (en) 2007-04-03 2017-06-20 International Business Machines Corporation Automatically generating an optimal marketing strategy for improving cross sales and upsales of items
US8831972B2 (en) 2007-04-03 2014-09-09 International Business Machines Corporation Generating a customer risk assessment using dynamic customer data
US8639563B2 (en) 2007-04-03 2014-01-28 International Business Machines Corporation Generating customized marketing messages at a customer level using current events data
US9361623B2 (en) * 2007-04-03 2016-06-07 International Business Machines Corporation Preferred customer marketing delivery based on biometric data for a customer
US9626684B2 (en) * 2007-04-03 2017-04-18 International Business Machines Corporation Providing customized digital media marketing content directly to a customer
US9031857B2 (en) 2007-04-03 2015-05-12 International Business Machines Corporation Generating customized marketing messages at the customer level based on biometric data
US20090006125A1 (en) * 2007-06-29 2009-01-01 Robert Lee Angell Method and apparatus for implementing digital video modeling to generate an optimal healthcare delivery model
US20090005650A1 (en) * 2007-06-29 2009-01-01 Robert Lee Angell Method and apparatus for implementing digital video modeling to generate a patient risk assessment model
US7908237B2 (en) * 2007-06-29 2011-03-15 International Business Machines Corporation Method and apparatus for identifying unexpected behavior of a customer in a retail environment using detected location data, temperature, humidity, lighting conditions, music, and odors
US7908233B2 (en) * 2007-06-29 2011-03-15 International Business Machines Corporation Method and apparatus for implementing digital video modeling to generate an expected behavior model
US8195499B2 (en) * 2007-09-26 2012-06-05 International Business Machines Corporation Identifying customer behavioral types from a continuous video stream for use in optimizing loss leader merchandizing
US20090150321A1 (en) * 2007-12-07 2009-06-11 Nokia Corporation Method, Apparatus and Computer Program Product for Developing and Utilizing User Pattern Profiles
US8207848B2 (en) * 2008-05-16 2012-06-26 Google Inc. Locking system for shipping container including bolt seal and electronic device with arms for receiving bolt seal
US8462662B2 (en) 2008-05-16 2013-06-11 Google Inc. Updating node presence based on communication pathway
WO2009151877A2 (en) 2008-05-16 2009-12-17 Terahop Networks, Inc. Systems and apparatus for securing a container
US7579945B1 (en) 2008-06-20 2009-08-25 International Business Machines Corporation System and method for dynamically and efficently directing evacuation of a building during an emergency condition
US9111237B2 (en) * 2008-12-01 2015-08-18 International Business Machines Corporation Evaluating an effectiveness of a monitoring system
US8391435B2 (en) 2008-12-25 2013-03-05 Google Inc. Receiver state estimation in a duty cycled radio
US8300551B2 (en) 2009-01-28 2012-10-30 Google Inc. Ascertaining presence in wireless networks
US8705523B2 (en) 2009-02-05 2014-04-22 Google Inc. Conjoined class-based networking
US9619589B2 (en) * 2009-03-28 2017-04-11 The Boeing Company Method of determining optical sensor coverage
US8314839B2 (en) * 2009-05-29 2012-11-20 Sentrus, Inc. Concealments for components of a covert video surveillance system
US8407177B2 (en) * 2009-06-22 2013-03-26 Integrated Training Solutions, Inc. System and associated method for determining and applying sociocultural characteristics
US8423498B2 (en) * 2009-06-22 2013-04-16 Integrated Training Solutions, Inc. System and associated method for determining and applying sociocultural characteristics
JP2011048547A (en) * 2009-08-26 2011-03-10 Toshiba Corp Abnormal-behavior detecting device, monitoring system, and abnormal-behavior detecting method
US9389681B2 (en) * 2011-12-19 2016-07-12 Microsoft Technology Licensing, Llc Sensor fusion interface for multiple sensor input
US9112790B2 (en) 2013-06-25 2015-08-18 Google Inc. Fabric network
US9798803B2 (en) * 2013-08-29 2017-10-24 Honeywell International Inc. Security system operator efficiency
US10104112B2 (en) 2014-04-18 2018-10-16 EntIT Software, LLC Rating threat submitter
US9547971B2 (en) * 2014-12-27 2017-01-17 Intel Corporation Technologies for determining a threat assessment based on fear responses
CN105763853A (en) * 2016-04-14 2016-07-13 北京中电万联科技股份有限公司 Emergency early warning method for stampede accident in public area
CN113324452B (en) * 2021-06-10 2022-08-19 嵩县金牛有限责任公司 Blasting warning method with early warning and power-off functions

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6571024B1 (en) * 1999-06-18 2003-05-27 Sarnoff Corporation Method and apparatus for multi-view three dimensional estimation
US6744397B1 (en) * 2003-06-11 2004-06-01 Honeywell International, Inc. Systems and methods for target location

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10138719A1 (en) * 2001-08-07 2003-03-06 Siemens Ag Method and device for displaying driving instructions, especially in car navigation systems
IL150915A0 (en) * 2002-07-25 2003-02-12 Vet Tech Ltd Imaging system and method for body condition evaluation
US20080049012A1 (en) * 2004-06-13 2008-02-28 Ittai Bar-Joseph 3D Line-of-Sight (Los) Visualization in User Interactive 3D Virtual Reality Environments

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6571024B1 (en) * 1999-06-18 2003-05-27 Sarnoff Corporation Method and apparatus for multi-view three dimensional estimation
US6744397B1 (en) * 2003-06-11 2004-06-01 Honeywell International, Inc. Systems and methods for target location

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8779920B2 (en) 2008-01-21 2014-07-15 Thales Nederland B.V. Multithreat safety and security system and specification method thereof
EP2081163A1 (en) 2008-01-21 2009-07-22 Thales Nederland B.V. Multithreat safety and security system and specification method thereof
US9746352B2 (en) 2013-03-29 2017-08-29 Symboticware Incorporated Method and apparatus for underground equipment monitoring
WO2014153656A1 (en) * 2013-03-29 2014-10-02 Symboticware Incorporated Method and apparatus for underground equipment monitoring
US10570000B2 (en) 2015-03-06 2020-02-25 Walmart Apollo, Llc Shopping facility assistance object detection systems, devices and methods
US9875503B2 (en) 2015-03-06 2018-01-23 Wal-Mart Stores, Inc. Method and apparatus for transporting a plurality of stacked motorized transport units
US9875502B2 (en) 2015-03-06 2018-01-23 Wal-Mart Stores, Inc. Shopping facility assistance systems, devices, and methods to identify security and safety anomalies
US10280054B2 (en) 2015-03-06 2019-05-07 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods
US11840814B2 (en) 2015-03-06 2023-12-12 Walmart Apollo, Llc Overriding control of motorized transport unit systems, devices and methods
US9896315B2 (en) 2015-03-06 2018-02-20 Wal-Mart Stores, Inc. Systems, devices and methods of controlling motorized transport units in fulfilling product orders
US9908760B2 (en) 2015-03-06 2018-03-06 Wal-Mart Stores, Inc. Shopping facility assistance systems, devices and methods to drive movable item containers
US9994434B2 (en) 2015-03-06 2018-06-12 Wal-Mart Stores, Inc. Overriding control of motorize transport unit systems, devices and methods
US11761160B2 (en) 2015-03-06 2023-09-19 Walmart Apollo, Llc Apparatus and method of monitoring product placement within a shopping facility
US10071892B2 (en) 2015-03-06 2018-09-11 Walmart Apollo, Llc Apparatus and method of obtaining location information of a motorized transport unit
US10239738B2 (en) 2015-03-06 2019-03-26 Walmart Apollo, Llc Apparatus and method of monitoring product placement within a shopping facility
US10071891B2 (en) 2015-03-06 2018-09-11 Walmart Apollo, Llc Systems, devices, and methods for providing passenger transport
US10287149B2 (en) 2015-03-06 2019-05-14 Walmart Apollo, Llc Assignment of a motorized personal assistance apparatus
US10130232B2 (en) 2015-03-06 2018-11-20 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods
US10138100B2 (en) 2015-03-06 2018-11-27 Walmart Apollo, Llc Recharging apparatus and method
US10189691B2 (en) 2015-03-06 2019-01-29 Walmart Apollo, Llc Shopping facility track system and method of routing motorized transport units
US10189692B2 (en) 2015-03-06 2019-01-29 Walmart Apollo, Llc Systems, devices and methods for restoring shopping space conditions
US11679969B2 (en) 2015-03-06 2023-06-20 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods
US10239740B2 (en) 2015-03-06 2019-03-26 Walmart Apollo, Llc Shopping facility assistance system and method having a motorized transport unit that selectively leads or follows a user within a shopping facility
US10239739B2 (en) 2015-03-06 2019-03-26 Walmart Apollo, Llc Motorized transport unit worker support systems and methods
US10071893B2 (en) 2015-03-06 2018-09-11 Walmart Apollo, Llc Shopping facility assistance system and method to retrieve in-store abandoned mobile item containers
US9801517B2 (en) 2015-03-06 2017-10-31 Wal-Mart Stores, Inc. Shopping facility assistance object detection systems, devices and methods
US10081525B2 (en) 2015-03-06 2018-09-25 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods to address ground and weather conditions
US10315897B2 (en) 2015-03-06 2019-06-11 Walmart Apollo, Llc Systems, devices and methods for determining item availability in a shopping space
US10336592B2 (en) 2015-03-06 2019-07-02 Walmart Apollo, Llc Shopping facility assistance systems, devices, and methods to facilitate returning items to their respective departments
US10346794B2 (en) 2015-03-06 2019-07-09 Walmart Apollo, Llc Item monitoring system and method
US10351400B2 (en) 2015-03-06 2019-07-16 Walmart Apollo, Llc Apparatus and method of obtaining location information of a motorized transport unit
US10351399B2 (en) 2015-03-06 2019-07-16 Walmart Apollo, Llc Systems, devices and methods of controlling motorized transport units in fulfilling product orders
US10358326B2 (en) 2015-03-06 2019-07-23 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods
US10435279B2 (en) 2015-03-06 2019-10-08 Walmart Apollo, Llc Shopping space route guidance systems, devices and methods
US10486951B2 (en) 2015-03-06 2019-11-26 Walmart Apollo, Llc Trash can monitoring systems and methods
US10508010B2 (en) 2015-03-06 2019-12-17 Walmart Apollo, Llc Shopping facility discarded item sorting systems, devices and methods
US11046562B2 (en) 2015-03-06 2021-06-29 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods
US10597270B2 (en) 2015-03-06 2020-03-24 Walmart Apollo, Llc Shopping facility track system and method of routing motorized transport units
US10611614B2 (en) 2015-03-06 2020-04-07 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods to drive movable item containers
US10633231B2 (en) 2015-03-06 2020-04-28 Walmart Apollo, Llc Apparatus and method of monitoring product placement within a shopping facility
US10669140B2 (en) 2015-03-06 2020-06-02 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods to detect and handle incorrectly placed items
US10815104B2 (en) 2015-03-06 2020-10-27 Walmart Apollo, Llc Recharging apparatus and method
US10875752B2 (en) 2015-03-06 2020-12-29 Walmart Apollo, Llc Systems, devices and methods of providing customer support in locating products
US11034563B2 (en) 2015-03-06 2021-06-15 Walmart Apollo, Llc Apparatus and method of monitoring product placement within a shopping facility
GB2542469A (en) * 2015-07-17 2017-03-22 Wal Mart Stores Inc Shopping facility assistance systems, devices, and method to identify security and safety anomalies
GB2542469B (en) * 2015-07-17 2018-02-07 Wal Mart Stores Inc Shopping facility assistance systems, devices, and method to identify security and safety anomalies
US10214400B2 (en) 2016-04-01 2019-02-26 Walmart Apollo, Llc Systems and methods for moving pallets via unmanned motorized unit-guided forklifts
US10017322B2 (en) 2016-04-01 2018-07-10 Wal-Mart Stores, Inc. Systems and methods for moving pallets via unmanned motorized unit-guided forklifts

Also Published As

Publication number Publication date
GB2441491A (en) 2008-03-05
US20070008408A1 (en) 2007-01-11
WO2006137072A3 (en) 2009-05-22
GB0800820D0 (en) 2008-02-27
GB2441491A8 (en) 2008-03-13

Similar Documents

Publication Publication Date Title
US20070008408A1 (en) Wide area security system and method
US20210406556A1 (en) Total Property Intelligence System
US9883165B2 (en) Method and system for reconstructing 3D trajectory in real time
Haering et al. The evolution of video surveillance: an overview
US8310554B2 (en) Method and apparatus for performing coordinated multi-PTZ camera tracking
US20160019427A1 (en) Video surveillence system for detecting firearms
US20200175767A1 (en) Systems and methods for dynamically identifying hazards, routing resources, and monitoring and training of persons
US20160232777A1 (en) Integrative security system and method
KR101321444B1 (en) A cctv monitoring system
US9607501B2 (en) Systems and methods for providing emergency resources
CN101375599A (en) Method and system for performing video flashlight
US20170280107A1 (en) Site sentinel systems and methods
JP6013923B2 (en) System and method for browsing and searching for video episodes
US8860812B2 (en) Ambient presentation of surveillance data
KR20160099931A (en) Disaster preventing and managing method for the disaster harzard and interest area
CN107360394A (en) More preset point dynamic and intelligent monitoring methods applied to frontier defense video monitoring system
CN108206931A (en) A kind of legacy monitoring analysis system
CN115410354B (en) Safety early warning method and device and safety early warning system for industrial factory
KR101005568B1 (en) Intelligent security system
RU2742582C1 (en) System and method for displaying moving objects on local map
KR20190050113A (en) System for Auto tracking of moving object monitoring system
KR20210058783A (en) Object tracking and service execution system with event rules and service settings for each Vid-Fence
KR101780929B1 (en) Image surveillence system for moving object
RU2693926C1 (en) System for monitoring and acting on objects of interest, and processes performed by them and corresponding method
JP2022526071A (en) Situational awareness monitoring

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWE Wipo information: entry into national phase

Ref document number: 0800820.3

Country of ref document: GB

122 Ep: pct application non-entry in european phase

Ref document number: 06745177

Country of ref document: EP

Kind code of ref document: A2