US20140327766A1 - Methods and apparatus related to improved surveillance using a smart camera - Google Patents
Methods and apparatus related to improved surveillance using a smart camera Download PDFInfo
- Publication number
- US20140327766A1 US20140327766A1 US14/021,750 US201314021750A US2014327766A1 US 20140327766 A1 US20140327766 A1 US 20140327766A1 US 201314021750 A US201314021750 A US 201314021750A US 2014327766 A1 US2014327766 A1 US 2014327766A1
- Authority
- US
- United States
- Prior art keywords
- camera
- map
- surveillance
- video
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19665—Details related to the storage of video surveillance data
- G08B13/19671—Addition of non-video data, i.e. metadata, to video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Abstract
Methods and apparatus related to smart surveillance camera operation and implementation are described. A smart surveillance camera stores a map corresponding to an area subject to monitoring. The camera provides at least one image detected by said camera and at least a portion of the stored map in conjunction with the at least one image detected by said camera, e.g., to a wireless communications device of an emergency responder in its local vicinity. Current camera position and/or viewing controls such as camera angle setting and/or zoom, are sometimes used, to determine the portion of the overlay map to be communicated in conjunction with a video stream. Externally detectable trigger events, e.g., from a 911 call or from a gunshot audio detector, and/or internally detectable trigger events, e.g., a detected mob in the camera viewing area, are sometimes used to initiate transmission of a video stream and corresponding map overlay.
Description
- The present application is a continuation of U.S. patent application Ser. No. 11/544,972, filed on Oct. 6, 2006 and entitled “METHODS AND APPARATUS RELATED TO IMPROVED SURVEILLANCE USING A SMART CAMERA” which is hereby expressly incorporated by reference in its entirety.
- The present invention relates generally to apparatus and methods for surveillance, and more particularly to apparatus and methods for providing a video stream with a corresponding map overlay from a smart camera.
- The use of surveillance cameras continues to proliferate for a variety of applications including private security, homeland security, crime detection, traffic management, crowd control, border control, weather condition monitoring and military applications. A set of surveillance cameras being operated in coordination is typically used to provide situation awareness at a system level. The video information from a plurality of cameras may be forwarded to a central control location where the information is available for viewing individually, as a composite and/or built up on an overall site map.
- It would be advantageous if methods and apparatus were developed which provided situational awareness at a local site level of interest, e.g., to an individual emergency responder close to or at the site of interest. The plethora of surveillance cameras being installed can potentially provide a centralized system monitoring station with a very high level of overall surveillance, e.g., within a metropolitan area. However, as the number of cameras increases, the amount of bandwidth needed for supporting concurrent video output feeds to the central control site increases. Video feed resolution can be reduced to allow for a larger number of concurrent camera feeds; however, this tends to limit detection capability at an individual site. There can also be significant communication delays between an individual camera output feed and the central control node site. In addition, there can be significant processing delays in coordinating and/or processing information from a plurality of cameras onto a centralized site map.
- It would be advantageous if the feed from an individual surveillance camera of interest to a particular local responder could be fed directly to that local responder without having to be fed through and processed by a central control node. It would also be beneficial if a camera's output feed could be selectively switched on/off in response to detection of an event, thus limiting the use of valuable limited bandwidth, e.g., air link bandwidth.
- Many surveillance cameras are placed at a fixed geographic location. It would be beneficial is methods and apparatus were to make use of that knowledge of local surveillance coordinate information in providing a local responder with additional information, e.g., an additional stream of information providing a local site map overlay with GPS coordinates.
- Various embodiments of the present invention are directed to apparatus and methods for surveillance, and more particularly to apparatus and methods for providing a video stream with a corresponding map overlay from a smart camera. In some embodiments, the corresponding map overlay provided is changed in accordance with changes in the viewing area corresponding to the video stream, e.g., as camera viewing area is changed due to camera angle, camera height, and/or camera zoom setting, provided map overlay information from the smart camera also changes. In various embodiments in which the smart camera is located at a fixed site, the smart camera stores geographic position information, e.g., information including GPS coordinate information of the camera and GPS coordinate information of the potential viewing area. In some embodiments, in which the smart camera is a mobile smart camera or in which the smart camera may be installed temporarily at a location, the smart camera includes at least one of a GPS receiver and/or an interface to a GPS receiver.
- An exemplary method of operating a surveillance camera in accordance with the present invention includes: storing in the camera a map corresponding to an area subject to monitoring by the camera, providing at least one image detected by the camera and providing at least a portion of said map in conjunction with the at least one image detected by the camera, e.g., to a wireless communications device of an emergency responder in its local vicinity. The communication can be, and sometimes is via a peer to peer wireless communications channel. Externally detectable trigger events, e.g., from a 911 call or from a gunshot audio detector, and/or internally detectable trigger events, e.g., a detected mob in the camera viewing area, are sometimes used to initiate transmission of a video stream and corresponding map overlay.
- While various embodiments have been discussed in the summary above, it should be appreciated that not necessarily all embodiments include the same features and some of the features described above are not necessary but can be desirable in some embodiments. Numerous additional features, embodiments and benefits of the various embodiments are discussed in the detailed description which follows.
-
FIG. 1 is drawing of an exemplary surveillance, monitoring, and communications system implemented in accordance with various embodiments of the present invention. -
FIG. 2 is drawing of an exemplary smart camera assembly implemented in accordance with the present invention. -
FIG. 3 is a drawing illustrating the exemplary system ofFIG. 1 and exemplary signaling in response to a detected event in accordance with the present invention. -
FIG. 4 is a drawing illustrating the exemplary system ofFIG. 1 and exemplary signaling in response to another detected event in accordance with the present invention. -
FIG. 5 is a drawing illustrating the exemplary system ofFIG. 1 and exemplary signaling in response to a smart camera detection trigger in accordance with the present invention. -
FIG. 6 is a drawing illustrating an exemplary surveillance area including a plurality of surveillance cameras, a plurality of responders, and camera video stream/map overlay signaling being communicated in accordance with various embodiments of the present invention. -
FIG. 7 is a drawing of a flowchart of an exemplary method of operating a surveillance camera in accordance with various embodiments of the present invention. -
FIG. 8 is a drawing illustrating an exemplary system monitoring area composite map in accordance with various embodiments of the present invention. -
FIG. 9 is a drawing illustrating exemplary smart camera which can be controllably positioned to view different portions of its potential coverage area corresponding to different portions of a stored map. -
FIG. 10 is a drawing illustrating an exemplary system and exemplary signaling in response to a detected event in accordance with various embodiments of the present invention. -
FIG. 11 is a drawing illustrating an exemplary system and exemplary signaling in response to a smart camera detection trigger in accordance with various embodiments of the present invention. -
FIG. 12 is a drawing illustrating an exemplary system and exemplary signaling in accordance with various embodiments of the present invention. -
FIG. 1 is drawing of an exemplary surveillance, monitoring, andcommunications system 100 implemented in accordance with various embodiments of the present invention.Exemplary system 100 includes a camera surveillance network, monitoring device networks, a wireless cellular communications network, and an emergency response network coupled together. - The camera surveillance network includes a plurality of smart cameras (
smart camera 1 102, . . . , smart camera N 104) coupled to anIP network router 106 via links (108, 110) respectively. IPcamera network router 106 is coupled to central camera monitoring/detection control system 112 vianetwork link 114. Each camera (102, 104) has a corresponding camera surveillance area (camera 1surveillance area 116, camera N surveillance area 118), respectively. - A first monitoring network includes a plurality of monitoring devices (
sensor device 4 120,sensor device 5 122) coupled to monitoring set 1control system 124 via links (126, 128), respectively. For example,sensor device 4 120 is a light sensor andsensor device 5 is a temperature sensor. Monitoring device set 1control system 124 is, e.g., a monitoring system for monitoring environmental condition sensors, e.g., a weather monitoring system operated privately or by a government agency. - A second monitoring network includes a plurality of monitoring devices (
sensor device 1 130,sensor device 2 132,sensor device 3 134) coupled to monitoring device set 2control system 136 via links (138, 140, 142), respectively. For example,sensor device 1 130 is an air monitor capable of detecting chemical agents,sensor device 2 132 is an audio detector capable or monitoring for and identifying gunshots and/or explosions,sensor device 3 134 is a radiation monitor. Monitoring device set 2control system 136 is, e.g., a monitoring system for monitoring various security sensors, e.g., operated under the direction of the homeland security agency. - The wireless cellular communications network includes a wireless
cellular control system 144, e.g., base stations, routers, central offices, and backhaul network infrastructure and a plurality of wireless communications devices which may be coupled to the cellular network.FIG. 1 illustrates an exemplary wireless communications device,cell phone 146, being used byprivate individual 148.Cell phone 146 is coupled to wirelesscellular control system 144 viawireless communications link 150. - The emergency response network includes emergency response
network control system 152 coupled to a plurality of wireless terminals (WT A 156 ofemergency response vehicle 158, WT B 160 of patrol officer 162) via wireless links (164, 166), respectively. Emergency responsenetwork control system 152 is coupled to central camera monitoring/detection control system 112, wirelesscellular control system 144, monitoring device set 1control system 124, and monitoring device set 2control system 136 via network links (168, 170, 172, 174), respectively. - In accordance with various features of the present invention an exemplary smart camera stores map information corresponding to its surveillance area and generates output signals including both a video camera viewing image stream and a geo-registered map overlay. In accordance with another feature of various embodiments of the present invention, an exemplary smart camera communicates its camera video stream/geo-registered map overlay to a wireless terminal, e.g. a wireless terminal of a local responder, without the signal having to be processed by a central camera monitoring/detection control system. In
FIG. 1 exemplary camera video stream/geo-registeredmap overlay 176 is being communicated fromsmart camera 1 102 to WT A 156 without traversing the central camera monitoring/detection control system 112, and exemplary camera video stream/geo-registeredmap overlay 178 is being communicated fromsmart camera N 104 to WTB 160 without traversing the central camera monitoring/detection control system 112. -
FIG. 2 is drawing of an exemplarysmart camera assembly 200 implemented in accordance with the present invention. Exemplarysmart camera assembly 200 may be any of the exemplary smart cameras (102, 104, 606, 606′, 606″, 608, 608″, 810, 812, 814, 816, 1002, 1004, 1102, 1104, 1202, 1204, 1206) ofFIGS. 1 , 3, 4, 5, 6, 8, 9, 10, 11 and 12. Exemplarysmart camera assembly 200 includes acamera sensor module 202, aprocessor 204, atiming module 206, amemory 212 and aninterface module 214 coupled together via abus 216 over which the various elements may interchange data and information. Exemplarysmart camera assembly 200 includes, in some embodiments, one or more of global positioning system (GPS)module 210 and cameraposition control module 208 coupled tobus 216. The various components of theexemplary camera assembly 200 are included in acamera housing 201. -
Camera sensor module 202 includes, e.g., a visible spectrum and/or infrared (IR) spectrum image sensor. In some embodimentscamera sensor module 202 includes zoom capability.Memory 212 includesroutines 218 and data/information 220. Theprocessor 204, e.g., a CPU, executes theroutines 218 and uses the data/information 220 inmemory 212 to control the operation ofsmart camera 200 and implement methods in accordance with the present invention.Timing module 206 is used maintaining accurate time and for time tagging detected images which are stored and/or communicated.Interface module 214, e.g., an IP interface module, includes one or more ofwireless interface module 256 andlandline interface module 258.Landline interface module 258, e.g., a cable interface, may couple the camera to a private network and/or the Internet.Wireless interface module 256, e.g., including a wireless transmitter module and a wireless receiver module, is coupled tocommunication antenna 256 which may couple thecamera 200 to a private network and/or the Internet. In someembodiments antenna 256 is included as part ofcamera 200. In some embodiments,antenna 256 is a standalone unit used in conjunction withcamera 200. In some embodiments,wireless interface module 256 provides a direct wireless connection to a local wireless terminal, e.g., corresponding to an emergency responder in the local vicinity, providing video stream output and a corresponding geo-registered overlay map. - Camera
position control module 208 is used for controlling the position of the camera, e.g., tilt angle, azimuth angle, and/or elevation, e.g., to change the field of view of the camera. In some embodiments cameraposition control module 208 also controls the zoom setting adjustment of the camera.GPS module 210, e.g., a module including a GPS receiver and, in some embodiments, GPS related processing, is coupled to associatedGPS antenna 211, receives GPS signals and is used to provide an accurate position of thecamera 200. In some embodiments, the GPS related processing includes adjustments for level arms between theGPS antenna 211 mount location and the camera location. In someembodiments GPS antenna 211 is included as part ofcamera 200. In someembodiments GPS antenna 211 is a standalone unit coupled toGPS module 210. In some embodiments, the camera is located at a fixed site of known location, and the GPS coordinates of the camera and/or field of view are loaded into the camera for future use, e.g., in generating an overlay map. In some embodiments such as an embodiment in which the camera is mounted on a moving vehicle,GPS module 210 output and/or cameraposition control module 208 setting, e.g., tilt angle, are used by the camera view/GPS coordinateposition determination module 238. -
Routines 218 include avideo monitoring module 222, a videostream generation module 224, amap overlay module 226, adisplay enhancement module 228, a target detection/tracking module 230, an externally detectedtrigger response module 234, and anoutput feed module 236. In some embodiments,routines 218 include a camera view/GPS coordinateposition determination module 238. Data/information 220 includes stored monitor output data/information 240, generated video stream output data/information 242, geo-registered overlay map output stream data/information 244,display enhancement information 246, storedmonitoring area map 247, stored camera view/GPS coordinateposition information 248, map portion corresponding toviewing area 249,cluster detection criteria 250, and receivedexternal trigger information 252. Receivedexternal trigger information 252 includes GPS coordinateinformation 254, e.g., the GPS coordinates of a person who place a 911 call or the GPS coordinates of a sensor which initialed the trigger, or the GPS coordinates of the event detected by the sensor which initiated the trigger. -
Video monitoring module 222 processes received signals from thecamera sensor module 202, associates time tags with received images, the time tags determined in accordance withtiming module 206.Video monitoring module 222 also stores monitoroutput information 240 inmemory 212. Videostream generation module 224 generates videostream output information 242 to be communicated viainterface module 214, e.g., using storedmonitor output information 240 anddisplay enhancement information 246.Map overlay module 226 generates, using storedmonitoring area map 247 andinformation 248, a map portion corresponding to aviewing area 249 and geo-registered overlay mapoutput stream information 244 to be communicated in conjunction with said generatedvideo stream 242 viainterface module 214.Display enhancement module 228 generatesdisplay enhancement information 246, e.g., symbols identifying targets and/or detected events, additional text to be added such as GPS coordinates of a target or detected event. Target detection/tracking module 230 detects targets of interest within the field of view of the camera, e.g., detecting a mob of people which has formed, detecting and/or tracking a mob which is moving toward or away from a location, detecting an intersection or road which is blocked, detecting a variation in traffic flow, detecting a high temperature location, e.g., car fire, etc. Target detection/tracking module 230 includes acluster detection module 232 which usescluster detection criteria 250 to detect clusters within the field of view, e.g., a cluster of people such as a mob. - Externally detected
trigger response module 234 responds to received signals, communicating an externally detected trigger event and/or commanding the camera to respond. For example, an externally detected trigger event may be the detection by an individual or a sensor corresponding to a location known to be within the coverage area of the smart camera. For example, an individual whose approximate location is known may place a 911 call to report a crime in progress, an observed accident, or a fire; a sensor whose location is known may record a gunshot or chemical agent detected in the air. An exemplary response byresponse module 234 may be the activation ofoutput feed module 236, the incorporation of received data bydisplay enhancement module 228, e.g., adding a target at appropriate GPS coordinates, and/or adjustments of camerapositioning control module 208 in response to received GPS coordinates of a target. -
Output feed module 236 controls the outputting of a generatedvideo stream output 242 and geo-registered overlaymap output stream 244. In some embodiments, thecamera 200 typically does not output video and map streams continuously regardless of conditions, but output is controlled by a command, externally trigger event, and/or internal trigger event. In this way, bandwidth usage can be controlled, facilitating a large number of cameras in the system to be available for output, yet tending to avoid congestion.Individual camera 200 may be continuously monitoring and storing monitored output data/information 240, e.g., in a larger buffer, such that after an event is detected andoutput feed module 236 is activated, recent previous information is still available to be communicated via a video stream output signal, in addition to live video stream output.Output feed module 236, in some embodiments, outputs the generatedvideo stream output 242 at a different rate than the geo-registered overlay mapoutput stream information 244. For example consider that the camera has a fixed field of vision. In such a case, it may be sufficient to communicate the map overlay at a much lower rate than the generated video stream output, as the map overlay does not change, but needs to be available at least once for a new user receiving the output streams. Alternatively consider that the camera has a variable field of vision, e.g., due to at least one of zoom adjustment capability, camera position control capability, and the camera being a movable camera, e.g., the camera is mounted on a vehicle. In such an embodiment, theoutput feed module 236 may transmit new overlay map information corresponding to each change of the field of vision. Camera view/GPS coordinateposition determination module 238 is used for determining GPS coordinate information corresponding to the current field of view, e.g., in embodiments where the camera may be moving and/or where the camera field of view may be changed due to zoom setting, tilt setting, rotary setting, and or elevation setting. -
FIG. 3 is a drawing 300 illustratingexemplary system 100 ofFIG. 1 and exemplary signaling in response to a detected event in accordance with the present invention. Incamera 1surveillance area 116private individual 148 detectscrime scene 301 and initiates a 911 call viacell phone 146.Cell phone 146 sends 911call signal 302 to wirelesscellular control system 144. The wirelesscellular control system 144 determines the caller's position, e.g. via GPS and/or other location detecting means such as, e.g., signal strength measurements and cell/sector mapping information. The wirelesscellular control system 144send signal 304 conveying 911 call information+caller location information to the emergency responsenetwork control system 152. The emergency responsenetwork control system 152 sends signal 306 conveying event trigger information and GPS information to central camera monitoring/detection control system 112. Central camera monitoring/detection control system 112 includescamera selection module 308 which selects a camera or set of cameras to be activated to output feed to view a region of interest corresponding to the detected event, e.g., based on the GPS coordinates of the detected event, the GPS coordinates of cameras, and/or the GPS coordinates of camera surveillance areas. Central camera monitoring/detection control system 112 sends selectedcamera activation signal 310 via IPcamera network router 106 tosmart camera 1 102. In addition central camera monitoring/detection control system 112 sends signal 312 conveying pertinent camerafeed identification information 312 to emergency responsenetwork control system 152. Emergency responsenetwork control system 152 identifies thatemergency response vehicle 158 is in the vicinity of thecrime scene 301 and forwards the pertinentcamera feed information 312 towireless terminal A 156. For example, the pertinentcamera feed information 312 includes information used in WT A to tune its receiver, recover signals fromcamera 1 102 using a designated channel currently allocated tosmart camera 1, and decode the information being communicated fromsmart camera 1, which may be encrypted. In response to receivedactivation signal 310,smart camera 1 102 transmits camera video stream/geo-registeredmap overlay 314 which is received and processed bywireless terminal A 156, thus providing real time or a near real time live video feed with overlay map of the area in the vicinity of the crime. Thus, the responder may be able to view a fleeing perpetrators direction and/or activity at the crime scene, and proceed accordingly. -
FIG. 4 is a drawing 400 illustratingexemplary system 100 ofFIG. 1 and exemplary signaling in response to another detected event in accordance with the present invention. Incamera 2surveillance area 118audio sensor device 132 detectsgunshot 402 and signals gunshotdetection notification information 404 to monitoring device set 2control system 136. Monitoring device set 2control system 136 associates thesensor 132 with a location and sends signal 406 conveying gunshot detection notification information and estimated position information to the emergency responsenetwork control system 152. The emergency responsenetwork control system 152 sends signal 408 conveying event trigger information and GPS information to central camera monitoring/detection control system 112. Central camera monitoring/detection control system 112 includescamera selection module 308 which selects a camera or set of cameras to be activated to output feed to view a region of interest corresponding to the detected event, e.g., based on the GPS coordinates of the detected event and/or relevant detection sensor, the GPS coordinates of cameras, and/or the GPS coordinates of camera surveillance areas. Central camera monitoring/detection control system 112 sends selectedcamera activation signal 410 via IPcamera network router 106 tosmart camera N 104. In addition central camera monitoring/detection control system 112 sends signal 412 conveying pertinent camera feed identification information to emergency responsenetwork control system 152. Emergency responsenetwork control system 152 identifies thatpatrol officer 162 withwireless terminal B 160 is in the vicinity of the detectedgunshot 402 and forwards the pertinentcamera feed information 412 towireless terminal B 162. For example, the pertinentcamera feed information 412 includes information used in WT B to tune its receiver, recover signals fromcamera N 104 using a designated channel currently allocated to smart camera N, and decode the information being communicated from smart camera N, which may be encrypted. In response to receivedactivation signal 410,smart camera N 104 transmits camera video stream/geo-registeredmap overlay 414 which is received and processed bywireless terminal B 160, thus providing real time or a near real time live video feed with overlay map of the area in the vicinity of the detected gunshot. -
FIG. 5 is a drawing 500 illustratingexemplary system 100 ofFIG. 1 and exemplary signaling in response to a smart camera detection trigger in accordance with the present invention. Incamera 1 surveillance area 116 amob 502 forms which is detected by a cluster detection module insmart camera 1 102. In response to the detectedmob 502,smart camera 1 102 sends signal 504 conveying notification of the detected mob and GPS coordinates of the detected mob via IPcamera network router 106 to central camera monitoring/detection control system 112. Central camera monitoring/detection control system 112 sends signal 506 conveying notification of the detected mob, GPS coordinates of the mob, and pertinent camera feed identification information to emergency responsenetwork control system 152. Emergency responsenetwork control system 152 identifies thatemergency response vehicle 158 withwireless terminal A 156 is in the vicinity of the detectedmob 502 and forwards thepertinent information 506 towireless terminal A 156. For example, the pertinent camera feed information includes information used in WT A to tune its receiver, recover signals fromcamera 1 102 using a designated channel currently allocated tosmart camera 1, and decode the information being communicated fromsmart camera 1, which may be encrypted. In response to the detected mob,smart camera 1 102 transmits camera video stream/geo-registeredmap overlay 510 which is received and processed bywireless terminal A 156, thus providing real time or a near real time live video feed with overlay map of the area in the vicinity of the detected mob. -
FIG. 6 is a drawing 600 illustrating an exemplary surveillance area including a plurality of surveillance cameras, a plurality of responders, and camera video stream/map overlay signaling being communicated in accordance with the present invention. In a metropolitan area, that an individual responder may cover, there may be a very large number of smart cameras, e.g., over 1000, from which the responder can potentially receive a camera video stream/map overlay. There are a large number of surveillance cameras; however, small selected subsets of cameras, corresponding to sites of interests, send camera video stream/map overlay data/information to a responder in the vicinity of the site. An exemplary site of interest is, e.g., a gunshot site, a 911 call site, a crime site, a smart camera detection trigger site such as a detected mob, a sensor detection trigger site, etc. Thus, the high bandwidth requirements which would be required to continually stream video signaling from each of the cameras in the overall surveillance is not needed. Drawing 600 also includeslegend 602 which identifies thatstar shape 604 represents a site of interest,shape 606 represents a fixed site camera,shape 608 represents a mobile camera, andshape 610 represents a responder which includes awireless terminal 612. - Site of
interest 604′ is within the surveillance area of fixed sitesmart camera 606′ which generates and communicates camera video stream/map overlay 614 towireless terminal 612′ ofresponder 610′, which is in the local vicinity ofsite 604′. - Site of
interest 604″ is within the surveillance area of fixed sitesmart camera 606″ and mobilesmart camera 608″. Fixed sitesmart camera 606″ generates and communicates camera video stream/map overlay 616″ towireless terminal 612″ ofresponder 610″. Mobilesmart camera 608″ generates and communicates camera video stream/map overlay 618 towireless terminal 612″ ofresponder 610″. In some embodiments, mobilesmart camera 608″ includes or is used in conjunction with a GPS receiver to obtain the position of smartmobile camera 608″. In some such embodiments, thesmart camera 608″ uses information indicative of camera positioning control, e.g., tilt angle, rotary angle, height setting, zoom setting in conjunction with GPS information to generate a map overlay corresponding to the camera surveillance area. -
FIG. 7 is a drawing of aflowchart 700 of an exemplary method of operating a surveillance camera in accordance with the present invention. The surveillance camera may besmart surveillance camera 200 ofFIG. 2 . The exemplary method starts instep 702, where the camera is powered on and initialized. Operation proceeds fromstart step 702 to step 704. Instep 704, geographic position information is stored in the camera, e.g., corresponding to the location of the camera and the viewing area of coverage corresponding to the camera. Operation proceeds fromstep 704 to step 706, in which a map is stored in the camera corresponding to an area subject to monitoring by said camera. Then, instep 708 as the camera performs monitoring operations, the camera stores video of the surveillance. Instep 710, the camera detects for trigger events. Step 710 includes sub-step 712 and sub-step 714. Insub-step 712, the camera detects for internal trigger events, e.g., (i) a cluster of detected targets in a portion of the viewing area such as the formation of a mob, e.g., 5 or more people in close proximity, (ii) a change of activity in a portion of the viewing area such as, e.g., a restriction in traffic flow, a rapid motion of a crowd away from or toward a location, and (iii) the presence of a high temperature target, e.g., a car fire in progress. Insub-step 714, the camera detects for reception of a trigger signal from an external source, e.g., a trigger signal from or in response to an audio detector detecting a gunshot, a trigger signal from a 911 call center in response to a received emergency call, a trigger signal resulting from the detection of an environmental monitoring sensor, e.g., air quality, radiation level, gas content, etc. exceeding a predetermined level. In some embodiments, an external trigger signal is accompanied by location information, e.g., GPS coordinate information indicating location or approximate location of the event which initiated the trigger. In some embodiments, the external trigger signal is accompanied by information identifying the cause of the trigger, e.g., gunshot, 911 call, fire alarm, air sensor, radiation sensor, etc. In various embodiments, the camera incorporates at least some of the received additional information in the output video stream, e.g., incorporating a symbol representative of the type of trigger into the video stream and/or incorporating GPS coordinate information. In some embodiments, received information, e.g., GPS external trigger target coordinates, is used by the camera to set and/or determine the viewing area, e.g., by controlling the position and/or zoom settings of the camera. - Operation proceeds from
step 710 to step 716. Instep 716, operation proceeds depending upon whether or not a trigger was detected instep 710. If a trigger was not detected operation returns to step 710 for additional monitoring. However, if a trigger was detected, operation proceeds fromstep 716 to step 718. Instep 718, the camera transmits stored video preceding receipt of said trigger to a central control center. In various embodiments, the video is time tagged and/or location coordinate tagged to allow law enforcement use, e.g., in a crime investigation and/or as evidence in a prosecution. Instep 720, the camera provides at least one image detected by said camera, e.g., the camera provides video of post trigger images. Operation proceeds fromstep 720 to step 722. Instep 722, the camera determines as a function of camera angle, e.g., rotary and/or tilt angle, a portion of said stored map to provide in conjunction with images being provided by said camera. In various embodiments, the camera determines as a function of zoom setting a portion of said stored map to provide in conjunction with images being provided by said camera. In some embodiments, the camera determined as a function of height setting a portion of said stored map to provide in conjunction with images being provided by said camera. In some embodiments in which the camera is a mobile camera, e.g., attached to a patrol car, the camera determines as a function of current camera GPS determined position settings a portion of said stored map to provide in conjunction with images being provided by said camera. In some embodiments, the camera is a fixed mount stationary camera and step 722 is not performed. Operation proceeds fromstep 722 to step 724. - In
step 724, the camera provides at least a portion of said map in conjunction with said at least one image detected by said camera. In some embodiments, providing at least a portion of a map includes providing a different portion of said map as the area viewed by the camera changes, said different portion of said map corresponding to video images being provided by said camera. For example, the portion of the map which is provided corresponds to the same portion to which said at least one image detected by the camera ofstep 720 corresponds, e.g., the map portion is a geo-registered overlay. - Operation proceeds from
step 724 to step 726. Instep 726, the camera is operated to communicate with a portable wireless communications device to supply thereto both video and corresponding portion of a map via a wireless communications channel. Step 726 includes sub-step 728. Insub-step 728, the camera transmits a real time view and a corresponding portion of said stored map to a portable device in wireless communications range of the camera, e.g., as a peer to peer wireless feed. - In some embodiments, the exemplary method includes transmitting target position information to be overlaid on said portion of said map. For example, the target position information is, in some embodiments, a symbol identifying the target and/or GPS coordinate information corresponding to the target. In some embodiments, providing at least a portion of said map in conjunction with at least one image detected by said camera includes transmitting said portion of said map as a data file, which is in addition to said at least one image. In various embodiments, providing at least a portion of said map in conjunction with at least one image detected by said camera includes transmitting said portion of said map as a video image, e.g., a compressed video image, which is in addition to said at least one image. In some implementations, there may be an advantage in transmitting said portion of said map as a video image rather than a specialized data file, in that the receiver can readily recover the map portion using a conventional video format. In some such embodiments providing at least a portion of said map in conjunction with at least one image detected by said camera includes transmitting said portion of said map as a video image which is in addition to said at least one image, said video image including a target symbol overlaid on said portion of said map to indicate the position of a detected target on said portion of said map. In some embodiments providing at least a portion of said map in conjunction with at least one image detected by said camera includes transmitting said portion of said map as a video image which is in addition to said at least one image, said video image including target position information, e.g., GPS target coordinate information, overlaid on said portion of said map.
-
FIG. 8 is a drawing 800 illustrating an exemplary system monitoring area composite map in accordance with the present invention. The composite map includes identified geographical features, landmarks, buildings, roads, etc., e.g., houses (820, 822) androad 818. The composite map includes (smart camera 1 storedmonitoring area map 802,smart camera 2 storedmonitoring area map 804,smart camera 3 storedmonitoring area map 806,smart camera 4 stored monitoring area map 808) corresponding to information stored in (smart camera 1 810,smart camera 2 812,smart camera 3 814,smart camera 4 816), respectively. In general, coverage areas of adjacent cameras may be, and sometimes are overlapping. Each camera stores map information corresponding to its surveillance coverage area. -
FIG. 9 is a drawing 900 illustrating exemplarysmart camera 1 810 which can be controllably positioned to view different portions of itspotential coverage area 802. Storedsmart camera 1 storedmonitoring area map 802 includes four exemplary portions (portion 1 of storedsmart camera 1map 902,portion 2 of storedsmart camera 1map 904,portion 3 of storedsmart camera 1map 906, andportion 4 of storedsmart camera 1 map 908). Ascamera angle 910 is changed the viewing area and portion of the stored map to be provided with the video stream is changed to correspond to the viewing stream. Zoom setting and/or camera height position setting, in some embodiments, are also used in determining a portion of a stored smart camera map to provide. -
FIG. 10 is a drawing 1000 illustrating an exemplary system and exemplary signaling in response to a detected event in accordance with the present invention. The exemplary system includes a plurality of smart camera (smart camera 1 1002, . . . , smart camera N 1004). Each smart camera (1002, 1004) has a corresponding surveillance area (1006, 1008). Incamera 1surveillance area 1006private individual 1020 detects crime scene 1018 and initiates a 911 call viacell phone 1022.Cell phone 1022 sends 911call signal 1024 to wirelesscellular control system 1026. The wirelesscellular control system 1026 determines the caller's position, e.g. via GPS and/or other location detecting means such as, e.g., signal strength measurements and cell/sector mapping information. The wirelesscellular control system 1026send signal 1028 conveying 911 call information+caller location information to the emergency responsenetwork control system 1030. The emergency responsenetwork control system 1030 considers alternative responders, e.g.,emergency response vehicle 1010 includingWT A 1012 orpatrol officer 1014 usingWT B 1016. Emergency responsenetwork control system 1030 sends aresponse request signal 1032 toWT A 1012 ofemergency response vehicle 1010, e.g., because the vehicle is closer to the crime scene and/or can reach the crime scene quicker. WT A 1012 sends a videofeed request signal 1034 tosmart camera 1 1002. The request can be for spooled video, e.g., corresponding to the time of the crime, and/or for live real time video feed.Smart camera 1 1002 responds sending camera video stream/geo-registeredmap overlay 1036 toWT A 1012. -
FIG. 11 is a drawing 1100 illustrating an exemplary system and exemplary signaling in response to a smart camera detection trigger in accordance with the present invention. The exemplary system includes a plurality of surveillance camera (smart camera 1 1102, smart camera N 1104), each with a corresponding surveillance area (1106, 1108), respectively. Incamera 1 surveillance area 1106 amob 1110 forms which is detected by a cluster detection module insmart camera 1 1102. In response to the detectedmob 1110,smart camera 1 1102 starts broadcasting camera video stream/geo-registeredmap overlay 1120, which is received byWT A 1114 ofemergency responder vehicle 1112 andWT B 1118 ofpatrol officer 1116, which are both in the local vicinity ofsmart camera 1 1102. - In addition,
smart camera 1 1102, in some embodiments receives signals from one or more wireless terminals, e.g.,wireless terminal A 1114, requesting information to be communicated, e.g., requesting a camera video stream/geo-registered map overlay, corresponding to a previous video recording, e.g., during a time interval including the detected trigger event. In some embodiments, such a transmission is communicated on a second communications channel in addition to the real time view being communicated. -
FIG. 12 is a drawing 1200 illustrating an exemplary system and exemplary signaling in accordance with the present invention. The exemplary system includes a plurality of surveillance cameras (smart camera 1 1202,smart camera 2 1204, . . . , smart camera N 1206), each with a corresponding surveillance area (1208, 1210, 1212), respectively.Emergency response vehicle 1214 includeswireless terminal A 1216 which is capable of communicating with a smart camera, e.g., in its current local vicinity, via a wireless communications link. The wireless communications link can be, and sometimes is a peer to peer wireless communications link. WT A 1216, which is in the vicinity ofsmart camera 1 1202, sends a videofeed request signal 1218 tosmart camera 1 1202;smart camera 1 1202 responds with a camera video stream/geo-registeredmay overlay signal 1220, providing a view and an overlay map ofcamera 1 surveillance area. The emergency response vehicle continues on its route passing smart camera surveillance area without requesting information fromsmart camera 2 1204. When, emergency response vehicle gets in the local vicinity ofsmart camera N 1206,WT A 1216 detects an alert signal fromsmart camera N 1206, e.g., as smart camera N has recently detected a trigger event, e.g., the formation ofmob 1222, in itsurveillance area 1212. WT A 1216 sends a videofeed request signal 1224 tosmart camera N 1206, which in turn, transmits camera video stream/geo-registeredmap overlay 1228 toWT A 1226. Asemergency vehicle 1214 enters the cameraN surveillance area 1212, theemergency vehicle 1214 is observable in the real time video stream/geo-registered map overlay being transmitted bysmart camera N 1206. - In various embodiments elements described herein are implemented using one or more modules to perform the steps corresponding to one or more methods of the present invention. Thus, in some embodiments various features of the present invention are implemented using modules. Such modules may be implemented using software, hardware or a combination of software and hardware. Many of the above described methods or method steps can be implemented using machine executable instructions, such as software, included in a machine readable medium such as a memory device, e.g., RAM, floppy disk, etc. to control a machine, e.g., general purpose computer with or without additional hardware, to implement all or portions of the above described methods, e.g., in one or more nodes. Accordingly, among other things, the present invention is directed to a machine-readable medium including machine executable instructions for causing a machine, e.g., processor and associated hardware which may be part of a test device, to perform one or more of the steps of the above-described method(s).
- Numerous additional variations on the methods and apparatus of the present invention described above will be apparent to those skilled in the art in view of the above description of the invention. Such variations are to be considered within the scope of the invention.
Claims (20)
1. A method of operating a camera, comprising:
storing in said camera a map corresponding to an area subject to monitoring by said camera;
providing, as an output of said camera, at least one image detected by said camera; and
providing, as an additional output of said camera, at least a portion of said map in conjunction with said at least one image detected by said camera.
2. The method of claim 1 , wherein the portion of said map which is provided corresponds to the same portion to which said at least one image detected by said camera corresponds.
3. The method of claim 2 , wherein the portion of said map is a first map overlay.
4. The method of claim 2 , wherein providing, as an output of said camera, at least one image detected by said camera includes providing video; and
wherein providing, as an additional output of said camera, at least a portion of said map includes providing different portions of said map as the area viewed by said camera changes, said different portion of said map corresponding to video images being provided by said camera.
5. The method of claim 4 , further comprising:
storing geographical position information in said camera; and
determining, within said camera, as a function of a camera angle the portion of said map to provide as the additional output of said camera, in conjunction with images being provided by said camera.
6. The method of claim 4 , wherein said camera includes a wireless communications transmitter and receiver for supplying video and at least a portion of said map corresponding to supplied video via a wireless communications link, the method further comprising:
operating said camera to communicate with a portable wireless communications device and to supply thereto both video and at least a portion of said map via a wireless communications link.
7. The method of claim 6 , further comprising:
receiving, at said camera, a trigger signal from a source external to said camera; and
transmitting said video and corresponding portion of a map in response to said trigger.
8. The method of claim 7 , further comprising:
storing in said camera video for a period of time; and
transmitting stored video preceding receipt of said trigger signal to a central control center.
9. The method of claim 7 , further comprising:
transmitting, from said camera, a real time view and a corresponding portion of said stored map to a portable device in wireless communications range of said camera.
10. The method of claim 7 , wherein said camera includes an event detection module for detecting clusters of targets, the method further comprising:
detecting, within said camera, when a cluster of targets are being viewed by said camera in a portion of a viewing area; and
outputting, from said camera, video and at least a portion of said map when said event detection module detects a cluster of targets.
11. The method of claim 4 , further comprising:
outputting, from said camera, a different map overlay when there is a change in a viewing area corresponding to a video stream being output by said camera.
12. The method of claim 1 ,
wherein said camera includes a camera housing; and
wherein said storing in said camera the map includes storing said map in a memory device located inside said camera housing; and
wherein providing, as an output of said camera, at least one image detected by said camera; and
wherein providing, as an additional output of said camera, at least a portion of said map in conjunction with said at least one image detected by said camera both include sending signals from inside said camera housing to outside said camera housing via an interface module.
13. The method of claim 1 , further comprising:
transmitting, from said camera, target position information to be overlaid on said portion of said map.
14. The method of claim 1 ,
wherein providing at least a portion of said map in conjunction with said at least one image detected by said camera includes transmitting said portion of said map as a video image which is in addition to said at least one image, said video image including a target symbol overlaid on said portion of said map to indicate the position of a detected target on said portion of said map.
15. A monitoring system, comprising:
a first surveillance camera including:
a stored map that corresponds to a surveillance area subject to monitoring by said camera;
an image sensor for sensing at least one of visible light and infrared energy; and
an interface for outputting sensed images and at least a portion of said stored map which corresponds to at least one image sensed by said first surveillance camera.
16. The monitoring system of claim 15 , wherein said portion of said stored map is a map overlay corresponding to an area which is the same as an area to which said at least one image sensed by said first surveillance camera corresponds.
17. The monitoring system of claim 15 ,
wherein said camera further includes a camera housing;
wherein said stored map is included in a memory located within said camera housing;
wherein said image sensor and said interface are at least partially located inside said camera housing; and
wherein said interface includes a wireless interface module for supporting wireless communications with a wireless terminal.
18. The monitoring system of claim 17 , wherein said first camera further comprises:
a trigger event detection module, located within said camera housing, for responding to an external trigger signal.
19. The monitoring system of claim 17 , wherein said first surveillance camera further comprises:
a cluster detection module, located within said camera housing, for detecting a cluster of targets within a viewing area of said first surveillance camera.
20. The monitoring system of claim 17 , wherein said system further includes:
a plurality of additional surveillance cameras, at least some individual surveillance cameras in said plurality of surveillance cameras including:
a stored map corresponding to a surveillance area subject to monitoring by an individual surveillance camera in which the map is stored;
an image sensor for sensing at least one of visible light and infrared energy; and
a wireless interface configured to support wireless communications with one or more wireless terminals and output, from said individual surveillance camera, sensed images and at least a portion of the map stored in said individual camera which corresponds to a sensed image which is output from said individual surveillance camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/021,750 US20140327766A1 (en) | 2006-10-06 | 2013-09-09 | Methods and apparatus related to improved surveillance using a smart camera |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/544,972 US8531521B2 (en) | 2006-10-06 | 2006-10-06 | Methods and apparatus related to improved surveillance using a smart camera |
US14/021,750 US20140327766A1 (en) | 2006-10-06 | 2013-09-09 | Methods and apparatus related to improved surveillance using a smart camera |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/544,972 Continuation US8531521B2 (en) | 2006-10-06 | 2006-10-06 | Methods and apparatus related to improved surveillance using a smart camera |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140327766A1 true US20140327766A1 (en) | 2014-11-06 |
Family
ID=39301484
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/544,972 Active 2031-06-21 US8531521B2 (en) | 2006-10-06 | 2006-10-06 | Methods and apparatus related to improved surveillance using a smart camera |
US14/021,750 Abandoned US20140327766A1 (en) | 2006-10-06 | 2013-09-09 | Methods and apparatus related to improved surveillance using a smart camera |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/544,972 Active 2031-06-21 US8531521B2 (en) | 2006-10-06 | 2006-10-06 | Methods and apparatus related to improved surveillance using a smart camera |
Country Status (1)
Country | Link |
---|---|
US (2) | US8531521B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9923514B1 (en) * | 2017-01-26 | 2018-03-20 | Face International Corporation | Security and tracking systems including energy harvesting components for providing autonomous electrical power |
US11310637B2 (en) | 2017-01-26 | 2022-04-19 | Face International Corporation | Methods for producing security and tracking systems including energy harvesting components for providing autonomous electrical power |
Families Citing this family (116)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7359285B2 (en) * | 2005-08-23 | 2008-04-15 | Bbn Technologies Corp. | Systems and methods for determining shooter locations with weak muzzle detection |
US7190633B2 (en) | 2004-08-24 | 2007-03-13 | Bbn Technologies Corp. | Self-calibrating shooter estimation |
US8520069B2 (en) | 2005-09-16 | 2013-08-27 | Digital Ally, Inc. | Vehicle-mounted video system with distributed processing |
JP4321541B2 (en) * | 2006-04-03 | 2009-08-26 | ソニー株式会社 | Monitoring device and monitoring method |
US7636033B2 (en) | 2006-04-05 | 2009-12-22 | Larry Golden | Multi sensor detection, stall to stop and lock disabling system |
US8155636B2 (en) * | 2006-05-05 | 2012-04-10 | Mediatek Inc. | Systems and methods for remotely controlling mobile stations |
DE102006033147A1 (en) * | 2006-07-18 | 2008-01-24 | Robert Bosch Gmbh | Surveillance camera, procedure for calibration of the security camera and use of the security camera |
US8656440B2 (en) * | 2006-12-27 | 2014-02-18 | Verizon Patent And Licensing Inc. | Method and system of providing a virtual community for participation in a remote event |
US9521371B2 (en) | 2006-12-27 | 2016-12-13 | Verizon Patent And Licensing Inc. | Remote station host providing virtual community participation in a remote event |
US8970703B1 (en) * | 2007-04-16 | 2015-03-03 | The United States Of America As Represented By The Secretary Of The Navy | Automatically triggered video surveillance system |
JP4964028B2 (en) * | 2007-05-30 | 2012-06-27 | 株式会社エルモ社 | Imaging device |
US9077863B2 (en) * | 2007-10-08 | 2015-07-07 | Nice Systems Ltd. | System and method for managing location based video services |
US8208024B2 (en) * | 2007-11-30 | 2012-06-26 | Target Brands, Inc. | Communication and surveillance system |
US9325951B2 (en) | 2008-03-03 | 2016-04-26 | Avigilon Patent Holding 2 Corporation | Content-aware computer networking devices with video analytics for reducing video storage and video communication bandwidth requirements of a video surveillance network camera system |
US8872940B2 (en) * | 2008-03-03 | 2014-10-28 | Videoiq, Inc. | Content aware storage of video data |
EP2271522B1 (en) * | 2008-04-14 | 2018-11-07 | Continental Teves AG & Co. OHG | Positioning signal for rescue forces |
US10896327B1 (en) | 2013-03-15 | 2021-01-19 | Spatial Cam Llc | Device with a camera for locating hidden object |
US10354407B2 (en) | 2013-03-15 | 2019-07-16 | Spatial Cam Llc | Camera for locating hidden objects |
US8437223B2 (en) * | 2008-07-28 | 2013-05-07 | Raytheon Bbn Technologies Corp. | System and methods for detecting shooter locations from an aircraft |
US8285790B2 (en) | 2008-09-26 | 2012-10-09 | International Business Machines Corporation | Virtual universe avatar activities review |
JP2010086265A (en) * | 2008-09-30 | 2010-04-15 | Fujitsu Ltd | Receiver, data display method, and movement support system |
US9123227B2 (en) * | 2008-10-13 | 2015-09-01 | The Boeing Company | System for checking security of video surveillance of an area |
US8503972B2 (en) | 2008-10-30 | 2013-08-06 | Digital Ally, Inc. | Multi-functional remote monitoring system |
US20100238985A1 (en) * | 2008-11-13 | 2010-09-23 | John Traywick | Cellular Uploader for Digital Game Camera |
US8736678B2 (en) * | 2008-12-11 | 2014-05-27 | At&T Intellectual Property I, L.P. | Method and apparatus for vehicle surveillance service in municipal environments |
US8487762B1 (en) | 2008-12-19 | 2013-07-16 | Sprint Spectrum L.P. | Using abnormal mobile station gathering to trigger security measures |
AU2010291859A1 (en) * | 2009-09-01 | 2012-03-22 | Demaher Industrial Cameras Pty Limited | Video camera system |
US20110063447A1 (en) * | 2009-09-15 | 2011-03-17 | Uchicago Argonne, Llc | Video-imaging system for radiation threat tracking in real-time |
US8320217B1 (en) | 2009-10-01 | 2012-11-27 | Raytheon Bbn Technologies Corp. | Systems and methods for disambiguating shooter locations with shockwave-only location |
US20110109747A1 (en) * | 2009-11-12 | 2011-05-12 | Siemens Industry, Inc. | System and method for annotating video with geospatially referenced data |
US8125334B1 (en) * | 2009-12-17 | 2012-02-28 | The Boeing Company | Visual event detection system |
US10084993B2 (en) | 2010-01-14 | 2018-09-25 | Verint Systems Ltd. | Systems and methods for managing and displaying video sources |
WO2012061567A1 (en) * | 2010-11-04 | 2012-05-10 | Magna Electronics Inc. | Vehicular camera system with reduced number of pins and conduits |
US8908013B2 (en) | 2011-01-20 | 2014-12-09 | Canon Kabushiki Kaisha | Systems and methods for collaborative image capturing |
CN103502986B (en) * | 2011-03-07 | 2015-04-29 | 科宝2股份有限公司 | Systems and methods for analytic data gathering from image providers at an event or geographic location |
JP2013012836A (en) * | 2011-06-28 | 2013-01-17 | Nifco Inc | Data recording control device and data recording device |
GB2495105B (en) * | 2011-09-28 | 2013-08-14 | Overview Ltd | Camera apparatus and system |
US9189556B2 (en) * | 2012-01-06 | 2015-11-17 | Google Inc. | System and method for displaying information local to a selected area |
US9852636B2 (en) * | 2012-05-18 | 2017-12-26 | International Business Machines Corproation | Traffic event data source identification, data collection and data storage |
US9117371B2 (en) | 2012-06-22 | 2015-08-25 | Harman International Industries, Inc. | Mobile autonomous surveillance |
TW201404135A (en) * | 2012-07-09 | 2014-01-16 | Av Tech Corp | Network electronic installation setting system, network electronic device and installation setting method thereof |
US10454997B2 (en) | 2012-09-07 | 2019-10-22 | Avigilon Corporation | Distributed physical security system |
US10272848B2 (en) | 2012-09-28 | 2019-04-30 | Digital Ally, Inc. | Mobile video and imaging system |
WO2014052898A1 (en) | 2012-09-28 | 2014-04-03 | Digital Ally, Inc. | Portable video and imaging system |
US9264474B2 (en) | 2013-05-07 | 2016-02-16 | KBA2 Inc. | System and method of portraying the shifting level of interest in an object or location |
US20140368643A1 (en) * | 2013-06-12 | 2014-12-18 | Prevvio IP Holding LLC | Systems and methods for monitoring and tracking emergency events within a defined area |
US10667277B2 (en) * | 2013-07-29 | 2020-05-26 | Lenel Systems International, Inc. | Systems and methods for integrated security access control for video and audio streaming |
US9159371B2 (en) | 2013-08-14 | 2015-10-13 | Digital Ally, Inc. | Forensic video recording with presence detection |
US10390732B2 (en) | 2013-08-14 | 2019-08-27 | Digital Ally, Inc. | Breath analyzer, system, and computer program for authenticating, preserving, and presenting breath analysis data |
US9253452B2 (en) | 2013-08-14 | 2016-02-02 | Digital Ally, Inc. | Computer program, method, and system for managing multiple data recording devices |
US10075681B2 (en) | 2013-08-14 | 2018-09-11 | Digital Ally, Inc. | Dual lens camera unit |
US9277192B2 (en) * | 2013-10-09 | 2016-03-01 | Institute Of Nuclear Energy Research | Monitoring system of real time image control for radiopharmaceutical automatic synthesizing apparatus in a micro hot cell |
US9361650B2 (en) | 2013-10-18 | 2016-06-07 | State Farm Mutual Automobile Insurance Company | Synchronization of vehicle sensor information |
US9262787B2 (en) | 2013-10-18 | 2016-02-16 | State Farm Mutual Automobile Insurance Company | Assessing risk using vehicle environment information |
US9892567B2 (en) | 2013-10-18 | 2018-02-13 | State Farm Mutual Automobile Insurance Company | Vehicle sensor collection of other vehicle information |
US20150229884A1 (en) * | 2014-02-07 | 2015-08-13 | Abb Technology Ag | Systems and methods concerning integrated video surveillance of remote assets |
GB2523353B (en) * | 2014-02-21 | 2017-03-01 | Jaguar Land Rover Ltd | System for use in a vehicle |
US10692370B2 (en) * | 2014-03-03 | 2020-06-23 | Inrix, Inc. | Traffic obstruction detection |
US20150279426A1 (en) * | 2014-03-26 | 2015-10-01 | AltSchool, PBC | Learning Environment Systems and Methods |
US10185997B1 (en) | 2014-05-20 | 2019-01-22 | State Farm Mutual Automobile Insurance Company | Accident fault determination for autonomous vehicles |
US9972054B1 (en) | 2014-05-20 | 2018-05-15 | State Farm Mutual Automobile Insurance Company | Accident fault determination for autonomous vehicles |
US10373259B1 (en) | 2014-05-20 | 2019-08-06 | State Farm Mutual Automobile Insurance Company | Fully autonomous vehicle insurance pricing |
US10319039B1 (en) | 2014-05-20 | 2019-06-11 | State Farm Mutual Automobile Insurance Company | Accident fault determination for autonomous vehicles |
US10599155B1 (en) | 2014-05-20 | 2020-03-24 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle operation feature monitoring and evaluation of effectiveness |
US11669090B2 (en) | 2014-05-20 | 2023-06-06 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle operation feature monitoring and evaluation of effectiveness |
US10185999B1 (en) | 2014-05-20 | 2019-01-22 | State Farm Mutual Automobile Insurance Company | Autonomous feature use monitoring and telematics |
US10387962B1 (en) | 2014-07-21 | 2019-08-20 | State Farm Mutual Automobile Insurance Company | Methods of reconstructing an accident scene using telematics data |
KR20160012575A (en) * | 2014-07-24 | 2016-02-03 | 삼성전자주식회사 | Disaster alerting server and disaster alerting method thereof |
FR3025389A1 (en) * | 2014-08-29 | 2016-03-04 | Tpsh | INTELLIGENT AND INDEPENDENT INFRARED CAMERA |
US9944282B1 (en) | 2014-11-13 | 2018-04-17 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle automatic parking |
US10594983B2 (en) * | 2014-12-10 | 2020-03-17 | Robert Bosch Gmbh | Integrated camera awareness and wireless sensor system |
IL236752B (en) * | 2015-01-15 | 2019-10-31 | Eran Jedwab | An integrative security system and method |
JP6679607B2 (en) * | 2015-03-03 | 2020-04-15 | ボルボトラックコーポレーション | Vehicle support system |
US9959109B2 (en) | 2015-04-10 | 2018-05-01 | Avigilon Corporation | Upgrading a physical security system having multiple server nodes |
US10565455B2 (en) * | 2015-04-30 | 2020-02-18 | Ants Technology (Hk) Limited | Methods and systems for audiovisual communication |
EP3096290B1 (en) | 2015-05-19 | 2018-07-18 | Axis AB | Method and system for determining camera pose |
US9841259B2 (en) | 2015-05-26 | 2017-12-12 | Digital Ally, Inc. | Wirelessly conducted electronic weapon |
US10013883B2 (en) | 2015-06-22 | 2018-07-03 | Digital Ally, Inc. | Tracking and analysis of drivers within a fleet of vehicles |
US10397484B2 (en) * | 2015-08-14 | 2019-08-27 | Qualcomm Incorporated | Camera zoom based on sensor data |
US11107365B1 (en) | 2015-08-28 | 2021-08-31 | State Farm Mutual Automobile Insurance Company | Vehicular driver evaluation |
US10979673B2 (en) * | 2015-11-16 | 2021-04-13 | Deep North, Inc. | Inventory management and monitoring |
WO2017086771A1 (en) * | 2015-11-18 | 2017-05-26 | Ventionex Technologies Sdn. Bhd. | A visual surveillance system with target tracking or positioning capability |
JP6606436B2 (en) * | 2016-01-19 | 2019-11-13 | ウイングアーク1st株式会社 | Information visualization system |
US10395332B1 (en) | 2016-01-22 | 2019-08-27 | State Farm Mutual Automobile Insurance Company | Coordinated autonomous vehicle automatic area scanning |
US11719545B2 (en) | 2016-01-22 | 2023-08-08 | Hyundai Motor Company | Autonomous vehicle component damage and salvage assessment |
US9940834B1 (en) | 2016-01-22 | 2018-04-10 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle application |
US11441916B1 (en) | 2016-01-22 | 2022-09-13 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle trip routing |
US11242051B1 (en) | 2016-01-22 | 2022-02-08 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle action communications |
US10824145B1 (en) | 2016-01-22 | 2020-11-03 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle component maintenance and repair |
US10324463B1 (en) | 2016-01-22 | 2019-06-18 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle operation adjustment based upon route |
US10134278B1 (en) | 2016-01-22 | 2018-11-20 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle application |
US20190208168A1 (en) * | 2016-01-29 | 2019-07-04 | John K. Collings, III | Limited Access Community Surveillance System |
WO2017136646A1 (en) | 2016-02-05 | 2017-08-10 | Digital Ally, Inc. | Comprehensive video collection and storage |
MX2018010226A (en) | 2016-02-26 | 2018-11-19 | Amazon Tech Inc | Sharing video footage from audio/video recording and communication devices. |
US11393108B1 (en) * | 2016-02-26 | 2022-07-19 | Amazon Technologies, Inc. | Neighborhood alert mode for triggering multi-device recording, multi-camera locating, and multi-camera event stitching for audio/video recording and communication devices |
US10841542B2 (en) | 2016-02-26 | 2020-11-17 | A9.Com, Inc. | Locating a person of interest using shared video footage from audio/video recording and communication devices |
US10489453B2 (en) | 2016-02-26 | 2019-11-26 | Amazon Technologies, Inc. | Searching shared video footage from audio/video recording and communication devices |
US10397528B2 (en) | 2016-02-26 | 2019-08-27 | Amazon Technologies, Inc. | Providing status information for secondary devices with video footage from audio/video recording and communication devices |
US9965934B2 (en) | 2016-02-26 | 2018-05-08 | Ring Inc. | Sharing video footage from audio/video recording and communication devices for parcel theft deterrence |
US10748414B2 (en) | 2016-02-26 | 2020-08-18 | A9.Com, Inc. | Augmenting and sharing data from audio/video recording and communication devices |
KR102586962B1 (en) * | 2016-04-07 | 2023-10-10 | 한화비전 주식회사 | Surveillance system and controlling method thereof |
US10521675B2 (en) | 2016-09-19 | 2019-12-31 | Digital Ally, Inc. | Systems and methods of legibly capturing vehicle markings |
KR102596485B1 (en) * | 2016-12-15 | 2023-11-01 | 한화비전 주식회사 | Apparatus and method for registering camera |
US10911725B2 (en) | 2017-03-09 | 2021-02-02 | Digital Ally, Inc. | System for automatically triggering a recording |
US10785458B2 (en) * | 2017-03-24 | 2020-09-22 | Blackberry Limited | Method and system for distributed camera network |
US10990831B2 (en) * | 2018-01-05 | 2021-04-27 | Pcms Holdings, Inc. | Method to create a VR event by evaluating third party information and re-providing the processed information in real-time |
US10574890B2 (en) | 2018-01-12 | 2020-02-25 | Movidius Ltd. | Methods and apparatus to operate a mobile camera for low-power usage |
US10885689B2 (en) | 2018-07-06 | 2021-01-05 | General Electric Company | System and method for augmented reality overlay |
US11080877B2 (en) | 2018-08-02 | 2021-08-03 | Matthew B. Schoen | Systems and methods of measuring an object in a scene of a captured image |
US11024137B2 (en) | 2018-08-08 | 2021-06-01 | Digital Ally, Inc. | Remote video triggering and tagging |
US10915995B2 (en) | 2018-09-24 | 2021-02-09 | Movidius Ltd. | Methods and apparatus to generate masked images based on selective privacy and/or location tracking |
US11678011B1 (en) * | 2019-04-17 | 2023-06-13 | Kuna Systems Corporation | Mobile distributed security response |
US20210368141A1 (en) * | 2020-05-25 | 2021-11-25 | PatriotOne Technologies | System and method for multi-sensor threat detection platform |
US11368586B2 (en) * | 2020-09-04 | 2022-06-21 | Fusus, Inc. | Real-time crime center solution with dispatch directed digital media payloads |
US11443613B2 (en) * | 2020-09-04 | 2022-09-13 | Fusus, Inc. | Real-time crime center solution with text-based tips and panic alerts |
US11950017B2 (en) | 2022-05-17 | 2024-04-02 | Digital Ally, Inc. | Redundant mobile video recording |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010010541A1 (en) * | 1998-03-19 | 2001-08-02 | Fernandez Dennis Sunga | Integrated network for monitoring remote objects |
US20060001757A1 (en) * | 2004-07-02 | 2006-01-05 | Fuji Photo Film Co., Ltd. | Map display system and digital camera |
US20060222209A1 (en) * | 2005-04-05 | 2006-10-05 | Objectvideo, Inc. | Wide-area site-based video surveillance system |
US20080068151A1 (en) * | 2006-09-13 | 2008-03-20 | Dror Ouzana | Surveillance system and method for optimizing coverage of a region of interest by a sensor |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0728980A (en) * | 1993-07-09 | 1995-01-31 | Kenichi Hyodo | Map information system |
US6084510A (en) * | 1997-04-18 | 2000-07-04 | Lemelson; Jerome H. | Danger warning and emergency response system and method |
US6392692B1 (en) * | 1999-02-25 | 2002-05-21 | David A. Monroe | Network communication techniques for security surveillance and safety system |
US7385626B2 (en) * | 2002-10-21 | 2008-06-10 | Sarnoff Corporation | Method and system for performing surveillance |
US20060007308A1 (en) * | 2004-07-12 | 2006-01-12 | Ide Curtis E | Environmentally aware, intelligent surveillance device |
US7456847B2 (en) * | 2004-08-12 | 2008-11-25 | Russell Steven Krajec | Video with map overlay |
US7688203B2 (en) * | 2006-01-12 | 2010-03-30 | Alfred Gerhold Rockefeller | Surveillance device by use of digital cameras linked to a cellular or wireless telephone |
US8018472B2 (en) * | 2006-06-08 | 2011-09-13 | Qualcomm Incorporated | Blending multiple display layers |
-
2006
- 2006-10-06 US US11/544,972 patent/US8531521B2/en active Active
-
2013
- 2013-09-09 US US14/021,750 patent/US20140327766A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010010541A1 (en) * | 1998-03-19 | 2001-08-02 | Fernandez Dennis Sunga | Integrated network for monitoring remote objects |
US20060001757A1 (en) * | 2004-07-02 | 2006-01-05 | Fuji Photo Film Co., Ltd. | Map display system and digital camera |
US20060222209A1 (en) * | 2005-04-05 | 2006-10-05 | Objectvideo, Inc. | Wide-area site-based video surveillance system |
US20080068151A1 (en) * | 2006-09-13 | 2008-03-20 | Dror Ouzana | Surveillance system and method for optimizing coverage of a region of interest by a sensor |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9923514B1 (en) * | 2017-01-26 | 2018-03-20 | Face International Corporation | Security and tracking systems including energy harvesting components for providing autonomous electrical power |
US10110163B2 (en) | 2017-01-26 | 2018-10-23 | Face International Corporation | Security and tracking systems including energy harvesting components for providing autonomous electrical power |
US11310637B2 (en) | 2017-01-26 | 2022-04-19 | Face International Corporation | Methods for producing security and tracking systems including energy harvesting components for providing autonomous electrical power |
Also Published As
Publication number | Publication date |
---|---|
US20080084473A1 (en) | 2008-04-10 |
US8531521B2 (en) | 2013-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8531521B2 (en) | Methods and apparatus related to improved surveillance using a smart camera | |
US10341619B2 (en) | Methods, systems, and products for emergency services | |
US7250853B2 (en) | Surveillance system | |
US10582162B2 (en) | Image information collecting system and method for collecting image information on moving object | |
US7929010B2 (en) | System and method for generating multimedia composites to track mobile events | |
US8319833B2 (en) | Video surveillance system | |
KR101550715B1 (en) | Video data offer system and method using antenna of mobile communication base station | |
US20070155325A1 (en) | Modular communications apparatus and method | |
WO2009113755A1 (en) | Monitoring system using unmanned air vehicle with wimax communication | |
WO2018205844A1 (en) | Video surveillance device, surveillance server, and system | |
US20120063270A1 (en) | Methods and Apparatus for Event Detection and Localization Using a Plurality of Smartphones | |
US11076131B2 (en) | Video collection system, video collection server, video collection method, and program | |
KR20160002119A (en) | Closed circuit television control system and control method the same | |
KR101654181B1 (en) | A security closed circuit television emergency call bell system using smart phone apps and smart tags | |
JP4602877B2 (en) | Communication system using position information of communication device | |
WO2005081530A1 (en) | Monitoring system, monitoring device, and monitoring method | |
US20160219011A1 (en) | A method and platform for sending a message to a communication device associated with a moving object | |
JP2009205227A (en) | Drive recorder | |
RU2244641C1 (en) | Information-security system for vehicles and immovable property objects | |
JP2014143477A (en) | Imaging system | |
CN111769898B (en) | Method and system for defending unmanned aerial vehicle | |
JP2017181417A (en) | Information processing device, method, and system | |
JP2010239186A (en) | Mobile communication device, mobile communication motorbike, mobile communication bicycle and mobile communication system | |
KR101751810B1 (en) | Dual cctv and operating method thereof | |
KR101634967B1 (en) | Application multi-encoding type system for monitoring region on bad visuality based 3D image encoding transformation, and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |