|Publication number||US5689442 A|
|Application number||US 08/408,901|
|Publication date||18 Nov 1997|
|Filing date||22 Mar 1995|
|Priority date||22 Mar 1995|
|Also published as||WO1999012354A1|
|Publication number||08408901, 408901, US 5689442 A, US 5689442A, US-A-5689442, US5689442 A, US5689442A|
|Inventors||Daniel R. Swanson, Jerry M. Moen, Bradley M. Tate|
|Original Assignee||Witness Systems, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (27), Non-Patent Citations (4), Referenced by (212), Classifications (17), Legal Events (7)|
|External Links: USPTO, USPTO Assignment, Espacenet|
1. Technical Field of the Invention
The present invention relates to surveillance systems and, in particular, to a surveillance system for capturing and storing information concerning events of interest for subsequent use in investigations and courtroom proceedings.
2. Description of Related Art
Human eyewitnesses to events often times provide the most important or only sources of evidence available to investigators and triers of fact in determining what actually occurred during an event of interest. Unfortunately, due in part to known frailties of human nature, the perceptions and recollections of multiple eyewitnesses to an event of interest tend to conflict with one another and, in fact, may also conflict with the physical evidence collected from the scene of the event. Eyewitnesses to events of interest have also been known to embellish or fabricate portions of their recollection of the event, with the unfortunate result of leading investigators and fact finders to incorrect conclusions. The factual accuracy of human eyewitness accounts is especially called into question when the event of interest occurred either unexpectedly or over a short period of time. Another concern with relying upon eyewitness accounts is that the witness to the event of interest may be unwilling or unable (perhaps due to injury or death) to assist investigators and to provide information helpful in reconstructing the event.
To address the foregoing concerns regarding the efficacy of relying on human eyewitness accounts in the investigation of events of interest, attention has been focused on the development of mechanical and electronic surveillance systems for witnessing and recording event information. One mechanical system installed within a vehicle senses changes in vehicle brake fluid pressure to automatically take a photograph at or near the time of an accident. Electronic surveillance systems have been used in homes and businesses to record events of interest on video tape, with the recorded images being useful in civil and criminal investigations. For example, stores commonly use surveillance systems to monitor both customers and employees, with the recorded information being useful in investigating robberies, thefts, and claims of negligence (e.g., slip and fall claims).
Electronic video surveillance systems commonly record information with recorders and video cassettes having an endless loop of tape. With such a recorder and media, "older" recorded information is overwritten and thus erased by "newer" recorded information until recordation is either manually or automatically terminated in response to the occurrence of an event of interest. If the occurrence of an event is not timely recognized and the recordation of events terminated, then event information stored on the endless loop of tape is likely to be overwritten and lost. Conversely, if an event is incorrectly recognized as being "of interest", then recordation will be incorrectly and untimely terminated and the system will not record subsequently occurring events of interest.
An alternative to the use of endless loop of tape video cassettes is to instead use a conventional long playing tape and institute a procedure for periodically replacing and storing the tape. The use of such conventional tapes in video surveillance systems requires continuous attention on the part of the user to avoid situations where recordation of an event is missed because the tape runs out of space. Another drawback of such systems is that a significant amount of space must be provided for storing previously recorded tapes. Even with adequate tape storage space, there still exists a chance that a tape having a previously recorded event of interest will be inadvertently reused prior to discovery that the tape contained a recorded event of interest. In such a case, the previously recorded event information will be irretrievably lost.
Tape recorder based surveillance systems suffer from other known drawbacks as well. For example, due to their continued use, the mean time between failure of key components (like the tape head) is relatively short. The recorders further suffer from a drop-out problem where one or more frames of information are periodically lost. The recorders further do not provide for automatically indexing the recorded data which is helpful in retrieving data. Recorders also do not provide for automatically encrypting the data.
Some surveillance systems utilize more than one device to simultaneously obtain information on events. With such systems, it is imperative that some procedure or apparatus be used to correlate the information being obtained from the multiple sources. One common scheme of correlation records video information from multiple sources in a split-screen format. The drawback of split screen recording is a loss of resolution. Another solution to the information correlation problem is to utilize sophisticated camera systems having synchronization capabilities. While synchronized cameras solve the correlation problem and further allow recordation of information at full resolution, such cameras are extraordinarily expensive and thus are infrequently used.
In spite of the foregoing drawbacks, more and more video surveillance systems are being installed to record information useful in both civil and criminal investigations. The use of such information in investigations, especially criminal investigations, raises an additional concern that the recorded information may be tampered with prior to review. Accordingly, it is vitally important that the integrity of the evidence recorded by video surveillance systems be preserved. To address this concern, one prior art system provides a lockable or otherwise tamper-proof enclosure for holding the recording devices and thus preventing unauthorized access to the recording media. By restricting access to the recording device and documenting the chain of custody of the recorded media after it leaves the recording device, some degree of confidence in the integrity of the recorded information can be maintained.
Providing such physical protection for the recording device and procedures for handling of the media do not, however, guarantee the integrity of the information. Other prior art systems have overlaid a sound stripe on the recorded media to deter persons from attempting to alter the recorded information through deletion, replacement or rearrangement of video frames. This protection scheme is easily bypassed, however, by reproducing and re-recording the audio security stripe on the media after tampering.
The surveillance system of the present invention comprises an event sensor for capturing information (such as images and sounds) concerning events. The event sensor is connected to a control processor that controls both the acquisition of the information by the event sensor and the storage of the information in a data storage device. The event information acquired by the event sensor is encrypted prior to storage in order to insure integrity. An environment sensor connected to the control processor operates to monitor conditions in the environment. The control processor includes a mode control functionality which processes the sensed conditions to identify the occurrence of events of interest and, in response thereto, control operation of the event sensor to emphasize the capture for storage of information related to the detected event of interest. A data management functionality in the control processor dynamically manages the stored information by selectively accessing and deleting from memory previously stored information that is less important or less relevant to the identified events of interest than other previously recorded information.
A more complete understanding of the surveillance system of the present invention may be had by reference to the following Detailed Description when taken in conjunction with the accompanying Drawings wherein:
FIG. 1 is a block diagram of the surveillance system of the present invention;
FIGS. 2A and 2B are graphs illustrating the amount of sensor information stored in relation to detected events of interest by the dynamic data management functionality of the system of the present invention;
FIG. 3 is a block diagram of the surveillance system of FIG. 1 configured for enhancing security in a building;
FIGS. 4A and 4B illustrate two methods for encrypting data;
FIG. 5 is a block diagram of the surveillance system of FIG. 1 configured for mounting in a vehicle; and
FIG. 6 is a block diagram of the surveillance system of FIG. 1 configured for carrying by a human being.
Referring now to FIG. 1, there is shown a block diagram of the surveillance system 100 of the present invention. The surveillance system 100 comprises a control processor 10, an imaging sensor 12, an audio sensor 14, an environment sensor 16 and a data storage device 18. The control processor 10, comprising one or more distributed or parallel processing elements, is connected to the imaging sensor 12 via communications link 20, to the audio sensor 14 via communications link 22, to the environment sensor 16 via communications link 24, and to the data storage device 18 via communications link 26. The communications links 20, 22, 24 and 26 are bi-directional in nature comprising copper, fiber optic, infrared, radio frequency, or the like, type links in either a serial or parallel format.
The imaging sensor 12 and the audio sensor 14 comprise an event sensor 13 operating to capture video and audio information concerning events, with the acquired event information stored in the data storage device 18. All events that occur and which are observed or detected by the sensors 12 and 14 are not necessarily important events (i.e., events of interest). Accordingly, information previously captured by the event sensor concerning these events need not necessarily be retained in the data storage device 18. With respect to events of interest, however, as much information in as much detail as possible needs to acquired by the event sensor 13 and stored in the data storage device 18 for future use.
The imaging sensor 12 comprises at least one imaging device 30 like a CCD video camera, infrared camera or high resolution imaging radar for acquiring images 28 and outputting signals representing the same in either an analog or digital information format. It is preferred that more than one imaging device 30 be used for the imaging sensor 12 to facilitate the taking of images 28 from a plurality of different angles, distances and points of view. The imaging device(s) 30 in the imaging sensor 12 output information for processing by a remote processor 32. Operation of the imaging device(s) 30 is controlled by signals output from the remote processor 32 in response to commands received from the control processor 10. For example, image resolution, zoom, compression and frame rate of image capture are each controllable in response to signals received from the remote processor 32. It will, of course, be understood that the operation and performance of the imaging devices 30 in the acquisition of images 28 is controllable in a number of other well known ways.
The audio sensor 14 comprises at least one audio device 34 like a microphone for detecting sounds 36 (output in either an analog or digital information format) associated with the images 28 taken by the imaging sensor 12. It is preferred that more than one audio device 34 be used for the audio sensor 14 to facilitate recording of sounds 36 related to the images 28 from a plurality of different locations. Preferably, each imaging device 30 will have a corresponding audio device 34. Other audio devices 34 are also included, if desired, and positioned perhaps at locations that are not viewable using the imaging devices 30. The audio device(s) 34 in the audio sensor 14 output information for processing by a remote processor 38. Operation of the audio device(s) 34 is controlled by signals output from the remote processor 38 in response to commands received from the control processor 10. For example, gain, compression and filtering are each controllable in response to signals received from the remote processor 38. It will, of course, be understood that the operation and performance of the audio devices 34 in the acquisition of sounds 36 is controllable in a number of other well known ways.
Alternatively, the operative control exercised by remote processors 32 and 38 on the imaging device 30 and audio device 34, respectively, is effectuated directly by the control processor 10. In such a configuration, remote processors 32 and 38 are not included, and the control processor 10 is connected for the transmission of control commands directly to the imaging device(s) 30 and the audio device(s) 34. However, due to current limitations with respect to control processor 10 data processing and throughput capabilities, the distributed processing design scheme of FIG. 1 utilizing remote processors 32 and 38 is preferred.
The environment sensor 16 comprises at least one sensing device 40 for sensing event conditions 42 (output in either an analog or digital information format) related to the images 28 taken by the imaging sensor 12 and the sounds 36 detected by the audio sensor 14. Signals indicative of the sensing of such conditions 42 are output from the environment sensor 16 over line 24. The signals output from the environment sensor 16 concerning detected conditions are processed by the control processor 10 to determine whether the images 28 and sounds 36 acquired by the sensors 12 and 14 comprise an "event of interest" to the system 100 and should therefore be preserved to facilitate a future investigation of the event. The termination of an event of interest may also be detected by the sensors, or alternatively identified by the control processor 10 based on the expiration of a pre-set event time period.
The environment sensor 16, in general, comprises sensors of two different types. The first type of sensor comprises a passive sensor which merely monitors and reports on conditions in the environment. Conditions sensed by passive sensors include temperature, speed, motion, acceleration, voltage level, etc. The second type of sensor comprises an active sensor which emits energy and monitors the effects (for example, reflection) of such an energy emission to detect conditions. Examples of active sensors include radar and sonar systems useful in actively detecting the presence of objects. Other types of active sensing systems useful in surveillance systems are known to those skilled in the art. Depending on sensor type, the passive and active sensors will output analog, digital or intelligent (i.e., interface) signals for processing by the control processor 10.
It should also be recognized that the imaging sensor 12 and audio sensor 14 provide information on conditions 42 as well as output images 28 and sounds 36. The condition information output from the imaging and audio sensors is useful either in combination with the environment sensor 18 signals, or by itself, in identifying the occurrence of an event of interest. Thus, the control processor 10 further functions to monitor and process the images and sounds captured by the imaging sensor 12 and audio sensor 14 to detect conditions 42 indicative of the occurrence of an event of interest. Image recognition and sound recognition processes are implemented by the control processor 10 in detecting shapes, movements and sounds (including voice recognition) for purposes of identifying the occurrence of an event of interest. Alternatively, the sensors 12 and 14 may comprise intelligent devices capable of detecting the occurrence of an event of interest. It is the event, rather than the conditions 42, which are reported to the control processor 10. In such a case, the sensors 12 and 14 may respond to the event, and control their own operation, without receiving instructions from the control processor 20.
The characteristics of the information transmitted from the environmental sensor 16 to the control processor 10 are a function of the nature of the sensor(s) used in the environmental sensor 16. Accordingly, the control processor 10 is programmed to handle and make sense of the information received in the sensor signals output from different type of devices. For example, a sensor device may sense and output a signal indicative of a certain condition of interest to the system 100. In such a case, no further processing need be done on the signal prior to use in detecting the occurrence of and event of interest. An example of this is a temperature sensor whose output of the current temperature need not be further processed as the temperature level itself is often a condition of interest to the system 100. Other sensor devices may sense and output a signal indicative of a condition not necessarily of direct interest to the system, but which may be further processed by the control processor 10 to detect a condition that is of interest. An example of this may be a location or position detector wherein a series of position outputs can be processed by the control processor 10 to detect conditions of interest such as movement, velocity and acceleration.
A programmable mode control functionality 43 is provided in the control processor 10 for processing the signals output from the environment sensor 16 (and possibly the event sensor) to detect the occurrence of events of interest (as described above), and generating the commands directing the operation of the imaging and audio sensors 12 and 14, respectively, to capture information concerning the events. For example, with respect to the operation of imaging sensor 12, the mode control functionality 43 specifies operation in terms of resolution, frequency of capture (frame rate), zoom, pan and tilt, compression, etc. With respect to the operation of the audio sensor 14, the mode control functionality 43 specifies operation in terms of gain, compression, filtering, etc. In situations where sensors 12 and 14 are of the intelligent type, the control processor 10 either confirms or overrides sensor event detection and operation to capture event information.
Dynamic control over system 100 operation is effectuated by continued monitoring by the control processor 10 of the signals output from the environment sensor 16. In response to such continued monitoring and processing of sensor signals, the control processor 10 adjusts to changes in the environment to assure continued acquisition of event relevant images 28 and sounds 36. With such dynamic control over the mode of system operation, the system 100 is capable of directing the passive capture of event information concerning concurrently occurring events of interest. Detection of an event of interest by the mode control functionality 43 may further be used an active manner with the system 100, for example, signaling an alarm.
The control processor 10 further includes a data management functionality 44 for managing the storage in the data storage device 18 of sensor 12 and 14 acquired images 28 and sounds 36, as well as sensor 16 acquired conditions 42 relating to the images and sounds. The data management functionality 44 affects data storage by selecting frames of sensor 12 images 28 (and associated sounds 36 and conditions 42) for storage or for subsequent deletion from storage in the data storage device 18. This selection decision is based on an evaluation of a variety of factors including the mode of system 100 operation at the time of image and sound acquisition, the age or staleness of the acquired information, and the amount of space remaining in the data storage device 18. In this connection, it should be apparent that the data management functionality 44 will operate to preserve in storage those images, sounds and conditions that were acquired by the system 100 at or near the time of a detected event of interest. Images, sounds and conditions acquired at other times, and thus not as relevant to the detected event of interest, will be preserved in the data storage device 18 only until such time as the storage space occupied by the information is needed for the storage of subsequently acquired information. The operation of the data management functionality 44 is user selectable and programmable, and thus may be tailored to a particular application or need.
In order to keep track of when certain frames of images 28 and associated sounds 36 and conditions 42 are acquired, as well as to facilitate subsequent synchronization of information (especially if acquired by different sensors), the control processor 10 maintains a timer 45 and time stamps each frame of event information (76 in FIGS. 4A and 4B) comprising images, sounds and conditions prior to storage in the data storage device 18. By time stamping the event information output from the sensors 12, 14 and 16, the system 100 advantageously does not require use of sophisticated and expensive sensors having time synchronization capabilities. In instances where multiple systems 100 are positioned to monitor a single location, the timers 45 for each of the systems are synchronized.
Reference is now made to FIGS. 2A and 2B wherein there are shown graphs illustrating examples of two methods of programming operation of the data management functionality 44 to emphasize the preservation of event information concerning detected events of interest. The y-axis 46 represents the number of frames of sensor 12 images 28 (and associated sounds 36 and conditions 42) stored by the data storage device 18. The x-axis 48 refers to the time at which the event information was acquired, with locations further right on the x-axis being "older" moments in time. The point on the x-axis 48 where the y-axis 46 intersects with the x-axis corresponds to the present time.
In FIG. 2A, line 50 illustrates one data management scheme emphasizing the storage and retention in storage of frames (as generally indicated at 52) of sensor 12 images 28 (and associated sounds 36 and conditions 42) acquired at or near the present time and times te of events of interest. The number of frames stored drops off in a bell curve fashion as time moves in either direction along the x-axis 48 away from the times te of the events of interest. Another data management scheme illustrated by line 54 in FIG. 2B does not emphasize the storage of event information immediately after the times te of events of interest (as generally indicated at 56), but does emphasize the storage of increasing amounts of event information as time leads up to the present time and either leads up to (as generally indicated at 58) the times te of the events of interest or alternatively leads away from the events of interest (as shown by broken line 54').
The operation of the data management functionality 44 to control the amount of information stored in the data storage device 18 is a constant, ongoing process, with the stored data being evaluated in terms of the detection of events of interest, the age or staleness of the acquired information, and the amount of space remaining in the data storage device 18. At the same time, however, the data management functionality 44 will prefer a management scheme where data is stored i.e., retained) rather than deleted. This is illustrated in line 50 of FIG. 2A at 60 where the data management functionality 44 is preserving a large number of frames of information acquired during the time between the two illustrated closely occurring events of interest te. However, due for example to a concern over dwindling amounts of available space for storing subsequently acquired event information, the data management functionality 44 will make room for soon to be acquired frames of information (as generally indicated by broken line 62) by deleting frames from storage (as illustrated by broken line 50') concerning event information acquired at times between the times of the two detected events of interest.
The management of stored data according to the functionality 44 thus comprises a dynamic, random access operation emphasizing the retention in storage of increased amounts of information acquired at or near the times of events of interest. The operation of such a data management functionality 44 accordingly requires that the control processor 10 be able to randomly access locations in the data storage device 18 to allow previously recorded frames of event information to be accessed and deleted when its retention is no longer needed. To facilitate the foregoing operations, the data storage device 18 must comprise some type of random access data store, like a PCMCIA card, that will allow the control processor 10 through its data management functionality 44 to selectively access locations in memory for data storage and data deletion. An alternative random access data store could comprise high memory content RAM chip(s) or extremely fast access disk drive(s).
Conventional magnetic tape cannot be used as the recording media in the data storage device 18 due to its inability to be quickly accessed in a random fashion by the control processor 10. However, a tape based data storage device is useful as an auxiliary data storage archive 92. Event information acquired by the sensors can be backed-up in the archive 92. Furthermore, in instances where the data management functionality 44 determines that more data needs to be stored than there is available space in the device 18, such overflow information (comprising "older" event information) may be transferred to the archive 92.
The random access data store for the storage device 18 thus will include a plurality of addresses (not shown) for storing frames of event information. These addresses will be accessed by the data management functionality 44 to store acquired frames of event information. After initial acquisition and storage, the data management functionality 44 operates as described above to dynamically manage the available data storage resources. In this connection, addresses in the data storage device 18 will be accessed by the data management functionality 44, and less valuable frames of event information stored therein will be deleted to make room for subsequently acquired information. The deletion determination is made to emphasize retention of the valuable event information relevant to detected events of interest. Accordingly, it is likely that adjacent addresses in the data storage device 18 will not contain related event information following the operation of the data management functionality 44 and the deletion of unwanted information.
The operation of the data management functionality 44 may be better understood through an example. At the present time, the system acquires frames of event information at a predetermined rate set by the mode control functionality 43. At or near the present time, all of the frames of event information will be stored at available addresses in the data storage device 18. However, if no event of interest is detected by the mode control functionality 43, the event information being collected becomes less and less important in terms of retention, and thus the data management functionality 44 will act to access the addresses of some of the previously stored frames and delete the information from storage. As time continues to pass, the data management functionality 44 will continue to delete more and more frames from storage to make room for newly acquired frames. Eventually, because no event of interest is detected near the time of information acquisition and storage, nearly all (if not all) of the previously acquired event information will be accessed and deleted by the data management functionality 44. Event information collected at or near the time of detected events of interest, on the other hand, will be retained to the greatest extent possible.
As mentioned above, a part of the event information stored by the system 100 comprises the information output from the environment sensor 16. This data is useful in a number of ways. First, the conditions 42 sensed by the environmental sensor 16 provide information that assists in the interpretation of images and sounds acquired by the sensors 12 and 14. For example, a captured image that reveals what appears to be liquid on a surface, in connection with a sensed condition 42 indicating a temperature below freezing, provides an indication that a slippery condition might have existed at the time of image was captured. Another use of the environmental information is in critically analyzing and trouble shooting system 100 performance, and in particular the performance of the mode control functionality 43 in identifying events of interest. Monitoring of environment conditions 42 that lead to an incorrect identification of an event of interest provide system analysts with information needed to adjust mode control functionality operation and performance to better and more accurately detect events of interest.
A more complete understanding of the operation of the system 100 of the present invention may be had by reference to a specific example illustrated in FIG. 3 wherein the system is installed in a business for the purpose of enhancing building security. In such an installation, the imaging devices 30 of the imaging sensor 12 are positioned at number of different locations about the inside and outside of the building. Particular attention for placement of imaging devices would be directed to entrance and exit doors, secure or restricted access rooms or areas, and any other desired location. Audio devices 34 of the audio sensor 14 are located at image device locations, and further positioned in other areas of interest. The environment sensor 40 will include a number of sensor inputs 64 for receiving information regarding conditions 42 both within and without the building. For example, inputs 64 will be received from motion detectors, glass break sensors, door window sensors, card key readers, and smoke and fire detectors.
In operation, the images 28 and sounds 36 acquired by the sensors 12 and 14 will be recorded in the data storage device 18, with the stored event information dynamically managed in accordance with the data management functionality 44. The environment sensor 40 will monitor conditions inside and outside the building in an effort to detect the occurrence of an event of interest such as a fire, an attempted or actual break-in or other apparently unauthorized access. When signals indicative of the occurrence of such an event of interest are output to the control processor 10, the location of the event is determined and the mode control functionality 43 commands the sensors 12 and 14 to acquire images and sounds in the determined location with specified characteristics of acquisition. Such commands could, for example, increase the frame rate of the image devices and gain of the audio devices in the area of the determined location. Thus, in this particular scenario, more data from the devices 30 and 34 in the determined location than the devices in other locations will be transmitted to the control processor 10 and stored in the data storage device 18. An alarm may also be sounded. Concurrent to the handling of the event, the environment sensor 40 continues via inputs 64 to monitor conditions inside and outside the building. New events of interest may be concurrently or subsequently detected, with the mode control functionality 43 operating to dynamically adjust system 100 operation to emphasize the reception of images and sounds from devices 30 and 34 positioned at or near the location of the concurrently or subsequently detected event. The data management functionality 44 will continue to control information storage by deleting, but only if necessary (as described above), images and sounds acquired either at times other than the times of events of interest, or by devices 30 and 34 not positioned at the determined locations of the detected events of interest.
The system 100 of the present invention is useful in moving as well as fixed platform installations. Such moving platforms comprise not only vehicles like automobiles, buses, trains, aircraft, and the like, but also human beings, like police officers or delivery men. In moving platform installations, it is important that the location of the platform as well as the sounds and images be recorded for subsequent review. Accordingly, with reference again to FIG. 1, the system 100 further includes a locating device 66 such as a GPS receiver and processor. The locating device 66 is connected to the control processor 10 via line 68. Signals indicative of detected location are output from the locating device 66, processed by the control processor 10 and the detected location stored, with a time stamp, in the data storage device 18 along with the frames of images and associated sounds and conditions acquired by the sensors 12, 14 and 16.
The system 100 of the present invention further includes a transceiver 70 for facilitating remote communications to and from the control processor 10. The transceiver 70 is useful for transmitting the images 28 and sounds 36 being acquired by the sensors 12 and 14. With the transceiver 70, not only the images and sounds of the event of interest may be transmitted to a remote location, but also the platform location data obtained by the locating device 66 and conditions detected by the environment sensor 16. The transceiver 70 may comprise a radio frequency transceiver, but it will be understood that other communication means such as a cellular phone system or an infrared communication system may be used to suit particular applications and system 100 needs. With a cellular phone connection, the system 100 further can implement well known automatic dialing procedures for contacting remote locations to report the occurrence of detected events of interest.
The transceiver 70 further allows the remote location to transmit commands to the control processor 10 for purposes of directing operation of the system 100. Such commands may, in fact, be used to override the operation of the mode control functionality 43 and direct the sensors 12 and 14 to acquire certain information deemed by the remote location to be of particular importance for real time review using a data transmission via the transceiver 70. At the same time, however, the system 100 will continue to store other images and sounds in the data storage device 18 for subsequent review after the event of interest is over. The transceiver 70 further facilitates the downloading from the remote location to the system 100 of programming upgrades and operation parameter changes.
By means of the transceiver 70, the remote location can command the downloading of recorded information from the data storage device 18 at predetermined times (for example, after a shift is completed). Alternatively, the system 100 could be commanded to download recorded data while the system is being used thereby freeing up memory in the data storage device 18 for storage of information concerning subsequent events of interest. Along the same lines, the data management functionality 44 may command such a download in situations where available space in the data storage device reaches a critically low level.
The system 100 of the present invention is particularly useful as an investigative tool recording images and sounds of events of interest for future review. The recorded images and sounds thus comprise important, if not the only pieces of evidence available to investigators or triers of fact in making a determination of what actually occurred. It is therefore vitally important that some measures be taken to preserve the integrity of the stored images and sounds.
The control processor 10 of the system 100 of the present invention accordingly further includes an encryption functionality 72 that operates to encrypt in some fashion either some or all of the information processed by the control processor 10 either for storage in the data storage device 18 or transmitted to a remote location by the transceiver 70. One method of encryption illustrated in FIG. 4A is to encrypt 74 in their entirety all of the frames of event information 76 (images, sounds and conditions). This method is especially useful when the information is to be transmitted to a remote location because anyone intercepting the transmission will be unable to access the information without the encryption key. Another method of encryption illustrated in FIG. 4B utilizes an encryption envelope 78 in front of or at the back of each frame of data 76 (such as the digital signature encryption currently used to protect electronic funds transfers). This method is especially useful when the information is being stored in the data storage device 18, and is not preferred for remotely transmitted data because the data can be reviewed without decrypting by anyone intercepting the transmission.
With either method illustrated in FIGS. 4A and 4B, the object of the encryption is to inhibit persons from tampering with the information and further allow for any attempted or completed acts of tampering to be detected. To provide a further measure of protection for stored information, the data management functionality 44 maintains an index 80 of the frames of event information 76 stored in the data stored device 18. Changes in the information stored in the data storage device 18 due to action of the data management functionality 44 cause a corresponding change in the contents of the index 80. For example, as old, no longer needed frames 76 are deleted, record of those frames is erased from the index 80. Similarly, as new frames 76 are stored, the index 80 is updated to reflect the presence of the new information. To prevent a person from deleting crucial frames 76 from the data storage device 18 and simply updating the index 80 accordingly to conceal the act of tampering, the index is also protected from tampering by encryption 82 in either format illustrated in FIGS. 4A and 4B. The updated index is primarily stored in RAM 73, and is periodically saved in the data storage device 18.
As a further measure of protection against tampering, the timer 45 (providing a record of current time) is capable of being reset only by means of a two-way communication with a remote location effectuated by means of the transceiver 70. A record of the time reset (or update) communication is maintained both at the remote location and in the control processor 10 RAM 73. These records are each encrypted using either of the formats illustrated in FIGS. 4A and 4B.
Reference is again made to FIG. 1. The amount of space available in the data storage device 18 is limited. Accordingly, as discussed above, the data management functionality 44 operates to dynamically control management of the available space and thus efficiently use the data storage device 18 to store as much event information as possible. Increased efficiency in data storage is provided by using a data compression functionality 84 to compress the event information prior to storage. Although shown located and preferably operated in the remote processors 32 and 38 of the sensors 12 and 14, it will, of course, be understood that the data compression functionality 84 is equally locatable in the control processor 10 (as shown). For images 28, either of the compression algorithms developed by the Joint Photographic Expert's Group (JPEG) or by the Moving Picture Expert's Group (MPEG) or any other suitable compression algorithm may be implemented to perform compression of the images acquired by the image sensor 12. Sounds 36, on the other hand, are compressed by the data compression functionality 84 using either the MPEG compression algorithm or other suitable compression algorithm.
The system 100 further includes a display 86, like a cathode ray tube, connected to the control processor 10 for displaying to a system user the event information currently being captured or previously stored. In fact, with the display 86 and random access data storage device 18, previously recorded event information may be viewed while the system 100 simultaneously records current event information. A data entry device 88 is provided connected to the control processor 10 to enable user selection of event information for display. Some control over system 100 operation may also be effectuated by the entry or selection of commands through the data entry device 88. The device 88 is further useful in entering data for storage in the data storage device 18, the entered data synchronized with the captured event information to which the input data relates.
To protect the control processor 10 and data storage device 18 from the environment and from tampering or other harm, these components are preferably installed in a temperature controlled enclosure 90. The enclosure 90 maintains a preset internal temperature range and further provides a physical barrier protecting against device damage or tampering. In particular, the enclosure 90 prevents unauthorized access to the data storage device 18 thus protecting the stored event information.
As mentioned above, the system 100 is particularly applicable for use in moving platforms. One implementation in a vehicle (like an automobile) is illustrated in FIG. 5. The vehicle installed system 100 preferably includes four imaging devices 30 oriented to image out each side and the front and back of the vehicle thus providing substantially three-hundred sixty degree external imaging coverage. Additional imaging devices 30 may be positioned inside the vehicle if desired. Audio devices 34 of the audio sensor 14 are located both inside and outside the vehicle, and further positioned at other locations as desired. The environment sensor 16 will include sensors 40 for detecting conditions 42 both inside and outside the vehicle. For example, the sensors 40 include: passive sensors 40(p) for sensing external temperature, engine conditions (RPMs, coolant temperature, oil pressure, etc.), vehicle speed, vehicle operating conditions (turn signals, headlights, horn, etc.), acceleration; and active sensors such as a radar collision avoidance system.
In the vehicle installation, the mode selected by the mode control functionality will emphasize the capture of event information based primarily on vehicle operation. For example, if the vehicle is moving in a forward direction, the emphasis will be placed on the acquisition of video information from the imaging devices with front and rear orientations. At the same time, the system 100 will monitor the detected conditions 42 in an attempt to identify a new event that would signal a mode change. Such a condition could comprise the slowing or stopping of the vehicle as the execution of a turn. These detected conditions may necessitate a mode change to acquire event information from other sources. A stopping of the vehicle could be caused by an accident or an approach to a stop sign. The mode control functionality processes the detected conditions 42 to identify which of these events is occurring and, in response thereto, acquire information concerning the former event only. The mode control functionality may further adjust the resolution of the imaging devices to acquire certain information of interest (such as a license plate number). From the foregoing, it will be understood that the mode control functionality 43 will separately control the operation of the devices in each of the sensors 12 and 14 in order to insure that only the most important and pertinent information is being obtained.
Use of the system 100 in a moving platform comprising a human being is illustrated in FIG. 6. The person carried installed system 100 preferably includes one device 30 oriented to image towards the front of the person. Additional imaging devices 30 may be positioned directed to the sides and behind the person if desired to provide three-hundred sixty degree imaging coverage. One audio device 34 is positioned on the body of the person to record the same sounds that the person hears. The environment sensor will include sensors 40 for detecting conditions both internal and external to the body of the person. For example, the sensors 40 include: internal sensors 40(i) for sensing body temperature, respiration, perspiration, heart beat, and muscle contractions; and external sensors 40(e) for sensing external temperature and location. The system 100 illustrated in FIG. 6 operates in the manner described above for the systems illustrated in FIGS. 1, 3 and 5. Accordingly, further detailed description of FIG. 6 and the operation of the system in the illustrated application is deemed unnecessary.
Although a preferred embodiment of the method and apparatus of the present invention has been illustrated in the accompanying Drawings and described in the foregoing Detailed Description, it will be understood that the invention is not limited to the embodiments disclosed, but is capable of numerous rearrangements, modifications and substitutions without departing from the spirit of the invention as set forth and defined by the following claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US2883255 *||24 Nov 1954||21 Apr 1959||Panellit Inc||Automatic process logging system|
|US3349679 *||7 Apr 1965||31 Oct 1967||Iii Joseph A Lohman||Photo identification apparatus|
|US3689695 *||10 Apr 1970||5 Sep 1972||Harry C Rosenfield||Vehicle viewing system|
|US3752047 *||13 Dec 1971||14 Aug 1973||L Gordon||Surveillance camera|
|US4214266 *||19 Jun 1978||22 Jul 1980||Myers Charles H||Rear viewing system for vehicles|
|US4234926 *||5 Dec 1978||18 Nov 1980||Sealand Service Inc.||System & method for monitoring & diagnosing faults in environmentally controlled containers, such system and method being especially adapted for remote computer controlled monitoring of numerous transportable containers over existing on-site power wiring|
|US4277804 *||31 Jan 1980||7 Jul 1981||Elburn Robison||System for viewing the area rearwardly of a vehicle|
|US4281354 *||17 May 1979||28 Jul 1981||Raffaele Conte||Apparatus for magnetic recording of casual events relating to movable means|
|US4511886 *||6 Oct 1983||16 Apr 1985||Micron International, Ltd.||Electronic security and surveillance system|
|US4745564 *||7 Feb 1986||17 May 1988||Board Of Trustees Operating Michigan State University||Impact detection apparatus|
|US4843463 *||23 May 1988||27 Jun 1989||Michetti Joseph A||Land vehicle mounted audio-visual trip recorder|
|US4910591 *||8 Aug 1988||20 Mar 1990||Edward Petrossian||Side and rear viewing apparatus for motor vehicles|
|US4922339 *||31 Mar 1988||1 May 1990||Stout Video Systems||Means and method for visual surveillance and documentation|
|US4949186 *||5 Dec 1988||14 Aug 1990||Peterson Roger D||Vehicle mounted surveillance system|
|US4992943 *||13 Feb 1989||12 Feb 1991||Mccracken Jack J||Apparatus for detecting and storing motor vehicle impact data|
|US5027104 *||21 Feb 1990||25 Jun 1991||Reid Donald J||Vehicle security device|
|US5111289 *||27 Apr 1990||5 May 1992||Lucas Gary L||Vehicular mounted surveillance and recording system|
|US5121200 *||6 Jul 1990||9 Jun 1992||Choi Seung Lyul||Travelling monitoring system for motor vehicles|
|US5144661 *||11 Feb 1991||1 Sep 1992||Robert Shamosh||Security protection system and method|
|US5155474 *||28 Jun 1991||13 Oct 1992||Park Photo Protection System Ltd.||Photographic security system|
|US5191312 *||28 Mar 1991||2 Mar 1993||Blaupunkt-Werke Gmbh||Automotive accessory control center|
|US5282182 *||12 Nov 1991||25 Jan 1994||Kreuzer Monroe E||Video monitor and housing assembly|
|US5319394 *||11 Feb 1991||7 Jun 1994||Dukek Randy R||System for recording and modifying behavior of passenger in passenger vehicles|
|US5437163 *||22 Aug 1994||1 Aug 1995||Thermo King Corporation||Method of logging data in a transport refrigeration unit|
|US5549115 *||28 Sep 1994||27 Aug 1996||Heartstream, Inc.||Method and apparatus for gathering event data using a removable data storage medium and clock|
|US5557546 *||10 Mar 1994||17 Sep 1996||Hitachi Building Systems Engineering & Service Co. Ltd.||Data acquisition system for the analysis of elevator trouble|
|US5557548 *||9 Dec 1994||17 Sep 1996||International Business Machines Corporation||Method and system for performance monitoring within a data processing system|
|1||*||Dawn Stover, Radar On a Chip; 101 Uses In Your Life, Popular Science, Mar. 1995, pp. 107 110, 116, 117.|
|2||Dawn Stover, Radar On a Chip; 101 Uses In Your Life, Popular Science, Mar. 1995, pp. 107-110, 116, 117.|
|3||*||Steve Ditlea, Real Men Don t Ask Directions, Popular Science, Mar. 1995, pp. 86, 87 89, 120, 121.|
|4||Steve Ditlea, Real Men Don't Ask Directions, Popular Science, Mar. 1995, pp. 86, 87-89, 120, 121.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6147598 *||23 Nov 1998||14 Nov 2000||Trimble Navigation Limited||Vehicle theft system including a handheld computing device|
|US6226329 *||6 Nov 1998||1 May 2001||Niles Parts Co., Ltd||Image storing and processing device|
|US6233428 *||17 Sep 1997||15 May 2001||Bruce Fryer||System and method for distribution of child care training materials and remote monitoring of child care centers|
|US6362750 *||6 Apr 2000||26 Mar 2002||Siemens Ag||Process and device for automatically supported guidance of aircraft to a parking position|
|US6422061 *||2 Mar 2000||23 Jul 2002||Cyrano Sciences, Inc.||Apparatus, systems and methods for detecting and transmitting sensory data over a computer network|
|US6559769 *||7 Dec 2001||6 May 2003||Eric Anthony||Early warning real-time security system|
|US6570487||22 Apr 1999||27 May 2003||Axcess Inc.||Distributed tag reader system and method|
|US6580373 *||29 Nov 1999||17 Jun 2003||Tuner Corporation||Car-mounted image record system|
|US6594597 *||4 Oct 2000||15 Jul 2003||The Minster Machine Company||Press residual life monitor|
|US6629139 *||1 Sep 1999||30 Sep 2003||International Business Machines Corporation||Reactive detection of conditions of interest in distributed systems|
|US6629397||16 Oct 1998||7 Oct 2003||Focke & Co. (Gmbh)||Machine monitoring apparatus capable of incorporation into a network|
|US6630884 *||12 Jun 2000||7 Oct 2003||Lucent Technologies Inc.||Surveillance system for vehicles that captures visual or audio data|
|US6686838||10 Nov 2000||3 Feb 2004||Xanboo Inc.||Systems and methods for the automatic registration of devices|
|US6701058 *||28 Dec 1999||2 Mar 2004||Fuji Photo Film Co., Ltd.||Image capturing and recording system utilizing wireless communication and image transmission-reception method thereof|
|US6731332 *||28 Jan 1999||4 May 2004||Matsushita Electric Industrial Co., Ltd.||Image processing apparatus|
|US6795642 *||30 Jul 2001||21 Sep 2004||Matsushita Electric Industrial, Co., Ltd.||Video recording apparatus and monitoring apparatus|
|US6803945 *||21 Sep 1999||12 Oct 2004||Intel Corporation||Motion detecting web camera system|
|US6813312 *||3 Feb 1999||2 Nov 2004||Axis, Ab||Data storage and reduction method for digital images, and a surveillance system using said method|
|US6816085||17 Nov 2000||9 Nov 2004||Michael N. Haynes||Method for managing a parking lot|
|US6850861 *||21 May 1999||1 Feb 2005||Syracuse University||System for monitoring sensing device data such as food sensing device data|
|US6864918 *||15 Oct 2001||8 Mar 2005||Canon Kabushiki Kaisha||Image pickup apparatus, method of controlling the image pickup apparatus, and external device|
|US6873261 *||17 Jan 2003||29 Mar 2005||Eric Anthony||Early warning near-real-time security system|
|US6943681||16 Sep 2003||13 Sep 2005||Xanboo, Inc.||Systems and methods for the automatic registration of devices|
|US6950013||31 May 2002||27 Sep 2005||Robert Jeffery Scaman||Incident recording secure database|
|US6954859 *||8 Oct 1999||11 Oct 2005||Axcess, Inc.||Networked digital security system and methods|
|US6961087 *||29 Apr 1998||1 Nov 2005||Canon Kabushiki Kaisha||Portable electronic apparatus, image processing method, photographic apparatus, and computer readable recording medium|
|US7075251||5 Dec 2003||11 Jul 2006||General Electric Company||Universal platform for phase dimming discharge lighting ballast and lamp|
|US7075568 *||15 Oct 2001||11 Jul 2006||Canon Kabushiki Kaisha||Digital camera, system, and method for capturing and storing an image, and using an event signal to indicate a change in the content stored in a memory|
|US7123166 *||8 Oct 2004||17 Oct 2006||Haynes Michael N||Method for managing a parking lot|
|US7131136||10 Jul 2002||31 Oct 2006||E-Watch, Inc.||Comprehensive multi-media surveillance and response system for aircraft, operations centers, airports and other commercial transports, centers and terminals|
|US7173526||4 Nov 2005||6 Feb 2007||Monroe David A||Apparatus and method of collecting and distributing event data to strategic security personnel and response vehicles|
|US7173672||8 Aug 2002||6 Feb 2007||Sony Corporation||System and method for transitioning between real images and virtual images|
|US7190882||19 Mar 2001||13 Mar 2007||Applied Concepts, Inc.||In-car digital video recording with MPEG-4 compression for police cruisers and other vehicles|
|US7197228||28 Aug 1998||27 Mar 2007||Monroe David A||Multifunction remote control system for audio and video recording, capture, transmission and playback of full motion and still images|
|US7205891||20 Sep 2004||17 Apr 2007||Purdue Research Foundation||Real-time wireless video exposure monitoring system|
|US7228429||21 Sep 2001||5 Jun 2007||E-Watch||Multimedia network appliances for security and surveillance applications|
|US7250854||1 Jun 2005||31 Jul 2007||Xanboo, Inc.||Systems and methods for the automatic registration of devices|
|US7265663||22 Oct 2002||4 Sep 2007||Trivinci Systems, Llc||Multimedia racing experience system|
|US7272179||1 Nov 2002||18 Sep 2007||Security With Advanced Technology, Inc.||Remote surveillance system|
|US7286158||22 Dec 1999||23 Oct 2007||Axcess International Inc.||Method and system for providing integrated remote monitoring services|
|US7339609 *||8 Aug 2002||4 Mar 2008||Sony Corporation||System and method for enhancing real-time data feeds|
|US7348895 *||3 Nov 2005||25 Mar 2008||Lagassey Paul J||Advanced automobile accident detection, data recordation and reporting system|
|US7355626 *||29 Apr 2002||8 Apr 2008||Infrared Integrated Systems Limited||Location of events in a three dimensional space under surveillance|
|US7359622||14 Feb 2005||15 Apr 2008||Monroe David A||Multifunction remote control system for audio and video recording, capture, transmission and playback of full motion and still images|
|US7365871||3 Jan 2003||29 Apr 2008||Monroe David A||Apparatus for capturing, converting and transmitting a visual image signal via a digital transmission system|
|US7399128 *||18 Feb 2004||15 Jul 2008||Matsushita Electric Industrial Co., Ltd.||Camera control system|
|US7400249||14 Feb 2005||15 Jul 2008||Monroe David A||Networked personal security system|
|US7428002||5 Jun 2002||23 Sep 2008||Monroe David A||Emergency telephone with integrated surveillance system connectivity|
|US7428368||29 Nov 2005||23 Sep 2008||Monroe David A||Multifunction remote control system for audio and video recording, capture, transmission and playback of full motion and still images|
|US7456875||12 Mar 2003||25 Nov 2008||Sony Corporation||Image pickup apparatus and method, signal processing apparatus and method, and wearable signal processing apparatus|
|US7477309||10 Jul 2003||13 Jan 2009||Lockheed Martin Corporation||Infrared camera system and method|
|US7495562||28 Dec 2006||24 Feb 2009||David A Monroe||Networked personal security system|
|US7511612||15 Sep 2005||31 Mar 2009||Monroe David A||Ground based security surveillance system for aircraft and other commercial vehicles|
|US7525421||11 May 2005||28 Apr 2009||Raytheon Company||Event detection module|
|US7539357||10 Aug 1999||26 May 2009||Monroe David A||Method and apparatus for sending and receiving facsimile transmissions over a non-telephonic transmission system|
|US7551075||28 Dec 2006||23 Jun 2009||David A Monroe||Ground based security surveillance system for aircraft and other commercial vehicles|
|US7555528||22 Jun 2001||30 Jun 2009||Xanboo Inc.||Systems and methods for virtually representing devices at remote sites|
|US7561037||28 Dec 2006||14 Jul 2009||Monroe David A||Apparatus for and method of collecting and distributing event data to strategic security personnel and response vehicles|
|US7576770||11 Feb 2004||18 Aug 2009||Raymond Metzger||System for a plurality of video cameras disposed on a common network|
|US7583197 *||10 Jan 2006||1 Sep 2009||Eveline Wesby Van Swaay||Programmable communicator|
|US7586516||28 Jan 2005||8 Sep 2009||Canon Kabushiki Kaisha||Image pickup apparatus, method of controlling the image pickup apparatus, and external device|
|US7609941 *||20 Oct 2004||27 Oct 2009||Panasonic Corporation||Multimedia data recording apparatus, monitor system, and multimedia data recording method|
|US7629886||9 Nov 2005||8 Dec 2009||Axcess International, Inc.||Method and system for networking radio tags in a radio frequency identification system|
|US7634334||28 Dec 2006||15 Dec 2009||Monroe David A||Record and playback system for aircraft|
|US7634361 *||11 May 2005||15 Dec 2009||Raytheon Company||Event alert system and method|
|US7634662||21 Nov 2003||15 Dec 2009||Monroe David A||Method for incorporating facial recognition technology in a multimedia surveillance system|
|US7640083||21 Nov 2003||29 Dec 2009||Monroe David A||Record and playback system for aircraft|
|US7643168||28 Dec 2006||5 Jan 2010||Monroe David A||Apparatus for capturing, converting and transmitting a visual image signal via a digital transmission system|
|US7650058||8 Jan 2002||19 Jan 2010||Cernium Corporation||Object selective video recording|
|US7652593||5 Oct 2006||26 Jan 2010||Haynes Michael N||Method for managing a parking lot|
|US7659827||8 May 2006||9 Feb 2010||Drivecam, Inc.||System and method for taking risk out of driving|
|US7688225||22 Oct 2007||30 Mar 2010||Haynes Michael N||Method for managing a parking lot|
|US7692695 *||8 Feb 2007||6 Apr 2010||Sony Corporation||Imaging apparatus and control method therefor|
|US7733371||14 Nov 2005||8 Jun 2010||Monroe David A||Digital security multimedia sensor|
|US7734724||2 Feb 2001||8 Jun 2010||Xanboo Inc.||Automated upload of content based on captured event|
|US7768566||28 Dec 2006||3 Aug 2010||David A Monroe||Dual-mode camera|
|US7782365||23 May 2006||24 Aug 2010||Searete Llc||Enhanced video/still image correlation|
|US7787024 *||13 Apr 2005||31 Aug 2010||Canon Kabushiki Kaisha||Portable electronic apparatus, image processing method, photographing apparatus, and computer readable recording medium|
|US7796023||27 Jun 2007||14 Sep 2010||Babak Rezvani||Systems and methods for the automatic registration of devices|
|US7800503||11 May 2007||21 Sep 2010||Axcess International Inc.||Radio frequency identification (RFID) tag antenna design|
|US7804426||4 Dec 2006||28 Sep 2010||Drivecam, Inc.||System and method for selective review of event data|
|US7830962 *||31 Mar 2006||9 Nov 2010||Fernandez Dennis S||Monitoring remote patients|
|US7839432||28 Mar 2001||23 Nov 2010||Dennis Sunga Fernandez||Detector selection for monitoring objects|
|US7839926||21 Apr 2005||23 Nov 2010||Metzger Raymond R||Bandwidth management and control|
|US7841120||10 Jan 2007||30 Nov 2010||Wilcox Industries Corp.||Hand grip apparatus for firearm|
|US7859396||22 Nov 2006||28 Dec 2010||Monroe David A||Multimedia network appliances for security and surveillance applications|
|US7872675||31 Oct 2005||18 Jan 2011||The Invention Science Fund I, Llc||Saved-image management|
|US7876357||2 Jun 2005||25 Jan 2011||The Invention Science Fund I, Llc||Estimating shared image device operational capabilities or resources|
|US7889846 *||13 Sep 2005||15 Feb 2011||International Business Machines Corporation||Voice coordination/data retrieval facility for first responders|
|US7920169||26 Apr 2005||5 Apr 2011||Invention Science Fund I, Llc||Proximity of shared image devices|
|US7920626||29 Mar 2001||5 Apr 2011||Lot 3 Acquisition Foundation, Llc||Video surveillance visual recognition|
|US7952609||11 Oct 2005||31 May 2011||Axcess International, Inc.||Networked digital security system and methods|
|US7983835||3 Nov 2005||19 Jul 2011||Lagassey Paul J||Modular intelligent transportation system|
|US8022965||22 May 2006||20 Sep 2011||Sony Corporation||System and method for data assisted chroma-keying|
|US8026945||24 Jul 2006||27 Sep 2011||Cernium Corporation||Directed attention digital video recordation|
|US8068143||30 Oct 2003||29 Nov 2011||Hewlett-Packard Development Company, L.P.||Camera apparatus with saliency signal generation|
|US8068986||28 Apr 2008||29 Nov 2011||Majid Shahbazi||Methods and apparatus related to sensor signal sniffing and/or analysis|
|US8072501||20 Sep 2006||6 Dec 2011||The Invention Science Fund I, Llc||Preservation and/or degradation of a video/audio data stream|
|US8094010||10 Aug 2009||10 Jan 2012||Wesby-Van Swaay Eveline||Programmable communicator|
|US8126276 *||21 Feb 2001||28 Feb 2012||International Business Machines Corporation||Business method for selectable semantic codec pairs for very low data-rate video transmission|
|US8180336||5 Jun 2009||15 May 2012||M2M Solutions Llc||System and method for remote asset management|
|US8233042||26 May 2006||31 Jul 2012||The Invention Science Fund I, Llc||Preservation and/or degradation of a video/audio data stream|
|US8253821||22 Aug 2006||28 Aug 2012||The Invention Science Fund I, Llc||Degradation/preservation management of captured data|
|US8269617||26 Jan 2009||18 Sep 2012||Drivecam, Inc.||Method and system for tuning the effect of vehicle characteristics on risk prediction|
|US8284304||11 Jul 2008||9 Oct 2012||Canon Kabushiki Kaisha||Portable electronic apparatus, image processing method, photographing apparatus, and computer readable recording medium|
|US8314708||8 May 2006||20 Nov 2012||Drivecam, Inc.||System and method for reducing driving risk with foresight|
|US8335254||23 Oct 2006||18 Dec 2012||Lot 3 Acquisition Foundation, Llc||Advertisements over a network|
|US8350946||22 Sep 2010||8 Jan 2013||The Invention Science Fund I, Llc||Viewfinder for shared image device|
|US8364147||24 Sep 2011||29 Jan 2013||Verint Americas, Inc.||System and method for determining commonly used communication terminals and for identifying noisy entities in large-scale link analysis|
|US8457350||2 Sep 2011||4 Jun 2013||Sony Corporation||System and method for data assisted chrom-keying|
|US8457622||19 Apr 2012||4 Jun 2013||M2M Solutions Llc||System and method for remote asset management|
|US8493442||29 Mar 2001||23 Jul 2013||Lot 3 Acquisition Foundation, Llc||Object location information|
|US8503972||30 Oct 2009||6 Aug 2013||Digital Ally, Inc.||Multi-functional remote monitoring system|
|US8504007||11 Sep 2012||6 Aug 2013||M2M Solutions Llc||System and method for remote asset management|
|US8508353||11 Jun 2010||13 Aug 2013||Drivecam, Inc.||Driver risk assessment system and method having calibrating automatic event scoring|
|US8509733||28 Apr 2011||13 Aug 2013||Verint Americas, Inc.||System and method for determining commonly used communication terminals and for identifying noisy entities in large-scale link analysis|
|US8520069||10 Aug 2008||27 Aug 2013||Digital Ally, Inc.||Vehicle-mounted video system with distributed processing|
|US8542111||13 Mar 2013||24 Sep 2013||M2M Solutions Llc||Programmable communicator|
|US8564446||15 Sep 2011||22 Oct 2013||Drivecam, Inc.||System and method for reducing driving risk with foresight|
|US8577358||7 Mar 2013||5 Nov 2013||M2M Solutions Llc||System and method for remote asset management|
|US8577359||14 Mar 2013||5 Nov 2013||M2M Solutions Llc||System and method for remote asset management|
|US8587655||23 Sep 2011||19 Nov 2013||Checkvideo Llc||Directed attention digital video recordation|
|US8589994||12 Jul 2006||19 Nov 2013||David A. Monroe||Comprehensive multi-media surveillance and response system for aircraft, operations centers, airports and other commercial transports, centers and terminals|
|US8601494||14 Jan 2008||3 Dec 2013||International Business Machines Corporation||Multi-event type monitoring and searching|
|US8606383||23 Apr 2010||10 Dec 2013||The Invention Science Fund I, Llc||Audio sharing|
|US8606492||31 Aug 2011||10 Dec 2013||Drivecam, Inc.||Driver log generation|
|US8633802||16 Dec 2011||21 Jan 2014||M2M Solutions Llc||Programmable communicator|
|US8638194||25 Jul 2008||28 Jan 2014||Axcess International, Inc.||Multiple radio frequency identification (RFID) tag wireless wide area network (WWAN) protocol|
|US8648717||3 Jul 2013||11 Feb 2014||M2M Solutions Llc||Programmable communicator|
|US8665728||27 Oct 2011||4 Mar 2014||Verint Systems, Ltd.||System and method for IP target traffic analysis|
|US8676428||17 Apr 2012||18 Mar 2014||Lytx, Inc.||Server request for downloaded information from a vehicle-based monitor|
|US8681225||3 Apr 2006||25 Mar 2014||Royce A. Levien||Storage access technique for captured data|
|US8681640||7 Jun 2011||25 Mar 2014||Verint Systems, Ltd.||Systems and methods for extracting media from network traffic having unknown protocols|
|US8723664||9 Aug 2010||13 May 2014||Nest Labs, Inc.||Systems and methods for the automatic registration of devices|
|US8731815||18 Sep 2009||20 May 2014||Charles Arnold Cummings||Holistic cybernetic vehicle control|
|US8743204 *||7 Jan 2011||3 Jun 2014||International Business Machines Corporation||Detecting and monitoring event occurrences using fiber optic sensors|
|US8744642||16 Sep 2011||3 Jun 2014||Lytx, Inc.||Driver identification based on face data|
|US8767551||25 Jan 2012||1 Jul 2014||Verint Systems, Ltd.||System and method for flow table management|
|US8804033||15 Jun 2011||12 Aug 2014||The Invention Science Fund I, Llc||Preservation/degradation of video/audio aspects of a data stream|
|US8849501||21 Jan 2010||30 Sep 2014||Lytx, Inc.||Driver risk assessment system and method employing selectively automatic event scoring|
|US8854199||3 Jun 2010||7 Oct 2014||Lytx, Inc.||Driver risk assessment system and method employing automated driver log|
|US8860804||27 Apr 2010||14 Oct 2014||Xanboo Inc.||Automated upload of content based on captured event|
|US8866589||31 Jan 2014||21 Oct 2014||M2M Solutions Llc||Programmable communicator|
|US8872624||7 Feb 2014||28 Oct 2014||M2M Solutions Llc||Programmable communicator|
|US8890953 *||27 Jun 2011||18 Nov 2014||Rawles Llc||Optical-based scene detection and audio extraction|
|US8892310||21 Feb 2014||18 Nov 2014||Smartdrive Systems, Inc.||System and method to detect execution of driving maneuvers|
|US8902320||14 Jun 2005||2 Dec 2014||The Invention Science Fund I, Llc||Shared image device synchronization or designation|
|US8928752||28 Aug 2007||6 Jan 2015||Stellar Llc||Recording device with pre-start signal storage capability|
|US8959025||28 Apr 2011||17 Feb 2015||Verint Systems Ltd.||System and method for automatic identification of speech coding scheme|
|US8959329||13 Apr 2012||17 Feb 2015||Verint Sytems, Ltd..||System and method for selective inspection of encrypted traffic|
|US8964054||1 Feb 2007||24 Feb 2015||The Invention Science Fund I, Llc||Capturing selected image objects|
|US8982944||27 Jan 2010||17 Mar 2015||Enforcement Video, Llc||Method and system for categorized event recording of images in multiple resolution levels|
|US8988537||13 Sep 2007||24 Mar 2015||The Invention Science Fund I, Llc||Shared image devices|
|US8989914||19 Dec 2011||24 Mar 2015||Lytx, Inc.||Driver identification based on driving maneuver signature|
|US8990238||26 Apr 2012||24 Mar 2015||Verint Systems Ltd.||System and method for keyword spotting using multiple character encoding schemes|
|US8996234||11 Oct 2011||31 Mar 2015||Lytx, Inc.||Driver performance determination based on geolocation|
|US8996240||16 Mar 2006||31 Mar 2015||Smartdrive Systems, Inc.||Vehicle event recorders with integrated web server|
|US9001215||28 Nov 2007||7 Apr 2015||The Invention Science Fund I, Llc||Estimating shared image device operational capabilities or resources|
|US9019383||31 Oct 2008||28 Apr 2015||The Invention Science Fund I, Llc||Shared image devices|
|US9041826||18 Aug 2006||26 May 2015||The Invention Science Fund I, Llc||Capturing selected image objects|
|US9060029||29 Oct 2012||16 Jun 2015||Verint Systems Ltd.||System and method for target profiling using social network analysis|
|US9075136||1 Mar 1999||7 Jul 2015||Gtj Ventures, Llc||Vehicle operator and/or occupant information apparatus and method|
|US9076208||28 Feb 2006||7 Jul 2015||The Invention Science Fund I, Llc||Imagery processing|
|US9078152||8 Aug 2014||7 Jul 2015||M2M Solutions Llc||Programmable communicator|
|US9082456||26 Jul 2005||14 Jul 2015||The Invention Science Fund I Llc||Shared image device designation|
|US9090295||18 Nov 2013||28 Jul 2015||The Wilfred J. and Louisette G. Lagassey Irrevocable Trust||Modular intelligent transportation system|
|US9092921 *||8 Jan 2013||28 Jul 2015||Lytx, Inc.||Server determined bandwidth saving in transmission of events|
|US9093121||29 Jun 2011||28 Jul 2015||The Invention Science Fund I, Llc||Data management of an audio data stream|
|US9094371||9 Sep 2014||28 Jul 2015||Google Inc.||Node having components for performing functions and software for controlling the components if the node has been registered to a user account at a remote site|
|US9100368||9 Sep 2014||4 Aug 2015||Google Inc.||Methods and systems for installing a device at a location featuring a client application capable of displaying installation instructions via a client device|
|US20010029613 *||29 Mar 2001||11 Oct 2001||Fernandez Dennis Sunga||Integrated network for monitoring remote objects|
|US20020015582 *||30 Jul 2001||7 Feb 2002||Matsushita Electric Industrial Co., Ltd.||Video recording apparatus and monitoring apparatus|
|US20020063781 *||15 Oct 2001||30 May 2002||Takashi Aizawa||Digital information input system|
|US20020065076 *||8 Jul 1999||30 May 2002||David A. Monroe||Apparatus and method for selection of circuit in multi-circuit communications device|
|US20040080608 *||20 Dec 2002||29 Apr 2004||Monroe David A.||Method and apparatus for image capture, compression and transmission of a visual image over telephonic or radio transmission system|
|US20040098515 *||16 Sep 2003||20 May 2004||Babak Rezvani||Systems and methods for the automatic registration of devices|
|US20040141062 *||30 Oct 2003||22 Jul 2004||Maurizio Pilu||Camera apparatus with saliency signal generation|
|US20040183903 *||21 Mar 2003||23 Sep 2004||Pedersen Christen Kent||Method and system for managing data in a system|
|US20040201765 *||23 Jan 2002||14 Oct 2004||Gammenthaler Robert S.||In-car digital video recording with MPEG compression|
|US20040207729 *||11 May 2004||21 Oct 2004||Yoichi Takagi||Image processor, intruder monitoring apparatus and intruder monitoring method|
|US20050007454 *||6 Aug 2004||13 Jan 2005||Needham Bradford H.||Motion detecting web camera system|
|US20050110634 *||20 Nov 2003||26 May 2005||Salcedo David M.||Portable security platform|
|US20050122057 *||5 Dec 2003||9 Jun 2005||Timothy Chen||Universal platform for phase dimming discharge lighting ballast and lamp|
|US20050128321 *||28 Jan 2005||16 Jun 2005||Canon Kabushiki Kaisha||Image pickup apparatus, method of controlling the image pickup apparatus, and external device|
|US20050200727 *||13 Apr 2005||15 Sep 2005||Shigeo Yoshida||Portable electronic apparatus, image processing method, photographing apparatus, and computer readable recording medium|
|US20050232352 *||1 Jun 2005||20 Oct 2005||A4S Security, Inc. (Formerly A4S Technologies, Inc.)||Remote surveillance system|
|US20050259151 *||12 Sep 2003||24 Nov 2005||Hamilton Jeffrey A||Incident recording information transfer device|
|US20050264412 *||11 May 2005||1 Dec 2005||Raytheon Company||Event alert system and method|
|US20060001537 *||3 Feb 2005||5 Jan 2006||Blake Wilbert L||System and method for remote access to security event information|
|US20060001736 *||2 Feb 2005||5 Jan 2006||Monroe David A||Method and apparatus for image capture, compression and transmission of a visual image over telephonic or radio transmission system|
|US20060010078 *||1 Jun 2005||12 Jan 2006||Xanboo, Inc.||Systems and methods for the automatic registration of devices|
|US20070097214 *||16 Jun 2006||3 May 2007||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Preservation/degradation of video/audio aspects of a data stream|
|US20070098348 *||15 May 2006||3 May 2007||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Degradation/preservation management of captured data|
|US20090213220 *||14 Apr 2008||27 Aug 2009||Tsung Chen||Active monitoring system with multi-spot image display and a method thereof|
|US20100182429 *||19 Mar 2009||22 Jul 2010||Wol Sup Kim||Monitor Observation System and its Observation Control Method|
|US20120069224 *||27 Apr 2011||22 Mar 2012||Andrew Cilia||Method and system for single-camera license-plate recognition and magnification|
|US20120176496 *||7 Jan 2011||12 Jul 2012||International Business Machines Corporation||Detecting and monitoring event occurences using fiber optic sensors|
|US20130147627 *||18 May 2011||13 Jun 2013||Vcfire System Ab||Fire monitoring system|
|US20140195105 *||8 Jan 2013||10 Jul 2014||Lytx, Inc.||Server determined bandwidth saving in transmission of events|
|EP1178378A1 *||16 Oct 1998||6 Feb 2002||Focke & Co. (GmbH & Co.)||Machine for packaging and/or manufacturing cigarettes|
|EP1345420A2 *||13 Mar 2003||17 Sep 2003||Sony Corporation||Image pickup apparatus having a front and rear camera|
|EP1727073A2 *||29 Mar 2006||29 Nov 2006||Robert Bosch Gmbh||Occupant detection device|
|EP2299591A1 *||20 May 2009||23 Mar 2011||Crambo, S.a.||Ambient sound capture and management device|
|WO1998052358A1 *||13 Mar 1998||19 Nov 1998||Eddie Lau||Transportation surveillance system|
|WO1999029191A2 *||16 Oct 1998||17 Jun 1999||Bretthauer Hans Juergen||Machine, especially a network with a machine, and a method for maintaining and/or diagnosing machines|
|WO2000048154A1 *||10 Feb 2000||17 Aug 2000||Nicholas Bernard Body||Improvements in or relating to control and/or monitoring systems|
|WO2001046923A1 *||11 Dec 2000||28 Jun 2001||Axcess Inc||Method and system for providing integrated remote monitoring services|
|WO2002021472A1 *||6 Sep 2001||14 Mar 2002||Xanboo Inc||Automated upload of content based on captured event|
|WO2003028025A1 *||26 Sep 2002||3 Apr 2003||Orbb Ltd||A system for real time data encryption|
|WO2003030552A1 *||25 Jul 2002||10 Apr 2003||Eric Anthony||Early warning real-time security system|
|WO2004005868A2 *||10 Jul 2003||15 Jan 2004||Lockheed Corp||Infrared camera system and method|
|WO2004107760A1 *||14 May 2003||9 Dec 2004||Docters Fernando||Portable multiple view surveillance system|
|U.S. Classification||380/241, 380/217, 348/143, 702/189, 340/500|
|International Classification||G08B15/00, G08B13/196|
|Cooperative Classification||G08B13/19667, G08B13/19669, G08B13/19673, G08B13/19695, G08B13/19647|
|European Classification||G08B13/196S1, G08B13/196S2, G08B13/196S3T, G08B13/196L3, G08B13/196W|
|12 May 1995||AS||Assignment|
Owner name: WITNESS SYSTEMS, INC., TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SWANSON, DANIEL R.;MOEN, JERRY M.;TATE, BRADLEY M.;REEL/FRAME:007527/0695;SIGNING DATES FROM 19950321 TO 19950322
|6 Mar 2001||FPAY||Fee payment|
Year of fee payment: 4
|14 Mar 2005||FPAY||Fee payment|
Year of fee payment: 8
|24 Jul 2007||AS||Assignment|
Owner name: LEHMAN COMMERCIAL PAPER INC., AS ADMINISTRATIVE AG
Free format text: SECURITY AGREEMENT;ASSIGNOR:VERINT AMERICAS, INC.;REEL/FRAME:019588/0854
Effective date: 20070525
|15 Apr 2009||FPAY||Fee payment|
Year of fee payment: 12
|9 Jun 2009||AS||Assignment|
Owner name: CREDIT SUISSE AS ADMINISTRATIVE AGENT, NEW YORK
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VERINT AMERICAS INC.;LEHMAN COMMERCIAL PAPER INC.;REEL/FRAME:022793/0976
Effective date: 20090604
|2 May 2011||AS||Assignment|
Free format text: RELEASE OF SECURITY INTEREST;ASSIGNOR:CREDIT SUISSE AG;REEL/FRAME:026206/0340
Owner name: VERINT VIDEO SOLUTIONS INC., NEW YORK
Effective date: 20110429
Owner name: VERINT SYSTEMS INC., NEW YORK
Free format text: RELEASE OF SECURITY INTEREST;ASSIGNOR:CREDIT SUISSE AG;REEL/FRAME:026206/0340
Effective date: 20110429
Free format text: RELEASE OF SECURITY INTEREST;ASSIGNOR:CREDIT SUISSE AG;REEL/FRAME:026206/0340
Owner name: VERINT AMERICAS INC., NEW YORK
Effective date: 20110429