US20140320648A1 - Remote User Interface & Display For Events For A Monitored Location - Google Patents

Remote User Interface & Display For Events For A Monitored Location Download PDF

Info

Publication number
US20140320648A1
US20140320648A1 US14/260,256 US201414260256A US2014320648A1 US 20140320648 A1 US20140320648 A1 US 20140320648A1 US 201414260256 A US201414260256 A US 201414260256A US 2014320648 A1 US2014320648 A1 US 2014320648A1
Authority
US
United States
Prior art keywords
user
information
location
sensor
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/260,256
Inventor
Adam D. Sager
Chris I. Rill
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canary Connect Inc
Original Assignee
Canary Connect Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canary Connect Inc filed Critical Canary Connect Inc
Priority to US14/260,256 priority Critical patent/US20140320648A1/en
Publication of US20140320648A1 publication Critical patent/US20140320648A1/en
Assigned to Canary Connect, Inc. reassignment Canary Connect, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SAGER, ADAM D., RILL, CHRIS I.
Assigned to VENTURE LENDING & LEASING VII, INC. reassignment VENTURE LENDING & LEASING VII, INC. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Canary Connect, Inc.
Assigned to SILICON VALLEY BANK reassignment SILICON VALLEY BANK SECURITY AGREEMENT Assignors: Canary Connect, Inc.
Assigned to VENTURE LENDING & LEASING VIII, INC., VENTURE LENDING & LEASING VII, INC. reassignment VENTURE LENDING & LEASING VIII, INC. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Canary Connect, Inc.
Priority to US15/457,182 priority patent/US10083599B2/en
Assigned to Canary Connect, Inc. reassignment Canary Connect, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SAGER, ADAM D., RILL, CHRISTOPHER I.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19684Portable terminal, e.g. mobile phone, used for viewing video remotely
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01KMEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
    • G01K1/00Details of thermometers not specially adapted for particular types of thermometer
    • G01K1/14Supports; Fastening devices; Arrangements for mounting thermometers in particular locations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/0004Gaseous mixtures, e.g. polluted air
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19682Graphic User Interface [GUI] presenting system data to the user, e.g. information on a screen helping a user interacting with an alarm system
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19695Arrangements wherein non-video detectors start video recording or forwarding but do not generate an alarm themselves
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B19/00Alarms responsive to two or more different undesired or abnormal conditions, e.g. burglary and fire, abnormal temperature and abnormal rate of flow
    • G08B19/005Alarms responsive to two or more different undesired or abnormal conditions, e.g. burglary and fire, abnormal temperature and abnormal rate of flow combined burglary and fire alarm systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B23/00Alarms responsive to unspecified undesired or abnormal conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/001Alarm cancelling procedures or alarm forwarding decisions, e.g. based on absence of alarm confirmation
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/003Address allocation methods and details
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/005Alarm destination chosen according to a hierarchy of available destinations, e.g. if hospital does not answer send to police station
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/006Alarm destination chosen according to type of event, e.g. in case of fire phone the fire service, in case of medical emergency phone the ambulance
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/008Alarm setting and unsetting, i.e. arming or disarming of the security system
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/10Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using wireless transmission systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B27/00Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations
    • G08B27/001Signalling to an emergency team, e.g. firemen
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B27/00Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations
    • G08B27/006Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations with transmission via telephone network
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • G08B29/185Signal analysis techniques for reducing or preventing false alarms or for enhancing the reliability of the system
    • G08B29/188Data fusion; cooperative systems, e.g. voting among different detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C19/00Electric signal transmission systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2823Reporting information sensed by appliance or service execution status of appliance services in a home automation network
    • H04L12/2825Reporting to a device located outside the home and the home network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2823Reporting information sensed by appliance or service execution status of appliance services in a home automation network
    • H04L12/2827Reporting to a device within the home network; wherein the reception of the information reported automatically triggers the execution of a home appliance functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/08Configuration management of networks or network elements
    • H04L41/0803Configuration setting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/0202Child monitoring systems using a transmitter-receiver system carried by the parent and the child
    • G08B21/0205Specific application combined with child monitoring using a transmitter-receiver system
    • G08B21/0208Combination with audio or video communication, e.g. combination with "baby phone" function
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L2012/284Home automation networks characterised by the type of medium used
    • H04L2012/2841Wireless
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/083Network architectures or network communication protocols for network security for authentication of entities using passwords
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2209/00Arrangements in telecontrol or telemetry systems
    • H04Q2209/80Arrangements in the sub-station, i.e. sensing device
    • H04Q2209/82Arrangements in the sub-station, i.e. sensing device where the sensing device takes the initiative of sending data
    • H04Q2209/823Arrangements in the sub-station, i.e. sensing device where the sensing device takes the initiative of sending data where the data is sent when the measured values exceed a threshold, e.g. sending an alarm

Definitions

  • Home security and monitoring devices and systems and, more particularly, home security and monitoring devices comprising a plurality of sensors, and systems having learning capabilities.
  • Home security systems typically include a door alarm, which can trigger an audible sound, such as a siren, when the door is opened when the system is armed.
  • the door alarm may be triggered when a contact between two surfaces is opened when the door is opened. Similar contacts may be provided on windows, to trigger an alarm when a window is opened when the system is armed. Sensors can also be provided to detect when the glass of a window is broken.
  • the home security system may inform a call center that an alarm has been triggered.
  • a user may enter a code into a device that communicates with the call center, to inform the call center that an alarm is a false alarm.
  • the call center may also call a home to inquire about an alarm.
  • the call center may call the police if a user does not respond.
  • Such home systems are prone to false alarms, which can be annoying to residents and neighbors, and waste the time and resources of authorities, such as the police and tire department.
  • Other types of home security devices are also known, such as fire, smoke, and carbon monoxide detectors, for example. Often, such devices are managed and positioned by a user, and provide a siren or other such sound if triggered.
  • Webcams may be provided to monitor children, baby sitters, and other activities in a home. Webcams provide a large amount of often irrelevant video that needs to be reviewed, which can be time consuming even if using a fast forward function.
  • Home automation systems allow for remote control of lights, thermostats, and security devices.
  • a device may transmit audio, video, and/or sensor data within one or multiple streams, via Wifi or other wireless communications, simultaneously, to a processing system for analysis.
  • the processing system may be a discrete entity or may be a cloud based system, for example.
  • the sensors in the device may include a temperature sensor, humidity sensor, air quality sensor, motion detector, and/or other sensors, as described below. Since contemporaneous data from multiple or all of the sensors may be analyzed together, a more complete understanding of what is happening at a location may be developed than when data from a single sensor is analyzed, as in the known prior art.
  • the device may include a camera (for video), a microphone (for audio), a passive infrared sensor (to detect motion), and life safety sensors, such as air quality monitoring sensors, carbon dioxide monitoring sensors, temperature monitoring sensors, humidity monitoring sensors and/or other air quality or atmospheric sensors, for example.
  • a processing device is provided in the device, as well, and an additional processing device in the home is not needed.
  • the device is able to combine the data feeds from the sensors simultaneously to create a clear picture of what is occurring in the one location. For instance, by putting a video feed in the same device as a temperature sensor, it can be determined whether movement of people or actions of people impacts the temperature of a given location.
  • the device may gather precise readings because of the physical proximity of the device, and provide more accurate reading and more immediate data when including in the same data stream via a Wifi and/or cellular networks.
  • Simultaneous sensor feeds also means that the user can have a complete picture of the audio, visual, and sensor readings from a physical place from any given time, and more accurately detect patterns of change, or impact from one sensor or activity on another sensor or activity.
  • the combined functionality assists in determining what is ordinary and what is out-of-the-ordinary in a location. This assists in decreasing false alarms, and identifying events that a user wants and/or needs to be informed of.
  • Embodiments of the invention provide a network-based, such as a cloud-based, communal security layer to the Internet, powered by integrated and intelligent device/s used to connect people to their homes, other people, and places that are important to them, enabling peace-of-mind, a sense of safety, security, and a detection of situations that deviate from the norm.
  • a network-based such as a cloud-based, communal security layer to the Internet
  • integrated and intelligent device/s used to connect people to their homes, other people, and places that are important to them, enabling peace-of-mind, a sense of safety, security, and a detection of situations that deviate from the norm.
  • a virtual neighborhood community watch may be defined to enable groups of people to look after each other via alert and notification-based actions for preventing, reacting to, and responding to, life-safety events and other events, for example.
  • the security device facilitates user monitoring of a physical place, through the collection of continuous information from the location when the device is powered. This information is analyzed by the device and/or a processing system, such as a cloud based processing system, to determine if an event has occurred that is of interest to the user.
  • a user may be one or more members of a household, for example.
  • An event of interest can be a negative event, such as a security or safety incident, including a life-threatening incident such as a fire or an attacker, for example.
  • An event of interest may also be a neutral or positive incident, such as watching one's child.
  • an event of interest may also be an event that is out of the ordinary, such as an unknown person entering the home or a known person entering the home at an unexpected time, for example.
  • the overall importance of an incident may be determined both by an individual user as well as compiled information from a multitude of users.
  • the device therefore, acts as a portal connecting a physical place to the web and to internet/cloud-based services.
  • Events may be defined in multiple ways. For example, events may be defined by the system, by defining thresholds for temperature, rate of temperature change, air quality, or movement, for example. In defining events, the system may take into account crowd-sourced information collected from a large number of devices at a large number of locations, for example.
  • the crowd-sourced information applied to a particular user may be derived from similarly situated users, such as users in one bedroom apartments in New York City, users in suburban homes outside of Washington D.C., etc.
  • the system 14 may learn and apply criteria, such as thresholds, for single sensors and combinations of sensors, for that user.
  • Events may also be defined by individual user input, allowing the user to essentially ‘program’ the device to learn the user's personal preferences and how they interact from a security and safety perspective. Users may provide input to the system to define normal or acceptable activities, and/or to confirm or modify system defined and/or crowd-sourced defined events.
  • the device and/or system record a series of events, including data from a sensor (or multiple sensors) identified as of interest based on being above/below a threshold or an out of the ordinary event, for example.
  • the data is then sent to the processing system, which determines, among other things, the time of day of the incident, who is in the home or location, or when the event happened.
  • Individual notifications sent over the course of a day may be displayed to a user on a user device in the form of a timeline of events, so that a user can monitor the activities that have taken place in the location. This further assists the user in understanding what happens in a location.
  • a single security/monitoring device can be used.
  • multiple devices can be used.
  • the device connects to a home network or directly to a user via their smartphone, other mobile device, and/or computer.
  • the device may constantly be on and gathering data, which means that it will be a continual source of information for what is occurring in a single location, and for determining if anything is out of the ordinary.
  • the user may turn off the device, when desired, either directly or remotely, via their smartphone, etc.
  • processing of collected data may take place on the device and/or in processing center.
  • relevant-confirming analytics can be performed from the device, such as through video, audio and motion sensing to determine whether an event might have taken place and the data should be sent to the processing center via the network for further analysis.
  • the processing system determines if an event or incident within the range of the device is of interest to the owner of the device, either by being a deviation from what is normal or by being another event that gives insight into what is taking place inside the home, for example.
  • Inputs from one or more sensor may be used to determine if there is an incident or an event worth informing the owner of the device about, by receiving and analyzing sensor data from multiple sensors.
  • the processing system communicates through the network directly with a user or group of users, or to the user's backup friends or family, or to the designated government or private authorities, such as the police and fire department, who can assist the user in responding to the event.
  • Notifications can come in the form of a direct communication such as an email, text message, or phone call, through a network-based portal dedicated to public safety officials or first responders, or through a software application, which is accessible by select individuals. Having immediate feedback from physical locations may greatly reduce the amount of time required for response and enables the authorities to help save lives.
  • the device does not require any installation or connection to a wall or wired home system.
  • the device in accordance with certain embodiments does not require a professional installer, or any installation at all. It need not be connected to a wall or wired home system, although that is an option.
  • the device does not need to be permanently or semi-permanently affixed to a surface, such as wall, doorframe, door, floor, ceiling, or a window, for example, although that is an option if desired. In accordance with an embodiment of the invention, it may be placed on a surface, such as the floor, a table, or a bookcase, for example.
  • the device can be connected to the Internet through Wifi or through another network, such as a 3G or 4G service from a cellular operator, for example, and through that connection perform all of the necessary functions of a monitoring, security, and/or safety system, as well as of a home connected device.
  • Wifi Wireless Fidelity
  • another network such as a 3G or 4G service from a cellular operator, for example, and through that connection perform all of the necessary functions of a monitoring, security, and/or safety system, as well as of a home connected device.
  • FIG. 1 is an example of a security system in accordance with an embodiment of the invention
  • FIG. 2 is a block diagram of an example of a monitoring/security device in accordance with an embodiment of the invention
  • FIG. 3 is a block diagram of another example of the device of FIG. 1 ;
  • FIG. 4 is another example of a block diagram of the device of FIG. 1 ;
  • FIG. 5 is a perspective view of a monitoring/security device in accordance with an embodiment of the invention.
  • FIG. 6 is a bottom view of the device, in accordance with the embodiment of FIG. 5 ;
  • FIG. 7 is a rear perspective view of the device, in accordance with the embodiment of FIG. 5 ;
  • FIG. 8 is a front view of the device, in accordance with the embodiment of FIG. 5 ;
  • FIG. 9 is a front perspective view of the main board of the device, in accordance with the embodiment of FIG. 5 ;
  • FIG. 10 is a front view of a disk that may be provided in front of the camera optics in the device, in accordance with the embodiment of FIG. 5 ;
  • FIG. 11 is a perspective view of the main board with a PIR lens removed, in accordance with the embodiment of FIG. 5 ;
  • FIG. 12 is a rear perspective view of the main board, the bottom board, and the antenna board, in accordance with the embodiment of FIG. 5 ;
  • FIG. 13 is a cross-sectional view of the device, in accordance with the embodiment of FIG. 5 ;
  • FIG. 14 is a top view of the antenna board, showing a Bluetooth antenna, a WiFi antenna, and a capacitive switch, in accordance with the embodiment of FIG. 5 ;
  • FIG. 15 is a view of the main board, showing an actuator for an IR filter, in accordance with the embodiment of FIG. 5 ;
  • FIG. 16 is a schematic representation of two IR filters for selective placement in front of an imaging sensor, in accordance with an embodiment of the invention.
  • FIGS. 17A-I 7 C show a flowchart of an example of a setup procedure, in accordance with an embodiment of the invention.
  • FIGS. 18 and 19 show a flowchart of an example of a method of the operation of the device and the system in response to an event detected by the device;
  • FIG. 20A is another flowchart of an example of a method of operation of the device and/or system when an event is detected, in accordance with an embodiment of the invention
  • FIG. 20B is another example of a learning procedure, in accordance with an embodiment of the invention, in accordance with an embodiment of the invention.
  • FIG. 21 is a schematic diagram of an example of components of the device involved in video processing, in accordance with an embodiment of the invention.
  • FIG. 22 is a schematic diagram of an example of the components of the system involved in video processing, in accordance with the embodiment of the invention.
  • FIG. 23 is an example of a notification provided to user device of a primary user as displayed by the App, in accordance with an embodiment of the invention.
  • FIG. 24 is an example of a notification displayed on a user device that states that a person arrived home and when, in accordance with an embodiment of the invention
  • FIG. 25 is an example of a timeline of events that can be displayed by the user device via the App, from data received from the system and/or the device, in accordance with an embodiment of the invention
  • FIG. 26 is a timeline of notifications, in accordance with an embodiment of the invention.
  • FIG. 27 is a flowchart of an example of a method for obtaining video clips from the system, in accordance with an embodiment of the invention.
  • FIG. 1 is an example of a security system 10 in accordance with an embodiment of the invention.
  • the system 100 comprises one or more monitoring/security devices 10 , each positioned in one or more locations within a home 12 .
  • One or more devices 10 may be positioned within a single room 14 or in multiple rooms of the home 12 , depending on the size of respective rooms, the areas of the home of activity and/or concern, the degree of security desired, the locations of entrances to the home, where children play, where a baby, children, older people, or sick people sleep, etc.
  • one device is placed near a corner of the room 14 , and the other is placed on a table near an opposite corner of in the same room.
  • One or more devices may be placed in other rooms of the home, as well.
  • Devices 10 may also be used in office buildings, stores, individual offices, hospital and hospice rooms, and any other locations that would be desirable to monitor and/or home security concerns.
  • devices 10 may also be placed in outdoor locations, such as backyards, decks, patios, etc.
  • installation is not required to attach the device 10 to a wall, doorframe, door, floor, ceiling or a window, for example, although that is an option if desired.
  • the device 10 may be placed on a flat surface, such as a floor, table, desk, bookshelf, etc.
  • the one or more devices 10 in the home 12 in this example communicate with a security processing system 14 via a network 16 .
  • the network 16 may be the Internet, for example.
  • the devices 10 may communicate with the network 16 wirelessly, such as by WiFi, or through the Ethernet, for example.
  • the security processing system 14 may comprise a processing device 18 and a database 20 , for example.
  • the processing device 18 may comprise one or more computers or servers, for example. Multiple databases 20 may also be provided.
  • the security processing system 14 may be a discrete entity connected to the network 16 , or it may be a cloud-based system, for example. References to “cloud” or “cloud-based” system or network refer to a well-known distributed networks operated by cloud-based applications.
  • FIG. 1 also shows a primary user 22 with a user device 24 outside of the home 12 that can communicate wirelessly with the devices 10 and/or the security processing system 14 via the network 16 , through WiFi and/or a cellular network, for example.
  • Primary users 22 are users with complete access to the device 10 and the system 14 .
  • Primary users 22 have the ability to set preferences and customize the activities of the device 10 and/or the system 14 .
  • Household users are family members or other persons who reside in the house 12 or are otherwise associated with the location, such as an office, who interact with the device 10 and/or the system 14 on a regular, such as daily, basis.
  • One important function of a primary user 22 is to receive information and notifications concerning the status of the home 12 , as detected by the device 10 , including notifications of potentially serious and life threating situations, and respond to them, including instructing the device 10 and/or the system 14 how to react to the situation.
  • the primary user 22 can teach the device 10 and/or the system 14 how to react to data detected by the device 10 .
  • the primary user 22 may be near the home 12 , at their place of business, or in any other location that can access the network 16 .
  • the user device 24 may be a mobile processing device, such as a smartphone or tablet, for example.
  • the user device 24 may also be a desktop or laptop computer in the home 12 or outside of the home 12 , such as an office.
  • a user 22 may have multiple devices that can communicate with the devices 10 .
  • FIG. 1 also shows a back up contact 26 with their own user device 24 .
  • a backup contact 26 has limited interaction with the system 14 when the primary user 22 does not respond to notifications provided by the system 14 .
  • Primary users 22 have an option to select or invite backup contacts during setup and at other times. The primary user 22 determines the circumstances when backup contacts receive notifications. In most instances when a primary user 22 , or a household user, is unable to respond to a notification, a backup contact would be sent a notification and have the opportunity to resolve the incident. In one example, a backup contact that accepts the invitation would be redirected to a web portal to create an account.
  • a backup contact receives a notification by text or email, they are taken to a unique URL and presented with information about the event that triggered a notification and have the ability to respond to the event notification based on primary user preferences.
  • a primary user 22 may select up to three backup contacts and will also be able to change the roster of backup contacts from time to time when the need arises (for example when a primary user is on vacation with whomever they have designated as their first backup). Users will be able to toggle between which backup contact receives a notification first after an event has escalated based on numerical order, for example backup contact one would be the first backup contact to receive an escalated notification, and thereafter backup contact two would receive the notification if backup contact one has failed to resolve the event. The primary user 22 sets the escalation delay between contacts.
  • the devices 10 , the security processing system 14 , and/or the primary user 22 and the backup contacts 26 may communicate with first responders, such as the police 28 , the fire department 30 , and/or an ambulance service 32 , for example, via the network 16 .
  • the devices 10 , the processing system 14 , and/or the user devices 24 may communicate with a call center 30 , which may in turn communicate with the police 26 , the fire department 28 , the ambulance service 30 and/or the other parties described herein, via the network 16 or other networks, such as telephone networks.
  • the call center may send notifications to primary users 22 and backup contacts 26 via text, email, and/or phone call to their respective user device 24 .
  • the system 14 may also contact home or office emails addresses or phone numbers, for example.
  • one or more backup contacts 26 may be contacted. If the backup contact or contacts 26 do not respond, then the system 14 may instruct the call center 34 to contact the police 28 , fire department 30 , and/or the ambulance 32 service, depending on the situation, or these parties may be contacted directly by the system 14 . The system 14 or the call center 34 may then attempt to contact the primary user 22 to inform the user of the potential event and the actions taken.
  • FIG. 2 is a block diagram 50 of an example of a monitoring/security device 10 in accordance with an embodiment of the invention.
  • the device 10 comprises a main printed circuit board (“PCB”) 52 , a bottom printed circuit board 54 , and an antenna printed circuit board 56 .
  • a processing device 58 such as a central processing unit (“CPU”), is mounted to the main PCB 52 .
  • the processing device may include a digital signal processor (“DSP”) 59 .
  • DSP digital signal processor
  • the CPU 58 may be an Ambarella digital signal processor, A5x, available from Ambarella, Inc., Santa Clara, Calif., for example.
  • An image sensor 60 of a camera, an infrared light emitting diode (“IR LED”) array 62 , an IR cut filter control mechanism 64 (for an IR cut filter 65 ), and a Bluetooth chip 66 are mounted to a sensor portion 68 of the main board 52 , and provide input to and/or receive input from the processing device 58 .
  • the main board 52 also includes a passive IR (“PIR”) portion 70 .
  • PIR sensor 72 mounted to the passive IR portion 70 are PIR sensor 72 , a PIR controller, such as a microcontroller, 74 , a microphone 76 , and an ambient light sensor 80 .
  • Memory, such as random access memory (“RAM”) 82 and flash memory 84 may also be mounted to the main board 52 .
  • a siren 86 may also be mounted to the main board 52 .
  • a humidity sensor 88 a temperature sensor 90 (which may comprise a combined humidity/temperature sensor, as discussed below), an accelerometer 92 , and an air quality sensor 94 , are mounted to the bottom board 54 .
  • a speaker 96 a red/green/blue (“RGB”) LED 98 , an RJ45 or other such Ethernet port 100 , a 3.5 mm audio jack 102 , a micro USB port 104 , and a reset button 106 are also mounted to the bottom board 54 .
  • a fan 108 may optionally be provided.
  • a Bluetooth antenna 108 , a WiFi module 110 , a WiFi antenna 112 , and a capacitive button 114 are mounted to the antenna board 56 .
  • the components may be mounted to different boards.
  • the Wifi module 110 may be mounted to the main board 52 , as shown in the Figures discussed below.
  • FIG. 3 is a block diagram 120 of another example of the device 10 , comprising two printed circuit boards, a side PCB 122 , and a lower PCB 124 . Components common to FIGS. 2 and 3A are commonly numbered.
  • a CPU 126 and a microcontroller are mounted to the side PCB 122 , along with the Bluetooth low energy (“BLE”) antenna 108 , the Wifi antenna 112 , the IR LED array 62 , a wide angle lens 132 , the ambient light sensor 80 , the PIR sensor 72 , the accelerometer 92 , the capacitive switch 114 , the microphone 76 and the speaker 96 .
  • BLE Bluetooth low energy
  • the RGB LEDs 98 , the humidity sensor 88 , the temperature sensor 90 , a carbon dioxide (CO 2 ) sensor 134 , a carbon monoxide sensor 136 , the siren 86 , the Ethernet port 100 , the audio jack 102 , the micro USB port 104 , and a reset button 106 are provided on the lower PCB 124 .
  • FIG. 4 is another example 150 of a block diagram of the device 10 . Components common to FIGS. 2 and 3 are commonly numbered and not further discussed. FIG. 4 further shows a cellular radio 152 , and a video encoder 154 connected to the CPU 126 . A power management chip 156 is connected to both the CPU 126 and the microcontroller 128 . AC power 158 and a battery 160 are connected to the power management chip. A pressure sensor 162 is connected to a microcontroller 164 .
  • FIG. 5 is a perspective view of a monitoring/security device 10 , in accordance with an embodiment of the invention.
  • the device 10 comprises an outer housing 202 and a front plate 204 .
  • the plate 204 includes a first window 206 , which is in front of the image sensor 60 .
  • a second window 208 which is rectangular in this example, is in front of the infrared LED array 62 .
  • An opening 210 is in front of the ambient light detector 80
  • an opening 212 is in front of the microphone 76 .
  • the front plate 204 may comprise black acrylic plastic, for example.
  • the black plastic acrylic plate 204 in this example is transparent to near IR greater than 800 nm.
  • the top 220 of the device 10 is also shown.
  • the top 220 includes outlet vents 224 through the top to allow for air flow out of the device 10 .
  • FIG. 6 is a bottom view of the device 10 , which shows the bottom 226 and inlet vents 228 to allow air flow into the device 10 .
  • the top 220 and the bottom 226 of the device 10 may be separate, plastic pieces that are attached to the housing 202 or an internal housing during assembly, for example. Air passing through the bottom, inlet vents 228 travel through the device 10 , where it picks up heat from the internal components of the device, and exits through the top, outlet vents 228 .
  • hot air rises through the device 10 by convection, causing air to be drawn into the device from the bottom vents 226 and exit through the top vents 220 .
  • a fan 108 ( FIG. 2 , for example), may be provided to draw external air into the device 10 through the bottom, inlet vents 228 and drive the air out of the device through the top, outlet vents 224 , as well.
  • vents 224 have to be large enough to allow heat to flow out of the unit and not convect into the device 10 , but the vents cannot be so large that a child can stick a finger into the unit.
  • a larger vent is made and the vent is covered with a Gore-Tex or nylon mesh to prevent water ingress but allow air to exit the unit.
  • FIG. 7 is a rear perspective view of the device 10 , showing the Ethernet connector 100 , the audio jack 102 , and a USB port 104 .
  • FIG. 8 is a front view of the device 10 with the front plate 204 .
  • a front wall 232 that connects the opposing curved edges of the curved housing to each other is shown.
  • the front wall 232 defines an opening having a first, partially circular section 234 , an upper rectangular section 236 above the partially circular opening and a second, lower rectangular section 238 below the partially circular section.
  • a smaller rectangular open section 240 is below the second rectangular section 238 .
  • the image sensor 60 is behind the partially circular section 234 , the IR LED arrays 62 are behind the first and second rectangular sections 236 , 238 and the passive IR sensor 72 is behind the smaller, lower rectangular section 240 .
  • the partially circular section is behind the first window 206 in the plate 204 .
  • the smaller rectangular section 240 is behind the rectangular window 208 in the plate 204 .
  • Additional openings 242 and 244 are provided behind the openings 210 , 212 in the plate 204 , in front of the ambient light sensor 80 and the microphone 76 .
  • the LEDs 62 behind the rectangular sections 236 , 238 are partially obscured by the front housing, the LEDs are angled, so that the infrared light emitted by the LEDs are directed out of the device 10 .
  • FIG. 9 is a front perspective view main board 52 .
  • Camera optics 246 are shown in front of the imaging sensor 60 (not shown). The camera optics are supported within a barrel 248 .
  • a light pipe 250 is in front of the ambient light sensor 80 , to couple the sensor to the opening 242 in FIG. 8 .
  • the microphone 76 is below the light pipe 250 .
  • a rectangular lens 252 in front of the PIR sensor 72 . The lens 252 and the sensor 72 are discussed further below.
  • a portion of the bottom board 54 is shown.
  • a flexible printed circuit (“FPC”) 254 connects the main board 52 to the bottom board 54 .
  • the rear of the Ethernet port 100 is also shown.
  • Other components shown in FIG. 9 are discussed further below.
  • FIG. 10 is a front view of a disk 256 with an opening 257 that may be provided in front of the optics 246 .
  • the disk 256 is painted black to blend in with the black front plate 204 , thereby making the imaging sensor 60 and camera optics 246 less noticeable during use in a home 12 , for example.
  • the disk 256 may be plastic, for example.
  • FIG. 11 is a perspective view of the main board 52 with the PIR lens 252 removed to show the PIR 72 mounted to the main PCB.
  • a light pipe in front of the ambient light sensor is also removed, to show the light sensor mounted to the main board 54 .
  • the barrel 248 is also removed, to better show the camera optics 246 .
  • FIG. 12 is a rear perspective view of the main board 52 , the bottom board 54 , and the antenna board 56 .
  • a heat sink 260 is applied to the back surface of the main board 52 .
  • the siren 86 mounted to the rear surface of the main board 52 . Air flow through the bottom vents 228 , past the heat sink 260 , removes heat from the heat sink, prior to exiting the device 10 through the top vents 224 .
  • the imaging sensor 60 is also shown.
  • FIG. 12 also shows the Ethernet port 100 , the audio jack 102 , and the USB power port 104 .
  • the audio jack 102 maybe a 3.5 mm barrel jack, for example.
  • FIG. 13 is a cross-sectional view of the device 10 .
  • the front plate 232 and the position of the front housing 204 are shown.
  • an RGB LED 98 mounted to the bottom board 54 .
  • Below the RGB LED is a cone shaped light guide 260 (shown in cross section in this view) diverging from the LED.
  • the circular bottom 262 of the light guide 260 provides colored illumination indicative of the state of the device, such as armed or disarmed, for example, out the bottom of the device 10 .
  • FIG. 13 also shows the temperature/humidity (“T/H”) sensor 90 / 88 and the air quality sensor 94 mounted to a FPC 270 , which is mounted bottom board 54 in this example.
  • the T/H sensor 90 / 88 is proximate one of the bottom vents 228 , so that external air from the location where the device 10 is located passes over the sensor.
  • An air guide 272 is provided around the T/H sensor 90 / 88 , the air quality sensor 94 , and one of the bottom vents 226 .
  • the air guide 272 guides external air received through the bottom vent past the sensors, and insulates the sensors from the heat generated within the device 10 , as described further below.
  • the air guide 272 in the example comprises side walls 272 a , and a rear wall 272 b , as shown in FIGS. 9 and 13 , for example.
  • the front 272 c of the air guide 272 is provided by the inner surface of the housing 202 .
  • the rear walls and the sidewalls taper inwardly toward the front of the device so that the bottom entrance to the guide 272 is larger than the exit 274 . It has been found that the taper improves the accuracy of the temperature detected by the T/H sensor 90 / 88 .
  • the air guide 272 may comprise acrylic plastic, for example, as discussed further below.
  • the housing 202 may comprise anodized aluminum, which provides a high quality appearance and high durability. Since the anodized aluminum may act as an electrical insulator, the inside surfaces of the housing 202 may be polished or masked during anodizing to allow the electrical ground to be tied to the aluminum metal housing. This provides a large sink for electrostatic discharge (ESD) and a partial shield against the electromagnetic radiation from the device 10 and electromagnetic susceptibility from external sources.
  • ESD electrostatic discharge
  • the polished inside surfaces of the housing 202 reduce the thermal emissivity and enable thermal radiation to escape the unit better and thereby thermally isolate the environmental sensors better.
  • the image sensor 60 of the camera may comprise a CMOS, CCD image array sensor, which is used as the transducer to digitally capture, transmit and store images or video.
  • the sensor may have a large pixel size to increase the light gathering capability.
  • Back side illuminated sensors may be used to further enhance the dynamic range in low light conditions.
  • the imaging sensor 60 is sensitive to visible light and also near IR (between 700 nm and 1100 nm wavelengths), which enables the sensor to capture “night” vision images.
  • the image sensor 60 can be either electronic rolling or global shutter and can achieve at least 30 fps of video at 720p and 1080p resolutions, and can produce larger resolution still images than 1080p for greater detail, pixel binning, long exposures, and high dynamic range imaging (“HDR”) techniques may be used to enhance dynamic range, which helps create accurate and high quality images at low light levels.
  • the image sensor may be an Aptina AR330, available from Aptina Imaging Corporation, San Jose, Calif., for example.
  • the camera optics 246 may comprise a fixed focus, wide angle multi-element lens to capture visual information about the location. In this example, the angle is 140 degrees.
  • the lens may be optimized for spatial resolution and chromatic aberration.
  • the lens mounts to the image sensor 60 through a screw mount to ensure precise focus during mass production.
  • the lens may be made of high quality glass such as BK7 or Sapphire, for example, to ensure high image quality.
  • the imaging optics 246 are protected from the environment using a flat exit window that is made of chemical strengthened or naturally strong glass, such as Gorilla Glass or Sapphire glass, for example.
  • the exit window may be covered with an anti-reflective coating, oleophobic coating to prevent finger prints and smudges.
  • the exit window may also have a hydrophobic coating to prevent water droplets or condensation from accumulating that might occlude the pictures or video.
  • the window 206 in the black front acrylic plate 204 accommodates the FOV of the sensor 60 and the optics 246 .
  • a 3-axis or 6-axis accelerator 92 may be provided placed on the bottom board 54 of the device or in another location to detect motion of the device 10 , itself, such as the device being knocked over, for example.
  • Heat generated by components within the device 10 may adversely influence the ambient temperature and humidity measurements by the T/H sensor 90 / 88 . Since heat transfer occurs through conduction, convection, and radiation, embodiments of the invention seek to decrease the effects of these heat transfer modes on the T/H sensor 90 / 88 .
  • the air guide 272 defines a thermal barrier for the T/H sensor 90 / 88 and the air quality sensor 94 , to isolate them from both conduction and radiation.
  • T/H and air quality sensors By mounting the T/H and air quality sensors to the FPC 270 , these sensors are isolated from heat transfer through conduction from.
  • very thin traces are used on the FPC.
  • the FPC may comprise polyamide, which has a high thermal resistance and decreases heat transfer that might have occurred if the T/H sensor and air sensor were mounted directly to the bottom board 54 .
  • Other sensors may be positioned within the air guide 272 for thermal isolation, as well
  • the air guide 272 comprises plastic acrylic wrapped in high thermal reflectivity material, such as polished copper or Mylar, for example. Since the anodized aluminum of the housing has a high thermal wavelength emissivity, polishing the inside walls of the housing 202 reduces thermal radiation.
  • main heat generators such as the processing device 58 and the Wifi module 110 , which are mounted to the main board 52 in this example, are heat “sunk” using a high conductivity material as a heat sink 260 , as shown in FIG. 12 .
  • the heat device thermally coupled to the polished inner walls of the aluminum walls.
  • the air quality sensor 94 may be a volatile organic compound (“VOC”) sensor, as in shown in the art.
  • VOC volatile organic compound
  • a thermistor may be provided on the main board and/or the antenna board, for example, to measure the heat generated by the heat generating components, which may be used to correct the temperature detected by the T/H sensor 90 / 88 , if necessary.
  • FIG. 14 is a top view of the antenna board 56 , showing the Bluetooth antenna 108 , the WiFi antenna 112 , and the capacitive switch 114 .
  • the Bluetooth antenna 108 and the WiFi antenna 112 defined by the antenna board 54 are placed under the top surface of the top 220 of the device 10 . This ensures that the RF energy does not get attenuated by the metal housing 202 , which is electrically grounded.
  • metal antennas may be integrated into the top metal cap or the main metal housing.
  • the top 220 may be made of plastic or metal with plastic pattern inserts or overmolds to define the antenna.
  • a metal top 220 can also act as a heat radiator to keep the inside of the device 10 cool.
  • the capacitive switch 114 may be used to change modes of the device 10 and the system 14 , as discussed below.
  • Night and dark vision operation is enhanced by illuminating the location with near IR light by the IR LEI) arrays 62 .
  • the near IR LED arrays may emit radiation in the range of 850-875 nm, for example.
  • 850 nm-875 nm light is very weakly visible to mostly invisible to most humans.
  • the CMOS image sensor 60 is sensitive to this wavelength band, it can respond to these illumination sources by providing well illuminated images, even when the room where the device is located is dark. It may be determined whether it is nighttime based on the light detected by the ambient light sensor 80 , for example.
  • the IR LED arrays 62 around the image sensor comprise near IR LEDs, (typically 850-880 nm to illuminate the surroundings in night vision mode, without blinding or distracting or being visible to the user at night.
  • the IR LED arrays 62 provide uniform illumination in the field of view (“FOV”) of the camera.
  • the IR LEDs are placed behind a black acrylic window to remain invisible to the user.
  • the black acrylic window allows only near IR to pass through.
  • Most 850 nm LEDs are partially visible to the human eye because of a wide transmission bandwidth that extends down towards the visible spectrum.
  • a band pass or high pass filter may be provided on the inside of the front housing (black acrylic sheet) to block any visible light from passing through the window.
  • a near IR filter 65 is provided to block near IR (“NIR”) radiation above 700-750 nm, for example, to enhance spatial resolution and other image quality parameters on a CMOS image sensor 60 during daytime, as shown in FIG. 13 and FIG. 15 . It need not be used at night. Movement of the filter 60 may be provided by an actuator 64 , as shown in FIG. 15 .
  • the actuator 64 may be an electro-optic system, such as is used in LCS shutters, or an electro-mechanical system, such as is used in electromechanical shutters. Operation of the near IR filter actuator 64 may be controlled by the processing device 58 , for example, as shown in FIG. 2 .
  • FIG. 15 also shows cavities 65 in a supporting structure 67 for respective LEDs in the array 62 .
  • a second, independent IR filter 282 under the control of another actuator 280 or the same actuator 64 that controls operation of the cut filter 65 , may be provided, as shown schematically in FIG. 16 .
  • the second filter 282 acts as a narrow band pass filter to allow only a narrow wavelength of light to pass through.
  • the second IR filter could be 850 nm band pass filter with a 30 nm band width. This filter 282 can therefore be used for 3D time of flight, structured light based 3D cameras.
  • the actuator 280 may also be controlled by the processing device 58 , for example.
  • the illumination source may be the near IR LED arrays (850 or 875 nm) that would be pulsed at very high frequencies (>10 MHz) using a standard pulse width modulation (“PWM”) circuit.
  • PWM pulse width modulation
  • the received “pulsed” or continuous wave images are provided to the processing device 58 or another processing device to compute depth of an image, for example.
  • the LEDs in the array 62 may be pulsed.
  • the time for infrared light to be detected by the image sensor 60 after emission for an LED pulse may be measured by the processing device 58 .
  • 3D information may be used to determine a precise location of a fire, an intruder, for example. It could also be sued to obtain in at least partial 3D image of a person, which would assist in the identification of the person based on the volume of the person, a can used in conjunction with video data from the image sensor. The processing of video data is discussed below. The first filter would not be activated while the second filter is activated.
  • the functions of the processing device 58 include running the operating system and embedded software that runs on the operating system, to exercise the various hardware on the device 10 appropriately, and compressing the incoming raw video signal from the image sensor to a high quality compact stream that can be efficiently transmitted over the Internet, as discussed further below.
  • RAM memory 82 may be used to store, copy and process the large volume of streaming video data coming into the processing device 58 from the image sensor. It can also be used as a work space to perform rapid data analytics locally on the device 10 . Flash memory 84 may be used for non volatile permanent storage of important information related to the location and the operating system
  • FIG. 14 is a top view of the antenna board.
  • a Wifi antenna and a BTLE or bluetooth low energy antenna are shown. Both antennas operate at 2.4 GHz frequency.
  • the PIR sensor 72 along with a fresnel lenslet is used for motion detection, based on the principles of black body radiation.
  • the fresnel lenslet may be flush against the front black acrylic plastic plate 204 and be color matched with the rest of the black acrylic material.
  • an exit window of dark black HDPE material in front of the fresnel lens which is clear or whitish.
  • the black HDPE would allow >8 um wavelengths to pass through.
  • the device 10 may be configured or “set up” for use and transfer of information between the device 10 and a home network, as well as between the device and a user device 24 , via the audio jack of a use device 24 and the device 10 . Since the device 10 does not include a user interface in this example, the user interface or input of the user device, such as a keyboard or touch screen, is used to enter data. Setup in accordance with this embodiment is simple, rapid, and avoids the need for a computer in the setup process.
  • FIGS. 17A-17C show a flowchart of an example of a setup procedure 300 in accordance with an embodiment of the invention.
  • the device 10 is plugged in or battery powered is turned on, in Step 302 .
  • the primary user 22 downloads an App to their device 24 and connects an audio cable, commonly known as 3.5 mm audio jack, to the device 10 and to the user device 24 , in Steps 304 and 306 .
  • the cable is connected to the 3.5 mm audio port (stereo sound and microphone) of the user device 24 , and to the 3.5 mm audio jack 96 of the device 10 .
  • the user device 10 may be a smartphone, tablet, laptop, or other computing device.
  • the primary user 22 can at this point set up the device either through a web interface accessed by the user device 24 or through the App downloaded to the user device 24 .
  • the remainder of the method 300 will be described with respect to an App on the user device 24 . Use of a web interface is similar.
  • the App When the user opens the App, in Step 308 , the App presents an option to Create an Account, in Step 310 . Selection of the option causes the App to present a graphical user interface enabling the entering of a user's name, email address, password, and phone number, for example, via the input of the user device 24 , such as a touchscreen or keyboard, in Step 312 . Other information may be requested, as well, such as how many devices 10 are being set up and where are they located, for example. The creation of the account and the account information may be confirmed by text or email, for example, in Step 314 . The user 22 may also be requested to agree to terms of service and to agree to accept push notifications, email, and/or text messages that are used to inform the primary user of events taking place in the location, as described below. The primary user 22 may be informed of events by phone, as well.
  • the user 24 is instructed by the App to position the device in a desired location, in Step 316 .
  • Example locations are discussed above.
  • the App requests the device serial number from the device 10 , via the audio cable, in Step 318 , and the serial number is received, in Step 320 .
  • Data is encoded as audio signals to pass from the device 10 to the user device 24 , and decoded by the user device 24 via the App.
  • the device 10 may be configured to encode and to decode the audio signal, and respond to it in order to complete the setup process, as well.
  • the App on the user device 24 will then determine if the user device is connected to a wireless network, in Step 322 , in FIG. 17B . If Yes, the App asks the user to confirm that the device 10 should be connected to the same network, in Step 324 . The response is received, in Step 326 . If the user confirms, then the App instructs the device 10 to connect to the same network.
  • the device 10 requests a password for the network, in Step 328 .
  • the user enters the password via the input device on the user device 24 , such as a touch screen or keyboard.
  • the App encodes the password into audio signals and provides the password to the device 10 , via the audio cable.
  • the App encodes the password into audio signals and provides the password to the password is received by the device 10 via the audio cable, in Step 330 , and the device decodes the password.
  • the device 10 updates the Wifi credentials, restarts the necessary systems, and connects to a processing system 14 via the network, in Step 128 , in a manner known in the art.
  • the device 10 then connects to the user device, in Step 130 .
  • Step 322 If the user is not connected to a wireless network, in Step 322 , or does not confirm that the device 10 should connect to the same network, in Step 324 and 326 , the App instructs the device to search for available networks, in Step 332 .
  • the device 10 informs the App of the available networks, in Step 334 .
  • the App then asks the user to select a network from the available networks, in 336 . If the user selects one of the networks, in Step 338 , the App requests the network password from the user, in Step 328 and the method 300 proceeds, as described above.
  • Step 338 the App requests the user to enter a desired network, in Step 340 .
  • the user enters the desired network by entering the service set identifier (“SSID”) of the network, for example, via the input device.
  • SSID service set identifier
  • the App requests the network password from the user, in Step 328 and the method 300 proceeds, as described above.
  • the network password is provided to the device 10 , in Step 344 , via the audio cable, and the device connects to the network, in Step 346 .
  • the device 10 then connects to the processing system 14 , in Step 348 .
  • Step 348 After connection of the device 10 to the network 16 and to the system 14 , in Step 348 , a unique identifier of the device 10 is sent by the device to the user device 24 , in Step 350 .
  • This enables the user 24 to setup the device 10 as their device in the system, i.e., to be an approved and connected device from a security and administrative perspective.
  • a map may be presented to the user 22 via the App to identify the geographic location of the device 10 , in Step 352 .
  • the App sends the User Account information established in Step 312 , the serial number of the device 10 , and the user's geographic location defined in Step 352 , and sends it to the processing system 14 , which establishes a profile and an association between the device 10 and the user's location, in Step 354 .
  • Steps 356 and 358 If the device 10 has already been placed in a desired location, the device is ready for operation, without further installation.
  • a secure cryptographic key may also be issued between the device and the phone, to enable them to be paired securely.
  • the device 10 can ‘talk’ to the phone itself, and know that it is the phone it paired with, by an exchange of keys.
  • the steps of the method 100 may be performed in a different order. Instead of asking the user 22 to confirm that the network that the user device 24 is connected to is the network the device 10 is to be connected to in Steps 316 - 322 , the method 100 may proceed from Step 320 to Step 332 , where the device 10 searches for available networks.
  • a primary user 22 moves the device 10 to a new location, or changes their wireless name or password, they can update that change on the device by again connecting the user device 24 to the device 10 via the audio port 102 .
  • Bluetooth low energy (“BTLE”) communication may also be provided for information exchange, in which case the user device 24 and the device 10 will find each other before communication begins.
  • the device 10 comprises a wireless router, allowing the user device 24 to connect to the device directly.
  • the device 10 sends an IP address to the user device 24 .
  • the user enters their Wifi SSID and password via the App to connect to the device 10 .
  • Bluetooth pairing may also be used to configure the device 10 .
  • Encoding information may also be provided into a flashing light, which is read by the user device 24 .
  • the device 10 may be connected to the user home network, which may comprise a wireless router, Ethernet, etc., in the home, and the devices connected to it, such as desktop computers, laptop computers, mobile smart devices, etc.
  • the device 10 may receive information and instructions from the home devices connected to the home network and from the user device 24 .
  • the device 10 may also send outbound interactions, such as information that the device 10 sends to the system 14 and to user device 24 .
  • the device 10 communicates with the user device 24 via an App, such as the App discussed above in the set up procedure, to communicate with it the user device. Since this App may run in the ‘background’ or have some functionality at all times on the user device 24 , it can be used for security or safety functions. For instance, when a user 22 approaches a physical location, automatic arming or disarming, based on proximity of the users of the security system 14 and the device 10 within the home may be enabled.
  • the user device 24 sends a signal via the App to the home network that the primary user 22 is in the area and that the device should be ‘ready’ for them to enter.
  • Being ‘ready’ in this example means that the device 10 and the system 14 should not notify the primary user 22 immediately upon entry of the user 22 into the home 12 that someone has entered the location, despite recognition of entry by the sensors of the device 10 .
  • the device 10 and system 14 may assume that the person entering the location is likely the primary user 22 and thus it should wait another predetermined set of time, such as 45 seconds, for example, to confirm that the user is in the location through the App or wireless network and disarm itself.
  • Other information for the user can be sent through the App and other users can use the App. For example, if a user adds a third party as an approved user in their network, that third party can download the App to their own user device 24 , and use the same proximity-based geolocation services of the device 10 to arm or disarm the user's system.
  • a secondary user may be a family member or other person regularly in the home 12 .
  • the information may be saved into a ‘timeline’ of events but the primary user 22 will not notified as if this was a security incident. Timelines are discussed further below.
  • security becomes more than simply an individual within their physical network. It becomes a social construct where a large group of people are listed as ‘trusted’ within an environment. Such a group security, or social security construct, may improve the way people work together to protect each other. By acting together as a community, enabled through technology, the wider community becomes more secure.
  • the device 10 may act as a hub for other services within a home, and preferences can act differently for different people that are identified in the system.
  • specific actions and notifications can be aligned with the geo-location of phones and the confirmation of the proximity of individuals within the location.
  • An individual owner of the account can set preferences for the individuals.
  • An action may be a specific notification—if the location of an individual's child, for instance, is activated, then the individual can receive a timeline or graph of the comings and goings of their children. In other words, specific actions based on the person.
  • An event is a deviation from the norm or a deviation from a predefined set of criteria in a location monitored by the device 10 , for one or several of the sensors, the camera (imaging sensor 60 ), and/or the microphone 76 .
  • the system 14 may define certain parameters or criteria concerning what is normal and what constitutes an event or incident. For example, the system 14 may define that a temperature above a certain threshold should cause a notification.
  • the parameters may be based on the geographic location of the device 10 and the home 12 . For example, temperature parameter and/or humidity criteria in a geographic location having high average temperatures and/or humidity may be higher than the temperature and/or humidity criteria for geographic locations with lower average temperatures and/or humidity.
  • the temperature parameter may also be change seasonally, based on the current date.
  • Air quality parameters levels may be similarly defined by the system 14 .
  • the parameters set by the system 14 may include a combination of sensor data. For example, a notification may be sent only when both the temperature and humidity and/or air quality are above predetermined levels.
  • Criteria and parameters may be stored by the system 14 in the database 20 , for example. Criteria and parameters to be applied by the device 10 , if any, may be downloaded by the system 14 to the device through the network 16 , for storage in the RAM 82 and/or the flash memory 84 or other such memory, for example.
  • the primary user 22 may set certain parameters and criteria for what is considered to be an event and when notifications should be sent.
  • the parameters set by the user 22 can apply to one or multiple sensors, as discussed above.
  • the parameters and criteria may also include a combination of sensor data and circumstances. For example, a primary user 22 can specify that they only want to be informed of an event if no one is home and if there is noise above a certain decibel. Or, a primary user 22 can specify that any time a specific person comes into their home 12 the primary user 22 wants to be notified or wants another person to be notified. In another example, a primary user 22 may determine that any time a certain person or user is inside a home and the temperature goes above or below a certain value, the primary user is to be notified. These are merely examples of parameters and criteria that a user can set. User parameters may be set during the on-boarding process and/or after on-boarding, where questions are posed to the primary user 22 to establish the user defined parameters and criteria, as discussed further below.
  • the device 10 and/or the system 14 is configured to learn from a user's behavior and the activities in a user's home 12 , to further define what is normal activity in the home.
  • the processing device 18 in the system 14 and/or the processing device 58 in the device 10 may be configured to learn by software stored on respective storage devices, such as the database 20 and the RAM 82 or the flash storage 84 , or other appropriate memory or storage devices.
  • the learning process may begin with basic patterns within a home, such as the comings and goings of people within a home based on the day of the week and time of the day. Most people have somewhat standard behavior which can be understood by the device 10 and/or system 14 .
  • the system 14 may know how many people are in the house 12 and who they are based on user input or by sensing body mass and activity by the one or more sensors on one or more devices 10 in the home, and/or by geo-location data. Individual patterns of those people, such as when they come and go from the location on each day of the week. If over 2-4 weeks a respective person has acted regularly over a time period, a pattern may be established for that person.
  • the primary user is notified of the person's activities in accordance with the pattern multiple times and the primary user clears the event, such as the person entering the home 12 at a particular time, the next time the person enters the home in accordance with the pattern, the primary user will not be notified of the event.
  • the event may be stored for future reference, such as in a timeline that can be provided to the primary user 22 upon request, if desired.
  • Patterns may be discerned for environmental characteristics of the home or other location based on sensor data, as well. For example, patterns related to temperature, humidity, and/or air quality may be discerned by comparing the data from the temperature, humidity, and/or air quality sensors over a period of time.
  • the system 14 may use data from the image sensor 60 , which may be in the form of video, for example, along with audio input from the microphone 76 , and/or temperature and humidity inputs from the temperature/humidity sensor 90 / 88 , for example, to learn about the activities of a user in a given location, and the level of activity in the location. This enables the system 14 to determine a normal level of activity in a given location, such as in a given room in the home 12 . If high activity is normal at particular times of a weekday or a day of the weekend, for example, and the system 14 determines that there is very little activity at one of these times, it may be because the user went on vacation, for example, or it may be that there is a potential problem. The system 14 identifies the deviation from the norm and notifies the primary user 22 . The user 22 can approximately then respond to the notification.
  • the pattern can be set for all the devices in the location.
  • One device 10 can also break a pattern by another device. For instance, one device 10 does not detect movement and therefore thinks that no one is in the home 12 during a certain time of day, and another device 10 within the home detects movement, that input is calculated in a per-location basis and the pattern of the first device is trumped by the input from the second device.
  • the system 14 accumulates information from multiple devices 10 to determine current activity and patterns of behavior, in a location.
  • the identity of persons within a location may be determined by the system 14 based on patterns learned about respective persons. Those patterns include general size and weight of the user, as determined by data received from the imaging sensor 60 and PIR sensor 72 , for example, the level of heat generated by the person, as determined by the PIR sensor, for example, the voice of a person, as detected by the microphone 76 , the facial or other pattern recognition of the user based on data collected by the imaging sensor, activity patterns of the user based on learned patters, or other person specific data determined via the various sensors within the device 10 . Person recognition is important in order to determine the activity and patterns of individual members of the location, to know who is home to determine if there are individual-specific responses to incidents.
  • One of the ways the system 14 and the device 10 enables learning is by giving status scores a home 12 or portions of the home activity. For instance, if a user's home 12 , or a particular room in the home, is typically very loud and very active at certain times of particular days, based on noise detected by the microphone 76 , motion found in the video recorded by the image sensor 60 , and/or temperature/humidity fluctuations detected by the T/H 99 / 88 , for example, the home 12 or room in the home may be assigned an activity score of 10/10 for those times of those days.
  • Activity scores may be stored in the database 20 in the system 14 , in association with the time period of the particular days that the score is applicable, the room the score is applicable to, and identifying information about the user and/or the user's home, for example. In one example, if a current activity score drops more than 20% below the expected score in a time period, the primary user 22 may be alerted. Other percentage deviations or ranges may be used instead.
  • a home 12 including an elderly user who lives alone may have a low activity score of 2, for example. If the activity score in the elderly person's home 12 suddenly rises to 5, that also may be indicative of a potential problem, such as a home intrusion, or the rise in the activity score may be caused a visit by grandchildren, for example.
  • the user or backup contacts may inform the system that there is or is not a problem in the home.
  • Other characteristics of a location may be assigned scores based on other sensor data, such as scores for temperature, humidity, and/or air quality, for example. By scoring both the sensors themselves and the sensors working together, the system 14 can provide a clearer and complete picture of what is normal or not normal in a home 12 or other location, and respond accordingly.
  • the device 10 and/or the system 14 enable learning by recalibrating how sensor data is processed.
  • the device 10 and/or the system 14 may enable learning by adapting how the processing device 58 and/or the processing device 18 processes data about a location based on one or more pieces of data about the location that are collected by one or more sensors coupled to the processing device, for example.
  • Embodiments of the invention include a learning-based on-boarding process.
  • a rapid learning period which includes periodic questions to determine information on the home 12 or other environment where the device 10 is located, such as:
  • the questions above are merely examples and are not necessarily in the format that would be presented to a user 22 . These questions express the subject matter of possible questions and information that may be desirable to collect.
  • the questions may be presented as multiple choice questions and/or as fill in the blank questions, with dropdown menus and/or windows addressing user input words, names, times, etc.
  • Periodic questions to confirm what is learned about the environment may also be presented to the primary user 22 .
  • the system 14 can deduce a pattern as user patterns of coming and going then it can ask the user to confirm that that pattern of activity or learning is accurate.
  • the device 10 sees over a period of days or weeks that a certain activity, such as people waking up, adults leaving and returning from work, children leaving and returning from school, deliveries, such as newspaper, laundry, and/or mail deliveries, etc., happening at a certain time times that have not been addressed by a response of the user to on-boarding questions, then the system 14 can ask the user to confirm that that pattern of activity or learning is accurate.
  • the learning may involve a combination of observed behavior and previously answered questions by the user.
  • Other patterns may relate to temperature, humidity, and/or air quality changes over the course of the day, for example.
  • Providing a ‘completeness score’ on how much the device has learned, and how much is left to truly understand the behavior of the individual, may encourage primary user 22 to answer more questions for example. If over a series of weeks the device 10 has observed the primary user's behavior, and has had a number of questions answered by the user, then the system 14 can inform the user 24 how close it is to knowing their environment by giving the user a completeness score, such as that learning is 80% complete, for example. The system 14 may always be learning, and can learn new patterns so the completeness score can continue to evolve. The score can be given as a percent or in another form, including natural language.
  • Percentage of completeness or other such score may also be awarded for the various tasks the user has to perform as a part of the on-boarding process learning process, such as answering the questions presented, as well as other activities, such as adding their friends and family as backup contacts 26 , confirming the identity of the local police 28 , fire department 32 , and ambulance service 34 , for example.
  • Rewards may be based on the completeness of the profile. For example, providing extra storage in the database 20 in return for completeness of the on-boarding process. Such additional storage may be used to store video clips that a user is interested in, for example.
  • That data may also be given a sensor activity score.
  • the score is based, at least in part, on a particular point of data from one sensor or a data set from multiple sensors, and what is learned about the location. After learning the patterns of room or home, for example, throughout a day and week, such as the frequency of movement or average temperature, the device is able to determine a baseline of activity or events that are deemed normal in a particular location and during particular times of a day. When new data is acquired by the device's sensors, that data's issue activity score is determined against the baseline of what the device has learned to be normal at that time of the day and day of the week in order to identify potential threats.
  • the data is compared to the pattern or patterns by the system 14 and/or the device 10 and the degree deviation from the normal pattern may be determined, for example.
  • the deviation may be expressed by a percentage deviation, for example.
  • the degree of deviation may also be applied to a scale of 1-10 or 1-100. A higher percent deviation or issue activity score reflects a stronger deviation from the homes normal baseline.
  • Sensor activity scores may also be indicative of the comfort of the physical environment of a home is, when certain sensors such as temperature deviate from the baseline pattern or a threshold set by the primary user 22 , for example. The user may be informed of the deviation in terms of the degree of deviation as a percentage or score, for example.
  • the device 10 may continuously collect data from all the sensors while on or during other time periods. Sensor data may also be periodically collected and stored, at regular or non-regular time interval. Some or all of the collected data may be analyzed by the processing device 58 on the device 10 , and/or sent to the system 14 via the network 12 for analysis or further analysis. Collected and stored data may be sent to the system 14 for analysis continuously or periodically in regular or non-regular time intervals as well.
  • the data or data files may be sent by the device 10 to the system 14 with metadata including an identification of the device it is coming from, along an encrypted channel such as a secure socket layer (SSL) connection.
  • SSL secure socket layer
  • the collected data may be sent in multiple respective data streams or in other formats, for example.
  • FIGS. 18 and 19 show an example of a method 400 of the operation of the device 10 and the system 14 in response to an event detected by the device 10 .
  • One or more sensors are triggered and/or motion is detected, in Step 402 .
  • the device 10 determines whether the detected data is to be defined as an “Event” that needs to be further analyzed, in Step 404 based on whether the detected data meets criteria, such as exceeds a threshold, or is a deviation from a pattern, for example. If not, in this example, no further action is taken. If Yes, the device 10 sends the detected data and other information related to the event to the system 14 via the network 16 , in Step 406 , for further analysis.
  • the data sent by the device 10 includes the data from the sensor or sensors triggering the event, and data from other sensors, which may assist the system 14 in the interpretation of the data from the sensor triggering the event. For example, if the temperature sensor 90 is triggered, a video clip may be sent to be the system 14 , as well. Identification information related to the device 10 , the user 22 , the home 12 , the location of the device in the home, etc., may also be sent.
  • Step 410 If the event is a life-safety event, it is then determined if anyone is home, in Step 410 .
  • the system 14 may know whether anyone is home based on data previously received from the image sensor 60 , the PIR sensor 72 , the microphone 76 , and geo-location of user devices 24 , for example.
  • the system 14 treats the event as an emergency and simultaneously contacts the primary user 22 and backup contacts 26 in Step 412 , via email, text, and by phone, for example.
  • Primary users 22 may determine how they are to be contacted. Different backup contacts may be contacted depending on the type of the event.
  • Step 414 It is determined whether the primary user 22 or backup contact 26 responds within a predetermined period of time, in Step 414 .
  • the predetermined time may be one (1) minute, for example. If Yes, it is determined whether one of the contacted parties clears the event as a false alarm, in Step 416 . If Yes, the method ends in Step 418 with the event being cleared.
  • Step 416 the system 14 will take further action, in Step 218 , such as to call the police 28 , fire department 30 and/or ambulance service 32 , police and/or fire department depending on the event.
  • the system 14 may cause the device 10 to activate the siren 86 , as well.
  • Step 414 If the primary user 24 or backup contacts 26 do not respond, in Step 414 , then they are re-notified, in Step 422 and the method returns to Step 414 . If the primary user 24 or the backup contacts 26 do not respond in Step 414 after re-notification, the system 14 may proceed directly to Step 420 to call the police, fire department, or ambulance.
  • Step 410 If no one is home, in Step 410 , but it is determined that it is nighttime, in Step 424 .
  • the system may still treat the event as an emergency and proceed to Step 412 .
  • Step 408 determines that the event is not a life threatening event, in Step 408 or that is not nighttime, in Step 424 .
  • the primary user is notified in Step 428 . If the primary user 22 responds within a predetermined period of time, such as one (1) minute, for example, in Step 430 , it is determined whether the primary user cleared the event as a false alarm, in Step 432 . If Yes, the alert is cleared, in Step 418 , as discussed above.
  • Step 432 If the primary user 22 does not clear the event as a false alarm, in Step 432 , the method goes to Step 434 to contact the police, etc., which is also discussed above.
  • Step 434 If the primary user 22 does not respond in the predetermined period of time, then the backup contacts 26 are notified, in Step 434 . It is then determined whether the backup contacts clear the event as a false alarm, in Step 432 . If Yes, the event is cleared, in Step 418 . If No, the system contacts the police, fire department, and/or ambulance service in Step 434 .
  • Steps 426 - 436 show an example of how the system 14 can escalate notifications from the primary user 22 , to the backup contacts, 26 , to the police, etc.
  • the escalation procedure may be determined by the system 14 and/or by user preferences.
  • a primary user 22 may set the time period before a backup contact is alerted after the primary user 22 has failed to respond and resolve a notification, for example.
  • Other escalation policies that can be set by the primary user include: the types of alerts that are escalated (i.e.
  • FIG. 20A is another example of a method of operation 500 of the device 10 and/or system 14 when an event is detected in accordance with an embodiment of the invention.
  • the device 10 performs all the steps of the method 500 except for the learning steps, which are performed by the system 14 .
  • An event is detected, in Step 502 . It is determined whether the event is a sensor related event, such as an event detected by the temperature/humidity and/or air quality sensors, in Step 504 . If Yes, the device 10 determines whether the event matches pre-determined criteria for sending an alert, in Step 506 . As, discussed above, the pre-determined criteria are system defined criteria, such as thresholds. If Yes, then the device 10 sends an alert to the user 22 in Step 508 and the method ends in Step 510 .
  • pre-determined criteria are system defined criteria, such as thresholds.
  • the device 10 determines whether the event matches user defined criteria, in Step 512 .
  • user defined criteria are criteria set by the primary user 22 .
  • the primary user 22 informs the device 10 and/or the system 14 that a notification should be sent when particular criteria are met.
  • the user defined criteria may be defined by the primary user 22 through the questions asked by the system 14 and answered by the user during set up and during operation, as discussed above.
  • an alert is sent to the primary user 22 , in Step 508 . Escalation may be followed if the primary user 22 does not respond in a predetermined period of time, as shown in FIG. 19 , for example and discussed above.
  • Step 416 If the device 10 determines that the event does not relate to user defined criteria, an alert is sent in Step 416 and the event is submitted for learning to the system 14 , in Step 518 .
  • the device 10 determines whether the event is proximity related, such as a person entering the home 12 , in Step 516 . If Yes, the device 10 determines whether the user wants to be informed of a proximity event, in Step 520 . If Yes, an alert is sent by the device, in Step 420 , and the event is submitted to the system 14 for learning, in Step 418 . If No, then an alert is not sent and the event is submitted to the system 14 for learning, in Step 422 .
  • the event is proximity related, such as a person entering the home 12 , in Step 516 . If Yes, the device 10 determines whether the user wants to be informed of a proximity event, in Step 520 . If Yes, an alert is sent by the device, in Step 420 , and the event is submitted to the system 14 for learning, in Step 418 . If No, then an alert is not sent and the event is submitted to the system 14 for learning, in Step 422 .
  • the system can also learn from the event.
  • the event may relate to data collected by the camera/imaging sensor 60 , and/or microphone 76 , such as sudden or unaccounted for motion or noise, etc.
  • the event may be presented to the primary user 24 and the user can input information clearing the event or defining the event. In this way, the system 14 learns which events the primary user wants to be informed of, and which events the user does not want to be informed of.
  • the video may show a small mass moving close to the floor.
  • the user may identify this mass a pet.
  • the system 14 learns not to send an alert the next time a similar moving mass is identified.
  • a person of a particular size and shape may enter the house between 9-10 AM on a Monday.
  • An alert is sent to the user, who clears the alert.
  • the same event takes place on the next two Mondays, which are also cleared by the user.
  • the system 14 can then ask the primary user 22 whether it is necessary to send alerts the next time the same person of the same size and shape enters the house on a Monday between 9-10 AM.
  • noise of a common low frequency may be detected for 7 or more days in a row.
  • the system 14 can learn that this is a normal level and only send alerts if the noise level suddenly spikes above this usual level, even if the spike does not exceed a preset threshold previously set by the system 14 , for example.
  • the system 14 may detect a loud noise at the same time every day, for 7 days. Each day the primary user 22 clears the noise event. The next day the system 14 detects the same sound at the same time, and does not send an alert to the user.
  • the device 10 performs all the steps of the method 400 in the description above, the device 10 and the system 14 may both perform steps of the method.
  • the device 10 may perform all the steps except for the learning steps 414 and 422 , which are performed by the system 14 .
  • the system 14 is configured to send the alerts in Steps 408 and 416 .
  • the device 10 detects an event and provides the information related to the event to the system 14 , which performs the remainder of the steps of the method 400 .
  • the device 10 provides all data collected from all the sensors to the system 14 , which determines whether an event has taken place, and performs all the other steps of the method 400 .
  • actions taken by the device 10 are performed by the processing device 58 in this example, under the control of software, such as an Application stored in memory, while actions by the systems 14 are performed by the processing device 16 , under the control of suitable software stored in memory.
  • the system 14 may be a discrete entity or a cloud based processing system.
  • FIG. 20B is another example of a learning procedure 600 , in accordance with an embodiment of the invention.
  • Data is received from the sensors in the device 10 , in Step 602 .
  • the data includes a time stamp stating the time and date of the data, as well as identifying information, such as the device detecting the data and the location of the device in the user's home, for example.
  • the time stamped data is stored, in Step 504 , and analyzed in Step 506 to identify potential patterns. Patterns may be identified by methods known in the art.
  • the first Option A comprises:
  • the second Option B comprises:
  • Step 606 the primary user is asked to confirm the potential pattern, in Step 508 . If the pattern is confirmed by the user, it is stored, in Step 612 . If it is not confirmed, it is not stored.
  • Step 620 Current sensor data is compared to stored patterns, in Step 620 . If the current sensor data deviates from the pattern by a predetermined amount, the user is informed, in Step 622 . If the sensor data does not deviate from the patter by the predetermined amount, the user is not informed, in Step 624 .
  • the user's response to being informed of a deviation may also be used in the learning algorithm for additional learning, as discussed above. For example, if a primary user 22 is notified of a particular deviation several times and each time the user clears the notification, the system 14 can ask the user whether the user wants to be informed of such notifications. If the user's response if No, then notifications will not be sent for the same deviation again.
  • video recorded by the image sensor 60 is analyzed by the processing device 58 of the device 10 and/or by the processing device 18 of the system 14 .
  • video is initially analyzed by the processing device 58 of the device 10 and when interesting video frames are identified, they are sent to the system 14 via the network 12 for further analysis by processing device 18 of the system 14 .
  • FIG. 21 is a schematic diagram of an example of components of the device 10 involved in video processing, in this embodiment.
  • FIG. 22 is a schematic diagram of an example of the components of the system 14 involved in video processing.
  • Data collected by the image sensor 68 is provided to the processing device 58 in two identical streams.
  • a digital signal processor 702 of the processing device 58 or a separate digital signal processor, compresses the video in one of the streams, and stores the compressed stream in a storage buffer 704 .
  • the storage buffer may be in RAM 82 or other such memory, for example. MPEG-4 video compression may be used, for example.
  • the second stream which is not compressed, is provided by the image sensor 60 to the video analysis module 706 of the processing device 58 .
  • the video analysis module 706 is a software module that determines whether there is change worthy of further processing by the system 14 , such as movement in the frames of the video.
  • the video analysis module 706 may quantify the amount of change in the video and compute an “interestingness” score, which may be a weighted function including available bandwidth between the device and system 14 .
  • the weighted function may be updated based on information/instructions from the system 14 .
  • the interestingness score may be provided to the buffer worker module 708 .
  • the buffer worker module 708 is a software module that determines which compressed frames are to be sent to the upload buffer for upload to the system 14 for further analysis.
  • the buffer worker module moves the video from the buffer storage 704 to an upload buffer 706 , for upload of the corresponding chunk of compressed video to the system 14 via the network 12 .
  • the video chunk may be uploaded to the system 14 by Wifi or Ethernet, for example. If the score is less than the threshold, nothing is done.
  • the buffer worker 708 notices that the buffer storage 704 is near capacity, it deletes videos based having the least interesting score, the oldest video, or other basis, to maximize information stored in the buffer.
  • Video chunks uploaded to the system 14 are received by a video ingress module 712 in FIG. 22 , via the network 12 .
  • the video ingress module 712 provides the video chunks to a video processing module 713 of the processing device 18 for video processing.
  • Processing may include motion characterization 714 , such as “entering”, “leaving” “crossing” “getting up”, segmentation 716 , and feature extraction 718 . It is noted the processing may be performed by a discrete entity or a cloud based system.
  • Segmentation 716 defines the moving object with respect to the non-moving background, over a number of frames.
  • feature extraction 718 particular features are extracted from the segmented volumes that may facilitate recognition of the moving object.
  • features are extracted from the non-moving parts of the background for characterization.
  • the output of the video processing module 713 is provided to a learning module 720 , which performs recognition to identify the moving objects.
  • a notification module 722 determines whether the primary user 22 or another party, such as backup contacts 26 , need to be notified of the identified moving object. If so, the notification module 722 sends the notification to the primary user 22 or other such party, via the network 12 in the manners specified by the primary user 22 and the backup contacts 26 . If not, then a notification is not sent.
  • the learning module 720 may also provide feedback to the buffer worker 708 in the device 10 via the network 12 , to adjust the threshold of the interestingness score or to mask parts of the image to remove them from the computation of the interestingness scores, such as a constantly changing television screen or a ceiling fan. While moving, a TV or fan is typically not of interest and can be considered part of the background. If not enough data is being provided to successfully identify moving objects, for example, the threshold may be lowered so that more video data is provided to the system 14 for further analysis. This may become apparent if the system 14 needs to keep asking the user to identify the same person, for example. If the system 14 determines that too much data is being provided for efficient, timely analysis, or it is determined that moving objects can be consistently and successfully identified with less data, the threshold may be raised.
  • the learning module 720 may send the video or an image to the primary user 22 with a request to identify whether an object in the video is a person, a child, or a pet. Ifa person or child, the system 14 may also request an identification of the person or child. The primary user 22 may also be asked whether the user wants to be informed of the presence/movement of that person, child or pet. Based on the feedback from the primary user 22 , the system 14 learns the identities of people living in or visiting a home or other location. The system 14 also learns when the primary user does not want to be notified of detected movement and when the user does want to be notified of detected movement.
  • the video analysis module 706 may determine whether the received video indicates movement by comparing a current frame or portion of a frame to a prior frame or portion of a frame, for example.
  • a grid may be superimposed on an image. For each point in the grid, an 8 ⁇ 8 or other size pixel patch is defined and compared to the same size pixel patch around the same point in the prior or following frame. Differences from frame to frame may indicate movement of one or more objects within the frame. Motion vectors may be generated from frame to frame. If there is no movement, the motion vectors in a patch equals zero (0) and the prior patch with a non-zero motion vector is carried over for comparison with a subsequent patch, as in MPEG-4 video compression.
  • feature detection and recognition techniques such as those used in computer vision, are used by the video analysis module 706 .
  • areas of high contrast are identified in a frame, to identify salient image areas.
  • the most salient image areas may be identified in each frame.
  • the salient image areas are compared from frame to frame to detect movement. Frames showing movement are given a higher interestingness score by the module 706 .
  • the threshold compared to the interestingness score by the buffer worker may be set in the device and/or provided by the system 14 via the network 12 .
  • the threshold may be changed by the system 14 , as discussed above.
  • a motion analysis module in this example determines “where” the motion is taking place in a frame and an object analysis module determines “what” is moving.
  • Moving “blobs” may be identified moving across frames, and motion signatures of moving blobs may be determined by defining a moving vector over multiple frames, such as 150 frames, for example. Clustering may be performed of all sets of vectors to group similar motions together. These clusters may form the basis of a dictionary of “motion signatures” against which future motions may be compared.
  • Motion signatures in the dictionary may also be compared to signatures developed during the learning process and named by user input or comparison against all motions recognized in all devices. Motion signatures for pets will be different than that of children and adults. Entering and Leaving signatures will have common traits across many locations. Motion signatures may take into account speed, motion cues, such as gait and/or size of a moving object which may also be stored in the dictionary. A primary user 22 may be asked to identify moving objects or blobs during the learning/onboarding phase, and later when unidentified moving blobs are detected.
  • the dictionary inclusion may be based on term frequency-inverse document frequency and/or K-means training, for example. Significant and commonly occurring features of interest (moving blobs) whose signatures may be stored in the dictionary include a door opening and closing, a person entering and leaving a room, a pet walking or running through a room, for example.
  • a second dictionary may be created to store features, such as colors, edges, corners, etc., which may also be extracted from video. This dictionary stores the “what” features of what is moving to recognize objects, such as people and pets, regardless of their motion or position in the video.
  • Feedback may be provided from the learning module 720 to the video processing module 713 during any one or all of the motion recognition, segmentation, or feature extraction steps, to improve the video processing based on the learning process and the success of identifying moving objects.
  • Video files and other data files containing data from other sensors may be sent by the device 10 to the system 14 with metadata including an identification of the device it is coming from, along an encrypted channel such as a secure socket layer (SSL) connection.
  • the device 10 transmits audio, video and sensor data simultaneously. That is different than currently available home automation systems, for example, which may include multiple independent sensors.
  • the device 10 may transmit a complete dataset, which facilities rapid response by the system 14 . If there are multiple devices 10 operating in a home 12 or other location at the same time, they may work simultaneously to compile the varying sensor data to send to the system 14 .
  • a current state of the device 10 including the current video file and current data from the sensors, such as temperatures, humidity, air quality, etc., may be stored in the database 20 in association with an identification of the device, the location of the device determined by geo-location, and the identification of the primary user.
  • Prior states of the device 10 may also be saved in the database 20 or other such storage device or database to facilitate learning, for example.
  • Backup contacts, family members, group members, and notification rules may also be stored in the database in association with the identification of the device 10 and primary user 22 .
  • video is stored in a separate database and a pointer or to the video is stored in association with the other status information.
  • FIG. 23 is an example of a notification 750 provided to user device 24 of a primary user 22 , as displayed by the App.
  • the notification 750 includes a description 752 of the event and location where it took place. In this example, activity was detected in the living room.
  • the notification also includes the time 754 of the event, here 2:12 PM.
  • the notification 750 also describes what triggered the event 756 , here motion detection.
  • An image 758 of the room where the event took place is also shown. Clicking on the image plays a video of the event.
  • An icon 760 is also provided, which enables viewing a current video of the room where the activity took place.
  • Another icon 762 is provided to provide action options for the primary user 22 .
  • the action options may include clearing the event as a false alarm, contacting a backup contact 26 , or contacting the police 28 , fire department 30 , or an ambulance 32 , for example.
  • FIG. 24 is an example of a notification 770 displayed on a user device 24 that states that a person arrived home and when.
  • the notification also includes the current temperature, humidity, air quality, and noise in the room based on the T/H sensor 90 / 88 , the air quality sensor 94 , and the microphone 76 , respectively.
  • the primary user 22 can control the device 10 through an App from their smart phone or a web interface.
  • the device 10 can also be controlled by the user through active and passive actions. Active actions that can control the device include voice commands and physical gestures, for example. A combination of voice/sound and gesture actions can also generate a command to control the device. Sound may be detected by the microphone 76 and gestures may be detected by the image sensor 60 .
  • a primary user or other member of a household can call out for a security alert to contact authorities, such as the police or fire department, or to contact a backup user, for example.
  • a specific action can be tailored based on the desired outcome, such as a user designating a verbal command to issue a low level alert or a specific hand gesture to issue a high level alert.
  • Voice recognition technology may be used to teach the system to recognize commands, as is known in the art.
  • Gesture recognition technology may be used to recognize gestures, as is also known in the art.
  • the user can also control the device through passive actions. Movement and video cues determine passive actions.
  • the primary user can walk in front of the device 10 to cause the device to disarm after the device and/or system 14 recognizes the primary user through facial recognition, for example.
  • the device 10 can also sense directional based input from its image sensor 60 to indicate whether a particular movement is normal or out of the ordinary.
  • a timeline is a linear, chronological schedule of events that occurred in a given location over a time period that have been captured by the device 10 or multiple devices 10 in a location.
  • the timeline is displayed on a user device 24 via the App, based on information provided to the user device from the system 14 and/or the device 10 , via the network 16 .
  • the timeline allows a primary user 22 to quickly get an overview of all activity in a user defined location that is captured by the device or devices 10 .
  • the timeline allows a user to toggle between events and previous notifications, under the control of the App.
  • the timeline may be composed of two types of events, engagement entries and event entries, for example.
  • Engagement notifications are generated when a primary user 22 or other user, such as a family member (trusted user), interacts with the device 10 .
  • Event notifications are generated when the device 10 is armed and detects something out of the ordinary.
  • Engagement entries capture the engagement of a user at a particular location with the device 10 .
  • an engagement entry is generated when a primary user arrives or departs from home.
  • Other engagement notifications include device mode changes, whether initiated by the system automatically or by a primary user 22 (e.g. “device is now armed”) or (e.g. “Amber disabled device”), a primary user goes live (explained in further detail below), a primary user makes changes to notification settings (e.g. “Justin paused all notifications for one hour”); or when a primary user resolves an Event, as described below.
  • the timeline also allows a user to manage event notifications.
  • the primary user 22 may see which sensors triggered event creation; learn which device in the home triggered event creation, in the case of multiple devices in a single location; learn at what time an event triggered; see video clips from when other primary users went “live” (actively used the camera to record remotely); see who went live, at what time, and from where (via GPS location); see the severity of an event; leave comments on entries for other users with access to the timeline; see how the event was responded to, and by which user; see sensor readings to make sure my environment is normal; see a high level view, and a detailed view of each event; see sensor details for each event; learn which users were home during each event; learn how long each event lasted; watch video from an event; know the status of notifications from an event; share an event socially; download an event report (sensor data, event timeline, video, etc.); mark an event as important; email or text an event video and details; and/or leave feedback for other users with access to the timeline.
  • the timeline may include events where the device 10 notified the primary user 22 of potential security threats.
  • the timeline may also include events that did not result in notification to the primary user 22 , due to learning that the primary user does not want to be notified of a particular event, or due to lack of severity of the event, for example.
  • FIG. 25 is an example of a timeline 800 of events that can be displayed by the user device 24 via the App, from data received from the system 24 and/or the device 10 .
  • the timeline identifies the location 802 of the device 10 , here “Brooklyn Apartment,” the current time 804 , the date (Today) 806 , and the number 808 of events that took place so far that day (8). Three events are listed with the time 810 of each event and a summary 812 of the event.
  • the timeline 800 may be scrolled by the user to see the additional events that are not currently displayed.
  • the timeline also indicates whether the event was previously viewed 814 .
  • FIG. 26 is a timeline 820 of notifications. Two days of notifications are shown, and the timeline may be scrolled to display additional notifications.
  • each user's timeline may appear slightly different and may be customized to them.
  • Backup contacts 26 and other parties may have access to the timeline, if allowed by the primary user 22 .
  • the device 10 and the system 14 knows who is in a certain location (using geolocation services on a mobile device that is linked to the account, facial recognition, and/or user patterns), what the temperature and humidity is inside based on the T/H humidity sensor 90 / 88 and outside via publically available weather information, what noise and activity is occurring via the microphone 76 and other sensor, and if all of these activities are normal or not.
  • the activities, plus their relative ‘normalcy’ or ‘non normalcy’ can be outlined on a timeline, where the primary user 22 can quickly delve into each event to understand if and how it impacts their life, and can be notified for the events that are not normal occurrences.
  • the device 10 and/or the system 14 may also provide a ‘day’ or ‘week in the life’ summary of what happened in a given location, to provide a very rapid video clip view and understanding of what was happening in the location for that week.
  • the user may be e-mailed a video clip synopsis of the week, the day, or other time period.
  • Primary users 22 may also access their devices 10 from any of their locations, to watch live streaming video from that device, enabling them to “Go Live.” They can also see current sensor readings from the location and who is home/away at the time. The primary user 22 has the option to save the video clip and corresponding sensor data.
  • a clip library allows users to store flagged video clips that they want the system 14 to store for them. These can include clips of events they have been notified to a clip of a robbery may be later used in court or for a police report. The user may request that clips of happy events, such as such as light hearted clips of the family pet doing something noteworthy, for example, also be saved. Flagging a clip via the App saves that clip for them in the database 20 or other storage device on the system 14 .
  • a clip library belongs to a location. Primary users may share saved clips from their clip library to social networks, such as Facebook and Twitter, for example.
  • FIG. 27 is an example of a method 900 for generating a day or week in the life at a location.
  • a location is monitored by a monitoring device 10 including a video camera, such as an imaging sensor 60 , in Step 902 .
  • a primary user 22 is notified of an event at the location, in Step 904 .
  • the user may be notified by the system 14 , as discussed above.
  • Video clips related to respective events at the location the user has been informed of are stored, in Step 906 .
  • the video clips may be uploaded to the system 14 by the device 10 , as discussed above.
  • the video clips may be stored in the database 20 or other storage device, by the system 14 , in Step 906 .
  • Video clips are stored by the system 14 upon a request of the user, in Step 908 .
  • the user 22 may request that particular video clips be stored via the App on the user device 24 .
  • the system 14 receives a request from the user 22 to provide stored video clips over a designated time period, via the App, in Step 910 .
  • the system retrieves the stored video clips, in Step 910 , and provides them to the user for display on the user device 24 , in Step 912 .
  • the video clips may be compiled into a single file and sent via email, for example.
  • the App opens and displays the video, when requested by the user 22 .
  • the App also allows users to view sensor data trends from the sensors in the device 10 , so that the primary user 24 is able to view visualized data about specific sensors within their location. For instance, if an air quality sensor is triggered, a graph may be generated by the system 14 and send to the primary user 22 that indicates a measure of the air quality, a video and audio clip of when the sensor went off, a historical chart of what the air quality was before the alert, instructions (e.g. go outside, open windows, etc.), and action calls (press here to call poison control, etc.).
  • insights concerning environmental sensor data developed by the system 14 may also be sent to respective primary users 24 .
  • the sensor data may be benchmarked against average readings as determined by sensor data from similarly situated primary user 24 . For example, “your temperature is an avg. of 76 degrees when no one is home.” Most homeowners in your city keep their interior temperature around 70 degrees when no is home.
  • Primary users 22 may also be able to see sensor data and compare it against a previous time frame (i.e. this month versus last month). Primary users 22 may also have the ability to toggle between data from specific sensors, showing graphs for those sensors that are relevant to them and hiding sensor data and corresponding graphs that are of no interest to them.
  • Geo-location may be used to identify who is approaching, entering, or leaving a location, for example. Geo-location may also be used to provide an input into the device 10 and/or the system 14 to determine the location of the primary user 22 and other persons, such as family members, for example. Geo-location services are available in most smart phone or smart mobile devices. Disarming and arming of the device 10 and the system 14 may be based on geo-location. For example, the device 10 /system 14 may be disarmed from issuing a notification that a person is approaching or entering a monitored location, when the primary user 22 is determined to be at or near the location, if desired by the user. In this example, while notifications are not sent, data may still be collected. Similarly, when the primary user 22 is determined to be leaving a location, the device 10 /system 14 is armed to provide notifications, if desired by the user.
  • a primary user 22 When a primary user 22 begins to approach a select radius around their home 12 or other location where the device 10 is located, it will send a signal to the home network that the user is in the area and be ‘ready’ for them to enter. This being ‘ready’ informs the device 10 /system 13 not to notify the user/s immediately upon entry to alert of movement alarm, since the person entering the location is likely the user and thus it should wait another predetermined set of time (e.g. 45 seconds) to confirm the user is in the location through the App or wireless network and disarm itself, for example.
  • another predetermined set of time e.g. 45 seconds
  • Geo-location of other users designated by the primary use 24, such as family members, for example, may also arm or disarm that device 10 /system 14 without alerting the user. Such information may be saved into a ‘timeline’ but need not result in a notification as a security incident.
  • geo-locating user device 24 By geo-locating user device 24 , other preferences in the home 12 may also be controlled. For example, preferences may be different depending on who is in the home. From a security and safety point of view, specific actions and notifications can align with the geo-location of phones and the confirmation of the proximity of individuals within the location. An individual owner of the account can set preferences for the individuals. An action may be a specific notification. For example, if the location of a primary user's child, for instance, is activated, then the primary user 22 can receive a timeline or graph of the activities of the child.
  • Geo-location may be performed in a variety of ways.
  • a user device 24 such as a mobile device, may monitor a large region ( ⁇ 1 Km) centered around the user's home 12 using GPS coordinates, for example.
  • a large region ⁇ 1 Km
  • GPS coordinates for example.
  • the user device 24 recognizes that it has entered this large region, it begins to search for a smaller region called an iBeacon, which are mini geo-fences that are created using BTLE devices.
  • the user device 24 recognizes the iBeacon, it sends a HTTPS request to the system 14 to disarm notifications.
  • a mobile user device 22 when a mobile user device 22 recognizes that it has entered a large monitored region, it begins to search for the device 10 , which is a Bluetooth low energy device.
  • the BTLE device 10 When connected to the user device 24 , the BTLE device 10 becomes a peripheral device and the user device 24 becomes the central device. The user device 24 continues to search for the peripheral in the background.
  • the user device 24 detects the peripheral, it verifies that it is that user's device 10 and sends a HTTPS request to the system 14 to disarm notifications.
  • a secondary user who can either be a primary user 24 of another device 10 or provisioned to be a backup/contact 26 in a group of the primary, can likewise be used to arm and disarm the security and notifications of the device, based on the criteria as set by the primary user 22 .
  • the system 14 may enable connection between the system to and third parties, such as Facebook or LinkedIn, to learn the location of a user, and to learn who may be potential backup contacts, for example. If a user connects their Facebook account to the system 14 , the system can scan the images of all of their friends in Facebook and ask the primary user whether they should be designated as backup contacts 26 , for example. Based on the scanned images, the device 10 and the system 14 may be able to recognize the friends on Facebook if they enter the home 12 by matching an image or video captured by the image sensor 60 and comparing it to the scanned image, for example. The primary user 22 can then be informed of who entered their home 12 .
  • third parties such as Facebook or LinkedIn
  • the system 14 may create an API to allow for the public and/or other users of their own device 10 to gather both generalized, anonymized data as well as specific data on individual's lives, in order to better understand their own environment.
  • This open security platform can be used by other security companies who wish to make devices to connect to the system 14 and/or to use the aspects of the system 14 for other security or non-security related benefits.
  • Other third party information that may be obtained and used by the system includes weather.
  • weather For example, temperature, humidity, and air quality thresholds may be changed by the system 14 based on the local weather, for example.
  • Local crime warnings may also be used by the system 14 .
  • the system 14 may send notifications to primary users 22 under circumstances where the user preferences state that notifications should not be provided, if it known that crime has recently increased in the area, for example.
  • Another embodiment of the invention is the interaction between the system 14 and the user's social group/friends, family or neighbors that they choose to include in the system.
  • the primary user 22 may designate through the App on the user device 24 individuals who the primary user knows as their ‘social group,’ and identifies those users as backups to the primary user notification. These backups are placed in specific ‘groups’, which work together to look into each other's homes from a timeline perspective and help monitor each other's safety and security.
  • the system 14 may also automatically designate people identified as friends of the primary user 22 on Facebook, for example.
  • These groups can be anywhere from 2 people to 10 people or more, but are intended to be a small social circle that enables a community of people to respond from a security point of view.
  • Member of the groups receive the notifications from the system 14 /device 10 either after the primary user 22 or, in the case of high priority notifications, at the same time as the primary user. This enables a crowd sourced response, where multiple people may respond the notification to ensure that there is an accurate and rapid response to the system.
  • a primary user 22 can cede primary user status and provide temporary “keys” to pass ownership and control over the software and hardware that make up the device over a pre-determined period of time.
  • These digital keys would allow a recipient to have temporary access to the device 10 when someone other than the primary user would be in or using the home, such as when renting out a primary user's home to a temporary guest or to give access to a friend of family member to check on a primary user's home when they are on vacation.
  • a primary user 22 does not respond to a notification, after a predetermined length of time, the notification would be sent to some or all member of the group by the system 14 , for example.
  • Members of the group may be able to either see what is going on in the location, or to see a written description of what caused the notification, for example.
  • the primary user 22 may determine how much information members of the group receive by setting appropriate preferences. Multiple group member can then work together to resolve an incident, as a virtual neighborhood or virtual community watch, and share in the responsibility of responding and reacting to a security or safety-related incident.
  • a primary user 22 may be a member of multiple groups.
  • a group may comprise two users, or a home owner and a secondary family member in their location or in another location, for example.
  • the group members may also have access to the timelines of other members in the group, and may add or subtract members from the group (depending on group settings and preferences).
  • the groups are a way for people to bring along people that they know and trust already to help them do security or safety, such as their parents, children, neighbors, and friends.
  • a group could comprise some or all of the inhabitants of a physical building of part of a building, such as an apartment building or office. This group may share limited information, such as confirmed incidents, or temperature, humidity and air quality readings. Multiple members of the building or office would share the feed to the device/s in the office, and share in the notifications or response. Knowing what is going on in their physical location would be beneficial to all members of the group.
  • a person can be a member of the group, even if they have no device or no membership. For example, they can be someone's backup, and still have the ability to participate in other people's security. In this case, they may not, however have an input of their own.
  • a group may also be a community and may include an input for the police or other authorities. Members of the group would receive instant notification should someone or something either be confirmed by the owner, or should a high heat, carbon monoxide or other life-safety related event be detected. Appropriate government authorities or non-governmental support companies/entities can be contacted directly or indirectly from the device 10 and/or the system 15 to inform/report an incident if the incident is deemed out-of-the-ordinary. Alternatively, the device 10 /system 14 may be configured by the primary user 22 to alert authorities if notifications remain unacknowledged by the primary user or group members.
  • a second layer aside from clusters of neighbors watching over each other is the interaction between the system 14 and local authorities.
  • a dedicated software App or website leading to police and other authorities could feed data collected by devices 10 data concerning crime and other safety information directly from a user's home to relevant groups.
  • Appropriate government authorities or non-governmental support companies and entities can be contacted directly or indirectly from the device 10 or system 14 , to inform and report of an incident that is deemed out-of-the-ordinary.
  • the device 10 and/or system 14 can be configured by the primary user 22 to alert authorities if notifications remain unacknowledged by the primary user or group members, by setting appropriate settings.
  • the primary user 22 and other users may communicate with the device 10 and system 14 via their mobile device or other processing device, such as a laptop or desk top computer, through an Application in the device.
  • a user controls the settings for the device, receives notifications when something occurs that is out of the ordinary, and has the ability to respond in an appropriate fashion.
  • the features of the mobile application work in concert to provide a holistic view of what occurs in one's home and the necessary tools to effectively monitor one's home in different situations.
  • Specific notifications are transmitted to the user on their mobile device, tablet, computer, or other web-connected device.
  • the messages can be sent via email, phone, text, or in-App message based on user preferences.
  • notifications or alerts are pushed to the user to apprise them of relevant incidents generated by the system.
  • notifications include device notifications. These are notifications related to the system that are not generated by human interaction. For example, a device notification would be sent to a user if the system loses power or if the system is disconnected from the Internet. There are also several classes of notifications that would be sent to a user that are not related to device functionality. These are non-device, non-human interaction, generated notifications such as: third party weather information; a reminder to pay for service plan; a notification that allotted data storage reaching capacity; SDK/API related notifications; software updates; confirmation of backup invitations; when notifications escalate to a backup contact; and/or confirmation of additional user notifications.
  • the device 10 and system 14 may operate differently in different modes of operation.
  • the modes of operation may include a home mode, an array mode, a nighttime mode, a privacy mode, and/or a vacation mode. Modes may be changed directly on the device 10 via the capacitive switch 114 and/or via the App in the user device 24 , for example. Each mode may be represented by a color displayed by the RGB LED 96 , for example.
  • the device 10 and the system 14 can be enabled to still watch over a user's home without notifying a user of every instance of an event.
  • a user can set Night Time Mode to only receive critical alerts, such as when the device 10 detects movement or a high spike in temperature, for example, and have all other alerts be delayed until the next morning, during set up or at a later time.
  • the device 10 and the system 10 may be in high alert mode.
  • the system can automatically sound the siren 86 when motion is detected instead of waiting for a user, or backup contact, to resolve the event.
  • escalation preferences are suspended with the primary user 22 and backup contacts receiving notifications simultaneously.
  • a primary user may cede primary user status and provide temporary digital “keys” to pass ownership and control over device 10 and the system 14 to the backup contact or other party for a pre-determined period of time.
  • These digital keys would also allow a recipient to have temporary access to the device 10 in instances where someone other than the primary user would be in or using the home. For example, when renting out a primary user's home to a temporary guest or to give access to a friend of family member to check on a primary user's home when they are on vacation.

Abstract

Providing information about a monitored environment comprises monitoring a location by a plurality of sensors, collecting data from the plurality of sensors by a processing device, and providing information to a user of the status of the location based on data collected by at least some sensors, for display on a user device, via a network. A user may be enabled to scroll through updates in the order that the information was provided, on the user device, by an App, for example. Providing video clips from a location to a user device comprises storing first video clips related to respective events at the location that the user has been informed of and/or receiving a request to provide stored video clips of the location over a time period, from the user, retrieving the stored video clips, and providing the retrieved video clips to a user device for display.

Description

    RELATED APPLICATION
  • The present application claims the benefit of U.S. Provisional Patent Application No. 61/815,223, which was filed on Apr. 23, 2013, is assigned to the assignee of the present application, and is incorporated by reference, herein.
  • FIELD OF THE INVENTION
  • Home security and monitoring devices and systems and, more particularly, home security and monitoring devices comprising a plurality of sensors, and systems having learning capabilities.
  • BACKGROUND OF THE INVENTION
  • Home security systems typically include a door alarm, which can trigger an audible sound, such as a siren, when the door is opened when the system is armed. The door alarm may be triggered when a contact between two surfaces is opened when the door is opened. Similar contacts may be provided on windows, to trigger an alarm when a window is opened when the system is armed. Sensors can also be provided to detect when the glass of a window is broken. Besides triggering a siren or other sound to warn occupants of the home, the home security system may inform a call center that an alarm has been triggered. A user may enter a code into a device that communicates with the call center, to inform the call center that an alarm is a false alarm. The call center may also call a home to inquire about an alarm. The call center may call the police if a user does not respond. Such home systems are prone to false alarms, which can be annoying to residents and neighbors, and waste the time and resources of authorities, such as the police and tire department. Other types of home security devices are also known, such as fire, smoke, and carbon monoxide detectors, for example. Often, such devices are managed and positioned by a user, and provide a siren or other such sound if triggered.
  • Webcams may be provided to monitor children, baby sitters, and other activities in a home. Webcams provide a large amount of often irrelevant video that needs to be reviewed, which can be time consuming even if using a fast forward function.
  • Home automation systems allow for remote control of lights, thermostats, and security devices.
  • SUMMARY OF THE INVENTION
  • There is no known integrated device comprising video and one or more sensors that can provide a more complete understanding of what is happening in a physical location, such as a home, office, or other location.
  • In accordance with an embodiment of the invention, a device may transmit audio, video, and/or sensor data within one or multiple streams, via Wifi or other wireless communications, simultaneously, to a processing system for analysis. The processing system may be a discrete entity or may be a cloud based system, for example. The sensors in the device may include a temperature sensor, humidity sensor, air quality sensor, motion detector, and/or other sensors, as described below. Since contemporaneous data from multiple or all of the sensors may be analyzed together, a more complete understanding of what is happening at a location may be developed than when data from a single sensor is analyzed, as in the known prior art. The device may include a camera (for video), a microphone (for audio), a passive infrared sensor (to detect motion), and life safety sensors, such as air quality monitoring sensors, carbon dioxide monitoring sensors, temperature monitoring sensors, humidity monitoring sensors and/or other air quality or atmospheric sensors, for example. A processing device is provided in the device, as well, and an additional processing device in the home is not needed.
  • Via the multiple sensors within one device, the device is able to combine the data feeds from the sensors simultaneously to create a clear picture of what is occurring in the one location. For instance, by putting a video feed in the same device as a temperature sensor, it can be determined whether movement of people or actions of people impacts the temperature of a given location. The device may gather precise readings because of the physical proximity of the device, and provide more accurate reading and more immediate data when including in the same data stream via a Wifi and/or cellular networks. Simultaneous sensor feeds also means that the user can have a complete picture of the audio, visual, and sensor readings from a physical place from any given time, and more accurately detect patterns of change, or impact from one sensor or activity on another sensor or activity.
  • In accordance with an embodiment of the invention, the combined functionality assists in determining what is ordinary and what is out-of-the-ordinary in a location. This assists in decreasing false alarms, and identifying events that a user wants and/or needs to be informed of.
  • Embodiments of the invention provide a network-based, such as a cloud-based, communal security layer to the Internet, powered by integrated and intelligent device/s used to connect people to their homes, other people, and places that are important to them, enabling peace-of-mind, a sense of safety, security, and a detection of situations that deviate from the norm.
  • In accordance with another embodiment of the invention, a virtual neighborhood community watch may be defined to enable groups of people to look after each other via alert and notification-based actions for preventing, reacting to, and responding to, life-safety events and other events, for example.
  • The combined functionality provides an exceptional perspective on what is happing in a location, specifically identifying both what is ordinary and out-of-the-ordinary. Likewise many of the individual elements of our invention, as outlined here, are new ways of dealing with old issues of security, connectivity, personal welfare, and safety.
  • In this regard, the security device facilitates user monitoring of a physical place, through the collection of continuous information from the location when the device is powered. This information is analyzed by the device and/or a processing system, such as a cloud based processing system, to determine if an event has occurred that is of interest to the user. A user may be one or more members of a household, for example. There may be a primary user or users, who receive initial notifications concerning events happening in a location and determine when notifications are to be sent, and backup users, designated by the primary users, to receive notifications when the primary user does not respond, for example.
  • An event of interest can be a negative event, such as a security or safety incident, including a life-threatening incident such as a fire or an attacker, for example. An event of interest may also be a neutral or positive incident, such as watching one's child. In accordance with embodiments of the invention, an event of interest may also be an event that is out of the ordinary, such as an unknown person entering the home or a known person entering the home at an unexpected time, for example. The overall importance of an incident may be determined both by an individual user as well as compiled information from a multitude of users. The device, therefore, acts as a portal connecting a physical place to the web and to internet/cloud-based services.
  • Events may be defined in multiple ways. For example, events may be defined by the system, by defining thresholds for temperature, rate of temperature change, air quality, or movement, for example. In defining events, the system may take into account crowd-sourced information collected from a large number of devices at a large number of locations, for example. The crowd-sourced information applied to a particular user may be derived from similarly situated users, such as users in one bedroom apartments in New York City, users in suburban homes outside of Washington D.C., etc. For example, based on crowd sourced information from locations near a particular user, the system 14 may learn and apply criteria, such as thresholds, for single sensors and combinations of sensors, for that user.
  • Events may also be defined by individual user input, allowing the user to essentially ‘program’ the device to learn the user's personal preferences and how they interact from a security and safety perspective. Users may provide input to the system to define normal or acceptable activities, and/or to confirm or modify system defined and/or crowd-sourced defined events.
  • In accordance with another embodiment of the invention, the device and/or system record a series of events, including data from a sensor (or multiple sensors) identified as of interest based on being above/below a threshold or an out of the ordinary event, for example. The data is then sent to the processing system, which determines, among other things, the time of day of the incident, who is in the home or location, or when the event happened. Individual notifications sent over the course of a day, for example, may be displayed to a user on a user device in the form of a timeline of events, so that a user can monitor the activities that have taken place in the location. This further assists the user in understanding what happens in a location.
  • In many homes, especially apartments or smaller units, a single security/monitoring device can be used. In larger locations, such as large rooms, hallways, and homes with multiple rooms, multiple devices can be used.
  • In embodiments of the invention, the device connects to a home network or directly to a user via their smartphone, other mobile device, and/or computer. The device may constantly be on and gathering data, which means that it will be a continual source of information for what is occurring in a single location, and for determining if anything is out of the ordinary. Alternatively, the user may turn off the device, when desired, either directly or remotely, via their smartphone, etc.
  • In accordance with embodiment of the invention, processing of collected data may take place on the device and/or in processing center. For example, relevant-confirming analytics can be performed from the device, such as through video, audio and motion sensing to determine whether an event might have taken place and the data should be sent to the processing center via the network for further analysis. The processing system then determines if an event or incident within the range of the device is of interest to the owner of the device, either by being a deviation from what is normal or by being another event that gives insight into what is taking place inside the home, for example. Inputs from one or more sensor may be used to determine if there is an incident or an event worth informing the owner of the device about, by receiving and analyzing sensor data from multiple sensors.
  • In embodiments of the invention, the processing system communicates through the network directly with a user or group of users, or to the user's backup friends or family, or to the designated government or private authorities, such as the police and fire department, who can assist the user in responding to the event. Notifications can come in the form of a direct communication such as an email, text message, or phone call, through a network-based portal dedicated to public safety officials or first responders, or through a software application, which is accessible by select individuals. Having immediate feedback from physical locations may greatly reduce the amount of time required for response and enables the authorities to help save lives.
  • In accordance with embodiments of the invention, the device does not require any installation or connection to a wall or wired home system. In addition, the device in accordance with certain embodiments does not require a professional installer, or any installation at all. It need not be connected to a wall or wired home system, although that is an option. The device does not need to be permanently or semi-permanently affixed to a surface, such as wall, doorframe, door, floor, ceiling, or a window, for example, although that is an option if desired. In accordance with an embodiment of the invention, it may be placed on a surface, such as the floor, a table, or a bookcase, for example.
  • In accordance with one embodiment, the device can be connected to the Internet through Wifi or through another network, such as a 3G or 4G service from a cellular operator, for example, and through that connection perform all of the necessary functions of a monitoring, security, and/or safety system, as well as of a home connected device.
  • DESCRIPTION OF THE FIGURES
  • FIG. 1 is an example of a security system in accordance with an embodiment of the invention;
  • FIG. 2 is a block diagram of an example of a monitoring/security device in accordance with an embodiment of the invention;
  • FIG. 3 is a block diagram of another example of the device of FIG. 1;
  • FIG. 4 is another example of a block diagram of the device of FIG. 1;
  • FIG. 5 is a perspective view of a monitoring/security device in accordance with an embodiment of the invention;
  • FIG. 6 is a bottom view of the device, in accordance with the embodiment of FIG. 5;
  • FIG. 7 is a rear perspective view of the device, in accordance with the embodiment of FIG. 5;
  • FIG. 8 is a front view of the device, in accordance with the embodiment of FIG. 5;
  • FIG. 9 is a front perspective view of the main board of the device, in accordance with the embodiment of FIG. 5;
  • FIG. 10 is a front view of a disk that may be provided in front of the camera optics in the device, in accordance with the embodiment of FIG. 5;
  • FIG. 11 is a perspective view of the main board with a PIR lens removed, in accordance with the embodiment of FIG. 5;
  • FIG. 12 is a rear perspective view of the main board, the bottom board, and the antenna board, in accordance with the embodiment of FIG. 5;
  • FIG. 13 is a cross-sectional view of the device, in accordance with the embodiment of FIG. 5;
  • FIG. 14 is a top view of the antenna board, showing a Bluetooth antenna, a WiFi antenna, and a capacitive switch, in accordance with the embodiment of FIG. 5;
  • FIG. 15 is a view of the main board, showing an actuator for an IR filter, in accordance with the embodiment of FIG. 5;
  • FIG. 16 is a schematic representation of two IR filters for selective placement in front of an imaging sensor, in accordance with an embodiment of the invention;
  • FIGS. 17A-I 7C show a flowchart of an example of a setup procedure, in accordance with an embodiment of the invention;
  • FIGS. 18 and 19 show a flowchart of an example of a method of the operation of the device and the system in response to an event detected by the device;
  • FIG. 20A is another flowchart of an example of a method of operation of the device and/or system when an event is detected, in accordance with an embodiment of the invention;
  • FIG. 20B is another example of a learning procedure, in accordance with an embodiment of the invention, in accordance with an embodiment of the invention;
  • FIG. 21 is a schematic diagram of an example of components of the device involved in video processing, in accordance with an embodiment of the invention;
  • FIG. 22 is a schematic diagram of an example of the components of the system involved in video processing, in accordance with the embodiment of the invention;
  • FIG. 23 is an example of a notification provided to user device of a primary user as displayed by the App, in accordance with an embodiment of the invention;
  • FIG. 24 is an example of a notification displayed on a user device that states that a person arrived home and when, in accordance with an embodiment of the invention;
  • FIG. 25 is an example of a timeline of events that can be displayed by the user device via the App, from data received from the system and/or the device, in accordance with an embodiment of the invention;
  • FIG. 26 is a timeline of notifications, in accordance with an embodiment of the invention; and
  • FIG. 27 is a flowchart of an example of a method for obtaining video clips from the system, in accordance with an embodiment of the invention.
  • DETAILED DESCRIPTION
  • FIG. 1 is an example of a security system 10 in accordance with an embodiment of the invention. In this example, the system 100 comprises one or more monitoring/security devices 10, each positioned in one or more locations within a home 12. One or more devices 10 may be positioned within a single room 14 or in multiple rooms of the home 12, depending on the size of respective rooms, the areas of the home of activity and/or concern, the degree of security desired, the locations of entrances to the home, where children play, where a baby, children, older people, or sick people sleep, etc.
  • In the example of FIG. 1, one device is placed near a corner of the room 14, and the other is placed on a table near an opposite corner of in the same room. One or more devices may be placed in other rooms of the home, as well. Devices 10 may also be used in office buildings, stores, individual offices, hospital and hospice rooms, and any other locations that would be desirable to monitor and/or home security concerns. For example, devices 10 may also be placed in outdoor locations, such as backyards, decks, patios, etc. As noted above, installation is not required to attach the device 10 to a wall, doorframe, door, floor, ceiling or a window, for example, although that is an option if desired. The device 10 may be placed on a flat surface, such as a floor, table, desk, bookshelf, etc.
  • The one or more devices 10 in the home 12 in this example communicate with a security processing system 14 via a network 16. The network 16 may be the Internet, for example. The devices 10 may communicate with the network 16 wirelessly, such as by WiFi, or through the Ethernet, for example. The security processing system 14 may comprise a processing device 18 and a database 20, for example. The processing device 18 may comprise one or more computers or servers, for example. Multiple databases 20 may also be provided. The security processing system 14 may be a discrete entity connected to the network 16, or it may be a cloud-based system, for example. References to “cloud” or “cloud-based” system or network refer to a well-known distributed networks operated by cloud-based applications.
  • FIG. 1 also shows a primary user 22 with a user device 24 outside of the home 12 that can communicate wirelessly with the devices 10 and/or the security processing system 14 via the network 16, through WiFi and/or a cellular network, for example. Primary users 22 are users with complete access to the device 10 and the system 14. Primary users 22 have the ability to set preferences and customize the activities of the device 10 and/or the system 14. Household users are family members or other persons who reside in the house 12 or are otherwise associated with the location, such as an office, who interact with the device 10 and/or the system 14 on a regular, such as daily, basis. One important function of a primary user 22 is to receive information and notifications concerning the status of the home 12, as detected by the device 10, including notifications of potentially serious and life threating situations, and respond to them, including instructing the device 10 and/or the system 14 how to react to the situation. In addition, the primary user 22 can teach the device 10 and/or the system 14 how to react to data detected by the device 10.
  • The primary user 22 may be near the home 12, at their place of business, or in any other location that can access the network 16. The user device 24 may be a mobile processing device, such as a smartphone or tablet, for example. The user device 24 may also be a desktop or laptop computer in the home 12 or outside of the home 12, such as an office. A user 22 may have multiple devices that can communicate with the devices 10.
  • FIG. 1 also shows a back up contact 26 with their own user device 24. A backup contact 26 has limited interaction with the system 14 when the primary user 22 does not respond to notifications provided by the system 14. Primary users 22 have an option to select or invite backup contacts during setup and at other times. The primary user 22 determines the circumstances when backup contacts receive notifications. In most instances when a primary user 22, or a household user, is unable to respond to a notification, a backup contact would be sent a notification and have the opportunity to resolve the incident. In one example, a backup contact that accepts the invitation would be redirected to a web portal to create an account. When a backup contact receives a notification by text or email, they are taken to a unique URL and presented with information about the event that triggered a notification and have the ability to respond to the event notification based on primary user preferences.
  • Multiple backup users can work together to resolve an incident. In one example, a primary user 22 may select up to three backup contacts and will also be able to change the roster of backup contacts from time to time when the need arises (for example when a primary user is on vacation with whomever they have designated as their first backup). Users will be able to toggle between which backup contact receives a notification first after an event has escalated based on numerical order, for example backup contact one would be the first backup contact to receive an escalated notification, and thereafter backup contact two would receive the notification if backup contact one has failed to resolve the event. The primary user 22 sets the escalation delay between contacts.
  • In addition, the devices 10, the security processing system 14, and/or the primary user 22 and the backup contacts 26, may communicate with first responders, such as the police 28, the fire department 30, and/or an ambulance service 32, for example, via the network 16. Alternatively, the devices 10, the processing system 14, and/or the user devices 24 may communicate with a call center 30, which may in turn communicate with the police 26, the fire department 28, the ambulance service 30 and/or the other parties described herein, via the network 16 or other networks, such as telephone networks. For example, the call center may send notifications to primary users 22 and backup contacts 26 via text, email, and/or phone call to their respective user device 24. The system 14 may also contact home or office emails addresses or phone numbers, for example.
  • If the primary user 22 does not respond to a notification concerning their home 12, for example, one or more backup contacts 26 may be contacted. If the backup contact or contacts 26 do not respond, then the system 14 may instruct the call center 34 to contact the police 28, fire department 30, and/or the ambulance 32 service, depending on the situation, or these parties may be contacted directly by the system 14. The system 14 or the call center 34 may then attempt to contact the primary user 22 to inform the user of the potential event and the actions taken.
  • FIG. 2 is a block diagram 50 of an example of a monitoring/security device 10 in accordance with an embodiment of the invention. In this example, the device 10 comprises a main printed circuit board (“PCB”) 52, a bottom printed circuit board 54, and an antenna printed circuit board 56. A processing device 58, such as a central processing unit (“CPU”), is mounted to the main PCB 52. The processing device may include a digital signal processor (“DSP”) 59. The CPU 58 may be an Ambarella digital signal processor, A5x, available from Ambarella, Inc., Santa Clara, Calif., for example.
  • An image sensor 60 of a camera, an infrared light emitting diode (“IR LED”) array 62, an IR cut filter control mechanism 64 (for an IR cut filter 65), and a Bluetooth chip 66 are mounted to a sensor portion 68 of the main board 52, and provide input to and/or receive input from the processing device 58. The main board 52 also includes a passive IR (“PIR”) portion 70. Mounted to the passive IR portion 70 are PIR sensor 72, a PIR controller, such as a microcontroller, 74, a microphone 76, and an ambient light sensor 80. Memory, such as random access memory (“RAM”) 82 and flash memory 84 may also be mounted to the main board 52. A siren 86 may also be mounted to the main board 52.
  • A humidity sensor 88, a temperature sensor 90 (which may comprise a combined humidity/temperature sensor, as discussed below), an accelerometer 92, and an air quality sensor 94, are mounted to the bottom board 54. A speaker 96, a red/green/blue (“RGB”) LED 98, an RJ45 or other such Ethernet port 100, a 3.5 mm audio jack 102, a micro USB port 104, and a reset button 106 are also mounted to the bottom board 54. A fan 108 may optionally be provided.
  • A Bluetooth antenna 108, a WiFi module 110, a WiFi antenna 112, and a capacitive button 114 are mounted to the antenna board 56.
  • The components may be mounted to different boards. For example, the Wifi module 110 may be mounted to the main board 52, as shown in the Figures discussed below.
  • FIG. 3 is a block diagram 120 of another example of the device 10, comprising two printed circuit boards, a side PCB 122, and a lower PCB 124. Components common to FIGS. 2 and 3A are commonly numbered. In this example a CPU 126 and a microcontroller are mounted to the side PCB 122, along with the Bluetooth low energy (“BLE”) antenna 108, the Wifi antenna 112, the IR LED array 62, a wide angle lens 132, the ambient light sensor 80, the PIR sensor 72, the accelerometer 92, the capacitive switch 114, the microphone 76 and the speaker 96. The RGB LEDs 98, the humidity sensor 88, the temperature sensor 90, a carbon dioxide (CO2) sensor 134, a carbon monoxide sensor 136, the siren 86, the Ethernet port 100, the audio jack 102, the micro USB port 104, and a reset button 106 are provided on the lower PCB 124.
  • FIG. 4 is another example 150 of a block diagram of the device 10. Components common to FIGS. 2 and 3 are commonly numbered and not further discussed. FIG. 4 further shows a cellular radio 152, and a video encoder 154 connected to the CPU 126. A power management chip 156 is connected to both the CPU 126 and the microcontroller 128. AC power 158 and a battery 160 are connected to the power management chip. A pressure sensor 162 is connected to a microcontroller 164.
  • FIG. 5 is a perspective view of a monitoring/security device 10, in accordance with an embodiment of the invention. The device 10 comprises an outer housing 202 and a front plate 204. In this example, the plate 204 includes a first window 206, which is in front of the image sensor 60. A second window 208, which is rectangular in this example, is in front of the infrared LED array 62. An opening 210 is in front of the ambient light detector 80, and an opening 212 is in front of the microphone 76. The front plate 204 may comprise black acrylic plastic, for example. The black plastic acrylic plate 204 in this example is transparent to near IR greater than 800 nm.
  • The top 220 of the device 10 is also shown. The top 220 includes outlet vents 224 through the top to allow for air flow out of the device 10. FIG. 6 is a bottom view of the device 10, which shows the bottom 226 and inlet vents 228 to allow air flow into the device 10. The top 220 and the bottom 226 of the device 10 may be separate, plastic pieces that are attached to the housing 202 or an internal housing during assembly, for example. Air passing through the bottom, inlet vents 228 travel through the device 10, where it picks up heat from the internal components of the device, and exits through the top, outlet vents 228. In this example hot air rises through the device 10 by convection, causing air to be drawn into the device from the bottom vents 226 and exit through the top vents 220. A fan 108 (FIG. 2, for example), may be provided to draw external air into the device 10 through the bottom, inlet vents 228 and drive the air out of the device through the top, outlet vents 224, as well.
  • The size of the vents 224 have to be large enough to allow heat to flow out of the unit and not convect into the device 10, but the vents cannot be so large that a child can stick a finger into the unit. Alternatively, a larger vent is made and the vent is covered with a Gore-Tex or nylon mesh to prevent water ingress but allow air to exit the unit.
  • FIG. 7 is a rear perspective view of the device 10, showing the Ethernet connector 100, the audio jack 102, and a USB port 104.
  • FIG. 8 is a front view of the device 10 with the front plate 204. A front wall 232 that connects the opposing curved edges of the curved housing to each other is shown. The front wall 232 defines an opening having a first, partially circular section 234, an upper rectangular section 236 above the partially circular opening and a second, lower rectangular section 238 below the partially circular section. A smaller rectangular open section 240 is below the second rectangular section 238.
  • The image sensor 60 is behind the partially circular section 234, the IR LED arrays 62 are behind the first and second rectangular sections 236, 238 and the passive IR sensor 72 is behind the smaller, lower rectangular section 240. The partially circular section is behind the first window 206 in the plate 204. The smaller rectangular section 240 is behind the rectangular window 208 in the plate 204. Additional openings 242 and 244 are provided behind the openings 210, 212 in the plate 204, in front of the ambient light sensor 80 and the microphone 76. It is noted that while some of the IR LEDs 62 behind the rectangular sections 236, 238 are partially obscured by the front housing, the LEDs are angled, so that the infrared light emitted by the LEDs are directed out of the device 10.
  • FIG. 9 is a front perspective view main board 52. Camera optics 246 are shown in front of the imaging sensor 60 (not shown). The camera optics are supported within a barrel 248. A light pipe 250 is in front of the ambient light sensor 80, to couple the sensor to the opening 242 in FIG. 8. The microphone 76 is below the light pipe 250. Also shown is a rectangular lens 252 in front of the PIR sensor 72. The lens 252 and the sensor 72 are discussed further below.
  • A portion of the bottom board 54 is shown. A flexible printed circuit (“FPC”) 254 connects the main board 52 to the bottom board 54. The rear of the Ethernet port 100 is also shown. Other components shown in FIG. 9 are discussed further below.
  • FIG. 10 is a front view of a disk 256 with an opening 257 that may be provided in front of the optics 246. The disk 256 is painted black to blend in with the black front plate 204, thereby making the imaging sensor 60 and camera optics 246 less noticeable during use in a home 12, for example. The disk 256 may be plastic, for example.
  • FIG. 11 is a perspective view of the main board 52 with the PIR lens 252 removed to show the PIR 72 mounted to the main PCB. A light pipe in front of the ambient light sensor is also removed, to show the light sensor mounted to the main board 54. The barrel 248 is also removed, to better show the camera optics 246.
  • FIG. 12 is a rear perspective view of the main board 52, the bottom board 54, and the antenna board 56. A heat sink 260 is applied to the back surface of the main board 52. The siren 86 mounted to the rear surface of the main board 52. Air flow through the bottom vents 228, past the heat sink 260, removes heat from the heat sink, prior to exiting the device 10 through the top vents 224. The imaging sensor 60 is also shown. FIG. 12 also shows the Ethernet port 100, the audio jack 102, and the USB power port 104. The audio jack 102 maybe a 3.5 mm barrel jack, for example.
  • FIG. 13 is a cross-sectional view of the device 10. The front plate 232 and the position of the front housing 204 are shown. Also shown in this view is an RGB LED 98 mounted to the bottom board 54. Below the RGB LED is a cone shaped light guide 260 (shown in cross section in this view) diverging from the LED. The circular bottom 262 of the light guide 260 provides colored illumination indicative of the state of the device, such as armed or disarmed, for example, out the bottom of the device 10.
  • FIG. 13 also shows the temperature/humidity (“T/H”) sensor 90/88 and the air quality sensor 94 mounted to a FPC 270, which is mounted bottom board 54 in this example. The T/H sensor 90/88 is proximate one of the bottom vents 228, so that external air from the location where the device 10 is located passes over the sensor. An air guide 272 is provided around the T/H sensor 90/88, the air quality sensor 94, and one of the bottom vents 226. The air guide 272 guides external air received through the bottom vent past the sensors, and insulates the sensors from the heat generated within the device 10, as described further below.
  • The air guide 272 in the example comprises side walls 272 a, and a rear wall 272 b, as shown in FIGS. 9 and 13, for example. The front 272 c of the air guide 272 is provided by the inner surface of the housing 202. The rear walls and the sidewalls taper inwardly toward the front of the device so that the bottom entrance to the guide 272 is larger than the exit 274. It has been found that the taper improves the accuracy of the temperature detected by the T/H sensor 90/88. The air guide 272 may comprise acrylic plastic, for example, as discussed further below.
  • The housing 202 may comprise anodized aluminum, which provides a high quality appearance and high durability. Since the anodized aluminum may act as an electrical insulator, the inside surfaces of the housing 202 may be polished or masked during anodizing to allow the electrical ground to be tied to the aluminum metal housing. This provides a large sink for electrostatic discharge (ESD) and a partial shield against the electromagnetic radiation from the device 10 and electromagnetic susceptibility from external sources. The polished inside surfaces of the housing 202 reduce the thermal emissivity and enable thermal radiation to escape the unit better and thereby thermally isolate the environmental sensors better.
  • The image sensor 60 of the camera may comprise a CMOS, CCD image array sensor, which is used as the transducer to digitally capture, transmit and store images or video. The sensor may have a large pixel size to increase the light gathering capability. Back side illuminated sensors may be used to further enhance the dynamic range in low light conditions.
  • In this example, the imaging sensor 60 is sensitive to visible light and also near IR (between 700 nm and 1100 nm wavelengths), which enables the sensor to capture “night” vision images. The image sensor 60 can be either electronic rolling or global shutter and can achieve at least 30 fps of video at 720p and 1080p resolutions, and can produce larger resolution still images than 1080p for greater detail, pixel binning, long exposures, and high dynamic range imaging (“HDR”) techniques may be used to enhance dynamic range, which helps create accurate and high quality images at low light levels. The image sensor may be an Aptina AR330, available from Aptina Imaging Corporation, San Jose, Calif., for example.
  • The camera optics 246 may comprise a fixed focus, wide angle multi-element lens to capture visual information about the location. In this example, the angle is 140 degrees. The lens may be optimized for spatial resolution and chromatic aberration. The lens mounts to the image sensor 60 through a screw mount to ensure precise focus during mass production. The lens may be made of high quality glass such as BK7 or Sapphire, for example, to ensure high image quality. The imaging optics 246 are protected from the environment using a flat exit window that is made of chemical strengthened or naturally strong glass, such as Gorilla Glass or Sapphire glass, for example. The exit window may be covered with an anti-reflective coating, oleophobic coating to prevent finger prints and smudges. The exit window may also have a hydrophobic coating to prevent water droplets or condensation from accumulating that might occlude the pictures or video. The window 206 in the black front acrylic plate 204 accommodates the FOV of the sensor 60 and the optics 246.
  • A 3-axis or 6-axis accelerator 92 (FIGS. 2, 4), may be provided placed on the bottom board 54 of the device or in another location to detect motion of the device 10, itself, such as the device being knocked over, for example.
  • Heat generated by components within the device 10, such as the processing device 58, the image sensor 60, and the Wifi module 110, may adversely influence the ambient temperature and humidity measurements by the T/H sensor 90/88. Since heat transfer occurs through conduction, convection, and radiation, embodiments of the invention seek to decrease the effects of these heat transfer modes on the T/H sensor 90/88.
  • The air guide 272, discussed above, defines a thermal barrier for the T/H sensor 90/88 and the air quality sensor 94, to isolate them from both conduction and radiation. By mounting the T/H and air quality sensors to the FPC 270, these sensors are isolated from heat transfer through conduction from. In addition, very thin traces are used on the FPC. The FPC may comprise polyamide, which has a high thermal resistance and decreases heat transfer that might have occurred if the T/H sensor and air sensor were mounted directly to the bottom board 54. Other sensors may be positioned within the air guide 272 for thermal isolation, as well
  • In this example, the air guide 272 comprises plastic acrylic wrapped in high thermal reflectivity material, such as polished copper or Mylar, for example. Since the anodized aluminum of the housing has a high thermal wavelength emissivity, polishing the inside walls of the housing 202 reduces thermal radiation.
  • In addition, the main heat generators such as the processing device 58 and the Wifi module 110, which are mounted to the main board 52 in this example, are heat “sunk” using a high conductivity material as a heat sink 260, as shown in FIG. 12. The heat device thermally coupled to the polished inner walls of the aluminum walls.
  • The air quality sensor 94 may be a volatile organic compound (“VOC”) sensor, as in shown in the art.
  • A thermistor (not shown) may be provided on the main board and/or the antenna board, for example, to measure the heat generated by the heat generating components, which may be used to correct the temperature detected by the T/H sensor 90/88, if necessary.
  • FIG. 14 is a top view of the antenna board 56, showing the Bluetooth antenna 108, the WiFi antenna 112, and the capacitive switch 114. In this example, the Bluetooth antenna 108 and the WiFi antenna 112 defined by the antenna board 54, as opposed to being sheet metal based, and are placed under the top surface of the top 220 of the device 10. This ensures that the RF energy does not get attenuated by the metal housing 202, which is electrically grounded. Alternatively, metal antennas may be integrated into the top metal cap or the main metal housing. Alternatively, the top 220 may be made of plastic or metal with plastic pattern inserts or overmolds to define the antenna. A metal top 220 can also act as a heat radiator to keep the inside of the device 10 cool.
  • The capacitive switch 114 may be used to change modes of the device 10 and the system 14, as discussed below.
  • Night and dark vision operation is enhanced by illuminating the location with near IR light by the IR LEI) arrays 62. The near IR LED arrays may emit radiation in the range of 850-875 nm, for example. 850 nm-875 nm light is very weakly visible to mostly invisible to most humans. However, since the CMOS image sensor 60 is sensitive to this wavelength band, it can respond to these illumination sources by providing well illuminated images, even when the room where the device is located is dark. It may be determined whether it is nighttime based on the light detected by the ambient light sensor 80, for example.
  • The IR LED arrays 62 around the image sensor comprise near IR LEDs, (typically 850-880 nm to illuminate the surroundings in night vision mode, without blinding or distracting or being visible to the user at night. The IR LED arrays 62 provide uniform illumination in the field of view (“FOV”) of the camera. The IR LEDs are placed behind a black acrylic window to remain invisible to the user. The black acrylic window allows only near IR to pass through. Most 850 nm LEDs are partially visible to the human eye because of a wide transmission bandwidth that extends down towards the visible spectrum. To avoid this distraction to the user, a band pass or high pass filter may be provided on the inside of the front housing (black acrylic sheet) to block any visible light from passing through the window.
  • A near IR filter 65 is provided to block near IR (“NIR”) radiation above 700-750 nm, for example, to enhance spatial resolution and other image quality parameters on a CMOS image sensor 60 during daytime, as shown in FIG. 13 and FIG. 15. It need not be used at night. Movement of the filter 60 may be provided by an actuator 64, as shown in FIG. 15. The actuator 64 may be an electro-optic system, such as is used in LCS shutters, or an electro-mechanical system, such as is used in electromechanical shutters. Operation of the near IR filter actuator 64 may be controlled by the processing device 58, for example, as shown in FIG. 2. FIG. 15 also shows cavities 65 in a supporting structure 67 for respective LEDs in the array 62.
  • In accordance with another embodiment of the invention, a second, independent IR filter 282, under the control of another actuator 280 or the same actuator 64 that controls operation of the cut filter 65, may be provided, as shown schematically in FIG. 16. The second filter 282 acts as a narrow band pass filter to allow only a narrow wavelength of light to pass through. For example, the second IR filter could be 850 nm band pass filter with a 30 nm band width. This filter 282 can therefore be used for 3D time of flight, structured light based 3D cameras. The actuator 280 may also be controlled by the processing device 58, for example.
  • In an example of a 3D imaging system that may be used in accordance with an embodiment of the invention, the illumination source may be the near IR LED arrays (850 or 875 nm) that would be pulsed at very high frequencies (>10 MHz) using a standard pulse width modulation (“PWM”) circuit. The received “pulsed” or continuous wave images are provided to the processing device 58 or another processing device to compute depth of an image, for example. To obtain the flight data that could be used to device 3D information, the LEDs in the array 62 may be pulsed. The time for infrared light to be detected by the image sensor 60 after emission for an LED pulse may be measured by the processing device 58. Infrared light will return from objects further away, after the infrared light returns from objects closer to the device 10. 3D information may be used to determine a precise location of a fire, an intruder, for example. It could also be sued to obtain in at least partial 3D image of a person, which would assist in the identification of the person based on the volume of the person, a can used in conjunction with video data from the image sensor. The processing of video data is discussed below. The first filter would not be activated while the second filter is activated.
  • The functions of the processing device 58 include running the operating system and embedded software that runs on the operating system, to exercise the various hardware on the device 10 appropriately, and compressing the incoming raw video signal from the image sensor to a high quality compact stream that can be efficiently transmitted over the Internet, as discussed further below.
  • RAM memory 82 may be used to store, copy and process the large volume of streaming video data coming into the processing device 58 from the image sensor. It can also be used as a work space to perform rapid data analytics locally on the device 10. Flash memory 84 may be used for non volatile permanent storage of important information related to the location and the operating system
  • FIG. 14 is a top view of the antenna board. A Wifi antenna and a BTLE or bluetooth low energy antenna are shown. Both antennas operate at 2.4 GHz frequency.
  • The PIR sensor 72 along with a fresnel lenslet is used for motion detection, based on the principles of black body radiation. The fresnel lenslet may be flush against the front black acrylic plastic plate 204 and be color matched with the rest of the black acrylic material. In order to provide an aesthetically pleasing front face of the device 10, an exit window of dark black HDPE material in front of the fresnel lens, which is clear or whitish. The black HDPE would allow >8 um wavelengths to pass through.
  • Setup Procedure
  • In accordance with an embodiment of the invention, the device 10 may be configured or “set up” for use and transfer of information between the device 10 and a home network, as well as between the device and a user device 24, via the audio jack of a use device 24 and the device 10. Since the device 10 does not include a user interface in this example, the user interface or input of the user device, such as a keyboard or touch screen, is used to enter data. Setup in accordance with this embodiment is simple, rapid, and avoids the need for a computer in the setup process.
  • FIGS. 17A-17C show a flowchart of an example of a setup procedure 300 in accordance with an embodiment of the invention. The device 10 is plugged in or battery powered is turned on, in Step 302. The primary user 22 downloads an App to their device 24 and connects an audio cable, commonly known as 3.5 mm audio jack, to the device 10 and to the user device 24, in Steps 304 and 306. The cable is connected to the 3.5 mm audio port (stereo sound and microphone) of the user device 24, and to the 3.5 mm audio jack 96 of the device 10. The user device 10 may be a smartphone, tablet, laptop, or other computing device.
  • The primary user 22 can at this point set up the device either through a web interface accessed by the user device 24 or through the App downloaded to the user device 24. The remainder of the method 300 will be described with respect to an App on the user device 24. Use of a web interface is similar.
  • When the user opens the App, in Step 308, the App presents an option to Create an Account, in Step 310. Selection of the option causes the App to present a graphical user interface enabling the entering of a user's name, email address, password, and phone number, for example, via the input of the user device 24, such as a touchscreen or keyboard, in Step 312. Other information may be requested, as well, such as how many devices 10 are being set up and where are they located, for example. The creation of the account and the account information may be confirmed by text or email, for example, in Step 314. The user 22 may also be requested to agree to terms of service and to agree to accept push notifications, email, and/or text messages that are used to inform the primary user of events taking place in the location, as described below. The primary user 22 may be informed of events by phone, as well.
  • The user 24 is instructed by the App to position the device in a desired location, in Step 316. Example locations are discussed above.
  • The App requests the device serial number from the device 10, via the audio cable, in Step 318, and the serial number is received, in Step 320. Data is encoded as audio signals to pass from the device 10 to the user device 24, and decoded by the user device 24 via the App. The device 10 may be configured to encode and to decode the audio signal, and respond to it in order to complete the setup process, as well.
  • The App on the user device 24 will then determine if the user device is connected to a wireless network, in Step 322, in FIG. 17B. If Yes, the App asks the user to confirm that the device 10 should be connected to the same network, in Step 324. The response is received, in Step 326. If the user confirms, then the App instructs the device 10 to connect to the same network. The device 10 requests a password for the network, in Step 328. The user enters the password via the input device on the user device 24, such as a touch screen or keyboard. The App encodes the password into audio signals and provides the password to the device 10, via the audio cable. The App encodes the password into audio signals and provides the password to the password is received by the device 10 via the audio cable, in Step 330, and the device decodes the password. The device 10 updates the Wifi credentials, restarts the necessary systems, and connects to a processing system 14 via the network, in Step 128, in a manner known in the art. The device 10 then connects to the user device, in Step 130.
  • If the user is not connected to a wireless network, in Step 322, or does not confirm that the device 10 should connect to the same network, in Step 324 and 326, the App instructs the device to search for available networks, in Step 332. The device 10 informs the App of the available networks, in Step 334. The App then asks the user to select a network from the available networks, in 336. If the user selects one of the networks, in Step 338, the App requests the network password from the user, in Step 328 and the method 300 proceeds, as described above.
  • If the user does not select one of the networks, in Step 338, then the App requests the user to enter a desired network, in Step 340. The user enters the desired network by entering the service set identifier (“SSID”) of the network, for example, via the input device. When the identification of the network is received, in Step 342, the App requests the network password from the user, in Step 328 and the method 300 proceeds, as described above.
  • The network password is provided to the device 10, in Step 344, via the audio cable, and the device connects to the network, in Step 346. The device 10 then connects to the processing system 14, in Step 348.
  • After connection of the device 10 to the network 16 and to the system 14, in Step 348, a unique identifier of the device 10 is sent by the device to the user device 24, in Step 350. This enables the user 24 to setup the device 10 as their device in the system, i.e., to be an approved and connected device from a security and administrative perspective.
  • A map may be presented to the user 22 via the App to identify the geographic location of the device 10, in Step 352. The App sends the User Account information established in Step 312, the serial number of the device 10, and the user's geographic location defined in Step 352, and sends it to the processing system 14, which establishes a profile and an association between the device 10 and the user's location, in Step 354.
  • Having set up the wireless connection and provisioning the user device 24 to know that it is working with the device 10, the user will be instructed to disconnect the audio port, and the device setup is completed, in Steps 356 and 358. If the device 10 has already been placed in a desired location, the device is ready for operation, without further installation.
  • A secure cryptographic key may also be issued between the device and the phone, to enable them to be paired securely. In this case, the device 10 can ‘talk’ to the phone itself, and know that it is the phone it paired with, by an exchange of keys.
  • The steps of the method 100 may be performed in a different order. Instead of asking the user 22 to confirm that the network that the user device 24 is connected to is the network the device 10 is to be connected to in Steps 316-322, the method 100 may proceed from Step 320 to Step 332, where the device 10 searches for available networks.
  • If a primary user 22 moves the device 10 to a new location, or changes their wireless name or password, they can update that change on the device by again connecting the user device 24 to the device 10 via the audio port 102.
  • Bluetooth low energy (“BTLE”) communication may also be provided for information exchange, in which case the user device 24 and the device 10 will find each other before communication begins. In another set up procedure, the device 10 comprises a wireless router, allowing the user device 24 to connect to the device directly. In this example, the device 10 sends an IP address to the user device 24. The user enters their Wifi SSID and password via the App to connect to the device 10.
  • In other examples, Bluetooth pairing, direct USB connection, or DTMF may also be used to configure the device 10. Encoding information may also be provided into a flashing light, which is read by the user device 24.
  • Interaction Between the Device and a Home Network
  • After setup, as described above, the device 10 may be connected to the user home network, which may comprise a wireless router, Ethernet, etc., in the home, and the devices connected to it, such as desktop computers, laptop computers, mobile smart devices, etc. The device 10 may receive information and instructions from the home devices connected to the home network and from the user device 24. The device 10 may also send outbound interactions, such as information that the device 10 sends to the system 14 and to user device 24.
  • The device 10 communicates with the user device 24 via an App, such as the App discussed above in the set up procedure, to communicate with it the user device. Since this App may run in the ‘background’ or have some functionality at all times on the user device 24, it can be used for security or safety functions. For instance, when a user 22 approaches a physical location, automatic arming or disarming, based on proximity of the users of the security system 14 and the device 10 within the home may be enabled.
  • In this example, when the primary user 22 approaches a predetermined radius around their home 12 or other location where the device 10 is located, the user device 24 sends a signal via the App to the home network that the primary user 22 is in the area and that the device should be ‘ready’ for them to enter. Being ‘ready’ in this example means that the device 10 and the system 14 should not notify the primary user 22 immediately upon entry of the user 22 into the home 12 that someone has entered the location, despite recognition of entry by the sensors of the device 10. Instead, the device 10 and system 14 may assume that the person entering the location is likely the primary user 22 and thus it should wait another predetermined set of time, such as 45 seconds, for example, to confirm that the user is in the location through the App or wireless network and disarm itself.
  • Other information for the user can be sent through the App and other users can use the App. For example, if a user adds a third party as an approved user in their network, that third party can download the App to their own user device 24, and use the same proximity-based geolocation services of the device 10 to arm or disarm the user's system.
  • In one example, if authorized by the user, other users may be listed as primary, secondary, or tertiary users on a user's network may be able to automatically arm or disarm a device 10 without alerting the first primary user 22. A secondary user may be a family member or other person regularly in the home 12.
  • In one example, the information may be saved into a ‘timeline’ of events but the primary user 22 will not notified as if this was a security incident. Timelines are discussed further below. By including other parties in the interactions with the device 10 and the system 14, security becomes more than simply an individual within their physical network. It becomes a social construct where a large group of people are listed as ‘trusted’ within an environment. Such a group security, or social security construct, may improve the way people work together to protect each other. By acting together as a community, enabled through technology, the wider community becomes more secure.
  • Additionally, by geolocating the cell phones, other preferences in the home can be controlled. For example, the device 10 may act as a hub for other services within a home, and preferences can act differently for different people that are identified in the system. From a security and safety point of view, specific actions and notifications can be aligned with the geo-location of phones and the confirmation of the proximity of individuals within the location. An individual owner of the account can set preferences for the individuals. An action may be a specific notification—if the location of an individual's child, for instance, is activated, then the individual can receive a timeline or graph of the comings and goings of their children. In other words, specific actions based on the person.
  • An event is a deviation from the norm or a deviation from a predefined set of criteria in a location monitored by the device 10, for one or several of the sensors, the camera (imaging sensor 60), and/or the microphone 76. There are several methods for learning/teaching the system to know what is normal or ordinary in a home or other such location in which the device 10 is situated, and what is not normal or out of the ordinary or a deviation, in accordance with embodiments of the invention.
  • Preset Parameters
  • The system 14 may define certain parameters or criteria concerning what is normal and what constitutes an event or incident. For example, the system 14 may define that a temperature above a certain threshold should cause a notification. The parameters may be based on the geographic location of the device 10 and the home 12. For example, temperature parameter and/or humidity criteria in a geographic location having high average temperatures and/or humidity may be higher than the temperature and/or humidity criteria for geographic locations with lower average temperatures and/or humidity. The temperature parameter may also be change seasonally, based on the current date. Air quality parameters levels may be similarly defined by the system 14. The parameters set by the system 14 may include a combination of sensor data. For example, a notification may be sent only when both the temperature and humidity and/or air quality are above predetermined levels. Criteria and parameters may be stored by the system 14 in the database 20, for example. Criteria and parameters to be applied by the device 10, if any, may be downloaded by the system 14 to the device through the network 16, for storage in the RAM 82 and/or the flash memory 84 or other such memory, for example.
  • User Parameters
  • The primary user 22 may set certain parameters and criteria for what is considered to be an event and when notifications should be sent. The parameters set by the user 22 can apply to one or multiple sensors, as discussed above. The parameters and criteria may also include a combination of sensor data and circumstances. For example, a primary user 22 can specify that they only want to be informed of an event if no one is home and if there is noise above a certain decibel. Or, a primary user 22 can specify that any time a specific person comes into their home 12 the primary user 22 wants to be notified or wants another person to be notified. In another example, a primary user 22 may determine that any time a certain person or user is inside a home and the temperature goes above or below a certain value, the primary user is to be notified. These are merely examples of parameters and criteria that a user can set. User parameters may be set during the on-boarding process and/or after on-boarding, where questions are posed to the primary user 22 to establish the user defined parameters and criteria, as discussed further below.
  • Learning Via User Habits, Behavior, and Feedback
  • In addition to user parameters and preset parameters, the device 10 and/or the system 14 is configured to learn from a user's behavior and the activities in a user's home 12, to further define what is normal activity in the home. The processing device 18 in the system 14 and/or the processing device 58 in the device 10 may be configured to learn by software stored on respective storage devices, such as the database 20 and the RAM 82 or the flash storage 84, or other appropriate memory or storage devices.
  • The learning process may begin with basic patterns within a home, such as the comings and goings of people within a home based on the day of the week and time of the day. Most people have somewhat standard behavior which can be understood by the device 10 and/or system 14. For example, the system 14 may know how many people are in the house 12 and who they are based on user input or by sensing body mass and activity by the one or more sensors on one or more devices 10 in the home, and/or by geo-location data. Individual patterns of those people, such as when they come and go from the location on each day of the week. If over 2-4 weeks a respective person has acted regularly over a time period, a pattern may be established for that person. If the primary user is notified of the person's activities in accordance with the pattern multiple times and the primary user clears the event, such as the person entering the home 12 at a particular time, the next time the person enters the home in accordance with the pattern, the primary user will not be notified of the event. The event may be stored for future reference, such as in a timeline that can be provided to the primary user 22 upon request, if desired.
  • Patterns may be discerned for environmental characteristics of the home or other location based on sensor data, as well. For example, patterns related to temperature, humidity, and/or air quality may be discerned by comparing the data from the temperature, humidity, and/or air quality sensors over a period of time.
  • The system 14 may use data from the image sensor 60, which may be in the form of video, for example, along with audio input from the microphone 76, and/or temperature and humidity inputs from the temperature/humidity sensor 90/88, for example, to learn about the activities of a user in a given location, and the level of activity in the location. This enables the system 14 to determine a normal level of activity in a given location, such as in a given room in the home 12. If high activity is normal at particular times of a weekday or a day of the weekend, for example, and the system 14 determines that there is very little activity at one of these times, it may be because the user went on vacation, for example, or it may be that there is a potential problem. The system 14 identifies the deviation from the norm and notifies the primary user 22. The user 22 can approximately then respond to the notification.
  • Additionally, if one device 10 within a location detects a pattern, the pattern can be set for all the devices in the location. One device 10 can also break a pattern by another device. For instance, one device 10 does not detect movement and therefore thinks that no one is in the home 12 during a certain time of day, and another device 10 within the home detects movement, that input is calculated in a per-location basis and the pattern of the first device is trumped by the input from the second device. The system 14 accumulates information from multiple devices 10 to determine current activity and patterns of behavior, in a location.
  • The identity of persons within a location may be determined by the system 14 based on patterns learned about respective persons. Those patterns include general size and weight of the user, as determined by data received from the imaging sensor 60 and PIR sensor 72, for example, the level of heat generated by the person, as determined by the PIR sensor, for example, the voice of a person, as detected by the microphone 76, the facial or other pattern recognition of the user based on data collected by the imaging sensor, activity patterns of the user based on learned patters, or other person specific data determined via the various sensors within the device 10. Person recognition is important in order to determine the activity and patterns of individual members of the location, to know who is home to determine if there are individual-specific responses to incidents.
  • The presence of pets in a location based on video data collected by the image sensor 60 and infrared heat showing the approximate size and shape of an animal detected by the PIR sensor 72, as well as sensitizing and de-sensitizing certain video pixels, such as by highlighting a portion of the video which is less sensitive to pets and movement, should there be an area where a pet constantly passes. Questions presented to the user during on-boarding may be used in conjunction with the sensor data to recognize pets. It is noted that pets are a common source of false alarms in home security systems. By learning to recognize the presence of pets, false alarms may be reduced.
  • One of the ways the system 14 and the device 10 enables learning is by giving status scores a home 12 or portions of the home activity. For instance, if a user's home 12, or a particular room in the home, is typically very loud and very active at certain times of particular days, based on noise detected by the microphone 76, motion found in the video recorded by the image sensor 60, and/or temperature/humidity fluctuations detected by the T/H 99/88, for example, the home 12 or room in the home may be assigned an activity score of 10/10 for those times of those days. Activity scores may be stored in the database 20 in the system 14, in association with the time period of the particular days that the score is applicable, the room the score is applicable to, and identifying information about the user and/or the user's home, for example. In one example, if a current activity score drops more than 20% below the expected score in a time period, the primary user 22 may be alerted. Other percentage deviations or ranges may be used instead.
  • In another example, a home 12 including an elderly user who lives alone may have a low activity score of 2, for example. If the activity score in the elderly person's home 12 suddenly rises to 5, that also may be indicative of a potential problem, such as a home intrusion, or the rise in the activity score may be caused a visit by grandchildren, for example. By notifying the elderly user and/or the person's backup contacts of the noted change in activity, the user or backup contacts may inform the system that there is or is not a problem in the home.
  • Other characteristics of a location may be assigned scores based on other sensor data, such as scores for temperature, humidity, and/or air quality, for example. By scoring both the sensors themselves and the sensors working together, the system 14 can provide a clearer and complete picture of what is normal or not normal in a home 12 or other location, and respond accordingly.
  • The device 10 and/or the system 14 enable learning by recalibrating how sensor data is processed. For example, the device 10 and/or the system 14 may enable learning by adapting how the processing device 58 and/or the processing device 18 processes data about a location based on one or more pieces of data about the location that are collected by one or more sensors coupled to the processing device, for example.
  • on-Boarding Process
  • Embodiments of the invention include a learning-based on-boarding process. In one embodiment of the invention, during the initial stages of ownership of the product such as the first two weeks, for example, there is a rapid learning period, which includes periodic questions to determine information on the home 12 or other environment where the device 10 is located, such as:
  • What is your address?
  • How many people live in your house?
  • How many are adults?
  • How many are children?
  • Do you have pets?
  • If you have pets, how many and what are the types?
  • What time do you usually leave for work?
  • What time do you usually come home from work?
  • Do you have any predetermined days when you arrive home later than or earlier than usual (such as a regularly scheduled meeting or exercise)?
  • What time do your children leave for school?
  • What time do your children usually return home from school?
  • Do your children have regular activities that make them arrive home later than usual?
  • Do you have regular visitors on any days of the week, such as a housekeeper, dog walker, etc.?
  • What time is your mail delivered?
  • What time is your newspaper delivered?
  • Do you use air conditioning? If Yes, is it on a schedule? If so what is the schedule?
  • Is your thermostat on a schedule? If so, what is it?
  • Who will be your back up contacts?
  • The questions above are merely examples and are not necessarily in the format that would be presented to a user 22. These questions express the subject matter of possible questions and information that may be desirable to collect. The questions may be presented as multiple choice questions and/or as fill in the blank questions, with dropdown menus and/or windows addressing user input words, names, times, etc.
  • It is not necessary for the user 22 to answer all of the questions, but the more questions answered, the faster the system 14 will learn the normal activities in the home 12 or portion of the home or other environment, at different times of the day and days of the week. The answers to those questions will inform the learning software and the learning software. For example, since the system 10 now knows that if the user has pets, it is necessary to differentiate between the movement of the pet and the user and other household members.
  • Periodic questions to confirm what is learned about the environment may also be presented to the primary user 22. For example, the system 14 can deduce a pattern as user patterns of coming and going then it can ask the user to confirm that that pattern of activity or learning is accurate. In particular, if the device 10 sees over a period of days or weeks that a certain activity, such as people waking up, adults leaving and returning from work, children leaving and returning from school, deliveries, such as newspaper, laundry, and/or mail deliveries, etc., happening at a certain time times that have not been addressed by a response of the user to on-boarding questions, then the system 14 can ask the user to confirm that that pattern of activity or learning is accurate. The learning may involve a combination of observed behavior and previously answered questions by the user. Other patterns may relate to temperature, humidity, and/or air quality changes over the course of the day, for example.
  • Score and Rewards-Based Incentives Learning
  • Providing a ‘completeness score’ on how much the device has learned, and how much is left to truly understand the behavior of the individual, may encourage primary user 22 to answer more questions for example. If over a series of weeks the device 10 has observed the primary user's behavior, and has had a number of questions answered by the user, then the system 14 can inform the user 24 how close it is to knowing their environment by giving the user a completeness score, such as that learning is 80% complete, for example. The system 14 may always be learning, and can learn new patterns so the completeness score can continue to evolve. The score can be given as a percent or in another form, including natural language.
  • Percentage of completeness or other such score may also be awarded for the various tasks the user has to perform as a part of the on-boarding process learning process, such as answering the questions presented, as well as other activities, such as adding their friends and family as backup contacts 26, confirming the identity of the local police 28, fire department 32, and ambulance service 34, for example.
  • Rewards may be based on the completeness of the profile. For example, providing extra storage in the database 20 in return for completeness of the on-boarding process. Such additional storage may be used to store video clips that a user is interested in, for example.
  • Data Activity Score
  • As the device 10 gathers data through its sensors, that data may also be given a sensor activity score. The score is based, at least in part, on a particular point of data from one sensor or a data set from multiple sensors, and what is learned about the location. After learning the patterns of room or home, for example, throughout a day and week, such as the frequency of movement or average temperature, the device is able to determine a baseline of activity or events that are deemed normal in a particular location and during particular times of a day. When new data is acquired by the device's sensors, that data's issue activity score is determined against the baseline of what the device has learned to be normal at that time of the day and day of the week in order to identify potential threats. The data is compared to the pattern or patterns by the system 14 and/or the device 10 and the degree deviation from the normal pattern may be determined, for example. The deviation may be expressed by a percentage deviation, for example. The degree of deviation may also be applied to a scale of 1-10 or 1-100. A higher percent deviation or issue activity score reflects a stronger deviation from the homes normal baseline. Sensor activity scores may also be indicative of the comfort of the physical environment of a home is, when certain sensors such as temperature deviate from the baseline pattern or a threshold set by the primary user 22, for example. The user may be informed of the deviation in terms of the degree of deviation as a percentage or score, for example.
  • During operation, the device 10 may continuously collect data from all the sensors while on or during other time periods. Sensor data may also be periodically collected and stored, at regular or non-regular time interval. Some or all of the collected data may be analyzed by the processing device 58 on the device 10, and/or sent to the system 14 via the network 12 for analysis or further analysis. Collected and stored data may be sent to the system 14 for analysis continuously or periodically in regular or non-regular time intervals as well. The data or data files may be sent by the device 10 to the system 14 with metadata including an identification of the device it is coming from, along an encrypted channel such as a secure socket layer (SSL) connection. The collected data may be sent in multiple respective data streams or in other formats, for example.
  • FIGS. 18 and 19 show an example of a method 400 of the operation of the device 10 and the system 14 in response to an event detected by the device 10. One or more sensors are triggered and/or motion is detected, in Step 402. In this example, the device 10 determines whether the detected data is to be defined as an “Event” that needs to be further analyzed, in Step 404 based on whether the detected data meets criteria, such as exceeds a threshold, or is a deviation from a pattern, for example. If not, in this example, no further action is taken. If Yes, the device 10 sends the detected data and other information related to the event to the system 14 via the network 16, in Step 406, for further analysis. The data sent by the device 10 includes the data from the sensor or sensors triggering the event, and data from other sensors, which may assist the system 14 in the interpretation of the data from the sensor triggering the event. For example, if the temperature sensor 90 is triggered, a video clip may be sent to be the system 14, as well. Identification information related to the device 10, the user 22, the home 12, the location of the device in the home, etc., may also be sent.
  • In this example, the system 14 determines whether the event is a life or safety incident based on the received data, in Step 408. A life or safety event may be detection of a high temperature by the temperature sensor 90, very poor air quality as sensed by the air quality sensor 94, excessive and sudden noise, as measured by the microphone 76, etc. The detected characteristic may be compared to thresholds or other such criteria to determine whether the detected characteristic qualifies as life threatening, for example. The event may be based on a combination of sensor measurements, as well.
  • If the event is a life-safety event, it is then determined if anyone is home, in Step 410. The system 14 may know whether anyone is home based on data previously received from the image sensor 60, the PIR sensor 72, the microphone 76, and geo-location of user devices 24, for example.
  • If Yes, then the system 14 treats the event as an emergency and simultaneously contacts the primary user 22 and backup contacts 26 in Step 412, via email, text, and by phone, for example. Primary users 22 may determine how they are to be contacted. Different backup contacts may be contacted depending on the type of the event.
  • It is determined whether the primary user 22 or backup contact 26 responds within a predetermined period of time, in Step 414. The predetermined time may be one (1) minute, for example. If Yes, it is determined whether one of the contacted parties clears the event as a false alarm, in Step 416. If Yes, the method ends in Step 418 with the event being cleared.
  • If the event is not cleared in Step 416, then the system 14 will take further action, in Step 218, such as to call the police 28, fire department 30 and/or ambulance service 32, police and/or fire department depending on the event. The system 14 may cause the device 10 to activate the siren 86, as well.
  • If the primary user 24 or backup contacts 26 do not respond, in Step 414, then they are re-notified, in Step 422 and the method returns to Step 414. If the primary user 24 or the backup contacts 26 do not respond in Step 414 after re-notification, the system 14 may proceed directly to Step 420 to call the police, fire department, or ambulance.
  • If no one is home, in Step 410, but it is determined that it is nighttime, in Step 424. The system may still treat the event as an emergency and proceed to Step 412.
  • If the system 14 determines that the event is not a life threatening event, in Step 408 or that is not nighttime, in Step 424, the primary user is notified in Step 428. If the primary user 22 responds within a predetermined period of time, such as one (1) minute, for example, in Step 430, it is determined whether the primary user cleared the event as a false alarm, in Step 432. If Yes, the alert is cleared, in Step 418, as discussed above.
  • If the primary user 22 does not clear the event as a false alarm, in Step 432, the method goes to Step 434 to contact the police, etc., which is also discussed above.
  • If the primary user 22 does not respond in the predetermined period of time, then the backup contacts 26 are notified, in Step 434. It is then determined whether the backup contacts clear the event as a false alarm, in Step 432. If Yes, the event is cleared, in Step 418. If No, the system contacts the police, fire department, and/or ambulance service in Step 434.
  • Steps 426-436 show an example of how the system 14 can escalate notifications from the primary user 22, to the backup contacts, 26, to the police, etc. The escalation procedure may be determined by the system 14 and/or by user preferences. A primary user 22 may set the time period before a backup contact is alerted after the primary user 22 has failed to respond and resolve a notification, for example. Other escalation policies that can be set by the primary user include: the types of alerts that are escalated (i.e. only alerts from certain sensors are sent to a backup to resolve); the time between escalation as between backup contacts 26 if not all backup contacts were initially contacted; if neither a primary user of backup contact has resolved an event automatically sound the siren; if no one responds alert the authorities (via call center backup for users who opt in to the service plan);
  • FIG. 20A is another example of a method of operation 500 of the device 10 and/or system 14 when an event is detected in accordance with an embodiment of the invention. In this example, the device 10 performs all the steps of the method 500 except for the learning steps, which are performed by the system 14.
  • An event is detected, in Step 502. It is determined whether the event is a sensor related event, such as an event detected by the temperature/humidity and/or air quality sensors, in Step 504. If Yes, the device 10 determines whether the event matches pre-determined criteria for sending an alert, in Step 506. As, discussed above, the pre-determined criteria are system defined criteria, such as thresholds. If Yes, then the device 10 sends an alert to the user 22 in Step 508 and the method ends in Step 510.
  • If the device 10 determines that the event does not match pre-defined criteria in Step 506, the device determines whether the event matches user defined criteria, in Step 512. As discussed above, user defined criteria are criteria set by the primary user 22. In other words, the primary user 22 informs the device 10 and/or the system 14 that a notification should be sent when particular criteria are met. The user defined criteria may be defined by the primary user 22 through the questions asked by the system 14 and answered by the user during set up and during operation, as discussed above. If the device 14 determines that the event meets user defined activity, in Step 512, an alert is sent to the primary user 22, in Step 508. Escalation may be followed if the primary user 22 does not respond in a predetermined period of time, as shown in FIG. 19, for example and discussed above.
  • If the device 10 determines that the event does not relate to user defined criteria, an alert is sent in Step 416 and the event is submitted for learning to the system 14, in Step 518.
  • If the event is not determined to be sensor related by the device 10, in Step 504, then the device 10 determines whether the event is proximity related, such as a person entering the home 12, in Step 516. If Yes, the device 10 determines whether the user wants to be informed of a proximity event, in Step 520. If Yes, an alert is sent by the device, in Step 420, and the event is submitted to the system 14 for learning, in Step 418. If No, then an alert is not sent and the event is submitted to the system 14 for learning, in Step 422.
  • If the event is not related to a sensor event, a proximity event, or predefined criteria, then the system can also learn from the event. The event may relate to data collected by the camera/imaging sensor 60, and/or microphone 76, such as sudden or unaccounted for motion or noise, etc. The event may be presented to the primary user 24 and the user can input information clearing the event or defining the event. In this way, the system 14 learns which events the primary user wants to be informed of, and which events the user does not want to be informed of.
  • For example, the video may show a small mass moving close to the floor. When presented to the primary user, the user may identify this mass a pet. After the user identifies a small mass moving close to the floor as a pet two or three times, the system 14 learns not to send an alert the next time a similar moving mass is identified.
  • In another example, a person of a particular size and shape may enter the house between 9-10 AM on a Monday. An alert is sent to the user, who clears the alert. The same event takes place on the next two Mondays, which are also cleared by the user. The system 14 can then ask the primary user 22 whether it is necessary to send alerts the next time the same person of the same size and shape enters the house on a Monday between 9-10 AM.
  • In another example, noise of a common low frequency may be detected for 7 or more days in a row. The system 14 can learn that this is a normal level and only send alerts if the noise level suddenly spikes above this usual level, even if the spike does not exceed a preset threshold previously set by the system 14, for example.
  • In another example, the system 14 may detect a loud noise at the same time every day, for 7 days. Each day the primary user 22 clears the noise event. The next day the system 14 detects the same sound at the same time, and does not send an alert to the user.
  • While the device 10 performs all the steps of the method 400 in the description above, the device 10 and the system 14 may both perform steps of the method. For example, the device 10 may perform all the steps except for the learning steps 414 and 422, which are performed by the system 14. In another example, the system 14 is configured to send the alerts in Steps 408 and 416. In another example, the device 10 detects an event and provides the information related to the event to the system 14, which performs the remainder of the steps of the method 400. In another example, the device 10 provides all data collected from all the sensors to the system 14, which determines whether an event has taken place, and performs all the other steps of the method 400.
  • While referring to the device 10 and the system 14, it is understood that actions taken by the device 10 are performed by the processing device 58 in this example, under the control of software, such as an Application stored in memory, while actions by the systems 14 are performed by the processing device 16, under the control of suitable software stored in memory. As noted above, the system 14 may be a discrete entity or a cloud based processing system.
  • FIG. 20B is another example of a learning procedure 600, in accordance with an embodiment of the invention. Data is received from the sensors in the device 10, in Step 602. The data includes a time stamp stating the time and date of the data, as well as identifying information, such as the device detecting the data and the location of the device in the user's home, for example.
  • The time stamped data is stored, in Step 504, and analyzed in Step 506 to identify potential patterns. Patterns may be identified by methods known in the art.
  • Two examples are described in the flowchart 600. The first Option A comprises:
  • 1) Defining Curves of Sensor Values over a 24 hour period, for each Sensor;
  • 2) Comparing Curves to Derive Potential Patterns for each Sensor, over Time Periods; and
  • 3) Comparing Curves for different Sensors to Derive Potential Patterns for Groups of Sensors or Time Periods
  • The second Option B comprises:
  • 1) Dividing 24 hour day into Predetermined Time Increments, such as 15 minute increments;
  • 2) Comparing Average Sensor Values for each Sensor in each Predetermined Time Increment, for each day;
  • 3) Determining whether Average Sensor Values are within a Predetermined Range, in each Time Increment; and
  • 4) If Yes, defining a Potential Pattern in that Time Increment;
  • 5) Compare Averaging Sensor Values for different Sensors in each Time Increment, for each day;
  • 6) Determining Whether Average Sensor Values for at least two different Sensors are within a Predetermined Range, in that Time Increment; and
  • 7) If Yes, Defining a Potential Multi-Sensor Pattern in that Time Increment.
  • After a pattern is identified, in Step 606, the primary user is asked to confirm the potential pattern, in Step 508. If the pattern is confirmed by the user, it is stored, in Step 612. If it is not confirmed, it is not stored.
  • Current sensor data is compared to stored patterns, in Step 620. If the current sensor data deviates from the pattern by a predetermined amount, the user is informed, in Step 622. If the sensor data does not deviate from the patter by the predetermined amount, the user is not informed, in Step 624.
  • The user's response to being informed of a deviation may also be used in the learning algorithm for additional learning, as discussed above. For example, if a primary user 22 is notified of a particular deviation several times and each time the user clears the notification, the system 14 can ask the user whether the user wants to be informed of such notifications. If the user's response if No, then notifications will not be sent for the same deviation again.
  • Video Processing
  • During monitoring of a location by the image sensor 60 of the device 10, there may be long periods of time when the images do not change because nothing is happening in the environment at the location. For example, no one may be home, no one may be in or passing through the room being monitored, or the people in the room may not be moving for at least part of the time they are in the room, such as if a person is sleeping or watching television, for example. In accordance with embodiments of the invention, video recorded by the image sensor 60 is analyzed by the processing device 58 of the device 10 and/or by the processing device 18 of the system 14. In accordance with one embodiment of the invention, video is initially analyzed by the processing device 58 of the device 10 and when interesting video frames are identified, they are sent to the system 14 via the network 12 for further analysis by processing device 18 of the system 14.
  • FIG. 21 is a schematic diagram of an example of components of the device 10 involved in video processing, in this embodiment. FIG. 22 is a schematic diagram of an example of the components of the system 14 involved in video processing. Data collected by the image sensor 68 is provided to the processing device 58 in two identical streams. A digital signal processor 702 of the processing device 58, or a separate digital signal processor, compresses the video in one of the streams, and stores the compressed stream in a storage buffer 704. The storage buffer may be in RAM 82 or other such memory, for example. MPEG-4 video compression may be used, for example.
  • The second stream, which is not compressed, is provided by the image sensor 60 to the video analysis module 706 of the processing device 58. The video analysis module 706 is a software module that determines whether there is change worthy of further processing by the system 14, such as movement in the frames of the video.
  • The video analysis module 706 may quantify the amount of change in the video and compute an “interestingness” score, which may be a weighted function including available bandwidth between the device and system 14. The weighted function may be updated based on information/instructions from the system 14. The interestingness score may be provided to the buffer worker module 708. The buffer worker module 708 is a software module that determines which compressed frames are to be sent to the upload buffer for upload to the system 14 for further analysis.
  • If the interestingness score for a chunk of video is greater than a threshold, for example, the buffer worker module moves the video from the buffer storage 704 to an upload buffer 706, for upload of the corresponding chunk of compressed video to the system 14 via the network 12. The video chunk may be uploaded to the system 14 by Wifi or Ethernet, for example. If the score is less than the threshold, nothing is done. When the buffer worker 708 notices that the buffer storage 704 is near capacity, it deletes videos based having the least interesting score, the oldest video, or other basis, to maximize information stored in the buffer.
  • Video chunks uploaded to the system 14 are received by a video ingress module 712 in FIG. 22, via the network 12. The video ingress module 712 provides the video chunks to a video processing module 713 of the processing device 18 for video processing. Processing may include motion characterization 714, such as “entering”, “leaving” “crossing” “getting up”, segmentation 716, and feature extraction 718. It is noted the processing may be performed by a discrete entity or a cloud based system.
  • Segmentation 716 defines the moving object with respect to the non-moving background, over a number of frames. In feature extraction 718, particular features are extracted from the segmented volumes that may facilitate recognition of the moving object. In addition features are extracted from the non-moving parts of the background for characterization.
  • The output of the video processing module 713 is provided to a learning module 720, which performs recognition to identify the moving objects. A notification module 722 then determines whether the primary user 22 or another party, such as backup contacts 26, need to be notified of the identified moving object. If so, the notification module 722 sends the notification to the primary user 22 or other such party, via the network 12 in the manners specified by the primary user 22 and the backup contacts 26. If not, then a notification is not sent.
  • The learning module 720 may also provide feedback to the buffer worker 708 in the device 10 via the network 12, to adjust the threshold of the interestingness score or to mask parts of the image to remove them from the computation of the interestingness scores, such as a constantly changing television screen or a ceiling fan. While moving, a TV or fan is typically not of interest and can be considered part of the background. If not enough data is being provided to successfully identify moving objects, for example, the threshold may be lowered so that more video data is provided to the system 14 for further analysis. This may become apparent if the system 14 needs to keep asking the user to identify the same person, for example. If the system 14 determines that too much data is being provided for efficient, timely analysis, or it is determined that moving objects can be consistently and successfully identified with less data, the threshold may be raised.
  • During the onboarding phase or when a moving object in a section of video cannot be identified, the learning module 720 may send the video or an image to the primary user 22 with a request to identify whether an object in the video is a person, a child, or a pet. Ifa person or child, the system 14 may also request an identification of the person or child. The primary user 22 may also be asked whether the user wants to be informed of the presence/movement of that person, child or pet. Based on the feedback from the primary user 22, the system 14 learns the identities of people living in or visiting a home or other location. The system 14 also learns when the primary user does not want to be notified of detected movement and when the user does want to be notified of detected movement.
  • Returning to the operations performed on the device 10, the video analysis module 706 may determine whether the received video indicates movement by comparing a current frame or portion of a frame to a prior frame or portion of a frame, for example. In one example, a grid may be superimposed on an image. For each point in the grid, an 8×8 or other size pixel patch is defined and compared to the same size pixel patch around the same point in the prior or following frame. Differences from frame to frame may indicate movement of one or more objects within the frame. Motion vectors may be generated from frame to frame. If there is no movement, the motion vectors in a patch equals zero (0) and the prior patch with a non-zero motion vector is carried over for comparison with a subsequent patch, as in MPEG-4 video compression.
  • In another example, feature detection and recognition techniques, such as those used in computer vision, are used by the video analysis module 706. Instead of a grid, in this example, areas of high contrast are identified in a frame, to identify salient image areas. The most salient image areas may be identified in each frame. The salient image areas are compared from frame to frame to detect movement. Frames showing movement are given a higher interestingness score by the module 706.
  • The threshold compared to the interestingness score by the buffer worker may be set in the device and/or provided by the system 14 via the network 12. The threshold may be changed by the system 14, as discussed above.
  • Returning to the system 14, during recognition, a motion analysis module in this example determines “where” the motion is taking place in a frame and an object analysis module determines “what” is moving. Moving “blobs” may be identified moving across frames, and motion signatures of moving blobs may be determined by defining a moving vector over multiple frames, such as 150 frames, for example. Clustering may be performed of all sets of vectors to group similar motions together. These clusters may form the basis of a dictionary of “motion signatures” against which future motions may be compared.
  • Motion signatures in the dictionary may also be compared to signatures developed during the learning process and named by user input or comparison against all motions recognized in all devices. Motion signatures for pets will be different than that of children and adults. Entering and Leaving signatures will have common traits across many locations. Motion signatures may take into account speed, motion cues, such as gait and/or size of a moving object which may also be stored in the dictionary. A primary user 22 may be asked to identify moving objects or blobs during the learning/onboarding phase, and later when unidentified moving blobs are detected. The dictionary inclusion may be based on term frequency-inverse document frequency and/or K-means training, for example. Significant and commonly occurring features of interest (moving blobs) whose signatures may be stored in the dictionary include a door opening and closing, a person entering and leaving a room, a pet walking or running through a room, for example.
  • A second dictionary may be created to store features, such as colors, edges, corners, etc., which may also be extracted from video. This dictionary stores the “what” features of what is moving to recognize objects, such as people and pets, regardless of their motion or position in the video.
  • Feedback may be provided from the learning module 720 to the video processing module 713 during any one or all of the motion recognition, segmentation, or feature extraction steps, to improve the video processing based on the learning process and the success of identifying moving objects.
  • Video files and other data files containing data from other sensors may be sent by the device 10 to the system 14 with metadata including an identification of the device it is coming from, along an encrypted channel such as a secure socket layer (SSL) connection. In accordance with embodiments of the invention, the device 10 transmits audio, video and sensor data simultaneously. That is different than currently available home automation systems, for example, which may include multiple independent sensors. The device 10 may transmit a complete dataset, which facilities rapid response by the system 14. If there are multiple devices 10 operating in a home 12 or other location at the same time, they may work simultaneously to compile the varying sensor data to send to the system 14.
  • A current state of the device 10, including the current video file and current data from the sensors, such as temperatures, humidity, air quality, etc., may be stored in the database 20 in association with an identification of the device, the location of the device determined by geo-location, and the identification of the primary user. Prior states of the device 10 may also be saved in the database 20 or other such storage device or database to facilitate learning, for example. Backup contacts, family members, group members, and notification rules may also be stored in the database in association with the identification of the device 10 and primary user 22. In one example, video is stored in a separate database and a pointer or to the video is stored in association with the other status information.
  • FIG. 23 is an example of a notification 750 provided to user device 24 of a primary user 22, as displayed by the App. The notification 750 includes a description 752 of the event and location where it took place. In this example, activity was detected in the living room. The notification also includes the time 754 of the event, here 2:12 PM. The notification 750 also describes what triggered the event 756, here motion detection. An image 758 of the room where the event took place is also shown. Clicking on the image plays a video of the event. An icon 760 is also provided, which enables viewing a current video of the room where the activity took place. Another icon 762 is provided to provide action options for the primary user 22. The action options may include clearing the event as a false alarm, contacting a backup contact 26, or contacting the police 28, fire department 30, or an ambulance 32, for example.
  • FIG. 24 is an example of a notification 770 displayed on a user device 24 that states that a person arrived home and when. The notification also includes the current temperature, humidity, air quality, and noise in the room based on the T/H sensor 90/88, the air quality sensor 94, and the microphone 76, respectively.
  • Gesture and Sound-Based User Inputted Actions
  • As discussed above, the primary user 22 can control the device 10 through an App from their smart phone or a web interface. The device 10 can also be controlled by the user through active and passive actions. Active actions that can control the device include voice commands and physical gestures, for example. A combination of voice/sound and gesture actions can also generate a command to control the device. Sound may be detected by the microphone 76 and gestures may be detected by the image sensor 60. In one example, a primary user or other member of a household can call out for a security alert to contact authorities, such as the police or fire department, or to contact a backup user, for example. A specific action can be tailored based on the desired outcome, such as a user designating a verbal command to issue a low level alert or a specific hand gesture to issue a high level alert. Voice recognition technology may be used to teach the system to recognize commands, as is known in the art. Gesture recognition technology may be used to recognize gestures, as is also known in the art.
  • The user can also control the device through passive actions. Movement and video cues determine passive actions. In one example, the primary user can walk in front of the device 10 to cause the device to disarm after the device and/or system 14 recognizes the primary user through facial recognition, for example. The device 10 can also sense directional based input from its image sensor 60 to indicate whether a particular movement is normal or out of the ordinary.
  • Timeline
  • When an event is stored by the system 14 and/or the device 10, it may be shown to the user in a ‘timeline’ of activities in their location. A timeline is a linear, chronological schedule of events that occurred in a given location over a time period that have been captured by the device 10 or multiple devices 10 in a location. The timeline is displayed on a user device 24 via the App, based on information provided to the user device from the system 14 and/or the device 10, via the network 16. The timeline allows a primary user 22 to quickly get an overview of all activity in a user defined location that is captured by the device or devices 10. The timeline allows a user to toggle between events and previous notifications, under the control of the App.
  • The timeline may be composed of two types of events, engagement entries and event entries, for example. Engagement notifications are generated when a primary user 22 or other user, such as a family member (trusted user), interacts with the device 10. Event notifications are generated when the device 10 is armed and detects something out of the ordinary.
  • Engagement entries capture the engagement of a user at a particular location with the device 10. For example, an engagement entry is generated when a primary user arrives or departs from home. Other engagement notifications include device mode changes, whether initiated by the system automatically or by a primary user 22 (e.g. “device is now armed”) or (e.g. “Amber disabled device”), a primary user goes live (explained in further detail below), a primary user makes changes to notification settings (e.g. “Justin paused all notifications for one hour”); or when a primary user resolves an Event, as described below.
  • The timeline also allows a user to manage event notifications. For example, the primary user 22 may see which sensors triggered event creation; learn which device in the home triggered event creation, in the case of multiple devices in a single location; learn at what time an event triggered; see video clips from when other primary users went “live” (actively used the camera to record remotely); see who went live, at what time, and from where (via GPS location); see the severity of an event; leave comments on entries for other users with access to the timeline; see how the event was responded to, and by which user; see sensor readings to make sure my environment is normal; see a high level view, and a detailed view of each event; see sensor details for each event; learn which users were home during each event; learn how long each event lasted; watch video from an event; know the status of notifications from an event; share an event socially; download an event report (sensor data, event timeline, video, etc.); mark an event as important; email or text an event video and details; and/or leave feedback for other users with access to the timeline.
  • The timeline may include events where the device 10 notified the primary user 22 of potential security threats. The timeline may also include events that did not result in notification to the primary user 22, due to learning that the primary user does not want to be notified of a particular event, or due to lack of severity of the event, for example.
  • FIG. 25 is an example of a timeline 800 of events that can be displayed by the user device 24 via the App, from data received from the system 24 and/or the device 10. The timeline identifies the location 802 of the device 10, here “Brooklyn Apartment,” the current time 804, the date (Today) 806, and the number 808 of events that took place so far that day (8). Three events are listed with the time 810 of each event and a summary 812 of the event. The timeline 800 may be scrolled by the user to see the additional events that are not currently displayed. The timeline also indicates whether the event was previously viewed 814.
  • FIG. 26 is a timeline 820 of notifications. Two days of notifications are shown, and the timeline may be scrolled to display additional notifications.
  • In the case of multiple primary users 22 sharing one or more devices 10, each user's timeline may appear slightly different and may be customized to them. Backup contacts 26 and other parties may have access to the timeline, if allowed by the primary user 22.
  • This concept of a timeline of your life within a physical location tailored specifically to what is normal or out of the ordinary for an individual user summarizes what is happening in one physical place from a sensor perspective, acting in close coordination with the camera/image sensor 60, microphone 76, and other sensors. For example, the device 10 and the system 14 knows who is in a certain location (using geolocation services on a mobile device that is linked to the account, facial recognition, and/or user patterns), what the temperature and humidity is inside based on the T/H humidity sensor 90/88 and outside via publically available weather information, what noise and activity is occurring via the microphone 76 and other sensor, and if all of these activities are normal or not. The activities, plus their relative ‘normalcy’ or ‘non normalcy’ can be outlined on a timeline, where the primary user 22 can quickly delve into each event to understand if and how it impacts their life, and can be notified for the events that are not normal occurrences.
  • The device 10 and/or the system 14 may also provide a ‘day’ or ‘week in the life’ summary of what happened in a given location, to provide a very rapid video clip view and understanding of what was happening in the location for that week. The user may be e-mailed a video clip synopsis of the week, the day, or other time period.
  • Primary users 22 may also access their devices 10 from any of their locations, to watch live streaming video from that device, enabling them to “Go Live.” They can also see current sensor readings from the location and who is home/away at the time. The primary user 22 has the option to save the video clip and corresponding sensor data.
  • A clip library allows users to store flagged video clips that they want the system 14 to store for them. These can include clips of events they have been notified to a clip of a robbery may be later used in court or for a police report. The user may request that clips of happy events, such as such as light hearted clips of the family pet doing something noteworthy, for example, also be saved. Flagging a clip via the App saves that clip for them in the database 20 or other storage device on the system 14. A clip library belongs to a location. Primary users may share saved clips from their clip library to social networks, such as Facebook and Twitter, for example.
  • FIG. 27 is an example of a method 900 for generating a day or week in the life at a location. A location is monitored by a monitoring device 10 including a video camera, such as an imaging sensor 60, in Step 902. A primary user 22 is notified of an event at the location, in Step 904. The user may be notified by the system 14, as discussed above. Video clips related to respective events at the location the user has been informed of are stored, in Step 906. The video clips may be uploaded to the system 14 by the device 10, as discussed above. The video clips may be stored in the database 20 or other storage device, by the system 14, in Step 906.
  • Video clips are stored by the system 14 upon a request of the user, in Step 908. The user 22 may request that particular video clips be stored via the App on the user device 24.
  • The system 14 receives a request from the user 22 to provide stored video clips over a designated time period, via the App, in Step 910. The system retrieves the stored video clips, in Step 910, and provides them to the user for display on the user device 24, in Step 912. The video clips may be compiled into a single file and sent via email, for example. The App opens and displays the video, when requested by the user 22.
  • Sensor Data Trends
  • The App also allows users to view sensor data trends from the sensors in the device 10, so that the primary user 24 is able to view visualized data about specific sensors within their location. For instance, if an air quality sensor is triggered, a graph may be generated by the system 14 and send to the primary user 22 that indicates a measure of the air quality, a video and audio clip of when the sensor went off, a historical chart of what the air quality was before the alert, instructions (e.g. go outside, open windows, etc.), and action calls (press here to call poison control, etc.).
  • Over time, insights concerning environmental sensor data developed by the system 14 may also be sent to respective primary users 24. The sensor data may be benchmarked against average readings as determined by sensor data from similarly situated primary user 24. For example, “your temperature is an avg. of 76 degrees when no one is home.” Most homeowners in your city keep their interior temperature around 70 degrees when no is home. Primary users 22 may also be able to see sensor data and compare it against a previous time frame (i.e. this month versus last month). Primary users 22 may also have the ability to toggle between data from specific sensors, showing graphs for those sensors that are relevant to them and hiding sensor data and corresponding graphs that are of no interest to them.
  • Geo-Location
  • Geo-location may be used to identify who is approaching, entering, or leaving a location, for example. Geo-location may also be used to provide an input into the device 10 and/or the system 14 to determine the location of the primary user 22 and other persons, such as family members, for example. Geo-location services are available in most smart phone or smart mobile devices. Disarming and arming of the device 10 and the system 14 may be based on geo-location. For example, the device 10/system 14 may be disarmed from issuing a notification that a person is approaching or entering a monitored location, when the primary user 22 is determined to be at or near the location, if desired by the user. In this example, while notifications are not sent, data may still be collected. Similarly, when the primary user 22 is determined to be leaving a location, the device 10/system 14 is armed to provide notifications, if desired by the user.
  • When a primary user 22 begins to approach a select radius around their home 12 or other location where the device 10 is located, it will send a signal to the home network that the user is in the area and be ‘ready’ for them to enter. This being ‘ready’ informs the device 10/system 13 not to notify the user/s immediately upon entry to alert of movement alarm, since the person entering the location is likely the user and thus it should wait another predetermined set of time (e.g. 45 seconds) to confirm the user is in the location through the App or wireless network and disarm itself, for example.
  • Geo-location of other users designated by the primary use 24, such as family members, for example, may also arm or disarm that device 10/system 14 without alerting the user. Such information may be saved into a ‘timeline’ but need not result in a notification as a security incident.
  • By geo-locating user device 24, other preferences in the home 12 may also be controlled. For example, preferences may be different depending on who is in the home. From a security and safety point of view, specific actions and notifications can align with the geo-location of phones and the confirmation of the proximity of individuals within the location. An individual owner of the account can set preferences for the individuals. An action may be a specific notification. For example, if the location of a primary user's child, for instance, is activated, then the primary user 22 can receive a timeline or graph of the activities of the child.
  • Geo-location may be performed in a variety of ways. In one example, a user device 24, such as a mobile device, may monitor a large region (˜1 Km) centered around the user's home 12 using GPS coordinates, for example. When the user device 24 recognizes that it has entered this large region, it begins to search for a smaller region called an iBeacon, which are mini geo-fences that are created using BTLE devices. When the user device 24 recognizes the iBeacon, it sends a HTTPS request to the system 14 to disarm notifications.
  • In another example, when a mobile user device 22 recognizes that it has entered a large monitored region, it begins to search for the device 10, which is a Bluetooth low energy device. When connected to the user device 24, the BTLE device 10 becomes a peripheral device and the user device 24 becomes the central device. The user device 24 continues to search for the peripheral in the background. When the user device 24 detects the peripheral, it verifies that it is that user's device 10 and sends a HTTPS request to the system 14 to disarm notifications.
  • A secondary user, who can either be a primary user 24 of another device 10 or provisioned to be a backup/contact 26 in a group of the primary, can likewise be used to arm and disarm the security and notifications of the device, based on the criteria as set by the primary user 22.
  • Third Party Information and APIs
  • In accordance with another embodiment of the invention, the system 14 may enable connection between the system to and third parties, such as Facebook or LinkedIn, to learn the location of a user, and to learn who may be potential backup contacts, for example. If a user connects their Facebook account to the system 14, the system can scan the images of all of their friends in Facebook and ask the primary user whether they should be designated as backup contacts 26, for example. Based on the scanned images, the device 10 and the system 14 may be able to recognize the friends on Facebook if they enter the home 12 by matching an image or video captured by the image sensor 60 and comparing it to the scanned image, for example. The primary user 22 can then be informed of who entered their home 12.
  • In accordance with another embodiment of the invention, the system 14 may create an API to allow for the public and/or other users of their own device 10 to gather both generalized, anonymized data as well as specific data on individual's lives, in order to better understand their own environment. This open security platform can be used by other security companies who wish to make devices to connect to the system 14 and/or to use the aspects of the system 14 for other security or non-security related benefits.
  • Other third party information that may be obtained and used by the system includes weather. For example, temperature, humidity, and air quality thresholds may be changed by the system 14 based on the local weather, for example.
  • Local crime warnings may also be used by the system 14. For example, the system 14 may send notifications to primary users 22 under circumstances where the user preferences state that notifications should not be provided, if it known that crime has recently increased in the area, for example.
  • Community Watch
  • Another embodiment of the invention is the interaction between the system 14 and the user's social group/friends, family or neighbors that they choose to include in the system. The primary user 22 may designate through the App on the user device 24 individuals who the primary user knows as their ‘social group,’ and identifies those users as backups to the primary user notification. These backups are placed in specific ‘groups’, which work together to look into each other's homes from a timeline perspective and help monitor each other's safety and security. The system 14 may also automatically designate people identified as friends of the primary user 22 on Facebook, for example.
  • These groups can be anywhere from 2 people to 10 people or more, but are intended to be a small social circle that enables a community of people to respond from a security point of view. Member of the groups receive the notifications from the system 14/device 10 either after the primary user 22 or, in the case of high priority notifications, at the same time as the primary user. This enables a crowd sourced response, where multiple people may respond the notification to ensure that there is an accurate and rapid response to the system. Additionally, a primary user 22 can cede primary user status and provide temporary “keys” to pass ownership and control over the software and hardware that make up the device over a pre-determined period of time. These digital keys would allow a recipient to have temporary access to the device 10 when someone other than the primary user would be in or using the home, such as when renting out a primary user's home to a temporary guest or to give access to a friend of family member to check on a primary user's home when they are on vacation.
  • In another example, if a primary user 22 does not respond to a notification, after a predetermined length of time, the notification would be sent to some or all member of the group by the system 14, for example. Members of the group may be able to either see what is going on in the location, or to see a written description of what caused the notification, for example. The primary user 22 may determine how much information members of the group receive by setting appropriate preferences. Multiple group member can then work together to resolve an incident, as a virtual neighborhood or virtual community watch, and share in the responsibility of responding and reacting to a security or safety-related incident.
  • Individuals within the primary user's social group can take direct or indirect action through the App to resolve an incident or respond to the incident or event. A primary user 22 may be a member of multiple groups. A group may comprise two users, or a home owner and a secondary family member in their location or in another location, for example. The group members may also have access to the timelines of other members in the group, and may add or subtract members from the group (depending on group settings and preferences). The groups are a way for people to bring along people that they know and trust already to help them do security or safety, such as their parents, children, neighbors, and friends.
  • In addition, a group could comprise some or all of the inhabitants of a physical building of part of a building, such as an apartment building or office. This group may share limited information, such as confirmed incidents, or temperature, humidity and air quality readings. Multiple members of the building or office would share the feed to the device/s in the office, and share in the notifications or response. Knowing what is going on in their physical location would be beneficial to all members of the group. A person can be a member of the group, even if they have no device or no membership. For example, they can be someone's backup, and still have the ability to participate in other people's security. In this case, they may not, however have an input of their own.
  • A group may also be a community and may include an input for the police or other authorities. Members of the group would receive instant notification should someone or something either be confirmed by the owner, or should a high heat, carbon monoxide or other life-safety related event be detected. Appropriate government authorities or non-governmental support companies/entities can be contacted directly or indirectly from the device 10 and/or the system 15 to inform/report an incident if the incident is deemed out-of-the-ordinary. Alternatively, the device 10/system 14 may be configured by the primary user 22 to alert authorities if notifications remain unacknowledged by the primary user or group members.
  • A second layer aside from clusters of neighbors watching over each other is the interaction between the system 14 and local authorities. A dedicated software App or website leading to police and other authorities could feed data collected by devices 10 data concerning crime and other safety information directly from a user's home to relevant groups. Appropriate government authorities or non-governmental support companies and entities can be contacted directly or indirectly from the device 10 or system 14, to inform and report of an incident that is deemed out-of-the-ordinary. Alternatively, the device 10 and/or system 14 can be configured by the primary user 22 to alert authorities if notifications remain unacknowledged by the primary user or group members, by setting appropriate settings.
  • The User App
  • As discussed above, the primary user 22 and other users, if allowed by the user, may communicate with the device 10 and system 14 via their mobile device or other processing device, such as a laptop or desk top computer, through an Application in the device. From the App, a user controls the settings for the device, receives notifications when something occurs that is out of the ordinary, and has the ability to respond in an appropriate fashion. The features of the mobile application work in concert to provide a holistic view of what occurs in one's home and the necessary tools to effectively monitor one's home in different situations.
  • Specific notifications are transmitted to the user on their mobile device, tablet, computer, or other web-connected device. The messages can be sent via email, phone, text, or in-App message based on user preferences. Based on the input from the device 10 and the system 14, notifications or alerts are pushed to the user to apprise them of relevant incidents generated by the system.
  • Other types of notifications include device notifications. These are notifications related to the system that are not generated by human interaction. For example, a device notification would be sent to a user if the system loses power or if the system is disconnected from the Internet. There are also several classes of notifications that would be sent to a user that are not related to device functionality. These are non-device, non-human interaction, generated notifications such as: third party weather information; a reminder to pay for service plan; a notification that allotted data storage reaching capacity; SDK/API related notifications; software updates; confirmation of backup invitations; when notifications escalate to a backup contact; and/or confirmation of additional user notifications.
  • Custom Modes
  • The device 10 and system 14 may operate differently in different modes of operation. The modes of operation may include a home mode, an array mode, a nighttime mode, a privacy mode, and/or a vacation mode. Modes may be changed directly on the device 10 via the capacitive switch 114 and/or via the App in the user device 24, for example. Each mode may be represented by a color displayed by the RGB LED 96, for example.
  • During Night Time Mode, the device 10 and the system 14 can be enabled to still watch over a user's home without notifying a user of every instance of an event. For example, a user can set Night Time Mode to only receive critical alerts, such as when the device 10 detects movement or a high spike in temperature, for example, and have all other alerts be delayed until the next morning, during set up or at a later time.
  • In vacation mode, the device 10 and the system 10 may be in high alert mode. For example the system can automatically sound the siren 86 when motion is detected instead of waiting for a user, or backup contact, to resolve the event. Additionally, escalation preferences are suspended with the primary user 22 and backup contacts receiving notifications simultaneously.
  • In addition, in vacation mode, a primary user may cede primary user status and provide temporary digital “keys” to pass ownership and control over device 10 and the system 14 to the backup contact or other party for a pre-determined period of time.
  • These digital keys would also allow a recipient to have temporary access to the device 10 in instances where someone other than the primary user would be in or using the home. For example, when renting out a primary user's home to a temporary guest or to give access to a friend of family member to check on a primary user's home when they are on vacation.
  • The examples of embodiments of the invention described herein are merely exemplary and one of ordinary skill in the art will recognize that aspects of the described embodiments may be changed without departing from the spirit and scope of the invention, which is defined in the claims below.

Claims (31)

We claim:
1. A method of providing information about a monitored environment, comprising:
monitoring a location by a plurality of sensors;
collecting data from the plurality of sensors by a processing device; and
providing information to a user of the status of the location based on data collected by at least some of the plurality of sensors via a network by the processing device, for display on a user device, via a network.
2. The method of claim 1, wherein respective information is provided if criteria with respect to data collected from at least one of the plurality of sensors is met.
3. The method of claim 2, wherein at least certain criteria comprise exceeding a threshold and/or deviating from a pattern of the data collected by one or more of the plurality of sensors over a time period.
4. The method of claim 2, comprising:
determining that criteria for at least one of the sensors is met; and
providing the information based, at least in part, or at least one sensor collecting data exceeding a respective threshold, and at least one sensor collecting data not meeting a respective criterion.
5. The method of claim 4, wherein one of the sensors comprises a camera and another one of the sensors comprises a temperature sensor, a humidity sensor, a motion sensor, an air quality sensor, and/or an accelerometer, the method comprising:
determining that data collected from at least one of the temperature sensor, humidity sensor, motion sensor, air quality sensor, and/or accelerometer, meets a respective criteria;
providing information related to the met criteria; and
providing video from data collected by the camera at a time the data meeting the criteria is collected.
6. The method of claim 5, wherein the information includes data exceeding a respective criteria.
7. The method of claim 6, further comprising:
receiving a response from the user to the information based, at least in part, on the data meeting the criteria and the data not meeting the criteria; and
providing subsequent information based, at least in part, on the response.
8. The method of claim 2, comprising displaying on a user device provided information.
9. The method of claim 8, comprising enabling a user to scroll through the information in the order that the information was provided.
10. The method of claim 8, comprising:
providing information to the user based on data collected by a plurality of monitoring devices in the location;
enabling the user to scroll through information provided based on data collected from at least some of the plurality of devices, in the order the information was provided; or
enabling the user to toggle between information provided based on data collected from at least some of the plurality of each devices.
11. The method of claim 8, comprising:
providing information to the user based on data collected by a plurality of monitoring devices in a plurality of different locations associated with the user; and
enabling the user to scroll through the information from particular devices, in the order the information was provided.
12. The method of claim 8, wherein at least some of the information includes a geo-location of the user.
13. The method of claim 1, further comprising enabling a user to scroll through information in the order the information was provided, over the course of at least one day.
14. The method of claim 1, comprising monitoring the location by a plurality of sensors within a single device.
15. A system for providing information about a monitored environment, comprising;
a processing device; and
memory;
wherein the processing device is configured to:
receive data collected by sensors in a location; and
provide information to a user of the status of the location, based, at least in part, on data collected by at least one of the plurality of sensors, via a network.
16. The system of claim 15, wherein respective updates are provided if criteria with respect to data collected from at least one of the plurality of sensors is met.
17. The system of claim 16, wherein meeting at least certain criteria comprise exceeding a threshold and/or deviating from a normal pattern.
18. The system of claim 16, comprising:
determining that criteria for at least one of the sensor is met; and
providing in the update information related to a plurality of sensors, including at least one sensor whose criteria is not met.
19. The system of claim 18, wherein one of the sensors comprises a camera and another one of the sensors comprises a temperature sensor, a humidity sensor, a motion sensor, an air quality sensor, and/or an accelerometer, and the processing device is configured to:
determine whether that criteria from at least one of the temperature sensor, humidity sensor, motion sensor, air quality sensor, and/or accelerometer is met; and
provide information related to the exceeded criteria, the update including video from data collected by the camera.
20. The system of claim 19, where the information includes data exceeding a respective criteria.
21. The system of claim 20, wherein the processing device is further configured to:
receive a response from the user to the update based, at least in part, on the data meeting the criteria and the data not meeting the criteria; and
provide subsequent information based, at least in part, on the response.
22. The system of claim 16, further comprising a user device configured to display information provided at multiple times.
23. The system of claim 22, wherein the user device is configured to enable a user to scroll through updates in the order that the information was provided.
24. The system of claim 23, wherein the processing device is configured to:
provide information to the user based on data collected by a plurality of monitoring devices in the location; and
the user device is configured to enable the user to scroll through information from the plurality of devices, in the order the information was provided.
25. The system of claim 22, wherein the processing device is configured to:
provide information to the user based on data collected by a plurality of monitoring devices in a plurality of different locations associated with the user; and
the user device is configured to enable the user to scroll through the information from particular devices, in the order the information was provided.
26. The system of claim 22, wherein at least some of the information includes a geo-location of the user.
27. The system of claim 15, wherein the user device is configured to enable a user to scroll through information in the order the information was provided.
28. The system of claim 15, comprising monitoring the location by a plurality of sensors within a single device.
29. The system of claim 28, wherein the user device is configured to enable a user to scroll through updates in the order the updates were provided, over the course of at least one week.
30. The system of claim 27, wherein the user device is configured to display the information via an App stored on the user device.
31. A method of providing video clips from a location to a user device, comprising:
monitoring a location by a monitoring device including a video camera
notifying a user of an event at the location
storing first video clips related to respective events at the location that the user has been informed of;
storing second video clips of the location upon a request of the user;
receiving a request to provide stored video clips of the location over a time period, from the user;
retrieving the stored video clips; and
providing the retrieved video clips to a user device for display.
US14/260,256 2013-04-23 2014-04-23 Remote User Interface & Display For Events For A Monitored Location Abandoned US20140320648A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/260,256 US20140320648A1 (en) 2013-04-23 2014-04-23 Remote User Interface & Display For Events For A Monitored Location
US15/457,182 US10083599B2 (en) 2013-04-23 2017-03-13 Remote user interface and display for events for a monitored location

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361815223P 2013-04-23 2013-04-23
US14/260,256 US20140320648A1 (en) 2013-04-23 2014-04-23 Remote User Interface & Display For Events For A Monitored Location

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/457,182 Continuation US10083599B2 (en) 2013-04-23 2017-03-13 Remote user interface and display for events for a monitored location

Publications (1)

Publication Number Publication Date
US20140320648A1 true US20140320648A1 (en) 2014-10-30

Family

ID=51728587

Family Applications (12)

Application Number Title Priority Date Filing Date
US14/260,246 Active US10304319B2 (en) 2013-04-23 2014-04-23 Monitoring and security devices comprising multiple sensors
US14/260,273 Abandoned US20140317710A1 (en) 2013-04-23 2014-04-23 Method for connecting devices to a network through an audio cable and a user device
US14/260,270 Abandoned US20140327555A1 (en) 2013-04-23 2014-04-23 Monitoring & security systems and methods with learning capabilities
US14/260,266 Active US9449491B2 (en) 2013-04-23 2014-04-23 Notifying a community of security events
US14/260,249 Abandoned US20140321505A1 (en) 2013-04-23 2014-04-23 Thermo isolation chamber for housing components and sensors
US14/260,256 Abandoned US20140320648A1 (en) 2013-04-23 2014-04-23 Remote User Interface & Display For Events For A Monitored Location
US14/260,264 Active US9472090B2 (en) 2013-04-23 2014-04-23 Designation and notifying backup users for location-based monitoring
US14/260,260 Active US9633548B2 (en) 2013-04-23 2014-04-23 Leveraging a user's geo-location to arm and disarm a network enabled device
US14/751,235 Abandoned US20150301513A1 (en) 2013-04-23 2015-06-26 Connecting a security monitoring device to a network and security processing system using an audio connection to a user device
US14/751,284 Active US9659483B2 (en) 2013-04-23 2015-06-26 System for leveraging a user's geo-location to arm and disarm network a enabled device
US14/751,395 Abandoned US20150302725A1 (en) 2013-04-23 2015-06-26 Monitoring & security systems and methods with learning capabilities
US15/457,182 Active US10083599B2 (en) 2013-04-23 2017-03-13 Remote user interface and display for events for a monitored location

Family Applications Before (5)

Application Number Title Priority Date Filing Date
US14/260,246 Active US10304319B2 (en) 2013-04-23 2014-04-23 Monitoring and security devices comprising multiple sensors
US14/260,273 Abandoned US20140317710A1 (en) 2013-04-23 2014-04-23 Method for connecting devices to a network through an audio cable and a user device
US14/260,270 Abandoned US20140327555A1 (en) 2013-04-23 2014-04-23 Monitoring & security systems and methods with learning capabilities
US14/260,266 Active US9449491B2 (en) 2013-04-23 2014-04-23 Notifying a community of security events
US14/260,249 Abandoned US20140321505A1 (en) 2013-04-23 2014-04-23 Thermo isolation chamber for housing components and sensors

Family Applications After (6)

Application Number Title Priority Date Filing Date
US14/260,264 Active US9472090B2 (en) 2013-04-23 2014-04-23 Designation and notifying backup users for location-based monitoring
US14/260,260 Active US9633548B2 (en) 2013-04-23 2014-04-23 Leveraging a user's geo-location to arm and disarm a network enabled device
US14/751,235 Abandoned US20150301513A1 (en) 2013-04-23 2015-06-26 Connecting a security monitoring device to a network and security processing system using an audio connection to a user device
US14/751,284 Active US9659483B2 (en) 2013-04-23 2015-06-26 System for leveraging a user's geo-location to arm and disarm network a enabled device
US14/751,395 Abandoned US20150302725A1 (en) 2013-04-23 2015-06-26 Monitoring & security systems and methods with learning capabilities
US15/457,182 Active US10083599B2 (en) 2013-04-23 2017-03-13 Remote user interface and display for events for a monitored location

Country Status (8)

Country Link
US (12) US10304319B2 (en)
EP (2) EP2989619A4 (en)
JP (1) JP2016524209A (en)
KR (1) KR20160032004A (en)
CN (1) CN105308657A (en)
AU (1) AU2014257036A1 (en)
CA (1) CA2909892C (en)
WO (1) WO2014176379A2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105809884A (en) * 2014-12-29 2016-07-27 北京市建筑设计研究院有限公司 House accident monitoring system
CN106079064A (en) * 2016-07-19 2016-11-09 魏会芳 A kind of Intelligent Machining device
CN106297232A (en) * 2015-12-11 2017-01-04 湖南天洋信息科技有限公司 Air-gauge, data processing method based on air-gauge and system
CN106504459A (en) * 2016-10-31 2017-03-15 国网山东省电力公司济南供电公司 A kind of unmanned communications equipment room fire early-warning system
CN106548587A (en) * 2016-10-31 2017-03-29 国网山东省电力公司济南供电公司 A kind of information machine room safety monitoring system based on Internet of Things
CN107305731A (en) * 2016-04-21 2017-10-31 中兴通讯股份有限公司 A kind of radio data transmission method and device
US20180046778A1 (en) * 2015-12-28 2018-02-15 National Taiwan University Fever epidemic detection system and method thereof
CN107728559A (en) * 2017-11-20 2018-02-23 惠州市酷丰科技有限公司 It is a kind of can remote monitoring and high security burglary-resisting system
US9965936B1 (en) * 2017-01-04 2018-05-08 Shawn W. Epps Network communication and accountability system for individual and group safety
US10152873B2 (en) * 2017-04-13 2018-12-11 Chekt Llc. Alarm verification system and method thereof
US10762755B2 (en) * 2018-06-04 2020-09-01 Apple Inc. Data-secure sensor system
US11062592B2 (en) * 2019-04-12 2021-07-13 AKBI Development, LLC Danger zone protection and assistance system
US20220376978A1 (en) * 2014-10-13 2022-11-24 Pismo Labs Technology Limited Methods and systems for configuring a mobile router

Families Citing this family (378)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7771320B2 (en) * 2006-09-07 2010-08-10 Nike, Inc. Athletic performance sensing and/or tracking systems and methods
US10469556B2 (en) 2007-05-31 2019-11-05 Ooma, Inc. System and method for providing audio cues in operation of a VoIP service
US11415949B2 (en) 2011-03-16 2022-08-16 View, Inc. Security event detection with smart windows
US11822202B2 (en) 2011-03-16 2023-11-21 View, Inc. Controlling transitions in optically switchable devices
US8705162B2 (en) 2012-04-17 2014-04-22 View, Inc. Controlling transitions in optically switchable devices
US11703814B2 (en) 2011-03-16 2023-07-18 View, Inc. Security event detection with smart windows
US9411327B2 (en) 2012-08-27 2016-08-09 Johnson Controls Technology Company Systems and methods for classifying data in building automation systems
US10304319B2 (en) 2013-04-23 2019-05-28 Canary Connect, Inc. Monitoring and security devices comprising multiple sensors
WO2014186808A2 (en) * 2013-05-17 2014-11-20 Barry Thornton Security and first-responder emergency lighting system
US9286786B2 (en) * 2013-07-17 2016-03-15 Honeywell International Inc. Surveillance systems and methods
CN105409261A (en) * 2013-07-23 2016-03-16 天龙马兰士集团有限公司 Remote system configuration using audio ports
US9386148B2 (en) 2013-09-23 2016-07-05 Ooma, Inc. Identifying and filtering incoming telephone calls to enhance privacy
IN2013MU03382A (en) * 2013-10-25 2015-07-17 Tata Consultancy Services Ltd
US10712718B2 (en) 2013-12-11 2020-07-14 Ademco Inc. Building automation remote control device with in-application messaging
US9900177B2 (en) 2013-12-11 2018-02-20 Echostar Technologies International Corporation Maintaining up-to-date home automation models
NL2012327B1 (en) * 2013-12-13 2016-06-21 Utc Fire & Security B V Selective intrusion detection systems.
US9769522B2 (en) 2013-12-16 2017-09-19 Echostar Technologies L.L.C. Methods and systems for location specific operations
US9442017B2 (en) * 2014-01-07 2016-09-13 Dale Read Occupancy sensor
US9533413B2 (en) 2014-03-13 2017-01-03 Brain Corporation Trainable modular robotic apparatus and methods
US9987743B2 (en) 2014-03-13 2018-06-05 Brain Corporation Trainable modular robotic apparatus and methods
US20150293549A1 (en) * 2014-04-14 2015-10-15 Eaton Corporation Load panel system
US9394740B2 (en) * 2014-04-30 2016-07-19 Cubic Corporation Failsafe operation for unmanned gatelines
US10553098B2 (en) 2014-05-20 2020-02-04 Ooma, Inc. Appliance device integration with alarm systems
US9633547B2 (en) 2014-05-20 2017-04-25 Ooma, Inc. Security monitoring and control
US10769931B2 (en) 2014-05-20 2020-09-08 Ooma, Inc. Network jamming detection and remediation
US20170251169A1 (en) * 2014-06-03 2017-08-31 Gopro, Inc. Apparatus and methods for context based video data compression
US10679671B2 (en) * 2014-06-09 2020-06-09 Pelco, Inc. Smart video digest system and method
US20160042621A1 (en) * 2014-06-13 2016-02-11 William Daylesford Hogg Video Motion Detection Method and Alert Management
US10469514B2 (en) * 2014-06-23 2019-11-05 Hewlett Packard Enterprise Development Lp Collaborative and adaptive threat intelligence for computer security
CN104104910B (en) * 2014-06-26 2018-04-17 北京小鱼在家科技有限公司 It is a kind of to carry out two-way live shared terminal and method with intelligent monitoring
US11003041B2 (en) 2014-06-30 2021-05-11 View, Inc. Power management for electrochromic window networks
US9513898B2 (en) 2014-06-30 2016-12-06 Google Inc. Systems and methods for updating software in a hazard detection system
EP3161552B1 (en) 2014-06-30 2020-01-15 View, Inc. Control methods and systems for networks of optically switchable windows during reduced power availability
US10110724B2 (en) 2014-07-02 2018-10-23 Titan Health & Security Technologies, Inc. Community safety, security, health communication and emergency notification system with inter-organizational compatibility
GB2528044B (en) 2014-07-04 2018-08-22 Arc Devices Ni Ltd Non-touch optical detection of vital signs
CN106471202A (en) * 2014-07-04 2017-03-01 皇家飞利浦有限公司 air quality warning system and method
US11330100B2 (en) 2014-07-09 2022-05-10 Ooma, Inc. Server based intelligent personal assistant services
US20160043896A1 (en) * 2014-08-05 2016-02-11 Fibar Group sp. z o.o. Home network manager for home automation
US9886840B2 (en) * 2014-09-23 2018-02-06 Groves Internet Consulting, Inc. Method for guaranteed delivery of alert notifications through chain-of-command escalation procedures
US9989507B2 (en) 2014-09-25 2018-06-05 Echostar Technologies International Corporation Detection and prevention of toxic gas
US20160092532A1 (en) * 2014-09-29 2016-03-31 Facebook, Inc. Load-balancing inbound real-time data updates for a social networking system
US9854973B2 (en) 2014-10-25 2018-01-02 ARC Devices, Ltd Hand-held medical-data capture-device interoperation with electronic medical record systems
US20160125318A1 (en) * 2014-11-03 2016-05-05 Canary Connect, Inc. User-Assisted Learning in Security/Safety Monitoring System
US9576466B2 (en) * 2014-11-04 2017-02-21 Canary Connect, Inc. Backup contact for security/safety monitoring system
US9396632B2 (en) 2014-12-05 2016-07-19 Elwha Llc Detection and classification of abnormal sounds
KR102462086B1 (en) 2014-12-08 2022-11-01 뷰, 인크. Multiple interacting systems at a site
US10225224B1 (en) * 2014-12-11 2019-03-05 Priority Reply Networks, Llc Web and voice message notification system and process
GB2554792B (en) * 2014-12-27 2020-02-05 Switchee Ltd System and method for controlling energy consuming devices within a building
GB201423300D0 (en) * 2014-12-29 2015-02-11 Sprue Safety Products Ltd Multi alarm remote monitoring system
US10127785B2 (en) 2014-12-30 2018-11-13 Google Llc Entry point opening sensor
US9332616B1 (en) 2014-12-30 2016-05-03 Google Inc. Path light feedback compensation
US20160189513A1 (en) * 2014-12-30 2016-06-30 Google Inc. Situationally Aware Alarm
US9558639B2 (en) 2014-12-30 2017-01-31 Google Inc. Systems and methods of intrusion detection
US9569943B2 (en) * 2014-12-30 2017-02-14 Google Inc. Alarm arming with open entry point
US9508247B2 (en) 2014-12-30 2016-11-29 Google Inc. Systems and methods of automated arming and disarming of a security system
CN104902221B (en) * 2014-12-31 2018-06-12 小米科技有限责任公司 Video frequency monitoring method and device
CN104640082B (en) * 2015-01-15 2019-02-12 小米科技有限责任公司 Prompting message generation method and device
US10127797B2 (en) * 2015-02-17 2018-11-13 Honeywell International Inc. Alternative inexpensive cloud-based mass market alarm system with alarm monitoring and reporting
WO2016138146A1 (en) * 2015-02-24 2016-09-01 KiLife Tech, Inc. Monitoring dependent individuals
US10032353B2 (en) 2015-02-24 2018-07-24 KiLife Tech, Inc. Monitoring dependent individuals
US9928713B2 (en) 2015-02-24 2018-03-27 KiLife Tech, Inc. Locks for wearable electronic bands
US9900174B2 (en) * 2015-03-06 2018-02-20 Honeywell International Inc. Multi-user geofencing for building automation
US9967391B2 (en) 2015-03-25 2018-05-08 Honeywell International Inc. Geo-fencing in a building automation system
US9836069B1 (en) * 2015-03-31 2017-12-05 Google Inc. Devices and methods for protecting unattended children in the home
US9858788B2 (en) 2015-04-07 2018-01-02 Vivint, Inc. Smart bedtime
US9609478B2 (en) 2015-04-27 2017-03-28 Honeywell International Inc. Geo-fencing with diagnostic feature
US10802459B2 (en) 2015-04-27 2020-10-13 Ademco Inc. Geo-fencing with advanced intelligent recovery
US10802469B2 (en) 2015-04-27 2020-10-13 Ademco Inc. Geo-fencing with diagnostic feature
CN104820072B (en) * 2015-04-30 2016-03-09 重庆大学 Based on the monitoring method of the Electronic Nose air-quality monitoring system of cloud computing
US11171875B2 (en) 2015-05-08 2021-11-09 Ooma, Inc. Systems and methods of communications network failure detection and remediation utilizing link probes
US10771396B2 (en) 2015-05-08 2020-09-08 Ooma, Inc. Communications network failure detection and remediation
US9521069B2 (en) 2015-05-08 2016-12-13 Ooma, Inc. Managing alternative networks for high quality of service communications
US10911368B2 (en) 2015-05-08 2021-02-02 Ooma, Inc. Gateway address spoofing for alternate network utilization
US10009286B2 (en) 2015-05-08 2018-06-26 Ooma, Inc. Communications hub
US10482759B2 (en) * 2015-05-13 2019-11-19 Tyco Safety Products Canada Ltd. Identified presence detection in and around premises
US9734702B2 (en) * 2015-05-21 2017-08-15 Google Inc. Method and system for consolidating events across sensors
US20160364967A1 (en) * 2015-06-11 2016-12-15 John Philippe Legg Privacy sensitive surveillance apparatus
FR3037430A1 (en) * 2015-06-12 2016-12-16 Gdf Suez SENSORS AND ALARMS FOR TECHNICAL MANAGEMENT OF A BUILDING AND EQUIPMENT FOR USE BY OCCUPANTS
WO2016198814A1 (en) * 2015-06-12 2016-12-15 Engie Sensors and alarms for multi-service management of a building and of equipment for the occupants thereof
US9361011B1 (en) * 2015-06-14 2016-06-07 Google Inc. Methods and systems for presenting multiple live video feeds in a user interface
US9704376B2 (en) 2015-06-24 2017-07-11 Vivint, Inc. Smart stay day
US9840003B2 (en) 2015-06-24 2017-12-12 Brain Corporation Apparatus and methods for safe navigation of robotic devices
CN105116742A (en) * 2015-07-21 2015-12-02 合肥智凯电子科技有限公司 System capable of intelligently controlling household equipment and monitoring equipment
FR3040817B1 (en) * 2015-09-04 2021-04-30 Domis Sa METHODS OF AUTOMATIC DISARMING AND OF USE OF AN ANTI-INTRUSION ALARM SYSTEM, AND ASSOCIATED ALARM SYSTEM
US10169611B2 (en) * 2015-09-10 2019-01-01 International Business Machines Corporation Dynamic application hiding
US9818291B2 (en) * 2015-09-11 2017-11-14 Honeywell International Inc. System arm notification based on BLE position
US9953511B2 (en) 2015-09-16 2018-04-24 Honeywell International Inc. Portable security device that communicates with home security system monitoring service
US11915178B2 (en) * 2015-09-22 2024-02-27 Nmetric, Llc Cascading notification system
US10425702B2 (en) 2015-09-30 2019-09-24 Sensormatic Electronics, LLC Sensor packs that are configured based on business application
US10354332B2 (en) 2015-09-30 2019-07-16 Sensormatic Electronics, LLC Sensor based system and method for drift analysis to predict equipment failure
US10902524B2 (en) 2015-09-30 2021-01-26 Sensormatic Electronics, LLC Sensor based system and method for augmenting underwriting of insurance policies
US11436911B2 (en) 2015-09-30 2022-09-06 Johnson Controls Tyco IP Holdings LLP Sensor based system and method for premises safety and operational profiling based on drift analysis
US11151654B2 (en) 2015-09-30 2021-10-19 Johnson Controls Tyco IP Holdings LLP System and method for determining risk profile, adjusting insurance premiums and automatically collecting premiums based on sensor data
US10116796B2 (en) 2015-10-09 2018-10-30 Ooma, Inc. Real-time communications-based internet advertising
GB201518050D0 (en) 2015-10-12 2015-11-25 Binatone Electronics Internat Ltd Home monitoring and control systems
EP3156857B1 (en) * 2015-10-16 2021-01-13 RP-Technik GmbH Device and method for monitoring a building
US20170109586A1 (en) * 2015-10-16 2017-04-20 Canary Connect, Inc. Sensitivity adjustment for computer-vision triggered notifications
US10534326B2 (en) 2015-10-21 2020-01-14 Johnson Controls Technology Company Building automation system with integrated building information model
US10057110B2 (en) 2015-11-06 2018-08-21 Honeywell International Inc. Site management system with dynamic site threat level based on geo-location data
US10516965B2 (en) 2015-11-11 2019-12-24 Ademco Inc. HVAC control using geofencing
US9628951B1 (en) 2015-11-11 2017-04-18 Honeywell International Inc. Methods and systems for performing geofencing with reduced power consumption
US10878251B2 (en) 2015-11-12 2020-12-29 Signify Holding B.V. Image processing system
US10270881B2 (en) * 2015-11-19 2019-04-23 Adobe Inc. Real-world user profiles via the internet of things
US10222119B2 (en) * 2015-11-20 2019-03-05 Mohsen Rezayat Deployable temperature controlled shed with remote management
EP3381021B1 (en) * 2015-11-23 2021-01-06 Essence Security International (E.S.I.) Ltd. Thermal motion detector and thermal camera
US9996066B2 (en) 2015-11-25 2018-06-12 Echostar Technologies International Corporation System and method for HVAC health monitoring using a television receiver
US9444703B1 (en) 2015-11-30 2016-09-13 International Business Machines Corporation Interconnecting electronic devices for reporting device status
CN105404163B (en) * 2015-12-07 2018-04-17 浙江元墅养老服务有限公司 A kind of intelligent home endowment service system
CN108370448A (en) * 2015-12-08 2018-08-03 法拉第未来公司 A kind of crowdsourcing broadcast system and method
US9560482B1 (en) 2015-12-09 2017-01-31 Honeywell International Inc. User or automated selection of enhanced geo-fencing
US9860697B2 (en) 2015-12-09 2018-01-02 Honeywell International Inc. Methods and systems for automatic adjustment of a geofence size
US10101717B2 (en) 2015-12-15 2018-10-16 Echostar Technologies International Corporation Home automation data storage system and methods
US10091017B2 (en) 2015-12-30 2018-10-02 Echostar Technologies International Corporation Personalized home automation control based on individualized profiling
US10060644B2 (en) 2015-12-31 2018-08-28 Echostar Technologies International Corporation Methods and systems for control of home automation activity based on user preferences
US10490003B2 (en) * 2015-12-31 2019-11-26 Vivint, Inc. Guest mode access
US10073428B2 (en) 2015-12-31 2018-09-11 Echostar Technologies International Corporation Methods and systems for control of home automation activity based on user characteristics
US10713873B1 (en) 2015-12-31 2020-07-14 Vivint, Inc. Traveling automation preferences
US11268732B2 (en) 2016-01-22 2022-03-08 Johnson Controls Technology Company Building energy management system with energy analytics
US20190208168A1 (en) * 2016-01-29 2019-07-04 John K. Collings, III Limited Access Community Surveillance System
CN105551189A (en) * 2016-02-04 2016-05-04 武克易 Internet of Thing device intelligent supervising method
CN105551188A (en) * 2016-02-04 2016-05-04 武克易 Realization method for Internet of Thing intelligent device having supervising function
WO2017132930A1 (en) * 2016-02-04 2017-08-10 武克易 Internet of things smart caregiving method
EP3203420B1 (en) 2016-02-08 2020-08-19 Honeywell International Inc. Removable memory card with security system support
US10605472B2 (en) 2016-02-19 2020-03-31 Ademco Inc. Multiple adaptive geo-fences for a building
TWI599992B (en) * 2016-03-01 2017-09-21 晶睿通訊股份有限公司 Surveillance camera device
US11810032B2 (en) 2016-03-16 2023-11-07 Triax Technologies, Inc. Systems and methods for low-energy wireless applications using networked wearable sensors
US10528902B2 (en) 2016-03-16 2020-01-07 Triax Technologies, Inc. System and interfaces for managing workplace events
US11170616B2 (en) * 2016-03-16 2021-11-09 Triax Technologies, Inc. System and interfaces for managing workplace events
US10769562B2 (en) 2016-03-16 2020-09-08 Triax Technologies, Inc. Sensor based system and method for authorizing operation of worksite equipment using a locally stored access control list
WO2017173167A1 (en) 2016-03-31 2017-10-05 Johnson Controls Technology Company Hvac device registration in a distributed building management system
US9734697B1 (en) * 2016-04-01 2017-08-15 Google Inc. Automatic notify mode for security system
US11553320B1 (en) * 2016-04-05 2023-01-10 Alarm.Com Incorporated Detection and handling of home owner moving by a home monitoring system
US9949086B2 (en) 2016-04-18 2018-04-17 Canary Connect, Inc. Automatic system control based on mobile device location relative to a physical space
US11774920B2 (en) 2016-05-04 2023-10-03 Johnson Controls Technology Company Building system with user presentation composition based on building context
US10417451B2 (en) 2017-09-27 2019-09-17 Johnson Controls Technology Company Building system with smart entity personal identifying information (PII) masking
US10505756B2 (en) 2017-02-10 2019-12-10 Johnson Controls Technology Company Building management system with space graphs
US10901373B2 (en) 2017-06-15 2021-01-26 Johnson Controls Technology Company Building management system with artificial intelligence for unified agent based control of building subsystems
US10552914B2 (en) 2016-05-05 2020-02-04 Sensormatic Electronics, LLC Method and apparatus for evaluating risk based on sensor monitoring
CN105785881A (en) * 2016-05-07 2016-07-20 张舒维 Intelligent control system for security and protection monitoring of community
US10452963B2 (en) * 2016-05-12 2019-10-22 Google Llc Arming and/or altering a home alarm system by specified positioning of everyday objects within view of a security camera
CN107404049A (en) * 2016-05-18 2017-11-28 乐清市三口电子科技有限公司 A kind of portable intelligent socket
EP3272107B1 (en) * 2016-05-27 2021-04-21 Titan Health & Security Technologies, Inc. Community emergency notification system with inter-organizational compatibility
US10192427B2 (en) * 2016-05-27 2019-01-29 Titan Health & Security Technologies, Inc. Community emergency notification system with inter-organizational compatibility
US9905109B2 (en) * 2016-06-02 2018-02-27 Google Llc Retroactive messaging for handling missed synchronization events
US10810676B2 (en) 2016-06-06 2020-10-20 Sensormatic Electronics, LLC Method and apparatus for increasing the density of data surrounding an event
US9997054B2 (en) 2016-06-07 2018-06-12 Ecolink Intelligent Technology, Inc. Method and apparatus for disarming a security system
DK179925B1 (en) 2016-06-12 2019-10-09 Apple Inc. User interface for managing controllable external devices
SE542124C2 (en) * 2016-06-17 2020-02-25 Irisity Ab Publ A monitoring system for security technology
US10531167B2 (en) * 2016-07-06 2020-01-07 RPH Engineering, LLC Electronic monitoring, security, and communication device assembly
JP7002526B2 (en) 2016-07-07 2022-01-20 サバント システムズ インコーポレイテッド Air gap devices, systems and methods for intelligent lighting control systems
US10957498B2 (en) * 2016-07-07 2021-03-23 Racepoint Energy, LLC Intelligent lighting control system deployment with scalable wallplate
US10522251B2 (en) 2016-07-08 2019-12-31 International Business Machines Corporation Infrared detectors and thermal tags for real-time activity monitoring
CN106251575A (en) * 2016-07-19 2016-12-21 洪明 A kind of mobile detection alarm device
CN106037284A (en) * 2016-07-19 2016-10-26 洪明 Intelligent device for removing odor of shoe cabinet
US10488062B2 (en) 2016-07-22 2019-11-26 Ademco Inc. Geofence plus schedule for a building controller
US10764077B2 (en) * 2016-07-26 2020-09-01 RAM Laboratories, Inc. Crowd-sourced event identification that maintains source privacy
US10306403B2 (en) 2016-08-03 2019-05-28 Honeywell International Inc. Location based dynamic geo-fencing system for security
US10294600B2 (en) 2016-08-05 2019-05-21 Echostar Technologies International Corporation Remote detection of washer/dryer operation/fault condition
US10140844B2 (en) 2016-08-10 2018-11-27 Honeywell International Inc. Smart device distributed security system
CN106272717A (en) * 2016-08-17 2017-01-04 夏林达 A kind of device processing timber
US10049515B2 (en) * 2016-08-24 2018-08-14 Echostar Technologies International Corporation Trusted user identification and management for home automation systems
US20180061220A1 (en) * 2016-08-24 2018-03-01 Echostar Technologies L.L.C. Systems and methods for suppressing unwanted home automation notifications
AU2017100209A4 (en) * 2016-09-09 2017-04-06 Vision Intelligence Pty Ltd Monitoring device
CN106442880B (en) * 2016-09-12 2018-05-25 河北地质大学 A kind of healthy early warning device based on cloud computing
CN106292609A (en) * 2016-09-27 2017-01-04 合肥海诺恒信息科技有限公司 A kind of home security long distance control system based on Zigbee
CN106200540A (en) * 2016-09-27 2016-12-07 合肥海诺恒信息科技有限公司 A kind of home security long distance control system based on Internet of Things
CN106406172A (en) * 2016-09-27 2017-02-15 合肥海诺恒信息科技有限公司 Household remote security protection monitoring system
CN106354058A (en) * 2016-09-27 2017-01-25 合肥海诺恒信息科技有限公司 Smart-home-based visual security system
CN106371414A (en) * 2016-09-27 2017-02-01 合肥海诺恒信息科技有限公司 Remote control-based intelligent security protection management system
JP6485428B2 (en) * 2016-10-06 2019-03-20 住友電気工業株式会社 Management system, management apparatus, management method, and management program
US9972195B2 (en) * 2016-10-07 2018-05-15 Vivint, Inc. False alarm reduction
US10376186B2 (en) 2016-10-18 2019-08-13 International Business Machines Corporation Thermal tags for real-time activity monitoring and methods for fabricating the same
US10528725B2 (en) * 2016-11-04 2020-01-07 Microsoft Technology Licensing, Llc IoT security service
CN106652330A (en) * 2016-11-16 2017-05-10 深圳市元征科技股份有限公司 Human body health and environmental monitoring equipment-based monitoring method and related equipment
WO2018098301A1 (en) 2016-11-23 2018-05-31 Abraham Joseph Kinney Detection of authorized user presence and handling of unauthenticated monitoring system commands
GB2579161B (en) * 2016-12-01 2020-12-02 Sensyne Health Group Ltd Method and apparatus for improving energy efficiency of sensing technology
US10902712B2 (en) * 2016-12-15 2021-01-26 Fabio D'angelo Surveillance device
US20180181094A1 (en) * 2016-12-23 2018-06-28 Centurylink Intellectual Property Llc Smart Home, Building, or Customer Premises Apparatus, System, and Method
US11069381B1 (en) * 2016-12-27 2021-07-20 Amazon Technologies, Inc. Automated sensor data retention
US10684033B2 (en) 2017-01-06 2020-06-16 Johnson Controls Technology Company HVAC system with automated device pairing
US10791962B2 (en) * 2017-01-26 2020-10-06 Hamilton Sundstrand Corporation Insulating a protective cover for a seal to sensor associated with a spacesuit
US10687125B2 (en) * 2017-01-27 2020-06-16 Danielle Marie MICHEL Heat and temperature monitoring device
EP3580999B1 (en) 2017-02-07 2021-03-17 Lutron Technology Company LLC Audio-based load control system
US11900287B2 (en) 2017-05-25 2024-02-13 Johnson Controls Tyco IP Holdings LLP Model predictive maintenance system with budgetary constraints
US20190361412A1 (en) 2017-02-10 2019-11-28 Johnson Controls Technology Company Building smart entity system with agent based data ingestion and entity creation using time series data
US11360447B2 (en) 2017-02-10 2022-06-14 Johnson Controls Technology Company Building smart entity system with agent based communication and control
US10854194B2 (en) 2017-02-10 2020-12-01 Johnson Controls Technology Company Building system with digital twin based data ingestion and processing
US11307538B2 (en) 2017-02-10 2022-04-19 Johnson Controls Technology Company Web services platform with cloud-eased feedback control
US11764991B2 (en) 2017-02-10 2023-09-19 Johnson Controls Technology Company Building management system with identity management
US10515098B2 (en) 2017-02-10 2019-12-24 Johnson Controls Technology Company Building management smart entity creation and maintenance using time series data
US10452043B2 (en) * 2017-02-10 2019-10-22 Johnson Controls Technology Company Building management system with nested stream generation
US10095756B2 (en) 2017-02-10 2018-10-09 Johnson Controls Technology Company Building management system with declarative views of timeseries data
WO2018152249A1 (en) 2017-02-16 2018-08-23 View, Inc. Solar power dynamic glass for heating and cooling buildings
US10506926B2 (en) 2017-02-18 2019-12-17 Arc Devices Limited Multi-vital sign detector in an electronic medical records system
US10492684B2 (en) 2017-02-21 2019-12-03 Arc Devices Limited Multi-vital-sign smartphone system in an electronic medical records system
US10929483B2 (en) * 2017-03-01 2021-02-23 xAd, Inc. System and method for characterizing mobile entities based on mobile device signals
US10332515B2 (en) * 2017-03-14 2019-06-25 Google Llc Query endpointing based on lip detection
CN110612747B (en) 2017-03-15 2022-11-29 开利公司 Wireless event notification system
US11042144B2 (en) 2017-03-24 2021-06-22 Johnson Controls Technology Company Building management system with dynamic channel communication
US10418813B1 (en) 2017-04-01 2019-09-17 Smart Power Partners LLC Modular power adapters and methods of implementing modular power adapters
US10727731B1 (en) 2017-04-01 2020-07-28 Smart Power Partners, LLC Power adapters adapted to receive a module and methods of implementing power adapters with modules
US10996645B1 (en) 2017-04-01 2021-05-04 Smart Power Partners LLC Modular power adapters and methods of implementing modular power adapters
US10317102B2 (en) 2017-04-18 2019-06-11 Ademco Inc. Geofencing for thermostatic control
US10984640B2 (en) * 2017-04-20 2021-04-20 Amazon Technologies, Inc. Automatic adjusting of day-night sensitivity for motion detection in audio/video recording and communication devices
US10868857B2 (en) 2017-04-21 2020-12-15 Johnson Controls Technology Company Building management system with distributed data collection and gateway services
CN107196756B (en) * 2017-04-25 2021-01-15 努比亚技术有限公司 WIFI password generation method and mobile terminal
US10788229B2 (en) 2017-05-10 2020-09-29 Johnson Controls Technology Company Building management system with a distributed blockchain database
US10380852B2 (en) * 2017-05-12 2019-08-13 Google Llc Systems, methods, and devices for activity monitoring via a home assistant
US10962942B2 (en) * 2017-06-04 2021-03-30 Apple Inc. Presence triggered notifications and actions
US20180357870A1 (en) * 2017-06-07 2018-12-13 Amazon Technologies, Inc. Behavior-aware security systems and associated methods
US11022947B2 (en) 2017-06-07 2021-06-01 Johnson Controls Technology Company Building energy optimization system with economic load demand response (ELDR) optimization and ELDR user interfaces
US10739028B2 (en) * 2017-06-09 2020-08-11 Johnson Controls Technology Company Thermostat with efficient wireless data transmission
US10333810B2 (en) 2017-06-09 2019-06-25 Johnson Controls Technology Company Control system with asynchronous wireless data transmission
EP3655826A1 (en) 2017-07-17 2020-05-27 Johnson Controls Technology Company Systems and methods for agent based building simulation for optimal control
US11422516B2 (en) 2017-07-21 2022-08-23 Johnson Controls Tyco IP Holdings LLP Building management system with dynamic rules with sub-rule reuse and equation driven smart diagnostics
US20190034735A1 (en) * 2017-07-25 2019-01-31 Motionloft, Inc. Object detection sensors and systems
JP7142698B2 (en) * 2017-07-25 2022-09-27 シックスス エナジー テクノロジーズ プライベート リミテッド Integrated device based on the Internet of Things (IoT) for monitoring and controlling events in the environment
WO2019020769A1 (en) * 2017-07-26 2019-01-31 Tyco Fire & Security Gmbh Security system using tiered analysis
US11182047B2 (en) 2017-07-27 2021-11-23 Johnson Controls Technology Company Building management system with fault detection and diagnostics visualization
CN107454593A (en) * 2017-08-01 2017-12-08 中国联合网络通信集团有限公司 Family's monitoring method, household monitoring system and network communicating system
US10701531B2 (en) * 2017-08-09 2020-06-30 Qualcomm Incorporated Environmental sensing with wireless communication devices
US10602987B2 (en) 2017-08-10 2020-03-31 Arc Devices Limited Multi-vital-sign smartphone system in an electronic medical records system
US10942196B2 (en) * 2017-08-14 2021-03-09 Google Llc Systems and methods of motion detection using dynamic thresholds and data filtering
US11004567B2 (en) 2017-08-15 2021-05-11 Koko Home, Inc. System and method for processing wireless backscattered signal using artificial intelligence processing for activities of daily life
US11204435B1 (en) 2017-09-12 2021-12-21 Safehub Inc. Methods and systems for measuring and analyzing building dynamics
CN107610433A (en) * 2017-09-13 2018-01-19 三石量子(苏州)信息科技有限公司 A kind of safe city emergency calls system
US10559180B2 (en) 2017-09-27 2020-02-11 Johnson Controls Technology Company Building risk analysis system with dynamic modification of asset-threat weights
US11768826B2 (en) 2017-09-27 2023-09-26 Johnson Controls Tyco IP Holdings LLP Web services for creation and maintenance of smart entities for connected devices
US10962945B2 (en) 2017-09-27 2021-03-30 Johnson Controls Technology Company Building management system with integration of data into smart entities
US11314788B2 (en) 2017-09-27 2022-04-26 Johnson Controls Tyco IP Holdings LLP Smart entity management for building management systems
DE112018004325T5 (en) 2017-09-27 2020-05-14 Johnson Controls Technology Company SYSTEMS AND METHODS FOR RISK ANALYSIS
CN107566809B (en) * 2017-09-30 2020-04-03 浙江大华技术股份有限公司 Video monitoring alarm method, device and system
US20190108404A1 (en) * 2017-10-10 2019-04-11 Weixin Xu Consumer Camera System Design for Globally Optimized Recognition
US20190108405A1 (en) * 2017-10-10 2019-04-11 Weixin Xu Globally optimized recognition system and service design, from sensing to recognition
WO2019075431A1 (en) * 2017-10-13 2019-04-18 Honeywell International, Inc. Energy theft detection device
CN107734303B (en) * 2017-10-30 2021-10-26 北京小米移动软件有限公司 Video identification method and device
CN108052022A (en) * 2017-11-03 2018-05-18 傅峰峰 A kind of intelligent door opening method, electronic equipment, storage medium and system based on double mode
US11281169B2 (en) 2017-11-15 2022-03-22 Johnson Controls Tyco IP Holdings LLP Building management system with point virtualization for online meters
US10809682B2 (en) 2017-11-15 2020-10-20 Johnson Controls Technology Company Building management system with optimized processing of building system data
US10760804B2 (en) 2017-11-21 2020-09-01 Emerson Climate Technologies, Inc. Humidifier control systems and methods
US11127235B2 (en) 2017-11-22 2021-09-21 Johnson Controls Tyco IP Holdings LLP Building campus with integrated smart environment
CN107995267A (en) * 2017-11-23 2018-05-04 苏州大成电子科技有限公司 A kind of environment detection method based on cloud computing
CN107919121B (en) * 2017-11-24 2021-06-01 江西科技师范大学 Control method and device of intelligent household equipment, storage medium and computer equipment
US11538257B2 (en) 2017-12-08 2022-12-27 Gatekeeper Inc. Detection, counting and identification of occupants in vehicles
EP3734951A4 (en) * 2017-12-28 2021-01-13 Panasonic Intellectual Property Corporation of America Display control method, information processing server and display terminal
CN108073095B (en) * 2017-12-29 2023-09-29 湘能楚天电力科技有限公司 Multifunctional intelligent robot for power distribution room
US11073838B2 (en) 2018-01-06 2021-07-27 Drivent Llc Self-driving vehicle systems and methods
KR20190085627A (en) * 2018-01-11 2019-07-19 삼성전자주식회사 Method for providing notification and electronic device for supporting the same
US10311689B1 (en) * 2018-01-18 2019-06-04 Ademco Inc. Systems and methods for protecting a bypassed zone in a security system or a connected home system
US20190238358A1 (en) 2018-02-01 2019-08-01 Bby Solutions, Inc. Automatic device orchestration and configuration
WO2019157100A1 (en) 2018-02-07 2019-08-15 Johnson Controls Technology Company Building access control system with complex event processing
WO2019157104A1 (en) 2018-02-07 2019-08-15 Johnson Controls Technology Company Building access control system with spatial modeling
US11048247B2 (en) 2018-02-08 2021-06-29 Johnson Controls Technology Company Building management system to detect anomalousness with temporal profile
US10943463B1 (en) * 2018-02-19 2021-03-09 Agape Grace Clark Technologies for assistance and security services
US20190306468A1 (en) * 2018-04-02 2019-10-03 Mars Semiconductor Corp. Wireless monitoring system and power saving method of wireless monitor
US11486593B2 (en) 2018-04-20 2022-11-01 Emerson Climate Technologies, Inc. Systems and methods with variable mitigation thresholds
US11371726B2 (en) 2018-04-20 2022-06-28 Emerson Climate Technologies, Inc. Particulate-matter-size-based fan control system
US11421901B2 (en) 2018-04-20 2022-08-23 Emerson Climate Technologies, Inc. Coordinated control of standalone and building indoor air quality devices and systems
WO2019204779A1 (en) 2018-04-20 2019-10-24 Emerson Climate Technologies, Inc. Indoor air quality and occupant monitoring systems and methods
WO2019204790A1 (en) 2018-04-20 2019-10-24 Emerson Climate Technologies, Inc. Systems and methods with variable mitigation thresholds
US10636282B2 (en) 2018-04-26 2020-04-28 International Business Machines Corporation Security system with cooperative behavior
WO2019209358A1 (en) * 2018-04-26 2019-10-31 Google Llc Systems and methods of power-management on smart devices
US20190333352A1 (en) * 2018-04-29 2019-10-31 Aaron Escobar Security camera mounted in ornamental fish
DE102018110423A1 (en) 2018-05-01 2019-11-07 Tobias Rückert Procedure for registering a target device to a network
CN117376505A (en) * 2018-05-07 2024-01-09 苹果公司 User interface for viewing live video feeds and recording video
US10485431B1 (en) 2018-05-21 2019-11-26 ARC Devices Ltd. Glucose multi-vital-sign system in an electronic medical records system
US20190362609A1 (en) * 2018-05-24 2019-11-28 Carrier Corporation Notification system
US11053729B2 (en) * 2018-06-29 2021-07-06 Overhead Door Corporation Door system and method with early warning sensors
WO2020028206A1 (en) * 2018-07-30 2020-02-06 Drivent Llc Self-driving vehicle systems and methods
US11080877B2 (en) 2018-08-02 2021-08-03 Matthew B. Schoen Systems and methods of measuring an object in a scene of a captured image
US10667081B2 (en) 2018-08-03 2020-05-26 International Business Machines Corporation Providing location-based services using geo-fencing tracking techniques
US10847018B2 (en) 2018-08-31 2020-11-24 Nortek Security & Control Llc Community-based security system
CN109448312A (en) * 2018-09-06 2019-03-08 上海电器科学研究所(集团)有限公司 A kind of alert interlink alarm system of intelligent doctor
US11644833B2 (en) 2018-10-01 2023-05-09 Drivent Llc Self-driving vehicle systems and methods
US11221622B2 (en) 2019-03-21 2022-01-11 Drivent Llc Self-driving vehicle systems and methods
CN112997223A (en) 2018-10-29 2021-06-18 赫克斯冈技术中心 Facility monitoring system and method
CN111127823A (en) * 2018-10-31 2020-05-08 杭州海康威视数字技术股份有限公司 Alarm method and device and video monitoring equipment
CN109326078A (en) * 2018-11-09 2019-02-12 合肥景彰科技有限公司 A kind of intelligent appliance control system with fire protection warning function
WO2020094298A1 (en) * 2018-11-09 2020-05-14 Arcelik Anonim Sirketi Household appliance
US20200162280A1 (en) 2018-11-19 2020-05-21 Johnson Controls Technology Company Building system with performance identification through equipment exercising and entity relationships
JP7012875B2 (en) * 2018-12-03 2022-01-28 三菱電機株式会社 Equipment control device and equipment control method
CN109493555A (en) * 2018-12-05 2019-03-19 郑州升达经贸管理学院 A kind of campus dormitory building safety defense monitoring system based on intelligent monitoring technology
US10718996B2 (en) 2018-12-19 2020-07-21 Arlo Technologies, Inc. Modular camera system
US10965484B2 (en) * 2018-12-21 2021-03-30 Opendoor Labs Inc. Fleet of home electronic systems
CN109600587A (en) * 2019-01-02 2019-04-09 上海理工大学 A kind of remote multi-point Sync image capture device and its acquisition method
US11436567B2 (en) 2019-01-18 2022-09-06 Johnson Controls Tyco IP Holdings LLP Conference room management system
EP3686856A1 (en) * 2019-01-23 2020-07-29 Howe, Anthony Richard Voice and/or image recording and transmission system
US10788798B2 (en) 2019-01-28 2020-09-29 Johnson Controls Technology Company Building management system with hybrid edge-cloud processing
GB2585998B (en) * 2019-02-13 2021-07-28 Total Waste Solutions Ltd A detector
US11238724B2 (en) 2019-02-15 2022-02-01 Ademco Inc. Systems and methods for automatically activating self-test devices of sensors of a security system
US11626010B2 (en) * 2019-02-28 2023-04-11 Nortek Security & Control Llc Dynamic partition of a security system
GR20190100129A (en) * 2019-03-22 2020-10-14 Ελευθερια Νικολαου Κατσιρη A device measuring atmospheric air elements
JP2020155002A (en) * 2019-03-22 2020-09-24 株式会社ゼンリンデータコム Terminal, server device, method and program
US11100767B1 (en) * 2019-03-26 2021-08-24 Halo Wearables, Llc Group management for electronic devices
CN110083070B (en) * 2019-04-24 2020-11-27 珠海格力电器股份有限公司 Equipment control method and device, air purifier and storage medium
KR102224880B1 (en) * 2019-05-02 2021-03-09 주식회사 허밍비 Interaction system with a person using a smart toy
US11363071B2 (en) 2019-05-31 2022-06-14 Apple Inc. User interfaces for managing a local network
US10904029B2 (en) 2019-05-31 2021-01-26 Apple Inc. User interfaces for managing controllable external devices
CN110296923B (en) * 2019-06-20 2021-11-02 山东省水利勘测设计院 Hydraulic engineering seepage monitoring system and method
US11138858B1 (en) * 2019-06-27 2021-10-05 Amazon Technologies, Inc. Event-detection confirmation by voice user interface
US11232921B1 (en) 2019-06-30 2022-01-25 Smart Power Partners LLC Power adapter having separate manual and electrical user interfaces
US11460874B1 (en) 2019-06-30 2022-10-04 Smart Power Partners LLC In-wall power adapter configured to control the application of power to a load
US10938168B2 (en) 2019-06-30 2021-03-02 Smart Power Partners LLC In-wall power adapter and method of controlling the application of power to a load
US11189948B1 (en) 2019-06-30 2021-11-30 Smart Power Partners LLC Power adapter and method of implementing a power adapter to provide power to a load
US11264769B1 (en) 2019-06-30 2022-03-01 Smart Power Partners LLC Power adapter having contact elements in a recess and method of controlling a power adapter
US11579640B1 (en) 2019-06-30 2023-02-14 Smart Power Partners LLC Control attachment for an in-wall power adapter
US10917956B1 (en) 2019-06-30 2021-02-09 Smart Power Partners LLC Control attachment configured to provide power to a load and method of configuring a control attachment
US10958026B1 (en) 2019-06-30 2021-03-23 Smart Power Partners LLC Contactless thermometer for an in-wall power adapter
US11231730B1 (en) 2019-06-30 2022-01-25 Smart Power Power LLC Control attachment for a power adapter configured to control power applied to a load
US11201444B1 (en) 2019-06-30 2021-12-14 Smart Power Partners LLC Power adapter having contact elements in a recess and method of controlling a power adapter
US11043768B1 (en) 2019-06-30 2021-06-22 Smart Power Partners LLC Power adapter configured to provide power to a load and method of implementing a power adapter
US10965068B1 (en) 2019-06-30 2021-03-30 Smart Power Partners LLC In-wall power adapter having an outlet and method of controlling an in-wall power adapter
US10958020B1 (en) 2019-06-30 2021-03-23 Smart Power Partners LLC Control attachment for an in-wall power adapter and method of controlling an in-wall power adapter
US10867193B1 (en) 2019-07-10 2020-12-15 Gatekeeper Security, Inc. Imaging systems for facial detection, license plate reading, vehicle overview and vehicle make, model, and color detection
KR102235981B1 (en) * 2019-07-15 2021-04-07 주식회사 에버시스템 Fire detection system and method
US11407381B2 (en) 2019-08-07 2022-08-09 Keep Technologies, Inc. Multi-device vehicle intrusion detection
CN110489076B (en) * 2019-08-22 2024-03-01 百度在线网络技术(北京)有限公司 Environment sound monitoring method and device and electronic equipment
TW202109471A (en) 2019-08-29 2021-03-01 研能科技股份有限公司 Monitor and gas detection information notification system
CN112444600A (en) * 2019-08-29 2021-03-05 研能科技股份有限公司 Monitoring and gas detection information reporting system
US11323883B2 (en) * 2019-09-30 2022-05-03 Inlecom Systems Limited Pattern driven selective sensor authentication for internet of things
US11719804B2 (en) 2019-09-30 2023-08-08 Koko Home, Inc. System and method for determining user activities using artificial intelligence processing
US11196965B2 (en) 2019-10-25 2021-12-07 Gatekeeper Security, Inc. Image artifact mitigation in scanners for entry control systems
KR102101351B1 (en) * 2019-11-05 2020-04-20 최경진 Improved driving device supporting air conditioning
CN110687822A (en) * 2019-11-18 2020-01-14 上海三菱电机·上菱空调机电器有限公司 Intelligent household system
CN110866016B (en) * 2019-11-26 2022-11-01 青岛华节鼎孚节能科技有限公司 Hydraulic engineering monitoring method and device based on multi-sensor technology and electronic equipment
US11533457B2 (en) 2019-11-27 2022-12-20 Aob Products Company Smart home and security system
TWI705845B (en) * 2019-12-26 2020-10-01 台灣新光保全股份有限公司 Security host
US20210200164A1 (en) 2019-12-31 2021-07-01 Johnson Controls Technology Company Building data platform with edge based event enrichment
US11894944B2 (en) 2019-12-31 2024-02-06 Johnson Controls Tyco IP Holdings LLP Building data platform with an enrichment loop
US11355004B2 (en) 2020-01-14 2022-06-07 Google Llc Systems and methods of security system access and sharing temporal event-based notifications and access to devices of designated persons
EP3885926A1 (en) 2020-03-25 2021-09-29 Carrier Corporation Fire protection system
US11240635B1 (en) * 2020-04-03 2022-02-01 Koko Home, Inc. System and method for processing using multi-core processors, signals, and AI processors from multiple sources to create a spatial map of selected region
US11537386B2 (en) 2020-04-06 2022-12-27 Johnson Controls Tyco IP Holdings LLP Building system with dynamic configuration of network resources for 5G networks
US11184738B1 (en) 2020-04-10 2021-11-23 Koko Home, Inc. System and method for processing using multi core processors, signals, and AI processors from multiple sources to create a spatial heat map of selected region
KR102218835B1 (en) * 2020-04-14 2021-02-23 주식회사 세리공영 Life safety-net construction and management system
US10939273B1 (en) * 2020-04-14 2021-03-02 Soter Technologies, Llc Systems and methods for notifying particular devices based on estimated distance
US11079913B1 (en) 2020-05-11 2021-08-03 Apple Inc. User interface for status indicators
KR102292053B1 (en) 2020-05-15 2021-08-23 엘지전자 주식회사 display apparatus
WO2021247300A1 (en) 2020-06-01 2021-12-09 Arc Devices Limited Apparatus and methods for measuring blood pressure and other vital signs via a finger
US11657614B2 (en) 2020-06-03 2023-05-23 Apple Inc. Camera and visitor user interfaces
US11589010B2 (en) 2020-06-03 2023-02-21 Apple Inc. Camera and visitor user interfaces
US11874809B2 (en) 2020-06-08 2024-01-16 Johnson Controls Tyco IP Holdings LLP Building system with naming schema encoding entity type and entity relationships
EP3926248A1 (en) * 2020-06-19 2021-12-22 Leapcraft ApS System and method for maintaining air quality within a structure
US11704683B1 (en) 2020-07-09 2023-07-18 Amdocs Development Limited Machine learning system, method, and computer program for household marketing segmentation
US11563858B1 (en) * 2020-07-09 2023-01-24 Amdocs Development Limited System, method, and computer program for generating insights from home network router data
US11605027B2 (en) 2020-07-09 2023-03-14 Amdocs Development Limited Machine learning system, method, and computer program for inferring user presence in a residential space
CN111856962A (en) * 2020-08-13 2020-10-30 郑州智利信信息技术有限公司 Intelligent home control system based on cloud computing
US11881093B2 (en) 2020-08-20 2024-01-23 Denso International America, Inc. Systems and methods for identifying smoking in vehicles
US11813926B2 (en) 2020-08-20 2023-11-14 Denso International America, Inc. Binding agent and olfaction sensor
US11636870B2 (en) 2020-08-20 2023-04-25 Denso International America, Inc. Smoking cessation systems and methods
US11828210B2 (en) 2020-08-20 2023-11-28 Denso International America, Inc. Diagnostic systems and methods of vehicles using olfaction
US11760170B2 (en) 2020-08-20 2023-09-19 Denso International America, Inc. Olfaction sensor preservation systems and methods
US11760169B2 (en) 2020-08-20 2023-09-19 Denso International America, Inc. Particulate control systems and methods for olfaction sensors
CN116195261A (en) 2020-09-05 2023-05-30 苹果公司 User interface for managing audio of media items
CN112070027B (en) * 2020-09-09 2022-08-26 腾讯科技(深圳)有限公司 Network training and action recognition method, device, equipment and storage medium
US11397773B2 (en) 2020-09-30 2022-07-26 Johnson Controls Tyco IP Holdings LLP Building management system with semantic model integration
US20220138362A1 (en) 2020-10-30 2022-05-05 Johnson Controls Technology Company Building management system with configuration by building model augmentation
CN112381442B (en) * 2020-11-25 2021-07-20 兰和科技(深圳)有限公司 Campus security information management system based on artificial intelligence technology
CN112562304B (en) * 2020-11-26 2022-02-18 英博超算(南京)科技有限公司 Interactive system of application layer and sensor data
CN113012386A (en) * 2020-12-25 2021-06-22 贵州北斗空间信息技术有限公司 Security alarm multi-level linkage rapid pushing method
US11647314B2 (en) 2021-01-26 2023-05-09 Timothy E. Felks Methods, devices, and systems for impact detection and reporting for structure envelopes
US11583770B2 (en) 2021-03-01 2023-02-21 Lghorizon, Llc Systems and methods for machine learning-based emergency egress and advisement
US11921481B2 (en) 2021-03-17 2024-03-05 Johnson Controls Tyco IP Holdings LLP Systems and methods for determining equipment energy waste
US11533709B2 (en) 2021-03-18 2022-12-20 Motorola Solutions, Inc. Device, system and method for transmitting notifications based on indications of effectiveness for previous notifications
US20220335812A1 (en) * 2021-04-15 2022-10-20 Honeywell International Inc. Live status notification of response personnel actions to building owner/operators
CA3157001A1 (en) * 2021-05-04 2022-11-04 Robbie Restoration Technologies Inc. Portable controller for drying equipment and related system and method
US20230007746A1 (en) * 2021-05-26 2023-01-05 Corsair Memory, Inc. Single rail control system with receiver modules
US11785012B2 (en) 2021-06-07 2023-10-10 Bank Of America Corporation Data processing for internet of things (IoT) devices based on recorded user behavior
US11769066B2 (en) 2021-11-17 2023-09-26 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin triggers and actions
US11899723B2 (en) 2021-06-22 2024-02-13 Johnson Controls Tyco IP Holdings LLP Building data platform with context based twin function processing
US11626002B2 (en) 2021-07-15 2023-04-11 Lghorizon, Llc Building security and emergency detection and advisement system
EP4131196A1 (en) * 2021-08-04 2023-02-08 Koninklijke Philips N.V. Monitoring system and method
US11796974B2 (en) 2021-11-16 2023-10-24 Johnson Controls Tyco IP Holdings LLP Building data platform with schema extensibility for properties and tags of a digital twin
US11704311B2 (en) 2021-11-24 2023-07-18 Johnson Controls Tyco IP Holdings LLP Building data platform with a distributed digital twin
US11714930B2 (en) 2021-11-29 2023-08-01 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin based inferences and predictions for a graphical building model
KR102473608B1 (en) 2021-12-17 2022-12-01 강수연 Method of providing sports training video using camera and radar sensor
US11823540B2 (en) 2021-12-30 2023-11-21 The Adt Security Corporation Determining areas of interest in video based at least on a user's interactions with the video
US11776382B1 (en) 2022-12-23 2023-10-03 The Adt Security Corporation Premises security system using ultra-wideband (UWB) functionality to initiate at least one action

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5956716A (en) * 1995-06-07 1999-09-21 Intervu, Inc. System and method for delivery of video data over a computer network
US20050162268A1 (en) * 2003-11-18 2005-07-28 Integraph Software Technologies Company Digital video surveillance
US20070217765A1 (en) * 2006-03-09 2007-09-20 Masaya Itoh Method and its application for video recorder and player
US20080303903A1 (en) * 2003-12-02 2008-12-11 Connexed Technologies Inc. Networked video surveillance system
US20100139290A1 (en) * 2008-05-23 2010-06-10 Leblond Raymond G Enclosure for surveillance hardware
US20130110565A1 (en) * 2011-04-25 2013-05-02 Transparency Sciences, Llc System, Method and Computer Program Product for Distributed User Activity Management
US20130235209A1 (en) * 2012-03-09 2013-09-12 Industrial Technology Research Institute System and method for dispatching video recording

Family Cites Families (123)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE8811566U1 (en) * 1988-09-13 1988-10-27 Honeywell Regelsysteme Gmbh, 6050 Offenbach, De
US5604384A (en) * 1993-02-08 1997-02-18 Winner International Royalty Corporation Anti-theft device for motor vehicle
US5463595A (en) * 1993-10-13 1995-10-31 Rodhall; Arne Portable security system for outdoor sites
US5381950A (en) * 1993-10-20 1995-01-17 American Standard Inc. Zone sensor or thermostat with forced air
US6141412A (en) 1994-06-01 2000-10-31 Davox Corporation Unscheduled event task processing system
US5850180A (en) * 1994-09-09 1998-12-15 Tattletale Portable Alarm Systems, Inc. Portable alarm system
JPH09123732A (en) * 1995-11-01 1997-05-13 Zexel Corp Air conditioner for automobile
US20030025599A1 (en) 2001-05-11 2003-02-06 Monroe David A. Method and apparatus for collecting, sending, archiving and retrieving motion video and still images and notification of detected events
US6970183B1 (en) * 2000-06-14 2005-11-29 E-Watch, Inc. Multimedia surveillance and monitoring system including network configuration
US6973166B1 (en) 1999-07-15 2005-12-06 Tsumpes William J Automated parallel and redundant subscriber contact and event notification system
US7015806B2 (en) * 1999-07-20 2006-03-21 @Security Broadband Corporation Distributed monitoring for a video security system
US7216145B2 (en) 2000-06-23 2007-05-08 Mission Communications, Llc Event notification system
WO2002005073A1 (en) * 2000-07-07 2002-01-17 Fujitsu Limited Password changing method, computer system, and computer-readable recorded medium on which program is stored
US20080303897A1 (en) * 2000-12-22 2008-12-11 Terahop Networks, Inc. Visually capturing and monitoring contents and events of cargo container
US20020147006A1 (en) * 2001-04-09 2002-10-10 Coon Bradley S. Proximity-based control of building functions
US6661340B1 (en) 2001-04-24 2003-12-09 Microstrategy Incorporated System and method for connecting security systems to a wireless device
US6782351B2 (en) * 2001-09-11 2004-08-24 Purechoice, Inc. Air quality monitoring and space management system coupled to a private communications network
US6693530B1 (en) 2001-10-16 2004-02-17 At&T Corp. Home security administration platform
JP4050925B2 (en) 2002-04-04 2008-02-20 シチズン電子株式会社 Imaging module and imaging apparatus
US6819559B1 (en) * 2002-05-06 2004-11-16 Apple Computer, Inc. Method and apparatus for controlling the temperature of electronic device enclosures
US8581688B2 (en) * 2002-06-11 2013-11-12 Intelligent Technologies International, Inc. Coastal monitoring techniques
US7026926B1 (en) * 2002-08-15 2006-04-11 Walker Iii Ethan A System and method for wireless transmission of security alarms to selected groups
US7479877B2 (en) * 2002-09-17 2009-01-20 Commerceguard Ab Method and system for utilizing multiple sensors for monitoring container security, contents and condition
US20040223054A1 (en) * 2003-05-06 2004-11-11 Rotholtz Ben Aaron Multi-purpose video surveillance
EP1623526A4 (en) 2003-05-15 2010-07-21 Commerceguard Ab Method and system for utilizing multiple sensors for monitoring container security, contents and condition
US7098788B2 (en) 2003-07-10 2006-08-29 University Of Florida Research Foundation, Inc. Remote surveillance and assisted care using a mobile communication device
US7619512B2 (en) 2006-10-02 2009-11-17 Alarm.Com System and method for alarm signaling during alarm system destruction
US6978631B2 (en) * 2003-10-24 2005-12-27 Fuller Andrew C Dehumidification system
US20070153993A1 (en) 2004-02-02 2007-07-05 Mobile Reach Media Inc. Monitoring method and system
US20050216302A1 (en) 2004-03-16 2005-09-29 Icontrol Networks, Inc. Business method for premises management
WO2006053185A2 (en) * 2004-11-10 2006-05-18 Bae Systems Information And Electronic Systems Integration Inc. Wearable portable device for establishing communications interoperability at an incident site
US8750509B2 (en) * 2004-09-23 2014-06-10 Smartvue Corporation Wireless surveillance system releasably mountable to track lighting
US20060158336A1 (en) * 2005-01-03 2006-07-20 Nourbakhsh Illah R Home and home occupant remote monitoring and communication system
US20080088428A1 (en) * 2005-03-10 2008-04-17 Brian Pitre Dynamic Emergency Notification and Intelligence System
US20060291657A1 (en) * 2005-05-03 2006-12-28 Greg Benson Trusted monitoring system and method
US20060250236A1 (en) * 2005-05-04 2006-11-09 Ackley Donald E Pod-based wireless sensor system
US20070004971A1 (en) * 2005-05-27 2007-01-04 Hill-Rom Services, Inc. Caregiver communication system for a home environment
US8880047B2 (en) 2005-08-03 2014-11-04 Jeffrey C. Konicek Realtime, location-based cell phone enhancements, uses, and applications
US20070037561A1 (en) * 2005-08-10 2007-02-15 Bowen Blake A Method for intelligently dialing contact numbers for a person using user-defined smart rules
US8400607B2 (en) 2005-10-11 2013-03-19 Barco N.V. Display assemblies and methods of display
US8254893B2 (en) 2005-11-17 2012-08-28 Nitesh Ratnakar System and method for automatically downloading and storing contact information to a personal communication device based on a geographical position of the personal communication device
US7443304B2 (en) 2005-12-09 2008-10-28 Honeywell International Inc. Method and system for monitoring a patient in a premises
US20070139192A1 (en) * 2005-12-21 2007-06-21 Wimberly Michael R Sensor unit having a network video camera
US7746794B2 (en) * 2006-02-22 2010-06-29 Federal Signal Corporation Integrated municipal management console
US7616095B2 (en) * 2006-02-23 2009-11-10 Rockwell Automation Technologies, Inc. Electronic token to provide sequential event control and monitoring
DE102006010946B3 (en) * 2006-03-04 2007-06-21 Stiftung Alfred-Wegener-Institut für Polar- und Meeresforschung Stiftung des öffentlichen Rechts Outside air temperature measuring apparatus with device for reducing false measurement due to strong winds
US9931571B2 (en) * 2006-03-17 2018-04-03 Nintendo Co., Ltd. Systems, methods and techniques for safely and effectively coordinating video game play and other activities among multiple remote networked friends and rivals
US20070268121A1 (en) * 2006-05-18 2007-11-22 Daryush Vasefi On-line portal system and method for management of devices and services
US20070290830A1 (en) * 2006-06-15 2007-12-20 Phase Iv Partners, Inc. Remotely monitored security system
WO2008004251A2 (en) * 2006-07-03 2008-01-10 Tanla Solutions Limited Home security system using an ad-hoc wireless mesh and method thereof
EP2057610A2 (en) * 2006-08-11 2009-05-13 Trident Security Concepts, LLC Self-contained security system
US8289430B2 (en) * 2007-02-09 2012-10-16 Gentex Corporation High dynamic range imaging device
US20090027196A1 (en) * 2007-03-07 2009-01-29 Roland Schoettle System and method for premises monitoring and control using self-learning detection devices
JP5475218B2 (en) * 2007-03-15 2014-04-16 カルソニックカンセイ株式会社 Compound sensor
US7911334B2 (en) 2007-04-19 2011-03-22 Andrew Busey Electronic personal alert system
US9497582B2 (en) 2007-06-11 2016-11-15 Broadcom Corporation Smart phone to home gateway/STB data exchange for content delivery
WO2009012289A1 (en) * 2007-07-16 2009-01-22 Cernium Corporation Apparatus and methods for video alarm verification
US20090027495A1 (en) 2007-07-25 2009-01-29 Stas Oskin Internet visual surveillance and management technology for telecommunications, Internet, cellular and other communications companies
US8018337B2 (en) 2007-08-03 2011-09-13 Fireear Inc. Emergency notification device and system
US8036679B1 (en) 2007-10-03 2011-10-11 University of South Floirda Optimizing performance of location-aware applications using state machines
US8456293B1 (en) * 2007-10-22 2013-06-04 Alarm.Com Incorporated Providing electronic content based on sensor data
US8204273B2 (en) * 2007-11-29 2012-06-19 Cernium Corporation Systems and methods for analysis of video content, event notification, and video content provision
WO2009100411A2 (en) * 2008-02-08 2009-08-13 Trident Security Concepts, Llc Wireless security system
US7782199B2 (en) * 2008-05-21 2010-08-24 Michael Issokson Portable self-contained alarm system
US8063764B1 (en) * 2008-05-27 2011-11-22 Toronto Rehabilitation Institute Automated emergency detection and response
WO2010048294A1 (en) * 2008-10-21 2010-04-29 Lifescan, Inc. Multiple temperature measurements coupled with modeling
US7938336B2 (en) * 2008-11-11 2011-05-10 Emerson Electric Co. Apparatus and method for isolating a temperature sensing device in a thermostat
US8049613B2 (en) * 2008-11-26 2011-11-01 Comcast Cable Holdings, Llc Building security system
US20100203920A1 (en) * 2009-02-06 2010-08-12 Ted Walter Gregory Personal communication device with integrated alarm
US8350694B1 (en) * 2009-05-18 2013-01-08 Alarm.Com Incorporated Monitoring system to monitor a property with a mobile device with a monitoring application
US8531294B2 (en) * 2009-05-18 2013-09-10 Alarm.Com Incorporated Moving asset location tracking
US10188295B2 (en) * 2009-06-01 2019-01-29 The Curators Of The University Of Missouri Integrated sensor network methods and systems
CA2773798A1 (en) 2009-09-09 2011-03-17 Absolute Software Corporation Alert for real-time risk of theft or loss
US8520072B1 (en) 2009-10-02 2013-08-27 Alarm.Com Incorporated Video monitoring and alarm verification technology
US8675071B1 (en) * 2009-10-02 2014-03-18 Alarm.Com Incorporated Video monitoring and alarm verification technology
NL1037342C2 (en) * 2009-10-02 2011-04-05 Inventor Invest Holding B V SECURITY SYSTEM AND METHOD FOR PROTECTING AN AREA.
WO2011056665A1 (en) * 2009-10-27 2011-05-12 Mobile Imagination, Llc System and method for the generation and distribution of an electronic interactive call sheet
US8493202B1 (en) * 2010-03-22 2013-07-23 Alarm.Com Alarm signaling technology
CN201716111U (en) 2010-03-23 2011-01-19 中国科学院西安光学精密机械研究所 Multiband infrared radiation automatic measuring system
US9326708B2 (en) * 2010-03-26 2016-05-03 Medtronic Minimed, Inc. Ambient temperature sensor systems and methods
US8471700B1 (en) * 2010-04-16 2013-06-25 Kontek Industries, Inc. Global positioning systems and methods for asset and infrastructure protection
US8862092B2 (en) 2010-06-25 2014-10-14 Emergensee, Inc. Emergency notification system for mobile devices
US8600008B2 (en) 2010-06-30 2013-12-03 Mark Kraus System and method of providing an emergency contact party line
US8396447B2 (en) 2010-08-27 2013-03-12 Don Reich Emergency call notification system and method
US9141150B1 (en) 2010-09-15 2015-09-22 Alarm.Com Incorporated Authentication and control interface of a security system
US8786189B2 (en) * 2010-11-18 2014-07-22 Jerrold W. Mayfield Integrated exit signs and monitoring system
WO2013058820A1 (en) * 2011-10-21 2013-04-25 Nest Labs, Inc. User-friendly, network connected learning thermostat and related systems and methods
US8395501B2 (en) * 2010-11-23 2013-03-12 Universal Security Instruments, Inc. Dynamic alarm sensitivity adjustment and auto-calibrating smoke detection for reduced resource microprocessors
US9036001B2 (en) * 2010-12-16 2015-05-19 Massachusetts Institute Of Technology Imaging system for immersive surveillance
US8624341B2 (en) 2011-01-26 2014-01-07 Maxim Integrated Products, Inc. Light sensor having IR cut and color pass interference filter integrated on-chip
US9057210B2 (en) * 2011-03-17 2015-06-16 Unikey Technologies, Inc. Wireless access control system and related methods
US9154001B2 (en) 2011-05-19 2015-10-06 Honeywell International Inc. Intuitive scheduling for energy management devices
CN102795074B (en) 2011-05-23 2014-07-30 延锋伟世通汽车电子有限公司 Inner compartment temperature detecting device for automatic air conditioning of automobile
US20120309373A1 (en) 2011-06-01 2012-12-06 Atlanta Trading & Eng Consulting Llc Proximity-Based Application Activation
US8618927B2 (en) * 2011-08-24 2013-12-31 At&T Intellectual Property I, L.P. Methods, systems, and products for notifications in security systems
US9124783B2 (en) * 2011-09-30 2015-09-01 Camiolog, Inc. Method and system for automated labeling at scale of motion-detected events in video surveillance
US9286516B2 (en) * 2011-10-20 2016-03-15 Xerox Corporation Method and systems of classifying a vehicle using motion vectors
US8849301B2 (en) 2011-11-01 2014-09-30 Digi International Inc. Location-based home services and energy management
US20130189946A1 (en) 2012-01-19 2013-07-25 Numerex Corp. Security System Alarming and Processing Based on User Location Information
US9098096B2 (en) * 2012-04-05 2015-08-04 Google Inc. Continuous intelligent-control-system update using information requests directed to user devices
US9182296B2 (en) * 2012-05-16 2015-11-10 General Electric Company Oven air sampling system
US9425978B2 (en) * 2012-06-27 2016-08-23 Ubiquiti Networks, Inc. Method and apparatus for configuring and controlling interfacing devices
US8994800B2 (en) * 2012-07-25 2015-03-31 Gopro, Inc. Credential transfer management camera system
US9247378B2 (en) * 2012-08-07 2016-01-26 Honeywell International Inc. Method for controlling an HVAC system using a proximity aware mobile device
US9208676B2 (en) 2013-03-14 2015-12-08 Google Inc. Devices, methods, and associated information processing for security in a smart-sensored home
US8630741B1 (en) * 2012-09-30 2014-01-14 Nest Labs, Inc. Automated presence detection and presence-related control within an intelligent controller
US20150245189A1 (en) 2012-10-19 2015-08-27 Srikanth Nalluri Personal safety and emergency services
US8934754B2 (en) * 2012-11-13 2015-01-13 International Business Machines Corporation Providing emergency access to surveillance video
US9049168B2 (en) 2013-01-11 2015-06-02 State Farm Mutual Automobile Insurance Company Home sensor data gathering for neighbor notification purposes
US9413827B2 (en) 2013-02-25 2016-08-09 Qualcomm Incorporated Context aware actions among heterogeneous internet of things (IOT) devices
US9035763B2 (en) * 2013-03-14 2015-05-19 Comcast Cable Communications, Llc Processing alarm signals
US9239997B2 (en) * 2013-03-15 2016-01-19 The United States Of America As Represented By The Secretary Of The Navy Remote environmental and condition monitoring system
US10304319B2 (en) 2013-04-23 2019-05-28 Canary Connect, Inc. Monitoring and security devices comprising multiple sensors
US10096003B2 (en) * 2013-05-31 2018-10-09 Javid Vahid Apparatus, methods and systems for knowledge based maintenance
US20150048927A1 (en) 2013-08-13 2015-02-19 Directed, Llc Smartphone based passive keyless entry system
US9917911B2 (en) * 2013-09-18 2018-03-13 Mivalife Mobile Technology, Inc. Security system communications management
US9858735B2 (en) 2013-12-10 2018-01-02 Ford Global Technologies, Llc User proximity detection for activating vehicle convenience functions
US20150168002A1 (en) 2013-12-18 2015-06-18 Google Inc. Systems and methods for determining or modifying a temperature program based on occupant activity
US20150173674A1 (en) 2013-12-20 2015-06-25 Diabetes Sentry Products Inc. Detecting and communicating health conditions
US20150261769A1 (en) 2014-03-14 2015-09-17 Joanne Uta Ono Local Safety Network
KR102280610B1 (en) 2014-04-24 2021-07-23 삼성전자주식회사 Method and apparatus for location estimation of electronic device
US10553098B2 (en) 2014-05-20 2020-02-04 Ooma, Inc. Appliance device integration with alarm systems
US10440499B2 (en) 2014-06-16 2019-10-08 Comcast Cable Communications, Llc User location and identity awareness

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5956716A (en) * 1995-06-07 1999-09-21 Intervu, Inc. System and method for delivery of video data over a computer network
US20050162268A1 (en) * 2003-11-18 2005-07-28 Integraph Software Technologies Company Digital video surveillance
US20080303903A1 (en) * 2003-12-02 2008-12-11 Connexed Technologies Inc. Networked video surveillance system
US20070217765A1 (en) * 2006-03-09 2007-09-20 Masaya Itoh Method and its application for video recorder and player
US20100139290A1 (en) * 2008-05-23 2010-06-10 Leblond Raymond G Enclosure for surveillance hardware
US20130110565A1 (en) * 2011-04-25 2013-05-02 Transparency Sciences, Llc System, Method and Computer Program Product for Distributed User Activity Management
US20130235209A1 (en) * 2012-03-09 2013-09-12 Industrial Technology Research Institute System and method for dispatching video recording

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220376978A1 (en) * 2014-10-13 2022-11-24 Pismo Labs Technology Limited Methods and systems for configuring a mobile router
CN105809884A (en) * 2014-12-29 2016-07-27 北京市建筑设计研究院有限公司 House accident monitoring system
CN106297232A (en) * 2015-12-11 2017-01-04 湖南天洋信息科技有限公司 Air-gauge, data processing method based on air-gauge and system
US10726956B2 (en) * 2015-12-28 2020-07-28 National Taiwan University Fever epidemic detection system and method thereof
US20180046778A1 (en) * 2015-12-28 2018-02-15 National Taiwan University Fever epidemic detection system and method thereof
CN107305731A (en) * 2016-04-21 2017-10-31 中兴通讯股份有限公司 A kind of radio data transmission method and device
CN106079064A (en) * 2016-07-19 2016-11-09 魏会芳 A kind of Intelligent Machining device
CN106504459A (en) * 2016-10-31 2017-03-15 国网山东省电力公司济南供电公司 A kind of unmanned communications equipment room fire early-warning system
CN106548587A (en) * 2016-10-31 2017-03-29 国网山东省电力公司济南供电公司 A kind of information machine room safety monitoring system based on Internet of Things
US9965936B1 (en) * 2017-01-04 2018-05-08 Shawn W. Epps Network communication and accountability system for individual and group safety
US10152873B2 (en) * 2017-04-13 2018-12-11 Chekt Llc. Alarm verification system and method thereof
CN107728559A (en) * 2017-11-20 2018-02-23 惠州市酷丰科技有限公司 It is a kind of can remote monitoring and high security burglary-resisting system
US10762755B2 (en) * 2018-06-04 2020-09-01 Apple Inc. Data-secure sensor system
US11682278B2 (en) * 2018-06-04 2023-06-20 Apple Inc. Data-secure sensor system
US11062592B2 (en) * 2019-04-12 2021-07-13 AKBI Development, LLC Danger zone protection and assistance system

Also Published As

Publication number Publication date
EP2989619A4 (en) 2016-12-28
US20150301513A1 (en) 2015-10-22
AU2014257036A1 (en) 2015-11-12
US20140313032A1 (en) 2014-10-23
CA2909892A1 (en) 2014-10-30
WO2014176379A2 (en) 2014-10-30
US20150302725A1 (en) 2015-10-22
US20170186309A1 (en) 2017-06-29
US20140327555A1 (en) 2014-11-06
US20150310726A1 (en) 2015-10-29
AU2014257036A8 (en) 2015-11-26
KR20160032004A (en) 2016-03-23
US9659483B2 (en) 2017-05-23
US20140320280A1 (en) 2014-10-30
CA2909892C (en) 2023-01-10
US9449491B2 (en) 2016-09-20
CN105308657A (en) 2016-02-03
WO2014176379A3 (en) 2015-01-22
US20140320312A1 (en) 2014-10-30
US9472090B2 (en) 2016-10-18
US20140321505A1 (en) 2014-10-30
US20140317710A1 (en) 2014-10-23
US20140320281A1 (en) 2014-10-30
US10083599B2 (en) 2018-09-25
US9633548B2 (en) 2017-04-25
JP2016524209A (en) 2016-08-12
EP2989619A2 (en) 2016-03-02
EP4006860A1 (en) 2022-06-01
US10304319B2 (en) 2019-05-28

Similar Documents

Publication Publication Date Title
US10083599B2 (en) Remote user interface and display for events for a monitored location
US11916691B2 (en) Power outlet cameras
US9253455B1 (en) Doorbell communication systems and methods
US9247219B2 (en) Doorbell communication systems and methods
US9237318B2 (en) Doorbell communication systems and methods
US9058738B1 (en) Doorbell communication systems and methods
US9113051B1 (en) Power outlet cameras
EP3025314B1 (en) Doorbell communication systems and methods
US9094584B2 (en) Doorbell communication systems and methods
US11381686B2 (en) Power outlet cameras
US20150116490A1 (en) Doorbell communication systems and methods
US11362853B2 (en) Doorbell communication systems and methods
US20220368556A1 (en) Doorbell communication systems and methods
WO2018226898A1 (en) Power outlet cameras

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANARY CONNECT, INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAGER, ADAM D.;RILL, CHRIS I.;SIGNING DATES FROM 20141125 TO 20141126;REEL/FRAME:034377/0017

AS Assignment

Owner name: VENTURE LENDING & LEASING VII, INC., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:CANARY CONNECT, INC.;REEL/FRAME:034734/0953

Effective date: 20150114

AS Assignment

Owner name: SILICON VALLEY BANK, MASSACHUSETTS

Free format text: SECURITY AGREEMENT;ASSIGNOR:CANARY CONNECT, INC.;REEL/FRAME:038432/0528

Effective date: 20160413

AS Assignment

Owner name: VENTURE LENDING & LEASING VIII, INC., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:CANARY CONNECT, INC.;REEL/FRAME:041293/0422

Effective date: 20161228

Owner name: VENTURE LENDING & LEASING VII, INC., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:CANARY CONNECT, INC.;REEL/FRAME:041293/0422

Effective date: 20161228

AS Assignment

Owner name: CANARY CONNECT, INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAGER, ADAM D.;RILL, CHRISTOPHER I.;SIGNING DATES FROM 20141125 TO 20141126;REEL/FRAME:041559/0133

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION