US20150162006A1 - Voice-recognition home automation system for speaker-dependent commands - Google Patents
Voice-recognition home automation system for speaker-dependent commands Download PDFInfo
- Publication number
- US20150162006A1 US20150162006A1 US14/566,977 US201414566977A US2015162006A1 US 20150162006 A1 US20150162006 A1 US 20150162006A1 US 201414566977 A US201414566977 A US 201414566977A US 2015162006 A1 US2015162006 A1 US 2015162006A1
- Authority
- US
- United States
- Prior art keywords
- television receiver
- home automation
- speaker
- voice command
- automation system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2823—Reporting information sensed by appliance or service execution status of appliance services in a home automation network
- H04L12/2825—Reporting to a device located outside the home and the home network
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B11/00—Automatic controllers
- G05B11/01—Automatic controllers electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/00174—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
- G07C9/00571—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated by interacting with a central unit
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification
- G10L17/22—Interactive procedures; Man-machine interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2816—Controlling appliance services of a home automation network by calling their functionalities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2838—Distribution of signals within a home automation network, e.g. involving splitting/multiplexing signals to/from different paths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/21—Monitoring or handling of messages
- H04L51/222—Monitoring or handling of messages using geographical location information, e.g. messages transmitted or received in proximity of a certain spot or area
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4131—Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
- H04N21/42607—Internal components of the client ; Characteristics thereof for processing the incoming bitstream
- H04N21/4263—Internal components of the client ; Characteristics thereof for processing the incoming bitstream involving specific tuning arrangements, e.g. two tuners
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/43615—Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
- H04N21/43637—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4532—Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/454—Content or additional data filtering, e.g. blocking advertisements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47217—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6143—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via a satellite
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05B—LOCKS; ACCESSORIES THEREFOR; HANDCUFFS
- E05B47/00—Operating or controlling locks or other fastening devices by electric or magnetic means
- E05B2047/0048—Circuits, feeding, monitoring
- E05B2047/0067—Monitoring
- E05B2047/0068—Door closed
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05B—LOCKS; ACCESSORIES THEREFOR; HANDCUFFS
- E05B47/00—Operating or controlling locks or other fastening devices by electric or magnetic means
- E05B2047/0048—Circuits, feeding, monitoring
- E05B2047/0067—Monitoring
- E05B2047/0069—Monitoring bolt position
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05B—LOCKS; ACCESSORIES THEREFOR; HANDCUFFS
- E05B47/00—Operating or controlling locks or other fastening devices by electric or magnetic means
- E05B2047/0094—Mechanical aspects of remotely controlled locks
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C2209/00—Indexing scheme relating to groups G07C9/00 - G07C9/38
- G07C2209/60—Indexing scheme relating to groups G07C9/00174 - G07C9/00944
- G07C2209/62—Comprising means for indicating the status of the lock
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/12—Messaging; Mailboxes; Announcements
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10T—TECHNICAL SUBJECTS COVERED BY FORMER US CLASSIFICATION
- Y10T292/00—Closure fasteners
- Y10T292/08—Bolts
- Y10T292/096—Sliding
Definitions
- this application is directed to home automation systems, and more specifically, to controlling home automation systems with speaker-dependent commands.
- a method for controlling a device in a home automation system based on a speaker-dependent command may include receiving, by a television receiver, a voice command for controlling the device connected to the home automation system.
- the method may include performing, by the television receiver, a voice recognition analysis to determine a speaker identity of the received voice command, and/or performing, by the television receiver, a speech recognition analysis to identify the device in the home automation system that is intended to be controlled by the received voice command.
- the method may include determining, by the television receiver, a permission status to control the identified device, wherein the determined permission status is based on the determined speaker identity and the identified device.
- the method may include controlling, by the television receiver, the identified device in the home automation system based on the determined status.
- the method may include detecting, by the television receiver, the voice command at a microphone on the television receiver.
- the method may include receiving, by the television receiver, the voice command from a remote control in operative communication with the television receiver, wherein the voice command is detected by a microphone on the remote control and is wirelessly relayed from the remote control to the television receiver.
- the method may include receiving, by the television receiver, the voice command from a home automation device in operative communication with the television receiver, wherein the voice command is detected by a microphone provided on the home automation device and is relayed from the home automation device to the television receiver.
- the method may include, in the step of performing the voice recognition analysis to determine the speaker identity, comparing, by the television receiver, at least a portion of the received voice command to a voice database comprising one or more voice samples, wherein each of the one or more voice samples are associated with one or more speaker identities.
- the method may include receiving, by the television receiver, one or more voice samples during an initial setup, receiving, by the television receiver, a speaker identity during the initial setup, associating, by the television receiver, at least a portion of the one or more voice samples with the speaker identity, and/or storing, by the television receiver, the associated one or more voice samples and the speaker identity in a voice database.
- the method may further include receiving, by the television receiver, one or more access settings associated with the speaker identity, and/or storing, by the television receiver, the one or more access settings associated with the speaker identity in a controls database.
- the method may include, in performing the speech recognition analysis to identify the device to be controlled, one or more steps of detecting, by the television receiver, one or more control phrases in the received voice command, and/or comparing, by the television receiver, the one or more control phrases to a controls database comprising a plurality of control phrases.
- Each of the one or more control phrases may be associated with one or more home automation devices.
- the method may include receiving, by the television receiver, one or more control phrases in a one-time setup, wherein each of the one or more control phrases comprises at least a word or a string of words, associating, by the television receiver, each of the one or more control phrases with a home automation device, and/or storing, by the television receiver, the associated control phrases and home automation devices in a controls database.
- control phrases may include user-configured control phrases.
- the method may include determining, by the television receiver, a plurality of home automation devices to control based on the speech recognition analysis.
- the method may include determining, by the television receiver, the permission status is at least one of an access granted status and an access denied status.
- the method may include transmitting, by the television receiver, an operational command to the identified device based on the access granted permission status.
- the method may include generating, by the television receiver, the operational command based on a communication protocol specific to the identified device, and/or transmitting, by the television receiver, the operational command to the identified device through a home automation network.
- the method may include outputting, by the television receiver, a confirmation notification that indicates a new state of the identified device, wherein the new state is based on the voice command.
- the method may include outputting, by the television receiver, a notification based on the access denied status, and/or maintaining, by the television receiver, a current state of the identified device.
- the method may include receiving, by the television receiver, a code word, wherein the received code word is detected by a microphone, associating, by the television receiver, the code word with a speaker identity, and/or detecting, by the television receiver, the code word immediately preceding the voice command.
- the method may include, in response to receiving the voice command, outputting, by the television receiver, an additional query for additional information related to the voice command. Other examples are possible.
- a system for controlling a device in a home automation system based on a speaker-dependent command may include one or more processors and/or a memory communicatively coupled with and readable by the one or more processors and having stored therein processor-readable instructions.
- the processor-readable instructions may cause the one or more processors to receive, by a television receiver, a voice command for controlling the device connected to the home automation system, perform, by the television receiver, a voice recognition analysis to determine a speaker identity of the received voice command, and/or perform, by the television receiver, a speech recognition analysis to identify the device in the home automation system that is intended to be controlled by the received voice command.
- the processor-readable instructions may cause the one or more processors to determine, by the television receiver, a permission status to control the identified device, wherein the determined permission status is based on the determined speaker identity and the identified device.
- the processor-readable instructions may cause the one or more processors to control, by the television receiver, the identified device in the home automation system based on the determined status.
- Other examples are possible.
- a computer-readable medium having stored thereon a series of instructions When executed by a processor, the series of instructions may cause the processor to control a device in a home automation system based on a speaker-dependent command.
- the series of instructions may include receiving, by a television receiver, a voice command for controlling the device connected to the home automation system, performing, by the television receiver, a voice recognition analysis to determine a speaker identity of the received voice command, and/or performing, by the television receiver, a speech recognition analysis to identify the device in the home automation system that is intended to be controlled by the received voice command.
- the series of instructions may further include determining, by the television receiver, a permission status to control the identified device, wherein the determined permission status is based on the determined speaker identity and the identified device, and/or controlling, by the television receiver, the identified device in the home automation system based on the determined status.
- determining, by the television receiver, a permission status to control the identified device wherein the determined permission status is based on the determined speaker identity and the identified device, and/or controlling, by the television receiver, the identified device in the home automation system based on the determined status.
- Other examples are possible.
- FIG. 1 shows an embodiment of a television service provider system
- FIG. 2 shows an embodiment of a home automation system hosted by a television receiver
- FIG. 3 shows an embodiment of a television receiver configured to host a home automation system
- FIG. 4 shows an example method for controlling a device in a home automation system based on a speaker-dependent command
- FIG. 5 shows a block diagram of example modules in a voice command engine for controlling a device in a home automation system based on a speaker-dependent command
- FIG. 6 shows another example method for controlling a device in a home automation system based on a speaker-dependent command
- FIG. 7 shows an embodiment of a computer system upon which various aspects of the present disclosure may be implemented.
- the systems and methods disclosed herein are directed to controlling a device, such as a home automation or “smart” device, of a home automation system based on a speaker-dependent command.
- a microphone such as a microphone on a television receiver, a television remote control, and/or on one or more devices in the home automation system, may detect a spoken voice command and transmit the voice command to a television receiver having a voice command recognition system.
- the television receiver may perform a voice recognition analysis for speaker verification and/or speaker identification.
- the television receiver may perform a speech recognition analysis to determine which device(s) connected to the home automation system should be controlled, e.g. which device(s) the voice command is intended to command. Based on the determined speaker's identity and the determined device(s) to control, the television receiver may allow for certain commands to be completed and/or prohibited in the home automation system.
- an adult user identity may be permitted to perform certain functions that a child user identity cannot, e.g., unlocking doors.
- a speaker may state a command without naming a particular device, e.g. “Lower the heat by five degrees.”
- the television receiver may determine, based at least in part on speech recognition, which home automation device is intended, and therefore which device to transmit an operational signal to, e.g. relaying the command to lower the heat by five degrees to a thermostat.
- the television receiver may respond to a user-defined code word that precedes and/or follows a spoken command.
- the television receiver may detect the code word “Sesame” followed by a spoken command such as “Turn on the living room lights.” Such user-defined code words may activate the television receiver to capture and analyze the subsequently spoken home automation command.
- Other examples are possible. It is contemplated that the present systems and methods provide for a user-friendly, secure and simple controls infrastructure that may be used to operate a variety of different electronic devices in the home automation system, as described in further detail below.
- FIG. 1 illustrates an embodiment of a satellite television distribution system 100 . While a home automation system may be incorporated with various types of television receivers, various embodiments may be part of a satellite-based television distribution system. Cable, IP-based, wireless and broadcast focused systems are also possible. Satellite television distribution system 100 may include: television service provider system 110 , satellite transmitter equipment 120 , satellites 130 , satellite dish 140 , television receiver 150 , system credit management engine 112 , and display device 160 . Alternate embodiments of satellite television distribution system 100 may include fewer or greater numbers of components.
- satellite dish 140 While only one satellite dish 140 , television receiver 150 , and display device 160 (collectively referred to as “user equipment”) are illustrated, it should be understood that multiple (e.g., tens, thousands, millions of) instances and types of user equipment may receive data and television signals from television service provider system 110 via satellites 130 .
- television service provider system 110 and satellite transmitter equipment 120 may be operated by a television service provider.
- a television service provider may distribute television channels, on-demand programming, programming information, and/or other content/services to users.
- Television service provider system 110 may receive feeds of one or more television channels and content from various sources. Such television channels may include multiple television channels that contain at least some of the same content (e.g., network affiliates).
- feeds of the television channels may be relayed to user equipment via multiple television distribution satellites. Each satellite may relay multiple transponder streams.
- Satellite transmitter equipment 120 may be used to transmit a feed of one or more television channels from television service provider system 110 to one or more satellites 130 .
- While a single television service provider system 110 and satellite transmitter equipment 120 are illustrated as part of satellite television distribution system 100 , it should be understood that multiple instances of transmitter equipment may be used, possibly scattered geographically, to communicate with satellites 130 . Such multiple instances of satellite transmitting equipment may communicate with the same or with different satellites. Different television channels may be transmitted to satellites 130 from different instances of transmitting equipment. For instance, a different satellite dish of satellite transmitter equipment 120 may be used for communication with satellites in different orbital slots.
- satellites 130 may be configured to receive signals, such as streams of television channels, from one or more satellite uplinks such as satellite transmitter equipment 120 . Satellites 130 may relay received signals from satellite transmitter equipment 120 (and/or other satellite transmitter equipment) to multiple instances of user equipment via transponder streams. Different frequencies may be used for uplink signals 170 from downlink signals 180 . Satellites 130 may be in geosynchronous orbit. Each of the transponder streams transmitted by satellites 130 may contain multiple television channels transmitted as packetized data. For example, a single transponder stream may be a serial digital packet stream containing multiple television channels. Therefore, packets for multiple television channels may be interspersed. Further, information used by television receiver 150 for home automation functions may also be relayed to television receiver via one or more transponder streams. For instance, home automation functions may be requested by and/or pushed to the television receiver 150 from the television service provider system 110 .
- multiple satellites 130 may be used to relay television channels from television service provider system 110 to satellite dish 140 .
- Different television channels may be carried using different satellites.
- Different television channels may also be carried using different transponders of the same satellite; thus, such television channels may be transmitted at different frequencies and/or different frequency ranges.
- a first and second television channel may be relayed via a first transponder of satellite 130 - 1 .
- a third, fourth, and fifth television channel may be relayed via a different satellite or a different transponder of the same satellite relaying the transponder stream at a different frequency.
- a transponder stream transmitted by a particular transponder of a particular satellite may include a finite number of television channels, such as seven. Accordingly, if many television channels are to be made available for viewing and recording, multiple transponder streams may be necessary to transmit all of the television channels to the instances of user equipment. Further, it is contemplated that multiple home automation functions may be transmitted in similar fashion.
- satellite dish 140 may be a piece of user equipment that is used to receive transponder streams from one or more satellites, such as satellites 130 .
- Satellite dish 140 may be provided to a subscriber for use on a subscription basis to receive television channels and/or home automation functions provided by the television service provider system 110 and/or specifically, the home automation service server 112 of the provider system 110 , satellite transmitter equipment 120 , and/or satellites 130 .
- Satellite dish 140 which may include one or more low noise blocks (LNBs), may be configured to receive transponder streams from multiple satellites and/or multiple transponders of the same satellite. Satellite dish 140 may be configured to receive television channels via transponder streams on multiple frequencies.
- LNBs low noise blocks
- transponder streams from a limited number of transponders concurrently.
- a tuner of television receiver 150 may only be able to tune to a single transponder stream from a transponder of a single satellite at a given time. The tuner can then be re-tuned to another transponder of the same or a different satellite.
- a television receiver 150 having multiple tuners may allow for multiple transponder streams to be received at the same time.
- multiple television channels and/or multiple home automation functions may be received concurrently.
- FIG. 1 further illustrates one or more television receivers in communication with satellite dish 140 .
- Television receivers may be configured to decode signals received from satellites 130 via satellite dish 140 for output and presentation via a display device, such as display device 160 .
- a display device such as display device 160 .
- such television receivers may decode signals received for any home automation devices.
- a home automation engine 311 as described further below, may decode such signals.
- a television receiver may be incorporated as part of a television or may be part of a separate device, commonly referred to as a set-top box (STB).
- STB set-top box
- Television receiver 150 may decode signals received via satellite dish 140 and provide an output to display device 160 .
- On-demand content such as PPV content, may be stored to a computer-readable storage medium.
- a television receiver is defined to include set-top boxes (STBs) and also circuitry having similar functionality that may be incorporated with another device. For instance, circuitry similar to that of a television receiver may be incorporated as part of a television.
- FIG. 1 illustrates an embodiment of television receiver 150 as separate from display device 160 , it should be understood that, in other embodiments, similar functions may be performed by a television receiver integrated with display device 160 .
- Television receiver 150 may include the home automation engine 311 , as detailed in relation to FIG. 3 .
- display device 160 may be used to present video and/or audio decoded and output by television receiver 150 .
- Television receiver 150 may also output a display of one or more interfaces to display device 160 , such as an electronic programming guide (EPG).
- EPG electronic programming guide
- display device 160 is a television.
- Display device 160 may also be a monitor, computer, or some other device configured to display video and, possibly, play audio.
- uplink signal 170 - 1 represents a signal between satellite transmitter equipment 120 and satellite 130 - 1 .
- Uplink signal 170 - 2 represents a signal between satellite transmitter equipment 120 and satellite 130 - 2 .
- Each of uplink signals 170 may contain streams of one or more different television channels and/or home automation functions.
- uplink signal 170 - 1 may contain a first group of television channels and/or home automation functions
- uplink signal 170 - 2 contains a second group of television channels and/or home automation functions.
- Each of these television channels and/or home automation functions may be scrambled such that unauthorized persons are prevented from accessing the television channels.
- downlink signal 180 - 1 represents a signal between satellite 130 - 1 and satellite dish 140 .
- Downlink signal 180 - 2 represents a signal between satellite 130 - 2 and satellite dish 140 .
- Each of the downlink signals 180 may contain one or more different television channels and/or home automation functions, which may be at least partially scrambled.
- a downlink signal may be in the form of a transponder stream.
- a single transponder stream may be tuned to at a given time by a tuner of a television receiver.
- downlink signal 180 - 1 may be a first transponder stream containing a first group of television channels and/or home automation functions
- downlink signal 180 - 2 may be a second transponder stream containing a different group of television channels and/or home automation functions.
- a transponder stream can be used to transmit on-demand content to television receivers, including PPV content (which may be stored locally by the television receiver until output for presentation).
- FIG. 1 further illustrates downlink signal 180 - 1 and downlink signal 180 - 2 , being received by satellite dish 140 and distributed to television receiver 150 .
- satellite dish 140 may receive downlink signal 180 - 1 and for a second group of channels and/or home automation functions, downlink signal 180 - 2 may be received.
- Television receiver 150 may decode the received transponder streams. As such, depending on which television channels and/or home automation functions are desired to be presented or stored, various transponder streams from various satellites may be received, descrambled, and decoded by television receiver 150 .
- network 190 which may include the Internet, may allow for bidirectional communication between television receiver 150 and television service provider system 110 , such as for home automation related services provided by home automation service server 112 .
- a telephone e.g., landline
- cellular connection may be used to enable communication between television receiver 150 and television service provider system 110 .
- Television receiver 150 may represent the television receiver of FIG. 1 . While television receiver 150 may be configured to receive television programming from a satellite-based television service provider, it should be understood that in other embodiments, other forms of television service provider networks may be used, such as an IP-based network (e.g., fiber network), a cable based network, a wireless broadcast-based network, etc.
- IP-based network e.g., fiber network
- cable based network e.g., a cable based network
- wireless broadcast-based network e.g., etc.
- television receiver 150 may be configured to communicate with multiple in-home home automation devices.
- the devices with which television receiver 150 communicates may use different communication standards or protocols. For instance, one or more devices may use a ZigBee® communication protocol while one or more other devices communicate with the television receiver using a Z-Wave® communication protocol.
- Other forms of wireless communication may be used by devices and the television receiver 150 .
- television receiver 150 and one or more devices may be configured to communicate using a wireless local area network, which may use a communication protocol such as 802.11.
- a separate device may be connected with television receiver 150 to enable communication with home automation devices.
- communication device 252 may be attached to television receiver 150 .
- Communication device 252 may be in the form of a dongle.
- Communication device 252 may be configured to allow for Zigbee®, Z-Wave®, and/or other forms of wireless communication.
- the communication device may connect with television receiver 150 via a USB port or via some other type of (wired) communication port.
- Communication device 252 may be powered by the television receiver 150 or may be separately coupled with a power source.
- television receiver 150 may be enabled to communicate with a local wireless network and may use communication device 252 in order to communicate with devices that use a ZigBee® communication protocol, Z-Wave® communication protocol, and/or some other home wireless communication protocols.
- communication device 252 may also serve to allow additional components to be connected with television receive 150 .
- communication device 252 may include additional audio/video inputs (e.g., HDMI), component, and/or composite input to allow for additional devices (e.g., Blu-ray players) to be connected with television receiver 150 .
- additional audio/video inputs e.g., HDMI
- component e.g., component
- composite input e.g., Blu-ray players
- Such connection may allow video from such additional devices to be overlaid with home automation information.
- whether home automation information is overlaid onto video may be triggered based on a user's press of a remote control button.
- television receiver 150 may be configured to output home automation information for presentation to a user via display device 160 . Such information may be presented simultaneously with television programming received by television receiver 150 , such as via system 100 of FIG. 1 described above. Television receiver 150 may also, at a given time, output only television programming or only home automation information based on a user's preference. The user may be able to provide input to television receiver 150 to control the home automation system hosted by television receiver 150 or by overlay device 251 , as detailed below.
- television receiver 150 may not be used as a host for a home automation system. Rather, a separate device may be coupled with television receiver 150 that allows for home automation information to be presented to a user via display device 160 . This separate device may be coupled with television receiver 150 .
- the separate device is referred to as overlay device 251 .
- Overlay device 251 may be configured to overlay information, such as home automation information, onto a signal to be visually presented via display device 160 , such as a television.
- overlay device 251 may be coupled between television receiver 150 , which may be in the form of a set top box, and display device 160 , which may be a television.
- television receiver 150 may receive, decode, descramble, decrypt, store, and/or output television programming and/or home automation functions.
- Television receiver 150 may output a signal, such as in the form of an HDMI signal.
- the output of television receiver 150 may be input to overlay device 251 .
- Overlay device 251 may receive the video and/or audio output from television receiver 150 .
- Overlay device 251 may add additional information to the video, audio and/or home automation function signal received from television receiver 150 .
- the modified video and/or audio signal may be output to display device 160 for presentation.
- overlay device 251 has an HDMI input and an HDMI output, with the HDMI output being connected to display device 160 .
- FIG. 2 illustrates lines illustrating communication between television receiver 150 and various devices, it should be understood that such communication may exist, in addition or in alternate via communication device 252 and/or with overlay device 251 .
- television receiver 150 may be used to provide home automation functionality while overlay device 251 may be used to present information via display device 160 .
- overlay device 251 may provide home automation functionality and be used to present information via display device 160 .
- Using overlay device 251 to present automation information via display device 160 may have additional benefits. For instance, multiple devices may provide input video to overlay device 251 .
- television receiver 150 may provide television programming to overlay device 251
- a DVD/Blu-Ray player may provide video overlay device 251
- a separate internet-TV device may stream other programming to overlay device 251 .
- overlay device 251 may output video and/or audio that has been modified to include home automation information, such as a pop-up overlay with a prompt message, and output to display device 160 .
- overlay device 251 may modify the audio/video to include home automation information and, possibly, solicit for user input.
- overlay device 251 may have four video inputs (e.g., four HDMI inputs) and a single video output (e.g., an HDMI output).
- such overlay functionality may be part of television receiver 150 .
- a separate device, such as a Blu-ray player may be connected with a video input of television receiver 150 , thus allowing television receiver 150 to overlay home automation information when content from the Blu-Ray player is being output to display device 160 .
- home automation information may be presented by display device 160 while television programming is also being presented by display device 160 .
- home automation information may be overlaid or may replace a portion of television programming (e.g., broadcast content, stored content, on-demand content, etc.) presented via display device 160 .
- television programming e.g., a television show on scuba diving
- the display is augmented with information related to home automation.
- This television show may represent broadcast programming, recorded content, on-demand content, or some other form of content.
- the presented home automation information is related to motion being detected by a camera at a front door of a location.
- Such augmentation of the television programming may be performed directly by television receiver 150 (which may or may not be in communication with communication device 252 ) or overlay device 251 connected with television receiver 150 and display device 160 .
- Such augmentation may result in solid or partially transparent graphics being overlaid onto television programming (or other forms of video) output by television receiver 150 .
- Overlay device 251 or television receive 150 may be configured to add or modify sound to television programming. For instance, in response to a doorbell ring, a sound may be played through the display device (or connected audio system). In addition or in alternate, a graphic may be displayed.
- camera data e.g., nanny camera data
- associated sound or motion sensors may be integrated in the system and overlaid or otherwise made available to a user. For example, detection of a crying baby from a nanny camera may trigger an on-screen alert to a user watching television.
- such presented home automation information may request user input.
- a user via controls of television receiver 150 (e.g., a remote control) or controls of overlay device 251 , can specify whether video from a camera at the front door should be presented, not presented, or if future notifications related to such motion such be ignored. If ignored, this may be for a predefined period of time, such as an hour, or until the television receiver 150 or overlay device 251 is powered down and powered back on. Ignoring of video may be particularly useful if motion or some other event is triggering the presentation of video that is not interesting to a viewer of display device 160 (or a wireless device), such as children playing on the lawn or snow falling.
- television receiver 150 or overlay device 251 may be configured to communicate with one or more wireless devices, such as wireless device 216 .
- Wireless device 216 may represent a tablet computer, cellular phone, laptop computer, remote computer, or some other device through which a user may desire to control home automation settings and view home automation information. Such a device also need not be wireless, such as a desktop computer.
- Television receiver 150 , communication device 252 , or overlay device 251 may communicate directly with wireless device 216 , or may use a local wireless network, such as network 270 .
- Wireless device 216 may be remotely located and not connected with a same local wireless network.
- television receiver 150 or overlay device 251 may be configured to transmit a notification to wireless device 216 regarding home automation information.
- a third-party notification server system such as the notification server system operated by Apple®, may be used to send such notifications to wireless device 216 .
- a location of wireless device 216 may be monitored. For instance, if wireless device 216 is a cellular phone, when its position indicates it has neared a door, the door may be unlocked. A user may be able to define which home automation functions are controlled based on a position of wireless device 216 . Other functions could include opening and/or closing a garage door, adjusting temperature settings, turning on and/or off lights, opening and/or closing shades, etc. Such location-based control may also take into account the detection of motion via one or more motion sensors that are integrated into other home automation devices and/or stand-alone motion sensors in communication with television receiver 150 .
- network 270 may be necessary to permit television receiver 150 to stream data out to the Internet.
- television receiver 150 and network 270 may be configured, via a service such as Sling® or other video streaming service, to allow for video to be streamed from television receiver 150 to devices accessible via the Internet.
- Such streaming capabilities may be “piggybacked” to allow for home automation data to be streamed to devices accessible via the Internet.
- U.S. patent application Ser. No. 12/645,870 filed on Dec. 23, 2009, entitled “Systems and Methods for Remotely Controlling a Media Server via a Network”, which is hereby incorporated by reference, describes one such system for allowing remote access and control of a local device.
- wireless device 216 may be in communication with television receiver 150 serving as the host of a home automation system.
- television receiver 150 serving as the host of a home automation system.
- similar information may be sent to wireless device 216 , such as via a third-party notification server or directly from television receiver 150 or overlay device 251 via a local wireless network.
- a user of wireless device 216 can specify whether video from a camera at the front door should be presented by wireless device 216 , not presented, or if future notifications related to such motion such be ignored.
- a user interface of the wireless device 216 may correspond to an overlay of the home automation information and/or prompt appearing on the display device 160 .
- wireless device 216 may serve as an input device for television receiver 150 .
- wireless device 216 may be a tablet computer that allows text to be typed by a user and provided to television receiver 150 . Such an arrangement may be useful for text messaging, group chat sessions, or any other form of text-based communication. Other types of input may be received for the television receiver from a tablet computer or other device as shown in the attached screenshots, such as lighting commands, security alarm settings and door lock commands. While wireless device 216 may be used as the input device for typing text, television receiver 150 may output for display text to display device 160 .
- wireless device 216 may be configured to store a software model of home automation system intended to mirror the software model stored by television receiver 150 , which is hosting the home automation system. For instance, such a software model may allow wireless device 216 to view, communicate with, and/or interact with various home automation devices. Such a software model may indicate the state of various home automation devices. When wireless device 216 is not in communication with television receiver 150 , changes to the home automation model made at television receiver 150 may not be known to wireless device 216 . A history list maintained by television receiver 150 and/or a synchronization point numerical value, whereby each change to the home automation model by television receiver 150 is assigned a value and synchronized at a later point with the wireless device 216 , may be implemented. In another aspect, the wireless device 216 may be utilized by a user for entering and/or confirming rules and other settings of the home automation system, and such settings may be synchronized or otherwise communicated with the television receiver 150 .
- a cellular modem 253 may be connected with either overlay device 251 or television receiver 150 .
- Cellular modem 253 may be useful if a local wireless network is not available.
- cellular modem 253 may permit access to the internet and/or communication with a television service provider.
- Communication with a television service provider, such as television service provider system 110 of FIG. 1 may also occur via a local wireless or wired network connected with the Internet.
- information for home automation purposes may be transmitted by television service provider system 110 to television receiver 150 or overlay device 251 via the television service provider's distribution network, which may include the use of satellites 130 .
- various home automation devices may be in communication with television receiver 150 or overlay device 251 .
- Such home automation devices may use disparate communication protocols.
- Such home automation devices may communicate with television receiver 150 directly or via communication device 252 .
- Such home automation devices may be controlled by a user and/or have a status viewed by a user via display device 160 and/or wireless device 216 .
- Such home automation device may include one or more of the following, as discussed below.
- Camera 212 may be either indoors or outdoors and may provide a video and, possibly, audio stream which can be presented via wireless device 216 and/or display device 160 .
- Video and/or audio from camera 212 may be recorded by overlay device 251 or television receiver 150 upon an event occurring, such as motion being detected by camera 212 .
- Video and/or audio from camera 212 may be continuously recorded such as in the form of a rolling window, thus allowing a period of time of video/audio to be reviewed by a user from before a triggering event and after the triggering event.
- Video may be recorded on a storage local to overlay device 251 or television receiver 150 , or may be recorded and or storage on external storage devices, such as a network attached storage device. In some embodiments, video may be transmitted across the local and/or wide area network to other storage devices upon occurrence of a trigger event for later playback.
- a still from camera 212 may be captured by and stored by television receiver 150 for subsequent presentation as part of a user interface via display device 160 such that the user can determine which camera (if multiple cameras are present) is being set up and/or later accessed.
- a user interface may display a still image from a front door camera (which is easily recognized by the user because it shows a scene in front of the house's front door) to allow a user to select the front door camera for viewing.
- video and, possibly, audio from camera 212 may be available live for viewing by a user via overlay device 251 or television receiver 150 .
- Such video may be presented simultaneously with television programming being presented.
- video may only be presented if motion is detected by camera 212 , otherwise video from camera 212 may not be presented by the display device presenting television programming.
- video (and, possibly, audio) from camera 212 may be recorded by television receiver 150 or overlay device 251 .
- Such video may be recorded based upon a timer configured by a user.
- camera 212 may be incorporated into an electronic programming guide (EPG) output for display by television receiver 150 .
- EPG electronic programming guide
- camera 212 may be presented as a “channel” as part of the EPG along with other television programming channels.
- a user may be permitted to select the channel associated with camera 212 for presentation via display device 160 (or wireless device 216 ).
- the user may also be permitted to set a timer to record the channel of camera 212 for a user-defined period of time on a user-defined date.
- Such recording may not be constrained by the rolling window associated with a triggering event being detected. For instance, recording camera 212 based on a timer may be useful if a babysitter is going to be watching a child and the parents want to later review the babysitter's behavior in their absence.
- video from camera 212 may be backed up to a remote storage device, such as cloud-based storage hosted by home automation service server 112 .
- a remote storage device such as cloud-based storage hosted by home automation service server 112 .
- Other data may also be cached to the cloud, such as configuration settings.
- a new device may be installed and the configuration data loaded onto the device from the cloud.
- window sensor 210 and door sensor 208 may transmit data to television receiver 150 (possibly via communication device 252 ) or overlay device 251 that indicates the status of a window or door, respectively. Such status may indicate open or closed. When a status change occurs, the user may be notified as such via wireless device 216 or display device 160 . Further, a user may be able to view a status screen to view the status one or more window sensors and/or one or more door sensors throughout the location. Window sensor 210 and/or door sensor 208 may have integrated glass break sensors to determine if glass has been broken.
- one or more smoke and/or CO 2 detectors 209 may be integrated as part of a home automation system. As such, alerts as to whether a fire or CO 2 has been detected can be sent to television receiver 150 , wireless device 216 , and/or emergency first responders. Further, television receiver 150 and/or wireless device 216 may be used to disable false alarms. One or more sensors may be integrated or separate to detect gas leaks, radon, or various other dangerous situations.
- pet door and/or feeder 211 may allow for pet related functionality to be integrated with television receiver 150 .
- a predefined amount of food may be dispensed at predefined times to a pet.
- a pet door may be locked and/or unlocked.
- the pet's weight or presence may trigger the locking or unlocking of the pet door.
- a camera located at the pet door may be used to perform image recognition of the pet or a weight sensor near the door may identify the presence of the pet and unlock the door.
- a user may also lock/unlock a pet door via wireless device 150 and/or wireless device 216 .
- weather sensor 206 may allow television receiver 150 or overlay device 251 to receive, identify, and/or output various forms of environmental data, including temperature, humidity, wind speed, barometric pressure, etc.
- Television receiver 150 or overlay device 251 may allow for control of one or more shades, such as window, door, and/or skylight shades, within a house.
- Shade controller 204 may respond to commands from television receiver 150 or overlay device 251 and may provide status updates (e.g., shade up, shade 50% up, shade down, etc.).
- television receiver 150 may receive and notify a user of the status of electrical appliances such as refrigerators and dishwashers within the house.
- the television receiver 150 may be linked to the appliances and presents a notification message to the user through whatever device the user is using at the time, such as a tablet computer, mobile phone or thin client.
- utility monitor 202 may serve to provide television receiver 150 or overlay device 251 with utility information, such as electricity usage, gas usage, water usage, wastewater usage, irrigation usage, etc.
- utility information such as electricity usage, gas usage, water usage, wastewater usage, irrigation usage, etc.
- a user may view a status page or may receive notifications upon predefined events occurring, such as electricity usage exceeding a defined threshold within a month, or current kilowatt usage exceeding a threshold.
- FIG. 2 further shows a health sensor 214 that may permit a user's vital characteristics to be monitored, such as a heart rate.
- health sensor 214 may contain a button or other type of actuator that a user can press to request assistance.
- health sensor 214 may be mounted to a fixed location, such as bedside, or may be carried by a user, such as on a lanyard.
- Such a request may trigger a notification to be presented to other users via display device 160 and/or wireless device 216 .
- a notification may be transmitted to emergency first responders to request help.
- a home automation service provider may first try contacting the user, such as via phone, to determine if an emergency is indeed occurring.
- a health sensor 214 may have additional purposes, such as for notification of another form of emergency, such as a break-in, fire, flood, theft, disaster, etc.
- the health sensor 214 may receive signals from various cameras, temperature sensors, and other monitoring equipment in connection with the home automation system, analyze such signals, and store or report such signals as necessary.
- health sensor 214 may be used as a medical alert pendant that can be worn or otherwise carried by a user. It may contain a microphone and/or speaker to allow communication with other users and/or emergency first responders.
- Television receiver 150 or overlay device 251 may be preprogrammed to contact a particular phone number (e.g., emergency service provider, relative, caregiver, etc.) based on an actuator of health sensor 214 being activated by a user. The user may be placed in contact with a person via the phone number and the microphone and/or speaker of health sensor 214 . Camera data may be combined with such alerts in order to give a contacted relative more information regarding the medical situation.
- health sensor 214 when activated in the family room, may generate a command which is linked with security camera footage from the same room.
- health sensor 214 may be able to monitor vitals of a user, such as a blood pressure, temperature, heart rate, blood sugar, etc.
- an event such as a fall or exiting a structure can be detected.
- parallel notifications may be sent by the health sensor 214 to multiple user devices at approximately the same time. As such, multiple people can be made aware of the event at approximately the same time (as opposed to serial notification). Which users are notified for which type of event may be customized by a user of television receiver 150 .
- data from other devices may trigger such parallel notifications according to various rules within the home automation system. For instance, a mailbox open, a garage door open, an entry/exit door open during wrong time, an unauthorized control of specific lights during vacation period, a water sensor detecting a leak or flow, a temperature of room or equipment is outside of defined range, and/or motion detected at front door are examples of possible events which may trigger parallel notifications.
- a configuring user may be able to select from a list of users provided by the home automation system to notify and method of notification to enable such parallel notifications.
- the configuring user may prioritize which systems and people are notified, and specify that the notification may continue through the list unless acknowledged either electronically or by human interaction. For example, the user could specify that they want to be notified of any light switch operation in their home during their vacation. Notification priority could be 1) SMS Message, 2) push notification, 3) electronic voice recorder places call to primary number, and 4) electronic voice recorder places call to spouse's number.
- Notification priority could be 1) SMS Message, 2) push notification, 3) electronic voice recorder places call to primary number, and 4) electronic voice recorder places call to spouse's number.
- the second notification may never happen if the user replies to the SMS message with an acknowledgment. Or, the second notification would automatically happen if the SMS gateway cannot be contacted.
- intercom 218 may permit a user in one location to communicate with a user in another location, who may be using wireless device 216 , display device 160 or some other device, such another television receiver within the structure.
- Intercom 218 may be integrated with camera 212 or may use a dedicated microphone/speaker, such as a Bluetooth® microphone.
- Microphones/speakers of wireless device 216 , display device 160 , communication device 252 , overlay device 251 may also or alternatively be used.
- a multimedia over coax (MOCA) network or other appropriate type of network may be used to provide audio and/or video based intercom via television receiver 150 with other television receivers and/or wireless devices in communication with television receiver 150 .
- MOCA multimedia over coax
- video and/or audio conferencing can be provided, such that communication with persons via the Internet is possible. Therefore, one possible use would be video and/or audio conferencing within a structure using each television receiver (and associated connected display devices) in the structure that are in communication, or allowing each television receiver to perform video/audio conferencing with other devices external to the structure or local area network.
- a microphone may be placed in a location where a user would typically be using intercom 218 .
- a microphone may be placed near display device 160 .
- a microphone may be integrated into a remote control of television receiver 150 . As such, if a user is using television receiver 150 via remote control, the user would have access to a microphone.
- a user can leverage the wireless device 216 , such as a mobile phone or tablet computer, as the microphone for the home automation system.
- doorbell sensor 223 may permit an indication of when a doorbell has been rung to be sent to multiple devices, such as television receiver 150 and/or wireless device 216 .
- doorbell sensor 223 detecting a doorbell ring may trigger video to be recorded by camera 212 of the area near the doorbell and the video to be stored until deleted by a user (or stored for predefined period of time).
- such a microphone, or a microphone on one or more other home automation devices may allow for voice recognition to be performed by television receiver 150 .
- Voice recognition may allow for a particular user to be determined and for commands to be completed based on a user speaking such commands. For instance, an adult user may be permitted to perform certain functions that a child user cannot; such as unlocking doors.
- Each user may provide a voice sample which is used by television receiver 150 to distinguish users from each other. Further, users may be able to speak commands, such as “lower heat 5 degrees,” to control home automation devices.
- television receiver 150 may determine to which home automation device the command is intended and may transmit an appropriate command (such as, in this example, a command to lower the heat setting by five degrees to thermostat 222 ).
- a user may use a user-defined code word that precedes or follows a command, such as “sesame,” then speaking a command such as “turn on the living room lights.”
- fingerprint identification may be used to determine an identify of a user. Specific functions of television receiver 150 may require that a user log in, such as via a fingerprint scanner, before being able to view and/or modify such functions.
- light controller 220 may permit a light to be turned on, off, and/or dimmed by television receiver 150 or overlay device 251 (such as based on a user command received via wireless device 216 or directly via television receiver 150 or overlay device 251 ).
- Light controller 220 may control a single light. As such, multiple different light controllers 220 may be present within a house.
- a physical light switch (which opens and closes a circuit of the light) may be left in the on position such that light controller 220 can be used to control whether the light is on or off.
- Light control 220 may be integrated into a light bulb or into a circuit (such as between the light fixture and the power source) to control whether the light is on or off.
- the user via television receiver 150 or overlay device 251 may be permitted to view a status of all light controllers 220 within a location. Since television receiver 150 or overlay device 251 may communicate using different home automation protocols, different light controllers 220 (and, more generally, different home automation devices) within a location may use disparate communication protocols, but may all still be controlled by television receiver 150 or overlay device 251 . In some embodiments, wireless light switches may be used that communicate with television receiver 150 or overlay device 251 . Such switches may use a different communication protocol than light controllers 220 . Such a difference may not affect functionality because television receiver 150 or overlay device 251 can serve as a hub for multiple disparate communication protocols and perform any necessary translation and/or bridging functions.
- a tablet computer may transmit a command over a WiFi connection and television receiver 150 or overlay device 251 may translate the command into an appropriate Zigbee or Zwave command for a wireless light bulb.
- the translation may occur for a group of disparate devices. For example, a user decides to turn off all lights in a room and selects a lighting command on the tablet computer.
- the overlay device 251 identifies the lights in the room and outputs appropriate commands to all devices over different protocol, such as a Zigbee wireless lightbulb and a Zwave table lamp.
- Television receiver 150 may permit timers and/or dimmer settings to be set for lights via light controller 220 . For instance, lights can be configured to turn on/off at various times during a day according to a schedule (and/or events being detected by the home automation system).
- thermostat 222 may communicate with television receiver 150 or overlay device 251 .
- Thermostat 222 may provide heating/cooling updates on the location to television receiver 150 or overlay device 251 for display via display device 160 and/or wireless device 216 . Further, control of thermostat 222 may be effectuated via television receiver 150 or overlay device 251 . Zone control within a structure using multiple thermostats may also be possible.
- Leak detection sensor 224 of FIG. 2 may be in communication with television receiver 150 or overlay device 251 and may be used to determine when a water leak as occurred, such as in pipes supplying water-based fixtures with water.
- Leak detection sensor 224 may be configured to attach to the exterior of a pipe and listen for a sound of water moving within a pipe.
- sonar, temperature sensors or ion infused water with appropriate sensors may be used to detect moving water. As such, cutting or otherwise modifying plumbing may not be necessary to use leak detection sensor 224 . If water movement is detected for greater than a threshold period of time, it may be determined a leak is occurring.
- Leak detection sensor 224 may have a component that couples over an existing valve such that the flow of water within one or more pipes can be stopped.
- leak detection sensor 224 determines a leak may be occurring, a notification may be provided to a user via wireless device 216 and/or display device 160 by television receiver 150 or overlay device 251 . If a user does not clear the notification, the flow of water may be shut off by leak detection sensor 224 after a predefined period of time. A user may also be able to provide input to allow the flow of water to continue or to immediately interrupt the flow of water.
- the home automation system may utilize various rules to determine whether a leak is occurring. For example, a measurement threshold may be utilized in the event that water is flowing to an ice machine. The amount of water typically drawn by such a device may be known, if the flow rate and/or flow time significantly exceeds normal operating parameters, it may be determined that a leak is occurring.
- the home automation system may communicate with appliances to determine whether water is flowing to the device. For example, a home automation system may communicate with a washing machine in operation to determine that water is flowing to the appliance, and thus, determine that a water leak is not occurring. If no appliance is using water (and, possibly, it is known that no user is home) it may be determined that a leak is occurring.
- data from various motion sensors may be utilized. For example, if the system identifies that users have left the home, but a large flow of water is occurring, then the system may determine that a leak is occurring and notify a user or take remedial steps accordingly.
- VoIP controller 225 may permit television receiver 150 to serve as a hub for a home phone system.
- One or more conventional telephones may be connected with television receiver 150 . Calls may be converted to IP by television receiver 150 and allow for calls to be received and placed via network 270 , which is connected with the Internet. The need for a dedicated home phone line may thus be eliminated.
- a cellular back channel e.g., via a cellular modem
- Appliance controller 226 of FIG. 2 may permit a status of an appliance to be retrieved and commands to control operation to be sent to an appliance by television receiver 150 or overlay device 251 .
- appliance controller 226 may control a washing machine, a dryer, a dishwasher, an oven, a microwave, a refrigerator, a toaster, a coffee maker, a hot tub, or any other form of appliance.
- Appliance controller 226 may be connected with the appliance or may be integrated as part of the appliance.
- Appliances and other electronic devices may also be monitored for electricity usage.
- US Pat. Pub. No. 2013/0318559 filed Nov. 19, 2012, to Crabtree, entitled “Apparatus for Displaying Electrical Device Usage Information on a Television Receiver,” which is hereby incorporated by reference, may allow for information regarding the electricity usage of one or more devices (e.g., other home automation devices or circuits within a home that are monitored) to be determined.
- Control of one or more home automation devices may be dependent on electrical usage and stored electrical rates. For instance, a washing machine may be activated in the evening when rates are lower. Additionally or alternatively, operation of devices may be staggered to help prevent consuming too much power at a given time. For instance, an electric heater may not be activated until a dryer powered via the same circuit is powered down.
- Garage door controller 228 of FIG. 2 may permit a status of a garage door to be checked and the door to be opened or closed by a user via television receiver 150 or overlay device 251 .
- the garage door may be controlled. For instance, if wireless device 216 is a cellular phone and it is detected to have moved a threshold distance away from a house having a garage door controller 228 installed, a notification may be sent to wireless device 216 . If no response is received within a threshold period of time, the garage may be automatically shut. If wireless device 216 moves within a threshold distance of garage door controller 228 , the garage may be opened.
- Lock controller 230 of FIG. 2 may permit a door to be locked and unlocked and/or monitored by a user via television receiver 150 or overlay device 251 .
- lock controller 230 may have an integrated door sensor 208 to determine if the door is open, shut, or partially ajar. Being able to only determine if a door is locked or unlocked may not be overly useful—for instance, a lock may be in a locked position, but if the door is ajar, the lock may not prevent access to the house. Therefore, for security, a user may benefit from knowing both that a door is closed (or open) and locked (or unlocked).
- lock controller 230 may have an integrated door sensor 208 that allows for the single lock controller 230 to lock/unlock a door and provide a status as to whether the door is open or shut. Therefore, a single device may control a lock and determine whether the associated door is shut or open. No mechanical or electrical component may need to be integrated separately into a door or doorframe to provide such functionality. Such a single device may have a single power source that allows for sensing of the lock position, sensing of the door position, and for engagement/disengagement of the lock.
- Lock controller 230 may have an integrated door sensor that includes a reed switch or proximity sensor that detects when the door is in a closed position, with a plate of the lock in proximity to a plate on the door frame of the door.
- a plate of the lock may have an integrated magnet or magnetized doorframe plate.
- a reed switch located in lock controller 230 may be used to determine that the door is closed; when not in proximity to the magnet, the reed switch located in lock controller 230 may be used to determine that the door is at least partially ajar.
- other forms of sensing may also be used, such as a proximity sensor to detect a doorframe.
- the sensor to determine the door is shut may be integrated directly into the deadbolt or other latching mechanism of lock controller 230 . When the deadbolt is extended, a sensor may be able to determine if the distal end of the deadbolt is properly latched within a door frame based on a proximity sensor or other sensing means.
- a home security system 207 of FIG. 2 may be integrated with a home automation system.
- the home security system 207 may detect motion, when a user has armed/disarmed the home security system 207 , when windows/doors are opened or broken, etc.
- Television receiver 150 may adjust settings based of home automation devices based on home security system 207 being armed or disarmed.
- a virtual control and alarm panel may be presented to a user via a display device 160 and television receiver 150 .
- the functions of a wall mounted panel alarm can be integrated in the graphical user interface of the TV viewing experience such as a menu system with an underlying tree structure.
- the virtual control and alarm panel can appear in a full screen or Picture-in-Picture (PiP) with TV content.
- PiP Picture-in-Picture
- Alarms and event notification can be in the form of scrolling text overlays, popups, flashing icons, etc.
- Camera video e.g., from camera 212
- the camera's video stream can be displayed full screen, PiP with TV content, or as a tiled mosaic to display multiple camera's streams at a same time.
- the display can switch between camera streams at fixed intervals.
- Television receiver 150 may perform video scaling, adjust frame rate and transcoding on video received from camera 212 .
- television receiver 150 may adaptively transcode the camera content to match an Internet connection.
- Irrigation controller 232 of FIG. 2 may allow for a status and control of an irrigation system (e.g., sprinkler system) to be controlled by a user via television receiver 150 and/or overlay device 251 .
- Irrigation controller 232 may be used in conjunction with weather sensor 206 to determine whether and/or for how long irrigation controller 232 should be activated for watering. Further, a user, via television receiver 150 and/or overlay device, may turn on, turn off, or adjust settings of irrigation controller 232 .
- One or more motion sensors can be incorporated into one or more of the previously detailed home automation devices or as a stand-alone device. Such motion sensors may be used to determine if a structure is occupied. Such information may be used in conjunction with a determined location of one or more wireless devices. If some or all users are not present in the structure, home automation settings may be adjusted, such as by lowering a temperature of thermostat 222 , shutting off lights via light controller 220 , and determining if one or more doors are closed by door sensor 208 . In some embodiments, a user-defined script may be run when it is determined that no users or other persons are present within the structure.
- a mailbox sensor may be attached to a mailbox to determine when mail is present and/or has been picked up.
- the ability to control one or more showers, baths, and/or faucets from television receiver 150 and/or wireless device 216 may also be possible.
- Pool and/or hot tub monitors may be incorporated into a home automation system. Such sensors may detect whether or not a pump is running, water temperature, pH level, a splash/whether something has fallen in, etc. Further, various characteristics of the pool and/or hot tub may be controlled via the home automation system.
- a vehicle dashcam may upload or otherwise make video/audio available to television receiver 150 when within range. For instance, when a vehicle has been parked within range of a local wireless network with which television receiver 150 is connected, video and/or audio may be transmitted from the dashcam to the television receiver for storage and/or uploading to a remote server.
- the home automation functions detailed herein that are attributed to television receiver 150 may alternatively or additionally be incorporated into overlay device 251 . As such, a separate overlay device 251 may be connected with display device 160 to provide home automation functionality.
- FIG. 3 an embodiment of a television receiver 300 , which may represent television receiver 150 of FIG. 1 and/or FIG. 2 , is illustrated.
- Television receiver 300 may be configured to function as a host for a home automation system either alone or in conjunction with a communication device, such as communication device 252 of FIG. 2 .
- Television receiver 300 may be in the form of a separate device configured to be connected with a display device, such as a television.
- Embodiments of television receiver 300 can include set top boxes (STBs).
- STBs set top boxes
- a television receiver may be incorporated as part of another device, such as a television, other form of display device, video game console, computer, mobile phone or tablet or the like.
- a television may have an integrated television receiver (which does not involve an external STB being coupled with the television).
- television receiver 300 may be incorporated as part of a television, such as display device 160 of FIG. 1 .
- Television receiver 300 may include: processors 310 (which may include control processor 310 - 1 , tuning management processor 310 - 2 , and possibly additional processors), tuners 315 , network interface 320 , non-transitory computer-readable storage medium 325 , electronic programming guide (EPG) database 330 , television interface 335 , digital video recorder (DVR) database 345 (which may include provider-managed television programming storage and/or user-defined television programming), on-demand programming database 327 , home automation settings database 347 , home automation script database 348 , remote control interface 350 , security device 360 , and/or descrambling engine 365 .
- processors 310 which may include control processor 310 - 1 , tuning management processor 310 - 2 , and possibly additional processors
- tuners 315 , network interface 320 , non-transitory computer-readable storage medium 325 , electronic programming guide (
- television receiver 300 fewer or greater numbers of components may be present. It should be understood that the various components of television receiver 300 may be implemented using hardware, firmware, software, and/or some combination thereof. Functionality of components may be combined; for example, functions of descrambling engine 365 may be performed by tuning management processor 310 - 2 . Further, functionality of components may be spread among additional components.
- processors 310 may include one or more specialized and/or general-purpose processors configured to perform processes such as tuning to a particular channel, accessing and displaying EPG information from EPG database 330 , and/or receiving and processing input from a user. It should be understood that the functions performed by various modules of FIG. 3 may be performed using one or more processors. As such, for example, functions of descrambling engine 365 may be performed by control processor 310 - 1 .
- Control processor 310 - 1 of FIG. 3 may communicate with tuning management processor 310 - 2 .
- Control processor 310 - 1 may control the recording of television channels based on timers stored in DVR database 345 .
- Control processor 310 - 1 may also provide commands to tuning management processor 310 - 2 when recording of a television channel is to cease.
- control processor 310 - 1 may provide commands to tuning management processor 310 - 2 that indicate television channels to be output to decoder module 333 for output to a display device.
- Control processor 310 - 1 may also communicate with network interface 320 and remote control interface 350 .
- Control processor 310 - 1 may handle incoming data from network interface 320 and remote control interface 350 . Additionally, control processor 310 - 1 may be configured to output data via network interface 320 .
- Control processor 310 - 1 of FIG. 3 may include the home automation engine 311 .
- Home automation engine 311 may permit television receiver 300 and control processor 310 - 1 to provide home automation functionality.
- Home automation engine 311 may have a JSON (JavaScript Object Notation) command interpreter or some other form of command interpreter that is configured to communicate with wireless devices via network interface 320 and a message server (possibly via a message server client). Such a command interpreter of home automation engine 311 may also communicate via a local area network with devices (without using the Internet).
- Home automation engine 311 may contain multiple controllers specific to different protocols; for instance, a ZigBee® controller, a Z-Wave® controller, and/or an IP camera controller (wireless LAN, 802.11) may be present.
- Home automation engine 311 may contain a media server configured to serve streaming audio and/or video to a remote devices (on a local area network or the Internet).
- Television receiver may be able to serve such devices with recorded content, live content, and/or content recorded using one or more home automation devices, such as camera 212 .
- Tuners 315 of FIG. 3 may include one or more tuners used to tune to transponders that include broadcasts of one or more television channels. Such tuners may be used also to receive for storage on-demand content and/or credit-earning television commercials and/or home automation functions. In some embodiments, two, three, or more than three tuners may be present, such as four, six, or eight tuners. Each tuner contained in tuners 315 may be capable of receiving and processing a single transponder stream from a satellite transponder (or from a cable network) at a given time. As such, a single tuner may tune to a single transponder stream at a given time.
- tuners 315 include multiple tuners, one tuner may be used to tune to a television channel on a first transponder stream for display using a television, while another tuner may be used to tune to a television channel on a second transponder for recording and viewing at some other time. If multiple television channels transmitted on the same transponder stream are desired, a single tuner of tuners 315 may be used to receive the signal containing the multiple television channels for presentation and/or recording. Tuners 315 may receive commands from tuning management processor 310 - 2 . Such commands may instruct tuners 315 to which frequencies are to be tuned.
- Network interface 320 of FIG. 3 may be used to communicate via an alternate communication channel with a television service provider, if such communication channel is available.
- a communication channel may be via satellite (which may be unidirectional to television receiver 300 ) and the alternate communication channel (which may be bidirectional) may be via a network, such as the Internet.
- Data may be transmitted from television receiver 300 to a television service provider system and from the television service provider system to television receiver 300 .
- Information may be transmitted and/or received via network interface 320 . For instance, instructions from a television service provider may also be received via network interface 320 , if connected with the Internet.
- the primary communication channel being satellite, cable network, an IP-based network, or broadcast network may be used.
- Network interface 320 may permit wireless communication with one or more types of networks, including using home automation network protocols and wireless network protocols. Also, wired networks may be connected to and communicated with via network interface 320 .
- Device interface 321 may represent a USB port or some other form of communication port that permits communication with a communication device.
- Storage medium 325 of FIG. 3 may represent one or more non-transitory computer-readable storage mediums.
- Storage medium 325 may include memory and/or a hard drive.
- Storage medium 325 may be used to store information received from one or more satellites and/or information received via network interface 320 .
- Storage medium 325 may store information related to on-demand programming database 327 , EPG database 330 , DVR database 345 , home automation settings database 347 , and/or home automation script database 348 .
- Recorded television programs may be stored using storage medium 325 as part of DVR database 345 .
- Storage medium 325 may be partitioned or otherwise divided (such as into folders) such that predefined amounts of storage medium 325 are devoted to storage of television programs recorded due to user-defined timers and stored television programs recorded due to provider-defined timers.
- Home automation settings database 347 of FIG. 3 may allow configuration settings of home automation devices and user preferences to be stored.
- Home automation settings database 347 may store data related to various devices that have been set up to communicate with television receiver 300 .
- home automation settings database 347 may be configured to store information on which types of events should be indicated to users, to which users, in what order, and what communication methods should be used. For instance, an event such as an open garage may only be notified to certain wireless devices (e.g., a cellular phone associated with a parent, not a child), notification may be by a third-party notification server, email, text message, and/or phone call.
- a second notification method may only be used if a first fails. For instance, if a notification cannot be sent to the user via a third-party notification server, an email may be sent.
- Home automation settings database 347 of FIG. 3 may store information that allows for the configuration and control of individual home automation devices which may operate using Z-wave and Zigbee—specific protocols. To do so, home automation engine 311 may create a proxy for each device that allows for settings for the device to be passed through a UI (e.g., presented on a television) to allow for settings to be solicited for and collected via a user interface presented by television receiver or overlay device. The received settings may then be handled by the proxy specific to the protocol, allowing for the settings to be passed on to the appropriate device. Such an arrangement may allow for settings to be collected and received via a UI of the television receiver or overlay device and passed to the appropriate home automation device and/or used for managing the appropriate home automation device.
- UI e.g., presented on a television
- Home automation script database 348 of FIG. 3 may store scripts that detail how home automation devices are to function based on various events occurring. For instance, if stored content starts being played back by television receiver 300 , lights in the vicinity of display device 160 may be dimmed and shades may be lowered by shade controller 204 . As another example, when a user shuts programming off late in the evening, there may be an assumption the user is going to bed. Therefore, the user may configure television receiver 300 to lock all doors via lock controller 230 , shut the garage door via garage controller 228 , lower a heat setting of thermostat 222 , shut off all lights via light controller 220 , and determine if any windows or doors are open via window sensor 210 and door sensor 208 (and, if so, alert the user). Such scripts or programs may be predefined by the home automation/television service provider and/or may be defined by a user.
- home automation script database 248 of FIG. 3 may allow for various music profiles to be implemented. For instance, based on home automation settings within a structure, appropriate music may be played. For instance, if the lights are dimmed, romantic music may be played. Conversely, based on the music being played, settings of home automation devices may be determined. If television programming, such as a movie, is output for playback by television receiver 150 , a particular home automation script may be used to adjust home automation settings (e.g., lower lights, raise temperature, and lock doors).
- EPG database 330 of FIG. 3 may store information related to television channels and the timing of programs appearing on such television channels.
- EPG database 330 may be stored using storage medium 325 , which may be a hard drive or solid-state drive. Information from EPG database 330 may be used to inform users of what television channels or programs are popular and/or provide recommendations to the user. Information from EPG database 330 may provide the user with a visual interface displayed by a television that allows a user to browse and select television channels and/or television programs for viewing and/or recording. Information used to populate EPG database 330 may be received via network interface 320 , via satellite, or some other communication link with a television service provider (e.g., a cable network). Updates to EPG database 330 may be received periodically. EPG database 330 may serve as an interface for a user to control DVR functions of television receiver 300 , and/or to enable viewing and/or recording of multiple television channels simultaneously. EPG database 340 may also contain information about on-demand content or any other form of accessible content.
- Decoder module 333 of FIG. 3 may serve to convert encoded video and audio into a format suitable for output to a display device.
- decoder module 333 may receive MPEG video and audio from storage medium 325 or descrambling engine 365 to be output to a television.
- MPEG video and audio from storage medium 325 may have been recorded to DVR database 345 as part of a previously-recorded television program.
- Decoder module 333 may convert the MPEG video and audio into a format appropriate to be displayed by a television or other form of display device and audio into a format appropriate to be output from speakers, respectively.
- Decoder module 333 may have the ability to convert a finite number of television channel streams received from storage medium 325 or descrambling engine 365 , simultaneously. For instance, decoders within decoder module 333 may be able to only decode a single television channel at a time. Decoder module 333 may have various numbers of decoders.
- Television interface 335 of FIG. 3 may serve to output a signal to a television (or another form of display device) in a proper format for display of video and playback of audio.
- television interface 335 may output one or more television channels, stored television programming from storage medium 325 (e.g., television programs from DVR database 345 , television programs from on-demand programming 330 and/or information from EPG database 330 ) to a television for presentation.
- Television interface 335 may also serve to output a CVM.
- DVR functionality may permit a television channel to be recorded for a period of time.
- DVR functionality of television receiver 300 may be managed by control processor 310 - 1 .
- Control processor 310 - 1 may coordinate the television channel, start time, and stop time of when recording of a television channel is to occur.
- DVR database 345 may store information related to the recording of television channels.
- DVR database 345 may store timers that are used by control processor 310 - 1 to determine when a television channel should be tuned to and its programs recorded to DVR database 345 of storage medium 325 . In some embodiments, a limited amount of storage medium 325 may be devoted to DVR database 345 .
- Timers may be set by the television service provider and/or one or more users of television receiver 300 .
- DVR database 345 of FIG. 3 may also be used to record recordings of service provider-defined television channels. For each day, an array of files may be created. For example, based on provider-defined timers, a file may be created for each recorded television channel for a day. For example, if four television channels are recorded from 6-10 PM on a given day, four files may be created (one for each television channel). Within each file, one or more television programs may be present.
- the service provider may define the television channels, the dates, and the time periods for which the television channels are recorded for the provider-defined timers.
- the provider-defined timers may be transmitted to television receiver 300 via the television provider's network. For example, in a satellite-based television service provider system, data necessary to create the provider-defined timers at television receiver 150 may be received via satellite.
- a television service provider may configure television receiver 300 to record television programming on multiple, predefined television channels for a predefined period of time, on predefined dates. For instance, a television service provider may configure television receiver 300 such that television programming may be recorded from 7 to 10 PM on NBC, ABC, CBS, and FOX on each weeknight and from 6 to 10 PM on each weekend night on the same channels. These channels may be transmitted as part of a single transponder stream such that only a single tuner needs to be used to receive the television channels. Packets for such television channels may be interspersed and may be received and recorded to a file.
- a television program is selected for recording by a user and is also specified for recording by the television service provider, the user selection may serve as an indication to save the television program for an extended time (beyond the time which the predefined recording would otherwise be saved).
- Television programming recorded based on provider-defined timers may be stored to a portion of storage medium 325 for provider-managed television programming storage.
- On-demand programming database 327 of FIG. 3 may store additional television programming.
- On-demand programming database 327 may include television programming that was not recorded to storage medium 325 via a timer (either user- or provider-defined). Rather, on-demand programming may be programming provided to the television receiver directly for storage by the television receiver and for later presentation to one or more users. On-demand programming may not be user-selected. As such, the television programming stored to on-demand programming database 327 may be the same for each television receiver of a television service provider.
- On-demand programming database 327 may include pay-per-view (PPV) programming that a user must pay and/or use an amount of credits to view. For instance, on-demand programming database 327 may include movies that are not available for purchase or rental yet. Typically, on-demand programming is presented commercial-free.
- PSV pay-per-view
- television channels received via satellite may contain at least some scrambled data. Packets of audio and video may be scrambled to prevent unauthorized users (e.g., nonsubscribers) from receiving television programming without paying the television service provider.
- the transponder stream may be a series of data packets corresponding to multiple television channels. Each data packet may contain a packet identifier (PID), which can be determined to be associated with a particular television channel. Particular data packets, referred to as entitlement control messages (ECMs), may be periodically transmitted.
- PID packet identifier
- ECMs entitlement control messages
- ECMs may be associated with another PID and may be encrypted; television receiver 300 may use decryption engine 361 of security device 360 to decrypt ECMs. Decryption of an ECM may only be possible if the user has authorization to access the particular television channel associated with the ECM. When an ECM is determined to correspond to a television channel being stored and/or displayed, the ECM may be provided to security device 360 for decryption.
- security device 360 may decrypt the ECM to obtain some number of control words. In some embodiments, from each ECM received by security device 360 , two control words are obtained. In some embodiments, when security device 360 receives an ECM, it compares the ECM to the previously received ECM. If the two ECMs match, the second ECM is not decrypted because the same control words would be obtained. In other embodiments, each ECM received by security device 360 is decrypted; however, if a second ECM matches a first ECM, the outputted control words will match; thus, effectively, the second ECM does not affect the control words output by security device 360 . Security device 360 may be permanently part of television receiver 300 or may be configured to be inserted and removed from television receiver 300 , such as a smart card, cable card or the like.
- Tuning management processor 310 - 2 of FIG. 3 may be in communication with tuners 315 and control processor 310 - 1 .
- Tuning management processor 310 - 2 may be configured to receive commands from control processor 310 - 1 . Such commands may indicate when to start/stop receiving and/or recording of a television channel and/or when to start/stop causing a television channel to be output to a television.
- Tuning management processor 310 - 2 may control tuners 315 .
- Tuning management processor 310 - 2 may provide commands to tuners 315 that instruct the tuners which satellite, transponder, and/or frequency to tune to. From tuners 315 , tuning management processor 310 - 2 may receive transponder streams of packetized data.
- Descrambling engine 365 of FIG. 3 may use the control words output by security device 360 in order to descramble video and/or audio corresponding to television channels and/or home automation functions for storage and/or presentation.
- Video and/or audio data contained in the transponder data stream received by tuners 315 may be scrambled.
- Video and/or audio data may be descrambled by descrambling engine 365 using a particular control word. Which control word output by security device 360 to be used for successful descrambling may be indicated by a scramble control identifier present within the data packet containing the scrambled video or audio.
- Descrambled video and/or audio may be output by descrambling engine 365 to storage medium 325 for storage (in DVR database 345 ) and/or to decoder module 333 for output to a television or other presentation equipment via television interface 335 .
- the television receiver 300 of FIG. 3 may be configured to periodically reboot in order to install software updates downloaded over the network 190 or satellites 130 . Such reboots may occur for example during the night when the users are likely asleep and not watching television. If the system utilizes a single processing module to provide television receiving and home automation functionality, then the security functions may be temporarily deactivated. In order to increase the security of the system, the television receiver 300 may be configured to reboot at random times during the night in order to allow for installation of updates. Thus, an intruder is less likely to guess the time when the system is rebooting.
- the television receiver 300 may include multiple processing modules for providing different functionality, such as television receiving functionality and home automation, such that an update to one module does not necessitate reboot of the whole system. In other embodiments, multiple processing modules may be made available as a primary and a backup during any installation or update procedures.
- television receiver 300 of FIG. 3 has been reduced to a block diagram; commonly known parts, such as a power supply, have been omitted. Further, some routing between the various modules of television receiver 300 has been illustrated. Such illustrations are for exemplary purposes only. The state of two modules not being directly or indirectly connected does not indicate the modules cannot communicate. Rather, connections between modules of the television receiver 300 are intended only to indicate possible common data routing. It should be understood that the modules of television receiver 300 may be combined into a fewer number of modules or divided into a greater number of modules. Further, the components of television receiver 300 may be part of another device, such as built into a television. Television receiver 300 may include one or more instances of various computerized components, such as disclosed in relation to computer system 700 of FIG. 7 .
- the television receiver 300 has been illustrated as a satellite-based television receiver, it is to be appreciated that techniques below may be implemented in other types of television receiving devices, such a cable receivers, terrestrial receivers, IPTV receivers or the like.
- the television receiver 300 may be configured as a hybrid receiving device, capable of receiving content from disparate communication networks, such as satellite and terrestrial television broadcasts.
- the tuners may be in the form of network interfaces capable of receiving content from designated network locations.
- the home automation functions of television receiver 300 may be performed by an overlay device. If such an overlay device, television programming functions may still be provided by a television receiver that is not used to provide home automation functions.
- one or more home automations functions may be performed by a voice command engine 370 , which may be incorporated in the home automation engine 311 as shown and/or in the storage medium 325 .
- the voice command engine 370 may provide for speaker-dependent commands to be implemented in the home automation system, as described in further detail in the following paragraphs.
- FIG. 4 an example method 400 for controlling a device in a home automation system, such as the home automation system 200 , based on a speaker-dependent command is provided.
- the method 400 may be implemented by the voice command engine 370 , which may be incorporated in the home automation engine 311 that is found in the television receiver 150 and/or the overlay device 251 .
- the method 400 shown, and any other methods disclosed herein, may include additional and/or alternative steps in relation to the steps being shown. Further, any steps may be optional, rearranged, and/or combined. Numerous variations are possible.
- the method 400 may include receiving a voice command 402 .
- the voice command 402 may be a spoken voice command provided by a user, e.g. a speaker on the premises having the home automation system 200 , and directed to adjusting a state of one or more devices in the home automation system 200 .
- the speaker may be an adult located in a living room of a house equipped with the home automation system 200 and the spoken voice command from the adult may include, “Lower the heat by five degrees.”
- Such spoken voice commands may be detected by one or more microphones in operative communication with the voice command engine 370 .
- the microphone(s) utilized by the voice command engine 370 may be found on any television receiver, a television remote control, and/or any of the home automation devices in the home automation system 200 , such as the intercom 218 , display device 160 , home security 207 , camera 212 , health sensor 214 , or any other device shown in FIG. 2 .
- the voice command may be communicated to the voice command engine 370 via a local wireless area network, a wired network, a home automation network, and/or any other type of communications network. Other examples are possible.
- the voice command received by the voice command engine 370 may be provided for by the speaker via a mobile device, such as the wireless device 216 , which may represent a tablet computer, cellular phone, laptop computer, remote computer, or some other device through which a user may desire to control home automation settings and view home automation information.
- Voice commands may be detected by a microphone of the wireless device 216 , which may further encode and/or transmit the detected voice command to the voice command engine 370 via a communications network.
- the voice command engine 370 may receive the voice command from a remotely-located microphone, e.g. a mobile device and speaker not located on the premises having the home automation system 200 , and further decode the voice command signal.
- the wireless device 216 may support a mobile application or app that is configured to detect, encode and/or and transmit the voice command to the voice command engine 370 .
- the voice command engine 370 connects to a VoIP platform and receives voice commands via VoIP, or traditional phone calls via a cellular or landline network. Still, other examples are possible. Further, other sensors for detecting the voice command may be possible.
- the received voice command may be preceded by a spoken code-word that is received by the voice command engine 370 in a similar manner as described above. It is contemplated that such code-words, or phrases, such as code-word “Sesame” preceding the command, “Lower the heat by five degrees,” may be user-defined and/or pre-programmed.
- a code-word may be directed to activating or otherwise indicating to the voice command engine 370 an incoming voice command.
- the code-word may be user-specific and indicate a speaker identity and/or speaker verification to the voice command engine 370 .
- the code-word may serve as a password or passcode for controlling any devices in the home automation system 200 , and may be device-specific, alternatively and/or additionally to being user-specific.
- the code-word may be utilized to alter operational settings of other modules or devices in operation with the voice command engine 370 to implement one or more settings for receiving the voice command.
- various or all components of the voice command engine 370 may be in a sleep mode until the code-word is detected at the engine 370 and/or a television receiver, whereupon the voice command engine 370 may be activated by an activation module therein and/or by a signal from the television receiver. This may prepare the voice command engine 370 for receiving the voice command and for performing subsequent steps thereafter.
- a volume level of a television or other speaker may be lowered in preparation for receiving the voice command at the voice command engine 370 .
- the voice command engine 370 may acknowledge the code-word and/or the speaker identity prior to receiving the voice command. For example, the voice command engine 370 , upon activation in response to the spoken code-word, may output in speech and/or text format on a display screen, “Yes, Sam?” to indicate the voice command engine 370 is active and/or to indicate a determined speaker identity of the code-word. Following that responsive prompt, the voice command engine 370 may activate one or more microphones to receive the voice command and/or provide further prompts and receive further responses to clarify the speaker identity and/or clarify the intentions of the voice command.
- activating one or more microphones may include determining if a microphone is in closer proximity to the speaker, turning that microphone on and switching an original microphone that picked up the code-word to an off state. This may ensure quality and accuracy of the voice detection and subsequent voice recognition process.
- the voice command engine 370 may determine that more than one speaker is giving voice commands at any given time. In that case, if the voice commands include conflicting orders, the voice command engine 370 may determine the plurality of speaker identities involved and determine which speaker identity has the higher level of permission, e.g. adult or child, and implement the voice command of the higher-ranked speaker in remaining steps of the method 400 . In some cases, the voice command engine 370 may confirm denial of certain voice commands and/or output a reason for denying such commands. Similarly, the voice command engine 370 may confirm reception of the voice command, and/or repeat the received voice command as a confirmation to the speaker immediately after receiving the voice command. Still, other examples are possible.
- the method 400 may include determining a speaker identity 404 .
- the voice command engine 370 may perform a voice recognition analysis to determine a speaker identity of the received voice command. For instance, the voice command engine 370 may analyze a speech pattern, voice pitch, speaking style, accent, and/or other aspects of speech to determine an identity of the speaker.
- the received voice command, and/or portions thereof is compared against one or more voice samples in a database of voices, e.g. a voice database of the voice command engine 370 as shown in FIG. 5 .
- the voice database may include one or more voice samples associated with one or more speaker identities. When a match is determined by the voice command engine 370 in comparing the voice command to the voice sample(s), the speaker identity may be determined, and/or tentatively determined and output for final confirmation from the speaker.
- the voice command engine 370 may perform the voice recognition analysis to verify that a speaker is who they say they are. For instance, the voice command engine 370 may detect that the speaker has identified herself in the voice command and/or code-word, or otherwise. In that case, the voice command engine 370 may treat the speaker-provided identification as a tentative identity, and perform further voice recognition analysis to verify the identity. Such analysis may be performed utilizing the voice database as described above. It is contemplated that such features may enhance secure access to the controls for the home automation system 200 . For instance, further prompts and/or denial of access to the home automation system 200 may be implemented by the voice command engine 370 in response to determining a false speaker verification.
- the method 400 may include determining a device for control 406 .
- the voice command engine 370 may perform a speech recognition analysis to identify one or more devices in the home automation system 200 that are intended to be controlled by the received command. For instance, the voice command engine 370 may convert the spoken voice command to a digitally stored set of words. In some cases, all or a portion of the set of words are compared to one or more device names, such as device IDs, which may be stored in a database of device names, e.g. a controls database of the voice command engine 370 as shown in FIG. 5 . In that case, the voice command engine 370 may identify one or more words in the set of words that match one or more device names in the controls database. The matched device name(s) may indicate the intended device(s) for control.
- the voice command engine 370 may utilize speech recognition to determine one or more words or phrases in the spoken voice command that may be related to a function of a device, and determine the device to be controlled based on the function revealed in the voice command.
- the voice command engine 370 may determine that in the received voice command for “Lower the heat by five degrees,” no particular device was verbally identified by the speaker. Additionally and/or alternatively, the voice command engine 370 may determine that certain portions of the voice command include command phrases, such as the phrase “lower the heat” and/or the word “degrees”. The combination of the identified command phrases, and/or a command phrase taken alone, may be sufficient for the voice command engine 370 to determine the device(s) to control.
- the voice command engine 370 may compare one or more of the control phrases to a database of control phrases, e.g. controls database of the voice command engine 370 as shown in FIG. 5 , whereby one or more control phrases may each be associated with one or more device names, e.g. device IDs. In this way, the voice command engine 370 may look-up or otherwise determine the intended device based on a match of the identified command phrase(s) with one or a grouping of control phrases that are associated with a device ID. In the example voice command for “Lower the heat by five degrees,” the voice command engine 370 may determine that the device intended for control is a thermostat, e.g. thermostat 222 .
- control phrases may be determined based on a plurality of voice commands.
- the voice command engine 370 may prompt the speaker for additional voice commands and/or other input if an initially received command was determined insufficient for determining a device.
- the voice command engine 370 may determine a list of possible device IDs indicating devices to control based on the control phrase(s), and prompt the user for more information or voice commands in order to narrow down the list of device IDs to the determined device ID.
- other cues may be captured by the voice command engine 370 via various devices in the home automation system 200 . For instance, a speaker location may be detected by the one or devices and transmitted to the voice command engine 370 to be utilized for determination of the device to control.
- the camera 212 in a living room may notify the voice command engine 370 , via the home automations network or other communications network, that the speaker is located in the living room, as detected by the camera 212 .
- the voice command engine 370 may lower the thermostat that controls the living room by five degrees, while other thermostats in other parts of the house are unaffected, e.g. the voice command engine 370 does not send the command signal to other thermostats.
- the voice command engine 370 may query other devices, such as the camera 212 , for such additional information in order to determine the intended device.
- the voice command engine 370 may implement speech recognition to determine a new state or status being requested. For example, the voice command engine 370 may determine “five degrees” to be a magnitude or unit of change for the determined device, based on the example received voice command of “Lower the heat by five degrees.” In another example, the voice command engine 370 may determine that “on” is a new state in an example received voice command of “Turn the light on.” In another aspect, control phrases may be user-defined and/or pushed from a satellite to the controls database during periodic updates from, for example, the home automation service server 112 via satellite connections and/or the network 190 , as shown in FIG. 1 . In this way, it is contemplated that the controls database stays current and relevant to the devices connected to the home automation system 200 . Still, other examples are possible.
- the method 400 may include determining a permission of the speaker to control the identified device(s) 408 .
- the voice command engine 370 may determine a permission status to control the identified device(s). The permission status may be based on the determined speaker identity and/or the identified device(s) to control. It is contemplated that the permission status enhances safety and security of the home automation system 200 by having the voice command engine 370 by prohibit otherwise undesirable controls from being implemented. In another aspect, the permission status allows the voice command engine 370 to provide greater flexibility for user(s), who may set when certain commands are forbidden and when certain commands are allowable.
- the permission status determined by the voice command engine 370 may include an access granted status and/or an access denied status.
- the permission status may be determined based on one or more variables, such as the speaker identity, access settings associated with the speaker identity such as parental controls settings, the identified control phrases in the voice command, the device for control, the code words, voice samples, a current status of the device to control or other devices in the home automation system, and so on.
- other variables such as time of day, may be considered by the voice command engine 370 in determining the permission status.
- the voice command engine 370 may determine a plurality of permission statuses.
- the voice command engine 370 may determine that some of the plurality of identified devices have a granted status, while others of the same plurality of identified devices have an access denied status. In other examples, the voice command engine 370 may determine the permission status that none of the plurality of devices are permitted for control. In still other examples, the voice command engine 370 may determine the access granted status for the permission status of all of the plurality of determined devices.
- the method 400 may include transmitting a control signal, e.g. operational command, to the determined device 410 .
- the voice command engine 370 may control the identified device(s) in the home automation system 200 based on the determined permission status.
- the voice command engine 370 may transmit an operational command signal to the identified device based on the access granted permission status.
- the voice command engine 370 may generate the operational command based on a communication protocol, e.g. Zigbee®, Z-Wave®, specific to the identified device and/or transmit the generated operational command to the device using the communication protocols in the home automation network 200 . Further examples may be possible.
- the method 400 may further include outputting a confirmation notification that indicates a new state of the identified device, whereby the new state is based on the received voice command.
- step 410 is optional and may be dependent on the permission status being determined as an access granted status. Additionally, it is noted that any of the steps in the method 400 may be optional. Further, it is noted that a plurality of control signals may be delivered to a plurality of identified devices when necessary.
- the voice command engine 370 may output, after determining an access denied permission status, a notification to inform the speaker that the desired control was not implemented.
- notifications may include speech notifications, sounds, text on display screens, and so on.
- the voice command engine 370 may maintain a current state of the identified device and/or not transmit any control signal to the identified device.
- Other examples are possible.
- the voice command engine 370 may be incorporated in the home automation engine 311 that is found in the television receiver 150 and/or the overlay device 251 .
- the voice command engine 370 is provided for in the control processor 310 - 1 and/or the storage medium 325 of the television receiver 300 as shown in FIG. 3 .
- various modules of the voice command engine 370 can be provided for by different parts of the television receiver 300 , and/or any computer system such as the disclosed computer system 700 in FIG. 7 . It is noted that the modules may be arranged in any manner and in operative communication with one another.
- any of the modules may be rearranged, optional, and/or additional modules may be included in the voice command engine 370 .
- the voice command engine 370 may identify a user, e.g. a speaker, and push an operational command to a home automation device if access for the user is authorized.
- the voice command engine 370 may include a database 502 that may be divided into, or otherwise include, a voice database 504 , a settings database 506 , and/or a controls database 508 .
- the voice database 504 may include stored voice samples 510 and/or code words 512 .
- the settings database 506 may include speaker identities 514 and/or access settings 516 .
- the controls database 508 may include control phrases 518 and/or device identifications 520 . It is noted that any of the data type modules may be commonly shared among the databases 504 , 506 , 508 , and that any of the modules may be optional and/or mapped to associate with each other.
- the speaker identities 514 may be stored in the voice database and the code words 512 may be stored in the settings database 506 .
- Other examples are possible.
- Voice samples 510 may be gathered and stored for each user of the voice command engine 370 during an initial setup.
- the voice command engine 370 may receive one or more voice samples during the initial setup, and/or associate such voice samples with speaker identities 514 for each user.
- voice samples may be captured by a microphone in communication with the voice command engine 370 and/or consist of certain voice commands, predetermined phrases, and/or personalized phrases.
- a training session may be initiated during the initial setup to train the voice command engine 370 to a speaker's voice by collecting audio samples of the speaker and associating the audio samples with the speaker's identity.
- the associated one or more voice samples with the speaker's identity may be stored in the voice database 504 , and/or the database 502 in general.
- Code words 512 may be received by the voice command engine 370 during an initial setup and upon detection by a microphone connected thereto.
- the code-word may be user configured and assigned to a specific speaker, e.g. specific speaker identity.
- a code-word may be assigned to multiple speaker identities.
- the code-word may be a general code-word that precedes any voice command to be received by the voice command engine 370 , and is not associated with a particular speaker.
- Such code-words may be use-configured and/or preprogrammed, and may consist of a word or a phrase.
- specific code-words may be associated with particular locations or rooms containing certain devices.
- a speaker in a living room may use “Sesame” as a code-word preceding a voice command, “Lower the heat by five degrees,” to indicate to the voice command engine 370 that the device to control is in the living room.
- the speaker may use a different code-word, e.g. “Genie, lower the heat by five degrees,” to indicate to the voice command engine 370 that the speaker is in a different room, such as a kitchen, and therefore the kitchen thermostat is intended for control.
- the voice command engine 370 may utilize the code-word to distinguish particular device(s) to control.
- the code words 512 may be utilized to start up one or more components of the voice command engine 370 upon detection thereof. For example, specific words may be spoken by a user to get attention from the voice command engine 370 .
- the voice command engine 370 and/or a device in operative communication with, or containing, the engine 370 may continuously search for an activation-type code word.
- the voice command engine 370 may continuously search for and identify a set of words and/or unique inflections to determine that the voice command engine 370 is about to receive a voice command.
- the code words may serve as authentication phrases or passwords to access the controls of the voice command engine 370 , and/or as identifying phrases to determine a speaker identity, device identity, locations, and so on.
- Speaker identities 514 may be received through text and/or captured by a microphone during an initial set-up phase of the voice command engine 370 .
- the speaker identity may be user-configured and/or include a speaker's first name, last name, and/or nickname.
- a speaker may identify herself by stating her speaker identity, whereupon the voice command engine 370 may use voice recognition analysis to further verify that the speaker is not providing a false identity.
- the voice command engine 370 may output a notification regarding denied access, a prompt to restate the speaker identity, and/or notify other recipient devices, e.g. mobile devices of other users, of the attempted access.
- the voice command engine 370 may look up one or more voice samples associated with the speaker identity to verify the speaker.
- the speaker provide a voice command
- the voice command engine 370 may analyze the voice command for a match with one or more of the voice samples 510 , which are further mapped to one or more speaker identities. In that case, the voice command engine 370 may determine the speaker identity based on voice recognition.
- the database 502 may provide a list of unavailable speaker identities. For example, a speaker may not be able to select, during the initial setup, a name that is also a device name, e.g. device identity. Other unavailable speaker identities may be user configured and/or preprogrammed. Further, in some examples, the voice command engine 370 may prohibit device identities to overlap with speaker identities.
- Access settings 516 may be user-configurable and associated with one or more speaker identities.
- the voice command engine 370 may store the one or more access settings associated with the speaker identity as a user profile.
- a user profile is a child's profile with parental control access settings implemented.
- Such access settings may be user-configured, e.g. configured by a parent, and include, merely by way of example, denying voice commands related to altering operation of a thermostat, unlocking a front door, using the television based on a user-specified time of day, and so on.
- the access settings upon lookup by the voice command engine 370 , may allow and/or prevent operational signals to be transmitted to certain determined devices.
- the access settings may be associated with particular device identities and/or code words.
- access settings may include storing a user's mobile device identification and communications information, so that the user may communicate with the voice command engine 370 from a remote location. Other examples are possible.
- Control phrases 518 may be received during the initial setup, downloaded, and/or otherwise pushed to the voice command engine 370 upon installation of a device to the home automation system 200 and/or from the home automation service server 112 as shown in FIG. 1 .
- Each control phrase may include a word and/or a string of words that may indicate operational settings and/or changes thereof, e.g. “turn on,” “lower heat,” “degrees” and so on.
- the control phrases are associated with a device identity, and/or any other type of module shown in FIG. 5 .
- the control phrases 518 include functions based on the device(s) connected to the home automation system 200 .
- the control phrases 518 may include user-configured commands.
- control phrases 518 includes a master library of permissible commands that may be altered and built without an internet connection, e.g. pushed via a television distribution system and/or satellite.
- Further examples of control phrases may include, “show me,” e.g. for a voice command “Show me the front door;” “record”, e.g. for a voice command “Record channel 2 from 8 pm to 10 pm;” “call,” e.g. for a voice command “Call 911.”
- further processes e.g. restriction-related authorizations, may be implemented by the voice command engine 370 prior transmitting an operational signal to a device, e.g. telephone.
- control phrases may be detected by a microphone and processed using speech recognition by the voice command engine 370 , and subsequently added to the database 502 .
- a control phrase may simply include a device identity or nickname. For instance, if a device operationally toggles between two settings, e.g. on/off, the control phrase for operation of the device may be the device name itself. Upon detection of the device name as the control phrase, the voice command engine 370 may transmit signals to the device to toggle between two or more functions. Merely by way of example, instead of speaking “Turn on the living room lights,” a speaker may simply state the device itself, “Living room lights.” The voice command engine 370 may transmit a signal to the living room lights to turn the lights on, or off if the lights are already determined to be in an on state by the voice command engine 370 .
- the voice command engine 370 may first detect additional conditions via other devices in the home automation system 200 that may further facilitate which control setting to transmit to the device. For instance, the voice command engine 370 may detect that user has entered a location, e.g. the living room, and/or determine a state of the device, e.g. that the living room lights are off. In that case, the voice command engine 370 detects the location of the user and a condition of the identified device, and generates and/or transmits an operational signal to render a second possible condition at the device.
- a location e.g. the living room
- a state of the device e.g. that the living room lights are off.
- the voice command engine 370 detects the location of the user and a condition of the identified device, and generates and/or transmits an operational signal to render a second possible condition at the device.
- Device identities 520 may include one or more device names or nicknames for devices in the home automation system 200 that are controllable via the voice command engine 370 .
- the device identity may be user-configured, via detection by a microphone and subsequent speech recognition analysis by the voice command engine 370 , and/or by a textual input from the user.
- the device identity may be linked to other data, such as voice samples, code words, and/or access settings, such that the voice command engine 370 may use the database 502 to look up which device should be implemented based on any other received data.
- the voice command engine 370 may prompt the speaker to set up the device for voice recognition controls.
- the voice command engine 370 may notify the speaker that no device is available for operation in the home automation system 200 .
- the voice command engine 370 may include a speech recognition analyzer 522 .
- the speech recognition analyzer 522 may determine one or more words or phrases in the received voice command that may be related to a function of a device, and determine the device to be controlled based on the function revealed in the voice command.
- the speech recognition analyzer 522 may further be utilized in identifying one or more devices in the home automation system 200 that are intended to be controlled by the received command. If detected speech is not recognized, the voice recognition engine 370 may prompt the speaker to repeat spoken voice command(s) and/or other responses, and/or enter the command via another medium, e.g. by textual input.
- functions of the speech recognition analyzer 522 may be updated via a television distribution system and/or satellite system.
- the speech recognition analyzer 522 may be a multilingual platform, whereby a user may select one or more languages to implement for speech recognition.
- the voice command engine 370 may include a voice recognition analyzer 524 to perform voice recognition analysis as described above.
- the voice recognition analyzer 524 may also include multilingual functions, and include features that are updated via a television distribution system and/or satellite system. It is contemplated that the voice recognition analyzer 524 , and/or the speech recognition analyzer 522 , may be trained during initial setup and/or user profile setup via the voice command engine 370 .
- the voice command engine 370 may include a permissions status analyzer 528 to determine if a voice command is permissible and if an operational signal based on the voice command should be generated and/or transmitted to the intended device.
- the permissions status analyzer 528 may utilize the determined speaker identity, device identity, access settings, code words, and other determined data for the voice command to determine if an operational signal should be generated.
- the permissions status analyzer 528 may determine if additional prompts and information should be provided prior to transmission of such operational signals. Further, the permissions status analyzer 528 may determine the access granted or access denied status, as described previously.
- the voice command engine 370 may include a home automation systems interface 526 .
- the home automations systems interface 526 may ensure communications between the voice command engine 370 and various different devices having different protocols in the home automation system 370 are seamlessly integrated.
- the home automation systems interface 526 may implement device-specific communications protocols to ensure that signals transmitted to the devices from the engine 370 , and/or received by the engine 370 , comply with one another.
- the voice command engine 370 may detect a channel change at a remote control and prompt the user with a request to “Identify yourself,” prior to sending an operational signal for changing to a certain channel.
- the voice command engine 370 may provide an intermediary control between a device and its dedicated controller, e.g.
- the home automations system interface 526 may permit the voice command engine 370 to signal to a television to lower or mute a volume level while audio output from the voice command engine 370 is underway, for example, through an intercom system. After a voice command sequence is completed, the home automation systems interface 526 may resume volume levels back to previous settings. Other examples are possible.
- the voice command engine 370 may include a microphone interface 530 to receive voice input detected by one or more microphones that may be scattered about the home automation system 200 .
- the microphone interface 530 may be configured to encode and/or decode any signals operatively communicated with the microphones.
- a microphone may be located at a remote control having one or more features of the voice command engine 370 .
- the remote control may process and analyze the voice command to decrease data processing requirements at a television receiver, which may provide additional features of the voice command engine 370 .
- the microphone interface 530 may be in operative communication with a microphone located in each room of house.
- the voice command engine 370 may further provide a conversations module 532 .
- the conversations module 532 may be responsible for disambiguation processes, including carrying conversations and/or additional queries directed to receiving additional information related to a received voice command.
- the voice command engine 370 may receive a voice command for “Close the windows in the living room” and the conversations module 532 may instruct the engine 370 to further inquire, “All windows?” for clarification.
- the conversations module 532 may be utilized to interact with the speaker for further information at any point when such information is needed.
- the conversations module 370 , and any other modules shown herein may be in sync and operatively connected with any other modules of the voice command engine 370 . It is noted that the conversations module 532 , and/or any other modules shown herein, may be multilingual to facilitate multilingual operations of the voice command engine 370 . Other examples are possible.
- the voice command engine 370 may include a notifications module 534 , which may be responsible for various audio, textual, or other notifications output by the voice command engine 370 . Such notifications may relay when a trigger is detected, such as when an access is denied, when a new status change based on the voice command has been successfully implemented at the device, and so on.
- mass notifications may be transmitted from the voice command engine 370 to a plurality of recipients and their various devices.
- the voice command engine may notify, via one or more communications networks, a plurality of mobile devices based on detection of an unauthorized voice command or other trigger.
- the notifications module 534 may initiate communications with law enforcement and/or emergency responders, directly and/or via other devices in the home automation system 200 . Other examples are possible.
- FIG. 6 another method 600 for controlling home automation systems with speaker-dependent commands is shown.
- the method 600 may include the steps shown in any order and any additional steps. Further, any steps may be optional. It is contemplated that the method 600 is provided for by the voice command engine 370 of FIG. 5 .
- the method 600 may include receiving a code word (step 602 ), which may include a user-configured or predefined code word.
- the voice command engine 370 may receive a voice command directed to a device in the home automation system (step 604 ).
- the voice command engine 370 may prompt the speaker for additional input, such as additional instructions to clarify the voice command (step 606 ).
- the voice command engine 370 may determine one or more device(s) to control, and/or a speaker identity (step 608 ). It is contemplated that with some voice commands, the speaker identity may not be required in order for the voice command engine 370 to implement an intended control at an intended device. Such special settings may be defined by the user during setup.
- the method 600 may include determining if control according to the received voice command is permitted (step 610 ).
- control is not permitted, e.g. the access is denied based on the permissions status analyzer 528
- the voice command engine 370 may maintain a current state of the intended device (step 618 ) by not generating or otherwise transmitting an operational signal to the intended device.
- the voice command engine 370 may further output notification of the denied command.
- notifications may include audio, visual, and/or textual notification to the speaker via various devices in the home automation system 200 .
- the voice command engine 370 may determine that a voice command is permitted for altering a state of the intended device. In that case, the voice command engine 370 may generate a protocol-specific operational command (step 612 ) according to requirements of the intended device, and transmit the operational command to the intended device via the communications protocol. In further examples, the voice command engine 370 may output notification relaying the new state of the device (step 616 ). Merely by way of example, after transmitting the operational command to the device, the voice command engine 370 may provide a follow-up query to the device for an update in order to determine if the changed operation has been implemented.
- the voice command engine may provide one or more notifications of the change to the user, and/or to other devices in the home automation system 200 .
- notifications may include audio, visual, and/or textual notifications to the speaker via various devices in the home automation system 200 .
- FIG. 7 a computer system as illustrated in FIG. 7 may be incorporated as part of the previously described computerized devices, such as the wireless devices, television receivers, overlay devices, communication devices, any of the home automation devices, the television service provider system, the voice command engine 370 , etc.
- FIG. 7 provides a schematic illustration of one embodiment of a computer system 700 that can perform various steps of the methods provided by various embodiments. It should be noted that FIG. 7 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. FIG. 7 , therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
- the computer system 700 is shown comprising hardware elements that can be electrically coupled via a bus 705 (or may otherwise be in communication, as appropriate).
- the hardware elements may include one or more processors 710 , including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like); one or more input devices 715 , which can include without limitation a mouse, a keyboard, remote control, and/or the like; and one or more output devices 720 , which can include without limitation a display device, a printer, and/or the like.
- processors 710 including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like)
- input devices 715 which can include without limitation a mouse, a keyboard, remote control, and/or the like
- output devices 720 which can include without limitation a display device,
- the computer system 700 may further include (and/or be in communication with) one or more non-transitory storage devices 725 , which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like.
- RAM random access memory
- ROM read-only memory
- Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
- the computer system 700 might also include a communications subsystem 730 , which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device, and/or a chipset (such as a BluetoothTM device, an 802.11 device, a WiFi device, a WiMax device, cellular communication device, etc.), and/or the like.
- the communications subsystem 730 may permit data to be exchanged with a network (such as the network described below, to name one example), other computer systems, and/or any other devices described herein.
- the computer system 700 will further comprise a working memory 735 , which can include a RAM or ROM device, as described above.
- the computer system 700 also can comprise software elements, shown as being currently located within the working memory 735 , including an operating system 740 , device drivers, executable libraries, and/or other code, such as one or more application programs 745 , which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- an operating system 740 device drivers, executable libraries, and/or other code
- application programs 745 may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
- a set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the non-transitory storage device(s) 725 described above.
- the storage medium might be incorporated within a computer system, such as computer system 700 .
- the storage medium might be separate from a computer system (e.g., a removable medium, such as a compact disc), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
- These instructions might take the form of executable code, which is executable by the computer system 700 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 700 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code.
- some embodiments may employ a computer system (such as the computer system 700 ) to perform methods in accordance with various embodiments of the invention. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 700 in response to processor 710 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 740 and/or other code, such as an application program 745 ) contained in the working memory 735 . Such instructions may be read into the working memory 735 from another computer-readable medium, such as one or more of the non-transitory storage device(s) 725 . Merely by way of example, execution of the sequences of instructions contained in the working memory 735 might cause the processor(s) 710 to perform one or more procedures of the methods described herein.
- a computer system such as the computer system 700
- some or all of the procedures of such methods are performed by the computer system 700 in response to processor 710 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 740 and
- machine-readable medium refers to any medium that participates in providing data that causes a machine to operate in a specific fashion. These mediums may be non-transitory.
- various computer-readable media might be involved in providing instructions/code to processor(s) 710 for execution and/or might be used to store and/or carry such instructions/code.
- a computer-readable medium is a physical and/or tangible storage medium.
- Such a medium may take the form of a non-volatile media or volatile media.
- Non-volatile media include, for example, optical and/or magnetic disks, such as the non-transitory storage device(s) 725 .
- Volatile media include, without limitation, dynamic memory, such as the working memory 735 .
- Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, any other physical medium with patterns of marks, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 710 for execution.
- the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
- a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 700 .
- the communications subsystem 730 (and/or components thereof) generally will receive signals, and the bus 705 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 735 , from which the processor(s) 710 retrieves and executes the instructions.
- the instructions received by the working memory 735 may optionally be stored on a non-transitory storage device 725 either before or after execution by the processor(s) 710 .
- computer system 700 can be distributed across a network. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components of computer system 700 may be similarly distributed. As such, computer system 700 may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances, computer system 700 may be interpreted as a single computing device, such as a distinct laptop, desktop computer, or the like, depending on the context.
- configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
- examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
Abstract
Description
- This application claims priority to U.S. Provisional Patent Application No. 61/914,856, filed on Dec. 11, 2013, and entitled, “METHODS AND SYSTEMS FOR HOME AUTOMATION,” the entire contents of which are incorporated herein by reference.
- Home automation systems are becoming increasingly prevalent. Such systems may incorporate a variety of electronic devices, such as “smart” electronics that allow end-users to control and/or view status information of those devices, and other traditional electronics. With the growing popularity of home automation systems, there is a need for a controls infrastructure that can operate with the variety of different electronic devices, and at the same time, be user-friendly, secure, and simple to implement. This application is intended to address such needs and to provide related advantages.
- In general, this application is directed to home automation systems, and more specifically, to controlling home automation systems with speaker-dependent commands.
- In one aspect of the present disclosure, a method for controlling a device in a home automation system based on a speaker-dependent command is provided. The method may include receiving, by a television receiver, a voice command for controlling the device connected to the home automation system. The method may include performing, by the television receiver, a voice recognition analysis to determine a speaker identity of the received voice command, and/or performing, by the television receiver, a speech recognition analysis to identify the device in the home automation system that is intended to be controlled by the received voice command. Further, the method may include determining, by the television receiver, a permission status to control the identified device, wherein the determined permission status is based on the determined speaker identity and the identified device. The method may include controlling, by the television receiver, the identified device in the home automation system based on the determined status.
- Various embodiments of the method may include one or more of the following features. The method may include detecting, by the television receiver, the voice command at a microphone on the television receiver. The method may include receiving, by the television receiver, the voice command from a remote control in operative communication with the television receiver, wherein the voice command is detected by a microphone on the remote control and is wirelessly relayed from the remote control to the television receiver. In another aspect, the method may include receiving, by the television receiver, the voice command from a home automation device in operative communication with the television receiver, wherein the voice command is detected by a microphone provided on the home automation device and is relayed from the home automation device to the television receiver.
- The method may include, in the step of performing the voice recognition analysis to determine the speaker identity, comparing, by the television receiver, at least a portion of the received voice command to a voice database comprising one or more voice samples, wherein each of the one or more voice samples are associated with one or more speaker identities. The method may include receiving, by the television receiver, one or more voice samples during an initial setup, receiving, by the television receiver, a speaker identity during the initial setup, associating, by the television receiver, at least a portion of the one or more voice samples with the speaker identity, and/or storing, by the television receiver, the associated one or more voice samples and the speaker identity in a voice database.
- The method may further include receiving, by the television receiver, one or more access settings associated with the speaker identity, and/or storing, by the television receiver, the one or more access settings associated with the speaker identity in a controls database. In other aspects, the method may include, in performing the speech recognition analysis to identify the device to be controlled, one or more steps of detecting, by the television receiver, one or more control phrases in the received voice command, and/or comparing, by the television receiver, the one or more control phrases to a controls database comprising a plurality of control phrases. Each of the one or more control phrases may be associated with one or more home automation devices. In further examples, the method may include receiving, by the television receiver, one or more control phrases in a one-time setup, wherein each of the one or more control phrases comprises at least a word or a string of words, associating, by the television receiver, each of the one or more control phrases with a home automation device, and/or storing, by the television receiver, the associated control phrases and home automation devices in a controls database.
- In some examples, the control phrases may include user-configured control phrases. In another example, the method may include determining, by the television receiver, a plurality of home automation devices to control based on the speech recognition analysis. In still other examples, the method may include determining, by the television receiver, the permission status is at least one of an access granted status and an access denied status. The method may include transmitting, by the television receiver, an operational command to the identified device based on the access granted permission status. In another example, the method may include generating, by the television receiver, the operational command based on a communication protocol specific to the identified device, and/or transmitting, by the television receiver, the operational command to the identified device through a home automation network. In other examples, the method may include outputting, by the television receiver, a confirmation notification that indicates a new state of the identified device, wherein the new state is based on the voice command.
- In still other examples, the method may include outputting, by the television receiver, a notification based on the access denied status, and/or maintaining, by the television receiver, a current state of the identified device. The method may include receiving, by the television receiver, a code word, wherein the received code word is detected by a microphone, associating, by the television receiver, the code word with a speaker identity, and/or detecting, by the television receiver, the code word immediately preceding the voice command. In some cases, the method may include, in response to receiving the voice command, outputting, by the television receiver, an additional query for additional information related to the voice command. Other examples are possible.
- In another aspect of the present disclosure, a system for controlling a device in a home automation system based on a speaker-dependent command is provided. The system may include one or more processors and/or a memory communicatively coupled with and readable by the one or more processors and having stored therein processor-readable instructions. When executed, the processor-readable instructions may cause the one or more processors to receive, by a television receiver, a voice command for controlling the device connected to the home automation system, perform, by the television receiver, a voice recognition analysis to determine a speaker identity of the received voice command, and/or perform, by the television receiver, a speech recognition analysis to identify the device in the home automation system that is intended to be controlled by the received voice command. The processor-readable instructions may cause the one or more processors to determine, by the television receiver, a permission status to control the identified device, wherein the determined permission status is based on the determined speaker identity and the identified device. In another example, the processor-readable instructions may cause the one or more processors to control, by the television receiver, the identified device in the home automation system based on the determined status. Other examples are possible.
- In yet another aspect of the present disclosure, a computer-readable medium having stored thereon a series of instructions is provided. When executed by a processor, the series of instructions may cause the processor to control a device in a home automation system based on a speaker-dependent command. For example, the series of instructions may include receiving, by a television receiver, a voice command for controlling the device connected to the home automation system, performing, by the television receiver, a voice recognition analysis to determine a speaker identity of the received voice command, and/or performing, by the television receiver, a speech recognition analysis to identify the device in the home automation system that is intended to be controlled by the received voice command. The series of instructions may further include determining, by the television receiver, a permission status to control the identified device, wherein the determined permission status is based on the determined speaker identity and the identified device, and/or controlling, by the television receiver, the identified device in the home automation system based on the determined status. Other examples are possible.
-
FIG. 1 shows an embodiment of a television service provider system; -
FIG. 2 shows an embodiment of a home automation system hosted by a television receiver; -
FIG. 3 shows an embodiment of a television receiver configured to host a home automation system; -
FIG. 4 shows an example method for controlling a device in a home automation system based on a speaker-dependent command; -
FIG. 5 shows a block diagram of example modules in a voice command engine for controlling a device in a home automation system based on a speaker-dependent command; -
FIG. 6 shows another example method for controlling a device in a home automation system based on a speaker-dependent command; and -
FIG. 7 shows an embodiment of a computer system upon which various aspects of the present disclosure may be implemented. - It is noted that any of the elements and/or steps provided in the block diagrams, flow diagrams, method diagrams, and other illustrations of the figures may be optional, replaced, and/or include additional components, such as combined and/or replaced with other elements and/or steps from other figures and text provided herein. Various embodiments of the present invention are discussed below, and various combinations or modifications thereof may be contemplated.
- In general, the systems and methods disclosed herein are directed to controlling a device, such as a home automation or “smart” device, of a home automation system based on a speaker-dependent command. For example, a microphone, such as a microphone on a television receiver, a television remote control, and/or on one or more devices in the home automation system, may detect a spoken voice command and transmit the voice command to a television receiver having a voice command recognition system. The television receiver may perform a voice recognition analysis for speaker verification and/or speaker identification. Additionally, the television receiver may perform a speech recognition analysis to determine which device(s) connected to the home automation system should be controlled, e.g. which device(s) the voice command is intended to command. Based on the determined speaker's identity and the determined device(s) to control, the television receiver may allow for certain commands to be completed and/or prohibited in the home automation system.
- Merely by way of example, an adult user identity may be permitted to perform certain functions that a child user identity cannot, e.g., unlocking doors. In another example, a speaker may state a command without naming a particular device, e.g. “Lower the heat by five degrees.” The television receiver may determine, based at least in part on speech recognition, which home automation device is intended, and therefore which device to transmit an operational signal to, e.g. relaying the command to lower the heat by five degrees to a thermostat. In yet another example, the television receiver may respond to a user-defined code word that precedes and/or follows a spoken command. For instance, the television receiver may detect the code word “Sesame” followed by a spoken command such as “Turn on the living room lights.” Such user-defined code words may activate the television receiver to capture and analyze the subsequently spoken home automation command. Other examples are possible. It is contemplated that the present systems and methods provide for a user-friendly, secure and simple controls infrastructure that may be used to operate a variety of different electronic devices in the home automation system, as described in further detail below.
-
FIG. 1 illustrates an embodiment of a satellitetelevision distribution system 100. While a home automation system may be incorporated with various types of television receivers, various embodiments may be part of a satellite-based television distribution system. Cable, IP-based, wireless and broadcast focused systems are also possible. Satellitetelevision distribution system 100 may include: televisionservice provider system 110, satellite transmitter equipment 120, satellites 130,satellite dish 140,television receiver 150, system credit management engine 112, anddisplay device 160. Alternate embodiments of satellitetelevision distribution system 100 may include fewer or greater numbers of components. While only onesatellite dish 140,television receiver 150, and display device 160 (collectively referred to as “user equipment”) are illustrated, it should be understood that multiple (e.g., tens, thousands, millions of) instances and types of user equipment may receive data and television signals from televisionservice provider system 110 via satellites 130. - As shown in
FIG. 1 , televisionservice provider system 110 and satellite transmitter equipment 120 may be operated by a television service provider. A television service provider may distribute television channels, on-demand programming, programming information, and/or other content/services to users. Televisionservice provider system 110 may receive feeds of one or more television channels and content from various sources. Such television channels may include multiple television channels that contain at least some of the same content (e.g., network affiliates). To distribute television channels for presentation to users, feeds of the television channels may be relayed to user equipment via multiple television distribution satellites. Each satellite may relay multiple transponder streams. Satellite transmitter equipment 120 may be used to transmit a feed of one or more television channels from televisionservice provider system 110 to one or more satellites 130. While a single televisionservice provider system 110 and satellite transmitter equipment 120 are illustrated as part of satellitetelevision distribution system 100, it should be understood that multiple instances of transmitter equipment may be used, possibly scattered geographically, to communicate with satellites 130. Such multiple instances of satellite transmitting equipment may communicate with the same or with different satellites. Different television channels may be transmitted to satellites 130 from different instances of transmitting equipment. For instance, a different satellite dish of satellite transmitter equipment 120 may be used for communication with satellites in different orbital slots. - Still referring to
FIG. 1 , satellites 130 may be configured to receive signals, such as streams of television channels, from one or more satellite uplinks such as satellite transmitter equipment 120. Satellites 130 may relay received signals from satellite transmitter equipment 120 (and/or other satellite transmitter equipment) to multiple instances of user equipment via transponder streams. Different frequencies may be used for uplink signals 170 from downlink signals 180. Satellites 130 may be in geosynchronous orbit. Each of the transponder streams transmitted by satellites 130 may contain multiple television channels transmitted as packetized data. For example, a single transponder stream may be a serial digital packet stream containing multiple television channels. Therefore, packets for multiple television channels may be interspersed. Further, information used bytelevision receiver 150 for home automation functions may also be relayed to television receiver via one or more transponder streams. For instance, home automation functions may be requested by and/or pushed to thetelevision receiver 150 from the televisionservice provider system 110. - As shown in
FIG. 1 , multiple satellites 130 may be used to relay television channels from televisionservice provider system 110 tosatellite dish 140. Different television channels may be carried using different satellites. Different television channels may also be carried using different transponders of the same satellite; thus, such television channels may be transmitted at different frequencies and/or different frequency ranges. As an example, a first and second television channel may be relayed via a first transponder of satellite 130-1. A third, fourth, and fifth television channel may be relayed via a different satellite or a different transponder of the same satellite relaying the transponder stream at a different frequency. A transponder stream transmitted by a particular transponder of a particular satellite may include a finite number of television channels, such as seven. Accordingly, if many television channels are to be made available for viewing and recording, multiple transponder streams may be necessary to transmit all of the television channels to the instances of user equipment. Further, it is contemplated that multiple home automation functions may be transmitted in similar fashion. - Still in reference to
FIG. 1 ,satellite dish 140 may be a piece of user equipment that is used to receive transponder streams from one or more satellites, such as satellites 130.Satellite dish 140 may be provided to a subscriber for use on a subscription basis to receive television channels and/or home automation functions provided by the televisionservice provider system 110 and/or specifically, the home automation service server 112 of theprovider system 110, satellite transmitter equipment 120, and/or satellites 130.Satellite dish 140, which may include one or more low noise blocks (LNBs), may be configured to receive transponder streams from multiple satellites and/or multiple transponders of the same satellite.Satellite dish 140 may be configured to receive television channels via transponder streams on multiple frequencies. Based on the characteristics oftelevision receiver 150 and/orsatellite dish 140, it may only be possible to capture transponder streams from a limited number of transponders concurrently. For example, a tuner oftelevision receiver 150 may only be able to tune to a single transponder stream from a transponder of a single satellite at a given time. The tuner can then be re-tuned to another transponder of the same or a different satellite. Atelevision receiver 150 having multiple tuners may allow for multiple transponder streams to be received at the same time. Merely by way of example, multiple television channels and/or multiple home automation functions may be received concurrently. -
FIG. 1 further illustrates one or more television receivers in communication withsatellite dish 140. Television receivers may be configured to decode signals received from satellites 130 viasatellite dish 140 for output and presentation via a display device, such asdisplay device 160. Similarly, such television receivers may decode signals received for any home automation devices. For instance, ahome automation engine 311, as described further below, may decode such signals. A television receiver may be incorporated as part of a television or may be part of a separate device, commonly referred to as a set-top box (STB).Television receiver 150 may decode signals received viasatellite dish 140 and provide an output to displaydevice 160. On-demand content, such as PPV content, may be stored to a computer-readable storage medium.FIG. 2 described below provides additional detail of various embodiments of a television receiver. A television receiver is defined to include set-top boxes (STBs) and also circuitry having similar functionality that may be incorporated with another device. For instance, circuitry similar to that of a television receiver may be incorporated as part of a television. As such, whileFIG. 1 illustrates an embodiment oftelevision receiver 150 as separate fromdisplay device 160, it should be understood that, in other embodiments, similar functions may be performed by a television receiver integrated withdisplay device 160.Television receiver 150 may include thehome automation engine 311, as detailed in relation toFIG. 3 . - Referring again to
FIG. 1 ,display device 160 may be used to present video and/or audio decoded and output bytelevision receiver 150.Television receiver 150 may also output a display of one or more interfaces to displaydevice 160, such as an electronic programming guide (EPG). In many embodiments,display device 160 is a television.Display device 160 may also be a monitor, computer, or some other device configured to display video and, possibly, play audio. - As further illustrated in
FIG. 1 , uplink signal 170-1 represents a signal between satellite transmitter equipment 120 and satellite 130-1. Uplink signal 170-2 represents a signal between satellite transmitter equipment 120 and satellite 130-2. Each of uplink signals 170 may contain streams of one or more different television channels and/or home automation functions. For example, uplink signal 170-1 may contain a first group of television channels and/or home automation functions, while uplink signal 170-2 contains a second group of television channels and/or home automation functions. Each of these television channels and/or home automation functions may be scrambled such that unauthorized persons are prevented from accessing the television channels. - As shown in
FIG. 1 , downlink signal 180-1 represents a signal between satellite 130-1 andsatellite dish 140. Downlink signal 180-2 represents a signal between satellite 130-2 andsatellite dish 140. Each of the downlink signals 180 may contain one or more different television channels and/or home automation functions, which may be at least partially scrambled. A downlink signal may be in the form of a transponder stream. A single transponder stream may be tuned to at a given time by a tuner of a television receiver. For example, downlink signal 180-1 may be a first transponder stream containing a first group of television channels and/or home automation functions, while downlink signal 180-2 may be a second transponder stream containing a different group of television channels and/or home automation functions. In addition to or instead of containing television channels, a transponder stream can be used to transmit on-demand content to television receivers, including PPV content (which may be stored locally by the television receiver until output for presentation). -
FIG. 1 further illustrates downlink signal 180-1 and downlink signal 180-2, being received bysatellite dish 140 and distributed totelevision receiver 150. For a first group of television channels and/or home automation functions,satellite dish 140 may receive downlink signal 180-1 and for a second group of channels and/or home automation functions, downlink signal 180-2 may be received.Television receiver 150 may decode the received transponder streams. As such, depending on which television channels and/or home automation functions are desired to be presented or stored, various transponder streams from various satellites may be received, descrambled, and decoded bytelevision receiver 150. - Further shown in
FIG. 1 ,network 190, which may include the Internet, may allow for bidirectional communication betweentelevision receiver 150 and televisionservice provider system 110, such as for home automation related services provided by home automation service server 112. In addition or in alternate tonetwork 190, a telephone (e.g., landline) or cellular connection may be used to enable communication betweentelevision receiver 150 and televisionservice provider system 110. - Turning now to
FIG. 2 , an embodiment of ahome automation system 200 hosted by a television receiver is illustrated.Television receiver 150 may represent the television receiver ofFIG. 1 . Whiletelevision receiver 150 may be configured to receive television programming from a satellite-based television service provider, it should be understood that in other embodiments, other forms of television service provider networks may be used, such as an IP-based network (e.g., fiber network), a cable based network, a wireless broadcast-based network, etc. - As shown in
FIG. 2 ,television receiver 150 may be configured to communicate with multiple in-home home automation devices. The devices with whichtelevision receiver 150 communicates may use different communication standards or protocols. For instance, one or more devices may use a ZigBee® communication protocol while one or more other devices communicate with the television receiver using a Z-Wave® communication protocol. Other forms of wireless communication may be used by devices and thetelevision receiver 150. For instance,television receiver 150 and one or more devices may be configured to communicate using a wireless local area network, which may use a communication protocol such as 802.11. - Referring to
FIG. 2 , in some embodiments, a separate device may be connected withtelevision receiver 150 to enable communication with home automation devices. For instance,communication device 252 may be attached totelevision receiver 150.Communication device 252 may be in the form of a dongle.Communication device 252 may be configured to allow for Zigbee®, Z-Wave®, and/or other forms of wireless communication. The communication device may connect withtelevision receiver 150 via a USB port or via some other type of (wired) communication port.Communication device 252 may be powered by thetelevision receiver 150 or may be separately coupled with a power source. In some embodiments,television receiver 150 may be enabled to communicate with a local wireless network and may usecommunication device 252 in order to communicate with devices that use a ZigBee® communication protocol, Z-Wave® communication protocol, and/or some other home wireless communication protocols. - Still referring to
FIG. 2 ,communication device 252 may also serve to allow additional components to be connected with television receive 150. For instance,communication device 252 may include additional audio/video inputs (e.g., HDMI), component, and/or composite input to allow for additional devices (e.g., Blu-ray players) to be connected withtelevision receiver 150. Such connection may allow video from such additional devices to be overlaid with home automation information. Merely by way of example, whether home automation information is overlaid onto video may be triggered based on a user's press of a remote control button. - As shown in
FIG. 2 , regardless of whethertelevision receiver 150 usescommunication device 252 to communicate with home automation devices,television receiver 150 may be configured to output home automation information for presentation to a user viadisplay device 160. Such information may be presented simultaneously with television programming received bytelevision receiver 150, such as viasystem 100 ofFIG. 1 described above.Television receiver 150 may also, at a given time, output only television programming or only home automation information based on a user's preference. The user may be able to provide input totelevision receiver 150 to control the home automation system hosted bytelevision receiver 150 or byoverlay device 251, as detailed below. - Still referring to
FIG. 2 , in some embodiments,television receiver 150 may not be used as a host for a home automation system. Rather, a separate device may be coupled withtelevision receiver 150 that allows for home automation information to be presented to a user viadisplay device 160. This separate device may be coupled withtelevision receiver 150. In some embodiments, the separate device is referred to asoverlay device 251.Overlay device 251 may be configured to overlay information, such as home automation information, onto a signal to be visually presented viadisplay device 160, such as a television. In some embodiments,overlay device 251 may be coupled betweentelevision receiver 150, which may be in the form of a set top box, anddisplay device 160, which may be a television. In such embodiments,television receiver 150 may receive, decode, descramble, decrypt, store, and/or output television programming and/or home automation functions.Television receiver 150 may output a signal, such as in the form of an HDMI signal. Rather than be directly input to displaydevice 160, the output oftelevision receiver 150 may be input tooverlay device 251.Overlay device 251 may receive the video and/or audio output fromtelevision receiver 150.Overlay device 251 may add additional information to the video, audio and/or home automation function signal received fromtelevision receiver 150. The modified video and/or audio signal may be output to displaydevice 160 for presentation. In some embodiments,overlay device 251 has an HDMI input and an HDMI output, with the HDMI output being connected to displaydevice 160. To be clear, whileFIG. 2 illustrates lines illustrating communication betweentelevision receiver 150 and various devices, it should be understood that such communication may exist, in addition or in alternate viacommunication device 252 and/or withoverlay device 251. - Referring again to
FIG. 2 , in some embodiments,television receiver 150 may be used to provide home automation functionality whileoverlay device 251 may be used to present information viadisplay device 160. It should be understood that the home automation functionality detailed herein in relation to a television receiver may alternatively be provided viaoverlay device 251. In some embodiments,overlay device 251 may provide home automation functionality and be used to present information viadisplay device 160. Usingoverlay device 251 to present automation information viadisplay device 160 may have additional benefits. For instance, multiple devices may provide input video tooverlay device 251. For instance,television receiver 150 may provide television programming tooverlay device 251, a DVD/Blu-Ray player may providevideo overlay device 251, and a separate internet-TV device may stream other programming tooverlay device 251. Regardless of the source of the video/audio,overlay device 251 may output video and/or audio that has been modified to include home automation information, such as a pop-up overlay with a prompt message, and output to displaydevice 160. As such, in such embodiments, regardless of the source of video/audio,overlay device 251 may modify the audio/video to include home automation information and, possibly, solicit for user input. For instance, in someembodiments overlay device 251 may have four video inputs (e.g., four HDMI inputs) and a single video output (e.g., an HDMI output). In other embodiments, such overlay functionality may be part oftelevision receiver 150. As such, a separate device, such as a Blu-ray player may be connected with a video input oftelevision receiver 150, thus allowingtelevision receiver 150 to overlay home automation information when content from the Blu-Ray player is being output to displaydevice 160. - Still referring to
FIG. 2 , regardless of whethertelevision receiver 150 is itself configured to provide home automation functionality and output home automation input for display viadisplay device 160 or such home automation functionality is provided viaoverlay device 251, home automation information may be presented bydisplay device 160 while television programming is also being presented bydisplay device 160. For instance, home automation information may be overlaid or may replace a portion of television programming (e.g., broadcast content, stored content, on-demand content, etc.) presented viadisplay device 160. Merely by way of example, while television programming is being presented (e.g., a television show on scuba diving), the display is augmented with information related to home automation. This television show may represent broadcast programming, recorded content, on-demand content, or some other form of content. In one example, the presented home automation information is related to motion being detected by a camera at a front door of a location. Such augmentation of the television programming may be performed directly by television receiver 150 (which may or may not be in communication with communication device 252) oroverlay device 251 connected withtelevision receiver 150 anddisplay device 160. Such augmentation may result in solid or partially transparent graphics being overlaid onto television programming (or other forms of video) output bytelevision receiver 150.Overlay device 251 or television receive 150 may be configured to add or modify sound to television programming. For instance, in response to a doorbell ring, a sound may be played through the display device (or connected audio system). In addition or in alternate, a graphic may be displayed. In other embodiments, camera data (e.g., nanny camera data) and/or associated sound or motion sensors may be integrated in the system and overlaid or otherwise made available to a user. For example, detection of a crying baby from a nanny camera may trigger an on-screen alert to a user watching television. - Still in reference to
FIG. 2 , such presented home automation information may request user input. For instance, a user, via controls of television receiver 150 (e.g., a remote control) or controls ofoverlay device 251, can specify whether video from a camera at the front door should be presented, not presented, or if future notifications related to such motion such be ignored. If ignored, this may be for a predefined period of time, such as an hour, or until thetelevision receiver 150 oroverlay device 251 is powered down and powered back on. Ignoring of video may be particularly useful if motion or some other event is triggering the presentation of video that is not interesting to a viewer of display device 160 (or a wireless device), such as children playing on the lawn or snow falling. - As shown in
FIG. 2 ,television receiver 150 oroverlay device 251 may be configured to communicate with one or more wireless devices, such aswireless device 216.Wireless device 216 may represent a tablet computer, cellular phone, laptop computer, remote computer, or some other device through which a user may desire to control home automation settings and view home automation information. Such a device also need not be wireless, such as a desktop computer.Television receiver 150,communication device 252, oroverlay device 251 may communicate directly withwireless device 216, or may use a local wireless network, such asnetwork 270.Wireless device 216 may be remotely located and not connected with a same local wireless network. Via the internet,television receiver 150 oroverlay device 251 may be configured to transmit a notification towireless device 216 regarding home automation information. For instance, in some embodiments, a third-party notification server system, such as the notification server system operated by Apple®, may be used to send such notifications towireless device 216. - Further shown in
FIG. 2 , in some embodiments, a location ofwireless device 216 may be monitored. For instance, ifwireless device 216 is a cellular phone, when its position indicates it has neared a door, the door may be unlocked. A user may be able to define which home automation functions are controlled based on a position ofwireless device 216. Other functions could include opening and/or closing a garage door, adjusting temperature settings, turning on and/or off lights, opening and/or closing shades, etc. Such location-based control may also take into account the detection of motion via one or more motion sensors that are integrated into other home automation devices and/or stand-alone motion sensors in communication withtelevision receiver 150. - Still referring to
FIG. 2 , in some embodiments, little to no setup ofnetwork 270 may be necessary to permittelevision receiver 150 to stream data out to the Internet. For instance,television receiver 150 andnetwork 270 may be configured, via a service such as Sling® or other video streaming service, to allow for video to be streamed fromtelevision receiver 150 to devices accessible via the Internet. Such streaming capabilities may be “piggybacked” to allow for home automation data to be streamed to devices accessible via the Internet. For example, U.S. patent application Ser. No. 12/645,870, filed on Dec. 23, 2009, entitled “Systems and Methods for Remotely Controlling a Media Server via a Network”, which is hereby incorporated by reference, describes one such system for allowing remote access and control of a local device. U.S. Pat. No. 8,171,148, filed Apr. 17, 2009, entitled “Systems and Methods for Establishing Connections Between Devices Communicating Over a Network”, which is hereby incorporated by reference, describes a system for establishing connection between devices over a network. U.S. patent application Ser. No. 12/619,192, filed May 19, 2011, entitled “Systems and Methods for Delivering Messages Over a Network”, which is hereby incorporated by reference, describes a message server that provides messages to clients located behind a firewall - Still referring to
FIG. 2 , as an example of howwireless device 216 may be used in conjunction withtelevision receiver 150 oroverlay device 251 for controlling a home automation system, awireless device 216 may be in communication withtelevision receiver 150 serving as the host of a home automation system. At approximately a same time that the home automation information is presented via display device 160 (assuming it is turned on), similar information may be sent towireless device 216, such as via a third-party notification server or directly fromtelevision receiver 150 oroverlay device 251 via a local wireless network. A user ofwireless device 216 can specify whether video from a camera at the front door should be presented bywireless device 216, not presented, or if future notifications related to such motion such be ignored. If ignored, this may be for a predefined period of time, such as an hour or some other predefined or user-selected period of time. In this way, a user interface of thewireless device 216 may correspond to an overlay of the home automation information and/or prompt appearing on thedisplay device 160. - Referring again to
FIG. 2 ,wireless device 216 may serve as an input device fortelevision receiver 150. For instance,wireless device 216 may be a tablet computer that allows text to be typed by a user and provided totelevision receiver 150. Such an arrangement may be useful for text messaging, group chat sessions, or any other form of text-based communication. Other types of input may be received for the television receiver from a tablet computer or other device as shown in the attached screenshots, such as lighting commands, security alarm settings and door lock commands. Whilewireless device 216 may be used as the input device for typing text,television receiver 150 may output for display text to displaydevice 160. - Still referring to
FIG. 2 ,wireless device 216 may be configured to store a software model of home automation system intended to mirror the software model stored bytelevision receiver 150, which is hosting the home automation system. For instance, such a software model may allowwireless device 216 to view, communicate with, and/or interact with various home automation devices. Such a software model may indicate the state of various home automation devices. Whenwireless device 216 is not in communication withtelevision receiver 150, changes to the home automation model made attelevision receiver 150 may not be known towireless device 216. A history list maintained bytelevision receiver 150 and/or a synchronization point numerical value, whereby each change to the home automation model bytelevision receiver 150 is assigned a value and synchronized at a later point with thewireless device 216, may be implemented. In another aspect, thewireless device 216 may be utilized by a user for entering and/or confirming rules and other settings of the home automation system, and such settings may be synchronized or otherwise communicated with thetelevision receiver 150. - Further shown in
FIG. 2 , in some embodiments, acellular modem 253 may be connected with eitheroverlay device 251 ortelevision receiver 150.Cellular modem 253 may be useful if a local wireless network is not available. For instance,cellular modem 253 may permit access to the internet and/or communication with a television service provider. Communication with a television service provider, such as televisionservice provider system 110 ofFIG. 1 , may also occur via a local wireless or wired network connected with the Internet. In some embodiments, information for home automation purposes may be transmitted by televisionservice provider system 110 totelevision receiver 150 oroverlay device 251 via the television service provider's distribution network, which may include the use of satellites 130. - As shown in
FIG. 2 , various home automation devices may be in communication withtelevision receiver 150 oroverlay device 251. Such home automation devices may use disparate communication protocols. Such home automation devices may communicate withtelevision receiver 150 directly or viacommunication device 252. Such home automation devices may be controlled by a user and/or have a status viewed by a user viadisplay device 160 and/orwireless device 216. Such home automation device may include one or more of the following, as discussed below. - As shown in
FIG. 2 , one or more cameras, such ascamera 212.Camera 212 may be either indoors or outdoors and may provide a video and, possibly, audio stream which can be presented viawireless device 216 and/ordisplay device 160. Video and/or audio fromcamera 212 may be recorded byoverlay device 251 ortelevision receiver 150 upon an event occurring, such as motion being detected bycamera 212. Video and/or audio fromcamera 212 may be continuously recorded such as in the form of a rolling window, thus allowing a period of time of video/audio to be reviewed by a user from before a triggering event and after the triggering event. Video may be recorded on a storage local tooverlay device 251 ortelevision receiver 150, or may be recorded and or storage on external storage devices, such as a network attached storage device. In some embodiments, video may be transmitted across the local and/or wide area network to other storage devices upon occurrence of a trigger event for later playback. For initial setup, a still fromcamera 212 may be captured by and stored bytelevision receiver 150 for subsequent presentation as part of a user interface viadisplay device 160 such that the user can determine which camera (if multiple cameras are present) is being set up and/or later accessed. For example a user interface may display a still image from a front door camera (which is easily recognized by the user because it shows a scene in front of the house's front door) to allow a user to select the front door camera for viewing. - For instance, as shown in
FIG. 2 , video and, possibly, audio fromcamera 212 may be available live for viewing by a user viaoverlay device 251 ortelevision receiver 150. Such video may be presented simultaneously with television programming being presented. In some embodiments, video may only be presented if motion is detected bycamera 212, otherwise video fromcamera 212 may not be presented by the display device presenting television programming. Also, such video (and, possibly, audio) fromcamera 212 may be recorded bytelevision receiver 150 oroverlay device 251. Such video may be recorded based upon a timer configured by a user. For instance,camera 212 may be incorporated into an electronic programming guide (EPG) output for display bytelevision receiver 150. For instance,camera 212 may be presented as a “channel” as part of the EPG along with other television programming channels. A user may be permitted to select the channel associated withcamera 212 for presentation via display device 160 (or wireless device 216). The user may also be permitted to set a timer to record the channel ofcamera 212 for a user-defined period of time on a user-defined date. Such recording may not be constrained by the rolling window associated with a triggering event being detected. For instance,recording camera 212 based on a timer may be useful if a babysitter is going to be watching a child and the parents want to later review the babysitter's behavior in their absence. In some embodiments, video fromcamera 212 may be backed up to a remote storage device, such as cloud-based storage hosted by home automation service server 112. Other data may also be cached to the cloud, such as configuration settings. Thus, if thetelevision receiver 150 oroverlay device 251 malfunctions, then a new device may be installed and the configuration data loaded onto the device from the cloud. - Further shown in
FIG. 2 ,window sensor 210 anddoor sensor 208 may transmit data to television receiver 150 (possibly via communication device 252) oroverlay device 251 that indicates the status of a window or door, respectively. Such status may indicate open or closed. When a status change occurs, the user may be notified as such viawireless device 216 ordisplay device 160. Further, a user may be able to view a status screen to view the status one or more window sensors and/or one or more door sensors throughout the location.Window sensor 210 and/ordoor sensor 208 may have integrated glass break sensors to determine if glass has been broken. - Still shown in
FIG. 2 , one or more smoke and/or CO2 detectors 209 may be integrated as part of a home automation system. As such, alerts as to whether a fire or CO2 has been detected can be sent totelevision receiver 150,wireless device 216, and/or emergency first responders. Further,television receiver 150 and/orwireless device 216 may be used to disable false alarms. One or more sensors may be integrated or separate to detect gas leaks, radon, or various other dangerous situations. - Still referring to
FIG. 2 , pet door and/orfeeder 211 may allow for pet related functionality to be integrated withtelevision receiver 150. For instance, a predefined amount of food may be dispensed at predefined times to a pet. A pet door may be locked and/or unlocked. The pet's weight or presence may trigger the locking or unlocking of the pet door. For instance, a camera located at the pet door may be used to perform image recognition of the pet or a weight sensor near the door may identify the presence of the pet and unlock the door. A user may also lock/unlock a pet door viawireless device 150 and/orwireless device 216. - Still shown in
FIG. 2 ,weather sensor 206 may allowtelevision receiver 150 oroverlay device 251 to receive, identify, and/or output various forms of environmental data, including temperature, humidity, wind speed, barometric pressure, etc.Television receiver 150 oroverlay device 251 may allow for control of one or more shades, such as window, door, and/or skylight shades, within a house.Shade controller 204 may respond to commands fromtelevision receiver 150 oroverlay device 251 and may provide status updates (e.g., shade up, shade 50% up, shade down, etc.). - As shown in
FIG. 2 , in some embodiments,television receiver 150 may receive and notify a user of the status of electrical appliances such as refrigerators and dishwashers within the house. Thetelevision receiver 150 may be linked to the appliances and presents a notification message to the user through whatever device the user is using at the time, such as a tablet computer, mobile phone or thin client. U.S. patent application Ser. No. 12/700,310, filed Feb. 4, 2010, entitled “Electronic Appliance Status Notification via a Home Entertainment System”, which is hereby incorporated by reference, describes such techniques in further detail. - Also shown in
FIG. 2 , utility monitor 202 may serve to providetelevision receiver 150 oroverlay device 251 with utility information, such as electricity usage, gas usage, water usage, wastewater usage, irrigation usage, etc. A user may view a status page or may receive notifications upon predefined events occurring, such as electricity usage exceeding a defined threshold within a month, or current kilowatt usage exceeding a threshold. -
FIG. 2 further shows ahealth sensor 214 that may permit a user's vital characteristics to be monitored, such as a heart rate. In some embodiments, additionally or alternatively,health sensor 214 may contain a button or other type of actuator that a user can press to request assistance. As such,health sensor 214 may be mounted to a fixed location, such as bedside, or may be carried by a user, such as on a lanyard. Such a request may trigger a notification to be presented to other users viadisplay device 160 and/orwireless device 216. Additionally or if the notification is not cleared by another user within a predefined period of time, a notification may be transmitted to emergency first responders to request help. In some embodiments, a home automation service provider may first try contacting the user, such as via phone, to determine if an emergency is indeed occurring. Such ahealth sensor 214 may have additional purposes, such as for notification of another form of emergency, such as a break-in, fire, flood, theft, disaster, etc. In some examples, thehealth sensor 214 may receive signals from various cameras, temperature sensors, and other monitoring equipment in connection with the home automation system, analyze such signals, and store or report such signals as necessary. - Still referring to
FIG. 2 , in some embodiments,health sensor 214 may be used as a medical alert pendant that can be worn or otherwise carried by a user. It may contain a microphone and/or speaker to allow communication with other users and/or emergency first responders.Television receiver 150 oroverlay device 251 may be preprogrammed to contact a particular phone number (e.g., emergency service provider, relative, caregiver, etc.) based on an actuator ofhealth sensor 214 being activated by a user. The user may be placed in contact with a person via the phone number and the microphone and/or speaker ofhealth sensor 214. Camera data may be combined with such alerts in order to give a contacted relative more information regarding the medical situation. For example,health sensor 214, when activated in the family room, may generate a command which is linked with security camera footage from the same room. In some embodiments,health sensor 214 may be able to monitor vitals of a user, such as a blood pressure, temperature, heart rate, blood sugar, etc. In some embodiments, an event, such as a fall or exiting a structure can be detected. Further, parallel notifications may be sent by thehealth sensor 214 to multiple user devices at approximately the same time. As such, multiple people can be made aware of the event at approximately the same time (as opposed to serial notification). Which users are notified for which type of event may be customized by a user oftelevision receiver 150. - Further in reference to
FIG. 2 , in addition to such parallel notifications being based on data fromhealth sensor 214, data from other devices may trigger such parallel notifications according to various rules within the home automation system. For instance, a mailbox open, a garage door open, an entry/exit door open during wrong time, an unauthorized control of specific lights during vacation period, a water sensor detecting a leak or flow, a temperature of room or equipment is outside of defined range, and/or motion detected at front door are examples of possible events which may trigger parallel notifications. A configuring user may be able to select from a list of users provided by the home automation system to notify and method of notification to enable such parallel notifications. The configuring user may prioritize which systems and people are notified, and specify that the notification may continue through the list unless acknowledged either electronically or by human interaction. For example, the user could specify that they want to be notified of any light switch operation in their home during their vacation. Notification priority could be 1) SMS Message, 2) push notification, 3) electronic voice recorder places call to primary number, and 4) electronic voice recorder places call to spouse's number. The second notification may never happen if the user replies to the SMS message with an acknowledgment. Or, the second notification would automatically happen if the SMS gateway cannot be contacted. - Referring again to
FIG. 2 ,intercom 218 may permit a user in one location to communicate with a user in another location, who may be usingwireless device 216,display device 160 or some other device, such another television receiver within the structure.Intercom 218 may be integrated withcamera 212 or may use a dedicated microphone/speaker, such as a Bluetooth® microphone. Microphones/speakers ofwireless device 216,display device 160,communication device 252,overlay device 251 may also or alternatively be used. A multimedia over coax (MOCA) network or other appropriate type of network may be used to provide audio and/or video based intercom viatelevision receiver 150 with other television receivers and/or wireless devices in communication withtelevision receiver 150. Similarly, video and/or audio conferencing can be provided, such that communication with persons via the Internet is possible. Therefore, one possible use would be video and/or audio conferencing within a structure using each television receiver (and associated connected display devices) in the structure that are in communication, or allowing each television receiver to perform video/audio conferencing with other devices external to the structure or local area network. - Referring to
FIG. 2 , to enableintercom 218, a microphone may be placed in a location where a user would typically be usingintercom 218. For instance, a microphone may be placed neardisplay device 160. In some embodiments, a microphone may be integrated into a remote control oftelevision receiver 150. As such, if a user is usingtelevision receiver 150 via remote control, the user would have access to a microphone. In at least one embodiment, a user can leverage thewireless device 216, such as a mobile phone or tablet computer, as the microphone for the home automation system. - Referring again to
FIG. 2 ,doorbell sensor 223 may permit an indication of when a doorbell has been rung to be sent to multiple devices, such astelevision receiver 150 and/orwireless device 216. In some embodiments,doorbell sensor 223 detecting a doorbell ring may trigger video to be recorded bycamera 212 of the area near the doorbell and the video to be stored until deleted by a user (or stored for predefined period of time). - Further, as shown in
FIG. 2 , such a microphone, or a microphone on one or more other home automation devices, may allow for voice recognition to be performed bytelevision receiver 150. Voice recognition may allow for a particular user to be determined and for commands to be completed based on a user speaking such commands. For instance, an adult user may be permitted to perform certain functions that a child user cannot; such as unlocking doors. Each user may provide a voice sample which is used bytelevision receiver 150 to distinguish users from each other. Further, users may be able to speak commands, such as “lower heat 5 degrees,” to control home automation devices. Based on the command received,television receiver 150 may determine to which home automation device the command is intended and may transmit an appropriate command (such as, in this example, a command to lower the heat setting by five degrees to thermostat 222). In at least one embodiment, a user may use a user-defined code word that precedes or follows a command, such as “sesame,” then speaking a command such as “turn on the living room lights.” In some embodiments, in addition or in alternate to voice identification, fingerprint identification may be used to determine an identify of a user. Specific functions oftelevision receiver 150 may require that a user log in, such as via a fingerprint scanner, before being able to view and/or modify such functions. - Referring to
FIG. 2 ,light controller 220 may permit a light to be turned on, off, and/or dimmed bytelevision receiver 150 or overlay device 251 (such as based on a user command received viawireless device 216 or directly viatelevision receiver 150 or overlay device 251).Light controller 220 may control a single light. As such, multiple differentlight controllers 220 may be present within a house. In some embodiments, a physical light switch (which opens and closes a circuit of the light) may be left in the on position such thatlight controller 220 can be used to control whether the light is on or off.Light control 220 may be integrated into a light bulb or into a circuit (such as between the light fixture and the power source) to control whether the light is on or off. The user, viatelevision receiver 150 oroverlay device 251 may be permitted to view a status of alllight controllers 220 within a location. Sincetelevision receiver 150 oroverlay device 251 may communicate using different home automation protocols, different light controllers 220 (and, more generally, different home automation devices) within a location may use disparate communication protocols, but may all still be controlled bytelevision receiver 150 oroverlay device 251. In some embodiments, wireless light switches may be used that communicate withtelevision receiver 150 oroverlay device 251. Such switches may use a different communication protocol thanlight controllers 220. Such a difference may not affect functionality becausetelevision receiver 150 oroverlay device 251 can serve as a hub for multiple disparate communication protocols and perform any necessary translation and/or bridging functions. For example, a tablet computer may transmit a command over a WiFi connection andtelevision receiver 150 oroverlay device 251 may translate the command into an appropriate Zigbee or Zwave command for a wireless light bulb. In some embodiments, the translation may occur for a group of disparate devices. For example, a user decides to turn off all lights in a room and selects a lighting command on the tablet computer. Theoverlay device 251 identifies the lights in the room and outputs appropriate commands to all devices over different protocol, such as a Zigbee wireless lightbulb and a Zwave table lamp.Television receiver 150 may permit timers and/or dimmer settings to be set for lights vialight controller 220. For instance, lights can be configured to turn on/off at various times during a day according to a schedule (and/or events being detected by the home automation system). - Referring again to
FIG. 2 ,thermostat 222 may communicate withtelevision receiver 150 oroverlay device 251.Thermostat 222 may provide heating/cooling updates on the location totelevision receiver 150 oroverlay device 251 for display viadisplay device 160 and/orwireless device 216. Further, control ofthermostat 222 may be effectuated viatelevision receiver 150 oroverlay device 251. Zone control within a structure using multiple thermostats may also be possible. -
Leak detection sensor 224 ofFIG. 2 may be in communication withtelevision receiver 150 oroverlay device 251 and may be used to determine when a water leak as occurred, such as in pipes supplying water-based fixtures with water.Leak detection sensor 224 may be configured to attach to the exterior of a pipe and listen for a sound of water moving within a pipe. In other embodiments, sonar, temperature sensors or ion infused water with appropriate sensors may be used to detect moving water. As such, cutting or otherwise modifying plumbing may not be necessary to useleak detection sensor 224. If water movement is detected for greater than a threshold period of time, it may be determined a leak is occurring.Leak detection sensor 224 may have a component that couples over an existing valve such that the flow of water within one or more pipes can be stopped. For instance, ifleak detection sensor 224 determines a leak may be occurring, a notification may be provided to a user viawireless device 216 and/ordisplay device 160 bytelevision receiver 150 oroverlay device 251. If a user does not clear the notification, the flow of water may be shut off byleak detection sensor 224 after a predefined period of time. A user may also be able to provide input to allow the flow of water to continue or to immediately interrupt the flow of water. - In
FIG. 2 , the home automation system may utilize various rules to determine whether a leak is occurring. For example, a measurement threshold may be utilized in the event that water is flowing to an ice machine. The amount of water typically drawn by such a device may be known, if the flow rate and/or flow time significantly exceeds normal operating parameters, it may be determined that a leak is occurring. In some embodiments, the home automation system may communicate with appliances to determine whether water is flowing to the device. For example, a home automation system may communicate with a washing machine in operation to determine that water is flowing to the appliance, and thus, determine that a water leak is not occurring. If no appliance is using water (and, possibly, it is known that no user is home) it may be determined that a leak is occurring. In other embodiments, data from various motion sensors may be utilized. For example, if the system identifies that users have left the home, but a large flow of water is occurring, then the system may determine that a leak is occurring and notify a user or take remedial steps accordingly. - Further shown in
FIG. 2 , VoIP (voice over IP)controller 225 may permittelevision receiver 150 to serve as a hub for a home phone system. One or more conventional telephones may be connected withtelevision receiver 150. Calls may be converted to IP bytelevision receiver 150 and allow for calls to be received and placed vianetwork 270, which is connected with the Internet. The need for a dedicated home phone line may thus be eliminated. In some embodiments, a cellular back channel (e.g., via a cellular modem) may be utilized as a backup to other types of internet connections, such as DSL, cable modems or satellite internet. -
Appliance controller 226 ofFIG. 2 may permit a status of an appliance to be retrieved and commands to control operation to be sent to an appliance bytelevision receiver 150 oroverlay device 251. For instance,appliance controller 226 may control a washing machine, a dryer, a dishwasher, an oven, a microwave, a refrigerator, a toaster, a coffee maker, a hot tub, or any other form of appliance.Appliance controller 226 may be connected with the appliance or may be integrated as part of the appliance. - Appliances and other electronic devices may also be monitored for electricity usage. For instance, US Pat. Pub. No. 2013/0318559, filed Nov. 19, 2012, to Crabtree, entitled “Apparatus for Displaying Electrical Device Usage Information on a Television Receiver,” which is hereby incorporated by reference, may allow for information regarding the electricity usage of one or more devices (e.g., other home automation devices or circuits within a home that are monitored) to be determined. Control of one or more home automation devices may be dependent on electrical usage and stored electrical rates. For instance, a washing machine may be activated in the evening when rates are lower. Additionally or alternatively, operation of devices may be staggered to help prevent consuming too much power at a given time. For instance, an electric heater may not be activated until a dryer powered via the same circuit is powered down.
-
Garage door controller 228 ofFIG. 2 may permit a status of a garage door to be checked and the door to be opened or closed by a user viatelevision receiver 150 oroverlay device 251. In some embodiments, based on a location ofwireless device 216, the garage door may be controlled. For instance, ifwireless device 216 is a cellular phone and it is detected to have moved a threshold distance away from a house having agarage door controller 228 installed, a notification may be sent towireless device 216. If no response is received within a threshold period of time, the garage may be automatically shut. Ifwireless device 216 moves within a threshold distance ofgarage door controller 228, the garage may be opened. -
Lock controller 230 ofFIG. 2 may permit a door to be locked and unlocked and/or monitored by a user viatelevision receiver 150 oroverlay device 251. In some embodiments,lock controller 230 may have an integrateddoor sensor 208 to determine if the door is open, shut, or partially ajar. Being able to only determine if a door is locked or unlocked may not be overly useful—for instance, a lock may be in a locked position, but if the door is ajar, the lock may not prevent access to the house. Therefore, for security, a user may benefit from knowing both that a door is closed (or open) and locked (or unlocked). To accomplish such notification and control,lock controller 230 may have an integrateddoor sensor 208 that allows for thesingle lock controller 230 to lock/unlock a door and provide a status as to whether the door is open or shut. Therefore, a single device may control a lock and determine whether the associated door is shut or open. No mechanical or electrical component may need to be integrated separately into a door or doorframe to provide such functionality. Such a single device may have a single power source that allows for sensing of the lock position, sensing of the door position, and for engagement/disengagement of the lock.Lock controller 230 may have an integrated door sensor that includes a reed switch or proximity sensor that detects when the door is in a closed position, with a plate of the lock in proximity to a plate on the door frame of the door. For instance, a plate of the lock may have an integrated magnet or magnetized doorframe plate. When in proximity to the magnet, a reed switch located inlock controller 230 may be used to determine that the door is closed; when not in proximity to the magnet, the reed switch located inlock controller 230 may be used to determine that the door is at least partially ajar. Rather than using a reed switch, other forms of sensing may also be used, such as a proximity sensor to detect a doorframe. In some embodiments, the sensor to determine the door is shut may be integrated directly into the deadbolt or other latching mechanism oflock controller 230. When the deadbolt is extended, a sensor may be able to determine if the distal end of the deadbolt is properly latched within a door frame based on a proximity sensor or other sensing means. - A
home security system 207 ofFIG. 2 may be integrated with a home automation system. Thehome security system 207 may detect motion, when a user has armed/disarmed thehome security system 207, when windows/doors are opened or broken, etc.Television receiver 150 may adjust settings based of home automation devices based onhome security system 207 being armed or disarmed. A virtual control and alarm panel may be presented to a user via adisplay device 160 andtelevision receiver 150. The functions of a wall mounted panel alarm can be integrated in the graphical user interface of the TV viewing experience such as a menu system with an underlying tree structure. The virtual control and alarm panel can appear in a full screen or Picture-in-Picture (PiP) with TV content. Alarms and event notification can be in the form of scrolling text overlays, popups, flashing icons, etc. Camera video (e.g., from camera 212) can be integrated with the standard DVR content oftelevision receiver 150 with additional search, zoom, time-line capabilities. The camera's video stream can be displayed full screen, PiP with TV content, or as a tiled mosaic to display multiple camera's streams at a same time. In some embodiments, the display can switch between camera streams at fixed intervals.Television receiver 150 may perform video scaling, adjust frame rate and transcoding on video received fromcamera 212. In addition,television receiver 150 may adaptively transcode the camera content to match an Internet connection. -
Irrigation controller 232 ofFIG. 2 may allow for a status and control of an irrigation system (e.g., sprinkler system) to be controlled by a user viatelevision receiver 150 and/oroverlay device 251.Irrigation controller 232 may be used in conjunction withweather sensor 206 to determine whether and/or for howlong irrigation controller 232 should be activated for watering. Further, a user, viatelevision receiver 150 and/or overlay device, may turn on, turn off, or adjust settings ofirrigation controller 232. - One or more motion sensors can be incorporated into one or more of the previously detailed home automation devices or as a stand-alone device. Such motion sensors may be used to determine if a structure is occupied. Such information may be used in conjunction with a determined location of one or more wireless devices. If some or all users are not present in the structure, home automation settings may be adjusted, such as by lowering a temperature of
thermostat 222, shutting off lights vialight controller 220, and determining if one or more doors are closed bydoor sensor 208. In some embodiments, a user-defined script may be run when it is determined that no users or other persons are present within the structure. - Additional forms of sensors not illustrated in
FIG. 2 may also be incorporated as part of a home automation system. For instance, a mailbox sensor may be attached to a mailbox to determine when mail is present and/or has been picked up. The ability to control one or more showers, baths, and/or faucets fromtelevision receiver 150 and/orwireless device 216 may also be possible. Pool and/or hot tub monitors may be incorporated into a home automation system. Such sensors may detect whether or not a pump is running, water temperature, pH level, a splash/whether something has fallen in, etc. Further, various characteristics of the pool and/or hot tub may be controlled via the home automation system. In some embodiments, a vehicle dashcam may upload or otherwise make video/audio available totelevision receiver 150 when within range. For instance, when a vehicle has been parked within range of a local wireless network with whichtelevision receiver 150 is connected, video and/or audio may be transmitted from the dashcam to the television receiver for storage and/or uploading to a remote server. - The home automation functions detailed herein that are attributed to
television receiver 150 may alternatively or additionally be incorporated intooverlay device 251. As such, aseparate overlay device 251 may be connected withdisplay device 160 to provide home automation functionality. - Turning now to
FIG. 3 , an embodiment of atelevision receiver 300, which may representtelevision receiver 150 ofFIG. 1 and/orFIG. 2 , is illustrated.Television receiver 300 may be configured to function as a host for a home automation system either alone or in conjunction with a communication device, such ascommunication device 252 ofFIG. 2 .Television receiver 300 may be in the form of a separate device configured to be connected with a display device, such as a television. Embodiments oftelevision receiver 300 can include set top boxes (STBs). In addition to being in the form of an STB, a television receiver may be incorporated as part of another device, such as a television, other form of display device, video game console, computer, mobile phone or tablet or the like. For example, a television may have an integrated television receiver (which does not involve an external STB being coupled with the television). - As shown in
FIG. 3 ,television receiver 300 may be incorporated as part of a television, such asdisplay device 160 ofFIG. 1 .Television receiver 300 may include: processors 310 (which may include control processor 310-1, tuning management processor 310-2, and possibly additional processors),tuners 315,network interface 320, non-transitory computer-readable storage medium 325, electronic programming guide (EPG)database 330,television interface 335, digital video recorder (DVR) database 345 (which may include provider-managed television programming storage and/or user-defined television programming), on-demand programming database 327, homeautomation settings database 347, homeautomation script database 348, remote control interface 350,security device 360, and/ordescrambling engine 365. In other embodiments oftelevision receiver 300, fewer or greater numbers of components may be present. It should be understood that the various components oftelevision receiver 300 may be implemented using hardware, firmware, software, and/or some combination thereof. Functionality of components may be combined; for example, functions ofdescrambling engine 365 may be performed by tuning management processor 310-2. Further, functionality of components may be spread among additional components. - In
FIG. 3 , processors 310 may include one or more specialized and/or general-purpose processors configured to perform processes such as tuning to a particular channel, accessing and displaying EPG information fromEPG database 330, and/or receiving and processing input from a user. It should be understood that the functions performed by various modules ofFIG. 3 may be performed using one or more processors. As such, for example, functions ofdescrambling engine 365 may be performed by control processor 310-1. - Control processor 310-1 of
FIG. 3 may communicate with tuning management processor 310-2. Control processor 310-1 may control the recording of television channels based on timers stored inDVR database 345. Control processor 310-1 may also provide commands to tuning management processor 310-2 when recording of a television channel is to cease. In addition to providing commands relating to the recording of television channels, control processor 310-1 may provide commands to tuning management processor 310-2 that indicate television channels to be output todecoder module 333 for output to a display device. Control processor 310-1 may also communicate withnetwork interface 320 and remote control interface 350. Control processor 310-1 may handle incoming data fromnetwork interface 320 and remote control interface 350. Additionally, control processor 310-1 may be configured to output data vianetwork interface 320. - Control processor 310-1 of
FIG. 3 may include thehome automation engine 311.Home automation engine 311 may permittelevision receiver 300 and control processor 310-1 to provide home automation functionality.Home automation engine 311 may have a JSON (JavaScript Object Notation) command interpreter or some other form of command interpreter that is configured to communicate with wireless devices vianetwork interface 320 and a message server (possibly via a message server client). Such a command interpreter ofhome automation engine 311 may also communicate via a local area network with devices (without using the Internet).Home automation engine 311 may contain multiple controllers specific to different protocols; for instance, a ZigBee® controller, a Z-Wave® controller, and/or an IP camera controller (wireless LAN, 802.11) may be present.Home automation engine 311 may contain a media server configured to serve streaming audio and/or video to a remote devices (on a local area network or the Internet). Television receiver may be able to serve such devices with recorded content, live content, and/or content recorded using one or more home automation devices, such ascamera 212. -
Tuners 315 ofFIG. 3 may include one or more tuners used to tune to transponders that include broadcasts of one or more television channels. Such tuners may be used also to receive for storage on-demand content and/or credit-earning television commercials and/or home automation functions. In some embodiments, two, three, or more than three tuners may be present, such as four, six, or eight tuners. Each tuner contained intuners 315 may be capable of receiving and processing a single transponder stream from a satellite transponder (or from a cable network) at a given time. As such, a single tuner may tune to a single transponder stream at a given time. Iftuners 315 include multiple tuners, one tuner may be used to tune to a television channel on a first transponder stream for display using a television, while another tuner may be used to tune to a television channel on a second transponder for recording and viewing at some other time. If multiple television channels transmitted on the same transponder stream are desired, a single tuner oftuners 315 may be used to receive the signal containing the multiple television channels for presentation and/or recording.Tuners 315 may receive commands from tuning management processor 310-2. Such commands may instructtuners 315 to which frequencies are to be tuned. -
Network interface 320 ofFIG. 3 may be used to communicate via an alternate communication channel with a television service provider, if such communication channel is available. A communication channel may be via satellite (which may be unidirectional to television receiver 300) and the alternate communication channel (which may be bidirectional) may be via a network, such as the Internet. Data may be transmitted fromtelevision receiver 300 to a television service provider system and from the television service provider system totelevision receiver 300. Information may be transmitted and/or received vianetwork interface 320. For instance, instructions from a television service provider may also be received vianetwork interface 320, if connected with the Internet. Besides the primary communication channel being satellite, cable network, an IP-based network, or broadcast network may be used.Network interface 320 may permit wireless communication with one or more types of networks, including using home automation network protocols and wireless network protocols. Also, wired networks may be connected to and communicated with vianetwork interface 320.Device interface 321 may represent a USB port or some other form of communication port that permits communication with a communication device. -
Storage medium 325 ofFIG. 3 may represent one or more non-transitory computer-readable storage mediums.Storage medium 325 may include memory and/or a hard drive.Storage medium 325 may be used to store information received from one or more satellites and/or information received vianetwork interface 320.Storage medium 325 may store information related to on-demand programming database 327,EPG database 330,DVR database 345, homeautomation settings database 347, and/or homeautomation script database 348. Recorded television programs may be stored usingstorage medium 325 as part ofDVR database 345.Storage medium 325 may be partitioned or otherwise divided (such as into folders) such that predefined amounts ofstorage medium 325 are devoted to storage of television programs recorded due to user-defined timers and stored television programs recorded due to provider-defined timers. - Home
automation settings database 347 ofFIG. 3 may allow configuration settings of home automation devices and user preferences to be stored. Homeautomation settings database 347 may store data related to various devices that have been set up to communicate withtelevision receiver 300. For instance, homeautomation settings database 347 may be configured to store information on which types of events should be indicated to users, to which users, in what order, and what communication methods should be used. For instance, an event such as an open garage may only be notified to certain wireless devices (e.g., a cellular phone associated with a parent, not a child), notification may be by a third-party notification server, email, text message, and/or phone call. In some embodiments, a second notification method may only be used if a first fails. For instance, if a notification cannot be sent to the user via a third-party notification server, an email may be sent. - Home
automation settings database 347 ofFIG. 3 may store information that allows for the configuration and control of individual home automation devices which may operate using Z-wave and Zigbee—specific protocols. To do so,home automation engine 311 may create a proxy for each device that allows for settings for the device to be passed through a UI (e.g., presented on a television) to allow for settings to be solicited for and collected via a user interface presented by television receiver or overlay device. The received settings may then be handled by the proxy specific to the protocol, allowing for the settings to be passed on to the appropriate device. Such an arrangement may allow for settings to be collected and received via a UI of the television receiver or overlay device and passed to the appropriate home automation device and/or used for managing the appropriate home automation device. - Home
automation script database 348 ofFIG. 3 may store scripts that detail how home automation devices are to function based on various events occurring. For instance, if stored content starts being played back bytelevision receiver 300, lights in the vicinity ofdisplay device 160 may be dimmed and shades may be lowered byshade controller 204. As another example, when a user shuts programming off late in the evening, there may be an assumption the user is going to bed. Therefore, the user may configuretelevision receiver 300 to lock all doors vialock controller 230, shut the garage door viagarage controller 228, lower a heat setting ofthermostat 222, shut off all lights vialight controller 220, and determine if any windows or doors are open viawindow sensor 210 and door sensor 208 (and, if so, alert the user). Such scripts or programs may be predefined by the home automation/television service provider and/or may be defined by a user. - In some embodiments, home automation script database 248 of
FIG. 3 may allow for various music profiles to be implemented. For instance, based on home automation settings within a structure, appropriate music may be played. For instance, if the lights are dimmed, romantic music may be played. Conversely, based on the music being played, settings of home automation devices may be determined. If television programming, such as a movie, is output for playback bytelevision receiver 150, a particular home automation script may be used to adjust home automation settings (e.g., lower lights, raise temperature, and lock doors). -
EPG database 330 ofFIG. 3 may store information related to television channels and the timing of programs appearing on such television channels.EPG database 330 may be stored usingstorage medium 325, which may be a hard drive or solid-state drive. Information fromEPG database 330 may be used to inform users of what television channels or programs are popular and/or provide recommendations to the user. Information fromEPG database 330 may provide the user with a visual interface displayed by a television that allows a user to browse and select television channels and/or television programs for viewing and/or recording. Information used to populateEPG database 330 may be received vianetwork interface 320, via satellite, or some other communication link with a television service provider (e.g., a cable network). Updates toEPG database 330 may be received periodically.EPG database 330 may serve as an interface for a user to control DVR functions oftelevision receiver 300, and/or to enable viewing and/or recording of multiple television channels simultaneously. EPG database 340 may also contain information about on-demand content or any other form of accessible content. -
Decoder module 333 ofFIG. 3 may serve to convert encoded video and audio into a format suitable for output to a display device. For instance,decoder module 333 may receive MPEG video and audio fromstorage medium 325 ordescrambling engine 365 to be output to a television. MPEG video and audio fromstorage medium 325 may have been recorded toDVR database 345 as part of a previously-recorded television program.Decoder module 333 may convert the MPEG video and audio into a format appropriate to be displayed by a television or other form of display device and audio into a format appropriate to be output from speakers, respectively.Decoder module 333 may have the ability to convert a finite number of television channel streams received fromstorage medium 325 ordescrambling engine 365, simultaneously. For instance, decoders withindecoder module 333 may be able to only decode a single television channel at a time.Decoder module 333 may have various numbers of decoders. -
Television interface 335 ofFIG. 3 may serve to output a signal to a television (or another form of display device) in a proper format for display of video and playback of audio. As such,television interface 335 may output one or more television channels, stored television programming from storage medium 325 (e.g., television programs fromDVR database 345, television programs from on-demand programming 330 and/or information from EPG database 330) to a television for presentation.Television interface 335 may also serve to output a CVM. - Still referring to
FIG. 3 , digital Video Recorder (DVR) functionality may permit a television channel to be recorded for a period of time. DVR functionality oftelevision receiver 300 may be managed by control processor 310-1. Control processor 310-1 may coordinate the television channel, start time, and stop time of when recording of a television channel is to occur.DVR database 345 may store information related to the recording of television channels.DVR database 345 may store timers that are used by control processor 310-1 to determine when a television channel should be tuned to and its programs recorded toDVR database 345 ofstorage medium 325. In some embodiments, a limited amount ofstorage medium 325 may be devoted toDVR database 345. Timers may be set by the television service provider and/or one or more users oftelevision receiver 300. -
DVR database 345 ofFIG. 3 may also be used to record recordings of service provider-defined television channels. For each day, an array of files may be created. For example, based on provider-defined timers, a file may be created for each recorded television channel for a day. For example, if four television channels are recorded from 6-10 PM on a given day, four files may be created (one for each television channel). Within each file, one or more television programs may be present. The service provider may define the television channels, the dates, and the time periods for which the television channels are recorded for the provider-defined timers. The provider-defined timers may be transmitted totelevision receiver 300 via the television provider's network. For example, in a satellite-based television service provider system, data necessary to create the provider-defined timers attelevision receiver 150 may be received via satellite. - Still referring to
FIG. 3 , as an example of DVR functionality oftelevision receiver 300 being used to record based on provider-defined timers, a television service provider may configuretelevision receiver 300 to record television programming on multiple, predefined television channels for a predefined period of time, on predefined dates. For instance, a television service provider may configuretelevision receiver 300 such that television programming may be recorded from 7 to 10 PM on NBC, ABC, CBS, and FOX on each weeknight and from 6 to 10 PM on each weekend night on the same channels. These channels may be transmitted as part of a single transponder stream such that only a single tuner needs to be used to receive the television channels. Packets for such television channels may be interspersed and may be received and recorded to a file. If a television program is selected for recording by a user and is also specified for recording by the television service provider, the user selection may serve as an indication to save the television program for an extended time (beyond the time which the predefined recording would otherwise be saved). Television programming recorded based on provider-defined timers may be stored to a portion ofstorage medium 325 for provider-managed television programming storage. - On-
demand programming database 327 ofFIG. 3 may store additional television programming. On-demand programming database 327 may include television programming that was not recorded tostorage medium 325 via a timer (either user- or provider-defined). Rather, on-demand programming may be programming provided to the television receiver directly for storage by the television receiver and for later presentation to one or more users. On-demand programming may not be user-selected. As such, the television programming stored to on-demand programming database 327 may be the same for each television receiver of a television service provider. On-demand programming database 327 may include pay-per-view (PPV) programming that a user must pay and/or use an amount of credits to view. For instance, on-demand programming database 327 may include movies that are not available for purchase or rental yet. Typically, on-demand programming is presented commercial-free. - Referring back to
tuners 315 ofFIG. 3 , television channels received via satellite (or cable) may contain at least some scrambled data. Packets of audio and video may be scrambled to prevent unauthorized users (e.g., nonsubscribers) from receiving television programming without paying the television service provider. When a tuner oftuners 315 is receiving data from a particular transponder of a satellite, the transponder stream may be a series of data packets corresponding to multiple television channels. Each data packet may contain a packet identifier (PID), which can be determined to be associated with a particular television channel. Particular data packets, referred to as entitlement control messages (ECMs), may be periodically transmitted. ECMs may be associated with another PID and may be encrypted;television receiver 300 may use decryption engine 361 ofsecurity device 360 to decrypt ECMs. Decryption of an ECM may only be possible if the user has authorization to access the particular television channel associated with the ECM. When an ECM is determined to correspond to a television channel being stored and/or displayed, the ECM may be provided tosecurity device 360 for decryption. - When
security device 360 ofFIG. 3 receives an encrypted ECM,security device 360 may decrypt the ECM to obtain some number of control words. In some embodiments, from each ECM received bysecurity device 360, two control words are obtained. In some embodiments, whensecurity device 360 receives an ECM, it compares the ECM to the previously received ECM. If the two ECMs match, the second ECM is not decrypted because the same control words would be obtained. In other embodiments, each ECM received bysecurity device 360 is decrypted; however, if a second ECM matches a first ECM, the outputted control words will match; thus, effectively, the second ECM does not affect the control words output bysecurity device 360.Security device 360 may be permanently part oftelevision receiver 300 or may be configured to be inserted and removed fromtelevision receiver 300, such as a smart card, cable card or the like. - Tuning management processor 310-2 of
FIG. 3 may be in communication withtuners 315 and control processor 310-1. Tuning management processor 310-2 may be configured to receive commands from control processor 310-1. Such commands may indicate when to start/stop receiving and/or recording of a television channel and/or when to start/stop causing a television channel to be output to a television. Tuning management processor 310-2 may controltuners 315. Tuning management processor 310-2 may provide commands totuners 315 that instruct the tuners which satellite, transponder, and/or frequency to tune to. Fromtuners 315, tuning management processor 310-2 may receive transponder streams of packetized data. - Descrambling
engine 365 ofFIG. 3 may use the control words output bysecurity device 360 in order to descramble video and/or audio corresponding to television channels and/or home automation functions for storage and/or presentation. Video and/or audio data contained in the transponder data stream received bytuners 315 may be scrambled. Video and/or audio data may be descrambled by descramblingengine 365 using a particular control word. Which control word output bysecurity device 360 to be used for successful descrambling may be indicated by a scramble control identifier present within the data packet containing the scrambled video or audio. Descrambled video and/or audio may be output by descramblingengine 365 tostorage medium 325 for storage (in DVR database 345) and/or todecoder module 333 for output to a television or other presentation equipment viatelevision interface 335. - In some embodiments, the
television receiver 300 ofFIG. 3 may be configured to periodically reboot in order to install software updates downloaded over thenetwork 190 or satellites 130. Such reboots may occur for example during the night when the users are likely asleep and not watching television. If the system utilizes a single processing module to provide television receiving and home automation functionality, then the security functions may be temporarily deactivated. In order to increase the security of the system, thetelevision receiver 300 may be configured to reboot at random times during the night in order to allow for installation of updates. Thus, an intruder is less likely to guess the time when the system is rebooting. In some embodiments, thetelevision receiver 300 may include multiple processing modules for providing different functionality, such as television receiving functionality and home automation, such that an update to one module does not necessitate reboot of the whole system. In other embodiments, multiple processing modules may be made available as a primary and a backup during any installation or update procedures. - For simplicity,
television receiver 300 ofFIG. 3 has been reduced to a block diagram; commonly known parts, such as a power supply, have been omitted. Further, some routing between the various modules oftelevision receiver 300 has been illustrated. Such illustrations are for exemplary purposes only. The state of two modules not being directly or indirectly connected does not indicate the modules cannot communicate. Rather, connections between modules of thetelevision receiver 300 are intended only to indicate possible common data routing. It should be understood that the modules oftelevision receiver 300 may be combined into a fewer number of modules or divided into a greater number of modules. Further, the components oftelevision receiver 300 may be part of another device, such as built into a television.Television receiver 300 may include one or more instances of various computerized components, such as disclosed in relation tocomputer system 700 ofFIG. 7 . - While the
television receiver 300 has been illustrated as a satellite-based television receiver, it is to be appreciated that techniques below may be implemented in other types of television receiving devices, such a cable receivers, terrestrial receivers, IPTV receivers or the like. In some embodiments, thetelevision receiver 300 may be configured as a hybrid receiving device, capable of receiving content from disparate communication networks, such as satellite and terrestrial television broadcasts. In some embodiments, the tuners may be in the form of network interfaces capable of receiving content from designated network locations. The home automation functions oftelevision receiver 300 may be performed by an overlay device. If such an overlay device, television programming functions may still be provided by a television receiver that is not used to provide home automation functions. In another aspect, one or more home automations functions may be performed by avoice command engine 370, which may be incorporated in thehome automation engine 311 as shown and/or in thestorage medium 325. For example, thevoice command engine 370 may provide for speaker-dependent commands to be implemented in the home automation system, as described in further detail in the following paragraphs. - Turning now to
FIG. 4 , anexample method 400 for controlling a device in a home automation system, such as thehome automation system 200, based on a speaker-dependent command is provided. Themethod 400 may be implemented by thevoice command engine 370, which may be incorporated in thehome automation engine 311 that is found in thetelevision receiver 150 and/or theoverlay device 251. Themethod 400 shown, and any other methods disclosed herein, may include additional and/or alternative steps in relation to the steps being shown. Further, any steps may be optional, rearranged, and/or combined. Numerous variations are possible. - As shown in
FIG. 4 , themethod 400 may include receiving avoice command 402. Thevoice command 402 may be a spoken voice command provided by a user, e.g. a speaker on the premises having thehome automation system 200, and directed to adjusting a state of one or more devices in thehome automation system 200. Merely by way of example, the speaker may be an adult located in a living room of a house equipped with thehome automation system 200 and the spoken voice command from the adult may include, “Lower the heat by five degrees.” Such spoken voice commands may be detected by one or more microphones in operative communication with thevoice command engine 370. For example, the microphone(s) utilized by thevoice command engine 370 may be found on any television receiver, a television remote control, and/or any of the home automation devices in thehome automation system 200, such as theintercom 218,display device 160,home security 207,camera 212,health sensor 214, or any other device shown inFIG. 2 . The voice command may be communicated to thevoice command engine 370 via a local wireless area network, a wired network, a home automation network, and/or any other type of communications network. Other examples are possible. - In another example, the voice command received by the
voice command engine 370 may be provided for by the speaker via a mobile device, such as thewireless device 216, which may represent a tablet computer, cellular phone, laptop computer, remote computer, or some other device through which a user may desire to control home automation settings and view home automation information. Voice commands may be detected by a microphone of thewireless device 216, which may further encode and/or transmit the detected voice command to thevoice command engine 370 via a communications network. In this case, thevoice command engine 370 may receive the voice command from a remotely-located microphone, e.g. a mobile device and speaker not located on the premises having thehome automation system 200, and further decode the voice command signal. In some cases, thewireless device 216 may support a mobile application or app that is configured to detect, encode and/or and transmit the voice command to thevoice command engine 370. In another example, thevoice command engine 370 connects to a VoIP platform and receives voice commands via VoIP, or traditional phone calls via a cellular or landline network. Still, other examples are possible. Further, other sensors for detecting the voice command may be possible. - In some cases, the received voice command may be preceded by a spoken code-word that is received by the
voice command engine 370 in a similar manner as described above. It is contemplated that such code-words, or phrases, such as code-word “Sesame” preceding the command, “Lower the heat by five degrees,” may be user-defined and/or pre-programmed. In some cases, a code-word may be directed to activating or otherwise indicating to thevoice command engine 370 an incoming voice command. In other cases, the code-word may be user-specific and indicate a speaker identity and/or speaker verification to thevoice command engine 370. The code-word may serve as a password or passcode for controlling any devices in thehome automation system 200, and may be device-specific, alternatively and/or additionally to being user-specific. - In still other examples, the code-word may be utilized to alter operational settings of other modules or devices in operation with the
voice command engine 370 to implement one or more settings for receiving the voice command. Merely by way of example, various or all components of thevoice command engine 370 may be in a sleep mode until the code-word is detected at theengine 370 and/or a television receiver, whereupon thevoice command engine 370 may be activated by an activation module therein and/or by a signal from the television receiver. This may prepare thevoice command engine 370 for receiving the voice command and for performing subsequent steps thereafter. In another example, a volume level of a television or other speaker may be lowered in preparation for receiving the voice command at thevoice command engine 370. - In still further examples, the
voice command engine 370 may acknowledge the code-word and/or the speaker identity prior to receiving the voice command. For example, thevoice command engine 370, upon activation in response to the spoken code-word, may output in speech and/or text format on a display screen, “Yes, Sam?” to indicate thevoice command engine 370 is active and/or to indicate a determined speaker identity of the code-word. Following that responsive prompt, thevoice command engine 370 may activate one or more microphones to receive the voice command and/or provide further prompts and receive further responses to clarify the speaker identity and/or clarify the intentions of the voice command. It is noted that activating one or more microphones may include determining if a microphone is in closer proximity to the speaker, turning that microphone on and switching an original microphone that picked up the code-word to an off state. This may ensure quality and accuracy of the voice detection and subsequent voice recognition process. - In yet another example, the
voice command engine 370 may determine that more than one speaker is giving voice commands at any given time. In that case, if the voice commands include conflicting orders, thevoice command engine 370 may determine the plurality of speaker identities involved and determine which speaker identity has the higher level of permission, e.g. adult or child, and implement the voice command of the higher-ranked speaker in remaining steps of themethod 400. In some cases, thevoice command engine 370 may confirm denial of certain voice commands and/or output a reason for denying such commands. Similarly, thevoice command engine 370 may confirm reception of the voice command, and/or repeat the received voice command as a confirmation to the speaker immediately after receiving the voice command. Still, other examples are possible. - As further shown in
FIG. 4 , themethod 400 may include determining aspeaker identity 404. In some cases, thevoice command engine 370 may perform a voice recognition analysis to determine a speaker identity of the received voice command. For instance, thevoice command engine 370 may analyze a speech pattern, voice pitch, speaking style, accent, and/or other aspects of speech to determine an identity of the speaker. In some cases, the received voice command, and/or portions thereof, is compared against one or more voice samples in a database of voices, e.g. a voice database of thevoice command engine 370 as shown inFIG. 5 . The voice database may include one or more voice samples associated with one or more speaker identities. When a match is determined by thevoice command engine 370 in comparing the voice command to the voice sample(s), the speaker identity may be determined, and/or tentatively determined and output for final confirmation from the speaker. - Additionally and/or alternatively, the
voice command engine 370 may perform the voice recognition analysis to verify that a speaker is who they say they are. For instance, thevoice command engine 370 may detect that the speaker has identified herself in the voice command and/or code-word, or otherwise. In that case, thevoice command engine 370 may treat the speaker-provided identification as a tentative identity, and perform further voice recognition analysis to verify the identity. Such analysis may be performed utilizing the voice database as described above. It is contemplated that such features may enhance secure access to the controls for thehome automation system 200. For instance, further prompts and/or denial of access to thehome automation system 200 may be implemented by thevoice command engine 370 in response to determining a false speaker verification. - Still referring to
FIG. 4 , themethod 400 may include determining a device forcontrol 406. In some cases, thevoice command engine 370 may perform a speech recognition analysis to identify one or more devices in thehome automation system 200 that are intended to be controlled by the received command. For instance, thevoice command engine 370 may convert the spoken voice command to a digitally stored set of words. In some cases, all or a portion of the set of words are compared to one or more device names, such as device IDs, which may be stored in a database of device names, e.g. a controls database of thevoice command engine 370 as shown inFIG. 5 . In that case, thevoice command engine 370 may identify one or more words in the set of words that match one or more device names in the controls database. The matched device name(s) may indicate the intended device(s) for control. - In another example, the
voice command engine 370 may utilize speech recognition to determine one or more words or phrases in the spoken voice command that may be related to a function of a device, and determine the device to be controlled based on the function revealed in the voice command. Merely by way of example, thevoice command engine 370 may determine that in the received voice command for “Lower the heat by five degrees,” no particular device was verbally identified by the speaker. Additionally and/or alternatively, thevoice command engine 370 may determine that certain portions of the voice command include command phrases, such as the phrase “lower the heat” and/or the word “degrees”. The combination of the identified command phrases, and/or a command phrase taken alone, may be sufficient for thevoice command engine 370 to determine the device(s) to control. For instance, thevoice command engine 370 may compare one or more of the control phrases to a database of control phrases, e.g. controls database of thevoice command engine 370 as shown inFIG. 5 , whereby one or more control phrases may each be associated with one or more device names, e.g. device IDs. In this way, thevoice command engine 370 may look-up or otherwise determine the intended device based on a match of the identified command phrase(s) with one or a grouping of control phrases that are associated with a device ID. In the example voice command for “Lower the heat by five degrees,” thevoice command engine 370 may determine that the device intended for control is a thermostat,e.g. thermostat 222. - It is contemplated that such control phrases may be determined based on a plurality of voice commands. For instance, the
voice command engine 370 may prompt the speaker for additional voice commands and/or other input if an initially received command was determined insufficient for determining a device. In another aspect, thevoice command engine 370 may determine a list of possible device IDs indicating devices to control based on the control phrase(s), and prompt the user for more information or voice commands in order to narrow down the list of device IDs to the determined device ID. It is further contemplated that other cues may be captured by thevoice command engine 370 via various devices in thehome automation system 200. For instance, a speaker location may be detected by the one or devices and transmitted to thevoice command engine 370 to be utilized for determination of the device to control. Merely by way of example, thecamera 212 in a living room may notify thevoice command engine 370, via the home automations network or other communications network, that the speaker is located in the living room, as detected by thecamera 212. Based on the additional information, thevoice command engine 370 may lower the thermostat that controls the living room by five degrees, while other thermostats in other parts of the house are unaffected, e.g. thevoice command engine 370 does not send the command signal to other thermostats. In some examples, thevoice command engine 370 may query other devices, such as thecamera 212, for such additional information in order to determine the intended device. - In still another aspect, the
voice command engine 370 may implement speech recognition to determine a new state or status being requested. For example, thevoice command engine 370 may determine “five degrees” to be a magnitude or unit of change for the determined device, based on the example received voice command of “Lower the heat by five degrees.” In another example, thevoice command engine 370 may determine that “on” is a new state in an example received voice command of “Turn the light on.” In another aspect, control phrases may be user-defined and/or pushed from a satellite to the controls database during periodic updates from, for example, the home automation service server 112 via satellite connections and/or thenetwork 190, as shown inFIG. 1 . In this way, it is contemplated that the controls database stays current and relevant to the devices connected to thehome automation system 200. Still, other examples are possible. - Referring again to
FIG. 4 , themethod 400 may include determining a permission of the speaker to control the identified device(s) 408. In some cases, thevoice command engine 370 may determine a permission status to control the identified device(s). The permission status may be based on the determined speaker identity and/or the identified device(s) to control. It is contemplated that the permission status enhances safety and security of thehome automation system 200 by having thevoice command engine 370 by prohibit otherwise undesirable controls from being implemented. In another aspect, the permission status allows thevoice command engine 370 to provide greater flexibility for user(s), who may set when certain commands are forbidden and when certain commands are allowable. - Merely by way of example, the permission status determined by the
voice command engine 370 may include an access granted status and/or an access denied status. The permission status may be determined based on one or more variables, such as the speaker identity, access settings associated with the speaker identity such as parental controls settings, the identified control phrases in the voice command, the device for control, the code words, voice samples, a current status of the device to control or other devices in the home automation system, and so on. In another example, other variables, such as time of day, may be considered by thevoice command engine 370 in determining the permission status. In some cases where thevoice command engine 370 determines that a plurality of devices are intended for control, thevoice command engine 370 may determine a plurality of permission statuses. For example, thevoice command engine 370 may determine that some of the plurality of identified devices have a granted status, while others of the same plurality of identified devices have an access denied status. In other examples, thevoice command engine 370 may determine the permission status that none of the plurality of devices are permitted for control. In still other examples, thevoice command engine 370 may determine the access granted status for the permission status of all of the plurality of determined devices. - Referring yet again to
FIG. 4 , in some examples, themethod 400 may include transmitting a control signal, e.g. operational command, to thedetermined device 410. For instance, thevoice command engine 370 may control the identified device(s) in thehome automation system 200 based on the determined permission status. In one example, thevoice command engine 370 may transmit an operational command signal to the identified device based on the access granted permission status. In another example, thevoice command engine 370 may generate the operational command based on a communication protocol, e.g. Zigbee®, Z-Wave®, specific to the identified device and/or transmit the generated operational command to the device using the communication protocols in thehome automation network 200. Further examples may be possible. For instance, themethod 400 may further include outputting a confirmation notification that indicates a new state of the identified device, whereby the new state is based on the received voice command. It is noted thatstep 410 is optional and may be dependent on the permission status being determined as an access granted status. Additionally, it is noted that any of the steps in themethod 400 may be optional. Further, it is noted that a plurality of control signals may be delivered to a plurality of identified devices when necessary. - In another example, the
voice command engine 370 may output, after determining an access denied permission status, a notification to inform the speaker that the desired control was not implemented. Such notifications may include speech notifications, sounds, text on display screens, and so on. In that case, thevoice command engine 370 may maintain a current state of the identified device and/or not transmit any control signal to the identified device. Other examples are possible. - Turning now to
FIG. 5 , example modules of thevoice command engine 370 that may provide themethod 400, and/or any other features disclosed herein, is shown. Thevoice command engine 370 may be incorporated in thehome automation engine 311 that is found in thetelevision receiver 150 and/or theoverlay device 251. In one example, thevoice command engine 370 is provided for in the control processor 310-1 and/or thestorage medium 325 of thetelevision receiver 300 as shown inFIG. 3 . It is contemplated that various modules of thevoice command engine 370 can be provided for by different parts of thetelevision receiver 300, and/or any computer system such as the disclosedcomputer system 700 inFIG. 7 . It is noted that the modules may be arranged in any manner and in operative communication with one another. Further, it is noted that any of the modules may be rearranged, optional, and/or additional modules may be included in thevoice command engine 370. In general, it is contemplated that thevoice command engine 370 may identify a user, e.g. a speaker, and push an operational command to a home automation device if access for the user is authorized. - As shown in the schematic block diagram of the
voice command engine 370 ofFIG. 5 , thevoice command engine 370 may include adatabase 502 that may be divided into, or otherwise include, avoice database 504, asettings database 506, and/or acontrols database 508. Thevoice database 504 may include storedvoice samples 510 and/orcode words 512. Thesettings database 506 may includespeaker identities 514 and/oraccess settings 516. Thecontrols database 508 may includecontrol phrases 518 and/ordevice identifications 520. It is noted that any of the data type modules may be commonly shared among thedatabases speaker identities 514 may be stored in the voice database and thecode words 512 may be stored in thesettings database 506. Other examples are possible. -
Voice samples 510 may be gathered and stored for each user of thevoice command engine 370 during an initial setup. For example, thevoice command engine 370 may receive one or more voice samples during the initial setup, and/or associate such voice samples withspeaker identities 514 for each user. Such voice samples may be captured by a microphone in communication with thevoice command engine 370 and/or consist of certain voice commands, predetermined phrases, and/or personalized phrases. Merely by way of example, a training session may be initiated during the initial setup to train thevoice command engine 370 to a speaker's voice by collecting audio samples of the speaker and associating the audio samples with the speaker's identity. The associated one or more voice samples with the speaker's identity may be stored in thevoice database 504, and/or thedatabase 502 in general. -
Code words 512 may be received by thevoice command engine 370 during an initial setup and upon detection by a microphone connected thereto. In some cases, the code-word may be user configured and assigned to a specific speaker, e.g. specific speaker identity. In other examples, a code-word may be assigned to multiple speaker identities. In still other examples, the code-word may be a general code-word that precedes any voice command to be received by thevoice command engine 370, and is not associated with a particular speaker. Such code-words may be use-configured and/or preprogrammed, and may consist of a word or a phrase. In still other examples, specific code-words may be associated with particular locations or rooms containing certain devices. For instance, a speaker in a living room may use “Sesame” as a code-word preceding a voice command, “Lower the heat by five degrees,” to indicate to thevoice command engine 370 that the device to control is in the living room. On the other hand, the speaker may use a different code-word, e.g. “Genie, lower the heat by five degrees,” to indicate to thevoice command engine 370 that the speaker is in a different room, such as a kitchen, and therefore the kitchen thermostat is intended for control. In this way, it is contemplated that thevoice command engine 370 may utilize the code-word to distinguish particular device(s) to control. - In another example, the
code words 512 may be utilized to start up one or more components of thevoice command engine 370 upon detection thereof. For example, specific words may be spoken by a user to get attention from thevoice command engine 370. In that case, thevoice command engine 370 and/or a device in operative communication with, or containing, theengine 370 may continuously search for an activation-type code word. For instance, thevoice command engine 370 may continuously search for and identify a set of words and/or unique inflections to determine that thevoice command engine 370 is about to receive a voice command. In another aspect, the code words may serve as authentication phrases or passwords to access the controls of thevoice command engine 370, and/or as identifying phrases to determine a speaker identity, device identity, locations, and so on. -
Speaker identities 514 may be received through text and/or captured by a microphone during an initial set-up phase of thevoice command engine 370. The speaker identity may be user-configured and/or include a speaker's first name, last name, and/or nickname. In practice, a speaker may identify herself by stating her speaker identity, whereupon thevoice command engine 370 may use voice recognition analysis to further verify that the speaker is not providing a false identity. Upon detection of a false identity, thevoice command engine 370 may output a notification regarding denied access, a prompt to restate the speaker identity, and/or notify other recipient devices, e.g. mobile devices of other users, of the attempted access. - It is contemplated that the
voice command engine 370 may look up one or more voice samples associated with the speaker identity to verify the speaker. In another example, the speaker provide a voice command, and thevoice command engine 370 may analyze the voice command for a match with one or more of thevoice samples 510, which are further mapped to one or more speaker identities. In that case, thevoice command engine 370 may determine the speaker identity based on voice recognition. In still other examples, thedatabase 502 may provide a list of unavailable speaker identities. For example, a speaker may not be able to select, during the initial setup, a name that is also a device name, e.g. device identity. Other unavailable speaker identities may be user configured and/or preprogrammed. Further, in some examples, thevoice command engine 370 may prohibit device identities to overlap with speaker identities. -
Access settings 516 may be user-configurable and associated with one or more speaker identities. For example, thevoice command engine 370 may store the one or more access settings associated with the speaker identity as a user profile. In one example, a user profile is a child's profile with parental control access settings implemented. Such access settings may be user-configured, e.g. configured by a parent, and include, merely by way of example, denying voice commands related to altering operation of a thermostat, unlocking a front door, using the television based on a user-specified time of day, and so on. The access settings, upon lookup by thevoice command engine 370, may allow and/or prevent operational signals to be transmitted to certain determined devices. In another example, the access settings may be associated with particular device identities and/or code words. In still other examples, access settings may include storing a user's mobile device identification and communications information, so that the user may communicate with thevoice command engine 370 from a remote location. Other examples are possible. -
Control phrases 518 may be received during the initial setup, downloaded, and/or otherwise pushed to thevoice command engine 370 upon installation of a device to thehome automation system 200 and/or from the home automation service server 112 as shown inFIG. 1 . Each control phrase may include a word and/or a string of words that may indicate operational settings and/or changes thereof, e.g. “turn on,” “lower heat,” “degrees” and so on. In some cases, the control phrases are associated with a device identity, and/or any other type of module shown inFIG. 5 . In that case, thecontrol phrases 518 include functions based on the device(s) connected to thehome automation system 200. In other cases, thecontrol phrases 518 may include user-configured commands. It is contemplated that thecontrol phrases 518 includes a master library of permissible commands that may be altered and built without an internet connection, e.g. pushed via a television distribution system and/or satellite. Further examples of control phrases may include, “show me,” e.g. for a voice command “Show me the front door;” “record”, e.g. for a voice command “Record channel 2 from 8 pm to 10 pm;” “call,” e.g. for a voice command “Call 911.” As in the latter example, further processes, e.g. restriction-related authorizations, may be implemented by thevoice command engine 370 prior transmitting an operational signal to a device, e.g. telephone. In another aspect, it is contemplated that control phrases may be detected by a microphone and processed using speech recognition by thevoice command engine 370, and subsequently added to thedatabase 502. - In a different aspect, a control phrase may simply include a device identity or nickname. For instance, if a device operationally toggles between two settings, e.g. on/off, the control phrase for operation of the device may be the device name itself. Upon detection of the device name as the control phrase, the
voice command engine 370 may transmit signals to the device to toggle between two or more functions. Merely by way of example, instead of speaking “Turn on the living room lights,” a speaker may simply state the device itself, “Living room lights.” Thevoice command engine 370 may transmit a signal to the living room lights to turn the lights on, or off if the lights are already determined to be in an on state by thevoice command engine 370. Thevoice command engine 370 may first detect additional conditions via other devices in thehome automation system 200 that may further facilitate which control setting to transmit to the device. For instance, thevoice command engine 370 may detect that user has entered a location, e.g. the living room, and/or determine a state of the device, e.g. that the living room lights are off. In that case, thevoice command engine 370 detects the location of the user and a condition of the identified device, and generates and/or transmits an operational signal to render a second possible condition at the device. -
Device identities 520 may include one or more device names or nicknames for devices in thehome automation system 200 that are controllable via thevoice command engine 370. The device identity may be user-configured, via detection by a microphone and subsequent speech recognition analysis by thevoice command engine 370, and/or by a textual input from the user. The device identity may be linked to other data, such as voice samples, code words, and/or access settings, such that thevoice command engine 370 may use thedatabase 502 to look up which device should be implemented based on any other received data. In some cases, if a device is not recognized or otherwise found to be in thedatabase 502 after detection of a voice command, thevoice command engine 370 may prompt the speaker to set up the device for voice recognition controls. In other cases, thevoice command engine 370 may notify the speaker that no device is available for operation in thehome automation system 200. - Further shown in
FIG. 5 , thevoice command engine 370 may include aspeech recognition analyzer 522. Thespeech recognition analyzer 522 may determine one or more words or phrases in the received voice command that may be related to a function of a device, and determine the device to be controlled based on the function revealed in the voice command. Thespeech recognition analyzer 522 may further be utilized in identifying one or more devices in thehome automation system 200 that are intended to be controlled by the received command. If detected speech is not recognized, thevoice recognition engine 370 may prompt the speaker to repeat spoken voice command(s) and/or other responses, and/or enter the command via another medium, e.g. by textual input. It is contemplated that functions of thespeech recognition analyzer 522 may be updated via a television distribution system and/or satellite system. Merely by way of example, thespeech recognition analyzer 522 may be a multilingual platform, whereby a user may select one or more languages to implement for speech recognition. - Still referring to
FIG. 5 , thevoice command engine 370 may include avoice recognition analyzer 524 to perform voice recognition analysis as described above. Thevoice recognition analyzer 524 may also include multilingual functions, and include features that are updated via a television distribution system and/or satellite system. It is contemplated that thevoice recognition analyzer 524, and/or thespeech recognition analyzer 522, may be trained during initial setup and/or user profile setup via thevoice command engine 370. - Referring again to
FIG. 5 , thevoice command engine 370 may include apermissions status analyzer 528 to determine if a voice command is permissible and if an operational signal based on the voice command should be generated and/or transmitted to the intended device. For example, thepermissions status analyzer 528 may utilize the determined speaker identity, device identity, access settings, code words, and other determined data for the voice command to determine if an operational signal should be generated. Thepermissions status analyzer 528 may determine if additional prompts and information should be provided prior to transmission of such operational signals. Further, thepermissions status analyzer 528 may determine the access granted or access denied status, as described previously. - Still referring to
FIG. 5 , thevoice command engine 370 may include a homeautomation systems interface 526. The home automations systems interface 526 may ensure communications between thevoice command engine 370 and various different devices having different protocols in thehome automation system 370 are seamlessly integrated. For example, the home automation systems interface 526 may implement device-specific communications protocols to ensure that signals transmitted to the devices from theengine 370, and/or received by theengine 370, comply with one another. In one example, thevoice command engine 370 may detect a channel change at a remote control and prompt the user with a request to “Identify yourself,” prior to sending an operational signal for changing to a certain channel. In that case, thevoice command engine 370 may provide an intermediary control between a device and its dedicated controller, e.g. the television and the remote control. In another aspect, the homeautomations system interface 526 may permit thevoice command engine 370 to signal to a television to lower or mute a volume level while audio output from thevoice command engine 370 is underway, for example, through an intercom system. After a voice command sequence is completed, the home automation systems interface 526 may resume volume levels back to previous settings. Other examples are possible. - As further shown in
FIG. 5 , thevoice command engine 370 may include amicrophone interface 530 to receive voice input detected by one or more microphones that may be scattered about thehome automation system 200. Themicrophone interface 530 may be configured to encode and/or decode any signals operatively communicated with the microphones. In one example, a microphone may be located at a remote control having one or more features of thevoice command engine 370. The remote control may process and analyze the voice command to decrease data processing requirements at a television receiver, which may provide additional features of thevoice command engine 370. In another example, themicrophone interface 530 may be in operative communication with a microphone located in each room of house. - As shown in
FIG. 5 , thevoice command engine 370 may further provide aconversations module 532. Theconversations module 532 may be responsible for disambiguation processes, including carrying conversations and/or additional queries directed to receiving additional information related to a received voice command. For example, thevoice command engine 370 may receive a voice command for “Close the windows in the living room” and theconversations module 532 may instruct theengine 370 to further inquire, “All windows?” for clarification. In general, theconversations module 532 may be utilized to interact with the speaker for further information at any point when such information is needed. Theconversations module 370, and any other modules shown herein, may be in sync and operatively connected with any other modules of thevoice command engine 370. It is noted that theconversations module 532, and/or any other modules shown herein, may be multilingual to facilitate multilingual operations of thevoice command engine 370. Other examples are possible. - As further shown in
FIG. 5 , thevoice command engine 370 may include anotifications module 534, which may be responsible for various audio, textual, or other notifications output by thevoice command engine 370. Such notifications may relay when a trigger is detected, such as when an access is denied, when a new status change based on the voice command has been successfully implemented at the device, and so on. In another example, mass notifications may be transmitted from thevoice command engine 370 to a plurality of recipients and their various devices. For example, the voice command engine may notify, via one or more communications networks, a plurality of mobile devices based on detection of an unauthorized voice command or other trigger. In still another aspect, thenotifications module 534 may initiate communications with law enforcement and/or emergency responders, directly and/or via other devices in thehome automation system 200. Other examples are possible. - Turning now to
FIG. 6 , anothermethod 600 for controlling home automation systems with speaker-dependent commands is shown. Themethod 600 may include the steps shown in any order and any additional steps. Further, any steps may be optional. It is contemplated that themethod 600 is provided for by thevoice command engine 370 ofFIG. 5 . - As shown in
FIG. 6 , themethod 600 may include receiving a code word (step 602), which may include a user-configured or predefined code word. After receiving the code word, thevoice command engine 370 may receive a voice command directed to a device in the home automation system (step 604). In some examples, thevoice command engine 370 may prompt the speaker for additional input, such as additional instructions to clarify the voice command (step 606). Based on the received voice command and subsequent inputs, thevoice command engine 370 may determine one or more device(s) to control, and/or a speaker identity (step 608). It is contemplated that with some voice commands, the speaker identity may not be required in order for thevoice command engine 370 to implement an intended control at an intended device. Such special settings may be defined by the user during setup. - Still referring to
FIG. 6 , themethod 600 may include determining if control according to the received voice command is permitted (step 610). In some examples where control is not permitted, e.g. the access is denied based on thepermissions status analyzer 528, thevoice command engine 370 may maintain a current state of the intended device (step 618) by not generating or otherwise transmitting an operational signal to the intended device. Thevoice command engine 370 may further output notification of the denied command. Such notifications may include audio, visual, and/or textual notification to the speaker via various devices in thehome automation system 200. - Referring again to
FIG. 6 , in another example, thevoice command engine 370 may determine that a voice command is permitted for altering a state of the intended device. In that case, thevoice command engine 370 may generate a protocol-specific operational command (step 612) according to requirements of the intended device, and transmit the operational command to the intended device via the communications protocol. In further examples, thevoice command engine 370 may output notification relaying the new state of the device (step 616). Merely by way of example, after transmitting the operational command to the device, thevoice command engine 370 may provide a follow-up query to the device for an update in order to determine if the changed operation has been implemented. After implementation is detected, the voice command engine may provide one or more notifications of the change to the user, and/or to other devices in thehome automation system 200. Such notifications may include audio, visual, and/or textual notifications to the speaker via various devices in thehome automation system 200. - Referring now to
FIG. 7 , a computer system as illustrated inFIG. 7 may be incorporated as part of the previously described computerized devices, such as the wireless devices, television receivers, overlay devices, communication devices, any of the home automation devices, the television service provider system, thevoice command engine 370, etc.FIG. 7 provides a schematic illustration of one embodiment of acomputer system 700 that can perform various steps of the methods provided by various embodiments. It should be noted thatFIG. 7 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate.FIG. 7 , therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner. - The
computer system 700 is shown comprising hardware elements that can be electrically coupled via a bus 705 (or may otherwise be in communication, as appropriate). The hardware elements may include one ormore processors 710, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like); one ormore input devices 715, which can include without limitation a mouse, a keyboard, remote control, and/or the like; and one ormore output devices 720, which can include without limitation a display device, a printer, and/or the like. - The
computer system 700 may further include (and/or be in communication with) one or morenon-transitory storage devices 725, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like. - The
computer system 700 might also include acommunications subsystem 730, which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device, and/or a chipset (such as a Bluetooth™ device, an 802.11 device, a WiFi device, a WiMax device, cellular communication device, etc.), and/or the like. Thecommunications subsystem 730 may permit data to be exchanged with a network (such as the network described below, to name one example), other computer systems, and/or any other devices described herein. In many embodiments, thecomputer system 700 will further comprise a workingmemory 735, which can include a RAM or ROM device, as described above. - The
computer system 700 also can comprise software elements, shown as being currently located within the workingmemory 735, including anoperating system 740, device drivers, executable libraries, and/or other code, such as one ormore application programs 745, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods. - A set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the non-transitory storage device(s) 725 described above. In some cases, the storage medium might be incorporated within a computer system, such as
computer system 700. In other embodiments, the storage medium might be separate from a computer system (e.g., a removable medium, such as a compact disc), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by thecomputer system 700 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 700 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code. - It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
- As mentioned above, in one aspect, some embodiments may employ a computer system (such as the computer system 700) to perform methods in accordance with various embodiments of the invention. According to a set of embodiments, some or all of the procedures of such methods are performed by the
computer system 700 in response toprocessor 710 executing one or more sequences of one or more instructions (which might be incorporated into theoperating system 740 and/or other code, such as an application program 745) contained in the workingmemory 735. Such instructions may be read into the workingmemory 735 from another computer-readable medium, such as one or more of the non-transitory storage device(s) 725. Merely by way of example, execution of the sequences of instructions contained in the workingmemory 735 might cause the processor(s) 710 to perform one or more procedures of the methods described herein. - The terms “machine-readable medium,” “computer-readable storage medium” and “computer-readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion. These mediums may be non-transitory. In an embodiment implemented using the
computer system 700, various computer-readable media might be involved in providing instructions/code to processor(s) 710 for execution and/or might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media. Non-volatile media include, for example, optical and/or magnetic disks, such as the non-transitory storage device(s) 725. Volatile media include, without limitation, dynamic memory, such as the workingmemory 735. - Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, any other physical medium with patterns of marks, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 710 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the
computer system 700. - The communications subsystem 730 (and/or components thereof) generally will receive signals, and the
bus 705 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the workingmemory 735, from which the processor(s) 710 retrieves and executes the instructions. The instructions received by the workingmemory 735 may optionally be stored on anon-transitory storage device 725 either before or after execution by the processor(s) 710. - It should further be understood that the components of
computer system 700 can be distributed across a network. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components ofcomputer system 700 may be similarly distributed. As such,computer system 700 may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances,computer system 700 may be interpreted as a single computing device, such as a distinct laptop, desktop computer, or the like, depending on the context. - The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
- Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
- Also, configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
- Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the invention. Also, a number of steps may be undertaken before, during, or after the above elements are considered.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/566,977 US20150162006A1 (en) | 2013-12-11 | 2014-12-11 | Voice-recognition home automation system for speaker-dependent commands |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361914856P | 2013-12-11 | 2013-12-11 | |
US14/566,977 US20150162006A1 (en) | 2013-12-11 | 2014-12-11 | Voice-recognition home automation system for speaker-dependent commands |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150162006A1 true US20150162006A1 (en) | 2015-06-11 |
Family
ID=53270607
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/475,252 Active 2036-04-27 US9900177B2 (en) | 2013-12-11 | 2014-09-02 | Maintaining up-to-date home automation models |
US14/485,188 Abandoned US20150163535A1 (en) | 2013-12-11 | 2014-09-12 | Home automation system integration |
US14/485,038 Active 2036-04-09 US9912492B2 (en) | 2013-12-11 | 2014-09-12 | Detection and mitigation of water leaks with home automation |
US14/553,763 Active 2036-06-01 US10027503B2 (en) | 2013-12-11 | 2014-11-25 | Integrated door locking and state detection systems and methods |
US14/565,853 Active 2036-01-07 US9838736B2 (en) | 2013-12-11 | 2014-12-10 | Home automation bubble architecture |
US14/567,502 Abandoned US20150160635A1 (en) | 2013-12-11 | 2014-12-11 | Multi-Tiered Feedback-Controlled Home Automation Notifications |
US14/566,977 Abandoned US20150162006A1 (en) | 2013-12-11 | 2014-12-11 | Voice-recognition home automation system for speaker-dependent commands |
Family Applications Before (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/475,252 Active 2036-04-27 US9900177B2 (en) | 2013-12-11 | 2014-09-02 | Maintaining up-to-date home automation models |
US14/485,188 Abandoned US20150163535A1 (en) | 2013-12-11 | 2014-09-12 | Home automation system integration |
US14/485,038 Active 2036-04-09 US9912492B2 (en) | 2013-12-11 | 2014-09-12 | Detection and mitigation of water leaks with home automation |
US14/553,763 Active 2036-06-01 US10027503B2 (en) | 2013-12-11 | 2014-11-25 | Integrated door locking and state detection systems and methods |
US14/565,853 Active 2036-01-07 US9838736B2 (en) | 2013-12-11 | 2014-12-10 | Home automation bubble architecture |
US14/567,502 Abandoned US20150160635A1 (en) | 2013-12-11 | 2014-12-11 | Multi-Tiered Feedback-Controlled Home Automation Notifications |
Country Status (6)
Country | Link |
---|---|
US (7) | US9900177B2 (en) |
EP (2) | EP3080677B1 (en) |
CN (1) | CN105814555B (en) |
CA (2) | CA2930990C (en) |
MX (2) | MX362800B (en) |
WO (3) | WO2015088603A1 (en) |
Cited By (187)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150324706A1 (en) * | 2014-05-07 | 2015-11-12 | Vivint, Inc. | Home automation via voice control |
US20160111089A1 (en) * | 2014-10-16 | 2016-04-21 | Hyundai Motor Company | Vehicle and control method thereof |
US20160164731A1 (en) * | 2014-12-04 | 2016-06-09 | Comcast Cable Communications, Llc | Configuration Responsive to a Device |
US9495860B2 (en) | 2013-12-11 | 2016-11-15 | Echostar Technologies L.L.C. | False alarm identification |
US9511259B2 (en) | 2014-10-30 | 2016-12-06 | Echostar Uk Holdings Limited | Fitness overlay and incorporation for home automation system |
CN106354023A (en) * | 2015-07-15 | 2017-01-25 | 腾讯科技(深圳)有限公司 | Method for controlling terminal device by mobile terminal, mobile terminal and system |
US20170052514A1 (en) * | 2015-08-17 | 2017-02-23 | Ton Duc Thang University | Method and computer software program for a smart home system |
US20170053210A1 (en) * | 2015-08-17 | 2017-02-23 | Ton Duc Thang University | Smart home system |
US9599981B2 (en) | 2010-02-04 | 2017-03-21 | Echostar Uk Holdings Limited | Electronic appliance status notification via a home entertainment system |
US9612728B2 (en) * | 1999-12-20 | 2017-04-04 | Apple Inc. | Graduated visual and manipulative translucency for windows |
US9621959B2 (en) | 2014-08-27 | 2017-04-11 | Echostar Uk Holdings Limited | In-residence track and alert |
US9628286B1 (en) | 2016-02-23 | 2017-04-18 | Echostar Technologies L.L.C. | Television receiver and home automation system and methods to associate data with nearby people |
US9632746B2 (en) | 2015-05-18 | 2017-04-25 | Echostar Technologies L.L.C. | Automatic muting |
CN106707788A (en) * | 2017-03-09 | 2017-05-24 | 上海电器科学研究院 | Intelligent and household voice control and recognition system |
US9723393B2 (en) | 2014-03-28 | 2017-08-01 | Echostar Technologies L.L.C. | Methods to conserve remote batteries |
US9729989B2 (en) | 2015-03-27 | 2017-08-08 | Echostar Technologies L.L.C. | Home automation sound detection and positioning |
US9769522B2 (en) | 2013-12-16 | 2017-09-19 | Echostar Technologies L.L.C. | Methods and systems for location specific operations |
WO2017160232A1 (en) * | 2016-03-16 | 2017-09-21 | Forth Tv Pte Ltd | Apparatus for assistive communication |
US9772612B2 (en) | 2013-12-11 | 2017-09-26 | Echostar Technologies International Corporation | Home monitoring and control |
US20170301353A1 (en) * | 2016-04-15 | 2017-10-19 | Sensory, Incorporated | Unobtrusive training for speaker verification |
US9798309B2 (en) | 2015-12-18 | 2017-10-24 | Echostar Technologies International Corporation | Home automation control based on individual profiling using audio sensor data |
WO2017189134A1 (en) * | 2016-04-28 | 2017-11-02 | Ecolink Intelligent Technology, Inc. | Systems, methods and apparatus for interacting with a security system using a television remote control |
US9811312B2 (en) * | 2014-12-22 | 2017-11-07 | Intel Corporation | Connected device voice command support |
US20170323635A1 (en) * | 2016-05-04 | 2017-11-09 | GM Global Technology Operations LLC | Disambiguation of vehicle speech commands |
US9824578B2 (en) | 2014-09-03 | 2017-11-21 | Echostar Technologies International Corporation | Home automation control using context sensitive menus |
US9838736B2 (en) | 2013-12-11 | 2017-12-05 | Echostar Technologies International Corporation | Home automation bubble architecture |
US9858927B2 (en) * | 2016-02-12 | 2018-01-02 | Amazon Technologies, Inc | Processing spoken commands to control distributed audio outputs |
US9882736B2 (en) | 2016-06-09 | 2018-01-30 | Echostar Technologies International Corporation | Remote sound generation for a home automation system |
US9898250B1 (en) * | 2016-02-12 | 2018-02-20 | Amazon Technologies, Inc. | Controlling distributed audio outputs to enable voice output |
CN107765838A (en) * | 2016-08-18 | 2018-03-06 | 北京北信源软件股份有限公司 | Man-machine interaction householder method and device |
US9948477B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Home automation weather detection |
US9946857B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Restricted access for home automation system |
US9960980B2 (en) | 2015-08-21 | 2018-05-01 | Echostar Technologies International Corporation | Location monitor and device cloning |
US9967614B2 (en) | 2014-12-29 | 2018-05-08 | Echostar Technologies International Corporation | Alert suspension for home automation system |
US20180137860A1 (en) * | 2015-05-19 | 2018-05-17 | Sony Corporation | Information processing device, information processing method, and program |
US9984686B1 (en) * | 2015-03-17 | 2018-05-29 | Amazon Technologies, Inc. | Mapping device capabilities to a predefined set |
US9983011B2 (en) | 2014-10-30 | 2018-05-29 | Echostar Technologies International Corporation | Mapping and facilitating evacuation routes in emergency situations |
US9989507B2 (en) | 2014-09-25 | 2018-06-05 | Echostar Technologies International Corporation | Detection and prevention of toxic gas |
US9996066B2 (en) | 2015-11-25 | 2018-06-12 | Echostar Technologies International Corporation | System and method for HVAC health monitoring using a television receiver |
US20180174581A1 (en) * | 2016-12-19 | 2018-06-21 | Pilot, Inc. | Voice-activated vehicle lighting control hub |
US10018977B2 (en) * | 2015-10-05 | 2018-07-10 | Savant Systems, Llc | History-based key phrase suggestions for voice control of a home automation system |
US10049515B2 (en) | 2016-08-24 | 2018-08-14 | Echostar Technologies International Corporation | Trusted user identification and management for home automation systems |
US10060644B2 (en) | 2015-12-31 | 2018-08-28 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user preferences |
US20180253960A1 (en) * | 2014-10-08 | 2018-09-06 | Gentex Corporation | Trainable transceiver and method of operation utilizing existing vehicle user interfaces |
US10073428B2 (en) | 2015-12-31 | 2018-09-11 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user characteristics |
US20180277107A1 (en) * | 2017-03-21 | 2018-09-27 | Harman International Industries, Inc. | Execution of voice commands in a multi-device system |
WO2018175201A1 (en) * | 2017-03-21 | 2018-09-27 | Amplivy, Inc. | Content-activated intelligent, autonomous audio/video source controller |
US10091017B2 (en) | 2015-12-30 | 2018-10-02 | Echostar Technologies International Corporation | Personalized home automation control based on individualized profiling |
US20180286407A1 (en) * | 2015-10-23 | 2018-10-04 | Sharp Kabushiki Kaisha | Communication device |
US10101717B2 (en) | 2015-12-15 | 2018-10-16 | Echostar Technologies International Corporation | Home automation data storage system and methods |
US20180308483A1 (en) * | 2017-04-21 | 2018-10-25 | Lg Electronics Inc. | Voice recognition apparatus and voice recognition method |
US20180358013A1 (en) * | 2017-06-13 | 2018-12-13 | Hyundai Motor Company | Apparatus for selecting at least one task based on voice command, vehicle including the same, and method thereof |
US20180366116A1 (en) * | 2017-06-19 | 2018-12-20 | Lenovo (Singapore) Pte. Ltd. | Systems and methods for execution of digital assistant |
US10163437B1 (en) * | 2016-06-02 | 2018-12-25 | Amazon Technologies, Inc. | Training models using voice tags |
US10235997B2 (en) * | 2016-05-10 | 2019-03-19 | Google Llc | Voice-controlled closed caption display |
US20190089550A1 (en) * | 2017-09-15 | 2019-03-21 | Kohler Co. | Bathroom speaker |
US10271093B1 (en) | 2016-06-27 | 2019-04-23 | Amazon Technologies, Inc. | Systems and methods for routing content to an associated output device |
US10294600B2 (en) | 2016-08-05 | 2019-05-21 | Echostar Technologies International Corporation | Remote detection of washer/dryer operation/fault condition |
US20190172467A1 (en) * | 2017-05-16 | 2019-06-06 | Apple Inc. | Far-field extension for digital assistant services |
US10325596B1 (en) * | 2018-05-25 | 2019-06-18 | Bao Tran | Voice control of appliances |
US10365620B1 (en) | 2015-06-30 | 2019-07-30 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
US10402450B2 (en) | 2016-05-13 | 2019-09-03 | Google Llc | Personalized and contextualized audio briefing |
GB2572175A (en) * | 2018-03-21 | 2019-09-25 | Emotech Ltd | Processing a command |
WO2019204196A1 (en) * | 2018-04-16 | 2019-10-24 | The Chamberlain Group, Inc. | Systems and methods for voice-activated control of an access control platform |
US10460734B2 (en) * | 2018-03-08 | 2019-10-29 | Frontive, Inc. | Methods and systems for speech signal processing |
US20190348036A1 (en) * | 2016-09-29 | 2019-11-14 | Intel IP Corporation | Context-aware query recognition for electronic devices |
US10481561B2 (en) * | 2014-04-24 | 2019-11-19 | Vivint, Inc. | Managing home automation system based on behavior |
EP3449341A4 (en) * | 2016-04-26 | 2019-12-04 | View, Inc. | Controlling optically-switchable devices |
US10531157B1 (en) * | 2017-09-21 | 2020-01-07 | Amazon Technologies, Inc. | Presentation and management of audio and visual content across devices |
WO2020027559A1 (en) * | 2018-08-02 | 2020-02-06 | Samsung Electronics Co., Ltd. | Electronic apparatus and control method thereof |
US10567515B1 (en) | 2017-10-26 | 2020-02-18 | Amazon Technologies, Inc. | Speech processing performed with respect to first and second user profiles in a dialog session |
TWI691893B (en) * | 2018-05-30 | 2020-04-21 | 大陸商出門問問信息科技有限公司 | A method and an apparatus for continuously broadcasting audio data |
US10655951B1 (en) | 2015-06-25 | 2020-05-19 | Amazon Technologies, Inc. | Determining relative positions of user devices |
US20200160857A1 (en) * | 2018-11-15 | 2020-05-21 | Motorola Mobility Llc | Electronic Device with Voice Process Control and Corresponding Methods |
USD885436S1 (en) | 2016-05-13 | 2020-05-26 | Google Llc | Panel of a voice interface device |
US10715604B1 (en) * | 2017-10-26 | 2020-07-14 | Amazon Technologies, Inc. | Remote system processing based on a previously identified user |
US10714081B1 (en) | 2016-03-07 | 2020-07-14 | Amazon Technologies, Inc. | Dynamic voice assistant interaction |
US10778674B2 (en) | 2018-01-30 | 2020-09-15 | D&M Holdings, Inc. | Voice authentication and setup for wireless media rendering system |
US10827028B1 (en) | 2019-09-05 | 2020-11-03 | Spotify Ab | Systems and methods for playing media content on a target device |
US10825454B1 (en) * | 2015-12-28 | 2020-11-03 | Amazon Technologies, Inc. | Naming devices via voice commands |
US10880308B1 (en) * | 2016-07-20 | 2020-12-29 | Vivint, Inc. | Integrated system component and electronic device |
US10878809B2 (en) | 2014-05-30 | 2020-12-29 | Apple Inc. | Multi-command single utterance input method |
US10880284B1 (en) * | 2016-08-19 | 2020-12-29 | Amazon Technologies, Inc. | Repurposing limited functionality devices as authentication factors |
US10896671B1 (en) | 2015-08-21 | 2021-01-19 | Soundhound, Inc. | User-defined extensions of the command input recognized by a virtual assistant |
EP3776540A1 (en) * | 2018-06-27 | 2021-02-17 | Samsung Electronics Co., Ltd. | Electronic apparatus, method for controlling mobile apparatus by electronic apparatus and computer readable recording medium |
US10931999B1 (en) * | 2016-06-27 | 2021-02-23 | Amazon Technologies, Inc. | Systems and methods for routing content to an associated output device |
US10938830B2 (en) | 2018-05-08 | 2021-03-02 | International Business Machines Corporation | Authorizing and nullifying commands issued to virtual assistants in an internet of things (IoT) computing environment based on hierarchal user access levels |
US20210073330A1 (en) * | 2019-09-11 | 2021-03-11 | International Business Machines Corporation | Creating an executable process from a text description written in a natural language |
US10956123B2 (en) | 2019-05-08 | 2021-03-23 | Rovi Guides, Inc. | Device and query management system |
US10964320B2 (en) | 2012-04-13 | 2021-03-30 | View, Inc. | Controlling optically-switchable devices |
US10972556B1 (en) * | 2017-03-22 | 2021-04-06 | Amazon Technologies, Inc. | Location-based functionality for voice-capturing devices |
US10970990B1 (en) * | 2015-02-19 | 2021-04-06 | State Farm Mutual Automobile Insurance Company | Systems and methods for monitoring building health |
US10978090B2 (en) | 2013-02-07 | 2021-04-13 | Apple Inc. | Voice trigger for a digital assistant |
US10984798B2 (en) | 2018-06-01 | 2021-04-20 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US11009970B2 (en) | 2018-06-01 | 2021-05-18 | Apple Inc. | Attention aware virtual assistant dismissal |
US11037565B2 (en) | 2016-06-10 | 2021-06-15 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US11070949B2 (en) | 2015-05-27 | 2021-07-20 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on an electronic device with a touch-sensitive display |
US20210224910A1 (en) * | 2020-01-21 | 2021-07-22 | S&P Global | Virtual reality system for analyzing financial risk |
US11073800B2 (en) | 2011-03-16 | 2021-07-27 | View, Inc. | Monitoring sites containing switchable optical devices and controllers |
US11074914B2 (en) * | 2019-03-08 | 2021-07-27 | Rovi Guides, Inc. | Automated query detection in interactive content |
US11087759B2 (en) | 2015-03-08 | 2021-08-10 | Apple Inc. | Virtual assistant activation |
US11094319B2 (en) | 2019-08-30 | 2021-08-17 | Spotify Ab | Systems and methods for generating a cleaned version of ambient sound |
US11093554B2 (en) | 2017-09-15 | 2021-08-17 | Kohler Co. | Feedback for water consuming appliance |
US11099540B2 (en) | 2017-09-15 | 2021-08-24 | Kohler Co. | User identity in household appliances |
US11120372B2 (en) | 2011-06-03 | 2021-09-14 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
US11126400B2 (en) | 2015-09-08 | 2021-09-21 | Apple Inc. | Zero latency digital assistant |
US11133008B2 (en) | 2014-05-30 | 2021-09-28 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11152002B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Application integration with a digital assistant |
US11158317B2 (en) * | 2017-05-08 | 2021-10-26 | Signify Holding B.V. | Methods, systems and apparatus for voice control of a utility |
US11170783B2 (en) | 2019-04-16 | 2021-11-09 | At&T Intellectual Property I, L.P. | Multi-agent input coordination |
US11169616B2 (en) | 2018-05-07 | 2021-11-09 | Apple Inc. | Raise to speak |
US20210350810A1 (en) * | 2020-05-11 | 2021-11-11 | Apple Inc. | Device arbitration for digital assistant-based intercom systems |
EP3913898A1 (en) * | 2015-11-06 | 2021-11-24 | Google LLC | Voice commands across devices |
EP3929885A1 (en) * | 2020-06-26 | 2021-12-29 | GIRA GIERSIEPEN GmbH & Co. KG | Method for building automation |
US11217251B2 (en) | 2019-05-06 | 2022-01-04 | Apple Inc. | Spoken notifications |
US11237797B2 (en) | 2019-05-31 | 2022-02-01 | Apple Inc. | User activity shortcut suggestions |
US11238142B2 (en) | 2018-10-08 | 2022-02-01 | Google Llc | Enrollment with an automated assistant |
US11238294B2 (en) | 2018-10-08 | 2022-02-01 | Google Llc | Enrollment with an automated assistant |
US11238870B2 (en) * | 2017-07-05 | 2022-02-01 | Alibaba Group Holding Limited | Interaction method, electronic device, and server |
US11244687B2 (en) | 2016-07-06 | 2022-02-08 | Pcms Holdings, Inc. | System and method for customizing smart home speech interfaces using personalized speech profiles |
US11257504B2 (en) | 2014-05-30 | 2022-02-22 | Apple Inc. | Intelligent assistant for home automation |
US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
US11308959B2 (en) | 2020-02-11 | 2022-04-19 | Spotify Ab | Dynamic adjustment of wake word acceptance tolerance thresholds in voice-controlled devices |
US11316974B2 (en) * | 2014-07-09 | 2022-04-26 | Ooma, Inc. | Cloud-based assistive services for use in telecommunications and on premise devices |
US11314214B2 (en) | 2017-09-15 | 2022-04-26 | Kohler Co. | Geographic analysis of water conditions |
US11315405B2 (en) | 2014-07-09 | 2022-04-26 | Ooma, Inc. | Systems and methods for provisioning appliance devices |
US11321116B2 (en) | 2012-05-15 | 2022-05-03 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US11328722B2 (en) | 2020-02-11 | 2022-05-10 | Spotify Ab | Systems and methods for generating a singular voice audio stream |
US11348582B2 (en) | 2008-10-02 | 2022-05-31 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US11355111B2 (en) * | 2017-01-24 | 2022-06-07 | Honeywell International Inc. | Voice control of an integrated room automation system |
US11355104B2 (en) * | 2016-02-02 | 2022-06-07 | Amazon Technologies, Inc. | Post-speech recognition request surplus detection and prevention |
US11381903B2 (en) | 2014-02-14 | 2022-07-05 | Sonic Blocks Inc. | Modular quick-connect A/V system and methods thereof |
US11380310B2 (en) | 2017-05-12 | 2022-07-05 | Apple Inc. | Low-latency intelligent automated assistant |
US11388291B2 (en) | 2013-03-14 | 2022-07-12 | Apple Inc. | System and method for processing voicemail |
US11405466B2 (en) | 2017-05-12 | 2022-08-02 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US20220247743A1 (en) * | 2014-10-03 | 2022-08-04 | Gopro, Inc. | Authenticating a limited input device via an authenticated application |
US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
US11423886B2 (en) | 2010-01-18 | 2022-08-23 | Apple Inc. | Task flow identification based on user intent |
US11431642B2 (en) | 2018-06-01 | 2022-08-30 | Apple Inc. | Variable latency device coordination |
US20220277727A1 (en) * | 2016-12-30 | 2022-09-01 | Google Llc | Conversation-aware proactive notifications for a voice interface device |
US11467802B2 (en) | 2017-05-11 | 2022-10-11 | Apple Inc. | Maintaining privacy of personal information |
US11468282B2 (en) | 2015-05-15 | 2022-10-11 | Apple Inc. | Virtual assistant in a communication session |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
US11500672B2 (en) | 2015-09-08 | 2022-11-15 | Apple Inc. | Distributed personal assistant |
US11516537B2 (en) | 2014-06-30 | 2022-11-29 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US11522619B2 (en) | 2019-03-08 | 2022-12-06 | Rovi Guides, Inc. | Frequency pairing for device synchronization |
US11521609B2 (en) * | 2017-09-28 | 2022-12-06 | Kyocera Corporation | Voice command system and voice command method |
US11526368B2 (en) | 2015-11-06 | 2022-12-13 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US11532306B2 (en) | 2017-05-16 | 2022-12-20 | Apple Inc. | Detecting a trigger of a digital assistant |
US20230031831A1 (en) * | 2017-05-12 | 2023-02-02 | Google Llc | Systems, methods, and devices for activity monitoring via a home assistant |
US11580990B2 (en) | 2017-05-12 | 2023-02-14 | Apple Inc. | User-specific acoustic models |
US11592723B2 (en) | 2009-12-22 | 2023-02-28 | View, Inc. | Automated commissioning of controllers in a window network |
US11599331B2 (en) | 2017-05-11 | 2023-03-07 | Apple Inc. | Maintaining privacy of personal information |
US20230070082A1 (en) * | 2021-07-26 | 2023-03-09 | LifePod Solutions, Inc. | Systems and methods for managing voice environments and voice routines |
US11627012B2 (en) | 2018-10-09 | 2023-04-11 | NewTekSol, LLC | Home automation management system |
US11641505B1 (en) * | 2022-06-13 | 2023-05-02 | Roku, Inc. | Speaker-identification model for controlling operation of a media player |
US11646974B2 (en) | 2015-05-08 | 2023-05-09 | Ooma, Inc. | Systems and methods for end point data communications anonymization for a communications hub |
US11657813B2 (en) | 2019-05-31 | 2023-05-23 | Apple Inc. | Voice identification in digital assistant systems |
US11656884B2 (en) | 2017-01-09 | 2023-05-23 | Apple Inc. | Application integration with a digital assistant |
US11671920B2 (en) | 2007-04-03 | 2023-06-06 | Apple Inc. | Method and system for operating a multifunction portable electronic device using voice-activation |
US11675829B2 (en) | 2017-05-16 | 2023-06-13 | Apple Inc. | Intelligent automated assistant for media exploration |
US11675491B2 (en) | 2019-05-06 | 2023-06-13 | Apple Inc. | User configurable task triggers |
US11696060B2 (en) | 2020-07-21 | 2023-07-04 | Apple Inc. | User identification using headphones |
US11710482B2 (en) | 2018-03-26 | 2023-07-25 | Apple Inc. | Natural assistant interaction |
US11727219B2 (en) | 2013-06-09 | 2023-08-15 | Apple Inc. | System and method for inferring user intent from speech inputs |
US11733660B2 (en) | 2014-03-05 | 2023-08-22 | View, Inc. | Monitoring sites containing switchable optical devices and controllers |
US11765209B2 (en) | 2020-05-11 | 2023-09-19 | Apple Inc. | Digital assistant hardware abstraction |
US11763663B2 (en) | 2014-05-20 | 2023-09-19 | Ooma, Inc. | Community security monitoring and control |
US11783815B2 (en) | 2019-03-18 | 2023-10-10 | Apple Inc. | Multimodality in digital assistant systems |
US11790914B2 (en) | 2019-06-01 | 2023-10-17 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US11798547B2 (en) | 2013-03-15 | 2023-10-24 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
US11804215B1 (en) * | 2022-04-29 | 2023-10-31 | Apple Inc. | Sonic responses |
US11809483B2 (en) | 2015-09-08 | 2023-11-07 | Apple Inc. | Intelligent automated assistant for media search and playback |
US11809783B2 (en) | 2016-06-11 | 2023-11-07 | Apple Inc. | Intelligent device arbitration and control |
US11822601B2 (en) | 2019-03-15 | 2023-11-21 | Spotify Ab | Ensemble-based data comparison |
US11838734B2 (en) | 2020-07-20 | 2023-12-05 | Apple Inc. | Multi-device audio adjustment coordination |
US11854539B2 (en) | 2018-05-07 | 2023-12-26 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US11853647B2 (en) | 2015-12-23 | 2023-12-26 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US11853536B2 (en) | 2015-09-08 | 2023-12-26 | Apple Inc. | Intelligent automated assistant in a media environment |
US11886805B2 (en) | 2015-11-09 | 2024-01-30 | Apple Inc. | Unconventional virtual assistant interactions |
US11893992B2 (en) | 2018-09-28 | 2024-02-06 | Apple Inc. | Multi-modal inputs for voice commands |
US11899566B1 (en) | 2020-05-15 | 2024-02-13 | Google Llc | Training and/or using machine learning model(s) for automatic generation of test case(s) for source code |
US11914848B2 (en) | 2020-05-11 | 2024-02-27 | Apple Inc. | Providing relevant data items based on context |
US11947873B2 (en) | 2015-06-29 | 2024-04-02 | Apple Inc. | Virtual assistant for media playback |
US11954405B2 (en) | 2022-11-07 | 2024-04-09 | Apple Inc. | Zero latency digital assistant |
Families Citing this family (105)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9339691B2 (en) | 2012-01-05 | 2016-05-17 | Icon Health & Fitness, Inc. | System and method for controlling an exercise device |
US10397013B1 (en) | 2012-04-11 | 2019-08-27 | Google Llc | User interfaces, systems and methods for configuring smart devices for interoperability with a smart hub device |
US9198204B2 (en) | 2012-04-11 | 2015-11-24 | Google Inc. | Apparatus and method for seamless commissioning of wireless devices |
US9697664B2 (en) | 2012-04-11 | 2017-07-04 | Digilock Asia Limited | Electronic locking systems, methods, and apparatus |
US10075334B1 (en) | 2012-04-11 | 2018-09-11 | Google Llc | Systems and methods for commissioning a smart hub device |
US9626859B2 (en) | 2012-04-11 | 2017-04-18 | Digilock Asia Limited | Electronic locking systems, methods, and apparatus |
US20150292240A1 (en) * | 2012-04-11 | 2015-10-15 | Bielet, Inc. | Alignment aid for electronic locking device |
US10598741B2 (en) * | 2013-02-20 | 2020-03-24 | D & D Group Pty Ltd | Sensor configuration for a latch assembly |
AU2013202672A1 (en) * | 2013-02-20 | 2014-09-04 | D & D Group Pty Ltd | Latch assembly |
WO2014153158A1 (en) | 2013-03-14 | 2014-09-25 | Icon Health & Fitness, Inc. | Strength training apparatus with flywheel and related methods |
US10680905B1 (en) | 2013-12-06 | 2020-06-09 | Mobile Iron, Inc. | Application help desk |
US10277698B1 (en) * | 2013-12-12 | 2019-04-30 | Mobile Iron, Inc. | Remote display using a proxy |
US10088818B1 (en) | 2013-12-23 | 2018-10-02 | Google Llc | Systems and methods for programming and controlling devices with sensor data and learning |
EP3623020A1 (en) | 2013-12-26 | 2020-03-18 | Icon Health & Fitness, Inc. | Magnetic resistance mechanism in a cable machine |
US9672717B1 (en) * | 2013-12-27 | 2017-06-06 | Alarm.Com Incorporated | Contextual communication of events |
US10433612B2 (en) | 2014-03-10 | 2019-10-08 | Icon Health & Fitness, Inc. | Pressure sensor to quantify work |
US9538235B2 (en) * | 2014-03-19 | 2017-01-03 | Verizon Patent And Licensing Inc. | Streaming an interactive program guide used for media content and home automation |
US20150278556A1 (en) * | 2014-03-28 | 2015-10-01 | Noam Avni | Centralized security for a computing device |
US10425479B2 (en) * | 2014-04-24 | 2019-09-24 | Vivint, Inc. | Saving video clips on a storage of limited size based on priority |
US10102585B1 (en) | 2014-04-25 | 2018-10-16 | State Farm Mutual Automobile Insurance Company | Systems and methods for automatically mitigating risk of property damage |
CN106470739B (en) | 2014-06-09 | 2019-06-21 | 爱康保健健身有限公司 | It is incorporated to the funicular system of treadmill |
WO2015195965A1 (en) | 2014-06-20 | 2015-12-23 | Icon Health & Fitness, Inc. | Post workout massage device |
US10764081B2 (en) * | 2014-07-28 | 2020-09-01 | Vivint, Inc. | Asynchronous communications using home automation system |
US20170097621A1 (en) * | 2014-09-10 | 2017-04-06 | Crestron Electronics, Inc. | Configuring a control sysem |
US10748539B2 (en) | 2014-09-10 | 2020-08-18 | Crestron Electronics, Inc. | Acoustic sensory network |
US9009805B1 (en) | 2014-09-30 | 2015-04-14 | Google Inc. | Method and system for provisioning an electronic device |
US10515372B1 (en) * | 2014-10-07 | 2019-12-24 | State Farm Mutual Automobile Insurance Company | Systems and methods for managing building code compliance for a property |
US20160127483A1 (en) * | 2014-10-31 | 2016-05-05 | Xiaomi Inc. | Method and device for displaying item content |
US10601604B2 (en) * | 2014-11-12 | 2020-03-24 | Google Llc | Data processing systems and methods for smart hub devices |
CN104483865B (en) * | 2014-12-26 | 2017-11-10 | 小米科技有限责任公司 | The installation implementation method and device of intelligent hardware devices |
US9652974B1 (en) * | 2014-12-19 | 2017-05-16 | SureView Systems, LLC | Heuristic electronic monitoring security device association |
US10017186B2 (en) * | 2014-12-19 | 2018-07-10 | Bosch Automotive Service Solutions Inc. | System and method for optimizing vehicle settings |
CN107110537B (en) * | 2014-12-22 | 2021-02-02 | 特灵国际有限公司 | Occupancy sensing and building control using mobile devices |
US10391361B2 (en) | 2015-02-27 | 2019-08-27 | Icon Health & Fitness, Inc. | Simulating real-world terrain on an exercise device |
US10742938B2 (en) * | 2015-03-07 | 2020-08-11 | Skybell Technologies Ip, Llc | Garage door communication systems and methods |
CN104808499B (en) * | 2015-03-09 | 2019-01-15 | 联想(北京)有限公司 | A kind of method and control device based on linkage rule control smart home device |
FR3035560B1 (en) * | 2015-04-21 | 2021-03-05 | Overkiz | CONFIGURATION, SUPERVISION AND CONTROL PROCEDURES FOR AT LEAST ONE HOME AUTOMATION INSTALLATION OF A BUILDING |
KR102410903B1 (en) * | 2015-06-12 | 2022-06-21 | 삼성전자 주식회사 | Room management system and service setting method |
DE102015110139A1 (en) * | 2015-06-24 | 2016-12-29 | Emka Beschlagteile Gmbh & Co. Kg | Access control system for a variety of closures |
US10200208B2 (en) | 2015-06-30 | 2019-02-05 | K4Connect Inc. | Home automation system including cloud and home message queue synchronization and related methods |
US11227674B2 (en) * | 2015-06-30 | 2022-01-18 | K4Connect Inc. | Home automation system generating user health score and related methods |
ES2938349T3 (en) | 2015-07-13 | 2023-04-10 | Carrier Corp | security automation system |
US20180283707A1 (en) * | 2015-09-30 | 2018-10-04 | Koninklijke Philips N.V. | Gas filtration system and method |
KR102459590B1 (en) * | 2015-12-24 | 2022-10-26 | 엘지전자 주식회사 | Image display apparatus |
EP3188573A1 (en) * | 2015-12-29 | 2017-07-05 | "Stabil" Piotr Narczyk | A system and method of controlling electric power supply in a room of a temporary use |
EP3403146A4 (en) * | 2016-01-15 | 2019-08-21 | iRobot Corporation | Autonomous monitoring robot systems |
US10623914B2 (en) | 2016-02-17 | 2020-04-14 | Tracfone Wireless, Inc. | Device, system, and process for providing real-time short message data services for mission critical communications |
USD803254S1 (en) * | 2016-03-01 | 2017-11-21 | Mengming Luo | Display screen with keyboard graphical user interface |
US9924021B2 (en) * | 2016-03-11 | 2018-03-20 | Distech Controls Inc. | Environment controllers capable of controlling a plurality of smart light fixtures |
US10625137B2 (en) | 2016-03-18 | 2020-04-21 | Icon Health & Fitness, Inc. | Coordinated displays in an exercise device |
US10493349B2 (en) | 2016-03-18 | 2019-12-03 | Icon Health & Fitness, Inc. | Display on exercise device |
US10272317B2 (en) | 2016-03-18 | 2019-04-30 | Icon Health & Fitness, Inc. | Lighted pace feature in a treadmill |
WO2017165166A1 (en) | 2016-03-21 | 2017-09-28 | Carrier Corporation | Intrusion security panel with remote assistance through simulated user interface |
US20170303007A1 (en) * | 2016-04-15 | 2017-10-19 | Arris Enterprises Llc | System and method for trasmitting warning signal based on emergency alert system signal |
FI20165338A (en) * | 2016-04-18 | 2017-10-19 | Rollock Oy | LOCK SYSTEM, LOCK SYSTEM SERVER AND DOOR LOCK |
US10319210B2 (en) | 2016-05-31 | 2019-06-11 | Honeywell International Inc. | Translating building automation events into mobile notifications |
US10270815B1 (en) * | 2016-06-07 | 2019-04-23 | Amazon Technologies, Inc. | Enabling communications between a controlling device and a network-controlled device via a network-connected device service over a mobile communications network |
FR3053497B1 (en) * | 2016-06-29 | 2019-09-13 | 4T Sa | METHOD FOR ENHANCING THE SECURITY OF A PEACE-BASED TELEVISION SYSTEM BASED ON PERIODIC PERIODIC RETRO-COMMUNICATION |
US10285344B2 (en) | 2016-09-16 | 2019-05-14 | Hunter Industries, Inc. | Irrigation controller with reversible operator controls |
US10671705B2 (en) | 2016-09-28 | 2020-06-02 | Icon Health & Fitness, Inc. | Customizing recipe recommendations |
US10662686B2 (en) | 2016-09-30 | 2020-05-26 | Barrette Outdoor Living, Inc. | Magnetic safety gate latch |
US10089798B2 (en) * | 2016-10-28 | 2018-10-02 | Fca Us Llc | Vehicle with variable position ajar sensor |
US10496048B2 (en) * | 2016-11-02 | 2019-12-03 | Edison Labs, Inc. | Switch terminal methods with wiring components secured to circuitry wiring without external live points of contact |
US10642231B1 (en) * | 2016-11-02 | 2020-05-05 | Edison Labs, Inc. | Switch terminal system with an activity assistant |
KR20180062036A (en) * | 2016-11-30 | 2018-06-08 | 삼성전자주식회사 | Apparatus and method for controlling light |
US20180158310A1 (en) * | 2016-12-01 | 2018-06-07 | Leroy Sieck | Personal Alert and Egress Control Assembly |
AU2016277697A1 (en) * | 2016-12-22 | 2018-07-12 | Canon Kabushiki Kaisha | Method, system and apparatus for providing access to videos |
KR20180086662A (en) * | 2017-01-23 | 2018-08-01 | 한화에어로스페이스 주식회사 | The Apparatus And The System For Monitoring |
US10304447B2 (en) * | 2017-01-25 | 2019-05-28 | International Business Machines Corporation | Conflict resolution enhancement system |
CA3054798A1 (en) | 2017-02-28 | 2018-09-07 | Lutron Technology Company Llc | Communicating with and controlling load control systems |
WO2018187168A1 (en) * | 2017-04-05 | 2018-10-11 | Ring Inc. | Providing status information for secondary devices with video footage from audio/video recording and communication devices |
US11187223B2 (en) | 2017-04-10 | 2021-11-30 | Logical Concepts, Inc. | Home flood prevention appliance system |
US11022124B2 (en) * | 2017-04-10 | 2021-06-01 | Logical Concepts, Inc. | Whole home water appliance system |
US10965899B1 (en) * | 2017-04-17 | 2021-03-30 | Alarm.Com Incorporated | System and method for integration of a television into a connected-home monitoring system |
US10769914B2 (en) * | 2017-06-07 | 2020-09-08 | Amazon Technologies, Inc. | Informative image data generation using audio/video recording and communication devices |
KR20180137913A (en) * | 2017-06-20 | 2018-12-28 | 삼성전자주식회사 | Electronic device for playing contents and operating method thereof |
US10452046B2 (en) * | 2017-06-29 | 2019-10-22 | Midea Group Co., Ltd. | Cooking appliance control of residential heating, ventilation and/or air conditioning (HVAC) system |
WO2019028219A1 (en) * | 2017-08-02 | 2019-02-07 | Objectvideo Labs, Llc | Supervising property access with portable camera |
EP3679539B1 (en) * | 2017-09-06 | 2021-12-22 | Landis+Gyr Innovations, Inc. | Voice-activated energy management system |
WO2019059954A1 (en) * | 2017-09-19 | 2019-03-28 | Rovi Guides, Inc. | System and methods for navigating internet appliances using a media guidance application |
BR102018068874A8 (en) * | 2017-09-19 | 2019-08-13 | Hunter Douglas | methods and apparatus for controlling architectural roofing |
US10217347B1 (en) | 2017-09-20 | 2019-02-26 | Robert William Lawson | System for monitoring and providing alerts |
US11360736B1 (en) * | 2017-11-03 | 2022-06-14 | Amazon Technologies, Inc. | System command processing |
US11259076B2 (en) * | 2017-12-13 | 2022-02-22 | Google Llc | Tactile launching of an asymmetric visual communication session |
US11313151B2 (en) * | 2017-12-20 | 2022-04-26 | Schlage Lock Company Llc | Sensor for rim latch roller strike |
US20190284853A1 (en) * | 2018-03-16 | 2019-09-19 | Fire Door Solutions Llc | Supplemental door lock |
US11094180B1 (en) | 2018-04-09 | 2021-08-17 | State Farm Mutual Automobile Insurance Company | Sensing peripheral heuristic evidence, reinforcement, and engagement system |
US20190392691A1 (en) * | 2018-06-26 | 2019-12-26 | The Chamberlain Group, Inc. | Entry security system and method |
US10837216B2 (en) | 2018-06-26 | 2020-11-17 | The Chamberlain Group, Inc. | Garage entry system and method |
CN112567309A (en) * | 2018-07-11 | 2021-03-26 | 卡罗马工业有限公司 | Water flow management system |
US11165599B2 (en) | 2018-09-24 | 2021-11-02 | International Business Machines Corporation | Cognitive component selection and implementation |
CN112399875A (en) | 2018-11-30 | 2021-02-23 | 开利公司 | Fire extinguishing system remote monitoring |
US10762766B2 (en) * | 2019-02-01 | 2020-09-01 | SimpliSafe, Inc. | Alarm system with door lock |
JP2022523564A (en) | 2019-03-04 | 2022-04-25 | アイオーカレンツ, インコーポレイテッド | Data compression and communication using machine learning |
US11639617B1 (en) | 2019-04-03 | 2023-05-02 | The Chamberlain Group Llc | Access control system and method |
EP4002858A4 (en) * | 2019-07-16 | 2023-03-29 | Lg Electronics Inc. | Display device for controlling one or more home appliances in consideration of viewing situation |
US11069163B2 (en) * | 2019-10-30 | 2021-07-20 | Cirque Corporation | Closure member sensor |
EP3883235A1 (en) | 2020-03-17 | 2021-09-22 | Aptiv Technologies Limited | Camera control modules and methods |
CN111663852A (en) * | 2020-06-08 | 2020-09-15 | 广东科徕尼智能科技有限公司 | Back locking monitoring method and device of intelligent lock and storage medium |
US20220026305A1 (en) * | 2020-07-24 | 2022-01-27 | Alarm.Com Incorporated | Dynamic water leak detection |
CN112031543B (en) * | 2020-09-08 | 2022-04-26 | 北京紫光安芯科技有限公司 | Intelligent door lock and state detection method and device thereof |
US11586485B2 (en) * | 2020-11-11 | 2023-02-21 | Shopify Inc. | Methods and systems for generating notifications |
US20220262180A1 (en) * | 2021-02-17 | 2022-08-18 | Carrier Corporation | Translator for Access to Smart Locks |
US20230008687A1 (en) * | 2021-07-07 | 2023-01-12 | Palatiumcare, Inc. | Systems and methods for securing doors with magnetic locks |
US11523190B1 (en) * | 2021-12-17 | 2022-12-06 | Google Llc | Generating notifications that provide context for predicted content interruptions |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6107935A (en) * | 1998-02-11 | 2000-08-22 | International Business Machines Corporation | Systems and methods for access filtering employing relaxed recognition constraints |
US6119088A (en) * | 1998-03-03 | 2000-09-12 | Ciluffo; Gary | Appliance control programmer using voice recognition |
US20010012998A1 (en) * | 1999-12-17 | 2001-08-09 | Pierrick Jouet | Voice recognition process and device, associated remote control device |
US6337899B1 (en) * | 1998-03-31 | 2002-01-08 | International Business Machines Corporation | Speaker verification for authorizing updates to user subscription service received by internet service provider (ISP) using an intelligent peripheral (IP) in an advanced intelligent network (AIN) |
US6415257B1 (en) * | 1999-08-26 | 2002-07-02 | Matsushita Electric Industrial Co., Ltd. | System for identifying and adapting a TV-user profile by means of speech technology |
US20020193989A1 (en) * | 1999-05-21 | 2002-12-19 | Michael Geilhufe | Method and apparatus for identifying voice controlled devices |
US20030005431A1 (en) * | 2001-07-02 | 2003-01-02 | Sony Corporation | PVR-based system and method for TV content control using voice recognition |
US20040143838A1 (en) * | 2003-01-17 | 2004-07-22 | Mark Rose | Video access management system |
US20050049862A1 (en) * | 2003-09-03 | 2005-03-03 | Samsung Electronics Co., Ltd. | Audio/video apparatus and method for providing personalized services through voice and speaker recognition |
US6931104B1 (en) * | 1996-09-03 | 2005-08-16 | Koninklijke Philips Electronics N.V. | Intelligent call processing platform for home telephone system |
US7103545B2 (en) * | 2000-08-07 | 2006-09-05 | Shin Caterpillar Mitsubishi Ltd. | Voice-actuated machine body control apparatus for construction machine |
US7260538B2 (en) * | 2002-01-08 | 2007-08-21 | Promptu Systems Corporation | Method and apparatus for voice control of a television control device |
US7529677B1 (en) * | 2005-01-21 | 2009-05-05 | Itt Manufacturing Enterprises, Inc. | Methods and apparatus for remotely processing locally generated commands to control a local device |
US20090271203A1 (en) * | 2008-04-25 | 2009-10-29 | Keith Resch | Voice-activated remote control service |
US20100083371A1 (en) * | 2008-10-01 | 2010-04-01 | Christopher Lee Bennetts | User Access Control System And Method |
US20100131280A1 (en) * | 2008-11-25 | 2010-05-27 | General Electric Company | Voice recognition system for medical devices |
US20120316876A1 (en) * | 2011-06-10 | 2012-12-13 | Seokbok Jang | Display Device, Method for Thereof and Voice Recognition System |
US20130238326A1 (en) * | 2012-03-08 | 2013-09-12 | Lg Electronics Inc. | Apparatus and method for multiple device voice control |
Family Cites Families (452)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR834856A (en) * | 1938-03-05 | 1938-12-05 | Device for controlling an electrical circuit | |
FR947943A (en) * | 1947-06-16 | 1949-07-18 | Muse Svoekhotoff | Safety lock |
US3803575A (en) * | 1971-02-15 | 1974-04-09 | M Gotanda | Device for setting-up a power source of electrical alarm |
JPS5247669Y2 (en) * | 1972-08-31 | 1977-10-28 | ||
US4127966A (en) | 1977-08-22 | 1978-12-05 | New Pneumatics, Inc. | Locking and emergency release system for barred windows |
US4386436A (en) | 1981-02-27 | 1983-05-31 | Rca Corporation | Television remote control system for selectively controlling external apparatus through the AC power line |
US4581606A (en) | 1982-08-30 | 1986-04-08 | Isotec Industries Limited | Central monitor for home security system |
EP0120345B1 (en) | 1983-03-23 | 1988-03-02 | TELEFUNKEN Fernseh und Rundfunk GmbH | Remote control apparatus controlling various functions of one or more devices |
JPS61294080A (en) | 1985-06-24 | 1986-12-24 | ワイケイケイ株式会社 | Detector for data of movement of door of automatic door |
DE3707284A1 (en) * | 1987-03-06 | 1988-09-15 | Winkhaus Fa August | ELECTRONIC DOOR LOCK |
US5400246A (en) | 1989-05-09 | 1995-03-21 | Ansan Industries, Ltd. | Peripheral data acquisition, monitor, and adaptive control system via personal computer |
US4959713A (en) | 1989-10-10 | 1990-09-25 | Matsushita Electric Industrial Co., Ltd. | Home automation system |
US6728832B2 (en) | 1990-02-26 | 2004-04-27 | Hitachi, Ltd. | Distribution of I/O requests across multiple disk units |
DE4012253C1 (en) * | 1990-04-17 | 1991-04-11 | Aug. Winkhaus Gmbh & Co Kg, 4404 Telgte, De | Control magnet carrier - has U=shaped bin with bridge port having attachment for fixing to lock plate and two shanks |
WO1993020544A1 (en) | 1992-03-31 | 1993-10-14 | Barbeau Paul E | Fire crisis management expert system |
US7082359B2 (en) * | 1995-06-07 | 2006-07-25 | Automotive Technologies International, Inc. | Vehicular information and monitoring system and methods |
JP3489214B2 (en) | 1994-10-05 | 2004-01-19 | ソニー株式会社 | Communication circuit |
DE4445730A1 (en) * | 1994-12-21 | 1996-07-18 | Grundig Emv | Device for arming an alarm system and for monitoring an entrance door |
DE19518527A1 (en) * | 1995-05-19 | 1996-11-21 | Winkhaus Fa August | Monitorable locking arrangement for a window or a door or the like |
KR0141751B1 (en) | 1995-06-13 | 1998-06-15 | 구자홍 | Method for sleep preservation confirmation of television |
KR100233516B1 (en) | 1995-08-28 | 1999-12-01 | 윤종용 | Home automation device for using digital tv receiver |
US6142913A (en) | 1995-10-11 | 2000-11-07 | Ewert; Bruce | Dynamic real time exercise video apparatus and method |
US5805442A (en) | 1996-05-30 | 1998-09-08 | Control Technology Corporation | Distributed interface architecture for programmable industrial control systems |
US5926090A (en) | 1996-08-26 | 1999-07-20 | Sharper Image Corporation | Lost article detector unit with adaptive actuation signal recognition and visual and/or audible locating signal |
AU735562B2 (en) | 1996-10-04 | 2001-07-12 | Bruce Ewert | Dynamic real time exercise video apparatus and method |
US6111517A (en) | 1996-12-30 | 2000-08-29 | Visionics Corporation | Continuous video monitoring using face recognition for access control |
US5886638A (en) | 1997-02-19 | 1999-03-23 | Ranco Inc. Of Delaware | Method and apparatus for testing a carbon monoxide sensor |
ID24894A (en) | 1997-06-25 | 2000-08-31 | Samsung Electronics Co Ltd Cs | METHOD AND APPARATUS FOR THREE-OTO DEVELOPMENTS A HOME NETWORK |
US6377858B1 (en) | 1997-10-02 | 2002-04-23 | Lucent Technologies Inc. | System and method for recording and controlling on/off events of devices of a dwelling |
US6107918A (en) | 1997-11-25 | 2000-08-22 | Micron Electronics, Inc. | Method for personal computer-based home surveillance |
US5970030A (en) | 1997-12-02 | 1999-10-19 | International Business Machines Corporation | Automated data storage library component exchange using media accessor |
US6104334A (en) | 1997-12-31 | 2000-08-15 | Eremote, Inc. | Portable internet-enabled controller and information browser for consumer devices |
US6445287B1 (en) * | 2000-02-28 | 2002-09-03 | Donnelly Corporation | Tire inflation assistance monitoring system |
US6081758A (en) | 1998-04-03 | 2000-06-27 | Sony Corporation | System for automatically unlocking an automotive child safety door lock |
US6891838B1 (en) * | 1998-06-22 | 2005-05-10 | Statsignal Ipc, Llc | System and method for monitoring and controlling residential devices |
US6914893B2 (en) * | 1998-06-22 | 2005-07-05 | Statsignal Ipc, Llc | System and method for monitoring and controlling remote devices |
US6543051B1 (en) | 1998-08-07 | 2003-04-01 | Scientific-Atlanta, Inc. | Emergency alert system |
US7103511B2 (en) * | 1998-10-14 | 2006-09-05 | Statsignal Ipc, Llc | Wireless communication networks for providing remote monitoring of devices |
US6405284B1 (en) | 1998-10-23 | 2002-06-11 | Oracle Corporation | Distributing data across multiple data storage devices in a data storage system |
US6876889B1 (en) | 1998-11-17 | 2005-04-05 | Intel Corporation | Rule processing system with external application integration |
US6553375B1 (en) | 1998-11-25 | 2003-04-22 | International Business Machines Corporation | Method and apparatus for server based handheld application and database management |
US6225938B1 (en) | 1999-01-14 | 2001-05-01 | Universal Electronics Inc. | Universal remote control system with bar code setup |
US6330621B1 (en) | 1999-01-15 | 2001-12-11 | Storage Technology Corporation | Intelligent data storage manager |
US6744771B1 (en) | 1999-06-09 | 2004-06-01 | Amx Corporation | Method and system for master to master communication in control systems |
US6441778B1 (en) | 1999-06-18 | 2002-08-27 | Jennifer Durst | Pet locator |
US6286764B1 (en) | 1999-07-14 | 2001-09-11 | Edward C. Garvey | Fluid and gas supply system |
JP4875796B2 (en) | 1999-07-30 | 2012-02-15 | キヤノン株式会社 | Electronic device and storage medium |
US6529230B1 (en) | 1999-08-30 | 2003-03-04 | Safe-T-Net Systems Pte Ltd | Security and fire control system |
US7574494B1 (en) | 1999-10-15 | 2009-08-11 | Thomson Licensing | User interface for a bi-directional communication system |
US6751657B1 (en) | 1999-12-21 | 2004-06-15 | Worldcom, Inc. | System and method for notification subscription filtering based on user role |
US6502166B1 (en) | 1999-12-29 | 2002-12-31 | International Business Machines Corporation | Method and apparatus for distributing data across multiple disk drives |
US7010332B1 (en) | 2000-02-21 | 2006-03-07 | Telefonaktiebolaget Lm Ericsson(Publ) | Wireless headset with automatic power control |
US7395546B1 (en) | 2000-03-09 | 2008-07-01 | Sedna Patent Services, Llc | Set top terminal having a program pause feature |
US20070281828A1 (en) | 2000-03-21 | 2007-12-06 | Rice Michael J P | Games controllers |
US6646676B1 (en) | 2000-05-17 | 2003-11-11 | Mitsubishi Electric Research Laboratories, Inc. | Networked surveillance and control system |
US6663375B1 (en) | 2000-06-19 | 2003-12-16 | Extrusion Dies, Inc. | Dual flexible lip extrusion apparatus with pivoting actuation member |
AU2001296925A1 (en) | 2000-09-28 | 2002-04-08 | Vigilos, Inc. | Method and process for configuring a premises for monitoring |
US6756998B1 (en) | 2000-10-19 | 2004-06-29 | Destiny Networks, Inc. | User interface and method for home automation system |
US6792319B1 (en) | 2000-10-19 | 2004-09-14 | Destiny Networks, Inc. | Home automation system and method |
KR100359827B1 (en) | 2000-11-27 | 2002-11-07 | 엘지전자 주식회사 | Network method and apparatus for home appliance |
US6920615B1 (en) | 2000-11-29 | 2005-07-19 | Verizon Corporate Services Group Inc. | Method and system for service-enablement gateway and its service portal |
US20020080238A1 (en) | 2000-12-27 | 2002-06-27 | Nikon Corporation | Watching system |
AU2002255568B8 (en) | 2001-02-20 | 2014-01-09 | Adidas Ag | Modular personal network systems and methods |
US6662282B2 (en) | 2001-04-17 | 2003-12-09 | Hewlett-Packard Development Company, L.P. | Unified data sets distributed over multiple I/O-device arrays |
US7346917B2 (en) | 2001-05-21 | 2008-03-18 | Cyberview Technology, Inc. | Trusted transactional set-top box |
US6825769B2 (en) | 2001-09-14 | 2004-11-30 | Koninklijke Philips Electronics N.V. | Automatic shut-off light system when user sleeps |
US7216002B1 (en) | 2001-10-22 | 2007-05-08 | Gateway Inc. | System and method for group content selection |
DE60208432T2 (en) | 2001-10-26 | 2006-08-24 | Koninklijke Philips Electronics N.V. | BIDIRECTIONAL REMOTE CONTROL SYSTEM AND METHOD |
US6976187B2 (en) | 2001-11-08 | 2005-12-13 | Broadcom Corporation | Rebuilding redundant disk arrays using distributed hot spare space |
KR100407051B1 (en) | 2001-11-16 | 2003-11-28 | 삼성전자주식회사 | Home network system |
US6690778B2 (en) | 2002-01-16 | 2004-02-10 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for automatically adjusting an electronic device output in response to an incoming telephone call |
KR100475447B1 (en) | 2002-01-21 | 2005-03-10 | 엘지전자 주식회사 | Method and apparatus of processing inputted signal for display having set-top box |
US9479550B2 (en) | 2002-02-12 | 2016-10-25 | Google Technology Holdings LLC | System for providing continuity of broadcast between clients and method therefor |
DE10208451A1 (en) * | 2002-02-27 | 2003-09-04 | Bremicker Soehne Kg A | Window / door lock |
US6774802B2 (en) | 2002-03-20 | 2004-08-10 | Hon Technology Inc. | Detection and air evacuation system |
US6691724B2 (en) * | 2002-04-11 | 2004-02-17 | Michael Brent Ford | Method and system for controlling a household water supply |
US7143298B2 (en) | 2002-04-18 | 2006-11-28 | Ge Fanuc Automation North America, Inc. | Methods and apparatus for backing up a memory device |
US9137035B2 (en) | 2002-05-09 | 2015-09-15 | Netstreams Llc | Legacy converter and controller for an audio video distribution system |
US7653212B2 (en) | 2006-05-19 | 2010-01-26 | Universal Electronics Inc. | System and method for using image data in connection with configuring a universal controlling device |
JP2004040285A (en) | 2002-07-01 | 2004-02-05 | Matsushita Electric Ind Co Ltd | Control unit for household electric appliance, control method, control program, and household electric appliance |
US7464035B2 (en) | 2002-07-24 | 2008-12-09 | Robert Bosch Corporation | Voice control of home automation systems via telephone |
US6778071B2 (en) | 2002-08-21 | 2004-08-17 | Lockheed Martin Corporation | Adaptive escape routing system |
US7739718B1 (en) | 2002-08-23 | 2010-06-15 | Arris Group, Inc. | System and method for automatically sensing the state of a video display device |
US10009577B2 (en) | 2002-08-29 | 2018-06-26 | Comcast Cable Communications, Llc | Communication systems |
GB2417318B (en) | 2002-09-19 | 2006-07-19 | Reilor Holdings Ltd | Key for automatic pet door |
JP2004166193A (en) | 2002-09-27 | 2004-06-10 | Matsushita Electric Ind Co Ltd | Remote control device |
US9009084B2 (en) * | 2002-10-21 | 2015-04-14 | Rockwell Automation Technologies, Inc. | System and methodology providing automation security analysis and network intrusion protection in an industrial environment |
US7330740B2 (en) | 2002-10-22 | 2008-02-12 | Broadcom Corporation | Cell phone wireless speaker-microphone sleep modes |
US7274295B2 (en) | 2002-10-30 | 2007-09-25 | At&T Bls Intellectual Property, Inc. | Instantaneous mobile access to all pertinent life events |
US20030126593A1 (en) | 2002-11-04 | 2003-07-03 | Mault James R. | Interactive physiological monitoring system |
US20040117843A1 (en) | 2002-12-11 | 2004-06-17 | Jeyhan Karaoguz | Media exchange network supporting local and remote personalized media overlay |
US20040128034A1 (en) | 2002-12-11 | 2004-07-01 | Lenker Jay A. | Method and apparatus for water flow sensing and control |
US7088238B2 (en) | 2002-12-11 | 2006-08-08 | Broadcom, Inc. | Access, monitoring, and control of appliances via a media processing system |
CN100556225C (en) | 2002-12-16 | 2009-10-28 | 皇家飞利浦电子股份有限公司 | Be used for carrying out the system and method that lighting control networks recovers from master-failure |
US7109879B2 (en) | 2003-01-17 | 2006-09-19 | Smart Safety Systems, Inc. | Remotely activated, multiple stage alarm system |
US20040148419A1 (en) | 2003-01-23 | 2004-07-29 | Chen Yancy T. | Apparatus and method for multi-user entertainment |
KR100514191B1 (en) | 2003-01-23 | 2005-09-13 | 삼성전자주식회사 | remote controller and set-top-box for it |
AU2003208352A1 (en) | 2003-01-29 | 2004-08-23 | Pale Holding B.V. | Method and system for providing emergency health information |
US20040260407A1 (en) | 2003-04-08 | 2004-12-23 | William Wimsatt | Home automation control architecture |
US7005979B2 (en) | 2003-06-25 | 2006-02-28 | Universal Electronics Inc. | System and method for monitoring remote control transmissions |
US20060155389A1 (en) | 2003-07-03 | 2006-07-13 | Francesco Pessolano | Method of controlling an electronic device |
CN100423558C (en) | 2003-07-14 | 2008-10-01 | 松下电器产业株式会社 | Signal switching device, signal distribution device, display device, and signal transmission system |
KR100541942B1 (en) | 2003-08-11 | 2006-01-10 | 삼성전자주식회사 | Apparatus for managing home-devices remotely in home-network and method thereof |
CN1607348A (en) * | 2003-10-15 | 2005-04-20 | 高砂工程株式会社 | Check valve for detecting water leakage and water leakage alarm system using the same |
AU2004285450B2 (en) | 2003-10-20 | 2010-01-14 | Gregory K. Frykman | Zeolite molecular sieves for the removal of toxins |
US7155305B2 (en) | 2003-11-04 | 2006-12-26 | Universal Electronics Inc. | System and methods for home appliance identification and control in a networked environment |
US7234074B2 (en) | 2003-12-17 | 2007-06-19 | International Business Machines Corporation | Multiple disk data storage system for reducing power consumption |
US20060253894A1 (en) | 2004-04-30 | 2006-11-09 | Peter Bookman | Mobility device platform |
US7505081B2 (en) | 2004-05-17 | 2009-03-17 | Toshiba America Consumer Products, L.L.C. | System and method for preserving external storage device control while in picture-outside-picture (POP) or picture-in-picture (PIP) modes |
US7395369B2 (en) | 2004-05-18 | 2008-07-01 | Oracle International Corporation | Distributing data across multiple storage devices |
US7218237B2 (en) | 2004-05-27 | 2007-05-15 | Lawrence Kates | Method and apparatus for detecting water leaks |
US8644525B2 (en) | 2004-06-02 | 2014-02-04 | Clearone Communications, Inc. | Virtual microphones in electronic conferencing systems |
KR100631589B1 (en) | 2004-06-25 | 2006-10-09 | 삼성전자주식회사 | How to provide the initial screen of digital television |
US7424867B2 (en) | 2004-07-15 | 2008-09-16 | Lawrence Kates | Camera system for canines, felines, or other animals |
US7228726B2 (en) | 2004-09-23 | 2007-06-12 | Lawrence Kates | System and method for utility metering and leak detection |
US7342488B2 (en) | 2004-10-13 | 2008-03-11 | Innvision Networks, Llc | System and method for providing home awareness |
JP4303191B2 (en) | 2004-11-26 | 2009-07-29 | 富士フイルム株式会社 | Image recognition system, image recognition method, and image recognition program |
US7363454B2 (en) | 2004-12-10 | 2008-04-22 | International Business Machines Corporation | Storage pool space allocation across multiple locations |
KR100635544B1 (en) | 2004-12-20 | 2006-10-18 | 한국전자통신연구원 | Device and method for distributing same or different digital broadcasting stream in heterogeneous home network |
EP1836552A1 (en) | 2004-12-24 | 2007-09-26 | Nokia Corporation | Hardware-initiated automated back-up of data from an internal memory of a hand-portable electronic device |
WO2006085852A2 (en) | 2005-01-28 | 2006-08-17 | Hermetic Switch, Inc. | A deadbolt sensor for security systems |
WO2006088842A1 (en) | 2005-02-17 | 2006-08-24 | Ranco Incorporated Of Delaware | Adverse condition detector with diagnostics |
US7408472B2 (en) | 2005-02-22 | 2008-08-05 | Lee Von Gunten | Device for simulating human activity in an unoccupied dwelling |
US8550368B2 (en) | 2005-02-23 | 2013-10-08 | Emerson Electric Co. | Interactive control system for an HVAC system |
US8010498B2 (en) | 2005-04-08 | 2011-08-30 | Microsoft Corporation | Virtually infinite reliable storage across multiple storage devices and storage services |
US8264318B2 (en) | 2005-06-09 | 2012-09-11 | Whirlpool Corporation | Consumable holder with converter |
US7793317B2 (en) | 2005-08-19 | 2010-09-07 | At&T Intellectual Property I, L.P. | System and method of managing video streams to a set top box |
US7391319B1 (en) | 2005-08-22 | 2008-06-24 | Walker Ethan A | Wireless fire alarm door unlocking interface |
CH702136B1 (en) * | 2005-09-14 | 2011-05-13 | Easy Etudes Et Applic Systeme Sa | Locking device for security door in machine tool, has electric switch with single contact element integrated with locking device to control closing of security door, and moving unit moving locking rod in slot using air pressure unit |
US20070078910A1 (en) | 2005-09-30 | 2007-04-05 | Rajendra Bopardikar | Back-up storage for home network |
US7945297B2 (en) | 2005-09-30 | 2011-05-17 | Atmel Corporation | Headsets and headset power management |
US7386666B1 (en) | 2005-09-30 | 2008-06-10 | Emc Corporation | Global sparing of storage capacity across multiple storage arrays |
GB2431813B (en) | 2005-10-28 | 2008-06-04 | Eleanor Johnson | Audio system |
US7870232B2 (en) | 2005-11-04 | 2011-01-11 | Intermatic Incorporated | Messaging in a home automation data transfer system |
US20070256085A1 (en) | 2005-11-04 | 2007-11-01 | Reckamp Steven R | Device types and units for a home automation data transfer system |
US7640351B2 (en) | 2005-11-04 | 2009-12-29 | Intermatic Incorporated | Application updating in a home automation data transfer system |
US7694005B2 (en) | 2005-11-04 | 2010-04-06 | Intermatic Incorporated | Remote device management in a home automation data transfer system |
US8042048B2 (en) | 2005-11-17 | 2011-10-18 | Att Knowledge Ventures, L.P. | System and method for home automation |
US7354383B2 (en) | 2005-12-06 | 2008-04-08 | Ilir Bardha | Jump rope with physiological monitor |
US20070135225A1 (en) | 2005-12-12 | 2007-06-14 | Nieminen Heikki V | Sport movement analyzer and training device |
US9153125B2 (en) | 2005-12-20 | 2015-10-06 | Savant Systems, Llc | Programmable multimedia controller with programmable services |
US7865512B2 (en) | 2005-12-27 | 2011-01-04 | Panasonic Electric Works Co., Ltd. | Systems and methods for providing victim location information during an emergency situation |
JP2007181030A (en) | 2005-12-28 | 2007-07-12 | Funai Electric Co Ltd | Image display device |
KR100725945B1 (en) | 2006-01-03 | 2007-06-11 | 삼성전자주식회사 | Broadcasting signal retransmitting system and method using illuminated light communication |
US20130219482A1 (en) * | 2006-01-31 | 2013-08-22 | Sigma Designs, Inc. | Method for uniquely addressing a group of network units in a sub-network |
US8516087B2 (en) | 2006-02-14 | 2013-08-20 | At&T Intellectual Property I, L.P. | Home automation system and method |
US20070194922A1 (en) | 2006-02-17 | 2007-08-23 | Lear Corporation | Safe warn building system and method |
WO2007126781A2 (en) | 2006-03-27 | 2007-11-08 | Exceptional Innovation Llc | Set top box for convergence and automation system |
US7659814B2 (en) | 2006-04-21 | 2010-02-09 | International Business Machines Corporation | Method for distributed sound collection and event triggering |
US8700772B2 (en) | 2006-05-03 | 2014-04-15 | Cloud Systems, Inc. | System and method for automating the management, routing, and control of multiple devices and inter-device connections |
US20070271518A1 (en) | 2006-05-16 | 2007-11-22 | Bellsouth Intellectual Property Corporation | Methods, Apparatus and Computer Program Products for Audience-Adaptive Control of Content Presentation Based on Sensed Audience Attentiveness |
US8045761B2 (en) | 2006-05-30 | 2011-10-25 | Intelliview Technologies Inc. | Detection of environmental conditions in a sequence of images |
FI20065390L (en) | 2006-06-08 | 2007-12-09 | Innohome Oy | Automatic multi-level access control system for electronic and electrical equipment |
US8392947B2 (en) | 2006-06-30 | 2013-03-05 | At&T Intellectual Property I, Lp | System and method for home audio and video communication |
KR100772412B1 (en) | 2006-07-18 | 2007-11-01 | 삼성전자주식회사 | Apparatus and method of controlling home control network |
FR2904127B1 (en) | 2006-07-19 | 2008-10-17 | Somfy Sas | METHOD FOR OPERATING AN AUTONOMOUS DOMOTIC SENSOR DEVICE FOR DETECTING THE EXISTENCE AND / OR MEASURING THE INTENSITY OF A PHYSICAL PHENOMENON |
US20080144884A1 (en) | 2006-07-20 | 2008-06-19 | Babak Habibi | System and method of aerial surveillance |
US20080021971A1 (en) | 2006-07-21 | 2008-01-24 | Halgas Joseph F | System and Method for Electronic Messaging Notification Using End-User Display Devices |
US20080046930A1 (en) | 2006-08-17 | 2008-02-21 | Bellsouth Intellectual Property Corporation | Apparatus, Methods and Computer Program Products for Audience-Adaptive Control of Content Presentation |
CN1916797A (en) * | 2006-08-23 | 2007-02-21 | 南京师范大学 | Wireless intelligence type monitoring and controlling alarm system for safety, environmental protection in room |
US8374586B2 (en) | 2006-09-07 | 2013-02-12 | Pima Electronic Systems Ltd. | Method and system for transmission of images from a monitored area |
WO2008030069A1 (en) | 2006-09-08 | 2008-03-13 | Lg Electronics Inc. | Broadcasting receiver and method of processing emergency alert message |
US8687037B2 (en) | 2006-09-12 | 2014-04-01 | Savant Systems, Llc | Telephony services for programmable multimedia controller |
US8335312B2 (en) | 2006-10-02 | 2012-12-18 | Plantronics, Inc. | Donned and doffed headset state detection |
US8719861B2 (en) | 2006-10-02 | 2014-05-06 | At&T Intellectual Property I, Lp | System and method for distributing dynamic event data in an internet protocol television system |
WO2008094224A2 (en) * | 2006-10-13 | 2008-08-07 | Savannah River Nuclear Solutions, Llc | Door latching recognition apparatus and process |
US8230466B2 (en) | 2006-11-16 | 2012-07-24 | At&T Intellectual Property I, L.P. | Home automation system and method including remote media access |
US20080120639A1 (en) | 2006-11-21 | 2008-05-22 | Sbc Knowledge Ventures, Lp | System and method of providing emergency information |
US8014501B2 (en) | 2006-11-27 | 2011-09-06 | Avaya Inc. | Determining whether to provide authentication credentials based on call-establishment delay |
US8619136B2 (en) | 2006-12-01 | 2013-12-31 | Centurylink Intellectual Property Llc | System and method for home monitoring using a set top box |
US20090281909A1 (en) | 2006-12-06 | 2009-11-12 | Pumpone, Llc | System and method for management and distribution of multimedia presentations |
FR2909755B1 (en) * | 2006-12-07 | 2009-03-06 | Edixia Soc Par Actions Simplif | METHOD FOR DETERMINING THE GAME AND / OR DISINFECTION OF AN OPENER IN PARTICULAR OF A VEHICLE WITHOUT REFERENCE OF THE OPENER |
JP2008148016A (en) | 2006-12-11 | 2008-06-26 | Toyota Motor Corp | Household appliance control system |
US8601515B2 (en) | 2006-12-28 | 2013-12-03 | Motorola Mobility Llc | On screen alert to indicate status of remote recording |
US20160277261A9 (en) * | 2006-12-29 | 2016-09-22 | Prodea Systems, Inc. | Multi-services application gateway and system employing the same |
US8397264B2 (en) | 2006-12-29 | 2013-03-12 | Prodea Systems, Inc. | Display inserts, overlays, and graphical user interfaces for multimedia systems |
US7865252B2 (en) | 2007-01-26 | 2011-01-04 | Autani Corporation | Upgradeable automation devices, systems, architectures, and methods |
US20080179053A1 (en) | 2007-01-29 | 2008-07-31 | Lawrence Kates | System and method for zone thermostat budgeting |
KR200456982Y1 (en) | 2007-03-30 | 2011-11-30 | 주식회사 아이레보 | Tubler Type Ectronic Door Lock Having All in One Driving Department Dead Bolt |
US20100321151A1 (en) | 2007-04-04 | 2010-12-23 | Control4 Corporation | Home automation security system and method |
US8797465B2 (en) | 2007-05-08 | 2014-08-05 | Sony Corporation | Applications for remote control devices with added functionalities |
US8136040B2 (en) | 2007-05-16 | 2012-03-13 | Apple Inc. | Audio variance for multiple windows |
TW200847782A (en) | 2007-05-17 | 2008-12-01 | Inventec Multimedia & Telecom | Programmable scheduling video converting apparatus |
JP2008301298A (en) | 2007-05-31 | 2008-12-11 | Toshiba Corp | Information input/output apparatus and information input/output method |
US7969318B2 (en) | 2007-06-15 | 2011-06-28 | Matt White | Flow detector with alarm features |
US20090023554A1 (en) | 2007-07-16 | 2009-01-22 | Youngtack Shim | Exercise systems in virtual environment |
US7688212B2 (en) | 2007-07-26 | 2010-03-30 | Simplexgrinnell Lp | Method and apparatus for providing occupancy information in a fire alarm system |
US8018337B2 (en) | 2007-08-03 | 2011-09-13 | Fireear Inc. | Emergency notification device and system |
US8221290B2 (en) | 2007-08-17 | 2012-07-17 | Adidas International Marketing B.V. | Sports electronic training system with electronic gaming features, and applications thereof |
EP2195688B1 (en) | 2007-08-30 | 2018-10-03 | Valeo Schalter und Sensoren GmbH | Method and system for weather condition detection with image-based road characterization |
US8107977B2 (en) | 2007-09-07 | 2012-01-31 | United Video Properties, Inc. | Cross-platform messaging |
US8310335B2 (en) | 2007-09-07 | 2012-11-13 | Verizon Patent And Licensing Inc. | Network-based access and control of home automation systems |
US20110090086A1 (en) * | 2007-10-22 | 2011-04-21 | Kent Dicks | Systems for personal emergency intervention |
US20090112541A1 (en) | 2007-10-26 | 2009-04-30 | Joel Anderson | Virtual reality tools for development of infection control solutions |
TWI353736B (en) | 2007-11-23 | 2011-12-01 | Compal Communications Inc | Device of wireless remote control and operating me |
US20090138507A1 (en) | 2007-11-27 | 2009-05-28 | International Business Machines Corporation | Automated playback control for audio devices using environmental cues as indicators for automatically pausing audio playback |
US8949870B2 (en) | 2007-12-19 | 2015-02-03 | Dish Network L.L.C. | Transfer of information from an information node to a broadcast programming receiver |
DE102007061754A1 (en) | 2007-12-20 | 2009-06-25 | Elektro Grundler Ges.M.B.H. & Co. Kg | Evacuation device and escape route indicator for this |
US8154381B2 (en) | 2007-12-31 | 2012-04-10 | Universal Electronics Inc. | System and method for interactive appliance control |
US8331544B2 (en) | 2007-12-31 | 2012-12-11 | Schlage Lock Company, Llc | Method and system for remotely controlling access to an access point |
JP4968091B2 (en) | 2008-01-30 | 2012-07-04 | ソニー株式会社 | Electronic device, message response method and program |
US8477830B2 (en) | 2008-03-18 | 2013-07-02 | On-Ramp Wireless, Inc. | Light monitoring system using a random phase multiple access system |
US20090235992A1 (en) | 2008-03-18 | 2009-09-24 | Armstrong Larry D | Method and apparatus for detecting water system leaks and preventing excessive water usage |
US8413204B2 (en) | 2008-03-31 | 2013-04-02 | At&T Intellectual Property I, Lp | System and method of interacting with home automation systems via a set-top box device |
JP4601684B2 (en) | 2008-04-25 | 2010-12-22 | シャープ株式会社 | Evacuation route acquisition system, portable terminal device, evacuation route acquisition method, evacuation route acquisition program, computer-readable recording medium |
US8320578B2 (en) | 2008-04-30 | 2012-11-27 | Dp Technologies, Inc. | Headset |
CN201202136Y (en) * | 2008-05-29 | 2009-03-04 | 丁文杰 | Household waterpipe explosion and overflow automatic shutdown system |
US9516116B2 (en) | 2008-06-06 | 2016-12-06 | Apple Inc. | Managing notification service connections |
US7579945B1 (en) | 2008-06-20 | 2009-08-25 | International Business Machines Corporation | System and method for dynamically and efficently directing evacuation of a building during an emergency condition |
US8290545B2 (en) | 2008-07-25 | 2012-10-16 | Apple Inc. | Systems and methods for accelerometer usage in a wireless headset |
US8321885B2 (en) | 2008-07-29 | 2012-11-27 | Pino Jr Angelo J | In-home system monitoring method and system |
US8013730B2 (en) | 2008-07-29 | 2011-09-06 | Honeywell International Inc. | Customization of personal emergency features for security systems |
US9015755B2 (en) | 2008-07-29 | 2015-04-21 | Centurylink Intellectual Property Llc | System and method for an automatic television channel change |
EP2311299B1 (en) | 2008-08-13 | 2013-03-13 | Koninklijke Philips Electronics N.V. | Updating scenes in remote controllers of a home control system |
US8130107B2 (en) | 2008-08-19 | 2012-03-06 | Timothy Meyer | Leak detection and control system and method |
US8358908B2 (en) | 2008-08-22 | 2013-01-22 | Panasonic Corporation | Recording and playback apparatus |
KR101542379B1 (en) | 2008-08-28 | 2015-08-06 | 엘지전자 주식회사 | Video display apparatus and method of setting user viewing conditions |
US8427278B2 (en) | 2008-10-17 | 2013-04-23 | Robert Bosch Gmbh | Automation and security system |
US8461959B2 (en) | 2008-10-23 | 2013-06-11 | Whirlpool Corporation | Consumable holder with process control apparatus |
US8051381B2 (en) | 2008-12-22 | 2011-11-01 | Whirlpool Corporation | Appliance with a graphical user interface for configuring an accessory |
US8516533B2 (en) | 2008-11-07 | 2013-08-20 | Digimarc Corporation | Second screen methods and arrangements |
US20100138007A1 (en) | 2008-11-21 | 2010-06-03 | Qwebl, Inc. | Apparatus and method for integration and setup of home automation |
US8781508B1 (en) | 2008-11-25 | 2014-07-15 | Brytelight Enterprises | System and method for transferring information between a remote computing device and a central business unit |
US8813121B2 (en) | 2008-12-02 | 2014-08-19 | At&T Intellectual Property I, L.P. | Delaying emergency alert system messages |
US8977974B2 (en) | 2008-12-08 | 2015-03-10 | Apple Inc. | Ambient noise based augmentation of media playback |
US8749392B2 (en) | 2008-12-30 | 2014-06-10 | Oneevent Technologies, Inc. | Evacuation system |
BRPI0920495B1 (en) | 2009-01-31 | 2020-09-29 | Telefonaktiebolaget Lm Ericsson (Publ) | COMPUTER SYSTEM AND SYSTEM FOR COMPUTING ENERGY CONSUMPTION AND METHOD FOR CALCULATING ENERGY CONSUMED BY A SYSTEM |
US20100211546A1 (en) | 2009-02-13 | 2010-08-19 | Lennox Manufacturing Inc. | System and method to backup data about devices in a network |
US9799205B2 (en) | 2013-07-15 | 2017-10-24 | Oneevent Technologies, Inc. | Owner controlled evacuation system with notification and route guidance provided by a user device |
US9948872B2 (en) | 2009-03-02 | 2018-04-17 | Flir Systems, Inc. | Monitor and control systems and methods for occupant safety and energy efficiency of structures |
US8171148B2 (en) | 2009-04-17 | 2012-05-01 | Sling Media, Inc. | Systems and methods for establishing connections between devices communicating over a network |
EP2425303B1 (en) | 2009-04-26 | 2019-01-16 | NIKE Innovate C.V. | Gps features and functionality in an athletic watch system |
US8201261B2 (en) | 2009-04-27 | 2012-06-12 | Chase Barfield | Secure data storage system and method |
US8638211B2 (en) | 2009-04-30 | 2014-01-28 | Icontrol Networks, Inc. | Configurable controller and interface for home SMA, phone and multimedia |
US8350694B1 (en) | 2009-05-18 | 2013-01-08 | Alarm.Com Incorporated | Monitoring system to monitor a property with a mobile device with a monitoring application |
US20110032423A1 (en) * | 2009-08-06 | 2011-02-10 | Sony Corporation | Adaptive user profiling for tv-centric home automation system |
US8645327B2 (en) | 2009-09-30 | 2014-02-04 | Apple Inc. | Management of access to data distributed across multiple computing devices |
JP5514507B2 (en) | 2009-10-21 | 2014-06-04 | 株式会社日立製作所 | In-area environment control system and in-area environment control method |
US9015225B2 (en) | 2009-11-16 | 2015-04-21 | Echostar Technologies L.L.C. | Systems and methods for delivering messages over a network |
US9191804B1 (en) | 2009-11-20 | 2015-11-17 | Sprint Communications Company L.P. | Managing subscription messages on behalf of a mobile device |
FR2953615B1 (en) | 2009-12-04 | 2014-11-21 | Thales Sa | SECURE DISTRIBUTED STORAGE SYSTEMS OF PERSONAL DATA, ESPECIALLY BIOMETRIC FINGERPRINTS, AND SYSTEM, DEVICE AND METHOD FOR IDENTITY CONTROL |
TWM384532U (en) | 2009-12-10 | 2010-07-21 | Ind Tech Res Inst | Intelligent pet-feeding device |
JP5515709B2 (en) | 2009-12-11 | 2014-06-11 | ソニー株式会社 | Control apparatus and method, and program |
US9178923B2 (en) | 2009-12-23 | 2015-11-03 | Echostar Technologies L.L.C. | Systems and methods for remotely controlling a media server via a network |
TW201123877A (en) | 2009-12-25 | 2011-07-01 | Hon Hai Prec Ind Co Ltd | Television and method for saving energy thereof |
US8339246B2 (en) | 2009-12-30 | 2012-12-25 | Echostar Technologies Llc | Systems, methods and apparatus for locating a lost remote control |
WO2011095567A1 (en) | 2010-02-04 | 2011-08-11 | Eldon Technology Limited Trading As Echostar Europe | A method of notifying a user of the status of an electrical appliance |
US8316413B2 (en) | 2010-02-04 | 2012-11-20 | Eldon Technology Limited | Apparatus for displaying electrical device usage information on a television receiver |
US9599981B2 (en) | 2010-02-04 | 2017-03-21 | Echostar Uk Holdings Limited | Electronic appliance status notification via a home entertainment system |
US8948793B1 (en) | 2010-02-12 | 2015-02-03 | Bruce R. Birkhold | System and method for automated remote messaging to wireless mobile devices |
US10455275B2 (en) | 2010-02-16 | 2019-10-22 | Comcast Cable Communications, Llc | Disposition of video alerts and integration of a mobile device into a local service domain |
US8156368B2 (en) | 2010-02-22 | 2012-04-10 | International Business Machines Corporation | Rebuilding lost data in a distributed redundancy data storage system |
US8606298B2 (en) | 2010-03-11 | 2013-12-10 | Electronics And Telecommunications Research Institute | System and method for tracking location of mobile terminal using TV |
US8086757B2 (en) | 2010-03-23 | 2011-12-27 | Michael Alan Chang | Intelligent gateway for heterogeneous peer-to-peer home automation networks |
CA2809448A1 (en) | 2010-04-15 | 2011-10-20 | Brian A. Corbett | Emergency lighting system with projected directional indication |
US8564421B2 (en) | 2010-04-30 | 2013-10-22 | Blackberry Limited | Method and apparatus for generating an audio notification file |
US8799413B2 (en) | 2010-05-03 | 2014-08-05 | Panzura, Inc. | Distributing data for a distributed filesystem across multiple cloud storage systems |
US9204193B2 (en) | 2010-05-14 | 2015-12-01 | Rovi Guides, Inc. | Systems and methods for media detection and filtering using a parental control logging application |
US8334765B2 (en) * | 2010-05-24 | 2012-12-18 | Keylockit Ltd. | Wireless network apparatus and method for lock indication |
WO2011149473A1 (en) | 2010-05-28 | 2011-12-01 | Honeywell International Inc. | Synchronization of light sources |
US9215420B2 (en) | 2010-06-01 | 2015-12-15 | Comcast Cable Communications, Llc | Ranking search results |
AU2011287276A1 (en) | 2010-08-05 | 2013-02-07 | Nice S.P.A. | Component addition/substitution method in a home automation wireless system |
DE102010034072A1 (en) | 2010-08-12 | 2012-02-16 | Crosscan Gmbh | Personnel control system for the evacuation of a building or a building section |
US11122334B2 (en) | 2010-08-17 | 2021-09-14 | DISH Technologies L.L.C. | Methods and apparatus for accessing external devices from a television receiver utilizing integrated content selection menus |
CN102058939A (en) | 2010-08-18 | 2011-05-18 | 清华大学 | Method and system for evaluating building fire situation and instructing evacuation |
GB2483370B (en) | 2010-09-05 | 2015-03-25 | Mobile Res Labs Ltd | A system and method for engaging a person in the presence of ambient audio |
US9104211B2 (en) | 2010-11-19 | 2015-08-11 | Google Inc. | Temperature controller with model-based time to target calculation and display |
US20120271472A1 (en) | 2011-04-22 | 2012-10-25 | Joulex, Inc. | System and methods for sustainable energy management, monitoring, and control of electronic devices |
EP2431956B1 (en) | 2010-09-17 | 2014-10-22 | Eldon Technology Limited trading as Echostar Europe | A method and device for operating a television located in a premises to simulate occupation of the premises |
US8786698B2 (en) | 2010-09-23 | 2014-07-22 | Sony Computer Entertainment Inc. | Blow tracking user interface system and method |
US8488067B2 (en) | 2010-10-27 | 2013-07-16 | Sony Corporation | TV use simulation |
US8640021B2 (en) | 2010-11-12 | 2014-01-28 | Microsoft Corporation | Audience-based presentation and customization of content |
US8683086B2 (en) | 2010-11-17 | 2014-03-25 | Flextronics Ap, Llc. | Universal remote control with automated setup |
CN103221984B (en) | 2010-11-19 | 2016-10-05 | 株式会社尼康 | Guide, detection device and posture state decision maker |
EP2645699B1 (en) | 2010-11-25 | 2020-08-05 | Panasonic Intellectual Property Corporation of America | Communication device |
JP5620287B2 (en) | 2010-12-16 | 2014-11-05 | 株式会社オプティム | Portable terminal, method and program for changing user interface |
US9147337B2 (en) * | 2010-12-17 | 2015-09-29 | Icontrol Networks, Inc. | Method and system for logging security event data |
US8868034B2 (en) | 2010-12-25 | 2014-10-21 | Intel Corporation | Secure wireless device area network of a cellular system |
WO2012092492A2 (en) | 2010-12-29 | 2012-07-05 | Secureall Corporation | Alignment-related operation and position sensing of electronic and other locks and other objects |
US8694600B2 (en) * | 2011-03-01 | 2014-04-08 | Covidien Lp | Remote monitoring systems for monitoring medical devices via wireless communication networks |
US20120206269A1 (en) * | 2011-02-11 | 2012-08-16 | B.E.A. Inc. | Electronic System to Signal Proximity of an Object |
TWI442200B (en) | 2011-03-02 | 2014-06-21 | Ind Tech Res Inst | Method and apparatus of binding sensors and actuators automatically |
US20130090213A1 (en) | 2011-03-25 | 2013-04-11 | Regents Of The University Of California | Exercise-Based Entertainment And Game Controller To Improve Health And Manage Obesity |
JP5681713B2 (en) | 2011-03-29 | 2015-03-11 | パナソニックIpマネジメント株式会社 | Remote control system and remote control |
US8429003B2 (en) | 2011-04-21 | 2013-04-23 | Efficiency3 Corp. | Methods, technology, and systems for quickly enhancing the operating and financial performance of energy systems at large facilities; interpreting usual and unusual patterns in energy consumption; identifying, quantifying, and monetizing hidden operating and financial waste; and accurately measuring the results of implemented energy management solutions-in the shortest amount of time with minimal cost and effort |
US20150142991A1 (en) | 2011-04-21 | 2015-05-21 | Efficiency3 Corp. | Electronic hub appliances used for collecting, storing, and processing potentially massive periodic data streams indicative of real-time or other measuring parameters |
US20120291068A1 (en) | 2011-05-09 | 2012-11-15 | Verizon Patent And Licensing Inc. | Home device control on television |
WO2012153290A1 (en) | 2011-05-10 | 2012-11-15 | Nds Limited | Adaptive presentation of content |
US20120314713A1 (en) | 2011-06-08 | 2012-12-13 | Harkirat Singh | Method and system for proxy entity representation in audio/video networks |
US8588990B2 (en) | 2011-06-30 | 2013-11-19 | Ayla Networks, Inc. | Communicating through a server between appliances and applications |
US8588968B2 (en) | 2011-07-06 | 2013-11-19 | Dominic Anthony Carelli | Internet-accessible pet treat dispensing system and method |
US20140222634A1 (en) | 2011-08-03 | 2014-08-07 | Innovaci Inc. | Method for Environmental Control and Monitoring |
US8618927B2 (en) | 2011-08-24 | 2013-12-31 | At&T Intellectual Property I, L.P. | Methods, systems, and products for notifications in security systems |
US9541625B2 (en) | 2011-08-25 | 2017-01-10 | En-Gauge, Inc. | Emergency resource location and status |
US9252967B2 (en) | 2011-09-01 | 2016-02-02 | Sony Corporation | Facilitated use of heterogeneous home-automation edge components |
US8677343B2 (en) | 2011-09-16 | 2014-03-18 | Cisco Technology, Inc. | Centrally coordinated firmware upgrade model across network for minimizing uptime loss and firmware compatibility |
JP2014534405A (en) | 2011-10-21 | 2014-12-18 | ネスト・ラブズ・インコーポレイテッド | User-friendly, networked learning thermostat and related systems and methods |
WO2013063769A1 (en) | 2011-11-02 | 2013-05-10 | Intel Corporation | Extending capabilities of existing devices without making modifications to existing devices |
US20130124192A1 (en) | 2011-11-14 | 2013-05-16 | Cyber360, Inc. | Alert notifications in an online monitoring system |
US20130147604A1 (en) | 2011-12-07 | 2013-06-13 | Donald R. Jones, Jr. | Method and system for enabling smart building evacuation |
US20130158717A1 (en) | 2011-12-14 | 2013-06-20 | Honeywell International Inc. | Hvac controller with delta-t based diagnostics |
US9103558B2 (en) | 2011-12-21 | 2015-08-11 | Lennox Industries Inc. | Method for detecting physical presence of a specific individual to control HVAC settings |
US20130185750A1 (en) | 2012-01-17 | 2013-07-18 | General Instrument Corporation | Context based correlative targeted advertising |
US9258593B1 (en) | 2012-01-25 | 2016-02-09 | Time Warner Cable Enterprises Llc | System and method for home security monitoring using a television set-top box |
US20130204408A1 (en) | 2012-02-06 | 2013-08-08 | Honeywell International Inc. | System for controlling home automation system using body movements |
JP5936379B2 (en) | 2012-02-07 | 2016-06-22 | シャープ株式会社 | Image display device |
US20140309853A1 (en) | 2013-04-15 | 2014-10-16 | Flextronics Ap, Llc | Vehicle diagnostics and roadside assistance |
US9378601B2 (en) | 2012-03-14 | 2016-06-28 | Autoconnect Holdings Llc | Providing home automation information via communication with a vehicle |
US8749375B2 (en) * | 2012-03-26 | 2014-06-10 | Sony Corporation | Hands-free home automation application |
US20130267383A1 (en) | 2012-04-06 | 2013-10-10 | Icon Health & Fitness, Inc. | Integrated Exercise Device Environment Controller |
US9633186B2 (en) | 2012-04-23 | 2017-04-25 | Apple Inc. | Systems and methods for controlling output of content based on human recognition data detection |
US8750576B2 (en) | 2012-04-24 | 2014-06-10 | Taiwan Colour And Imaging Technology Corporation | Method of managing visiting guests by face recognition |
US9210361B2 (en) | 2012-04-24 | 2015-12-08 | Skreens Entertainment Technologies, Inc. | Video display system |
AU2013251524B2 (en) | 2012-04-25 | 2016-05-12 | Bidgely Inc. | Energy disaggregation techniques for low resolution whole-house energy consumption data |
US20130324247A1 (en) | 2012-06-04 | 2013-12-05 | Microsoft Corporation | Interactive sports applications |
US20130325150A1 (en) | 2012-06-05 | 2013-12-05 | Henryk Bury | Method for the Operation of a Control Unit and a Control Unit |
US8923823B1 (en) | 2012-06-28 | 2014-12-30 | Emc Corporation | System for delivering and confirming receipt of notification messages across different notification media |
US8667529B2 (en) | 2012-07-09 | 2014-03-04 | EchoStar Technologies, L.L.C. | Presentation of audiovisual exercise segments between segments of primary audiovisual content |
US8886785B2 (en) | 2012-07-17 | 2014-11-11 | The Procter & Gamble Company | Home network of connected consumer devices |
US9798325B2 (en) | 2012-07-17 | 2017-10-24 | Elwha Llc | Unmanned device interaction methods and systems |
DE102012106719B4 (en) * | 2012-07-24 | 2016-09-22 | K.A. Schmersal Holding Gmbh & Co. Kg | Access protection device and a method for monitoring a state of the access protection device |
EP2698686B1 (en) | 2012-07-27 | 2018-10-10 | LG Electronics Inc. | Wrist-wearable terminal and control method thereof |
US8498572B1 (en) | 2012-08-24 | 2013-07-30 | Google Inc. | Home automation device pairing by NFC-enabled portable device |
US8620841B1 (en) | 2012-08-31 | 2013-12-31 | Nest Labs, Inc. | Dynamic distributed-sensor thermostat network for forecasting external events |
US8965170B1 (en) | 2012-09-04 | 2015-02-24 | Google Inc. | Automatic transition of content based on facial recognition |
US9432210B2 (en) | 2012-09-12 | 2016-08-30 | Zuli, Inc. | System for monitor and control of equipment |
US9353550B1 (en) * | 2012-09-13 | 2016-05-31 | Shelby G. Smith, III | Lock engagement status indicator system |
US20150156030A1 (en) | 2012-09-21 | 2015-06-04 | Google Inc. | Handling specific visitor behavior at an entryway to a smart-home |
US9711036B2 (en) | 2012-09-21 | 2017-07-18 | Google Inc. | Leveraging neighborhood to handle potential visitor at a smart-home |
US9208676B2 (en) | 2013-03-14 | 2015-12-08 | Google Inc. | Devices, methods, and associated information processing for security in a smart-sensored home |
US10735216B2 (en) | 2012-09-21 | 2020-08-04 | Google Llc | Handling security services visitor at a smart-home |
US9652912B2 (en) | 2012-09-21 | 2017-05-16 | Google Inc. | Secure handling of unsupervised package drop off at a smart-home |
US10332059B2 (en) | 2013-03-14 | 2019-06-25 | Google Llc | Security scoring in a smart-sensored home |
US8539567B1 (en) | 2012-09-22 | 2013-09-17 | Nest Labs, Inc. | Multi-tiered authentication methods for facilitating communications amongst smart home devices and cloud-based servers |
JP6231327B2 (en) | 2012-09-28 | 2017-11-15 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Terminal control method, terminal control system, and server device |
US9353500B1 (en) | 2012-10-26 | 2016-05-31 | Cold Stone Shorelines And Retaining Walls, Inc. | Excavator thumb having hardened removable teeth defining a platform beyond a wear and tear surface of thumb |
WO2014068559A1 (en) | 2012-10-29 | 2014-05-08 | Laufer Assaf | Appliances control devices and methods |
WO2014071193A1 (en) | 2012-11-02 | 2014-05-08 | Polaris Surgical Llc | Systems and methods for measuring orthopedic parameters in arthroplastic procedures |
CN102984039B (en) | 2012-11-06 | 2016-03-23 | 鸿富锦精密工业(深圳)有限公司 | The intelligent control method of intelligent gateway, intelligent domestic system and home appliance |
US8533144B1 (en) | 2012-11-12 | 2013-09-10 | State Farm Mutual Automobile Insurance Company | Automation and security application store suggestions based on usage data |
KR20140061619A (en) | 2012-11-13 | 2014-05-22 | 한국전자통신연구원 | Home energy management device and method thereof |
DE102013019488A1 (en) | 2012-11-19 | 2014-10-09 | Mace Wolf | PHOTO WITH PROTECTION OF THE PRIVACY |
KR20140065897A (en) | 2012-11-22 | 2014-05-30 | 삼성전자주식회사 | Non-intrusive load monitoring apparatus and method |
EP2736027B8 (en) | 2012-11-26 | 2018-06-27 | AGT International GmbH | Method and system for evacuation support |
TW201424362A (en) | 2012-12-11 | 2014-06-16 | Hon Hai Prec Ind Co Ltd | System and method for switching television channels |
US8930700B2 (en) | 2012-12-12 | 2015-01-06 | Richard J. Wielopolski | Remote device secure data file storage system and method |
US10192411B2 (en) | 2012-12-13 | 2019-01-29 | Oneevent Technologies, Inc. | Sensor-based monitoring system |
KR102058918B1 (en) | 2012-12-14 | 2019-12-26 | 삼성전자주식회사 | Home monitoring method and apparatus |
US20140192197A1 (en) | 2013-01-04 | 2014-07-10 | Thomson Licensing | Method and apparatus for controlling access to a home using visual cues |
CN103916723B (en) | 2013-01-08 | 2018-08-10 | 联想(北京)有限公司 | A kind of sound collection method and a kind of electronic equipment |
US9049168B2 (en) | 2013-01-11 | 2015-06-02 | State Farm Mutual Automobile Insurance Company | Home sensor data gathering for neighbor notification purposes |
US9113213B2 (en) | 2013-01-25 | 2015-08-18 | Nuance Communications, Inc. | Systems and methods for supplementing content with audience-requested information |
US9189611B2 (en) | 2013-02-07 | 2015-11-17 | Sony Corporation | Adapting content and monitoring user behavior based on facial recognition |
US9414114B2 (en) | 2013-03-13 | 2016-08-09 | Comcast Cable Holdings, Llc | Selective interactivity |
US9262906B2 (en) | 2013-03-14 | 2016-02-16 | Comcast Cable Communications, Llc | Processing sensor data |
US9462041B1 (en) | 2013-03-15 | 2016-10-04 | SmartThings, Inc. | Distributed control scheme for remote control and monitoring of devices through a data network |
US9762865B2 (en) | 2013-03-15 | 2017-09-12 | James Carey | Video identification and analytical recognition system |
US20140297001A1 (en) | 2013-03-28 | 2014-10-02 | Kaspar Llc | System and method for adaptive automated resource management and conservation |
US9728052B2 (en) | 2013-04-22 | 2017-08-08 | Electronics And Telecommunications Research Institute | Digital signage system and emergency alerting method using same |
WO2014176379A2 (en) | 2013-04-23 | 2014-10-30 | Canary Connect, Inc. | Security and/or monitoring devices and systems |
WO2014178920A2 (en) * | 2013-04-30 | 2014-11-06 | Flood Monkey Inc. | Intelligent electronic water flow regulation system |
US9996154B2 (en) | 2013-05-09 | 2018-06-12 | Samsung Electronics Co., Ltd. | Apparatus and method of controlling display apparatus |
US20140351832A1 (en) | 2013-05-21 | 2014-11-27 | Samsung Electronics Co., Ltd. | Electronic device using framework interface for communication |
US9544682B2 (en) | 2013-06-05 | 2017-01-10 | Echostar Technologies L.L.C. | Apparatus, method and article for providing audio of different programs |
US9286482B1 (en) | 2013-06-10 | 2016-03-15 | Amazon Technologies, Inc. | Privacy control based on user recognition |
US20140373074A1 (en) | 2013-06-12 | 2014-12-18 | Vivint, Inc. | Set top box automation |
TWI513371B (en) | 2013-07-08 | 2015-12-11 | Lextar Electronics Corp | Integrated wireless and wired light control system |
US20150015401A1 (en) | 2013-07-15 | 2015-01-15 | Oneevent Technologies, Inc. | Owner controlled evacuation system |
US8780201B1 (en) | 2013-07-26 | 2014-07-15 | SkyBell Technologies, Inc. | Doorbell communication systems and methods |
CA2958401A1 (en) | 2013-08-21 | 2015-02-26 | David William OFFEN | Systems and methods for managing incoming calls |
US10029648B2 (en) | 2013-09-04 | 2018-07-24 | Vivint, Inc. | Premises security |
US10025463B2 (en) | 2013-09-18 | 2018-07-17 | Vivint, Inc. | Systems and methods for home automation scene control |
US9058734B2 (en) | 2013-09-24 | 2015-06-16 | Verizon Patent And Licensing Inc. | Alert sensing and monitoring via a user device |
US20150085184A1 (en) | 2013-09-25 | 2015-03-26 | Joel Vidal | Smartphone and tablet having a side-panel camera |
US9646480B2 (en) | 2013-10-07 | 2017-05-09 | Google Inc. | Smart home device with integrated conditional lighting |
WO2015054254A1 (en) | 2013-10-07 | 2015-04-16 | Google Inc. | Hazard detection unit facilitating user-friendly setup experience |
JP2015076802A (en) | 2013-10-10 | 2015-04-20 | 船井電機株式会社 | Display device |
US20150192914A1 (en) | 2013-10-15 | 2015-07-09 | ETC Sp. z.o.o. | Automation and control system with inference and anticipation |
US9594361B2 (en) * | 2013-10-15 | 2017-03-14 | SILVAIR Sp. z o.o. | Automation and control system with context awareness |
US20150113571A1 (en) | 2013-10-22 | 2015-04-23 | Time Warner Cable Enterprises Llc | Methods and apparatus for content switching |
US9347242B2 (en) * | 2013-10-28 | 2016-05-24 | Smartlabs, Inc. | Systems and methods to automatically detect a door state |
US9479633B2 (en) | 2013-10-29 | 2016-10-25 | Logitech Europe S.A. | Method and apparatus for reliably providing an alarm notification |
US9338741B2 (en) | 2013-11-11 | 2016-05-10 | Mivalife Mobile Technology, Inc. | Security system device power management |
WO2015073912A1 (en) | 2013-11-14 | 2015-05-21 | Wager Jeffrey | Treatment or prevention of pulmonary conditions with carbon monoxide |
KR20150056397A (en) | 2013-11-15 | 2015-05-26 | 삼성전자주식회사 | broadcast receiving apparatus and method for displaying notice message using the same |
US10939155B2 (en) | 2013-11-19 | 2021-03-02 | Comcast Cable Communications, Llc | Premises automation control |
US9942723B2 (en) | 2013-12-02 | 2018-04-10 | Ravi Vemulapalli | Location and direction system for buildings |
US20150160935A1 (en) | 2013-12-06 | 2015-06-11 | Vivint, Inc. | Managing device configuration information |
US9495860B2 (en) | 2013-12-11 | 2016-11-15 | Echostar Technologies L.L.C. | False alarm identification |
US9900177B2 (en) | 2013-12-11 | 2018-02-20 | Echostar Technologies International Corporation | Maintaining up-to-date home automation models |
US20150163412A1 (en) | 2013-12-11 | 2015-06-11 | Echostar Technologies, Llc | Home Monitoring and Control |
US9769522B2 (en) | 2013-12-16 | 2017-09-19 | Echostar Technologies L.L.C. | Methods and systems for location specific operations |
WO2015104650A2 (en) | 2014-01-08 | 2015-07-16 | Koninklijke Philips N.V. | System for sharing and/or synchronizing attributes of emitted light among lighting systems |
US11651258B2 (en) | 2014-01-08 | 2023-05-16 | Yechezkal Evan Spero | Integrated docking system for intelligent devices |
US20150198941A1 (en) | 2014-01-15 | 2015-07-16 | John C. Pederson | Cyber Life Electronic Networking and Commerce Operating Exchange |
US9246921B1 (en) | 2014-01-20 | 2016-01-26 | SmartThings, Inc. | Secure external access to device automation system |
US20150241860A1 (en) | 2014-02-24 | 2015-08-27 | Raid And Raid, Inc., D/B/A Ruminate | Intelligent home and office automation system |
US20160203700A1 (en) | 2014-03-28 | 2016-07-14 | Echostar Technologies L.L.C. | Methods and systems to make changes in home automation based on user states |
US9723393B2 (en) | 2014-03-28 | 2017-08-01 | Echostar Technologies L.L.C. | Methods to conserve remote batteries |
US9888266B2 (en) | 2014-04-22 | 2018-02-06 | Vivint, Inc. | Pushing video to panels and sending metadata tag to cloud |
US10274909B2 (en) | 2014-04-25 | 2019-04-30 | Vivint, Inc. | Managing barrier and occupancy based home automation system |
US9765562B2 (en) | 2014-05-07 | 2017-09-19 | Vivint, Inc. | Weather based notification systems and methods for home automation |
GB201408751D0 (en) | 2014-05-16 | 2014-07-02 | Microsoft Corp | Notifications |
US9633547B2 (en) | 2014-05-20 | 2017-04-25 | Ooma, Inc. | Security monitoring and control |
US10237711B2 (en) | 2014-05-30 | 2019-03-19 | Apple Inc. | Dynamic types for activity continuation between electronic devices |
US10440499B2 (en) | 2014-06-16 | 2019-10-08 | Comcast Cable Communications, Llc | User location and identity awareness |
US9443142B2 (en) | 2014-07-24 | 2016-09-13 | Exelis, Inc. | Vision-based system for dynamic weather detection |
US9348689B2 (en) | 2014-10-07 | 2016-05-24 | Belkin International Inc. | Backup-instructing broadcast to network devices responsive to detection of failure risk |
US9621959B2 (en) | 2014-08-27 | 2017-04-11 | Echostar Uk Holdings Limited | In-residence track and alert |
US9824578B2 (en) | 2014-09-03 | 2017-11-21 | Echostar Technologies International Corporation | Home automation control using context sensitive menus |
US9989507B2 (en) | 2014-09-25 | 2018-06-05 | Echostar Technologies International Corporation | Detection and prevention of toxic gas |
US10448749B2 (en) | 2014-10-10 | 2019-10-22 | Sleep Number Corporation | Bed having logic controller |
US9835434B1 (en) | 2014-10-13 | 2017-12-05 | Google Inc. | Home automation input interfaces based on a capacitive touchscreen for detecting patterns of conductive ink |
US10057079B2 (en) | 2014-10-21 | 2018-08-21 | T-Mobile Usa, Inc. | Wireless building automation |
US9511259B2 (en) | 2014-10-30 | 2016-12-06 | Echostar Uk Holdings Limited | Fitness overlay and incorporation for home automation system |
US9983011B2 (en) | 2014-10-30 | 2018-05-29 | Echostar Technologies International Corporation | Mapping and facilitating evacuation routes in emergency situations |
US9396632B2 (en) | 2014-12-05 | 2016-07-19 | Elwha Llc | Detection and classification of abnormal sounds |
US20160182249A1 (en) | 2014-12-19 | 2016-06-23 | EchoStar Technologies, L.L.C. | Event-based audio/video feed selection |
US9967614B2 (en) | 2014-12-29 | 2018-05-08 | Echostar Technologies International Corporation | Alert suspension for home automation system |
US20160189527A1 (en) | 2014-12-30 | 2016-06-30 | Google Inc. | Intelligent Object-Based Alarm System |
US20160191912A1 (en) | 2014-12-31 | 2016-06-30 | Echostar Technologies L.L.C. | Home occupancy simulation mode selection and implementation |
US9870696B2 (en) | 2015-01-05 | 2018-01-16 | Ford Global Technologies, Llc | Smart device vehicle integration |
US10764079B2 (en) | 2015-02-09 | 2020-09-01 | Vivint, Inc. | System and methods for correlating sleep data to security and/or automation system operations |
US9942056B2 (en) | 2015-02-19 | 2018-04-10 | Vivint, Inc. | Methods and systems for automatically monitoring user activity |
US10453098B2 (en) | 2015-03-04 | 2019-10-22 | Google Llc | Privacy-aware personalized content for the smart home |
US10206108B2 (en) | 2015-03-24 | 2019-02-12 | Lenovo (Beijing) Co., Ltd. | Device and method for smart home |
US9729989B2 (en) | 2015-03-27 | 2017-08-08 | Echostar Technologies L.L.C. | Home automation sound detection and positioning |
US10321101B2 (en) | 2015-04-29 | 2019-06-11 | Ademco Inc. | System and method of sharing or connecting security and home control system |
US9948477B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Home automation weather detection |
US9946857B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Restricted access for home automation system |
US9632746B2 (en) | 2015-05-18 | 2017-04-25 | Echostar Technologies L.L.C. | Automatic muting |
US9900957B2 (en) | 2015-06-11 | 2018-02-20 | Cree, Inc. | Lighting device including solid state emitters with adjustable control |
WO2017004184A1 (en) | 2015-06-30 | 2017-01-05 | K4Connect Inc. | Home automation system including device signature pairing and related methods |
CN104898634B (en) | 2015-06-30 | 2018-08-07 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CA2994708C (en) | 2015-08-05 | 2020-10-13 | Lutron Electronics Co., Inc. | Commissioning and controlling load control devices |
US10027920B2 (en) | 2015-08-11 | 2018-07-17 | Samsung Electronics Co., Ltd. | Television (TV) as an internet of things (IoT) Participant |
US9915435B2 (en) | 2015-08-21 | 2018-03-13 | Google Llc | Intelligent HVAC control including automatic furnace shutdown event processing |
US9960980B2 (en) | 2015-08-21 | 2018-05-01 | Echostar Technologies International Corporation | Location monitor and device cloning |
US10375150B2 (en) | 2015-09-18 | 2019-08-06 | Google Llc | Crowd-based device trust establishment in a connected environment |
US9996066B2 (en) | 2015-11-25 | 2018-06-12 | Echostar Technologies International Corporation | System and method for HVAC health monitoring using a television receiver |
US9589448B1 (en) | 2015-12-08 | 2017-03-07 | Micro Apps Group Inventions, LLC | Autonomous safety and security device on an unmanned platform under command and control of a cellular phone |
US10101717B2 (en) | 2015-12-15 | 2018-10-16 | Echostar Technologies International Corporation | Home automation data storage system and methods |
US9798309B2 (en) | 2015-12-18 | 2017-10-24 | Echostar Technologies International Corporation | Home automation control based on individual profiling using audio sensor data |
US20170187993A1 (en) | 2015-12-29 | 2017-06-29 | Echostar Technologies L.L.C. | Unmanned aerial vehicle integration with home automation systems |
US10091017B2 (en) | 2015-12-30 | 2018-10-02 | Echostar Technologies International Corporation | Personalized home automation control based on individualized profiling |
US10073428B2 (en) | 2015-12-31 | 2018-09-11 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user characteristics |
US10060644B2 (en) | 2015-12-31 | 2018-08-28 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user preferences |
US9628286B1 (en) | 2016-02-23 | 2017-04-18 | Echostar Technologies L.L.C. | Television receiver and home automation system and methods to associate data with nearby people |
US9882736B2 (en) | 2016-06-09 | 2018-01-30 | Echostar Technologies International Corporation | Remote sound generation for a home automation system |
US10294600B2 (en) | 2016-08-05 | 2019-05-21 | Echostar Technologies International Corporation | Remote detection of washer/dryer operation/fault condition |
US20180061220A1 (en) | 2016-08-24 | 2018-03-01 | Echostar Technologies L.L.C. | Systems and methods for suppressing unwanted home automation notifications |
US10049515B2 (en) | 2016-08-24 | 2018-08-14 | Echostar Technologies International Corporation | Trusted user identification and management for home automation systems |
-
2014
- 2014-09-02 US US14/475,252 patent/US9900177B2/en active Active
- 2014-09-03 MX MX2016006239A patent/MX362800B/en active IP Right Grant
- 2014-09-03 WO PCT/US2014/053876 patent/WO2015088603A1/en active Application Filing
- 2014-09-03 EP EP14870507.2A patent/EP3080677B1/en active Active
- 2014-09-03 CA CA2930990A patent/CA2930990C/en active Active
- 2014-09-12 WO PCT/US2014/055441 patent/WO2015088608A1/en active Application Filing
- 2014-09-12 US US14/485,188 patent/US20150163535A1/en not_active Abandoned
- 2014-09-12 EP EP14868928.4A patent/EP3080710A4/en not_active Withdrawn
- 2014-09-12 CN CN201480067003.5A patent/CN105814555B/en active Active
- 2014-09-12 CA CA2931007A patent/CA2931007C/en active Active
- 2014-09-12 MX MX2016006589A patent/MX358781B/en active IP Right Grant
- 2014-09-12 US US14/485,038 patent/US9912492B2/en active Active
- 2014-09-12 WO PCT/US2014/055476 patent/WO2015088609A1/en active Application Filing
- 2014-11-25 US US14/553,763 patent/US10027503B2/en active Active
- 2014-12-10 US US14/565,853 patent/US9838736B2/en active Active
- 2014-12-11 US US14/567,502 patent/US20150160635A1/en not_active Abandoned
- 2014-12-11 US US14/566,977 patent/US20150162006A1/en not_active Abandoned
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6931104B1 (en) * | 1996-09-03 | 2005-08-16 | Koninklijke Philips Electronics N.V. | Intelligent call processing platform for home telephone system |
US6107935A (en) * | 1998-02-11 | 2000-08-22 | International Business Machines Corporation | Systems and methods for access filtering employing relaxed recognition constraints |
US6119088A (en) * | 1998-03-03 | 2000-09-12 | Ciluffo; Gary | Appliance control programmer using voice recognition |
US6337899B1 (en) * | 1998-03-31 | 2002-01-08 | International Business Machines Corporation | Speaker verification for authorizing updates to user subscription service received by internet service provider (ISP) using an intelligent peripheral (IP) in an advanced intelligent network (AIN) |
US20020193989A1 (en) * | 1999-05-21 | 2002-12-19 | Michael Geilhufe | Method and apparatus for identifying voice controlled devices |
US6415257B1 (en) * | 1999-08-26 | 2002-07-02 | Matsushita Electric Industrial Co., Ltd. | System for identifying and adapting a TV-user profile by means of speech technology |
US20010012998A1 (en) * | 1999-12-17 | 2001-08-09 | Pierrick Jouet | Voice recognition process and device, associated remote control device |
US7103545B2 (en) * | 2000-08-07 | 2006-09-05 | Shin Caterpillar Mitsubishi Ltd. | Voice-actuated machine body control apparatus for construction machine |
US20030005431A1 (en) * | 2001-07-02 | 2003-01-02 | Sony Corporation | PVR-based system and method for TV content control using voice recognition |
US7260538B2 (en) * | 2002-01-08 | 2007-08-21 | Promptu Systems Corporation | Method and apparatus for voice control of a television control device |
US20040143838A1 (en) * | 2003-01-17 | 2004-07-22 | Mark Rose | Video access management system |
US20050049862A1 (en) * | 2003-09-03 | 2005-03-03 | Samsung Electronics Co., Ltd. | Audio/video apparatus and method for providing personalized services through voice and speaker recognition |
US7529677B1 (en) * | 2005-01-21 | 2009-05-05 | Itt Manufacturing Enterprises, Inc. | Methods and apparatus for remotely processing locally generated commands to control a local device |
US20090271203A1 (en) * | 2008-04-25 | 2009-10-29 | Keith Resch | Voice-activated remote control service |
US20100083371A1 (en) * | 2008-10-01 | 2010-04-01 | Christopher Lee Bennetts | User Access Control System And Method |
US20100131280A1 (en) * | 2008-11-25 | 2010-05-27 | General Electric Company | Voice recognition system for medical devices |
US20120316876A1 (en) * | 2011-06-10 | 2012-12-13 | Seokbok Jang | Display Device, Method for Thereof and Voice Recognition System |
US20130238326A1 (en) * | 2012-03-08 | 2013-09-12 | Lg Electronics Inc. | Apparatus and method for multiple device voice control |
Cited By (309)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD951298S1 (en) | 1991-11-29 | 2022-05-10 | Google Llc | Panel of a voice interface device |
US9612728B2 (en) * | 1999-12-20 | 2017-04-04 | Apple Inc. | Graduated visual and manipulative translucency for windows |
US11671920B2 (en) | 2007-04-03 | 2023-06-06 | Apple Inc. | Method and system for operating a multifunction portable electronic device using voice-activation |
US11348582B2 (en) | 2008-10-02 | 2022-05-31 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US11900936B2 (en) | 2008-10-02 | 2024-02-13 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US11592723B2 (en) | 2009-12-22 | 2023-02-28 | View, Inc. | Automated commissioning of controllers in a window network |
US11423886B2 (en) | 2010-01-18 | 2022-08-23 | Apple Inc. | Task flow identification based on user intent |
US9599981B2 (en) | 2010-02-04 | 2017-03-21 | Echostar Uk Holdings Limited | Electronic appliance status notification via a home entertainment system |
US11073800B2 (en) | 2011-03-16 | 2021-07-27 | View, Inc. | Monitoring sites containing switchable optical devices and controllers |
US11120372B2 (en) | 2011-06-03 | 2021-09-14 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
US11735183B2 (en) | 2012-04-13 | 2023-08-22 | View, Inc. | Controlling optically-switchable devices |
US10964320B2 (en) | 2012-04-13 | 2021-03-30 | View, Inc. | Controlling optically-switchable devices |
US11687045B2 (en) | 2012-04-13 | 2023-06-27 | View, Inc. | Monitoring sites containing switchable optical devices and controllers |
US11321116B2 (en) | 2012-05-15 | 2022-05-03 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US11557310B2 (en) | 2013-02-07 | 2023-01-17 | Apple Inc. | Voice trigger for a digital assistant |
US11862186B2 (en) | 2013-02-07 | 2024-01-02 | Apple Inc. | Voice trigger for a digital assistant |
US10978090B2 (en) | 2013-02-07 | 2021-04-13 | Apple Inc. | Voice trigger for a digital assistant |
US11636869B2 (en) | 2013-02-07 | 2023-04-25 | Apple Inc. | Voice trigger for a digital assistant |
US11388291B2 (en) | 2013-03-14 | 2022-07-12 | Apple Inc. | System and method for processing voicemail |
US11798547B2 (en) | 2013-03-15 | 2023-10-24 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
US11727219B2 (en) | 2013-06-09 | 2023-08-15 | Apple Inc. | System and method for inferring user intent from speech inputs |
US9838736B2 (en) | 2013-12-11 | 2017-12-05 | Echostar Technologies International Corporation | Home automation bubble architecture |
US9900177B2 (en) | 2013-12-11 | 2018-02-20 | Echostar Technologies International Corporation | Maintaining up-to-date home automation models |
US9495860B2 (en) | 2013-12-11 | 2016-11-15 | Echostar Technologies L.L.C. | False alarm identification |
US9772612B2 (en) | 2013-12-11 | 2017-09-26 | Echostar Technologies International Corporation | Home monitoring and control |
US9912492B2 (en) | 2013-12-11 | 2018-03-06 | Echostar Technologies International Corporation | Detection and mitigation of water leaks with home automation |
US10027503B2 (en) | 2013-12-11 | 2018-07-17 | Echostar Technologies International Corporation | Integrated door locking and state detection systems and methods |
US11109098B2 (en) | 2013-12-16 | 2021-08-31 | DISH Technologies L.L.C. | Methods and systems for location specific operations |
US9769522B2 (en) | 2013-12-16 | 2017-09-19 | Echostar Technologies L.L.C. | Methods and systems for location specific operations |
US10200752B2 (en) | 2013-12-16 | 2019-02-05 | DISH Technologies L.L.C. | Methods and systems for location specific operations |
US11381903B2 (en) | 2014-02-14 | 2022-07-05 | Sonic Blocks Inc. | Modular quick-connect A/V system and methods thereof |
US11733660B2 (en) | 2014-03-05 | 2023-08-22 | View, Inc. | Monitoring sites containing switchable optical devices and controllers |
US9723393B2 (en) | 2014-03-28 | 2017-08-01 | Echostar Technologies L.L.C. | Methods to conserve remote batteries |
US10481561B2 (en) * | 2014-04-24 | 2019-11-19 | Vivint, Inc. | Managing home automation system based on behavior |
US10554432B2 (en) * | 2014-05-07 | 2020-02-04 | Vivint, Inc. | Home automation via voice control |
US9860076B2 (en) * | 2014-05-07 | 2018-01-02 | Vivint, Inc. | Home automation via voice control |
US20150324706A1 (en) * | 2014-05-07 | 2015-11-12 | Vivint, Inc. | Home automation via voice control |
US20180176031A1 (en) * | 2014-05-07 | 2018-06-21 | Vivint, Inc. | Home automation via voice control |
US11763663B2 (en) | 2014-05-20 | 2023-09-19 | Ooma, Inc. | Community security monitoring and control |
US11670289B2 (en) | 2014-05-30 | 2023-06-06 | Apple Inc. | Multi-command single utterance input method |
US11133008B2 (en) | 2014-05-30 | 2021-09-28 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US10878809B2 (en) | 2014-05-30 | 2020-12-29 | Apple Inc. | Multi-command single utterance input method |
US11257504B2 (en) | 2014-05-30 | 2022-02-22 | Apple Inc. | Intelligent assistant for home automation |
US11810562B2 (en) | 2014-05-30 | 2023-11-07 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US11699448B2 (en) | 2014-05-30 | 2023-07-11 | Apple Inc. | Intelligent assistant for home automation |
US11838579B2 (en) | 2014-06-30 | 2023-12-05 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US11516537B2 (en) | 2014-06-30 | 2022-11-29 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US11316974B2 (en) * | 2014-07-09 | 2022-04-26 | Ooma, Inc. | Cloud-based assistive services for use in telecommunications and on premise devices |
US11315405B2 (en) | 2014-07-09 | 2022-04-26 | Ooma, Inc. | Systems and methods for provisioning appliance devices |
US11330100B2 (en) * | 2014-07-09 | 2022-05-10 | Ooma, Inc. | Server based intelligent personal assistant services |
US9621959B2 (en) | 2014-08-27 | 2017-04-11 | Echostar Uk Holdings Limited | In-residence track and alert |
US9824578B2 (en) | 2014-09-03 | 2017-11-21 | Echostar Technologies International Corporation | Home automation control using context sensitive menus |
US9989507B2 (en) | 2014-09-25 | 2018-06-05 | Echostar Technologies International Corporation | Detection and prevention of toxic gas |
US20220247743A1 (en) * | 2014-10-03 | 2022-08-04 | Gopro, Inc. | Authenticating a limited input device via an authenticated application |
US20180253960A1 (en) * | 2014-10-08 | 2018-09-06 | Gentex Corporation | Trainable transceiver and method of operation utilizing existing vehicle user interfaces |
US11030888B2 (en) * | 2014-10-08 | 2021-06-08 | Gentex Corporation | Trainable transceiver and method of operation utilizing existing vehicle user interfaces |
US9685157B2 (en) * | 2014-10-16 | 2017-06-20 | Hyundai Motor Company | Vehicle and control method thereof |
US20160111089A1 (en) * | 2014-10-16 | 2016-04-21 | Hyundai Motor Company | Vehicle and control method thereof |
US9983011B2 (en) | 2014-10-30 | 2018-05-29 | Echostar Technologies International Corporation | Mapping and facilitating evacuation routes in emergency situations |
US9511259B2 (en) | 2014-10-30 | 2016-12-06 | Echostar Uk Holdings Limited | Fitness overlay and incorporation for home automation system |
US9977587B2 (en) | 2014-10-30 | 2018-05-22 | Echostar Technologies International Corporation | Fitness overlay and incorporation for home automation system |
US9542083B2 (en) * | 2014-12-04 | 2017-01-10 | Comcast Cable Communications, Llc | Configuration responsive to a device |
US20160164731A1 (en) * | 2014-12-04 | 2016-06-09 | Comcast Cable Communications, Llc | Configuration Responsive to a Device |
US10275214B2 (en) | 2014-12-22 | 2019-04-30 | Intel Corporation | Connected device voice command support |
US9811312B2 (en) * | 2014-12-22 | 2017-11-07 | Intel Corporation | Connected device voice command support |
US9967614B2 (en) | 2014-12-29 | 2018-05-08 | Echostar Technologies International Corporation | Alert suspension for home automation system |
US10990069B1 (en) | 2015-02-19 | 2021-04-27 | State Farm Mutual Automobile Insurance Company | Systems and methods for monitoring building health |
US11644805B1 (en) | 2015-02-19 | 2023-05-09 | State Farm Mutual Automobile Insurance Company | Systems and methods for monitoring building health |
US10970990B1 (en) * | 2015-02-19 | 2021-04-06 | State Farm Mutual Automobile Insurance Company | Systems and methods for monitoring building health |
US11087759B2 (en) | 2015-03-08 | 2021-08-10 | Apple Inc. | Virtual assistant activation |
US11842734B2 (en) | 2015-03-08 | 2023-12-12 | Apple Inc. | Virtual assistant activation |
US10976996B1 (en) * | 2015-03-17 | 2021-04-13 | Amazon Technologies, Inc. | Grouping devices for voice control |
US11422772B1 (en) * | 2015-03-17 | 2022-08-23 | Amazon Technologies, Inc. | Creating scenes from voice-controllable devices |
US11429345B2 (en) * | 2015-03-17 | 2022-08-30 | Amazon Technologies, Inc. | Remote execution of secondary-device drivers |
US10031722B1 (en) * | 2015-03-17 | 2018-07-24 | Amazon Technologies, Inc. | Grouping devices for voice control |
US9984686B1 (en) * | 2015-03-17 | 2018-05-29 | Amazon Technologies, Inc. | Mapping device capabilities to a predefined set |
US20210326103A1 (en) * | 2015-03-17 | 2021-10-21 | Amazon Technologies, Inc. | Grouping Devices for Voice Control |
US10453461B1 (en) * | 2015-03-17 | 2019-10-22 | Amazon Technologies, Inc. | Remote execution of secondary-device drivers |
US9729989B2 (en) | 2015-03-27 | 2017-08-08 | Echostar Technologies L.L.C. | Home automation sound detection and positioning |
US11646974B2 (en) | 2015-05-08 | 2023-05-09 | Ooma, Inc. | Systems and methods for end point data communications anonymization for a communications hub |
US9948477B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Home automation weather detection |
US9946857B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Restricted access for home automation system |
US11468282B2 (en) | 2015-05-15 | 2022-10-11 | Apple Inc. | Virtual assistant in a communication session |
US9632746B2 (en) | 2015-05-18 | 2017-04-25 | Echostar Technologies L.L.C. | Automatic muting |
US20180137860A1 (en) * | 2015-05-19 | 2018-05-17 | Sony Corporation | Information processing device, information processing method, and program |
US10861449B2 (en) * | 2015-05-19 | 2020-12-08 | Sony Corporation | Information processing device and information processing method |
US11070949B2 (en) | 2015-05-27 | 2021-07-20 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on an electronic device with a touch-sensitive display |
US11703320B2 (en) | 2015-06-25 | 2023-07-18 | Amazon Technologies, Inc. | Determining relative positions of user devices |
US10655951B1 (en) | 2015-06-25 | 2020-05-19 | Amazon Technologies, Inc. | Determining relative positions of user devices |
US11947873B2 (en) | 2015-06-29 | 2024-04-02 | Apple Inc. | Virtual assistant for media playback |
US11340566B1 (en) | 2015-06-30 | 2022-05-24 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
US10365620B1 (en) | 2015-06-30 | 2019-07-30 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
US11809150B1 (en) | 2015-06-30 | 2023-11-07 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
CN106354023A (en) * | 2015-07-15 | 2017-01-25 | 腾讯科技(深圳)有限公司 | Method for controlling terminal device by mobile terminal, mobile terminal and system |
US10511457B2 (en) * | 2015-07-15 | 2019-12-17 | Tencent Technology (Shenzhen) Company Limited | Method, intelligent device, and system for controlling terminal device |
US20170053210A1 (en) * | 2015-08-17 | 2017-02-23 | Ton Duc Thang University | Smart home system |
US20170052514A1 (en) * | 2015-08-17 | 2017-02-23 | Ton Duc Thang University | Method and computer software program for a smart home system |
US9960980B2 (en) | 2015-08-21 | 2018-05-01 | Echostar Technologies International Corporation | Location monitor and device cloning |
US10896671B1 (en) | 2015-08-21 | 2021-01-19 | Soundhound, Inc. | User-defined extensions of the command input recognized by a virtual assistant |
US11853536B2 (en) | 2015-09-08 | 2023-12-26 | Apple Inc. | Intelligent automated assistant in a media environment |
US11550542B2 (en) | 2015-09-08 | 2023-01-10 | Apple Inc. | Zero latency digital assistant |
US11809483B2 (en) | 2015-09-08 | 2023-11-07 | Apple Inc. | Intelligent automated assistant for media search and playback |
US11126400B2 (en) | 2015-09-08 | 2021-09-21 | Apple Inc. | Zero latency digital assistant |
US11500672B2 (en) | 2015-09-08 | 2022-11-15 | Apple Inc. | Distributed personal assistant |
US10018977B2 (en) * | 2015-10-05 | 2018-07-10 | Savant Systems, Llc | History-based key phrase suggestions for voice control of a home automation system |
US10650825B2 (en) * | 2015-10-23 | 2020-05-12 | Sharp Kabushiki Kaisha | Communication device |
US20180286407A1 (en) * | 2015-10-23 | 2018-10-04 | Sharp Kabushiki Kaisha | Communication device |
EP3913898A1 (en) * | 2015-11-06 | 2021-11-24 | Google LLC | Voice commands across devices |
US11749266B2 (en) | 2015-11-06 | 2023-09-05 | Google Llc | Voice commands across devices |
US11526368B2 (en) | 2015-11-06 | 2022-12-13 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US11809886B2 (en) | 2015-11-06 | 2023-11-07 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US11886805B2 (en) | 2015-11-09 | 2024-01-30 | Apple Inc. | Unconventional virtual assistant interactions |
US9996066B2 (en) | 2015-11-25 | 2018-06-12 | Echostar Technologies International Corporation | System and method for HVAC health monitoring using a television receiver |
US10101717B2 (en) | 2015-12-15 | 2018-10-16 | Echostar Technologies International Corporation | Home automation data storage system and methods |
US9798309B2 (en) | 2015-12-18 | 2017-10-24 | Echostar Technologies International Corporation | Home automation control based on individual profiling using audio sensor data |
US11853647B2 (en) | 2015-12-23 | 2023-12-26 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10825454B1 (en) * | 2015-12-28 | 2020-11-03 | Amazon Technologies, Inc. | Naming devices via voice commands |
US11942085B1 (en) | 2015-12-28 | 2024-03-26 | Amazon Technologies, Inc. | Naming devices via voice commands |
US10091017B2 (en) | 2015-12-30 | 2018-10-02 | Echostar Technologies International Corporation | Personalized home automation control based on individualized profiling |
US10073428B2 (en) | 2015-12-31 | 2018-09-11 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user characteristics |
US10060644B2 (en) | 2015-12-31 | 2018-08-28 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user preferences |
US11355104B2 (en) * | 2016-02-02 | 2022-06-07 | Amazon Technologies, Inc. | Post-speech recognition request surplus detection and prevention |
US20200013397A1 (en) * | 2016-02-12 | 2020-01-09 | Amazon Technologies, Inc. | Processing spoken commands to control distributed audio outputs |
US10262657B1 (en) * | 2016-02-12 | 2019-04-16 | Amazon Technologies, Inc. | Processing spoken commands to control distributed audio outputs |
US9898250B1 (en) * | 2016-02-12 | 2018-02-20 | Amazon Technologies, Inc. | Controlling distributed audio outputs to enable voice output |
US9858927B2 (en) * | 2016-02-12 | 2018-01-02 | Amazon Technologies, Inc | Processing spoken commands to control distributed audio outputs |
US10878815B2 (en) * | 2016-02-12 | 2020-12-29 | Amazon Technologies, Inc. | Processing spoken commands to control distributed audio outputs |
US9628286B1 (en) | 2016-02-23 | 2017-04-18 | Echostar Technologies L.L.C. | Television receiver and home automation system and methods to associate data with nearby people |
US10714081B1 (en) | 2016-03-07 | 2020-07-14 | Amazon Technologies, Inc. | Dynamic voice assistant interaction |
WO2017160232A1 (en) * | 2016-03-16 | 2017-09-21 | Forth Tv Pte Ltd | Apparatus for assistive communication |
US20170301353A1 (en) * | 2016-04-15 | 2017-10-19 | Sensory, Incorporated | Unobtrusive training for speaker verification |
CN107452384A (en) * | 2016-04-15 | 2017-12-08 | 感官公司 | For device, media and the method for the non-invasi training for speaker verification |
CN107452384B (en) * | 2016-04-15 | 2021-02-05 | 感官公司 | Apparatus, media, and methods for non-intrusive training for speaker verification |
US10152974B2 (en) * | 2016-04-15 | 2018-12-11 | Sensory, Incorporated | Unobtrusive training for speaker verification |
AU2017257789B2 (en) * | 2016-04-26 | 2022-06-30 | View, Inc. | Controlling optically-switchable devices |
EP3449341A4 (en) * | 2016-04-26 | 2019-12-04 | View, Inc. | Controlling optically-switchable devices |
US20210297722A1 (en) * | 2016-04-28 | 2021-09-23 | Ecolink Intelligent Technology, Inc. | Systems, methods and apparatus for interacting with a security system using a television remote control |
US11032599B2 (en) * | 2016-04-28 | 2021-06-08 | Ecolink Intelligent Technology, Inc. | Systems, methods and apparatus for interacting with a security system using a television remote control |
US10362350B2 (en) * | 2016-04-28 | 2019-07-23 | Ecolink Intelligent Technology, Inc. | Systems, methods and apparatus for interacting with a security system using a television remote control |
WO2017189134A1 (en) * | 2016-04-28 | 2017-11-02 | Ecolink Intelligent Technology, Inc. | Systems, methods and apparatus for interacting with a security system using a television remote control |
US11831940B2 (en) * | 2016-04-28 | 2023-11-28 | Ecolink Intelligent Technology, Inc. | Systems, methods and apparatus for interacting with a security system using a television remote control |
US20190306558A1 (en) * | 2016-04-28 | 2019-10-03 | Ecolink Intelligent Technology, Inc. | Systems, methods and apparatus for interacting with a security system using a television remote control |
US10319371B2 (en) * | 2016-05-04 | 2019-06-11 | GM Global Technology Operations LLC | Disambiguation of vehicle speech commands |
US20170323635A1 (en) * | 2016-05-04 | 2017-11-09 | GM Global Technology Operations LLC | Disambiguation of vehicle speech commands |
US10332516B2 (en) | 2016-05-10 | 2019-06-25 | Google Llc | Media transfer among media output devices |
US10861461B2 (en) | 2016-05-10 | 2020-12-08 | Google Llc | LED design language for visual affordance of voice user interfaces |
US10235997B2 (en) * | 2016-05-10 | 2019-03-19 | Google Llc | Voice-controlled closed caption display |
US10304450B2 (en) | 2016-05-10 | 2019-05-28 | Google Llc | LED design language for visual affordance of voice user interfaces |
US11341964B2 (en) | 2016-05-10 | 2022-05-24 | Google Llc | Voice-controlled media play in smart media environment |
US11935535B2 (en) | 2016-05-10 | 2024-03-19 | Google Llc | Implementations for voice assistant on devices |
US11922941B2 (en) | 2016-05-10 | 2024-03-05 | Google Llc | Implementations for voice assistant on devices |
US11355116B2 (en) | 2016-05-10 | 2022-06-07 | Google Llc | Implementations for voice assistant on devices |
US10535343B2 (en) | 2016-05-10 | 2020-01-14 | Google Llc | Implementations for voice assistant on devices |
US11860933B2 (en) | 2016-05-13 | 2024-01-02 | Google Llc | Personalized and contextualized audio briefing |
US10402450B2 (en) | 2016-05-13 | 2019-09-03 | Google Llc | Personalized and contextualized audio briefing |
USD885436S1 (en) | 2016-05-13 | 2020-05-26 | Google Llc | Panel of a voice interface device |
USD927550S1 (en) | 2016-05-13 | 2021-08-10 | Google Llc | Voice interface device |
USD979602S1 (en) | 2016-05-13 | 2023-02-28 | Google Llc | Panel of a voice interface device |
US10163437B1 (en) * | 2016-06-02 | 2018-12-25 | Amazon Technologies, Inc. | Training models using voice tags |
US9882736B2 (en) | 2016-06-09 | 2018-01-30 | Echostar Technologies International Corporation | Remote sound generation for a home automation system |
US11037565B2 (en) | 2016-06-10 | 2021-06-15 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US11657820B2 (en) | 2016-06-10 | 2023-05-23 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US11152002B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Application integration with a digital assistant |
US11749275B2 (en) | 2016-06-11 | 2023-09-05 | Apple Inc. | Application integration with a digital assistant |
US11809783B2 (en) | 2016-06-11 | 2023-11-07 | Apple Inc. | Intelligent device arbitration and control |
US10931999B1 (en) * | 2016-06-27 | 2021-02-23 | Amazon Technologies, Inc. | Systems and methods for routing content to an associated output device |
US10271093B1 (en) | 2016-06-27 | 2019-04-23 | Amazon Technologies, Inc. | Systems and methods for routing content to an associated output device |
US11244687B2 (en) | 2016-07-06 | 2022-02-08 | Pcms Holdings, Inc. | System and method for customizing smart home speech interfaces using personalized speech profiles |
US10880308B1 (en) * | 2016-07-20 | 2020-12-29 | Vivint, Inc. | Integrated system component and electronic device |
US10294600B2 (en) | 2016-08-05 | 2019-05-21 | Echostar Technologies International Corporation | Remote detection of washer/dryer operation/fault condition |
EP3502807A4 (en) * | 2016-08-18 | 2019-08-14 | Beijing VRV Software Corporation Limited | Method and apparatus for assisting human-computer interaction |
CN107765838A (en) * | 2016-08-18 | 2018-03-06 | 北京北信源软件股份有限公司 | Man-machine interaction householder method and device |
US10880284B1 (en) * | 2016-08-19 | 2020-12-29 | Amazon Technologies, Inc. | Repurposing limited functionality devices as authentication factors |
US10049515B2 (en) | 2016-08-24 | 2018-08-14 | Echostar Technologies International Corporation | Trusted user identification and management for home automation systems |
US20190348036A1 (en) * | 2016-09-29 | 2019-11-14 | Intel IP Corporation | Context-aware query recognition for electronic devices |
US20180174581A1 (en) * | 2016-12-19 | 2018-06-21 | Pilot, Inc. | Voice-activated vehicle lighting control hub |
US11908445B2 (en) * | 2016-12-30 | 2024-02-20 | Google Llc | Conversation-aware proactive notifications for a voice interface device |
US20220277727A1 (en) * | 2016-12-30 | 2022-09-01 | Google Llc | Conversation-aware proactive notifications for a voice interface device |
US11656884B2 (en) | 2017-01-09 | 2023-05-23 | Apple Inc. | Application integration with a digital assistant |
US11355111B2 (en) * | 2017-01-24 | 2022-06-07 | Honeywell International Inc. | Voice control of an integrated room automation system |
CN106707788A (en) * | 2017-03-09 | 2017-05-24 | 上海电器科学研究院 | Intelligent and household voice control and recognition system |
WO2018175201A1 (en) * | 2017-03-21 | 2018-09-27 | Amplivy, Inc. | Content-activated intelligent, autonomous audio/video source controller |
US10129594B2 (en) | 2017-03-21 | 2018-11-13 | Amplivy, Inc. | Content-activated intelligent, autonomous audio/video source controller |
US10621980B2 (en) * | 2017-03-21 | 2020-04-14 | Harman International Industries, Inc. | Execution of voice commands in a multi-device system |
US20180277107A1 (en) * | 2017-03-21 | 2018-09-27 | Harman International Industries, Inc. | Execution of voice commands in a multi-device system |
US10972556B1 (en) * | 2017-03-22 | 2021-04-06 | Amazon Technologies, Inc. | Location-based functionality for voice-capturing devices |
US20180308483A1 (en) * | 2017-04-21 | 2018-10-25 | Lg Electronics Inc. | Voice recognition apparatus and voice recognition method |
US11158317B2 (en) * | 2017-05-08 | 2021-10-26 | Signify Holding B.V. | Methods, systems and apparatus for voice control of a utility |
US11467802B2 (en) | 2017-05-11 | 2022-10-11 | Apple Inc. | Maintaining privacy of personal information |
US11599331B2 (en) | 2017-05-11 | 2023-03-07 | Apple Inc. | Maintaining privacy of personal information |
US11837237B2 (en) | 2017-05-12 | 2023-12-05 | Apple Inc. | User-specific acoustic models |
US11580990B2 (en) | 2017-05-12 | 2023-02-14 | Apple Inc. | User-specific acoustic models |
US11830333B2 (en) * | 2017-05-12 | 2023-11-28 | Google Llc | Systems, methods, and devices for activity monitoring via a home assistant |
US11862151B2 (en) | 2017-05-12 | 2024-01-02 | Apple Inc. | Low-latency intelligent automated assistant |
US11538469B2 (en) | 2017-05-12 | 2022-12-27 | Apple Inc. | Low-latency intelligent automated assistant |
US11380310B2 (en) | 2017-05-12 | 2022-07-05 | Apple Inc. | Low-latency intelligent automated assistant |
US20230031831A1 (en) * | 2017-05-12 | 2023-02-02 | Google Llc | Systems, methods, and devices for activity monitoring via a home assistant |
US11405466B2 (en) | 2017-05-12 | 2022-08-02 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US11217255B2 (en) | 2017-05-16 | 2022-01-04 | Apple Inc. | Far-field extension for digital assistant services |
US11675829B2 (en) | 2017-05-16 | 2023-06-13 | Apple Inc. | Intelligent automated assistant for media exploration |
US20190172467A1 (en) * | 2017-05-16 | 2019-06-06 | Apple Inc. | Far-field extension for digital assistant services |
US10748546B2 (en) * | 2017-05-16 | 2020-08-18 | Apple Inc. | Digital assistant services based on device capabilities |
US11532306B2 (en) | 2017-05-16 | 2022-12-20 | Apple Inc. | Detecting a trigger of a digital assistant |
US20180358013A1 (en) * | 2017-06-13 | 2018-12-13 | Hyundai Motor Company | Apparatus for selecting at least one task based on voice command, vehicle including the same, and method thereof |
US10431221B2 (en) * | 2017-06-13 | 2019-10-01 | Hyundai Motor Company | Apparatus for selecting at least one task based on voice command, vehicle including the same, and method thereof |
US10607606B2 (en) * | 2017-06-19 | 2020-03-31 | Lenovo (Singapore) Pte. Ltd. | Systems and methods for execution of digital assistant |
US20180366116A1 (en) * | 2017-06-19 | 2018-12-20 | Lenovo (Singapore) Pte. Ltd. | Systems and methods for execution of digital assistant |
US11238870B2 (en) * | 2017-07-05 | 2022-02-01 | Alibaba Group Holding Limited | Interaction method, electronic device, and server |
US10887125B2 (en) * | 2017-09-15 | 2021-01-05 | Kohler Co. | Bathroom speaker |
US11099540B2 (en) | 2017-09-15 | 2021-08-24 | Kohler Co. | User identity in household appliances |
US11949533B2 (en) | 2017-09-15 | 2024-04-02 | Kohler Co. | Sink device |
US11314215B2 (en) | 2017-09-15 | 2022-04-26 | Kohler Co. | Apparatus controlling bathroom appliance lighting based on user identity |
US11314214B2 (en) | 2017-09-15 | 2022-04-26 | Kohler Co. | Geographic analysis of water conditions |
US11921794B2 (en) | 2017-09-15 | 2024-03-05 | Kohler Co. | Feedback for water consuming appliance |
US20190089550A1 (en) * | 2017-09-15 | 2019-03-21 | Kohler Co. | Bathroom speaker |
US11093554B2 (en) | 2017-09-15 | 2021-08-17 | Kohler Co. | Feedback for water consuming appliance |
US11892811B2 (en) | 2017-09-15 | 2024-02-06 | Kohler Co. | Geographic analysis of water conditions |
US10531157B1 (en) * | 2017-09-21 | 2020-01-07 | Amazon Technologies, Inc. | Presentation and management of audio and visual content across devices |
US11758232B2 (en) | 2017-09-21 | 2023-09-12 | Amazon Technologies, Inc. | Presentation and management of audio and visual content across devices |
US11521609B2 (en) * | 2017-09-28 | 2022-12-06 | Kyocera Corporation | Voice command system and voice command method |
US10715604B1 (en) * | 2017-10-26 | 2020-07-14 | Amazon Technologies, Inc. | Remote system processing based on a previously identified user |
US11627189B2 (en) * | 2017-10-26 | 2023-04-11 | Amazon Technologies, Inc. | Performing an action based on secondary user authorization |
US10567515B1 (en) | 2017-10-26 | 2020-02-18 | Amazon Technologies, Inc. | Speech processing performed with respect to first and second user profiles in a dialog session |
US10778674B2 (en) | 2018-01-30 | 2020-09-15 | D&M Holdings, Inc. | Voice authentication and setup for wireless media rendering system |
US11056119B2 (en) | 2018-03-08 | 2021-07-06 | Frontive, Inc. | Methods and systems for speech signal processing |
US10460734B2 (en) * | 2018-03-08 | 2019-10-29 | Frontive, Inc. | Methods and systems for speech signal processing |
US10909990B2 (en) | 2018-03-08 | 2021-02-02 | Frontive, Inc. | Methods and systems for speech signal processing |
GB2572175A (en) * | 2018-03-21 | 2019-09-25 | Emotech Ltd | Processing a command |
GB2572175B (en) * | 2018-03-21 | 2022-10-12 | Emotech Ltd | Processing a command |
US11710482B2 (en) | 2018-03-26 | 2023-07-25 | Apple Inc. | Natural assistant interaction |
WO2019204196A1 (en) * | 2018-04-16 | 2019-10-24 | The Chamberlain Group, Inc. | Systems and methods for voice-activated control of an access control platform |
US11600124B2 (en) | 2018-04-16 | 2023-03-07 | The Chamberlain Group Llc | Systems and methods for voice-activated control of an access control platform |
US11010999B2 (en) | 2018-04-16 | 2021-05-18 | The Chamberlain Group, Inc. | Systems and methods for voice-activated control of an access control platform |
US11169616B2 (en) | 2018-05-07 | 2021-11-09 | Apple Inc. | Raise to speak |
US11487364B2 (en) | 2018-05-07 | 2022-11-01 | Apple Inc. | Raise to speak |
US11907436B2 (en) | 2018-05-07 | 2024-02-20 | Apple Inc. | Raise to speak |
US11854539B2 (en) | 2018-05-07 | 2023-12-26 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US11900923B2 (en) | 2018-05-07 | 2024-02-13 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US10938830B2 (en) | 2018-05-08 | 2021-03-02 | International Business Machines Corporation | Authorizing and nullifying commands issued to virtual assistants in an internet of things (IoT) computing environment based on hierarchal user access levels |
US10325596B1 (en) * | 2018-05-25 | 2019-06-18 | Bao Tran | Voice control of appliances |
US10902852B2 (en) * | 2018-05-25 | 2021-01-26 | Bao Tran | Voice controlled appliance |
US11657815B2 (en) * | 2018-05-25 | 2023-05-23 | Bao Tran | Voice control system |
US20210110827A1 (en) * | 2018-05-25 | 2021-04-15 | Bao Tran | Voice Appliance |
TWI691893B (en) * | 2018-05-30 | 2020-04-21 | 大陸商出門問問信息科技有限公司 | A method and an apparatus for continuously broadcasting audio data |
US11360577B2 (en) | 2018-06-01 | 2022-06-14 | Apple Inc. | Attention aware virtual assistant dismissal |
US11630525B2 (en) | 2018-06-01 | 2023-04-18 | Apple Inc. | Attention aware virtual assistant dismissal |
US10984798B2 (en) | 2018-06-01 | 2021-04-20 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US11009970B2 (en) | 2018-06-01 | 2021-05-18 | Apple Inc. | Attention aware virtual assistant dismissal |
US11431642B2 (en) | 2018-06-01 | 2022-08-30 | Apple Inc. | Variable latency device coordination |
US11545158B2 (en) * | 2018-06-27 | 2023-01-03 | Samsung Electronics Co., Ltd. | Electronic apparatus, method for controlling mobile apparatus by electronic apparatus and computer readable recording medium |
EP3776540A4 (en) * | 2018-06-27 | 2021-06-02 | Samsung Electronics Co., Ltd. | Electronic apparatus, method for controlling mobile apparatus by electronic apparatus and computer readable recording medium |
EP3776540A1 (en) * | 2018-06-27 | 2021-02-17 | Samsung Electronics Co., Ltd. | Electronic apparatus, method for controlling mobile apparatus by electronic apparatus and computer readable recording medium |
US11417338B2 (en) | 2018-08-02 | 2022-08-16 | Samsung Electronics Co., Ltd. | Electronic apparatus and control method for controlling a device in an Internet of Things |
WO2020027559A1 (en) * | 2018-08-02 | 2020-02-06 | Samsung Electronics Co., Ltd. | Electronic apparatus and control method thereof |
US11893992B2 (en) | 2018-09-28 | 2024-02-06 | Apple Inc. | Multi-modal inputs for voice commands |
US11289100B2 (en) * | 2018-10-08 | 2022-03-29 | Google Llc | Selective enrollment with an automated assistant |
US11238294B2 (en) | 2018-10-08 | 2022-02-01 | Google Llc | Enrollment with an automated assistant |
US11238142B2 (en) | 2018-10-08 | 2022-02-01 | Google Llc | Enrollment with an automated assistant |
US11704940B2 (en) | 2018-10-08 | 2023-07-18 | Google Llc | Enrollment with an automated assistant |
US11627012B2 (en) | 2018-10-09 | 2023-04-11 | NewTekSol, LLC | Home automation management system |
US10770071B2 (en) * | 2018-11-15 | 2020-09-08 | Motorola Mobility Llc | Electronic device with voice process control and corresponding methods |
US20200160857A1 (en) * | 2018-11-15 | 2020-05-21 | Motorola Mobility Llc | Electronic Device with Voice Process Control and Corresponding Methods |
US11074914B2 (en) * | 2019-03-08 | 2021-07-27 | Rovi Guides, Inc. | Automated query detection in interactive content |
US11677479B2 (en) | 2019-03-08 | 2023-06-13 | Rovi Guides, Inc. | Frequency pairing for device synchronization |
US11522619B2 (en) | 2019-03-08 | 2022-12-06 | Rovi Guides, Inc. | Frequency pairing for device synchronization |
US11822601B2 (en) | 2019-03-15 | 2023-11-21 | Spotify Ab | Ensemble-based data comparison |
US11783815B2 (en) | 2019-03-18 | 2023-10-10 | Apple Inc. | Multimodality in digital assistant systems |
US11170783B2 (en) | 2019-04-16 | 2021-11-09 | At&T Intellectual Property I, L.P. | Multi-agent input coordination |
US11664032B2 (en) | 2019-04-16 | 2023-05-30 | At&T Intellectual Property I, L.P. | Multi-agent input coordination |
US11217251B2 (en) | 2019-05-06 | 2022-01-04 | Apple Inc. | Spoken notifications |
US11675491B2 (en) | 2019-05-06 | 2023-06-13 | Apple Inc. | User configurable task triggers |
US11705130B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | Spoken notifications |
US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
US10956123B2 (en) | 2019-05-08 | 2021-03-23 | Rovi Guides, Inc. | Device and query management system |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11888791B2 (en) | 2019-05-21 | 2024-01-30 | Apple Inc. | Providing message response suggestions |
US11360739B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User activity shortcut suggestions |
US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
US11237797B2 (en) | 2019-05-31 | 2022-02-01 | Apple Inc. | User activity shortcut suggestions |
US11657813B2 (en) | 2019-05-31 | 2023-05-23 | Apple Inc. | Voice identification in digital assistant systems |
US11790914B2 (en) | 2019-06-01 | 2023-10-17 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US11551678B2 (en) | 2019-08-30 | 2023-01-10 | Spotify Ab | Systems and methods for generating a cleaned version of ambient sound |
US11094319B2 (en) | 2019-08-30 | 2021-08-17 | Spotify Ab | Systems and methods for generating a cleaned version of ambient sound |
US10827028B1 (en) | 2019-09-05 | 2020-11-03 | Spotify Ab | Systems and methods for playing media content on a target device |
US20210073330A1 (en) * | 2019-09-11 | 2021-03-11 | International Business Machines Corporation | Creating an executable process from a text description written in a natural language |
US11681873B2 (en) * | 2019-09-11 | 2023-06-20 | International Business Machines Corporation | Creating an executable process from a text description written in a natural language |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
US20210224910A1 (en) * | 2020-01-21 | 2021-07-22 | S&P Global | Virtual reality system for analyzing financial risk |
US11861713B2 (en) * | 2020-01-21 | 2024-01-02 | S&P Global Inc. | Virtual reality system for analyzing financial risk |
US11810564B2 (en) | 2020-02-11 | 2023-11-07 | Spotify Ab | Dynamic adjustment of wake word acceptance tolerance thresholds in voice-controlled devices |
US11328722B2 (en) | 2020-02-11 | 2022-05-10 | Spotify Ab | Systems and methods for generating a singular voice audio stream |
US11308959B2 (en) | 2020-02-11 | 2022-04-19 | Spotify Ab | Dynamic adjustment of wake word acceptance tolerance thresholds in voice-controlled devices |
US11914848B2 (en) | 2020-05-11 | 2024-02-27 | Apple Inc. | Providing relevant data items based on context |
US11765209B2 (en) | 2020-05-11 | 2023-09-19 | Apple Inc. | Digital assistant hardware abstraction |
US11810578B2 (en) * | 2020-05-11 | 2023-11-07 | Apple Inc. | Device arbitration for digital assistant-based intercom systems |
US20210350810A1 (en) * | 2020-05-11 | 2021-11-11 | Apple Inc. | Device arbitration for digital assistant-based intercom systems |
US11924254B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Digital assistant hardware abstraction |
US11899566B1 (en) | 2020-05-15 | 2024-02-13 | Google Llc | Training and/or using machine learning model(s) for automatic generation of test case(s) for source code |
EP3929885A1 (en) * | 2020-06-26 | 2021-12-29 | GIRA GIERSIEPEN GmbH & Co. KG | Method for building automation |
US11838734B2 (en) | 2020-07-20 | 2023-12-05 | Apple Inc. | Multi-device audio adjustment coordination |
US11696060B2 (en) | 2020-07-21 | 2023-07-04 | Apple Inc. | User identification using headphones |
US11750962B2 (en) | 2020-07-21 | 2023-09-05 | Apple Inc. | User identification using headphones |
US11960789B2 (en) | 2021-02-17 | 2024-04-16 | Rovi Guides, Inc. | Device and query management system |
US20230070082A1 (en) * | 2021-07-26 | 2023-03-09 | LifePod Solutions, Inc. | Systems and methods for managing voice environments and voice routines |
US11804215B1 (en) * | 2022-04-29 | 2023-10-31 | Apple Inc. | Sonic responses |
US20230352007A1 (en) * | 2022-04-29 | 2023-11-02 | Apple Inc. | Sonic responses |
US11641505B1 (en) * | 2022-06-13 | 2023-05-02 | Roku, Inc. | Speaker-identification model for controlling operation of a media player |
US11954405B2 (en) | 2022-11-07 | 2024-04-09 | Apple Inc. | Zero latency digital assistant |
Also Published As
Publication number | Publication date |
---|---|
CN105814555B (en) | 2019-07-19 |
US20150160663A1 (en) | 2015-06-11 |
EP3080677A4 (en) | 2017-07-26 |
CA2931007A1 (en) | 2015-06-18 |
WO2015088608A1 (en) | 2015-06-18 |
CN105814555A (en) | 2016-07-27 |
WO2015088609A1 (en) | 2015-06-18 |
MX362800B (en) | 2019-02-13 |
US20150160635A1 (en) | 2015-06-11 |
EP3080710A4 (en) | 2017-08-09 |
CA2930990A1 (en) | 2015-06-18 |
US20150160634A1 (en) | 2015-06-11 |
MX2016006239A (en) | 2016-09-07 |
US20150159401A1 (en) | 2015-06-11 |
US10027503B2 (en) | 2018-07-17 |
US9838736B2 (en) | 2017-12-05 |
EP3080677B1 (en) | 2019-01-09 |
CA2930990C (en) | 2023-12-05 |
EP3080710A1 (en) | 2016-10-19 |
CA2931007C (en) | 2022-03-15 |
MX2016006589A (en) | 2016-09-06 |
US9912492B2 (en) | 2018-03-06 |
US20150160623A1 (en) | 2015-06-11 |
EP3080677A1 (en) | 2016-10-19 |
WO2015088603A1 (en) | 2015-06-18 |
US9900177B2 (en) | 2018-02-20 |
MX358781B (en) | 2018-09-04 |
US20150163535A1 (en) | 2015-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150162006A1 (en) | Voice-recognition home automation system for speaker-dependent commands | |
US9632746B2 (en) | Automatic muting | |
US9798309B2 (en) | Home automation control based on individual profiling using audio sensor data | |
US9960980B2 (en) | Location monitor and device cloning | |
US10091017B2 (en) | Personalized home automation control based on individualized profiling | |
US9495860B2 (en) | False alarm identification | |
US10073428B2 (en) | Methods and systems for control of home automation activity based on user characteristics | |
US10060644B2 (en) | Methods and systems for control of home automation activity based on user preferences | |
US9628286B1 (en) | Television receiver and home automation system and methods to associate data with nearby people | |
US20170064412A1 (en) | Device-based event detection and notification surfacing | |
US9948477B2 (en) | Home automation weather detection | |
US9983011B2 (en) | Mapping and facilitating evacuation routes in emergency situations | |
US20160182249A1 (en) | Event-based audio/video feed selection | |
US20160191912A1 (en) | Home occupancy simulation mode selection and implementation | |
US11395030B2 (en) | Premises automation control | |
US20160203700A1 (en) | Methods and systems to make changes in home automation based on user states | |
US9704537B2 (en) | Methods and systems for coordinating home automation activity | |
US9967614B2 (en) | Alert suspension for home automation system | |
US11659225B2 (en) | Systems and methods for targeted television commercials based on viewer presence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ECHOSTAR TECHNOLOGIES L.L.C., COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KUMMER, DAVID A;REEL/FRAME:036163/0612 Effective date: 20150708 |
|
AS | Assignment |
Owner name: ECHOSTAR TECHNOLOGIES INTERNATIONAL CORPORATION, COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ECHOSTAR TECHNOLOGIES L.L.C.;REEL/FRAME:041735/0861 Effective date: 20170214 Owner name: ECHOSTAR TECHNOLOGIES INTERNATIONAL CORPORATION, C Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ECHOSTAR TECHNOLOGIES L.L.C.;REEL/FRAME:041735/0861 Effective date: 20170214 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |