US20130135198A1 - Electronic Devices With Gaze Detection Capabilities - Google Patents

Electronic Devices With Gaze Detection Capabilities Download PDF

Info

Publication number
US20130135198A1
US20130135198A1 US13/750,877 US201313750877A US2013135198A1 US 20130135198 A1 US20130135198 A1 US 20130135198A1 US 201313750877 A US201313750877 A US 201313750877A US 2013135198 A1 US2013135198 A1 US 2013135198A1
Authority
US
United States
Prior art keywords
electronic device
user
display
gaze detection
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/750,877
Inventor
Andrew Hodge
Michael Rosenblatt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US13/750,877 priority Critical patent/US20130135198A1/en
Publication of US20130135198A1 publication Critical patent/US20130135198A1/en
Priority to US14/157,909 priority patent/US10025380B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4436Power management, e.g. shutting down unused components of the receiver
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0261Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level
    • H04W52/0267Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by controlling user interface components
    • H04W52/027Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by controlling user interface components by controlling a display operation or backlight unit
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • This invention relates generally to electronic devices, and more particularly, to electronic devices such as portable electronic devices that have gaze detection capabilities.
  • Portable electronic devices such as portable electronic devices are becoming increasingly popular. Examples of portable devices include handheld computers, cellular telephones, media players, and hybrid devices that include the functionality of multiple devices of this type. Popular portable electronic devices that are somewhat larger than traditional handheld electronic devices include laptop computers and tablet computers.
  • An electronic device with a small battery has limited battery capacity. Unless care is taken to consume power wisely, an electronic device with a small battery may exhibit unacceptably short battery life. Techniques for reducing power consumption may be particularly important in wireless devices that support cellular telephone communications, because users of cellular telephone devices often demand long “talk” times.
  • An electronic device may have gaze detection capabilities.
  • One or more gaze detection sensors such as a camera may be used by the electronic device to determine whether a user's gaze is directed towards the electronic device (e.g., whether the user of the electronic device is looking at the electronic device).
  • the electronic device may use gaze detection sensors to determine whether or not the user is looking at a display portion of the electronic device.
  • the electronic device may have power management capabilities that are used to help conserve power.
  • the electronic device may operate in two or more operating modes. One operation mode may be used to optimize performance. Another operating mode may help to extend battery life.
  • the electronic device may use results from gaze detection operations to determine an appropriate mode in which to operate the electronic device.
  • the electronic device may operate in an active mode when the electronic device determines, using gaze detection sensors, that the user's gaze is directed towards the electronic device and may operate in one or more standby modes when the device determines that the user's gaze is not directed towards the electronic device.
  • circuitry and components such as a display screen, touch screen components, gaze detection components, and a central processing unit or CPU in the electronic device may be powered down or operated in a low-power mode to minimize power consumption in the electronic device.
  • the electronic device when the electronic device is in the active mode and detects that the user has looked away from the device, the electronic device may dim or turn off a display screen. If desired, the electronic device can dim the display screen to a standby brightness level after the device has determined that the user has looked away from the device. After a given period of time has elapsed in which no user input has been received by the electronic device, the electronic device can turn off the display screen to conserve power.
  • the electronic device may enter the active mode and return the display screen to an active brightness level (e.g., turn on the display screen or brighten the display screen to the active brightness level).
  • the electronic device may be performing an operation, while in the active mode, that is uninterrupted when the electronic device switches to operating in one of the standby modes.
  • the electronic device may be performing a music playback operation while in the active mode and, when the electronic device detects the user's gaze is not directed towards the electronic device, the electronic device may enter one of the standby modes without interrupting the music playback operation.
  • the electronic device may interrupt an operation when the electronic device begins operating in one of the standby mode.
  • the electronic device may be performing a video playback operation while in the active mode.
  • the electronic device may enter one of the standby modes, dim the display screen that was being used for the video playback operation, and pause the video playback operation.
  • the electronic device may resume the video playback operation when it detects that the user has redirected their gaze towards the electronic device (e.g., towards the video screen).
  • the electronic device may use readings from sensors such as proximity sensors, ambient light sensors, and motion sensors such as accelerometers to determine whether or not to perform gaze detection operations. For example, the electronic device may suspend gaze detection operations whenever a proximity sensor, ambient light sensor, or accelerometer indicates that gaze detection operations are inappropriate (e.g., because of an object in close proximity with the electronic device, insufficient ambient light for gaze detection sensors to detect the user's gaze, excessive vibration which may degrade the performance of gaze detection sensors, etc.).
  • sensors such as proximity sensors, ambient light sensors, and motion sensors such as accelerometers to determine whether or not to perform gaze detection operations.
  • the electronic device may suspend gaze detection operations whenever a proximity sensor, ambient light sensor, or accelerometer indicates that gaze detection operations are inappropriate (e.g., because of an object in close proximity with the electronic device, insufficient ambient light for gaze detection sensors to detect the user's gaze, excessive vibration which may degrade the performance of gaze detection sensors, etc.).
  • An advantage of powering down the display is that a powered down display can help to prevent information on the display from being viewed by an unauthorized viewer. It may therefore be helpful to turn off a display when the lack of a user's gaze indicates that the user is not present to guard the device.
  • FIG. 1 is a perspective view of an illustrative portable electronic device that may have gaze detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of an illustrative portable electronic device that may have gaze detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 3 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 4 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities during a music playback operation in accordance with an embodiment of the present invention.
  • FIG. 5 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities and activity detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 6 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities during a video playback operation in accordance with an embodiment of the present invention.
  • FIG. 7 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection and touch screen input capabilities in accordance with an embodiment of the present invention.
  • FIG. 8 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 9 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities and sensors such as environment sensors in accordance with an embodiment of the present invention.
  • FIG. 10 is a flow chart of illustrative steps involved in reducing power to displays in an electronic device in accordance with an embodiment of the present invention.
  • the present invention relates generally to electronic devices, and more particularly, to electronic devices such as portable electronic devices that have gaze detection capabilities.
  • an electronic device with gaze detection capabilities may have the ability to determine whether a user's gaze is within a given boundary without resolving the specific location of the user's gaze within that boundary.
  • the electronic device may be able to detect whether a user's gaze is directed towards a display associated with the device.
  • an electronic device may have gaze tracking capabilities. Gaze tracking capabilities allow the electronic device to determine not only whether or not a user's gaze is directed towards a display associated with the device but also which portion of the display the user's gaze is directed towards.
  • An electronic device may be used to detect a user's gaze and adjust its behavior according to whether or not the user's gaze is detected. For example, the electronic device may be able to detect whether or not the user is looking at the device and adjust power management settings accordingly. With one suitable arrangement, the electronic device may delay turning device components off (e.g., components which would otherwise be turned off as part of a power management scheme) while the user's gaze is directed towards the device and the electronic device may accelerate the shutdown of device components when the user's gaze is not detected. For example, when the user's gaze is detected, a device with a display may keep the display at normal brightness rather than dimming the display and, when the device detects the user is no longing looking at the device, the device may dim or turn off the display.
  • the electronic device may delay turning device components off (e.g., components which would otherwise be turned off as part of a power management scheme) while the user's gaze is directed towards the device and the electronic device may accelerate the shutdown of device components when the user's gaze is not detected.
  • This type of arrangement may be especially beneficial in situations in which the user is not actively controlling the electronic device (e.g., the user is not pressing buttons or supplying touch screen inputs) but is still interacting with the electronic device (e.g., the user is reading text on the display, watching video on the display, etc.).
  • An advantage of turning off the display when the user is not looking at the display is this may help prevent unauthorized users from viewing information on the display, thereby enhancing device security.
  • Electronic devices that have gaze detection capabilities may be portable electronic devices such as laptop computers or small portable computers of the type that are sometimes referred to as ultraportables.
  • Portable electronic devices may also be somewhat smaller devices. Examples of smaller portable electronic devices include wrist-watch devices, pendant devices, headphone and earpiece devices, and other wearable and miniature devices. With one suitable arrangement, the portable electronic devices may be wireless electronic devices.
  • the wireless electronic devices may be, for example, handheld wireless devices such as cellular telephones, media players with wireless communications capabilities, handheld computers (also sometimes called personal digital assistants), global positioning system (GPS) devices, and handheld gaming devices.
  • the wireless electronic devices may also be hybrid devices that combine the functionality of multiple conventional devices. Examples of hybrid portable electronic devices include a cellular telephone that includes media player functionality, a gaming device that includes a wireless communications capability, a cellular telephone that includes game and email functions, and a portable device that receives email, supports mobile telephone calls, has music player functionality, and supports web browsing. These are merely illustrative examples.
  • User device 10 may be any suitable electronic device such as a portable or handheld electronic device.
  • Device 10 of FIG. 1 may be, for example, a handheld electronic device that supports 2G and/or 3G cellular telephone and data functions, global positioning system capabilities or other satellite navigation capabilities, and local wireless communications capabilities (e.g., IEEE 802.11 and Bluetooth®) and that supports handheld computing device functions such as internet browsing, email and calendar functions, games, music player functionality, etc.
  • local wireless communications capabilities e.g., IEEE 802.11 and Bluetooth®
  • Display 16 may have a housing 12 .
  • Display 16 may be attached to housing 12 using bezel 14 .
  • Display 16 may be a touch screen liquid crystal display (as an example).
  • Display 16 may have pixels that can be controlled individually in connection with power consumption adjustments. For example, in an organic light emitting diode (OLED) display, power can be reduced by making full and/or partial brightness reductions to some or all of the pixels.
  • Display 16 may be formed from a panel subsystem and a backlight subsystem.
  • display 16 may have a liquid crystal display (LCD) panel subsystem and a light emitting diode or fluorescent tube backlight subsystem. In backlight subsystems that contain individually controllable elements such as light emitting diodes, the brightness of the backlight elements may be selectively controlled.
  • LCD liquid crystal display
  • the brightness of some of the backlight elements may be reduced while the other backlight elements remain fully powered.
  • the power of the single element may be partially or fully reduced to reduce power consumption. It may also be advantageous to make power adjustments to the circuitry that drives the LCD panel subsystem.
  • Display screen 16 is merely one example of an input-output device that may be used with electronic device 10 .
  • electronic device 10 may have other input-output devices.
  • electronic device 10 may have user input control devices such as button 19 , and input-output components such as port 20 and one or more input-output jacks (e.g., for audio and/or video).
  • Button 19 may be, for example, a menu button.
  • Port 20 may contain a 30-pin data connector (as an example). Openings 22 and 24 may, if desired, form speaker and microphone ports.
  • Speaker port 22 may be used when operating device 10 in speakerphone mode. Opening 23 may also form a speaker port.
  • speaker port 23 may serve as a telephone receiver that is placed adjacent to a user's ear during operation.
  • display screen 16 is shown as being mounted on the front face of handheld electronic device 10 , but display screen 16 may, if desired, be mounted on the rear face of handheld electronic device 10 , on a side of device 10 , on a flip-up portion of device 10 that is attached to a main body portion of device 10 by a hinge (for example), or using any other suitable mounting arrangement.
  • a user of electronic device 10 may supply input commands using user input interface devices such as button 19 and touch screen 16 .
  • Suitable user input interface devices for electronic device 10 include buttons (e.g., alphanumeric keys, power on-off, power-on, power-off, and other specialized buttons, etc.), a touch pad, pointing stick, or other cursor control device, a microphone for supplying voice commands, or any other suitable interface for controlling device 10 .
  • Buttons such as button 19 and other user input interface devices may generally be formed on any suitable portion of electronic device 10 .
  • a button such as button 19 or other user interface control may be formed on the side of electronic device 10 .
  • Buttons and other user interface controls can also be located on the top face, rear face, or other portion of device 10 .
  • device 10 can be controlled remotely (e.g., using an infrared remote control, a radio-frequency remote control such as a Bluetooth® remote control, etc.).
  • device 10 may contain sensors such as a proximity sensor and an ambient light sensor.
  • sensors such as a proximity sensor and an ambient light sensor.
  • a proximity sensor may be used to detect when device 10 is close to a user's head or other object.
  • An ambient light sensor may be used to make measurements of current light levels.
  • Device 10 may have a camera or other optical sensor such as camera 30 that can be used for gaze detection operations.
  • Cameras used for gaze detection may, for example, be used by device 10 to capture images of a user's face that are processed by device 10 to detect where the user's gaze is directed.
  • Camera 30 may be integrated into housing 12 . While shown as being formed on the top face of electronic device 10 in the example of FIG. 1 , cameras such as camera 30 may generally be formed on any suitable portion of electronic device 10 .
  • camera 30 may be mounted on a flip-up portion of device 10 that is attached to a main body portion of device 10 by a hinge or may be mounted between the flip-up portion of device 10 and the main body portion of device 10 (e.g., in the hinge region between the flip-up portion and the main body portion such that the camera can be used regardless of whether the device is flipped open or is closed).
  • Device 10 may also have additional cameras (e.g., device 10 may have camera 30 on the top face of device 10 for gaze detection operations and another camera on the bottom face of device 10 for capturing images and video).
  • camera 30 may be implemented using an optical sensor that has been optimized for gaze detection operations.
  • camera 30 may include one or more light emitting diodes (LED's) and an optical sensor capable of detecting reflections of light emitted from the LEDs off of the users' eyes when the users are gazing at device 10 .
  • the light emitting diodes may emit a modulated infrared light and the optical sensor may be synchronized to detect reflections of the modulated infrared light, as an example.
  • any suitable gaze detection image sensor and circuitry may be used for supporting gaze detection operations in device 10 .
  • the use of camera 30 is sometimes described herein as an example.
  • Portable device 10 may be a mobile telephone, a mobile telephone with media player capabilities, a handheld computer, a remote control, a game player, a global positioning system (GPS) device, a laptop computer, a tablet computer, an ultraportable computer, a hybrid device that includes the functionality of some or all of these devices, or any other suitable portable electronic device.
  • GPS global positioning system
  • Storage 34 may include one or more different types of storage such as hard disk drive storage, nonvolatile memory (e.g., flash memory or other electrically-programmable-read-only memory), volatile memory (e.g., battery-based static or dynamic random-access-memory), etc.
  • nonvolatile memory e.g., flash memory or other electrically-programmable-read-only memory
  • volatile memory e.g., battery-based static or dynamic random-access-memory
  • Processing circuitry 36 may be used to control the operation of device 10 .
  • Processing circuitry 36 may be based on a processor such as a microprocessor and other suitable integrated circuits. With one suitable arrangement, processing circuitry 36 and storage 34 are used to run software on device 10 , such as gaze detection applications, internet browsing applications, voice-over-internet-protocol (VOIP) telephone call applications, email applications, media playback applications, navigation functions, map functions, operating system functions, power management functions, etc.
  • Processing circuitry 36 and storage 34 may be used in implementing suitable communications protocols.
  • Communications protocols that may be implemented using processing circuitry 36 and storage 34 include internet protocols, wireless local area network protocols (e.g., IEEE 802.11 protocols—sometimes referred to as Wi-Fi®), protocols for other short-range wireless communications links such as the Bluetooth® protocol, protocols for handling 3 G communications services (e.g., using wide band code division multiple access techniques), 2G cellular telephone communications protocols, etc.
  • processing circuitry 36 may operate in a reduced power mode (e.g., circuitry 36 may be suspended or operated at a lower frequency) when device 10 enters a suitable standby mode.
  • Input-output devices 38 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices.
  • Display screen 16 , camera 30 , button 19 , microphone port 24 , speaker port 22 , and dock connector port 20 are examples of input-output devices 38 .
  • input-output devices 38 may include any suitable components for receiving input and/or providing output from device 10 .
  • input-output devices 38 can include user input-output devices 40 such as buttons, touch screens, joysticks, click wheels, scrolling wheels, touch pads, key pads, keyboards, microphones, etc.
  • a user can control the operation of device 10 by supplying commands through user input devices 40 .
  • Input-output device 38 may include sensors such as proximity sensors, ambient light sensors, orientation sensors, proximity sensors, and any other suitable sensors.
  • Input-output devices 38 may include a camera such as integrated camera 41 (e.g., a camera that is integrated into the housing of device 10 ) and camera 30 of FIG. 1 .
  • Cameras such as camera 41 and camera 30 may be used as part of a gaze detection system.
  • camera 41 may be used by device 10 to capture images that are processed by a gaze detection application running on processing circuitry 36 to determine whether or not a user's gaze is directed towards the device.
  • Cameras such as camera 41 and camera 30 may, if desired, be provided with image stabilization capabilities (e.g., using feedback derived from an accelerometer, orientation sensor, or other sensor).
  • Display and audio devices 42 may include liquid-crystal display (LCD) screens or other screens, light-emitting diodes (LEDs), and other components that present visual information and status data. Display and audio devices 42 may also include audio equipment such as speakers and other devices for creating sound. Display and audio devices 42 may contain audio-video interface equipment such as jacks and other connectors for external headphones and monitors.
  • LCD liquid-crystal display
  • LEDs light-emitting diodes
  • Display and audio devices 42 may also include audio equipment such as speakers and other devices for creating sound.
  • Display and audio devices 42 may contain audio-video interface equipment such as jacks and other connectors for external headphones and monitors.
  • Wireless communications devices 44 may include communications circuitry such as radio-frequency (RF) transceiver circuitry formed from one or more integrated circuits, power amplifier circuitry, passive RF components, antennas, and other circuitry for handling RF wireless signals. Wireless signals can also be sent using light (e.g., using infrared communications).
  • RF radio-frequency
  • Device 10 can communicate with external devices such as accessories 46 , computing equipment 48 , and wireless network 49 , as shown by paths 50 and 51 .
  • Paths 50 may include wired and wireless paths.
  • Path 51 may be a wireless path.
  • Accessories 46 may include headphones (e.g., a wireless cellular headset or audio headphones) and audio-video equipment (e.g., wireless speakers, a game controller, or other equipment that receives and plays audio and video content), a peripheral such as a wireless printer or camera, etc.
  • Computing equipment 48 may be any suitable computer. With one suitable arrangement, computing equipment 48 is a computer that has an associated wireless access point (router) or an internal or external wireless card that establishes a wireless connection with device 10 .
  • the computer may be a server (e.g., an internet server), a local area network computer with or without internet access, a user's own personal computer, a peer device (e.g., another portable electronic device 10 ), or any other suitable computing equipment.
  • Wireless network 49 may include any suitable network equipment, such as cellular telephone base stations, cellular towers, wireless data networks, computers associated with wireless networks, etc.
  • a device such as device 10 that has gaze detection capabilities may use gaze detector data in implementing a power management scheme.
  • device 10 may operate in multiple modes to conserve power and may utilize gaze detection operations to assist in determining an appropriate mode in which to operate.
  • the operational modes of device 10 may include modes such as an active mode, a partial standby mode, and a full standby mode.
  • device 10 may adjust the brightness of display 16 and may turn display 16 on or off whenever appropriate in order to conserve power.
  • display 16 may be at an active brightness when device 10 is in the active mode, a standby brightness when device 10 is in the partial standby mode, and may be turned off when device 10 is in the full standby mode.
  • the standby brightness may be somewhat dimmer than the active brightness.
  • the power consumption of display 16 and therefore device 10 will be reduced when the brightness of display 16 is reduced and when display 16 is turned off.
  • device 10 is in an active mode.
  • a display such as display 16 may be turned on and may display an appropriate screen such as an application display screen at the active brightness level.
  • the active brightness level may be a configurable brightness level. For example, device 10 may receive input from a user to adjust the active brightness level. In general, the active brightness level may be adjusted anywhere between the maximum brightness and minimum brightness level display 16 is capable of.
  • device 10 may be performing a music playback operation when device 10 is in the active mode.
  • the music playback operation may be occurring in the background of the operation of device 10 (e.g., device 10 may be performing the music playback operation while display 16 and user input device 40 are used by the user to perform additional tasks such as writing an e-mail, browsing the web, etc.).
  • any suitable interval such as thirty times per second, ten times per second, twice per second, once per second, every two seconds, every five seconds, upon occurrence of non-time-based criteria, combinations of these intervals, or at any other suitable time.
  • device 10 may dim display screen 16 and may enter partial standby mode 56 .
  • Device 10 may detect that the user has diverted their gaze away from device 10 and display 16 using a gaze detection sensor such as camera 30 and gaze detection software running on the hardware of device 10 . If desired, gaze detection processing may be offloaded to specialized gaze detection circuitry (e.g., circuitry in a gaze detection chip or a camera controller).
  • device 10 is in a partial standby mode.
  • the brightness level of display 16 may be reduced from an active brightness level to a standby brightness level to reduce the power consumption of device 10 .
  • some operations running on device 10 may be suspended or stopped and some operations may continue running.
  • a music playback operation may continue when device 10 enters one of its standby modes while a web browsing application may be suspended.
  • device 10 can dim display 16 to conserve power whenever the user looks away from display 16 while continuing to play back the music that the user is listening to without interruption.
  • device 10 may brighten display screen 16 and may enter active mode 52 .
  • Device 10 may enter active mode 52 in response to user activity such as button press activity received through a button such as button 19 and in response to other activity such as network activity (e.g., activity received through a wired or wireless communications link).
  • user activity such as button press activity received through a button such as button 19
  • other activity such as network activity (e.g., activity received through a wired or wireless communications link).
  • network activity e.g., activity received through a wired or wireless communications link
  • device 10 may implement a power management scheme that turns off display 16 based on gaze detection data.
  • device 10 may turn off display 16 when the device detects that the user is not looking at display 16 (e.g., rather than merely dimming display 16 as in the example of FIG. 3 ).
  • device 10 is in an active mode. While device 10 is in the active mode, device 10 may perform gaze detection operations. Because device 10 is in the active mode, display 16 may be at an active brightness level. With one suitable arrangement, when device 10 is in active mode 60 , device 10 may be displaying a screen with display 16 that is of interest to the user but which does not demand the user's constant attention. For example, when device 10 is in mode 60 , device 10 may be displaying a screen such as a now playing screen associated with a music playback operation or a telephone information screen associated with a telephone operation (e.g., a new incoming call, a new outgoing call, an active call, etc.).
  • a screen such as a now playing screen associated with a music playback operation or a telephone information screen associated with a telephone operation (e.g., a new incoming call, a new outgoing call, an active call, etc.).
  • the now playing screen may, for example, include information about the music playback operation such as a track name, album name, artist name, elapsed playback time, remaining playback time, album art, etc. and may include on-screen selectable options (e.g., when display 16 is a touch-screen display) such as play, pause, fast forward, rewind, skip ahead (e.g., to another audio track), skip back, stop, etc.
  • information about the music playback operation such as a track name, album name, artist name, elapsed playback time, remaining playback time, album art, etc.
  • on-screen selectable options e.g., when display 16 is a touch-screen display
  • play pause, fast forward, rewind, skip ahead (e.g., to another audio track), skip back, stop, etc.
  • a telephone information screen might include information about a telephone operation such as a current call time, the telephone number associated with a telephone call, a contact name associated with the telephone call, and an image associated with the telephone call and may include on-screen selectable options such as a keypad to enter a telephone number, a call button, an end call button, a hold button, a speakerphone button, a mute button, an add call button, a contacts button, etc.
  • device 10 may turn off display 16 and may enter standby mode 64 .
  • device 10 may continue to perform background operations such as a music playback operation that was occurring before device 10 entered standby mode 64 (e.g., before device 10 detected that the user's gaze was diverted away from display 16 ).
  • the application screen displayed in mode 60 is of secondary importance to the user, device 10 may turn off display 16 completely when the user looks away without disrupting the user. For example, when a user is listening to an audio track and is also viewing information associated with the audio track on a now playing screen, device 10 can turn off display 16 when the user looks away, while continuing an audio playback operation.
  • the user's primary use of device 10 (listening to music) is not interrupted, even though the secondary use of device 10 (viewing the now playing screen) has been halted.
  • mode 64 device 10 is in a standby mode.
  • display 16 may be turned off by device 10 to conserve power.
  • suitable components of device 10 may be powered down (if desired).
  • the power consumption of processing circuitry 36 may be reduced (e.g., by operating fewer processor cores, by reducing the computing frequency of circuitry 36 , etc.).
  • an operation such as a music playback operation or a telephone call may continue when device 10 is in mode 64 .
  • device 10 when device 10 detects activity such as user activity, device 10 may enter active mode 60 and turn on display 16 .
  • Device 10 may enter active mode 60 in response to any suitable activity such as button press activity, network activity, and gaze detection activity (e.g., when device 10 detects that the user has directed their gaze towards device 10 ).
  • device 10 may implement a power management scheme that is responsive to gaze detection data and other input data (e.g., user input, network input, etc.).
  • device 10 can switch between an active mode, a partial standby mode, and a standby mode.
  • Device 10 may power down hardware components and suspend or slow down software operations depending on the mode in which device 10 is operating. For example, when device 10 is in either of the standby modes, device 10 may reduce the number of processing cores utilized by circuitry 36 and/or may reduce the processing frequency (clock rate) of circuitry such as circuitry 36 .
  • device 10 may turn display 16 on at an active brightness level in the active mode, dim display 16 to a standby brightness level in the partial standby mode, and turn display 16 off in the standby mode.
  • device 10 is in an active mode. While device 10 is in the active mode, device 10 may perform gaze detection operations. Display 16 may be at the active brightness level while device 10 is in active mode 68 . In mode 68 , device 10 may be displaying an application display screen such as a home page, a music playback application screen, a web browsing application screen, an email application screen, etc. If desired, device 10 may also be performing a music playback operation while in mode 68 (e.g., device 10 may be performing the music playback operation as a background process as device 10 displays the application display screen).
  • an application display screen such as a home page, a music playback application screen, a web browsing application screen, an email application screen, etc.
  • device 10 may also be performing a music playback operation while in mode 68 (e.g., device 10 may be performing the music playback operation as a background process as device 10 displays the application display screen).
  • device 10 When device 10 detects that the user has looked away from display 16 (e.g., using a gaze detection sensor such as camera 30 ), device 10 may dim display 16 and enter partial standby mode 72 , as illustrated by line 70 .
  • a gaze detection sensor such as camera 30
  • mode 72 device 10 is in a partial standby mode.
  • device 10 may dim display 16 to a partial standby brightness level to conserve power and, if desired, may place other components such as processing circuitry, wireless transceiver circuitry, etc. in a standby mode to conserve power.
  • Certain operations may continue when device 10 enters mode 72 . For example, a music playback operation or a telephone call may continue uninterrupted when device 10 enters mode 72 .
  • Device 10 may perform gaze detection operations while in mode 72 .
  • device 10 may continually capture images using camera 30 at regular intervals and may analyze the captured images using gaze detection software to determine whether the user's gaze has returned to device 10 and display 16 .
  • the rate at which device 10 captures and processes images for gaze detection operations while in mode 72 may be reduced relative to the rate at which gaze detection images are captured and processed while device is in an active mode such as mode 68 (e.g., device 10 may capture images at a rate of once every 100 milliseconds, 250 milliseconds, 500 milliseconds, 1 second, etc. in mode 72 and once every 50 milliseconds, 125 milliseconds, 250 milliseconds, 500 milliseconds, etc. in mode 68 ).
  • Device 10 may switch from partial standby mode 72 to active mode 68 whenever appropriate. For example, when device 10 detects that a user's gaze is directed towards display 16 , device 10 may enter an active mode such as mode 68 (e.g., as illustrated by line 75 ) and may brighten display 16 to the active brightness level. Device 10 may also enter active mode 68 when device 10 detects activity such as user activity received through a button such as button 19 and network activity received through a wired or wireless communications link (e.g., as illustrated by line 74 ). In general, device 10 will enter active mode 68 whenever a user resumes interacting with device 10 or device 10 needs to respond to network activity.
  • active mode 68 e.g., as illustrated by line 75
  • device 10 may also enter active mode 68 when device 10 detects activity such as user activity received through a button such as button 19 and network activity received through a wired or wireless communications link (e.g., as illustrated by line 74 ). In general, device 10 will enter active mode 68 whenever a user resume
  • device 10 enters active mode 68 when device 10 detects that the user's gaze is directed towards display 16 (e.g., as illustrated by line 75 ), the user of device 10 need not press a button or provide other input to awaken device 10 from the partial standby state. Instead, device 10 can automatically awaken (e.g., switch to active mode 68 ) when device 10 detects that the user has directed their gaze towards display 16 .
  • device 10 may operate in a standby mode such as standby mode 76 in which display 16 is turned off.
  • standby mode 76 in which display 16 is turned off.
  • device 10 may enter standby mode 76 and turn off display 16 .
  • Device 10 may enter standby mode 76 , as illustrated by line 79 , after device 10 detects that the user has looked away (e.g., as illustrated by line 70 ) and after a given period of user inactivity has elapsed following the device's detection that the user looked away.
  • device 10 may operate with display 16 turned off.
  • Device 10 may place suitable components into standby.
  • device 10 may turn wireless transceiver circuitry off, reduce the power consumption of processing circuitry such as circuitry 36 (e.g., by turning off selected processing cores or lowering clock rates), turn off sensors such as proximity sensors, ambient light sensors, and accelerometers, and may suspend or power down any other suitable components.
  • processing circuitry such as circuitry 36 (e.g., by turning off selected processing cores or lowering clock rates)
  • turn off sensors such as proximity sensors, ambient light sensors, and accelerometers
  • certain operations may continue when device 10 enters and operates in standby mode 76 .
  • a music playback operation or a telephone call may continue uninterrupted when device 10 enters mode 76 .
  • device 10 may remain in active mode 68 .
  • Device 10 may remain in active mode 68 even when no other user activity is received (e.g., when the user is not pressing a button such as button 19 or providing user input through a touch screen such as touch screen display 16 ).
  • This type of arrangement may be beneficial when a user is utilizing device 10 without providing user input and would be inconvenienced by device 10 implementing power management techniques.
  • Device 10 can override power management schemes such as dimming a display screen based on results of gaze detection operations.
  • device 10 when device 10 detects a user's gaze and is presenting the user with text or video through display 16 , device 10 may override power management instructions that could otherwise reduce the power of display 16 to ensure that display 16 is not dimmed or turned off even though the user has not provided direct user input.
  • device 10 may continue to perform gaze detection operations when operating in standby mode 76 . As illustrated by dashed line 77 , device 10 may switch from standby mode 76 to active mode 68 whenever device 10 detects that a user's gaze is once again directed towards display 16 .
  • device 10 may switch from mode 76 to active mode (e.g., device 10 may turn on display 16 to the active brightness level).
  • device 10 may enter active mode 68 in response to user activity such as button press activity received through a button such as button 19 and in response to other activity such as network activity (e.g., activity received through a wired or wireless communications link).
  • device 10 may implement a power management scheme that utilizes gaze detection capabilities while executing a video playback operation (e.g., while playing video for a user).
  • device 10 can operate in an active mode (mode 80 ), a pause standby mode (e.g., a partial standby mode such as mode 84 ), and a standby mode (mode 90 ).
  • mode 80 active mode
  • pause standby mode e.g., a partial standby mode such as mode 84
  • mode 90 standby mode
  • device 10 may be performing a video playback operation for a user when the device is in the active mode, device 10 may pause the video playback operation and dim an associated display screen when the user looks away from the device and the device enters the pause standby mode, and device 10 may turn off the display screen (e.g., the screen used for the video playback operation) if the user does not look back towards the device within a given period of time and no other user activity is detected.
  • the display screen e.g., the screen used for the video playback operation
  • device 10 is active. While device 10 is in the active mode, device 10 may perform gaze detection operations (e.g., using camera 30 and processing circuitry 36 to detect whether or not a user is gazing at display 16 ). While in mode 80 , device 10 may perform a video playback operation. For example, device 10 may display video on display 16 and may play audio associated with the video through a speaker such as speaker 22 or through a headphone accessory such as accessory 46 . Display 16 may display the video at an active brightness level (e.g., display 16 may be at a relatively bright display level).
  • gaze detection operations e.g., using camera 30 and processing circuitry 36 to detect whether or not a user is gazing at display 16 .
  • device 10 may perform a video playback operation. For example, device 10 may display video on display 16 and may play audio associated with the video through a speaker such as speaker 22 or through a headphone accessory such as accessory 46 .
  • Display 16 may display the video at an active brightness level (e.g., display 16 may be at a relatively bright display
  • device 10 may dim display 16 and enter pause standby mode 84 as illustrated by line 82 .
  • device 10 may pause the video playback operation of mode 80 .
  • device 10 will also pause an accompanying audio playback associated with the video playback operation. The user may, if desired, configure whether device 10 pauses the audio.
  • mode 84 device 10 is in a pause standby mode.
  • device 10 may dim display 16 to a pause standby brightness level (e.g., a partial standby brightness level) to conserve power.
  • the video playback operation of mode 80 may be paused while device 10 is in mode 84 .
  • device 10 may place components such as processing circuitry and wireless transceiver circuitry in a standby mode while device 10 is in mode 84 (e.g., by turning off unused CPU cores or reducing clock rates).
  • device 10 may be performing gaze detection operations while in pause standby mode 84 .
  • device 10 may capture images using camera 30 at regular intervals and may analyze the images using gaze detection software to continually monitor whether the user's gaze has returned to device 10 and display 16 .
  • Device 10 may switch from pause standby mode 84 to mode 80 whenever appropriate. For example, whenever device 10 detects that a user's gaze is once again directed towards display 16 , device 10 may enter an active mode such as mode 80 (e.g., as illustrated by line 86 ), brighten display 16 to the active brightness level, and resume the video playback operation. Device 10 may also enter mode 80 when device 10 detects activity such as user activity received through a button such as button 19 or network activity received through a wired or wireless communications link (e.g., as illustrated by dashed line 87 ). In general, device 10 will enter mode 80 whenever a user resumes interacting with device 10 .
  • an active mode such as mode 80 (e.g., as illustrated by line 86 )
  • device 10 may also enter mode 80 when device 10 detects activity such as user activity received through a button such as button 19 or network activity received through a wired or wireless communications link (e.g., as illustrated by dashed line 87 ).
  • device 10 will enter mode 80 whenever
  • device 10 enters mode 80 when it detects that the user's gaze is directed towards display 16 (e.g., as illustrated by line 86 ), the user of device 10 need not press a button or provide other input to awaken device 10 from the pause standby state and resume the video playback operation of mode 80 . Instead, device 10 can automatically awaken itself (e.g., switch to mode 80 ) and resume the video playback operation when the user directs their gaze towards display 16 .
  • device 10 may operate in a standby mode such as standby mode 90 in which display 16 is turned off.
  • standby mode 90 in which display 16 is turned off.
  • device 10 may enter standby mode 90 and turn off display 16 .
  • standby mode 90 involves a lower power state for device 10 then pause standby mode 84
  • mode 90 may sometimes referred to as full standby mode. As illustrated by line 88 in FIG.
  • device 10 may enter full standby mode 90 after device 10 detects that the user has looked away (e.g., as illustrated by line 82 ) and after a given period of user inactivity has elapsed following the device's detection that the user looked away.
  • device 10 may remain in active mode 80 and video playback operation can continue (e.g., until the video is completed or the operation is stopped).
  • Device 10 may remain in mode 80 even when no other user activity is being received (e.g., when the user is not pressing a button such as button 19 or providing user input through a touch screen such as touch screen display 16 ).
  • This type of arrangement may be beneficial when a user is viewing a video on display 16 of device 10 without providing user input and would be inconvenienced if device 10 were to attempt to conserve power by dimming the video screen.
  • Device 10 can pause the video playback operation when the user temporarily looks away and can then resume operation when the user returns their gaze to device 10 . This allows the user of device 10 to automatically pause a video without having to provide direct user input (e.g., without selecting a pause button). The video can be paused simply by looking away from a video display such as display 16 .
  • device 10 may continue to perform gaze detection operations when operating in standby mode 90 . As illustrated by dashed line 93 , device 10 may switch from standby mode 90 to active mode 80 and resume the video playback operation of mode 80 when device 10 detects that a user's gaze is again directed towards display 16 .
  • device 10 may switch from mode 90 to active mode (e.g., device 10 may turn on display 16 to the active brightness level).
  • active mode e.g., device 10 may turn on display 16 to the active brightness level.
  • device 10 may enter active mode 80 in response to user activity such as button press activity received through a button such as button 19 and in response to other activity such as network activity (e.g., activity received through a wired or wireless communications link).
  • device 10 may automatically resume a video playback operation when the device switches to active mode 80 from a standby mode such as pause standby mode 84 or full standby mode 90 .
  • device 10 may present the user with an option such as an on-screen selectable option to resume the video playback operation when the device switches to active mode 80 .
  • Device 10 may have touch screen capabilities and may implement a power management scheme using gaze detection capabilities to control the device's touch screen capabilities. With this type of scheme, which is illustrated by FIG. 7 , device 10 can switch between an active mode, a partial standby mode, and a standby mode.
  • Touch screen functions can be adjusted to conserve power.
  • display 16 may be a touch screen display that can operate at varying speeds (e.g., a fast speed and a slow speed) or with varying levels of functionality (e.g., general touch sensitivity, localized touch sensitivity, and gesture-capable touch sensitivity). These features can be adjusted based on gaze detection data.
  • touch screen display 16 may operate at a first frequency (e.g., at a relatively high speed) when device 10 is in active mode 94 and a second frequency (e.g., a relatively low speed) when device 10 is in standby mode 104 .
  • the frequency of touch screen display 16 may be the frequency at which the touch screen scans for user input (e.g., once every 10 milliseconds, 50 milliseconds, 100 milliseconds, 200 milliseconds, etc.).
  • touch screen display 16 may operate at a first level of functionality when device 10 is in mode 94 and at a second level of functionality when device 10 is in mode 104 .
  • touch screen display 16 may be configured to sense the location of user input within the area of display 16 .
  • Device 10 may also be configured to sense user inputs such as multi-touch user inputs and gestures such as swipe gestures and swipe and hold gestures while in mode 94 .
  • touch screen display 16 may be configured such that display 16 can sense general user input such as the presence or absence of contact without being able to resolve the location of the input. The power consumption of display 16 may be reduced when display 16 is configured in this way.
  • device 10 is in an active mode. While device 10 is in the active mode, device 10 may perform gaze detection operations.
  • Touch screen display 16 may be operating at a relatively high frequency (e.g., in the high power mode) while device 10 is in active mode 94 .
  • touch screen display 16 may be operating at or near its maximum capability (e.g., touch screen display 16 may be configured to sense the location of user inputs and to sense user inputs such as multi-touch inputs and gestures).
  • Display 16 may also be displaying an application display screen (e.g., a home page, a telephone application information page, a media player screen, etc.) at an active brightness level.
  • an application display screen e.g., a home page, a telephone application information page, a media player screen, etc.
  • device 10 When device 10 detects that the user has looked away from display 16 (e.g., using a gaze detection sensor such as camera 30 ), device 10 may dim display 16 and enter partial standby mode 98 , as illustrated by line 96 .
  • a gaze detection sensor such as camera 30
  • device 10 In mode 98 , device 10 is in a partial standby mode. In partial standby mode 98 , device 10 may dim display 16 to a partial standby brightness level to conserve power and may retain the touch screen capabilities of display 16 . (Alternatively, touch screen capabilities can be reduced in mode 98 .)
  • Device 10 may switch from partial standby mode 98 to active mode 94 whenever appropriate. For example, when device 10 detects that a user's gaze is directed towards display 16 , device 10 may enter an active mode such as mode (e.g., as illustrated by line 100 ) and may brighten display 16 to the active brightness level. Device 10 may also enter active mode 94 when device 10 detects user activity (e.g., as illustrated by dashed line 99 ). In arrangements in which the touch screen capabilities of display 16 remain at the active mode level when device 10 is in mode 98 , display 16 may be able to receive location specific user inputs (e.g., inputs specific to a particular portion of display 16 ) while device 10 is in mode 98 .
  • an active mode such as mode (e.g., as illustrated by line 100 ) and may brighten display 16 to the active brightness level.
  • Device 10 may also enter active mode 94 when device 10 detects user activity (e.g., as illustrated by dashed line 99 ).
  • display 16 may be able to receive
  • device 10 may operate in a full standby mode such as standby mode 104 in which display 16 is turned off and the touch screen capabilities of display 16 are reduced.
  • standby mode 104 in which display 16 is turned off and the touch screen capabilities of display 16 are reduced.
  • device 10 may enter standby mode 104 .
  • Device 10 may enter standby mode 104 as illustrated by line 102 after device 10 detects that the user has looked away (e.g., as illustrated by line 96 ) and after a given period of user inactivity has elapsed following the device's detection that the user has looked away.
  • device 10 may enter standby mode 104 directly from active mode 94 when no user activity is detected for a configurable period of time (e.g., as illustrated by dashed line 108 ).
  • Device 10 may enter standby mode 104 even when device 10 detects that a user's gaze is directed towards display 16 .
  • the time period of user inactivity required before device 10 enters mode 104 directly from mode 94 e.g., when a user's gaze is still directed towards device 10
  • the time period of user inactivity required before device 10 enters mode 104 directly from mode 94 may be longer than the time period of user inactivity required before device 10 enters mode 104 from mode 98 (e.g., when the user's gaze is not directed towards device 10 ).
  • the inactivity period associated with the mode transition of line 108 may be one minute or more while the inactivity period associated with the mode transition of line 102 may be thirty seconds or less.
  • device 10 may operate with a display portion of display 16 turned off.
  • the display portion of display 16 and a touch screen portion of display 16 may be powered and configured independently.
  • device 10 may reduce the touch screen capabilities of the touch screen portion of display 16 (e.g., by reducing the frequency at which touch screen display 16 scans for user input, by configuring display 16 such that user inputs can only be sensed generally, by disabling the touch screen capabilities of display 16 , etc.).
  • device 10 may continue to perform gaze detection operations when operating in standby mode 104 . As illustrated by dashed line 105 , device 10 may switch from standby mode 104 to active mode 94 when device 10 detects that a user's gaze is directed towards display 16 .
  • device 10 may also switch from mode 104 to active mode 94 when activity is detected (e.g., device 10 may turn on display 16 to the active brightness level and restore the touch screen capabilities of display 16 to the active capability level).
  • power can be further conserved by reducing the power consumption of components such as a processor, wireless communications circuitry, etc. while in full standby mode 104 and/or partial standby mode 98 .
  • the clock frequency for the clock that is used to operate processing circuitry 36 e.g., a microprocessor
  • the number of processor cores that are active in processing circuitry 36 may also be reduced.
  • Some or all of wireless communications circuitry 44 may be placed in a low-power state or turned off.
  • the amount of additional circuitry that is powered down when device 10 enters modes 98 and 104 may be the same or, if desired, relatively more circuitry may be powered down in full standby mode 104 than in partial standby mode 98 .
  • Device 10 may have additional power down modes in which different numbers of these components have been placed in low-power states. Any suitable criteria may be used to determine when to switch device 10 between these modes. For example, gaze detection data, user input data, and/or sensor data may be used to determine an appropriate mode in which to operate device 10 .
  • Components that may be powered down in this way include proximity sensors, light sensors such as an ambient light sensor, cameras, motions sensors such as accelerometers, audio circuits, radio-frequency transceiver circuitry, radio-frequency amplifiers, audio amplifiers, serial and parallel port communications circuits, thermal sensors, touch-screen input devices, etc.
  • device 10 can implement a power management scheme in which gaze detection circuitry is turned on or off or is otherwise adjusted in real time.
  • device 10 can switch between an active mode, a partial standby mode, and a standby mode.
  • the gaze detection capabilities of device 10 can be adjusted to conserve power depending on the mode in which device 10 is operating. For example, device 10 may perform gaze detection operations by taking images using camera 30 or other imaging circuitry at a first rate while in an active mode and at a second rate that is less than the first rate while in a standby mode. If desired, device 10 may suspend gaze detection operations while in standby. When the gaze detection operations of device 10 are slowed down (e.g., performed at the second rate) or suspended, device 10 may consume a reduced amount of power.
  • device 10 is in an active mode. While device 10 is in the active mode, device 10 may perform gaze detection operations. For example, device 10 may perform gaze detection operations by taking images at a given rate to search for a user's gaze (e.g., once every 100 milliseconds, 200 milliseconds, 250 milliseconds, 500 milliseconds, 1 second, 2 seconds, etc.). These images may then be analyzed to determine whether the user of device 10 is looking at device 10 .
  • Display 16 may simultaneously display an application display screen (e.g., a home page, a telephone application information page, a media player screen, etc.) at an active brightness level.
  • an application display screen e.g., a home page, a telephone application information page, a media player screen, etc.
  • device 10 When device 10 detects that the user has looked away from display 16 (e.g., using a gaze detection sensor such as camera 30 ), device 10 may dim display 16 and enter partial standby mode 114 as illustrated by line 112 .
  • a gaze detection sensor such as camera 30
  • device 10 is in a partial standby mode.
  • device 10 may dim display 16 to a partial standby brightness level to conserve power.
  • device 10 may also reduce the speed at which images are captured for gaze detection operations in device 10 (e.g., to a lower multiple of the rate at which gaze detection images were captured in mode 110 such as one-half, one-quarter, etc. of the rate in mode 110 ).
  • Device 10 may switch from partial standby mode 114 to active mode 110 whenever appropriate. For example, when device 10 detects that a user's gaze is directed towards display 16 , device 10 may enter an active mode such as mode 110 (e.g., as illustrated by line 116 ) and may brighten display 16 to the active brightness level. Device 10 may also enter active mode 110 when device 10 detects user activity (e.g., as illustrated by line 118 ).
  • an active mode such as mode 110 (e.g., as illustrated by line 116 ) and may brighten display 16 to the active brightness level.
  • Device 10 may also enter active mode 110 when device 10 detects user activity (e.g., as illustrated by line 118 ).
  • device 10 may operate in a full standby mode such as standby mode 122 in which display 16 is turned off and the gaze detection capabilities of device 10 are also turned off (e.g., camera 30 is turned off).
  • standby mode 122 in which display 16 is turned off and the gaze detection capabilities of device 10 are also turned off (e.g., camera 30 is turned off).
  • device 10 may enter standby mode 122 .
  • Device 10 may enter standby mode 122 as illustrated by line 120 after device 10 detects that the user has looked away (e.g., as illustrated by line 116 ) and after a given period of user inactivity has elapsed following the device's detection that the user has looked away.
  • standby mode 122 device 10 may operate with display 16 turned off and with gaze detection disabled (e.g., turned off).
  • Other circuitry may also be placed in a low-power standby mode (e.g., processing circuitry).
  • device 10 may switch from mode 122 to active mode 110 (e.g., device 10 may turn on display 16 to the active brightness level and turn on gaze detection capabilities to determine is a user's gaze is directed towards display 16 ).
  • active mode 110 e.g., device 10 may turn on display 16 to the active brightness level and turn on gaze detection capabilities to determine is a user's gaze is directed towards display 16 .
  • the period of user inactivity detected by device 10 and associated with the mode transition of line 120 may be reset. For example, when device 10 switches from mode 122 to active mode 110 and determines that the user's gaze is not directed towards display 16 , device 10 may switch to mode 114 and the given period of user inactivity associated with the mode transition of line 120 may begin anew.
  • the motion of device 10 can be indicative of whether device 10 is being used by a user. If desired, device 10 may use data from an accelerometer or other motion sensor in selecting its mode of operation. For example, when device 10 detects motion above a threshold level with an accelerometer, device 10 may activate gaze detection operations to determine if a user is looking at the device. Device 10 may turn on gaze detection circuitry or may temporarily activate gaze detection operations for a given period of time (e.g., one second, five seconds, etc.) whenever a motion sensor such as an accelerometer detects that a user is shaking device 10 or device 10 is otherwise in motion. With this type of arrangement, device 10 may be in standby mode. When device 10 is picked up by a user, device 10 may detect that the device is in motion using the accelerometer. Device 10 may then activate gaze detection operations and, if the user's gaze is properly detected, may switch to an active mode such as mode 68 in which display 16 is turned on.
  • an active mode such as mode 68 in which display 16 is turned on.
  • Device 10 may also suspend gaze detection operations when appropriate. For example, when device 10 is receiving user input through an input-output device 38 (e.g., when a user is providing user input through one or more user input devices) or when device 10 has recently received user input, gaze detection operations may be suspended (e.g., camera 30 may be turned off and the execution of gaze detection software may be stopped). In this situation, the presence of user interface activity makes it unnecessary to expend extra power operating the gaze detection circuitry.
  • gaze detection operations may be suspended (e.g., camera 30 may be turned off and the execution of gaze detection software may be stopped). In this situation, the presence of user interface activity makes it unnecessary to expend extra power operating the gaze detection circuitry.
  • device 10 may also use information from environmental sensors such as proximity sensors and ambient light sensors to determine whether or not to perform gaze detection operations.
  • Environmental sensors such as these may, if desired, be used in conjunction with an environmental sensor such as an accelerometer that detects device motion.
  • device 10 may suspend gaze detection operations whenever a sensor in device 10 indicates that gaze detection operations are inappropriate or not needed (e.g., as illustrated by line 128 ).
  • device 10 may be able to detect when gaze detection sensors such as camera 30 would be incapable of detecting a user's gaze due to excessive vibration detected by an accelerometer.
  • device 10 may suspend gaze detection operations (e.g., device 10 may switch to operating in mode 130 ) in response to signals from the accelerometer in device 10 that indicate the device is shaking or otherwise moving rapidly.
  • device 10 may switch to mode 130 when the accelerometer detects that the acceleration of device 10 exceeds a given threshold level.
  • device 10 may be able to detect, using a proximity sensor, that gaze detection operations are inappropriate because an object is in close proximity to device 10 and is blocking the device's gaze detection sensors (e.g., such as when a user places device 10 against their ear and thereby blocks camera 30 ).
  • device 10 may suspend gaze detection operations when an ambient light sensor detects that there is insufficient light in the environment around device 10 for a camera such as camera 30 to capture images in which a user's gaze could be detected.
  • Device 10 may also deactivate a camera associated with gaze detection operations and suspend a gaze detection application running on circuitry 36 when data from one or more sensors in device 10 indicate that gaze detection operations are inappropriate or wasteful of power.
  • device 10 When device 10 detects that gaze detection operations may be appropriate (e.g., after the sensors no longer indicate that gaze detection operations are inappropriate), device 10 may resume gaze detection operations in mode 126 , as illustrated by line 132 .
  • This type of arrangement may help device 10 to avoid performing gaze detection operations at inappropriate times, while ensuring that the power conserving functionality of the gaze detection circuitry is retained during normal device operation.
  • the gaze detection capabilities of device 10 may, if desired, include visual user identification capabilities (e.g., face recognition).
  • device 10 may distinguish between authorized users and unauthorized users based on image sensor data. For example, device 10 may recognize an authorized user and may unlock itself whenever the authorized user is detected by the device's gaze detection circuitry (e.g., camera 30 ). If desired, when device 10 detects that the authorized user's gaze has been diverted from device 10 , device 10 may lock itself to prevent unauthorized users from using device 10 .
  • This type of user-specific gaze detection functionality may be used for all gaze detection operations if desired. By making gaze detection specific to particular users, device 10 will not inadvertently transition from standby mode to active mode if a person in the user's vicinity happens to glance at the user's device.
  • FIG. 10 shows steps involved in processing a command to reduce the power consumption of display 16 .
  • Power reduction commands may be processed by device 10 based on gaze detection data or any other suitable data.
  • processing may begin with reception of a power reduction command by the processing circuitry of device 10 .
  • Display 16 may be an OLED display or other display that has pixels that may be controlled individually. As shown by box 136 , in this type of situation, device 10 may make partial or full power reduction to some or all of the pixels of display 16 in response to the received power reduction command.
  • Display 16 may also be formed from a panel subsystem and a backlight subsystem.
  • display 16 may have a liquid crystal display (LCD) panel subsystem and a light emitting diode or fluorescent tube backlight subsystem.
  • LCD liquid crystal display
  • the brightness of the backlight elements may be selectively controlled. For example, as shown in step 138 , the brightness of some of the backlight elements may be reduced while the other backlight elements remain fully powered.
  • the power of the single element may be partially or fully reduced to reduce power consumption (step 140 ).
  • step 138 and 140 further power reductions may be made by adjusting circuitry that controls the LCD panel subsystem.

Abstract

An electronic device may have gaze detection capabilities that allow the device to detect when a user is looking at the device. The electronic device may implement a power management scheme using the results of gaze detection operations. When the device detects that the user has looked away from the device, the device may dim a display screen and may perform other suitable actions. The device may pause a video playback operation when the device detects that the user has looked away from the device. The device may resume the video playback operation when the device detects that the user is looking towards the device. Gaze detector circuitry may be powered down when sensor data indicates that gazed detection readings will not be reliable or are not needed.

Description

  • This application is a divisional of patent application Ser. No. 12/242,251, filed Sep. 30, 2008, which is hereby incorporated by referenced herein in its entirety. This application claims the benefit of and claims priority to patent application Ser. No. 12/242,251, filed Sep. 30, 2008.
  • BACKGROUND
  • This invention relates generally to electronic devices, and more particularly, to electronic devices such as portable electronic devices that have gaze detection capabilities.
  • Electronic devices such as portable electronic devices are becoming increasingly popular. Examples of portable devices include handheld computers, cellular telephones, media players, and hybrid devices that include the functionality of multiple devices of this type. Popular portable electronic devices that are somewhat larger than traditional handheld electronic devices include laptop computers and tablet computers.
  • To satisfy consumer demand for small form factor portable electronic devices, manufacturers are continually striving to reduce the size of components that are used in these devices. For example, manufacturers have made attempts to miniaturize the batteries used in portable electronic devices.
  • An electronic device with a small battery has limited battery capacity. Unless care is taken to consume power wisely, an electronic device with a small battery may exhibit unacceptably short battery life. Techniques for reducing power consumption may be particularly important in wireless devices that support cellular telephone communications, because users of cellular telephone devices often demand long “talk” times.
  • Conventional portable electronic devices use various techniques for reducing their power consumption. Because display screens in electronic devices can consume relatively large amounts of power, power conservation techniques in portable electronic devices with display screens typically involve turning off the display screens at particular times. Unfortunately, conventional power conservation techniques may turn off display screens at inappropriate times, thereby interfering with a user's ability to interact with a device. Conventional techniques may also leave display screens on at inappropriate times, wasting valuable battery power.
  • It would therefore be desirable to be able to provide improved ways in which to conserve power in electronic devices.
  • SUMMARY
  • An electronic device is provided that may have gaze detection capabilities. One or more gaze detection sensors such as a camera may be used by the electronic device to determine whether a user's gaze is directed towards the electronic device (e.g., whether the user of the electronic device is looking at the electronic device). In particular, the electronic device may use gaze detection sensors to determine whether or not the user is looking at a display portion of the electronic device.
  • In an illustrative embodiment, the electronic device may have power management capabilities that are used to help conserve power. The electronic device may operate in two or more operating modes. One operation mode may be used to optimize performance. Another operating mode may help to extend battery life. The electronic device may use results from gaze detection operations to determine an appropriate mode in which to operate the electronic device.
  • For example, the electronic device may operate in an active mode when the electronic device determines, using gaze detection sensors, that the user's gaze is directed towards the electronic device and may operate in one or more standby modes when the device determines that the user's gaze is not directed towards the electronic device. When the electronic device is operating in one of the standby modes, circuitry and components such as a display screen, touch screen components, gaze detection components, and a central processing unit or CPU in the electronic device may be powered down or operated in a low-power mode to minimize power consumption in the electronic device.
  • With one suitable arrangement, when the electronic device is in the active mode and detects that the user has looked away from the device, the electronic device may dim or turn off a display screen. If desired, the electronic device can dim the display screen to a standby brightness level after the device has determined that the user has looked away from the device. After a given period of time has elapsed in which no user input has been received by the electronic device, the electronic device can turn off the display screen to conserve power. When the electronic device detects that the user's gaze is directed towards the electronic device, the electronic device may enter the active mode and return the display screen to an active brightness level (e.g., turn on the display screen or brighten the display screen to the active brightness level).
  • If desired, the electronic device may be performing an operation, while in the active mode, that is uninterrupted when the electronic device switches to operating in one of the standby modes. For example, the electronic device may be performing a music playback operation while in the active mode and, when the electronic device detects the user's gaze is not directed towards the electronic device, the electronic device may enter one of the standby modes without interrupting the music playback operation.
  • With one suitable arrangement, the electronic device may interrupt an operation when the electronic device begins operating in one of the standby mode. For example, the electronic device may be performing a video playback operation while in the active mode. In this example, when the electronic device detects that the user's gaze is no longer directed towards the electronic device, the electronic device may enter one of the standby modes, dim the display screen that was being used for the video playback operation, and pause the video playback operation. If desired, the electronic device may resume the video playback operation when it detects that the user has redirected their gaze towards the electronic device (e.g., towards the video screen).
  • In an illustrative embodiment, the electronic device may use readings from sensors such as proximity sensors, ambient light sensors, and motion sensors such as accelerometers to determine whether or not to perform gaze detection operations. For example, the electronic device may suspend gaze detection operations whenever a proximity sensor, ambient light sensor, or accelerometer indicates that gaze detection operations are inappropriate (e.g., because of an object in close proximity with the electronic device, insufficient ambient light for gaze detection sensors to detect the user's gaze, excessive vibration which may degrade the performance of gaze detection sensors, etc.).
  • An advantage of powering down the display is that a powered down display can help to prevent information on the display from being viewed by an unauthorized viewer. It may therefore be helpful to turn off a display when the lack of a user's gaze indicates that the user is not present to guard the device.
  • Further features of the invention, its nature and various advantages will be more apparent from the accompanying drawings and the following detailed description of the preferred embodiments.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective view of an illustrative portable electronic device that may have gaze detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of an illustrative portable electronic device that may have gaze detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 3 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 4 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities during a music playback operation in accordance with an embodiment of the present invention.
  • FIG. 5 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities and activity detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 6 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities during a video playback operation in accordance with an embodiment of the present invention.
  • FIG. 7 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection and touch screen input capabilities in accordance with an embodiment of the present invention.
  • FIG. 8 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities in accordance with an embodiment of the present invention.
  • FIG. 9 is a state diagram of illustrative operating modes of an illustrative electronic device with gaze detection capabilities and sensors such as environment sensors in accordance with an embodiment of the present invention.
  • FIG. 10 is a flow chart of illustrative steps involved in reducing power to displays in an electronic device in accordance with an embodiment of the present invention.
  • DETAILED DESCRIPTION
  • The present invention relates generally to electronic devices, and more particularly, to electronic devices such as portable electronic devices that have gaze detection capabilities.
  • With one suitable arrangement, an electronic device with gaze detection capabilities may have the ability to determine whether a user's gaze is within a given boundary without resolving the specific location of the user's gaze within that boundary. The electronic device, as an example, may be able to detect whether a user's gaze is directed towards a display associated with the device. With another suitable arrangement, an electronic device may have gaze tracking capabilities. Gaze tracking capabilities allow the electronic device to determine not only whether or not a user's gaze is directed towards a display associated with the device but also which portion of the display the user's gaze is directed towards.
  • An electronic device may be used to detect a user's gaze and adjust its behavior according to whether or not the user's gaze is detected. For example, the electronic device may be able to detect whether or not the user is looking at the device and adjust power management settings accordingly. With one suitable arrangement, the electronic device may delay turning device components off (e.g., components which would otherwise be turned off as part of a power management scheme) while the user's gaze is directed towards the device and the electronic device may accelerate the shutdown of device components when the user's gaze is not detected. For example, when the user's gaze is detected, a device with a display may keep the display at normal brightness rather than dimming the display and, when the device detects the user is no longing looking at the device, the device may dim or turn off the display. This type of arrangement may be especially beneficial in situations in which the user is not actively controlling the electronic device (e.g., the user is not pressing buttons or supplying touch screen inputs) but is still interacting with the electronic device (e.g., the user is reading text on the display, watching video on the display, etc.). An advantage of turning off the display when the user is not looking at the display is this may help prevent unauthorized users from viewing information on the display, thereby enhancing device security.
  • Electronic devices that have gaze detection capabilities may be portable electronic devices such as laptop computers or small portable computers of the type that are sometimes referred to as ultraportables. Portable electronic devices may also be somewhat smaller devices. Examples of smaller portable electronic devices include wrist-watch devices, pendant devices, headphone and earpiece devices, and other wearable and miniature devices. With one suitable arrangement, the portable electronic devices may be wireless electronic devices.
  • The wireless electronic devices may be, for example, handheld wireless devices such as cellular telephones, media players with wireless communications capabilities, handheld computers (also sometimes called personal digital assistants), global positioning system (GPS) devices, and handheld gaming devices. The wireless electronic devices may also be hybrid devices that combine the functionality of multiple conventional devices. Examples of hybrid portable electronic devices include a cellular telephone that includes media player functionality, a gaming device that includes a wireless communications capability, a cellular telephone that includes game and email functions, and a portable device that receives email, supports mobile telephone calls, has music player functionality, and supports web browsing. These are merely illustrative examples.
  • An illustrative portable electronic device in accordance with an embodiment of the present invention is shown in FIG. 1. User device 10 may be any suitable electronic device such as a portable or handheld electronic device. Device 10 of FIG. 1 may be, for example, a handheld electronic device that supports 2G and/or 3G cellular telephone and data functions, global positioning system capabilities or other satellite navigation capabilities, and local wireless communications capabilities (e.g., IEEE 802.11 and Bluetooth®) and that supports handheld computing device functions such as internet browsing, email and calendar functions, games, music player functionality, etc.
  • Device 10 may have a housing 12. Display 16 may be attached to housing 12 using bezel 14. Display 16 may be a touch screen liquid crystal display (as an example). Display 16 may have pixels that can be controlled individually in connection with power consumption adjustments. For example, in an organic light emitting diode (OLED) display, power can be reduced by making full and/or partial brightness reductions to some or all of the pixels. Display 16 may be formed from a panel subsystem and a backlight subsystem. For example, display 16 may have a liquid crystal display (LCD) panel subsystem and a light emitting diode or fluorescent tube backlight subsystem. In backlight subsystems that contain individually controllable elements such as light emitting diodes, the brightness of the backlight elements may be selectively controlled. For example, the brightness of some of the backlight elements may be reduced while the other backlight elements remain fully powered. In backlight subsystems that contain a single backlight element, the power of the single element may be partially or fully reduced to reduce power consumption. It may also be advantageous to make power adjustments to the circuitry that drives the LCD panel subsystem.
  • Display screen 16 (e.g., a touch screen) is merely one example of an input-output device that may be used with electronic device 10. If desired, electronic device 10 may have other input-output devices. For example, electronic device 10 may have user input control devices such as button 19, and input-output components such as port 20 and one or more input-output jacks (e.g., for audio and/or video). Button 19 may be, for example, a menu button. Port 20 may contain a 30-pin data connector (as an example). Openings 22 and 24 may, if desired, form speaker and microphone ports. Speaker port 22 may be used when operating device 10 in speakerphone mode. Opening 23 may also form a speaker port. For example, speaker port 23 may serve as a telephone receiver that is placed adjacent to a user's ear during operation. In the example of FIG. 1, display screen 16 is shown as being mounted on the front face of handheld electronic device 10, but display screen 16 may, if desired, be mounted on the rear face of handheld electronic device 10, on a side of device 10, on a flip-up portion of device 10 that is attached to a main body portion of device 10 by a hinge (for example), or using any other suitable mounting arrangement.
  • A user of electronic device 10 may supply input commands using user input interface devices such as button 19 and touch screen 16. Suitable user input interface devices for electronic device 10 include buttons (e.g., alphanumeric keys, power on-off, power-on, power-off, and other specialized buttons, etc.), a touch pad, pointing stick, or other cursor control device, a microphone for supplying voice commands, or any other suitable interface for controlling device 10. Buttons such as button 19 and other user input interface devices may generally be formed on any suitable portion of electronic device 10. For example, a button such as button 19 or other user interface control may be formed on the side of electronic device 10. Buttons and other user interface controls can also be located on the top face, rear face, or other portion of device 10. If desired, device 10 can be controlled remotely (e.g., using an infrared remote control, a radio-frequency remote control such as a Bluetooth® remote control, etc.).
  • If desired, device 10 may contain sensors such as a proximity sensor and an ambient light sensor. A proximity sensor may be used to detect when device 10 is close to a user's head or other object. An ambient light sensor may be used to make measurements of current light levels.
  • Device 10 may have a camera or other optical sensor such as camera 30 that can be used for gaze detection operations. Cameras used for gaze detection may, for example, be used by device 10 to capture images of a user's face that are processed by device 10 to detect where the user's gaze is directed. Camera 30 may be integrated into housing 12. While shown as being formed on the top face of electronic device 10 in the example of FIG. 1, cameras such as camera 30 may generally be formed on any suitable portion of electronic device 10. For example, camera 30 may be mounted on a flip-up portion of device 10 that is attached to a main body portion of device 10 by a hinge or may be mounted between the flip-up portion of device 10 and the main body portion of device 10 (e.g., in the hinge region between the flip-up portion and the main body portion such that the camera can be used regardless of whether the device is flipped open or is closed). Device 10 may also have additional cameras (e.g., device 10 may have camera 30 on the top face of device 10 for gaze detection operations and another camera on the bottom face of device 10 for capturing images and video).
  • If desired, the gaze detection functions of camera 30 may be implemented using an optical sensor that has been optimized for gaze detection operations. For example, camera 30 may include one or more light emitting diodes (LED's) and an optical sensor capable of detecting reflections of light emitted from the LEDs off of the users' eyes when the users are gazing at device 10. The light emitting diodes may emit a modulated infrared light and the optical sensor may be synchronized to detect reflections of the modulated infrared light, as an example. In general, any suitable gaze detection image sensor and circuitry may be used for supporting gaze detection operations in device 10. The use of camera 30 is sometimes described herein as an example.
  • A schematic diagram of an embodiment of an illustrative portable electronic device such as a handheld electronic device is shown in FIG. 2. Portable device 10 may be a mobile telephone, a mobile telephone with media player capabilities, a handheld computer, a remote control, a game player, a global positioning system (GPS) device, a laptop computer, a tablet computer, an ultraportable computer, a hybrid device that includes the functionality of some or all of these devices, or any other suitable portable electronic device.
  • As shown in FIG. 2, device 10 may include storage 34. Storage 34 may include one or more different types of storage such as hard disk drive storage, nonvolatile memory (e.g., flash memory or other electrically-programmable-read-only memory), volatile memory (e.g., battery-based static or dynamic random-access-memory), etc.
  • Processing circuitry 36 may be used to control the operation of device 10. Processing circuitry 36 may be based on a processor such as a microprocessor and other suitable integrated circuits. With one suitable arrangement, processing circuitry 36 and storage 34 are used to run software on device 10, such as gaze detection applications, internet browsing applications, voice-over-internet-protocol (VOIP) telephone call applications, email applications, media playback applications, navigation functions, map functions, operating system functions, power management functions, etc. Processing circuitry 36 and storage 34 may be used in implementing suitable communications protocols. Communications protocols that may be implemented using processing circuitry 36 and storage 34 include internet protocols, wireless local area network protocols (e.g., IEEE 802.11 protocols—sometimes referred to as Wi-Fi®), protocols for other short-range wireless communications links such as the Bluetooth® protocol, protocols for handling 3 G communications services (e.g., using wide band code division multiple access techniques), 2G cellular telephone communications protocols, etc. If desired, processing circuitry 36 may operate in a reduced power mode (e.g., circuitry 36 may be suspended or operated at a lower frequency) when device 10 enters a suitable standby mode.
  • Input-output devices 38 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Display screen 16, camera 30, button 19, microphone port 24, speaker port 22, and dock connector port 20 are examples of input-output devices 38. In general, input-output devices 38 may include any suitable components for receiving input and/or providing output from device 10. For example, input-output devices 38 can include user input-output devices 40 such as buttons, touch screens, joysticks, click wheels, scrolling wheels, touch pads, key pads, keyboards, microphones, etc. A user can control the operation of device 10 by supplying commands through user input devices 40. Input-output device 38 may include sensors such as proximity sensors, ambient light sensors, orientation sensors, proximity sensors, and any other suitable sensors.
  • Input-output devices 38 may include a camera such as integrated camera 41 (e.g., a camera that is integrated into the housing of device 10) and camera 30 of FIG. 1. Cameras such as camera 41 and camera 30 may be used as part of a gaze detection system. For example, camera 41 may be used by device 10 to capture images that are processed by a gaze detection application running on processing circuitry 36 to determine whether or not a user's gaze is directed towards the device. Cameras such as camera 41 and camera 30 may, if desired, be provided with image stabilization capabilities (e.g., using feedback derived from an accelerometer, orientation sensor, or other sensor).
  • Display and audio devices 42 may include liquid-crystal display (LCD) screens or other screens, light-emitting diodes (LEDs), and other components that present visual information and status data. Display and audio devices 42 may also include audio equipment such as speakers and other devices for creating sound. Display and audio devices 42 may contain audio-video interface equipment such as jacks and other connectors for external headphones and monitors.
  • Wireless communications devices 44 may include communications circuitry such as radio-frequency (RF) transceiver circuitry formed from one or more integrated circuits, power amplifier circuitry, passive RF components, antennas, and other circuitry for handling RF wireless signals. Wireless signals can also be sent using light (e.g., using infrared communications).
  • Device 10 can communicate with external devices such as accessories 46, computing equipment 48, and wireless network 49, as shown by paths 50 and 51. Paths 50 may include wired and wireless paths. Path 51 may be a wireless path. Accessories 46 may include headphones (e.g., a wireless cellular headset or audio headphones) and audio-video equipment (e.g., wireless speakers, a game controller, or other equipment that receives and plays audio and video content), a peripheral such as a wireless printer or camera, etc.
  • Computing equipment 48 may be any suitable computer. With one suitable arrangement, computing equipment 48 is a computer that has an associated wireless access point (router) or an internal or external wireless card that establishes a wireless connection with device 10. The computer may be a server (e.g., an internet server), a local area network computer with or without internet access, a user's own personal computer, a peer device (e.g., another portable electronic device 10), or any other suitable computing equipment.
  • Wireless network 49 may include any suitable network equipment, such as cellular telephone base stations, cellular towers, wireless data networks, computers associated with wireless networks, etc.
  • A device such as device 10 that has gaze detection capabilities may use gaze detector data in implementing a power management scheme. As an example, device 10 may operate in multiple modes to conserve power and may utilize gaze detection operations to assist in determining an appropriate mode in which to operate.
  • With one suitable arrangement, the operational modes of device 10 may include modes such as an active mode, a partial standby mode, and a full standby mode. In these and other operational modes, device 10 may adjust the brightness of display 16 and may turn display 16 on or off whenever appropriate in order to conserve power. For example, display 16 may be at an active brightness when device 10 is in the active mode, a standby brightness when device 10 is in the partial standby mode, and may be turned off when device 10 is in the full standby mode. The standby brightness may be somewhat dimmer than the active brightness. Generally, the power consumption of display 16 and therefore device 10 will be reduced when the brightness of display 16 is reduced and when display 16 is turned off.
  • Consider, as an example, the scenario of FIG. 3. In mode 52 of FIG. 3, device 10 is in an active mode. In general, it is generally desirable for device 10 to be in the active mode whenever a user is actively interacting with device 10. In particular, it is desirable for display 16 to be at the active brightness level whenever the user's gaze is directed towards display 16.
  • When device 10 is in the active mode, a display such as display 16 may be turned on and may display an appropriate screen such as an application display screen at the active brightness level. The active brightness level may be a configurable brightness level. For example, device 10 may receive input from a user to adjust the active brightness level. In general, the active brightness level may be adjusted anywhere between the maximum brightness and minimum brightness level display 16 is capable of.
  • If desired, device 10 may be performing a music playback operation when device 10 is in the active mode. In the example of FIG. 3, the music playback operation may be occurring in the background of the operation of device 10 (e.g., device 10 may be performing the music playback operation while display 16 and user input device 40 are used by the user to perform additional tasks such as writing an e-mail, browsing the web, etc.).
  • While device 10 is in the active mode, device 10 may be performing gaze detection operations. For example, when device 10 is in the active mode, device 10 may be capturing images using camera 30 or other image sensing components at regular intervals and maybe analyzing the images using gaze detection software. Based on this analysis, the device can determine whether the user's gaze is directed towards device 10 and display 16. When device 10 is performing gaze detection operations, device 10 may be capturing images used for the gaze detection operations at any suitable interval such as thirty times per second, ten times per second, twice per second, once per second, every two seconds, every five seconds, upon occurrence of non-time-based criteria, combinations of these intervals, or at any other suitable time.
  • As illustrated by line 54, when device 10 detects that the user has looked away, device 10 may dim display screen 16 and may enter partial standby mode 56. Device 10 may detect that the user has diverted their gaze away from device 10 and display 16 using a gaze detection sensor such as camera 30 and gaze detection software running on the hardware of device 10. If desired, gaze detection processing may be offloaded to specialized gaze detection circuitry (e.g., circuitry in a gaze detection chip or a camera controller).
  • In mode 56, device 10 is in a partial standby mode. In the partial standby mode, the brightness level of display 16 may be reduced from an active brightness level to a standby brightness level to reduce the power consumption of device 10. When device 10 enters a standby mode such as the partial standby mode, some operations running on device 10 may be suspended or stopped and some operations may continue running. For example, a music playback operation may continue when device 10 enters one of its standby modes while a web browsing application may be suspended. With this type of arrangement, when a user of device 10 is listening to music through the device while browsing the web on display 16, device 10 can dim display 16 to conserve power whenever the user looks away from display 16 while continuing to play back the music that the user is listening to without interruption.
  • As illustrated by line 58, when device 10 detects activity, device 10 may brighten display screen 16 and may enter active mode 52. Device 10 may enter active mode 52 in response to user activity such as button press activity received through a button such as button 19 and in response to other activity such as network activity (e.g., activity received through a wired or wireless communications link). In this type of arrangement, device 10 will enter the active mode whenever a user resumes interacting with device 10.
  • As illustrated by FIG. 4, device 10 may implement a power management scheme that turns off display 16 based on gaze detection data. In particular, device 10 may turn off display 16 when the device detects that the user is not looking at display 16 (e.g., rather than merely dimming display 16 as in the example of FIG. 3).
  • In mode 60, device 10 is in an active mode. While device 10 is in the active mode, device 10 may perform gaze detection operations. Because device 10 is in the active mode, display 16 may be at an active brightness level. With one suitable arrangement, when device 10 is in active mode 60, device 10 may be displaying a screen with display 16 that is of interest to the user but which does not demand the user's constant attention. For example, when device 10 is in mode 60, device 10 may be displaying a screen such as a now playing screen associated with a music playback operation or a telephone information screen associated with a telephone operation (e.g., a new incoming call, a new outgoing call, an active call, etc.). The now playing screen may, for example, include information about the music playback operation such as a track name, album name, artist name, elapsed playback time, remaining playback time, album art, etc. and may include on-screen selectable options (e.g., when display 16 is a touch-screen display) such as play, pause, fast forward, rewind, skip ahead (e.g., to another audio track), skip back, stop, etc. A telephone information screen might include information about a telephone operation such as a current call time, the telephone number associated with a telephone call, a contact name associated with the telephone call, and an image associated with the telephone call and may include on-screen selectable options such as a keypad to enter a telephone number, a call button, an end call button, a hold button, a speakerphone button, a mute button, an add call button, a contacts button, etc.
  • As illustrated by line 62, when device 10 detects that the user has looked away from display 16, device 10 may turn off display 16 and may enter standby mode 64. When device 10 is in standby mode 64, device 10 may continue to perform background operations such as a music playback operation that was occurring before device 10 entered standby mode 64 (e.g., before device 10 detected that the user's gaze was diverted away from display 16). Because the application screen displayed in mode 60 is of secondary importance to the user, device 10 may turn off display 16 completely when the user looks away without disrupting the user. For example, when a user is listening to an audio track and is also viewing information associated with the audio track on a now playing screen, device 10 can turn off display 16 when the user looks away, while continuing an audio playback operation. The user's primary use of device 10 (listening to music) is not interrupted, even though the secondary use of device 10 (viewing the now playing screen) has been halted.
  • In mode 64, device 10 is in a standby mode. In standby mode 64, display 16 may be turned off by device 10 to conserve power. When device 10 enters standby mode 64, suitable components of device 10 may be powered down (if desired). For example, in mode 64, the power consumption of processing circuitry 36 may be reduced (e.g., by operating fewer processor cores, by reducing the computing frequency of circuitry 36, etc.). With one suitable arrangement, an operation such as a music playback operation or a telephone call may continue when device 10 is in mode 64.
  • As illustrated by line 66, when device 10 detects activity such as user activity, device 10 may enter active mode 60 and turn on display 16. Device 10 may enter active mode 60 in response to any suitable activity such as button press activity, network activity, and gaze detection activity (e.g., when device 10 detects that the user has directed their gaze towards device 10).
  • As shown in the example of FIG. 5, device 10 may implement a power management scheme that is responsive to gaze detection data and other input data (e.g., user input, network input, etc.). In the power management scheme illustrated in FIG. 5, device 10 can switch between an active mode, a partial standby mode, and a standby mode. Device 10 may power down hardware components and suspend or slow down software operations depending on the mode in which device 10 is operating. For example, when device 10 is in either of the standby modes, device 10 may reduce the number of processing cores utilized by circuitry 36 and/or may reduce the processing frequency (clock rate) of circuitry such as circuitry 36. With one suitable arrangement, device 10 may turn display 16 on at an active brightness level in the active mode, dim display 16 to a standby brightness level in the partial standby mode, and turn display 16 off in the standby mode.
  • In mode 68, device 10 is in an active mode. While device 10 is in the active mode, device 10 may perform gaze detection operations. Display 16 may be at the active brightness level while device 10 is in active mode 68. In mode 68, device 10 may be displaying an application display screen such as a home page, a music playback application screen, a web browsing application screen, an email application screen, etc. If desired, device 10 may also be performing a music playback operation while in mode 68 (e.g., device 10 may be performing the music playback operation as a background process as device 10 displays the application display screen).
  • When device 10 detects that the user has looked away from display 16 (e.g., using a gaze detection sensor such as camera 30), device 10 may dim display 16 and enter partial standby mode 72, as illustrated by line 70.
  • In mode 72, device 10 is in a partial standby mode. In partial standby mode 72, device 10 may dim display 16 to a partial standby brightness level to conserve power and, if desired, may place other components such as processing circuitry, wireless transceiver circuitry, etc. in a standby mode to conserve power. Certain operations may continue when device 10 enters mode 72. For example, a music playback operation or a telephone call may continue uninterrupted when device 10 enters mode 72.
  • Device 10 may perform gaze detection operations while in mode 72. For example, device 10 may continually capture images using camera 30 at regular intervals and may analyze the captured images using gaze detection software to determine whether the user's gaze has returned to device 10 and display 16. If desired, the rate at which device 10 captures and processes images for gaze detection operations while in mode 72 may be reduced relative to the rate at which gaze detection images are captured and processed while device is in an active mode such as mode 68 (e.g., device 10 may capture images at a rate of once every 100 milliseconds, 250 milliseconds, 500 milliseconds, 1 second, etc. in mode 72 and once every 50 milliseconds, 125 milliseconds, 250 milliseconds, 500 milliseconds, etc. in mode 68).
  • Device 10 may switch from partial standby mode 72 to active mode 68 whenever appropriate. For example, when device 10 detects that a user's gaze is directed towards display 16, device 10 may enter an active mode such as mode 68 (e.g., as illustrated by line 75) and may brighten display 16 to the active brightness level. Device 10 may also enter active mode 68 when device 10 detects activity such as user activity received through a button such as button 19 and network activity received through a wired or wireless communications link (e.g., as illustrated by line 74). In general, device 10 will enter active mode 68 whenever a user resumes interacting with device 10 or device 10 needs to respond to network activity. Because device 10 enters active mode 68 when device 10 detects that the user's gaze is directed towards display 16 (e.g., as illustrated by line 75), the user of device 10 need not press a button or provide other input to awaken device 10 from the partial standby state. Instead, device 10 can automatically awaken (e.g., switch to active mode 68) when device 10 detects that the user has directed their gaze towards display 16.
  • If desired, device 10 may operate in a standby mode such as standby mode 76 in which display 16 is turned off. For example, when device 10 is operating in partial standby mode 72 and no user activity is detected for a given period of time (e.g., within a period of time such as one second, two seconds, . . . , ten seconds, twenty seconds, thirty seconds, etc.), device 10 may enter standby mode 76 and turn off display 16. Device 10 may enter standby mode 76, as illustrated by line 79, after device 10 detects that the user has looked away (e.g., as illustrated by line 70) and after a given period of user inactivity has elapsed following the device's detection that the user looked away.
  • In standby mode 76, device 10 may operate with display 16 turned off. Device 10 may place suitable components into standby. For example, device 10 may turn wireless transceiver circuitry off, reduce the power consumption of processing circuitry such as circuitry 36 (e.g., by turning off selected processing cores or lowering clock rates), turn off sensors such as proximity sensors, ambient light sensors, and accelerometers, and may suspend or power down any other suitable components. If desired, certain operations may continue when device 10 enters and operates in standby mode 76. For example, a music playback operation or a telephone call may continue uninterrupted when device 10 enters mode 76.
  • With the arrangement of FIG. 5, as long as device 10 detects that the user's gaze is directed at the device (e.g., the user is looking at display 16), device 10 may remain in active mode 68. Device 10 may remain in active mode 68 even when no other user activity is received (e.g., when the user is not pressing a button such as button 19 or providing user input through a touch screen such as touch screen display 16). This type of arrangement may be beneficial when a user is utilizing device 10 without providing user input and would be inconvenienced by device 10 implementing power management techniques. Device 10 can override power management schemes such as dimming a display screen based on results of gaze detection operations. For example, when device 10 detects a user's gaze and is presenting the user with text or video through display 16, device 10 may override power management instructions that could otherwise reduce the power of display 16 to ensure that display 16 is not dimmed or turned off even though the user has not provided direct user input.
  • If desired, device 10 may continue to perform gaze detection operations when operating in standby mode 76. As illustrated by dashed line 77, device 10 may switch from standby mode 76 to active mode 68 whenever device 10 detects that a user's gaze is once again directed towards display 16.
  • As illustrated by line 78, when device 10 detects activity, device 10 may switch from mode 76 to active mode (e.g., device 10 may turn on display 16 to the active brightness level). As an example, device 10 may enter active mode 68 in response to user activity such as button press activity received through a button such as button 19 and in response to other activity such as network activity (e.g., activity received through a wired or wireless communications link).
  • As illustrated in FIG. 6, device 10 may implement a power management scheme that utilizes gaze detection capabilities while executing a video playback operation (e.g., while playing video for a user). In the scheme illustrated by FIG. 6, device 10 can operate in an active mode (mode 80), a pause standby mode (e.g., a partial standby mode such as mode 84), and a standby mode (mode 90). With one suitable arrangement, device 10 may be performing a video playback operation for a user when the device is in the active mode, device 10 may pause the video playback operation and dim an associated display screen when the user looks away from the device and the device enters the pause standby mode, and device 10 may turn off the display screen (e.g., the screen used for the video playback operation) if the user does not look back towards the device within a given period of time and no other user activity is detected.
  • In active mode 80, device 10 is active. While device 10 is in the active mode, device 10 may perform gaze detection operations (e.g., using camera 30 and processing circuitry 36 to detect whether or not a user is gazing at display 16). While in mode 80, device 10 may perform a video playback operation. For example, device 10 may display video on display 16 and may play audio associated with the video through a speaker such as speaker 22 or through a headphone accessory such as accessory 46. Display 16 may display the video at an active brightness level (e.g., display 16 may be at a relatively bright display level).
  • When device 10 detects that the user has looked away from display 16 (e.g., using a gaze detection sensor such as camera 30), device 10 may dim display 16 and enter pause standby mode 84 as illustrated by line 82. As part of entering pause standby mode 84, device 10 may pause the video playback operation of mode 80. Generally, when device 10 pauses the video playback operation, device 10 will also pause an accompanying audio playback associated with the video playback operation. The user may, if desired, configure whether device 10 pauses the audio.
  • In mode 84, device 10 is in a pause standby mode. In pause standby mode 84, device 10 may dim display 16 to a pause standby brightness level (e.g., a partial standby brightness level) to conserve power. The video playback operation of mode 80 may be paused while device 10 is in mode 84. If desired, device 10 may place components such as processing circuitry and wireless transceiver circuitry in a standby mode while device 10 is in mode 84 (e.g., by turning off unused CPU cores or reducing clock rates).
  • With one suitable arrangement, device 10 may be performing gaze detection operations while in pause standby mode 84. For example, device 10 may capture images using camera 30 at regular intervals and may analyze the images using gaze detection software to continually monitor whether the user's gaze has returned to device 10 and display 16.
  • Device 10 may switch from pause standby mode 84 to mode 80 whenever appropriate. For example, whenever device 10 detects that a user's gaze is once again directed towards display 16, device 10 may enter an active mode such as mode 80 (e.g., as illustrated by line 86), brighten display 16 to the active brightness level, and resume the video playback operation. Device 10 may also enter mode 80 when device 10 detects activity such as user activity received through a button such as button 19 or network activity received through a wired or wireless communications link (e.g., as illustrated by dashed line 87). In general, device 10 will enter mode 80 whenever a user resumes interacting with device 10.
  • Because device 10 enters mode 80 when it detects that the user's gaze is directed towards display 16 (e.g., as illustrated by line 86), the user of device 10 need not press a button or provide other input to awaken device 10 from the pause standby state and resume the video playback operation of mode 80. Instead, device 10 can automatically awaken itself (e.g., switch to mode 80) and resume the video playback operation when the user directs their gaze towards display 16.
  • If desired, device 10 may operate in a standby mode such as standby mode 90 in which display 16 is turned off. For example, when device 10 is operating in pause standby mode 84 and no user activity is detected for a given period of time (e.g., within a period of time such as one second, two seconds, . . . , ten seconds, twenty seconds, thirty seconds, etc.), device 10 may enter standby mode 90 and turn off display 16. Because standby mode 90 involves a lower power state for device 10 then pause standby mode 84, mode 90 may sometimes referred to as full standby mode. As illustrated by line 88 in FIG. 6, device 10 may enter full standby mode 90 after device 10 detects that the user has looked away (e.g., as illustrated by line 82) and after a given period of user inactivity has elapsed following the device's detection that the user looked away.
  • In standby mode 90, device 10 may operate with display 16 turned off. Device 10 may also place other suitable components into standby (e.g., wireless circuitry, etc.).
  • With the arrangement of FIG. 6, as long as device 10 detects that the user's gaze is directed at the device (e.g., the user is looking at display 16), device 10 may remain in active mode 80 and video playback operation can continue (e.g., until the video is completed or the operation is stopped). Device 10 may remain in mode 80 even when no other user activity is being received (e.g., when the user is not pressing a button such as button 19 or providing user input through a touch screen such as touch screen display 16). This type of arrangement may be beneficial when a user is viewing a video on display 16 of device 10 without providing user input and would be inconvenienced if device 10 were to attempt to conserve power by dimming the video screen. Device 10 can pause the video playback operation when the user temporarily looks away and can then resume operation when the user returns their gaze to device 10. This allows the user of device 10 to automatically pause a video without having to provide direct user input (e.g., without selecting a pause button). The video can be paused simply by looking away from a video display such as display 16.
  • If desired, device 10 may continue to perform gaze detection operations when operating in standby mode 90. As illustrated by dashed line 93, device 10 may switch from standby mode 90 to active mode 80 and resume the video playback operation of mode 80 when device 10 detects that a user's gaze is again directed towards display 16.
  • As illustrated by line 92, when device 10 detects activity, device 10 may switch from mode 90 to active mode (e.g., device 10 may turn on display 16 to the active brightness level). As an example, device 10 may enter active mode 80 in response to user activity such as button press activity received through a button such as button 19 and in response to other activity such as network activity (e.g., activity received through a wired or wireless communications link).
  • If desired, device 10 may automatically resume a video playback operation when the device switches to active mode 80 from a standby mode such as pause standby mode 84 or full standby mode 90. With another suitable arrangement, device 10 may present the user with an option such as an on-screen selectable option to resume the video playback operation when the device switches to active mode 80.
  • Device 10 may have touch screen capabilities and may implement a power management scheme using gaze detection capabilities to control the device's touch screen capabilities. With this type of scheme, which is illustrated by FIG. 7, device 10 can switch between an active mode, a partial standby mode, and a standby mode.
  • Touch screen functions can be adjusted to conserve power. For example, display 16 may be a touch screen display that can operate at varying speeds (e.g., a fast speed and a slow speed) or with varying levels of functionality (e.g., general touch sensitivity, localized touch sensitivity, and gesture-capable touch sensitivity). These features can be adjusted based on gaze detection data.
  • With one suitable arrangement, touch screen display 16 may operate at a first frequency (e.g., at a relatively high speed) when device 10 is in active mode 94 and a second frequency (e.g., a relatively low speed) when device 10 is in standby mode 104. The frequency of touch screen display 16 may be the frequency at which the touch screen scans for user input (e.g., once every 10 milliseconds, 50 milliseconds, 100 milliseconds, 200 milliseconds, etc.).
  • If desired, touch screen display 16 may operate at a first level of functionality when device 10 is in mode 94 and at a second level of functionality when device 10 is in mode 104. For example, when device 10 is in active mode 94, touch screen display 16 may be configured to sense the location of user input within the area of display 16. Device 10 may also be configured to sense user inputs such as multi-touch user inputs and gestures such as swipe gestures and swipe and hold gestures while in mode 94. In contrast, when device 10 is in standby mode 104, touch screen display 16 may be configured such that display 16 can sense general user input such as the presence or absence of contact without being able to resolve the location of the input. The power consumption of display 16 may be reduced when display 16 is configured in this way.
  • In mode 94, device 10 is in an active mode. While device 10 is in the active mode, device 10 may perform gaze detection operations. Touch screen display 16 may be operating at a relatively high frequency (e.g., in the high power mode) while device 10 is in active mode 94. With another suitable arrangement, touch screen display 16 may be operating at or near its maximum capability (e.g., touch screen display 16 may be configured to sense the location of user inputs and to sense user inputs such as multi-touch inputs and gestures). Display 16 may also be displaying an application display screen (e.g., a home page, a telephone application information page, a media player screen, etc.) at an active brightness level.
  • When device 10 detects that the user has looked away from display 16 (e.g., using a gaze detection sensor such as camera 30), device 10 may dim display 16 and enter partial standby mode 98, as illustrated by line 96.
  • In mode 98, device 10 is in a partial standby mode. In partial standby mode 98, device 10 may dim display 16 to a partial standby brightness level to conserve power and may retain the touch screen capabilities of display 16. (Alternatively, touch screen capabilities can be reduced in mode 98.)
  • Device 10 may switch from partial standby mode 98 to active mode 94 whenever appropriate. For example, when device 10 detects that a user's gaze is directed towards display 16, device 10 may enter an active mode such as mode (e.g., as illustrated by line 100) and may brighten display 16 to the active brightness level. Device 10 may also enter active mode 94 when device 10 detects user activity (e.g., as illustrated by dashed line 99). In arrangements in which the touch screen capabilities of display 16 remain at the active mode level when device 10 is in mode 98, display 16 may be able to receive location specific user inputs (e.g., inputs specific to a particular portion of display 16) while device 10 is in mode 98.
  • If desired, device 10 may operate in a full standby mode such as standby mode 104 in which display 16 is turned off and the touch screen capabilities of display 16 are reduced. As an example, when device 10 is operating in partial standby mode 98 and no user activity is detected for a given period of time, device 10 may enter standby mode 104. Device 10 may enter standby mode 104 as illustrated by line 102 after device 10 detects that the user has looked away (e.g., as illustrated by line 96) and after a given period of user inactivity has elapsed following the device's detection that the user has looked away.
  • With another suitable arrangement, device 10 may enter standby mode 104 directly from active mode 94 when no user activity is detected for a configurable period of time (e.g., as illustrated by dashed line 108). Device 10 may enter standby mode 104 even when device 10 detects that a user's gaze is directed towards display 16. If desired, the time period of user inactivity required before device 10 enters mode 104 directly from mode 94 (e.g., when a user's gaze is still directed towards device 10) may be longer than the time period of user inactivity required before device 10 enters mode 104 from mode 98 (e.g., when the user's gaze is not directed towards device 10). For example, the inactivity period associated with the mode transition of line 108 may be one minute or more while the inactivity period associated with the mode transition of line 102 may be thirty seconds or less.
  • In standby mode 104, device 10 may operate with a display portion of display 16 turned off. The display portion of display 16 and a touch screen portion of display 16 may be powered and configured independently. In mode 104, device 10 may reduce the touch screen capabilities of the touch screen portion of display 16 (e.g., by reducing the frequency at which touch screen display 16 scans for user input, by configuring display 16 such that user inputs can only be sensed generally, by disabling the touch screen capabilities of display 16, etc.).
  • If desired, device 10 may continue to perform gaze detection operations when operating in standby mode 104. As illustrated by dashed line 105, device 10 may switch from standby mode 104 to active mode 94 when device 10 detects that a user's gaze is directed towards display 16.
  • As illustrated by line 106, device 10 may also switch from mode 104 to active mode 94 when activity is detected (e.g., device 10 may turn on display 16 to the active brightness level and restore the touch screen capabilities of display 16 to the active capability level).
  • If desired, power can be further conserved by reducing the power consumption of components such as a processor, wireless communications circuitry, etc. while in full standby mode 104 and/or partial standby mode 98. For example, when device 10 is placed in full standby mode 104 or partial standby mode 98, the clock frequency for the clock that is used to operate processing circuitry 36 (e.g., a microprocessor) may be reduced. The number of processor cores that are active in processing circuitry 36 may also be reduced. Some or all of wireless communications circuitry 44 may be placed in a low-power state or turned off. The amount of additional circuitry that is powered down when device 10 enters modes 98 and 104 may be the same or, if desired, relatively more circuitry may be powered down in full standby mode 104 than in partial standby mode 98.
  • In configurations in which device 10 has additional components, some or all of these components can be selectively powered down. Device 10 may have additional power down modes in which different numbers of these components have been placed in low-power states. Any suitable criteria may be used to determine when to switch device 10 between these modes. For example, gaze detection data, user input data, and/or sensor data may be used to determine an appropriate mode in which to operate device 10. Components that may be powered down in this way include proximity sensors, light sensors such as an ambient light sensor, cameras, motions sensors such as accelerometers, audio circuits, radio-frequency transceiver circuitry, radio-frequency amplifiers, audio amplifiers, serial and parallel port communications circuits, thermal sensors, touch-screen input devices, etc.
  • As illustrated by FIG. 8, device 10 can implement a power management scheme in which gaze detection circuitry is turned on or off or is otherwise adjusted in real time. In the scheme illustrated by FIG. 8, device 10 can switch between an active mode, a partial standby mode, and a standby mode.
  • The gaze detection capabilities of device 10 can be adjusted to conserve power depending on the mode in which device 10 is operating. For example, device 10 may perform gaze detection operations by taking images using camera 30 or other imaging circuitry at a first rate while in an active mode and at a second rate that is less than the first rate while in a standby mode. If desired, device 10 may suspend gaze detection operations while in standby. When the gaze detection operations of device 10 are slowed down (e.g., performed at the second rate) or suspended, device 10 may consume a reduced amount of power.
  • In mode 110, device 10 is in an active mode. While device 10 is in the active mode, device 10 may perform gaze detection operations. For example, device 10 may perform gaze detection operations by taking images at a given rate to search for a user's gaze (e.g., once every 100 milliseconds, 200 milliseconds, 250 milliseconds, 500 milliseconds, 1 second, 2 seconds, etc.). These images may then be analyzed to determine whether the user of device 10 is looking at device 10. Display 16 may simultaneously display an application display screen (e.g., a home page, a telephone application information page, a media player screen, etc.) at an active brightness level.
  • When device 10 detects that the user has looked away from display 16 (e.g., using a gaze detection sensor such as camera 30), device 10 may dim display 16 and enter partial standby mode 114 as illustrated by line 112.
  • In mode 114, device 10 is in a partial standby mode. In partial standby mode 114, device 10 may dim display 16 to a partial standby brightness level to conserve power. If desired, device 10 may also reduce the speed at which images are captured for gaze detection operations in device 10 (e.g., to a lower multiple of the rate at which gaze detection images were captured in mode 110 such as one-half, one-quarter, etc. of the rate in mode 110).
  • Device 10 may switch from partial standby mode 114 to active mode 110 whenever appropriate. For example, when device 10 detects that a user's gaze is directed towards display 16, device 10 may enter an active mode such as mode 110 (e.g., as illustrated by line 116) and may brighten display 16 to the active brightness level. Device 10 may also enter active mode 110 when device 10 detects user activity (e.g., as illustrated by line 118).
  • If desired, device 10 may operate in a full standby mode such as standby mode 122 in which display 16 is turned off and the gaze detection capabilities of device 10 are also turned off (e.g., camera 30 is turned off). As an example, when device 10 is operating in partial standby mode 114 and no user activity is detected for a given period of time, device 10 may enter standby mode 122. Device 10 may enter standby mode 122 as illustrated by line 120 after device 10 detects that the user has looked away (e.g., as illustrated by line 116) and after a given period of user inactivity has elapsed following the device's detection that the user has looked away.
  • In standby mode 122, device 10 may operate with display 16 turned off and with gaze detection disabled (e.g., turned off). Other circuitry may also be placed in a low-power standby mode (e.g., processing circuitry).
  • As illustrated by dashed line 124, when device 10 detects activity, device 10 may switch from mode 122 to active mode 110 (e.g., device 10 may turn on display 16 to the active brightness level and turn on gaze detection capabilities to determine is a user's gaze is directed towards display 16).
  • With one suitable arrangement, when device 10 detects activity such as user activity, the period of user inactivity detected by device 10 and associated with the mode transition of line 120 may be reset. For example, when device 10 switches from mode 122 to active mode 110 and determines that the user's gaze is not directed towards display 16, device 10 may switch to mode 114 and the given period of user inactivity associated with the mode transition of line 120 may begin anew.
  • The motion of device 10 can be indicative of whether device 10 is being used by a user. If desired, device 10 may use data from an accelerometer or other motion sensor in selecting its mode of operation. For example, when device 10 detects motion above a threshold level with an accelerometer, device 10 may activate gaze detection operations to determine if a user is looking at the device. Device 10 may turn on gaze detection circuitry or may temporarily activate gaze detection operations for a given period of time (e.g., one second, five seconds, etc.) whenever a motion sensor such as an accelerometer detects that a user is shaking device 10 or device 10 is otherwise in motion. With this type of arrangement, device 10 may be in standby mode. When device 10 is picked up by a user, device 10 may detect that the device is in motion using the accelerometer. Device 10 may then activate gaze detection operations and, if the user's gaze is properly detected, may switch to an active mode such as mode 68 in which display 16 is turned on.
  • Device 10 may also suspend gaze detection operations when appropriate. For example, when device 10 is receiving user input through an input-output device 38 (e.g., when a user is providing user input through one or more user input devices) or when device 10 has recently received user input, gaze detection operations may be suspended (e.g., camera 30 may be turned off and the execution of gaze detection software may be stopped). In this situation, the presence of user interface activity makes it unnecessary to expend extra power operating the gaze detection circuitry.
  • As illustrated by FIG. 9, device 10 may also use information from environmental sensors such as proximity sensors and ambient light sensors to determine whether or not to perform gaze detection operations. Environmental sensors such as these may, if desired, be used in conjunction with an environmental sensor such as an accelerometer that detects device motion.
  • When device 10 is performing gaze detection operations (e.g., when device 10 is operating in a mode such as mode 126), device 10 may suspend gaze detection operations whenever a sensor in device 10 indicates that gaze detection operations are inappropriate or not needed (e.g., as illustrated by line 128). With one suitable arrangement, device 10 may be able to detect when gaze detection sensors such as camera 30 would be incapable of detecting a user's gaze due to excessive vibration detected by an accelerometer. For example, device 10 may suspend gaze detection operations (e.g., device 10 may switch to operating in mode 130) in response to signals from the accelerometer in device 10 that indicate the device is shaking or otherwise moving rapidly. In this example, device 10 may switch to mode 130 when the accelerometer detects that the acceleration of device 10 exceeds a given threshold level. In another example, device 10 may be able to detect, using a proximity sensor, that gaze detection operations are inappropriate because an object is in close proximity to device 10 and is blocking the device's gaze detection sensors (e.g., such as when a user places device 10 against their ear and thereby blocks camera 30). If desired, device 10 may suspend gaze detection operations when an ambient light sensor detects that there is insufficient light in the environment around device 10 for a camera such as camera 30 to capture images in which a user's gaze could be detected. Device 10 may also deactivate a camera associated with gaze detection operations and suspend a gaze detection application running on circuitry 36 when data from one or more sensors in device 10 indicate that gaze detection operations are inappropriate or wasteful of power.
  • When device 10 detects that gaze detection operations may be appropriate (e.g., after the sensors no longer indicate that gaze detection operations are inappropriate), device 10 may resume gaze detection operations in mode 126, as illustrated by line 132. This type of arrangement may help device 10 to avoid performing gaze detection operations at inappropriate times, while ensuring that the power conserving functionality of the gaze detection circuitry is retained during normal device operation.
  • The gaze detection capabilities of device 10 may, if desired, include visual user identification capabilities (e.g., face recognition). In this type of arrangement, device 10 may distinguish between authorized users and unauthorized users based on image sensor data. For example, device 10 may recognize an authorized user and may unlock itself whenever the authorized user is detected by the device's gaze detection circuitry (e.g., camera 30). If desired, when device 10 detects that the authorized user's gaze has been diverted from device 10, device 10 may lock itself to prevent unauthorized users from using device 10. This type of user-specific gaze detection functionality may be used for all gaze detection operations if desired. By making gaze detection specific to particular users, device 10 will not inadvertently transition from standby mode to active mode if a person in the user's vicinity happens to glance at the user's device.
  • FIG. 10 shows steps involved in processing a command to reduce the power consumption of display 16. Power reduction commands may be processed by device 10 based on gaze detection data or any other suitable data.
  • As show by step 134, processing may begin with reception of a power reduction command by the processing circuitry of device 10.
  • Display 16 may be an OLED display or other display that has pixels that may be controlled individually. As shown by box 136, in this type of situation, device 10 may make partial or full power reduction to some or all of the pixels of display 16 in response to the received power reduction command.
  • Display 16 may also be formed from a panel subsystem and a backlight subsystem. For example, display 16 may have a liquid crystal display (LCD) panel subsystem and a light emitting diode or fluorescent tube backlight subsystem. In backlight subsystems that contain individually controllable elements such as light emitting diodes, the brightness of the backlight elements may be selectively controlled. For example, as shown in step 138, the brightness of some of the backlight elements may be reduced while the other backlight elements remain fully powered.
  • In backlight subsystems that contain a single backlight element, the power of the single element may be partially or fully reduced to reduce power consumption (step 140).
  • During the operations of steps 138 and 140, further power reductions may be made by adjusting circuitry that controls the LCD panel subsystem.
  • The foregoing is merely illustrative of the principles of this invention and various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention.

Claims (20)

What is claimed is:
1. A method for using a portable electronic device having an accelerometer and gaze detection circuitry, the method comprising:
with the accelerometer, determining whether a measured acceleration level for the portable electronic device has exceeded a given threshold value; and
when it has been determined that the acceleration level of the portable electronic device has exceeded the given threshold value, disabling the gaze detection circuitry.
2. The method defined in claim 1 further comprising:
after it has been determined that the acceleration level of the portable electronic device has exceeded the given threshold value, determining whether the acceleration level of the portable electronic device has dropped below a second threshold value; and
when it has been determined that the acceleration level of the portable electronic device has dropped below the second threshold value, enabling the gaze detection circuitry.
3. The method defined in claim 1 wherein the gaze detection circuitry comprises a camera that is used to determine whether the user's gaze is directed towards the portable electronic device and wherein disabling the gaze detection circuitry comprises turning off the camera.
4. The method defined in claim 1 wherein determining whether the measured acceleration level for the portable electronic device has exceeded the given threshold value comprises determining whether the gaze detection circuitry would be incapable of detecting a user's gaze due to excessive vibration detected by the accelerometer.
5. The method defined in claim 1 wherein determining whether the measured acceleration level for the portable electronic device has exceeded the given threshold value comprises determining whether the gaze detection circuitry would be incapable of detecting a user's gaze due to excessive motion of the portable electronic device detected by the accelerometer.
6. A method for using an electronic device having a sensor and gaze detection circuitry, the method comprising:
with the sensor, determining whether a given condition exists; and
in response to determining that the given conditions exists, disabling the gaze detection circuitry.
7. The method defined in claim 6 wherein determining whether the given condition exists comprises determining whether data from the sensor indicates that readings from the gaze detection circuitry will not be reliable due to the presence of the given condition.
8. The method defined in claim 6 wherein determining whether the given condition exists comprises determining whether data from the sensor indicates that readings from the gaze detection circuitry will not be needed.
9. The method defined in claim 6 wherein the sensor comprises an accelerometer and wherein determining whether the given condition exists comprises determining whether a measured acceleration level for the electronic device has exceeded a given threshold value.
10. The method defined in claim 6 wherein the sensor comprises a motion sensor and wherein determining whether the given condition exists comprises determining whether a measured level of motion of the electronic device has exceeded a given threshold value.
11. The method defined in claim 6 wherein the sensor comprises an accelerometer and wherein determining whether the given condition exists comprises determining whether a measured vibration level for the electronic device has exceeded a given threshold value.
12. The method defined in claim 6 wherein the sensor comprises a proximity sensor and wherein determining whether the given condition exists comprises determining whether any objects are within a given distance of the electronic device.
13. The method defined in claim 6 wherein the sensor comprises an environmental sensor.
14. The method defined in claim 6 wherein the sensor comprises a light sensor separate from the gaze detection circuitry.
15. The method defined in claim 6 wherein the sensor comprises an ambient light sensor and wherein determining whether the given condition exists comprises determining whether a measured brightness level of ambient light is less than a given brightness level.
16. An electronic device comprising:
a sensor;
an image sensor that captures images of a user; and
circuitry that processes the captured images to determine whether the user is looking at the electronic device, wherein the circuitry is configured to suspend captured image processing operations when a given condition is detected by the sensor.
17. The electronic device defined in claim 16 further comprising a button, wherein the circuitry is configured to suspend captured image processing operations when a user presses the button.
18. The electronic device defined in claim 16 wherein the sensor comprises an accelerometer that measures at least one of motion and vibration of the electronic device and wherein the given condition comprises a measurement from the accelerometer that exceeds at least one of a given motion level and a given vibration level.
19. The electronic device defined in claim 16 wherein the sensor comprises a light sensor that measures a brightness level of ambient light and wherein the given condition comprises a measurement from the light sensor that is less than a given brightness level.
20. The electronic device defined in claim 16 wherein the sensor comprises a proximity sensor that determines whether any objects are within a given distance of the electronic device and wherein the given condition comprises a measurement from the proximity sensor that at least one object is within the given distance of the electronic device.
US13/750,877 2008-09-30 2013-01-25 Electronic Devices With Gaze Detection Capabilities Abandoned US20130135198A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/750,877 US20130135198A1 (en) 2008-09-30 2013-01-25 Electronic Devices With Gaze Detection Capabilities
US14/157,909 US10025380B2 (en) 2008-09-30 2014-01-17 Electronic devices with gaze detection capabilities

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/242,251 US20100079508A1 (en) 2008-09-30 2008-09-30 Electronic devices with gaze detection capabilities
US13/750,877 US20130135198A1 (en) 2008-09-30 2013-01-25 Electronic Devices With Gaze Detection Capabilities

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/242,251 Division US20100079508A1 (en) 2008-09-30 2008-09-30 Electronic devices with gaze detection capabilities

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/157,909 Division US10025380B2 (en) 2008-09-30 2014-01-17 Electronic devices with gaze detection capabilities

Publications (1)

Publication Number Publication Date
US20130135198A1 true US20130135198A1 (en) 2013-05-30

Family

ID=42056955

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/242,251 Abandoned US20100079508A1 (en) 2008-09-30 2008-09-30 Electronic devices with gaze detection capabilities
US13/750,877 Abandoned US20130135198A1 (en) 2008-09-30 2013-01-25 Electronic Devices With Gaze Detection Capabilities
US14/157,909 Active 2028-10-06 US10025380B2 (en) 2008-09-30 2014-01-17 Electronic devices with gaze detection capabilities

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/242,251 Abandoned US20100079508A1 (en) 2008-09-30 2008-09-30 Electronic devices with gaze detection capabilities

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/157,909 Active 2028-10-06 US10025380B2 (en) 2008-09-30 2014-01-17 Electronic devices with gaze detection capabilities

Country Status (1)

Country Link
US (3) US20100079508A1 (en)

Cited By (115)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120200490A1 (en) * 2011-02-03 2012-08-09 Denso Corporation Gaze detection apparatus and method
US20120327447A1 (en) * 2011-06-24 2012-12-27 Konica Minolta Business Technologies, Inc. Image forming apparatus
US20130083344A1 (en) * 2011-10-04 2013-04-04 Konica Minolta Business Technologies, Inc. , Image forming apparatus
US20140108842A1 (en) * 2012-10-14 2014-04-17 Ari M. Frank Utilizing eye tracking to reduce power consumption involved in measuring affective response
US20140191948A1 (en) * 2013-01-04 2014-07-10 Samsung Electronics Co., Ltd. Apparatus and method for providing control service using head tracking technology in electronic device
US20140204014A1 (en) * 2012-03-30 2014-07-24 Sony Mobile Communications Ab Optimizing selection of a media object type in which to present content to a user of a device
WO2014200508A1 (en) 2013-06-14 2014-12-18 Intel Corporation Methods and apparatus to provide power to devices
US20150001302A1 (en) * 2013-06-28 2015-01-01 Hand Held Products, Inc. Mobile device having an improved user interface for reading code symbols
US8937591B2 (en) 2012-04-06 2015-01-20 Apple Inc. Systems and methods for counteracting a perceptual fading of a movable indicator
CN104427087A (en) * 2013-08-21 2015-03-18 腾讯科技(深圳)有限公司 Method for realizing dynamic wallpaper of mobile terminal, and mobile terminal
US20150121228A1 (en) * 2013-10-31 2015-04-30 Samsung Electronics Co., Ltd. Photographing image changes
US9094539B1 (en) * 2011-09-22 2015-07-28 Amazon Technologies, Inc. Dynamic device adjustments based on determined user sleep state
WO2015167471A1 (en) * 2014-04-29 2015-11-05 Hewlett-Packard Development Company, L.P. Gaze detector using reference frames in media
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
US9286728B2 (en) 2014-02-11 2016-03-15 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9298001B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Optical configurations for head worn computing
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US9316833B2 (en) 2014-01-21 2016-04-19 Osterhout Group, Inc. Optical configurations for head worn computing
US9329387B2 (en) 2014-01-21 2016-05-03 Osterhout Group, Inc. See-through computer display systems
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US9366868B2 (en) 2014-09-26 2016-06-14 Osterhout Group, Inc. See-through computer display systems
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9423612B2 (en) 2014-03-28 2016-08-23 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US9436306B2 (en) 2011-09-21 2016-09-06 Nec Corporation Portable terminal device and program
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US9454699B2 (en) 2014-04-29 2016-09-27 Microsoft Technology Licensing, Llc Handling glare in eye tracking
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US20160364030A1 (en) * 2015-06-11 2016-12-15 Dell Products L.P. Touch user interface at a display edge
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9532715B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US20170097677A1 (en) * 2015-10-05 2017-04-06 International Business Machines Corporation Gaze-aware control of multi-screen experience
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
US20170187982A1 (en) * 2015-12-29 2017-06-29 Le Holdings (Beijing) Co., Ltd. Method and terminal for playing control based on face recognition
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9720234B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9811152B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US9836639B2 (en) 2014-01-10 2017-12-05 Facebook, Inc. Systems and methods of light modulation in eye tracking devices
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9852545B2 (en) 2014-02-11 2017-12-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US20180033362A1 (en) * 2016-07-29 2018-02-01 Semiconductor Energy Laboratory Co., Ltd. Display method, display device, electronic device, non-temporary memory medium, and program
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
CN108270920A (en) * 2018-01-16 2018-07-10 深圳市金立通信设备有限公司 A kind of wallpaper setting method, terminal and computer readable storage medium
CN108370409A (en) * 2015-12-22 2018-08-03 索尼公司 Information processing unit, imaging device, information processing system, information processing method and program
US10045050B2 (en) 2014-04-25 2018-08-07 Vid Scale, Inc. Perceptual preprocessing filter for viewing-conditions-aware video coding
US10062182B2 (en) 2015-02-17 2018-08-28 Osterhout Group, Inc. See-through computer display systems
US10136214B2 (en) 2015-08-11 2018-11-20 Google Llc Pairing of media streaming devices
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
USD840395S1 (en) 2016-10-17 2019-02-12 Osterhout Group, Inc. Head-worn computer
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US10284902B2 (en) 2015-09-18 2019-05-07 Samsung Electronics Co., Ltd. Apparatus and method for playing back multimedia content
US10310599B1 (en) 2013-03-21 2019-06-04 Chian Chiu Li System and method for providing information
US10422995B2 (en) 2017-07-24 2019-09-24 Mentor Acquisition One, Llc See-through computer display systems with stray light management
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
US10466491B2 (en) 2016-06-01 2019-11-05 Mentor Acquisition One, Llc Modular systems for head-worn computers
US10534180B2 (en) 2016-09-08 2020-01-14 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10558050B2 (en) 2014-01-24 2020-02-11 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US10578869B2 (en) 2017-07-24 2020-03-03 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US10591728B2 (en) 2016-03-02 2020-03-17 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10606374B2 (en) 2015-08-19 2020-03-31 Lg Electronics Inc. Watch-type mobile terminal
US10613638B2 (en) * 2016-07-27 2020-04-07 Kyocera Corporation Electronic device
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US10667981B2 (en) 2016-02-29 2020-06-02 Mentor Acquisition One, Llc Reading assistance system for visually impaired
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US10684478B2 (en) 2016-05-09 2020-06-16 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10690936B2 (en) 2016-08-29 2020-06-23 Mentor Acquisition One, Llc Adjustable nose bridge assembly for headworn computer
US10824253B2 (en) 2016-05-09 2020-11-03 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US20200399861A1 (en) * 2018-03-23 2020-12-24 Kobelco Construction Machinery Co., Ltd. Construction machine
US10878775B2 (en) 2015-02-17 2020-12-29 Mentor Acquisition One, Llc See-through computer display systems
US20200409720A1 (en) * 2019-06-28 2020-12-31 AO Kaspersky Lab Systems and methods for automatic service activation on a computing device
US10969584B2 (en) 2017-08-04 2021-04-06 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
US10979769B2 (en) * 2019-02-25 2021-04-13 PreTechnology, Inc. Method and apparatus for monitoring and tracking consumption of digital content
US10990187B2 (en) * 2014-09-23 2021-04-27 Fitbit, Inc. Methods, systems, and apparatuses to update screen content responsive to user gestures
US11016564B2 (en) 2018-03-10 2021-05-25 Chian Chiu Li System and method for providing information
US11104272B2 (en) 2014-03-28 2021-08-31 Mentor Acquisition One, Llc System for assisted operator safety using an HMD
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US11269182B2 (en) 2014-07-15 2022-03-08 Mentor Acquisition One, Llc Content presentation in head worn computing
US11372320B2 (en) 2020-02-27 2022-06-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array
US11372985B2 (en) * 2018-11-21 2022-06-28 Rovi Guides, Inc. Intelligent display of content
US11409105B2 (en) 2017-07-24 2022-08-09 Mentor Acquisition One, Llc See-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11573620B2 (en) 2021-04-20 2023-02-07 Chian Chiu Li Systems and methods for providing information and performing task
US11630907B2 (en) * 2020-03-30 2023-04-18 Salesforce, Inc. Live data viewing security
US11632587B2 (en) 2020-06-24 2023-04-18 The Nielsen Company (Us), Llc Mobile device attention detection
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11709236B2 (en) 2016-12-27 2023-07-25 Samsung Semiconductor, Inc. Systems and methods for machine perception
US11714170B2 (en) 2015-12-18 2023-08-01 Samsung Semiconuctor, Inc. Real time position sensing of objects
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11849153B2 (en) 2012-01-19 2023-12-19 Vid Scale, Inc. Methods and systems for video delivery supporting adaptation to viewing conditions
US11872018B2 (en) * 2015-03-10 2024-01-16 Brain Tunnelgenix Technologies Corp. Devices, apparatuses, systems, and methods for measuring temperature of an ABTT terminus
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems

Families Citing this family (252)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8677377B2 (en) 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
US7633076B2 (en) 2005-09-30 2009-12-15 Apple Inc. Automated response to and sensing of user activity in portable devices
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
EP2235713A4 (en) * 2007-11-29 2012-04-25 Oculis Labs Inc Method and apparatus for display of secure visual content
US9158116B1 (en) 2014-04-25 2015-10-13 Osterhout Group, Inc. Temple and ear horn assembly for headworn computer
US8344998B2 (en) * 2008-02-01 2013-01-01 Wimm Labs, Inc. Gesture-based power management of a wearable portable electronic device with display
US20090218957A1 (en) * 2008-02-29 2009-09-03 Nokia Corporation Methods, apparatuses, and computer program products for conserving power in mobile devices
WO2010024415A1 (en) * 2008-08-28 2010-03-04 京セラ株式会社 Communication device
US8676904B2 (en) 2008-10-02 2014-03-18 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
TW201021525A (en) * 2008-11-28 2010-06-01 Inventec Corp Communication device and electricity saving method thereof
CN201562447U (en) * 2009-04-28 2010-08-25 鸿富锦精密工业(深圳)有限公司 Electronic photo frame with intelligent control of display brightness
JP5299866B2 (en) * 2009-05-19 2013-09-25 日立コンシューマエレクトロニクス株式会社 Video display device
US20120311585A1 (en) 2011-06-03 2012-12-06 Apple Inc. Organizing task items that represent tasks to perform
US8508520B2 (en) * 2009-07-09 2013-08-13 Nvidia Corporation Luminous power control of a light source of a multimedia processing system
US20120019447A1 (en) * 2009-10-02 2012-01-26 Hanes David H Digital display device
KR101642933B1 (en) 2009-12-04 2016-07-27 삼성전자주식회사 Method and apparatus for reducing power consumption in digital living network alliance network
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
WO2011104949A1 (en) * 2010-02-24 2011-09-01 シャープ株式会社 Illuminating device, display device, data generating method, data generating program, and recording medium
US8922480B1 (en) 2010-03-05 2014-12-30 Amazon Technologies, Inc. Viewer-based device control
US8913004B1 (en) * 2010-03-05 2014-12-16 Amazon Technologies, Inc. Action based device control
JP5355466B2 (en) * 2010-03-25 2013-11-27 京セラ株式会社 Mobile terminal device
US9218119B2 (en) * 2010-03-25 2015-12-22 Blackberry Limited System and method for gesture detection and feedback
US20110304541A1 (en) * 2010-06-11 2011-12-15 Navneet Dalal Method and system for detecting gestures
NL2004878C2 (en) 2010-06-11 2011-12-13 Univ Amsterdam System and method for detecting a person's direction of interest, such as a person's gaze direction.
US8824747B2 (en) * 2010-06-29 2014-09-02 Apple Inc. Skin-tone filtering
US8326001B2 (en) 2010-06-29 2012-12-04 Apple Inc. Low threshold face recognition
US20120032894A1 (en) 2010-08-06 2012-02-09 Nima Parivar Intelligent management for an electronic device
WO2012020864A1 (en) * 2010-08-13 2012-02-16 엘지전자 주식회사 Mobile terminal, display device, and method for controlling same
CN103081449B (en) 2010-09-13 2015-09-09 Lg电子株式会社 Mobile terminal and method of controlling operation thereof thereof
US8957847B1 (en) * 2010-12-28 2015-02-17 Amazon Technologies, Inc. Low distraction interfaces
US8531536B2 (en) * 2011-02-17 2013-09-10 Blackberry Limited Apparatus, and associated method, for selecting information delivery manner using facial recognition
US8836777B2 (en) 2011-02-25 2014-09-16 DigitalOptics Corporation Europe Limited Automatic detection of vertical gaze using an embedded imaging device
US20130057573A1 (en) * 2011-09-02 2013-03-07 DigitalOptics Corporation Europe Limited Smart Display with Dynamic Face-Based User Preference Settings
CN102761645B (en) * 2011-04-26 2016-05-18 富泰华工业(深圳)有限公司 Electronic equipment and control method thereof
US8843346B2 (en) 2011-05-13 2014-09-23 Amazon Technologies, Inc. Using spatial information with device interaction
US8806235B2 (en) * 2011-06-14 2014-08-12 International Business Machines Corporation Display management for multi-screen computing environments
TW201305998A (en) * 2011-07-20 2013-02-01 Hon Hai Prec Ind Co Ltd Electronic device with function of adjusting backlight and method thereof
DE102011079703A1 (en) 2011-07-25 2013-01-31 Robert Bosch Gmbh Method for assisting a driver of a motor vehicle
KR101824413B1 (en) 2011-08-30 2018-02-02 삼성전자주식회사 Method and apparatus for controlling operating mode of portable terminal
US10489570B2 (en) 2011-09-09 2019-11-26 Google Llc Preventing computing device from timing out
US8223024B1 (en) * 2011-09-21 2012-07-17 Google Inc. Locking mechanism based on unnatural movement of head-mounted display
DE102011084186A1 (en) * 2011-10-09 2013-04-11 XS Embedded GmbH Display device for visual representation of information using power-saving mode, has display and intensity control circuit for controlling brightness of display
KR101160681B1 (en) 2011-10-19 2012-06-28 배경덕 Method, mobile communication terminal and computer-readable recording medium for operating specific function when activaing of mobile communication terminal
ES2620762T3 (en) 2011-10-27 2017-06-29 Tobii Ab Power management in an eye tracking system
US8976110B2 (en) 2011-10-27 2015-03-10 Tobii Technology Ab Power management in an eye-tracking system
US10037086B2 (en) * 2011-11-04 2018-07-31 Tobii Ab Portable device
EP2590047A1 (en) * 2011-11-04 2013-05-08 Tobii Technology AB Portable device
KR101887058B1 (en) * 2011-11-11 2018-08-09 엘지전자 주식회사 A process for processing a three-dimensional image and a method for controlling electric power of the same
US9098069B2 (en) 2011-11-16 2015-08-04 Google Technology Holdings LLC Display device, corresponding systems, and methods for orienting output on a display
CN102520852A (en) * 2011-11-29 2012-06-27 华为终端有限公司 Control method of mobile equipment screen status and associated mobile equipment
KR101891786B1 (en) * 2011-11-29 2018-08-27 삼성전자주식회사 Operation Method For User Function based on a Eye-Tracking and Portable Device supporting the same
US9654768B2 (en) * 2011-12-23 2017-05-16 Thomson Licensing Computer device with power-consumption management and method for managing power consumption of computer device
JP5945417B2 (en) * 2012-01-06 2016-07-05 京セラ株式会社 Electronics
KR101850034B1 (en) * 2012-01-06 2018-04-20 엘지전자 주식회사 Mobile terminal and control method therof
US9256071B1 (en) 2012-01-09 2016-02-09 Google Inc. User interface
US9058168B2 (en) * 2012-01-23 2015-06-16 Blackberry Limited Electronic device and method of controlling a display
EP2629177A1 (en) * 2012-02-16 2013-08-21 Research In Motion Limited Portable electronic device and method
US20130215250A1 (en) * 2012-02-16 2013-08-22 Research In Motion Limited Portable electronic device and method
US9778829B2 (en) 2012-02-17 2017-10-03 Lenovo (Singapore) Pte. Ltd. Magnification based on eye input
US8988349B2 (en) 2012-02-28 2015-03-24 Google Technology Holdings LLC Methods and apparatuses for operating a display in an electronic device
US8947382B2 (en) * 2012-02-28 2015-02-03 Motorola Mobility Llc Wearable display device, corresponding systems, and method for presenting output on the same
JP5696071B2 (en) * 2012-03-02 2015-04-08 株式会社東芝 Electronic device, control method of electronic device, control program, and recording medium
US8947323B1 (en) 2012-03-20 2015-02-03 Hayes Solos Raffle Content display methods
CN103324276A (en) * 2012-03-22 2013-09-25 华为终端有限公司 Method and device for controlling standby operation
US20150035776A1 (en) * 2012-03-23 2015-02-05 Ntt Docomo, Inc. Information terminal, method for controlling input acceptance, and program for controlling input acceptance
EP2834779A4 (en) * 2012-04-05 2015-10-21 Invue Security Products Inc Merchandise user tracking system and method
US20130271355A1 (en) 2012-04-13 2013-10-17 Nokia Corporation Multi-segment wearable accessory
JP2013232804A (en) * 2012-04-27 2013-11-14 Fujitsu Ltd Terminal device, backlight control method, and backlight control program
WO2013169237A1 (en) * 2012-05-09 2013-11-14 Intel Corporation Eye tracking based selective accentuation of portions of a display
US10417037B2 (en) 2012-05-15 2019-09-17 Apple Inc. Systems and methods for integrating third party services with a digital assistant
BR112014028657A2 (en) 2012-05-18 2017-06-27 Unify Gmbh & Co Kg method, device, and system to reduce bandwidth usage during a communication session
US9823742B2 (en) * 2012-05-18 2017-11-21 Microsoft Technology Licensing, Llc Interaction and management of devices using gaze detection
US9304621B1 (en) 2012-05-25 2016-04-05 Amazon Technologies, Inc. Communication via pressure input
JP5966624B2 (en) * 2012-05-29 2016-08-10 株式会社リコー Information processing apparatus and information display system
US9152203B2 (en) * 2012-05-31 2015-10-06 At&T Intellectual Property I, Lp Managing power consumption state of electronic devices responsive to predicting future demand
CN102799361A (en) * 2012-06-21 2012-11-28 华为终端有限公司 Method for calling application object out and mobile terminal
US20130342672A1 (en) * 2012-06-25 2013-12-26 Amazon Technologies, Inc. Using gaze determination with device input
US9854159B2 (en) * 2012-07-20 2017-12-26 Pixart Imaging Inc. Image system with eye protection
SE537580C2 (en) * 2012-08-03 2015-06-30 Crunchfish Ab Improved input
KR101920020B1 (en) * 2012-08-07 2019-02-11 삼성전자 주식회사 Status Change Control Method and Electronic Device supporting the same
ES2898981T3 (en) * 2012-08-09 2022-03-09 Tobii Ab Quick activation in a gaze tracking system
US9270822B2 (en) * 2012-08-14 2016-02-23 Avaya Inc. Protecting privacy of a customer and an agent using face recognition in a video contact center environment
TWI515636B (en) * 2012-08-24 2016-01-01 緯創資通股份有限公司 Portable electronic device and automatic unlocking method thereof
WO2014039337A1 (en) * 2012-09-07 2014-03-13 Tractus Corporation Methods, apparatus, and systems for viewing multiple-slice medical images
US9578224B2 (en) 2012-09-10 2017-02-21 Nvidia Corporation System and method for enhanced monoimaging
US9323310B2 (en) 2012-09-19 2016-04-26 Sony Corporation Mobile client device, operation method, and recording medium
KR102091597B1 (en) * 2012-09-24 2020-03-20 엘지전자 주식회사 Portable device and controlling method thereof
US9406103B1 (en) 2012-09-26 2016-08-02 Amazon Technologies, Inc. Inline message alert
KR20140042280A (en) * 2012-09-28 2014-04-07 엘지전자 주식회사 Portable device and controlling method thereof
RU2523040C2 (en) * 2012-10-02 2014-07-20 ЭлДжи ЭЛЕКТРОНИКС ИНК. Screen brightness control for mobile device
US9612656B2 (en) 2012-11-27 2017-04-04 Facebook, Inc. Systems and methods of eye tracking control on mobile device
US9760150B2 (en) 2012-11-27 2017-09-12 Nvidia Corporation Low-power states for a computer system with integrated baseband
TR201820498T4 (en) 2012-11-29 2019-01-21 Vorwerk Co Interholding Kitchen tool.
CN103857019B (en) 2012-11-30 2018-01-02 辉达公司 A kind of method for being used for power saving in the terminal
US20140160019A1 (en) * 2012-12-07 2014-06-12 Nvidia Corporation Methods for enhancing user interaction with mobile devices
WO2014092437A1 (en) 2012-12-10 2014-06-19 Samsung Electronics Co., Ltd. Mobile device of bangle type, control method thereof, and ui display method
KR101480594B1 (en) * 2012-12-18 2015-01-08 현대자동차주식회사 Method for controlling termination call based on gaze, and mobile communication terminal therefor
US20140191939A1 (en) * 2013-01-09 2014-07-10 Microsoft Corporation Using nonverbal communication in determining actions
SE536990C2 (en) * 2013-01-22 2014-11-25 Crunchfish Ab Improved tracking of an object for controlling a non-touch user interface
KR20230137475A (en) 2013-02-07 2023-10-04 애플 인크. Voice trigger for a digital assistant
CN109756673B (en) * 2013-02-13 2022-02-08 华为技术有限公司 Method and terminal for controlling display state
KR102093198B1 (en) * 2013-02-21 2020-03-25 삼성전자주식회사 Method and apparatus for user interface using gaze interaction
US20140235253A1 (en) * 2013-02-21 2014-08-21 Hong C. Li Call routing among personal devices based on visual clues
US20140244190A1 (en) * 2013-02-28 2014-08-28 Cellco Partnership D/B/A Verizon Wireless Power usage analysis
US9395816B2 (en) * 2013-02-28 2016-07-19 Lg Electronics Inc. Display device for selectively outputting tactile feedback and visual feedback and method for controlling the same
US20140247208A1 (en) * 2013-03-01 2014-09-04 Tobii Technology Ab Invoking and waking a computing device from stand-by mode based on gaze detection
US9864498B2 (en) 2013-03-13 2018-01-09 Tobii Ab Automatic scrolling based on gaze detection
ES2731560T3 (en) 2013-03-01 2019-11-15 Tobii Ab Look interaction with delayed deformation
US10652394B2 (en) 2013-03-14 2020-05-12 Apple Inc. System and method for processing voicemail
US10748529B1 (en) 2013-03-15 2020-08-18 Apple Inc. Voice activated device for use with a voice-based digital assistant
US9842374B2 (en) * 2013-03-16 2017-12-12 Marc Jim Bitoun Physiological indicator monitoring for identifying stress triggers and certain health problems
WO2014146168A1 (en) * 2013-03-19 2014-09-25 National Ict Australia Limited Automatic detection of task transition
US9075435B1 (en) 2013-04-22 2015-07-07 Amazon Technologies, Inc. Context-aware notifications
GB2513579A (en) 2013-04-29 2014-11-05 Tobii Technology Ab Power efficient image sensing apparatus, method of operating the same and eye/gaze tracking system
US20140340317A1 (en) * 2013-05-14 2014-11-20 Sony Corporation Button with capacitive touch in a metal body of a user device and power-saving touch key control of information to display
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US20140368423A1 (en) * 2013-06-17 2014-12-18 Nvidia Corporation Method and system for low power gesture recognition for waking up mobile devices
EP3012828A4 (en) * 2013-06-19 2017-01-04 Yulong Computer Telecommunication Scientific (Shenzhen) Co. Ltd. Smart watch and display method for smart watch
KR102160767B1 (en) * 2013-06-20 2020-09-29 삼성전자주식회사 Mobile terminal and method for detecting a gesture to control functions
US10025378B2 (en) * 2013-06-25 2018-07-17 Microsoft Technology Licensing, Llc Selecting user interface elements via position signal
WO2014210430A1 (en) 2013-06-27 2014-12-31 Tractus Corporation Systems and methods for tissue mapping
US20160147300A1 (en) * 2013-06-28 2016-05-26 Nokia Technologies Oy Supporting Activation of Function of Device
US20150015688A1 (en) * 2013-07-09 2015-01-15 HTC Corportion Facial unlock mechanism using light level determining module
KR102179812B1 (en) * 2013-07-18 2020-11-17 엘지전자 주식회사 Watch type mobile terminal
JP5907353B2 (en) * 2013-08-06 2016-04-26 コニカミノルタ株式会社 Display device, display control program, and image processing device
KR102305578B1 (en) * 2013-08-23 2021-09-27 삼성전자 주식회사 Method and apparatus for switching mode of terminal
WO2015026203A1 (en) * 2013-08-23 2015-02-26 Samsung Electronics Co., Ltd. Mode switching method and apparatus of terminal
US9703355B2 (en) * 2013-08-28 2017-07-11 Qualcomm Incorporated Method, devices and systems for dynamic multimedia data flow control for thermal power budgeting
US9367117B2 (en) * 2013-08-29 2016-06-14 Sony Interactive Entertainment America Llc Attention-based rendering and fidelity
US9374872B2 (en) 2013-08-30 2016-06-21 Universal Display Corporation Intelligent dimming lighting
US9823728B2 (en) 2013-09-04 2017-11-21 Nvidia Corporation Method and system for reduced rate touch scanning on an electronic device
US9947080B2 (en) 2013-09-17 2018-04-17 Nokia Technologies Oy Display of a visual event notification
US9881592B2 (en) 2013-10-08 2018-01-30 Nvidia Corporation Hardware overlay assignment
US9398144B2 (en) * 2013-10-24 2016-07-19 Cellco Partnership Mobile device mode of operation for visually impaired users
EP3065623B1 (en) * 2013-11-09 2020-04-22 Shenzhen Goodix Technology Co., Ltd. Optical eye tracking
US10558262B2 (en) 2013-11-18 2020-02-11 Tobii Ab Component determination and gaze provoked interaction
US10317995B2 (en) 2013-11-18 2019-06-11 Tobii Ab Component determination and gaze provoked interaction
US9111076B2 (en) * 2013-11-20 2015-08-18 Lg Electronics Inc. Mobile terminal and control method thereof
US9390649B2 (en) 2013-11-27 2016-07-12 Universal Display Corporation Ruggedized wearable display
US9552064B2 (en) 2013-11-27 2017-01-24 Shenzhen Huiding Technology Co., Ltd. Eye tracking and user reaction detection
CN104683557A (en) * 2013-12-02 2015-06-03 中兴通讯股份有限公司 Method and device for automatically switching call mode of mobile terminal
US9213659B2 (en) 2013-12-03 2015-12-15 Lenovo (Singapore) Pte. Ltd. Devices and methods to receive input at a first device and present output in response on a second device different from the first device
US9110635B2 (en) * 2013-12-03 2015-08-18 Lenova (Singapore) Pte. Ltd. Initiating personal assistant application based on eye tracking and gestures
US10163455B2 (en) 2013-12-03 2018-12-25 Lenovo (Singapore) Pte. Ltd. Detecting pause in audible input to device
EP3080800A4 (en) * 2013-12-09 2017-08-02 AGCO Corporation Method and apparatus for improving user interface visibility in agricultural machines
US9709708B2 (en) 2013-12-09 2017-07-18 Lenovo (Singapore) Pte. Ltd. Adjustable display optics
US20150169047A1 (en) * 2013-12-16 2015-06-18 Nokia Corporation Method and apparatus for causation of capture of visual information indicative of a part of an environment
US20150169048A1 (en) * 2013-12-18 2015-06-18 Lenovo (Singapore) Pte. Ltd. Systems and methods to present information on device based on eye tracking
US10180716B2 (en) 2013-12-20 2019-01-15 Lenovo (Singapore) Pte Ltd Providing last known browsing location cue using movement-oriented biometric data
US9633252B2 (en) 2013-12-20 2017-04-25 Lenovo (Singapore) Pte. Ltd. Real-time detection of user intention based on kinematics analysis of movement-oriented biometric data
US10032075B2 (en) 2013-12-23 2018-07-24 Eyelock Llc Methods and apparatus for power-efficient iris recognition
US20150187357A1 (en) * 2013-12-30 2015-07-02 Samsung Electronics Co., Ltd. Natural input based virtual ui system for mobile devices
US10073671B2 (en) 2014-01-20 2018-09-11 Lenovo (Singapore) Pte. Ltd. Detecting noise or object interruption in audio video viewing and altering presentation based thereon
US9542844B2 (en) * 2014-02-11 2017-01-10 Google Inc. Providing navigation directions in view of device orientation relative to user
US9805254B2 (en) * 2014-02-18 2017-10-31 Lenovo (Singapore) Pte. Ltd. Preventing display clearing
US20150261315A1 (en) * 2014-03-11 2015-09-17 Google Technology Holdings LLC Display viewing detection
US10119864B2 (en) 2014-03-11 2018-11-06 Google Technology Holdings LLC Display viewing detection
US20150309534A1 (en) 2014-04-25 2015-10-29 Osterhout Group, Inc. Ear horn assembly for headworn computer
US9572232B2 (en) * 2014-05-15 2017-02-14 Universal Display Corporation Biosensing electronic devices
EP3480811A1 (en) 2014-05-30 2019-05-08 Apple Inc. Multi-command single utterance input method
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US20150346814A1 (en) * 2014-05-30 2015-12-03 Vaibhav Thukral Gaze tracking for one or more users
EP2975576A1 (en) * 2014-07-15 2016-01-20 Thomson Licensing Method of determination of stable zones within an image stream, and portable device for implementing the method
US9952883B2 (en) 2014-08-05 2018-04-24 Tobii Ab Dynamic determination of hardware
US9811095B2 (en) 2014-08-06 2017-11-07 Lenovo (Singapore) Pte. Ltd. Glasses with fluid-fillable membrane for adjusting focal length of one or more lenses of the glasses
US20160109943A1 (en) * 2014-10-21 2016-04-21 Honeywell International Inc. System and method for controlling visibility of a proximity display
US20160131904A1 (en) * 2014-11-07 2016-05-12 Osterhout Group, Inc. Power management for head worn computing
US9406211B2 (en) * 2014-11-19 2016-08-02 Medical Wearable Solutions Ltd. Wearable posture regulation system and method to regulate posture
US9535497B2 (en) 2014-11-20 2017-01-03 Lenovo (Singapore) Pte. Ltd. Presentation of data on an at least partially transparent display based on user focus
FR3029307B1 (en) * 2014-11-27 2018-01-05 Oberthur Technologies ELECTRONIC DEVICE, SYSTEM COMPRISING SUCH A DEVICE, METHOD FOR CONTROLLING SUCH A DEVICE AND METHOD FOR DISPLAYING MANAGEMENT BY A SYSTEM COMPRISING SUCH A DEVICE
US10115185B2 (en) 2014-12-05 2018-10-30 At&T Intellectual Property I, L.P. Dynamic image recognition model updates
ES2642263T3 (en) * 2014-12-23 2017-11-16 Nokia Technologies Oy Virtual reality content control
US20170364324A1 (en) * 2014-12-23 2017-12-21 Lg Electronics Inc. Portable device and control method therefor
US10156899B2 (en) * 2015-01-07 2018-12-18 Facebook, Inc. Dynamic camera or light operation
EP3059658A1 (en) * 2015-02-18 2016-08-24 Nokia Technologies OY Apparatus, methods and computer programs for providing images
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US10860094B2 (en) 2015-03-10 2020-12-08 Lenovo (Singapore) Pte. Ltd. Execution of function based on location of display at which a user is looking and manipulation of an input device
US10013540B2 (en) 2015-03-10 2018-07-03 Lenovo (Singapore) Pte. Ltd. Authentication based on body movement
US10499164B2 (en) 2015-03-18 2019-12-03 Lenovo (Singapore) Pte. Ltd. Presentation of audio based on source
US10621431B2 (en) 2015-03-27 2020-04-14 Lenovo (Singapore) Pte. Ltd. Camera that uses light from plural light sources disposed on a device
JP2016194799A (en) * 2015-03-31 2016-11-17 富士通株式会社 Image analyzer and image analysis method
US10331398B2 (en) 2015-05-14 2019-06-25 International Business Machines Corporation Reading device usability
US10460227B2 (en) 2015-05-15 2019-10-29 Apple Inc. Virtual assistant in a communication session
US10127211B2 (en) 2015-05-20 2018-11-13 International Business Machines Corporation Overlay of input control to identify and restrain draft content from streaming
WO2016191404A1 (en) * 2015-05-26 2016-12-01 Invue Security Products Inc. Merchandise display including sensor for detecting the presence of a customer
US10180741B2 (en) * 2015-05-27 2019-01-15 Samsung Electronics Co., Ltd. Electronic apparatus including emissive display and transparent display and method of controlling same
US20160378747A1 (en) 2015-06-29 2016-12-29 Apple Inc. Virtual assistant for media playback
KR20170011870A (en) * 2015-07-24 2017-02-02 삼성전자주식회사 Electronic device and method thereof for providing content
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
JP6675082B2 (en) * 2015-09-30 2020-04-01 パナソニックIpマネジメント株式会社 Watching device, watching method, and computer program
US20170102758A1 (en) * 2015-10-08 2017-04-13 Stmicroelectronics Asia Pacific Pte Ltd Wake up gesture for low power using capacitive touch controller
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US20170177088A1 (en) * 2015-12-21 2017-06-22 Sap Se Two-step gesture recognition for fine-grain control of wearable applications
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US9880441B1 (en) 2016-09-08 2018-01-30 Osterhout Group, Inc. Electrochromic systems for head-worn computer systems
CN107305434A (en) * 2016-04-25 2017-10-31 中兴通讯股份有限公司 The recognition methods of button operation and device
US11227589B2 (en) 2016-06-06 2022-01-18 Apple Inc. Intelligent list reading
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DK201670616A1 (en) * 2016-06-12 2018-01-22 Apple Inc Devices and Methods for Accessing Prevalent Device Functions
US10354057B2 (en) * 2016-12-28 2019-07-16 Motorola Solutions, Inc. Detection of unauthorized user assistance of an electronic device based on the detection or tracking of eyes
US11204787B2 (en) 2017-01-09 2021-12-21 Apple Inc. Application integration with a digital assistant
US10659923B2 (en) 2017-04-10 2020-05-19 Here Global B.V. Condition based accurate indoor positioning
DK201770383A1 (en) 2017-05-09 2018-12-14 Apple Inc. User interface for correcting recognition errors
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
DK179745B1 (en) 2017-05-12 2019-05-01 Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
DK201770428A1 (en) 2017-05-12 2019-02-18 Apple Inc. Low-latency intelligent automated assistant
DK179560B1 (en) 2017-05-16 2019-02-18 Apple Inc. Far-field extension for digital assistant services
US20180336275A1 (en) 2017-05-16 2018-11-22 Apple Inc. Intelligent automated assistant for media exploration
US10862329B2 (en) * 2017-06-01 2020-12-08 Apple Inc. Electronic device with activity-based power management
CA3122315A1 (en) 2018-02-22 2019-08-29 Innodem Neurosciences Eye tracking method and system
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
DK180639B1 (en) 2018-06-01 2021-11-04 Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
DK179822B1 (en) 2018-06-01 2019-07-12 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US11010561B2 (en) 2018-09-27 2021-05-18 Apple Inc. Sentiment prediction from textual data
US11475898B2 (en) 2018-10-26 2022-10-18 Apple Inc. Low-latency multi-speaker speech recognition
US11178468B2 (en) * 2018-11-29 2021-11-16 International Business Machines Corporation Adjustments to video playing on a computer
US11638059B2 (en) 2019-01-04 2023-04-25 Apple Inc. Content playback on multiple devices
US20200236539A1 (en) * 2019-01-22 2020-07-23 Jpmorgan Chase Bank, N.A. Method for protecting privacy on mobile communication device
US11348573B2 (en) 2019-03-18 2022-05-31 Apple Inc. Multimodality in digital assistant systems
WO2020191643A1 (en) * 2019-03-27 2020-10-01 Intel Corporation Smart display panel apparatus and related methods
US11475884B2 (en) 2019-05-06 2022-10-18 Apple Inc. Reducing digital assistant latency when a language is incorrectly determined
US11423908B2 (en) 2019-05-06 2022-08-23 Apple Inc. Interpreting spoken requests
US11217251B2 (en) 2019-05-06 2022-01-04 Apple Inc. Spoken notifications
US11307752B2 (en) 2019-05-06 2022-04-19 Apple Inc. User configurable task triggers
WO2020229912A1 (en) * 2019-05-10 2020-11-19 株式会社半導体エネルギー研究所 Complex device and method for driving electronic device
US11140099B2 (en) 2019-05-21 2021-10-05 Apple Inc. Providing message response suggestions
US11379016B2 (en) 2019-05-23 2022-07-05 Intel Corporation Methods and apparatus to operate closed-lid portable computers
US11289073B2 (en) 2019-05-31 2022-03-29 Apple Inc. Device text to speech
DK180129B1 (en) 2019-05-31 2020-06-02 Apple Inc. User activity shortcut suggestions
US11496600B2 (en) 2019-05-31 2022-11-08 Apple Inc. Remote execution of machine-learned models
DK201970510A1 (en) 2019-05-31 2021-02-11 Apple Inc Voice identification in digital assistant systems
US11360641B2 (en) 2019-06-01 2022-06-14 Apple Inc. Increasing the relevance of new available information
WO2021056255A1 (en) 2019-09-25 2021-04-01 Apple Inc. Text detection using global geometry estimators
US11543873B2 (en) 2019-09-27 2023-01-03 Intel Corporation Wake-on-touch display screen devices and related methods
US11733761B2 (en) 2019-11-11 2023-08-22 Intel Corporation Methods and apparatus to manage power and performance of computing devices based on user presence
US10923045B1 (en) * 2019-11-26 2021-02-16 Himax Technologies Limited Backlight control device and method
US11809535B2 (en) 2019-12-23 2023-11-07 Intel Corporation Systems and methods for multi-modal user device authentication
KR102291593B1 (en) * 2019-12-26 2021-08-18 엘지전자 주식회사 Image displaying apparatus and method thereof
US11175714B2 (en) * 2019-12-28 2021-11-16 Intel Corporation Detection of user-facing camera obstruction
US10955988B1 (en) 2020-02-14 2021-03-23 Lenovo (Singapore) Pte. Ltd. Execution of function based on user looking at one area of display while touching another area of display
US20230336826A1 (en) * 2020-05-22 2023-10-19 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for controlling video playing, electronic device and storage medium
US11789565B2 (en) 2020-08-18 2023-10-17 Intel Corporation Lid controller hub architecture for improved touch experiences
US20210109585A1 (en) * 2020-12-21 2021-04-15 Intel Corporation Methods and apparatus to improve user experience on computing devices
US11463270B2 (en) * 2021-01-28 2022-10-04 Dell Products, Lp System and method for operating an intelligent face framing management system for videoconferencing applications
CN116360583A (en) * 2021-12-28 2023-06-30 华为技术有限公司 Equipment control method and related device

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5214466A (en) * 1990-04-11 1993-05-25 Canon Kabushiki Kaisha Camera having visual axis detecting apparatus
US5239337A (en) * 1990-08-20 1993-08-24 Nikon Corporation Apparatus for ordering to phototake with eye-detection
US5990872A (en) * 1996-10-31 1999-11-23 Gateway 2000, Inc. Keyboard control of a pointing device of a computer
US6111580A (en) * 1995-09-13 2000-08-29 Kabushiki Kaisha Toshiba Apparatus and method for controlling an electronic device with user action
US20020163582A1 (en) * 2001-05-04 2002-11-07 Gruber Michael A. Self-calibrating, digital, large format camera with single or mulitiple detector arrays and single or multiple optical systems
US6665805B1 (en) * 1999-12-27 2003-12-16 Intel Corporation Method and apparatus for real time monitoring of user presence to prolong a portable computer battery operation time
US20040081337A1 (en) * 2002-10-23 2004-04-29 Tsirkel Aaron M. Method and apparatus for adaptive realtime system power state control
US20040181703A1 (en) * 2003-02-12 2004-09-16 Nokia Corporation Selecting operation modes in electronic device
US20050199783A1 (en) * 2004-03-15 2005-09-15 Wenstrand John S. Using eye detection for providing control and power management of electronic devices
US20050236488A1 (en) * 2004-04-26 2005-10-27 Kricorissian Gregg R Motion induced blur minimization in a portable image reader
US20060140452A1 (en) * 2004-12-15 2006-06-29 Stmicroelectronics Ltd. Computer user detection apparatus and associated method
US20070072553A1 (en) * 2005-09-26 2007-03-29 Barbera Melvin A Safety features for portable electronic device
US20070075965A1 (en) * 2005-09-30 2007-04-05 Brian Huppi Automated response to and sensing of user activity in portable devices
US7302089B1 (en) * 2004-04-29 2007-11-27 National Semiconductor Corporation Autonomous optical wake-up intelligent sensor circuit
US20080095402A1 (en) * 2006-09-29 2008-04-24 Topcon Corporation Device and method for position measurement
US20080167834A1 (en) * 2007-01-07 2008-07-10 Herz Scott M Using ambient light sensor to augment proximity sensor output
US20090033618A1 (en) * 2005-07-04 2009-02-05 Rune Norager Unit, an Assembly and a Method for Controlling in a Dynamic Egocentric Interactive Space
US20090097705A1 (en) * 2007-10-12 2009-04-16 Sony Ericsson Mobile Communications Ab Obtaining information by tracking a user
US20090219224A1 (en) * 2008-02-28 2009-09-03 Johannes Elg Head tracking for enhanced 3d experience using face detection

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5996080A (en) 1995-10-04 1999-11-30 Norand Corporation Safe, virtual trigger for a portable data capture terminal
US5835083A (en) * 1996-05-30 1998-11-10 Sun Microsystems, Inc. Eyetrack-driven illumination and information display
GB9722766D0 (en) * 1997-10-28 1997-12-24 British Telecomm Portable computers
AUPP400998A0 (en) * 1998-06-10 1998-07-02 Canon Kabushiki Kaisha Face detection in digital images
US6526159B1 (en) * 1998-12-31 2003-02-25 Intel Corporation Eye tracking for resource and power management
JP2001272943A (en) * 2000-03-27 2001-10-05 Canon Inc Mobile type electronic equipment and its control method
US20070078552A1 (en) * 2006-01-13 2007-04-05 Outland Research, Llc Gaze-based power conservation for portable media players
US20020151297A1 (en) 2000-10-14 2002-10-17 Donald Remboski Context aware wireless communication device and method
US20020173344A1 (en) * 2001-03-16 2002-11-21 Cupps Bryan T. Novel personal electronics device
US6886137B2 (en) * 2001-05-29 2005-04-26 International Business Machines Corporation Eye gaze control of dynamic information presentation
US20030038754A1 (en) * 2001-08-22 2003-02-27 Mikael Goldstein Method and apparatus for gaze responsive text presentation in RSVP display
KR100455294B1 (en) 2002-12-06 2004-11-06 삼성전자주식회사 Method for detecting user and detecting motion, and apparatus for detecting user within security system
US7280678B2 (en) 2003-02-28 2007-10-09 Avago Technologies General Ip Pte Ltd Apparatus and method for detecting pupils
US7379560B2 (en) * 2003-03-05 2008-05-27 Intel Corporation Method and apparatus for monitoring human attention in dynamic power management
US8292433B2 (en) * 2003-03-21 2012-10-23 Queen's University At Kingston Method and apparatus for communication between humans and devices
US7117380B2 (en) * 2003-09-30 2006-10-03 International Business Machines Corporation Apparatus, system, and method for autonomic power adjustment in an electronic device
US7809160B2 (en) * 2003-11-14 2010-10-05 Queen's University At Kingston Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
US20050289363A1 (en) * 2004-06-28 2005-12-29 Tsirkel Aaron M Method and apparatus for automatic realtime power management
US7438414B2 (en) * 2005-07-28 2008-10-21 Outland Research, Llc Gaze discriminating electronic control apparatus, system, method and computer program product
US20090066722A1 (en) * 2005-08-29 2009-03-12 Kriger Joshua F System, Device, and Method for Conveying Information Using Enhanced Rapid Serial Presentation
KR100648517B1 (en) * 2005-12-09 2006-11-27 주식회사 애트랩 Optical navigation device and operating method thereof

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5214466A (en) * 1990-04-11 1993-05-25 Canon Kabushiki Kaisha Camera having visual axis detecting apparatus
US5239337A (en) * 1990-08-20 1993-08-24 Nikon Corporation Apparatus for ordering to phototake with eye-detection
US6111580A (en) * 1995-09-13 2000-08-29 Kabushiki Kaisha Toshiba Apparatus and method for controlling an electronic device with user action
US5990872A (en) * 1996-10-31 1999-11-23 Gateway 2000, Inc. Keyboard control of a pointing device of a computer
US7152172B2 (en) * 1999-12-27 2006-12-19 Intel Corporation Method and apparatus for real time monitoring of user presence to prolong a portable computer battery operation time
US6665805B1 (en) * 1999-12-27 2003-12-16 Intel Corporation Method and apparatus for real time monitoring of user presence to prolong a portable computer battery operation time
US20040073827A1 (en) * 1999-12-27 2004-04-15 Intel Corporation Method and apparatus for real time monitoring of user presence to prolong a portable computer battery operation time
US20020163582A1 (en) * 2001-05-04 2002-11-07 Gruber Michael A. Self-calibrating, digital, large format camera with single or mulitiple detector arrays and single or multiple optical systems
US20040081337A1 (en) * 2002-10-23 2004-04-29 Tsirkel Aaron M. Method and apparatus for adaptive realtime system power state control
US20040181703A1 (en) * 2003-02-12 2004-09-16 Nokia Corporation Selecting operation modes in electronic device
US20050199783A1 (en) * 2004-03-15 2005-09-15 Wenstrand John S. Using eye detection for providing control and power management of electronic devices
US20050236488A1 (en) * 2004-04-26 2005-10-27 Kricorissian Gregg R Motion induced blur minimization in a portable image reader
US7302089B1 (en) * 2004-04-29 2007-11-27 National Semiconductor Corporation Autonomous optical wake-up intelligent sensor circuit
US20060140452A1 (en) * 2004-12-15 2006-06-29 Stmicroelectronics Ltd. Computer user detection apparatus and associated method
US20090033618A1 (en) * 2005-07-04 2009-02-05 Rune Norager Unit, an Assembly and a Method for Controlling in a Dynamic Egocentric Interactive Space
US20070072553A1 (en) * 2005-09-26 2007-03-29 Barbera Melvin A Safety features for portable electronic device
US20070075965A1 (en) * 2005-09-30 2007-04-05 Brian Huppi Automated response to and sensing of user activity in portable devices
US20080095402A1 (en) * 2006-09-29 2008-04-24 Topcon Corporation Device and method for position measurement
US20080167834A1 (en) * 2007-01-07 2008-07-10 Herz Scott M Using ambient light sensor to augment proximity sensor output
US20090097705A1 (en) * 2007-10-12 2009-04-16 Sony Ericsson Mobile Communications Ab Obtaining information by tracking a user
US20090219224A1 (en) * 2008-02-28 2009-09-03 Johannes Elg Head tracking for enhanced 3d experience using face detection

Cited By (272)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US20120200490A1 (en) * 2011-02-03 2012-08-09 Denso Corporation Gaze detection apparatus and method
US8866736B2 (en) * 2011-02-03 2014-10-21 Denso Corporation Gaze detection apparatus and method
US20120327447A1 (en) * 2011-06-24 2012-12-27 Konica Minolta Business Technologies, Inc. Image forming apparatus
US9436306B2 (en) 2011-09-21 2016-09-06 Nec Corporation Portable terminal device and program
US9094539B1 (en) * 2011-09-22 2015-07-28 Amazon Technologies, Inc. Dynamic device adjustments based on determined user sleep state
US20130083344A1 (en) * 2011-10-04 2013-04-04 Konica Minolta Business Technologies, Inc. , Image forming apparatus
US9179015B2 (en) * 2011-10-04 2015-11-03 Konica Minolta Business Technologies, Inc. Image forming apparatus
US11849153B2 (en) 2012-01-19 2023-12-19 Vid Scale, Inc. Methods and systems for video delivery supporting adaptation to viewing conditions
US20140204014A1 (en) * 2012-03-30 2014-07-24 Sony Mobile Communications Ab Optimizing selection of a media object type in which to present content to a user of a device
US8937591B2 (en) 2012-04-06 2015-01-20 Apple Inc. Systems and methods for counteracting a perceptual fading of a movable indicator
US20170010647A1 (en) * 2012-10-14 2017-01-12 Ari M. Frank Reducing power consumed by a head-mounted system that measures affective response to content
US20140108842A1 (en) * 2012-10-14 2014-04-17 Ari M. Frank Utilizing eye tracking to reduce power consumption involved in measuring affective response
US9104467B2 (en) * 2012-10-14 2015-08-11 Ari M Frank Utilizing eye tracking to reduce power consumption involved in measuring affective response
US9791920B2 (en) * 2013-01-04 2017-10-17 Samsung Electronics Co., Ltd. Apparatus and method for providing control service using head tracking technology in electronic device
US20140191948A1 (en) * 2013-01-04 2014-07-10 Samsung Electronics Co., Ltd. Apparatus and method for providing control service using head tracking technology in electronic device
US10310599B1 (en) 2013-03-21 2019-06-04 Chian Chiu Li System and method for providing information
US11121575B2 (en) 2013-06-14 2021-09-14 Intel Corporation Methods and apparatus to provide power to devices
KR101766435B1 (en) * 2013-06-14 2017-08-08 인텔 코포레이션 Methods and apparatus to provide power to devices
US9859743B2 (en) 2013-06-14 2018-01-02 Intel Corporation Mobile wireless charging service
WO2014200508A1 (en) 2013-06-14 2014-12-18 Intel Corporation Methods and apparatus to provide power to devices
US20150001302A1 (en) * 2013-06-28 2015-01-01 Hand Held Products, Inc. Mobile device having an improved user interface for reading code symbols
US8985461B2 (en) * 2013-06-28 2015-03-24 Hand Held Products, Inc. Mobile device having an improved user interface for reading code symbols
US9235737B2 (en) * 2013-06-28 2016-01-12 Hand Held Products, Inc. System having an improved user interface for reading code symbols
US9477856B2 (en) * 2013-06-28 2016-10-25 Hand Held Products, Inc. System having an improved user interface for reading code symbols
US20150178523A1 (en) * 2013-06-28 2015-06-25 Hand Held Products, Inc. System having an improved user interface for reading code symbols
US20150138243A1 (en) * 2013-08-21 2015-05-21 Tencent Technology (Shenzhen) Company Limited Systems and Methods for Dynamic Wall Paper for Mobile Terminals
CN104427087A (en) * 2013-08-21 2015-03-18 腾讯科技(深圳)有限公司 Method for realizing dynamic wallpaper of mobile terminal, and mobile terminal
US10027737B2 (en) * 2013-10-31 2018-07-17 Samsung Electronics Co., Ltd. Method, apparatus and computer readable medium for activating functionality of an electronic device based on the presence of a user staring at the electronic device
KR102233728B1 (en) * 2013-10-31 2021-03-30 삼성전자주식회사 Method, apparatus and computer readable recording medium for controlling on an electronic device
KR20150049942A (en) * 2013-10-31 2015-05-08 삼성전자주식회사 Method, apparatus and computer readable recording medium for controlling on an electronic device
US20150121228A1 (en) * 2013-10-31 2015-04-30 Samsung Electronics Co., Ltd. Photographing image changes
US9836639B2 (en) 2014-01-10 2017-12-05 Facebook, Inc. Systems and methods of light modulation in eye tracking devices
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US11231817B2 (en) 2014-01-17 2022-01-25 Mentor Acquisition One, Llc External user interface for head worn computing
US11782529B2 (en) 2014-01-17 2023-10-10 Mentor Acquisition One, Llc External user interface for head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US11507208B2 (en) 2014-01-17 2022-11-22 Mentor Acquisition One, Llc External user interface for head worn computing
US11169623B2 (en) 2014-01-17 2021-11-09 Mentor Acquisition One, Llc External user interface for head worn computing
US10222618B2 (en) 2014-01-21 2019-03-05 Osterhout Group, Inc. Compact optics with reduced chromatic aberrations
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9436006B2 (en) 2014-01-21 2016-09-06 Osterhout Group, Inc. See-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9529199B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US10481393B2 (en) 2014-01-21 2019-11-19 Mentor Acquisition One, Llc See-through computer display systems
US9529192B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. Eye imaging in head worn computing
US9532715B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US9532714B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US9538915B2 (en) 2014-01-21 2017-01-10 Osterhout Group, Inc. Eye imaging in head worn computing
US10698223B2 (en) 2014-01-21 2020-06-30 Mentor Acquisition One, Llc See-through computer display systems
US10866420B2 (en) 2014-01-21 2020-12-15 Mentor Acquisition One, Llc See-through computer display systems
US10379365B2 (en) 2014-01-21 2019-08-13 Mentor Acquisition One, Llc See-through computer display systems
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US11796799B2 (en) 2014-01-21 2023-10-24 Mentor Acquisition One, Llc See-through computer display systems
US11796805B2 (en) 2014-01-21 2023-10-24 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9615742B2 (en) 2014-01-21 2017-04-11 Osterhout Group, Inc. Eye imaging in head worn computing
US9651789B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-Through computer display systems
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9298002B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Optical configurations for head worn computing
US9651788B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9651783B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9658458B2 (en) 2014-01-21 2017-05-23 Osterhout Group, Inc. See-through computer display systems
US9658457B2 (en) 2014-01-21 2017-05-23 Osterhout Group, Inc. See-through computer display systems
US10890760B2 (en) 2014-01-21 2021-01-12 Mentor Acquisition One, Llc See-through computer display systems
US9298001B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Optical configurations for head worn computing
US9684171B2 (en) 2014-01-21 2017-06-20 Osterhout Group, Inc. See-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9684165B2 (en) 2014-01-21 2017-06-20 Osterhout Group, Inc. Eye imaging in head worn computing
US11002961B2 (en) 2014-01-21 2021-05-11 Mentor Acquisition One, Llc See-through computer display systems
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9720235B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9720227B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9720234B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US11650416B2 (en) 2014-01-21 2023-05-16 Mentor Acquisition One, Llc See-through computer display systems
US9377625B2 (en) 2014-01-21 2016-06-28 Osterhout Group, Inc. Optical configurations for head worn computing
US11619820B2 (en) 2014-01-21 2023-04-04 Mentor Acquisition One, Llc See-through computer display systems
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US9740012B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. See-through computer display systems
US9746676B2 (en) 2014-01-21 2017-08-29 Osterhout Group, Inc. See-through computer display systems
US11054902B2 (en) 2014-01-21 2021-07-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US10139632B2 (en) 2014-01-21 2018-11-27 Osterhout Group, Inc. See-through computer display systems
US11099380B2 (en) 2014-01-21 2021-08-24 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9772492B2 (en) 2014-01-21 2017-09-26 Osterhout Group, Inc. Eye imaging in head worn computing
US11103132B2 (en) 2014-01-21 2021-08-31 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11622426B2 (en) 2014-01-21 2023-04-04 Mentor Acquisition One, Llc See-through computer display systems
US11126003B2 (en) 2014-01-21 2021-09-21 Mentor Acquisition One, Llc See-through computer display systems
US9811152B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US10579140B2 (en) 2014-01-21 2020-03-03 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US10012838B2 (en) 2014-01-21 2018-07-03 Osterhout Group, Inc. Compact optical system with improved contrast uniformity
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9829703B2 (en) 2014-01-21 2017-11-28 Osterhout Group, Inc. Eye imaging in head worn computing
US10007118B2 (en) 2014-01-21 2018-06-26 Osterhout Group, Inc. Compact optical system with improved illumination
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US10001644B2 (en) 2014-01-21 2018-06-19 Osterhout Group, Inc. See-through computer display systems
US11353957B2 (en) 2014-01-21 2022-06-07 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US9958674B2 (en) 2014-01-21 2018-05-01 Osterhout Group, Inc. Eye imaging in head worn computing
US9329387B2 (en) 2014-01-21 2016-05-03 Osterhout Group, Inc. See-through computer display systems
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9885868B2 (en) 2014-01-21 2018-02-06 Osterhout Group, Inc. Eye imaging in head worn computing
US9316833B2 (en) 2014-01-21 2016-04-19 Osterhout Group, Inc. Optical configurations for head worn computing
US9933622B2 (en) 2014-01-21 2018-04-03 Osterhout Group, Inc. See-through computer display systems
US9927612B2 (en) 2014-01-21 2018-03-27 Osterhout Group, Inc. See-through computer display systems
US11782274B2 (en) 2014-01-24 2023-10-10 Mentor Acquisition One, Llc Stray light suppression for head worn computing
US9939646B2 (en) 2014-01-24 2018-04-10 Osterhout Group, Inc. Stray light suppression for head worn computing
US10578874B2 (en) 2014-01-24 2020-03-03 Mentor Acquisition One, Llc Stray light suppression for head worn computing
US10558050B2 (en) 2014-01-24 2020-02-11 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US11822090B2 (en) 2014-01-24 2023-11-21 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US9843093B2 (en) 2014-02-11 2017-12-12 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9841602B2 (en) 2014-02-11 2017-12-12 Osterhout Group, Inc. Location indicating avatar in head worn computing
US10558420B2 (en) 2014-02-11 2020-02-11 Mentor Acquisition One, Llc Spatial location presentation in head worn computing
US9229234B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro doppler presentations in head worn computing
US9852545B2 (en) 2014-02-11 2017-12-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US11599326B2 (en) 2014-02-11 2023-03-07 Mentor Acquisition One, Llc Spatial location presentation in head worn computing
US9286728B2 (en) 2014-02-11 2016-03-15 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9784973B2 (en) 2014-02-11 2017-10-10 Osterhout Group, Inc. Micro doppler presentations in head worn computing
US9928019B2 (en) 2014-02-14 2018-03-27 Osterhout Group, Inc. Object shadowing in head worn computing
US9547465B2 (en) 2014-02-14 2017-01-17 Osterhout Group, Inc. Object shadowing in head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US11104272B2 (en) 2014-03-28 2021-08-31 Mentor Acquisition One, Llc System for assisted operator safety using an HMD
US9423612B2 (en) 2014-03-28 2016-08-23 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10101588B2 (en) 2014-04-25 2018-10-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US11880041B2 (en) 2014-04-25 2024-01-23 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US10634922B2 (en) 2014-04-25 2020-04-28 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US10045050B2 (en) 2014-04-25 2018-08-07 Vid Scale, Inc. Perceptual preprocessing filter for viewing-conditions-aware video coding
US11474360B2 (en) 2014-04-25 2022-10-18 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US11727223B2 (en) 2014-04-25 2023-08-15 Mentor Acquisition One, Llc Language translation with head-worn computing
WO2015167471A1 (en) * 2014-04-29 2015-11-05 Hewlett-Packard Development Company, L.P. Gaze detector using reference frames in media
US9916502B2 (en) 2014-04-29 2018-03-13 Microsoft Technology Licensing, Llc Handling glare in eye tracking
US9983668B2 (en) 2014-04-29 2018-05-29 Hewlett-Packard Development Company, L.P. Gaze detector using reference frames in media
US9454699B2 (en) 2014-04-29 2016-09-27 Microsoft Technology Licensing, Llc Handling glare in eye tracking
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US11402639B2 (en) 2014-06-05 2022-08-02 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US10877270B2 (en) 2014-06-05 2020-12-29 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US11327323B2 (en) 2014-06-09 2022-05-10 Mentor Acquisition One, Llc Content presentation in head worn computing
US11022810B2 (en) 2014-06-09 2021-06-01 Mentor Acquisition One, Llc Content presentation in head worn computing
US11887265B2 (en) 2014-06-09 2024-01-30 Mentor Acquisition One, Llc Content presentation in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US10139635B2 (en) 2014-06-09 2018-11-27 Osterhout Group, Inc. Content presentation in head worn computing
US11790617B2 (en) 2014-06-09 2023-10-17 Mentor Acquisition One, Llc Content presentation in head worn computing
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US10976559B2 (en) 2014-06-09 2021-04-13 Mentor Acquisition One, Llc Content presentation in head worn computing
US11360318B2 (en) 2014-06-09 2022-06-14 Mentor Acquisition One, Llc Content presentation in head worn computing
US11663794B2 (en) 2014-06-09 2023-05-30 Mentor Acquisition One, Llc Content presentation in head worn computing
US9720241B2 (en) 2014-06-09 2017-08-01 Osterhout Group, Inc. Content presentation in head worn computing
US11789267B2 (en) 2014-06-17 2023-10-17 Mentor Acquisition One, Llc External user interface for head worn computing
US11294180B2 (en) 2014-06-17 2022-04-05 Mentor Acquisition One, Llc External user interface for head worn computing
US10698212B2 (en) 2014-06-17 2020-06-30 Mentor Acquisition One, Llc External user interface for head worn computing
US11054645B2 (en) 2014-06-17 2021-07-06 Mentor Acquisition One, Llc External user interface for head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US10775630B2 (en) 2014-07-08 2020-09-15 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US11409110B2 (en) 2014-07-08 2022-08-09 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US9798148B2 (en) 2014-07-08 2017-10-24 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US10564426B2 (en) 2014-07-08 2020-02-18 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US11269182B2 (en) 2014-07-15 2022-03-08 Mentor Acquisition One, Llc Content presentation in head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US11786105B2 (en) 2014-07-15 2023-10-17 Mentor Acquisition One, Llc Content presentation in head worn computing
US11360314B2 (en) 2014-08-12 2022-06-14 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US11630315B2 (en) 2014-08-12 2023-04-18 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US10908422B2 (en) 2014-08-12 2021-02-02 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US10990187B2 (en) * 2014-09-23 2021-04-27 Fitbit, Inc. Methods, systems, and apparatuses to update screen content responsive to user gestures
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9366868B2 (en) 2014-09-26 2016-06-14 Osterhout Group, Inc. See-through computer display systems
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US11262846B2 (en) 2014-12-03 2022-03-01 Mentor Acquisition One, Llc See-through computer display systems
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US11809628B2 (en) 2014-12-03 2023-11-07 Mentor Acquisition One, Llc See-through computer display systems
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
US10197801B2 (en) 2014-12-03 2019-02-05 Osterhout Group, Inc. Head worn computer display systems
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD792400S1 (en) 2014-12-31 2017-07-18 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
USD794637S1 (en) 2015-01-05 2017-08-15 Osterhout Group, Inc. Air mouse
US11721303B2 (en) 2015-02-17 2023-08-08 Mentor Acquisition One, Llc See-through computer display systems
US10062182B2 (en) 2015-02-17 2018-08-28 Osterhout Group, Inc. See-through computer display systems
US10878775B2 (en) 2015-02-17 2020-12-29 Mentor Acquisition One, Llc See-through computer display systems
US11872018B2 (en) * 2015-03-10 2024-01-16 Brain Tunnelgenix Technologies Corp. Devices, apparatuses, systems, and methods for measuring temperature of an ABTT terminus
US20160364030A1 (en) * 2015-06-11 2016-12-15 Dell Products L.P. Touch user interface at a display edge
US9971436B2 (en) * 2015-06-11 2018-05-15 Dell Products L.P. Touch user interface at a display edge
US10887687B2 (en) 2015-08-11 2021-01-05 Google Llc Pairing of media streaming devices
US10136214B2 (en) 2015-08-11 2018-11-20 Google Llc Pairing of media streaming devices
US10606374B2 (en) 2015-08-19 2020-03-31 Lg Electronics Inc. Watch-type mobile terminal
US10284902B2 (en) 2015-09-18 2019-05-07 Samsung Electronics Co., Ltd. Apparatus and method for playing back multimedia content
US20170097677A1 (en) * 2015-10-05 2017-04-06 International Business Machines Corporation Gaze-aware control of multi-screen experience
US20170097678A1 (en) * 2015-10-05 2017-04-06 International Business Machines Corporation Gaze-aware control of multi-screen experience
US10031577B2 (en) * 2015-10-05 2018-07-24 International Business Machines Corporation Gaze-aware control of multi-screen experience
US10042420B2 (en) * 2015-10-05 2018-08-07 International Business Machines Corporation Gaze-aware control of multi-screen experience
US11714170B2 (en) 2015-12-18 2023-08-01 Samsung Semiconuctor, Inc. Real time position sensing of objects
CN108370409A (en) * 2015-12-22 2018-08-03 索尼公司 Information processing unit, imaging device, information processing system, information processing method and program
US11588965B2 (en) 2015-12-22 2023-02-21 Sony Group Corporation Information processing apparatus, imaging apparatus, information processing system, information processing method, and program
US11924544B2 (en) 2015-12-22 2024-03-05 Sony Group Corporation Information processing apparatus, imaging apparatus, information processing system, information processing method, and medium
US10721385B2 (en) * 2015-12-22 2020-07-21 Sony Corporation Information processing apparatus, imaging apparatus, information processing system, and method to implement power saving mode by stopping a first communication path
US20180359406A1 (en) * 2015-12-22 2018-12-13 Sony Corporation Information processing apparatus, imaging apparatus, information processing system, information processing method, and program
US11140307B2 (en) 2015-12-22 2021-10-05 Sony Corporation Information processing method and program that switches between a first communication path and a second communication path during capturing an image by an image pickup apparatus
US20170187982A1 (en) * 2015-12-29 2017-06-29 Le Holdings (Beijing) Co., Ltd. Method and terminal for playing control based on face recognition
US10667981B2 (en) 2016-02-29 2020-06-02 Mentor Acquisition One, Llc Reading assistance system for visually impaired
US10849817B2 (en) 2016-02-29 2020-12-01 Mentor Acquisition One, Llc Providing enhanced images for navigation
US11654074B2 (en) 2016-02-29 2023-05-23 Mentor Acquisition One, Llc Providing enhanced images for navigation
US11298288B2 (en) 2016-02-29 2022-04-12 Mentor Acquisition One, Llc Providing enhanced images for navigation
US11156834B2 (en) 2016-03-02 2021-10-26 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10591728B2 (en) 2016-03-02 2020-03-17 Mentor Acquisition One, Llc Optical systems for head-worn computers
US11592669B2 (en) 2016-03-02 2023-02-28 Mentor Acquisition One, Llc Optical systems for head-worn computers
US11500212B2 (en) 2016-05-09 2022-11-15 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10684478B2 (en) 2016-05-09 2020-06-16 Mentor Acquisition One, Llc User interface systems for head-worn computers
US11320656B2 (en) 2016-05-09 2022-05-03 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10824253B2 (en) 2016-05-09 2020-11-03 Mentor Acquisition One, Llc User interface systems for head-worn computers
US11226691B2 (en) 2016-05-09 2022-01-18 Mentor Acquisition One, Llc User interface systems for head-worn computers
US11586048B2 (en) 2016-06-01 2023-02-21 Mentor Acquisition One, Llc Modular systems for head-worn computers
US11754845B2 (en) 2016-06-01 2023-09-12 Mentor Acquisition One, Llc Modular systems for head-worn computers
US11460708B2 (en) 2016-06-01 2022-10-04 Mentor Acquisition One, Llc Modular systems for head-worn computers
US10466491B2 (en) 2016-06-01 2019-11-05 Mentor Acquisition One, Llc Modular systems for head-worn computers
US11022808B2 (en) 2016-06-01 2021-06-01 Mentor Acquisition One, Llc Modular systems for head-worn computers
US10613638B2 (en) * 2016-07-27 2020-04-07 Kyocera Corporation Electronic device
US20180033362A1 (en) * 2016-07-29 2018-02-01 Semiconductor Energy Laboratory Co., Ltd. Display method, display device, electronic device, non-temporary memory medium, and program
US11409128B2 (en) 2016-08-29 2022-08-09 Mentor Acquisition One, Llc Adjustable nose bridge assembly for headworn computer
US10690936B2 (en) 2016-08-29 2020-06-23 Mentor Acquisition One, Llc Adjustable nose bridge assembly for headworn computer
US11366320B2 (en) 2016-09-08 2022-06-21 Mentor Acquisition One, Llc Optical systems for head-worn computers
US11604358B2 (en) 2016-09-08 2023-03-14 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10534180B2 (en) 2016-09-08 2020-01-14 Mentor Acquisition One, Llc Optical systems for head-worn computers
USD840395S1 (en) 2016-10-17 2019-02-12 Osterhout Group, Inc. Head-worn computer
US11709236B2 (en) 2016-12-27 2023-07-25 Samsung Semiconductor, Inc. Systems and methods for machine perception
USD947186S1 (en) 2017-01-04 2022-03-29 Mentor Acquisition One, Llc Computer glasses
USD918905S1 (en) 2017-01-04 2021-05-11 Mentor Acquisition One, Llc Computer glasses
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
US11042035B2 (en) 2017-07-24 2021-06-22 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US11668939B2 (en) 2017-07-24 2023-06-06 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US10578869B2 (en) 2017-07-24 2020-03-03 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US11226489B2 (en) 2017-07-24 2022-01-18 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US11789269B2 (en) 2017-07-24 2023-10-17 Mentor Acquisition One, Llc See-through computer display systems
US10422995B2 (en) 2017-07-24 2019-09-24 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US11409105B2 (en) 2017-07-24 2022-08-09 Mentor Acquisition One, Llc See-through computer display systems
US11567328B2 (en) 2017-07-24 2023-01-31 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US11550157B2 (en) 2017-07-24 2023-01-10 Mentor Acquisition One, Llc See-through computer display systems
US10969584B2 (en) 2017-08-04 2021-04-06 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
US11500207B2 (en) 2017-08-04 2022-11-15 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
CN108270920A (en) * 2018-01-16 2018-07-10 深圳市金立通信设备有限公司 A kind of wallpaper setting method, terminal and computer readable storage medium
US11016564B2 (en) 2018-03-10 2021-05-25 Chian Chiu Li System and method for providing information
US20200399861A1 (en) * 2018-03-23 2020-12-24 Kobelco Construction Machinery Co., Ltd. Construction machine
US11657166B2 (en) * 2018-11-21 2023-05-23 Rovi Guides, Inc. Intelligent display of content
US11372985B2 (en) * 2018-11-21 2022-06-28 Rovi Guides, Inc. Intelligent display of content
US20230252170A1 (en) * 2018-11-21 2023-08-10 Rovi Guides, Inc. Intelligent display of content
US20220284111A1 (en) * 2018-11-21 2022-09-08 Rovi Guides, Inc. Intelligent display of content
US10979769B2 (en) * 2019-02-25 2021-04-13 PreTechnology, Inc. Method and apparatus for monitoring and tracking consumption of digital content
US11803393B2 (en) * 2019-06-28 2023-10-31 AO Kaspersky Lab Systems and methods for automatic service activation on a computing device
US20200409720A1 (en) * 2019-06-28 2020-12-31 AO Kaspersky Lab Systems and methods for automatic service activation on a computing device
US11829059B2 (en) 2020-02-27 2023-11-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array
US11372320B2 (en) 2020-02-27 2022-06-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array
US11630907B2 (en) * 2020-03-30 2023-04-18 Salesforce, Inc. Live data viewing security
US11632587B2 (en) 2020-06-24 2023-04-18 The Nielsen Company (Us), Llc Mobile device attention detection
US20230291967A1 (en) * 2020-06-24 2023-09-14 The Nielsen Company (Us), Llc Mobile device attention detection
US11573620B2 (en) 2021-04-20 2023-02-07 Chian Chiu Li Systems and methods for providing information and performing task

Also Published As

Publication number Publication date
US10025380B2 (en) 2018-07-17
US20100079508A1 (en) 2010-04-01
US20140132508A1 (en) 2014-05-15

Similar Documents

Publication Publication Date Title
US10025380B2 (en) Electronic devices with gaze detection capabilities
US10416748B2 (en) Method and apparatus for controlling an operation mode of a mobile terminal
US8068925B2 (en) Dynamic routing of audio among multiple audio devices
US8914559B2 (en) Methods and systems for automatic configuration of peripherals
US9619079B2 (en) Automated response to and sensing of user activity in portable devices
US11366510B2 (en) Processing method for reducing power consumption and mobile terminal
US8335549B2 (en) Method for power management of mobile communication terminal and mobile communication terminal using this method
WO2020029616A1 (en) Mobile terminal and power-saving mode control method therefor
CN1961488A (en) Sensor screen saver
US20110188675A1 (en) Electronic apparatus
CN111443803A (en) Mode switching method, device, storage medium and mobile terminal
CN112822001B (en) Control method of electronic equipment and electronic equipment
WO2022082951A1 (en) Frame rate setting method, apparatus, storage medium, and mobile terminal
WO2023029940A1 (en) Touch screen control method and related device
KR101913632B1 (en) Method and apparatus for controlling operating mode of portable terminal
KR101647039B1 (en) Method and apparatus for performing pedometer function for decreasing current consumption in mobile terminal
CN114967897A (en) Power consumption optimization method and device and mobile terminal

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION