US20080246734A1 - Body movement based usage of mobile device - Google Patents

Body movement based usage of mobile device Download PDF

Info

Publication number
US20080246734A1
US20080246734A1 US12/052,463 US5246308A US2008246734A1 US 20080246734 A1 US20080246734 A1 US 20080246734A1 US 5246308 A US5246308 A US 5246308A US 2008246734 A1 US2008246734 A1 US 2008246734A1
Authority
US
United States
Prior art keywords
input
user
mobile device
sensor
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/052,463
Inventor
Chi Ying Tsui
Ross David Murch
Roger Shu Kwan Cheng
Wai Ho Mow
Vincent Kin Nang Lau
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TUEN SOLUTIONS LLC
Original Assignee
Hong Kong University of Science and Technology HKUST
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hong Kong University of Science and Technology HKUST filed Critical Hong Kong University of Science and Technology HKUST
Priority to US12/052,463 priority Critical patent/US20080246734A1/en
Assigned to THE HONG KONG UNIVERSITY OF SCIENCE AND TECHNOLOGY reassignment THE HONG KONG UNIVERSITY OF SCIENCE AND TECHNOLOGY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHENG, ROGER SHU KWAN, LAU, VINCENT KIN NANG, MOW, WAI HO, MURCH, ROSS DAVID, TSUI, CHI YING
Publication of US20080246734A1 publication Critical patent/US20080246734A1/en
Assigned to HONG KONG TECHNOLOGIES GROUP LIMITED reassignment HONG KONG TECHNOLOGIES GROUP LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: THE HONG KONG UNIVERSITY OF SCIENCE AND TECHNOLOGY
Assigned to TUEN SOLUTIONS LIMITED LIABILITY COMPANY reassignment TUEN SOLUTIONS LIMITED LIABILITY COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HONG KONG TECHNOLOGIES GROUP LIMITED
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/16Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
    • H02J7/0029Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries with safety or protection devices or circuits
    • H02J7/00302Overcharge protection
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J50/00Circuit arrangements or systems for wireless supply or distribution of electric power
    • H02J50/001Energy harvesting or scavenging
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
    • H02J7/32Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries for charging batteries from a charging set comprising a non-electric prime mover rotating at constant speed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/51Discovery or management thereof, e.g. service location protocol [SLP] or web services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J2207/00Indexing scheme relating to details of circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
    • H02J2207/40Indexing scheme relating to details of circuit arrangements for charging or depolarising batteries or for supplying loads from batteries adapted for charging from various sources, e.g. AC, DC or multivoltage
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02NELECTRIC MACHINES NOT OTHERWISE PROVIDED FOR
    • H02N2/00Electric machines in general using piezoelectric effect, electrostriction or magnetostriction
    • H02N2/18Electric machines in general using piezoelectric effect, electrostriction or magnetostriction producing electrical output from mechanical input, e.g. generators
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B40/00Technologies aiming at improving the efficiency of home appliances, e.g. induction cooking or efficient technologies for refrigerators, freezers or dish washers

Definitions

  • the subject disclosure relates to user interaction with computing devices, and more particular, to body-based user interaction with mobile devices.
  • Mobile devices such as cell phones, portable media players, personal digital assistants (PDAs), messaging devices, portable game players, are ubiquitous. Many users carry multiple mobile devices with them. Input devices for these mobile devices however are usually inadequate for a number of reasons.
  • buttons there is a tradeoff between size and number of buttons.
  • a full QWERTY keypad input drives the size of buttons to the limit of smallness, whereas at the other end of the spectrum, character groupings for buttons (e.g., ABC->1), while they save space and allow for bigger buttons, the buttons may require tedious multiple pressing of buttons to achieve input precision for a single character.
  • buttons on the embedded keypad often cannot be reassigned by the user to perform user-defined functions.
  • Audio input and command also suffers from disadvantages relating to precision of input, and may not be intuitive for different languages or dialects. Furthermore, audio input suffers from a lack of privacy and potential to control other nearby mobile devices within earshot. Thus, audio input is inappropriate in a number of different environments (e.g., a public bus, office cubicles).
  • today's mobile devices usually do not share their user input devices with other computing devices, such as the user's other mobile devices.
  • a user's portable media player is often unaware that a user is making or receiving a call on the user's cell phone.
  • improved ways to interact with mobile devices are desirable.
  • Body movements or other body information are captured as input to one or more mobile devices, such as a cell phone. This type of interaction can be instead or a supplement to traditional keypads and audio interaction with a mobile device.
  • the sensors can include orientation sensors that track movement of various parts of the body, which can be used on the mobile device to make input to the one or more mobile devices.
  • sensors are placed on or proximate to one or more places of the body (e.g., fingers, legs, torso, head, etc.) capturing some or all of a user's body movements and conditions.
  • the body movements sensed by the sensors can be used as to make input to one or more mobile devices for a variety of tasks, such as making finger graffiti in the air for phone input to make phone calls, playing games such as boxing that sense a punch is thrown, receiving sports instruction (e.g., how to improve a golf swing or a bowling ball roll), and a variety of other tasks and services as well.
  • the sensors are advantageously wireless coupled to one or more mobile devices although some or all of the sensors can be connected by wire to the mobile device.
  • the sensors can also advantageously be included as part of jewelry (e.g., rings, watches, bracelets, necklaces, pins, cuff links, etc.) or other fashion accessories (e.g., shoes, belts, foot bands, socks, pocket squares, head bands, hats, etc.).
  • certain gestures by the user can also initiate a series of actions and/or communications that cause an external response. For instance, one might drive up to their garage door, and through a body movement gesture captured by body sensors communicatively coupled to the mobile device, initiate a sequence of operations that opens the garage door.
  • one or more ring sensors for sensing finger movement are particularly advantageous because many people wear rings today, and can be used to determine what input is being conveyed for a virtual input device.
  • the rings sensor can determine what keys of a virtual “air” keyboard a user is pressing or where to move the cursor for a virtual mouse or trackball.
  • the sensor array is able to be turned on/off, such as automatically depending on the environment for privacy reasons and/or to prevent undesired actions from being performed.
  • the array can be automatically turned off when the user enters a bathroom if a sensor is available to detect a user's location in a bathroom.
  • body-based movement interaction can be turned off when it is detected that the user is performing an activity that involves body-movement unrelated to input to the mobile device (e.g., driving, dancing, playing an instrument, etc.).
  • FIG. 1 illustrates an exemplary non-limiting block diagram of a mobile device that can receive body movement based interaction.
  • FIG. 2 is an exemplary non-limiting block diagram showing illustrative aspects of embodiments in the context of body movement based usage of mobile device.
  • FIG. 3 is an exemplary non-limiting block diagram illustrating body movement based usage of multiple mobile devices.
  • FIGS. 4A-4C illustrate block diagrams of sensors that are embedded in jewelry or fashion accessories according to an embodiment.
  • FIG. 5 is an exemplary non-limiting block diagram showing illustrative aspects of embodiments in the context of body movement based usage of other computing devices and/or appliances via the mobile device according to one aspect.
  • FIGS. 6A-6C illustrate various exemplary body movements that can be used to initiate a process on a mobile device.
  • FIG. 7 is a flow diagram of virtual device input according to one aspect.
  • FIG. 8 is a diagram of an exemplary virtual input device that can be used to interact with a mobile device according to one embodiment.
  • FIG. 9 is a flow diagram showing illustrative aspects of embodiments in the context of body movement based usage of mobile device.
  • FIG. 10 is a flow diagram illustrating aspects of embodiments that use a virtual input device for input.
  • FIG. 11 is a flow diagram illustrating aspects of embodiments that automatically turn on/off body-based movement interaction depending on the user's environment.
  • FIG. 12 is a block diagram of an input processing component for a mobile device according to one aspect.
  • the invention captures body movements or other body information as input to a mobile device, such as a cell phone.
  • sensors are placed on one or more places of the body, e.g., fingers, legs, torso, head, etc. everywhere, capturing all of a user's body movements and conditions.
  • the body movement sensed by the sensors can be used as to make input to a mobile device for a variety of tasks, such as making finger graffiti in the air for phone input to make phone calls, playing games such as boxing that sense a punch is thrown, receiving sports instruction, e.g., how to improve a golf swing, or communicating a need to drink more fluids, etc., and a variety of other tasks and services as well.
  • tasks such as making finger graffiti in the air for phone input to make phone calls, playing games such as boxing that sense a punch is thrown, receiving sports instruction, e.g., how to improve a golf swing, or communicating a need to drink more fluids, etc., and a variety of other tasks and services as well.
  • FIG. 1 an exemplary non-limiting mobile computing system environment in which the present invention may be implemented is illustrated. Even though a general-purpose mobile computing device is illustrated, one will appreciate that any mobile computing device, including mobile computing devices implemented using multiple processors or a System on a chip or wearable mobile devices are contemplated. Although not required, the invention can partly be implemented via software (e.g., firmware). Software may be described in the general context of computer-executable instructions, such as program modules, being executed by one or more computers.
  • FIG. 1 thus illustrates an example of a mobile computing device.
  • the invention may be practiced with any suitable computing system environment 100 in which the invention may be implemented but the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100 .
  • an example of a computing device for implementing the invention includes a general-purpose mobile computing device in the form of a computer 110 .
  • Components of computer 110 may include, but are not limited to, a processing unit 120 , a system memory 130 , and a system bus 121 that couples various system components including the system memory to the processing unit 120 .
  • the system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • Computer 110 typically includes a variety of computer readable media.
  • Computer readable media can be any available media that can be accessed by computer 110 .
  • Computer readable media may comprise computer storage media and communication media.
  • Computer storage media includes volatile and nonvolatile as well as removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 110 .
  • Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • the system memory 130 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) and/or random access memory (RAM).
  • ROM read only memory
  • RAM random access memory
  • a basic input/output system (BIOS) containing the basic routines that help to transfer information between elements within computer 110 , such as during start-up, may be stored in memory 130 .
  • Memory 130 typically also contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120 .
  • memory 130 may also include an operating system, application programs, other program modules, and program data.
  • the computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media.
  • computer 110 could include a flash memory that reads from or writes to non-removable, nonvolatile media, a magnetic disk drive that reads from or writes to a removable, nonvolatile magnetic disk, and/or an optical disk drive that reads from or writes to a removable, nonvolatile optical disk, such as a CD-ROM or other optical media.
  • Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, digital versatile disks, digital video tape, solid state RAM, solid state ROM and the like.
  • a user may enter commands and information into the computer 110 through input devices.
  • Input devices are often connected to the processing unit 120 through user input 140 and associated interface(s) that are coupled to the system bus 121 , but may be connected by other interface and bus structures, in a wired or wireless manner, such as a parallel port, game port, a universal serial bus (USB), wireless USB, or Bluetooth.
  • a graphics subsystem may also be connected to the system bus 121 .
  • One or remote sensors, including orientation sensors 145 are also connected to system bus 121 via input interface 140 . At least one of the sensors is attached or proximate to the user's body and each sensor is communicatively coupled to computer via wired or wireless means.
  • a monitor or other type of remote output devices 155 may also connected to the system bus 121 via an interface, such as output interface 150 , which may in turn communicate with video memory.
  • computer 110 may also include other peripheral output devices, which may be connected through output interface 150 .
  • the computer 110 may operate in a networked or distributed environment using logical connections to one or more other remote computers, such as remote computer 170 , which may in turn have capabilities different from device 110 .
  • the logical connections depicted in FIG. 1 include a network 171 .
  • the network 171 can include both the wireless network described herein as well as other networks, such a personal area network (PAN), a local area network (LAN) or wide area network (WAN).
  • PAN personal area network
  • LAN local area network
  • WAN wide area network
  • the computer 110 When used in a PAN networking environment, the computer 110 is connected to the PAN through a network interface or adapter, such as a Bluetooth or Wireless USB adapter. When used in a LAN networking environment, the computer 110 is connected to the LAN through a network interface or adapter. When used in a WAN networking environment, the computer 110 typically includes a communications component, such as a modem, or other means for establishing communications over the WAN, such as the Internet.
  • a communications component such as a network interface card, which may be internal or external, wired or wireless, may be connected to the system bus 121 via the user input interface of input 140 , or other appropriate mechanism.
  • program modules, or portions thereof may be stored in a remote memory storage device.
  • FIG. 2 illustrates a variety of body sensors S 1 , S 2 , S 3 , S 4 , S 5 , S 6 , S 7 , etc., including but not limited to orientation sensors on body parts, making input to device 210 , which processes the input at 220 , e.g., tracks movement of various body parts via the sensors. If a response condition is detected at 230 with the input, automatic action is taken by device 210 on behalf of the user 200 at 240 . Response conditions can include, but are not limited to, predefined finger movements, digit taps, arm movements, head movements, etc. In at least some embodiments, some or all of the movements are user-defined. Automatic actions to be performed can be mobile device-dependent.
  • the automatic actions can include, but are not limited to, initiating a phone call, hanging up a phone call, calling a particular individual, retrieving information from the cell phone's phonebook, etc.
  • the automatic actions can include moving a game piece according to the body movement, quitting the game, saving the game, etc.
  • automatic actions for a portable media player can include resuming playback, stopping playback, pausing playback, start recording, fast forward, rewind, etc.
  • the devices include orientation sensors that track movement of various parts of the body, which can be used on the mobile device to make input to the mobile device.
  • the device 210 can be a wearable computer or a more traditional handheld mobile device that is carrier in a purse, pocket, or belt holder/holster.
  • one or more body-movement sensors are shared between multiple mobile devices.
  • this allows each mobile device to respond appropriately based on the user's movement. For example, assume that a user has both a cell phone and portable video player. When body movements indicate the user is answering the cell phone, the portable video player can paused. Once the user hangs up on the cell phone, the portable video player can resume playback.
  • this functionality can be used in other scenarios and with different device pairs, such as a portable media player and a portable game player, a cell phone and a portable game player, etc.
  • FIG. 3 illustrates this scenario. Similar to FIG. 2 , user 300 has sensors S 1 -S 7 . However, these sensors are shared via communication framework 310 to mobile devices 320 and 330 .
  • the communication framework can be wired or wireless, such as Bluetooth, wireless USB, or a wired USB bus.
  • Mobile devices 320 and 330 share at least some of the sensor and can, as discussed supra, initiate different actions on each of the two devices.
  • any number of mobile devices can share one or more body sensors.
  • Sensors that sense body position or other user information can be advantageously placed in or attached to jewelry, such as a rings, watches or necklaces, fashion accessories, such as belts, headbands, or hats, or garments, such as shoes. Accordingly, there can be no need to look for or hold a special device when body-based movement is used for input.
  • FIGS. 4A-4C illustrate block diagrams of various sensors embedded in various jewelry and fashion accessories.
  • FIG. 4A illustrates a sensor S 3 placed into a ring 400 .
  • FIG. 4B illustrates a sensor s 6 embedded in a watch. The orientation of a finger can be sensed via the sensors in these two pieces of jewelry so that the direction a user is pointing can be determined by the mobile device.
  • FIG. 4C illustrates sensor S 6 embedded in a belt buckle of a belt 440 .
  • These sensors can be embedded upon manufacture of the jewelry or fashion accessories or can be attached subsequently, such as by the user or a retailer (e.g., jewelry retailer, clothing retailer, or mobile device retailer).
  • one or more sensors can be embedded or attached to traditional input/output devices for mobile devices that are worn by the user.
  • one or more sensors can be embedded into a Bluetooth or other wired/wireless headset or wireless/wired headphones.
  • certain gestures by the user can also initiate a series of actions and/or communications that cause an external response. For instance, one might drive up to their garage door, and through a body movement gesture captured by body sensors communicatively coupled to the mobile device, initiate a sequence of operations that opens the garage door.
  • one might operate remote-controlled devices e.g., television, DVR, DVD, VCR, stereo, set-top box, etc.
  • a body movement gesture captured by body sensors communicatively coupled to the mobile device and initiate a sequence of operations on a nearby universal remote that operates a remote-controlled device.
  • FIG. 5 illustrates sensors S 1 -S 7 on or proximate to user 500 . These sensors are communicatively attached to mobile device 510 .
  • Mobile device 510 is then communicatively coupled to at least one of an appliance 520 (e.g., a garage door, smart refrigerator, smart microwave, oven, air conditioner, etc.) or a non-portable computing device 530 , such as a desktop or flat-panel display.
  • an appliance 520 e.g., a garage door, smart refrigerator, smart microwave, oven, air conditioner, etc.
  • a non-portable computing device 530 such as a desktop or flat-panel display.
  • the mobile device 510 can initiate communication with and control appliance 520 or non-portable computer 430 . This communication can include the command to perform and may also include identifying information for the mobile device 510 .
  • the mobile device can interpret pointing at an appliance or non-portable computing device as a desire to control the appliance or device.
  • a map e.g., a 3D map
  • This communication can include the command to perform and may also include identifying information for the mobile device 510 .
  • the communicative coupling of the sensors to the mobile device can be different than the communicative coupling of the mobile device 510 to either the appliance or the non-portable computing device.
  • the sensors can be connected to the mobile device via Bluetooth and the mobile device to the appliance via wireless Ethernet.
  • Finger or arm movement sensors communicatively coupled to a mobile device are conducive to recognizing such unique hand gestures.
  • multiple sensors for different body locations can together form a single input to a mobile device.
  • doing “jumping jacks” requires movement of both the arms and legs in a certain harmonic manner, which could be detected as a unique gesture input to the mobile device by positioning sensors on both the arms and legs.
  • unique gestures may be defined by a user of the mobile device for performing common actions with the mobile device, such as “Call my friend Lee.”
  • FIGS. 6A-6C illustrate exemplary finger/hand gestures that can be used to initiate a process.
  • the hand gestures form a gesture-based input language. These hand gestures may need to occur within a predetermined period of time, such as 3 seconds. Depending on the number of sensors available, the same gesture can have a different meaning for different hands, different fingers, or different finger combinations.
  • body gestures can be used as well. For example, actions can be initiated based on a number of taps of fingers, feet, or hands. Some body gestures may involve the use of two different body areas, such as the “jumping jacks” movement above.
  • the gesture-based input language can be customized to the user's needs. For example, a person lacking one or more fingers may choose to use leg gestures instead of hand or finger gestures for gesture-based input. As a second example, a handicapped person may use a standardized sign language as a gesture-based input language.
  • One or more ring sensors for sensing finger movement are particularly advantageous because many people wear them today, and can be used to determine, for example, which keys of a virtual “air” keyboard a user is pressing. Key input of a mobile device is thus advantageously complemented or replaced by the supplemental mobile device input sensors of the invention.
  • FIG. 7 illustrates input via such virtual input devices according to one embodiment.
  • User 700 has sensors S 1 -S 7 and two output display devices O 1 and O 2 .
  • Output devices O 1 such as a wearable display, or O 2 a mini-wearable projector, can be used to display a virtual input device, such as a keypad or mouse/trackball, for mobile device 710 .
  • the virtual input device is displayed at 720 .
  • Movement is tracked by sensors at 730 , and input corresponding to the tracked body movement determined at 740 . The movement can correspond to the pressing of a virtual keypad or the movement of the virtual mouse/trackball.
  • FIG. 8 illustrates an exemplary virtual “air” keyboard 800 that is displayed to the user of a cell phone via an output display device, such as O 1 or O 2 of FIG. 7 .
  • the illustrated keypad displayed via the output display device includes a standard telephone keypad 802 and four additional keys 804 , 806 , 808 , 810 . Additional keys 804 , 806 , 808 are user-defined. Key 810 allows body movement to be temporarily turned off, such as when the user is dancing, driving, or in a bathroom.
  • FIG. 8 illustrates additional keys being added to a traditional keypad layout for a telephone
  • various other layout not including the traditional keys or key locations are possible in other embodiments.
  • a virtual keyboard with the letters in alphabetical order rather than a QWERTY keyboard can be created.
  • virtual keypads can be constructed that are entirely composed of virtual keys associated with user-defined functionality.
  • a virtual input device is described as being displayed to the user on an output display device, in other embodiments, the virtual input device can be constructed out of paper printout of an input device if the layout is previously known to the mobile device and can automatically determined (e.g., via picture recognition or via barcode or other machine-readable tag). Accordingly, virtual keypads of any size, shape, or button configuration can be created.
  • FIGS. 9-11 methodologies that may be implemented in accordance with the present invention are illustrated. While, for purposes of simplicity of explanation, the methodologies are shown and described as a series of blocks, it is to be understood and appreciated that the present invention is not limited by the order of the blocks, as some blocks may, in accordance with the present invention, occur in different orders and/or concurrently with other blocks from that shown and described herein. Moreover, not all illustrated blocks may be required to implement the methodologies in accordance with the present invention.
  • FIG. 9 is an exemplary non-limiting flow diagram showing a method 900 for use in connection with a mobile device in accordance with the invention.
  • orientation input data is received relating to an orientation of body part(s) of a user from input sensor(s), including orientation sensor(s), attachable to or near the body part(s) of the user.
  • the movement of the input sensor(s) is tracked via the orientation sensor(s).
  • pattern(s) in the orientation input data generated by the at least one input sensor are recognized.
  • one or more processes are automatically initiated by the mobile device in response to the pattern(s) recognized in the input data.
  • FIG. 10 is an exemplary non-limiting flow diagram of a method 1000 for using a virtual input device.
  • the virtual input device is displayed to the user. As discussed supra, this display can be via an output display device communicatively coupled to the mobile device or maybe displayed on piece of paper.
  • movement of input sensors is tracked via one or more orientation sensors with regard to the displayed virtual input device.
  • input date from the input sensor and information associated with the displayed virtual input device is used to determine the user input.
  • this method can be performed repeatedly to receive multiple inputs (e.g., multiple key inputs) from the virtual input device.
  • FIG. 11 is an exemplary non-limiting flow diagram of a method 1100 of automatically turning on/off body-based movement as input depending on the user's environment.
  • Some user environments based on the user location (e.g., bathroom) or current activity (e.g., driving, dancing) are not conducive to body-based movement being used as input.
  • body-based movement can also be turned on/off manually by the user.
  • the user's current environment is determined based on one or more sensors, such as global positioning sensors or sensors that read location-specific identifiers (e.g., RFID tags or wireless network gateway MAC addresses, etc.). This can be performed periodically, such as once every minute, or be determined based on the amount or nature of body movements.
  • body-based movement is turned on/off as appropriate.
  • FIG. 12 is a block diagram of an input processing component 1210 according to one embodiment.
  • the sensor tracking component 1214 is communicatively coupled to at least one input sensor that tracks the movement of the input sensor via an orientation sensor.
  • the pattern recognition component 1212 then recognizes at least one pattern in input data generated by at least one input sensor.
  • one or more processes are automatically initiated by the mobile device via the input indication component 1216 .
  • a user-defined function component allows a user to store and manage body gestures that will be recognized by the pattern recognition component and associate those gestures with processes to automatically performed when the pattern is recognized.
  • Optional virtual input device component 1220 facilitates display of virtual devices and stores and manages corresponding processes to perform when certain body gestures are received.
  • virtual input device component can store layout of one or more virtual keypads.
  • a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on computer and the computer can be a component.
  • One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
  • the methods and apparatus of the present invention may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
  • the computing device In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
  • program modules include routines, programs, objects, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • functionality of the program modules may be combined or distributed as desired in various embodiments.
  • various portions of the disclosed systems above and methods below may include or consist of artificial intelligence or knowledge or rule based components, sub-components, processes, means, methodologies, or mechanisms (e.g., support vector machines, neural networks, expert systems, Bayesian belief networks, fuzzy logic, data fusion engines, classifiers . . . ).
  • Such components can automate certain mechanisms or processes performed thereby to make portions of the systems and methods more adaptive as well as efficient and intelligent.
  • the disclosed subject matter may be implemented at least partially as a system, method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer or processor based device to implement aspects detailed herein.
  • article of manufacture “computer program product” or similar terms, where used herein, are intended to encompass a computer program accessible from any computer-readable device, carrier, or media.
  • computer readable media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips . . . ), optical disks (e.g., compact disk (CD), digital versatile disk (DVD) . . . ), smart cards, and flash memory devices (e.g., card, stick).
  • a carrier wave can be employed to carry computer-readable electronic data such as those used in transmitting and receiving electronic mail or in accessing a network such as the Internet or a local area network (LAN).
  • LAN local area network

Abstract

Body-based sensors are used to interact with one or more mobile devices. This interaction can be an alternative to or a supplement to traditional input methods on mobile devices. In order to facilitate everyday use, sensors can be hidden in jewelry or other fashion accessories normally worn by a user and wirelessly coupled to the mobile devices. When certain body movement is detected, a mobile device can automatically initiate one or more processes that perform various actions, such as actions on that mobile device or actions on communicatively coupled devices. The body movements can be user-defined so that a user can customize his interaction with the mobile device to meet the user's particular needs. The sensor array can also be adapted to turn on or off depending on the current environment.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This non-provisional application claims benefit under 35 U.S.C. §119(e) of U.S. provisional Application No. 60/910,109, filed Apr. 4, 2007.
  • TECHNICAL FIELD
  • The subject disclosure relates to user interaction with computing devices, and more particular, to body-based user interaction with mobile devices.
  • BACKGROUND
  • Mobile devices, such as cell phones, portable media players, personal digital assistants (PDAs), messaging devices, portable game players, are ubiquitous. Many users carry multiple mobile devices with them. Input devices for these mobile devices however are usually inadequate for a number of reasons.
  • With traditional keypads, there is a tradeoff between size and number of buttons. For example, in today's cell phones, at one end of the spectrum, due to the desire to have a palm sized device, a full QWERTY keypad input drives the size of buttons to the limit of smallness, whereas at the other end of the spectrum, character groupings for buttons (e.g., ABC->1), while they save space and allow for bigger buttons, the buttons may require tedious multiple pressing of buttons to achieve input precision for a single character.
  • Moreover, since keypads are often integrated into the devices, user input often cannot be personalized for a particular user of the device. Therefore, people with limited eyesight, for instance, often have problems finding devices with large buttons to meet their needs. In addition, buttons on the embedded keypad often cannot be reassigned by the user to perform user-defined functions.
  • Audio input and command also suffers from disadvantages relating to precision of input, and may not be intuitive for different languages or dialects. Furthermore, audio input suffers from a lack of privacy and potential to control other nearby mobile devices within earshot. Thus, audio input is inappropriate in a number of different environments (e.g., a public bus, office cubicles).
  • In addition, today's mobile devices usually do not share their user input devices with other computing devices, such as the user's other mobile devices. For example, a user's portable media player is often unaware that a user is making or receiving a call on the user's cell phone. Thus, improved ways to interact with mobile devices are desirable.
  • The above-described deficiencies of interacting with mobile devices are merely intended to provide an overview of some of the problems of interacting with today's mobile devices, and are not intended to be exhaustive. Other problems with the state of the art may become further apparent upon review of the description of various non-limiting embodiments that follows.
  • SUMMARY
  • The following presents a simplified summary in order to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the invention. It is intended to neither identify key or critical elements of the invention nor delineate the scope of the invention. Its sole purpose is to present some concepts of the invention in a simplified form as a prelude to the more detailed description that is presented later.
  • Body movements or other body information are captured as input to one or more mobile devices, such as a cell phone. This type of interaction can be instead or a supplement to traditional keypads and audio interaction with a mobile device. The sensors can include orientation sensors that track movement of various parts of the body, which can be used on the mobile device to make input to the one or more mobile devices.
  • In various embodiments, sensors are placed on or proximate to one or more places of the body (e.g., fingers, legs, torso, head, etc.) capturing some or all of a user's body movements and conditions. The body movements sensed by the sensors can be used as to make input to one or more mobile devices for a variety of tasks, such as making finger graffiti in the air for phone input to make phone calls, playing games such as boxing that sense a punch is thrown, receiving sports instruction (e.g., how to improve a golf swing or a bowling ball roll), and a variety of other tasks and services as well.
  • The sensors are advantageously wireless coupled to one or more mobile devices although some or all of the sensors can be connected by wire to the mobile device. The sensors can also advantageously be included as part of jewelry (e.g., rings, watches, bracelets, necklaces, pins, cuff links, etc.) or other fashion accessories (e.g., shoes, belts, foot bands, socks, pocket squares, head bands, hats, etc.).
  • In one embodiment, as mobile devices become connected to other autonomous computing devices and networks, certain gestures by the user can also initiate a series of actions and/or communications that cause an external response. For instance, one might drive up to their garage door, and through a body movement gesture captured by body sensors communicatively coupled to the mobile device, initiate a sequence of operations that opens the garage door.
  • In one embodiment, one or more ring sensors for sensing finger movement are particularly advantageous because many people wear rings today, and can be used to determine what input is being conveyed for a virtual input device. For example, the rings sensor can determine what keys of a virtual “air” keyboard a user is pressing or where to move the cursor for a virtual mouse or trackball.
  • In at least some embodiments, the sensor array is able to be turned on/off, such as automatically depending on the environment for privacy reasons and/or to prevent undesired actions from being performed. For example, the array can be automatically turned off when the user enters a bathroom if a sensor is available to detect a user's location in a bathroom. Similarly, body-based movement interaction can be turned off when it is detected that the user is performing an activity that involves body-movement unrelated to input to the mobile device (e.g., driving, dancing, playing an instrument, etc.).
  • To the accomplishment of the foregoing and related ends, certain illustrative aspects of the invention are described herein in connection with the following description and the annexed drawings. These aspects are indicative, however, of but a few of the various ways in which the principles of the invention may be employed and the present invention is intended to include all such aspects and their equivalents. Other advantages and novel features of the invention may become apparent from the following detailed description of the invention when considered in conjunction with the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an exemplary non-limiting block diagram of a mobile device that can receive body movement based interaction.
  • FIG. 2 is an exemplary non-limiting block diagram showing illustrative aspects of embodiments in the context of body movement based usage of mobile device.
  • FIG. 3 is an exemplary non-limiting block diagram illustrating body movement based usage of multiple mobile devices.
  • FIGS. 4A-4C illustrate block diagrams of sensors that are embedded in jewelry or fashion accessories according to an embodiment.
  • FIG. 5 is an exemplary non-limiting block diagram showing illustrative aspects of embodiments in the context of body movement based usage of other computing devices and/or appliances via the mobile device according to one aspect.
  • FIGS. 6A-6C illustrate various exemplary body movements that can be used to initiate a process on a mobile device.
  • FIG. 7 is a flow diagram of virtual device input according to one aspect.
  • FIG. 8 is a diagram of an exemplary virtual input device that can be used to interact with a mobile device according to one embodiment.
  • FIG. 9 is a flow diagram showing illustrative aspects of embodiments in the context of body movement based usage of mobile device.
  • FIG. 10 is a flow diagram illustrating aspects of embodiments that use a virtual input device for input.
  • FIG. 11 is a flow diagram illustrating aspects of embodiments that automatically turn on/off body-based movement interaction depending on the user's environment.
  • FIG. 12 is a block diagram of an input processing component for a mobile device according to one aspect.
  • DETAILED DESCRIPTION
  • The present invention is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It may be evident, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the present invention.
  • As discussed in the background, today, input devices for cell phones are inadequate due to the technological and engineering tradeoffs involved. In consideration of these limitations on input techniques for mobile devices, the invention captures body movements or other body information as input to a mobile device, such as a cell phone. In various embodiments, sensors are placed on one or more places of the body, e.g., fingers, legs, torso, head, etc. everywhere, capturing all of a user's body movements and conditions. The body movement sensed by the sensors can be used as to make input to a mobile device for a variety of tasks, such as making finger graffiti in the air for phone input to make phone calls, playing games such as boxing that sense a punch is thrown, receiving sports instruction, e.g., how to improve a golf swing, or communicating a need to drink more fluids, etc., and a variety of other tasks and services as well.
  • Turning to FIG. 1, an exemplary non-limiting mobile computing system environment in which the present invention may be implemented is illustrated. Even though a general-purpose mobile computing device is illustrated, one will appreciate that any mobile computing device, including mobile computing devices implemented using multiple processors or a System on a chip or wearable mobile devices are contemplated. Although not required, the invention can partly be implemented via software (e.g., firmware). Software may be described in the general context of computer-executable instructions, such as program modules, being executed by one or more computers.
  • FIG. 1 thus illustrates an example of a mobile computing device. Those skilled in the art will appreciate that the invention may be practiced with any suitable computing system environment 100 in which the invention may be implemented but the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100.
  • With reference to FIG. 1, an example of a computing device for implementing the invention includes a general-purpose mobile computing device in the form of a computer 110. Components of computer 110 may include, but are not limited to, a processing unit 120, a system memory 130, and a system bus 121 that couples various system components including the system memory to the processing unit 120. The system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 110. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile as well as removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 110. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • The system memory 130 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) and/or random access memory (RAM). A basic input/output system (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, may be stored in memory 130. Memory 130 typically also contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation, memory 130 may also include an operating system, application programs, other program modules, and program data.
  • The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. For example, computer 110 could include a flash memory that reads from or writes to non-removable, nonvolatile media, a magnetic disk drive that reads from or writes to a removable, nonvolatile magnetic disk, and/or an optical disk drive that reads from or writes to a removable, nonvolatile optical disk, such as a CD-ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, digital versatile disks, digital video tape, solid state RAM, solid state ROM and the like.
  • A user may enter commands and information into the computer 110 through input devices. Input devices are often connected to the processing unit 120 through user input 140 and associated interface(s) that are coupled to the system bus 121, but may be connected by other interface and bus structures, in a wired or wireless manner, such as a parallel port, game port, a universal serial bus (USB), wireless USB, or Bluetooth. A graphics subsystem may also be connected to the system bus 121. One or remote sensors, including orientation sensors 145 are also connected to system bus 121 via input interface 140. At least one of the sensors is attached or proximate to the user's body and each sensor is communicatively coupled to computer via wired or wireless means. A monitor or other type of remote output devices 155 may also connected to the system bus 121 via an interface, such as output interface 150, which may in turn communicate with video memory. In addition to a monitor, computer 110 may also include other peripheral output devices, which may be connected through output interface 150.
  • The computer 110 may operate in a networked or distributed environment using logical connections to one or more other remote computers, such as remote computer 170, which may in turn have capabilities different from device 110. The logical connections depicted in FIG. 1 include a network 171. The network 171 can include both the wireless network described herein as well as other networks, such a personal area network (PAN), a local area network (LAN) or wide area network (WAN).
  • When used in a PAN networking environment, the computer 110 is connected to the PAN through a network interface or adapter, such as a Bluetooth or Wireless USB adapter. When used in a LAN networking environment, the computer 110 is connected to the LAN through a network interface or adapter. When used in a WAN networking environment, the computer 110 typically includes a communications component, such as a modem, or other means for establishing communications over the WAN, such as the Internet. A communications component, such as a network interface card, which may be internal or external, wired or wireless, may be connected to the system bus 121 via the user input interface of input 140, or other appropriate mechanism. In a networked environment, program modules, or portions thereof, may be stored in a remote memory storage device.
  • FIG. 2 illustrates a variety of body sensors S1, S2, S3, S4, S5, S6, S7, etc., including but not limited to orientation sensors on body parts, making input to device 210, which processes the input at 220, e.g., tracks movement of various body parts via the sensors. If a response condition is detected at 230 with the input, automatic action is taken by device 210 on behalf of the user 200 at 240. Response conditions can include, but are not limited to, predefined finger movements, digit taps, arm movements, head movements, etc. In at least some embodiments, some or all of the movements are user-defined. Automatic actions to be performed can be mobile device-dependent. For a cell phone, the automatic actions can include, but are not limited to, initiating a phone call, hanging up a phone call, calling a particular individual, retrieving information from the cell phone's phonebook, etc. For a portable game player, the automatic actions can include moving a game piece according to the body movement, quitting the game, saving the game, etc. As an additional example, automatic actions for a portable media player can include resuming playback, stopping playback, pausing playback, start recording, fast forward, rewind, etc.
  • Otherwise, device 210 continues to monitor the input from the sensors S1 to S7. In one embodiment, the sensors include orientation sensors that track movement of various parts of the body, which can be used on the mobile device to make input to the mobile device. The device 210 can be a wearable computer or a more traditional handheld mobile device that is carrier in a purse, pocket, or belt holder/holster.
  • In at least one embodiment, one or more body-movement sensors are shared between multiple mobile devices. Advantageously, this allows each mobile device to respond appropriately based on the user's movement. For example, assume that a user has both a cell phone and portable video player. When body movements indicate the user is answering the cell phone, the portable video player can paused. Once the user hangs up on the cell phone, the portable video player can resume playback. One will appreciate that this functionality can be used in other scenarios and with different device pairs, such as a portable media player and a portable game player, a cell phone and a portable game player, etc.
  • FIG. 3 illustrates this scenario. Similar to FIG. 2, user 300 has sensors S1-S7. However, these sensors are shared via communication framework 310 to mobile devices 320 and 330. The communication framework can be wired or wireless, such as Bluetooth, wireless USB, or a wired USB bus. Mobile devices 320 and 330 share at least some of the sensor and can, as discussed supra, initiate different actions on each of the two devices. One will appreciate that although only two devices are shown and described for the sake of simplicity, any number of mobile devices can share one or more body sensors.
  • Sensors that sense body position or other user information can be advantageously placed in or attached to jewelry, such as a rings, watches or necklaces, fashion accessories, such as belts, headbands, or hats, or garments, such as shoes. Accordingly, there can be no need to look for or hold a special device when body-based movement is used for input.
  • FIGS. 4A-4C illustrate block diagrams of various sensors embedded in various jewelry and fashion accessories. In particular, FIG. 4A illustrates a sensor S3 placed into a ring 400. FIG. 4B illustrates a sensor s6 embedded in a watch. The orientation of a finger can be sensed via the sensors in these two pieces of jewelry so that the direction a user is pointing can be determined by the mobile device. FIG. 4C illustrates sensor S6 embedded in a belt buckle of a belt 440. These sensors can be embedded upon manufacture of the jewelry or fashion accessories or can be attached subsequently, such as by the user or a retailer (e.g., jewelry retailer, clothing retailer, or mobile device retailer).
  • One will also appreciate that one or more sensors can be embedded or attached to traditional input/output devices for mobile devices that are worn by the user. For example, one or more sensors can be embedded into a Bluetooth or other wired/wireless headset or wireless/wired headphones.
  • As mobile devices become connected to other autonomous computing devices and networks, certain gestures by the user can also initiate a series of actions and/or communications that cause an external response. For instance, one might drive up to their garage door, and through a body movement gesture captured by body sensors communicatively coupled to the mobile device, initiate a sequence of operations that opens the garage door. As a second example, one might operate remote-controlled devices (e.g., television, DVR, DVD, VCR, stereo, set-top box, etc.) through a body movement gesture captured by body sensors communicatively coupled to the mobile device and initiate a sequence of operations on a nearby universal remote that operates a remote-controlled device.
  • This aspect is illustrated in FIG. 5. FIG. 5 illustrates sensors S1-S7 on or proximate to user 500. These sensors are communicatively attached to mobile device 510. Mobile device 510 is then communicatively coupled to at least one of an appliance 520 (e.g., a garage door, smart refrigerator, smart microwave, oven, air conditioner, etc.) or a non-portable computing device 530, such as a desktop or flat-panel display. Upon receiving a specific body gesture, the mobile device 510 can initiate communication with and control appliance 520 or non-portable computer 430. This communication can include the command to perform and may also include identifying information for the mobile device 510.
  • For example, the mobile device can interpret pointing at an appliance or non-portable computing device as a desire to control the appliance or device. Combining the orientation of a finger/hand with a map (e.g., a 3D map) of nearby objects, one can activate/operate the appliance or computing device with a natural pointing gesture. For example, one can point to an air conditioner thermostat at a distance to turn the air conditioning on/off in an indoor environment even without a specific remote controller at hand. This communication can include the command to perform and may also include identifying information for the mobile device 510.
  • The communicative coupling of the sensors to the mobile device can be different than the communicative coupling of the mobile device 510 to either the appliance or the non-portable computing device. For example, the sensors can be connected to the mobile device via Bluetooth and the mobile device to the appliance via wireless Ethernet.
  • Finger or arm movement sensors communicatively coupled to a mobile device are conducive to recognizing such unique hand gestures. In this manner, multiple sensors for different body locations can together form a single input to a mobile device. As a simple example, doing “jumping jacks” requires movement of both the arms and legs in a certain harmonic manner, which could be detected as a unique gesture input to the mobile device by positioning sensors on both the arms and legs. In this fashion, unique gestures may be defined by a user of the mobile device for performing common actions with the mobile device, such as “Call my friend Lee.”
  • FIGS. 6A-6C illustrate exemplary finger/hand gestures that can be used to initiate a process. The hand gestures form a gesture-based input language. These hand gestures may need to occur within a predetermined period of time, such as 3 seconds. Depending on the number of sensors available, the same gesture can have a different meaning for different hands, different fingers, or different finger combinations.
  • One will appreciate however, that other body gestures can be used as well. For example, actions can be initiated based on a number of taps of fingers, feet, or hands. Some body gestures may involve the use of two different body areas, such as the “jumping jacks” movement above.
  • In addition, the gesture-based input language can be customized to the user's needs. For example, a person lacking one or more fingers may choose to use leg gestures instead of hand or finger gestures for gesture-based input. As a second example, a handicapped person may use a standardized sign language as a gesture-based input language.
  • One or more ring sensors for sensing finger movement are particularly advantageous because many people wear them today, and can be used to determine, for example, which keys of a virtual “air” keyboard a user is pressing. Key input of a mobile device is thus advantageously complemented or replaced by the supplemental mobile device input sensors of the invention.
  • FIG. 7 illustrates input via such virtual input devices according to one embodiment. User 700 has sensors S1-S7 and two output display devices O1 and O2. Output devices O1, such as a wearable display, or O2 a mini-wearable projector, can be used to display a virtual input device, such as a keypad or mouse/trackball, for mobile device 710. The virtual input device is displayed at 720. Movement is tracked by sensors at 730, and input corresponding to the tracked body movement determined at 740. The movement can correspond to the pressing of a virtual keypad or the movement of the virtual mouse/trackball.
  • FIG. 8 illustrates an exemplary virtual “air” keyboard 800 that is displayed to the user of a cell phone via an output display device, such as O1 or O2 of FIG. 7. The illustrated keypad displayed via the output display device includes a standard telephone keypad 802 and four additional keys 804, 806, 808, 810. Additional keys 804, 806, 808 are user-defined. Key 810 allows body movement to be temporarily turned off, such as when the user is dancing, driving, or in a bathroom.
  • Although FIG. 8 illustrates additional keys being added to a traditional keypad layout for a telephone, one will appreciate that various other layout not including the traditional keys or key locations are possible in other embodiments. For example, a virtual keyboard with the letters in alphabetical order rather than a QWERTY keyboard can be created. As a second example, virtual keypads can be constructed that are entirely composed of virtual keys associated with user-defined functionality.
  • One will also appreciate that although a virtual input device is described as being displayed to the user on an output display device, in other embodiments, the virtual input device can be constructed out of paper printout of an input device if the layout is previously known to the mobile device and can automatically determined (e.g., via picture recognition or via barcode or other machine-readable tag). Accordingly, virtual keypads of any size, shape, or button configuration can be created.
  • Turning briefly to FIGS. 9-11, methodologies that may be implemented in accordance with the present invention are illustrated. While, for purposes of simplicity of explanation, the methodologies are shown and described as a series of blocks, it is to be understood and appreciated that the present invention is not limited by the order of the blocks, as some blocks may, in accordance with the present invention, occur in different orders and/or concurrently with other blocks from that shown and described herein. Moreover, not all illustrated blocks may be required to implement the methodologies in accordance with the present invention.
  • FIG. 9 is an exemplary non-limiting flow diagram showing a method 900 for use in connection with a mobile device in accordance with the invention. At 910, orientation input data is received relating to an orientation of body part(s) of a user from input sensor(s), including orientation sensor(s), attachable to or near the body part(s) of the user. At 920, the movement of the input sensor(s) is tracked via the orientation sensor(s). At 930, pattern(s) in the orientation input data generated by the at least one input sensor are recognized. At 940, one or more processes are automatically initiated by the mobile device in response to the pattern(s) recognized in the input data.
  • FIG. 10 is an exemplary non-limiting flow diagram of a method 1000 for using a virtual input device. At 1010, the virtual input device is displayed to the user. As discussed supra, this display can be via an output display device communicatively coupled to the mobile device or maybe displayed on piece of paper. At 1020, movement of input sensors is tracked via one or more orientation sensors with regard to the displayed virtual input device. At 1030, input date from the input sensor and information associated with the displayed virtual input device is used to determine the user input. Although not shown, this method can be performed repeatedly to receive multiple inputs (e.g., multiple key inputs) from the virtual input device.
  • FIG. 11 is an exemplary non-limiting flow diagram of a method 1100 of automatically turning on/off body-based movement as input depending on the user's environment. Some user environments, based on the user location (e.g., bathroom) or current activity (e.g., driving, dancing) are not conducive to body-based movement being used as input. One will appreciate that body-based movement can also be turned on/off manually by the user.
  • At 1110, the user's current environment is determined based on one or more sensors, such as global positioning sensors or sensors that read location-specific identifiers (e.g., RFID tags or wireless network gateway MAC addresses, etc.). This can be performed periodically, such as once every minute, or be determined based on the amount or nature of body movements. At 1120, it is determined if the current user environment is appropriate for body-based movement. At 1130, depending on the current state of body-based movement input and the determination made at 1120, body-based movement is turned on/off as appropriate.
  • FIG. 12 is a block diagram of an input processing component 1210 according to one embodiment. The sensor tracking component 1214 is communicatively coupled to at least one input sensor that tracks the movement of the input sensor via an orientation sensor. The pattern recognition component 1212 then recognizes at least one pattern in input data generated by at least one input sensor. In response to a pattern being recognized in the input data, one or more processes are automatically initiated by the mobile device via the input indication component 1216. A user-defined function component allows a user to store and manage body gestures that will be recognized by the pattern recognition component and associate those gestures with processes to automatically performed when the pattern is recognized. Optional virtual input device component 1220 facilitates display of virtual devices and stores and manages corresponding processes to perform when certain body gestures are received. For example, virtual input device component can store layout of one or more virtual keypads.
  • The present invention has been described herein by way of examples. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used in either the detailed description or the claims, for the avoidance of doubt, such terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements.
  • Various implementations of the invention described herein may have aspects that are wholly in hardware, partly in hardware and partly in software, as well as in software. As used herein, the terms “component,” “system” and the like are likewise intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on computer and the computer can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
  • Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention. In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
  • Furthermore, the invention may be described in the general context of computer-executable instructions, such as program modules, executed by one or more components. Generally, program modules include routines, programs, objects, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments. Furthermore, as will be appreciated various portions of the disclosed systems above and methods below may include or consist of artificial intelligence or knowledge or rule based components, sub-components, processes, means, methodologies, or mechanisms (e.g., support vector machines, neural networks, expert systems, Bayesian belief networks, fuzzy logic, data fusion engines, classifiers . . . ). Such components, inter alia, can automate certain mechanisms or processes performed thereby to make portions of the systems and methods more adaptive as well as efficient and intelligent.
  • Additionally, the disclosed subject matter may be implemented at least partially as a system, method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer or processor based device to implement aspects detailed herein. The terms “article of manufacture,” “computer program product” or similar terms, where used herein, are intended to encompass a computer program accessible from any computer-readable device, carrier, or media. For example, computer readable media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips . . . ), optical disks (e.g., compact disk (CD), digital versatile disk (DVD) . . . ), smart cards, and flash memory devices (e.g., card, stick). Additionally, it is known that a carrier wave can be employed to carry computer-readable electronic data such as those used in transmitting and receiving electronic mail or in accessing a network such as the Internet or a local area network (LAN).
  • The aforementioned systems have been described with respect to interaction between several components. It can be appreciated that such systems and components can include those components or specified sub-components, some of the specified components or sub-components, and/or additional components, according to various permutations and combinations of the foregoing. Sub-components can also be implemented as components communicatively coupled to other components rather than included within parent components, e.g., according to a hierarchical arrangement. Additionally, it should be noted that one or more components may be combined into a single component providing aggregate functionality or divided into several separate sub-components, and any one or more middle layers, such as a management layer, may be provided to communicatively couple to such sub-components in order to provide integrated functionality. Any components described herein may also interact with one or more other components not specifically described herein but generally known by those of skill in the art.

Claims (20)

1. A mobile device system comprising:
at least one input sensor attachable to or near at least one body part of the user including at least one orientation sensor; and
an input processing component of a first mobile device communicatively coupled to the at least one input sensor that tracks the movement of the at least one input sensor via the at least one orientation sensor, wherein the input processing component recognizes at least one pattern in input data generated by the at least one input sensor and communicated to the input processing component, whereby in response to the at least one pattern recognized in the input data, one or more processes are automatically initiated by the first mobile device.
2. The mobile device system of claim 1, wherein at least one input sensor is or proximate to a piece of jewelry, the piece of jewelry is proximate to at least one body part of the user.
3. The mobile device system of claim 1, further comprising an input processing component of a second mobile device communicatively coupled to the at least one input sensor that tracks the movement of the at least one input sensor via the at least one orientation sensor, wherein the input processing component recognizes at least one pattern in input data generated by the at least one input sensor and communicated to the input processing component, whereby in response to the at least one pattern recognized in the input data, one or more processes are automatically initiated by the second mobile device.
4. The mobile device system of claim 1, wherein at least one input sensor attachable to or near at least one body part of the user is communicatively coupled to the first mobile device wirelessly.
5. The mobile device system of claim 1, wherein at least one input sensor is or proximate to a fashion accessory, the fashion accessory is proximate to at least one body part of the user.
6. The mobile device system of claim 1, wherein the input processing component recognizes at least one user-defined pattern in input data generated by the at least one input sensor.
7. The mobile device system of claim 1, wherein the one or more processes automatically initiated by the first mobile device comprises communicating with a remote computing device to automatically initiate one or more processes on the remote computing device.
8. A method for use in connection with a mobile device, comprising:
displaying a virtual input device to a user;
receiving orientation input data relating to an orientation of at least one body part of a user from at least one input sensor, including at least one orientation sensor attachable proximate to at least one body part of the user;
tracking the movement of the at least one input sensor via the at least one orientation sensor in relation to the displayed virtual input device; and
recognizing input to the displayed virtual input device based on the tracked movement.
9. The method of claim 8, further comprising automatically initiating one or more processes by the mobile device in response to the recognized input.
10. The method of claim 8, wherein the displaying of a virtual input device comprises displaying at least one of a virtual mouse or a virtual trackball.
11. The method of claim 8, wherein the displaying of a virtual input device comprises displaying the virtual input device via a wearable output display.
12. The method of claim 8, wherein the displaying of a virtual input device comprises displaying a virtual keypad with at least one user-defined key.
13. The method of claim 8, wherein the receiving orientation input data relating to an orientation of at least one body part of a user from at least one input sensor comprises receiving orientation input data from at least one input sensor in a piece of jewelry.
14. The method of claim 8, wherein the tracking includes tracking the movement of the at least one input sensor via the at least one orientation sensor in relation to the displayed virtual input device comprises tracking the movement of one or more fingers of a user.
15. The method of claim 8, wherein the displaying of a virtual input device comprises displaying the virtual input device on an external output device communicatively coupled to the mobile device.
16. A method of automatically switching on or off body-based movement interaction with a mobile device:
determining a current environment for a user based on one or more sensors;
determining whether the current environment is appropriate for body-based movement;
when it is determined that the current environment is not appropriate for body-based movement, turning body-based movement interaction off; and
when it is determined that the current environment is appropriate for body-based movement, turning body-based movement interaction on.
17. The method of claim 16, wherein the determining of a current environment for a user based on one or more sensors comprises determining a current environment for user based on at least one sensor attached to a body of the user.
18. The method of claim 16, wherein the determining of a current environment for a user based on one or more sensors comprises determining a current location for the user based on at least one sensor attached to a body of the user.
19. The method of claim 16, wherein the determining of a current environment for a user based on one or more sensors comprises determining a current activity for the user based on at least one sensor attached to a body of the user.
20. The method of claim 16, wherein the determining of a current environment for a user based on one or more sensors comprises determining a current activity for the user based on at least one wireless sensor proximate to at least one of a piece of jewelry or fashion accessory.
US12/052,463 2007-04-04 2008-03-20 Body movement based usage of mobile device Abandoned US20080246734A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/052,463 US20080246734A1 (en) 2007-04-04 2008-03-20 Body movement based usage of mobile device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US91010907P 2007-04-04 2007-04-04
US12/052,463 US20080246734A1 (en) 2007-04-04 2008-03-20 Body movement based usage of mobile device

Publications (1)

Publication Number Publication Date
US20080246734A1 true US20080246734A1 (en) 2008-10-09

Family

ID=39826375

Family Applications (10)

Application Number Title Priority Date Filing Date
US12/051,532 Active 2030-06-20 US8032472B2 (en) 2007-04-04 2008-03-19 Intelligent agent for distributed services for mobile devices
US12/052,463 Abandoned US20080246734A1 (en) 2007-04-04 2008-03-20 Body movement based usage of mobile device
US12/053,861 Active 2031-03-24 US8786246B2 (en) 2007-04-04 2008-03-24 Power resource management
US12/054,910 Abandoned US20080248779A1 (en) 2007-04-04 2008-03-25 Media content and mobile devices
US12/054,544 Abandoned US20080246629A1 (en) 2007-04-04 2008-03-25 Mobile devices as centers for health information, monitoring and services
US12/055,040 Abandoned US20080261572A1 (en) 2007-04-04 2008-03-25 Mobile device business models
US12/054,878 Abandoned US20080248750A1 (en) 2007-04-04 2008-03-25 Componentization of mobile devices
US12/054,841 Active 2030-10-18 US8340658B2 (en) 2007-04-04 2008-03-25 Peer to peer sharing of functionality of mobile devices
US13/222,175 Active US8209275B2 (en) 2007-04-04 2011-08-31 Intelligent agent for distributed services for mobile devices
US13/683,648 Expired - Fee Related US9055106B2 (en) 2007-04-04 2012-11-21 Peer to peer sharing of functionality of mobile devices

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/051,532 Active 2030-06-20 US8032472B2 (en) 2007-04-04 2008-03-19 Intelligent agent for distributed services for mobile devices

Family Applications After (8)

Application Number Title Priority Date Filing Date
US12/053,861 Active 2031-03-24 US8786246B2 (en) 2007-04-04 2008-03-24 Power resource management
US12/054,910 Abandoned US20080248779A1 (en) 2007-04-04 2008-03-25 Media content and mobile devices
US12/054,544 Abandoned US20080246629A1 (en) 2007-04-04 2008-03-25 Mobile devices as centers for health information, monitoring and services
US12/055,040 Abandoned US20080261572A1 (en) 2007-04-04 2008-03-25 Mobile device business models
US12/054,878 Abandoned US20080248750A1 (en) 2007-04-04 2008-03-25 Componentization of mobile devices
US12/054,841 Active 2030-10-18 US8340658B2 (en) 2007-04-04 2008-03-25 Peer to peer sharing of functionality of mobile devices
US13/222,175 Active US8209275B2 (en) 2007-04-04 2011-08-31 Intelligent agent for distributed services for mobile devices
US13/683,648 Expired - Fee Related US9055106B2 (en) 2007-04-04 2012-11-21 Peer to peer sharing of functionality of mobile devices

Country Status (4)

Country Link
US (10) US8032472B2 (en)
KR (2) KR20090125264A (en)
CN (2) CN101766015A (en)
WO (2) WO2008124394A1 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080273755A1 (en) * 2007-05-04 2008-11-06 Gesturetek, Inc. Camera-based user input for compact devices
US20090313010A1 (en) * 2008-06-11 2009-12-17 International Business Machines Corporation Automatic playback of a speech segment for media devices capable of pausing a media stream in response to environmental cues
US20120095575A1 (en) * 2010-10-14 2012-04-19 Cedes Safety & Automation Ag Time of flight (tof) human machine interface (hmi)
WO2012093393A1 (en) * 2011-01-07 2012-07-12 Seal Mobile Id Ltd Method and system for unobtrusive mobile device user recognition
US20120280905A1 (en) * 2011-05-05 2012-11-08 Net Power And Light, Inc. Identifying gestures using multiple sensors
US20130204408A1 (en) * 2012-02-06 2013-08-08 Honeywell International Inc. System for controlling home automation system using body movements
US8542320B2 (en) 2010-06-17 2013-09-24 Sony Corporation Method and system to control a non-gesture controlled device using gesture interactions with a gesture controlled device
US20140351334A1 (en) * 2011-09-12 2014-11-27 Tata Consultancy Services Limited System for Dynamic Service Collaboration through Identification and Context of Plurality of Heterogeneous Devices
US20150153854A1 (en) * 2013-12-03 2015-06-04 Lenovo (Singapore) Pte. Ltd. Extension of wearable information handling device user interface
US20150185827A1 (en) * 2013-12-31 2015-07-02 Linkedln Corporation Techniques for performing social interactions with content
US20150248168A1 (en) * 2012-10-17 2015-09-03 Sony Corporation Communication system, communication method and program
US20160011665A1 (en) * 2014-07-09 2016-01-14 Pearson Education, Inc. Operational feedback with 3d commands
US9632588B1 (en) * 2011-04-02 2017-04-25 Open Invention Network, Llc System and method for redirecting content based on gestures
US9691293B2 (en) 2014-07-09 2017-06-27 Pearson Education, Inc. Customizing application usability with 3D input
US10025974B1 (en) * 2015-04-03 2018-07-17 William Felder Boxing motion system and method
US20200107750A1 (en) * 2018-10-03 2020-04-09 Surge Motion Inc. Method and system for assessing human movements
US11195525B2 (en) * 2018-06-13 2021-12-07 Panasonic Intellectual Property Corporation Of America Operation terminal, voice inputting method, and computer-readable recording medium
US11361122B2 (en) 2019-01-17 2022-06-14 Middle Chart, LLC Methods of communicating geolocated data based upon a self-verifying array of nodes
US11436388B2 (en) 2019-01-17 2022-09-06 Middle Chart, LLC Methods and apparatus for procedure tracking
US11436389B2 (en) 2017-02-22 2022-09-06 Middle Chart, LLC Artificial intelligence based exchange of geospatial related digital content
US11468209B2 (en) 2017-02-22 2022-10-11 Middle Chart, LLC Method and apparatus for display of digital content associated with a location in a wireless communications area
US11475177B2 (en) 2017-02-22 2022-10-18 Middle Chart, LLC Method and apparatus for improved position and orientation based information display
US11481527B2 (en) 2017-02-22 2022-10-25 Middle Chart, LLC Apparatus for displaying information about an item of equipment in a direction of interest
US11507714B2 (en) 2020-01-28 2022-11-22 Middle Chart, LLC Methods and apparatus for secure persistent location based digital content
US11514207B2 (en) 2017-02-22 2022-11-29 Middle Chart, LLC Tracking safety conditions of an area
US11610032B2 (en) 2017-02-22 2023-03-21 Middle Chart, LLC Headset apparatus for display of location and direction based content
US11625510B2 (en) 2017-02-22 2023-04-11 Middle Chart, LLC Method and apparatus for presentation of digital content
US11640486B2 (en) 2021-03-01 2023-05-02 Middle Chart, LLC Architectural drawing based exchange of geospatial related digital content
US11875656B2 (en) 2015-03-12 2024-01-16 Alarm.Com Incorporated Virtual enhancement of security monitoring
US11900022B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Apparatus for determining a position relative to a reference transceiver
US11900021B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Provision of digital content via a wearable eye covering

Families Citing this family (419)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9053500B2 (en) * 1999-06-30 2015-06-09 Blackboard Inc. Internet-based education support system and method with multi-language capability
US8050272B2 (en) 2004-06-29 2011-11-01 Damaka, Inc. System and method for concurrent sessions in a peer-to-peer hybrid communications network
US7570636B2 (en) 2004-06-29 2009-08-04 Damaka, Inc. System and method for traversing a NAT device for peer-to-peer hybrid communications
US7933260B2 (en) 2004-06-29 2011-04-26 Damaka, Inc. System and method for routing and communicating in a heterogeneous network environment
US8009586B2 (en) 2004-06-29 2011-08-30 Damaka, Inc. System and method for data transfer in a peer-to peer hybrid communication network
US7849154B2 (en) * 2005-06-27 2010-12-07 M:Metrics, Inc. Acquiring, storing, and correlating profile data of cellular mobile communications system's users to events
US20080070550A1 (en) * 2006-09-20 2008-03-20 Hose David A Providing Subscriber Specific Information Across Wireless Networks
JP4426563B2 (en) * 2006-12-25 2010-03-03 大日本印刷株式会社 Information provision system
US8556833B2 (en) * 2007-01-10 2013-10-15 Integrity Tracking, Llc Wireless sensor network system and method
US8412269B1 (en) * 2007-03-26 2013-04-02 Celio Technology Corporation Systems and methods for providing additional functionality to a device for increased usability
US8032472B2 (en) * 2007-04-04 2011-10-04 Tuen Solutions Limited Liability Company Intelligent agent for distributed services for mobile devices
TW200847058A (en) 2007-04-27 2008-12-01 Rohm Co Ltd Information exchanging apparatus
WO2008135094A1 (en) * 2007-05-08 2008-11-13 Telefonaktiebolaget Lm Ericsson (Publ) Signalling of extended mobile station capabilities to a mobile communication network
US8170609B2 (en) * 2007-06-20 2012-05-01 Qualcomm Incorporated Personal virtual assistant providing advice to a user regarding physiological information received about the user
WO2009002336A1 (en) * 2007-06-26 2008-12-31 Jeffrey Therese M Enhanced telecommunication system
WO2009032854A2 (en) 2007-09-03 2009-03-12 Damaka, Inc. Device and method for maintaining a communication session during a network transition
US20090079547A1 (en) * 2007-09-25 2009-03-26 Nokia Corporation Method, Apparatus and Computer Program Product for Providing a Determination of Implicit Recommendations
WO2009043016A2 (en) 2007-09-28 2009-04-02 Damaka, Inc. System and method for transitioning a communication session between networks that are not commonly controlled
CN101414296B (en) * 2007-10-15 2012-07-25 日电(中国)有限公司 Self-adapting service recommendation equipment and method, self-adapting service recommendation system and method
US8862689B2 (en) * 2007-10-24 2014-10-14 International Business Machines Corporation Local flash memory and remote server hybrid continuous data protection
US8380859B2 (en) 2007-11-28 2013-02-19 Damaka, Inc. System and method for endpoint handoff in a hybrid peer-to-peer networking environment
US8082189B2 (en) 2007-12-13 2011-12-20 Dai Nippon Printing Co., Ltd. Information providing system for providing store information to a mobile terminal device
US10002189B2 (en) 2007-12-20 2018-06-19 Apple Inc. Method and apparatus for searching using an active ontology
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
KR101238363B1 (en) * 2008-01-31 2013-02-28 삼성전자주식회사 Method for providing Blog service by mobile terminal and system performing the same, and mobile Blog Caster
US11087261B1 (en) 2008-03-14 2021-08-10 DataInfoCom USA Inc. Apparatus, system and method for processing, analyzing or displaying data related to performance metrics
US8364519B1 (en) 2008-03-14 2013-01-29 DataInfoCom USA Inc. Apparatus, system and method for processing, analyzing or displaying data related to performance metrics
US20090243433A1 (en) * 2008-04-01 2009-10-01 Joe Dirr Apparatus, system and method for converting vibrational energy to electric potential
US20090259493A1 (en) * 2008-04-11 2009-10-15 Venon Medhi O Mobile health book
US7515899B1 (en) * 2008-04-23 2009-04-07 International Business Machines Corporation Distributed grid computing method utilizing processing cycles of mobile phones
US8056269B2 (en) 2008-05-02 2011-11-15 Nike, Inc. Article of footwear with lighting system
US11723436B2 (en) 2008-05-02 2023-08-15 Nike, Inc. Article of footwear and charging system
US9907359B2 (en) 2008-05-02 2018-03-06 Nike, Inc. Lacing system with guide elements
US8058837B2 (en) * 2008-05-02 2011-11-15 Nike, Inc. Charging system for an article of footwear
US11206891B2 (en) 2008-05-02 2021-12-28 Nike, Inc. Article of footwear and a method of assembly of the article of footwear
US8046937B2 (en) * 2008-05-02 2011-11-01 Nike, Inc. Automatic lacing system
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
US9002680B2 (en) 2008-06-13 2015-04-07 Nike, Inc. Foot gestures for computer input and interface control
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
EP3087858B1 (en) * 2008-06-13 2021-04-28 NIKE Innovate C.V. Footwear having sensor system
US8631351B2 (en) * 2008-06-29 2014-01-14 Microsoft Corporation Providing multiple degrees of context for content consumed on computers and media players
JP2010016486A (en) * 2008-07-01 2010-01-21 Canon Inc Digital broadcast receiving apparatus and control method and program for the same
US20100030549A1 (en) 2008-07-31 2010-02-04 Lee Michael M Mobile device having human language translation capability with positional feedback
EP2321755B1 (en) * 2008-08-11 2017-06-21 Roche Diabetes Care GmbH Ambulatory medical device comprising an alert controller
US20100042564A1 (en) * 2008-08-15 2010-02-18 Beverly Harrison Techniques for automatically distingusihing between users of a handheld device
US8326630B2 (en) 2008-08-18 2012-12-04 Microsoft Corporation Context based online advertising
US20100042421A1 (en) * 2008-08-18 2010-02-18 Microsoft Corporation Context based advertisement bidding mechanism
US8324857B1 (en) * 2008-09-23 2012-12-04 SolarLego Inc. Portable stackable solar batteries
US20110071889A1 (en) * 2009-09-24 2011-03-24 Avaya Inc. Location-Aware Retail Application
US8371855B1 (en) * 2008-09-30 2013-02-12 Amazon Technologies, Inc. Sharing electronic books
US8676904B2 (en) 2008-10-02 2014-03-18 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
JP4670937B2 (en) * 2008-11-04 2011-04-13 トヨタ自動車株式会社 Navigation device
US8103250B2 (en) 2008-12-04 2012-01-24 At&T Mobility Ii Llc System and method for sharing location data in a wireless communication network
US20100146123A1 (en) * 2008-12-08 2010-06-10 Electronics And Telecommunications Research Institute Resource allocation method of each terminal apparatus using resource management system and resource management server apparatus
US8171517B2 (en) 2008-12-12 2012-05-01 At&T Intellectual Property I, L.P. Apparatus and method for distributing media content to vehicles
US8487772B1 (en) 2008-12-14 2013-07-16 Brian William Higgins System and method for communicating information
US9323854B2 (en) * 2008-12-19 2016-04-26 Intel Corporation Method, apparatus and system for location assisted translation
US20100161720A1 (en) * 2008-12-23 2010-06-24 Palm, Inc. System and method for providing content to a mobile device
US8653965B1 (en) * 2009-01-12 2014-02-18 Integrity Tracking, Llc Human health monitoring systems and methods
US8930655B2 (en) * 2009-01-19 2015-01-06 Microsoft Corporation Transient storage device configuration silo
US9572532B2 (en) * 2009-01-23 2017-02-21 Qualcomm Incorporated Button sensor
US8190938B2 (en) * 2009-01-29 2012-05-29 Nokia Corporation Method and apparatus for controlling energy consumption during resource sharing
US8893232B2 (en) * 2009-02-06 2014-11-18 Empire Technology Development Llc Media monitoring system
US8310374B2 (en) * 2009-03-04 2012-11-13 General Electric Company Telemetry system and method
US8483669B2 (en) * 2009-04-03 2013-07-09 Microsoft Corporation Mobile sensor network
US8983535B2 (en) * 2009-04-03 2015-03-17 Ubiquity Broadcasting Corporation Medical scan clip on
US9329619B1 (en) * 2009-04-06 2016-05-03 Dynamics Inc. Cards with power management
US20100281138A1 (en) * 2009-04-29 2010-11-04 Paulo Lerner Froimtchuk Method and system for remote coprocessor
US8667109B2 (en) 2009-04-30 2014-03-04 Empire Technology Development Llc User profile-based wireless device system level management
TW201042567A (en) * 2009-05-27 2010-12-01 Ipeer Multimedia Internat Ltd Digital content trading system and method applied to mobile apparatus
US20100317371A1 (en) * 2009-06-12 2010-12-16 Westerinen William J Context-based interaction model for mobile devices
EP2431782A1 (en) * 2009-06-16 2012-03-21 Intel Corporation Camera applications in a handheld device
WO2011021886A2 (en) 2009-08-21 2011-02-24 Samsung Electronics Co., Ltd. Device capable of notifying operation state change thereof through network and communication method of the device
EP2520134B1 (en) 2009-10-08 2015-03-25 Delos Living, LLC Led lighting system
KR101127451B1 (en) * 2009-10-08 2012-03-22 장길훈 Method of providing and accessing resource remotely
US8063541B2 (en) 2009-10-27 2011-11-22 Research In Motion Limited Holster-integrated piezoelectric energy source for handheld electronic device
US20110153380A1 (en) * 2009-12-22 2011-06-23 Verizon Patent And Licensing Inc. Method and system of automated appointment management
WO2011080715A2 (en) * 2010-01-02 2011-07-07 Francesco Dori System and method for displaying digital content
US20110191692A1 (en) * 2010-02-03 2011-08-04 Oto Technologies, Llc System and method for e-book contextual communication
US8892646B2 (en) 2010-08-25 2014-11-18 Damaka, Inc. System and method for shared session appearance in a hybrid peer-to-peer environment
US8725895B2 (en) 2010-02-15 2014-05-13 Damaka, Inc. NAT traversal by concurrently probing multiple candidates
US8874785B2 (en) 2010-02-15 2014-10-28 Damaka, Inc. System and method for signaling and data tunneling in a peer-to-peer environment
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
US8689307B2 (en) 2010-03-19 2014-04-01 Damaka, Inc. System and method for providing a virtual peer-to-peer environment
US9124804B2 (en) * 2010-03-22 2015-09-01 Microsoft Technology Licensing, Llc Using accelerometer information for determining orientation of pictures and video images
US9043488B2 (en) 2010-03-29 2015-05-26 Damaka, Inc. System and method for session sweeping between devices
US8812657B2 (en) * 2010-04-15 2014-08-19 Qualcomm Incorporated Network-assisted peer discovery
US9191416B2 (en) 2010-04-16 2015-11-17 Damaka, Inc. System and method for providing enterprise voice call continuity
US8990411B2 (en) 2010-04-22 2015-03-24 Microsoft Technology Licensing, Llc Dynamic connection management on mobile peer devices
US8352563B2 (en) 2010-04-29 2013-01-08 Damaka, Inc. System and method for peer-to-peer media routing using a third party instant messaging system for signaling
US8825731B2 (en) 2010-05-18 2014-09-02 International Business Machines Corporation Mobile device workload management for cloud computing using SIP and presence to control workload and method thereof
US9552478B2 (en) * 2010-05-18 2017-01-24 AO Kaspersky Lab Team security for portable information devices
US8560365B2 (en) 2010-06-08 2013-10-15 International Business Machines Corporation Probabilistic optimization of resource discovery, reservation and assignment
US8446900B2 (en) * 2010-06-18 2013-05-21 Damaka, Inc. System and method for transferring a call between endpoints in a hybrid peer-to-peer network
US8611540B2 (en) 2010-06-23 2013-12-17 Damaka, Inc. System and method for secure messaging in a hybrid peer-to-peer network
US9319625B2 (en) * 2010-06-25 2016-04-19 Sony Corporation Content transfer system and communication terminal
US8296765B2 (en) 2010-07-27 2012-10-23 Kurdi Heba A Method of forming a personal mobile grid system and resource scheduling thereon
CH703558A1 (en) * 2010-08-05 2012-02-15 Christoph Buechel Portable device with improved energy autonomy.
US9646271B2 (en) 2010-08-06 2017-05-09 International Business Machines Corporation Generating candidate inclusion/exclusion cohorts for a multiply constrained group
US8370350B2 (en) 2010-09-03 2013-02-05 International Business Machines Corporation User accessibility to resources enabled through adaptive technology
US8968197B2 (en) * 2010-09-03 2015-03-03 International Business Machines Corporation Directing a user to a medical resource
US9292577B2 (en) 2010-09-17 2016-03-22 International Business Machines Corporation User accessibility to data analytics
US8468010B2 (en) 2010-09-24 2013-06-18 Damaka, Inc. System and method for language translation in a hybrid peer-to-peer environment
US20120084248A1 (en) * 2010-09-30 2012-04-05 Microsoft Corporation Providing suggestions based on user intent
US8897148B2 (en) 2010-10-06 2014-11-25 Qualcomm Incorporated Methods and apparatus for resource allocation for peer-to-peer data in non peer-to-peer resources
US8743781B2 (en) 2010-10-11 2014-06-03 Damaka, Inc. System and method for a reverse invitation in a hybrid peer-to-peer environment
KR101425093B1 (en) * 2010-10-12 2014-08-04 한국전자통신연구원 Method for personalized searching of mobile terminal and mobile terminal performing the same
US8429182B2 (en) 2010-10-13 2013-04-23 International Business Machines Corporation Populating a task directed community in a complex heterogeneous environment based on non-linear attributes of a paradigmatic cohort member
US9443211B2 (en) 2010-10-13 2016-09-13 International Business Machines Corporation Describing a paradigmatic member of a task directed community in a complex heterogeneous environment based on non-linear attributes
KR101418393B1 (en) * 2010-10-25 2014-07-14 한국전자통신연구원 Apparatus and method for mobile intelligent advertizing based on mobile user contextual matching
CA2816589A1 (en) 2010-11-05 2012-05-10 Nike International Ltd. Method and system for automated personal training
US9457256B2 (en) * 2010-11-05 2016-10-04 Nike, Inc. Method and system for automated personal training that includes training programs
US9977874B2 (en) 2011-11-07 2018-05-22 Nike, Inc. User interface for remote joint workout session
CN103443795B (en) 2010-11-10 2016-10-26 耐克创新有限合伙公司 Measure for time-based motor activity and the system and method for display
WO2012075099A2 (en) * 2010-11-30 2012-06-07 Google Inc. Use of location tagging in data communications
US10420982B2 (en) 2010-12-13 2019-09-24 Nike, Inc. Fitness training system with energy expenditure calculation that uses a form factor
ES2931178T3 (en) 2010-12-15 2022-12-27 Auto Telematics Ltd Method and system for recording vehicle behavior
US20120158503A1 (en) * 2010-12-17 2012-06-21 Ebay Inc. Identifying purchase patterns and marketing based on user mood
US20120167035A1 (en) * 2010-12-22 2012-06-28 Electronics And Telecommunications Research Institute Apparatus and method for developing customer-oriented emotional home application service
US20120165617A1 (en) * 2010-12-28 2012-06-28 General Electric Company Patient enabled methods, apparatus, and systems for early health and preventive care using wearable sensors
US9575776B2 (en) * 2010-12-30 2017-02-21 Samsung Electrônica da Amazônia Ltda. System for organizing and guiding a user in the experience of browsing different applications based on contexts
US20120173622A1 (en) * 2011-01-04 2012-07-05 Samsung Electronics Co., Ltd. Social screen casting
US20120185569A1 (en) * 2011-01-14 2012-07-19 Qualcomm Incorporated Techniques for dynamic task processing in a wireless communication system
EP2667769B1 (en) 2011-01-27 2020-07-22 The Board of Trustees of the Leland Stanford Junior University Systems for monitoring the circulatory system
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
KR101810751B1 (en) 2011-02-17 2017-12-19 나이키 이노베이트 씨.브이. Selecting and correlating physical activity data with image data
BR112013021141A2 (en) 2011-02-17 2019-12-10 Nike Int Ltd footwear with sensor system
EP3662829A1 (en) 2011-02-17 2020-06-10 NIKE Innovate C.V. Footwear having sensor system
US20120215642A1 (en) * 2011-02-23 2012-08-23 Yoon Kean Wong Advertisement Based on Contextual Usage of Application
US8826313B2 (en) * 2011-03-04 2014-09-02 CSC Holdings, LLC Predictive content placement on a managed services systems
US20130052616A1 (en) * 2011-03-17 2013-02-28 Sears Brands, L.L.C. Methods and systems for device management with sharing and programming capabilities
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US8793357B2 (en) * 2011-04-02 2014-07-29 Open Invention Network, Llc System and method for persisting mobile agents within a mobile region of interest
US8407314B2 (en) 2011-04-04 2013-03-26 Damaka, Inc. System and method for sharing unsupported document types between communication devices
US9164679B2 (en) 2011-04-06 2015-10-20 Patents1, Llc System, method and computer program product for multi-thread operation involving first memory of a first memory class and second memory of a second memory class
US9170744B1 (en) 2011-04-06 2015-10-27 P4tents1, LLC Computer program product for controlling a flash/DRAM/embedded DRAM-equipped system
US8930647B1 (en) 2011-04-06 2015-01-06 P4tents1, LLC Multiple class memory systems
US9176671B1 (en) 2011-04-06 2015-11-03 P4tents1, LLC Fetching data between thread execution in a flash/DRAM/embedded DRAM-equipped system
US9158546B1 (en) 2011-04-06 2015-10-13 P4tents1, LLC Computer program product for fetching from a first physical memory between an execution of a plurality of threads associated with a second physical memory
WO2012139226A1 (en) * 2011-04-13 2012-10-18 Research In Motion Limited System and method for context aware dynamic ribbon
US9880604B2 (en) 2011-04-20 2018-01-30 Microsoft Technology Licensing, Llc Energy efficient location detection
US8732569B2 (en) 2011-05-04 2014-05-20 Google Inc. Predicting user navigation events
US20120290594A1 (en) * 2011-05-12 2012-11-15 Ciright Systems, Inc. Event/performance data aggregation, monitoring, and feedback platform
US8671314B2 (en) * 2011-05-13 2014-03-11 Microsoft Corporation Real-time diagnostics pipeline for large scale services
US8694587B2 (en) 2011-05-17 2014-04-08 Damaka, Inc. System and method for transferring a call bridge between communication devices
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US9769285B2 (en) 2011-06-14 2017-09-19 Google Inc. Access to network content
US8788711B2 (en) 2011-06-14 2014-07-22 Google Inc. Redacting content and inserting hypertext transfer protocol (HTTP) error codes in place thereof
US8823520B2 (en) * 2011-06-16 2014-09-02 The Boeing Company Reconfigurable network enabled plug and play multifunctional processing and sensing node
US9317834B2 (en) * 2011-06-30 2016-04-19 Microsoft Technology Licensing, Llc User computing device with personal agent program for recommending meeting a friend at a service location based on current location, travel direction, and calendar activity
US8650139B2 (en) 2011-07-01 2014-02-11 Google Inc. Predicting user navigation events
US8745212B2 (en) * 2011-07-01 2014-06-03 Google Inc. Access to network content
US8630963B2 (en) 2011-07-01 2014-01-14 Intel Corporation Automatic user identification from button presses recorded in a feature vector
US9083583B1 (en) 2011-07-01 2015-07-14 Google Inc. Latency reduction via adaptive speculative preconnection
US8566696B1 (en) 2011-07-14 2013-10-22 Google Inc. Predicting user navigation events
US8478890B2 (en) 2011-07-15 2013-07-02 Damaka, Inc. System and method for reliable virtual bi-directional data stream communications with single socket point-to-multipoint capability
US8744988B1 (en) 2011-07-15 2014-06-03 Google Inc. Predicting user navigation events in an internet browser
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
US9087348B2 (en) * 2011-08-11 2015-07-21 GM Global Technology Operations LLC Digital content networking
US20130054315A1 (en) * 2011-08-31 2013-02-28 Jon Shutter Method and system for providing targeted advertisements
US9195968B2 (en) * 2011-09-09 2015-11-24 Microsoft Technology Licensing, Llc Cloud-based broker service for digital assistants
CN102368764B (en) * 2011-09-10 2016-08-31 上海量明科技发展有限公司 A kind of method, system and client communicated by multi-point login
KR101659420B1 (en) * 2011-09-12 2016-09-30 인텔 코포레이션 Personalized video content consumption using shared video device and personal device
US10373121B2 (en) * 2011-09-13 2019-08-06 International Business Machines Corporation Integrating a calendaring system with a mashup page containing widgets to provide information regarding the calendared event
US8600921B2 (en) 2011-09-15 2013-12-03 Google Inc. Predicting user navigation events in a browser using directed graphs
US8655819B1 (en) 2011-09-15 2014-02-18 Google Inc. Predicting user navigation events based on chronological history data
US20130081029A1 (en) * 2011-09-23 2013-03-28 Elwha LLC, a limited liability company of the State of Delaware Methods and devices for receiving and executing subtasks
US20130081030A1 (en) * 2011-09-23 2013-03-28 Elwha LLC, a limited liability company of the State Delaware Methods and devices for receiving and executing subtasks
US8280414B1 (en) 2011-09-26 2012-10-02 Google Inc. Map tile data pre-fetching based on mobile device generated event analysis
US8341245B1 (en) 2011-09-26 2012-12-25 Google Inc. Content-facilitated speculative preparation and rendering
US9104664B1 (en) 2011-10-07 2015-08-11 Google Inc. Access to search results
US8903946B1 (en) 2011-10-25 2014-12-02 Google Inc. Reduction in redirect navigation latency via speculative preconnection
US8494838B2 (en) 2011-11-10 2013-07-23 Globili Llc Systems, methods and apparatus for dynamic content management and delivery
US8886715B1 (en) 2011-11-16 2014-11-11 Google Inc. Dynamically determining a tile budget when pre-fetching data in a client device
US20130124563A1 (en) * 2011-11-16 2013-05-16 Google Inc. Controlling pre-fetching of map data tiles based on selectable parameters
US9584579B2 (en) 2011-12-01 2017-02-28 Google Inc. Method and system for providing page visibility information
US9305107B2 (en) 2011-12-08 2016-04-05 Google Inc. Method and apparatus for pre-fetching place page data for subsequent display on a mobile computing device
US9197713B2 (en) 2011-12-09 2015-11-24 Google Inc. Method and apparatus for pre-fetching remote resources for subsequent display on a mobile computing device
US9189252B2 (en) * 2011-12-30 2015-11-17 Microsoft Technology Licensing, Llc Context-based device action prediction
WO2013097232A1 (en) 2011-12-31 2013-07-04 Intel Corporation Content-based control system
US8793235B2 (en) 2012-01-19 2014-07-29 Google Inc. System and method for improving access to search results
US20130189944A1 (en) * 2012-01-20 2013-07-25 Dyax Corp. Tracking and reporting information characterizing attacks caused by a disease
US20130213146A1 (en) 2012-02-22 2013-08-22 Nike, Inc. Footwear Having Sensor System
US11071344B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Motorized shoe with gesture control
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control
US20130213147A1 (en) 2012-02-22 2013-08-22 Nike, Inc. Footwear Having Sensor System
US8739639B2 (en) 2012-02-22 2014-06-03 Nike, Inc. Footwear having sensor system
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US9396337B2 (en) * 2012-04-11 2016-07-19 Intermec Ip Corp. Wireless sensor field enumeration
DE102012206727A1 (en) * 2012-04-24 2013-10-24 Robert Bosch Gmbh Akkuinduktivladevorrichtung
US9088169B2 (en) 2012-05-09 2015-07-21 World Panel, Inc. Power-conditioned solar charger for directly coupling to portable electronic devices
CN205212521U (en) * 2012-05-09 2016-05-04 世界太阳能面板公司 Photovoltaic system
US9946792B2 (en) 2012-05-15 2018-04-17 Google Llc Access to network content
US20130325494A1 (en) * 2012-05-30 2013-12-05 Getmyrx Llc Mobile Fulfillment Platform For Prescription Medications
CN104508669B (en) 2012-06-04 2019-10-01 耐克创新有限合伙公司 A kind of system and method for comprehensive body exercising-sports score
US20130339123A1 (en) * 2012-06-19 2013-12-19 Empire Technology Development, Llc. Group nature verification for providing group incentives
KR101297152B1 (en) * 2012-06-20 2013-08-21 (주)휴즈플로우 Mapping server and mapping method
US20130346593A1 (en) * 2012-06-22 2013-12-26 Nokia Corporation Method and apparatus for providing transition to an alternate service based on performance degradation of an initial service
US9813776B2 (en) 2012-06-25 2017-11-07 Pin Pon Llc Secondary soundtrack delivery
US9483308B2 (en) 2012-06-29 2016-11-01 Intel Corporation Performance of predicted actions
US8990143B2 (en) 2012-06-29 2015-03-24 Intel Corporation Application-provided context for potential action prediction
US20140006599A1 (en) * 2012-06-29 2014-01-02 Dirk Hohndel Probabilities of potential actions based on system observations
US8887239B1 (en) 2012-08-08 2014-11-11 Google Inc. Access to network content
US9728077B1 (en) 2013-03-14 2017-08-08 Kuna Systems Corporation eReceptionist and eNeighborhood watch system for crime prevention and/or verification
US9542832B1 (en) * 2013-03-14 2017-01-10 Kuna Systems Corporation eReceptionist and eNeighborhood watch system for crime prevention and/or verification
CN106950908A (en) 2012-08-28 2017-07-14 戴尔斯生活有限责任公司 For improve with can the associated happiness of living environment system, method and object
US9258744B2 (en) * 2012-08-29 2016-02-09 At&T Mobility Ii, Llc Sharing of network resources within a managed network
US20140067801A1 (en) * 2012-08-31 2014-03-06 Fujitsu Limited Geotagging based on specified criteria
CN102833420B (en) * 2012-08-31 2014-09-24 珠海市魅族科技有限公司 Data transmitting method and mobile terminal
US8945328B2 (en) 2012-09-11 2015-02-03 L.I.F.E. Corporation S.A. Methods of making garments having stretchable and conductive ink
US10159440B2 (en) 2014-03-10 2018-12-25 L.I.F.E. Corporation S.A. Physiological monitoring garments
US10201310B2 (en) 2012-09-11 2019-02-12 L.I.F.E. Corporation S.A. Calibration packaging apparatuses for physiological monitoring garments
US8948839B1 (en) 2013-08-06 2015-02-03 L.I.F.E. Corporation S.A. Compression garments having stretchable and conductive ink
ES2705526T3 (en) 2012-09-11 2019-03-25 Life Corp Sa Wearable communication platform
US10462898B2 (en) 2012-09-11 2019-10-29 L.I.F.E. Corporation S.A. Physiological monitoring garments
US11246213B2 (en) 2012-09-11 2022-02-08 L.I.F.E. Corporation S.A. Physiological monitoring garments
US9817440B2 (en) 2012-09-11 2017-11-14 L.I.F.E. Corporation S.A. Garments having stretchable and conductive ink
US8892067B2 (en) * 2012-09-13 2014-11-18 Mitac International Corp. Method of displaying fitness data and related fitness system
US8825022B2 (en) * 2012-09-14 2014-09-02 International Business Machines Corporation Information sharing for third party applications in cellular telecommunication infrastructures
US20140088995A1 (en) 2012-09-21 2014-03-27 Md Revolution, Inc. Systems and methods for dynamic adjustments for personalized health and wellness programs
GB2499281B (en) 2012-09-28 2014-06-25 Imagination Tech Ltd Method, system and device for selecting a device to satisfy a user request
US9141722B2 (en) 2012-10-02 2015-09-22 Google Inc. Access to network content
WO2014058842A1 (en) 2012-10-08 2014-04-17 Patrick Soon-Shiong Distributed storage systems and methods
US9678487B1 (en) 2012-10-09 2017-06-13 DataInfoCom USA, Inc. System and method for allocating a fixed quantity distributed over a set of quantities
US9219668B2 (en) 2012-10-19 2015-12-22 Facebook, Inc. Predicting the future state of a mobile device user
US10046123B2 (en) 2012-10-31 2018-08-14 Inhaletech Llc Systems and methods for administering pulmonary medications
US9230211B1 (en) 2012-11-09 2016-01-05 DataInfoCom USA, Inc. Analytics scripting systems and methods
US9031889B1 (en) 2012-11-09 2015-05-12 DataInfoCom USA Inc. Analytics scripting systems and methods
JP6814236B2 (en) * 2012-11-30 2021-01-13 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Information processing method
US9043004B2 (en) 2012-12-13 2015-05-26 Nike, Inc. Apparel having sensor system
KR102037416B1 (en) * 2012-12-17 2019-10-28 삼성전자주식회사 Method for managing of external devices, method for operating of an external device, host device, management server, and the external device
US10860931B1 (en) 2012-12-31 2020-12-08 DataInfoCom USA, Inc. Method and system for performing analysis using unstructured data
US20140207914A1 (en) * 2013-01-22 2014-07-24 Benjamin Paul Robinson Certification validation and associated content access
US9913321B2 (en) * 2013-01-25 2018-03-06 Energyield, Llc Energy harvesting container
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
USD732558S1 (en) 2013-03-11 2015-06-23 Arris Technology, Inc. Display screen with graphical user interface
US9848276B2 (en) * 2013-03-11 2017-12-19 Rovi Guides, Inc. Systems and methods for auto-configuring a user equipment device with content consumption material
EP2967355B1 (en) 2013-03-14 2018-11-21 M. Zubair Mirza Internet based disease monitoring system (idms)
KR101857648B1 (en) * 2013-03-15 2018-05-15 애플 인크. User training by intelligent digital assistant
US9279734B2 (en) 2013-03-15 2016-03-08 Nike, Inc. System and method for analyzing athletic activity
US9558508B2 (en) 2013-03-15 2017-01-31 Microsoft Technology Licensing, Llc Energy-efficient mobile advertising
US9673925B2 (en) 2013-03-15 2017-06-06 Universal Electronics Inc. System and method for monitoring user interactions with a universal controlling device
US9198002B2 (en) 2013-03-15 2015-11-24 Microsoft Technology Licensing, Llc Peer-to-peer device movement communications
CN104079617A (en) * 2013-03-29 2014-10-01 联想(北京)有限公司 Terminal device and method for same
US9610417B2 (en) * 2013-05-07 2017-04-04 Gabrielle M Kassatly Portable discontinuous positive airway pressure (DPAP) device and method of using the same
US10243786B2 (en) * 2013-05-20 2019-03-26 Citrix Systems, Inc. Proximity and context aware mobile workspaces in enterprise systems
US9326236B2 (en) 2013-05-24 2016-04-26 International Business Machines Corporation Method, apparatus and computer program product providing performance and energy optimization for mobile computing
US10641921B1 (en) 2013-05-29 2020-05-05 DataInfoCom USA, Inc. System and method for well log analysis
WO2014197335A1 (en) 2013-06-08 2014-12-11 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US10223156B2 (en) 2013-06-09 2019-03-05 Apple Inc. Initiating background updates based on user activity
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US9392393B2 (en) 2013-06-09 2016-07-12 Apple Inc. Push notification initiated background updates
EP3937002A1 (en) 2013-06-09 2022-01-12 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US9564042B2 (en) 2013-06-13 2017-02-07 Motorola Solutions, Inc. Communication system with improved safety feature
US9820231B2 (en) 2013-06-14 2017-11-14 Microsoft Technology Licensing, Llc Coalescing geo-fence events
US9998866B2 (en) 2013-06-14 2018-06-12 Microsoft Technology Licensing, Llc Detecting geo-fence events using varying confidence levels
US9444031B2 (en) * 2013-06-28 2016-09-13 Samsung Electronics Co., Ltd. Energy harvester using mass and mobile device including the energy harvester
US9027032B2 (en) 2013-07-16 2015-05-05 Damaka, Inc. System and method for providing additional functionality to existing software in an integrated manner
US9226094B2 (en) 2013-07-25 2015-12-29 Elwha Llc Systems and methods for receiving gesture indicative data at a limb wearable computing device
US9078089B2 (en) 2013-07-25 2015-07-07 Elwha Llc Systems and methods for providing one or more functionalities to a wearable computing device
US9226097B2 (en) 2013-07-25 2015-12-29 Elwha Llc Systems and methods for selecting for usage one or more functional devices detected within a communication range of a wearable computing device
US9204245B2 (en) 2013-07-25 2015-12-01 Elwha Llc Systems and methods for providing gesture indicative data via a head wearable computing device
US9167407B2 (en) 2013-07-25 2015-10-20 Elwha Llc Systems and methods for communicating beyond communication range of a wearable computing device
US9237411B2 (en) 2013-07-25 2016-01-12 Elwha Llc Systems and methods for providing one or more functionalities to a wearable computing device with directional antenna
US9785731B1 (en) 2013-08-26 2017-10-10 DataInfoCom USA, Inc. Prescriptive reservoir asset management
DE102013014896B3 (en) * 2013-09-06 2014-12-18 Aissa Zouhri Device and method for signal transmission to persons
CN104516659A (en) * 2013-09-27 2015-04-15 联想(北京)有限公司 Information processing method and device
US9646150B2 (en) * 2013-10-01 2017-05-09 Kalman Csaba Toth Electronic identity and credentialing system
US10756906B2 (en) 2013-10-01 2020-08-25 Kalman Csaba Toth Architecture and methods for self-sovereign digital identity
US9357016B2 (en) * 2013-10-18 2016-05-31 Damaka, Inc. System and method for virtual parallel resource management
US10095982B1 (en) 2013-11-13 2018-10-09 DataInfoCom USA, Inc. System and method for well trace analysis
US10296160B2 (en) 2013-12-06 2019-05-21 Apple Inc. Method for extracting salient dialog usage from live data
US20150178751A1 (en) * 2013-12-23 2015-06-25 Kalibrate Technologies Plc Fuel price data generation
EP3091864B8 (en) 2014-01-06 2018-12-19 L.I.F.E. Corporation S.A. Systems and methods to automatically determine garment fit
WO2015112375A1 (en) * 2014-01-21 2015-07-30 Hazeltine Nelson B Evidenced-based personalized, diabetes self-care system and method
CN103731275A (en) * 2014-01-28 2014-04-16 苏州大学 Battery energy sharing system and method for electric vehicles
US9436270B2 (en) 2014-02-12 2016-09-06 Qualcomm Incorporated Wireless low-energy secure data transfer
US10712722B2 (en) 2014-02-28 2020-07-14 Delos Living Llc Systems and articles for enhancing wellness associated with habitable environments
WO2015134880A1 (en) * 2014-03-06 2015-09-11 Respiratory Motion, Inc. Methods and devices for displaying trend and variability in a physiological dataset
EP3120274A1 (en) * 2014-03-20 2017-01-25 Quidel Corporation Wireless system for near real time surveillance of disease
US9648088B1 (en) * 2014-03-25 2017-05-09 Amazon Technologies, Inc. Digital content prefetch for travel
US10304114B2 (en) 2014-03-25 2019-05-28 Ebay Inc. Data mesh based environmental augmentation
US9417092B2 (en) * 2014-04-25 2016-08-16 Samsung Electronics Co., Ltd. Automatic fixture monitoring using mobile location and sensor data with smart meter data
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9432796B2 (en) 2014-05-30 2016-08-30 Apple Inc. Dynamic adjustment of mobile device based on peer event data
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US9813990B2 (en) * 2014-05-30 2017-11-07 Apple Inc. Dynamic adjustment of mobile device based on voter feedback
AU2015266863B2 (en) 2014-05-30 2018-03-15 Apple Inc. Multi-command single utterance input method
US9943241B2 (en) 2014-06-12 2018-04-17 PhysioWave, Inc. Impedance measurement devices, systems, and methods
US9568354B2 (en) 2014-06-12 2017-02-14 PhysioWave, Inc. Multifunction scale with large-area display
US9949662B2 (en) 2014-06-12 2018-04-24 PhysioWave, Inc. Device and method having automatic user recognition and obtaining impedance-measurement signals
US10130273B2 (en) 2014-06-12 2018-11-20 PhysioWave, Inc. Device and method having automatic user-responsive and user-specific physiological-meter platform
US9546898B2 (en) 2014-06-12 2017-01-17 PhysioWave, Inc. Fitness testing scale
US10754842B2 (en) 2014-06-13 2020-08-25 International Business Machines Corporation Preplaying transactions that mix hot and cold data
US9672400B2 (en) * 2014-07-08 2017-06-06 Aila Technologies Inc. Imaging and peripheral enhancements for mobile devices
US9220123B1 (en) 2014-07-10 2015-12-22 International Business Machines Corporation Peer-to-peer sharing of network resources
US9230150B1 (en) * 2014-07-28 2016-01-05 Google Technology Holdings LLC Finger print sensor and auxiliary processor integration in an electronic device
KR102365161B1 (en) * 2014-07-31 2022-02-21 삼성전자주식회사 Method and device for performing funtion of mobile device
EP3410253A1 (en) * 2014-07-31 2018-12-05 Samsung Electronics Co., Ltd. Mobile communication device using a plurality of wearable devices in parallel
US9712639B2 (en) 2014-08-01 2017-07-18 American Express Travel Related Services Company, Inc. System and method for dynamic provisioning of mobile application content
WO2016022574A1 (en) 2014-08-05 2016-02-11 Damaka, Inc. System and method for providing unified communications and collaboration (ucc) connectivity between incompatible systems
US9693696B2 (en) 2014-08-07 2017-07-04 PhysioWave, Inc. System with user-physiological data updates
US9498137B2 (en) * 2014-08-07 2016-11-22 PhysioWave, Inc. Multi-function fitness scale with display
US9824374B1 (en) * 2014-08-19 2017-11-21 Sprint Communications Company L.P. Radio access network adaptive mobile advertisement delivery
US9386401B2 (en) * 2014-08-25 2016-07-05 Steven K. Gold Proximity-based sensing, communicating, and processing of user physiologic information
US20180227735A1 (en) * 2014-08-25 2018-08-09 Phyziio, Inc. Proximity-Based Attribution of Rewards
US20160072857A1 (en) * 2014-09-09 2016-03-10 Microsoft Technology Licensing, Llc Accessibility features in content sharing
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10020951B2 (en) * 2014-09-17 2018-07-10 Ca, Inc. Crowdsourcing-based detection, identification, and tracking of electronic devices
US20160073947A1 (en) * 2014-09-17 2016-03-17 Glen J. Anderson Managing cognitive assistance
KR20160034737A (en) * 2014-09-22 2016-03-30 에스케이텔레콤 주식회사 Apparatus and method for multi-terminal communication service
US10635296B2 (en) 2014-09-24 2020-04-28 Microsoft Technology Licensing, Llc Partitioned application presentation across devices
US10025684B2 (en) * 2014-09-24 2018-07-17 Microsoft Technology Licensing, Llc Lending target device resources to host device computing environment
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US10031644B2 (en) * 2014-12-05 2018-07-24 HomeAway.com, Inc. Platform to influence channelization of customized information to a user
US9648463B2 (en) 2014-12-19 2017-05-09 Here Global B.V. Method and apparatus for providing context-related point-of-interest recommendations
CN104598416A (en) * 2014-12-30 2015-05-06 西安乾易企业管理咨询有限公司 Realization system and method for using mobile terminals as input equipment
US10923226B2 (en) 2015-01-13 2021-02-16 Delos Living Llc Systems, methods and articles for monitoring and enhancing human wellness
US10152299B2 (en) 2015-03-06 2018-12-11 Apple Inc. Reducing response latency of intelligent automated assistants
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
CN104715188B (en) * 2015-03-27 2019-10-01 百度在线网络技术(北京)有限公司 A kind of application implementation method and device based on binding terminal
US9848674B2 (en) 2015-04-14 2017-12-26 Nike, Inc. Article of footwear with weight-activated cinching apparatus
KR101610883B1 (en) * 2015-04-23 2016-04-08 네이버 주식회사 Apparatus and method for providing information
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10594835B2 (en) 2015-06-05 2020-03-17 Apple Inc. Efficient context monitoring
US10945671B2 (en) 2015-06-23 2021-03-16 PhysioWave, Inc. Determining physiological parameters using movement detection
CN105022630B (en) * 2015-06-30 2019-01-22 中标软件有限公司 A kind of assembly management system and assembly management method
CN108024721B (en) 2015-07-20 2021-10-26 立芙公司 Flexible fabric strap connector for garment with sensors and electronics
US9826048B2 (en) * 2015-07-27 2017-11-21 JBK Media LLC Systems and methods for location-based content sharing
US20180256028A1 (en) * 2015-10-29 2018-09-13 Nokia Technologies Oy Method and apparatus for facilitating transmission of a proximity health alert via a local wireless network
US10553306B2 (en) 2015-11-20 2020-02-04 PhysioWave, Inc. Scaled-based methods and apparatuses for automatically updating patient profiles
US10436630B2 (en) 2015-11-20 2019-10-08 PhysioWave, Inc. Scale-based user-physiological data hierarchy service apparatuses and methods
US10980483B2 (en) 2015-11-20 2021-04-20 PhysioWave, Inc. Remote physiologic parameter determination methods and platform apparatuses
US10923217B2 (en) 2015-11-20 2021-02-16 PhysioWave, Inc. Condition or treatment assessment methods and platform apparatuses
US10395055B2 (en) 2015-11-20 2019-08-27 PhysioWave, Inc. Scale-based data access control methods and apparatuses
US11561126B2 (en) 2015-11-20 2023-01-24 PhysioWave, Inc. Scale-based user-physiological heuristic systems
CN112754109B (en) 2015-11-30 2023-04-07 耐克创新有限合伙公司 System and method for controlling an article of footwear
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10091025B2 (en) 2016-03-31 2018-10-02 Damaka, Inc. System and method for enabling use of a single user identifier across incompatible networks for UCC functionality
US10390772B1 (en) 2016-05-04 2019-08-27 PhysioWave, Inc. Scale-based on-demand care system
US9945672B2 (en) 2016-06-07 2018-04-17 International Business Machines Corporation Wearable device for tracking real-time ambient health conditions and method for destination selection based on tracked real-time ambient health conditions
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
CN109640820A (en) 2016-07-01 2019-04-16 立芙公司 The living things feature recognition carried out by the clothes with multiple sensors
EP3504942A4 (en) 2016-08-24 2020-07-15 Delos Living LLC Systems, methods and articles for enhancing wellness associated with habitable environments
US10215619B1 (en) 2016-09-06 2019-02-26 PhysioWave, Inc. Scale-based time synchrony
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
US10681198B2 (en) 2016-09-12 2020-06-09 Nymbus, Llc Audience interaction system and method
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US10719900B2 (en) 2016-10-11 2020-07-21 Motorola Solutions, Inc. Methods and apparatus to perform actions in public safety incidents based on actions performed in prior incidents
CN106324329B (en) * 2016-10-25 2023-09-22 云南电网有限责任公司电力科学研究院 Overvoltage self-adaptive identification method and system based on D-dot principle
US10389612B1 (en) * 2017-01-25 2019-08-20 Amazon Technologies, Inc. Product agnostic pattern detection and management
US9819782B1 (en) * 2017-02-07 2017-11-14 Shavar Daniels Neurological communication device
US20180225421A1 (en) * 2017-02-08 2018-08-09 International Business Machines Corporation Personalized health tracker and method for destination selection based on tracked personalized health information
CN110574297B (en) * 2017-02-16 2022-01-11 沃特洛电气制造公司 Compact modular wireless sensor
US10555258B2 (en) 2017-03-13 2020-02-04 At&T Intellectual Property I, L.P. User-centric ecosystem for heterogeneous connected devices
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
US11301477B2 (en) 2017-05-12 2022-04-12 Apple Inc. Feedback analysis of a digital assistant
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US10303715B2 (en) 2017-05-16 2019-05-28 Apple Inc. Intelligent automated assistant for media exploration
US10412139B2 (en) * 2017-05-26 2019-09-10 Streamsure Solutions Limited Communication event
US11425140B1 (en) * 2017-05-30 2022-08-23 Amazon Technologies, Inc. Secure and efficient cross-service sharing of subscriber data
CN107392661A (en) * 2017-07-19 2017-11-24 深圳市孝心快递养老服务有限公司 A kind of data processing method, measuring apparatus and system server
US11331019B2 (en) 2017-08-07 2022-05-17 The Research Foundation For The State University Of New York Nanoparticle sensor having a nanofibrous membrane scaffold
WO2019046478A1 (en) * 2017-08-29 2019-03-07 Walmart Apollo, Llc System and method for collaborative sharing of database information
WO2019046580A1 (en) 2017-08-30 2019-03-07 Delos Living Llc Systems, methods and articles for assessing and/or improving health and well-being
EP3457242B1 (en) * 2017-09-14 2023-03-01 Rohde & Schwarz GmbH & Co. KG Method for automatically notifying an intended person as well as a test and measurement device
US10755051B2 (en) 2017-09-29 2020-08-25 Apple Inc. Rule-based natural language processing
US10904929B2 (en) 2017-11-09 2021-01-26 Uniraja Ou Secure communication system
US10636424B2 (en) 2017-11-30 2020-04-28 Apple Inc. Multi-turn canned dialog
US10733982B2 (en) 2018-01-08 2020-08-04 Apple Inc. Multi-directional dialog
US10733375B2 (en) 2018-01-31 2020-08-04 Apple Inc. Knowledge-based framework for improving natural language understanding
US10902534B2 (en) 2018-03-01 2021-01-26 International Business Machines Corporation Cognitive travel assistance
US10789959B2 (en) 2018-03-02 2020-09-29 Apple Inc. Training speaker recognition models for digital assistants
US10592604B2 (en) 2018-03-12 2020-03-17 Apple Inc. Inverse text normalization for automatic speech recognition
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
US10909331B2 (en) 2018-03-30 2021-02-02 Apple Inc. Implicit identification of translation payload with neural machine translation
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
US10984780B2 (en) 2018-05-21 2021-04-20 Apple Inc. Global semantic word embeddings using bi-directional recurrent neural networks
DK179822B1 (en) 2018-06-01 2019-07-12 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
DK180639B1 (en) 2018-06-01 2021-11-04 Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
DK201870355A1 (en) 2018-06-01 2019-12-16 Apple Inc. Virtual assistant operation in multi-device environments
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
US11386266B2 (en) 2018-06-01 2022-07-12 Apple Inc. Text correction
US10504518B1 (en) 2018-06-03 2019-12-10 Apple Inc. Accelerated task performance
US20200066406A1 (en) * 2018-08-22 2020-02-27 Centre For Addiction And Mental Health Tool for identifying occurrence of acute incident symptomatic of mental condition or disorder
US20200168317A1 (en) 2018-08-22 2020-05-28 Centre For Addiction And Mental Health Tool for assisting individuals experiencing auditory hallucinations to differentiate between hallucinations and ambient sounds
WO2020055872A1 (en) 2018-09-14 2020-03-19 Delos Living Llc Systems and methods for air remediation
US11178537B2 (en) * 2019-01-24 2021-11-16 International Business Machines Corporation Data item transfer between mobile devices
US11844163B2 (en) 2019-02-26 2023-12-12 Delos Living Llc Method and apparatus for lighting in an office environment
US11368571B1 (en) 2019-03-06 2022-06-21 Nymbus, Llc Dense audience interaction device and communication method
US11898898B2 (en) 2019-03-25 2024-02-13 Delos Living Llc Systems and methods for acoustic monitoring
US11567912B1 (en) 2019-05-13 2023-01-31 Datometry, Inc. Database segmentation
US11075832B2 (en) * 2019-08-07 2021-07-27 Rohde & Schwarz Gmbh & Co. Kg Method and apparatus for data transmission rate control in a network
CN112752048B (en) * 2019-10-31 2022-04-12 华为技术有限公司 Cooperative work method, device, storage medium and cooperative system
CA3168925A1 (en) * 2020-01-30 2021-08-05 Centre For Addiction And Mental Health Tool for assisting individuals experiencing auditory hallucinations to differentiate between hallucinations and ambient sounds
US20210407683A1 (en) * 2020-06-30 2021-12-30 Verizon Patent And Licensing Inc. Method and system for remote health monitoring, analyzing, and response
CN111865734A (en) * 2020-07-07 2020-10-30 深圳康佳电子科技有限公司 Control method for terminal function sharing, gateway, terminal and storage medium
CN113923528B (en) * 2020-07-08 2023-03-28 华为技术有限公司 Screen sharing method, terminal and storage medium
WO2022145951A1 (en) * 2020-12-29 2022-07-07 Samsung Electronics Co., Ltd. Method and apparatus for providing a remote assistance
US20220216485A1 (en) * 2021-01-05 2022-07-07 Alexander Charles Kurple Power generation and energy storage in thermal batteries
WO2022159628A1 (en) * 2021-01-22 2022-07-28 Zinn Labs, Inc. Headset integrated into healthcare platform
CN113079031B (en) * 2021-02-22 2022-07-08 四川惟邦新创科技有限公司 Method for establishing ordered link based on intelligent agent to improve network service quality
US11720237B2 (en) * 2021-08-05 2023-08-08 Motorola Mobility Llc Input session between devices based on an input trigger
US11583760B1 (en) 2021-08-09 2023-02-21 Motorola Mobility Llc Controller mode for a mobile device
US11902936B2 (en) 2021-08-31 2024-02-13 Motorola Mobility Llc Notification handling based on identity and physical presence
US11641440B2 (en) 2021-09-13 2023-05-02 Motorola Mobility Llc Video content based on multiple capture devices
US20230141079A1 (en) * 2021-11-09 2023-05-11 Soonbum Shin Methods, Systems, and Devices for Facilitating a Health Protection Protocol

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5555490A (en) * 1993-12-13 1996-09-10 Key Idea Development, L.L.C. Wearable personal computer system
US5774338A (en) * 1996-09-20 1998-06-30 Mcdonnell Douglas Corporation Body integral electronics packaging
US20020028704A1 (en) * 2000-09-05 2002-03-07 Bloomfield Mark E. Information gathering and personalization techniques
US20020123337A1 (en) * 2000-12-28 2002-09-05 Dharia Bhupal Kanaiyalal System for fast macrodiversity switching in mobile wireless networks
US6466232B1 (en) * 1998-12-18 2002-10-15 Tangis Corporation Method and system for controlling presentation of information to a user based on the user's condition
US6573883B1 (en) * 1998-06-24 2003-06-03 Hewlett Packard Development Company, L.P. Method and apparatus for controlling a computing device with gestures
US20040088347A1 (en) * 2002-10-31 2004-05-06 Yeager William J. Mobile agents in peer-to-peer networks
US6801140B2 (en) * 2001-01-02 2004-10-05 Nokia Corporation System and method for smart clothing and wearable electronic devices
US20050212760A1 (en) * 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
US6965374B2 (en) * 2001-07-16 2005-11-15 Samsung Electronics Co., Ltd. Information input method using wearable information input device
US20060013440A1 (en) * 1998-08-10 2006-01-19 Cohen Charles J Gesture-controlled interfaces for self-service machines and other applications
US20060187203A1 (en) * 2005-02-22 2006-08-24 Eaton Corporation Handheld electronic device, system and method for inverting display orientation for left-handed or right-handed operation responsive to a wireless message
US20060187196A1 (en) * 2005-02-08 2006-08-24 Underkoffler John S System and method for gesture based control system
US7167920B2 (en) * 2001-01-22 2007-01-23 Sun Microsystems, Inc. Peer-to-peer communication pipes
US20080136775A1 (en) * 2006-12-08 2008-06-12 Conant Carson V Virtual input device for computing
US7774075B2 (en) * 2002-11-06 2010-08-10 Lin Julius J Y Audio-visual three-dimensional input/output
US20100217837A1 (en) * 2006-12-29 2010-08-26 Prodea Systems , Inc. Multi-services application gateway and system employing the same
US7813697B2 (en) * 2007-01-05 2010-10-12 Apple Inc. Power efficient high speed communication systems and methods
US7971156B2 (en) * 2007-01-12 2011-06-28 International Business Machines Corporation Controlling resource access based on user gesturing in a 3D captured image stream of the user

Family Cites Families (96)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5045839A (en) * 1990-03-08 1991-09-03 Rand G. Ellis Personnel monitoring man-down alarm and location system
JP2776105B2 (en) * 1992-01-07 1998-07-16 三菱電機株式会社 Electronic device and method for supplying power to electronic device
US5446679A (en) * 1993-08-05 1995-08-29 Motorola, Inc. Method for an operator station to perform requested functions when a functions processor is unable
US5883954A (en) * 1995-06-07 1999-03-16 Digital River, Inc. Self-launching encrypted try before you buy software distribution system
US5884184A (en) * 1996-05-01 1999-03-16 Sheffer; Eliezer Arie Supervised cellular reporting network
US6475170B1 (en) * 1997-12-30 2002-11-05 Remon Medical Technologies Ltd Acoustic biosensor for monitoring physiological conditions in a body implantation site
US5963012A (en) * 1998-07-13 1999-10-05 Motorola, Inc. Wireless battery charging system having adaptive parameter sensing
US6470212B1 (en) * 1998-08-11 2002-10-22 Medtronic, Inc. Body heat powered implantable medical device
US6225799B1 (en) * 1998-11-23 2001-05-01 Oleg A. Gergel Method and apparatus for testing magnetic heads and hard disks
US6563532B1 (en) * 1999-01-05 2003-05-13 Internal Research Corporation Low attention recording unit for use by vigorously active recorder
US6862347B1 (en) * 1999-01-28 2005-03-01 Siemens Communications, Inc. Method and apparatus for extending a telephone's capabilities
DE19930241A1 (en) * 1999-06-25 2000-12-28 Biotronik Mess & Therapieg Procedure for data transmission in implant monitoring
DE19929328A1 (en) * 1999-06-26 2001-01-04 Daimlerchrysler Aerospace Ag Device for long-term medical monitoring of people
US6281594B1 (en) * 1999-07-26 2001-08-28 Ivan Marijan Sarich Human powered electrical generation system
US7324953B1 (en) * 1999-08-13 2008-01-29 Danny Murphy Demographic information database processor
US7203732B2 (en) * 1999-11-11 2007-04-10 Miralink Corporation Flexible remote data mirroring
US6443890B1 (en) * 2000-03-01 2002-09-03 I-Medik, Inc. Wireless internet bio-telemetry monitoring system
US7734287B2 (en) * 2000-04-10 2010-06-08 I/O Controls Corporation System for providing remote access to diagnostic information over a wide area network
US6907264B1 (en) * 2000-08-09 2005-06-14 Lucent Technologies Inc. Methods and apparatus for modularization of real time and task oriented features in wireless communications
US6734071B1 (en) * 2000-08-30 2004-05-11 Micron Technology, Inc. Methods of forming insulative material against conductive structures
AU2001290762A1 (en) * 2000-09-11 2002-03-26 Wishoo, Inc. Portable system for digital photo management
JP4523143B2 (en) * 2000-11-10 2010-08-11 シチズンホールディングス株式会社 Concentration measuring device and sugar content measuring device
US20080032738A1 (en) * 2001-03-07 2008-02-07 Palm, Inc. Portable wireless network
US7471734B2 (en) * 2001-04-26 2008-12-30 Motorola, Inc. Space-time transmit diversity scheme for time-dispersive propagation media
WO2002093408A1 (en) * 2001-05-11 2002-11-21 Wildseed, Ltd. Method and system for collecting and displaying aggregate presence information for mobile media players
US7245725B1 (en) * 2001-05-17 2007-07-17 Cypress Semiconductor Corp. Dual processor framer
US20040054589A1 (en) * 2001-06-14 2004-03-18 Nicholas Frank C. Method and system for providing network based target advertising and encapsulation
US7113771B2 (en) * 2001-08-02 2006-09-26 Motorola, Inc. Method and apparatus for enabling and rewarding wireless resource sharing
US7478157B2 (en) * 2001-11-07 2009-01-13 International Business Machines Corporation System, method, and business methods for enforcing privacy preferences on personal-data exchanges across a network
US7146433B2 (en) * 2002-02-01 2006-12-05 Lenovo Singapore Pte. Ltd Extending an allowable transmission distance between a wireless device and an access point by communication with intermediate wireless devices
US6822343B2 (en) * 2002-02-28 2004-11-23 Texas Instruments Incorporated Generating electric power in response to activity of a biological system
US7019492B1 (en) * 2002-04-25 2006-03-28 Innovative Solutions & Technologies, Llc Hand-held, manually-operated battery charger with emergency light
US6970444B2 (en) * 2002-05-13 2005-11-29 Meshnetworks, Inc. System and method for self propagating information in ad-hoc peer-to-peer networks
US20040203797A1 (en) * 2002-09-05 2004-10-14 Jeremy Burr Method and apparatus for communications using distributed services in a mobile ad hoc network (MANET)
US20040203617A1 (en) * 2002-09-11 2004-10-14 Knauerhase Robert C. Communicating between devices within a mobile ad hoc network
US7194298B2 (en) * 2002-10-02 2007-03-20 Medicale Intelligence Inc. Method and apparatus for trend detection in an electrocardiogram monitoring signal
US6870089B1 (en) * 2002-11-12 2005-03-22 Randolph Dean Gray System and apparatus for charging an electronic device using solar energy
US7003353B1 (en) * 2002-12-10 2006-02-21 Quallion Llc Photovoltaic powered charging apparatus for implanted rechargeable batteries
US8131649B2 (en) * 2003-02-07 2012-03-06 Igware, Inc. Static-or-dynamic and limited-or-unlimited content rights
US7304416B2 (en) * 2003-02-21 2007-12-04 Jeffrey D Mullen Maximizing power generation in and distributing force amongst piezoelectric generators
US20050275729A1 (en) * 2003-03-13 2005-12-15 Logitech Europe S.A. User interface for image processing device
US7193649B2 (en) * 2003-04-01 2007-03-20 Logitech Europe S.A. Image processing device supporting variable data technologies
US7092713B2 (en) * 2003-04-29 2006-08-15 Microsoft Corporation Establishing call paths between source wireless computing systems and remote wireless computing systems using intermediary computing systems
US7417557B2 (en) * 2003-05-07 2008-08-26 Itron, Inc. Applications for a low cost receiver in an automatic meter reading system
US7142911B2 (en) * 2003-06-26 2006-11-28 Pacesetter, Inc. Method and apparatus for monitoring drug effects on cardiac electrical signals using an implantable cardiac stimulation device
US20050055309A1 (en) * 2003-09-04 2005-03-10 Dwango North America Method and apparatus for a one click upgrade for mobile applications
JP2005159905A (en) * 2003-11-27 2005-06-16 Ntt Docomo Inc Data storing device and communication terminal
US20050172141A1 (en) * 2004-01-30 2005-08-04 Gayde Ruth S. Method and apparatus for wireless management of mobile entities
US7483694B2 (en) * 2004-02-24 2009-01-27 Research In Motion Limited Method and system for remotely testing a wireless device
US7613478B2 (en) * 2004-03-15 2009-11-03 General Electric Company Method and system for portability of clinical images using a high-quality display and portable device
US20060019704A1 (en) * 2004-05-10 2006-01-26 Mike Kwon Integrating wireless telephone with external call processor
JP2008515309A (en) * 2004-09-29 2008-05-08 レイフ コミュニケーションズ エルエルシー Control of portable digital devices
US20070054662A1 (en) * 2004-09-30 2007-03-08 Siemens Aktiengesellschaft Wittelsbacherplatz 2 Reconfigurable radio system with error recognition and treatment
US7233333B2 (en) * 2004-11-23 2007-06-19 Buxco Electric, Inc. Collapsible (folding) graph
US7541776B2 (en) * 2004-12-10 2009-06-02 Apple Inc. Method and system for operating a portable electronic device in a power-limited manner
WO2006068295A1 (en) * 2004-12-21 2006-06-29 Matsushita Electric Industrial Co., Ltd. Hybrid mobile communication system comprising multi-hop-ad-hoc and circuit-switched modes
US8068819B2 (en) * 2005-01-24 2011-11-29 Kyocera Corporation System and method for increased wireless communication device performance
US20060170956A1 (en) * 2005-01-31 2006-08-03 Jung Edward K Shared image devices
WO2006098874A2 (en) * 2005-03-14 2006-09-21 Mark Strickland File sharing methods and systems
CN1838701B (en) * 2005-03-21 2012-01-04 松下电器产业株式会社 Method and application for making originating and goal telephone set call using relay telephone set
US7970870B2 (en) * 2005-06-24 2011-06-28 Microsoft Corporation Extending digital artifacts through an interactive surface
US20070004387A1 (en) * 2005-06-30 2007-01-04 Gadamsetty Uma M Sharing of services between a mobile computer and a wireless device
US20070021140A1 (en) * 2005-07-22 2007-01-25 Keyes Marion A Iv Wireless power transmission systems and methods
US7162926B1 (en) * 2005-08-04 2007-01-16 Kavlico Corporation Lead embedded pressure sensor
US8787967B2 (en) * 2005-08-23 2014-07-22 Sony Corporation Communication terminals with pull-based digital information distribution capability and pull-based methods for digital information distribution
EP1768323A1 (en) * 2005-09-27 2007-03-28 Hewlett-Packard Development Company, L.P. Method and apparats for feature sharing between mobile devices
KR100713511B1 (en) * 2005-10-07 2007-04-30 삼성전자주식회사 Method for performing video communication service in mobile communication terminal
CN100487649C (en) * 2005-11-04 2009-05-13 北京金山软件有限公司 Call method between chip
US8681778B2 (en) * 2006-01-10 2014-03-25 Ingenio Llc Systems and methods to manage privilege to speak
US20070160004A1 (en) * 2006-01-10 2007-07-12 Ketul Sakhpara Local Radio Group
US7532898B2 (en) * 2006-01-19 2009-05-12 International Business Machines Corporation Generating and dynamically updating databases of WIFI hotspots locations and performance metrics via location mappers
US7764247B2 (en) * 2006-02-17 2010-07-27 Microsoft Corporation Adaptive heads-up user interface for automobiles
US8040835B2 (en) * 2006-02-17 2011-10-18 Cisco Technology, Inc. Troubleshooting link and protocol in a wireless network
US8224366B2 (en) * 2006-02-17 2012-07-17 Qualcomm Incorporated System and method for multiple simultaneous communication groups in a wireless system
US7629769B2 (en) * 2006-03-10 2009-12-08 Atmel Corporation Power surge filtering in over-current and short circuit protection
US8188868B2 (en) * 2006-04-20 2012-05-29 Nike, Inc. Systems for activating and/or authenticating electronic devices for operation with apparel
US7698546B2 (en) * 2006-04-27 2010-04-13 Microsoft Corporation BIOS configuration update technique
US8046411B2 (en) * 2006-04-28 2011-10-25 Yahoo! Inc. Multimedia sharing in social networks for mobile devices
US7831270B2 (en) * 2006-05-18 2010-11-09 Cisco Technology, Inc. Providing virtual talk group communication sessions in accordance with endpoint resources
US8571580B2 (en) * 2006-06-01 2013-10-29 Loopt Llc. Displaying the location of individuals on an interactive map display on a mobile communication device
US20070293271A1 (en) * 2006-06-16 2007-12-20 Leslie-Anne Streeter System that augments the functionality of a wireless device through an external graphical user interface on a detached external display
US7711392B2 (en) * 2006-07-14 2010-05-04 Research In Motion Limited System and method to provision a mobile device
US7617423B2 (en) * 2006-08-14 2009-11-10 Kyocera Corporation System and method for detecting, reporting, and repairing of software defects for a wireless device
US20080045201A1 (en) * 2006-08-17 2008-02-21 Kies Jonathan K Remote feature control of a mobile device
US20080182563A1 (en) * 2006-09-15 2008-07-31 Wugofski Theodore D Method and system for social networking over mobile devices using profiles
US20080086226A1 (en) * 2006-10-10 2008-04-10 Gene Fein Internet enabled voice communication
US7705726B2 (en) * 2006-10-11 2010-04-27 Nortel Networks Limited Wireless-enabled device with capability of responding to changes in operational state
US20080089299A1 (en) * 2006-10-13 2008-04-17 Motorola, Inc. Method and system for distributing content in Ad-hoc networks using super peers
US7769009B1 (en) * 2006-12-11 2010-08-03 Sprint Communications Company L.P. Automatic peer to peer mobile device data replication
TWM318873U (en) * 2006-12-28 2007-09-11 Micro Star Int Co Ltd Wireless earphone with decoration effect
US7889124B2 (en) * 2007-01-26 2011-02-15 Mohammad Mojahedul Islam Handheld wireless utility asset mapping device
JP2010524094A (en) * 2007-04-04 2010-07-15 マグネットー・イナーシャル・センシング・テクノロジー・インコーポレイテッド Dynamically configurable wireless sensor network
US8032472B2 (en) * 2007-04-04 2011-10-04 Tuen Solutions Limited Liability Company Intelligent agent for distributed services for mobile devices
EP2165538A2 (en) * 2007-06-08 2010-03-24 Sorensen Associates Inc. Shopper view tracking and analysis system and method
US20090083148A1 (en) * 2007-09-26 2009-03-26 Sony Corporation System and method for facilitating content transfers between client devices in an electronic network
US20090089166A1 (en) * 2007-10-01 2009-04-02 Happonen Aki P Providing dynamic content to users

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5555490A (en) * 1993-12-13 1996-09-10 Key Idea Development, L.L.C. Wearable personal computer system
US5774338A (en) * 1996-09-20 1998-06-30 Mcdonnell Douglas Corporation Body integral electronics packaging
US6573883B1 (en) * 1998-06-24 2003-06-03 Hewlett Packard Development Company, L.P. Method and apparatus for controlling a computing device with gestures
US20060013440A1 (en) * 1998-08-10 2006-01-19 Cohen Charles J Gesture-controlled interfaces for self-service machines and other applications
US6466232B1 (en) * 1998-12-18 2002-10-15 Tangis Corporation Method and system for controlling presentation of information to a user based on the user's condition
US20020028704A1 (en) * 2000-09-05 2002-03-07 Bloomfield Mark E. Information gathering and personalization techniques
US20020123337A1 (en) * 2000-12-28 2002-09-05 Dharia Bhupal Kanaiyalal System for fast macrodiversity switching in mobile wireless networks
US6801140B2 (en) * 2001-01-02 2004-10-05 Nokia Corporation System and method for smart clothing and wearable electronic devices
US7167920B2 (en) * 2001-01-22 2007-01-23 Sun Microsystems, Inc. Peer-to-peer communication pipes
US6965374B2 (en) * 2001-07-16 2005-11-15 Samsung Electronics Co., Ltd. Information input method using wearable information input device
US20040088347A1 (en) * 2002-10-31 2004-05-06 Yeager William J. Mobile agents in peer-to-peer networks
US7774075B2 (en) * 2002-11-06 2010-08-10 Lin Julius J Y Audio-visual three-dimensional input/output
US20050212760A1 (en) * 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
US20060187196A1 (en) * 2005-02-08 2006-08-24 Underkoffler John S System and method for gesture based control system
US20060187203A1 (en) * 2005-02-22 2006-08-24 Eaton Corporation Handheld electronic device, system and method for inverting display orientation for left-handed or right-handed operation responsive to a wireless message
US20080136775A1 (en) * 2006-12-08 2008-06-12 Conant Carson V Virtual input device for computing
US20100217837A1 (en) * 2006-12-29 2010-08-26 Prodea Systems , Inc. Multi-services application gateway and system employing the same
US7813697B2 (en) * 2007-01-05 2010-10-12 Apple Inc. Power efficient high speed communication systems and methods
US7971156B2 (en) * 2007-01-12 2011-06-28 International Business Machines Corporation Controlling resource access based on user gesturing in a 3D captured image stream of the user

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8472665B2 (en) * 2007-05-04 2013-06-25 Qualcomm Incorporated Camera-based user input for compact devices
US20080273755A1 (en) * 2007-05-04 2008-11-06 Gesturetek, Inc. Camera-based user input for compact devices
US20090313010A1 (en) * 2008-06-11 2009-12-17 International Business Machines Corporation Automatic playback of a speech segment for media devices capable of pausing a media stream in response to environmental cues
US8542320B2 (en) 2010-06-17 2013-09-24 Sony Corporation Method and system to control a non-gesture controlled device using gesture interactions with a gesture controlled device
US20120095575A1 (en) * 2010-10-14 2012-04-19 Cedes Safety & Automation Ag Time of flight (tof) human machine interface (hmi)
CN104991519A (en) * 2010-10-14 2015-10-21 洛克威尔自动控制技术股份有限公司 Time of flight human machine interface
WO2012093393A1 (en) * 2011-01-07 2012-07-12 Seal Mobile Id Ltd Method and system for unobtrusive mobile device user recognition
US10884508B1 (en) 2011-04-02 2021-01-05 Open Invention Network Llc System and method for redirecting content based on gestures
US9632588B1 (en) * 2011-04-02 2017-04-25 Open Invention Network, Llc System and method for redirecting content based on gestures
US10338689B1 (en) * 2011-04-02 2019-07-02 Open Invention Network Llc System and method for redirecting content based on gestures
US11281304B1 (en) 2011-04-02 2022-03-22 Open Invention Network Llc System and method for redirecting content based on gestures
US11720179B1 (en) * 2011-04-02 2023-08-08 International Business Machines Corporation System and method for redirecting content based on gestures
US20120280905A1 (en) * 2011-05-05 2012-11-08 Net Power And Light, Inc. Identifying gestures using multiple sensors
US9063704B2 (en) * 2011-05-05 2015-06-23 Net Power And Light, Inc. Identifying gestures using multiple sensors
US20140351334A1 (en) * 2011-09-12 2014-11-27 Tata Consultancy Services Limited System for Dynamic Service Collaboration through Identification and Context of Plurality of Heterogeneous Devices
US9723062B2 (en) * 2011-09-12 2017-08-01 Tata Consultancy Services Limited System for dynamic service collaboration through identification and context of plurality of heterogeneous devices
US20130204408A1 (en) * 2012-02-06 2013-08-08 Honeywell International Inc. System for controlling home automation system using body movements
US9952674B2 (en) * 2012-10-17 2018-04-24 Sony Corporation Communication system, communication method and program
US20150248168A1 (en) * 2012-10-17 2015-09-03 Sony Corporation Communication system, communication method and program
US20150153854A1 (en) * 2013-12-03 2015-06-04 Lenovo (Singapore) Pte. Ltd. Extension of wearable information handling device user interface
US20150185827A1 (en) * 2013-12-31 2015-07-02 Linkedln Corporation Techniques for performing social interactions with content
US9691293B2 (en) 2014-07-09 2017-06-27 Pearson Education, Inc. Customizing application usability with 3D input
US9600074B2 (en) * 2014-07-09 2017-03-21 Pearson Education, Inc. Operational feedback with 3D commands
US20160011665A1 (en) * 2014-07-09 2016-01-14 Pearson Education, Inc. Operational feedback with 3d commands
US11875656B2 (en) 2015-03-12 2024-01-16 Alarm.Com Incorporated Virtual enhancement of security monitoring
US10025974B1 (en) * 2015-04-03 2018-07-17 William Felder Boxing motion system and method
US11610033B2 (en) 2017-02-22 2023-03-21 Middle Chart, LLC Method and apparatus for augmented reality display of digital content associated with a location
US11514207B2 (en) 2017-02-22 2022-11-29 Middle Chart, LLC Tracking safety conditions of an area
US11436389B2 (en) 2017-02-22 2022-09-06 Middle Chart, LLC Artificial intelligence based exchange of geospatial related digital content
US11468209B2 (en) 2017-02-22 2022-10-11 Middle Chart, LLC Method and apparatus for display of digital content associated with a location in a wireless communications area
US11475177B2 (en) 2017-02-22 2022-10-18 Middle Chart, LLC Method and apparatus for improved position and orientation based information display
US11481527B2 (en) 2017-02-22 2022-10-25 Middle Chart, LLC Apparatus for displaying information about an item of equipment in a direction of interest
US11900021B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Provision of digital content via a wearable eye covering
US11893317B2 (en) 2017-02-22 2024-02-06 Middle Chart, LLC Method and apparatus for associating digital content with wireless transmission nodes in a wireless communication area
US11900023B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Agent supportable device for pointing towards an item of interest
US11610032B2 (en) 2017-02-22 2023-03-21 Middle Chart, LLC Headset apparatus for display of location and direction based content
US11900022B2 (en) 2017-02-22 2024-02-13 Middle Chart, LLC Apparatus for determining a position relative to a reference transceiver
US11625510B2 (en) 2017-02-22 2023-04-11 Middle Chart, LLC Method and apparatus for presentation of digital content
US11195525B2 (en) * 2018-06-13 2021-12-07 Panasonic Intellectual Property Corporation Of America Operation terminal, voice inputting method, and computer-readable recording medium
US20200107750A1 (en) * 2018-10-03 2020-04-09 Surge Motion Inc. Method and system for assessing human movements
US11861269B2 (en) 2019-01-17 2024-01-02 Middle Chart, LLC Methods of determining location with self-verifying array of nodes
US11436388B2 (en) 2019-01-17 2022-09-06 Middle Chart, LLC Methods and apparatus for procedure tracking
US11636236B2 (en) 2019-01-17 2023-04-25 Middle Chart, LLC Methods and apparatus for procedure tracking
US11361122B2 (en) 2019-01-17 2022-06-14 Middle Chart, LLC Methods of communicating geolocated data based upon a self-verifying array of nodes
US11593536B2 (en) 2019-01-17 2023-02-28 Middle Chart, LLC Methods and apparatus for communicating geolocated data
US11507714B2 (en) 2020-01-28 2022-11-22 Middle Chart, LLC Methods and apparatus for secure persistent location based digital content
US11809787B2 (en) 2021-03-01 2023-11-07 Middle Chart, LLC Architectural drawing aspect based exchange of geospatial related digital content
US11640486B2 (en) 2021-03-01 2023-05-02 Middle Chart, LLC Architectural drawing based exchange of geospatial related digital content

Also Published As

Publication number Publication date
US20080250408A1 (en) 2008-10-09
US9055106B2 (en) 2015-06-09
US20080261572A1 (en) 2008-10-23
US20080246629A1 (en) 2008-10-09
CN101711387B (en) 2014-05-07
US8340658B2 (en) 2012-12-25
WO2008124399A1 (en) 2008-10-16
US20080249969A1 (en) 2008-10-09
KR20090125264A (en) 2009-12-04
US20080246439A1 (en) 2008-10-09
US8032472B2 (en) 2011-10-04
KR101332462B1 (en) 2013-11-25
US8209275B2 (en) 2012-06-26
CN101766015A (en) 2010-06-30
US8786246B2 (en) 2014-07-22
KR20100015362A (en) 2010-02-12
CN101711387A (en) 2010-05-19
WO2008124394A1 (en) 2008-10-16
US20080248750A1 (en) 2008-10-09
US20080248779A1 (en) 2008-10-09
US20110320518A1 (en) 2011-12-29
US20130080616A1 (en) 2013-03-28

Similar Documents

Publication Publication Date Title
US20080246734A1 (en) Body movement based usage of mobile device
US20140049487A1 (en) Interactive user interface for clothing displays
US10552004B2 (en) Method for providing application, and electronic device therefor
KR102192929B1 (en) Wearable electronic device having heterogeneous display screens
CN104408402B (en) Face identification method and device
CN106605201A (en) User interfaces for battery management
KR20170096774A (en) Activity-centric contextual modes of operation for electronic devices
CN108139865A (en) Mancarried device and the method for display
US20140129560A1 (en) Context labels for data clusters
CN109144181A (en) Gestures detection is lifted in equipment
CN107103840A (en) Folding device and its control method
CN107087137B (en) Method and device for presenting video and terminal equipment
WO2015094222A1 (en) User interface based on wearable device interaction
CN107453964A (en) Sleep environment management method and device
CN108132983A (en) The recommendation method and device of clothing matching, readable storage medium storing program for executing, electronic equipment
CN108494947A (en) A kind of images share method and mobile terminal
CN109831576A (en) A kind of garment coordination method for pushing and terminal, computer readable storage medium
CN104317647B (en) Application function implementation method, device and terminal
WO2019184679A1 (en) Method and device for implementing game, storage medium, and electronic apparatus
KR102425464B1 (en) Electronic deivce including rotatable annular member
CN109804618A (en) Electronic equipment for displaying images and computer readable recording medium
CN103927391A (en) Information processing method and device
US20200265233A1 (en) Method for recognizing object and electronic device supporting the same
KR102277097B1 (en) Method for controlling display in electronic device and the electronic device
US8160565B2 (en) Device with multidirectional control for selecting actions to perform on a telecommunication device

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE HONG KONG UNIVERSITY OF SCIENCE AND TECHNOLOGY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TSUI, CHI YING;MURCH, ROSS DAVID;CHENG, ROGER SHU KWAN;AND OTHERS;REEL/FRAME:020682/0317

Effective date: 20080319

AS Assignment

Owner name: HONG KONG TECHNOLOGIES GROUP LIMITED

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THE HONG KONG UNIVERSITY OF SCIENCE AND TECHNOLOGY;REEL/FRAME:024067/0623

Effective date: 20100305

Owner name: HONG KONG TECHNOLOGIES GROUP LIMITED, SAMOA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THE HONG KONG UNIVERSITY OF SCIENCE AND TECHNOLOGY;REEL/FRAME:024067/0623

Effective date: 20100305

AS Assignment

Owner name: TUEN SOLUTIONS LIMITED LIABILITY COMPANY, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HONG KONG TECHNOLOGIES GROUP LIMITED;REEL/FRAME:024921/0001

Effective date: 20100728

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION