WO2001029642A1 - Personal mobile communication device - Google Patents

Personal mobile communication device Download PDF

Info

Publication number
WO2001029642A1
WO2001029642A1 PCT/GB2000/003970 GB0003970W WO0129642A1 WO 2001029642 A1 WO2001029642 A1 WO 2001029642A1 GB 0003970 W GB0003970 W GB 0003970W WO 0129642 A1 WO0129642 A1 WO 0129642A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
location
physical
attributes
visual display
Prior art date
Application number
PCT/GB2000/003970
Other languages
French (fr)
Inventor
Jeremy Michael Bowskill
Alexander Loffler
Matthew John Polaine
Jeffrey Joseph Patmore
Original Assignee
British Telecommunications Public Limited Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by British Telecommunications Public Limited Company filed Critical British Telecommunications Public Limited Company
Priority to CA002386407A priority Critical patent/CA2386407C/en
Priority to EP00968118A priority patent/EP1222518B1/en
Priority to DE60010915T priority patent/DE60010915T2/en
Priority to JP2001532372A priority patent/JP2003512798A/en
Priority to AU78073/00A priority patent/AU7807300A/en
Publication of WO2001029642A1 publication Critical patent/WO2001029642A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/56Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2203/00Aspects of automatic or semi-automatic exchanges
    • H04M2203/10Aspects of automatic or semi-automatic exchanges related to the purpose or context of the telephonic communication
    • H04M2203/1016Telecontrol
    • H04M2203/1025Telecontrol of avatars
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2203/00Aspects of automatic or semi-automatic exchanges
    • H04M2203/20Aspects of automatic or semi-automatic exchanges related to features of supplementary services
    • H04M2203/2038Call context notifications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2242/00Special services or facilities
    • H04M2242/30Determination of the location of a subscriber
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/42348Location-based services which utilize the location information of a target
    • H04M3/42357Location-based services which utilize the location information of a target where the information is provided to a monitoring entity such as a potential calling party or a call processing server

Definitions

  • This invention relates to a device in which the user interface of a mobile personal device is modified according to physical and location context.
  • this invention relates to a mobile teleconferencing device.
  • images are generated relating to a "virtual meeting space".
  • Individuals at a plurality of locations remote from each other, and accessing the facility using different types of access device may interact with each other in a manner which emulates a conventional meeting.
  • the physical and location attributes may be used to modify a representation of the user.
  • the detected physical and location attributes may also be used to modify the interface of the teleconferencing device.
  • avatars Individual users are represented in the virtual meeting space display by computer-generated representations of the users, known as "avatars” (or “icons”). These may be derived from video images of the users, either live or retrieved from a store, but usually they are digitally generated representations. In general, each user is able to select the appearance of his or her avatar in the virtual space from a menu of characteristics. Alternatively, each individual user may be able to select, for his own viewpoint, how each of the other users' avatars will appear. Other characteristics of the meeting space, such as the colour and shape of the elements of the meeting space, may also be selectable by the user.
  • a human computer interface device comprising a user interface device comprising a visual display device and an audio output device; and a physical detector for detecting physical attributes of a user; in which the visual display device is arranged to inhibit output via the visual display device when the user is not stationary.
  • the device further comprises a location detector for detecting location attribute of the user and in which the operation of the user interface device dependent upon the detected location attributes of the user.
  • the output of the audio output device is dependent upon the location attributes of the user, and preferably the output of the visual display device is dependent upon the location attributes of the user.
  • a human computer interface device comprising a user interface device comprising a visual display device and an audio output device; a physical detector for detecting physical attributes of a user; and a location detector for detecting location attributes of the user and in which the operation of the user interface device dependent upon the detected location attributes of the user.
  • the output of the audio output device is dependent upon the location attributes of the user, and preferably the output of the visual display device is dependent upon the location attributes of the user.
  • a mobile conferencing device including such a human computer interfacing device.
  • Figure 1 shows a network with human/machine interface units serving teleconference users via respective client apparatuses
  • Figure 2 is a representation of a teleconference as displayed on an interface unit of Figure
  • Figure 3a is a block diagram showing a client apparatus of Figure 1 which incorporates a physical and location sensor;
  • Figure 3b is a functional block diagram showing the logical operation of the apparatus shown in Figure 3a.
  • Figures 4 to 7 are examples of representations of a user as shown on an interface unit of Figure 1 , in which the representation of the user is dependent upon location and physical data collected using the apparatus shown in Figure 3a.
  • Figure 1 shows a network serving four users 1 , 2, 3, 4 (not shown) allowing them to interact in a virtual teleconference.
  • Each user has a respective human/machine interface unit 21 , 22, 23, 24, which includes video and/or audio equipment for the user to see and/or hear what is happening in the virtual meeting space.
  • the interface unit includes user input devices (e.g. audio input, keyboard or keypad, computer “mouse” etc.) to enable the user to provide input to the virtual meeting space.
  • Each interface unit, 21 , 22, 23, 24 is connected to a respective client apparatus 11 , 12, 13, 14 which provides an interface between the user and a main server 10 which controls the operation of the meeting space.
  • the server 10 has, as a further input, a virtual reality (VR) definition store 30 which maintains permanent data defining the virtual meeting space (also referred to as the meeting space definition unit in the specification).
  • VR virtual reality
  • the control of the meeting space is carried out by interaction between the client apparatuses 11 , 12, 13, 14 and the server 10.
  • the display control functions may take place in the server 10, or the display control functions may be distributed in the client apparatus 11 , 12, 13, 14, depending on the functionality available in the client apparatus.
  • Links between the client apparatus 11 , 12, 13, 14 and the server 10 may be permanent hard-wired connections, virtual connections (permanent as perceived by the user, but provided over shared lines by the telecommunications provider), or dial-up connections (available on demand, and provided on a pay-per-use basis), and may include radio links, for example to a mobile device.
  • the server 10 may have, in addition to the server functionality, similar functionality to the client apparatus 11, 12, 13, 14, but as shown the server 10 is dedicated to the server function only.
  • FIG. 2 An example of an image representing a meeting space as it appears on a display device is shown in Figure 2.
  • users 2, 3 and 4 are represented by avatars 42, 43 and 44 respectively.
  • the client apparatus 11 transmits these inputs to the main server 10 which, in accordance with the meeting space definition unit 30, controls the images to be represented on the other users' screens in the human machine interface units 22, 23, 24 to represent the activities of the user 1 , input through interface device 21.
  • the actions of the user 1 when first establishing contact with the meeting space are translated by the client apparatus 11 and converted by the server 10, into a representation of the user 1 entering the meeting space, which is in turn passed to the individual clients 12, 13, 14 to be represented as the avatar of the user 1 moving into the field of view of the display devices 22, 23, 24.
  • the manner of representation of the individual user 1 in the virtual space for example the appearance of the avatar in terms of age, sex, hair colour etc may be selected either by the user 1 through his respective client device 11 , or by each receiving user 2, 3, 4 in the meeting space, who may each select an avatar according to his own requirements to represent the user 1.
  • some parts of the virtual meeting space may be defined centrally in the meeting space definition unit 30, whereas other aspects may be defined by each individual client apparatus 11 , 12, 13, 14 independently of the others.
  • Such definitions may include colour schemes, the relative locations in the virtual meeting space of the individual users 1 , 2, 3, 4, etc.
  • the client apparatus 11 is a mobile device, and in the embodiment of the invention described here the mobile device 11 is a wireless palmtop computer.
  • the term mobile device is intended to refer to all computing devices which may be carried around or worn by a user, and may be used whilst the user is moving around and active in other tasks.
  • Mobile devices are distinguished from portable devices which are carried to a location and then used whilst the user is stationary.
  • a mobile device may or may not have visual display capabilities. Even if the device does have such capabilities, the user 1 may be walking or running or otherwise distracted, and may not be able to attend to a visual display.
  • the representation of the user 1 is displayed to the other users 2, 3, 4 as shown in Figure 4, so that the other users are aware that user 1 is on line, but that the user 1 may not have a visual link to the teleconference.
  • the client device 11 may not have as sophisticated input and output capabilities as other client devices 12, 13, 14 .
  • Privacy may be an issue. It is possible that other people might move in and out of the user's proximity during a conversation.
  • the user's avatar is changed as shown in Figure 5 to indicate that the user is on line, but that the user may not be in private.
  • the user 1 can indicate that there is a privacy issue manually, by transmitting a signal via the client 11 to the server 10 using a predetermined key or sequence of keys.
  • the device 11 has an audio input, and as an alternative to using a manually entered key or sequence of keys to indicate the user is not in private, the received audio signal is analysed, using known speaker recognition algorithms, to determine whether speech other than that from the user is detected.
  • the device 11 may also be equipped with a video input, in which case the video signal received via the video input can be analysed using known image classification algorithms, for example to detect wither there is skin detected in the captured image, or to detect the number of faces in the captured image. The results of such image classification may then be used to indicate to the server 10 that the user is not in private and the user's avatar is modified accordingly.
  • QoS Quality of Service
  • the fixed telephony network uses 64Kbits/s per voice channel while the mobile network uses 9.6 Kbits/s per voice channel.
  • the average number of bits per second transmitted from the client device 11 to the server 10 is monitored by the server 10.
  • the avatar of the user 1 is modified to be more or less opaque as a function of the average number of bits per second received by the server 10 from the client device 11.
  • the more opaque the avatar the better the perceived QoS.
  • the attention paid to the virtual meeting space varies in dependence upon the 'real world' task currently being carried out. For example, whilst travelling on a train a user may be required to show a ticket to the ticket inspector, or somebody may speak to the user to ask the time. If the user is walking, running, or unable to remain still for some reason, then the attention paid to the virtual meeting space will be more limited than otherwise. If the user is in a noisy environment, again, the attention paid to the virtual meeting space will be less than it would be in a very quiet environment. Detection of a user's physical and location attributes is discussed in more detail with reference to Figure 3a and 3b.
  • the audio environment is analysed using the audio signal received via the audio input on the client apparatus 11. It is also possible for the user to use a predetermined key or sequence of keys to indicate via the client apparatus 11 to the server 10 that he is distracted or on the move.
  • Figure 6 shows a representation of a user who is on-line but distracted
  • Figure 7 shows a representation of a user who is on line but on the move.
  • the user interface unit 21 includes a physical and location sensor 50 as shown in Figure 3, as well as a visual display 60 and an audio input/output device 61.
  • the physical and location sensor 50 is connected to the client apparatus 11 by a serial interface 51.
  • a low acceleration detector 52 measures acceleration of a low force in two directions using an ADXL202.
  • a high acceleration detector 53 measures acceleration of a high force in three directions using an ACH04-08-05 available from Measurement Specialities Incorporated (which can be referenced via Universal Resource Locator (URL) http://www.msiusa.com on the Internet).
  • a direction detector 54 is provided using a compass which gives an absolute measurement of orientation of the client apparatus.
  • a HMC2003 available from Honywell (URL http://www.ssechonevwell.com), is used.
  • the compass is a three-axis magnetometer sensitive to fields along the length, width and height of the device.
  • a direction and velocity detector 55 is provided using an ENC Piezoelectric Vibrating Gyroscope (part number S42E-2 which is sold under the registered trademark GYROSTAR) available from Murata manufacturing Company Ltd. (URL http://www.murata.com).
  • the gyroscope measures angular velocity, giving speed and direction in two directions in each axis of rotation (i. e. six measurements are provided).
  • the acceleration detectors 52, 53, the direction detector 54 and the velocity and direction detector 55 are connected via a multiplexer (MUX) 56 to a microcontroller 57 where the outputs are analysed as will be described later.
  • MUX multiplexer
  • a global position detector 58 is provided which measures the absolute location of the device using a Global Positioning System (GPS) receiver which receives signal from GPS satellites.
  • GPS Global Positioning System
  • GPS provides specially coded satellite signals that can be processed in a GPS receiver, enabling the receiver to compute position, velocity and time.
  • the nominal GPS Operational Constellation consists of 24 satellites that orbit the earth twice a day, 11 ,000 miles above the earth. (There are often more than 24 operational satellites as new ones are launched to replace older satellites.)
  • the satellite orbits repeat almost the same ground track (as the earth turns beneath them) once each day.
  • This constellation provides the user with from five to eight satellites visible from any point on the earth.
  • the GPS satellites orbit the earth transmitting their precise position and elevation.
  • a GPS receiver acquires the signal, then measures the interval between transmission and receipt of the signal to determine the distance between the receiver and the satellite. Once the receiver has calculated this data for at least 3 satellites, its location on the earth's surface can be determined.
  • the receiver used in this embodiment of the invention is a Garmin GPS35 unit (available, for example from Lowe Electronics Ltd in the UK). GPS signals do not propagate inside buildings so a local position detector 59 is also provided which uses local area beacons (LAB's) (not shown) which use low power 418MHz AM radio transmitters (such as the CR91Y, CR72P, CR73Q or CR74R from RF Solutions) at known locations within a building. Radio or infrared transmitters could be used, although radio provides a more robust solution since line of sight connections are not required.
  • LAB's local area beacons
  • AM radio transmitters such as the CR91Y, CR72P, CR73Q or CR74R from RF Solutions
  • Bluetooth is a standard for wireless connectivity, designed to replace cables between portable consumer devices such as cellular phones, laptop computers, personal digital assistants, digital cameras, and many other products.
  • the Bluetooth version 1.0 specification was agreed in July 1999, and the first products are expected on the market in mid 2000.
  • Software on the microcontroller 57 gathers sensor data from the detectors 52, 53, 54, 55, via the MUX 56 which is configured to read each device in turn via an analogue port.
  • the output from the global position detector 58 is read via a serial port connection and the output from the local position detector 59 is connected to a digital input on the microcontroller 57.
  • a location database 64 which is accessed by the microcontroller 57 to determine location names.
  • Figure 3b is a functional block diagram showing the logical operation of the physical and location detector 50.
  • a location agent 62 implemented in software on the microcontroller 57, uses location data gathered by the global position detector 58 and the local position detector 59, analyses this data and makes the analysis available to the client apparatus 11.
  • the location agent 62 also receives information about velocity and direction, measured by the direction detector 54 and the velocity and direction detector 55, from a physical agent 63.
  • the physical agent is also implemented in software in the microcontroller 57.
  • the location agent determines whether GPS is available, and whether the global location measured by the global position detector 58 is based on a signal from three or more satellites.
  • the local position detector 59 detects signals from LAB's, each of which has a unique identifier.
  • the location agent 62 accesses the location database 64 to determine a location name associated with a received LAB identifier.
  • the location agent 62 must be able to determine the following:
  • Direction of movement This may be determined by the global position detector and /or by direction data received from the physical agent.
  • the physical agent 63 analyses physical sensor data and makes this available to the location agent 62.
  • the physical agent is used to determine the following user attributes.
  • the physical agent 63 of this embodiment of the invention uses Hidden Markov Models (HMM) to provide a determination above based on the inputs from the detectors 52, 53, 54, 55, 56.
  • HMM Hidden Markov Models
  • a good description of an implementation of HMM's may be found in "Hidden Markov Models for Automatic Speech Recognition: Theory and Application” S.J. Cox, British Telecom Technology Journal Vol. 6, No. 2, April 1988.
  • the physical agent it is possible for the physical agent to analyse visual and audio information received from the visual and audio input/output device provided as part of the interface unit 21.
  • the client apparatus 11 has the physical information made available to it via the physical agent 63, and the location information made available to it via the location agent 62. Audio and/or visual information is used on the mobile device to provide the user with information alerts, and for teleconferencing activity. Spatial audio is also used for information alerts and for spatialised teleconferencing, which appears more natural to the user.
  • the interface used by the device for information alerts, and the interface used for teleconferencing are dependent on the user's current location and physical context (i. e. is the user standing/ walking/sitting etc). If the user is unlikely to be able to attend to a visual display, an audio interface is used. If the user is likely to be unavailable (eg running) then the device could divert alerts to a messaging service, which could then alert the user when it is determined he is available again. In embodiments of the invention incorporating audio input and analysis it is also possible to configure the audio output on the user's wearable or handheld device to match the acoustics, ambient noise level etc of the real world space in which the user is located.
  • the nature of the interface used can be modified according to the detected user location.
  • a mobile phone handset could use a ring-tone such as a voice saying "shop at the Harrods' sale” if it is determined by the location agent 62 that the user is walking along Knightsbridge (where the famous shop 'Harrods' is located).
  • a phone could use an appropriate piece of music if it is determined by the location agent 62 that the user is in church.
  • the visual display can be altered according to the determined location.
  • the screen style of the visual interface can be made to reflect the theme of the location. For example if the user is viewing web pages, and is walking around a museum, the web pages viewed as the user moves to different locations change to reflect the area of the museum.

Abstract

A human computer interface device is provided in which the operation of the user interface depends upon detected physical and location attributes of the user. If a user is moving the user interface switches to auditory output only. Detected location attributes are also used to modify the operation of the user interface. Also provided is a mobile conferencing device incorporating such a human conmputer interface device. In this case the ring-tone or a visual display can be tailored according to the detected location.

Description

PERSONAL MOBILE COMMUNICATION DEVICE
This invention relates to a device in which the user interface of a mobile personal device is modified according to physical and location context. In particular this invention relates to a mobile teleconferencing device. In a telecommunications conferencing (teleconferencing) facility images are generated relating to a "virtual meeting space". Individuals at a plurality of locations remote from each other, and accessing the facility using different types of access device may interact with each other in a manner which emulates a conventional meeting. When the user is using a teleconferencing facility the physical and location attributes may be used to modify a representation of the user. The detected physical and location attributes may also be used to modify the interface of the teleconferencing device.
Individual users are represented in the virtual meeting space display by computer-generated representations of the users, known as "avatars" (or "icons"). These may be derived from video images of the users, either live or retrieved from a store, but usually they are digitally generated representations. In general, each user is able to select the appearance of his or her avatar in the virtual space from a menu of characteristics. Alternatively, each individual user may be able to select, for his own viewpoint, how each of the other users' avatars will appear. Other characteristics of the meeting space, such as the colour and shape of the elements of the meeting space, may also be selectable by the user.
According to the present invention there is provided a human computer interface device comprising a user interface device comprising a visual display device and an audio output device; and a physical detector for detecting physical attributes of a user; in which the visual display device is arranged to inhibit output via the visual display device when the user is not stationary.
In a preferred embodiment the device further comprises a location detector for detecting location attribute of the user and in which the operation of the user interface device dependent upon the detected location attributes of the user. Preferably the output of the audio output device is dependent upon the location attributes of the user, and preferably the output of the visual display device is dependent upon the location attributes of the user.
According to another aspect of the invention there is provided a human computer interface device comprising a user interface device comprising a visual display device and an audio output device; a physical detector for detecting physical attributes of a user; and a location detector for detecting location attributes of the user and in which the operation of the user interface device dependent upon the detected location attributes of the user.
Preferably the output of the audio output device is dependent upon the location attributes of the user, and preferably the output of the visual display device is dependent upon the location attributes of the user.
According to the invention there is also provided a mobile conferencing device including such a human computer interfacing device.
An embodiment of the invention will now be described by way of example only with reference to the accompanying drawings, in which:
Figure 1 shows a network with human/machine interface units serving teleconference users via respective client apparatuses;
Figure 2 is a representation of a teleconference as displayed on an interface unit of Figure
1 ; Figure 3a is a block diagram showing a client apparatus of Figure 1 which incorporates a physical and location sensor;
Figure 3b is a functional block diagram showing the logical operation of the apparatus shown in Figure 3a; and
Figures 4 to 7 are examples of representations of a user as shown on an interface unit of Figure 1 , in which the representation of the user is dependent upon location and physical data collected using the apparatus shown in Figure 3a.
Figure 1 shows a network serving four users 1 , 2, 3, 4 (not shown) allowing them to interact in a virtual teleconference. Each user has a respective human/machine interface unit 21 , 22, 23, 24, which includes video and/or audio equipment for the user to see and/or hear what is happening in the virtual meeting space. The interface unit includes user input devices (e.g. audio input, keyboard or keypad, computer "mouse" etc.) to enable the user to provide input to the virtual meeting space. Each interface unit, 21 , 22, 23, 24 is connected to a respective client apparatus 11 , 12, 13, 14 which provides an interface between the user and a main server 10 which controls the operation of the meeting space. The server 10 has, as a further input, a virtual reality (VR) definition store 30 which maintains permanent data defining the virtual meeting space (also referred to as the meeting space definition unit in the specification). The control of the meeting space is carried out by interaction between the client apparatuses 11 , 12, 13, 14 and the server 10. The display control functions may take place in the server 10, or the display control functions may be distributed in the client apparatus 11 , 12, 13, 14, depending on the functionality available in the client apparatus. Links between the client apparatus 11 , 12, 13, 14 and the server 10 may be permanent hard-wired connections, virtual connections (permanent as perceived by the user, but provided over shared lines by the telecommunications provider), or dial-up connections (available on demand, and provided on a pay-per-use basis), and may include radio links, for example to a mobile device. The server 10 may have, in addition to the server functionality, similar functionality to the client apparatus 11, 12, 13, 14, but as shown the server 10 is dedicated to the server function only.
An example of an image representing a meeting space as it appears on a display device is shown in Figure 2. In this example, users 2, 3 and 4 are represented by avatars 42, 43 and 44 respectively.
Referring again to Figure 1 , in response to inputs from one of the users (e.g. user
1) through his respective user interface 21 the client apparatus 11 transmits these inputs to the main server 10 which, in accordance with the meeting space definition unit 30, controls the images to be represented on the other users' screens in the human machine interface units 22, 23, 24 to represent the activities of the user 1 , input through interface device 21. As a very simple example, the actions of the user 1 when first establishing contact with the meeting space are translated by the client apparatus 11 and converted by the server 10, into a representation of the user 1 entering the meeting space, which is in turn passed to the individual clients 12, 13, 14 to be represented as the avatar of the user 1 moving into the field of view of the display devices 22, 23, 24. The manner of representation of the individual user 1 in the virtual space, for example the appearance of the avatar in terms of age, sex, hair colour etc may be selected either by the user 1 through his respective client device 11 , or by each receiving user 2, 3, 4 in the meeting space, who may each select an avatar according to his own requirements to represent the user 1. Similarly, some parts of the virtual meeting space may be defined centrally in the meeting space definition unit 30, whereas other aspects may be defined by each individual client apparatus 11 , 12, 13, 14 independently of the others. Such definitions may include colour schemes, the relative locations in the virtual meeting space of the individual users 1 , 2, 3, 4, etc.
The client apparatus 11 is a mobile device, and in the embodiment of the invention described here the mobile device 11 is a wireless palmtop computer. In this specification the term mobile device is intended to refer to all computing devices which may be carried around or worn by a user, and may be used whilst the user is moving around and active in other tasks. Mobile devices are distinguished from portable devices which are carried to a location and then used whilst the user is stationary.
However, a mobile device may or may not have visual display capabilities. Even if the device does have such capabilities, the user 1 may be walking or running or otherwise distracted, and may not be able to attend to a visual display. The representation of the user 1 is displayed to the other users 2, 3, 4 as shown in Figure 4, so that the other users are aware that user 1 is on line, but that the user 1 may not have a visual link to the teleconference.
For users using a mobile device there are other aspects of the service to consider beside the fact that the client device 11 may not have as sophisticated input and output capabilities as other client devices 12, 13, 14 . Privacy may be an issue. It is possible that other people might move in and out of the user's proximity during a conversation. In order to make the other users in a conference aware of potential privacy issues the user's avatar is changed as shown in Figure 5 to indicate that the user is on line, but that the user may not be in private. The user 1 can indicate that there is a privacy issue manually, by transmitting a signal via the client 11 to the server 10 using a predetermined key or sequence of keys. The device 11 has an audio input, and as an alternative to using a manually entered key or sequence of keys to indicate the user is not in private, the received audio signal is analysed, using known speaker recognition algorithms, to determine whether speech other than that from the user is detected. The device 11 may also be equipped with a video input, in which case the video signal received via the video input can be analysed using known image classification algorithms, for example to detect wither there is skin detected in the captured image, or to detect the number of faces in the captured image. The results of such image classification may then be used to indicate to the server 10 that the user is not in private and the user's avatar is modified accordingly.
Another issue which is relevant to mobile users using radio links to access the virtual meeting space is Quality of Service (QoS). The fixed telephony network uses 64Kbits/s per voice channel while the mobile network uses 9.6 Kbits/s per voice channel. The average number of bits per second transmitted from the client device 11 to the server 10 is monitored by the server 10. The avatar of the user 1 is modified to be more or less opaque as a function of the average number of bits per second received by the server 10 from the client device 11. Hence the opacity of the avatar representing the user 1 related to the QoS as perceived by other users 2, 3, 4. In this embodiment of the invention the more opaque the avatar the better the perceived QoS.
For a mobile user, the attention paid to the virtual meeting space varies in dependence upon the 'real world' task currently being carried out. For example, whilst travelling on a train a user may be required to show a ticket to the ticket inspector, or somebody may speak to the user to ask the time. If the user is walking, running, or unable to remain still for some reason, then the attention paid to the virtual meeting space will be more limited than otherwise. If the user is in a noisy environment, again, the attention paid to the virtual meeting space will be less than it would be in a very quiet environment. Detection of a user's physical and location attributes is discussed in more detail with reference to Figure 3a and 3b.
The audio environment is analysed using the audio signal received via the audio input on the client apparatus 11. It is also possible for the user to use a predetermined key or sequence of keys to indicate via the client apparatus 11 to the server 10 that he is distracted or on the move. Figure 6 shows a representation of a user who is on-line but distracted, and Figure 7 shows a representation of a user who is on line but on the move.
The user interface unit 21 includes a physical and location sensor 50 as shown in Figure 3, as well as a visual display 60 and an audio input/output device 61. The physical and location sensor 50 is connected to the client apparatus 11 by a serial interface 51. A low acceleration detector 52 measures acceleration of a low force in two directions using an ADXL202. A high acceleration detector 53 measures acceleration of a high force in three directions using an ACH04-08-05 available from Measurement Specialities Incorporated (which can be referenced via Universal Resource Locator (URL) http://www.msiusa.com on the Internet). A direction detector 54 is provided using a compass which gives an absolute measurement of orientation of the client apparatus. A HMC2003, available from Honywell (URL http://www.ssechonevwell.com), is used. The compass is a three-axis magnetometer sensitive to fields along the length, width and height of the device. A direction and velocity detector 55 is provided using an ENC Piezoelectric Vibrating Gyroscope (part number S42E-2 which is sold under the registered trademark GYROSTAR) available from Murata manufacturing Company Ltd. (URL http://www.murata.com). The gyroscope measures angular velocity, giving speed and direction in two directions in each axis of rotation (i. e. six measurements are provided). The acceleration detectors 52, 53, the direction detector 54 and the velocity and direction detector 55 are connected via a multiplexer (MUX) 56 to a microcontroller 57 where the outputs are analysed as will be described later.
A global position detector 58 is provided which measures the absolute location of the device using a Global Positioning System (GPS) receiver which receives signal from GPS satellites.
GPS provides specially coded satellite signals that can be processed in a GPS receiver, enabling the receiver to compute position, velocity and time. The nominal GPS Operational Constellation consists of 24 satellites that orbit the earth twice a day, 11 ,000 miles above the earth. (There are often more than 24 operational satellites as new ones are launched to replace older satellites.) The satellite orbits repeat almost the same ground track (as the earth turns beneath them) once each day. There are six orbital planes (with nominally four satellites in each), equally spaced (60 degrees apart), and inclined at about fifty-five degrees with respect to the equatorial plane. This constellation provides the user with from five to eight satellites visible from any point on the earth. The GPS satellites orbit the earth transmitting their precise position and elevation. A GPS receiver acquires the signal, then measures the interval between transmission and receipt of the signal to determine the distance between the receiver and the satellite. Once the receiver has calculated this data for at least 3 satellites, its location on the earth's surface can be determined.
The receiver used in this embodiment of the invention is a Garmin GPS35 unit (available, for example from Lowe Electronics Ltd in the UK). GPS signals do not propagate inside buildings so a local position detector 59 is also provided which uses local area beacons (LAB's) (not shown) which use low power 418MHz AM radio transmitters (such as the CR91Y, CR72P, CR73Q or CR74R from RF Solutions) at known locations within a building. Radio or infrared transmitters could be used, although radio provides a more robust solution since line of sight connections are not required.
Once the "Bluetooth" radio based system becomes available this will also provide a suitable solution. Bluetooth is a standard for wireless connectivity, designed to replace cables between portable consumer devices such as cellular phones, laptop computers, personal digital assistants, digital cameras, and many other products. The Bluetooth version 1.0 specification was agreed in July 1999, and the first products are expected on the market in mid 2000.
Software on the microcontroller 57 gathers sensor data from the detectors 52, 53, 54, 55, via the MUX 56 which is configured to read each device in turn via an analogue port. The output from the global position detector 58 is read via a serial port connection and the output from the local position detector 59 is connected to a digital input on the microcontroller 57. Also provided is a location database 64 which is accessed by the microcontroller 57 to determine location names.
Figure 3b is a functional block diagram showing the logical operation of the physical and location detector 50. A location agent 62, implemented in software on the microcontroller 57, uses location data gathered by the global position detector 58 and the local position detector 59, analyses this data and makes the analysis available to the client apparatus 11. The location agent 62 also receives information about velocity and direction, measured by the direction detector 54 and the velocity and direction detector 55, from a physical agent 63. The physical agent is also implemented in software in the microcontroller 57. The location agent determines whether GPS is available, and whether the global location measured by the global position detector 58 is based on a signal from three or more satellites. The local position detector 59 detects signals from LAB's, each of which has a unique identifier. The location agent 62 accesses the location database 64 to determine a location name associated with a received LAB identifier. The location agent 62 must be able to determine the following:
• Is the device inside or outside? If less than three GPS signals are received then the device is determined to be inside. • Is the device moving? A measured velocity from the global position detector 58 (if the device is outside) and velocity measured via the physical agent 63 are used to determine whether the device is moving.
• Location of the device. Latitude and longitude, if the device is outside, are measured via the global position detector 58 and/or a location name is determined using the local position detector 59 and the location database 64.
• Direction of movement. This may be determined by the global position detector and /or by direction data received from the physical agent.
The physical agent 63 analyses physical sensor data and makes this available to the location agent 62. The physical agent is used to determine the following user attributes.
Standing. Walking. • Sitting.
Cadence (velocity).
Acceleration.
Shock.
The complex nature of the physical data makes the use of simple rules unreliable. The physical agent 63 of this embodiment of the invention uses Hidden Markov Models (HMM) to provide a determination above based on the inputs from the detectors 52, 53, 54, 55, 56. A good description of an implementation of HMM's (as applied to speech recognition, but the principles are the same) may be found in "Hidden Markov Models for Automatic Speech Recognition: Theory and Application" S.J. Cox, British Telecom Technology Journal Vol. 6, No. 2, April 1988. In other embodiments of the invention it is possible for the physical agent to analyse visual and audio information received from the visual and audio input/output device provided as part of the interface unit 21.
The client apparatus 11 has the physical information made available to it via the physical agent 63, and the location information made available to it via the location agent 62. Audio and/or visual information is used on the mobile device to provide the user with information alerts, and for teleconferencing activity. Spatial audio is also used for information alerts and for spatialised teleconferencing, which appears more natural to the user.
The interface used by the device for information alerts, and the interface used for teleconferencing are dependent on the user's current location and physical context (i. e. is the user standing/ walking/sitting etc). If the user is unlikely to be able to attend to a visual display, an audio interface is used. If the user is likely to be unavailable (eg running) then the device could divert alerts to a messaging service, which could then alert the user when it is determined he is available again. In embodiments of the invention incorporating audio input and analysis it is also possible to configure the audio output on the user's wearable or handheld device to match the acoustics, ambient noise level etc of the real world space in which the user is located. The nature of the interface used (for example the sound of a mobile device's alert or 'ring-tone') can be modified according to the detected user location. For example, a mobile phone handset could use a ring-tone such as a voice saying "shop at the Harrods' sale" if it is determined by the location agent 62 that the user is walking along Knightsbridge (where the famous shop 'Harrods' is located). A phone could use an appropriate piece of music if it is determined by the location agent 62 that the user is in church. Similarly to changing the users' audio interface in dependence on the detected location, the visual display can be altered according to the determined location. The screen style of the visual interface can be made to reflect the theme of the location. For example if the user is viewing web pages, and is walking around a museum, the web pages viewed as the user moves to different locations change to reflect the area of the museum.
In embodiments of the invention including the analysis of visual and audio information received from a visual and audio input/output device provided as part of the interface unit 21 , it is possible to use standard speech and video analysis algorithms to provide a more sophisticated interface to the user. There are standard algorithms for identifying speech within an audio stream so it would be possible to make a mobile phone handset that auto diverted or changed ring tone if the user is detected to be currently in conversation with someone. Visual information can also be analysed using standard algorithms such as skin detection or face detection and this information can be used along with audio analysis to infer whether the user is likely to be in private, for example.

Claims

1. A human computer interface device comprising a user interface device comprising a visual display device and an audio output device; and a physical detector for detecting physical attributes of a user; in which the visual display device is arranged to inhibit output via the visual display device when the user is not stationary.
2. A device according to claim 1 , further comprising a location detector for detecting location attributes of the user and in which the operation of the user interface device dependent upon the detected location attributes of the user.
3. A device according to claim 2 in which the output of the audio output device is dependent upon the location attributes of the user.
4. A device according to claim 2 or claim 3 in which the output of the visual display device is dependent upon the location attributes of the user.
5. A human computer interface device comprising a user interface device comprising a visual display device and an audio output device; a physical detector for detecting physical attributes of a user; and a location detector for detecting location attributes of the user and in which the operation of the user interface device dependent upon the detected location attributes of the user.
6. A device according to claim 5 in which the output of the audio output device is dependent upon the location attributes of the user.
7. A device according to claim 5 or claim 6 in which the output of the visual display device is dependent upon the location attributes of the user.
8. A mobile conferencing device including a human computer interface device according to any one of the preceding claims.
PCT/GB2000/003970 1999-10-18 2000-10-16 Personal mobile communication device WO2001029642A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CA002386407A CA2386407C (en) 1999-10-18 2000-10-16 Personal mobile communication device
EP00968118A EP1222518B1 (en) 1999-10-18 2000-10-16 Personal mobile communication device
DE60010915T DE60010915T2 (en) 1999-10-18 2000-10-16 PERSONAL MOBILE COMMUNICATION DEVICE
JP2001532372A JP2003512798A (en) 1999-10-18 2000-10-16 Personal mobile communication device
AU78073/00A AU7807300A (en) 1999-10-18 2000-10-16 Personal mobile communication device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP99308195 1999-10-18
EP99308195.9 1999-10-18

Publications (1)

Publication Number Publication Date
WO2001029642A1 true WO2001029642A1 (en) 2001-04-26

Family

ID=8241677

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2000/003970 WO2001029642A1 (en) 1999-10-18 2000-10-16 Personal mobile communication device

Country Status (7)

Country Link
EP (1) EP1222518B1 (en)
JP (1) JP2003512798A (en)
CN (1) CN1379868A (en)
AU (1) AU7807300A (en)
CA (1) CA2386407C (en)
DE (1) DE60010915T2 (en)
WO (1) WO2001029642A1 (en)

Cited By (131)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003100585A2 (en) * 2002-05-24 2003-12-04 Koninklijke Philips Electronics N.V. Context-aware portable device
EP1526706A2 (en) * 2003-10-22 2005-04-27 Xerox Corporation System and method for providing communication channels that each comprise at least one property dynamically changeable during social interactions
US20110292160A1 (en) * 2005-02-23 2011-12-01 AOL, Inc. Configuring output on a communication device
WO2013008238A1 (en) * 2011-07-12 2013-01-17 Mobli Technologies 2010 Ltd. Methods and systems of providing visual content editing functions
US9544729B2 (en) 2012-11-02 2017-01-10 Ge Intelligent Platforms, Inc. Apparatus and method for geolocation intelligence
US9801018B2 (en) 2015-01-26 2017-10-24 Snap Inc. Content request by location
US9825898B2 (en) 2014-06-13 2017-11-21 Snap Inc. Prioritization of messages within a message collection
US9843720B1 (en) 2014-11-12 2017-12-12 Snap Inc. User interface for accessing media at a geographic location
US20170374003A1 (en) 2014-10-02 2017-12-28 Snapchat, Inc. Ephemeral gallery of ephemeral messages
US9881094B2 (en) 2015-05-05 2018-01-30 Snap Inc. Systems and methods for automated local story generation and curation
US9936030B2 (en) 2014-01-03 2018-04-03 Investel Capital Corporation User content sharing system and method with location-based external content integration
US10080102B1 (en) 2014-01-12 2018-09-18 Investment Asset Holdings Llc Location-based messaging
US10102680B2 (en) 2015-10-30 2018-10-16 Snap Inc. Image based tracking in augmented reality systems
US10154192B1 (en) 2014-07-07 2018-12-11 Snap Inc. Apparatus and method for supplying content aware photo filters
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US10165402B1 (en) 2016-06-28 2018-12-25 Snap Inc. System to track engagement of media items
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
US10572681B1 (en) 2014-05-28 2020-02-25 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US10580458B2 (en) 2014-12-19 2020-03-03 Snap Inc. Gallery of videos set to an audio time line
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US10614828B1 (en) 2017-02-20 2020-04-07 Snap Inc. Augmented reality speech balloon system
US10623666B2 (en) 2016-11-07 2020-04-14 Snap Inc. Selective identification and order of image modifiers
US10638256B1 (en) 2016-06-20 2020-04-28 Pipbin, Inc. System for distribution and display of mobile targeted augmented reality content
US10657708B1 (en) 2015-11-30 2020-05-19 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US10805696B1 (en) 2016-06-20 2020-10-13 Pipbin, Inc. System for recording and targeting tagged content of user interest
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US10834525B2 (en) 2016-02-26 2020-11-10 Snap Inc. Generation, curation, and presentation of media collections
US10839219B1 (en) 2016-06-20 2020-11-17 Pipbin, Inc. System for curation, distribution and display of location-dependent augmented reality content
US10862951B1 (en) 2007-01-05 2020-12-08 Snap Inc. Real-time display of multiple images
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US10911575B1 (en) 2015-05-05 2021-02-02 Snap Inc. Systems and methods for story and sub-story navigation
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
US10933311B2 (en) 2018-03-14 2021-03-02 Snap Inc. Generating collectible items based on location information
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US10948717B1 (en) 2015-03-23 2021-03-16 Snap Inc. Reducing boot time and power consumption in wearable display systems
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11030787B2 (en) 2017-10-30 2021-06-08 Snap Inc. Mobile-based cartographic control of display content
US11037372B2 (en) 2017-03-06 2021-06-15 Snap Inc. Virtual vision system
US11038829B1 (en) 2014-10-02 2021-06-15 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US11044393B1 (en) 2016-06-20 2021-06-22 Pipbin, Inc. System for curation and display of location-dependent augmented reality content in an augmented estate system
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US11182383B1 (en) 2012-02-24 2021-11-23 Placed, Llc System and method for data collection to validate location data
US11201981B1 (en) 2016-06-20 2021-12-14 Pipbin, Inc. System for notification of user accessibility of curated location-dependent content in an augmented estate
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11206615B2 (en) 2019-05-30 2021-12-21 Snap Inc. Wearable device location systems
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11232040B1 (en) 2017-04-28 2022-01-25 Snap Inc. Precaching unlockable data elements
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US11249617B1 (en) 2015-01-19 2022-02-15 Snap Inc. Multichannel system
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11297399B1 (en) 2017-03-27 2022-04-05 Snap Inc. Generating a stitched data stream
US11301117B2 (en) 2019-03-08 2022-04-12 Snap Inc. Contextual information in chat
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
US11349796B2 (en) 2017-03-27 2022-05-31 Snap Inc. Generating a stitched data stream
US11361493B2 (en) 2019-04-01 2022-06-14 Snap Inc. Semantic texture mapping system
US11372608B2 (en) 2014-12-19 2022-06-28 Snap Inc. Gallery of messages from individuals with a shared interest
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11500525B2 (en) 2019-02-25 2022-11-15 Snap Inc. Custom media overlay system
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11558709B2 (en) 2018-11-30 2023-01-17 Snap Inc. Position service to determine relative position to map features
US11574431B2 (en) 2019-02-26 2023-02-07 Snap Inc. Avatar based on weather
US11601783B2 (en) 2019-06-07 2023-03-07 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11601888B2 (en) 2021-03-29 2023-03-07 Snap Inc. Determining location using multi-source geolocation data
US11606755B2 (en) 2019-05-30 2023-03-14 Snap Inc. Wearable device location systems architecture
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11625443B2 (en) 2014-06-05 2023-04-11 Snap Inc. Web document enhancement
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
US11676378B2 (en) 2020-06-29 2023-06-13 Snap Inc. Providing travel-based augmented reality content with a captured image
US11675831B2 (en) 2017-05-31 2023-06-13 Snap Inc. Geolocation based playlists
US11714535B2 (en) 2019-07-11 2023-08-01 Snap Inc. Edge gesture interface with smart interactions
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US11751015B2 (en) 2019-01-16 2023-09-05 Snap Inc. Location-based context information sharing in a messaging system
US11776256B2 (en) 2020-03-27 2023-10-03 Snap Inc. Shared augmented reality system
US11785161B1 (en) 2016-06-20 2023-10-10 Pipbin, Inc. System for user accessibility of tagged curated augmented reality content
US11799811B2 (en) 2018-10-31 2023-10-24 Snap Inc. Messaging and gaming applications communication platform
US11809624B2 (en) 2019-02-13 2023-11-07 Snap Inc. Sleep detection in a location sharing system
US11816853B2 (en) 2016-08-30 2023-11-14 Snap Inc. Systems and methods for simultaneous localization and mapping
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11860888B2 (en) 2018-05-22 2024-01-02 Snap Inc. Event detection system
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US11876941B1 (en) 2016-06-20 2024-01-16 Pipbin, Inc. Clickable augmented reality content manager, system, and network
US11877211B2 (en) 2019-01-14 2024-01-16 Snap Inc. Destination sharing in location sharing system
US11893208B2 (en) 2019-12-31 2024-02-06 Snap Inc. Combined map icon with action indicator
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
US11943192B2 (en) 2020-08-31 2024-03-26 Snap Inc. Co-location connection service
US11961116B2 (en) 2020-10-26 2024-04-16 Foursquare Labs, Inc. Determining exposures to content presented by physical objects

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100704622B1 (en) * 2004-09-24 2007-04-10 삼성전자주식회사 Method and apparatus for providing user interface for multistreaming audio control
JP4766696B2 (en) * 2007-03-06 2011-09-07 日本電信電話株式会社 Interface device and interface system
US8295879B2 (en) 2008-05-30 2012-10-23 Motorola Mobility Llc Devices and methods for initiating functions based on movement characteristics relative to a reference
US9002416B2 (en) 2008-12-22 2015-04-07 Google Technology Holdings LLC Wireless communication device responsive to orientation and movement
KR101653432B1 (en) * 2009-01-29 2016-09-01 임머숀 코퍼레이션 Systems and methods for interpreting physical interactions with a graphical user interface

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0453128A2 (en) * 1990-04-12 1991-10-23 AT&T Corp. Multiple call control method in a multimedia conferencing system
US5570301A (en) * 1994-07-15 1996-10-29 Mitsubishi Electric Information Technology Center America, Inc. System for unencumbered measurement and reporting of body posture
WO1998025423A1 (en) * 1996-12-02 1998-06-11 Nokia Telecommunications Oy Maintenance of group call in mobile communication system
US5907604A (en) * 1997-03-25 1999-05-25 Sony Corporation Image icon associated with caller ID

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3292248B2 (en) * 1991-05-29 2002-06-17 富士通株式会社 Teleconferencing system
JP2539153B2 (en) * 1993-03-19 1996-10-02 インターナショナル・ビジネス・マシーンズ・コーポレイション Virtual conference system terminal device and virtual conference system
JPH0730877A (en) * 1993-07-12 1995-01-31 Oki Electric Ind Co Ltd Inter-multi location multimedia communications conference system
US5491743A (en) * 1994-05-24 1996-02-13 International Business Machines Corporation Virtual conference system and terminal apparatus therefor
JPH0891756A (en) * 1994-09-22 1996-04-09 Toshiba Elevator Technos Kk Safety protecting fence
JPH09247638A (en) * 1996-03-04 1997-09-19 Atsushi Matsushita Video conference system
JPH10304432A (en) * 1997-04-30 1998-11-13 Matsushita Electric Works Ltd Location management system
WO1999030494A1 (en) * 1997-12-09 1999-06-17 British Telecommunications Public Limited Company Conference facility

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0453128A2 (en) * 1990-04-12 1991-10-23 AT&T Corp. Multiple call control method in a multimedia conferencing system
US5570301A (en) * 1994-07-15 1996-10-29 Mitsubishi Electric Information Technology Center America, Inc. System for unencumbered measurement and reporting of body posture
WO1998025423A1 (en) * 1996-12-02 1998-06-11 Nokia Telecommunications Oy Maintenance of group call in mobile communication system
US5907604A (en) * 1997-03-25 1999-05-25 Sony Corporation Image icon associated with caller ID

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
COLOMBO C ET AL: "PROTOTYPE OF A VISION-BASED GAZE-DRIVEN MAN-MACHINE INTERFACE", PROCEEDINGS OF THE IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS),US,LOS ALAMITOS, IEEE COMP. SOC. PRESS, 1995, pages 188 - 192, XP000740890, ISBN: 0-7803-3006-4 *

Cited By (281)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003100585A3 (en) * 2002-05-24 2004-07-01 Koninkl Philips Electronics Nv Context-aware portable device
WO2003100585A2 (en) * 2002-05-24 2003-12-04 Koninklijke Philips Electronics N.V. Context-aware portable device
EP1526706A2 (en) * 2003-10-22 2005-04-27 Xerox Corporation System and method for providing communication channels that each comprise at least one property dynamically changeable during social interactions
US20110292160A1 (en) * 2005-02-23 2011-12-01 AOL, Inc. Configuring output on a communication device
US8694655B2 (en) * 2005-02-23 2014-04-08 Facebook, Inc. Configuring output on a communication device
US9177075B2 (en) 2005-02-23 2015-11-03 Facebook, Inc. Monitoring and configuring communication sessions
US11025860B2 (en) 2005-02-23 2021-06-01 Facebook, Inc. Configuring output on a communication device
US11588770B2 (en) 2007-01-05 2023-02-21 Snap Inc. Real-time display of multiple images
US10862951B1 (en) 2007-01-05 2020-12-08 Snap Inc. Real-time display of multiple images
US10440420B2 (en) 2011-07-12 2019-10-08 Snap Inc. Providing visual content editing functions
US10334307B2 (en) 2011-07-12 2019-06-25 Snap Inc. Methods and systems of providing visual content editing functions
US11750875B2 (en) 2011-07-12 2023-09-05 Snap Inc. Providing visual content editing functions
US11451856B2 (en) 2011-07-12 2022-09-20 Snap Inc. Providing visual content editing functions
US10999623B2 (en) 2011-07-12 2021-05-04 Snap Inc. Providing visual content editing functions
US9459778B2 (en) 2011-07-12 2016-10-04 Mobli Technologies 2010 Ltd. Methods and systems of providing visual content editing functions
WO2013008238A1 (en) * 2011-07-12 2013-01-17 Mobli Technologies 2010 Ltd. Methods and systems of providing visual content editing functions
US11182383B1 (en) 2012-02-24 2021-11-23 Placed, Llc System and method for data collection to validate location data
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
US9544729B2 (en) 2012-11-02 2017-01-10 Ge Intelligent Platforms, Inc. Apparatus and method for geolocation intelligence
US9936030B2 (en) 2014-01-03 2018-04-03 Investel Capital Corporation User content sharing system and method with location-based external content integration
US10080102B1 (en) 2014-01-12 2018-09-18 Investment Asset Holdings Llc Location-based messaging
US10349209B1 (en) 2014-01-12 2019-07-09 Investment Asset Holdings Llc Location-based messaging
US10990697B2 (en) 2014-05-28 2021-04-27 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US10572681B1 (en) 2014-05-28 2020-02-25 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US11625443B2 (en) 2014-06-05 2023-04-11 Snap Inc. Web document enhancement
US11921805B2 (en) 2014-06-05 2024-03-05 Snap Inc. Web document enhancement
US11166121B2 (en) 2014-06-13 2021-11-02 Snap Inc. Prioritization of messages within a message collection
US10623891B2 (en) 2014-06-13 2020-04-14 Snap Inc. Prioritization of messages within a message collection
US10524087B1 (en) 2014-06-13 2019-12-31 Snap Inc. Message destination list mechanism
US10200813B1 (en) 2014-06-13 2019-02-05 Snap Inc. Geo-location based event gallery
US9825898B2 (en) 2014-06-13 2017-11-21 Snap Inc. Prioritization of messages within a message collection
US10182311B2 (en) 2014-06-13 2019-01-15 Snap Inc. Prioritization of messages within a message collection
US11317240B2 (en) 2014-06-13 2022-04-26 Snap Inc. Geo-location based event gallery
US10448201B1 (en) 2014-06-13 2019-10-15 Snap Inc. Prioritization of messages within a message collection
US10779113B2 (en) 2014-06-13 2020-09-15 Snap Inc. Prioritization of messages within a message collection
US10659914B1 (en) 2014-06-13 2020-05-19 Snap Inc. Geo-location based event gallery
US10154192B1 (en) 2014-07-07 2018-12-11 Snap Inc. Apparatus and method for supplying content aware photo filters
US11595569B2 (en) 2014-07-07 2023-02-28 Snap Inc. Supplying content aware photo filters
US10432850B1 (en) 2014-07-07 2019-10-01 Snap Inc. Apparatus and method for supplying content aware photo filters
US11849214B2 (en) 2014-07-07 2023-12-19 Snap Inc. Apparatus and method for supplying content aware photo filters
US11122200B2 (en) 2014-07-07 2021-09-14 Snap Inc. Supplying content aware photo filters
US10602057B1 (en) 2014-07-07 2020-03-24 Snap Inc. Supplying content aware photo filters
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US11625755B1 (en) 2014-09-16 2023-04-11 Foursquare Labs, Inc. Determining targeting information based on a predictive targeting model
US11281701B2 (en) 2014-09-18 2022-03-22 Snap Inc. Geolocation-based pictographs
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US11741136B2 (en) 2014-09-18 2023-08-29 Snap Inc. Geolocation-based pictographs
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US11411908B1 (en) 2014-10-02 2022-08-09 Snap Inc. Ephemeral message gallery user interface with online viewing history indicia
US20170374003A1 (en) 2014-10-02 2017-12-28 Snapchat, Inc. Ephemeral gallery of ephemeral messages
US11038829B1 (en) 2014-10-02 2021-06-15 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US11522822B1 (en) 2014-10-02 2022-12-06 Snap Inc. Ephemeral gallery elimination based on gallery and message timers
US10476830B2 (en) 2014-10-02 2019-11-12 Snap Inc. Ephemeral gallery of ephemeral messages
US10616476B1 (en) 2014-11-12 2020-04-07 Snap Inc. User interface for accessing media at a geographic location
US11190679B2 (en) 2014-11-12 2021-11-30 Snap Inc. Accessing media at a geographic location
US11956533B2 (en) 2014-11-12 2024-04-09 Snap Inc. Accessing media at a geographic location
US9843720B1 (en) 2014-11-12 2017-12-12 Snap Inc. User interface for accessing media at a geographic location
US11803345B2 (en) 2014-12-19 2023-10-31 Snap Inc. Gallery of messages from individuals with a shared interest
US11783862B2 (en) 2014-12-19 2023-10-10 Snap Inc. Routing messages by message parameter
US11372608B2 (en) 2014-12-19 2022-06-28 Snap Inc. Gallery of messages from individuals with a shared interest
US10580458B2 (en) 2014-12-19 2020-03-03 Snap Inc. Gallery of videos set to an audio time line
US10811053B2 (en) 2014-12-19 2020-10-20 Snap Inc. Routing messages by message parameter
US11250887B2 (en) 2014-12-19 2022-02-15 Snap Inc. Routing messages by message parameter
US11301960B2 (en) 2015-01-09 2022-04-12 Snap Inc. Object recognition based image filters
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US11734342B2 (en) 2015-01-09 2023-08-22 Snap Inc. Object recognition based image overlays
US10380720B1 (en) 2015-01-09 2019-08-13 Snap Inc. Location-based image filters
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US11249617B1 (en) 2015-01-19 2022-02-15 Snap Inc. Multichannel system
US10123167B2 (en) 2015-01-26 2018-11-06 Snap Inc. Content request by location
US10123166B2 (en) 2015-01-26 2018-11-06 Snap Inc. Content request by location
US9801018B2 (en) 2015-01-26 2017-10-24 Snap Inc. Content request by location
US11910267B2 (en) 2015-01-26 2024-02-20 Snap Inc. Content request by location
US10932085B1 (en) 2015-01-26 2021-02-23 Snap Inc. Content request by location
US11528579B2 (en) 2015-01-26 2022-12-13 Snap Inc. Content request by location
US10536800B1 (en) 2015-01-26 2020-01-14 Snap Inc. Content request by location
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US11902287B2 (en) 2015-03-18 2024-02-13 Snap Inc. Geo-fence authorization provisioning
US10893055B2 (en) 2015-03-18 2021-01-12 Snap Inc. Geo-fence authorization provisioning
US11662576B2 (en) 2015-03-23 2023-05-30 Snap Inc. Reducing boot time and power consumption in displaying data content
US11320651B2 (en) 2015-03-23 2022-05-03 Snap Inc. Reducing boot time and power consumption in displaying data content
US10948717B1 (en) 2015-03-23 2021-03-16 Snap Inc. Reducing boot time and power consumption in wearable display systems
US10911575B1 (en) 2015-05-05 2021-02-02 Snap Inc. Systems and methods for story and sub-story navigation
US9881094B2 (en) 2015-05-05 2018-01-30 Snap Inc. Systems and methods for automated local story generation and curation
US11496544B2 (en) 2015-05-05 2022-11-08 Snap Inc. Story and sub-story navigation
US11392633B2 (en) 2015-05-05 2022-07-19 Snap Inc. Systems and methods for automated local story generation and curation
US11449539B2 (en) 2015-05-05 2022-09-20 Snap Inc. Automated local story generation and curation
US10592574B2 (en) 2015-05-05 2020-03-17 Snap Inc. Systems and methods for automated local story generation and curation
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
US10366543B1 (en) 2015-10-30 2019-07-30 Snap Inc. Image based tracking in augmented reality systems
US10102680B2 (en) 2015-10-30 2018-10-16 Snap Inc. Image based tracking in augmented reality systems
US10733802B2 (en) 2015-10-30 2020-08-04 Snap Inc. Image based tracking in augmented reality systems
US11315331B2 (en) 2015-10-30 2022-04-26 Snap Inc. Image based tracking in augmented reality systems
US11769307B2 (en) 2015-10-30 2023-09-26 Snap Inc. Image based tracking in augmented reality systems
US10997783B2 (en) 2015-11-30 2021-05-04 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
US11380051B2 (en) 2015-11-30 2022-07-05 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11599241B2 (en) 2015-11-30 2023-03-07 Snap Inc. Network resource location linking and visual content sharing
US10657708B1 (en) 2015-11-30 2020-05-19 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11468615B2 (en) 2015-12-18 2022-10-11 Snap Inc. Media overlay publication system
US11830117B2 (en) 2015-12-18 2023-11-28 Snap Inc Media overlay publication system
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US11611846B2 (en) 2016-02-26 2023-03-21 Snap Inc. Generation, curation, and presentation of media collections
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11197123B2 (en) 2016-02-26 2021-12-07 Snap Inc. Generation, curation, and presentation of media collections
US11889381B2 (en) 2016-02-26 2024-01-30 Snap Inc. Generation, curation, and presentation of media collections
US10834525B2 (en) 2016-02-26 2020-11-10 Snap Inc. Generation, curation, and presentation of media collections
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11785161B1 (en) 2016-06-20 2023-10-10 Pipbin, Inc. System for user accessibility of tagged curated augmented reality content
US10992836B2 (en) 2016-06-20 2021-04-27 Pipbin, Inc. Augmented property system of curated augmented reality media elements
US10638256B1 (en) 2016-06-20 2020-04-28 Pipbin, Inc. System for distribution and display of mobile targeted augmented reality content
US10805696B1 (en) 2016-06-20 2020-10-13 Pipbin, Inc. System for recording and targeting tagged content of user interest
US11201981B1 (en) 2016-06-20 2021-12-14 Pipbin, Inc. System for notification of user accessibility of curated location-dependent content in an augmented estate
US11876941B1 (en) 2016-06-20 2024-01-16 Pipbin, Inc. Clickable augmented reality content manager, system, and network
US10839219B1 (en) 2016-06-20 2020-11-17 Pipbin, Inc. System for curation, distribution and display of location-dependent augmented reality content
US11044393B1 (en) 2016-06-20 2021-06-22 Pipbin, Inc. System for curation and display of location-dependent augmented reality content in an augmented estate system
US11640625B2 (en) 2016-06-28 2023-05-02 Snap Inc. Generation, curation, and presentation of media collections with automated advertising
US10165402B1 (en) 2016-06-28 2018-12-25 Snap Inc. System to track engagement of media items
US11445326B2 (en) 2016-06-28 2022-09-13 Snap Inc. Track engagement of media items
US10785597B2 (en) 2016-06-28 2020-09-22 Snap Inc. System to track engagement of media items
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US10506371B2 (en) 2016-06-28 2019-12-10 Snap Inc. System to track engagement of media items
US10219110B2 (en) 2016-06-28 2019-02-26 Snap Inc. System to track engagement of media items
US10327100B1 (en) 2016-06-28 2019-06-18 Snap Inc. System to track engagement of media items
US10735892B2 (en) 2016-06-28 2020-08-04 Snap Inc. System to track engagement of media items
US10885559B1 (en) 2016-06-28 2021-01-05 Snap Inc. Generation, curation, and presentation of media collections with automated advertising
US11895068B2 (en) 2016-06-30 2024-02-06 Snap Inc. Automated content curation and communication
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US11080351B1 (en) 2016-06-30 2021-08-03 Snap Inc. Automated content curation and communication
US11509615B2 (en) 2016-07-19 2022-11-22 Snap Inc. Generating customized electronic messaging graphics
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US11816853B2 (en) 2016-08-30 2023-11-14 Snap Inc. Systems and methods for simultaneous localization and mapping
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US11876762B1 (en) 2016-10-24 2024-01-16 Snap Inc. Generating and displaying customized avatars in media overlays
US11750767B2 (en) 2016-11-07 2023-09-05 Snap Inc. Selective identification and order of image modifiers
US10623666B2 (en) 2016-11-07 2020-04-14 Snap Inc. Selective identification and order of image modifiers
US11233952B2 (en) 2016-11-07 2022-01-25 Snap Inc. Selective identification and order of image modifiers
US10754525B1 (en) 2016-12-09 2020-08-25 Snap Inc. Customized media overlays
US11397517B2 (en) 2016-12-09 2022-07-26 Snap Inc. Customized media overlays
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
US11720640B2 (en) 2017-02-17 2023-08-08 Snap Inc. Searching social media content
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US11861795B1 (en) 2017-02-17 2024-01-02 Snap Inc. Augmented reality anamorphosis system
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US10614828B1 (en) 2017-02-20 2020-04-07 Snap Inc. Augmented reality speech balloon system
US11748579B2 (en) 2017-02-20 2023-09-05 Snap Inc. Augmented reality speech balloon system
US11189299B1 (en) 2017-02-20 2021-11-30 Snap Inc. Augmented reality speech balloon system
US11670057B2 (en) 2017-03-06 2023-06-06 Snap Inc. Virtual vision system
US11037372B2 (en) 2017-03-06 2021-06-15 Snap Inc. Virtual vision system
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
US11258749B2 (en) 2017-03-09 2022-02-22 Snap Inc. Restricted group content collection
US10887269B1 (en) 2017-03-09 2021-01-05 Snap Inc. Restricted group content collection
US11349796B2 (en) 2017-03-27 2022-05-31 Snap Inc. Generating a stitched data stream
US11558678B2 (en) 2017-03-27 2023-01-17 Snap Inc. Generating a stitched data stream
US11297399B1 (en) 2017-03-27 2022-04-05 Snap Inc. Generating a stitched data stream
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US11195018B1 (en) 2017-04-20 2021-12-07 Snap Inc. Augmented reality typography personalization system
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US11782574B2 (en) 2017-04-27 2023-10-10 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
US11385763B2 (en) 2017-04-27 2022-07-12 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11451956B1 (en) 2017-04-27 2022-09-20 Snap Inc. Location privacy management on map-based social media platforms
US11556221B2 (en) 2017-04-27 2023-01-17 Snap Inc. Friend location sharing mechanism for social media platforms
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US11418906B2 (en) 2017-04-27 2022-08-16 Snap Inc. Selective location-based identity communication
US11409407B2 (en) 2017-04-27 2022-08-09 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11474663B2 (en) 2017-04-27 2022-10-18 Snap Inc. Location-based search mechanism in a graphical user interface
US11392264B1 (en) 2017-04-27 2022-07-19 Snap Inc. Map-based graphical user interface for multi-type social media galleries
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US11232040B1 (en) 2017-04-28 2022-01-25 Snap Inc. Precaching unlockable data elements
US11675831B2 (en) 2017-05-31 2023-06-13 Snap Inc. Geolocation based playlists
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US11335067B2 (en) 2017-09-15 2022-05-17 Snap Inc. Augmented reality system
US11721080B2 (en) 2017-09-15 2023-08-08 Snap Inc. Augmented reality system
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
US11006242B1 (en) 2017-10-09 2021-05-11 Snap Inc. Context sensitive presentation of content
US11617056B2 (en) 2017-10-09 2023-03-28 Snap Inc. Context sensitive presentation of content
US11030787B2 (en) 2017-10-30 2021-06-08 Snap Inc. Mobile-based cartographic control of display content
US11670025B2 (en) 2017-10-30 2023-06-06 Snap Inc. Mobile-based cartographic control of display content
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
US11943185B2 (en) 2017-12-01 2024-03-26 Snap Inc. Dynamic media overlay with smart widget
US11558327B2 (en) 2017-12-01 2023-01-17 Snap Inc. Dynamic media overlay with smart widget
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11687720B2 (en) 2017-12-22 2023-06-27 Snap Inc. Named entity recognition visual context and caption data
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US11487794B2 (en) 2018-01-03 2022-11-01 Snap Inc. Tag distribution visualization system
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US11841896B2 (en) 2018-02-13 2023-12-12 Snap Inc. Icon based tagging
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US11523159B2 (en) 2018-02-28 2022-12-06 Snap Inc. Generating media content items based on location information
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US11722837B2 (en) 2018-03-06 2023-08-08 Snap Inc. Geo-fence selection system
US11044574B2 (en) 2018-03-06 2021-06-22 Snap Inc. Geo-fence selection system
US11570572B2 (en) 2018-03-06 2023-01-31 Snap Inc. Geo-fence selection system
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
US10524088B2 (en) 2018-03-06 2019-12-31 Snap Inc. Geo-fence selection system
US10933311B2 (en) 2018-03-14 2021-03-02 Snap Inc. Generating collectible items based on location information
US11491393B2 (en) 2018-03-14 2022-11-08 Snap Inc. Generating collectible items based on location information
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US11683657B2 (en) 2018-04-18 2023-06-20 Snap Inc. Visitation tracking system
US10448199B1 (en) 2018-04-18 2019-10-15 Snap Inc. Visitation tracking system
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
US10924886B2 (en) 2018-04-18 2021-02-16 Snap Inc. Visitation tracking system
US10681491B1 (en) 2018-04-18 2020-06-09 Snap Inc. Visitation tracking system
US11297463B2 (en) 2018-04-18 2022-04-05 Snap Inc. Visitation tracking system
US10779114B2 (en) 2018-04-18 2020-09-15 Snap Inc. Visitation tracking system
US11860888B2 (en) 2018-05-22 2024-01-02 Snap Inc. Event detection system
US11670026B2 (en) 2018-07-24 2023-06-06 Snap Inc. Conditional modification of augmented reality object
US10789749B2 (en) 2018-07-24 2020-09-29 Snap Inc. Conditional modification of augmented reality object
US11367234B2 (en) 2018-07-24 2022-06-21 Snap Inc. Conditional modification of augmented reality object
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
US10943381B2 (en) 2018-07-24 2021-03-09 Snap Inc. Conditional modification of augmented reality object
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11676319B2 (en) 2018-08-31 2023-06-13 Snap Inc. Augmented reality anthropomorphtzation system
US11450050B2 (en) 2018-08-31 2022-09-20 Snap Inc. Augmented reality anthropomorphization system
US11704005B2 (en) 2018-09-28 2023-07-18 Snap Inc. Collaborative achievement interface
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11799811B2 (en) 2018-10-31 2023-10-24 Snap Inc. Messaging and gaming applications communication platform
US11812335B2 (en) 2018-11-30 2023-11-07 Snap Inc. Position service to determine relative position to map features
US11698722B2 (en) 2018-11-30 2023-07-11 Snap Inc. Generating customized avatars based on location information
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11558709B2 (en) 2018-11-30 2023-01-17 Snap Inc. Position service to determine relative position to map features
US11877211B2 (en) 2019-01-14 2024-01-16 Snap Inc. Destination sharing in location sharing system
US11751015B2 (en) 2019-01-16 2023-09-05 Snap Inc. Location-based context information sharing in a messaging system
US11693887B2 (en) 2019-01-30 2023-07-04 Snap Inc. Adaptive spatial density based clustering
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11809624B2 (en) 2019-02-13 2023-11-07 Snap Inc. Sleep detection in a location sharing system
US11954314B2 (en) 2019-02-25 2024-04-09 Snap Inc. Custom media overlay system
US11500525B2 (en) 2019-02-25 2022-11-15 Snap Inc. Custom media overlay system
US11574431B2 (en) 2019-02-26 2023-02-07 Snap Inc. Avatar based on weather
US11301117B2 (en) 2019-03-08 2022-04-12 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US11740760B2 (en) 2019-03-28 2023-08-29 Snap Inc. Generating personalized map interface with enhanced icons
US11361493B2 (en) 2019-04-01 2022-06-14 Snap Inc. Semantic texture mapping system
US11785549B2 (en) 2019-05-30 2023-10-10 Snap Inc. Wearable device location systems
US11606755B2 (en) 2019-05-30 2023-03-14 Snap Inc. Wearable device location systems architecture
US11206615B2 (en) 2019-05-30 2021-12-21 Snap Inc. Wearable device location systems
US11917495B2 (en) 2019-06-07 2024-02-27 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11601783B2 (en) 2019-06-07 2023-03-07 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11714535B2 (en) 2019-07-11 2023-08-01 Snap Inc. Edge gesture interface with smart interactions
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11893208B2 (en) 2019-12-31 2024-02-06 Snap Inc. Combined map icon with action indicator
US11943303B2 (en) 2019-12-31 2024-03-26 Snap Inc. Augmented reality objects registry
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
US11888803B2 (en) 2020-02-12 2024-01-30 Snap Inc. Multiple gateway message exchange
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11765117B2 (en) 2020-03-05 2023-09-19 Snap Inc. Storing data based on device location
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
US11915400B2 (en) 2020-03-27 2024-02-27 Snap Inc. Location mapping for large scale augmented-reality
US11776256B2 (en) 2020-03-27 2023-10-03 Snap Inc. Shared augmented reality system
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11676378B2 (en) 2020-06-29 2023-06-13 Snap Inc. Providing travel-based augmented reality content with a captured image
US11943192B2 (en) 2020-08-31 2024-03-26 Snap Inc. Co-location connection service
US11961116B2 (en) 2020-10-26 2024-04-16 Foursquare Labs, Inc. Determining exposures to content presented by physical objects
US11606756B2 (en) 2021-03-29 2023-03-14 Snap Inc. Scheduling requests for location data
US11902902B2 (en) 2021-03-29 2024-02-13 Snap Inc. Scheduling requests for location data
US11601888B2 (en) 2021-03-29 2023-03-07 Snap Inc. Determining location using multi-source geolocation data
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code
US11962645B2 (en) 2022-06-02 2024-04-16 Snap Inc. Guided personal identity based actions
US11963105B2 (en) 2023-02-10 2024-04-16 Snap Inc. Wearable device location systems architecture
US11961196B2 (en) 2023-03-17 2024-04-16 Snap Inc. Virtual vision system

Also Published As

Publication number Publication date
DE60010915T2 (en) 2005-05-25
JP2003512798A (en) 2003-04-02
EP1222518A1 (en) 2002-07-17
EP1222518B1 (en) 2004-05-19
CN1379868A (en) 2002-11-13
CA2386407C (en) 2009-05-05
DE60010915D1 (en) 2004-06-24
AU7807300A (en) 2001-04-30
CA2386407A1 (en) 2001-04-26

Similar Documents

Publication Publication Date Title
CA2386407C (en) Personal mobile communication device
US7443283B2 (en) Methods and apparatus for connecting an intimate group by exchanging awareness cues and text, voice instant messages, and two-way voice communications
US7634073B2 (en) Voice communication system
Hub et al. Design and development of an indoor navigation and object identification system for the blind
KR100641978B1 (en) Group notification system and method for indicating the proximity between individuals or groups
Mann Wearable computing: A first step toward personal imaging
US20060256008A1 (en) Pointing interface for person-to-person information exchange
US7853273B2 (en) Method of controlling user and remote cell phone transmissions and displays
US20060229058A1 (en) Real-time person-to-person communication using geospatial addressing
JP2014502349A (en) Tactile-based personal navigation
JP2004531791A (en) Pointing system for addressing objects
US7924152B1 (en) Interactive video gaming footwear including means for transmitting location information to a remote party
CA2366957A1 (en) Personal data capture device and web posting system
Repenning et al. Mobility agents: guiding and tracking public transportation users
Bowskill et al. Wearable location mediated telecommunications; a first step towards contextual communication
JP4844093B2 (en) Information processing apparatus and method, recording medium, and program
CN110536236A (en) A kind of communication means, terminal device and the network equipment
CN107430560B (en) Information processing apparatus, information processing method, and computer program
EP1094657A1 (en) Mobile conferencing system and method
JP7304639B2 (en) Methods and systems for enabling enhanced user-to-user communication in digital reality
JP2000172238A (en) Personalized information representing method
Gil et al. inContexto: A fusion architecture to obtain mobile context
US9294715B2 (en) Controlling display of video data
JP2009259135A (en) Network type real-time communication system
KR20050054369A (en) Apparatus for servicing the information of destination in wireless telecommunication terminal

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 10088346

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2000968118

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2386407

Country of ref document: CA

ENP Entry into the national phase

Ref country code: JP

Ref document number: 2001 532372

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 008143684

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 2000968118

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWG Wipo information: grant in national office

Ref document number: 2000968118

Country of ref document: EP