WO2013118144A2 - A system and method for identifying and analyzing personal context of a user - Google Patents

A system and method for identifying and analyzing personal context of a user Download PDF

Info

Publication number
WO2013118144A2
WO2013118144A2 PCT/IN2013/000045 IN2013000045W WO2013118144A2 WO 2013118144 A2 WO2013118144 A2 WO 2013118144A2 IN 2013000045 W IN2013000045 W IN 2013000045W WO 2013118144 A2 WO2013118144 A2 WO 2013118144A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
sensors
personal context
module
communication device
Prior art date
Application number
PCT/IN2013/000045
Other languages
French (fr)
Other versions
WO2013118144A3 (en
Inventor
Arpan Pal
Balamuralidhar Purushothaman
Prateep MISRA
Sunil Kumar Kopparapu
Aniruddha Sinha
Chirabrata BHAUMIK
Priyanka SINHA
Avik GHOSE
Original Assignee
Tata Consultancy Services Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tata Consultancy Services Limited filed Critical Tata Consultancy Services Limited
Priority to JP2014555389A priority Critical patent/JP5951802B2/en
Priority to US14/376,536 priority patent/US9560094B2/en
Priority to AU2013217184A priority patent/AU2013217184A1/en
Priority to EP13746227.1A priority patent/EP2810426A4/en
Priority to CN201380014086.7A priority patent/CN104335564B/en
Publication of WO2013118144A2 publication Critical patent/WO2013118144A2/en
Publication of WO2013118144A3 publication Critical patent/WO2013118144A3/en
Priority to AU2016200905A priority patent/AU2016200905B2/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0257Hybrid positioning
    • G01S5/0258Hybrid positioning by combining or switching between measurements derived from different systems
    • G01S5/02585Hybrid positioning by combining or switching between measurements derived from different systems at least one of the measurements being a non-radio measurement
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/06Foreign languages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/306User profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/06Selective distribution of broadcast services, e.g. multimedia broadcast multicast service [MBMS]; Services to user groups; One-way selective calling services
    • H04W4/08User group management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72457User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location

Definitions

  • This invention generally relates to the field, of personal context identification of a person. More particularly, the invention relates to a system and method for personal context identification for deriving social interaction information of the person.
  • a number of technologies are available in the market for analyzing social behavior of a person using the reality mining techniques and other related aspects such as work cultures which have dedicated software and hardware requirements of the system.
  • Such systems analyze Organizational Behavior based on user context sensing via specially designed devices that ⁇ incorporate all required sensors. These devices interact with each other and aserver to gather information associated with an individual.
  • Such systems and devices pose a threat as data related to a number of individuals is required to be transmitted to a back-end server for further processing, thereby raising privacy concerns.
  • further refinement and data processing at a distant located server leads to heavy transmission costs. Further, such data transmission leads to extra usage of battery power in order to transmit each and every particular sensed detail to the back-end server without processing it.
  • an additional device is required to be deployed at an extra cost in order to track anindividual's behavior.
  • Such a device does not generally has an always-on connectivity to dump the user data collected to the back-end server for further analysis - it needs to be docked to a special data collection station to transfer the data.
  • such available devices do not have provision to connect to additional external sensors over wireless, so the extensibility of the system to newer applications is limited.
  • the principle object of the present invention is to implement a real-time personal context identification system using reality mining techniques by means of a widely available mobile communication device.
  • Another significant object of the invention is to use the existing sensing means present in a mobile communication device such as an on-board microphone-speaker combine, accelerometer, camera, etc, soft/virtual sensors like networking website profile of the user, email headers of the user, Rich Site Summary (RSS) feed and social blog profile of the user,along with building management system (BMS) access control which works on realtime data brought into the phone using various data communication methods to capture an individual's behavior.
  • a mobile communication device such as an on-board microphone-speaker combine, accelerometer, camera, etc, soft/virtual sensors like networking website profile of the user, email headers of the user, Rich Site Summary (RSS) feed and social blog profile of the user,along with building management system (BMS) access control which works on realtime data brought into the phone using various data communication methods to capture an individual's behavior.
  • RSS Rich Site Summary
  • BMS building management system
  • Another significant object of the invention is to represent the social network of the user in the form of graphs depicting the social interaction information of the user'sinteraction while working in cubicle, interaction while leading a meeting, interaction of a presenter in a session, passive listener in meeting, interaction during a group discussion and the like.
  • Another object of the invention is to assign the confidence value to the existing sensing means capturing the user's information.
  • Another object of the invention is togroup the users having similar location information. Another object of the invention is to fuse the multimodal data from various sources at the backend server.
  • Yet another object of the invention is to provide connectivity with one or more external sensing devices for capturing additional details regarding the individual user. Yet another object of the invention is to reduce battery consumption and transmission cost of the system by pre-processing the sensor informationon the mobile communication device itself.
  • the present invention provides a method and system for identifying personal context of atleast one user having a portable mobile communication device (102) at a particular location for deriving social interaction information of the user.
  • the social interaction information of the user may be the physical interaction between the two users.
  • a system and method for capturing and processing multi-sensor data received from a mobile communication device for the purpose of personal context analysis is provided.
  • the mobile communication device may be a mobile phone, tablet or any such mobile device with adequate processing power and suitably adapted with required sensors.
  • the mobile communication device may also be connected to one or more external sensors such as ultrasound sensors, EEG sensors, and the like.
  • a low-energy-consuming and low-sampling-rate data acquisition/capturing method is utilized for capturing the sensory data.
  • the sensing process will be aware of the device context such as battery level, memory usage etc towards an efficient and robust sensing strategy. Further, onboard analytics of the sensor data on the mobile communication device itself is utilized for extracting one or more parameters.
  • accelerometer analytics for activity detection baseband communication using ultrasound for localization and proximity detection
  • microphone captured audio analytics for emotion detection and neighborhood sensing camera for ambient lighting detection and optional analysis of external physiological sensors such as EEG sensor for further insights into user context.
  • the system enables sending the onboard analyzed parameters from the mobile communication deviceas a gateway to a back-end system over a network connection such as internet.
  • the onboard analyzed parameters may be encrypted for security/privacy purposes.
  • the mobile device may collaborate and exchange information between one-another to get more context information.
  • a method for assigning a predefined confidence value to the identified personal context of the user;obtaining precise current location information of the user;grouping at least two users having similar current location information at the particular location with apredefined density criteria;estimating proximityfor deriving accurate straight line distance between at least two users in a group at the particular location by utilizing the external sensors (118) and the internal sensors (104) embedded in the mobile communication device (102) of the user; and deriving the social interaction in the form of physical interaction information, of the user with the other user by fusing the current location information of the user, derived accurate straight line distance between at least two users in a group at the particular locationand a web sensor (214).
  • All these parametric information collected from sensors may then be further analyzed at the back-end server for creating individual and aggregated user context that can be used for Organizational Behavior Analysis, Workspace Ergonomics Analysis, Discovering the user's physical interaction network and also for Measuring and analyzing user response in User Studies etc.
  • the overall system is also beneficial for Learning the user personal context in general and applying the knowledge to create adaptive intelligent systems that responds with action or information that is relevant to the user and for capturing and analyzing user-specific real-time service consumption data. This information is stored in proper formats will also enhance in population modeling and mass people behavioral modeling in a city while doing urban city modeling.
  • a system for the mobile communication device (102) which further comprises of an internal sensors (104), a processing module (110), an external sensors (118), an internal memory (116), a transmission module (114), a back end server (120), and a switching module (112).
  • the system further comprises of a fusion engine (212), a front end application (124) and a database (122).
  • a system further comprises of a localization module (202); a confidence value assignment module (204); a current location identification module (206); a proximity estimation module (208) and a grouping module (210).
  • the above said system and method are preferably for identifying personal context of atleast one user having a portable mobile communication device (102) at a particular location for deriving an social interaction information of the userbut also may be used for many other applications.
  • Figure 1 illustrates components of a system for identifying personal context, in accordance with an embodiment.
  • Figure 2 represents a block diagram for personal context based social interaction system (200), in accordance with an embodiment.
  • Figure 3 illustrates a logical flow (300) for identifying personal context, in accordance with an embodiment.
  • Figure 4 illustrates the logical flow (400) for personal context based social interaction, in accordance with an embodiment.
  • Figure 5 depicts the normalized histogram representing number of phones being detected at the same location, in accordance with an exemplary embodiment of the present invention.
  • modules may include self-contained component in a hardware circuit comprising of logical gate, semiconductor device, integrated circuits or any other discrete component.
  • the module may also be a part of any software program executed by any hardware entity for example processor.
  • the implementation of module as a software program may include a set of logical instructions to be executed by the processor or any other hardware entity.
  • a module may be incorporated with the set of instructions or a program by means of an interface.
  • Figure 1 refers to a system (100) for personal context identification in accordance with an exemplary embodiment of the invention.
  • the system (100) is constructed using one or more modules functioning independently or in combination to perform personal context analysis.
  • the system (100) comprises of Mobile communication device(102) that further comprisesof internal sensors (104) like accelerometer, camera, GPS, microphones and the like for sensing user activities, location, audio, and ambient light.
  • the internal sensors further provide means for localization and proximity detection using ultrasonic and upper audio band.
  • Anexternal sensor interface (108) connects external sensors (118) like wearable EEG having low-energy-consumption and low-sampling-rate with the mobile communication device (102).
  • a processing module (110) and internal memory (116) performs onboard analytics of the sensor data captured by various internal and external sensors.
  • the mobile communication device (102) further comprises a transmission module (114) for transmitting relevant information to the backend server (120) and a switching module (112) for switching between Other mobile device Applications (106) of internal sensors (104) and personal context analysiswhen an interrupt is generated from regular activities of mobile communication device (102).
  • the processing module (110) processes the sensed data, separates the context data and transmits only the relevant information for personal context identification to the backend server (120) with the help of transmission module (114).
  • the backend server (120) processes and analyzes the information received from the transmission module (114) present in the mobile communication device (102).
  • the backend server (120) sorts and processes each-and-every information related to a specific user. This information is stored into the database (122) and is accessed by the frontend applications (124) through backend server (120) for creating individual and aggregated user context that can be used for Organizational Behavior Analysis, Workspace Ergonomics Analysis and also for Measuring and analyzing user response in User Studies etc.
  • the frontend application (124) through backend server (120) is further adapted for rendering the derived social interaction information of the user in the form of statistic representation.
  • the statics representation may include the multimodal social graph with edges having multiple attributes and can be queried along multiple dimensions on-demand.
  • the social graph can be prepared by reading the attributes from two fold sensor input, the first being the analyzed output of the physical sensors, and the second being the feed from web based soft sensors. The analyzed data is used to generate the nodes and edges of the social graph.
  • Figure 2 represents the block diagram for personal context based social interaction deriving system (200).
  • the system comprises of a localization module (202); a confidence value assignment module (204); a current location identification module (206); a proximity estimation module (208); a grouping module (210); and a fusion engine (212) hosted by the back end server (120).
  • the system (200) is configured to derive personal context based social interaction.
  • the personal context of the user comprises of the identity of the user i.e. who the person is, the proximity of the user i.e. the closeness and the duration of the user to other individuals or users, the activities of the user i.e. working in cubicle, discussion room, corridor and the like, and the location of the user.
  • the social interaction of the user is the physical interaction of the user with other individuals.
  • the system (200) comprises of the localization module (202) adapted to locate the user within a predefined range of the various sources foridentifying the personal context of the user at a particular location.
  • the personal context of the user is identified using the external sensors (118) and the internal sensors (104) embedded in the mobile communication device (102). Further the confidence value assignment module (204) is used to assign the predefined confidence values to the identified personal context of the user, depending on the sources from which the user is localized.
  • Precise current location information of the user inside the particular location is obtained using the current location identification module (206).
  • the precise current location information is obtained by fusing the assigned confidence value of personal context of the user using a fusion engine (212).
  • the system groups the two users located with the similar current location information using the grouping module (208). The grouping of the two users is decided based on certain with predefined density criteria.
  • the accurate straight line distance between the two users in a group, at the particular location is derived using the proximity estimation module (210).
  • the fusion engine (212) which is hosted by the back end server (120) is used to derive the social interaction information of the user by fusing the current location information of the user and derived accurate straight line distance between at the two users
  • the external sensors (118) are selected from the group comprising of access control entry/exit information from building management system (BMS), surveillance using depth camera like 3D camera, wearable EEG, ultra sound and the like.
  • the internal sensors (104) embedded in the mobile communication device (102) are selected form the group comprising of accelerometer, magnetometer, speech recognizer, gyroscope and W-Fi signature and the like.
  • the Wi-Fi fingerprinting and/or triangulation based on RSSI can also provide indoor localization.
  • the accelerometer, magnetometer and gyroscope are combined to sense complex motion like walking or climbing stairs or elevator or sitting/standing etc.
  • the personal context of the user is selected from the group comprising of identity, proximity, activity and location.
  • the identity of the may be captured by the means of the user's mobile phone which is tagged to the user, the user's Smart card while the user passes through BMS, the user's skeletal structure and gait measurements taken using 3D depth cameras and the like.
  • the proximity of the user may be captured by means of the Bluetooth of the user, proximity sensors located in the building structure, infrared proximity sensing and the like.
  • the activity of the user may be captured by the social interaction of the user with the people around which may comprises of information like during presentation, who all are listening? Who is asking questions? During a group discussion and also gesture recognition may be used to identify interactions like handshakes and card exchanges etc.
  • the location of the user may be detect using the localization module (202).
  • the confidence value assignment module (204) assigns the predefined confidence value to the identified personal context of the user based on the source of information.
  • the predefined confidence value or the confidence score of the internal source and external source shall vary between 0% and 100% but may not inclusive of either bound.
  • the predefined confidence value of the building management system (BMS) data is 100 %.
  • the speech recognition software may provide a detection score which may be considered as the confidence value.
  • the localization range from the speech data may be of the order of 10 sq. m. but not limited to that. Similar the case may be of Wi-Fi signature and accelerometer data.
  • the confidence scores may vary from the sources from which they are captured, depending on how much data is captured and the environment from which the data is captured, the same sensor and analytics might produce very high to very low confidence values.
  • the joint probability function which is used for finding the location information of anindividual user is obtained by the following equation: n ⁇ flP jt Qoc x)
  • Email files and archives parsed are read using many commercially or open source email tools including the Java mail API.
  • the mobile based proximity sensing is used as the pair-wise individual for different users of the group of people. From the "mobile proximity sensing", distance d between the individuals j and 1, is given by the probability distribution function for the pair (j and i to be at distance d) as is given by ⁇ ⁇ chorus (d / x) .
  • the confidence score S j i m may be associated with the observation of is P j i m (d j i m /x).
  • the confidence score S associated with the observation of is P j i c (d j
  • 7x). Further, dj distance (loc j , loci), is the distance between the locations of individual j (loc j ) and individual i (loq).
  • the grouping module (210) groups two users having similar current location information at the particular location with the predefined Where P Ji (loc j l x) is the probability distribution function of the location of j individual given the observation from the i th sensor data.
  • the P j , ⁇ loc j lx) is calculated from Bayes theorem, given below:
  • locj l arg max L l -
  • S is the physical space in which the individual j can belong.
  • the confidence score associated with the observation of L 1 ] (loc' j ) is S j 1 .
  • the current location identification module (206) obtains precise current location information of the user within the particular location.
  • the proximity estimation module (208) is used to estimate the proximity for deriving accurate straight line distance between at least two users in a group at the current location by utilizing the external sensors (118) and the internal sensors (104) embedded in the mobile communication device (102) of the user.
  • the proximity estimation module (208) may use indoor proximity detection using Bluetooth, audio on mobile, 3D depth-camera, real life communications discovery by scanning the e-mail headers of the individual which provides information like To, CC and Subject fields giving an indication of the users' communication with the other people, and the like.
  • the Bluetooth may be used to detect proximity since commercial phones come with class II Bluetooth which provides only short range.
  • the 3D depth cameras provide accurate measurement of distance and direction
  • the predefined density criteria may be derived employing a density based clustering method.
  • the clustering algorithm is used to create groups.
  • the clustering algorithm may follow the steps like, the minimum group size is G min and maximum group size is G max .
  • the maximum distance of an individual from the centroid of the group is dj Cm ax-
  • the Density based clustering (DBSCAN) algorithm may be used to form groups with density criteria ' ⁇ ' such that there has to be N individual per unit area.
  • the clustering algorithm gives the core groups as clusters and individuals not belonging to any groups are treated as outliers.
  • the fusion engine (212) is adapted for deriving the social interaction information of the user.
  • the fusion engine (212) is hosted by a back end server (120).
  • the fusion engine is used to fuse the data or information received from the derived accurate straight line distance between the two users in a group at the particular location and a web sensor (214).
  • the fusion engine (212) the data from all these sensors as well as the web based soft-sensors are fed to the multimodal fusion module.
  • Each sensor may have an error probability and also a confidence score with which it reports a reading.
  • the fusion module reads data from multiple sensors along with mentioned properties for each reading.
  • the engine then infers commonality from reporting of multiple sensors and come up with an aggregated attribute for the edge between the vertices under consideration. For example "proximity" reported by a set of sensors may be fused with audio analysis to arrive at a conclusion such as "conversation".
  • This augmented with a location of conference room can be used to deduce a "meeting” whereas the same will be deduced as “chat” if the location changes to the coffee machine.
  • Another aspect of multimodal sensor fusion may be used for error reduction and cancelation. For example, where 3D Camera reports proximity between two people at a location with a moderate degree of confidence, however the location for one of the persons does not match the location derived from accelerometer. In such a case the 3D camera data may be rejected as a "false positive".
  • the web sensor (214) may be selected from the group comprising of social networking website profile of the user, email headers of the user, RSS feed and social blog profile of the user.
  • the social networking sites like Facebook, Twitter etc. provides access to various information's like profile data which is in the form of a structured data. This structured data is gathered for separately parsing the structured data to extract the interests of individual. The interests may provide an important property for the edge of the social graph as two people having a common interest are likely to be connected across that.
  • the structured data mining, unstructured data mining from user's blogs and social posts may be obtained for forming edge attributes.
  • the email headers for the person may be scanned to understand real life communications of the individual, which may provide information like To, CC and Subject fields giving an indication of the user's communication with the other people.
  • the location information for the j th individual as loc j is given in equation (2).
  • the corresponding confidence value (S j 1 ) of the detected location is the (loc j ). This is done for all N individuals.
  • the pair-wise distance is computed by "mobile proximity sensing" and "3D camera data" as d j i" 1 and d for individual's j and i.
  • the location information of j as derived from "mobile proximity sensing" of j-1 pair is sphere centered on loci 1 with radius d j TM.
  • P oc ⁇ I x) P(loc)) * P(d% )
  • loc loc fl c hc] j .+ dj ( l (5)
  • the final fused location for j th individual is obtained as the weighted sum obtained from different observations, where the weights are the confidence scores of the individual observations is given by,
  • N is the number of individuals in the proximity.
  • FIG. 3 illustrates the logical flow (300), as shown in step (302) the internal sensors (104) present in the mobile communication device (102) is activated for sensing information related to various activities done by a user.
  • the processing module (110) checks for presence of any external sensors and connects with them through communication means like USB, Wi-Fi, Bluetooth and the like.
  • the processing module (110) checks for any user activity such as downloading process or browsing is going on the mobile communication device (102) as shown in step (308) and if so the processor module (110) waits for a predefined time step (306) and again checks for user activity.
  • step (310) and (312) check for any user activity going on the processing module (110) instructs the internal sensors (104) and external sensors (118) to sense user activity and transfer the captured data to the processing module (110).
  • the processing module (110) then analyzes this data, separates context information and transmits only relevant information to the transmission module (114) which further transfers this information to the backend server (120) as shown in step (314).
  • the processing module (110) performs the following activities - accelerometer analytics for activity detection, baseband communication using ultrasound for localization and proximity detection, microphone captured audio analytics for emotion detection and neighborhood sensing, camera for ambient lighting detection and analysis of external physiological sensors such as EEG sensor for further insights into user contextAt step (316) the back end server (120) and its underlying fusion and decision module performs analysis to identify user personal context and store this information in the data base (122). At step (318) this stored information is used by the front end applications (124) for creating individual and aggregated user context that can be used for Organizational Behavior Analysis, Workspace Ergonomics Analysis, Discovering the user's physical interaction network and also for Measuring and analyzing user response in User Studies etc.
  • step (402) depicts the logical flow of the personal context based social interaction deriving method (400), as shown in step (402) the process starts by locating the user within predefined range at the particular location.
  • step (404) personal context of the user is identified using an external sensors (118) and an internal sensors (104) embedded in the mobile communication device (102).
  • the identified personal context of the user is assigned with the predefined confidence value.
  • the precise current location information of the user is obtained within the particular location.
  • the accurate straight line distance between at least two users at the current location is derived.
  • the two users having similar current location information at the particular location are grouped together with the predefined density criteria.
  • step (414) wherein the social interaction information of the user in a group at the particular location is derived.
  • FIG 5 depicts the normalized histogram representing number of phones being detected at the same location in which, a microphone sensor "i" which is static and its location is known, is considered.
  • a mobile phone (j) may be used to play a sound at a known volume.
  • the fixed microphone "i” receives the sound and computes a distance "d kj ".
  • the distance may be computed with the principle that at the receiver using microphone sensors and the like, the attenuation of the volume of the sound is inversely proportional to the distance from the sound source. This computation is performed for N such phones.
  • a histogram is computed on number of phones being detected at the same location.
  • a sample normalized histogram plot is shown in Figure 5. It is clear from the Figure 5, that maximum numbers of phones are detected at a distance 11 units, this is the actual distance d*. But, there may be certain non-zero phones whose distance is detected as different from d k . This may be due to the error in observation, difference in phone model and environmental effects. This process is repeated for all the loC k Or equivalently d k .Every time when the receiver observes a distance "d kj ", there is a confidence score "S jk " associated with the observation.
  • P ( x I loc ) function shown in Figure 5 gives S jk to derive the probability distribution * of the observed value given a distance d k .
  • the probability of the location loc k observed by i th sensor is ⁇ °° k ' ' x ⁇ . This is obtained using Bayes equation.

Abstract

A method and system for identifying personal context of auser having a portable mobile communication device at a particular location for deriving social interaction information of the user, wherein the user within a predefined range is identified using personal context of the user at the particular location and the identified personal context of the user is assigned with the confidence value. Further the current location information of the user within the particular location is obtained by fusing assigned confidence value. Further the proximity of the user in the current location is estimated by finding the accurate straight line distance between users. Further the two users having similar current location information at the particular location are grouped together with the predefined density criteria. Finally the social interaction information of the user is derived by multimodal sensor data fusion at the fusion engine and represented using a human network graph.

Description

A SYSTEM AND METHOD FOR IDENTIFYING AND ANALYZING PERSONAL
CONTEXT OF A USER
FIELD OF THE INVENTION
This invention generally relates to the field, of personal context identification of a person. More particularly, the invention relates to a system and method for personal context identification for deriving social interaction information of the person.
BACKGROUND OF THE INVENTION
Analysis of a person's behavior has been an important aspect that has a plurality of applications in the field of marketing, organizational development etc. Due to this the field of personal context analysis is gaining wide importance. Specifically organizations employing a large number of employees are concerned to analyze the behavior of an individual for faster and better growth of the organization. The increasing need of analysis in terms of personal context has led to increasing growth in the field of Organizational Behavior Analysis, Workspace Ergonomics Analysis, Discovering the user's physical interaction network, User Studies Analysis, Market Study and Real-time Usage capture etc.
A number of technologies are available in the market for analyzing social behavior of a person using the reality mining techniques and other related aspects such as work cultures which have dedicated software and hardware requirements of the system. Such systems analyze Organizational Behavior based on user context sensing via specially designed devices that · incorporate all required sensors. These devices interact with each other and aserver to gather information associated with an individual. However,such systems and devices pose a threat as data related to a number of individuals is required to be transmitted to a back-end server for further processing, thereby raising privacy concerns. Moreover, further refinement and data processing at a distant located server leads to heavy transmission costs. Further, such data transmission leads to extra usage of battery power in order to transmit each and every particular sensed detail to the back-end server without processing it.
Furthermore, an additional device is required to be deployed at an extra cost in order to track anindividual's behavior. Such a device does not generally has an always-on connectivity to dump the user data collected to the back-end server for further analysis - it needs to be docked to a special data collection station to transfer the data. Also, such available devices do not have provision to connect to additional external sensors over wireless, so the extensibility of the system to newer applications is limited.
Also, current solutions for context recognition and analysis using the reality mining techniquesare dependent on wearable sensors and mobile devices for sensing user's activity, location and proximity with respect to other users. There are different algorithms used to arrive at the conclusion of user's attributes in the real world, such results are often inaccurate due to errors in sensor readings and change in ambient environment. Such inaccuracies can cause discrepancies and malfunctioning in the case of ubiquitous applications. Furthermore, the sensors used are very limited in the kind of data they provide and most of time specialized sensors are needed for deployment.
As a result there is growing need to integrate a personal context analysis system with a more efficient, widely available and user-friendly device which is easy to carry and simple to operate, and thereby eliminating the need for a separate special device or data collection system. There is also a need to process the raw sensory data at the sensing device itself topreserve battery life of the device (radio communication takes up most of the battery life), and thereby also addressing the privacy preservation and data transmission cost concerns. Moreover, a provision to connect to additional external sensors through existing communication means like USB, Wi-Fi or Bluetooth will also lead to better grasp of an individual's behavior. Further the personal context analysis will also lead to create a social network based on real life fixations and affinities of the user. OBJECTS OF THE INVENTION
The principle object of the present invention is to implement a real-time personal context identification system using reality mining techniques by means of a widely available mobile communication device.
Another significant object of the invention is to use the existing sensing means present in a mobile communication device such as an on-board microphone-speaker combine, accelerometer, camera, etc, soft/virtual sensors like networking website profile of the user, email headers of the user, Rich Site Summary (RSS) feed and social blog profile of the user,along with building management system (BMS) access control which works on realtime data brought into the phone using various data communication methods to capture an individual's behavior.
Another significant object of the invention is to represent the social network of the user in the form of graphs depicting the social interaction information of the user'sinteraction while working in cubicle, interaction while leading a meeting, interaction of a presenter in a session, passive listener in meeting, interaction during a group discussion and the like.
Another object of the invention is to assign the confidence value to the existing sensing means capturing the user's information.
Another object of the invention is togroup the users having similar location information. Another object of the invention is to fuse the multimodal data from various sources at the backend server.
Yet another object of the invention is to provide connectivity with one or more external sensing devices for capturing additional details regarding the individual user. Yet another object of the invention is to reduce battery consumption and transmission cost of the system by pre-processing the sensor informationon the mobile communication device itself.
SUMMARY OF THE INVENTION
Before the present methods, systems, and hardware enablement are described, it is to be understood that this invention in not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments of the present invention which are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present invention.
The present invention provides a method and system for identifying personal context of atleast one user having a portable mobile communication device (102) at a particular location for deriving social interaction information of the user. The social interaction information of the user may be the physical interaction between the two users.
In an embodiment of the present invention, a system and method for capturing and processing multi-sensor data received from a mobile communication device for the purpose of personal context analysis is provided. In an aspect, the mobile communication device may be a mobile phone, tablet or any such mobile device with adequate processing power and suitably adapted with required sensors. The mobile communication device may also be connected to one or more external sensors such as ultrasound sensors, EEG sensors, and the like. A low-energy-consuming and low-sampling-rate data acquisition/capturing method is utilized for capturing the sensory data. The sensing process will be aware of the device context such as battery level, memory usage etc towards an efficient and robust sensing strategy. Further, onboard analytics of the sensor data on the mobile communication device itself is utilized for extracting one or more parameters. For example, accelerometer analytics for activity detection, baseband communication using ultrasound for localization and proximity detection, microphone captured audio analytics for emotion detection and neighborhood sensing, camera for ambient lighting detection and optional analysis of external physiological sensors such as EEG sensor for further insights into user context. The system enables sending the onboard analyzed parameters from the mobile communication deviceas a gateway to a back-end system over a network connection such as internet. In an aspect, the onboard analyzed parameters may be encrypted for security/privacy purposes. In another aspect, the mobile device may collaborate and exchange information between one-another to get more context information.
In an embodiment of the invention a method is provided for assigning a predefined confidence value to the identified personal context of the user;obtaining precise current location information of the user;grouping at least two users having similar current location information at the particular location with apredefined density criteria;estimating proximityfor deriving accurate straight line distance between at least two users in a group at the particular location by utilizing the external sensors (118) and the internal sensors (104) embedded in the mobile communication device (102) of the user; and deriving the social interaction in the form of physical interaction information, of the user with the other user by fusing the current location information of the user, derived accurate straight line distance between at least two users in a group at the particular locationand a web sensor (214).
All these parametric information collected from sensors may then be further analyzed at the back-end server for creating individual and aggregated user context that can be used for Organizational Behavior Analysis, Workspace Ergonomics Analysis, Discovering the user's physical interaction network and also for Measuring and analyzing user response in User Studies etc. The overall system is also beneficial for Learning the user personal context in general and applying the knowledge to create adaptive intelligent systems that responds with action or information that is relevant to the user and for capturing and analyzing user-specific real-time service consumption data. This information is stored in proper formats will also enhance in population modeling and mass people behavioral modeling in a city while doing urban city modeling.
In an embodiment of the invention a system is provided for the mobile communication device (102) which further comprises of an internal sensors (104), a processing module (110), an external sensors (118), an internal memory (116), a transmission module (114), a back end server (120), and a switching module (112). The system further comprises of a fusion engine (212), a front end application (124) and a database (122).
In another embodiment of the invention a system further comprises of a localization module (202); a confidence value assignment module (204); a current location identification module (206); a proximity estimation module (208) and a grouping module (210).
The above said system and method are preferably for identifying personal context of atleast one user having a portable mobile communication device (102) at a particular location for deriving an social interaction information of the userbut also may be used for many other applications.
BRIEF DESCRIPTION OF DRAWINGS
The foregoing summary, as well as the following detailed description of preferred embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, there is shown in the drawings example constructions of the invention; however, the invention is not limited to the specific system and method disclosed in the drawings:
Figure 1 illustrates components of a system for identifying personal context, in accordance with an embodiment. Figure 2 represents a block diagram for personal context based social interaction system (200), in accordance with an embodiment.
Figure 3 illustrates a logical flow (300) for identifying personal context, in accordance with an embodiment.
Figure 4 illustrates the logical flow (400) for personal context based social interaction, in accordance with an embodiment.
Figure 5 depicts the normalized histogram representing number of phones being detected at the same location, in accordance with an exemplary embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Some embodiments of this invention, illustrating all its features, will now be discussed in detail. The words "comprising," "having," "containing," and "including," and Other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.
It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present invention, the preferred, systems and methods are now described.
One or more components of the invention are described as module for the understanding of the specification. For example, a module may include self-contained component in a hardware circuit comprising of logical gate, semiconductor device, integrated circuits or any other discrete component. The module may also be a part of any software program executed by any hardware entity for example processor. The implementation of module as a software program may include a set of logical instructions to be executed by the processor or any other hardware entity. Further a module may be incorporated with the set of instructions or a program by means of an interface.
The disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms.
Figure 1 refers to a system (100) for personal context identification in accordance with an exemplary embodiment of the invention. The system (100) is constructed using one or more modules functioning independently or in combination to perform personal context analysis. The system (100) comprises of Mobile communication device(102) that further comprisesof internal sensors (104) like accelerometer, camera, GPS, microphones and the like for sensing user activities, location, audio, and ambient light.The internal sensors further provide means for localization and proximity detection using ultrasonic and upper audio band. Anexternal sensor interface (108) connects external sensors (118) like wearable EEG having low-energy-consumption and low-sampling-rate with the mobile communication device (102).A processing module (110) and internal memory (116) performs onboard analytics of the sensor data captured by various internal and external sensors. The mobile communication device (102) further comprises a transmission module (114) for transmitting relevant information to the backend server (120) and a switching module (112) for switching between Other mobile device Applications (106) of internal sensors (104) and personal context analysiswhen an interrupt is generated from regular activities of mobile communication device (102).
The processing module (110) processes the sensed data, separates the context data and transmits only the relevant information for personal context identification to the backend server (120) with the help of transmission module (114). The backend server (120) processes and analyzes the information received from the transmission module (114) present in the mobile communication device (102). The backend server (120) then sorts and processes each-and-every information related to a specific user. This information is stored into the database (122) and is accessed by the frontend applications (124) through backend server (120) for creating individual and aggregated user context that can be used for Organizational Behavior Analysis, Workspace Ergonomics Analysis and also for Measuring and analyzing user response in User Studies etc.
The frontend application (124) through backend server (120) is further adapted for rendering the derived social interaction information of the user in the form of statistic representation. The statics representation may include the multimodal social graph with edges having multiple attributes and can be queried along multiple dimensions on-demand. The social graph can be prepared by reading the attributes from two fold sensor input, the first being the analyzed output of the physical sensors, and the second being the feed from web based soft sensors. The analyzed data is used to generate the nodes and edges of the social graph.
Figure 2 represents the block diagram for personal context based social interaction deriving system (200). The system comprises of a localization module (202); a confidence value assignment module (204); a current location identification module (206); a proximity estimation module (208); a grouping module (210); and a fusion engine (212) hosted by the back end server (120).
According to an exemplary embodiment of the present invention, the system (200) is configured to derive personal context based social interaction. The personal context of the user comprises of the identity of the user i.e. who the person is, the proximity of the user i.e. the closeness and the duration of the user to other individuals or users, the activities of the user i.e. working in cubicle, discussion room, corridor and the like, and the location of the user. The social interaction of the user is the physical interaction of the user with other individuals. The system (200) comprises of the localization module (202) adapted to locate the user within a predefined range of the various sources foridentifying the personal context of the user at a particular location. The personal context of the user is identified using the external sensors (118) and the internal sensors (104) embedded in the mobile communication device (102). Further the confidence value assignment module (204) is used to assign the predefined confidence values to the identified personal context of the user, depending on the sources from which the user is localized. Precise current location information of the user inside the particular location is obtained using the current location identification module (206). The precise current location information is obtained by fusing the assigned confidence value of personal context of the user using a fusion engine (212). Further the system groups the two users located with the similar current location information using the grouping module (208). The grouping of the two users is decided based on certain with predefined density criteria.The accurate straight line distance between the two users in a group, at the particular location is derived using the proximity estimation module (210). The fusion engine (212) which is hosted by the back end server (120) is used to derive the social interaction information of the user by fusing the current location information of the user and derived accurate straight line distance between at the two users in a group at the particular location.
The external sensors (118) are selected from the group comprising of access control entry/exit information from building management system (BMS), surveillance using depth camera like 3D camera, wearable EEG, ultra sound and the like. The internal sensors (104) embedded in the mobile communication device (102) are selected form the group comprising of accelerometer, magnetometer, speech recognizer, gyroscope and W-Fi signature and the like.
The localization module (202) in which the indoor location detection is captured using access control logs of the buildings, Wi-Fi signatures, magnetometer signatures, accelerometer analytics on mobile phones and the like. The Wi-Fi fingerprinting and/or triangulation based on RSSI can also provide indoor localization. The accelerometer, magnetometer and gyroscope are combined to sense complex motion like walking or climbing stairs or elevator or sitting/standing etc. The personal context of the user is selected from the group comprising of identity, proximity, activity and location. The identity of the may be captured by the means of the user's mobile phone which is tagged to the user, the user's Smart card while the user passes through BMS, the user's skeletal structure and gait measurements taken using 3D depth cameras and the like. The proximity of the user may be captured by means of the Bluetooth of the user, proximity sensors located in the building structure, infrared proximity sensing and the like. The activity of the user may be captured by the social interaction of the user with the people around which may comprises of information like during presentation, who all are listening? Who is asking questions? During a group discussion and also gesture recognition may be used to identify interactions like handshakes and card exchanges etc. The location of the user may be detect using the localization module (202).
The confidence value assignment module (204) assigns the predefined confidence value to the identified personal context of the user based on the source of information. The predefined confidence value or the confidence score of the internal source and external source shall vary between 0% and 100% but may not inclusive of either bound. For example, the predefined confidence value of the building management system (BMS) data is 100 %. The speech recognition software may provide a detection score which may be considered as the confidence value. The localization range from the speech data may be of the order of 10 sq. m. but not limited to that. Similar the case may be of Wi-Fi signature and accelerometer data. The confidence scores may vary from the sources from which they are captured, depending on how much data is captured and the environment from which the data is captured, the same sensor and analytics might produce very high to very low confidence values.
In an embodiment of the present invention, the joint probability function which is used for finding the location information of anindividual user is obtained by the following equation: n^ flPjtQoc x)
'=! (1) between people and hence provide a very good source for detecting proximity. The Email files and archives parsed are read using many commercially or open source email tools including the Java mail API.
In an embodiment of the present invention, the mobile based proximity sensing is used as the pair-wise individual for different users of the group of people. From the "mobile proximity sensing", distance d between the individuals j and 1, is given by the probability distribution function for the pair (j and i to be at distance d) as is given by Ρβ„ (d / x) .
The Pjtm (d I x) is calculated from Bayes theorem, as below:
Figure imgf000014_0001
where p(d) is the prior probability of the distance between j and i. hence, d i = argmax Pjlm (d/ x)
deS
(3)
The confidence score Sjim may be associated with the observation of is Pjim(djim/x).
For the distance d between the individuals j and i, using "3D camera data" is given by the probability distribution function for the pair j and i to be at distance d, as is given by Pjk (dlx) . Hence, dji = arg max Pjlc (d I x)
Figure imgf000014_0002
^
The confidence score S associated with the observation of is Pjic(dj|7x). Further, dj = distance (locj, loci), is the distance between the locations of individual j (locj) and individual i (loq).
In an embodiment of the present invention, the grouping module (210) groups two users having similar current location information at the particular location with the predefined Where PJi (locj l x) is the probability distribution function of the location of j individual given the observation from the ith sensor data.
The Pj, {locj lx) is calculated from Bayes theorem, given below:
PM0c /x) = r(x /l0C) * p(l0C)
^ p(x llock ) * p{lock )
k
Where p(loc) is the prior probability of the location of j.Thus the location of individual j is given by, locj l = arg max Ll-
Figure imgf000015_0001
Where S is the physical space in which the individual j can belong. The confidence score associated with the observation of L1 ] (loc'j) is Sj 1.
In an embodiment of the present invention, the current location identification module (206) obtains precise current location information of the user within the particular location.
In an embodiment of the present invention, the proximity estimation module (208) is used to estimate the proximity for deriving accurate straight line distance between at least two users in a group at the current location by utilizing the external sensors (118) and the internal sensors (104) embedded in the mobile communication device (102) of the user.
In an embodiment of the present invention, the proximity estimation module (208) may use indoor proximity detection using Bluetooth, audio on mobile, 3D depth-camera, real life communications discovery by scanning the e-mail headers of the individual which provides information like To, CC and Subject fields giving an indication of the users' communication with the other people, and the like. The Bluetooth may be used to detect proximity since commercial phones come with class II Bluetooth which provides only short range. The 3D depth cameras provide accurate measurement of distance and direction
12 density criteria. The predefined density criteria may be derived employing a density based clustering method.
In an embodiment of the present invention, the clustering algorithm is used to create groups. The clustering algorithm may follow the steps like, the minimum group size is Gmin and maximum group size is Gmax. The maximum distance of an individual from the centroid of the group is djCmax- The Density based clustering (DBSCAN) algorithm may be used to form groups with density criteria ' ε ' such that there has to be N individual per unit area. The clustering algorithm gives the core groups as clusters and individuals not belonging to any groups are treated as outliers.
In an embodiment of the present invention, the fusion engine (212) is adapted for deriving the social interaction information of the user. The fusion engine (212) is hosted by a back end server (120). The fusion engine is used to fuse the data or information received from the derived accurate straight line distance between the two users in a group at the particular location and a web sensor (214).
In an embodiment of the present invention, the fusion engine (212) the data from all these sensors as well as the web based soft-sensors are fed to the multimodal fusion module. Each sensor may have an error probability and also a confidence score with which it reports a reading. The fusion module reads data from multiple sensors along with mentioned properties for each reading. The engine then infers commonality from reporting of multiple sensors and come up with an aggregated attribute for the edge between the vertices under consideration. For example "proximity" reported by a set of sensors may be fused with audio analysis to arrive at a conclusion such as "conversation". This augmented with a location of conference room can be used to deduce a "meeting" whereas the same will be deduced as "chat" if the location changes to the coffee machine. Another aspect of multimodal sensor fusion may be used for error reduction and cancelation. For example, where 3D Camera reports proximity between two people at a location with a moderate degree of confidence, however the location for one of the persons does not match the location derived from accelerometer. In such a case the 3D camera data may be rejected as a "false positive".
In an embodiment, the web sensor (214) may be selected from the group comprising of social networking website profile of the user, email headers of the user, RSS feed and social blog profile of the user. The social networking sites like Facebook, Twitter etc. provides access to various information's like profile data which is in the form of a structured data. This structured data is gathered for separately parsing the structured data to extract the interests of individual. The interests may provide an important property for the edge of the social graph as two people having a common interest are likely to be connected across that. The structured data mining, unstructured data mining from user's blogs and social posts may be obtained for forming edge attributes. The email headers for the person may be scanned to understand real life communications of the individual, which may provide information like To, CC and Subject fields giving an indication of the user's communication with the other people.
In an embodiment of the present invention, the location information for the jth individual as locj is given in equation (2). The corresponding confidence value (Sj 1) of the detected location is the (locj). This is done for all N individuals. We term the location for "j" obtained from equation (2) as loc . The pair-wise distance is computed by "mobile proximity sensing" and "3D camera data" as dji"1 and d for individual's j and i.The location information of j as derived from "mobile proximity sensing" of j-1 pair is sphere centered on loci1 with radius dj™. Thus the probability of the location of j derived from the location of i and the distance between the j-i pair is given by P oc^ I x) = P(loc)) * P(d% )
Combined probability of the location of j derived from all other individuals is given as, Lmj = l P Voc J / x)
1 = 1 ,1* j
Finally, the location of j derived from "mobile proximity sensing" is given as,
Figure imgf000018_0001
(4)
The corresponding score is given as Sj m = Lj m(loCj m).
The location of j as computed from "3D camera data" obtained from the distance between j-1 pair is given as loc locfl c = hc]j.+ dj( l (5)
The location of "j" obtained in equation (5) is through the location of "i". Hence the score for obtaining equation (5) is Sj'*Sjic
Thus the final fused location for jth individual is obtained as the weighted sum obtained from different observations, where the weights are the confidence scores of the individual observations is given by,
locj = ι ψ , V/ = 1,2,.; JV s) + sj +
l=\,l≠j (6)
Where, N is the number of individuals in the proximity.
Figure 3 illustrates the logical flow (300), as shown in step (302) the internal sensors (104) present in the mobile communication device (102) is activated for sensing information related to various activities done by a user. In step (304) the processing module (110) checks for presence of any external sensors and connects with them through communication means like USB, Wi-Fi, Bluetooth and the like. As shown in step (306), the processing module (110) checks for any user activity such as downloading process or browsing is going on the mobile communication device (102) as shown in step (308) and if so the processor module (110) waits for a predefined time step (306) and again checks for user activity. As presented in step (310) and (312) check for any user activity going on the processing module (110) instructs the internal sensors (104) and external sensors (118) to sense user activity and transfer the captured data to the processing module (110). The processing module (110) then analyzes this data, separates context information and transmits only relevant information to the transmission module (114) which further transfers this information to the backend server (120) as shown in step (314). Specific to the sensors under consideration, the processing module (110) performs the following activities - accelerometer analytics for activity detection, baseband communication using ultrasound for localization and proximity detection, microphone captured audio analytics for emotion detection and neighborhood sensing, camera for ambient lighting detection and analysis of external physiological sensors such as EEG sensor for further insights into user contextAt step (316) the back end server (120) and its underlying fusion and decision module performs analysis to identify user personal context and store this information in the data base (122). At step (318) this stored information is used by the front end applications (124) for creating individual and aggregated user context that can be used for Organizational Behavior Analysis, Workspace Ergonomics Analysis, Discovering the user's physical interaction network and also for Measuring and analyzing user response in User Studies etc.
Referring to Figure 4, depicts the logical flow of the personal context based social interaction deriving method (400), as shown in step (402) the process starts by locating the user within predefined range at the particular location. At step (404) personal context of the user is identified using an external sensors (118) and an internal sensors (104) embedded in the mobile communication device (102). At the step (406) the identified personal context of the user is assigned with the predefined confidence value. At the step (408) the precise current location information of the user is obtained within the particular location. At the step (410) the accurate straight line distance between at least two users at the current location is derived. At the step (412) the two users having similar current location information at the particular location are grouped together with the predefined density criteria. The process ends at step (414) wherein the social interaction information of the user in a group at the particular location is derived. Referring to Figure 5 as an exemplary embodiment of the present invention, depicts the normalized histogram representing number of phones being detected at the same location in which, a microphone sensor "i" which is static and its location is known, is considered. Ata particular location lock, a mobile phone (j) may be used to play a sound at a known volume. The fixed microphone "i" receives the sound and computes a distance "dkj". The distance may be computed with the principle that at the receiver using microphone sensors and the like, the attenuation of the volume of the sound is inversely proportional to the distance from the sound source. This computation is performed for N such phones. A histogram is computed on number of phones being detected at the same location.A sample normalized histogram plot is shown in Figure 5. It is clear from the Figure 5, that maximum numbers of phones are detected at a distance 11 units, this is the actual distance d*. But, there may be certain non-zero phones whose distance is detected as different from dk. This may be due to the error in observation, difference in phone model and environmental effects. This process is repeated for all the loCkOr equivalently dk.Every time when the receiver observes a distance "dkj", there is a confidence score "Sjk" associated with the observation. The
P ( x I loc ) function shown in Figure 5 gives Sjk to derive the probability distribution * of the observed value given a distance dk.During the actual detection process, based on the received sound from the mobile phone, the probability of the location lock observed by ith sensor is ^°°k ' ' x^ . This is obtained using Bayes equation.

Claims

WE CLAIM:
1. A method for identifying personal context of atleast one user having a portable mobile communication device (102) at a particular location for deriving an social interaction information of the user, the method comprising: a. locating at least one user within predefined range by identifying personal context of the user at the particular location using a localization module (202), wherein personal context of the user is identified using an external sensors (118) and an internal sensors (104) embedded in the mobile communication device (102); b. assigning a predefined confidence value to the identified personal context of the user using a confidence value assignment module (204);
c. obtaining precise current location information of the user within the particular location by fusing assigned confidence value of personal context of the user using a current location identification module (206) and fusion engine (212) hosted by a back end server (120);
d. estimating proximity for deriving accurate straight line distance between at least two users at the current location by utilizing the external sensors (118) and the internal sensors (104) embedded in the mobile communication device (102) of the user using an proximity estimation module (208);
e. grouping at least two users having similar current location information at the particular location with a predefined density criteria using a grouping module (210); and
f. deriving the social interaction information of the user by fusing the current location information of the user, derived accurate straight line distance between at least two users in a group at the particular locationand a web sensor (214) usingthe fusion engine (212) hosted by the back end server (120).
2. The method of claim 1, wherein the personal context of the user is selected from a group comprising of identity, proximity, activity and location.
3. The method of claim 1, wherein the mobile communication device (102) is selected from the group comprising of mobile phones, Smartphone, laptops, palmtops, personal data assistance (PDA).
4. The method of claim 1, wherein the external sensors (118) are selected from the group comprising of building management system (BMS), 3 D depth camera, and ultra sound.
5. The method of claim 1, wherein the internal sensors (104) are selected from the group comprising of accelerometer, magnetometer, speech recognizer, Bluetooth RSSI and wireless frequency (Wi-Fi) signature.
6. The method of claim 1 , wherein the predefined density criteria for grouping at least two users having similar current location information at the particular locationis derived by employing a density based clustering method.
7. The method of claim 1, wherein theexternal sensors (118) and the internal sensors (104) embedded in mobile communication device (102) is physical/ hard sensors.
8. The method of claim 1, wherein the web sensor (214) is selected from the group comprising ofsocial networking website profile of the user, email headers of the user, RSS feed and social blog profile of the user.
9. The method of claim 1, wherein the external sensors (118) and the internal sensors (104) are communicatively coupled with a processing module (110) and the external sensors (118) transmits the sensory data to the processing module (110) via an external sensor interface (108) and the internal sensors (104) transmits the sensory data to the processing module (110) directly.
10. The method of claim 9, wherein the processed sensory data received from the external sensors (118) and the internal sensors (104) is stored in an internal memory (116) communicatively coupled with the processing module (110).
1 1. The method of claim 1 , wherein thederived social interaction information of the user is displayed using a front end application (124).
12. The method of claim 1, wherein the derived social interaction information of the user is selected from a group of interaction comprising interaction while working in cubicle, interaction while leading a meeting, interaction of a presenter in a session, passive listener in meeting, interaction during a group discussion.
13. The method of claim 1, wherein the fused data and personal context of the users are stored in a database (122) which is communicatively coupled to the back end server (120).
14. The method of claim 1, further comprising switching between other mobile device applications (106) of internal sensors (104) and personal context analysis when an interrupt is generated from regular activities of the mobile communication device (102) using a switching module (112).
15. A system for identifying personal context of atleast one user at a particular location for deriving an social interaction information of the user, the system comprising a mobile communication device (102);an external sensors (118);a back end server (120) hosting a fusion engine (212); a front end application (124); anda database (122), wherein: a. the mobile communication device (102) further comprising:
an internal sensors (104) adapted for sensing personal context of the user and sending the same to a processing module (110) of the mobile communication device (102); the processing module (110) adapted to perform on board processing of the sensory data received from the external sensors (118) and the internal sensors (104);
an internal memory (116) communicatively coupled with the processing module (110), adapted to store processed sensory data received from the external sensors (118) and the internal sensors (104);
a transmission module (114) adapted to transmit processed sensory data received from the processing module (110) to a back end server (120); and a switching module (112) adapted for switching between other mobile device applications (106) of internal sensors (104) and personal context analysis when an interrupt is generated from regular activities of the mobile communication device (102);
b. the external sensors (118) adapted for sensing personal context of the user and sending the same to the processing module (110) via an external sensor interface (108) of the mobile communication device (102);
c. the back end server (120) hosting a fusion engine (212) adapted for fusing an assigned confidence value of personal context of the user and the current location information of the user and derived accurate straight line di stance between at least two users in a group at the particular location;
d. a front end application (124) adapted for displaying derived social interaction information of the user; and
e. a database (122) communicatively coupled to the back end server (120), adapted to store fused data and personal context of the users.
The system of claim 15, further comprising: a. a localization module (202), adapted for locating at least one user within predefined range by identifying personal context of the user at the particular location, wherein personal context of the user is identified using the external sensors (118) and the internal sensors (104) embedded in the mobile communication device (102);
b. a confidence value assignment module (204), adapted for assigning the predefined confidence value to the identified personal context of the user;
c. a current location identification module (206), adapted for obtaining precise current location information of the user within the particular location by fusing assigned confidence value of personal context of the user using a fusion engine (212);
d. an proximity estimation module (208), adapted for estimating proximity for deriving accurate straight line distance between at least two users at the current location by utilizing the external sensors (118) and the internal sensors (104) embedded in the mobile communication device (102) of the user;
e. a grouping module (210), adapted for grouping at least two users having similar current location information at the particular location with a predefined density criteria; and
f. the fusion engine (212), adapted for deriving the social interaction information of the user by fusing the current location information of the user and derived accurate straight line distance between at least two users in a group at the particular location.
17. The system of claim 15, wherein the mobile communication device (102) is selected from the group comprising of mobile phones, Smartphone, laptops, palmtops, personal data assistance (PDA).
18. The system of claim 15, wherein the external sensors (118) are selected from the group comprising of building management system (BMS), 3 D depth camera, wearable EEG, and ultra sound.
19. The system of claim 15, wherein the internal sensors (104) are selected from the group comprising of accelerometer, magnetometer, speech recognizer and Wi-Fi signature.
PCT/IN2013/000045 2012-02-02 2013-01-22 A system and method for identifying and analyzing personal context of a user WO2013118144A2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2014555389A JP5951802B2 (en) 2012-02-02 2013-01-22 System and method for identifying and analyzing a user's personal context
US14/376,536 US9560094B2 (en) 2012-02-02 2013-01-22 System and method for identifying and analyzing personal context of a user
AU2013217184A AU2013217184A1 (en) 2012-02-02 2013-01-22 A system and method for identifying and analyzing personal context of a user
EP13746227.1A EP2810426A4 (en) 2012-02-02 2013-01-22 A system and method for identifying and analyzing personal context of a user
CN201380014086.7A CN104335564B (en) 2012-02-02 2013-01-22 For identify and analyze user personal scene system and method
AU2016200905A AU2016200905B2 (en) 2012-02-02 2016-02-12 A system and method for identifying and analyzing personal context of a user

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN313MU2012 2012-02-02
IN313/MUM/2012 2012-02-02

Publications (2)

Publication Number Publication Date
WO2013118144A2 true WO2013118144A2 (en) 2013-08-15
WO2013118144A3 WO2013118144A3 (en) 2013-10-17

Family

ID=48948129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2013/000045 WO2013118144A2 (en) 2012-02-02 2013-01-22 A system and method for identifying and analyzing personal context of a user

Country Status (6)

Country Link
US (1) US9560094B2 (en)
EP (1) EP2810426A4 (en)
JP (1) JP5951802B2 (en)
CN (1) CN104335564B (en)
AU (2) AU2013217184A1 (en)
WO (1) WO2013118144A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104765730A (en) * 2014-01-02 2015-07-08 株式会社理光 Method and device for recommending interested people
CN104811903A (en) * 2015-03-25 2015-07-29 惠州Tcl移动通信有限公司 Method for establishing communication group and wearable device capable of establishing communication group
CN105519074A (en) * 2014-06-30 2016-04-20 华为技术有限公司 User data processing method and device
US9661010B2 (en) 2014-11-21 2017-05-23 Honeywell International Inc. Security log mining devices, methods, and systems
EP3561815A1 (en) * 2018-04-27 2019-10-30 Tata Consultancy Services Limited A unified platform for domain adaptable human behaviour inference

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9953304B2 (en) * 2012-12-30 2018-04-24 Buzd, Llc Situational and global context aware calendar, communications, and relationship management
US10034144B2 (en) * 2013-02-22 2018-07-24 International Business Machines Corporation Application and situation-aware community sensing
US9509643B1 (en) 2013-11-12 2016-11-29 Twitter, Inc. Network-based content discovery using messages of a messaging platform
US10909462B2 (en) * 2015-05-21 2021-02-02 Tata Consultancy Services Limited Multi-dimensional sensor data based human behaviour determination system and method
WO2016199463A1 (en) * 2015-06-12 2016-12-15 ソニー株式会社 Information processing device, information processing method, and program
WO2017115145A1 (en) 2015-12-31 2017-07-06 Delta Faucet Company Water sensor
US9711056B1 (en) * 2016-03-14 2017-07-18 Fuvi Cognitive Network Corp. Apparatus, method, and system of building and processing personal emotion-based computer readable cognitive sensory memory and cognitive insights for enhancing memorization and decision making skills
CN105974360A (en) * 2016-04-27 2016-09-28 沈阳云飞科技有限公司 Monitoring analysis method based on ADL and apparatus thereof
US10254378B1 (en) * 2016-07-05 2019-04-09 Phunware, Inc. Mobile device localization based on relative received signal strength indicators
US10087046B2 (en) * 2016-10-12 2018-10-02 Otis Elevator Company Intelligent building system for altering elevator operation based upon passenger identification
US20180146062A1 (en) * 2016-11-18 2018-05-24 Futurewei Technologies, Inc. Channel recommendation system and method
WO2018211655A1 (en) * 2017-05-18 2018-11-22 三菱電機株式会社 Position detection device, elevator control device, and elevator system
CN107346112B (en) * 2017-05-27 2020-02-14 深圳奥比中光科技有限公司 System and terminal equipment fusing multi-sensor information
CN107370869B (en) * 2017-05-27 2020-02-18 深圳奥比中光科技有限公司 Mobile terminal and mobile phone
US10127825B1 (en) 2017-06-13 2018-11-13 Fuvi Cognitive Network Corp. Apparatus, method, and system of insight-based cognitive assistant for enhancing user's expertise in learning, review, rehearsal, and memorization
EP3682373A4 (en) * 2017-09-15 2021-06-09 Tandemlaunch Inc System and method for classifying passive human-device interactions through ongoing device context awareness
US10511446B2 (en) 2017-09-22 2019-12-17 Cisco Technology, Inc. Methods and apparatus for secure device pairing for secure network communication including cybersecurity
US10853736B2 (en) * 2017-11-17 2020-12-01 Microsoft Technology Licensing, Llc Preventing notification blindness
CN110059795A (en) * 2018-01-18 2019-07-26 中国科学院声学研究所 A kind of mobile subscriber's node networking method merging geographical location and temporal characteristics
CN108845679B (en) * 2018-05-02 2019-08-02 上海交通大学 Full keyboard inputs acquisition methods on remote subscriber's touch screen
US10594549B2 (en) 2018-05-18 2020-03-17 Nant Holdings Ip, Llc Fine grained network management to edge device features
US11080438B2 (en) * 2018-05-30 2021-08-03 International Business Machines Corporation Building-information management system with directional wind propagation and diffusion
EP3798838B1 (en) * 2019-09-30 2022-04-06 Fujitsu Limited Method and system for reconciling values of a feature
US11650662B2 (en) * 2020-10-05 2023-05-16 Dell Products L.P. Automated display viewing angle alignment
US11589188B1 (en) 2021-05-27 2023-02-21 T-Mobile Usa, Inc. Device-based timely emergency call routing

Family Cites Families (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5844415A (en) * 1994-02-03 1998-12-01 Massachusetts Institute Of Technology Method for three-dimensional positions, orientation and mass distribution
US20030126258A1 (en) * 2000-02-22 2003-07-03 Conkright Gary W. Web based fault detection architecture
US8086672B2 (en) * 2000-06-17 2011-12-27 Microsoft Corporation When-free messaging
DE10218313B4 (en) * 2002-04-24 2018-02-15 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Digital motion picture camera
US6823190B2 (en) * 2002-12-03 2004-11-23 International Business Machines Corporation System and method to anonymously test for proximity of mobile users without revealing individual phase space coordinates
PL1616342T3 (en) * 2003-04-19 2007-05-31 Haake Andre Safety strip for a striking edge safety device or closing edge safety device
US8373660B2 (en) * 2003-07-14 2013-02-12 Matt Pallakoff System and method for a portable multimedia client
DE10331964B4 (en) * 2003-07-15 2016-05-04 Robert Bosch Gmbh Device for side impact detection and pressure sensor
US20050076013A1 (en) 2003-10-01 2005-04-07 Fuji Xerox Co., Ltd. Context-based contact information retrieval systems and methods
US7219255B2 (en) * 2004-07-27 2007-05-15 Mks Instruments, Inc. Failsafe switching of intelligent controller method and device
DE102004064002B4 (en) * 2004-08-04 2019-05-09 Continental Automotive Gmbh System for monitoring a sensor device
US7295103B2 (en) * 2004-12-22 2007-11-13 The Goodyear Tire & Rubber Company Integrated sensor system and method for a farm tire
US8583139B2 (en) * 2004-12-31 2013-11-12 Nokia Corporation Context diary application for a mobile terminal
US7400594B2 (en) * 2005-05-03 2008-07-15 Eaton Corporation Method and system for automated distributed pairing of wireless nodes of a communication network
JP2008072414A (en) * 2006-09-14 2008-03-27 Hitachi Ltd Sensor net system and sensor node
US20080130972A1 (en) * 2006-11-30 2008-06-05 General Electric Company Storing imaging parameters
US20080243439A1 (en) * 2007-03-28 2008-10-02 Runkle Paul R Sensor exploration and management through adaptive sensing framework
US8013731B2 (en) 2007-07-03 2011-09-06 3M Innovative Properties Company Apparatus and method for processing data collected via wireless network sensors
WO2009043020A2 (en) * 2007-09-28 2009-04-02 The Trustees Of Dartmouth College System and method for injecting sensed presence into social networking applications
JP2009129338A (en) * 2007-11-27 2009-06-11 Sony Corp Interpersonal relationship evaluation device, interpersonal relationship evaluation method, interpersonal relationship evaluation system, and terminal device
WO2009070138A1 (en) * 2007-11-29 2009-06-04 David Stackpole Dynamic geosocial networking
EP2250066B1 (en) * 2008-02-08 2017-11-15 ALSTOM Transport Technologies Railway sensor communication system and method
US8587402B2 (en) * 2008-03-07 2013-11-19 Palm, Inc. Context aware data processing in mobile computing device
US8290885B2 (en) * 2008-03-13 2012-10-16 Sony Corporation Information processing apparatus, information processing method, and computer program
WO2010001483A1 (en) * 2008-07-04 2010-01-07 パイオニア株式会社 Relationship estimation device and method
JP5137781B2 (en) 2008-10-30 2013-02-06 株式会社エヌ・ティ・ティ・ドコモ Mobile device and application switching method
US8279089B2 (en) * 2008-11-20 2012-10-02 Ellenberger & Poensgen Gmbh Method and device for monitoring the function of a safety unit
JP2010145228A (en) * 2008-12-18 2010-07-01 Sanyo Electric Co Ltd Position display apparatus and current position determination method
US8219513B2 (en) * 2008-12-19 2012-07-10 Eastman Kodak Company System and method for generating a context enhanced work of communication
JP2010165097A (en) * 2009-01-14 2010-07-29 Ntt Docomo Inc Personal relationship estimation device, and personal relationship estimation method
DE102009008125B4 (en) * 2009-02-09 2020-01-23 IAD Gesellschaft für Informatik, Automatisierung und Datenverarbeitung mbH Modular, expandable measuring device with an access-protected area
US20100284290A1 (en) 2009-04-09 2010-11-11 Aegis Mobility, Inc. Context based data mediation
TWI385544B (en) 2009-09-01 2013-02-11 Univ Nat Pingtung Sci & Tech Density-based data clustering method
CN102577494B (en) * 2009-09-28 2016-03-16 瑞典爱立信有限公司 Support the method and apparatus of the social network analysis in communication network
EP2520130B1 (en) * 2009-12-31 2020-12-09 Nokia Technologies Oy Method and apparatus for performing multiple forms of communications in one session
US20110172918A1 (en) * 2010-01-13 2011-07-14 Qualcomm Incorporated Motion state detection for mobile device
US20110246490A1 (en) 2010-04-01 2011-10-06 Sony Ericsson Mobile Communications Ab Updates with context information
CN103069412A (en) * 2010-06-29 2013-04-24 诺基亚公司 Method and apparatus for context-based grouping
EP2591466B1 (en) * 2010-07-06 2019-05-08 Sparkup Ltd. Method and system for book reading enhancement
US9055020B2 (en) * 2010-09-27 2015-06-09 Nokia Technologies Oy Method and apparatus for sharing user information
DE102011002706B4 (en) * 2011-01-14 2013-12-19 Siemens Aktiengesellschaft Device and method for protecting a security module against manipulation attempts in a field device
US20120319989A1 (en) * 2011-06-20 2012-12-20 Chris Argiro Video-game controller assemblies designed for progressive control of actionable-objects displayed on touchscreens: expanding the method and breadth of touch-input delivery
US20130103759A1 (en) * 2011-10-21 2013-04-25 Nokia Corporation Method and apparatus for providing data sharing schemes to provision device services
US10853531B2 (en) * 2011-11-02 2020-12-01 Nokia Technologies Oy Method and apparatus for context sensing inference

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP2810426A4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104765730A (en) * 2014-01-02 2015-07-08 株式会社理光 Method and device for recommending interested people
CN105519074A (en) * 2014-06-30 2016-04-20 华为技术有限公司 User data processing method and device
EP3145156A4 (en) * 2014-06-30 2017-05-31 Huawei Technologies Co. Ltd. User data processing method and device
US9661010B2 (en) 2014-11-21 2017-05-23 Honeywell International Inc. Security log mining devices, methods, and systems
CN104811903A (en) * 2015-03-25 2015-07-29 惠州Tcl移动通信有限公司 Method for establishing communication group and wearable device capable of establishing communication group
EP3561815A1 (en) * 2018-04-27 2019-10-30 Tata Consultancy Services Limited A unified platform for domain adaptable human behaviour inference

Also Published As

Publication number Publication date
US20140351337A1 (en) 2014-11-27
JP5951802B2 (en) 2016-07-13
CN104335564B (en) 2017-03-08
AU2016200905B2 (en) 2017-02-02
WO2013118144A3 (en) 2013-10-17
EP2810426A4 (en) 2015-09-02
US9560094B2 (en) 2017-01-31
AU2016200905A1 (en) 2016-05-12
EP2810426A2 (en) 2014-12-10
AU2013217184A1 (en) 2014-08-21
CN104335564A (en) 2015-02-04
JP2015510636A (en) 2015-04-09

Similar Documents

Publication Publication Date Title
AU2016200905B2 (en) A system and method for identifying and analyzing personal context of a user
Gu et al. Paws: Passive human activity recognition based on wifi ambient signals
US9342532B2 (en) System and method for real-time map-based lost and found
CN101938691B (en) Multi-modal proximity detection
US11510033B2 (en) Creation and consumption of transient user profiles
US10204292B2 (en) User terminal device and method of recognizing object thereof
Matic et al. Analysis of social interactions through mobile phones
CN108304758A (en) Facial features tracking method and device
CN107705251A (en) Picture joining method, mobile terminal and computer-readable recording medium
CN104182488A (en) Search method, server and client
Bouchaud et al. IoT Forensic: identification and classification of evidence in criminal investigations
WO2017000347A1 (en) Method and apparatus for protecting mobile terminal
Honnef et al. Zero-effort indoor continuous social distancing monitoring system
CN107704514A (en) A kind of photo management method, device and computer-readable recording medium
Ehatisham-ul-Haq et al. Using smartphone accelerometer for human physical activity and context recognition in-the-wild
Hanzl et al. Analyses of human behaviour in public spaces
Zheng et al. Missile: A system of mobile inertial sensor-based sensitive indoor location eavesdropping
CN112231768B (en) Data processing method and device, computer equipment and storage medium
Gala et al. Real-time indoor geolocation tracking for assisted healthcare facilities
Incel et al. Arservice: a smartphone based crowd-sourced data collection and activity recognition framework
Williams et al. Graph mining indoor tracking data for social interaction analysis
CN113837393B (en) Wireless perception model robustness detection method based on probability and statistical evaluation
Álvarez-García et al. Detecting social interactions in working environments through sensing technologies
Walse et al. A Survey on Framework for Context Awareness Using Smartphone
Ustebay et al. A Comparative Analysis of N-Nearest Neighbors (N3) and Binned Nearest Neighbors (BNN) Algorithms for Indoor Localization

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13746227

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2014555389

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2013746227

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2013746227

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 14376536

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2013217184

Country of ref document: AU

Date of ref document: 20130122

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13746227

Country of ref document: EP

Kind code of ref document: A2