US20080218331A1 - Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness - Google Patents
Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness Download PDFInfo
- Publication number
- US20080218331A1 US20080218331A1 US11/715,338 US71533807A US2008218331A1 US 20080218331 A1 US20080218331 A1 US 20080218331A1 US 71533807 A US71533807 A US 71533807A US 2008218331 A1 US2008218331 A1 US 2008218331A1
- Authority
- US
- United States
- Prior art keywords
- information
- environment
- personnel
- graphics
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/016—Personal emergency signalling and security systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19617—Surveillance camera constructional details
- G08B13/19621—Portable camera
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/14—Central alarm receiver or annunciator arrangements
Definitions
- This relates to showing the location and relevant data about personnel and sensors within an environment on top of a view of the environment.
- FIG. 1 Prior art indoor personnel location information system
- FIG. 2 Prior art indoor personnel location information system
- FIG. 3 Prior art indoor personnel location information system
- FIG. 4 Exemplary indoor personnel location information system
- FIG. 5 Exemplary indoor personnel location information system incorporating an optical see-through display
- FIG. 6 Exemplary indoor personnel location information system incorporating a video display
- FIG. 7 Exemplary indoor personnel location information system incorporating a video display
- FIG. 8 Exemplary indoor personnel location information system incorporating a video display
- FIG. 9 Exemplary view of information superimposed on top of a view of an environment from the perspective of a responder outside of the environment;
- FIG. 10 Exemplary indoor personnel location system
- FIG. 11 Exemplary view of information superimposed on top of a view of an environment from the perspective of a responder within the environment.
- FIG. 1 illustrates one of the simplest ways prior art systems provide information to first responders.
- first responder 100 and user 102 communicate across communication channel 106 using respective communication devices 104 a and 104 b.
- Communication devices 104 a and 104 b are typically radio transceivers and communication channel 106 is the physical medium through which communication devices are linked. In the case where communication devices 104 a and 104 b are radio transceivers, communication channel 106 is simply air.
- User 102 is located some distance away from responder 100 and has a perspective of the first responder's surrounding environment 108 that allows user 102 to provide responder 100 with information about the responder's environment 108 not immediately available to responder 100 .
- FIG. 2 illustrates a prior art system that enhances the prior art system of FIG. 1 by incorporating a computer 110 that provides a map 112 which can be viewed by user 102 on a display 114 .
- Map 112 provides the user 102 with more information about environment 108 . This information can be communicated by the user 102 to the first responder 100 .
- Map 112 is typically a static 2D or 3D representation of a portion of environment 108 .
- FIG. 3 illustrates a prior art system that enhances the prior art system of FIG. 2 by equipping a first responder 100 with a sensor 116 that allows first responder's location to be monitored. This allows computer 110 to plot the first responder's location on the map 112 . Thus, display 114 provides the user 102 with information about a first responder's location where the first responder's location is superimposed on map 112 .
- Prior art systems illustrated in FIGS. 1 and 2 do not combine a dynamic representation of an environment with information received from first responders.
- user 102 typically must create a mental picture of the location of each first responder 100 with respect to the environment 108 by using communications received from each responder or team of responders.
- a map or 3D display of a virtual view of the environment is used as in systems illustrated in FIG. 3 , this view is not aligned with the real environment 108 , and therefore requires mental integration to put/relate received information in context with the environment 108 .
- Given the large number of buildings in a metropolitan area it is very rare that a map or a 3D model will be available for every building.
- the following example provides an illustration of exemplary prior art systems.
- firefighters i.e. first responders
- the captain i.e. user
- the captain can use a map of the building to plot the locations of the firefighters in the building or a more modern system might automatically plot the locations on a map, given that there are sensors able to sense the location of the responders in the building.
- the captain can then communicate information to the firefighters about their locations based on information from the map and/or the information from the captain's view of the building.
- the captain might use his view of the building for his own use without communicating information to the firefighters.
- dynamic information about the building e.g. what parts are on fire and/or going down
- information received from the firefighters e.g. locations. That is, the captain must look to the map to determine where the firefighters are located and look to the building to see which parts are on fire and integrate both types of information to determine if any firefighters are in danger. Only after such integration can the captain communicate to the firefighters if they are in danger.
- the system described herein uses augmented reality to show information received from first responders on top of a live view of the first responders' environment.
- the system can provide an enhanced representation of the environment.
- This enhanced representation can be used to provide enhanced situational awareness for first responders. For example, in the scenario described above, if a part of the building can be seen as becoming weak under fire, the captain will immediately be able to determine if any of the firefighters are in danger by looking at the information superimposed on the dynamic view of the environment. The captain can then call the person at this location to leave that area of the building.
- the system can also show the locations and values of sensors placed within the environment superimposed on top of a real-time view of the environment. For example, when a temperature sensor is dropped by a firefighter, the sensor's own tracking system (or last location of the firefighter at the time he dropped the sensor) provides the location of the sensor. By showing data coming from the sensor on top of a real-time view of the environment, the captain can directly relate the sensor reading with a location in the environment.
- AR Augmented Reality
- AR is like Virtual Reality, but instead of using completely artificial images (e.g. maps or 3D models), AR superimposes 3D graphics on a view of the real world.
- a very simple example of AR is used in football games to show the first down with a yellow line.
- An example of an AR system that can be employed is one described in the examples of U.S. application Ser. No. 11/441,241 in combination with the present disclosure.
- An AR visualization system comprises: a spatial database, a graphical computer, a viewpoint tracking device, and a display device.
- a display device that displays dynamic images corresponding to a user's view is tracked. That is, the display's position and orientation are measured by a viewpoint tracking device.
- a spatial database and a graphical computer associate information with a real world environment. Associated information is superimposed on top of the dynamic display image in accordance with the display's position and orientation, thereby creating an augmented image.
- FIG. 4 shows an exemplary embodiment of an AR system used to provide a responder 100 with enhanced situational awareness.
- Computer 110 collects information from sensors 116 a worn by first responder 100 and sensors 116 b placed throughout surrounding environment 108 .
- Sensors 116 a include sensors that allow a first responder's location to be monitored and can include sensors that provide information about the state of the first responder's health (e.g. temperature, heart rate, etc.), the first responder's equipment (e.g. capacity and/or power level of equipment), and conditions within the first responder's immediate proximity (e.g. temperature, air content).
- Sensors 116 b can include any sensors that can gather information about the conditions of the environment 108 . Examples of such sensors 116 b include temperature sensors, radiation sensors, smoke detectors, gas sensors, wind sensors, pressure sensors, humidity sensors and the like. It should be noted that although FIG.
- FIG. 4 shows a single first responder 100 such a representation is not intended to be limiting and any number of first responders 100 could be located within the environment 108 .
- Sensors 116 a and 116 b communicate with computer 110 using a communication medium similar to communication medium 106 .
- Computer 110 updates database 118 with the information received from sensors 116 a and 116 b.
- Database 118 stores the information from sensors 116 a and 116 b.
- Database 118 may additionally contain model information about the environment 108 , such as a 3D model of a building. Model information may be used to provide advanced functionality in the system, but is not necessary for the basic system implementation.
- Graphical computer 110 continuously renders information from the database 118 , thereby showing a first responder's location within the environment 108 and generating graphics from current information received from sensors 116 a and 116 b.
- Graphical computer 110 continuously renders information from the database 118 , thereby placing current information from sensors 116 a and 116 b in the database 118 .
- Computer 110 also receives information about the viewpoint of the display device 124 captured by the tracking device 122 .
- Computer 110 takes information from database 118 and tracking information about the viewpoint of the display device 124 and renders current information from sensors 116 a and 116 b in relation to the current view of the display device 124 by using a common 3 D projection process.
- By measuring in real time the position and orientation of the display 124 i.e. determining user's viewpoint, it is possible to align information rendered from the spatial database 118 with the corresponding viewpoint.
- the display device 124 is able to show the image generated by the graphical computer 110 superimposed on a view of the surrounding environment 108 as “seen” by or through the display device 124 .
- user 102 has a global perspective of environment 108 with information superimposed thereon and is able to use this enhanced global perspective of environment 108 to communicate information to first responder 100 . Thereby, efficiently providing first responder 102 with information about environment 108 that would not otherwise be available to first responder 102 .
- FIG. 5 shows display device 124 implemented with an optical see-through display.
- Optical see-through displays show the image generated by the graphical computer 110 superimposed on a view of the surrounding environment 108 by using an optical beam splitter that lets through half of the light coming from environment 108 in front and reflecting half of the light coming from a display 124 showing the image generated by the graphical computer 110 , in effect combining the real world environment 118 and the graphics.
- See-through displays are typically in the form of goggles that are worn by the user 102 , but could be also a head-up display as used in fighter jets.
- FIG. 6 shows the display device 124 implemented with a video see-through display.
- Video see-through displays achieve showing the image generated by the graphical computer 110 superimposed on a view of environment 108 by using a video camera 126 to take video of environment 108 and show it on the display 124 after the image from the graphical computer 110 has been overlaid on top of it using video rendering device 128 .
- the camera capturing the view of the real world environment 108 and the display showing this video can be co-located in a single display device as shown in FIG. 6 or placed at different locations as shown in FIG. 7 .
- Video displays can be implemented using various types of display technologies and can be located anywhere in proximity to user 102 .
- display 124 could be a screen inside a truck, a tablet computer or PDA outside the truck.
- FIG. 8 is an exemplary embodiment in which user 102 and first responder 100 each have displays 124 . This allows first responder 100 to receive the augmented video displayed on the user's 102 video see-through display. In the case where there are multiple first responders 100 , each responder 100 could receive video generated from any AR system used by other responder 100 . Multiple video sources can be provided to user 102 and each first responder 100 using any known manner i.e. split screen, multiple displays, switching sources, etc. It should be noted that a responder can receive video on a display in implementations where the user display and camera are not co-located, as in FIG. 7 .
- FIGS. 4-8 can be combined in any number of ways when appropriate (e.g. tracking 122 and computer 110 can be combined within the same physical device). Further, the elements shown can be distinct physical devices that communicate with each other in any appropriate manner. For example, sensors 116 a and 116 b can communicate with computer 110 via radio communications, across a network using network protocols, or using any other appropriate method of electronic communications.
- FIG. 9 illustrates the concept of superimposing information on top of a real world exterior building view using the example of firefighters inside a burning building.
- the view of the building could be provided from a video camera which could be mounted on a truck near the building, handled by a cameraman, or mounted on the captain in such a way that it represents the captain's view.
- the image could also be generated by using an optical see-through display.
- the image in FIG. 9 provides a perspective of the environment from outside the actual environment.
- the locations of responders called “John” and “Joe” are superimposed on top of the real-life view of the building. It should be noted that although John and Joe are represented by the same symbol (i.e. a cross), such a representation is not intended to be limiting and each responder could be represented by a unique symbol.
- John and Joe's names are displayed next to John and Joe's names.
- the percentage represents the level of oxygen that each has in his oxygen tank.
- John's oxygen tank is 80% full and Joe's tank is 90% full. This can provide the captain with an idea of how much time John and Joe have to operate inside the building.
- Avatars can alternatively be used to represent the first responders or any of the information received from them. There are numerous known avatars used in the electronic gaming art which could be incorporated into the system. Further, graphical information can represent the combined status of John and Joe, e.g. an indicator that represents a combined oxygen level. Alternatively, both could be shown using an aggregated symbol (a team of responders operating close by, the reduce display cluttering).
- the sensors are temperature sensors dropped somewhere in the building on fire.
- One such sensor is supplying the temperature reading 435 degrees as shown.
- Other types of sensors and additional temperature sensors can be placed throughout the building.
- exemplary systems can also be implemented as systems in the following applications: a system showing “blue force” (friendly) in military operations in urban environments, a system showing locations of workers inside a building, a system used by security personnel showing the location of an alarm sensor that has been triggered, a system used by maintenance personnel to show the location and data about a sensor or a set of sensor in a plant/building.
- blue force friendly
- exemplary systems can also be implemented as systems in the following applications: a system showing “blue force” (friendly) in military operations in urban environments, a system showing locations of workers inside a building, a system used by security personnel showing the location of an alarm sensor that has been triggered, a system used by maintenance personnel to show the location and data about a sensor or a set of sensor in a plant/building.
- FIG. 10 illustrates the principle of a first responder 100 having an augmented view.
- the system in FIG. 10 is not implemented any differently from the systems in FIGS. 4-8 . The difference is the position of the person with the augmented view of the environment 108 .
- this person (user 102 ) is outside of the environment 108 .
- this person (first responder 100 ) is inside the environment.
- first responder 100 has an augmented view, it allows first responders 100 to have information from sensors superimposed on his view of the environment, which is unique from the user's view.
- An example of such a view is shown in FIG. 11 , in which the firefighter's view of a room is superimposed by information such as who is in the room and where they are, as well as values coming from sensors that have been placed in the environment.
- first responders 100 see graphics superimposed on their individual views
- first responders 100 might be using a helmet, wrist mounted or PDA/tablet display to see the information aligned with the real world environment 108 .
- This display 124 would show the same information such as the locations and data about responders 100 and sensors 116 b or any other useful information. If a responder 100 needs assistance, it becomes now easy for other responders to come to help because they see where the responder 100 is with respect to the environment 108 and they can see how to get to the responder 100 while avoiding obstacles.
Abstract
Description
- The present application is related to the following co-pending U.S. Patent applications, the entire contents of each of which are incorporated herein by reference:
-
- 1. U.S. application Ser. No. 11/441,241 entitled “System and Method to Display Maintenance and Operation Instructions of an Apparatus Using Augmented Reality,” filed May 26, 2006; and
- 2. U.S. application Ser. No. 11/______ entitled “Augmented Reality-Based System and Method Providing Status and Control of Unmanned Vehicles,” filed Mar. 8, 2007.
- 3. U.S. application Ser. No. 11/516,545 entitled “Method and System for Geo-Referencing and Visualization of Detected Contaminants,” filed Sep. 7, 2006.
- This relates to showing the location and relevant data about personnel and sensors within an environment on top of a view of the environment.
- The following description, given with respect to the attached drawings, may be better understood with reference to the non-limiting examples of the drawing, wherein the drawings show:
-
FIG. 1 : Prior art indoor personnel location information system; -
FIG. 2 : Prior art indoor personnel location information system; -
FIG. 3 : Prior art indoor personnel location information system; -
FIG. 4 : Exemplary indoor personnel location information system; -
FIG. 5 : Exemplary indoor personnel location information system incorporating an optical see-through display; -
FIG. 6 : Exemplary indoor personnel location information system incorporating a video display; -
FIG. 7 : Exemplary indoor personnel location information system incorporating a video display; -
FIG. 8 : Exemplary indoor personnel location information system incorporating a video display; -
FIG. 9 : Exemplary view of information superimposed on top of a view of an environment from the perspective of a responder outside of the environment; -
FIG. 10 : Exemplary indoor personnel location system; and -
FIG. 11 : Exemplary view of information superimposed on top of a view of an environment from the perspective of a responder within the environment. - It has long been desirable to provide enhanced situational awareness to first responders. For example, providing first responders with more information about their surrounding environment could improve rescue operations. Prior art devices have attempted to provide enhanced situational awareness to first responders by combining a virtual representation of an environment (e.g. a map or 3D representation of a building) with status information received from first responders and having a user interpret the relevance of the combination and communicate the relevance to first responders.
-
FIG. 1 illustrates one of the simplest ways prior art systems provide information to first responders. InFIG. 1 , first responder 100 and user 102 communicate acrosscommunication channel 106 usingrespective communication devices Communication devices communication channel 106 is the physical medium through which communication devices are linked. In the case wherecommunication devices communication channel 106 is simply air. User 102 is located some distance away fromresponder 100 and has a perspective of the first responder's surroundingenvironment 108 that allows user 102 to provideresponder 100 with information about the responder'senvironment 108 not immediately available to responder 100. -
FIG. 2 illustrates a prior art system that enhances the prior art system ofFIG. 1 by incorporating acomputer 110 that provides amap 112 which can be viewed by user 102 on adisplay 114.Map 112 provides the user 102 with more information aboutenvironment 108. This information can be communicated by the user 102 to thefirst responder 100.Map 112 is typically a static 2D or 3D representation of a portion ofenvironment 108. -
FIG. 3 illustrates a prior art system that enhances the prior art system ofFIG. 2 by equipping afirst responder 100 with asensor 116 that allows first responder's location to be monitored. This allowscomputer 110 to plot the first responder's location on themap 112. Thus,display 114 provides the user 102 with information about a first responder's location where the first responder's location is superimposed onmap 112. - Prior art systems illustrated in
FIGS. 1 and 2 do not combine a dynamic representation of an environment with information received from first responders. As a consequence, user 102 typically must create a mental picture of the location of eachfirst responder 100 with respect to theenvironment 108 by using communications received from each responder or team of responders. Even if a map or 3D display of a virtual view of the environment is used as in systems illustrated inFIG. 3 , this view is not aligned with thereal environment 108, and therefore requires mental integration to put/relate received information in context with theenvironment 108. Further, given the large number of buildings in a metropolitan area, it is very rare that a map or a 3D model will be available for every building. - The following example provides an illustration of exemplary prior art systems. In an event where a one-story building is on fire, firefighters (i.e. first responders) arrive and enter the building. As they move around they let the captain (i.e. user) know roughly where they are in the building (e.g. “I am entering the North-East corner”). The captain can use a map of the building to plot the locations of the firefighters in the building or a more modern system might automatically plot the locations on a map, given that there are sensors able to sense the location of the responders in the building. The captain can then communicate information to the firefighters about their locations based on information from the map and/or the information from the captain's view of the building. Alternatively the captain might use his view of the building for his own use without communicating information to the firefighters. In this example, dynamic information about the building (e.g. what parts are on fire and/or going down) is not combined with information received from the firefighters (e.g. locations). That is, the captain must look to the map to determine where the firefighters are located and look to the building to see which parts are on fire and integrate both types of information to determine if any firefighters are in danger. Only after such integration can the captain communicate to the firefighters if they are in danger.
- The system described herein uses augmented reality to show information received from first responders on top of a live view of the first responders' environment. By having information received from first responders on top of a dynamic/live/real-time view of an environment, the system can provide an enhanced representation of the environment. This enhanced representation can be used to provide enhanced situational awareness for first responders. For example, in the scenario described above, if a part of the building can be seen as becoming weak under fire, the captain will immediately be able to determine if any of the firefighters are in danger by looking at the information superimposed on the dynamic view of the environment. The captain can then call the person at this location to leave that area of the building. Further, if a responder is down, it is also possible to see the responder's location with respect to the actual building, which can be useful in determining the best way to reach the responder given the current conditions of the building. The system can also show the locations and values of sensors placed within the environment superimposed on top of a real-time view of the environment. For example, when a temperature sensor is dropped by a firefighter, the sensor's own tracking system (or last location of the firefighter at the time he dropped the sensor) provides the location of the sensor. By showing data coming from the sensor on top of a real-time view of the environment, the captain can directly relate the sensor reading with a location in the environment.
- The situation awareness system described herein makes use of Augmented Reality (AR) technology to provide the necessary view to the user. AR is like Virtual Reality, but instead of using completely artificial images (e.g. maps or 3D models), AR superimposes 3D graphics on a view of the real world. A very simple example of AR is used in football games to show the first down with a yellow line. An example of an AR system that can be employed is one described in the examples of U.S. application Ser. No. 11/441,241 in combination with the present disclosure.
- An AR visualization system comprises: a spatial database, a graphical computer, a viewpoint tracking device, and a display device.
- The working principle of an Augmented Reality system is described below. A display device that displays dynamic images corresponding to a user's view is tracked. That is, the display's position and orientation are measured by a viewpoint tracking device. A spatial database and a graphical computer associate information with a real world environment. Associated information is superimposed on top of the dynamic display image in accordance with the display's position and orientation, thereby creating an augmented image.
-
FIG. 4 shows an exemplary embodiment of an AR system used to provide aresponder 100 with enhanced situational awareness. -
Computer 110 collects information fromsensors 116 a worn byfirst responder 100 andsensors 116 b placed throughout surroundingenvironment 108.Sensors 116 a include sensors that allow a first responder's location to be monitored and can include sensors that provide information about the state of the first responder's health (e.g. temperature, heart rate, etc.), the first responder's equipment (e.g. capacity and/or power level of equipment), and conditions within the first responder's immediate proximity (e.g. temperature, air content).Sensors 116 b can include any sensors that can gather information about the conditions of theenvironment 108. Examples ofsuch sensors 116 b include temperature sensors, radiation sensors, smoke detectors, gas sensors, wind sensors, pressure sensors, humidity sensors and the like. It should be noted that althoughFIG. 4 shows a singlefirst responder 100 such a representation is not intended to be limiting and any number offirst responders 100 could be located within theenvironment 108.Sensors computer 110 using a communication medium similar tocommunication medium 106. -
Computer 110updates database 118 with the information received fromsensors Database 118 stores the information fromsensors Database 118 may additionally contain model information about theenvironment 108, such as a 3D model of a building. Model information may be used to provide advanced functionality in the system, but is not necessary for the basic system implementation.Graphical computer 110 continuously renders information from thedatabase 118, thereby showing a first responder's location within theenvironment 108 and generating graphics from current information received fromsensors Graphical computer 110 continuously renders information from thedatabase 118, thereby placing current information fromsensors database 118. -
Computer 110 also receives information about the viewpoint of thedisplay device 124 captured by thetracking device 122.Computer 110 takes information fromdatabase 118 and tracking information about the viewpoint of thedisplay device 124 and renders current information fromsensors display device 124 by using a common 3D projection process. By measuring in real time the position and orientation of the display 124 (i.e. determining user's viewpoint), it is possible to align information rendered from thespatial database 118 with the corresponding viewpoint. - The
display device 124 is able to show the image generated by thegraphical computer 110 superimposed on a view of the surroundingenvironment 108 as “seen” by or through thedisplay device 124. Thus, user 102 has a global perspective ofenvironment 108 with information superimposed thereon and is able to use this enhanced global perspective ofenvironment 108 to communicate information tofirst responder 100. Thereby, efficiently providing first responder 102 with information aboutenvironment 108 that would not otherwise be available to first responder 102. -
FIG. 5 showsdisplay device 124 implemented with an optical see-through display. Optical see-through displays show the image generated by thegraphical computer 110 superimposed on a view of the surroundingenvironment 108 by using an optical beam splitter that lets through half of the light coming fromenvironment 108 in front and reflecting half of the light coming from adisplay 124 showing the image generated by thegraphical computer 110, in effect combining thereal world environment 118 and the graphics. See-through displays are typically in the form of goggles that are worn by the user 102, but could be also a head-up display as used in fighter jets. -
FIG. 6 shows thedisplay device 124 implemented with a video see-through display. Video see-through displays achieve showing the image generated by thegraphical computer 110 superimposed on a view ofenvironment 108 by using a video camera 126 to take video ofenvironment 108 and show it on thedisplay 124 after the image from thegraphical computer 110 has been overlaid on top of it usingvideo rendering device 128. In the case of a video see-through display, the camera capturing the view of thereal world environment 108 and the display showing this video can be co-located in a single display device as shown inFIG. 6 or placed at different locations as shown inFIG. 7 . Video displays can be implemented using various types of display technologies and can be located anywhere in proximity to user 102. In the firefighter example,display 124 could be a screen inside a truck, a tablet computer or PDA outside the truck. - The three exemplary configurations (optical see-through, collocated camera and display, and camera and display at different locations) described above are mentioned for understanding the implementation of an AR system and are not intended to be limiting. Any AR system that is able to superimpose graphics that appear attached to the real world could be used.
-
FIG. 8 is an exemplary embodiment in which user 102 andfirst responder 100 each have displays 124. This allowsfirst responder 100 to receive the augmented video displayed on the user's 102 video see-through display. In the case where there are multiplefirst responders 100, eachresponder 100 could receive video generated from any AR system used byother responder 100. Multiple video sources can be provided to user 102 and eachfirst responder 100 using any known manner i.e. split screen, multiple displays, switching sources, etc. It should be noted that a responder can receive video on a display in implementations where the user display and camera are not co-located, as inFIG. 7 . - It should be noted that the elements shown in
FIGS. 4-8 can be combined in any number of ways when appropriate (e.g. tracking 122 andcomputer 110 can be combined within the same physical device). Further, the elements shown can be distinct physical devices that communicate with each other in any appropriate manner. For example,sensors computer 110 via radio communications, across a network using network protocols, or using any other appropriate method of electronic communications. -
FIG. 9 illustrates the concept of superimposing information on top of a real world exterior building view using the example of firefighters inside a burning building. In this example, the view of the building could be provided from a video camera which could be mounted on a truck near the building, handled by a cameraman, or mounted on the captain in such a way that it represents the captain's view. The image could also be generated by using an optical see-through display. The image inFIG. 9 provides a perspective of the environment from outside the actual environment. As shown, the locations of responders called “John” and “Joe” are superimposed on top of the real-life view of the building. It should be noted that although John and Joe are represented by the same symbol (i.e. a cross), such a representation is not intended to be limiting and each responder could be represented by a unique symbol. - Also displayed next to John and Joe's names is information regarding the status of each. In this example, the percentage represents the level of oxygen that each has in his oxygen tank. Here John's oxygen tank is 80% full and Joe's tank is 90% full. This can provide the captain with an idea of how much time John and Joe have to operate inside the building. Avatars can alternatively be used to represent the first responders or any of the information received from them. There are numerous known avatars used in the electronic gaming art which could be incorporated into the system. Further, graphical information can represent the combined status of John and Joe, e.g. an indicator that represents a combined oxygen level. Alternatively, both could be shown using an aggregated symbol (a team of responders operating close by, the reduce display cluttering).
- Shown above the representations of John and Joe, is data coming from sensors that have been dropped inside the building. In this exemplary embodiment, the sensors are temperature sensors dropped somewhere in the building on fire. One such sensor is supplying the temperature reading 435 degrees as shown. Other types of sensors and additional temperature sensors can be placed throughout the building.
- Although the principles of the exemplary system is illustrated by using the example of as a situational-awareness system for firefighters, exemplary systems can also be implemented as systems in the following applications: a system showing “blue force” (friendly) in military operations in urban environments, a system showing locations of workers inside a building, a system used by security personnel showing the location of an alarm sensor that has been triggered, a system used by maintenance personnel to show the location and data about a sensor or a set of sensor in a plant/building.
-
FIG. 10 illustrates the principle of afirst responder 100 having an augmented view. The system inFIG. 10 is not implemented any differently from the systems inFIGS. 4-8 . The difference is the position of the person with the augmented view of theenvironment 108. InFIGS. 4-8 , this person (user 102) is outside of theenvironment 108. InFIG. 10 , this person (first responder 100) is inside the environment. Whenfirst responder 100 has an augmented view, it allowsfirst responders 100 to have information from sensors superimposed on his view of the environment, which is unique from the user's view. An example of such a view is shown inFIG. 11 , in which the firefighter's view of a room is superimposed by information such as who is in the room and where they are, as well as values coming from sensors that have been placed in the environment. - In this case where
first responders 100 see graphics superimposed on their individual views,first responders 100 might be using a helmet, wrist mounted or PDA/tablet display to see the information aligned with thereal world environment 108. Thisdisplay 124 would show the same information such as the locations and data aboutresponders 100 andsensors 116 b or any other useful information. If aresponder 100 needs assistance, it becomes now easy for other responders to come to help because they see where theresponder 100 is with respect to theenvironment 108 and they can see how to get to theresponder 100 while avoiding obstacles. - While the invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not to be limited to the disclosed embodiment, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
Claims (18)
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/715,338 US20080218331A1 (en) | 2007-03-08 | 2007-03-08 | Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness |
CA002679427A CA2679427A1 (en) | 2007-03-08 | 2008-03-07 | Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness |
JP2009552743A JP5553405B2 (en) | 2007-03-08 | 2008-03-07 | Augmented reality-based system and method for indicating the location of personnel and sensors in a closed structure and providing enhanced situational awareness |
EP08726549A EP2126866A4 (en) | 2007-03-08 | 2008-03-07 | Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness |
PCT/US2008/003038 WO2008112149A2 (en) | 2007-03-08 | 2008-03-07 | Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness |
AU2008226932A AU2008226932B2 (en) | 2007-03-08 | 2008-03-07 | Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness |
US13/354,167 US9324229B2 (en) | 2007-03-08 | 2012-01-19 | System and method to display maintenance and operational instructions of an apparatus using augmented reality |
JP2014061266A JP2014123406A (en) | 2007-03-08 | 2014-03-25 | Augmented reality-based system and method for showing locations of personnel and sensor inside occluded structure and providing increased situation awareness |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/715,338 US20080218331A1 (en) | 2007-03-08 | 2007-03-08 | Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/354,167 Continuation US9324229B2 (en) | 2007-03-08 | 2012-01-19 | System and method to display maintenance and operational instructions of an apparatus using augmented reality |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080218331A1 true US20080218331A1 (en) | 2008-09-11 |
Family
ID=39741080
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/715,338 Abandoned US20080218331A1 (en) | 2007-03-08 | 2007-03-08 | Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness |
US13/354,167 Active 2027-08-18 US9324229B2 (en) | 2007-03-08 | 2012-01-19 | System and method to display maintenance and operational instructions of an apparatus using augmented reality |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/354,167 Active 2027-08-18 US9324229B2 (en) | 2007-03-08 | 2012-01-19 | System and method to display maintenance and operational instructions of an apparatus using augmented reality |
Country Status (6)
Country | Link |
---|---|
US (2) | US20080218331A1 (en) |
EP (1) | EP2126866A4 (en) |
JP (2) | JP5553405B2 (en) |
AU (1) | AU2008226932B2 (en) |
CA (1) | CA2679427A1 (en) |
WO (1) | WO2008112149A2 (en) |
Cited By (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070271340A1 (en) * | 2006-05-16 | 2007-11-22 | Goodman Brian D | Context Enhanced Messaging and Collaboration System |
US20100313146A1 (en) * | 2009-06-08 | 2010-12-09 | Battelle Energy Alliance, Llc | Methods and systems relating to an augmented virtuality environment |
WO2012022930A1 (en) * | 2010-08-16 | 2012-02-23 | Stuart Graham Edwards | A building having an emergency information facility |
US8185101B1 (en) | 2008-04-10 | 2012-05-22 | Sandia Corporation | Handheld portable real-time tracking and communications device |
DE102011009952A1 (en) * | 2011-02-01 | 2012-08-02 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Method for determining position and location of astronaut in spacecraft, involves transmitting three-dimensional co-ordinates of detected three-dimensional position of each point from spacecraft to control station |
WO2013006642A3 (en) * | 2011-07-05 | 2013-03-14 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display |
FR2990286A1 (en) * | 2012-05-07 | 2013-11-08 | Schneider Electric Ind Sas | METHOD FOR HIGHLY REALIZED DISPLAY OF INFORMATION RELATING TO TARGET EQUIPMENT ON A SCREEN OF AN ELECTRONIC DEVICE, COMPUTER PROGRAM PRODUCT, ELECTRONIC DEVICE AND ELECTRIC EQUIPMENT THEREFOR |
WO2013181749A1 (en) * | 2012-06-08 | 2013-12-12 | Thales Canada Inc. | Integrated combat resource management system |
US8644673B2 (en) | 2011-03-22 | 2014-02-04 | Fmr Llc | Augmented reality system for re-casting a seminar with private calculations |
US8872640B2 (en) | 2011-07-05 | 2014-10-28 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring health and ergonomic status of drivers of vehicles |
US20150097673A1 (en) * | 2013-10-08 | 2015-04-09 | HYPERION S.r.l. | System of electronic devices for protection and security of places, persons, and goods |
US20150228117A1 (en) * | 2013-03-14 | 2015-08-13 | Glen Anderson | Asynchronous representation of alternate reality characters |
US9129430B2 (en) | 2013-06-25 | 2015-09-08 | Microsoft Technology Licensing, Llc | Indicating out-of-view augmented reality images |
US20150302655A1 (en) * | 2014-04-18 | 2015-10-22 | Magic Leap, Inc. | Using a map of the world for augmented or virtual reality systems |
US9424579B2 (en) | 2011-03-22 | 2016-08-23 | Fmr Llc | System for group supervision |
US9460561B1 (en) | 2013-03-15 | 2016-10-04 | Bentley Systems, Incorporated | Hypermodel-based panorama augmentation |
US9462977B2 (en) | 2011-07-05 | 2016-10-11 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9492120B2 (en) | 2011-07-05 | 2016-11-15 | Saudi Arabian Oil Company | Workstation for monitoring and improving health and productivity of employees |
US9615746B2 (en) | 2011-07-05 | 2017-04-11 | Saudi Arabian Oil Company | Floor mat system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9693734B2 (en) | 2011-07-05 | 2017-07-04 | Saudi Arabian Oil Company | Systems for monitoring and improving biometric health of employees |
US9710788B2 (en) | 2011-07-05 | 2017-07-18 | Saudi Arabian Oil Company | Computer mouse system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9722472B2 (en) | 2013-12-11 | 2017-08-01 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for harvesting human energy in the workplace |
US9761045B1 (en) | 2013-03-15 | 2017-09-12 | Bentley Systems, Incorporated | Dynamic and selective model clipping for enhanced augmented hypermodel visualization |
WO2017205052A1 (en) * | 2016-05-26 | 2017-11-30 | Motorola Solutions, Inc. | System and method for completing a call utilizing a head-mounted display and a communication device |
US9889311B2 (en) | 2015-12-04 | 2018-02-13 | Saudi Arabian Oil Company | Systems, protective casings for smartphones, and associated methods to enhance use of an automated external defibrillator (AED) device |
US9949640B2 (en) | 2011-07-05 | 2018-04-24 | Saudi Arabian Oil Company | System for monitoring employee health |
US10108783B2 (en) | 2011-07-05 | 2018-10-23 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring health of employees using mobile devices |
US10114451B2 (en) | 2011-03-22 | 2018-10-30 | Fmr Llc | Augmented reality in a virtual tour through a financial portfolio |
US10307104B2 (en) | 2011-07-05 | 2019-06-04 | Saudi Arabian Oil Company | Chair pad system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US10354439B2 (en) | 2016-10-24 | 2019-07-16 | Charles C. Carrington | System for generating virtual building plan data based upon stored and scanned building data and related methods |
CN110168618A (en) * | 2017-01-09 | 2019-08-23 | 三星电子株式会社 | Augmented reality control system and method |
US10475351B2 (en) | 2015-12-04 | 2019-11-12 | Saudi Arabian Oil Company | Systems, computer medium and methods for management training systems |
US10510812B2 (en) | 2017-11-09 | 2019-12-17 | Lockheed Martin Corporation | Display-integrated infrared emitter and sensor structures |
CN110770678A (en) * | 2017-06-16 | 2020-02-07 | 微软技术许可有限责任公司 | Object holographic enhancement |
US10594951B2 (en) | 2018-02-07 | 2020-03-17 | Lockheed Martin Corporation | Distributed multi-aperture camera array |
US10628770B2 (en) | 2015-12-14 | 2020-04-21 | Saudi Arabian Oil Company | Systems and methods for acquiring and employing resiliency data for leadership development |
US10642955B2 (en) | 2015-12-04 | 2020-05-05 | Saudi Arabian Oil Company | Devices, methods, and computer medium to provide real time 3D visualization bio-feedback |
US10652529B2 (en) | 2018-02-07 | 2020-05-12 | Lockheed Martin Corporation | In-layer Signal processing |
US10690910B2 (en) | 2018-02-07 | 2020-06-23 | Lockheed Martin Corporation | Plenoptic cellular vision correction |
US10698201B1 (en) | 2019-04-02 | 2020-06-30 | Lockheed Martin Corporation | Plenoptic cellular axis redirection |
US10802582B1 (en) * | 2014-04-22 | 2020-10-13 | sigmund lindsay clements | Eye tracker in an augmented reality glasses for eye gaze to input displayed input icons |
US10824132B2 (en) | 2017-12-07 | 2020-11-03 | Saudi Arabian Oil Company | Intelligent personal protective equipment |
US10838250B2 (en) | 2018-02-07 | 2020-11-17 | Lockheed Martin Corporation | Display assemblies with electronically emulated transparency |
US10866413B2 (en) | 2018-12-03 | 2020-12-15 | Lockheed Martin Corporation | Eccentric incident luminance pupil tracking |
US10922881B2 (en) * | 2018-11-02 | 2021-02-16 | Star Global Expert Solutions Joint Stock Company | Three dimensional/360 degree (3D/360°) real-time full information smart management integrated mapping system (SMIMS) and process of generating the same |
US10930709B2 (en) | 2017-10-03 | 2021-02-23 | Lockheed Martin Corporation | Stacked transparent pixel structures for image sensors |
US10951883B2 (en) | 2018-02-07 | 2021-03-16 | Lockheed Martin Corporation | Distributed multi-screen array for high density display |
WO2021054941A1 (en) * | 2019-09-17 | 2021-03-25 | Hewlett-Packard Development Company, L.P. | Notification delivery |
US10979699B2 (en) | 2018-02-07 | 2021-04-13 | Lockheed Martin Corporation | Plenoptic cellular imaging system |
US20220060877A1 (en) * | 2020-08-19 | 2022-02-24 | Cade Robertson | Emergency gps signal system |
US11616941B2 (en) | 2018-02-07 | 2023-03-28 | Lockheed Martin Corporation | Direct camera-to-display system |
US11892624B2 (en) | 2021-04-27 | 2024-02-06 | Microsoft Technology Licensing, Llc | Indicating an off-screen target |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9600933B2 (en) | 2011-07-01 | 2017-03-21 | Intel Corporation | Mobile augmented reality system |
US20150150641A1 (en) * | 2011-10-28 | 2015-06-04 | Navigate Surgical Technologies, Inc. | System and method for determining the three-dimensional location and orientation of identification markers |
US20130137076A1 (en) * | 2011-11-30 | 2013-05-30 | Kathryn Stone Perez | Head-mounted display based education and instruction |
US9096920B1 (en) * | 2012-03-22 | 2015-08-04 | Google Inc. | User interface method |
US9361730B2 (en) | 2012-07-26 | 2016-06-07 | Qualcomm Incorporated | Interactions of tangible and augmented reality objects |
US9129429B2 (en) | 2012-10-24 | 2015-09-08 | Exelis, Inc. | Augmented reality on wireless mobile devices |
TWI486629B (en) | 2012-11-21 | 2015-06-01 | Ind Tech Res Inst | Optical-see-through head mounted display system and interactive operation |
US9483875B2 (en) | 2013-02-14 | 2016-11-01 | Blackberry Limited | Augmented reality system with encoding beacons |
US9607584B2 (en) * | 2013-03-15 | 2017-03-28 | Daqri, Llc | Real world analytics visualization |
US9892489B1 (en) * | 2013-08-20 | 2018-02-13 | Rockwell Collins, Inc. | System for and method of providing a virtual cockpit, control panel, or dashboard using augmented reality |
US9530057B2 (en) * | 2013-11-26 | 2016-12-27 | Honeywell International Inc. | Maintenance assistant system |
US9740935B2 (en) | 2013-11-26 | 2017-08-22 | Honeywell International Inc. | Maintenance assistant system |
US20160342839A1 (en) * | 2014-03-20 | 2016-11-24 | Hewlett Packard Enterprise Development Lp | Identifying electronic components for augmented reality |
US9826297B2 (en) * | 2014-10-29 | 2017-11-21 | At&T Intellectual Property I, L.P. | Accessory device that provides sensor input to a media device |
US9955059B2 (en) * | 2014-10-29 | 2018-04-24 | Kabushiki Kaisha Toshiba | Electronic device, method, and computer program product |
US9508248B2 (en) | 2014-12-12 | 2016-11-29 | Motorola Solutions, Inc. | Method and system for information management for an incident response |
US10065750B2 (en) * | 2015-02-10 | 2018-09-04 | Honeywell International Inc. | Aircraft maintenance systems and methods using wearable device |
US10382746B1 (en) | 2015-09-22 | 2019-08-13 | Rockwell Collins, Inc. | Stereoscopic augmented reality head-worn display with indicator conforming to a real-world object |
US10528021B2 (en) | 2015-10-30 | 2020-01-07 | Rockwell Automation Technologies, Inc. | Automated creation of industrial dashboards and widgets |
FI3374737T3 (en) * | 2015-11-10 | 2023-01-13 | Robust vision-inertial pedestrian tracking with heading auto-alignment | |
DE102015015695A1 (en) * | 2015-12-04 | 2017-06-08 | Audi Ag | A display system and method for operating a display system |
US10313281B2 (en) | 2016-01-04 | 2019-06-04 | Rockwell Automation Technologies, Inc. | Delivery of automated notifications by an industrial asset |
US10318570B2 (en) | 2016-08-18 | 2019-06-11 | Rockwell Automation Technologies, Inc. | Multimodal search input for an industrial search platform |
US10401839B2 (en) | 2016-09-26 | 2019-09-03 | Rockwell Automation Technologies, Inc. | Workflow tracking and identification using an industrial monitoring system |
US10545492B2 (en) | 2016-09-26 | 2020-01-28 | Rockwell Automation Technologies, Inc. | Selective online and offline access to searchable industrial automation data |
US10319128B2 (en) | 2016-09-26 | 2019-06-11 | Rockwell Automation Technologies, Inc. | Augmented reality presentation of an industrial environment |
US10354350B2 (en) | 2016-10-18 | 2019-07-16 | Motorola Solutions, Inc. | Method and system for information management for an incident response |
DE102016120081A1 (en) * | 2016-10-21 | 2018-04-26 | Minimax Gmbh & Co. Kg | Method for commissioning and / or maintenance of a fire alarm and / or extinguishing control panel and device therefor |
US10388075B2 (en) * | 2016-11-08 | 2019-08-20 | Rockwell Automation Technologies, Inc. | Virtual reality and augmented reality for industrial automation |
US10735691B2 (en) | 2016-11-08 | 2020-08-04 | Rockwell Automation Technologies, Inc. | Virtual reality and augmented reality for industrial automation |
US10866631B2 (en) | 2016-11-09 | 2020-12-15 | Rockwell Automation Technologies, Inc. | Methods, systems, apparatuses, and techniques for employing augmented reality and virtual reality |
US10942252B2 (en) * | 2016-12-26 | 2021-03-09 | Htc Corporation | Tracking system and tracking method |
US10895628B2 (en) * | 2016-12-29 | 2021-01-19 | Htc Corporation | Tracking system, tracking device and tracking method |
EP3593227B1 (en) * | 2017-03-10 | 2021-09-15 | Brainlab AG | Augmented reality pre-registration |
US20180357922A1 (en) | 2017-06-08 | 2018-12-13 | Honeywell International Inc. | Apparatus and method for assessing and tracking user competency in augmented/virtual reality-based training in industrial automation systems and other systems |
US10902536B2 (en) | 2017-06-14 | 2021-01-26 | International Business Machines Corporation | Cognitive emergency task coordination |
US10445944B2 (en) | 2017-11-13 | 2019-10-15 | Rockwell Automation Technologies, Inc. | Augmented reality safety automation zone system and method |
DE102017222534B3 (en) | 2017-12-12 | 2019-06-13 | Volkswagen Aktiengesellschaft | A method, computer readable storage medium having instructions, apparatus, and system for gauging augmented reality goggles in a vehicle, vehicle suitable for the method, and augmented reality goggles suitable for the method |
US10678238B2 (en) * | 2017-12-20 | 2020-06-09 | Intel IP Corporation | Modified-reality device and method for operating a modified-reality device |
US20190042843A1 (en) * | 2018-04-05 | 2019-02-07 | Intel Corporation | Cable detection for ar/vr computing method and apparatus |
US10853014B2 (en) | 2018-04-17 | 2020-12-01 | Rockwell Collins, Inc. | Head wearable device, system, and method |
FR3089672B1 (en) * | 2018-12-05 | 2021-12-03 | Thales Sa | Method and display and interaction system on board a cockpit |
US10498029B1 (en) * | 2019-07-15 | 2019-12-03 | Bao Tran | Cellular system |
US10461421B1 (en) * | 2019-05-07 | 2019-10-29 | Bao Tran | Cellular system |
US11486961B2 (en) * | 2019-06-14 | 2022-11-01 | Chirp Microsystems | Object-localization and tracking using ultrasonic pulses with reflection rejection |
DE102020105196A1 (en) * | 2020-02-27 | 2021-09-02 | Audi Aktiengesellschaft | Method for operating data glasses in a motor vehicle and a system with a motor vehicle and data glasses |
EP3885818A1 (en) * | 2020-03-25 | 2021-09-29 | Siemens Aktiengesellschaft | Device and method for spectroscopic surface analysis |
JP7040827B1 (en) | 2021-01-14 | 2022-03-23 | 株式会社ロックガレッジ | Search support system and rescue support program |
Citations (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US706214A (en) * | 1894-09-08 | 1902-08-05 | Avery Mfg Company | Straw-stacker. |
US5506862A (en) * | 1993-06-25 | 1996-04-09 | Digital Wireless Corp. | Digital implementation of spread spectrum communications system |
US6064335A (en) * | 1997-07-21 | 2000-05-16 | Trimble Navigation Limited | GPS based augmented reality collision avoidance system |
US6078142A (en) * | 1995-12-04 | 2000-06-20 | Industrial Technology Research Institute | Low power consumption driving method for field emitter displays |
US6112015A (en) * | 1996-12-06 | 2000-08-29 | Northern Telecom Limited | Network management graphical user interface |
US6166744A (en) * | 1997-11-26 | 2000-12-26 | Pathfinder Systems, Inc. | System for combining virtual images with real-world scenes |
US6215498B1 (en) * | 1998-09-10 | 2001-04-10 | Lionhearth Technologies, Inc. | Virtual command post |
US6281790B1 (en) * | 1999-09-01 | 2001-08-28 | Net Talon Security Systems, Inc. | Method and apparatus for remotely monitoring a site |
US6317127B1 (en) * | 1996-10-16 | 2001-11-13 | Hughes Electronics Corporation | Multi-user real-time augmented reality system and method |
US20020008153A1 (en) * | 2000-03-15 | 2002-01-24 | Ebersole John Franklin | Instrumented firefighter's nozzle and method |
US6348877B1 (en) * | 1999-06-17 | 2002-02-19 | International Business Machines Corporation | Method and system for alerting a pilot to the location of other aircraft |
US20020039085A1 (en) * | 2000-03-15 | 2002-04-04 | Ebersole John Franklin | Augmented reality display integrated with self-contained breathing apparatus |
US20020044104A1 (en) * | 1999-03-02 | 2002-04-18 | Wolfgang Friedrich | Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus |
US20020057340A1 (en) * | 1998-03-19 | 2002-05-16 | Fernandez Dennis Sunga | Integrated network for monitoring remote objects |
US20020069072A1 (en) * | 1999-03-02 | 2002-06-06 | Wolfgang Friedrich | Augmented-reality system with voice-based recording of information data, in particular of service reports |
US20020074370A1 (en) * | 2000-12-18 | 2002-06-20 | Quintana W. Vincent | Apparatus and method for using a wearable computer in testing and diagnostic applications |
US6421031B1 (en) * | 1993-10-22 | 2002-07-16 | Peter A. Ronzani | Camera display system |
US20020101568A1 (en) * | 2001-01-30 | 2002-08-01 | Eberl Heinrich A. | Interactive data view and command system |
US6434416B1 (en) * | 1998-11-10 | 2002-08-13 | Olympus Optical Co., Ltd. | Surgical microscope |
US6453168B1 (en) * | 1999-08-02 | 2002-09-17 | Itt Manufacturing Enterprises, Inc | Method and apparatus for determining the position of a mobile communication device using low accuracy clocks |
US20020140708A1 (en) * | 2001-03-27 | 2002-10-03 | Frank Sauer | Augmented reality guided instrument positioning with depth determining graphics |
US20020140694A1 (en) * | 2001-03-27 | 2002-10-03 | Frank Sauer | Augmented reality guided instrument positioning with guiding graphics |
US20020140709A1 (en) * | 2001-03-27 | 2002-10-03 | Frank Sauer | Augmented reality guided instrument positioning with modulated guiding graphics |
US6466232B1 (en) * | 1998-12-18 | 2002-10-15 | Tangis Corporation | Method and system for controlling presentation of information to a user based on the user's condition |
US20020160343A1 (en) * | 2000-03-15 | 2002-10-31 | Ebersole John Franklin | Method of simulating nozzle spray interaction with fire, smoke and other aerosols and gases |
US6474159B1 (en) * | 2000-04-21 | 2002-11-05 | Intersense, Inc. | Motion-tracking |
US20020174367A1 (en) * | 1999-09-01 | 2002-11-21 | Kimmel David E. | Method and apparatus for remotely monitoring a site |
US20020191003A1 (en) * | 2000-08-09 | 2002-12-19 | Hobgood Andrew W. | Method for using a motorized camera mount for tracking in augmented reality |
US20020197591A1 (en) * | 1999-03-15 | 2002-12-26 | Ebersole John Franklin | Method for simulating multi-layer obscuration from a viewpoint |
US20030034300A1 (en) * | 2001-08-01 | 2003-02-20 | Srinivasan Vadake R. | Plug flow anaerobic digester |
US20030037449A1 (en) * | 2001-08-23 | 2003-02-27 | Ali Bani-Hashemi | Augmented and virtual reality guided instrument positioning using along-the-line-of-sight alignment |
US20030040914A1 (en) * | 2000-01-27 | 2003-02-27 | Siemens Ag | System and method for eye tracking controlled speech processing |
US20030050785A1 (en) * | 2000-01-27 | 2003-03-13 | Siemens Aktiengesellschaft | System and method for eye-tracking controlled speech processing with generation of a visual feedback signal |
US6653990B1 (en) * | 1998-03-06 | 2003-11-25 | Societe Rasterland S.A. | System for displaying realistic virtual three-dimensional images in real time |
US6675091B2 (en) * | 2001-11-20 | 2004-01-06 | Siemens Corporate Research, Inc. | System and method for tracking, locating, and guiding within buildings |
US20040080467A1 (en) * | 2002-10-28 | 2004-04-29 | University Of Washington | Virtual image registration in augmented display field |
US20040105427A1 (en) * | 2000-08-22 | 2004-06-03 | Wolfgang Friedrich | System and method for communication between a mobile data processing device and a stationary data processing device |
US20040113885A1 (en) * | 2001-05-31 | 2004-06-17 | Yakup Genc | New input devices for augmented reality applications |
US6757068B2 (en) * | 2000-01-28 | 2004-06-29 | Intersense, Inc. | Self-referenced tracking |
US20040149036A1 (en) * | 2000-04-21 | 2004-08-05 | Eric Foxlin | Motion-tracking |
US20040212630A1 (en) * | 2002-07-18 | 2004-10-28 | Hobgood Andrew W. | Method for automatically tracking objects in augmented reality |
US6822648B2 (en) * | 2001-04-17 | 2004-11-23 | Information Decision Technologies, Llc | Method for occlusion of movable objects and people in augmented reality scenes |
US6873256B2 (en) * | 2002-06-21 | 2005-03-29 | Dorothy Lemelson | Intelligent building alarm |
US6922632B2 (en) * | 2002-08-09 | 2005-07-26 | Intersense, Inc. | Tracking, auto-calibration, and map-building system |
US20050168403A1 (en) * | 2003-12-17 | 2005-08-04 | Ebersole John F.Jr. | Method and system for accomplishing a scalable, multi-user, extended range, distributed, augmented reality environment |
US20050195279A1 (en) * | 2002-07-18 | 2005-09-08 | Andrew Wesley Hobgood | Method for using a wireless motorized camera mount for tracking in augmented reality |
US20050203380A1 (en) * | 2004-02-17 | 2005-09-15 | Frank Sauer | System and method for augmented reality navigation in a medical intervention procedure |
US20050256396A1 (en) * | 2004-05-17 | 2005-11-17 | Canon Kabushiki Kaisha | Image composition system, image composition method, and image composition apparatus |
US7002551B2 (en) * | 2002-09-25 | 2006-02-21 | Hrl Laboratories, Llc | Optical see-through augmented reality modified-scale display |
US20060043314A1 (en) * | 2004-09-02 | 2006-03-02 | Abraham Katzir | Thermoluminescence measurements and dosimetry with temperature control of the thermoluminescent element |
US20060071775A1 (en) * | 2004-09-22 | 2006-04-06 | Otto Kevin L | Remote field command post |
US20060075356A1 (en) * | 2004-10-04 | 2006-04-06 | Faulkner Lawrence Q | Three-dimensional cartographic user interface system |
US20060232499A1 (en) * | 2001-08-09 | 2006-10-19 | Ebersole John F | Method and Apparatus for Using Thermal Imaging and Augmented Reality |
US7126558B1 (en) * | 2001-10-19 | 2006-10-24 | Accenture Global Services Gmbh | Industrial augmented reality |
US20060241792A1 (en) * | 2004-12-22 | 2006-10-26 | Abb Research Ltd. | Method to generate a human machine interface |
US20070016372A1 (en) * | 2005-07-14 | 2007-01-18 | Gm Global Technology Operations, Inc. | Remote Perspective Vehicle Environment Observation System |
US20070018880A1 (en) * | 2005-07-14 | 2007-01-25 | Huston Charles D | GPS Based Situational Awareness and Identification System and Method |
US20070098238A1 (en) * | 2005-10-31 | 2007-05-03 | Pere Obrador | Imaging methods, imaging systems, and articles of manufacture |
US20070276590A1 (en) * | 2006-05-24 | 2007-11-29 | Raytheon Company | Beacon-Augmented Pose Estimation |
US20080204361A1 (en) * | 2007-02-28 | 2008-08-28 | Science Applications International Corporation | System and Method for Video Image Registration and/or Providing Supplemental Data in a Heads Up Display |
JP2009538487A (en) * | 2006-05-26 | 2009-11-05 | アイティーティー マニュファクチャリング エンタープライジーズ, インコーポレイテッド | System and method for displaying device maintenance and operation instructions using augmented reality |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3208939B2 (en) | 1993-08-09 | 2001-09-17 | 日産自動車株式会社 | Monitoring device with head-mounted display device |
CA2174510A1 (en) * | 1993-10-22 | 1995-04-27 | John C. C. Fan | Head-mounted display system |
JPH0923449A (en) * | 1995-07-05 | 1997-01-21 | Hitachi Ltd | Actual-photographic picture and three-dimensional model composing device |
DE19639615C5 (en) | 1996-09-26 | 2008-11-06 | Brainlab Ag | Reflector referencing system for surgical and medical instruments |
JPH10207394A (en) * | 1997-01-16 | 1998-08-07 | Shimadzu Corp | Head mounted display system |
WO1999005580A2 (en) | 1997-07-23 | 1999-02-04 | Duschek Horst Juergen | Method for controlling an unmanned transport vehicle and unmanned transport vehicle system therefor |
US6522312B2 (en) | 1997-09-01 | 2003-02-18 | Canon Kabushiki Kaisha | Apparatus for presenting mixed reality shared among operators |
JP2000102036A (en) | 1998-09-22 | 2000-04-07 | Mr System Kenkyusho:Kk | Composite actual feeling presentation system, composite actual feeling presentation method, man-machine interface device and man-machine interface method |
US6708142B1 (en) | 1999-01-14 | 2004-03-16 | University Of Central Florida | Automatic motion modeling of rigid bodies using collision detection |
JP4550184B2 (en) * | 1999-07-02 | 2010-09-22 | オリンパス株式会社 | Observation optical system |
GB2352289B (en) * | 1999-07-14 | 2003-09-17 | Dennis Majoe | Position and orientation detection system |
JP3423676B2 (en) | 2000-07-19 | 2003-07-07 | キヤノン株式会社 | Virtual object operation device and virtual object operation method |
WO2002029700A2 (en) | 2000-10-05 | 2002-04-11 | Siemens Corporate Research, Inc. | Intra-operative image-guided neurosurgery with augmented reality visualization |
CA2477553A1 (en) | 2002-03-12 | 2003-09-25 | Menache, Llc | Motion tracking system and method |
JP2003337963A (en) | 2002-05-17 | 2003-11-28 | Seiko Epson Corp | Device and method for image processing, and image processing program and recording medium therefor |
JP4381767B2 (en) * | 2003-10-10 | 2009-12-09 | 能美防災株式会社 | Disaster prevention integrated system |
JP4566786B2 (en) | 2004-05-14 | 2010-10-20 | キヤノン株式会社 | Position and orientation measurement method and information processing apparatus |
JP2006000999A (en) * | 2004-06-21 | 2006-01-05 | Nec Fielding Ltd | Monitoring system, its method, monitoring robot and program |
US8665087B2 (en) * | 2004-11-10 | 2014-03-04 | Bae Systems Information And Electronic Systems Integration Inc. | Wearable or portable device including sensors and an image input for establishing communications interoperability and situational awareness of events at an incident site |
KR100687737B1 (en) * | 2005-03-19 | 2007-02-27 | 한국전자통신연구원 | Apparatus and method for a virtual mouse based on two-hands gesture |
JP2007018173A (en) * | 2005-07-06 | 2007-01-25 | Canon Inc | Image processing method and image processor |
DE502005009238D1 (en) | 2005-10-07 | 2010-04-29 | Brainlab Ag | Medical-technical marker device |
US8217856B1 (en) * | 2011-07-27 | 2012-07-10 | Google Inc. | Head-mounted display that displays a visual representation of physical interaction with an input interface located outside of the field of view |
-
2007
- 2007-03-08 US US11/715,338 patent/US20080218331A1/en not_active Abandoned
-
2008
- 2008-03-07 EP EP08726549A patent/EP2126866A4/en not_active Withdrawn
- 2008-03-07 JP JP2009552743A patent/JP5553405B2/en not_active Expired - Fee Related
- 2008-03-07 CA CA002679427A patent/CA2679427A1/en not_active Abandoned
- 2008-03-07 WO PCT/US2008/003038 patent/WO2008112149A2/en active Application Filing
- 2008-03-07 AU AU2008226932A patent/AU2008226932B2/en not_active Ceased
-
2012
- 2012-01-19 US US13/354,167 patent/US9324229B2/en active Active
-
2014
- 2014-03-25 JP JP2014061266A patent/JP2014123406A/en not_active Withdrawn
Patent Citations (84)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US706214A (en) * | 1894-09-08 | 1902-08-05 | Avery Mfg Company | Straw-stacker. |
US5506862A (en) * | 1993-06-25 | 1996-04-09 | Digital Wireless Corp. | Digital implementation of spread spectrum communications system |
US6421031B1 (en) * | 1993-10-22 | 2002-07-16 | Peter A. Ronzani | Camera display system |
US6078142A (en) * | 1995-12-04 | 2000-06-20 | Industrial Technology Research Institute | Low power consumption driving method for field emitter displays |
US6317127B1 (en) * | 1996-10-16 | 2001-11-13 | Hughes Electronics Corporation | Multi-user real-time augmented reality system and method |
US6112015A (en) * | 1996-12-06 | 2000-08-29 | Northern Telecom Limited | Network management graphical user interface |
US6064335A (en) * | 1997-07-21 | 2000-05-16 | Trimble Navigation Limited | GPS based augmented reality collision avoidance system |
US6166744A (en) * | 1997-11-26 | 2000-12-26 | Pathfinder Systems, Inc. | System for combining virtual images with real-world scenes |
US6653990B1 (en) * | 1998-03-06 | 2003-11-25 | Societe Rasterland S.A. | System for displaying realistic virtual three-dimensional images in real time |
US20020057340A1 (en) * | 1998-03-19 | 2002-05-16 | Fernandez Dennis Sunga | Integrated network for monitoring remote objects |
US6215498B1 (en) * | 1998-09-10 | 2001-04-10 | Lionhearth Technologies, Inc. | Virtual command post |
US6434416B1 (en) * | 1998-11-10 | 2002-08-13 | Olympus Optical Co., Ltd. | Surgical microscope |
US6466232B1 (en) * | 1998-12-18 | 2002-10-15 | Tangis Corporation | Method and system for controlling presentation of information to a user based on the user's condition |
US7324081B2 (en) * | 1999-03-02 | 2008-01-29 | Siemens Aktiengesellschaft | Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus |
US20020044104A1 (en) * | 1999-03-02 | 2002-04-18 | Wolfgang Friedrich | Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus |
US20020069072A1 (en) * | 1999-03-02 | 2002-06-06 | Wolfgang Friedrich | Augmented-reality system with voice-based recording of information data, in particular of service reports |
US20030003430A1 (en) * | 1999-03-15 | 2003-01-02 | Ebersole John Franklin | Method for simulating flow of an extinguishing agent |
US6500008B1 (en) * | 1999-03-15 | 2002-12-31 | Information Decision Technologies, Llc | Augmented reality-based firefighter training system and method |
US6809743B2 (en) * | 1999-03-15 | 2004-10-26 | Information Decision Technologies, Llc | Method of generating three-dimensional fire and smoke plume for graphical display |
US6809744B2 (en) * | 1999-03-15 | 2004-10-26 | Information Decision Technologies, Llc | Method for simulating flow of an extinguishing agent |
US20030017438A1 (en) * | 1999-03-15 | 2003-01-23 | Ebersole John Franklin | Method of generating three-dimensional fire and smoke plume for graphical display |
US20020197591A1 (en) * | 1999-03-15 | 2002-12-26 | Ebersole John Franklin | Method for simulating multi-layer obscuration from a viewpoint |
US6989831B2 (en) * | 1999-03-15 | 2006-01-24 | Information Decision Technologies, Llc | Method for simulating multi-layer obscuration from a viewpoint |
US6348877B1 (en) * | 1999-06-17 | 2002-02-19 | International Business Machines Corporation | Method and system for alerting a pilot to the location of other aircraft |
US6453168B1 (en) * | 1999-08-02 | 2002-09-17 | Itt Manufacturing Enterprises, Inc | Method and apparatus for determining the position of a mobile communication device using low accuracy clocks |
US20020174367A1 (en) * | 1999-09-01 | 2002-11-21 | Kimmel David E. | Method and apparatus for remotely monitoring a site |
US6281790B1 (en) * | 1999-09-01 | 2001-08-28 | Net Talon Security Systems, Inc. | Method and apparatus for remotely monitoring a site |
US20050177375A1 (en) * | 2000-01-27 | 2005-08-11 | Siemens Ag | System and method for eye tracking controlled speech processing |
US6853972B2 (en) * | 2000-01-27 | 2005-02-08 | Siemens Aktiengesellschaft | System and method for eye tracking controlled speech processing |
US6889192B2 (en) * | 2000-01-27 | 2005-05-03 | Siemens Aktiengesellschaft | Generating visual feedback signals for eye-tracking controlled speech processing |
US20030040914A1 (en) * | 2000-01-27 | 2003-02-27 | Siemens Ag | System and method for eye tracking controlled speech processing |
US20030050785A1 (en) * | 2000-01-27 | 2003-03-13 | Siemens Aktiengesellschaft | System and method for eye-tracking controlled speech processing with generation of a visual feedback signal |
US6757068B2 (en) * | 2000-01-28 | 2004-06-29 | Intersense, Inc. | Self-referenced tracking |
US20040201857A1 (en) * | 2000-01-28 | 2004-10-14 | Intersense, Inc., A Delaware Corporation | Self-referenced tracking |
US20020160343A1 (en) * | 2000-03-15 | 2002-10-31 | Ebersole John Franklin | Method of simulating nozzle spray interaction with fire, smoke and other aerosols and gases |
US6607038B2 (en) * | 2000-03-15 | 2003-08-19 | Information Decision Technologies, Llc | Instrumented firefighter's nozzle and method |
US6616454B2 (en) * | 2000-03-15 | 2003-09-09 | Information Decision Technologies, Llc | Method of simulating nozzle spray interaction with fire, smoke and other aerosols and gases |
US7110013B2 (en) * | 2000-03-15 | 2006-09-19 | Information Decision Technology | Augmented reality display integrated with self-contained breathing apparatus |
US20020008153A1 (en) * | 2000-03-15 | 2002-01-24 | Ebersole John Franklin | Instrumented firefighter's nozzle and method |
US20020039085A1 (en) * | 2000-03-15 | 2002-04-04 | Ebersole John Franklin | Augmented reality display integrated with self-contained breathing apparatus |
US20040149036A1 (en) * | 2000-04-21 | 2004-08-05 | Eric Foxlin | Motion-tracking |
US6474159B1 (en) * | 2000-04-21 | 2002-11-05 | Intersense, Inc. | Motion-tracking |
US20020191003A1 (en) * | 2000-08-09 | 2002-12-19 | Hobgood Andrew W. | Method for using a motorized camera mount for tracking in augmented reality |
US6903707B2 (en) * | 2000-08-09 | 2005-06-07 | Information Decision Technologies, Llc | Method for using a motorized camera mount for tracking in augmented reality |
US20040105427A1 (en) * | 2000-08-22 | 2004-06-03 | Wolfgang Friedrich | System and method for communication between a mobile data processing device and a stationary data processing device |
US20020074370A1 (en) * | 2000-12-18 | 2002-06-20 | Quintana W. Vincent | Apparatus and method for using a wearable computer in testing and diagnostic applications |
US6962277B2 (en) * | 2000-12-18 | 2005-11-08 | Bath Iron Works Corporation | Apparatus and method for using a wearable computer in testing and diagnostic applications |
US7245273B2 (en) * | 2001-01-30 | 2007-07-17 | David Parker Dickerson | Interactive data view and command system |
US20020101568A1 (en) * | 2001-01-30 | 2002-08-01 | Eberl Heinrich A. | Interactive data view and command system |
US20020140694A1 (en) * | 2001-03-27 | 2002-10-03 | Frank Sauer | Augmented reality guided instrument positioning with guiding graphics |
US20050093889A1 (en) * | 2001-03-27 | 2005-05-05 | Frank Sauer | Augmented reality guided instrument positioning with guiding graphics |
US6856324B2 (en) * | 2001-03-27 | 2005-02-15 | Siemens Corporate Research, Inc. | Augmented reality guided instrument positioning with guiding graphics |
US20020140709A1 (en) * | 2001-03-27 | 2002-10-03 | Frank Sauer | Augmented reality guided instrument positioning with modulated guiding graphics |
US20020140708A1 (en) * | 2001-03-27 | 2002-10-03 | Frank Sauer | Augmented reality guided instrument positioning with depth determining graphics |
US6822648B2 (en) * | 2001-04-17 | 2004-11-23 | Information Decision Technologies, Llc | Method for occlusion of movable objects and people in augmented reality scenes |
US20040113885A1 (en) * | 2001-05-31 | 2004-06-17 | Yakup Genc | New input devices for augmented reality applications |
US7215322B2 (en) * | 2001-05-31 | 2007-05-08 | Siemens Corporate Research, Inc. | Input devices for augmented reality applications |
US20030034300A1 (en) * | 2001-08-01 | 2003-02-20 | Srinivasan Vadake R. | Plug flow anaerobic digester |
US20060232499A1 (en) * | 2001-08-09 | 2006-10-19 | Ebersole John F | Method and Apparatus for Using Thermal Imaging and Augmented Reality |
US20030037449A1 (en) * | 2001-08-23 | 2003-02-27 | Ali Bani-Hashemi | Augmented and virtual reality guided instrument positioning using along-the-line-of-sight alignment |
US7126558B1 (en) * | 2001-10-19 | 2006-10-24 | Accenture Global Services Gmbh | Industrial augmented reality |
US6675091B2 (en) * | 2001-11-20 | 2004-01-06 | Siemens Corporate Research, Inc. | System and method for tracking, locating, and guiding within buildings |
US6873256B2 (en) * | 2002-06-21 | 2005-03-29 | Dorothy Lemelson | Intelligent building alarm |
US7071898B2 (en) * | 2002-07-18 | 2006-07-04 | Information Decision Technologies, Llc | Method for using a wireless motorized camera mount for tracking in augmented reality |
US7138963B2 (en) * | 2002-07-18 | 2006-11-21 | Metamersion, Llc | Method for automatically tracking objects in augmented reality |
US20050195279A1 (en) * | 2002-07-18 | 2005-09-08 | Andrew Wesley Hobgood | Method for using a wireless motorized camera mount for tracking in augmented reality |
US20040212630A1 (en) * | 2002-07-18 | 2004-10-28 | Hobgood Andrew W. | Method for automatically tracking objects in augmented reality |
US6922632B2 (en) * | 2002-08-09 | 2005-07-26 | Intersense, Inc. | Tracking, auto-calibration, and map-building system |
US7002551B2 (en) * | 2002-09-25 | 2006-02-21 | Hrl Laboratories, Llc | Optical see-through augmented reality modified-scale display |
US20040080467A1 (en) * | 2002-10-28 | 2004-04-29 | University Of Washington | Virtual image registration in augmented display field |
US7046214B2 (en) * | 2003-12-17 | 2006-05-16 | Information Decision Technologies, Llc | Method and system for accomplishing a scalable, multi-user, extended range, distributed, augmented reality environment |
US20050168403A1 (en) * | 2003-12-17 | 2005-08-04 | Ebersole John F.Jr. | Method and system for accomplishing a scalable, multi-user, extended range, distributed, augmented reality environment |
US20050203380A1 (en) * | 2004-02-17 | 2005-09-15 | Frank Sauer | System and method for augmented reality navigation in a medical intervention procedure |
US20050256396A1 (en) * | 2004-05-17 | 2005-11-17 | Canon Kabushiki Kaisha | Image composition system, image composition method, and image composition apparatus |
US20060043314A1 (en) * | 2004-09-02 | 2006-03-02 | Abraham Katzir | Thermoluminescence measurements and dosimetry with temperature control of the thermoluminescent element |
US20060071775A1 (en) * | 2004-09-22 | 2006-04-06 | Otto Kevin L | Remote field command post |
US20060075356A1 (en) * | 2004-10-04 | 2006-04-06 | Faulkner Lawrence Q | Three-dimensional cartographic user interface system |
US20060241792A1 (en) * | 2004-12-22 | 2006-10-26 | Abb Research Ltd. | Method to generate a human machine interface |
US20070018880A1 (en) * | 2005-07-14 | 2007-01-25 | Huston Charles D | GPS Based Situational Awareness and Identification System and Method |
US20070016372A1 (en) * | 2005-07-14 | 2007-01-18 | Gm Global Technology Operations, Inc. | Remote Perspective Vehicle Environment Observation System |
US20070098238A1 (en) * | 2005-10-31 | 2007-05-03 | Pere Obrador | Imaging methods, imaging systems, and articles of manufacture |
US20070276590A1 (en) * | 2006-05-24 | 2007-11-29 | Raytheon Company | Beacon-Augmented Pose Estimation |
JP2009538487A (en) * | 2006-05-26 | 2009-11-05 | アイティーティー マニュファクチャリング エンタープライジーズ, インコーポレイテッド | System and method for displaying device maintenance and operation instructions using augmented reality |
US20080204361A1 (en) * | 2007-02-28 | 2008-08-28 | Science Applications International Corporation | System and Method for Video Image Registration and/or Providing Supplemental Data in a Heads Up Display |
Cited By (113)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070271340A1 (en) * | 2006-05-16 | 2007-11-22 | Goodman Brian D | Context Enhanced Messaging and Collaboration System |
US8185101B1 (en) | 2008-04-10 | 2012-05-22 | Sandia Corporation | Handheld portable real-time tracking and communications device |
US20100313146A1 (en) * | 2009-06-08 | 2010-12-09 | Battelle Energy Alliance, Llc | Methods and systems relating to an augmented virtuality environment |
US8732592B2 (en) * | 2009-06-08 | 2014-05-20 | Battelle Energy Alliance, Llc | Methods and systems relating to an augmented virtuality environment |
GB2496825A (en) * | 2010-08-16 | 2013-05-22 | Stuart Graham Edwards | A building having an emergency information facility |
CN103124991A (en) * | 2010-08-16 | 2013-05-29 | 斯图尔特·格雷厄姆·爱德华兹 | A building having an emergency information facility |
US8723666B2 (en) | 2010-08-16 | 2014-05-13 | Stuart Graham Edwards | Building having an emergency information facility |
WO2012022930A1 (en) * | 2010-08-16 | 2012-02-23 | Stuart Graham Edwards | A building having an emergency information facility |
DE102011009952A1 (en) * | 2011-02-01 | 2012-08-02 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Method for determining position and location of astronaut in spacecraft, involves transmitting three-dimensional co-ordinates of detected three-dimensional position of each point from spacecraft to control station |
US10455089B2 (en) | 2011-03-22 | 2019-10-22 | Fmr Llc | Augmented reality system for product selection |
US9264655B2 (en) | 2011-03-22 | 2016-02-16 | Fmr Llc | Augmented reality system for re-casting a seminar with private calculations |
US10114451B2 (en) | 2011-03-22 | 2018-10-30 | Fmr Llc | Augmented reality in a virtual tour through a financial portfolio |
US9973630B2 (en) | 2011-03-22 | 2018-05-15 | Fmr Llc | System for group supervision |
US8644673B2 (en) | 2011-03-22 | 2014-02-04 | Fmr Llc | Augmented reality system for re-casting a seminar with private calculations |
US9424579B2 (en) | 2011-03-22 | 2016-08-23 | Fmr Llc | System for group supervision |
US9962083B2 (en) | 2011-07-05 | 2018-05-08 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring and improving biomechanical health of employees |
US9693734B2 (en) | 2011-07-05 | 2017-07-04 | Saudi Arabian Oil Company | Systems for monitoring and improving biometric health of employees |
US10052023B2 (en) | 2011-07-05 | 2018-08-21 | Saudi Arabian Oil Company | Floor mat system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9844344B2 (en) | 2011-07-05 | 2017-12-19 | Saudi Arabian Oil Company | Systems and method to monitor health of employee when positioned in association with a workstation |
US9833142B2 (en) | 2011-07-05 | 2017-12-05 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for coaching employees based upon monitored health conditions using an avatar |
WO2013006642A3 (en) * | 2011-07-05 | 2013-03-14 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display |
US10058285B2 (en) | 2011-07-05 | 2018-08-28 | Saudi Arabian Oil Company | Chair pad system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US10307104B2 (en) | 2011-07-05 | 2019-06-04 | Saudi Arabian Oil Company | Chair pad system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9256711B2 (en) | 2011-07-05 | 2016-02-09 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display |
US8872640B2 (en) | 2011-07-05 | 2014-10-28 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring health and ergonomic status of drivers of vehicles |
AU2012279054B2 (en) * | 2011-07-05 | 2016-02-25 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display |
US9830576B2 (en) | 2011-07-05 | 2017-11-28 | Saudi Arabian Oil Company | Computer mouse for monitoring and improving health and productivity of employees |
CN103765426A (en) * | 2011-07-05 | 2014-04-30 | 沙特阿拉伯石油公司 | Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display |
US9830577B2 (en) | 2011-07-05 | 2017-11-28 | Saudi Arabian Oil Company | Computer mouse system and associated computer medium for monitoring and improving health and productivity of employees |
US9462977B2 (en) | 2011-07-05 | 2016-10-11 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9492120B2 (en) | 2011-07-05 | 2016-11-15 | Saudi Arabian Oil Company | Workstation for monitoring and improving health and productivity of employees |
US9808156B2 (en) | 2011-07-05 | 2017-11-07 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring and improving biomechanical health of employees |
US9526455B2 (en) | 2011-07-05 | 2016-12-27 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9615746B2 (en) | 2011-07-05 | 2017-04-11 | Saudi Arabian Oil Company | Floor mat system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9949640B2 (en) | 2011-07-05 | 2018-04-24 | Saudi Arabian Oil Company | System for monitoring employee health |
US9710788B2 (en) | 2011-07-05 | 2017-07-18 | Saudi Arabian Oil Company | Computer mouse system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
US9805339B2 (en) | 2011-07-05 | 2017-10-31 | Saudi Arabian Oil Company | Method for monitoring and improving health and productivity of employees using a computer mouse system |
US10108783B2 (en) | 2011-07-05 | 2018-10-23 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for monitoring health of employees using mobile devices |
US10206625B2 (en) | 2011-07-05 | 2019-02-19 | Saudi Arabian Oil Company | Chair pad system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees |
WO2013167523A1 (en) * | 2012-05-07 | 2013-11-14 | Schneider Electric Industries Sas | Method for displaying, in augmented reality, information relating to target equipment on a screen of an electronic device, and associated computer program product, electronic device and electrical equipment |
FR2990286A1 (en) * | 2012-05-07 | 2013-11-08 | Schneider Electric Ind Sas | METHOD FOR HIGHLY REALIZED DISPLAY OF INFORMATION RELATING TO TARGET EQUIPMENT ON A SCREEN OF AN ELECTRONIC DEVICE, COMPUTER PROGRAM PRODUCT, ELECTRONIC DEVICE AND ELECTRIC EQUIPMENT THEREFOR |
WO2013181749A1 (en) * | 2012-06-08 | 2013-12-12 | Thales Canada Inc. | Integrated combat resource management system |
US10319145B2 (en) * | 2013-03-14 | 2019-06-11 | Intel Corporation | Asynchronous representation of alternate reality characters |
CN104981833A (en) * | 2013-03-14 | 2015-10-14 | 英特尔公司 | Asynchronous representation of alternate reality characters |
US20150228117A1 (en) * | 2013-03-14 | 2015-08-13 | Glen Anderson | Asynchronous representation of alternate reality characters |
US9761045B1 (en) | 2013-03-15 | 2017-09-12 | Bentley Systems, Incorporated | Dynamic and selective model clipping for enhanced augmented hypermodel visualization |
US9460561B1 (en) | 2013-03-15 | 2016-10-04 | Bentley Systems, Incorporated | Hypermodel-based panorama augmentation |
US9501873B2 (en) | 2013-06-25 | 2016-11-22 | Microsoft Technology Licensing, Llc | Indicating out-of-view augmented reality images |
US9761057B2 (en) | 2013-06-25 | 2017-09-12 | Microsoft Technology Licensing, Llc | Indicating out-of-view augmented reality images |
US9129430B2 (en) | 2013-06-25 | 2015-09-08 | Microsoft Technology Licensing, Llc | Indicating out-of-view augmented reality images |
US9286791B2 (en) * | 2013-10-08 | 2016-03-15 | HYPERION S.r.l. | Protection and security system including three-dimensional virtual reality |
US20150097673A1 (en) * | 2013-10-08 | 2015-04-09 | HYPERION S.r.l. | System of electronic devices for protection and security of places, persons, and goods |
US9722472B2 (en) | 2013-12-11 | 2017-08-01 | Saudi Arabian Oil Company | Systems, computer medium and computer-implemented methods for harvesting human energy in the workplace |
US10013806B2 (en) | 2014-04-18 | 2018-07-03 | Magic Leap, Inc. | Ambient light compensation for augmented or virtual reality |
US9767616B2 (en) | 2014-04-18 | 2017-09-19 | Magic Leap, Inc. | Recognizing objects in a passable world model in an augmented or virtual reality system |
US9928654B2 (en) | 2014-04-18 | 2018-03-27 | Magic Leap, Inc. | Utilizing pseudo-random patterns for eye tracking in augmented or virtual reality systems |
US9911233B2 (en) | 2014-04-18 | 2018-03-06 | Magic Leap, Inc. | Systems and methods for using image based light solutions for augmented or virtual reality |
US9911234B2 (en) | 2014-04-18 | 2018-03-06 | Magic Leap, Inc. | User interface rendering in augmented or virtual reality systems |
US9972132B2 (en) | 2014-04-18 | 2018-05-15 | Magic Leap, Inc. | Utilizing image based light solutions for augmented or virtual reality |
US11205304B2 (en) | 2014-04-18 | 2021-12-21 | Magic Leap, Inc. | Systems and methods for rendering user interfaces for augmented or virtual reality |
US9984506B2 (en) | 2014-04-18 | 2018-05-29 | Magic Leap, Inc. | Stress reduction in geometric maps of passable world model in augmented or virtual reality systems |
US9996977B2 (en) | 2014-04-18 | 2018-06-12 | Magic Leap, Inc. | Compensating for ambient light in augmented or virtual reality systems |
US10008038B2 (en) | 2014-04-18 | 2018-06-26 | Magic Leap, Inc. | Utilizing totems for augmented or virtual reality systems |
US9881420B2 (en) | 2014-04-18 | 2018-01-30 | Magic Leap, Inc. | Inferential avatar rendering techniques in augmented or virtual reality systems |
US10043312B2 (en) | 2014-04-18 | 2018-08-07 | Magic Leap, Inc. | Rendering techniques to find new map points in augmented or virtual reality systems |
US9852548B2 (en) | 2014-04-18 | 2017-12-26 | Magic Leap, Inc. | Systems and methods for generating sound wavefronts in augmented or virtual reality systems |
US10909760B2 (en) | 2014-04-18 | 2021-02-02 | Magic Leap, Inc. | Creating a topological map for localization in augmented or virtual reality systems |
US10109108B2 (en) | 2014-04-18 | 2018-10-23 | Magic Leap, Inc. | Finding new points by render rather than search in augmented or virtual reality systems |
US10665018B2 (en) | 2014-04-18 | 2020-05-26 | Magic Leap, Inc. | Reducing stresses in the passable world model in augmented or virtual reality systems |
US10115233B2 (en) * | 2014-04-18 | 2018-10-30 | Magic Leap, Inc. | Methods and systems for mapping virtual objects in an augmented or virtual reality system |
US10115232B2 (en) * | 2014-04-18 | 2018-10-30 | Magic Leap, Inc. | Using a map of the world for augmented or virtual reality systems |
US9766703B2 (en) | 2014-04-18 | 2017-09-19 | Magic Leap, Inc. | Triangulation of points using known points in augmented or virtual reality systems |
US10127723B2 (en) * | 2014-04-18 | 2018-11-13 | Magic Leap, Inc. | Room based sensors in an augmented reality system |
US10186085B2 (en) | 2014-04-18 | 2019-01-22 | Magic Leap, Inc. | Generating a sound wavefront in augmented or virtual reality systems |
US10198864B2 (en) | 2014-04-18 | 2019-02-05 | Magic Leap, Inc. | Running object recognizers in a passable world model for augmented or virtual reality |
US9761055B2 (en) | 2014-04-18 | 2017-09-12 | Magic Leap, Inc. | Using object recognizers in an augmented or virtual reality system |
US10262462B2 (en) | 2014-04-18 | 2019-04-16 | Magic Leap, Inc. | Systems and methods for augmented and virtual reality |
US20150302656A1 (en) * | 2014-04-18 | 2015-10-22 | Magic Leap, Inc. | Using a map of the world for augmented or virtual reality systems |
US20150302642A1 (en) * | 2014-04-18 | 2015-10-22 | Magic Leap, Inc. | Room based sensors in an augmented reality system |
US10846930B2 (en) | 2014-04-18 | 2020-11-24 | Magic Leap, Inc. | Using passable world model for augmented or virtual reality |
US10825248B2 (en) * | 2014-04-18 | 2020-11-03 | Magic Leap, Inc. | Eye tracking systems and method for augmented or virtual reality |
US20150302655A1 (en) * | 2014-04-18 | 2015-10-22 | Magic Leap, Inc. | Using a map of the world for augmented or virtual reality systems |
US9922462B2 (en) | 2014-04-18 | 2018-03-20 | Magic Leap, Inc. | Interacting with totems in augmented or virtual reality systems |
US10802582B1 (en) * | 2014-04-22 | 2020-10-13 | sigmund lindsay clements | Eye tracker in an augmented reality glasses for eye gaze to input displayed input icons |
US10475351B2 (en) | 2015-12-04 | 2019-11-12 | Saudi Arabian Oil Company | Systems, computer medium and methods for management training systems |
US10642955B2 (en) | 2015-12-04 | 2020-05-05 | Saudi Arabian Oil Company | Devices, methods, and computer medium to provide real time 3D visualization bio-feedback |
US9889311B2 (en) | 2015-12-04 | 2018-02-13 | Saudi Arabian Oil Company | Systems, protective casings for smartphones, and associated methods to enhance use of an automated external defibrillator (AED) device |
US10628770B2 (en) | 2015-12-14 | 2020-04-21 | Saudi Arabian Oil Company | Systems and methods for acquiring and employing resiliency data for leadership development |
WO2017205052A1 (en) * | 2016-05-26 | 2017-11-30 | Motorola Solutions, Inc. | System and method for completing a call utilizing a head-mounted display and a communication device |
US11069132B2 (en) | 2016-10-24 | 2021-07-20 | Charles C. Carrington | System for generating virtual building plan data based upon stored and scanned building data and related methods |
US10354439B2 (en) | 2016-10-24 | 2019-07-16 | Charles C. Carrington | System for generating virtual building plan data based upon stored and scanned building data and related methods |
CN110168618A (en) * | 2017-01-09 | 2019-08-23 | 三星电子株式会社 | Augmented reality control system and method |
CN110770678A (en) * | 2017-06-16 | 2020-02-07 | 微软技术许可有限责任公司 | Object holographic enhancement |
US11257292B2 (en) | 2017-06-16 | 2022-02-22 | Microsoft Technology Licensing, Llc | Object holographic augmentation |
US11659751B2 (en) | 2017-10-03 | 2023-05-23 | Lockheed Martin Corporation | Stacked transparent pixel structures for electronic displays |
US10930709B2 (en) | 2017-10-03 | 2021-02-23 | Lockheed Martin Corporation | Stacked transparent pixel structures for image sensors |
US10510812B2 (en) | 2017-11-09 | 2019-12-17 | Lockheed Martin Corporation | Display-integrated infrared emitter and sensor structures |
US10998386B2 (en) | 2017-11-09 | 2021-05-04 | Lockheed Martin Corporation | Display-integrated infrared emitter and sensor structures |
US10824132B2 (en) | 2017-12-07 | 2020-11-03 | Saudi Arabian Oil Company | Intelligent personal protective equipment |
US11146781B2 (en) | 2018-02-07 | 2021-10-12 | Lockheed Martin Corporation | In-layer signal processing |
US10594951B2 (en) | 2018-02-07 | 2020-03-17 | Lockheed Martin Corporation | Distributed multi-aperture camera array |
US10951883B2 (en) | 2018-02-07 | 2021-03-16 | Lockheed Martin Corporation | Distributed multi-screen array for high density display |
US10652529B2 (en) | 2018-02-07 | 2020-05-12 | Lockheed Martin Corporation | In-layer Signal processing |
US10979699B2 (en) | 2018-02-07 | 2021-04-13 | Lockheed Martin Corporation | Plenoptic cellular imaging system |
US11616941B2 (en) | 2018-02-07 | 2023-03-28 | Lockheed Martin Corporation | Direct camera-to-display system |
US10838250B2 (en) | 2018-02-07 | 2020-11-17 | Lockheed Martin Corporation | Display assemblies with electronically emulated transparency |
US10690910B2 (en) | 2018-02-07 | 2020-06-23 | Lockheed Martin Corporation | Plenoptic cellular vision correction |
US10922881B2 (en) * | 2018-11-02 | 2021-02-16 | Star Global Expert Solutions Joint Stock Company | Three dimensional/360 degree (3D/360°) real-time full information smart management integrated mapping system (SMIMS) and process of generating the same |
US10866413B2 (en) | 2018-12-03 | 2020-12-15 | Lockheed Martin Corporation | Eccentric incident luminance pupil tracking |
US10698201B1 (en) | 2019-04-02 | 2020-06-30 | Lockheed Martin Corporation | Plenoptic cellular axis redirection |
WO2021054941A1 (en) * | 2019-09-17 | 2021-03-25 | Hewlett-Packard Development Company, L.P. | Notification delivery |
US20220060877A1 (en) * | 2020-08-19 | 2022-02-24 | Cade Robertson | Emergency gps signal system |
US11892624B2 (en) | 2021-04-27 | 2024-02-06 | Microsoft Technology Licensing, Llc | Indicating an off-screen target |
Also Published As
Publication number | Publication date |
---|---|
EP2126866A4 (en) | 2011-06-01 |
AU2008226932A2 (en) | 2009-11-19 |
AU2008226932A1 (en) | 2008-09-18 |
US20120120070A1 (en) | 2012-05-17 |
AU2008226932B2 (en) | 2011-09-22 |
EP2126866A2 (en) | 2009-12-02 |
CA2679427A1 (en) | 2008-09-18 |
JP2010520559A (en) | 2010-06-10 |
JP5553405B2 (en) | 2014-07-16 |
US9324229B2 (en) | 2016-04-26 |
WO2008112149A2 (en) | 2008-09-18 |
JP2014123406A (en) | 2014-07-03 |
WO2008112149A3 (en) | 2009-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2008226932B2 (en) | Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness | |
US11282248B2 (en) | Information display by overlay on an object | |
US20210043007A1 (en) | Virtual Path Presentation | |
US20020196202A1 (en) | Method for displaying emergency first responder command, control, and safety information using augmented reality | |
CN104660995B (en) | A kind of disaster relief rescue visible system | |
US20200020162A1 (en) | Virtual Path Display | |
US20030210228A1 (en) | Augmented reality situational awareness system and method | |
CN106909215A (en) | Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality | |
WO2007133209A1 (en) | Advanced augmented reality system and method for firefighter and first responder training | |
CN209221402U (en) | A kind of portable fire-fighting individual device based on multisensor | |
CN107807575A (en) | A kind of infrared positioning AR of fireman shows the helmet and fire command system | |
US20110248847A1 (en) | Mobile asset location in structure | |
EP3062297A1 (en) | Emergency guidance system and method | |
KR20160070503A (en) | Method and system for providing a position of co-operated firemen by using a wireless communication, method for displaying a position of co-operated firefighter, and fire hat for performing the method | |
GB2535723A (en) | Emergency guidance system and method | |
TWI755834B (en) | Visual image location system | |
CN207337157U (en) | A kind of infrared positioning AR of fireman shows the helmet and fire command system | |
GB2535728A (en) | Information system and method | |
WO2016135448A1 (en) | Emergency guidance system and method | |
Bretschneider et al. | Head mounted displays for fire fighters | |
Streefkerk et al. | Evaluating a multimodal interface for firefighting rescue tasks | |
EP3062220A1 (en) | Information system and method | |
WO2016135447A1 (en) | Information system and method | |
WO2023286115A1 (en) | Display controller, display system, display method and computer readable medium | |
CN211323223U (en) | AR fire helmet based on distributed network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ITT MANUFACTURING ENTERPRISES, INC., DELAWARE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BAILLOT, YOHAN;REEL/FRAME:019076/0358 Effective date: 20070308 |
|
AS | Assignment |
Owner name: EXELIS INC., VIRGINIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ITT MANUFACTURING ENTERPRISES LLC (FORMERLY KNOWN AS ITT MANUFACTURING ENTERPRISES, INC.);REEL/FRAME:027604/0136 Effective date: 20111221 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |