US20030142068A1 - Selective real image obstruction in a virtual reality display apparatus and method - Google Patents

Selective real image obstruction in a virtual reality display apparatus and method Download PDF

Info

Publication number
US20030142068A1
US20030142068A1 US10/339,090 US33909003A US2003142068A1 US 20030142068 A1 US20030142068 A1 US 20030142068A1 US 33909003 A US33909003 A US 33909003A US 2003142068 A1 US2003142068 A1 US 2003142068A1
Authority
US
United States
Prior art keywords
obstruction
image
real
area
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/339,090
Inventor
Michael DeLuca
Joan DeLuca
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/339,090 priority Critical patent/US20030142068A1/en
Publication of US20030142068A1 publication Critical patent/US20030142068A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/22Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/371Image reproducers using viewer tracking for tracking viewers with different interocular distances; for tracking rotational head movements around the vertical axis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/373Image reproducers using viewer tracking for tracking forward-backward translational head movements, i.e. longitudinal movements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/10Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • H04N13/289Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/324Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/334Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spectral multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/337Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/341Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Definitions

  • This invention generally relates to the area of image displays and more particularly to transparent image displays and virtual reality user interfaces.
  • Graphical user interfaces have become a standard for interfacing between a user and a computer. Such interfaces are in wide use in computer operating system interfaces produced by Apple, Microsoft and others. These interfaces are limited in that they are intended for interfacing between a user and a computer having a two dimensional display such as a CRT or LCD.
  • a user activates the interface with a key board and or a pointing device such as a mouse pointing to an icon on the display.
  • Advancements have been made with the advent of a touch screen which allows a user to approximately contact the icon or intended area of the graphical user interface in order to use the interface.
  • contact with the touch screen can contaminate the display area of the screen with finger prints and other types of smudges.
  • constant physical contact with the touch screen can result in its mechanical failure. Thus, what is needed is a way to contact user interface images without contacting a keyboard or a mouse or the display itself.
  • Three dimensional image displays are improving.
  • Several types of three dimensional displays are known including stereoscopic displays which display a virtual three dimensional image using filters to highlight images intended for each eye of the viewer, thereby providing a stereoscopic or three dimensional affect.
  • Such systems alternately flash images for the left and right eye of the user and require a filter for each eye, usually included in glasses worn by the viewer.
  • Systems are in public use which require glasses may have color filters, orthogonally polarized lenses, or actively switched lenses, and the display is correspondingly modulated with left and right eye images to provide the three dimensional effect.
  • stereoscopic displays which do not require glasses have been described, descriptions are included in U.S. Pat. No. 4,987,487, Jan.
  • the aforesaid stereoscopic displays allow the viewer to simultaneously observe both a stereoscopic object, appearing to be generally set apart in three dimensions from the image projection means, and a physical object, such as the hand of the user, in approximately the same perceived space. What is needed is a method and apparatus by which the intersection of the physical object and the stereoscopic object can form a user interface with a computer system.
  • Stereoscopic headsets are capable of generating independent images for each eye and thus provide a three-dimensional virtual reality image for the viewer.
  • Such headsets have the advantage of providing the experience of a substantially large display system, such as a movie theater screen, at a significantly reduced price and in a substantially small area.
  • Some headsets are opaque while others are transparent.
  • Opaque headsets entirely block the user's view of real images normally observable when a headset is not worn.
  • Opaque headsets have the advantage of enhancing the virtual reality image but the disadvantage of preventing the viewer from observing real images. The inability of the observer to view real images while wearing the headsets inhibits most normal social functions such as walking or having a normal conversation with others in observer's vicinity.
  • transparent headsets allows the observer to see both real images and virtual reality images projected by the headset, the virtual reality images appearing superimposed upon reality's real images.
  • This has the advantage of allowing the user to view reality while wearing such a headset, thus enabling the user to conduct most normal social functions such as walking or carrying on a normal conversation.
  • the quality of the virtual reality image may be compromised when superimposed upon real images because the real images may distract the user from the content of the virtual reality image, thus detracting from the virtual reality experience.
  • the control signal may cause modification of the displayed image or control another device.
  • the display system is also capable of extending the physical object with a three dimensional extension image and then using the extended image to determine the intersection.
  • FIG. 1 shows a perspective view of a user causing an intersection of a physical object with a three dimensional stereoscopic object projected by a display.
  • FIG. 2 shows the display of the stereoscopic interface image.
  • FIG. 3 shows determination of the position of the stereoscopic interface image.
  • FIG. 4 shows a physical object intersecting the stereoscopic interface image.
  • FIG. 5 shows a stereoscopic extension of the physical object intersecting the stereoscopic interface image.
  • FIG. 6 shows a stereoscopic extension image of the physical object intersecting the stereoscopic interface image wherein the intersection is behind the display.
  • FIG. 7 shows a block diagram of the user interface system operating in accordance with the present invention.
  • FIG. 8 shows a flow chart of a process operating in accordance with the present invention.
  • FIG. 9 shows active real image obstruction in a virtual reality display system.
  • FIG. 10 shows selective real image obstruction in the virtual reality display system.
  • FIG. 11 shows a headset embodiment of the present invention.
  • FIG. 12 shows a front view of the headset embodiment of the present invention.
  • FIG. 13 shows a top view operation of a headset with active reality obstruction.
  • FIG. 14 shows an example of a view of a transparent display system without real image obstruction.
  • FIG. 15 shows an example view of the transparent display system with real image obstruction.
  • FIG. 16 shows a flowchart of a method operating in accordance with the present invention.
  • FIG. 1 shows a perspective view of a user causing an intersection of a physical object with a three dimensional stereoscopic object projected by a display.
  • the user 100 has left and right eyes 110 and 120 which are used to view a display 200 which projects a three dimensional stereoscopic object 245 in a space between the user and the display.
  • the stereoscopic object has a stereoscopic interface image 250 .
  • images from video cameras 310 and 320 are used to determine the position of physical objects within the space, such as the position of the user 100 and the user's finger 400 .
  • a control signal is generated in response to the intersection of the interface image 250 and a physical object 400 .
  • the stereoscopic object 245 projected by the display 200 could be the image of an open book, including readable text on pages of the book.
  • Interface image 250 could be an icon indicating that contact with the icon would cause a page in the book to turn.
  • a control signal is generated causing a new image 245 of a book to be displayed with a turned page.
  • the stereoscopic three dimensional image has the advantage of being projected in a space, no physical contact with a keyboard, mouse or touch screen is needed to generate a control signal to turn a page of the book. Rather, an intuitive action of a user appearing to make physical contact with a three dimensional image in the space causes generation of the control signal.
  • the user sees the interface image in a three dimensional space and simply uses a finger to touch the interface image to cause a response.
  • the user has an actual view of the finger, with which the user has had a life time to become familiar, touching a virtual stereoscopic object similar to the way the user has spent a life time touching physical objects. This provides for an intuitive interface.
  • the stereoscopic projector 200 can be any of several display means capable of displaying three dimensional images. Some projectors require the user to wear colored, polarized of active image filter glasses (not shown) to observe the three dimensional image while others are totally contained within a display headset worn by the user, yet another requires only a display separate from the user and no glasses at all. While all displays capable of displaying a three dimensional image are contemplated, the latter is preferred because of the convenience to a user requiring no physical contact with the means necessary to display three dimensional images.
  • FIG. 2 shows the display of the stereoscopic interface image.
  • Display 200 displays an image 210 for viewing by the left eye 110 of the user 100 while image 220 displayed for viewing by the right eye 120 of user 100 .
  • stereoscopic interface image 250 appears to occur in a space between the user 100 and the display 200 at a position indicated by the intersection of a line from eye 110 to image 210 and a second line from eye 120 to image 220 .
  • FIG. 3 shows determination of the position of the stereoscopic interface image.
  • the position is dependent upon the distance between images 210 and 220 , the distance between the eyes 110 and 120 of the user 100 and the position of the user including distance D 1 between the display 200 and the user.
  • the size of display 200 is predetermined and the image 250 is determined by the computer generating the image. Consequently the distance between images 210 and 220 is also predetermined.
  • the distance between the eyes 110 and 120 can be entered by the user as a calibration procedure prior to operating the user interface means, or can be determined by pattern recognition from images recorded by cameras 310 and 320 .
  • the position of the user including the distance between the user and the display can determined by pattern recognition by the images recorded by cameras 310 and 320 to determine a common point relative to the user.
  • Pattern recognition of images of faces and other physical objects are well known, such descriptions can be found in references including U.S. Pat. No. 5,680,481 Oct. 21, 1997 to Prasad et al. entitled Facial feature extraction method and apparatus for a neural network acoustic and visual speech recognition system, U.S. Pat. No. 5,715,325 Feb. 3, 1998 to Bang et al. entitled Apparatus and method for detecting a face in a video image, and U.S. Pat. No. 5,719,951 Feb. 17, 1998 to Shackeleton et al.
  • the common point may be the area between the eyes of the user.
  • the identification of the common point may be simplified by adding a fiducial mark at the desired point to assist in identifying the desired point and its corresponding angle.
  • a fiducial mark could be a colored dot placed between the eyes or at the tip of the nose, or marks on glasses worn by the user, the mark could be further illuminated to simplify patter recognition of images received by the video camera.
  • triangulation is performed to determine the position of the user including D 1 .
  • D 1 is a geometric solution of a predetermined distance between cameras 310 and 320 angles A 1 and A 2 found from images recorded by cameras 310 and 320 .
  • the position including D 2 of interface image 250 is readily geometrically determined from the aforesaid determinations.
  • the three dimensional display means can be constructed such that the position of the user and the distance D 1 is predetermined in order for the user to correctly view the stereoscopic effect.
  • the distance between the eyes 110 and 120 can also be predetermined to be an average distance between eyes of a number of users. This simplifies determination of the position of interface image 250 without departing from the spirit and scope of the invention.
  • FIG. 3 shows determining the position of interface image 250 from a top view, it should be appreciated that a similar analysis applies to determining the position of interface image 250 from a side view, thus providing a three dimensional position of the user 100 and the interface image 250 .
  • FIG. 4 shows a physical object intersecting the stereoscopic interface image.
  • Physical object 400 can be any physical object where the position of the object can be determined.
  • the physical object corresponds to the tip of the finger of the user.
  • Pattern recognition is used to determine the position of the physical object and the tip of the finger of the user.
  • a fiducial mark such as the aforementioned colored or illuminated dot may be added to assist pattern recognition.
  • a control signal is generated.
  • the control signal may result in the modifications of the image or the control another device such as a printer or modem.
  • FIG. 4 shows a computer system which stereoscopically projects a three dimensional object having an interface image in a space observable by a user.
  • the user controls the movement of a physical object within the space while observing both the three dimensionally projected object and the physical object.
  • the computer system monitors the position of the user to determine the position of the interface image within the space and further monitors the movement of the physical object to determine its position.
  • a control signal is generated in response to the position of the physical object intersecting the position of the interface image.
  • a word processing program is indicated by an interface image such as an icon including the letter “W” three dimensionally projected within the space.
  • the word processing program is activated when the user's finger moves within the space to touch the projected icon.
  • the interface allows the user to observe the projected icon, physical finger and their intersection within the space.
  • FIG. 5 shows a stereoscopic extension of the physical object intersecting the stereoscopic interface image.
  • the physical object is shown as a bar 450 having a first and second end 452 and 454 with a stereoscopic extension image 255 projecting from end 454 .
  • the orientation and position of the physical object is determined by determining the positions of end points 452 and 454 from images recorded by cameras 310 and 320 .
  • the end points can be found by pattern recognition or by adding of differing colored fiducial marks at either end of the bar.
  • the position of end point 452 may be determined from angles A 6 and A 8 of images from cameras 310 and 320 respectively while the position of end point 454 may be determined from angles A 5 and A 7 from cameras 310 and 320 respectively.
  • FIG. 5 shows determining the position of the end points from a top view, it should be appreciated that a similar analysis applies to determining the position of the end points from a side view, thus providing a three dimensional position of end points 452 and 454 .
  • the orientation of the physical object 450 may be determined.
  • a stereoscopic extension image 255 is created such that the extension image appears to be an extension of the physical object.
  • the extension image 255 is shown as a line extending along the line of physical object 450 with an arrow head tip.
  • the length and shape of the extension image is predetermined and may vary from application to application.
  • the stereoscopic extension image 255 is created by displaying images 215 and 225 on display 200 for view by eyes 110 and 120 respectively.
  • a control signal is generated when the position of a predetermined portion of the stereoscopic extension image, such as the tip of the arrow head, intersects the position of the stereoscopic interface image.
  • FIG. 6 shows a stereoscopic extension image of the physical object intersecting the stereoscopic interface image wherein the intersection is behind the display 200 .
  • FIG. 6 is similar to FIG. 5 in that both show a stereoscopic extension image, 255 and 255 ′, intersecting a stereoscopic interface image, 250 and 250 ′. However in FIG. 5 the intersection is in front of display 200 , while in FIG. 6 the intersection is behind display 200 .
  • the position and orientation of physical object 450 is determined by determining the position of end points 452 and 454 via cameras 310 and 320 and angles A 5 ′, A 6 ′, A 7 ′ and A 8 ′.
  • the resulting extension image 255 ′ is shown to have a substantially longer predetermined length than image 255 of FIG. 5.
  • display 200 were not a heads-up stereoscopic display, but rather a conventional LCD or CRT, then the intersection between a physical object and an interface image could not occur if the position of the interface image were behind the display because either the space is physically occupied by another object or the user could not see the physical intersection through the display.
  • the extension image has the advantage of enabling intersections to occur in positions appearing behind the display 200 , or in other positions out of reach of the user, while allowing the user to directly view the physical object used to cause the intersection.
  • Physical object 450 has been referred to as a bar, but it should be appreciated that the physical object could be any of a number of physical objects including the finger of the user where one end is the finger tip and the other end is a joint of the finger. Fiducial marks could be added to the points on the finger to facilitate pattern recognition of images recorded by the cameras. While the extension image is shown as a line with an arrow head, other types of extension images may be used depending upon the application. The stereoscopic extension may be considered a virtual end effect for a physical handle, a wide variety of end effects may be created by the computer system.
  • a paint brush could be used for paining a virtual object, the handle being the physical object and the brush bristles and paint color the being end effect while the interface image appears as a paint canvas mounted on and three dimensional easel image.
  • the physical object could be the handle and the end effect extension image the blade of a scalpel while the stereoscopic interface image part of a three dimensional image simulating surgery.
  • the stereoscopic extension image could be a laser beam, rocket, bullet or bolt of lightning appearing to emanate from the finger of the user along a three dimensional vector defined by the finger, the stereoscopic interface image may be a villain or enemy tank moving in three dimensions.
  • the position and orientation of the user 100 and physical object 450 have been described as being determined by two cameras with pattern recognition which triangulate in order to determine the corresponding position and orientation.
  • the cameras could be preferably mounted on the head set for visually monitoring physical objects in same space in which the user observes the projected stereoscopic images.
  • other techniques may be used to determine the aforesaid positions and orientations without departing from the spirit and scope of the invention.
  • FIG. 7 shows a block diagram of the user interface system operating in accordance with the present invention.
  • a stereoscopic display 200 displays stereoscopic images generated by stereoscopic image generation means 212 in a manner know in the art.
  • the stereoscopic display may be a CRT or LCD screen requiring filter glasses to be worn by the user to direct the appropriate image to the corresponding eye of the user. Alternately, it may be a heads up stereoscopic display worn by the user.
  • display 200 is a display means especially adapted to displaying stereoscopic images without the aid of devices worn by the use.
  • Cameras 310 and 320 produce images which are analyzed by pattern recognizers 312 and 322 which identify certain points of the image and their location within the image.
  • the pattern recognition may be performed with or without the aid of fiducial marks.
  • the location of the points from pattern recognizers 312 and 322 are analyzed by coordinate determining means 314 which analyzes the angles relative to each point from each camera, and knowing the predetermined distance between the cameras, is able to determine the desired positions and orientations.
  • Coordinate determining means 314 also makes available the position of the user and the position and orientation of the physical object so that the stereoscopic image generator 212 may generate the stereoscopic extension image in response thereto.
  • Coordinate determining means 314 also makes available the position of the user to coordinate determining means 214 which determines the position of the interface image relative to the user by determining the distance between the left eye and right eye images displayed on display 200 with the user's position including the distance between the user and the display and the spacing between the eyes of the user. The positions of the physical object and interface image are then compared by intersection monitor 322 which generates a control signal in response to a substantial coincidence with the position of the physical object, or its stereoscopic extension image, and the position of the stereoscopic interface image.
  • FIG. 8 shows a flow chart of a process operating in accordance with the present invention.
  • a stereoscopic image is displayed.
  • Step 802 determines the position of the user as previously described. Note in alternate embodiments the position of the user may be predetermined.
  • the position of the stereoscopic interface image relative to the user is determined.
  • Step 806 determines the position and orientation of the physical object and step 810 asks if and extension image is desired. If so, step 812 causes the display of the extension image and step 814 redetermines the position and orientation of the physical object with the extension image.
  • step 816 determines if there is an intersection between the interface image and the physical object or its extension image. If so, step 818 generates a control signal which in step 820 modifies the displayed image and/or controls another device.
  • FIG. 9 shows active real image obstruction in a virtual reality display system.
  • Display means 200 is a transparent display means preferably capable of displaying a stereoscopic image 250 appearing in front of the display means or stereoscopic image 250 ′ appearing behind the display means.
  • an image 251 or 252 could appear in coincidence with the display if a non-stereoscopic display were implemented, such non-stereoscopic virtual reality images are produced by displays including “teleprompters”.
  • Image 252 could alternately be a stereoscopic image if display 200 were a stereoscopic display.
  • Images produced by display 200 correspond to virtual reality images when viewed by the user.
  • Reality also has numerous real images including real images 850 having portion 852 and 854 corresponding to images normally observable by an observer with the naked eye 110 .
  • the transparency of display 200 allows the virtual reality images to appear superimposed upon real images represented by reality 850 .
  • Such a system is shown in U.S. Pat. No. 5,491,510 to Gove entitled System and method for simultaneously viewing a scene and an obstructed object, or U.S. Pat. No. 5,694,142 to Dumoulin et al. entitled Interactive digital arrow (D'ARROW) three-dimensional (3D) pointing, which are hereby incorporated by reference.
  • D'ARROW Interactive digital arrow
  • 3D three-dimensional
  • the invention also includes an active real image obstructer, or real image obstruction means, 860 .
  • the active real image obstructer modifies the transparency of portions of the viewing system.
  • the active reality obstructer preferably includes a multiplicity of individually addressable and electronically controlled light valves, and preferably includes a gray scale liquid crystal display (LCD) having pixels capable of electronically switching between substantially transparent, partially transparent and substantially opaque states. Such LCDs are know to those familiar with the art.
  • the result is the selective obstruction of real images 850 .
  • obstructer 860 has a portion 864 where light valves substantially inhibit viewing of real images 854 .
  • display 200 also displays image 250 , 250 ′ or 251 .
  • the resulting display system results in a view wherein virtual image 252 appears superimposed upon or combined with real image 852 , while virtual image 250 , 250 ′ or 251 appears on a substantially opaque background.
  • An opaque area 864 is formed by light valves of the obstructer inhibiting viewing of real images 854 .
  • real images 854 do not interfere with the viewing of virtual image 250 , 250 ′ or 251 .
  • FIG. 10 shows selective real image obstruction in the virtual reality display system.
  • video cameras 310 and 320 determine the position of the viewer in order to determine the position of the perceived virtual reality image 250 ′.
  • the transparency of display 200 allows viewing of an image of real physical object 450 . Since object 450 is on the opposite side of the display system, a second set of video cameras 310 ′ and 320 ′ are used to determine the position of selecting object 450 .
  • real image obstruction 864 ′ is modified to enable viewing of selector 450 so that both the real image of object 450 and virtual image 250 ′ may be both viewed by the viewer.
  • image 250 ′ has an interface image
  • object 450 corresponds to selecting device, such as a finger of the user.
  • Real image obstruction 864 ′ is modified to facilitate the user observing the object in order guide the object to a desired interface image. Note that if reality obstruction 864 ′ of FIG. 10 were not modified in response to selecting object 450 , the view of selecting object 450 would be obstructed.
  • selecting object 450 is recognized independently of other real images 854 obstructed by obstruction 864 . The independent recognition of selecting object 450 may be facilitated with the use of fiducial marks on the pointing device such as a ring or colored dots on the finger.
  • pattern recognition could be performed to identify a predetermined pointing device such as the user's finger. Since the position of the user, the selecting object and display system are all known, sufficient information is available to adjust the reality obstruction 864 ′ facilitate viewing of the pointing device by the user.
  • a “head's up” display includes active reality obstruction.
  • a transparent display is coupled with real image obstructer 860 and place a substantial distance from the viewer in order that the display may be viewed by both eyes of the viewer.
  • the obstructer 860 allows viewing of some of the virtual images projected by the display to combined with real images viewed through the display.
  • the reality obstructer inhibits viewing real images in other portions of the display in order to enhance viewing of corresponding virtual images or to facilitate better viewing of real images through the display system.
  • the display system of FIG. 1 to FIG. 6 is modified such that display 200 includes a transparent display and obstructer 860 is locate behind the display.
  • Interface image 250 can be further incorporated into the display system to provide for a heads-up stereoscopic user interface.
  • FIG. 11 shows a headset embodiment of the present invention.
  • the headset is capable of displaying stereoscopic images with a transparent display projection system.
  • a transparent display projection system Such a system is shown in U.S. Pat. No. 5,886,822 to Spitzer entitled Image combining system for eyeglasses and face masks, which is hereby incorporated by reference.
  • the display system includes an image projector system 870 and 872 . Images are generated by image generator 870 and reflected into an eye of the viewer by reflector means 872 . When a corresponding system is used with the second eye of the user, a stereoscopic image is generated. Also included is obstructer 860 which obstructs real images in certain parts of the viewing area.
  • Video camera means 310 ′ is used to monitor real images and adjust the reality obstruction in response thereto. For example, real image obstruction can be modified to enable viewing of selecting object 450 . Since the system is part of a headset, the location of each of the user's eyes is substantially predetermined. As a result video cameras 310 and 320 , which were previously used to determine the location of the user become optional. As previously discussed, the system enables the stereoscopic user interface by facilitating the user's view of the intersection between a real object and a stereoscopic interface image.
  • FIG. 12 shows a front view of the headset embodiment of the present invention.
  • Each lens has a virtual reality image projector and a real image obstructer 860 and 872 , and 860 ′ and 872 ′ respectively.
  • the obstruction created by obstructers 860 and 860 ′ is also stereoscopic, occupying a perceived space a distance in front of the user.
  • the perceived distance of the obstruction is adjusted to enhance the user's view. For example, if stereoscopic image 250 is projected to appear one half meter in front of the viewer, then the stereoscopic obstruction would also be generated to appear one half meter in front of the viewer in order to enhance the viewing of image.
  • Video cameras 310 ′ and 320 ′ monitor real images viewed by the user. This facilitates the stereoscopic user interface by locating a selecting object position as well as facilitates adjusting the real image obstructers in response to real images.
  • the head set display system of FIG. 11 or FIG. 12 can be implemented with filter glass (polarized, colored or actively switched) viewing a common display panel.
  • FIG. 13 shows a top view operation of a headset with active reality obstruction.
  • the headset includes reality obstructers 860 and 860 ′ for obstructing real images viewed by eyes 110 and 120 .
  • a virtual reality image 250 ′ is projected in a distance in front of the viewer.
  • an obstruction is created substantially between lines 870 and 872 on obstructer 860 and substantially between lines 874 and 876 on obstructer 860 ′.
  • virtual image 450 is created by the display system, its position is inherently known by the display system.
  • the reality obstruction corresponding to virtual image 250 ′ may be determined by the display system at the time of generation of image 250 ′. Note that if the display system changes the position or size of image 250 ′ then the position or size of the corresponding obstruction may also be changed.
  • FIG. 13 further shows how the active obstructions are modified in response to real objects.
  • FIG. 13 shows objects 450 and 880 .
  • Monitoring means such as video cameras 310 ′ and 320 ′ may be added to the system to determine the position and/or character of the real objects.
  • the obstruction is correspondingly modified to facilitate viewing of the real image of the selecting object.
  • the real image of selecting object 450 may be viewed. This is a top view graphical analysis. A similar graphical analysis can be performed for the side view, thus generating a two dimensional real image obstruction area on obstructers 860 and 860 ′.
  • the monitoring means determines the character of the real images and obstructs their view if they interfere with viewing other virtual reality images or other real images.
  • real object 880 corresponds to a relatively bright object, such as a streetlight or the sun. Such objects produce relatively bright images that tend to interfere with viewing of either virtual reality images or other real images.
  • Video cameras 310 ′ and 320 ′ determine the location an relative brightness of the real image of object 880 . Then a stereoscopic obstruction occurs in substantial coincidence with the real image of object 880 by generating obstructions substantially between lines 882 and 884 on obstructer 860 and substantially between lines 886 and 888 on obstructer 860 ′.
  • the stereoscopic projectors 870 and 870 ′ are optional. It is further noted that such stereoscopic obstructions may be created by the aforementioned heads-up display system of FIG. 10. In such a heads-up application, if only the enhancement of viewing real images is sought, the projector 200 can become optional.
  • FIG. 14 shows an example of a view of a transparent display system without real image obstruction.
  • FIG. 15 shows an example view of the transparent display system with real image obstruction.
  • the view of reality 850 includes a real image of a building and the sun 880 .
  • a first virtual reality image 890 of a digital time of day clock is displayed superimposed on real image 850 in both FIG. 14 and FIG. 15.
  • FIG. 14 also has a second virtual reality image of a streaming information display such as real time stock prices. The information streams from right to left towards the bottom of the display and is superimposed real image 850 .
  • Real image 850 tends to interfere with the viewing of the streaming information of virtual reality image 892 .
  • FIG. 15 shows a real image obstruction 893 in substantial coincidence with virtual reality image 892 .
  • Real image obstruction 892 substantially reduces visual interference from the real image and improves the viewing of the streaming information.
  • FIG. 14 also shows a third superimposed virtual reality image 894 of a video image.
  • the video image is that of a lecture given on the building being observed.
  • image 894 is superimposed upon the sun 880 which is substantially brighter, making at least portions of the video image difficult if not impossible to view.
  • the image from bright object 880 may also interfere with viewing of other virtual reality images as well as the real image of the building.
  • FIG. 15 generates a real image obstruction in substantial coincidence with virtual reality image 894 which substantially enhances its viewing.
  • Image 894 of FIG. 14 and FIG. 15 also includes a stereoscopic interface image 250 ′ which in this example may be used to raise or lower the volume of an accompanying audio portion of the video image 894 , a sound track of the speaker's lecture on the real image of the building being viewed.
  • the real image of selector 450 is also shown.
  • FIG. 15 shows the real image obstruction in substantial coincidence with the real image of selector 450 being removed to facilitate viewing of the real image of selecting object 450 .
  • the block diagram also shows the active real image obstruction means 860 in substantial coincidence with display 200 in order create the previously described active obstruction of real images in a virtual reality display system.
  • Obstructer 860 is controlled by obstruction controller 902 which in the preferred embodiment switches selected light valves of obstructer 860 off or on, or partially off or on in order to create a desired level of transparency or opaqueness.
  • the obstruction controller adjusts the size, location and/or transparency of obstructions in response to manual inputs from the user from manual input means 904 , inputs from an information signal used to generate images displayed on display 200 , the coordinates of virtual images produced by the system from coordinates means 214 and/or in response to inputs from coordinates means 314 for determining coordinates of real objects having real images.
  • Stereoscopic image generator 212 communicates the coordinates of the virtual images to obstruction controller 902 which generates the obstructions accordingly.
  • the amount of transparency of each obstruction may be varied between substantially totally transparent through several levels of decreasing transparency to substantially opaque. If the virtual reality image having a corresponding real image obstruction is moved or resized, the corresponding real image obstruction is also moved or resized by obstruction controller 902 . Such a virtual realty image may be moved in response to a manual input from the viewer.
  • the viewer may be watching a high definition movie in a small window with a corresponding partially transparent real image obstruction while traveling through an airport and on the way to a seat on the airplane.
  • the transparency of the portions of the viewing area allows the viewer to do may things while viewing the movie in the small window.
  • the viewer may use the real images navigate the crowds and corridors of the airport, communicate with airline and security personnel and find an assigned seat on the airplane.
  • the viewer may desire to substantially enlarge the viewing window of the movie and further reduce the transparency of the movie's real image obstruction in order to improve the viewing of the movie.
  • Such adjustments can be done via manual inputs.
  • the manual adjustments may be made via several means including switches or buttons associated with the display system or via a stereoscopic user interface system as previously described.
  • the virtual reality image and its corresponding real image obstruction may be moved or resized in response to an information signal used to generate virtual reality images. For example, a substantially real time stock quote information stream is being displayed, and the user desires to receive alerts based upon a certain financial trigger. In an event of the trigger, the size of the image could be doubled to facilitate a second display of information related to the trigger. This is an example of the size of the virtual reality image, and its corresponding real image obstruction, vary in response to the information signal used for generating the virtual reality image. Furthermore, the transparency of the corresponding real image obstruction could be reduced in response to the trigger. This is an example of changing the size and/or transparency of the real image obstruction in response to an information signal used for generating the virtual reality image.
  • Obstruction controller 902 also receives coordinate information from coordinates means 314 in order to modify obstructions to facilitate viewing of a real image of a selecting object in an embodiment implementing the aforementioned stereoscopic user interface.
  • Coordinates means 314 also provides coordinates of substantially bright portions of the real image in order that corresponding real image obstructions may be generated to reduce the viewed brightness of the bright portions of the real image. This has the advantage of improving the viewing of both virtual reality images and other real images.
  • Coordinates means 314 is capable of determining the ambient brightness of the real images.
  • the ambient brightness may be used to adjust the transparency of the entire obstructer 860 .
  • the transparency of each pixel of the obstructer 860 would be substantially halved in order to maintain a substantially constant contrast ratio between the virtual reality images and the real images.
  • first and second portions of the obstructer had 100% and 50% transparency respectively, then upon a doubling of the ambient light, the first and second portions of the obstructer would be correspondingly adjusted to 50% and 25% transparency.
  • Other non-liner adjusting relationships between obstructer transparency, real image ambient light and virtual reality image brightness are also anticipated.
  • the means for determining coordinates of real images is not necessarily needed.
  • the video cameras 310 and 320 , pattern recognizers 312 and 322 , and coordinate determining means 314 could be substituted for an simpler and lower cost ambient light sensor.
  • FIG. 16 shows a flowchart of a method operating in accordance with the present invention.
  • Step 910 checks for either a manual input or information including the virtual image indicating the need for a change in real image obstruction. If found, then step 915 resizes, and/or moves the image and/or real image obstruction, and/or changes the transparency of the obstruction. From either step 910 or 915 , step 920 checks for a change in ambient brightness of the real image and if found adjusts the overall transparency of the obstruction means in step 925 . Thereafter, step 930 checks for a substantially bright area in the real image and if found step 935 adjusts the transparency of an obstruction corresponding to the bright area in step 935 . Thereafter, step 940 determines if a selecting object is included in the real image, and if found adjusts the transparency of the obstruction to facilitate viewing of the selecting object in step 945 . Thereafter the program returns to step 910 .
  • the real image obstruction is a passive real image obstruction of a predetermined area of the display system.
  • real image obstruction 893 of FIG. 14 and FIG. 15 could be a permanent blacked out obstruction on the headset, a thick black stripe towards the bottom of each lens of the headset.
  • streaming stock information image 892 could always been seen with out interference from real images: other virtual reality images, such as 890 and 894 , would always appear combined with or superimposed upon real images.
  • the thick black stripe real image obstructer may be substantially unrelated to the display means used for projecting virtual reality images. This also enables active real image obstructer 860 to become an optional component of the display system.
  • a further enhancement of the selective obstruction includes blocking only certain spectrum of visible light of real images with either the passive or active real image obstructer. This enhances viewing of a virtual image in the color region of the real image obstruction.
  • real image obstructer 893 may selectively block blue light causing real images in area 893 to appear substantially yellow. This substantially enhances viewing of the information stream 892 , particularly if the information stream is projected with a blue color.
  • the entire display system can selectively block of filter a portion of the spectrum of light from real images (such as blue light) and with the display system projecting desired information (such as text, graph or line art) with a corresponding (blue) color in order to enhance viewing of the virtual (text, text graph or line art) images when viewed superimposed upon or combined with filtered light from the real images.
  • desired information such as text, graph or line art
  • desired information such as text, graph or line art

Abstract

A virtual reality system (200-322) stereoscopically projects a virtual reality images including a three dimensional image (245) having an interface image (250′) in a space observable by a user (100). The display system includes a substantially transparent display means (200) which also allows real images of real objects (850) to be combined or superimposed with the virtual reality images. Selective areas or characteristics of the real images are obstructed by a selective real image obstructer (860) to enhance viewing of selected virtual reality images while providing for viewing or real images or virtual images combined with real images in other viewing areas. The display system includes either a stereoscopic headset display system or a heads-up display system. The selective real images obstructer is a gray scale liquid crystal display included with the display system providing for adjustment of the size, shape and/or transparency of the obstruction of real images. The obstruction of real images may be adjusted in response to information for generating the virtual image, manual inputs or processing of real images by video cameras (310′ and 320′). Other selective real image obstructions include filtering a portion of the spectrum of visible light associated with the real images.

Description

    RELATED APPLICATIONS
  • This application is a divisional of application Ser. No. 09/464,976 filed Jan. 31, 2000 which is a continuation-in-part of application Ser. No. 09/108,814 filed Jul. 1, 1998 which issued as U.S. Pat. No. 6,064,354 on May 16, 2000.[0001]
  • FIELD OF THE INVENTION
  • This invention generally relates to the area of image displays and more particularly to transparent image displays and virtual reality user interfaces. [0002]
  • BACKGROUND OF THE INVENTION
  • Graphical user interfaces have become a standard for interfacing between a user and a computer. Such interfaces are in wide use in computer operating system interfaces produced by Apple, Microsoft and others. These interfaces are limited in that they are intended for interfacing between a user and a computer having a two dimensional display such as a CRT or LCD. A user activates the interface with a key board and or a pointing device such as a mouse pointing to an icon on the display. Advancements have been made with the advent of a touch screen which allows a user to approximately contact the icon or intended area of the graphical user interface in order to use the interface. However, contact with the touch screen can contaminate the display area of the screen with finger prints and other types of smudges. Also, constant physical contact with the touch screen can result in its mechanical failure. Thus, what is needed is a way to contact user interface images without contacting a keyboard or a mouse or the display itself. [0003]
  • Three dimensional image displays are improving. Several types of three dimensional displays are known including stereoscopic displays which display a virtual three dimensional image using filters to highlight images intended for each eye of the viewer, thereby providing a stereoscopic or three dimensional affect. Such systems alternately flash images for the left and right eye of the user and require a filter for each eye, usually included in glasses worn by the viewer. Systems are in public use which require glasses may have color filters, orthogonally polarized lenses, or actively switched lenses, and the display is correspondingly modulated with left and right eye images to provide the three dimensional effect. Furthermore, stereoscopic displays which do not require glasses have been described, descriptions are included in U.S. Pat. No. 4,987,487, Jan. 22, 1991, to Ichinose et al. entitled Method of stereoscopic images display which compensates electronically for viewer head movement, and U.S. Pat. No. 5,365,370, Nov. 15, 1994, to Hudgins entitled Three dimensional viewing illusion with 2D display. Yet another stereoscopic display system in completely contained in a head set worn apparatus as described in U.S. Pat. No. 5,673,151 Sep. 30, 1997 to Dennis entitled Image correction in a virtual reality and heads up display. The aforesaid patents are incorporated by reference. The aforesaid stereoscopic displays allow the viewer to simultaneously observe both a stereoscopic object, appearing to be generally set apart in three dimensions from the image projection means, and a physical object, such as the hand of the user, in approximately the same perceived space. What is needed is a method and apparatus by which the intersection of the physical object and the stereoscopic object can form a user interface with a computer system. [0004]
  • Stereoscopic headsets are capable of generating independent images for each eye and thus provide a three-dimensional virtual reality image for the viewer. Such headsets have the advantage of providing the experience of a substantially large display system, such as a movie theater screen, at a significantly reduced price and in a substantially small area. Some headsets are opaque while others are transparent. Opaque headsets entirely block the user's view of real images normally observable when a headset is not worn. Opaque headsets have the advantage of enhancing the virtual reality image but the disadvantage of preventing the viewer from observing real images. The inability of the observer to view real images while wearing the headsets inhibits most normal social functions such as walking or having a normal conversation with others in observer's vicinity. On the other hand, transparent headsets allows the observer to see both real images and virtual reality images projected by the headset, the virtual reality images appearing superimposed upon reality's real images. This has the advantage of allowing the user to view reality while wearing such a headset, thus enabling the user to conduct most normal social functions such as walking or carrying on a normal conversation. However, the quality of the virtual reality image may be compromised when superimposed upon real images because the real images may distract the user from the content of the virtual reality image, thus detracting from the virtual reality experience. [0005]
  • Thus, what is needed is a virtual reality viewing system that provides for the advantages of both transparent and opaque viewing systems while reducing the disadvantages of both. [0006]
  • OBJECT OF THE INVENTION
  • It is therefor an object of the invention to provide a three dimensional display system capable of determining an intersection of a physical object with a three dimensionally displayed object in a space where the three dimensional object is viewed and generating a control signal in response thereto. The control signal may cause modification of the displayed image or control another device. The display system is also capable of extending the physical object with a three dimensional extension image and then using the extended image to determine the intersection. [0007]
  • It is another object of the present invention to provide a transparent display system for viewing real objects beyond the display which actively obstructs transparency to enhance viewing of displayed virtual images and real images viewable through the display.[0008]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a perspective view of a user causing an intersection of a physical object with a three dimensional stereoscopic object projected by a display. [0009]
  • FIG. 2 shows the display of the stereoscopic interface image. [0010]
  • FIG. 3 shows determination of the position of the stereoscopic interface image. [0011]
  • FIG. 4 shows a physical object intersecting the stereoscopic interface image. [0012]
  • FIG. 5 shows a stereoscopic extension of the physical object intersecting the stereoscopic interface image. [0013]
  • FIG. 6 shows a stereoscopic extension image of the physical object intersecting the stereoscopic interface image wherein the intersection is behind the display. [0014]
  • FIG. 7 shows a block diagram of the user interface system operating in accordance with the present invention. [0015]
  • FIG. 8 shows a flow chart of a process operating in accordance with the present invention. [0016]
  • FIG. 9 shows active real image obstruction in a virtual reality display system. [0017]
  • FIG. 10 shows selective real image obstruction in the virtual reality display system. [0018]
  • FIG. 11 shows a headset embodiment of the present invention. [0019]
  • FIG. 12 shows a front view of the headset embodiment of the present invention. [0020]
  • FIG. 13 shows a top view operation of a headset with active reality obstruction. [0021]
  • FIG. 14 shows an example of a view of a transparent display system without real image obstruction. [0022]
  • FIG. 15 shows an example view of the transparent display system with real image obstruction. [0023]
  • FIG. 16 shows a flowchart of a method operating in accordance with the present invention.[0024]
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 1 shows a perspective view of a user causing an intersection of a physical object with a three dimensional stereoscopic object projected by a display. The [0025] user 100 has left and right eyes 110 and 120 which are used to view a display 200 which projects a three dimensional stereoscopic object 245 in a space between the user and the display. The stereoscopic object has a stereoscopic interface image 250. Using pattern recognition and triangulation, images from video cameras 310 and 320 are used to determine the position of physical objects within the space, such as the position of the user 100 and the user's finger 400. As will be described herein, a control signal is generated in response to the intersection of the interface image 250 and a physical object 400. For example, the stereoscopic object 245 projected by the display 200 could be the image of an open book, including readable text on pages of the book. Interface image 250 could be an icon indicating that contact with the icon would cause a page in the book to turn. When the finger tip 400 of the user touches the icon 250, a control signal is generated causing a new image 245 of a book to be displayed with a turned page. The stereoscopic three dimensional image has the advantage of being projected in a space, no physical contact with a keyboard, mouse or touch screen is needed to generate a control signal to turn a page of the book. Rather, an intuitive action of a user appearing to make physical contact with a three dimensional image in the space causes generation of the control signal. The user sees the interface image in a three dimensional space and simply uses a finger to touch the interface image to cause a response. The user has an actual view of the finger, with which the user has had a life time to become familiar, touching a virtual stereoscopic object similar to the way the user has spent a life time touching physical objects. This provides for an intuitive interface.
  • The [0026] stereoscopic projector 200 can be any of several display means capable of displaying three dimensional images. Some projectors require the user to wear colored, polarized of active image filter glasses (not shown) to observe the three dimensional image while others are totally contained within a display headset worn by the user, yet another requires only a display separate from the user and no glasses at all. While all displays capable of displaying a three dimensional image are contemplated, the latter is preferred because of the convenience to a user requiring no physical contact with the means necessary to display three dimensional images.
  • FIG. 2 shows the display of the stereoscopic interface image. [0027] Display 200 displays an image 210 for viewing by the left eye 110 of the user 100 while image 220 displayed for viewing by the right eye 120 of user 100. As a result, stereoscopic interface image 250 appears to occur in a space between the user 100 and the display 200 at a position indicated by the intersection of a line from eye 110 to image 210 and a second line from eye 120 to image 220.
  • FIG. 3 shows determination of the position of the stereoscopic interface image. The position is dependent upon the distance between [0028] images 210 and 220, the distance between the eyes 110 and 120 of the user 100 and the position of the user including distance D1 between the display 200 and the user. Preferably, the size of display 200 is predetermined and the image 250 is determined by the computer generating the image. Consequently the distance between images 210 and 220 is also predetermined. The distance between the eyes 110 and 120 can be entered by the user as a calibration procedure prior to operating the user interface means, or can be determined by pattern recognition from images recorded by cameras 310 and 320. The position of the user including the distance between the user and the display can determined by pattern recognition by the images recorded by cameras 310 and 320 to determine a common point relative to the user. Pattern recognition of images of faces and other physical objects are well known, such descriptions can be found in references including U.S. Pat. No. 5,680,481 Oct. 21, 1997 to Prasad et al. entitled Facial feature extraction method and apparatus for a neural network acoustic and visual speech recognition system, U.S. Pat. No. 5,715,325 Feb. 3, 1998 to Bang et al. entitled Apparatus and method for detecting a face in a video image, and U.S. Pat. No. 5,719,951 Feb. 17, 1998 to Shackeleton et al. entitled Normalized image feature processing, which are hereby incorporated by reference. The common point may be the area between the eyes of the user. Alternately, the identification of the common point may be simplified by adding a fiducial mark at the desired point to assist in identifying the desired point and its corresponding angle. Such a mark could be a colored dot placed between the eyes or at the tip of the nose, or marks on glasses worn by the user, the mark could be further illuminated to simplify patter recognition of images received by the video camera. Thereafter, triangulation is performed to determine the position of the user including D1. D1 is a geometric solution of a predetermined distance between cameras 310 and 320 angles A1 and A2 found from images recorded by cameras 310 and 320. Thus, the position including D2 of interface image 250 is readily geometrically determined from the aforesaid determinations. It should be appreciated that the three dimensional display means can be constructed such that the position of the user and the distance D1 is predetermined in order for the user to correctly view the stereoscopic effect. Furthermore, the distance between the eyes 110 and 120 can also be predetermined to be an average distance between eyes of a number of users. This simplifies determination of the position of interface image 250 without departing from the spirit and scope of the invention. FIG. 3 shows determining the position of interface image 250 from a top view, it should be appreciated that a similar analysis applies to determining the position of interface image 250 from a side view, thus providing a three dimensional position of the user 100 and the interface image 250.
  • FIG. 4 shows a physical object intersecting the stereoscopic interface image. [0029] Physical object 400 can be any physical object where the position of the object can be determined. In FIG. 1, the physical object corresponds to the tip of the finger of the user. Pattern recognition is used to determine the position of the physical object and the tip of the finger of the user. Alternately a fiducial mark such as the aforementioned colored or illuminated dot may be added to assist pattern recognition. Once the desired point is identified from the images recorded by cameras 310 and 320, angles A3 and A4 may be determined. Given angles A3 and A4, and the predetermined distance between cameras 310 and 320, the position of the physical object 400 may be geometrically determined. FIG. 4 shows determining the position of the physical object from a top view, it should be appreciated that a similar analysis applies to determining the position of the physical object from a side view, thus providing a three dimensional position of physical object 400. Upon determination of a substantial intersection of the position of interface image 250 and physical object 400, a control signal is generated. The control signal may result in the modifications of the image or the control another device such as a printer or modem.
  • FIG. 4 shows a computer system which stereoscopically projects a three dimensional object having an interface image in a space observable by a user. The user controls the movement of a physical object within the space while observing both the three dimensionally projected object and the physical object. The computer system monitors the position of the user to determine the position of the interface image within the space and further monitors the movement of the physical object to determine its position. A control signal is generated in response to the position of the physical object intersecting the position of the interface image. For example, a word processing program is indicated by an interface image such as an icon including the letter “W” three dimensionally projected within the space. The word processing program is activated when the user's finger moves within the space to touch the projected icon. The interface allows the user to observe the projected icon, physical finger and their intersection within the space. [0030]
  • FIG. 5 shows a stereoscopic extension of the physical object intersecting the stereoscopic interface image. In this alternative embodiment, the physical object is shown as a [0031] bar 450 having a first and second end 452 and 454 with a stereoscopic extension image 255 projecting from end 454. The orientation and position of the physical object is determined by determining the positions of end points 452 and 454 from images recorded by cameras 310 and 320. The end points can be found by pattern recognition or by adding of differing colored fiducial marks at either end of the bar. The position of end point 452 may be determined from angles A6 and A8 of images from cameras 310 and 320 respectively while the position of end point 454 may be determined from angles A5 and A7 from cameras 310 and 320 respectively. FIG. 5 shows determining the position of the end points from a top view, it should be appreciated that a similar analysis applies to determining the position of the end points from a side view, thus providing a three dimensional position of end points 452 and 454. From the position of the two end points, the orientation of the physical object 450 may be determined. In response to the determined position and orientation of physical object 450 and the determined position of user 100, a stereoscopic extension image 255 is created such that the extension image appears to be an extension of the physical object. In FIG. 5, the extension image 255 is shown as a line extending along the line of physical object 450 with an arrow head tip. The length and shape of the extension image is predetermined and may vary from application to application. The stereoscopic extension image 255 is created by displaying images 215 and 225 on display 200 for view by eyes 110 and 120 respectively. A control signal is generated when the position of a predetermined portion of the stereoscopic extension image, such as the tip of the arrow head, intersects the position of the stereoscopic interface image.
  • FIG. 6 shows a stereoscopic extension image of the physical object intersecting the stereoscopic interface image wherein the intersection is behind the [0032] display 200. FIG. 6 is similar to FIG. 5 in that both show a stereoscopic extension image, 255 and 255′, intersecting a stereoscopic interface image, 250 and 250′. However in FIG. 5 the intersection is in front of display 200, while in FIG. 6 the intersection is behind display 200. The position and orientation of physical object 450 is determined by determining the position of end points 452 and 454 via cameras 310 and 320 and angles A5′, A6′, A7′ and A8′. In this case the resulting extension image 255′ is shown to have a substantially longer predetermined length than image 255 of FIG. 5. If display 200 were not a heads-up stereoscopic display, but rather a conventional LCD or CRT, then the intersection between a physical object and an interface image could not occur if the position of the interface image were behind the display because either the space is physically occupied by another object or the user could not see the physical intersection through the display. The extension image has the advantage of enabling intersections to occur in positions appearing behind the display 200, or in other positions out of reach of the user, while allowing the user to directly view the physical object used to cause the intersection.
  • [0033] Physical object 450 has been referred to as a bar, but it should be appreciated that the physical object could be any of a number of physical objects including the finger of the user where one end is the finger tip and the other end is a joint of the finger. Fiducial marks could be added to the points on the finger to facilitate pattern recognition of images recorded by the cameras. While the extension image is shown as a line with an arrow head, other types of extension images may be used depending upon the application. The stereoscopic extension may be considered a virtual end effect for a physical handle, a wide variety of end effects may be created by the computer system. For example a paint brush could be used for paining a virtual object, the handle being the physical object and the brush bristles and paint color the being end effect while the interface image appears as a paint canvas mounted on and three dimensional easel image. In a medical application, the physical object could be the handle and the end effect extension image the blade of a scalpel while the stereoscopic interface image part of a three dimensional image simulating surgery. Alternately in a game application the stereoscopic extension image could be a laser beam, rocket, bullet or bolt of lightning appearing to emanate from the finger of the user along a three dimensional vector defined by the finger, the stereoscopic interface image may be a villain or enemy tank moving in three dimensions.
  • It should also be appreciated that the position and orientation of the [0034] user 100 and physical object 450 have been described as being determined by two cameras with pattern recognition which triangulate in order to determine the corresponding position and orientation. In a heads up stereoscopic head set display, the cameras could be preferably mounted on the head set for visually monitoring physical objects in same space in which the user observes the projected stereoscopic images. In alternate embodiments other techniques may be used to determine the aforesaid positions and orientations without departing from the spirit and scope of the invention.
  • FIG. 7 shows a block diagram of the user interface system operating in accordance with the present invention. A [0035] stereoscopic display 200 displays stereoscopic images generated by stereoscopic image generation means 212 in a manner know in the art. The stereoscopic display may be a CRT or LCD screen requiring filter glasses to be worn by the user to direct the appropriate image to the corresponding eye of the user. Alternately, it may be a heads up stereoscopic display worn by the user. Preferably display 200 is a display means especially adapted to displaying stereoscopic images without the aid of devices worn by the use. Cameras 310 and 320 produce images which are analyzed by pattern recognizers 312 and 322 which identify certain points of the image and their location within the image. As previously described, the pattern recognition may be performed with or without the aid of fiducial marks. The location of the points from pattern recognizers 312 and 322 are analyzed by coordinate determining means 314 which analyzes the angles relative to each point from each camera, and knowing the predetermined distance between the cameras, is able to determine the desired positions and orientations. Coordinate determining means 314 also makes available the position of the user and the position and orientation of the physical object so that the stereoscopic image generator 212 may generate the stereoscopic extension image in response thereto. Coordinate determining means 314 also makes available the position of the user to coordinate determining means 214 which determines the position of the interface image relative to the user by determining the distance between the left eye and right eye images displayed on display 200 with the user's position including the distance between the user and the display and the spacing between the eyes of the user. The positions of the physical object and interface image are then compared by intersection monitor 322 which generates a control signal in response to a substantial coincidence with the position of the physical object, or its stereoscopic extension image, and the position of the stereoscopic interface image.
  • FIG. 8 shows a flow chart of a process operating in accordance with the present invention. In [0036] step 800, a stereoscopic image is displayed. Step 802 determines the position of the user as previously described. Note in alternate embodiments the position of the user may be predetermined. Then in step 804 the position of the stereoscopic interface image relative to the user is determined. Step 806 determines the position and orientation of the physical object and step 810 asks if and extension image is desired. If so, step 812 causes the display of the extension image and step 814 redetermines the position and orientation of the physical object with the extension image. Then step 816 determines if there is an intersection between the interface image and the physical object or its extension image. If so, step 818 generates a control signal which in step 820 modifies the displayed image and/or controls another device.
  • Thus what has been provided is a method and apparatus by which the intersection of a physical object and a stereoscopic object can be determined and be used to form a user interface with a computer system. [0037]
  • FIG. 9 shows active real image obstruction in a virtual reality display system. Display means [0038] 200 is a transparent display means preferably capable of displaying a stereoscopic image 250 appearing in front of the display means or stereoscopic image 250′ appearing behind the display means. Alternately, an image 251 or 252 could appear in coincidence with the display if a non-stereoscopic display were implemented, such non-stereoscopic virtual reality images are produced by displays including “teleprompters”. Image 252 could alternately be a stereoscopic image if display 200 were a stereoscopic display. Images produced by display 200 correspond to virtual reality images when viewed by the user. Reality also has numerous real images including real images 850 having portion 852 and 854 corresponding to images normally observable by an observer with the naked eye 110.
  • The transparency of [0039] display 200 allows the virtual reality images to appear superimposed upon real images represented by reality 850. Such a system is shown in U.S. Pat. No. 5,491,510 to Gove entitled System and method for simultaneously viewing a scene and an obstructed object, or U.S. Pat. No. 5,694,142 to Dumoulin et al. entitled Interactive digital arrow (D'ARROW) three-dimensional (3D) pointing, which are hereby incorporated by reference. Thus, virtual image 252 appears superimposed upon real image 852.
  • The invention also includes an active real image obstructer, or real image obstruction means, [0040] 860. The active real image obstructer modifies the transparency of portions of the viewing system. The active reality obstructer preferably includes a multiplicity of individually addressable and electronically controlled light valves, and preferably includes a gray scale liquid crystal display (LCD) having pixels capable of electronically switching between substantially transparent, partially transparent and substantially opaque states. Such LCDs are know to those familiar with the art. The result is the selective obstruction of real images 850. For example, obstructer 860 has a portion 864 where light valves substantially inhibit viewing of real images 854. In addition to image 252, display 200 also displays image 250, 250′ or 251. The resulting display system results in a view wherein virtual image 252 appears superimposed upon or combined with real image 852, while virtual image 250, 250′ or 251 appears on a substantially opaque background. An opaque area 864 is formed by light valves of the obstructer inhibiting viewing of real images 854. Thus, real images 854 do not interfere with the viewing of virtual image 250, 250′ or 251. This enhances the viewing of virtual reality image 250, 250′ or 251 by provide a “dark background” background free of real images 854, while enabling virtual reality image 252 to be viewed along with real image 852.
  • FIG. 10 shows selective real image obstruction in the virtual reality display system. As discussed with respect to previous figures, [0041] video cameras 310 and 320 determine the position of the viewer in order to determine the position of the perceived virtual reality image 250′. The transparency of display 200 allows viewing of an image of real physical object 450. Since object 450 is on the opposite side of the display system, a second set of video cameras 310′ and 320′ are used to determine the position of selecting object 450. In response to determining the position of object 450, real image obstruction 864′ is modified to enable viewing of selector 450 so that both the real image of object 450 and virtual image 250′ may be both viewed by the viewer.
  • In an embodiment including a stereoscopic user interface system, [0042] image 250′ has an interface image, and object 450 corresponds to selecting device, such as a finger of the user. Real image obstruction 864′ is modified to facilitate the user observing the object in order guide the object to a desired interface image. Note that if reality obstruction 864′ of FIG. 10 were not modified in response to selecting object 450, the view of selecting object 450 would be obstructed. In the preferred embodiment, selecting object 450 is recognized independently of other real images 854 obstructed by obstruction 864. The independent recognition of selecting object 450 may be facilitated with the use of fiducial marks on the pointing device such as a ring or colored dots on the finger. Alternately pattern recognition could be performed to identify a predetermined pointing device such as the user's finger. Since the position of the user, the selecting object and display system are all known, sufficient information is available to adjust the reality obstruction 864′ facilitate viewing of the pointing device by the user.
  • In another embodiment, a “head's up” display includes active reality obstruction. In this embodiment a transparent display is coupled with [0043] real image obstructer 860 and place a substantial distance from the viewer in order that the display may be viewed by both eyes of the viewer. The obstructer 860 allows viewing of some of the virtual images projected by the display to combined with real images viewed through the display. For other virtual images, the reality obstructer inhibits viewing real images in other portions of the display in order to enhance viewing of corresponding virtual images or to facilitate better viewing of real images through the display system. In an example of this embodiment, the display system of FIG. 1 to FIG. 6 is modified such that display 200 includes a transparent display and obstructer 860 is locate behind the display. Interface image 250 can be further incorporated into the display system to provide for a heads-up stereoscopic user interface.
  • FIG. 11 shows a headset embodiment of the present invention. The headset is capable of displaying stereoscopic images with a transparent display projection system. Such a system is shown in U.S. Pat. No. 5,886,822 to Spitzer entitled Image combining system for eyeglasses and face masks, which is hereby incorporated by reference. The display system includes an [0044] image projector system 870 and 872. Images are generated by image generator 870 and reflected into an eye of the viewer by reflector means 872. When a corresponding system is used with the second eye of the user, a stereoscopic image is generated. Also included is obstructer 860 which obstructs real images in certain parts of the viewing area. Video camera means 310′ is used to monitor real images and adjust the reality obstruction in response thereto. For example, real image obstruction can be modified to enable viewing of selecting object 450. Since the system is part of a headset, the location of each of the user's eyes is substantially predetermined. As a result video cameras 310 and 320, which were previously used to determine the location of the user become optional. As previously discussed, the system enables the stereoscopic user interface by facilitating the user's view of the intersection between a real object and a stereoscopic interface image.
  • FIG. 12 shows a front view of the headset embodiment of the present invention. Each lens has a virtual reality image projector and a [0045] real image obstructer 860 and 872, and 860′ and 872′ respectively. In the system of FIG. 11 and FIG. 12, the obstruction created by obstructers 860 and 860′ is also stereoscopic, occupying a perceived space a distance in front of the user. In the preferred embodiment, the perceived distance of the obstruction is adjusted to enhance the user's view. For example, if stereoscopic image 250 is projected to appear one half meter in front of the viewer, then the stereoscopic obstruction would also be generated to appear one half meter in front of the viewer in order to enhance the viewing of image. Video cameras 310′ and 320′ monitor real images viewed by the user. This facilitates the stereoscopic user interface by locating a selecting object position as well as facilitates adjusting the real image obstructers in response to real images.
  • In alternate embodiments, the head set display system of FIG. 11 or FIG. 12 can be implemented with filter glass (polarized, colored or actively switched) viewing a common display panel. [0046]
  • FIG. 13 shows a top view operation of a headset with active reality obstruction. The headset includes [0047] reality obstructers 860 and 860′ for obstructing real images viewed by eyes 110 and 120. A virtual reality image 250′ is projected in a distance in front of the viewer. In order to create a stereoscopic obstruction in coincidence with virtual image 250′, an obstruction is created substantially between lines 870 and 872 on obstructer 860 and substantially between lines 874 and 876 on obstructer 860′. Note that since virtual image 450 is created by the display system, its position is inherently known by the display system. Thus, the reality obstruction corresponding to virtual image 250′ may be determined by the display system at the time of generation of image 250′. Note that if the display system changes the position or size of image 250′ then the position or size of the corresponding obstruction may also be changed.
  • FIG. 13 further shows how the active obstructions are modified in response to real objects. FIG. 13 shows [0048] objects 450 and 880. Monitoring means such as video cameras 310′ and 320′ may be added to the system to determine the position and/or character of the real objects. Upon determining the size and position of selecting object 450, the obstruction is correspondingly modified to facilitate viewing of the real image of the selecting object. By modifying the stereoscopic obstruction to occur substantially between lines 870 and 877 on obstructer 860 and substantially between lines 874 and 878 on obstructer 860′, the real image of selecting object 450 may be viewed. This is a top view graphical analysis. A similar graphical analysis can be performed for the side view, thus generating a two dimensional real image obstruction area on obstructers 860 and 860′.
  • In another embodiment of the invention, the monitoring means determines the character of the real images and obstructs their view if they interfere with viewing other virtual reality images or other real images. For example, [0049] real object 880 corresponds to a relatively bright object, such as a streetlight or the sun. Such objects produce relatively bright images that tend to interfere with viewing of either virtual reality images or other real images. Video cameras 310′ and 320′ determine the location an relative brightness of the real image of object 880. Then a stereoscopic obstruction occurs in substantial coincidence with the real image of object 880 by generating obstructions substantially between lines 882 and 884 on obstructer 860 and substantially between lines 886 and 888 on obstructer 860′. Note that in this application, the stereoscopic projectors 870 and 870′ are optional. It is further noted that such stereoscopic obstructions may be created by the aforementioned heads-up display system of FIG. 10. In such a heads-up application, if only the enhancement of viewing real images is sought, the projector 200 can become optional.
  • FIG. 14 shows an example of a view of a transparent display system without real image obstruction. FIG. 15 shows an example view of the transparent display system with real image obstruction. The view of [0050] reality 850 includes a real image of a building and the sun 880. A first virtual reality image 890 of a digital time of day clock is displayed superimposed on real image 850 in both FIG. 14 and FIG. 15. FIG. 14 also has a second virtual reality image of a streaming information display such as real time stock prices. The information streams from right to left towards the bottom of the display and is superimposed real image 850. Real image 850 tends to interfere with the viewing of the streaming information of virtual reality image 892. FIG. 15 shows a real image obstruction 893 in substantial coincidence with virtual reality image 892. Real image obstruction 892 substantially reduces visual interference from the real image and improves the viewing of the streaming information. FIG. 14 also shows a third superimposed virtual reality image 894 of a video image. In this example the video image is that of a lecture given on the building being observed. However, image 894 is superimposed upon the sun 880 which is substantially brighter, making at least portions of the video image difficult if not impossible to view. The image from bright object 880 may also interfere with viewing of other virtual reality images as well as the real image of the building. FIG. 15 generates a real image obstruction in substantial coincidence with virtual reality image 894 which substantially enhances its viewing. Further, creating a real image obstruction in substantial coincidence with or including the bright image 880 also enhances viewing of the other virtual reality images 890 and 892 as well as the real image of the building. Image 894 of FIG. 14 and FIG. 15 also includes a stereoscopic interface image 250′ which in this example may be used to raise or lower the volume of an accompanying audio portion of the video image 894, a sound track of the speaker's lecture on the real image of the building being viewed. The real image of selector 450 is also shown. FIG. 15 shows the real image obstruction in substantial coincidence with the real image of selector 450 being removed to facilitate viewing of the real image of selecting object 450.
  • Referring back to the block diagram of FIG. 7, the block diagram also shows the active real image obstruction means [0051] 860 in substantial coincidence with display 200 in order create the previously described active obstruction of real images in a virtual reality display system. Obstructer 860 is controlled by obstruction controller 902 which in the preferred embodiment switches selected light valves of obstructer 860 off or on, or partially off or on in order to create a desired level of transparency or opaqueness. The obstruction controller adjusts the size, location and/or transparency of obstructions in response to manual inputs from the user from manual input means 904, inputs from an information signal used to generate images displayed on display 200, the coordinates of virtual images produced by the system from coordinates means 214 and/or in response to inputs from coordinates means 314 for determining coordinates of real objects having real images.
  • As previously described, it may be desirable for certain virtual reality images to be superimposed or combined with real images while other virtual reality images be viewed substantially only as virtual reality images with little or no interference from real images. [0052] Stereoscopic image generator 212 communicates the coordinates of the virtual images to obstruction controller 902 which generates the obstructions accordingly. The amount of transparency of each obstruction may be varied between substantially totally transparent through several levels of decreasing transparency to substantially opaque. If the virtual reality image having a corresponding real image obstruction is moved or resized, the corresponding real image obstruction is also moved or resized by obstruction controller 902. Such a virtual realty image may be moved in response to a manual input from the viewer.
  • For example, the viewer may be watching a high definition movie in a small window with a corresponding partially transparent real image obstruction while traveling through an airport and on the way to a seat on the airplane. The transparency of the portions of the viewing area allows the viewer to do may things while viewing the movie in the small window. The viewer may use the real images navigate the crowds and corridors of the airport, communicate with airline and security personnel and find an assigned seat on the airplane. Once seated, the viewer may desire to substantially enlarge the viewing window of the movie and further reduce the transparency of the movie's real image obstruction in order to improve the viewing of the movie. Such adjustments can be done via manual inputs. The manual adjustments may be made via several means including switches or buttons associated with the display system or via a stereoscopic user interface system as previously described. [0053]
  • The virtual reality image and its corresponding real image obstruction may be moved or resized in response to an information signal used to generate virtual reality images. For example, a substantially real time stock quote information stream is being displayed, and the user desires to receive alerts based upon a certain financial trigger. In an event of the trigger, the size of the image could be doubled to facilitate a second display of information related to the trigger. This is an example of the size of the virtual reality image, and its corresponding real image obstruction, vary in response to the information signal used for generating the virtual reality image. Furthermore, the transparency of the corresponding real image obstruction could be reduced in response to the trigger. This is an example of changing the size and/or transparency of the real image obstruction in response to an information signal used for generating the virtual reality image. [0054]
  • [0055] Obstruction controller 902 also receives coordinate information from coordinates means 314 in order to modify obstructions to facilitate viewing of a real image of a selecting object in an embodiment implementing the aforementioned stereoscopic user interface.
  • Coordinates means [0056] 314 also provides coordinates of substantially bright portions of the real image in order that corresponding real image obstructions may be generated to reduce the viewed brightness of the bright portions of the real image. This has the advantage of improving the viewing of both virtual reality images and other real images.
  • Coordinates means [0057] 314 is capable of determining the ambient brightness of the real images. In one embodiment, the ambient brightness may be used to adjust the transparency of the entire obstructer 860. For example, if the ambient brightness of the real images doubled, the transparency of each pixel of the obstructer 860 would be substantially halved in order to maintain a substantially constant contrast ratio between the virtual reality images and the real images. If first and second portions of the obstructer had 100% and 50% transparency respectively, then upon a doubling of the ambient light, the first and second portions of the obstructer would be correspondingly adjusted to 50% and 25% transparency. Other non-liner adjusting relationships between obstructer transparency, real image ambient light and virtual reality image brightness are also anticipated. In simplified embodiment which detects substantially only ambient light, the means for determining coordinates of real images is not necessarily needed. Thus, the video cameras 310 and 320, pattern recognizers 312 and 322, and coordinate determining means 314 could be substituted for an simpler and lower cost ambient light sensor.
  • FIG. 16 shows a flowchart of a method operating in accordance with the present invention. Step [0058] 910 checks for either a manual input or information including the virtual image indicating the need for a change in real image obstruction. If found, then step 915 resizes, and/or moves the image and/or real image obstruction, and/or changes the transparency of the obstruction. From either step 910 or 915, step 920 checks for a change in ambient brightness of the real image and if found adjusts the overall transparency of the obstruction means in step 925. Thereafter, step 930 checks for a substantially bright area in the real image and if found step 935 adjusts the transparency of an obstruction corresponding to the bright area in step 935. Thereafter, step 940 determines if a selecting object is included in the real image, and if found adjusts the transparency of the obstruction to facilitate viewing of the selecting object in step 945. Thereafter the program returns to step 910.
  • In an alternate embodiment of selective real image obstruction of a virtual reality display system, the real image obstruction is a passive real image obstruction of a predetermined area of the display system. For example, [0059] real image obstruction 893 of FIG. 14 and FIG. 15 could be a permanent blacked out obstruction on the headset, a thick black stripe towards the bottom of each lens of the headset. In this way streaming stock information image 892 could always been seen with out interference from real images: other virtual reality images, such as 890 and 894, would always appear combined with or superimposed upon real images. In this embodiment, the thick black stripe real image obstructer may be substantially unrelated to the display means used for projecting virtual reality images. This also enables active real image obstructer 860 to become an optional component of the display system. A further enhancement of the selective obstruction includes blocking only certain spectrum of visible light of real images with either the passive or active real image obstructer. This enhances viewing of a virtual image in the color region of the real image obstruction. For example, real image obstructer 893 may selectively block blue light causing real images in area 893 to appear substantially yellow. This substantially enhances viewing of the information stream 892, particularly if the information stream is projected with a blue color. As a further embodiment, the entire display system can selectively block of filter a portion of the spectrum of light from real images (such as blue light) and with the display system projecting desired information (such as text, graph or line art) with a corresponding (blue) color in order to enhance viewing of the virtual (text, text graph or line art) images when viewed superimposed upon or combined with filtered light from the real images. Other virtual reality images (such a color video) may be projected in full color and viewed in combination with filtered light from the real images, or the real images may be substantially totally obstructed by an active image obstructer blocking substantially all spectrum of visible light as previously disclosed.
  • Thus, what has been provide is a virtual reality viewing system that provides for the advantages of both transparent and opaque viewing systems while reducing the disadvantages of both by actively obstructing views of real images to enhance views of select virtual reality images as well as views or other real images.[0060]

Claims (25)

We claim:
1. A method of displaying a viewable virtual image within a display area having viewable real images, the method comprising the step of:
obstructing a view of real images in an obstruction area for obstructing views of real images within a portion of the display area, wherein the obstruction area is substantially independent of substantial obstructions resulting from a means for projecting the virtual image, thereby enhancing viewing of the virtual image in coincidence with the obstruction area while enabling viewing of the virtual image and the real images beyond the obstruction area.
2. The method of claim 1 further comprising the step of actively selecting the obstruction area from a plurality of obstruction areas within the display area.
3. The method according to claim 1 further comprising the step of
selectively modifying characteristics of the obstruction area in response to a modification signal.
4. The method according to claim 3 wherein said step of selectively modifying modifies a size, shape or location of the obstruction area.
5. The method according to claim 3 wherein said step of selectively modifying modifies an amount of obstruction of real images within the obstruction area between states including a first state wherein real images within the obstruction area are substantially visible and at least a second state wherein real images within the obstruction are not substantially visible.
6. The method according to claim 5 further comprising the steps of:
determining an amount of ambient light associated with the real images; and
generating the modification signal in response thereto, wherein said step of selectively modifying modifies the amount of obstruction in response to said step of determining.
7. The method according to claim 3 further comprising the steps of:
receiving a manual input; and
generating the modification signal in response thereto, wherein said step of modifying is made in response to the manual input.
8. The method according to claim 3 wherein the virtual image is generated in response to an information signal and said step of selectively modifying is responsive to the information signal.
9. The method according to claim 1 wherein the obstruction area occurs in a predetermined portion of the display area.
10. A display system comprising:
a first display projector for projecting a virtual image within a viewable area wherein the virtual image appears substantially combined with real images visible within the viewable area; and
a first real image obstruction means for obstructing viewing of real images within an obstruction area within the viewable area, thereby enhancing viewing of the virtual image appearing within the obstruction area while enabling viewing of the virtual image and real images in the viewable area and beyond the obstruction area.
11. The display system according to claim 10 comprised within a headset display system for a viewer having first and second eyes for viewing the virtual image and the real images further comprising:
a second display projector substantially equivalent to said first display projector; and
a second real image obstruction means substantially equivalent to said first real image obstruction means, and further wherein
the first eye views virtual and real images from said first display projector and said first real image obstruction means and
the second eye views virtual and real images from said second display projector and said second image obstruction means.
12. The headset display system according to claim 11 wherein the first and second display projectors display a stereoscopic virtual image viewable by the viewer, the virtual image includes a stereoscopic interface image appearing within an interface space viewable by the viewer, said headset display system further comprising
a monitoring means for determining an intersection of a physical object having a real image with the interface space including the stereoscopic interface image and for generating a control signal in response thereto.
13. The headset display system of claim 12 wherein the physical object is a selecting object having a real image including a selecting image and further wherein
said first and second real image obstruction means are responsive to said monitoring means for obstructing viewing of real images within an obstruction area substantially including the interface image yet enable viewing of the selecting image, thereby allowing the viewer to view the intersection of the selecting object with the stereoscopic interface image.
14. The headset display system according to claim 13 wherein said monitoring means further identifies the selecting object as a finger tip of the viewer.
15. The display system according to claim 10 further comprising
an obstruction controller coupled to said first real image obstruction means for selectively modifying characteristics of the obstruction area.
16. The display system according to claim 15 wherein
said obstruction controller modifies a size, shape or location of the obstruction area or an amount of obstruction of real images appearing within the obstruction area in response to a manual input.
17. The display system according to claim 15 wherein
said first display projector projects the virtual image in response to information included within an information signal, and
said obstruction controller modifies a size, shape or location of the obstruction area or an amount of obstruction of real images appearing within the obstruction area in response to the information signal.
18. The display system according to claim 15 further comprising
an ambient light monitoring means for determining an ambient light level of the real images and for generating a brightness signal in response thereto, wherein
said obstruction controller modifies an amount of obstruction of real images appearing within the obstruction area in response to the brightness signal.
19. The display system according to claim 11 further comprising:
an obstruction controller coupled to said first and second real image obstruction means for selectively modifying characteristics of the obstruction area;
a light monitoring means for determining a substantially high light level of a bright real image and a location thereof and for generating a light source signal in response thereto, wherein
said obstruction controller increases an amount of obstruction of the bright real image in substantial coincidence with the location in response to the light source signal.
20. In a headset display system, a method of enhancing viewing of a virtual image viewed in combination with real images comprised of a spectrum of light comprising the steps of:
substantially filtering a portion of the spectrum of light of the real images; and
displaying the virtual image in a light spectrum substantially within the filtered portion of the spectrum of light of the real images.
21. The method according to claim 20 wherein the virtual image includes a substantial amount of text or graphic information.
22. The method according to claim 20 wherein the virtual image is displayed in a viewing area of the headset display system and said step of substantially filtering filters the spectrum of light of real images for substantially the entire viewing area.
23. The method according to claim 22 further comprising the steps of:
substantially obstructing the entire spectrum of light of the real images within an obstruction area; and
displaying a second virtual image within the obstruction area.
24. The method according to claim 20 wherein the virtual image is displayed in a viewing area of the headset display system and said step of substantially filtering filters the spectrum of light of real images for a portion of the entire viewing area.
25. The method according to claim 24 further comprising the step of actively modifying a size or shape of the portion of the entire viewing area.
US10/339,090 1998-07-01 2003-01-09 Selective real image obstruction in a virtual reality display apparatus and method Abandoned US20030142068A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/339,090 US20030142068A1 (en) 1998-07-01 2003-01-09 Selective real image obstruction in a virtual reality display apparatus and method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US09/108,814 US6064354A (en) 1998-07-01 1998-07-01 Stereoscopic user interface method and apparatus
US09/494,976 US6559813B1 (en) 1998-07-01 2000-01-31 Selective real image obstruction in a virtual reality display apparatus and method
US10/339,090 US20030142068A1 (en) 1998-07-01 2003-01-09 Selective real image obstruction in a virtual reality display apparatus and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/494,976 Division US6559813B1 (en) 1998-07-01 2000-01-31 Selective real image obstruction in a virtual reality display apparatus and method

Publications (1)

Publication Number Publication Date
US20030142068A1 true US20030142068A1 (en) 2003-07-31

Family

ID=22324190

Family Applications (4)

Application Number Title Priority Date Filing Date
US09/108,814 Expired - Fee Related US6064354A (en) 1998-07-01 1998-07-01 Stereoscopic user interface method and apparatus
US09/494,976 Expired - Lifetime US6559813B1 (en) 1998-07-01 2000-01-31 Selective real image obstruction in a virtual reality display apparatus and method
US09/539,536 Expired - Lifetime US6243054B1 (en) 1988-07-01 2000-03-31 Stereoscopic user interface method and apparatus
US10/339,090 Abandoned US20030142068A1 (en) 1998-07-01 2003-01-09 Selective real image obstruction in a virtual reality display apparatus and method

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US09/108,814 Expired - Fee Related US6064354A (en) 1998-07-01 1998-07-01 Stereoscopic user interface method and apparatus
US09/494,976 Expired - Lifetime US6559813B1 (en) 1998-07-01 2000-01-31 Selective real image obstruction in a virtual reality display apparatus and method
US09/539,536 Expired - Lifetime US6243054B1 (en) 1988-07-01 2000-03-31 Stereoscopic user interface method and apparatus

Country Status (3)

Country Link
US (4) US6064354A (en)
AU (1) AU4837299A (en)
WO (1) WO2000002187A1 (en)

Cited By (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020047835A1 (en) * 2000-09-11 2002-04-25 Tomoaki Kawai Image display apparatus and method of displaying image data
US20030179192A1 (en) * 2002-03-20 2003-09-25 Allen William J. Method and apparatus for image display
WO2006005799A1 (en) * 2004-05-11 2006-01-19 Ritva Laijoki-Puska Method and arrangement for presenting a virtual landscape
US20060017654A1 (en) * 2004-07-23 2006-01-26 Romo Justin R Virtual reality interactivity system and method
US20060109510A1 (en) * 2004-11-23 2006-05-25 Simon Widdowson Methods and systems for determining object layouts
US20070003134A1 (en) * 2005-06-30 2007-01-04 Myoung-Seop Song Stereoscopic image display device
US20070035561A1 (en) * 2005-04-11 2007-02-15 Systems Technology, Inc. System for combining virtual and real-time environments
FR2895811A1 (en) * 2006-01-04 2007-07-06 Bernard Szajner Real time control of relief images for use in computer based three- dimensional imaging systems based on correlation between observer movement data and possible relief projection data
US20080059578A1 (en) * 2006-09-06 2008-03-06 Jacob C Albertson Informing a user of gestures made by others out of the user's line of sight
US20080170749A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Controlling a system based on user behavioral signals detected from a 3d captured image stream
US20080169914A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Warning a vehicle operator of unsafe operation behavior based on a 3d captured image stream
US20080170748A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Controlling a document based on user behavioral signals detected from a 3d captured image stream
US20080170776A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Controlling resource access based on user gesturing in a 3d captured image stream of the user
US20080172261A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Adjusting a consumer experience based on a 3d captured image stream of a consumer response
US20080170118A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Assisting a vision-impaired user with navigation based on a 3d captured image stream
US20080169929A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Warning a user about adverse behaviors of others within an environment based on a 3d captured image stream
US20080170123A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Tracking a range of body movement based on 3d captured image streams of a user
US7425946B1 (en) * 2003-08-15 2008-09-16 Britton Rick A Remote camouflage keypad for alarm control panel
US20100060575A1 (en) * 2008-09-05 2010-03-11 Keizo Ohta Computer readable recording medium recording image processing program and image processing apparatus
US20110012830A1 (en) * 2009-07-20 2011-01-20 J Touch Corporation Stereo image interaction system
US20110074918A1 (en) * 2009-09-30 2011-03-31 Rovi Technologies Corporation Systems and methods for generating a three-dimensional media guidance application
US20110137727A1 (en) * 2009-12-07 2011-06-09 Rovi Technologies Corporation Systems and methods for determining proximity of media objects in a 3d media environment
US20120013612A1 (en) * 2010-07-13 2012-01-19 Lg Electronics Inc. Electronic apparatus and method for displaying graphical user interface as 3d image
US20120069180A1 (en) * 2009-05-26 2012-03-22 Panasonic Electric Works Co., Ltd. Information presentation apparatus
US20120081529A1 (en) * 2010-10-04 2012-04-05 Samsung Electronics Co., Ltd Method of generating and reproducing moving image data by using augmented reality and photographing apparatus using the same
US20120119988A1 (en) * 2009-08-12 2012-05-17 Shimane Prefectural Government Image recognition apparatus, operation determining method and computer-readable medium
US20120120051A1 (en) * 2010-11-16 2012-05-17 Shu-Ming Liu Method and system for displaying stereoscopic images
US20120287235A1 (en) * 2011-05-13 2012-11-15 Ahn Mooki Apparatus and method for processing 3-dimensional image
US20130009949A1 (en) * 2011-07-05 2013-01-10 Texas Instruments Incorporated Method, system and computer program product for re-convergence of a stereoscopic image
US20130121528A1 (en) * 2011-11-14 2013-05-16 Sony Corporation Information presentation device, information presentation method, information presentation system, information registration device, information registration method, information registration system, and program
US20130169530A1 (en) * 2011-12-29 2013-07-04 Khalifa University Of Science And Technology & Research (Kustar) Human eye controlled computer mouse interface
US20130222424A1 (en) * 2010-11-08 2013-08-29 Ntt Docomo, Inc. Object display device and object display method
WO2013050953A3 (en) * 2011-10-04 2013-09-12 Nokia Corporation Methods, apparatuses, and computer program products for restricting overlay of an augmentation
US20130293687A1 (en) * 2011-01-14 2013-11-07 Sharp Kabushiki Kaisha Stereoscopic image processing apparatus, stereoscopic image processing method, and program
US20130335404A1 (en) * 2012-06-15 2013-12-19 Jeff Westerinen Depth of field control for see-thru display
US20140009424A1 (en) * 2011-03-25 2014-01-09 Kyocera Corporation Electronic device, control method, and control program
US20140125557A1 (en) * 2012-11-02 2014-05-08 Atheer, Inc. Method and apparatus for a three dimensional interface
WO2014069722A1 (en) * 2012-10-30 2014-05-08 삼성전자주식회사 Three-dimensional display device and user interfacing method therefor
US20140132726A1 (en) * 2012-11-13 2014-05-15 Lg Electronics Inc. Image display apparatus and method for operating the same
US20140195983A1 (en) * 2012-06-30 2014-07-10 Yangzhou Du 3d graphical user interface
US8854802B2 (en) 2010-10-22 2014-10-07 Hewlett-Packard Development Company, L.P. Display with rotatable display screen
CN104093240A (en) * 2014-06-30 2014-10-08 广东九联科技股份有限公司 System for intelligently adjusting environment light
US8878773B1 (en) 2010-05-24 2014-11-04 Amazon Technologies, Inc. Determining relative motion as input
US8892357B2 (en) 2010-09-20 2014-11-18 Honeywell International Inc. Ground navigational display, system and method displaying buildings in three-dimensions
US8942434B1 (en) 2011-12-20 2015-01-27 Amazon Technologies, Inc. Conflict resolution for pupil detection
US8947351B1 (en) * 2011-09-27 2015-02-03 Amazon Technologies, Inc. Point of view determinations for finger tracking
WO2015066037A1 (en) * 2013-10-28 2015-05-07 Brown University Virtual reality methods and systems
US9041734B2 (en) 2011-07-12 2015-05-26 Amazon Technologies, Inc. Simulating three-dimensional features
CN104661015A (en) * 2015-02-06 2015-05-27 武汉也琪工业设计有限公司 Virtual reality simulation display equipment of 3D real scene
EP2669885A3 (en) * 2012-05-28 2015-06-10 Acer Incorporated Transparent display device and transparency adjustment method thereof
TWI499964B (en) * 2013-09-17 2015-09-11 Genius Toy Taiwan Co Ltd Method, system and computer program product for real-time touchless interaction
US9164581B2 (en) 2010-10-22 2015-10-20 Hewlett-Packard Development Company, L.P. Augmented reality display system and method of display
US20150332084A1 (en) * 2004-08-12 2015-11-19 Bioscrypt, Inc. Device for biometrically controlling a face surface
WO2015195549A1 (en) * 2014-06-16 2015-12-23 Vladimir Vaganov 3d digital painting
US9223415B1 (en) 2012-01-17 2015-12-29 Amazon Technologies, Inc. Managing resource usage for task performance
US20160041625A1 (en) * 2010-07-20 2016-02-11 Apple Inc. Adaptive Projector
US9269012B2 (en) 2013-08-22 2016-02-23 Amazon Technologies, Inc. Multi-tracker object tracking
US9268407B1 (en) * 2012-10-10 2016-02-23 Amazon Technologies, Inc. Interface elements for managing gesture control
US9317113B1 (en) 2012-05-31 2016-04-19 Amazon Technologies, Inc. Gaze assisted object recognition
US9440484B2 (en) 2010-06-01 2016-09-13 Vladimir Vaganov 3D digital painting
US9489102B2 (en) 2010-10-22 2016-11-08 Hewlett-Packard Development Company, L.P. System and method of modifying lighting in a display system
WO2016191049A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset
US9588352B2 (en) 2015-01-19 2017-03-07 Samsung Display Co., Ltd. Autostereoscopic image display device with a difference image map
US9734622B2 (en) 2010-06-01 2017-08-15 Vladimir Vaganov 3D digital painting
US9766796B2 (en) 2011-06-07 2017-09-19 Sony Corporation Information processing apparatus, information processing method, and program
US10055013B2 (en) 2013-09-17 2018-08-21 Amazon Technologies, Inc. Dynamic object tracking for user interfaces
US10088924B1 (en) 2011-08-04 2018-10-02 Amazon Technologies, Inc. Overcoming motion effects in gesture recognition
US20180321816A1 (en) * 2017-05-08 2018-11-08 International Business Machines Corporation Finger direction based holographic object interaction from a distance
US10217264B2 (en) 2010-06-01 2019-02-26 Vladimir Vaganov 3D digital painting
US10481677B2 (en) * 2006-09-27 2019-11-19 Sony Corporation Display apparatus and display method
US10643398B2 (en) 2018-07-23 2020-05-05 Microsoft Technology Licensing, Llc Depth ray layer for reduced visual noise
US10922870B2 (en) 2010-06-01 2021-02-16 Vladimir Vaganov 3D digital painting
US11922594B2 (en) 2021-01-27 2024-03-05 Qualcomm Incorporated Context-aware extended reality systems

Families Citing this family (344)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0895190A3 (en) * 1997-07-18 2001-01-17 Artwings Co., Ltd. Motion detection system
US6198485B1 (en) * 1998-07-29 2001-03-06 Intel Corporation Method and apparatus for three-dimensional input entry
US6344860B1 (en) * 1998-11-27 2002-02-05 Seriate Solutions, Inc. Methods and apparatus for a stereoscopic graphic user interface
US6614422B1 (en) * 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6522395B1 (en) 1999-04-30 2003-02-18 Canesta, Inc. Noise reduction techniques suitable for three-dimensional information acquirable with CMOS-compatible image sensor ICS
US6512838B1 (en) 1999-09-22 2003-01-28 Canesta, Inc. Methods for enhancing performance and data acquired from three-dimensional image systems
JP3632563B2 (en) * 1999-10-19 2005-03-23 株式会社豊田自動織機 Image positional relationship correction device, steering assist device including the image positional relationship correction device, and image positional relationship correction method
JP4052498B2 (en) 1999-10-29 2008-02-27 株式会社リコー Coordinate input apparatus and method
JP2001184161A (en) 1999-12-27 2001-07-06 Ricoh Co Ltd Method and device for inputting information, writing input device, method for managing written data, method for controlling display, portable electronic writing device, and recording medium
AU2001251221A1 (en) * 2000-03-29 2001-10-08 Indelelink Corporation Personalized computer peripheral
US6803906B1 (en) 2000-07-05 2004-10-12 Smart Technologies, Inc. Passive touch system and method of detecting user input
CA2412878C (en) * 2000-07-05 2015-02-03 Smart Technologies Inc. Camera-based touch system
US6611253B1 (en) * 2000-09-19 2003-08-26 Harel Cohen Virtual input environment
CA2328795A1 (en) 2000-12-19 2002-06-19 Advanced Numerical Methods Ltd. Applications and performance enhancements for detail-in-context viewing technology
US8026944B1 (en) * 2001-04-12 2011-09-27 Sony Corporation Method and apparatus for hosting a network camera with image degradation
CA2345803A1 (en) * 2001-05-03 2002-11-03 Idelix Software Inc. User interface elements for pliable display technology implementations
US8416266B2 (en) 2001-05-03 2013-04-09 Noregin Assetts N.V., L.L.C. Interacting with detail-in-context presentations
US7213214B2 (en) 2001-06-12 2007-05-01 Idelix Software Inc. Graphical user interface with zoom for detail-in-context presentations
US9760235B2 (en) 2001-06-12 2017-09-12 Callahan Cellular L.L.C. Lens-defined adjustment of displays
US7084886B2 (en) 2002-07-16 2006-08-01 Idelix Software Inc. Using detail-in-context lenses for accurate digital image cropping and measurement
US6478432B1 (en) 2001-07-13 2002-11-12 Chad D. Dyner Dynamically generated interactive real imaging device
CA2361341A1 (en) 2001-11-07 2003-05-07 Idelix Software Inc. Use of detail-in-context presentation on stereoscopically paired images
CA2370752A1 (en) * 2002-02-05 2003-08-05 Idelix Software Inc. Fast rendering of pyramid lens distorted raster images
US20030179249A1 (en) * 2002-02-12 2003-09-25 Frank Sauer User interface for three-dimensional data sets
US7904826B2 (en) * 2002-03-29 2011-03-08 Microsoft Corporation Peek around user interface
US7203911B2 (en) * 2002-05-13 2007-04-10 Microsoft Corporation Altering a display on a viewing device based upon a user proximity to the viewing device
US20040001144A1 (en) * 2002-06-27 2004-01-01 Mccharles Randy Synchronization of camera images in camera-based touch system to enhance position determination of fast moving objects
US6857746B2 (en) * 2002-07-01 2005-02-22 Io2 Technology, Llc Method and system for free-space imaging display and interface
US8120624B2 (en) 2002-07-16 2012-02-21 Noregin Assets N.V. L.L.C. Detail-in-context lenses for digital image cropping, measurement and online maps
CA2393887A1 (en) 2002-07-17 2004-01-17 Idelix Software Inc. Enhancements to user interface for detail-in-context data presentation
US8570378B2 (en) * 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US20060256081A1 (en) * 2002-07-27 2006-11-16 Sony Computer Entertainment America Inc. Scheme for detecting and tracking user manipulation of a game controller body
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
JP4298407B2 (en) * 2002-09-30 2009-07-22 キヤノン株式会社 Video composition apparatus and video composition method
CA2406131A1 (en) 2002-09-30 2004-03-30 Idelix Software Inc. A graphical user interface using detail-in-context folding
CA2449888A1 (en) 2003-11-17 2005-05-17 Idelix Software Inc. Navigating large images using detail-in-context fisheye rendering techniques
US20070097109A1 (en) * 2005-10-18 2007-05-03 Idelix Software Inc. Method and system for generating detail-in-context presentations in client/server systems
US6954197B2 (en) * 2002-11-15 2005-10-11 Smart Technologies Inc. Size/scale and orientation determination of a pointer in a camera-based touch system
CA2411898A1 (en) 2002-11-15 2004-05-15 Idelix Software Inc. A method and system for controlling access to detail-in-context presentations
US7629967B2 (en) 2003-02-14 2009-12-08 Next Holdings Limited Touch screen signal processing
US8508508B2 (en) * 2003-02-14 2013-08-13 Next Holdings Limited Touch screen signal processing with single-point calibration
US8456447B2 (en) 2003-02-14 2013-06-04 Next Holdings Limited Touch screen signal processing
US7063256B2 (en) * 2003-03-04 2006-06-20 United Parcel Service Of America Item tracking and processing systems and methods
US7090134B2 (en) * 2003-03-04 2006-08-15 United Parcel Service Of America, Inc. System for projecting a handling instruction onto a moving item or parcel
US7532206B2 (en) 2003-03-11 2009-05-12 Smart Technologies Ulc System and method for differentiating between pointers used to contact touch surface
JP4134785B2 (en) * 2003-03-28 2008-08-20 株式会社デンソー Display device
EP1614159B1 (en) * 2003-04-11 2014-02-26 Microsoft Corporation Method and system to differentially enhance sensor dynamic range
US7176438B2 (en) * 2003-04-11 2007-02-13 Canesta, Inc. Method and system to differentially enhance sensor dynamic range using enhanced common mode reset
US6783252B1 (en) * 2003-04-21 2004-08-31 Infocus Corporation System and method for displaying projector system identification information
CA2530987C (en) * 2003-07-03 2012-04-17 Holotouch, Inc. Holographic human-machine interfaces
US7874917B2 (en) 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US7411575B2 (en) 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US7274356B2 (en) 2003-10-09 2007-09-25 Smart Technologies Inc. Apparatus for determining the location of a pointer within a region of interest
JP4478863B2 (en) * 2003-11-19 2010-06-09 ソニー株式会社 Display device, bidirectional communication system, and display information utilization method
US7355593B2 (en) 2004-01-02 2008-04-08 Smart Technologies, Inc. Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
US7232986B2 (en) * 2004-02-17 2007-06-19 Smart Technologies Inc. Apparatus for detecting a pointer within a region of interest
JP2007531951A (en) * 2004-04-05 2007-11-08 マイケル エー. ベセリー Horizontal perspective display
US7486302B2 (en) * 2004-04-14 2009-02-03 Noregin Assets N.V., L.L.C. Fisheye lens graphical user interfaces
US7460110B2 (en) 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
US7492357B2 (en) 2004-05-05 2009-02-17 Smart Technologies Ulc Apparatus and method for detecting a pointer relative to a touch surface
US7538759B2 (en) 2004-05-07 2009-05-26 Next Holdings Limited Touch panel display system with illumination and detection provided from a single edge
US8120596B2 (en) 2004-05-21 2012-02-21 Smart Technologies Ulc Tiled touch system
US8106927B2 (en) 2004-05-28 2012-01-31 Noregin Assets N.V., L.L.C. Graphical user interfaces and occlusion prevention for fisheye lenses with line segment foci
US7295220B2 (en) * 2004-05-28 2007-11-13 National University Of Singapore Interactive system and method
US20050275915A1 (en) * 2004-06-01 2005-12-15 Vesely Michael A Multi-plane horizontal perspective display
US9317945B2 (en) 2004-06-23 2016-04-19 Callahan Cellular L.L.C. Detail-in-context lenses for navigation
US7561717B2 (en) * 2004-07-09 2009-07-14 United Parcel Service Of America, Inc. System and method for displaying item information
US7714859B2 (en) 2004-09-03 2010-05-11 Shoemaker Garth B D Occlusion reduction and magnification for multidimensional data presentations
US20060050070A1 (en) * 2004-09-07 2006-03-09 Canon Kabushiki Kaisha Information processing apparatus and method for presenting image combined with virtual image
CN100573231C (en) * 2004-09-08 2009-12-23 日本电信电话株式会社 3 D displaying method, device
US7995078B2 (en) 2004-09-29 2011-08-09 Noregin Assets, N.V., L.L.C. Compound lenses for multi-source data presentation
US20060126926A1 (en) * 2004-11-30 2006-06-15 Vesely Michael A Horizontal perspective representation
US7307675B2 (en) * 2004-12-07 2007-12-11 Planar Systems, Inc. Display panel with backlighting structure and selectively transmissive window therethrough
US20060152482A1 (en) * 2005-01-07 2006-07-13 Chauncy Godwin Virtual interface and control device
US7580036B2 (en) 2005-04-13 2009-08-25 Catherine Montagnese Detail-in-context terrain displacement algorithm with optimizations
DE102005017313A1 (en) * 2005-04-14 2006-10-19 Volkswagen Ag Method for displaying information in a means of transport and instrument cluster for a motor vehicle
US8487910B2 (en) * 2005-05-02 2013-07-16 Smart Technologies Ulc Large scale touch system and methods for interacting with same
US20060252978A1 (en) * 2005-05-09 2006-11-09 Vesely Michael A Biofeedback eyewear system
US20060250391A1 (en) * 2005-05-09 2006-11-09 Vesely Michael A Three dimensional horizontal perspective workstation
US8717423B2 (en) 2005-05-09 2014-05-06 Zspace, Inc. Modifying perspective of stereoscopic images based on changes in user viewpoint
US7875132B2 (en) * 2005-05-31 2011-01-25 United Technologies Corporation High temperature aluminum alloys
US20070043466A1 (en) * 2005-08-18 2007-02-22 Vesely Michael A Stereoscopic display using polarized eyewear
US20070040905A1 (en) * 2005-08-18 2007-02-22 Vesely Michael A Stereoscopic display using polarized eyewear
US8031206B2 (en) 2005-10-12 2011-10-04 Noregin Assets N.V., L.L.C. Method and system for generating pyramid fisheye lens detail-in-context presentations
GB0522968D0 (en) 2005-11-11 2005-12-21 Popovich Milan M Holographic illumination device
WO2007064633A1 (en) * 2005-11-29 2007-06-07 The Board Of Trustees Of The University Of Illinois Virtual reality display system
US8279168B2 (en) * 2005-12-09 2012-10-02 Edge 3 Technologies Llc Three-dimensional virtual-touch human-machine interface system and method therefor
US20070165007A1 (en) * 2006-01-13 2007-07-19 Gerald Morrison Interactive input system
US20070205994A1 (en) * 2006-03-02 2007-09-06 Taco Van Ieperen Touch system and method for interacting with the same
GB0718706D0 (en) 2007-09-25 2007-11-07 Creative Physics Ltd Method and apparatus for reducing laser speckle
US7983473B2 (en) 2006-04-11 2011-07-19 Noregin Assets, N.V., L.L.C. Transparency adjustment of a presentation
EP1847963A1 (en) * 2006-04-20 2007-10-24 Koninklijke KPN N.V. Method and system for displaying visual information on a display
US8972902B2 (en) 2008-08-22 2015-03-03 Northrop Grumman Systems Corporation Compound gesture recognition
US9696808B2 (en) 2006-07-13 2017-07-04 Northrop Grumman Systems Corporation Hand-gesture recognition method
US8589824B2 (en) * 2006-07-13 2013-11-19 Northrop Grumman Systems Corporation Gesture recognition interface system
US8180114B2 (en) 2006-07-13 2012-05-15 Northrop Grumman Systems Corporation Gesture recognition interface system with vertical display
US8234578B2 (en) 2006-07-25 2012-07-31 Northrop Grumman Systems Corporatiom Networked gesture collaboration system
US8432448B2 (en) 2006-08-10 2013-04-30 Northrop Grumman Systems Corporation Stereo camera intrusion detection system
US20080055194A1 (en) * 2006-08-31 2008-03-06 Motorola, Inc. Method and system for context based user interface information presentation and positioning
US8395658B2 (en) * 2006-09-07 2013-03-12 Sony Computer Entertainment Inc. Touch screen-like user interface that does not require actual touching
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
USRE48417E1 (en) 2006-09-28 2021-02-02 Sony Interactive Entertainment Inc. Object direction using video input combined with tilt angle information
WO2008038205A2 (en) * 2006-09-28 2008-04-03 Koninklijke Philips Electronics N.V. 3 menu display
US8781151B2 (en) 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US9442607B2 (en) 2006-12-04 2016-09-13 Smart Technologies Inc. Interactive input system and method
US8144148B2 (en) 2007-02-08 2012-03-27 Edge 3 Technologies Llc Method and system for vision-based interaction in a virtual environment
JP2008219788A (en) * 2007-03-07 2008-09-18 Toshiba Corp Stereoscopic image display device, and method and program therefor
US8115753B2 (en) 2007-04-11 2012-02-14 Next Holdings Limited Touch screen system with hover and click input methods
WO2008152932A1 (en) * 2007-06-13 2008-12-18 Nec Corporation Image display device, image display method and image display program
US8094137B2 (en) 2007-07-23 2012-01-10 Smart Technologies Ulc System and method of detecting contact on a display
US9026938B2 (en) 2007-07-26 2015-05-05 Noregin Assets N.V., L.L.C. Dynamic detail-in-context user interface for application access and content access on electronic displays
AU2008280952A1 (en) 2007-08-30 2009-03-19 Next Holdings Ltd Low profile touch panel systems
CN101802760B (en) 2007-08-30 2013-03-20 奈克斯特控股有限公司 Optical touch screen with improved illumination
US7881901B2 (en) * 2007-09-18 2011-02-01 Gefemer Research Acquisitions, Llc Method and apparatus for holographic user interface communication
US20090102603A1 (en) * 2007-10-19 2009-04-23 Fein Gene S Method and apparatus for providing authentication with a user interface system
CN101174332B (en) * 2007-10-29 2010-11-03 张建中 Method, device and system for interactively combining real-time scene in real world with virtual reality scene
US20090109174A1 (en) * 2007-10-30 2009-04-30 Fein Gene S Method and Apparatus for User Interface in Electronic Devices With Visual Display Units
US8212768B2 (en) * 2007-10-31 2012-07-03 Fimed Properties Ag Limited Liability Company Digital, data, and multimedia user interface with a keyboard
US8127251B2 (en) * 2007-10-31 2012-02-28 Fimed Properties Ag Limited Liability Company Method and apparatus for a user interface with priority data
US8477098B2 (en) 2007-10-31 2013-07-02 Gene S. Fein Method and apparatus for user interface of input devices
US20090109215A1 (en) 2007-10-31 2009-04-30 Fein Gene S Method and apparatus for user interface communication with an image manipulator
US8139110B2 (en) 2007-11-01 2012-03-20 Northrop Grumman Systems Corporation Calibration of a gesture recognition interface system
US9377874B2 (en) 2007-11-02 2016-06-28 Northrop Grumman Systems Corporation Gesture recognition light and video image projector
US8405636B2 (en) 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US8166421B2 (en) * 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US8902193B2 (en) 2008-05-09 2014-12-02 Smart Technologies Ulc Interactive input system and bezel therefor
US8345920B2 (en) 2008-06-20 2013-01-01 Northrop Grumman Systems Corporation Gesture recognition interface system with a light-diffusive screen
US9191238B2 (en) * 2008-07-23 2015-11-17 Yahoo! Inc. Virtual notes in a reality overlay
JP4625515B2 (en) * 2008-09-24 2011-02-02 富士フイルム株式会社 Three-dimensional imaging apparatus, method, and program
DE102008049407A1 (en) * 2008-09-29 2010-04-01 Carl Zeiss Ag Display device and display method
KR20100041006A (en) 2008-10-13 2010-04-22 엘지전자 주식회사 A user interface controlling method using three dimension multi-touch
US8339378B2 (en) * 2008-11-05 2012-12-25 Smart Technologies Ulc Interactive input system with multi-angle reflector
US20100225588A1 (en) * 2009-01-21 2010-09-09 Next Holdings Limited Methods And Systems For Optical Detection Of Gestures
JP5136442B2 (en) * 2009-01-27 2013-02-06 ブラザー工業株式会社 Head mounted display
WO2010103482A2 (en) * 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
US11726332B2 (en) 2009-04-27 2023-08-15 Digilens Inc. Diffractive projection apparatus
US9335604B2 (en) 2013-12-11 2016-05-10 Milan Momcilo Popovich Holographic waveguide display
JP5409107B2 (en) * 2009-05-13 2014-02-05 任天堂株式会社 Display control program, information processing apparatus, display control method, and information processing system
US20100309197A1 (en) * 2009-06-08 2010-12-09 Nvidia Corporation Interaction of stereoscopic objects with physical objects in viewing area
US8692768B2 (en) 2009-07-10 2014-04-08 Smart Technologies Ulc Interactive input system
JP2011035592A (en) * 2009-07-31 2011-02-17 Nintendo Co Ltd Display control program and information processing system
DE102009037835B4 (en) 2009-08-18 2012-12-06 Metaio Gmbh Method for displaying virtual information in a real environment
US9341846B2 (en) 2012-04-25 2016-05-17 Rockwell Collins Inc. Holographic wide angle display
JP5405264B2 (en) * 2009-10-20 2014-02-05 任天堂株式会社 Display control program, library program, information processing system, and display control method
US20110095977A1 (en) * 2009-10-23 2011-04-28 Smart Technologies Ulc Interactive input system incorporating multi-angle reflecting structure
JP4754031B2 (en) * 2009-11-04 2011-08-24 任天堂株式会社 Display control program, information processing system, and program used for stereoscopic display control
KR101657565B1 (en) * 2010-04-21 2016-09-19 엘지전자 주식회사 Augmented Remote Controller and Method of Operating the Same
KR20110118421A (en) * 2010-04-23 2011-10-31 엘지전자 주식회사 Augmented remote controller, augmented remote controller controlling method and the system for the same
KR101694159B1 (en) * 2010-04-21 2017-01-09 엘지전자 주식회사 Augmented Remote Controller and Method of Operating the Same
US20110164032A1 (en) * 2010-01-07 2011-07-07 Prime Sense Ltd. Three-Dimensional User Interface
EP2355526A3 (en) 2010-01-14 2012-10-31 Nintendo Co., Ltd. Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
US8717360B2 (en) 2010-01-29 2014-05-06 Zspace, Inc. Presenting a view within a three dimensional scene
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US8482859B2 (en) 2010-02-28 2013-07-09 Osterhout Group, Inc. See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US8472120B2 (en) 2010-02-28 2013-06-25 Osterhout Group, Inc. See-through near-eye display glasses with a small scale image source
US9759917B2 (en) * 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US8488246B2 (en) 2010-02-28 2013-07-16 Osterhout Group, Inc. See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US8467133B2 (en) 2010-02-28 2013-06-18 Osterhout Group, Inc. See-through display with an optical assembly including a wedge-shaped illumination system
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US8477425B2 (en) 2010-02-28 2013-07-02 Osterhout Group, Inc. See-through near-eye display glasses including a partially reflective, partially transmitting optical element
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9285589B2 (en) * 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
CN102906623A (en) 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
US20110227911A1 (en) * 2010-03-22 2011-09-22 Lg Electronics Inc. Image display device and method for operating the same
US9990062B2 (en) * 2010-03-26 2018-06-05 Nokia Technologies Oy Apparatus and method for proximity based input
EP2372512A1 (en) 2010-03-30 2011-10-05 Harman Becker Automotive Systems GmbH Vehicle user interface unit for a vehicle electronic device
US8558756B2 (en) 2010-04-30 2013-10-15 International Business Machines Corporation Displaying messages on created collections of displays
US9693039B2 (en) 2010-05-27 2017-06-27 Nintendo Co., Ltd. Hand-held electronic device
US8780059B2 (en) 2010-05-28 2014-07-15 Nokia Corporation User interface
US9030536B2 (en) 2010-06-04 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for presenting media content
US8593398B2 (en) 2010-06-25 2013-11-26 Nokia Corporation Apparatus and method for proximity based input
US8640182B2 (en) 2010-06-30 2014-01-28 At&T Intellectual Property I, L.P. Method for detecting a viewing apparatus
US9787974B2 (en) 2010-06-30 2017-10-10 At&T Intellectual Property I, L.P. Method and apparatus for delivering media content
US8593574B2 (en) 2010-06-30 2013-11-26 At&T Intellectual Property I, L.P. Apparatus and method for providing dimensional media content based on detected display capability
US20120005624A1 (en) 2010-07-02 2012-01-05 Vesely Michael A User Interface Elements for Use within a Three Dimensional Scene
US8918831B2 (en) 2010-07-06 2014-12-23 At&T Intellectual Property I, Lp Method and apparatus for managing a presentation of media content
US9049426B2 (en) 2010-07-07 2015-06-02 At&T Intellectual Property I, Lp Apparatus and method for distributing three dimensional media content
US8643569B2 (en) 2010-07-14 2014-02-04 Zspace, Inc. Tools for use within a three dimensional scene
US9232274B2 (en) 2010-07-20 2016-01-05 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
US9560406B2 (en) 2010-07-20 2017-01-31 At&T Intellectual Property I, L.P. Method and apparatus for adapting a presentation of media content
US9032470B2 (en) 2010-07-20 2015-05-12 At&T Intellectual Property I, Lp Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
WO2012011044A1 (en) 2010-07-20 2012-01-26 Primesense Ltd. Interactive reality augmentation for natural interaction
KR101709497B1 (en) 2010-07-29 2017-02-23 엘지전자 주식회사 Mobile terminal and operation control method thereof
KR101685980B1 (en) 2010-07-30 2016-12-13 엘지전자 주식회사 Mobile terminal and method for controlling the same
US8994716B2 (en) 2010-08-02 2015-03-31 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US8605136B2 (en) 2010-08-10 2013-12-10 Sony Corporation 2D to 3D user interface content data conversion
KR101763592B1 (en) * 2010-08-16 2017-08-01 엘지전자 주식회사 Method for processing image of display system outputting 3 dimensional contents and display system enabling of the method
US8438502B2 (en) 2010-08-25 2013-05-07 At&T Intellectual Property I, L.P. Apparatus for controlling three-dimensional images
US9141189B2 (en) 2010-08-26 2015-09-22 Samsung Electronics Co., Ltd. Apparatus and method for controlling interface
KR101674957B1 (en) 2010-08-31 2016-11-10 엘지전자 주식회사 Mobile terminal and method for controlling thereof
KR101685982B1 (en) 2010-09-01 2016-12-13 엘지전자 주식회사 Mobile terminal and Method for controlling 3 dimention display thereof
KR101708696B1 (en) 2010-09-15 2017-02-21 엘지전자 주식회사 Mobile terminal and operation control method thereof
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US8947511B2 (en) 2010-10-01 2015-02-03 At&T Intellectual Property I, L.P. Apparatus and method for presenting three-dimensional media content
US9110564B2 (en) 2010-11-05 2015-08-18 Lg Electronics Inc. Mobile terminal, method for controlling mobile terminal, and method for displaying image of mobile terminal
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8589822B2 (en) 2010-12-10 2013-11-19 International Business Machines Corporation Controlling three-dimensional views of selected portions of content
TW201228360A (en) * 2010-12-22 2012-07-01 Largan Precision Co Ltd Stereo display device
US9354718B2 (en) * 2010-12-22 2016-05-31 Zspace, Inc. Tightly coupled interactive stereo display
DE102010056042A1 (en) * 2010-12-23 2012-06-28 Yxlon International Gmbh Method and device for visual inspection of a test object to be checked by means of X-ray radiation
JP5711962B2 (en) 2010-12-27 2015-05-07 株式会社ソニー・コンピュータエンタテインメント Gesture operation input processing apparatus and gesture operation input processing method
US9618972B2 (en) 2011-01-20 2017-04-11 Blackberry Limited Three-dimensional, multi-depth presentation of icons in association with differing input components of a user interface
US9582144B2 (en) 2011-01-20 2017-02-28 Blackberry Limited Three-dimensional, multi-depth presentation of icons associated with a user interface
EP3527121B1 (en) 2011-02-09 2023-08-23 Apple Inc. Gesture detection in a 3d mapping environment
TWI529572B (en) * 2011-02-23 2016-04-11 原相科技股份有限公司 Method for detecting operation object and touch device
CN102654802B (en) * 2011-03-04 2015-01-07 原相科技股份有限公司 Detecting method for manipulation object and touch control device
US9274349B2 (en) 2011-04-07 2016-03-01 Digilens Inc. Laser despeckler based on angular diversity
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US20120274545A1 (en) * 2011-04-28 2012-11-01 Research In Motion Limited Portable electronic device and method of controlling same
US8786529B1 (en) 2011-05-18 2014-07-22 Zspace, Inc. Liquid crystal variable drive voltage
JP5670255B2 (en) * 2011-05-27 2015-02-18 京セラ株式会社 Display device
US9445046B2 (en) 2011-06-24 2016-09-13 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content with telepresence
US8947497B2 (en) 2011-06-24 2015-02-03 At&T Intellectual Property I, Lp Apparatus and method for managing telepresence sessions
US9602766B2 (en) 2011-06-24 2017-03-21 At&T Intellectual Property I, L.P. Apparatus and method for presenting three dimensional objects with telepresence
US9030522B2 (en) 2011-06-24 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for providing media content
JP5774387B2 (en) 2011-06-28 2015-09-09 京セラ株式会社 Display device
JP5864144B2 (en) * 2011-06-28 2016-02-17 京セラ株式会社 Display device
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US8587635B2 (en) 2011-07-15 2013-11-19 At&T Intellectual Property I, L.P. Apparatus and method for providing media services with telepresence
JP5922349B2 (en) * 2011-07-27 2016-05-24 京セラ株式会社 Display device, control system and control program
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US10670876B2 (en) 2011-08-24 2020-06-02 Digilens Inc. Waveguide laser illuminator incorporating a despeckler
WO2016020630A2 (en) 2014-08-08 2016-02-11 Milan Momcilo Popovich Waveguide laser illuminator incorporating a despeckler
EP2748670B1 (en) 2011-08-24 2015-11-18 Rockwell Collins, Inc. Wearable data display
DE102011112618A1 (en) * 2011-09-08 2013-03-14 Eads Deutschland Gmbh Interaction with a three-dimensional virtual scenario
CN103108197A (en) 2011-11-14 2013-05-15 辉达公司 Priority level compression method and priority level compression system for three-dimensional (3D) video wireless display
WO2013074997A1 (en) * 2011-11-18 2013-05-23 Infinite Z, Inc. Indirect 3d scene positioning control
EP2788839A4 (en) * 2011-12-06 2015-12-16 Thomson Licensing Method and system for responding to user's selection gesture of object displayed in three dimensions
US8412413B1 (en) * 2011-12-21 2013-04-02 Delphi Technologies, Inc. Vehicle windshield display with obstruction detection
US20150010265A1 (en) 2012-01-06 2015-01-08 Milan, Momcilo POPOVICH Contact image sensor using switchable bragg gratings
US9829715B2 (en) 2012-01-23 2017-11-28 Nvidia Corporation Eyewear device for transmitting signal and communication method thereof
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US9456744B2 (en) 2012-05-11 2016-10-04 Digilens, Inc. Apparatus for eye tracking
TWI576771B (en) * 2012-05-28 2017-04-01 宏碁股份有限公司 Transparent display device and transparency adjustment method thereof
CN103514841A (en) * 2012-06-15 2014-01-15 宏碁股份有限公司 Transparent display device and transparency adjustment method thereof
US20180048750A1 (en) * 2012-06-15 2018-02-15 Muzik, Llc Audio/video wearable computer system with integrated projector
US9578224B2 (en) 2012-09-10 2017-02-21 Nvidia Corporation System and method for enhanced monoimaging
US9407961B2 (en) * 2012-09-14 2016-08-02 Intel Corporation Media stream selective decode based on window visibility state
US9933684B2 (en) * 2012-11-16 2018-04-03 Rockwell Collins, Inc. Transparent waveguide display providing upper and lower fields of view having a specific light output aperture configuration
JP5974238B2 (en) * 2012-12-25 2016-08-23 東芝メディカルシステムズ株式会社 Image processing system, apparatus, method, and medical image diagnostic apparatus
CN104076910B (en) * 2013-03-25 2017-11-03 联想(北京)有限公司 The method and electronic equipment of a kind of information processing
US9323338B2 (en) 2013-04-12 2016-04-26 Usens, Inc. Interactive input system and method
US20140354602A1 (en) * 2013-04-12 2014-12-04 Impression.Pi, Inc. Interactive input system and method
WO2014188149A1 (en) 2013-05-20 2014-11-27 Milan Momcilo Popovich Holographic waveguide eye tracker
KR20140139847A (en) * 2013-05-28 2014-12-08 삼성디스플레이 주식회사 Stereoscopic display device, image processing device and image processing method
US9280259B2 (en) 2013-07-26 2016-03-08 Blackberry Limited System and method for manipulating an object in a three-dimensional desktop environment
WO2015017242A1 (en) * 2013-07-28 2015-02-05 Deluca Michael J Augmented reality based user interfacing
US9727772B2 (en) 2013-07-31 2017-08-08 Digilens, Inc. Method and apparatus for contact image sensing
US9390598B2 (en) 2013-09-11 2016-07-12 Blackberry Limited Three dimensional haptics hybrid modeling
US9158115B1 (en) * 2013-09-16 2015-10-13 Amazon Technologies, Inc. Touch control for immersion in a tablet goggles accessory
US9208765B1 (en) 2013-09-18 2015-12-08 American Megatrends, Inc. Audio visual presentation with three-dimensional display devices
US9411511B1 (en) * 2013-09-19 2016-08-09 American Megatrends, Inc. Three-dimensional display devices with out-of-screen virtual keyboards
US9582516B2 (en) 2013-10-17 2017-02-28 Nant Holdings Ip, Llc Wide area augmented reality location-based services
US9883173B2 (en) 2013-12-25 2018-01-30 3Di Llc Stereoscopic display
US10652525B2 (en) 2013-10-31 2020-05-12 3Di Llc Quad view display system
US11343487B2 (en) 2013-10-31 2022-05-24 David Woods Trackable glasses system for perspective views of a display
US10116914B2 (en) * 2013-10-31 2018-10-30 3Di Llc Stereoscopic display
US9986228B2 (en) 2016-03-24 2018-05-29 3Di Llc Trackable glasses system that provides multiple views of a shared display
US20150185599A1 (en) * 2013-12-31 2015-07-02 Brian Mullins Audio based on captured image data of visual content
KR20150081599A (en) * 2014-01-06 2015-07-15 삼성전자주식회사 Display apparatus and method for controlling the same
US10935788B2 (en) 2014-01-24 2021-03-02 Nvidia Corporation Hybrid virtual 3D rendering approach to stereovision
US9928728B2 (en) 2014-05-09 2018-03-27 Sony Interactive Entertainment Inc. Scheme for embedding a control signal in an audio signal using pseudo white noise
US10613585B2 (en) * 2014-06-19 2020-04-07 Samsung Electronics Co., Ltd. Transparent display apparatus, group play system using transparent display apparatus and performance methods thereof
US9298010B2 (en) 2014-08-08 2016-03-29 Marissa J. Sundquist Wearable optical display with audio functionality
US9779633B2 (en) 2014-08-08 2017-10-03 Greg Van Curen Virtual reality system enabling compatibility of sense of immersion in virtual space and movement in real space, and battle training system using same
US9599821B2 (en) 2014-08-08 2017-03-21 Greg Van Curen Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
US10359736B2 (en) 2014-08-08 2019-07-23 Digilens Inc. Method for holographic mastering and replication
KR101594839B1 (en) * 2014-09-12 2016-02-17 고려대학교 산학협력단 Method and apparatus for providing prefrontal activity game
WO2016042283A1 (en) 2014-09-19 2016-03-24 Milan Momcilo Popovich Method and apparatus for generating input images for holographic waveguide displays
US10423222B2 (en) 2014-09-26 2019-09-24 Digilens Inc. Holographic waveguide optical tracker
EP3245551B1 (en) 2015-01-12 2019-09-18 DigiLens Inc. Waveguide light field displays
CN107873086B (en) 2015-01-12 2020-03-20 迪吉伦斯公司 Environmentally isolated waveguide display
WO2016116733A1 (en) 2015-01-20 2016-07-28 Milan Momcilo Popovich Holographic waveguide lidar
US9632226B2 (en) 2015-02-12 2017-04-25 Digilens Inc. Waveguide grating device
US11468639B2 (en) * 2015-02-20 2022-10-11 Microsoft Technology Licensing, Llc Selective occlusion system for augmented reality devices
JP2016162162A (en) * 2015-03-02 2016-09-05 株式会社リコー Contact detection device, projector device, electronic blackboard device, digital signage device, projector system, and contact detection method
JP6053845B2 (en) * 2015-03-09 2016-12-27 株式会社ソニー・インタラクティブエンタテインメント Gesture operation input processing device, three-dimensional display device, and gesture operation input processing method
WO2016146963A1 (en) 2015-03-16 2016-09-22 Popovich, Milan, Momcilo Waveguide device incorporating a light pipe
WO2016156776A1 (en) 2015-03-31 2016-10-06 Milan Momcilo Popovich Method and apparatus for contact image sensing
CN104765156B (en) * 2015-04-22 2017-11-21 京东方科技集团股份有限公司 A kind of three-dimensional display apparatus and 3 D displaying method
CN104820497B (en) * 2015-05-08 2017-12-22 东华大学 A kind of 3D interactive display systems based on augmented reality
US10109110B2 (en) 2015-06-29 2018-10-23 International Business Machines Corporation Reality augmentation to eliminate, or de-emphasize, selected portions of base image
US10397469B1 (en) * 2015-08-31 2019-08-27 Snap Inc. Dynamic image-based adjustment of image capture parameters
CN105278108A (en) * 2015-09-10 2016-01-27 上海理鑫光学科技有限公司 Double-screen stereo imaging augmented reality system
US9937420B2 (en) 2015-09-29 2018-04-10 Sony Interactive Entertainment Inc. Method and apparatus for the projection of images, video, and/or holograms generated by a computer simulation
WO2017060665A1 (en) 2015-10-05 2017-04-13 Milan Momcilo Popovich Waveguide display
CN105280111B (en) * 2015-11-11 2018-01-09 武汉华星光电技术有限公司 Transparent display
US9767606B2 (en) * 2016-01-12 2017-09-19 Lenovo (Singapore) Pte. Ltd. Automatic modification of augmented reality objects
US10296088B2 (en) * 2016-01-26 2019-05-21 Futurewei Technologies, Inc. Haptic correlated graphic effects
EP3398007A1 (en) 2016-02-04 2018-11-07 DigiLens, Inc. Holographic waveguide optical tracker
US9906981B2 (en) 2016-02-25 2018-02-27 Nvidia Corporation Method and system for dynamic regulation and control of Wi-Fi scans
JP6895451B2 (en) 2016-03-24 2021-06-30 ディジレンズ インコーポレイテッド Methods and Devices for Providing Polarized Selective Holography Waveguide Devices
JP6734933B2 (en) 2016-04-11 2020-08-05 ディジレンズ インコーポレイテッド Holographic Waveguide Device for Structured Light Projection
US10019831B2 (en) * 2016-10-20 2018-07-10 Zspace, Inc. Integrating real world conditions into virtual imagery
WO2018102834A2 (en) 2016-12-02 2018-06-07 Digilens, Inc. Waveguide device with uniform output illumination
US10545346B2 (en) 2017-01-05 2020-01-28 Digilens Inc. Wearable heads up displays
US10146300B2 (en) 2017-01-25 2018-12-04 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Emitting a visual indicator from the position of an object in a simulated reality emulation
CA3045780A1 (en) * 2017-01-30 2018-08-02 Novartis Ag Systems and method for augmented reality ophthalmic surgical microscope projection
JP2018137505A (en) * 2017-02-20 2018-08-30 セイコーエプソン株式会社 Display device and control method thereof
US10471478B2 (en) 2017-04-28 2019-11-12 United Parcel Service Of America, Inc. Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same
US11023109B2 (en) * 2017-06-30 2021-06-01 Microsoft Techniogy Licensing, LLC Annotation using a multi-device mixed interactivity system
US10338400B2 (en) 2017-07-03 2019-07-02 Holovisions LLC Augmented reality eyewear with VAPE or wear technology
US10859834B2 (en) 2017-07-03 2020-12-08 Holovisions Space-efficient optical structures for wide field-of-view augmented reality (AR) eyewear
EP3698214A4 (en) 2017-10-16 2021-10-27 Digilens Inc. Systems and methods for multiplying the image resolution of a pixelated display
KR20200108030A (en) 2018-01-08 2020-09-16 디지렌즈 인코포레이티드. System and method for high throughput recording of holographic gratings in waveguide cells
US10914950B2 (en) 2018-01-08 2021-02-09 Digilens Inc. Waveguide architectures and related methods of manufacturing
US10690851B2 (en) 2018-03-16 2020-06-23 Digilens Inc. Holographic waveguides incorporating birefringence control and methods for their fabrication
US10504289B2 (en) 2018-05-01 2019-12-10 Dell Products, Lp Method and apparatus for securely displaying private information using an augmented reality headset
CN112105983B (en) * 2018-05-08 2023-07-07 苹果公司 Enhanced visual ability
US11402801B2 (en) 2018-07-25 2022-08-02 Digilens Inc. Systems and methods for fabricating a multilayer optical structure
WO2020084625A1 (en) * 2018-10-25 2020-04-30 Beyeonics Surgical Ltd. Ui for head mounted display system
JP2022520472A (en) 2019-02-15 2022-03-30 ディジレンズ インコーポレイテッド Methods and equipment for providing holographic waveguide displays using integrated grids
DE102019105764B3 (en) 2019-03-07 2020-08-06 Gestigon Gmbh Method for calibrating a user interface and user interface
KR20210134763A (en) 2019-03-12 2021-11-10 디지렌즈 인코포레이티드. Holographic waveguide backlights and related manufacturing methods
KR20220016990A (en) 2019-06-07 2022-02-10 디지렌즈 인코포레이티드. Waveguides incorporating transmission and reflection gratings and related manufacturing methods
JP2022543571A (en) 2019-07-29 2022-10-13 ディジレンズ インコーポレイテッド Method and Apparatus for Multiplying Image Resolution and Field of View for Pixelated Displays
EP4022370A4 (en) 2019-08-29 2023-08-30 Digilens Inc. Evacuating bragg gratings and methods of manufacturing
JP7330507B2 (en) * 2019-12-13 2023-08-22 株式会社Agama-X Information processing device, program and method
DE102020202624A1 (en) 2020-03-02 2021-09-02 Carl Zeiss Meditec Ag Head-worn visualization system
US11938907B2 (en) 2020-10-29 2024-03-26 Oliver Crispin Robotics Limited Systems and methods of servicing equipment
US11685051B2 (en) 2020-10-29 2023-06-27 General Electric Company Systems and methods of servicing equipment
US11935290B2 (en) * 2020-10-29 2024-03-19 Oliver Crispin Robotics Limited Systems and methods of servicing equipment
US11874653B2 (en) 2020-10-29 2024-01-16 Oliver Crispin Robotics Limited Systems and methods of servicing equipment
US11915531B2 (en) 2020-10-29 2024-02-27 General Electric Company Systems and methods of servicing equipment
US11794107B2 (en) * 2020-12-30 2023-10-24 Activision Publishing, Inc. Systems and methods for improved collision detection in video games

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4649425A (en) 1983-07-25 1987-03-10 Pund Marvin L Stereoscopic display
FR2572812B1 (en) * 1984-11-06 1987-03-06 Sintra Alcatel Sa VISUALIZATION SYSTEM INCLUDING A TRANSPARENT EMISSITIVE SCREEN
US4812829A (en) 1986-05-17 1989-03-14 Hitachi, Ltd. Three-dimensional display device and method for pointing displayed three-dimensional image
US4808979A (en) 1987-04-02 1989-02-28 Tektronix, Inc. Cursor for use in 3-D imaging systems
US4987487A (en) 1988-08-12 1991-01-22 Nippon Telegraph And Telephone Corporation Method of stereoscopic images display which compensates electronically for viewer head movement
JP3129719B2 (en) * 1989-04-21 2001-01-31 株式会社パルカ Video display device
JP2762677B2 (en) * 1990-04-24 1998-06-04 ソニー株式会社 Optical device
CA2087523C (en) 1990-07-17 1997-04-15 Mark Andrew Shackleton Method of processing an image
US5025314A (en) * 1990-07-30 1991-06-18 Xerox Corporation Apparatus allowing remote interactive use of a plurality of writing surfaces
WO1992007350A1 (en) 1990-10-15 1992-04-30 National Biomedical Research Foundation Three-dimensional cursor control device
US5239373A (en) * 1990-12-26 1993-08-24 Xerox Corporation Video computational shared drawing space
US5168531A (en) * 1991-06-27 1992-12-01 Digital Equipment Corporation Real-time recognition of pointing information from video
US5162779A (en) 1991-07-22 1992-11-10 International Business Machines Corporation Point addressable cursor for stereo raster display
US5303085A (en) 1992-02-07 1994-04-12 Rallison Richard D Optically corrected helmet mounted display
EP0554492B1 (en) * 1992-02-07 1995-08-09 International Business Machines Corporation Method and device for optical input of commands or data
US5838458A (en) 1992-02-25 1998-11-17 Tsai; Irving Method and apparatus for linking designated portions of a received document image with an electronic address
US5680481A (en) 1992-05-26 1997-10-21 Ricoh Corporation Facial feature extraction method and apparatus for a neural network acoustic and visual speech recognition system
US5311220A (en) 1992-06-10 1994-05-10 Dimension Technologies, Inc. Autostereoscopic display
US5721788A (en) 1992-07-31 1998-02-24 Corbis Corporation Method and system for digital image signatures
US5365370A (en) 1993-06-10 1994-11-15 Hudgins J Stephen Three dimensional viewing illusion with 2D display
US5694142A (en) * 1993-06-21 1997-12-02 General Electric Company Interactive digital arrow (d'arrow) three-dimensional (3D) pointing
US5491510A (en) 1993-12-03 1996-02-13 Texas Instruments Incorporated System and method for simultaneously viewing a scene and an obscured object
JPH08139994A (en) * 1994-11-09 1996-05-31 Hitachi Ltd Image synthesis system
US5715325A (en) 1995-08-30 1998-02-03 Siemens Corporate Research, Inc. Apparatus and method for detecting a face in a video image
US5886822A (en) 1996-10-08 1999-03-23 The Microoptical Corporation Image combining system for eyeglasses and face masks
US6008946A (en) 1997-11-07 1999-12-28 Honeywell, Inc. Ambient light display illumination for a head-mounted display
US6118414A (en) 1997-12-02 2000-09-12 Kintz; Gregory J. Virtual reality system and method
US5913591A (en) 1998-01-20 1999-06-22 University Of Washington Augmented imaging using a silhouette to improve contrast

Cited By (119)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020047835A1 (en) * 2000-09-11 2002-04-25 Tomoaki Kawai Image display apparatus and method of displaying image data
US20030179192A1 (en) * 2002-03-20 2003-09-25 Allen William J. Method and apparatus for image display
US7019736B2 (en) * 2002-03-20 2006-03-28 Hewlett-Packard Development Company, L.P. Method and apparatus for image display
US7425946B1 (en) * 2003-08-15 2008-09-16 Britton Rick A Remote camouflage keypad for alarm control panel
WO2006005799A1 (en) * 2004-05-11 2006-01-19 Ritva Laijoki-Puska Method and arrangement for presenting a virtual landscape
US20060017654A1 (en) * 2004-07-23 2006-01-26 Romo Justin R Virtual reality interactivity system and method
US20150332084A1 (en) * 2004-08-12 2015-11-19 Bioscrypt, Inc. Device for biometrically controlling a face surface
US7609847B2 (en) * 2004-11-23 2009-10-27 Hewlett-Packard Development Company, L.P. Methods and systems for determining object layouts
US20060109510A1 (en) * 2004-11-23 2006-05-25 Simon Widdowson Methods and systems for determining object layouts
US7479967B2 (en) * 2005-04-11 2009-01-20 Systems Technology Inc. System for combining virtual and real-time environments
US20070035561A1 (en) * 2005-04-11 2007-02-15 Systems Technology, Inc. System for combining virtual and real-time environments
US8111906B2 (en) * 2005-06-30 2012-02-07 Samsung Mobile Display Co., Ltd. Stereoscopic image display device
US20070003134A1 (en) * 2005-06-30 2007-01-04 Myoung-Seop Song Stereoscopic image display device
FR2895811A1 (en) * 2006-01-04 2007-07-06 Bernard Szajner Real time control of relief images for use in computer based three- dimensional imaging systems based on correlation between observer movement data and possible relief projection data
EP1806933A3 (en) * 2006-01-04 2013-08-21 Laurence Frison Method of interacting in real time with virtual three-dimensional images
EP1806933A2 (en) * 2006-01-04 2007-07-11 Laurence Frison Method of interacting in real time with virtual three-dimensional images
US7725547B2 (en) 2006-09-06 2010-05-25 International Business Machines Corporation Informing a user of gestures made by others out of the user's line of sight
US20080059578A1 (en) * 2006-09-06 2008-03-06 Jacob C Albertson Informing a user of gestures made by others out of the user's line of sight
US10481677B2 (en) * 2006-09-27 2019-11-19 Sony Corporation Display apparatus and display method
US20080170749A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Controlling a system based on user behavioral signals detected from a 3d captured image stream
US20080172261A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Adjusting a consumer experience based on a 3d captured image stream of a consumer response
US20080169929A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Warning a user about adverse behaviors of others within an environment based on a 3d captured image stream
WO2008084053A1 (en) * 2007-01-12 2008-07-17 International Business Machines Corporation Adjusting a consumer experience based on a 3d captured image stream of a consumer response
US8295542B2 (en) 2007-01-12 2012-10-23 International Business Machines Corporation Adjusting a consumer experience based on a 3D captured image stream of a consumer response
US20080170118A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Assisting a vision-impaired user with navigation based on a 3d captured image stream
US7792328B2 (en) 2007-01-12 2010-09-07 International Business Machines Corporation Warning a vehicle operator of unsafe operation behavior based on a 3D captured image stream
US7801332B2 (en) 2007-01-12 2010-09-21 International Business Machines Corporation Controlling a system based on user behavioral signals detected from a 3D captured image stream
US7840031B2 (en) 2007-01-12 2010-11-23 International Business Machines Corporation Tracking a range of body movement based on 3D captured image streams of a user
US20080169914A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Warning a vehicle operator of unsafe operation behavior based on a 3d captured image stream
US7877706B2 (en) 2007-01-12 2011-01-25 International Business Machines Corporation Controlling a document based on user behavioral signals detected from a 3D captured image stream
US9412011B2 (en) 2007-01-12 2016-08-09 International Business Machines Corporation Warning a user about adverse behaviors of others within an environment based on a 3D captured image stream
US9208678B2 (en) 2007-01-12 2015-12-08 International Business Machines Corporation Predicting adverse behaviors of others within an environment based on a 3D captured image stream
US20080170123A1 (en) * 2007-01-12 2008-07-17 Jacob C Albertson Tracking a range of body movement based on 3d captured image streams of a user
US7971156B2 (en) 2007-01-12 2011-06-28 International Business Machines Corporation Controlling resource access based on user gesturing in a 3D captured image stream of the user
US20080170776A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Controlling resource access based on user gesturing in a 3d captured image stream of the user
US20080170748A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Controlling a document based on user behavioral signals detected from a 3d captured image stream
US8269834B2 (en) 2007-01-12 2012-09-18 International Business Machines Corporation Warning a user about adverse behaviors of others within an environment based on a 3D captured image stream
US10354127B2 (en) 2007-01-12 2019-07-16 Sinoeast Concept Limited System, method, and computer program product for alerting a supervising user of adverse behavior of others within an environment by providing warning signals to alert the supervising user that a predicted behavior of a monitored user represents an adverse behavior
US8577087B2 (en) 2007-01-12 2013-11-05 International Business Machines Corporation Adjusting a consumer experience based on a 3D captured image stream of a consumer response
US8588464B2 (en) 2007-01-12 2013-11-19 International Business Machines Corporation Assisting a vision-impaired user with navigation based on a 3D captured image stream
US20120113004A1 (en) * 2008-09-05 2012-05-10 Nintendo Co., Ltd. Computer readable recording medium recording image processing program and image processing apparatus
US8223120B2 (en) 2008-09-05 2012-07-17 Nintendo Co., Ltd. Computer readable recording medium recording image processing program and image processing apparatus
US8284158B2 (en) * 2008-09-05 2012-10-09 Nintendo Co., Ltd. Computer readable recording medium recording image processing program and image processing apparatus
US20100060575A1 (en) * 2008-09-05 2010-03-11 Keizo Ohta Computer readable recording medium recording image processing program and image processing apparatus
US20120069180A1 (en) * 2009-05-26 2012-03-22 Panasonic Electric Works Co., Ltd. Information presentation apparatus
US20110012830A1 (en) * 2009-07-20 2011-01-20 J Touch Corporation Stereo image interaction system
US20120119988A1 (en) * 2009-08-12 2012-05-17 Shimane Prefectural Government Image recognition apparatus, operation determining method and computer-readable medium
US9535512B2 (en) 2009-08-12 2017-01-03 Shimane Prefectural Government Image recognition apparatus, operation determining method and computer-readable medium
US8890809B2 (en) * 2009-08-12 2014-11-18 Shimane Prefectural Government Image recognition apparatus, operation determining method and computer-readable medium
US20110078634A1 (en) * 2009-09-30 2011-03-31 Rovi Technologies Corporation Systems and methods for navigating a three-dimensional media guidance application
US8970669B2 (en) 2009-09-30 2015-03-03 Rovi Guides, Inc. Systems and methods for generating a three-dimensional media guidance application
US8291322B2 (en) * 2009-09-30 2012-10-16 United Video Properties, Inc. Systems and methods for navigating a three-dimensional media guidance application
US20110074918A1 (en) * 2009-09-30 2011-03-31 Rovi Technologies Corporation Systems and methods for generating a three-dimensional media guidance application
US20110137727A1 (en) * 2009-12-07 2011-06-09 Rovi Technologies Corporation Systems and methods for determining proximity of media objects in a 3d media environment
US8878773B1 (en) 2010-05-24 2014-11-04 Amazon Technologies, Inc. Determining relative motion as input
US9557811B1 (en) 2010-05-24 2017-01-31 Amazon Technologies, Inc. Determining relative motion as input
US10521951B2 (en) 2010-06-01 2019-12-31 Vladimir Vaganov 3D digital painting
US10217264B2 (en) 2010-06-01 2019-02-26 Vladimir Vaganov 3D digital painting
US9440484B2 (en) 2010-06-01 2016-09-13 Vladimir Vaganov 3D digital painting
US10922870B2 (en) 2010-06-01 2021-02-16 Vladimir Vaganov 3D digital painting
US9734622B2 (en) 2010-06-01 2017-08-15 Vladimir Vaganov 3D digital painting
US9030467B2 (en) * 2010-07-13 2015-05-12 Lg Electronics Inc. Electronic apparatus and method for displaying graphical user interface as 3D image
US20120013612A1 (en) * 2010-07-13 2012-01-19 Lg Electronics Inc. Electronic apparatus and method for displaying graphical user interface as 3d image
US20160041625A1 (en) * 2010-07-20 2016-02-11 Apple Inc. Adaptive Projector
US9740298B2 (en) * 2010-07-20 2017-08-22 Apple Inc. Adaptive projector for projecting content into a three-dimensional virtual space
US8892357B2 (en) 2010-09-20 2014-11-18 Honeywell International Inc. Ground navigational display, system and method displaying buildings in three-dimensions
US20120081529A1 (en) * 2010-10-04 2012-04-05 Samsung Electronics Co., Ltd Method of generating and reproducing moving image data by using augmented reality and photographing apparatus using the same
US9164581B2 (en) 2010-10-22 2015-10-20 Hewlett-Packard Development Company, L.P. Augmented reality display system and method of display
US9489102B2 (en) 2010-10-22 2016-11-08 Hewlett-Packard Development Company, L.P. System and method of modifying lighting in a display system
US8854802B2 (en) 2010-10-22 2014-10-07 Hewlett-Packard Development Company, L.P. Display with rotatable display screen
US9454836B2 (en) * 2010-11-08 2016-09-27 Ntt Docomo, Inc. Object display device and object display method
US20130222424A1 (en) * 2010-11-08 2013-08-29 Ntt Docomo, Inc. Object display device and object display method
US20120120051A1 (en) * 2010-11-16 2012-05-17 Shu-Ming Liu Method and system for displaying stereoscopic images
US20130293687A1 (en) * 2011-01-14 2013-11-07 Sharp Kabushiki Kaisha Stereoscopic image processing apparatus, stereoscopic image processing method, and program
US9430081B2 (en) * 2011-03-25 2016-08-30 Kyocera Corporation Electronic device, control method, and control program
US20140009424A1 (en) * 2011-03-25 2014-01-09 Kyocera Corporation Electronic device, control method, and control program
US20120287235A1 (en) * 2011-05-13 2012-11-15 Ahn Mooki Apparatus and method for processing 3-dimensional image
US9766796B2 (en) 2011-06-07 2017-09-19 Sony Corporation Information processing apparatus, information processing method, and program
US20130009949A1 (en) * 2011-07-05 2013-01-10 Texas Instruments Incorporated Method, system and computer program product for re-convergence of a stereoscopic image
US9041734B2 (en) 2011-07-12 2015-05-26 Amazon Technologies, Inc. Simulating three-dimensional features
US10088924B1 (en) 2011-08-04 2018-10-02 Amazon Technologies, Inc. Overcoming motion effects in gesture recognition
US8947351B1 (en) * 2011-09-27 2015-02-03 Amazon Technologies, Inc. Point of view determinations for finger tracking
US9418292B2 (en) 2011-10-04 2016-08-16 Here Global B.V. Methods, apparatuses, and computer program products for restricting overlay of an augmentation
WO2013050953A3 (en) * 2011-10-04 2013-09-12 Nokia Corporation Methods, apparatuses, and computer program products for restricting overlay of an augmentation
US20130121528A1 (en) * 2011-11-14 2013-05-16 Sony Corporation Information presentation device, information presentation method, information presentation system, information registration device, information registration method, information registration system, and program
US8948451B2 (en) * 2011-11-14 2015-02-03 Sony Corporation Information presentation device, information presentation method, information presentation system, information registration device, information registration method, information registration system, and program
US8942434B1 (en) 2011-12-20 2015-01-27 Amazon Technologies, Inc. Conflict resolution for pupil detection
US20130169530A1 (en) * 2011-12-29 2013-07-04 Khalifa University Of Science And Technology & Research (Kustar) Human eye controlled computer mouse interface
US9075453B2 (en) * 2011-12-29 2015-07-07 Khalifa University of Science, Technology & Research (KUSTAR) Human eye controlled computer mouse interface
US9223415B1 (en) 2012-01-17 2015-12-29 Amazon Technologies, Inc. Managing resource usage for task performance
EP2669885A3 (en) * 2012-05-28 2015-06-10 Acer Incorporated Transparent display device and transparency adjustment method thereof
US9563272B2 (en) 2012-05-31 2017-02-07 Amazon Technologies, Inc. Gaze assisted object recognition
US9317113B1 (en) 2012-05-31 2016-04-19 Amazon Technologies, Inc. Gaze assisted object recognition
US20130335404A1 (en) * 2012-06-15 2013-12-19 Jeff Westerinen Depth of field control for see-thru display
US9430055B2 (en) * 2012-06-15 2016-08-30 Microsoft Technology Licensing, Llc Depth of field control for see-thru display
US20140195983A1 (en) * 2012-06-30 2014-07-10 Yangzhou Du 3d graphical user interface
US9268407B1 (en) * 2012-10-10 2016-02-23 Amazon Technologies, Inc. Interface elements for managing gesture control
US10180766B2 (en) 2012-10-30 2019-01-15 Samsung Electronics Co., Ltd. Three-dimensional display device and user interfacing method therefor
WO2014069722A1 (en) * 2012-10-30 2014-05-08 삼성전자주식회사 Three-dimensional display device and user interfacing method therefor
US10782848B2 (en) 2012-11-02 2020-09-22 Atheer, Inc. Method and apparatus for a three dimensional interface
US20200387290A1 (en) * 2012-11-02 2020-12-10 Atheer, Inc. Method and apparatus for a three dimensional interface
US10241638B2 (en) * 2012-11-02 2019-03-26 Atheer, Inc. Method and apparatus for a three dimensional interface
US11789583B2 (en) * 2012-11-02 2023-10-17 West Texas Technology Partners, Llc Method and apparatus for a three dimensional interface
US20140125557A1 (en) * 2012-11-02 2014-05-08 Atheer, Inc. Method and apparatus for a three dimensional interface
US20140132726A1 (en) * 2012-11-13 2014-05-15 Lg Electronics Inc. Image display apparatus and method for operating the same
US9269012B2 (en) 2013-08-22 2016-02-23 Amazon Technologies, Inc. Multi-tracker object tracking
TWI499964B (en) * 2013-09-17 2015-09-11 Genius Toy Taiwan Co Ltd Method, system and computer program product for real-time touchless interaction
US10055013B2 (en) 2013-09-17 2018-08-21 Amazon Technologies, Inc. Dynamic object tracking for user interfaces
WO2015066037A1 (en) * 2013-10-28 2015-05-07 Brown University Virtual reality methods and systems
WO2015195549A1 (en) * 2014-06-16 2015-12-23 Vladimir Vaganov 3d digital painting
CN104093240A (en) * 2014-06-30 2014-10-08 广东九联科技股份有限公司 System for intelligently adjusting environment light
US9588352B2 (en) 2015-01-19 2017-03-07 Samsung Display Co., Ltd. Autostereoscopic image display device with a difference image map
CN104661015A (en) * 2015-02-06 2015-05-27 武汉也琪工业设计有限公司 Virtual reality simulation display equipment of 3D real scene
WO2016191049A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset
US10824293B2 (en) * 2017-05-08 2020-11-03 International Business Machines Corporation Finger direction based holographic object interaction from a distance
US20180321816A1 (en) * 2017-05-08 2018-11-08 International Business Machines Corporation Finger direction based holographic object interaction from a distance
US10643398B2 (en) 2018-07-23 2020-05-05 Microsoft Technology Licensing, Llc Depth ray layer for reduced visual noise
US11182977B2 (en) 2018-07-23 2021-11-23 Microsoft Technology Licensing, Llc Depth ray layer for reduced visual noise
US11922594B2 (en) 2021-01-27 2024-03-05 Qualcomm Incorporated Context-aware extended reality systems

Also Published As

Publication number Publication date
US6559813B1 (en) 2003-05-06
US6064354A (en) 2000-05-16
WO2000002187A1 (en) 2000-01-13
US6243054B1 (en) 2001-06-05
AU4837299A (en) 2000-01-24

Similar Documents

Publication Publication Date Title
US6559813B1 (en) Selective real image obstruction in a virtual reality display apparatus and method
US6417969B1 (en) Multiple viewer headset display apparatus and method with second person icon display
US11656468B2 (en) Steerable high-resolution display having a foveal display and a field display with intermediate optics
JP4468370B2 (en) Three-dimensional display method, apparatus and program
Youngblut et al. Review of virtual environment interface technology
US6917370B2 (en) Interacting augmented reality and virtual reality
RU2541936C2 (en) Three-dimensional display system
JP2005164916A (en) Stereoscopic display device
JP2000258723A (en) Video display device
EP2954487A1 (en) Improvements in and relating to image making
WO2018204738A1 (en) Head tracking based depth fusion
Peterson et al. Label segregation by remapping stereoscopic depth in far-field augmented reality
US5880734A (en) Peripheral vision simulator for immersive 3D virtual environments
US10852767B2 (en) Handwriting support device
US11399154B2 (en) Presentation system and presentation method
KR100815505B1 (en) 3d displaying method, device and program
JP4547960B2 (en) Video display system and video generation method
Lacoche et al. Dealing with frame cancellation for stereoscopic displays in 3d user interfaces
KR102542641B1 (en) Apparatus and operation method for rehabilitation training using hand tracking
CN111247473A (en) Display apparatus and display method using device for providing visual cue
CN111727924B (en) Mixed reality fish tank system in stereoscopic display environment and generation method
Orlosky Adaptive display of virtual content for improving usability and safety in mixed and augmented reality
JP2005283704A6 (en) Video display system and video generation method
US6614426B1 (en) Method and device for displaying simulated 3D space as an image
Yoshikawa et al. Studies of vection field II: a method for generating smooth motion pattern

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION