CA2604513A1 - System and method for dynamically correcting parallax in head borne video systems - Google Patents
System and method for dynamically correcting parallax in head borne video systems Download PDFInfo
- Publication number
- CA2604513A1 CA2604513A1 CA002604513A CA2604513A CA2604513A1 CA 2604513 A1 CA2604513 A1 CA 2604513A1 CA 002604513 A CA002604513 A CA 002604513A CA 2604513 A CA2604513 A CA 2604513A CA 2604513 A1 CA2604513 A1 CA 2604513A1
- Authority
- CA
- Canada
- Prior art keywords
- offset
- video data
- display device
- rows
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W40/00—Communication routing or communication path finding
- H04W40/24—Connectivity information management, e.g. connectivity discovery or connectivity update
- H04W40/248—Connectivity information update
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0129—Head-up displays characterised by optical features comprising devices for correcting parallax
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W8/00—Network data management
- H04W8/005—Discovery of network devices, e.g. terminals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W84/00—Network topologies
- H04W84/18—Self-organising networks, e.g. ad-hoc networks or sensor networks
Abstract
A dynamically corrected parallax system includes a head borne video source for imaging an object and providing video data. A controller electronically offsets the video data provided from the head borne video source to form offset video data. A display device receives the offset video data and displays the offset video data to a user's eye. The display device is configured for placement directly in front of the user's eye as a vision aid, and the head borne video source is configured for displacement to a side of the user's eye. The offset video data corrects parallax due to horizontal and/or vertical displacement between the display device and the head borne video source. The display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes an offset of a number of columns of pixels in the X direction of the X,Y array, and/or another offset of a number of rows of pixels in the Y direction of the X,Y array.
Description
SYSTEM AND METHOD FOR DYNAMICALLY CORRECTING PARALLAX
IN HEAD BORNE VIDEO SYSTEMS
FIELD OF THE INVENTION
The present invention relates, in general, to a system for parallax correction.
More specifically, the present invention relates to a system and method for dynamically correcting parallax in a head mounted display (HMD), which is placed directly in front of a s user's eye.
BACKGROUND OF THE INVENTION
Vision aid devices which are worn on the head are typically located directly in front of the aided eye or eyes. As these systems migrate from direct view optical paths to digital camera aids, the system configuration requires that a head mounted display (HMD) io be placed directly in front of the user's aided eye, with one inch of eye relief. This placement of the HMD prevents the co-location of the camera aperture directly in front of the aided eye. The camera aperture must be moved either in front of the HMD or to one side of the HMD.
If, for example, the digital camera is placed 100 mm to the side of the optical 15 axis of the aided eye, then a displacement is created between the aperture of the digital camera and the image display of the digital camera, the display typically centered about the optical axis of the aided eye. This displacement creates a disparity between the apparent positions of objects viewed through the camera, and the actual positions of the objects seen 21682208.1 in object space (or real space). This offset in perceived space and object space is referred to as parallax.
FIG. 1 provides an example of parallax error. As shown, the user is viewing environment 10 through a head mounted video device. The user sees tool 12 at close range and attempts to pick up the tool. Because of parallax, the perceived position of tool 12 is incorrect. The true position of tool 12 in object space is shown by dotted tool 14.
In the case of the user viewing an object through a head mounted video device, parallax reduces the usefulness of the video system. The human psycho-visual system is unconsciously attuned to perceiving the world through its natural entrance aperture, which is the pupil in the human eye. The hand-to-eye coordination inherent in manual tasks is based on this innate property. Normal human movement tasks, such as walking and running, depend on this subconscious process. A fixed system, which is aligned to remove parallax at some fixed distance, is miss-aligned at all other distances. This is especially true when the video system is aligned to remove parallax of an object at far range is and the user attempts to locate another object at close range, such as tool 12 on FIG. 1 which is located within an arms length of the user.
As will be explained, the present invention addresses the parallax problem by providing a system for dynamically realigning the video image so that the image coincides with the real world at all distances.
21682208.1 SUMMARY OF THE INVENTION
To meet this and other needs, and in view of its purposes, the present invention provides a dynamically corrected parallax system including a head borne video source for imaging an object and providing video data. A controller is included for electronically offsetting the video data provided from the head borne video source to form offset video data. A display device receives the offset video data and displays the offset video data to a user's eye. The display device is configured for placement directly in front of the user's eye as a vision aid, and the head borne video source is configured for displacement to a side of the user's eye. The offset video data corrects parallax due to displacement between the display device and the head borne video source.
The display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes an offset of a number of columns of pixels in the X
direction of the X,Y array. The offset video data, alternatively, may include an offset of a number of rows of pixels in the Y direction of the X,Y array. The offset video data may also include an offset of a number of columns of pixels in the X direction of the X,Y array and another offset of a number of rows of pixels in the Y direction of the X,Y
array.
Geometrically, the optical axis of the user's eye extends a distance of D to an object imaged by the video source, and an optical axis of the aperture of the video source extends in a direction parallel to the optical axis of the user's eye. The displacement to a side is a horizontal displacement distance of d in a Frankfort plane between the optical axis of the user's eye and the optical axis of the aperture of the video source.
The offset video data is based on the horizontal displacement distance d and the distance D to the object.
21682208.1 Furthermore, a horizontal offset angle 0D is formed, as follows:
OD = tan l d/D, where d is a horizontal displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source.
The display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes the following horizontal offset:
offsetco,umns = #Columns/FOVnoa*eD
where offsetco,umns is the amount of horizontal offset in columns, FOVnoa is the horizontal field-of-view of the video source, and #Columns is the total number of columns of the display device.
Further yet, a vertical offset angle OD may also be formed, where (DD=tanld/D, where d' is a vertical displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source. The offset video data includes the following vertical offset:
offset,o,s = # Rows/ FOVvert* OD
21682208.1 where offsetro,s is the amount of vertical offset in rows, FOVVert is the vertical field-of-view of the video source, and #Rows is the total number of rows in the display device.
The dynamically corrected parallax system includes a display electronics module disposed between the video source and the display device for converting the video data from the video source into digital video data. The display electronics module is configured to receive an offset command from the controller and modify the digital video data into the offset video data. The display electronics module and the controller may be integrated in a single unit. A focus position encoder may be coupled to the controller for io determining a distance D to an object imaged by the video source, where the distance D is used to correct the parallax.
The display device may be a helmet mounted display (HMD), or part of a head mounted night vision goggle.
Another embodiment of the present invention includes a dynamically is correcting parallax method for a head borne camera system having a video source and a display device, where the display device is configured for placement directly in front of a user's eye as a vision aid, and the video source is configured for displacement to a side of the user's eye. The method includes the steps of: (a) imaging an object, by the video source, to provide video data; (b) determining a focus distance to an object;
(c) offsetting 20 the video data to form offset video data based on the focus distance determined in step (b) and a displacement distance between the user's eye and an aperture of the video source;
and (d) displaying the offset video data by the display device.
21682208.1 It is understood that the foregoing general description and the following detailed description are exemplary, but are not restrictive, of the invention.
BRIEF DESCRIPTION OF THE DRAWING
The invention is best understood from the following detailed description when read in connection with the accompanying drawings. Included in the drawing are the following figures:
FIG. 1 depicts a geometry of a parallax offset between an object as imaged by a camera and the same object as seen in object space by a viewer;
FIG. 2 is a block diagram of a system for dynamically correcting parallax in a to head borne video system, in accordance with an embodiment of the present invention;
FIG. 3A is a top view of an object as viewed by a user and imaged by a video camera, where a display of the image is displaced from the aperture of the camera by a horizontal displacement distance;
FIG. 3B is a side view of an object as viewed by a user and imaged by a video camera, where a display of the image is displaced from the aperture of the camera by a vertical displacement distance;
FIG. 4 is a plot of the number of columns required to be shifted on a display as a function of viewing distance to the object-of-interest, in accordance with an embodiment of the present invention; and 21682208.1 FIG. 5 is a plot of the number of columns required to be shifted on a display as a function of viewing distance to the object-of-interest, with a bias angle introduced in the imaging angle of the camera, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
As will be explained, the present invention dynamically realigns the video image so that the image coincides with the real world at all distances. To do this, the present invention determines the range to the object of interest, so that dynamic alignment may be accomplished based on the determined range. In one embodiment, the invention io uses an absolute position of the camera's focus mechanism (or angular orientation of a manual focus knob) to determine the distance to the user's object-of-interest and then applies an appropriate amount of parallax correction to the image shown on the user's display. In this manner, the apparent location of an object-of-interest is correctly perceived at its true position in object space.
In one embodiment of the invention, the video is provided to the user on a digital display device, such as a LCD or LED display. These displays consist of an array of rows and columns of pixels. By controlling the timing of the video data sent to the display, the present invention induces an offset in the image as the image is displayed to the user.
By shifting the image in display space, the present invention removes the disparity between the apparent position of an object and its actual position in object space.
A consequence of shifting the image on the display is lost rows and/or columns of pixels in the direction of the image shift. Rows and/or columns of pixels on the 21682208.1 opposite edges of the display show arbitrary intensity values, because (assuming a one-to-one relationship in pixel resolution between the camera and the display) these pixels are no longer within the field-of-view of the camera and, therefore, do not provide image data.
Thus, shifting the image introduces a reduction in the effective user's field-of-view, because of the reduced usable image size. This negative effect may be minimized, however, by setting the camera pointing angle for convergence at a distance much closer than the far field.
Referring next to FIG. 2, there is shown a system for dynamically correcting parallax in a head borne video system, generally designated as 20. System 20 includes io video source 23 providing video data to display electronics module 24, the latter forming digital pixel data for viewing on display device 25. Also included in system 20 is a focus position encoder, designated as 21, for providing focus position data to microcontroller 22.
The focus position encoder 21 encodes, as shown, the orientation of focus knob 26 disposed on video source 23. Microcontroller 22 converts the focus position data received from the position encoder 21 into X,Y offset control signals, as will be explained later. The X,Y offset control signals are provided to display electronics 24 which, in turn, provides the offset video data for viewing on display device 25.
It will be appreciated that video source 23 may be any camera device configured to be placed on the side of the optical axis of a user's eye. In the embodiment shown in FIG. 2, video source 23 includes manual focus knob 26 which allows the user to adjust the lens of the video camera to focus on an object-of-interest. Display device 25 may be any display which is configured to be placed about the optical axis of the user's eye. The display device provides an offset pixel image of the image represented by the video data received from video source 23. The X,Y array of pixels displayed on display device 25 and 21682208.1 the video data provided by video source 23 may have a one-to-one correspondence, or may have any other relationship, such as a correspondence resulting from a reduced resolution display versus a high resolution video camera.
As another embodiment, focus knob 26 may be controlled by a motor (not shown) to allow for a zoom lens operation of video source 23. In this embodiment, focus position encoder 21 may determine the focal length to an object-of-interest by including a zoom lens barrel. A focal length detecting circuit may be included to detect and output the focal length of the zoom lens barrel. As a further embodiment, video source 23 may include a range finder, such as an infrared range finder, which may focus an infrared beam onto a io target and receive a reflected infrared beam from the target. A position sensitive device included in focus position encoder 21 may detect the displacement of the reflected beam and provide an encoded signal of the range, or position of the target.
The microcontroller may be any type of controller having a processor execution capability provided by a software program stored in a medium, or a hardwired is program provided by an integrated circuit. The manner in which microcontroller 22 computes the X,Y offset control signals is described next.
Referring to FIGS. 3A and 3B, camera 23 is shown offset by a displacement distance from a user's eye 32. FIG. 3A and 3B are similar to each other, except that camera 23 is oriented to a horizontal, right side of a user's eye 32 by a horizontal displacement 20 distance of d in FIG. 3A, whereas it is oriented to a vertical side of (above or below) the user's eye by a vertical displacement distance of d' in FIG. 3B. The horizontal displacement distance and/or the vertical displacement distance is typically in the vicinity of 100 21682208.1 millimeters. The camera 23 has an optical axis designated as 37 and the user's eye has an optical axis designated as 35. Both optical axes are shown parallel to each other.
The user is aided in the viewing of object 31 by way of display device 25. As shown in FIG. 3A, camera 23 is imaging object 31 at a horizontal offset angle of 6D. In FIG.
3B, however, camera 23 is imaging object 31 at a vertical offset angle of (DD.
In both figures, object 31 is displayed as a pixel image on display device 25 for viewing by the user.
The focal distance, which may be adjustable, is the distance D between the user's eye and the object-of-interest 31.
Using FIG. 3A, a method for calculating the X offset control signal by microcontroller 22 is exemplified below. In this example, the units of the X
offset are in horizontal pixels, which may be equivalent to columns of pixels on video display 25. For the purpose of this example, it is assumed that the horizontal displacement distance d is 103 mm; the field-of-view (FOV) of camera 23 is 40 degrees along the horizontal (HFOV) axis;
the horizontal resolution of display device 25 is 1280 pixels; the optical axis of camera 23 is parallel to the optical axis of the unaided eye 32; the aperture of the camera is on the viewer's Frankfort plane, in line with the unaided eye; and the object-of-interest 31 is at a focal distance of D.
The horizontal offset angle 6D is given by equation (1) as follows eD = tan 1 d/D (Eq. 1) The correction factor 'Choa' (for a 40 degree FOV and a 1280 pixel horizontal display resolution) is given by equation 2, in units of columns per degree, as follows 21682208.1 ChaR = #Columns/FOVhar7 (Eq. 2) = 1280/40 = 32 columns/degree Here, #columns is the total number of columns in the digital display, or 1280 columns (in this example). The image shift on the display device, or the amount of offset-in-columns, is given by equation 3 below, where OD is the horizontal offset angle between the camera's line of sight 36 and the camera's optical axis 37.
offsetcolumns = Chorz *OD (Eq. 3) In a similar manner, using FIG. 3B, a method for calculating the Y offset control signal by microcontroller 22 is exemplified below. In this example, the units of the Y
offset are in vertical pixels, which may be equivalent to rows of pixels on video display 25.
For the purpose of this example, it is assumed that the vertical displacement distance d' is 103 mm; the field-of-view (FOV) of camera 23 is 30 degrees along the vertical (VFOV) axis;
the vertical resolution of display device 25 is 1024 pixels; the optical axis of camera 23 is is parallel to the optical axis of the unaided eye 32; the aperture of the camera is in a vertical line with the unaided eye; and the object-of-interest 31 is at a focal distance of D.
The vertical offset angle (DD is given by equation (4) as follows OD = tan1 d'/D (Eq. 4) The correction factor CVert (for a 30 degree vertical FOV and a 1024 pixel vertical display resolution) is given by equation 5, in units of rows per degree, as follows 21682208.1 Cvert = #ROWS/FOVvert (Eq. 5) = 1024/30 = 34 rows/degree Here, #rows is the total number of rows in the digital display, or 1024 rows (in this example). The image shift on the display device, or the amount of offset-in-rows, is given by equation 6 below, where Op is the vertical offset angle between the camera's line of sight 36 and the camera's optical axis 37.
OffSetrows = Cvert * (DD (Eq. 6) Referring next to FIG. 4, there is shown a plot of the offset-in-#columns vs the distance between the observer (the user's eye) and the observed object (object-of-interest). More specifically, FIG. 4 plots the horizontal image offset, in number-of-columns, required to compensate for the parallax induced by a 103 mm horizontal displacement between an observer and the video camera. For a camera located to the right of the aided eye, the parallax correcting image shift in the display is towards the right.
The plot shown in FIG. 4 is for a camera/HMD system with a matched HFOV of 40 degree. As can be seen, the amount of image shift required to remove the parallax increases nonlinearly as the observer focuses to shorter and shorter distances. At a focus distance of 2 feet, 25% of the viewable area of a SXGA high resolution display will be shifted out of view, thereby reducing the effective display HFOV by approximately 25%.
To avoid the loss of HFOV at close focus distances, the optical axis of the camera may be biased to the left, thereby reducing the horizontal offset angle Op.
21682208.1 A similar plot to the plot shown in FIG. 4 may be made for an offset-in-#rows vs the distance between the observer (the user's eye) and the observed object (object-of-interest)..
Lastly, FIG. 5 shows a resulting horizontal image offset in #columns with the same assumptions as those made for FIG. 4, except that a bias angle of 4.8 degrees has been introduced. At this camera angle, the display offset required to remove parallax is reduced to zero at 4 feet. At 2 feet, the required offset is 152 columns, or 12% of the HFOV, as compared to 24% of the HFOV in FIG. 4. Beyond a distance of 4 feet, the display offset becomes negative, which means that the video image must be shifted toward the io opposite edge, or end of the display. This camera angle thus introduces a parallax error with an opposite sign. For a focal distance of 10 feet, the horizontal display offset required to compensate for parallax is -93 columns, or 7.2% of the HFOV. At 40 feet distance, the horizontal display offset is 139 columns, or 11% of the HFOV.
The embodiments described above may be used by any head borne camera is system, including a head mounted night vision goggle and a head mounted reality mediator device.
Although the invention is illustrated and described herein with reference to specific embodiments, the invention is not intended to be limited to the details shown.
Rather, various modifications may be made in the details within the scope and range of 20 equivalents of the claims and without departing from the invention.
21682208.1
IN HEAD BORNE VIDEO SYSTEMS
FIELD OF THE INVENTION
The present invention relates, in general, to a system for parallax correction.
More specifically, the present invention relates to a system and method for dynamically correcting parallax in a head mounted display (HMD), which is placed directly in front of a s user's eye.
BACKGROUND OF THE INVENTION
Vision aid devices which are worn on the head are typically located directly in front of the aided eye or eyes. As these systems migrate from direct view optical paths to digital camera aids, the system configuration requires that a head mounted display (HMD) io be placed directly in front of the user's aided eye, with one inch of eye relief. This placement of the HMD prevents the co-location of the camera aperture directly in front of the aided eye. The camera aperture must be moved either in front of the HMD or to one side of the HMD.
If, for example, the digital camera is placed 100 mm to the side of the optical 15 axis of the aided eye, then a displacement is created between the aperture of the digital camera and the image display of the digital camera, the display typically centered about the optical axis of the aided eye. This displacement creates a disparity between the apparent positions of objects viewed through the camera, and the actual positions of the objects seen 21682208.1 in object space (or real space). This offset in perceived space and object space is referred to as parallax.
FIG. 1 provides an example of parallax error. As shown, the user is viewing environment 10 through a head mounted video device. The user sees tool 12 at close range and attempts to pick up the tool. Because of parallax, the perceived position of tool 12 is incorrect. The true position of tool 12 in object space is shown by dotted tool 14.
In the case of the user viewing an object through a head mounted video device, parallax reduces the usefulness of the video system. The human psycho-visual system is unconsciously attuned to perceiving the world through its natural entrance aperture, which is the pupil in the human eye. The hand-to-eye coordination inherent in manual tasks is based on this innate property. Normal human movement tasks, such as walking and running, depend on this subconscious process. A fixed system, which is aligned to remove parallax at some fixed distance, is miss-aligned at all other distances. This is especially true when the video system is aligned to remove parallax of an object at far range is and the user attempts to locate another object at close range, such as tool 12 on FIG. 1 which is located within an arms length of the user.
As will be explained, the present invention addresses the parallax problem by providing a system for dynamically realigning the video image so that the image coincides with the real world at all distances.
21682208.1 SUMMARY OF THE INVENTION
To meet this and other needs, and in view of its purposes, the present invention provides a dynamically corrected parallax system including a head borne video source for imaging an object and providing video data. A controller is included for electronically offsetting the video data provided from the head borne video source to form offset video data. A display device receives the offset video data and displays the offset video data to a user's eye. The display device is configured for placement directly in front of the user's eye as a vision aid, and the head borne video source is configured for displacement to a side of the user's eye. The offset video data corrects parallax due to displacement between the display device and the head borne video source.
The display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes an offset of a number of columns of pixels in the X
direction of the X,Y array. The offset video data, alternatively, may include an offset of a number of rows of pixels in the Y direction of the X,Y array. The offset video data may also include an offset of a number of columns of pixels in the X direction of the X,Y array and another offset of a number of rows of pixels in the Y direction of the X,Y
array.
Geometrically, the optical axis of the user's eye extends a distance of D to an object imaged by the video source, and an optical axis of the aperture of the video source extends in a direction parallel to the optical axis of the user's eye. The displacement to a side is a horizontal displacement distance of d in a Frankfort plane between the optical axis of the user's eye and the optical axis of the aperture of the video source.
The offset video data is based on the horizontal displacement distance d and the distance D to the object.
21682208.1 Furthermore, a horizontal offset angle 0D is formed, as follows:
OD = tan l d/D, where d is a horizontal displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source.
The display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes the following horizontal offset:
offsetco,umns = #Columns/FOVnoa*eD
where offsetco,umns is the amount of horizontal offset in columns, FOVnoa is the horizontal field-of-view of the video source, and #Columns is the total number of columns of the display device.
Further yet, a vertical offset angle OD may also be formed, where (DD=tanld/D, where d' is a vertical displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source. The offset video data includes the following vertical offset:
offset,o,s = # Rows/ FOVvert* OD
21682208.1 where offsetro,s is the amount of vertical offset in rows, FOVVert is the vertical field-of-view of the video source, and #Rows is the total number of rows in the display device.
The dynamically corrected parallax system includes a display electronics module disposed between the video source and the display device for converting the video data from the video source into digital video data. The display electronics module is configured to receive an offset command from the controller and modify the digital video data into the offset video data. The display electronics module and the controller may be integrated in a single unit. A focus position encoder may be coupled to the controller for io determining a distance D to an object imaged by the video source, where the distance D is used to correct the parallax.
The display device may be a helmet mounted display (HMD), or part of a head mounted night vision goggle.
Another embodiment of the present invention includes a dynamically is correcting parallax method for a head borne camera system having a video source and a display device, where the display device is configured for placement directly in front of a user's eye as a vision aid, and the video source is configured for displacement to a side of the user's eye. The method includes the steps of: (a) imaging an object, by the video source, to provide video data; (b) determining a focus distance to an object;
(c) offsetting 20 the video data to form offset video data based on the focus distance determined in step (b) and a displacement distance between the user's eye and an aperture of the video source;
and (d) displaying the offset video data by the display device.
21682208.1 It is understood that the foregoing general description and the following detailed description are exemplary, but are not restrictive, of the invention.
BRIEF DESCRIPTION OF THE DRAWING
The invention is best understood from the following detailed description when read in connection with the accompanying drawings. Included in the drawing are the following figures:
FIG. 1 depicts a geometry of a parallax offset between an object as imaged by a camera and the same object as seen in object space by a viewer;
FIG. 2 is a block diagram of a system for dynamically correcting parallax in a to head borne video system, in accordance with an embodiment of the present invention;
FIG. 3A is a top view of an object as viewed by a user and imaged by a video camera, where a display of the image is displaced from the aperture of the camera by a horizontal displacement distance;
FIG. 3B is a side view of an object as viewed by a user and imaged by a video camera, where a display of the image is displaced from the aperture of the camera by a vertical displacement distance;
FIG. 4 is a plot of the number of columns required to be shifted on a display as a function of viewing distance to the object-of-interest, in accordance with an embodiment of the present invention; and 21682208.1 FIG. 5 is a plot of the number of columns required to be shifted on a display as a function of viewing distance to the object-of-interest, with a bias angle introduced in the imaging angle of the camera, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
As will be explained, the present invention dynamically realigns the video image so that the image coincides with the real world at all distances. To do this, the present invention determines the range to the object of interest, so that dynamic alignment may be accomplished based on the determined range. In one embodiment, the invention io uses an absolute position of the camera's focus mechanism (or angular orientation of a manual focus knob) to determine the distance to the user's object-of-interest and then applies an appropriate amount of parallax correction to the image shown on the user's display. In this manner, the apparent location of an object-of-interest is correctly perceived at its true position in object space.
In one embodiment of the invention, the video is provided to the user on a digital display device, such as a LCD or LED display. These displays consist of an array of rows and columns of pixels. By controlling the timing of the video data sent to the display, the present invention induces an offset in the image as the image is displayed to the user.
By shifting the image in display space, the present invention removes the disparity between the apparent position of an object and its actual position in object space.
A consequence of shifting the image on the display is lost rows and/or columns of pixels in the direction of the image shift. Rows and/or columns of pixels on the 21682208.1 opposite edges of the display show arbitrary intensity values, because (assuming a one-to-one relationship in pixel resolution between the camera and the display) these pixels are no longer within the field-of-view of the camera and, therefore, do not provide image data.
Thus, shifting the image introduces a reduction in the effective user's field-of-view, because of the reduced usable image size. This negative effect may be minimized, however, by setting the camera pointing angle for convergence at a distance much closer than the far field.
Referring next to FIG. 2, there is shown a system for dynamically correcting parallax in a head borne video system, generally designated as 20. System 20 includes io video source 23 providing video data to display electronics module 24, the latter forming digital pixel data for viewing on display device 25. Also included in system 20 is a focus position encoder, designated as 21, for providing focus position data to microcontroller 22.
The focus position encoder 21 encodes, as shown, the orientation of focus knob 26 disposed on video source 23. Microcontroller 22 converts the focus position data received from the position encoder 21 into X,Y offset control signals, as will be explained later. The X,Y offset control signals are provided to display electronics 24 which, in turn, provides the offset video data for viewing on display device 25.
It will be appreciated that video source 23 may be any camera device configured to be placed on the side of the optical axis of a user's eye. In the embodiment shown in FIG. 2, video source 23 includes manual focus knob 26 which allows the user to adjust the lens of the video camera to focus on an object-of-interest. Display device 25 may be any display which is configured to be placed about the optical axis of the user's eye. The display device provides an offset pixel image of the image represented by the video data received from video source 23. The X,Y array of pixels displayed on display device 25 and 21682208.1 the video data provided by video source 23 may have a one-to-one correspondence, or may have any other relationship, such as a correspondence resulting from a reduced resolution display versus a high resolution video camera.
As another embodiment, focus knob 26 may be controlled by a motor (not shown) to allow for a zoom lens operation of video source 23. In this embodiment, focus position encoder 21 may determine the focal length to an object-of-interest by including a zoom lens barrel. A focal length detecting circuit may be included to detect and output the focal length of the zoom lens barrel. As a further embodiment, video source 23 may include a range finder, such as an infrared range finder, which may focus an infrared beam onto a io target and receive a reflected infrared beam from the target. A position sensitive device included in focus position encoder 21 may detect the displacement of the reflected beam and provide an encoded signal of the range, or position of the target.
The microcontroller may be any type of controller having a processor execution capability provided by a software program stored in a medium, or a hardwired is program provided by an integrated circuit. The manner in which microcontroller 22 computes the X,Y offset control signals is described next.
Referring to FIGS. 3A and 3B, camera 23 is shown offset by a displacement distance from a user's eye 32. FIG. 3A and 3B are similar to each other, except that camera 23 is oriented to a horizontal, right side of a user's eye 32 by a horizontal displacement 20 distance of d in FIG. 3A, whereas it is oriented to a vertical side of (above or below) the user's eye by a vertical displacement distance of d' in FIG. 3B. The horizontal displacement distance and/or the vertical displacement distance is typically in the vicinity of 100 21682208.1 millimeters. The camera 23 has an optical axis designated as 37 and the user's eye has an optical axis designated as 35. Both optical axes are shown parallel to each other.
The user is aided in the viewing of object 31 by way of display device 25. As shown in FIG. 3A, camera 23 is imaging object 31 at a horizontal offset angle of 6D. In FIG.
3B, however, camera 23 is imaging object 31 at a vertical offset angle of (DD.
In both figures, object 31 is displayed as a pixel image on display device 25 for viewing by the user.
The focal distance, which may be adjustable, is the distance D between the user's eye and the object-of-interest 31.
Using FIG. 3A, a method for calculating the X offset control signal by microcontroller 22 is exemplified below. In this example, the units of the X
offset are in horizontal pixels, which may be equivalent to columns of pixels on video display 25. For the purpose of this example, it is assumed that the horizontal displacement distance d is 103 mm; the field-of-view (FOV) of camera 23 is 40 degrees along the horizontal (HFOV) axis;
the horizontal resolution of display device 25 is 1280 pixels; the optical axis of camera 23 is parallel to the optical axis of the unaided eye 32; the aperture of the camera is on the viewer's Frankfort plane, in line with the unaided eye; and the object-of-interest 31 is at a focal distance of D.
The horizontal offset angle 6D is given by equation (1) as follows eD = tan 1 d/D (Eq. 1) The correction factor 'Choa' (for a 40 degree FOV and a 1280 pixel horizontal display resolution) is given by equation 2, in units of columns per degree, as follows 21682208.1 ChaR = #Columns/FOVhar7 (Eq. 2) = 1280/40 = 32 columns/degree Here, #columns is the total number of columns in the digital display, or 1280 columns (in this example). The image shift on the display device, or the amount of offset-in-columns, is given by equation 3 below, where OD is the horizontal offset angle between the camera's line of sight 36 and the camera's optical axis 37.
offsetcolumns = Chorz *OD (Eq. 3) In a similar manner, using FIG. 3B, a method for calculating the Y offset control signal by microcontroller 22 is exemplified below. In this example, the units of the Y
offset are in vertical pixels, which may be equivalent to rows of pixels on video display 25.
For the purpose of this example, it is assumed that the vertical displacement distance d' is 103 mm; the field-of-view (FOV) of camera 23 is 30 degrees along the vertical (VFOV) axis;
the vertical resolution of display device 25 is 1024 pixels; the optical axis of camera 23 is is parallel to the optical axis of the unaided eye 32; the aperture of the camera is in a vertical line with the unaided eye; and the object-of-interest 31 is at a focal distance of D.
The vertical offset angle (DD is given by equation (4) as follows OD = tan1 d'/D (Eq. 4) The correction factor CVert (for a 30 degree vertical FOV and a 1024 pixel vertical display resolution) is given by equation 5, in units of rows per degree, as follows 21682208.1 Cvert = #ROWS/FOVvert (Eq. 5) = 1024/30 = 34 rows/degree Here, #rows is the total number of rows in the digital display, or 1024 rows (in this example). The image shift on the display device, or the amount of offset-in-rows, is given by equation 6 below, where Op is the vertical offset angle between the camera's line of sight 36 and the camera's optical axis 37.
OffSetrows = Cvert * (DD (Eq. 6) Referring next to FIG. 4, there is shown a plot of the offset-in-#columns vs the distance between the observer (the user's eye) and the observed object (object-of-interest). More specifically, FIG. 4 plots the horizontal image offset, in number-of-columns, required to compensate for the parallax induced by a 103 mm horizontal displacement between an observer and the video camera. For a camera located to the right of the aided eye, the parallax correcting image shift in the display is towards the right.
The plot shown in FIG. 4 is for a camera/HMD system with a matched HFOV of 40 degree. As can be seen, the amount of image shift required to remove the parallax increases nonlinearly as the observer focuses to shorter and shorter distances. At a focus distance of 2 feet, 25% of the viewable area of a SXGA high resolution display will be shifted out of view, thereby reducing the effective display HFOV by approximately 25%.
To avoid the loss of HFOV at close focus distances, the optical axis of the camera may be biased to the left, thereby reducing the horizontal offset angle Op.
21682208.1 A similar plot to the plot shown in FIG. 4 may be made for an offset-in-#rows vs the distance between the observer (the user's eye) and the observed object (object-of-interest)..
Lastly, FIG. 5 shows a resulting horizontal image offset in #columns with the same assumptions as those made for FIG. 4, except that a bias angle of 4.8 degrees has been introduced. At this camera angle, the display offset required to remove parallax is reduced to zero at 4 feet. At 2 feet, the required offset is 152 columns, or 12% of the HFOV, as compared to 24% of the HFOV in FIG. 4. Beyond a distance of 4 feet, the display offset becomes negative, which means that the video image must be shifted toward the io opposite edge, or end of the display. This camera angle thus introduces a parallax error with an opposite sign. For a focal distance of 10 feet, the horizontal display offset required to compensate for parallax is -93 columns, or 7.2% of the HFOV. At 40 feet distance, the horizontal display offset is 139 columns, or 11% of the HFOV.
The embodiments described above may be used by any head borne camera is system, including a head mounted night vision goggle and a head mounted reality mediator device.
Although the invention is illustrated and described herein with reference to specific embodiments, the invention is not intended to be limited to the details shown.
Rather, various modifications may be made in the details within the scope and range of 20 equivalents of the claims and without departing from the invention.
21682208.1
Claims (20)
1. A dynamically corrected parallax system comprising a head borne video source for imaging an object and providing video data, a controller for electronically offsetting the video data provided from the head borne video source to form offset video data, and a display device for receiving the offset video data and displaying the offset video data to a user's eye, wherein the display device is configured for placement directly in front of the user's eye as a vision aid, and the head borne video source is configured for displacement to a side of the user's eye, and the offset video data corrects parallax due to displacement between the display device and the head borne video source.
2. The dynamically corrected parallax system of claim 1 wherein the display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes an offset of a number of columns of pixels in the X direction of the X,Y array.
3. The dynamically corrected parallax system of claim 1 wherein the display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes an offset of a number of rows of pixels in the Y direction of the X,Y array.
4. The dynamically corrected parallax system of claim 1 wherein the display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes an offset of a number of columns of pixels in the X direction of the X,Y array and another offset of a number of rows of pixels in the Y direction of the X,Y array.
5. The dynamically corrected parallax system of claim 1 wherein an optical axis of the user's eye extends a distance of D to an object imaged by the video source, an optical axis of an aperture of the video source extends in a direction parallel to the optical axis of the user's eye, the displacement to a side is a horizontal displacement distance of d in a Frankfort plane between the optical axis of the user's eye and the optical axis of the aperture of the video source, and the offset video data is based on the horizontal displacement distance d and the distance D to the object.
6. The dynamically corrected parallax system of claim 5 wherein a horizontal offset angle .theta.D is formed, as follows:
.theta.D = tan-1 d/D, where d is a horizontal displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source, and the display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes the following horizontal offset:
offset columns = #Columns/FOV horz*.theta.D
where offset columns is the amount of horizontal offset in columns, FOV horz is the horizontal field-of-view of the video source, and #Columns is the total number of columns in the display device.
.theta.D = tan-1 d/D, where d is a horizontal displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source, and the display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes the following horizontal offset:
offset columns = #Columns/FOV horz*.theta.D
where offset columns is the amount of horizontal offset in columns, FOV horz is the horizontal field-of-view of the video source, and #Columns is the total number of columns in the display device.
7. The dynamically corrected parallax system of claim 5 wherein a vertical offset angle .theta.D is formed, as follows:
.PHI.D = tan-1 d'/D, where d' is a vertical displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source; and the display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes the following vertical offset:
offset rows = #Rows/FOV vert*.PHI.D
where offset rows is the amount of vertical offset in rows, FOV vert is the vertical field-of-view of the video source, and #Rows is the total number of rows in the display device.
.PHI.D = tan-1 d'/D, where d' is a vertical displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source; and the display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes the following vertical offset:
offset rows = #Rows/FOV vert*.PHI.D
where offset rows is the amount of vertical offset in rows, FOV vert is the vertical field-of-view of the video source, and #Rows is the total number of rows in the display device.
8. The dynamically corrected parallax system of claim 1 including a display electronics module disposed between the video source and the display device for converting the video data from the video source into digital video data, wherein the display electronics module is configured to receive an offset command from the controller and modify the digital video data into the offset video data.
9. The dynamically corrected parallax system of claim 8 wherein the display electronics module and the controller are integrated in a single unit.
10. The dynamically corrected parallax system of claim 1 including a focus position encoder coupled to the controller for determining a distance D to an object imaged by the video source, wherein the distance D is used to correct the parallax.
11. The dynamically corrected parallax system of claim 1 wherein the display device is a helmet mounted display (HMD).
12. The dynamically corrected parallax system of claim 1 wherein the display device and the video source are part of a head mounted night vision goggle.
13. In a head borne camera system having a video source and a display device, wherein the display device is configured for placement directly in front of a user's eye as a vision aid, and the video source is configured for displacement to a side of the user's eye, a method of dynamically correcting parallax comprising the steps of:
(a) imaging an object, by the video source, to provide video data;
(b) determining a focus distance to an object;
(c) offsetting the video data to form offset video data based on the focus distance determined in step (b) and a displacement distance between the user's eye and an aperture of the video source; and (d) displaying the offset video data by the display device;
wherein offsetting the video data corrects the parallax.
(a) imaging an object, by the video source, to provide video data;
(b) determining a focus distance to an object;
(c) offsetting the video data to form offset video data based on the focus distance determined in step (b) and a displacement distance between the user's eye and an aperture of the video source; and (d) displaying the offset video data by the display device;
wherein offsetting the video data corrects the parallax.
14. The method of claim 13 wherein the video data is used to form an X,Y array of respective columns and rows of pixels, and step (c) includes moving the video data by a number of columns of pixels in the X direction of the X,Y array to form the offset video data.
15. The method of claim 13 wherein the video data is used to form an X,Y array of respective columns and rows of pixels, and step (c) includes moving the video data by a number of rows of pixels in the Y direction of the X,Y array to form the offset video data.
16. The method of claim 13 wherein the video data is used to form an X,Y array of respective columns and rows of pixels, and step (c) includes moving the video data by a number of columns of pixels in the X direction and a number of pixels in the Y direction of the X,Y
array to form the offset video data.
array to form the offset video data.
17. The method of claim 13 wherein step (a) includes providing analog video data, and step (c) includes converting the analog video data into digital video data prior to offsetting the video data.
18. The method of claim 13 wherein the video source is configured for displacement to a right side of the user's eye, and step (c) includes shifting an image of the object to the right side of the display device.
19. The method of claim 18 including the step of:
biasing the aperture of the video source toward an optical axis of the user's eye to minimize an amount of offset produced in step (c).
biasing the aperture of the video source toward an optical axis of the user's eye to minimize an amount of offset produced in step (c).
20. The method of claim 13 wherein step (b) includes encoding an angular orientation of a focus knob disposed on the video source to determine the focus distance to the object.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/545,644 | 2006-10-10 | ||
US11/545,644 US8130261B2 (en) | 2006-10-10 | 2006-10-10 | System and method for dynamically correcting parallax in head borne video systems |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2604513A1 true CA2604513A1 (en) | 2008-04-10 |
Family
ID=39032336
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002604513A Abandoned CA2604513A1 (en) | 2006-10-10 | 2007-09-27 | System and method for dynamically correcting parallax in head borne video systems |
Country Status (7)
Country | Link |
---|---|
US (1) | US8130261B2 (en) |
EP (1) | EP1912392A2 (en) |
JP (1) | JP5160852B2 (en) |
CN (1) | CN101163236B (en) |
AU (1) | AU2007219287B2 (en) |
CA (1) | CA2604513A1 (en) |
IL (1) | IL186235A (en) |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8648897B2 (en) * | 2006-10-10 | 2014-02-11 | Exelis, Inc. | System and method for dynamically enhancing depth perception in head borne video systems |
EP2237504B1 (en) | 2009-03-30 | 2013-07-17 | The Boeing Company | Mobile AD HOC network |
JP4875127B2 (en) * | 2009-09-28 | 2012-02-15 | パナソニック株式会社 | 3D image processing device |
HU0900696D0 (en) * | 2009-11-05 | 2009-12-28 | Holakovszky Laszlo | Binocular display device |
CA3043204C (en) * | 2009-11-19 | 2021-08-31 | Esight Corp. | Apparatus and method for a dynamic "region of interest" in a display system |
CN102098442B (en) * | 2010-12-24 | 2012-09-19 | 中国科学院长春光学精密机械与物理研究所 | Method and system for calibrating non-overlap ratio of optical axis and visual axis of zoom camera |
CN104115491A (en) * | 2012-02-22 | 2014-10-22 | 索尼公司 | Display apparatus, image processing apparatus, image processing method, and computer program |
CN103369212B (en) * | 2012-03-28 | 2018-06-05 | 联想(北京)有限公司 | A kind of image-pickup method and equipment |
US9091849B2 (en) | 2012-08-31 | 2015-07-28 | Industrial Technology Research Institute | Wearable display and adjusting method thereof |
US9998687B2 (en) * | 2012-09-12 | 2018-06-12 | Bae Systems Information And Electronic Systems Integration Inc. | Face mounted extreme environment thermal sensor system |
CN102905076B (en) * | 2012-11-12 | 2016-08-24 | 深圳市维尚境界显示技术有限公司 | The device of a kind of 3D stereoscopic shooting Based Intelligent Control, system and method |
KR20140090552A (en) * | 2013-01-09 | 2014-07-17 | 엘지전자 주식회사 | Head Mounted Display and controlling method for eye-gaze calibration |
US9619021B2 (en) | 2013-01-09 | 2017-04-11 | Lg Electronics Inc. | Head mounted display providing eye gaze calibration and control method thereof |
JP6314339B2 (en) * | 2014-01-16 | 2018-04-25 | コニカミノルタ株式会社 | Eyeglass type display device |
JP6347067B2 (en) * | 2014-01-16 | 2018-06-27 | コニカミノルタ株式会社 | Eyeglass type display device |
CN103901622B (en) * | 2014-04-23 | 2016-05-25 | 成都理想境界科技有限公司 | 3D wears viewing equipment and corresponding video player |
WO2016088227A1 (en) * | 2014-12-03 | 2016-06-09 | 日立マクセル株式会社 | Video display device and method |
US10162412B2 (en) | 2015-03-27 | 2018-12-25 | Seiko Epson Corporation | Display, control method of display, and program |
WO2018010040A1 (en) * | 2016-07-11 | 2018-01-18 | 王民良 | Image reality augmentation method and surgical guide of applying same to wearable glasses |
JP2018148257A (en) * | 2017-03-01 | 2018-09-20 | セイコーエプソン株式会社 | Head mounted display device and control method for the same |
US11150475B2 (en) | 2017-04-12 | 2021-10-19 | Hewlett-Packard Development Company, L.P. | Transfer to head mounted display |
CN107346175B (en) * | 2017-06-30 | 2020-08-25 | 联想(北京)有限公司 | Gesture position correction method and augmented reality display device |
US11290942B2 (en) | 2020-08-07 | 2022-03-29 | Rockwell Collins, Inc. | System and method for independent dominating set (IDS) based routing in mobile AD hoc networks (MANET) |
US11737121B2 (en) | 2021-08-20 | 2023-08-22 | Rockwell Collins, Inc. | System and method to compile and distribute spatial awareness information for network |
US11296966B2 (en) | 2019-11-27 | 2022-04-05 | Rockwell Collins, Inc. | System and method for efficient information collection and distribution (EICD) via independent dominating sets |
US11726162B2 (en) | 2021-04-16 | 2023-08-15 | Rockwell Collins, Inc. | System and method for neighbor direction and relative velocity determination via doppler nulling techniques |
US11665658B1 (en) | 2021-04-16 | 2023-05-30 | Rockwell Collins, Inc. | System and method for application of doppler corrections for time synchronized transmitter and receiver |
TWI741536B (en) * | 2020-03-20 | 2021-10-01 | 台灣骨王生技股份有限公司 | Surgical navigation image imaging method based on mixed reality |
CN111866493B (en) * | 2020-06-09 | 2022-01-28 | 青岛小鸟看看科技有限公司 | Image correction method, device and equipment based on head-mounted display equipment |
US11037359B1 (en) * | 2020-06-24 | 2021-06-15 | Microsoft Technology Licensing, Llc | Real-time rendering stylized passthrough images |
US11646962B1 (en) | 2020-10-23 | 2023-05-09 | Rockwell Collins, Inc. | Zero overhead efficient flooding (ZOEF) oriented hybrid any-cast routing for mobile ad hoc networks (MANET) |
US11481960B2 (en) * | 2020-12-30 | 2022-10-25 | Meta Platforms Technologies, Llc | Systems and methods for generating stabilized images of a real environment in artificial reality |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4398799A (en) | 1980-03-04 | 1983-08-16 | Pilkington P.E. Limited | Head-up displays |
US6590573B1 (en) * | 1983-05-09 | 2003-07-08 | David Michael Geshwind | Interactive computer system for creating three-dimensional image information and for converting two-dimensional image information for three-dimensional display systems |
GB2189365A (en) | 1986-03-20 | 1987-10-21 | Rank Xerox Ltd | Imaging apparatus |
US4924247A (en) | 1988-03-11 | 1990-05-08 | Asahi Kogaku Kogyo Kabushiki Kaisha | Apparatus and method for correcting and adjusting parallax in electronic camera |
US5173726A (en) | 1991-06-04 | 1992-12-22 | Eastman Kodak Company | Automatic parallax correction in deploying lens camera |
US5500671A (en) | 1994-10-25 | 1996-03-19 | At&T Corp. | Video conference system and method of providing parallax correction and a sense of presence |
JPH09269528A (en) | 1996-01-31 | 1997-10-14 | Canon Inc | Camera with parallax correction function |
US5787313A (en) | 1997-02-20 | 1998-07-28 | Eastman Kodak Company | Hybrid camera including viewfinder with masks for parallax correction and image format indication |
CA2233047C (en) * | 1998-02-02 | 2000-09-26 | Steve Mann | Wearable camera system with viewfinder means |
JP3261115B2 (en) * | 1999-09-22 | 2002-02-25 | 富士重工業株式会社 | Stereo image processing device |
US6560029B1 (en) * | 2001-12-21 | 2003-05-06 | Itt Manufacturing Enterprises, Inc. | Video enhanced night vision goggle |
JP2004289548A (en) * | 2003-03-24 | 2004-10-14 | Olympus Corp | Image adjuster and head-mounted display device |
JP4388790B2 (en) * | 2003-11-06 | 2009-12-24 | 富士フイルム株式会社 | Endoscope light source device socket |
CN2665738Y (en) | 2003-11-28 | 2004-12-22 | 王小光 | Multipurpose image forming apparatus |
US7225548B2 (en) * | 2004-05-17 | 2007-06-05 | Sr2 Group, Llc | System and method for aligning multiple sighting devices |
CN1721915A (en) * | 2004-07-14 | 2006-01-18 | 文化传信科技(澳门)有限公司 | Image display system and method thereof |
WO2006060746A2 (en) * | 2004-12-03 | 2006-06-08 | Infrared Solutions, Inc. | Visible light and ir combined image camera with a laser pointer |
JP2006208451A (en) * | 2005-01-25 | 2006-08-10 | Konica Minolta Photo Imaging Inc | Video display device |
US20060250322A1 (en) * | 2005-05-09 | 2006-11-09 | Optics 1, Inc. | Dynamic vergence and focus control for head-mounted displays |
CN1716081A (en) * | 2005-07-22 | 2006-01-04 | 中国科学院上海光学精密机械研究所 | Head carried type self lighting video photographic system |
-
2006
- 2006-10-10 US US11/545,644 patent/US8130261B2/en active Active
-
2007
- 2007-09-24 IL IL186235A patent/IL186235A/en active IP Right Grant
- 2007-09-25 AU AU2007219287A patent/AU2007219287B2/en not_active Ceased
- 2007-09-27 CA CA002604513A patent/CA2604513A1/en not_active Abandoned
- 2007-09-28 EP EP07117578A patent/EP1912392A2/en not_active Withdrawn
- 2007-10-09 JP JP2007263328A patent/JP5160852B2/en not_active Expired - Fee Related
- 2007-10-09 CN CN2007101809367A patent/CN101163236B/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
AU2007219287B2 (en) | 2013-08-01 |
CN101163236A (en) | 2008-04-16 |
JP5160852B2 (en) | 2013-03-13 |
JP2008134616A (en) | 2008-06-12 |
IL186235A0 (en) | 2008-01-20 |
AU2007219287A1 (en) | 2008-04-24 |
EP1912392A2 (en) | 2008-04-16 |
US8130261B2 (en) | 2012-03-06 |
IL186235A (en) | 2013-01-31 |
CN101163236B (en) | 2013-05-08 |
US20080084472A1 (en) | 2008-04-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8130261B2 (en) | System and method for dynamically correcting parallax in head borne video systems | |
US8648897B2 (en) | System and method for dynamically enhancing depth perception in head borne video systems | |
US11223820B2 (en) | Augmented reality displays with active alignment and corresponding methods | |
US10869024B2 (en) | Augmented reality displays with active alignment and corresponding methods | |
JP6248931B2 (en) | Image display device and image display method | |
US8330846B2 (en) | Image pickup apparatus | |
EP0618471B1 (en) | Image display apparatus | |
KR101446767B1 (en) | Stereoscopic imaging device | |
US10382699B2 (en) | Imaging system and method of producing images for display apparatus | |
US10187634B2 (en) | Near-eye display system including a modulation stack | |
JP2009017207A (en) | Stereoscopic television system and stereoscopic television receiver | |
US20120313936A1 (en) | Stereoscopic display system and stereoscopic glasses | |
US20090059364A1 (en) | Systems and methods for electronic and virtual ocular devices | |
KR101690646B1 (en) | Camera driving device and method for see-through displaying | |
US20230035023A1 (en) | Aerial image display device | |
CN109997067B (en) | Display apparatus and method using portable electronic device | |
CN110088662B (en) | Imaging system and method for generating background image and focusing image | |
JP2019106723A (en) | Display device and display method using context display and projector | |
CN113973199A (en) | Light-transmitting display system and image output method and processing device thereof | |
JP4227187B2 (en) | 3D image viewing glasses | |
US20220400247A1 (en) | Augmented reality display device and method | |
US20220252883A1 (en) | Head mounted display apparatus | |
CN101533156A (en) | Head-mounted stereoscopic display manufactured by utilizing single chip display device | |
KR20160081715A (en) | 2d/3d switchable autostereoscopic display device and dirving method thereof | |
CN117499613A (en) | Method for preventing 3D dizziness for tripod head device and tripod head device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
FZDE | Discontinued |
Effective date: 20160408 |