WO2007130539A2 - Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods - Google Patents

Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods Download PDF

Info

Publication number
WO2007130539A2
WO2007130539A2 PCT/US2007/010774 US2007010774W WO2007130539A2 WO 2007130539 A2 WO2007130539 A2 WO 2007130539A2 US 2007010774 W US2007010774 W US 2007010774W WO 2007130539 A2 WO2007130539 A2 WO 2007130539A2
Authority
WO
WIPO (PCT)
Prior art keywords
reference markers
display
processor
gis
markers
Prior art date
Application number
PCT/US2007/010774
Other languages
French (fr)
Other versions
WO2007130539A3 (en
Inventor
Guillermo E. Gutierrez
Timothy B. Faulkner
Original Assignee
Harris Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harris Corporation filed Critical Harris Corporation
Priority to BRPI0711291-2A priority Critical patent/BRPI0711291A2/en
Priority to JP2009509723A priority patent/JP2009535734A/en
Priority to EP07794526A priority patent/EP2024961A4/en
Priority to CA002651318A priority patent/CA2651318A1/en
Publication of WO2007130539A2 publication Critical patent/WO2007130539A2/en
Publication of WO2007130539A3 publication Critical patent/WO2007130539A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • GIS GEOGRAPHIC INFORMATION SYSTEM
  • the present invention relates to the field of image processing systems, and, more particularly, to geographic information systems (GIS) and related methods.
  • GIS geographic information systems
  • mapping programs e.g., Google Earth
  • architectural design applications e.g., Pro/E, CATIA
  • digital design and modeling tools e.g., Maya, 3D ⁇ tudio Max
  • three-dimensional visualization analysis tools e.g., Google Earth
  • a 3D object is typically created/edited using only 2D input/output devices such as a monitor or display, mouse, keyboard, and/or joystick. This is usually done in one of two ways.
  • the first way is to create or place a 3D object in the scene, which can be a cumbersome multi-step process.
  • the object is first created or placed in a two-dimensional plane and then manipulated in the third dimension. While there are multiple points of view often being displayed simultaneously, the process may still be relatively unintuitive to the user.
  • objects can be natively placed directly in 3D space, but usually only relative to a pre-existing 3D object which already has a spatial context in the current coordinate system.
  • One example of an application which allows objects to be natively placed in a 3D space is the InRealityTM sitemodel viewer from the present Assignee, Harris Corp. InRealityTM also provides a sophisticated interaction within a 3D virtual scene allowing users to easily move through a geospatially accurate virtual environment with the capability of immersion at any location within a scene.
  • U.S. Patent No. 5,883,625 to Crawford et al is disclosed in U.S. Patent No. 5,883,625 to Crawford et al .
  • This patent is directed to a system and method for automatically arranging objects inside a container of a graphical user interface (GUI) .
  • Selectable grid styles are provided for arranging cells into different configurations inside the container.
  • the cells may be placed in different grid styles, such as rectangular, rhombus—shaped, or circular.
  • identifiers are used for placing objects such as icons or buttons in each cell and ordering the objects for other user applications .
  • a geographic information system which may include a display, a GIS database, and a processor. More particularly, the processor may cooperate with the display and the GIS database to display a three-dimensional (3D) geospatial image including a plurality of spaced-apart reference markers therein.
  • the reference markers may have different visual characteristics indicative of different relative positions within the 3D geospatial image.
  • the processor may also associate with each reference marker selectively displayable position data.
  • the different visual characteristics may include different sizes and/or different colors.
  • An input device may also be connected to the processor, and the processor may selectively display position data for a given reference marker based upon the input device. For example, in the case of a mouse, the processor may display the position data when a mouse cursor is moved to point at the given reference maker.
  • the selectively displayable position data may include selectively displayable latitude, longitude, and height coordinates, for example.
  • the input device may also cooperate with the processor to draw a line between a pair of reference markers, and the processor may cooperate with the display to display a distance between the pair of reference markers based upon the line.
  • the input device may further cooperate with the processor to select a given reference marker from among the plurality of reference markers.
  • the 3D geospatial image may include a ground surface below the given reference marker.
  • the processor may cooperate with the display to draw a vertical reference line between the ground surface and the given reference marker upon selection thereof.
  • the reference markers may be semi-transparent geometric objects, such as semi-transparent spheres, for example .
  • the processor may also cooperate with the display to selectively change the spacing between the reference markers based upon the input device.
  • the spacing between at least some of the reference markers may be nonuniform and/or non-linear.
  • the input device may be used for selecting reference markers.
  • the 3D geospatial image may include at least one polygon, and the processor may determine an orientation of the at least one polygon based upon an order of selection of reference markers associated therewith.
  • the processor may selectively display the plurality of reference markers with the 3D geospatial image based upon the input device. For example, if the input device is a keyboard, the processor may display the reference markers when a given key(s) is depressed, and remove the reference markers from the display when the given key(s) is released.
  • a three-dimensional (3D) geospatial image display method aspect may include displaying the 3D geospatial image on a display with a plurality of spaced-apart reference markers therein.
  • the reference markers may have different visual characteristics indicative of different relative positions within the 3D geospatial image.
  • the method may further include associating with each reference marker selectively displayable position data.
  • FIG. 1 is a schematic block diagram of an exemplary geographic information system (GIS) in accordance with the invention.
  • GIS geographic information system
  • FIG. 2 is a sample display of a 3D image with reference markers in accordance with the invention with selectively displayed position data.
  • FIG. 3 is a sample display of the 3D image of FIG. 2 displaying a distance between a pair of reference markers and with a different spacing between reference markers.
  • FIG. 4 is a sample display of the 3D image of FIG. 2 displaying a vertical reference line from the ground surface in the image to a reference marker, and the associated height.
  • FIG. 5 is a sample display of the 3D image of FIG. 2 with an alternative embodiment of the reference markers having different colors to indicate different relative positions within the image .
  • FIG. 6 is a sample display of another 3D image including semi-transparent spherical reference markers in accordance with the invention.
  • FIG. 7 is a sample display illustrating a 3D geospatial image display method in accordance with the invention .
  • a geographic information system (GIS) 20 illustratively includes a display 21, a GIS (or other 3D image) database 22, and a processor 23 (e.g., a computer CPU) .
  • input devices such as a mouse 24 and a keyboard 25 are connected to the processor 23 for allowing a user to interact with and manipulate data
  • the processor 23 cooperates with the display 21 and the GIS database 22 to display a three- dimensional (3D) geospatial image stored in the GIS database, along with a plurality of spaced-apart reference markers 30a- 301 therein.
  • the 3D image is simply a ground (e.g., terrain) surface or grid so that the reference markers 30a-301 are more easily identifiable.
  • the reference markers 30a-301 are spheres in these embodiments, but other geometric shapes or markers may also be used.
  • the reference markers 30a-301 advantageously have different visual characteristics indicative of different relative positions within the 3D geospatial image to help users more readily distinguish the relative positions of object vertices, boundaries, elevations, etc., within an image.
  • the different visual characteristics of the reference markers 30a-301 are their different relative sizes.
  • the reference marker 30a which is in the foreground is larger than the reference marker 301 in the background, which indicates to the user that the reference marker 30a is "closer" with respect to the particular angle at which the user is viewing the 3D image (i.e., closer from the user's vantage point) .
  • reference markers 30a' -3Oi' have different colors (illustrated by different grayscale shade) to indicate their relative positions within the image.
  • the darker colored reference markers appear in the foreground, and as the markers get farther away from the user's vantage point their color becomes lighter, although other arrangements may also be used.
  • both color and size may be used to indicate relative positions within an image, as will be appreciated by those skilled in the art.
  • individual reference markers may be colorized based upon elevation from the ground surface 31 (in a geo-referenced context) , or more generally, based upon a distance from a pre-defined point or surface .
  • the processor 23 may also advantageously associate with each reference marker selectively displayable position data.
  • the processor 24 will associate respective position data with each reference marker 30a-301 based upon its position within the image, as will be appreciated by those skilled in the art.
  • the position data may be referenced to a particular object in a scene based upon a scale, etc., as will be appreciated by those skilled in the art.
  • the processor 23 may cause the display 21 to display the position data associated with a given reference marker 30 when the user selects the given reference marker.
  • the user has selected the reference marker 30a by moving a mouse cursor 32 to point thereto, which causes the processor to generate a pop-up window 33 displaying the latitude, longitude, and height/elevation coordinates associated with this particular reference marker.
  • selection could be performed by pressing a given mouse button or keyboard key, for example.
  • the given reference marker' s current coordinates may be displayed and updated in real time as the density of the reference markers is changed, if desired, as will be discussed further below.
  • the mouse 24 may also be used to draw a line 34 between a pair of reference markers 30a and 3Og, as seen in FIG. 3. This may done by simply selecting a first reference marker (here the reference marker 30a) , such as by clicking a mouse button when the mouse pointer 32 is pointing thereto, and then dragging the line 34 to the second reference marker 3Og and releasing the mouse button. Of course, other approaches for selecting and/or drawing lines between reference markers may also be used, as will be appreciated by those skilled in the art.
  • the processor 23 may also display the pop-up window 33, which in this example displays a distance between the two reference markers (i.e., 2 m). This feature may be particularly beneficial for city planners, etc., who need to determine a distance from one point in a 3D scene (such as the top of one building) to another point (e.g., the top of another building), for example.
  • the mouse 24 (or keyboard 25 or other appropriate input devices) may be used to select a given reference marker 30a so that the processor 23 may cause a vertical reference line 35 to be drawn between the ground surface and the given reference marker upon selection thereof, as seen in FIG. 4. That is, the vertical reference line 35 provides a helpful reference for the user to determine where the ground surface 31 directly beneath the given reference marker 30a is located.
  • the pop-up window may also be generated on the display 21 by the processor 23 with an indication of the distance between the ground surface 31 and the given reference marker 30a (here, 5 m) .
  • the reference markers may be semi-transparent geometric objects, such as semi-transparent spheres 30' ' , for example, as shown in FIG. 6.
  • the spheres 30'' in the illustrated example delineate points on an object 40, which could be a building (i.e., a manmade structure), elevated terrain, etc.
  • the processor 23 may advantageously display only those portions of the given reference marker outside of the object, as shown, to further help the user appreciate the relative position and boundaries of the object while not obscuring the object itself.
  • the processor 23 may also cause the display 21 to selectively change the spacing between the reference markers 30a-301 based upon one of the input devices.
  • the processor 23 may change the spacing (i.e., density) of the reference markers 30a-301 based upon a scroll wheel of the mouse 24, which may be done in combination with pressing a particular key (e.g., CTRL key) on the keyboard 25.
  • a particular key e.g., CTRL key
  • the user is able to quickly and conveniently change the spacing of the reference markers 30a-301 to suit the particular image or zoom level that the user is working with.
  • the reference marker density may also be automatically updated as the user changes zoom-level, if desired.
  • the processor 23 may also selectively display the reference markers 30a-301 with the 3D geospatial image, i.e., only display them when requested by the user. For example, this may be done based upon one of the input devices such as the keyboard 25. More particularly, a specific key(s) on the keyboard 25 may be assigned for causing the processor 23 to display the reference markers 30a-301 when pressed or held down by the user (e.g., the space bar), and then "hide" the reference markers when the user releases the designated key(s) .
  • the method illustratively includes displaying a 3D geospatial image on the display 21 with a plurality of spaced-apart reference markers 30a-301, at Block 72.
  • the reference markers 30a-301 preferably have different visual characteristics indicative of different relative .positions within the 3D geospatial image (e.g., size, color, etc. ) .
  • the method may further include associating with each reference marker selectively displayable position data, at Block 74, as discussed further above.
  • the processor 23 then cooperates with the mouse 24 and/or keyboard 25 to determine when a given reference marker 30 is selected, at Block 76. When this occurs, the processor 23 then performs the appropriate action, such as displaying the respective position data associated with the given reference marker 30, as noted above, at Block 78, thus concluding the illustrated method (Block 80) .
  • the reference markers 30a-301 may be expanded to span an entire viewable scene (i.e., view frustum), or just portions thereof in different situations or ' implementations . Moreover, the reference markers 30a-301 may also advantageously be used to place pre-defined objects in the 3D scene, or to define entirely new objects by successively selecting markers, for example. Preferably the grid or matrix of reference markers 30a-301 will have a regular spacing by default. However, additional user or context-definable parameters may be used to automatically increase the sphere density in certain areas causing the dynamic increasing and decreasing of the grid density to be non-uniform or even non- linear throughout the extent of the grid, as will be appreciated by those skilled in the art.
  • the keyboard 25 spacebar brings in (i.e., overlays) the matrix of reference markers 30a-301
  • the dynamic grid density adjustment does not necessarily .need to be uniform or linear across entire matrix/grid, as noted above.
  • the mouse pointer 32 mouse moves over a selectable reference sphere, (a) if there is a ground surface portion below the sphere, a straight vertical reference line 35 is automatically drawn to the ground 31 to show exactly over what ground point that sphere lies, and (b) if the scene is- within a GIS context (i.e., has an origin), the latitude/longitude/height coordinates of the given sphere are shown preferably even if no ground exists below.
  • the spheres may be colorized based upon height/elevation or distance from a certain point (showing appropriate color bar legend on the side of the scene) .
  • clicking on a given sphere may select it and optionally close out a polygon (or volprint, as discussed in U.S.
  • Patent No. 6,915,310 to Gutierrez et al . which is assigned to the present Assignee and is hereby incorporated herein by reference in its entirety) if more than one sphere is selected. In a degenerate polygon case, two selected spheres make a line, as will be appreciated by those skilled in the art.
  • the above-described computer system 20 and methods may provide several advantages. For example, they may provide full 3D context relatively fast and with few operations required by a user, as well as providing a GIS (latitude/longitude/height) context for any 3D point in a scene. Furthermore, radial colorization may be provided based upon a distance from a point or object, or planar colorization based upon a distance from surface (e.g., ground) . Other advantages may include dynamic density calibration, as well as non-uniformity in dynamic density calibration (i.e., areas of interest can be adjusted to have a higher density than the rest of the matrix). Moreover, polygon orientation (i.e., winding, in computer graphic terms, which is used to determine if a polygon is front-facing or back-facing) may optionally be automatically deduced from the order that the user selects the spheres .
  • GIS latitude/longitude/height

Abstract

A geographic information system (GIS) (20) may include a display (21), a GIS database (22), and a processor (23). The processor (23) may cooperate with the display (21) and the GIS database (22) to display a three-dimensional (3D) geospatial image including a plurality of spaced-apart reference markers (30a-301) therein. The reference markers (30a-301) may have different visual characteristics indicative of different relative positions within the 3D geospatial image. The processor (23) may also associate with each reference marker (30a-301) selectively displayable position data. The reference markers (30a-301) may have different sizes and/or colors, for example.

Description

GEOGRAPHIC INFORMATION SYSTEM (GIS) FOR DISPLAYING 3D GEOSPATIAL IMAGES WITH REFERENCE MARKERS AND RELATED METHODS
The present invention relates to the field of image processing systems, and, more particularly, to geographic information systems (GIS) and related methods.
In certain applications it is desirable to provide digital representations of three-dimensional (3D) objects or images. By way of example, such applications may include mapping programs (e.g., Google Earth) architectural design applications (e.g., Pro/E, CATIA), digital design and modeling tools (e.g., Maya, 3DΞtudio Max), and three-dimensional visualization analysis tools.
One challenge of displaying and interacting with digital 3D images on a computer is that this is traditionally accomplished using two-dimensional (2D) interaction mechanisms. More particularly, in 3D application domains, a 3D object is typically created/edited using only 2D input/output devices such as a monitor or display, mouse, keyboard, and/or joystick. This is usually done in one of two ways. The first way is to create or place a 3D object in the scene, which can be a cumbersome multi-step process. The object is first created or placed in a two-dimensional plane and then manipulated in the third dimension. While there are multiple points of view often being displayed simultaneously, the process may still be relatively unintuitive to the user.
In accordance with another approach objects can be natively placed directly in 3D space, but usually only relative to a pre-existing 3D object which already has a spatial context in the current coordinate system. One example of an application which allows objects to be natively placed in a 3D space is the InReality™ sitemodel viewer from the present Assignee, Harris Corp. InReality™ also provides a sophisticated interaction within a 3D virtual scene allowing users to easily move through a geospatially accurate virtual environment with the capability of immersion at any location within a scene.
Various approaches have been developed for arranging or placing graphical objects on a display. One example of a 2D arrangement for object placement on windows is disclosed in U.S. Patent No. 5,883,625 to Crawford et al . This patent is directed to a system and method for automatically arranging objects inside a container of a graphical user interface (GUI) . Selectable grid styles are provided for arranging cells into different configurations inside the container. The cells may be placed in different grid styles, such as rectangular, rhombus—shaped, or circular. Furthermore, identifiers are used for placing objects such as icons or buttons in each cell and ordering the objects for other user applications .
While such approaches may be helpful for interacting with 2D images, these approaches may not be of use for working with 3D images. While certain haptic (i.e., technology that interfaces the user via the sense of touch) and inherently 3D input devices do exist which attempt to facilitate interaction with 3D data, such devices are typically expensive, require specialized hardware/software, have a substantial learning curve, and/or are not readily available. In view of the foregoing background, it is therefore an object of the present invention to provide a system and related methods for facilitating interaction with 3D data, such as 3D geospatial images, for example.
This and other objects, features, and advantages are provided by a geographic information system (GIS) which may include a display, a GIS database, and a processor. More particularly, the processor may cooperate with the display and the GIS database to display a three-dimensional (3D) geospatial image including a plurality of spaced-apart reference markers therein. The reference markers may have different visual characteristics indicative of different relative positions within the 3D geospatial image. The processor may also associate with each reference marker selectively displayable position data.
By way of example, the different visual characteristics may include different sizes and/or different colors. An input device may also be connected to the processor, and the processor may selectively display position data for a given reference marker based upon the input device. For example, in the case of a mouse, the processor may display the position data when a mouse cursor is moved to point at the given reference maker. By way of example, the selectively displayable position data may include selectively displayable latitude, longitude, and height coordinates, for example.
The input device may also cooperate with the processor to draw a line between a pair of reference markers, and the processor may cooperate with the display to display a distance between the pair of reference markers based upon the line. In addition, the input device may further cooperate with the processor to select a given reference marker from among the plurality of reference markers. Also, the 3D geospatial image may include a ground surface below the given reference marker. As such, the processor may cooperate with the display to draw a vertical reference line between the ground surface and the given reference marker upon selection thereof. The reference markers may be semi-transparent geometric objects, such as semi-transparent spheres, for example . The processor may also cooperate with the display to selectively change the spacing between the reference markers based upon the input device. In some embodiment, the spacing between at least some of the reference markers may be nonuniform and/or non-linear. The input device may be used for selecting reference markers. As such, the 3D geospatial image may include at least one polygon, and the processor may determine an orientation of the at least one polygon based upon an order of selection of reference markers associated therewith.
Furthermore, the processor may selectively display the plurality of reference markers with the 3D geospatial image based upon the input device. For example, if the input device is a keyboard, the processor may display the reference markers when a given key(s) is depressed, and remove the reference markers from the display when the given key(s) is released.
A three-dimensional (3D) geospatial image display method aspect may include displaying the 3D geospatial image on a display with a plurality of spaced-apart reference markers therein. The reference markers may have different visual characteristics indicative of different relative positions within the 3D geospatial image. The method may further include associating with each reference marker selectively displayable position data.
FIG. 1 is a schematic block diagram of an exemplary geographic information system (GIS) in accordance with the invention.
FIG. 2 is a sample display of a 3D image with reference markers in accordance with the invention with selectively displayed position data.
FIG. 3 is a sample display of the 3D image of FIG. 2 displaying a distance between a pair of reference markers and with a different spacing between reference markers. FIG. 4 is a sample display of the 3D image of FIG. 2 displaying a vertical reference line from the ground surface in the image to a reference marker, and the associated height.
FIG. 5 is a sample display of the 3D image of FIG. 2 with an alternative embodiment of the reference markers having different colors to indicate different relative positions within the image .
FIG. 6 is a sample display of another 3D image including semi-transparent spherical reference markers in accordance with the invention.
FIG. 7 is a sample display illustrating a 3D geospatial image display method in accordance with the invention .
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which preferred embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout, and prime and multiple prime notation are used to indicate similar elements in alternate embodiments. Referring initially to FIG. 1, a geographic information system (GIS) 20 illustratively includes a display 21, a GIS (or other 3D image) database 22, and a processor 23 (e.g., a computer CPU) . Moreover, input devices such as a mouse 24 and a keyboard 25 are connected to the processor 23 for allowing a user to interact with and manipulate data
(e.g., image data) displayed on the display 21. Other input devices such as a joystick (not shown) may also be used, as will be appreciated by those skilled in the art.
Generally speaking, the processor 23 cooperates with the display 21 and the GIS database 22 to display a three- dimensional (3D) geospatial image stored in the GIS database, along with a plurality of spaced-apart reference markers 30a- 301 therein. In FIGS. 2-5, the 3D image is simply a ground (e.g., terrain) surface or grid so that the reference markers 30a-301 are more easily identifiable. Moreover, the reference markers 30a-301 are spheres in these embodiments, but other geometric shapes or markers may also be used.
The reference markers 30a-301 advantageously have different visual characteristics indicative of different relative positions within the 3D geospatial image to help users more readily distinguish the relative positions of object vertices, boundaries, elevations, etc., within an image. By way of example, in FIGS. 2-4 the different visual characteristics of the reference markers 30a-301 are their different relative sizes. For example, the reference marker 30a which is in the foreground is larger than the reference marker 301 in the background, which indicates to the user that the reference marker 30a is "closer" with respect to the particular angle at which the user is viewing the 3D image (i.e., closer from the user's vantage point) .
Other visual characteristics besides size may be used to help visually indicate to a user the relative position of reference markers within an image. For example, in the alternative embodiment illustrated in FIG. 5, reference markers 30a' -3Oi' have different colors (illustrated by different grayscale shade) to indicate their relative positions within the image. In this example, the darker colored reference markers appear in the foreground, and as the markers get farther away from the user's vantage point their color becomes lighter, although other arrangements may also be used. In some embodiments, both color and size may be used to indicate relative positions within an image, as will be appreciated by those skilled in the art. Moreover, individual reference markers may be colorized based upon elevation from the ground surface 31 (in a geo-referenced context) , or more generally, based upon a distance from a pre-defined point or surface . The processor 23 may also advantageously associate with each reference marker selectively displayable position data. Thus, in the case of a 3D geospatial image of a particular city or other location in which the points in the image are referenced to actual latitude, longitude, and/or height/elevation coordinates for the particular city, etc., the processor 24 will associate respective position data with each reference marker 30a-301 based upon its position within the image, as will be appreciated by those skilled in the art. Of course, for applications other than GIS (e.g., architectural design applications, digital design modeling tools, etc.), the position data may be referenced to a particular object in a scene based upon a scale, etc., as will be appreciated by those skilled in the art. In particular, the processor 23 may cause the display 21 to display the position data associated with a given reference marker 30 when the user selects the given reference marker. In the example illustrated in FIG. 2, the user has selected the reference marker 30a by moving a mouse cursor 32 to point thereto, which causes the processor to generate a pop-up window 33 displaying the latitude, longitude, and height/elevation coordinates associated with this particular reference marker. In other cases, selection could be performed by pressing a given mouse button or keyboard key, for example. Additionally, the given reference marker' s current coordinates may be displayed and updated in real time as the density of the reference markers is changed, if desired, as will be discussed further below.
The mouse 24 may also be used to draw a line 34 between a pair of reference markers 30a and 3Og, as seen in FIG. 3. This may done by simply selecting a first reference marker (here the reference marker 30a) , such as by clicking a mouse button when the mouse pointer 32 is pointing thereto, and then dragging the line 34 to the second reference marker 3Og and releasing the mouse button. Of course, other approaches for selecting and/or drawing lines between reference markers may also be used, as will be appreciated by those skilled in the art. The processor 23 may also display the pop-up window 33, which in this example displays a distance between the two reference markers (i.e., 2 m). This feature may be particularly beneficial for city planners, etc., who need to determine a distance from one point in a 3D scene (such as the top of one building) to another point (e.g., the top of another building), for example.
Yet another similar feature is that the mouse 24 (or keyboard 25 or other appropriate input devices) may be used to select a given reference marker 30a so that the processor 23 may cause a vertical reference line 35 to be drawn between the ground surface and the given reference marker upon selection thereof, as seen in FIG. 4. That is, the vertical reference line 35 provides a helpful reference for the user to determine where the ground surface 31 directly beneath the given reference marker 30a is located. In addition, the pop-up window may also be generated on the display 21 by the processor 23 with an indication of the distance between the ground surface 31 and the given reference marker 30a (here, 5 m) .
The reference markers may be semi-transparent geometric objects, such as semi-transparent spheres 30' ' , for example, as shown in FIG. 6. In particular, the spheres 30'' in the illustrated example delineate points on an object 40, which could be a building (i.e., a manmade structure), elevated terrain, etc. When a given reference marker intersects the object 40, the processor 23 may advantageously display only those portions of the given reference marker outside of the object, as shown, to further help the user appreciate the relative position and boundaries of the object while not obscuring the object itself. The processor 23 may also cause the display 21 to selectively change the spacing between the reference markers 30a-301 based upon one of the input devices. For example, the processor 23 may change the spacing (i.e., density) of the reference markers 30a-301 based upon a scroll wheel of the mouse 24, which may be done in combination with pressing a particular key (e.g., CTRL key) on the keyboard 25. Thus, the user is able to quickly and conveniently change the spacing of the reference markers 30a-301 to suit the particular image or zoom level that the user is working with. Of course, the reference marker density may also be automatically updated as the user changes zoom-level, if desired.
The processor 23 may also selectively display the reference markers 30a-301 with the 3D geospatial image, i.e., only display them when requested by the user. For example, this may be done based upon one of the input devices such as the keyboard 25. More particularly, a specific key(s) on the keyboard 25 may be assigned for causing the processor 23 to display the reference markers 30a-301 when pressed or held down by the user (e.g., the space bar), and then "hide" the reference markers when the user releases the designated key(s) . Of course, other methods may be used for instructing the processor 23 to display the reference markers 30a-301 (as well as performing the various functions described above) , such as drop down menu items, buttons on a button bar, etc., as will be appreciated by those of skill in the art.
A three-dimensional (3D) geospatial image display method aspect will now be described with reference to FIG. 7. Beginning at Block 70, the method illustratively includes displaying a 3D geospatial image on the display 21 with a plurality of spaced-apart reference markers 30a-301, at Block 72. As noted above, the reference markers 30a-301 preferably have different visual characteristics indicative of different relative .positions within the 3D geospatial image (e.g., size, color, etc. ) .
The method may further include associating with each reference marker selectively displayable position data, at Block 74, as discussed further above. The processor 23 then cooperates with the mouse 24 and/or keyboard 25 to determine when a given reference marker 30 is selected, at Block 76. When this occurs, the processor 23 then performs the appropriate action, such as displaying the respective position data associated with the given reference marker 30, as noted above, at Block 78, thus concluding the illustrated method (Block 80) .
The reference markers 30a-301 may be expanded to span an entire viewable scene (i.e., view frustum), or just portions thereof in different situations or ' implementations . Moreover, the reference markers 30a-301 may also advantageously be used to place pre-defined objects in the 3D scene, or to define entirely new objects by successively selecting markers, for example. Preferably the grid or matrix of reference markers 30a-301 will have a regular spacing by default. However, additional user or context-definable parameters may be used to automatically increase the sphere density in certain areas causing the dynamic increasing and decreasing of the grid density to be non-uniform or even non- linear throughout the extent of the grid, as will be appreciated by those skilled in the art.
Operational details of one exemplary embodiment of the computer system 20 will now be described to provide still further understanding. The keyboard 25 spacebar brings in (i.e., overlays) the matrix of reference markers 30a-301
(i.e., spheres), which are appropriately sized to match the context, over the whole image scene or some portion thereof. The keyboard 25 and/or joystick may be used to move the camera view around in the scene. Further, a scroll wheel on the mouse 24 dynamically increases/decreases the matrix density (i.e., inter-sphere spacing). Optionally, the dynamic grid density adjustment does not necessarily .need to be uniform or linear across entire matrix/grid, as noted above. Each time the mouse pointer 32 mouse moves over a selectable reference sphere, (a) if there is a ground surface portion below the sphere, a straight vertical reference line 35 is automatically drawn to the ground 31 to show exactly over what ground point that sphere lies, and (b) if the scene is- within a GIS context (i.e., has an origin), the latitude/longitude/height coordinates of the given sphere are shown preferably even if no ground exists below. Optionally, the spheres may be colorized based upon height/elevation or distance from a certain point (showing appropriate color bar legend on the side of the scene) . In addition, clicking on a given sphere may select it and optionally close out a polygon (or volprint, as discussed in U.S. Patent No. 6,915,310 to Gutierrez et al . , which is assigned to the present Assignee and is hereby incorporated herein by reference in its entirety) if more than one sphere is selected. In a degenerate polygon case, two selected spheres make a line, as will be appreciated by those skilled in the art.
The above-described computer system 20 and methods may provide several advantages. For example, they may provide full 3D context relatively fast and with few operations required by a user, as well as providing a GIS (latitude/longitude/height) context for any 3D point in a scene. Furthermore, radial colorization may be provided based upon a distance from a point or object, or planar colorization based upon a distance from surface (e.g., ground) . Other advantages may include dynamic density calibration, as well as non-uniformity in dynamic density calibration (i.e., areas of interest can be adjusted to have a higher density than the rest of the matrix). Moreover, polygon orientation (i.e., winding, in computer graphic terms, which is used to determine if a polygon is front-facing or back-facing) may optionally be automatically deduced from the order that the user selects the spheres .

Claims

CIAIMS
1. A geographic information system (GIS) comprising : a display; a GIS database; and a processor cooperating with said display and said GIS database to display a three-dimensional (3D) geospatial image including a plurality of spaced-apart reference markers therein, said reference markers having different visual characteristics indicative of different relative positions within the 3D geospatial image; said processor also associating with each reference marker selectively displayable position data.
2. The GIS of Claim 1 wherein the different visual characteristics comprise different sizes.
3. The GIS of Claim 1 wherein the different visual characteristics comprise different colors.
4. The GIS of Claim 1 further comprising an input device connected to said processor, and wherein said processor selectively displays position data for a given reference marker based upon said input device.
5. The GIS of Claim 1 further comprising an input device cooperating with said processor to draw a line between a pair of reference markers; and wherein said processor cooperates with said display to display a distance between the pair of reference markers based upon the line.
6. A three-dimensional (3D) geospatial image display method comprising: displaying the 3D geospatial image on a display with a plurality of spaced-apart reference markers therein, the reference markers having different visual characteristics indicative of different relative positions within the 3D geospatial image; and associating with each reference marker selectively displayable position data.
7. The method of Claim 6 wherein the different visual characteristics comprise at least one of different sizes and different colors.
8. The method of Claim 6 further comprising: drawing a line between a pair of reference markers; and displaying a distance between the pair of reference markers based upon the line .
9. The method of Claim 6 wherein the 3D geospatial image comprises a ground surface; and further comprising: selecting a given reference marker from among the plurality of reference markers above the ground surface; and drawing a vertical reference line between the ground and the given reference marker upon selection thereof.
10. The method of Claim 6 wherein the reference markers comprise semi-transparent geometric objects.
PCT/US2007/010774 2006-05-04 2007-05-03 Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods WO2007130539A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
BRPI0711291-2A BRPI0711291A2 (en) 2006-05-04 2007-05-03 Geographic Information System (GIS) and three-dimensional geospatial image display method (3d)
JP2009509723A JP2009535734A (en) 2006-05-04 2007-05-03 Geographic information system and associated method for displaying a three-dimensional geospatial image with a reference sign
EP07794526A EP2024961A4 (en) 2006-05-04 2007-05-03 Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods
CA002651318A CA2651318A1 (en) 2006-05-04 2007-05-03 Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/381,628 US20070257903A1 (en) 2006-05-04 2006-05-04 Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods
US11/381,628 2006-05-04

Publications (2)

Publication Number Publication Date
WO2007130539A2 true WO2007130539A2 (en) 2007-11-15
WO2007130539A3 WO2007130539A3 (en) 2008-07-24

Family

ID=38660784

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/010774 WO2007130539A2 (en) 2006-05-04 2007-05-03 Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods

Country Status (9)

Country Link
US (1) US20070257903A1 (en)
EP (1) EP2024961A4 (en)
JP (1) JP2009535734A (en)
KR (1) KR20090007623A (en)
CN (1) CN101438341A (en)
BR (1) BRPI0711291A2 (en)
CA (1) CA2651318A1 (en)
TW (1) TW200813885A (en)
WO (1) WO2007130539A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441675B (en) * 2008-12-18 2011-01-26 上海城市发展信息研究中心 Communication path building method based on city underground structures

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8274506B1 (en) * 2008-04-28 2012-09-25 Adobe Systems Incorporated System and methods for creating a three-dimensional view of a two-dimensional map
KR101562827B1 (en) * 2008-10-23 2015-10-23 삼성전자주식회사 Apparatus and method for manipulating virtual object
WO2011057026A2 (en) * 2009-11-05 2011-05-12 Aptima, Inc. Systems and methods to define and monitor a scenario of conditions
US9123160B1 (en) 2011-10-30 2015-09-01 Lockheed Martin Corporation Concurrent mesh generation in a computer simulation
US9147283B1 (en) * 2011-10-30 2015-09-29 Lockhead Martin Corporation Water surface visualization during a simulation
CN103150305B (en) * 2011-12-06 2016-05-11 泰瑞数创科技(北京)有限公司 The real time data processing of the 3-dimensional digital earth and management system
CN103366635B (en) * 2013-07-30 2015-06-10 武汉大学 Method for dynamically marking mobile object in electronic map
CN103971414A (en) * 2014-04-30 2014-08-06 深圳职业技术学院 Method and system for making visualized true three-dimensional map
US9770216B2 (en) * 2014-07-02 2017-09-26 Covidien Lp System and method for navigating within the lung
CA2953694A1 (en) * 2014-07-02 2016-01-07 Covidien Lp Alignment ct
CN104268937A (en) * 2014-09-26 2015-01-07 北京超图软件股份有限公司 Method and device for creating water surface effects in three-dimensional geographic information system (GIS)
JP6304077B2 (en) * 2015-03-10 2018-04-04 三菱電機株式会社 Line-of-sight display device
SE1530070A1 (en) 2015-05-19 2016-05-17 Advanced Technical Solutions In Scandinavia Ab Base member and an RFID member for 3D image creation
KR20170001632A (en) 2015-06-26 2017-01-04 주식회사 파베리안 Control system for collecting 3-dimension modeling data and method thereof
US20180253445A1 (en) * 2015-10-02 2018-09-06 Entit Software Llc Geo-positioning information indexing
US10339708B2 (en) * 2016-11-01 2019-07-02 Google Inc. Map summarization and localization
KR20180051288A (en) * 2016-11-08 2018-05-16 삼성전자주식회사 Display apparatus and control method thereof
US10565802B2 (en) * 2017-08-31 2020-02-18 Disney Enterprises, Inc. Collaborative multi-modal mixed-reality system and methods leveraging reconfigurable tangible user interfaces for the production of immersive, cinematic, and interactive content
US11464576B2 (en) 2018-02-09 2022-10-11 Covidien Lp System and method for displaying an alignment CT
CN108981698B (en) * 2018-05-29 2020-07-14 杭州视氪科技有限公司 Visual positioning method based on multi-mode data
CN111445569B (en) * 2019-11-28 2023-04-14 成都理工大学 Sedimentary geological evolution dynamic simulation method
CN114510841B (en) * 2022-02-21 2022-10-04 深圳市格衡土地房地产资产评估咨询有限公司 Virtual image modeling-based removal visualization system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5050090A (en) * 1989-03-30 1991-09-17 R. J. Reynolds Tobacco Company Object placement method and apparatus
US5535134A (en) * 1994-06-03 1996-07-09 International Business Machines Corporation Object placement aid
US5883625A (en) * 1996-04-22 1999-03-16 Ast Research, Inc. Arrangement system for object placement on windows
JP3052286B2 (en) * 1997-08-28 2000-06-12 防衛庁技術研究本部長 Flight system and pseudo visual field forming device for aircraft
JP2002107161A (en) * 2000-10-03 2002-04-10 Matsushita Electric Ind Co Ltd Course-guiding apparatus for vehicles
US6915310B2 (en) * 2002-03-28 2005-07-05 Harris Corporation Three-dimensional volumetric geo-spatial querying
US7658610B2 (en) * 2003-02-26 2010-02-09 Align Technology, Inc. Systems and methods for fabricating a dental template with a 3-D object placement
WO2006041937A2 (en) * 2004-10-04 2006-04-20 Solid Terrain Modeling Three-dimensional cartographic user interface system
US7873240B2 (en) * 2005-07-01 2011-01-18 The Boeing Company Method for analyzing geographic location and elevation data and geocoding an image with the data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP2024961A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441675B (en) * 2008-12-18 2011-01-26 上海城市发展信息研究中心 Communication path building method based on city underground structures

Also Published As

Publication number Publication date
TW200813885A (en) 2008-03-16
WO2007130539A3 (en) 2008-07-24
CN101438341A (en) 2009-05-20
EP2024961A4 (en) 2012-05-30
JP2009535734A (en) 2009-10-01
CA2651318A1 (en) 2007-11-15
EP2024961A2 (en) 2009-02-18
BRPI0711291A2 (en) 2011-08-23
US20070257903A1 (en) 2007-11-08
KR20090007623A (en) 2009-01-19

Similar Documents

Publication Publication Date Title
US20070257903A1 (en) Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods
US11809681B2 (en) Reality capture graphical user interface
US7084886B2 (en) Using detail-in-context lenses for accurate digital image cropping and measurement
CN103309605B (en) Information processing unit and information processing method
US8928657B2 (en) Progressive disclosure of indoor maps
US8042056B2 (en) Browsers for large geometric data visualization
US9569066B2 (en) Interface for navigating imagery
US7983473B2 (en) Transparency adjustment of a presentation
US10353535B2 (en) Multi-view display viewing zone layout and content assignment
US20110267372A1 (en) Compound Lenses for Multi-Source Data Presentation
US20120139915A1 (en) Object selecting device, computer-readable recording medium, and object selecting method
US20040125138A1 (en) Detail-in-context lenses for multi-layer images
CN105103112A (en) Apparatus and method for manipulating the orientation of object on display device
US20150248211A1 (en) Method for instantaneous view-based display and selection of obscured elements of object models
CN105046748B (en) The 3D photo frame apparatus of image can be formed in a kind of three-dimensional geologic scene
US9483878B2 (en) Contextual editing using variable offset surfaces
US9159300B2 (en) Seat layout display apparatus, seat layout display method, and program thereof
Wu et al. An interactive and flexible information visualization method
Röhlig et al. Visibility widgets: Managing occlusion of quantitative data in 3d terrain visualization
Rohs et al. Which one is better? Information navigation techniques for spatially aware handheld displays
US20130090895A1 (en) Device and associated methodology for manipulating three-dimensional objects
WO2020084192A1 (en) Method, arrangement, and computer program product for three-dimensional visualization of augmented reality and virtual reality environments
Lee et al. Mirage: A touch screen based mixed reality interface for space planning applications
Chen et al. A two-point map-based interface for architectural walkthrough
JP2017228186A (en) View point field candidate presentation program and view point field candidate presentation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07794526

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2009509723

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2651318

Country of ref document: CA

Ref document number: 200780016248.5

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2007794526

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 1020087029549

Country of ref document: KR

ENP Entry into the national phase

Ref document number: PI0711291

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20081104