US20050144574A1 - Constraining user movement in virtual environments - Google Patents

Constraining user movement in virtual environments Download PDF

Info

Publication number
US20050144574A1
US20050144574A1 US11/052,150 US5215005A US2005144574A1 US 20050144574 A1 US20050144574 A1 US 20050144574A1 US 5215005 A US5215005 A US 5215005A US 2005144574 A1 US2005144574 A1 US 2005144574A1
Authority
US
United States
Prior art keywords
user
permitted zone
zone
permitted
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/052,150
Inventor
Nelson Chang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Priority to US11/052,150 priority Critical patent/US20050144574A1/en
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHANG, NELSON LIANG AN
Publication of US20050144574A1 publication Critical patent/US20050144574A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images

Definitions

  • Computer systems are commonly used to display objects in a virtual environment.
  • the objects may comprise text, still images, video, audio, graphic symbols, icons, any other type of computer-representable items, and/or combinations of any of the foregoing.
  • the virtual environment may be two-dimensional (e.g., a maze game) or three-dimensional (e.g., a simulation of a city having low-rise homes as well as towering skyscrapers).
  • three-dimensional applications may be depicted in a series of two-dimensional views.
  • a computer-aided blueprinting application may depict top, side and end views of a three-dimensional building because draftsmen and architects are used to working with those kinds of two-dimensional views.
  • it is inconvenient to use a series of two-dimensional views and it is preferable to have a three-dimensional view.
  • laymen are generally not accustomed to looking at two-dimensional blueprints, so an architect might prefer to present a design for a home for client approval using a perspective view, while retaining the two-dimensional views for construction purposes.
  • users may have difficulty navigating through the virtual environment. For instance, difficulties may arise due to inexperience with digital navigation tools (e.g., joystick, keyboard, etc.), the user interface, the complexity of the environment, etc. For example, a user may navigate too close to a displayed image to view the entire image, and also thereby lose track of the context and how that particular image fits within the overall environment. Moreover, some or all of a virtual environment may not have been designed to be viewed at near distances (e.g., some parts of the environment may not properly render when viewed too closely).
  • digital navigation tools e.g., joystick, keyboard, etc.
  • An exemplary method for constraining a user's movement in a virtual environment includes determining a user's current location in the virtual environment, determining a permitted zone based on the current location, dividing an area outside the permitted zone into multiple zones, obtaining an input to move to a new location, calculating a point within the permitted zone proximate to the new location based on one or more of the multiple zones, and moving the user to the proximate point.
  • FIG. 1 illustrates an exemplary process for constraining a user's movement to a permitted zone in a virtual environment.
  • FIG. 2 illustrates an exemplary process for determining a user's new location if the new location is in another permitted zone.
  • FIG. 3A illustrates an exemplary partitioned area in a partitioned virtual environment.
  • FIG. 3B illustrates an exemplary permitted zone within the partitioned area.
  • FIG. 4 illustrates an exemplary division of areas outside of the permitted zone into multiple zones.
  • FIG. 5 illustrates an exemplary polygonal permitted zone surrounded by constraint lines.
  • FIG. 6 illustrates an exemplary division of zones outside of the polygonal permitted zone.
  • FIG. 7 illustrates an exemplary specification of the zones outside of the polygonal permitted zone based on the constraint lines.
  • FIG. 8 illustrates an exemplary zone outside of the polygonal permitted zone.
  • FIG. 9 illustrates an exemplary rectangular permitted zone surrounded by constraint lines.
  • FIG. 10 illustrates an exemplary specification of the zones outside of the rectangular permitted zone based on the constraint lines.
  • FIG. 11 illustrates exemplary multiple partitioned areas separated by a passageway.
  • FIG. 12 illustrates an exemplary path in an exemplary partitioned area.
  • FIG. 13 illustrates an exemplary system and operating environment.
  • Section II describes, in a general sense, exemplary processes for constraining a user's movement within a virtual environment.
  • Section III describes an exemplary partitioned area in which the user's movement may be constrained, and illustrates more specifically an application of the exemplary process of Section II to determine a permitted zone within the exemplary partitioned area.
  • Section IV describes exemplary processes for dividing zones outside of a permitted zone based on constraint lines surrounding the permitted zone.
  • Section V describes other exemplary implementations and aspects.
  • Section VI describes an exemplary system and operating environment.
  • FIG. 1 illustrates, in a general sense, an exemplary process for constraining a user's movement within a virtual environment that is partitioned into various sub-areas.
  • the virtual environment might be gridded into rectangular sub-areas having the overall pattern of a checkerboard or tic-tac-toe board.
  • a virtual environment can be partitioned into sub-areas of any shape, not simply rectangular, and sub-areas may have different shapes.
  • the user is located in one of the sub-areas, and the user can move within one of the sub-areas or from one sub-area to another.
  • each of the sub-areas into which the domain is partitioned will be referred to hereafter as a “partitioned area.”
  • partitioned area Some exemplary partitioned areas are illustrated in FIGS. 3A, 3B and 4 . These Figures will be referred to in the exemplary process steps described below for clarity purposes.
  • the virtual environment may be a two-dimensional environment or a three-dimensional environment; however, for ease of explanation, the exemplary process will be illustrated in the context of a two-dimensional virtual environment.
  • a user's current location in a virtual environment is determined.
  • a permitted zone is determined based on the user's current location.
  • the partitioned area where the user is currently located can be referred to as the “active” area.
  • the permitted zone can be specified by a spacing parameter, p, which is the minimum distance away from one or more walls in the active area.
  • p the spacing parameter
  • the spacing parameter is not necessarily a constant but can vary as a function of the walls of an active area. For example, a different spacing parameter may be assigned with respect to each wall of an active area. Further, the spacing parameter may also be assigned to zero (which is equivalent to not having a spacing parameter at all). In this implementation, the user may be able to move right up to the edge of a wall or even pass through a wall depending on design choice.
  • the term wall as used herein shall include any types of boundary (visible or transparent) partially or wholly separating one partitioned area from another partitioned area, or restricting the user to a portion of the active area.
  • p can be any predetermined value depending on design choice.
  • the permitted zone is a sub-area within the active area which may be specified by one or more spacing parameters greater than or equal to zero from the wall(s) of the active area.
  • zones outside of the permitted zone are divided into multiple zones.
  • the areas outside the permitted zone could be divided into eight zones surrounding the permitted zone, with each zone corresponding to one of the non-active partitioned areas.
  • the other zones outside the permitted zone extend to infinity (if the virtual environment is unbounded), or to the respective edge(s) of the virtual environment (if it is bounded).
  • zone 5 represents the permitted zone and areas outside of zone 5 have been divided into 8 zones surrounding zone 5 .
  • partitioned areas can be of other shapes than rectangular.
  • the partitioned areas may be squares, circles, ellipses, ovals, and/or other polygons.
  • partitioned areas in a virtual environment need not be of the same type of shapes or sizes.
  • the number (and shapes) of zones outside the permitted zone is dependent on the shape of a particular permitted zone. In general, the number of zones outside a permitted zone can be selected based on design choice.
  • the zones outside a polygonal permitted zone can be characterized by constraint lines surrounding the permitted zone. This exemplary implementation will be described in more detail in Section IV below.
  • a user input to move to a new location in the virtual environment is received.
  • the input may be a new coordinate value of the new location, relative vector values (e.g., an offset and a direction), and/or other types of input indicative of the new location.
  • step 150 whether the new location is outside the permitted zone is determined.
  • a point within the permitted zone that is proximate to the new location is calculated.
  • a proximate point may be a nearest point from the current location to a boundary, a nearest point from the new location to a boundary, a nearest point from the current location to a boundary while maintaining the direction indicated by the user, or any other point within a permitted zone between the current location and the new location.
  • a proximate point can be efficiently calculated by determining which zone (e.g., the permitted zone or any of the zones surrounding the permitted zone) the new location is located. Based on the determined zone, a reasonable approximation can be made to quickly calculate the edge of the permitted zone where the user should be sent. For example, if the new location is located in the upper left corner zone of a rectangular gridded virtual environment, then the proximate point can be approximated as the point at the lower right corner of the corner zone.
  • intersection point will be the point on the boundary of the permitted zone that is actually nearest to the new location (as well as the current location). Otherwise, the actual nearest point can be found by simply moving along the boundary until a line between the instantaneous boundary position and the new point is perpendicular to the new location (or the current location). It is not necessary to compute the intersection point in this case.
  • the intersection point on the boundary intersecting a line connecting the current location and the new location can be considered as the proximate point (i.e., to a sufficiently good approximation). If it is more important to minimize the distance to the new location, the boundary moving technique of the previous paragraph can be used to obtain the actual nearest point as the proximate point.
  • the user is moved to the proximate point.
  • the user can be moved in a straight line path from the current location to the proximate point.
  • the user can be moved in a zig-zag path, for example, if an impassable obstacle (e.g., wall) is in the way of a straight-line path.
  • the process returns to step 140 for a next input to move.
  • step 180 the user is moved to that location based on the input. The process returns to step 140 for a next input to move.
  • a user's input at step 140 may indicate a new location located in another permitted zone.
  • FIG. 2 illustrates an exemplary process to resolve this situation.
  • step 210 it is determined whether the new location is located within another permitted zone.
  • a point within a permitted zone that is proximate to the new location is calculated.
  • the proximate point may be within the permitted zone where the user is currently located; or the proximate point may be within another permitted zone that is closer to the new location but not necessarily in the permitted zone where the user is currently located.
  • step 230 the user is moved to the proximate point.
  • the process returns to step 140 for a next input to move.
  • steps 180 and 240 may be performed sequentially in any order or simultaneously. The process returns to step 140 for a next input to move.
  • FIG. 3A illustrates an exemplary partitioned area 300 in a virtual environment.
  • the partitioned area 300 includes multiple doorways 310 a - 350 a which may lead to other partitioned areas (not shown).
  • FIG. 3B illustrates an exemplary permitted zone 360 within the partitioned area 300 .
  • a spacing parameter p (or distance) from each boundary of the permitted zone to the walls of the partitioned area is the same throughout.
  • FIG. 3B illustrates (smaller) additional permitted zones 310 b - 350 b to provide users with access to the multiple doorways 310 a - 350 a.
  • the spacing parameter p is set to zero for the additional permitted zones. In other words, users have access up to the boundaries surrounding the additional permitted zones 310 a - 350 a.
  • one or more spacing parameters may be implemented in one or more additional permitted zones 310 a - 350 a.
  • the additional permitted zones 310 b - 350 b can be treated as other permitted zones (i.e., treated just like the permitted zone 360 ).
  • the permitted zone 360 and the additional permitted zones 310 b - 350 b may be considered as a single (polygonal) permitted zone.
  • FIG. 4 illustrates the exemplary partitioned area 300 of FIG. 3B being divided into nine zones.
  • the permitted zone 360 of FIG. 3B is the middle zone, or zone 5 .
  • Zones 1 - 4 and 6 - 9 are indicated by dashed lines and each extends to infinity (or at least to the edge of the virtual environment). Three examples are provided below. In each example, a user is currently located in zone 5 .
  • the user inputs a move to P 1 in zone 3 .
  • P 1 is not located in a permitted zone.
  • a possible proximate point to P 1 is the point nearest to zone 5 , P 1 ′, which is approximated as the upper right hand corner point.
  • the user is moved to P 1 ′ instead of P 1 in zone 3 , which is not located in a permitted zone.
  • the user's input of a new location P 2 is located in additional permitted zone 350 b in zone 4 .
  • the additional permitted zone 350 b is a permitted zone, so the user is moved to P 2 in the additional permitted zone 350 b.
  • areas outside the additional permitted zone 350 b can also be divided into multiple zones to facilitate quicker determination of the proper location for the next move.
  • the user's input of a new location P 3 is located in zone 6 , and not within a permitted zone.
  • a proximate point can be within zone 5 or zone 6 , depending on design choice.
  • the user is moved to a proximate point (e.g., the nearest point) to P 3 in zone 5 (where the user started).
  • the user is moved to a proximate point (e.g., the nearest point) to P 3 in the additional permitted zone 330 b in zone 6 .
  • the additional permitted zone 330 b becomes the current permitted zone and all areas outside of the additional permitted zone 330 b can be divided into multiple zones to facilitate quicker determination of the proper location for the next move.
  • zones can be characterized based on constraint lines surrounding a polygonal permitted zone. This is particularly useful for representing the zones in a manner that can be readily stored and manipulated by a computer.
  • FIG. 5 illustrates an exemplary polygonal permitted zone 500 in a two-dimensional virtual environment having a generalized u,v coordinate system.
  • the polygonal permitted zone 500 is characterized by constraint lines I-V.
  • Each constraint line I-V separates two half planes.
  • the half planes separated by each constraint line can either be characterized as a k u+b k v+c k >0 (positive half plane) or a k u+b k v+c k ⁇ 0 (negative half plane).
  • the half plane outside the polygonal permitted zone 500 will be referred to as the “negative” half plane.
  • the permitted zone 500 is zone 11 .
  • Each zone in FIG. 6 can be characterized by overlapping half planes as shown in the table illustrated in FIG. 7 .
  • the table in FIG. 7 may be used to quickly deduce the zone in which a given point (u, v) is located. For example, if one determines that (u, v) is located in the positive half plane of constraint line I, then one can eliminate zones 1 to 3 , and so forth.
  • the table in FIG. 7 is merely exemplary. One skilled in the art will recognize that, depending on design choice, other calculations can be implemented to quickly determine a zone at least partially bounded by constraint lines.
  • zone 3 is the area of overlap between negative half planes a 1 u+b 1 v+c 1 ⁇ 0 and a 3 u+b 3 v+C 3 ⁇ 0.
  • zone 1 can be characterized by the negative half planes of constraint lines I and II
  • zone 2 can be characterized by the negative half plane of constraint line I, and positive half planes of constraint lines II and III.
  • the zone in which a new location is located can be ascertained based on the constraint lines surrounding a permitted zone (e.g., evaluating the sign of a k u+b k v+c k for each relevant constraint line k and determining the zone from FIG. 7 ).
  • a proximate point to the new location can be quickly determined by applying the techniques described in Section III above.
  • FIG. 9 illustrates an exemplary rectangular permitted zone (zone 5 ) surrounded by four constraint lines I-IV that may extend to the boundaries of the virtual environment.
  • constraint lines I and II have constant x values of A and B, respectively
  • constraint lines III and IV have constant y values of C and D, respectively.
  • Zones 1 - 9 can be readily characterized based on these constraint lines. Any point within a given zone will have an identifiable relationship with respect to the constant x, y values of the constraint lines for that zone.
  • FIG. 10 illustrates an exemplary table characterizing each zone based on its relationship to the constant values (of that zone's constraint lines).
  • a proximate point (e.g., (x,y)
  • proximate point is merely exemplary.
  • Other types of proximate points e.g., the nearest point to the new location, the nearest point to the old location, etc. may alternatively be implemented in accordance with design choice.
  • multiple partitioned areas are interconnected.
  • multiple partitioned areas may be interconnected by openings (e.g., doors) or passageways (e.g., hallways).
  • FIG. 11 illustrates two partitioned areas 1110 and 1120 being interconnected by a passageway 1130 .
  • each partitioned area 1110 , 1120 includes a primary permitted zone 1140 , 1160 , respectively.
  • each partitioned area 1110 , 1120 may include one or more additional permitted zones 1170 , 1180 to enable users to go through doorways (not shown).
  • a spacing parameter p may be set to a non-zero value or to a zero value (i.e., equivalent to not having a spacing parameter). For example, if a passageway is narrow, it may not be practical to limit the user's movement within the passageway. However, if a passageway is being used to display images of objects, a spacing parameter (albeit a small one) may be implemented to prevent the user from getting too close to the images.
  • the passageway 1130 is the permitted zone and all areas outside the passageway 1130 can be divided into multiple zones as described in various implementations herein.
  • Permitted zones may also be linked together to create a path in a partitioned area.
  • FIG. 12 illustrates an exemplary partitioned area 1200 having multiple permitted zones 1210 - 1250 within the area.
  • each permitted zone can be implemented by selecting appropriate spacing parameter(s) for that zone.
  • This implementation can be used, for example, to constrain a user to a guided tour through a partitioned area.
  • the zones can be characterized by constraint planes (a more generalized form of intersection by constraint lines) in the three-dimensional space. Based on these zones, one can efficiently determine where and how to move the user (e.g., by determining the nearest point to the new location within a permitted zone).
  • FIG. 13 illustrates an exemplary system for generating virtual environments in which exemplary implementations described herein may be applied.
  • the exemplary system 1300 including a graphics processing unit 1310 , a rendering engine 1320 , a user output interface 1330 , a user input interface 1340 , and memory 1350 .
  • the graphics processing unit 1310 functions to receive object data 10 A and generate two- and/or three-dimensional imaging data 10 B corresponding to the specified type of virtual environment using techniques that are well known in the field of graphics imaging.
  • Rendering engine 1320 receives imaging data 10 B and generates rendering data 11 A outputted through user output interface 1330 .
  • rendering data 11 A might be configured for a monitor (not shown) or other form of output device to display the three-dimensional virtual environment including the representative images.
  • User input interface 1340 might include computer-implemented hardware and/or software that allows a user to interact with the three-dimensional environment. For instance, the user input interface 1340 might allow the user to navigate through and view the three-dimensional environment by moving a displayed cursor using a user input device (e.g., a keyboard, a mouse, a joystick, a pressure sensitive screen, etc.).
  • Graphics processing unit 1310 may function to generate the representative images of the data objects or the representative image data may be stored within the memory 1350 and linked to a data object database.
  • the embodiments described herein may be implemented in an operating environment, such as the system 100 , comprising software installed on a computer, in hardware, or in a combination of software and hardware.
  • the software and/or hardware would typically include some type of computer-readable media which can store data and logic instructions (such as those which, when executed, authenticates a user having a biometric authentication datum using a pass code) that are accessible by the computer or the processing logic within the hardware.
  • data and logic instructions such as those which, when executed, authenticates a user having a biometric authentication datum using a pass code
  • Such media might include, without limitation, hard disks, floppy disks, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROMs), and the like.

Abstract

An exemplary method for constraining a user's movement in a virtual environment includes determining a user's current location in the virtual environment, determining a permitted zone based on the current location, dividing areas outside the permitted zone into multiple zones, obtaining an input to move to a new location, calculating a point within the permitted zone proximate to the new location based on one or more of the multiple zones, and moving the user to the proximate point.

Description

    RELATED APPLICATION
  • This application is a continuation-in-part application of pending U.S. patent application Ser. No. 10/021,648 filed on Oct. 30, 2001, which is hereby incorporated by reference in its entirety.
  • BACKGROUND
  • Computer systems are commonly used to display objects in a virtual environment. The objects may comprise text, still images, video, audio, graphic symbols, icons, any other type of computer-representable items, and/or combinations of any of the foregoing.
  • The virtual environment (and the objects therein) may be two-dimensional (e.g., a maze game) or three-dimensional (e.g., a simulation of a city having low-rise homes as well as towering skyscrapers). Depending on the background of the user, three-dimensional applications may be depicted in a series of two-dimensional views. For example, a computer-aided blueprinting application may depict top, side and end views of a three-dimensional building because draftsmen and architects are used to working with those kinds of two-dimensional views. However, for other applications or users, it is inconvenient to use a series of two-dimensional views, and it is preferable to have a three-dimensional view. For example, laymen are generally not accustomed to looking at two-dimensional blueprints, so an architect might prefer to present a design for a home for client approval using a perspective view, while retaining the two-dimensional views for construction purposes.
  • Whatever the application, and whether in two or three dimensions, users may have difficulty navigating through the virtual environment. For instance, difficulties may arise due to inexperience with digital navigation tools (e.g., joystick, keyboard, etc.), the user interface, the complexity of the environment, etc. For example, a user may navigate too close to a displayed image to view the entire image, and also thereby lose track of the context and how that particular image fits within the overall environment. Moreover, some or all of a virtual environment may not have been designed to be viewed at near distances (e.g., some parts of the environment may not properly render when viewed too closely).
  • In addition, many virtual environments have a number of boundaries that constrain the user's movement. For example, when traversing a maze, the user must navigate between—but cannot cross over—the lines defining the maze. Similarly, a user exploring a virtual house should not be permitted to bump into any of the walls, but should instead move from room to room through virtual doorways.
  • Thus, a market exists for a computer-implemented process that provides constraints on the user's movements in virtual environments, whether in two or three dimensions.
  • SUMMARY
  • An exemplary method for constraining a user's movement in a virtual environment includes determining a user's current location in the virtual environment, determining a permitted zone based on the current location, dividing an area outside the permitted zone into multiple zones, obtaining an input to move to a new location, calculating a point within the permitted zone proximate to the new location based on one or more of the multiple zones, and moving the user to the proximate point.
  • Other embodiments and implementations are also described below.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 illustrates an exemplary process for constraining a user's movement to a permitted zone in a virtual environment.
  • FIG. 2 illustrates an exemplary process for determining a user's new location if the new location is in another permitted zone.
  • FIG. 3A illustrates an exemplary partitioned area in a partitioned virtual environment.
  • FIG. 3B illustrates an exemplary permitted zone within the partitioned area.
  • FIG. 4 illustrates an exemplary division of areas outside of the permitted zone into multiple zones.
  • FIG. 5 illustrates an exemplary polygonal permitted zone surrounded by constraint lines.
  • FIG. 6 illustrates an exemplary division of zones outside of the polygonal permitted zone.
  • FIG. 7 illustrates an exemplary specification of the zones outside of the polygonal permitted zone based on the constraint lines.
  • FIG. 8 illustrates an exemplary zone outside of the polygonal permitted zone.
  • FIG. 9 illustrates an exemplary rectangular permitted zone surrounded by constraint lines.
  • FIG. 10 illustrates an exemplary specification of the zones outside of the rectangular permitted zone based on the constraint lines.
  • FIG. 11 illustrates exemplary multiple partitioned areas separated by a passageway.
  • FIG. 12 illustrates an exemplary path in an exemplary partitioned area.
  • FIG. 13 illustrates an exemplary system and operating environment.
  • DETAILED DESCRIPTION
  • I. Overview
  • Section II describes, in a general sense, exemplary processes for constraining a user's movement within a virtual environment.
  • Section III describes an exemplary partitioned area in which the user's movement may be constrained, and illustrates more specifically an application of the exemplary process of Section II to determine a permitted zone within the exemplary partitioned area.
  • Section IV describes exemplary processes for dividing zones outside of a permitted zone based on constraint lines surrounding the permitted zone.
  • Section V describes other exemplary implementations and aspects.
  • Section VI describes an exemplary system and operating environment.
  • II. An Exemplary Process for Constraining a User's Movement within a Virtual Environment
  • FIG. 1 illustrates, in a general sense, an exemplary process for constraining a user's movement within a virtual environment that is partitioned into various sub-areas. For example and without limitation, in a simple form of partitioning, the virtual environment might be gridded into rectangular sub-areas having the overall pattern of a checkerboard or tic-tac-toe board. Of course, a virtual environment can be partitioned into sub-areas of any shape, not simply rectangular, and sub-areas may have different shapes. At any given time, the user is located in one of the sub-areas, and the user can move within one of the sub-areas or from one sub-area to another. For continuity with the terminology used in U.S. patent application Ser. No. 10/021,648 from which this patent claims priority, each of the sub-areas into which the domain is partitioned will be referred to hereafter as a “partitioned area.” Some exemplary partitioned areas are illustrated in FIGS. 3A, 3B and 4. These Figures will be referred to in the exemplary process steps described below for clarity purposes. The virtual environment may be a two-dimensional environment or a three-dimensional environment; however, for ease of explanation, the exemplary process will be illustrated in the context of a two-dimensional virtual environment.
  • At step 110, a user's current location in a virtual environment is determined. In two dimensions, the user's current location may be indicated by L=(x, y) in the x- and y-coordinate system.
  • At step 120, a permitted zone is determined based on the user's current location. In an exemplary implementation, the partitioned area where the user is currently located can be referred to as the “active” area. Within the active area, the permitted zone can be specified by a spacing parameter, p, which is the minimum distance away from one or more walls in the active area. For example, in FIG. 3B, the partitioned area 300 is the active area and the permitted zone 360 within the active area is specified by the parameter p.
  • The spacing parameter is not necessarily a constant but can vary as a function of the walls of an active area. For example, a different spacing parameter may be assigned with respect to each wall of an active area. Further, the spacing parameter may also be assigned to zero (which is equivalent to not having a spacing parameter at all). In this implementation, the user may be able to move right up to the edge of a wall or even pass through a wall depending on design choice. The term wall as used herein shall include any types of boundary (visible or transparent) partially or wholly separating one partitioned area from another partitioned area, or restricting the user to a portion of the active area. In general, p can be any predetermined value depending on design choice. In general, the permitted zone is a sub-area within the active area which may be specified by one or more spacing parameters greater than or equal to zero from the wall(s) of the active area.
  • At step 130, areas outside of the permitted zone are divided into multiple zones. To give a very simple form of example, if all the partitioned areas are rectangular, then the areas outside the permitted zone could be divided into eight zones surrounding the permitted zone, with each zone corresponding to one of the non-active partitioned areas. The other zones outside the permitted zone extend to infinity (if the virtual environment is unbounded), or to the respective edge(s) of the virtual environment (if it is bounded). For example, in FIG. 4, zone 5 represents the permitted zone and areas outside of zone 5 have been divided into 8 zones surrounding zone 5.
  • One skilled in the art will recognize that the partitioned areas can be of other shapes than rectangular. For example, the partitioned areas may be squares, circles, ellipses, ovals, and/or other polygons. In addition, partitioned areas in a virtual environment need not be of the same type of shapes or sizes. In general, the number (and shapes) of zones outside the permitted zone is dependent on the shape of a particular permitted zone. In general, the number of zones outside a permitted zone can be selected based on design choice.
  • In an exemplary implementation, the zones outside a polygonal permitted zone can be characterized by constraint lines surrounding the permitted zone. This exemplary implementation will be described in more detail in Section IV below.
  • At step 140, a user input to move to a new location in the virtual environment is received. In an exemplary implementation, the input may be a new coordinate value of the new location, relative vector values (e.g., an offset and a direction), and/or other types of input indicative of the new location.
  • At step 150, whether the new location is outside the permitted zone is determined.
  • At step 160, if the new location is outside the permitted zone, a point within the permitted zone that is proximate to the new location is calculated. Depending on a design choice, a proximate point may be a nearest point from the current location to a boundary, a nearest point from the new location to a boundary, a nearest point from the current location to a boundary while maintaining the direction indicated by the user, or any other point within a permitted zone between the current location and the new location.
  • In an exemplary implementation, a proximate point can be efficiently calculated by determining which zone (e.g., the permitted zone or any of the zones surrounding the permitted zone) the new location is located. Based on the determined zone, a reasonable approximation can be made to quickly calculate the edge of the permitted zone where the user should be sent. For example, if the new location is located in the upper left corner zone of a rectangular gridded virtual environment, then the proximate point can be approximated as the point at the lower right corner of the corner zone.
  • Alternatively, if the new location is within a side zone (rather than a corner zone), then a line connecting the current location (xcurrent, ycurrent) to the new location (xnew, ynew) can be described by the equation (y−ycurrent)=(ynew−ycurrent)/(xnew−xcurrent)*(x−xcurrent). If the boundary of the permitted zone is characterized as any function y=f(x), then the intersection of the line and the boundary can straightforwardly be calculated since there are two equations and two unknowns. The intersection point (xintersection, yintersection) will lie at the boundary of the permitted zone. In a vertical move, one can simply set xnew=xcurrent and ynew=yboundary (and in a horizontal move, set ynew=ycurrent and xnew=xboundary) to achieve a faster calculation. If the line between the current and new locations is perpendicular to the boundary, the intersection point will be the point on the boundary of the permitted zone that is actually nearest to the new location (as well as the current location). Otherwise, the actual nearest point can be found by simply moving along the boundary until a line between the instantaneous boundary position and the new point is perpendicular to the new location (or the current location). It is not necessary to compute the intersection point in this case.
  • If it is desired to maintain the same direction of movement as that originally expected by the user in specifying the new location, the intersection point on the boundary intersecting a line connecting the current location and the new location can be considered as the proximate point (i.e., to a sufficiently good approximation). If it is more important to minimize the distance to the new location, the boundary moving technique of the previous paragraph can be used to obtain the actual nearest point as the proximate point.
  • At step 170, the user is moved to the proximate point. In an exemplary implementation, the user can be moved in a straight line path from the current location to the proximate point. In another exemplary implementation, the user can be moved in a zig-zag path, for example, if an impassable obstacle (e.g., wall) is in the way of a straight-line path. The process returns to step 140 for a next input to move.
  • Referring back to step 150, if the new location is within the permitted zone, then at step 180, the user is moved to that location based on the input. The process returns to step 140 for a next input to move.
  • From time to time, a user's input at step 140 may indicate a new location located in another permitted zone. FIG. 2 illustrates an exemplary process to resolve this situation.
  • If it is determined that the new location is located outside the permitted zone at step 150, then at step 210, it is determined whether the new location is located within another permitted zone.
  • If the new location is not located within another permitted zone, at step 220, a point within a permitted zone that is proximate to the new location is calculated. For example, the proximate point may be within the permitted zone where the user is currently located; or the proximate point may be within another permitted zone that is closer to the new location but not necessarily in the permitted zone where the user is currently located.
  • At step 230, the user is moved to the proximate point. The process returns to step 140 for a next input to move.
  • Referring back to step 210, if the new location is within another permitted zone, then at step 240, areas outside the new permitted zone are divided into multiple zones and, at step 180, the user is moved to that location based on the input. In general, steps 180 and 240 may be performed sequentially in any order or simultaneously. The process returns to step 140 for a next input to move.
  • The processes illustrated above are merely exemplary. Those skilled in the art will appreciate that other processes and/or steps may be used in accordance with the requirements of a particular implementation.
  • III. An Exemplary Partitioned Area within which a User's Movement may be Constrained
  • The exemplary processes described above in Section II can be more specifically illustrated in an exemplary two-dimensional virtual environment shown in FIGS. 3A-3B and 4.
  • FIG. 3A illustrates an exemplary partitioned area 300 in a virtual environment. The partitioned area 300 includes multiple doorways 310 a-350 a which may lead to other partitioned areas (not shown).
  • FIG. 3B illustrates an exemplary permitted zone 360 within the partitioned area 300. In this example, a spacing parameter p (or distance) from each boundary of the permitted zone to the walls of the partitioned area is the same throughout. In addition, FIG. 3B illustrates (smaller) additional permitted zones 310 b-350 b to provide users with access to the multiple doorways 310 a-350 a. In an exemplary implementation, the spacing parameter p is set to zero for the additional permitted zones. In other words, users have access up to the boundaries surrounding the additional permitted zones 310 a-350 a. In another exemplary implementation, one or more spacing parameters (same or different than the spacing parameter(s) for permitted zone 360) may be implemented in one or more additional permitted zones 310 a-350 a. In one implementation, the additional permitted zones 310 b-350 b can be treated as other permitted zones (i.e., treated just like the permitted zone 360). Alternatively, the permitted zone 360 and the additional permitted zones 310 b-350 b may be considered as a single (polygonal) permitted zone.
  • FIG. 4 illustrates the exemplary partitioned area 300 of FIG. 3B being divided into nine zones. The permitted zone 360 of FIG. 3B is the middle zone, or zone 5. Zones 1-4 and 6-9 are indicated by dashed lines and each extends to infinity (or at least to the edge of the virtual environment). Three examples are provided below. In each example, a user is currently located in zone 5.
  • In a first example, the user inputs a move to P1 in zone 3. In this example, P1 is not located in a permitted zone. A possible proximate point to P1 is the point nearest to zone 5, P1′, which is approximated as the upper right hand corner point. The user is moved to P1′ instead of P1 in zone 3, which is not located in a permitted zone.
  • In a second example, the user's input of a new location P2 is located in additional permitted zone 350 b in zone 4. The additional permitted zone 350 b is a permitted zone, so the user is moved to P2 in the additional permitted zone 350 b. In an exemplary implementation, areas outside the additional permitted zone 350 b can also be divided into multiple zones to facilitate quicker determination of the proper location for the next move.
  • In a third example, the user's input of a new location P3 is located in zone 6, and not within a permitted zone. In this example, a proximate point can be within zone 5 or zone 6, depending on design choice. For example, in a first scenario, the user is moved to a proximate point (e.g., the nearest point) to P3 in zone 5 (where the user started). In a second scenario, the user is moved to a proximate point (e.g., the nearest point) to P3 in the additional permitted zone 330 b in zone 6. In this scenario, the additional permitted zone 330 b becomes the current permitted zone and all areas outside of the additional permitted zone 330 b can be divided into multiple zones to facilitate quicker determination of the proper location for the next move.
  • By dividing the areas outside a permitted zone (where the user is currently located), one can efficiently and quickly determine a proximate point between a new location and the (or another) permitted zone and move the user accordingly.
  • IV. Exemplary Processes for Dividing Zones Based on Constraint Lines Surrounding a Permitted Zone
  • In an exemplary implementation, zones can be characterized based on constraint lines surrounding a polygonal permitted zone. This is particularly useful for representing the zones in a manner that can be readily stored and manipulated by a computer.
  • FIG. 5 illustrates an exemplary polygonal permitted zone 500 in a two-dimensional virtual environment having a generalized u,v coordinate system. The polygonal permitted zone 500 is characterized by constraint lines I-V. Each constraint line I-V separates two half planes. Each constraint line can be characterized as: aku+bkv+ck=0 in the generalized u,v coordinate system. The half planes separated by each constraint line can either be characterized as aku+bkv+ck>0 (positive half plane) or aku+bkv+ck<0 (negative half plane). As a matter of convenience in this exemplary embodiment, the half plane outside the polygonal permitted zone 500 will be referred to as the “negative” half plane.
  • FIG. 6 illustrates exemplary zones 1-11 characterizable by the constraint lines I-V (corresponding to k=1 to 5, respectively). In this example, the permitted zone 500 is zone 11. Each zone in FIG. 6 can be characterized by overlapping half planes as shown in the table illustrated in FIG. 7.
  • In an exemplary implementation, the table in FIG. 7 may be used to quickly deduce the zone in which a given point (u, v) is located. For example, if one determines that (u, v) is located in the positive half plane of constraint line I, then one can eliminate zones 1 to 3, and so forth. The table in FIG. 7 is merely exemplary. One skilled in the art will recognize that, depending on design choice, other calculations can be implemented to quickly determine a zone at least partially bounded by constraint lines.
  • In FIG. 8, constraint lines I and III can be characterized by a1u+b1v+c1=0 and a3u+b3v+C3=0, respectively. In this example, zone 3 is the area of overlap between negative half planes a1u+b1v+c1<0 and a3u+b3v+C3<0. Similarly, zone 1 can be characterized by the negative half planes of constraint lines I and II, and zone 2 can be characterized by the negative half plane of constraint line I, and positive half planes of constraint lines II and III.
  • Thus, the zone in which a new location is located can be ascertained based on the constraint lines surrounding a permitted zone (e.g., evaluating the sign of aku+bkv+ck for each relevant constraint line k and determining the zone from FIG. 7). After determining the zone of a new location, a proximate point to the new location can be quickly determined by applying the techniques described in Section III above.
  • The foregoing describes a generalized process for characterizing polygonal zones in a virtual environment. A rectangular permitted zone (having rectangular surrounding zones) can alternatively be calculated by a simpler technique. FIG. 9 illustrates an exemplary rectangular permitted zone (zone 5) surrounded by four constraint lines I-IV that may extend to the boundaries of the virtual environment. In a x,y coordinate system, constraint lines I and II have constant x values of A and B, respectively, and constraint lines III and IV have constant y values of C and D, respectively. Zones 1-9 can be readily characterized based on these constraint lines. Any point within a given zone will have an identifiable relationship with respect to the constant x, y values of the constraint lines for that zone. For example, any point having a value of x greater than A and less than B, and a value of y greater than C and less than D, must be in zone 5. FIG. 10 illustrates an exemplary table characterizing each zone based on its relationship to the constant values (of that zone's constraint lines).
  • Referring back to FIG. 9, upon determining a zone where the new location (x2, y2) is located and knowing where the current location (x1, y1) is located, one can determine a proximate point (e.g., (x,y)) very quickly. For example, a direction maintaining proximate point may be determined by solving the following equation: y−y1=(y2−y1)/(x2−x1))*(x−x1), where either x or y in this case is a known constant value determined by the boundary type (e.g., x=constant for a vertical boundary and y=constant for a horizontal boundary). In this example, the proximate point (x,y) is the intersection point on the boundary of zones 2 and 5 where y=D. A person skilled in the art will recognize that this proximate point is merely exemplary. Other types of proximate points (e.g., the nearest point to the new location, the nearest point to the old location, etc.) may alternatively be implemented in accordance with design choice.
  • V. Other Exemplary Implementations and Aspects
  • 1. Two or More Partitioned Areas Connected by Passageways
  • In a typical virtual environment, multiple partitioned areas are interconnected. For example, multiple partitioned areas may be interconnected by openings (e.g., doors) or passageways (e.g., hallways).
  • FIG. 11 illustrates two partitioned areas 1110 and 1120 being interconnected by a passageway 1130. In an exemplary implementation, each partitioned area 1110, 1120 includes a primary permitted zone 1140, 1160, respectively. In addition, each partitioned area 1110, 1120 may include one or more additional permitted zones 1170, 1180 to enable users to go through doorways (not shown).
  • Depending on the size of a passageway, a spacing parameter p may be set to a non-zero value or to a zero value (i.e., equivalent to not having a spacing parameter). For example, if a passageway is narrow, it may not be practical to limit the user's movement within the passageway. However, if a passageway is being used to display images of objects, a spacing parameter (albeit a small one) may be implemented to prevent the user from getting too close to the images.
  • In FIG. 11, if a user's current position is in the passageway 1130, the passageway 1130 is the permitted zone and all areas outside the passageway 1130 can be divided into multiple zones as described in various implementations herein.
  • 2. Creating a Path in a Partitioned Area
  • Permitted zones may also be linked together to create a path in a partitioned area.
  • FIG. 12 illustrates an exemplary partitioned area 1200 having multiple permitted zones 1210-1250 within the area.
  • In an exemplary implementation, each permitted zone can be implemented by selecting appropriate spacing parameter(s) for that zone. For example, permitted zone 1210 has the following spacing parameters for each of its four boundaries: upper boundary, p=0; lower boundary, p=8; right boundary, p=6.5; left boundary, p=4.
  • This implementation can be used, for example, to constrain a user to a guided tour through a partitioned area.
  • 3. Application in a Three-Dimensional Virtual Environment
  • The various exemplary implementations described herein are generally described in the context of two-dimensional space. One skilled in the art will readily appreciate that the exemplary implementations can be adapted to be applied to constrain user movements in three-dimensional space in which each permitted zone is a three-dimensional volume.
  • For example, in the case of a rectangular permitted zone, instead of dividing the virtual environment into nine zones in two-dimensional space (8 outer zones surrounding a central permitted zone), one could divide the virtual environment into 27 zones in three-dimensional space (26 outer zones surrounding a central permitted zone). In an exemplary implementation, the zones can be characterized by constraint planes (a more generalized form of intersection by constraint lines) in the three-dimensional space. Based on these zones, one can efficiently determine where and how to move the user (e.g., by determining the nearest point to the new location within a permitted zone).
  • VI. An Exemplary System and Operating Environment
  • FIG. 13 illustrates an exemplary system for generating virtual environments in which exemplary implementations described herein may be applied. The exemplary system 1300 including a graphics processing unit 1310, a rendering engine 1320, a user output interface 1330, a user input interface 1340, and memory 1350. The graphics processing unit 1310 functions to receive object data 10A and generate two- and/or three-dimensional imaging data 10B corresponding to the specified type of virtual environment using techniques that are well known in the field of graphics imaging. Rendering engine 1320 receives imaging data 10B and generates rendering data 11A outputted through user output interface 1330. For instance, rendering data 11A might be configured for a monitor (not shown) or other form of output device to display the three-dimensional virtual environment including the representative images. User input interface 1340 might include computer-implemented hardware and/or software that allows a user to interact with the three-dimensional environment. For instance, the user input interface 1340 might allow the user to navigate through and view the three-dimensional environment by moving a displayed cursor using a user input device (e.g., a keyboard, a mouse, a joystick, a pressure sensitive screen, etc.). Graphics processing unit 1310 may function to generate the representative images of the data objects or the representative image data may be stored within the memory 1350 and linked to a data object database.
  • The embodiments described herein may be implemented in an operating environment, such as the system 100, comprising software installed on a computer, in hardware, or in a combination of software and hardware.
  • The software and/or hardware would typically include some type of computer-readable media which can store data and logic instructions (such as those which, when executed, authenticates a user having a biometric authentication datum using a pass code) that are accessible by the computer or the processing logic within the hardware. Such media might include, without limitation, hard disks, floppy disks, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROMs), and the like.
  • VII. Conclusion
  • The foregoing examples illustrate certain exemplary embodiments from which other embodiments, variations, and modifications will be apparent to those skilled in the art. The inventions should therefore not be limited to the particular embodiments discussed above, but rather are defined by the claims. Furthermore, some of the claims may include alphanumeric identifiers to distinguish the elements thereof. Such identifiers are merely provided for convenience in reading, and should not necessarily be construed as requiring or implying a particular order of steps, or a particular sequential relationship among the claim elements.

Claims (33)

1. A method for constraining a user's movement in a virtual environment, comprising:
determining a user's current location in a virtual environment;
determining a permitted zone based on said current location;
dividing an area outside said permitted zone into multiple zones;
obtaining an input to move to a new location;
calculating a point within said permitted zone proximate to said new location based on one or more of said multiple zones; and
moving said user to said proximate point.
2. The method of claim 1, wherein said calculating said proximate point includes selecting a nearest corner of said one of said multiple zones.
3. The method of claim 1, wherein said calculating said proximate point is constrained by maintaining a similar direction, relative to said current location, as said new location.
4. The method of claim 1, wherein said calculating said proximate point is constrained by minimizing the distance between said new location and a boundary of said one of said multiple zones.
5. The method of claim 1, wherein said determining a permitted zone includes:
determining a partitioned area within said virtual environment where said user is currently located; and
determining said permitted zone within said partitioned area by subtracting a spacing parameter from one or more walls of said partitioned area.
6. The method of claim 1, wherein said calculating includes:
determining a zone of said multiple zones where said new location is located; and
determining a point in said permitted zone that is proximate to said new location based on said zone of said multiple zones.
7. The method of claim 1, wherein said moving said user includes:
moving said user in a straight line path from said current location to said proximate point.
8. The method of claim 1, wherein said moving said user includes:
moving said user in a zig-zag path from said current location to said proximate point.
9. The method of claim 1, wherein said dividing includes:
characterizing said multiple zones based on constraint lines surrounding said permitted zone.
10. The method of claim 1, wherein said permitted zone enables said user access to an adjoining partitioned area.
11. The method of claim 1, wherein said permitted zone is a passageway connecting multiple partitioned areas.
12. The method of claim 1, further comprising:
generating multiple permitted zones to guide said user in a path through a partitioned area.
13. An apparatus for constraining a user's movement in a virtual environment, comprising:
a processor; and
a memory, said memory comprising logic instructions that, when executed:
determine a user's current location in a virtual environment;
determine a permitted zone based on said current location;
divide an area outside said permitted zone into multiple zones;
obtain an input to move to a new location;
calculate a point within said permitted zone proximate to said new location based on one or more of said multiple zones; and
move said user to said proximate point.
14. The apparatus of claim 13, wherein said logic instructions to calculate said proximate point include logic instructions to select a nearest corner of said one of said multiple zones.
15. The apparatus of claim 13, wherein said logic instructions to calculate said proximate point are constrained to maintain a similar direction, relative to said current location, as said new location.
16. The apparatus of claim 13, wherein said logic instructions to calculate said proximate point are constrained to minimize the distance between said new location and a boundary of said one of said multiple zones.
17. The apparatus of claim 13, wherein said logic instructions to determine a permitted zone are constrained to determine said permitted zone based on one or more spacing parameters.
18. The appartus of claim 13, wherein said logic instructions to divide include logic instructions that characterize said multiple zones based on constraint lines surrounding said permitted zone.
19. A computer-readable medium for constraining a user's movement in a virtual environment, comprising:
logic instructions that, when executed:
determine a user's current location in a virtual environment;
determine a permitted zone based on said current location;
divide an area outside said permitted zone into multiple zones;
obtain an input to move to a new location;
calculate a point within said permitted zone proximate to said new location based on one or more of said multiple zones; and
move said user to said proximate point.
20. The computer-readable medium of claim 19, wherein said logic instructions to calculate said proximate point include logic instructions to select a nearest corner of said one of said multiple zones.
21. The computer-readable medium of claim 19, wherein said logic instructions to calculate said proximate point are constrained to maintain a similar direction, relative to said current location, as said new location.
22. The computer-readable medium of claim 19, wherein said logic instructions to calculate said proximate point are constrained to minimize the distance between said new location and a boundary of said one of said multiple zones.
23. The computer-readable application of claim 19, wherein said logic instructions to determine a permitted zone include logic instructions that, when executed:
determine a partitioned area within said virtual environment where said user is currently located; and
determine said permitted zone within said partitioned area by subtracting one or more spacing parameters from one or more walls of said partitioned area.
24. The computer-readable application of claim 19, wherein said logic instructions to calculate include logic instructions that, when executed:
determine a zone of said multiple zones where said new location is located; and
determine a point on said permitted zone that is closest to said zone.
25. The computer-readable application of claim 19, wherein said logic instructions to determine a permitted zone are constrained to determine said permitted zone based on one or more spacing parameters.
26. The computer-readable application of claim 19, wherein said logic instructions to divide include logic instructions that, when executed:
characterize said multiple zones based on constraint lines surrounding said permitted zone.
27. The computer-readable application of claim 19, wherein said permitted zone enables said user access to an adjoining partitioned area.
28. The computer-readable application of claim 19, wherein said permitted zone is a passageway connecting multiple partitioned areas.
29. The computer-readable application of claim 19, further comprising logic instructions that, when executed:
generate multiple permitted zones to guide said user in a path through a partitioned area.
30. An apparatus for constraining a user's movement in a virtual environment, comprising:
means for determining a user's current location in a virtual environment;
means for determining a permitted zone based on said current location;
means for dividing an area outside said permitted zone into multiple zones;
means for obtaining an input to move to a new location;
means for calculating a point within said permitted zone proximate to said new location based on at least one of said multiple zones; and
means for moving said user to said proximate point.
31. The apparatus of claim 30, wherein said calculating said proximate point is constrained to maintain a similar direction, relative to said current location, as said new location.
32. The apparatus of claim 30, wherein said calculating said proximate point is constrained to minimize the distance between said new location and a boundary of said one of said multiple zones.
33. The apparatus of claim 30, further comprising:
means for generating multiple permitted zones to guide said user in a path through a partitioned area.
US11/052,150 2001-10-30 2005-02-07 Constraining user movement in virtual environments Abandoned US20050144574A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/052,150 US20050144574A1 (en) 2001-10-30 2005-02-07 Constraining user movement in virtual environments

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/021,648 US6907579B2 (en) 2001-10-30 2001-10-30 User interface and method for interacting with a three-dimensional graphical environment
US11/052,150 US20050144574A1 (en) 2001-10-30 2005-02-07 Constraining user movement in virtual environments

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/021,648 Continuation-In-Part US6907579B2 (en) 2001-10-30 2001-10-30 User interface and method for interacting with a three-dimensional graphical environment

Publications (1)

Publication Number Publication Date
US20050144574A1 true US20050144574A1 (en) 2005-06-30

Family

ID=21805376

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/021,648 Expired - Lifetime US6907579B2 (en) 2001-10-30 2001-10-30 User interface and method for interacting with a three-dimensional graphical environment
US11/052,150 Abandoned US20050144574A1 (en) 2001-10-30 2005-02-07 Constraining user movement in virtual environments

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/021,648 Expired - Lifetime US6907579B2 (en) 2001-10-30 2001-10-30 User interface and method for interacting with a three-dimensional graphical environment

Country Status (6)

Country Link
US (2) US6907579B2 (en)
EP (1) EP1442356B1 (en)
JP (1) JP4130409B2 (en)
DE (1) DE60231229D1 (en)
TW (1) TW583597B (en)
WO (1) WO2003038593A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060267966A1 (en) * 2005-05-24 2006-11-30 Microsoft Corporation Hover widgets: using the tracking state to extend capabilities of pen-operated devices
US20060267967A1 (en) * 2005-05-24 2006-11-30 Microsoft Corporation Phrasing extensions and multiple modes in one spring-loaded control
US20070168890A1 (en) * 2006-01-13 2007-07-19 Microsoft Corporation Position-based multi-stroke marking menus
US20070257915A1 (en) * 2006-05-08 2007-11-08 Ken Kutaragi User Interface Device, User Interface Method and Information Storage Medium
US20130141428A1 (en) * 2011-11-18 2013-06-06 Dale L. Gipson Computer-implemented apparatus, system, and method for three dimensional modeling software
US20150278211A1 (en) * 2014-03-31 2015-10-01 Microsoft Corporation Using geographic familiarity to generate search results
US9223471B2 (en) 2010-12-28 2015-12-29 Microsoft Technology Licensing, Llc Touch screen control
US11532139B1 (en) * 2020-06-07 2022-12-20 Apple Inc. Method and device for improved pathfinding

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7269632B2 (en) 2001-06-05 2007-09-11 Xdyne, Inc. Networked computer system for communicating and operating in a virtual reality environment
US6907579B2 (en) * 2001-10-30 2005-06-14 Hewlett-Packard Development Company, L.P. User interface and method for interacting with a three-dimensional graphical environment
US20040027394A1 (en) * 2002-08-12 2004-02-12 Ford Global Technologies, Inc. Virtual reality method and apparatus with improved navigation
US7853899B1 (en) * 2002-12-30 2010-12-14 Sap Aktiengesellschaft Configuring and extending a user interface
US10151599B1 (en) * 2003-03-13 2018-12-11 Pamala Meador Interactive virtual reality tour
US20060114251A1 (en) * 2004-02-11 2006-06-01 Miller Jacob J Methods for simulating movement of a computer user through a remote environment
EP1621988A3 (en) * 2004-07-24 2012-04-04 Samsung Electronics Co., Ltd Three-Dimensional Motion Graphic User Interface and method and apparatus for providing the same.
KR100631763B1 (en) * 2004-07-26 2006-10-09 삼성전자주식회사 3D motion graphic user interface and method and apparatus for providing same
US7395191B2 (en) 2004-07-28 2008-07-01 Blueridge Analytic, Inc. Computer-implemented land planning system and method designed to generate at least one conceptual fit solution to a user-defined land development problem
KR100643276B1 (en) * 2004-08-07 2006-11-10 삼성전자주식회사 Three dimensional motion graphic user interface and method and apparutus for providing this user interface
KR100755684B1 (en) * 2004-08-07 2007-09-05 삼성전자주식회사 Three dimensional motion graphic user interface and method and apparutus for providing this user interface
JP4500632B2 (en) * 2004-09-07 2010-07-14 キヤノン株式会社 Virtual reality presentation apparatus and information processing method
US8418075B2 (en) 2004-11-16 2013-04-09 Open Text Inc. Spatially driven content presentation in a cellular environment
US8001476B2 (en) * 2004-11-16 2011-08-16 Open Text Inc. Cellular user interface
US20060123351A1 (en) * 2004-12-08 2006-06-08 Evil Twin Studios, Inc. System and method for communicating objects status within a virtual environment using translucency
US20060256109A1 (en) * 2005-03-18 2006-11-16 Kristin Acker Interactive floorplan viewer
US7710423B2 (en) * 2005-03-21 2010-05-04 Microsoft Corproation Automatic layout of items along an embedded one-manifold path
US20070011617A1 (en) * 2005-07-06 2007-01-11 Mitsunori Akagawa Three-dimensional graphical user interface
US20070192727A1 (en) * 2006-01-26 2007-08-16 Finley William D Three dimensional graphical user interface representative of a physical work space
US9477495B2 (en) 2006-08-17 2016-10-25 International Business Machines Corporation Conservative class preloading for real time Java execution
JP4786486B2 (en) * 2006-09-19 2011-10-05 キヤノンソフトウェア株式会社 Information processing apparatus, information processing apparatus control method, and program
JP4296521B2 (en) * 2007-02-13 2009-07-15 ソニー株式会社 Display control apparatus, display control method, and program
JP2008210348A (en) * 2007-02-28 2008-09-11 Univ Of Tokyo Image display device
US9043707B2 (en) * 2007-03-28 2015-05-26 Autodesk, Inc. Configurable viewcube controller
US7782319B2 (en) * 2007-03-28 2010-08-24 Autodesk, Inc. Three-dimensional orientation indicator and controller
US8326442B2 (en) * 2007-05-25 2012-12-04 International Business Machines Corporation Constrained navigation in a three-dimensional (3D) virtual arena
US8686991B2 (en) 2007-09-26 2014-04-01 Autodesk, Inc. Navigation system for a 3D virtual scene
US9454283B1 (en) * 2008-01-07 2016-09-27 The Mathworks, Inc. Three-dimensional visualization
US9098647B2 (en) * 2008-03-10 2015-08-04 Apple Inc. Dynamic viewing of a three dimensional space
US8089479B2 (en) * 2008-04-11 2012-01-03 Apple Inc. Directing camera behavior in 3-D imaging system
US10134044B1 (en) 2008-05-28 2018-11-20 Excalibur Ip, Llc Collection and use of fine-grained user behavior data
WO2010022386A2 (en) * 2008-08-22 2010-02-25 Google Inc. Navigation in a three dimensional environment on a mobile device
KR101037497B1 (en) * 2009-08-21 2011-05-26 한국과학기술원 Three-dimensional navigation system for contents guide and method thereof
US9358158B2 (en) * 2010-03-16 2016-06-07 Kci Licensing, Inc. Patterned neo-epithelialization dressings, systems, and methods
US8913056B2 (en) * 2010-08-04 2014-12-16 Apple Inc. Three dimensional user interface effects on a display by using properties of motion
US9411413B2 (en) 2010-08-04 2016-08-09 Apple Inc. Three dimensional user interface effects on a display
US8560960B2 (en) 2010-11-23 2013-10-15 Apple Inc. Browsing and interacting with open windows
US9851866B2 (en) 2010-11-23 2017-12-26 Apple Inc. Presenting and browsing items in a tilted 3D space
EP2669781B1 (en) * 2012-05-30 2016-08-17 Dassault Systèmes A user interface for navigating in a three-dimensional environment
US9786097B2 (en) 2012-06-22 2017-10-10 Matterport, Inc. Multi-modal method for interacting with 3D models
US10139985B2 (en) 2012-06-22 2018-11-27 Matterport, Inc. Defining, displaying and interacting with tags in a three-dimensional model
US10163261B2 (en) * 2014-03-19 2018-12-25 Matterport, Inc. Selecting two-dimensional imagery data for display within a three-dimensional model
US10108693B2 (en) 2013-03-14 2018-10-23 Xdyne, Inc. System and method for interacting with virtual maps
JP6121564B2 (en) * 2013-11-29 2017-04-26 京セラドキュメントソリューションズ株式会社 Information processing apparatus, image forming apparatus, and information processing method
WO2015168167A1 (en) * 2014-04-28 2015-11-05 Invodo, Inc. System and method of three-dimensional virtual commerce environments
JP6239218B1 (en) * 2016-02-10 2017-11-29 三菱電機株式会社 Display control apparatus, display system, and display method
CN112799559B (en) * 2021-02-01 2023-10-31 北京海天维景科技有限公司 Anti-lost interaction method and device applied to digital sand table

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4089524A (en) * 1977-01-18 1978-05-16 Gremlin Industries, Inc. Digitally controlled electronic game
US4862373A (en) * 1987-05-13 1989-08-29 Texas Instruments Incorporated Method for providing a collision free path in a three-dimensional space
US4905147A (en) * 1986-10-15 1990-02-27 Logg George E Collision detection system for video system
US5047916A (en) * 1988-03-25 1991-09-10 Kabushiki Kaisha Toshiba Method and apparatus of free space enumeration for collision avoidance
US5050883A (en) * 1990-02-07 1991-09-24 Adolph E. Goldfarb Self-contained competitive game for developing spatial sense in young children
US5287446A (en) * 1990-10-15 1994-02-15 Sierra On-Line, Inc. System and methods for intelligent movement on computer displays
US5411272A (en) * 1992-11-20 1995-05-02 Sega Of America, Inc. Video game with spiral loop graphics
US5577961A (en) * 1994-06-28 1996-11-26 The Walt Disney Company Method and system for restraining a leader object in a virtual reality presentation
US5588914A (en) * 1994-06-28 1996-12-31 The Walt Disney Company Method and system for guiding a user in a virtual reality presentation
US5769718A (en) * 1996-05-15 1998-06-23 Rieder; William R. Video game apparatus and medium readable by a computer stored with video game program
US5889951A (en) * 1996-05-13 1999-03-30 Viewpoint Corporation Systems, methods, and computer program products for accessing, leasing, relocating, constructing and modifying internet sites within a multi-dimensional virtual reality environment
US6106399A (en) * 1997-06-16 2000-08-22 Vr-1, Inc. Internet audio multi-user roleplaying game
US6123619A (en) * 1999-03-23 2000-09-26 Square Co., Ltd. Method of generating maps with fixed and random portions and use of same in video games
US6139433A (en) * 1995-11-22 2000-10-31 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US6215498B1 (en) * 1998-09-10 2001-04-10 Lionhearth Technologies, Inc. Virtual command post
US6319129B1 (en) * 1999-09-30 2001-11-20 Konami Corporation Method and a video game system of generating a field map
US6380952B1 (en) * 1998-04-07 2002-04-30 International Business Machines Corporation System for continuous display and navigation in a virtual-reality world
US20030080956A1 (en) * 2001-10-30 2003-05-01 Chang Nelson Liang An Apparatus and method for distributing representative images in partitioned areas of a three-dimensional graphical environment
US20030080960A1 (en) * 2001-10-30 2003-05-01 Chang Nelson Liang An Layout design apparatus and method for three-dimensional graphical environments
US20030081012A1 (en) * 2001-10-30 2003-05-01 Chang Nelson Liang An User interface and method for interacting with a three-dimensional graphical environment
US20030081010A1 (en) * 2001-10-30 2003-05-01 An Chang Nelson Liang Automatically designed three-dimensional graphical environments for information discovery and visualization
US20030108695A1 (en) * 2001-08-28 2003-06-12 Freek Michael A. Polyethylene terephthalate disposable tumblers
US20050086612A1 (en) * 2003-07-25 2005-04-21 David Gettman Graphical user interface for an information display system
US6909443B1 (en) * 1999-04-06 2005-06-21 Microsoft Corporation Method and apparatus for providing a three-dimensional task gallery computer interface

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734805A (en) * 1994-06-17 1998-03-31 International Business Machines Corporation Apparatus and method for controlling navigation in 3-D space
US6219045B1 (en) * 1995-11-13 2001-04-17 Worlds, Inc. Scalable virtual world chat client-server system
US6097393A (en) * 1996-09-03 2000-08-01 The Takshele Corporation Computer-executed, three-dimensional graphical resource management process and system
US6057856A (en) * 1996-09-30 2000-05-02 Sony Corporation 3D virtual reality multi-user interaction with superimposed positional information display for each user
US6154211A (en) * 1996-09-30 2000-11-28 Sony Corporation Three-dimensional, virtual reality space display processing apparatus, a three dimensional virtual reality space display processing method, and an information providing medium
US20010055039A1 (en) * 1996-09-30 2001-12-27 Koichi Matsuda Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium
US6271843B1 (en) * 1997-05-30 2001-08-07 International Business Machines Corporation Methods systems and computer program products for transporting users in three dimensional virtual reality worlds using transportation vehicles
US5883628A (en) * 1997-07-03 1999-03-16 International Business Machines Corporation Climability: property for objects in 3-D virtual environments
JP3561114B2 (en) * 1997-07-28 2004-09-02 富士通株式会社 3D information browsing device
US5907328A (en) * 1997-08-27 1999-05-25 International Business Machines Corporation Automatic and configurable viewpoint switching in a 3D scene
JP3046578B2 (en) * 1998-06-11 2000-05-29 株式会社ナムコ Image generation device and information storage medium
US6414679B1 (en) * 1998-10-08 2002-07-02 Cyberworld International Corporation Architecture and methods for generating and displaying three dimensional representations
WO2000020987A2 (en) 1998-10-08 2000-04-13 Cyberworld, International Corp. Systems and methods for displaying three dimensional representations and an associated separate web window
US6388688B1 (en) * 1999-04-06 2002-05-14 Vergics Corporation Graph-based visual navigation through spatial environments
US6346938B1 (en) * 1999-04-27 2002-02-12 Harris Corporation Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model
US6636210B1 (en) * 2000-03-03 2003-10-21 Muse Corporation Method and system for auto-navigation in a three dimensional viewing environment
EP1261906A2 (en) 2000-03-10 2002-12-04 Richfx Ltd. Natural user interface for virtual reality shopping systems

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4089524A (en) * 1977-01-18 1978-05-16 Gremlin Industries, Inc. Digitally controlled electronic game
US4905147A (en) * 1986-10-15 1990-02-27 Logg George E Collision detection system for video system
US4862373A (en) * 1987-05-13 1989-08-29 Texas Instruments Incorporated Method for providing a collision free path in a three-dimensional space
US5047916A (en) * 1988-03-25 1991-09-10 Kabushiki Kaisha Toshiba Method and apparatus of free space enumeration for collision avoidance
US5050883A (en) * 1990-02-07 1991-09-24 Adolph E. Goldfarb Self-contained competitive game for developing spatial sense in young children
US5287446A (en) * 1990-10-15 1994-02-15 Sierra On-Line, Inc. System and methods for intelligent movement on computer displays
US5411272A (en) * 1992-11-20 1995-05-02 Sega Of America, Inc. Video game with spiral loop graphics
US5577961A (en) * 1994-06-28 1996-11-26 The Walt Disney Company Method and system for restraining a leader object in a virtual reality presentation
US5588914A (en) * 1994-06-28 1996-12-31 The Walt Disney Company Method and system for guiding a user in a virtual reality presentation
US6139433A (en) * 1995-11-22 2000-10-31 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US5889951A (en) * 1996-05-13 1999-03-30 Viewpoint Corporation Systems, methods, and computer program products for accessing, leasing, relocating, constructing and modifying internet sites within a multi-dimensional virtual reality environment
US5769718A (en) * 1996-05-15 1998-06-23 Rieder; William R. Video game apparatus and medium readable by a computer stored with video game program
US6106399A (en) * 1997-06-16 2000-08-22 Vr-1, Inc. Internet audio multi-user roleplaying game
US6380952B1 (en) * 1998-04-07 2002-04-30 International Business Machines Corporation System for continuous display and navigation in a virtual-reality world
US6215498B1 (en) * 1998-09-10 2001-04-10 Lionhearth Technologies, Inc. Virtual command post
US6123619A (en) * 1999-03-23 2000-09-26 Square Co., Ltd. Method of generating maps with fixed and random portions and use of same in video games
US6909443B1 (en) * 1999-04-06 2005-06-21 Microsoft Corporation Method and apparatus for providing a three-dimensional task gallery computer interface
US6319129B1 (en) * 1999-09-30 2001-11-20 Konami Corporation Method and a video game system of generating a field map
US20030108695A1 (en) * 2001-08-28 2003-06-12 Freek Michael A. Polyethylene terephthalate disposable tumblers
US20030080956A1 (en) * 2001-10-30 2003-05-01 Chang Nelson Liang An Apparatus and method for distributing representative images in partitioned areas of a three-dimensional graphical environment
US20030080960A1 (en) * 2001-10-30 2003-05-01 Chang Nelson Liang An Layout design apparatus and method for three-dimensional graphical environments
US20030081012A1 (en) * 2001-10-30 2003-05-01 Chang Nelson Liang An User interface and method for interacting with a three-dimensional graphical environment
US20030081010A1 (en) * 2001-10-30 2003-05-01 An Chang Nelson Liang Automatically designed three-dimensional graphical environments for information discovery and visualization
US20050086612A1 (en) * 2003-07-25 2005-04-21 David Gettman Graphical user interface for an information display system

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060267966A1 (en) * 2005-05-24 2006-11-30 Microsoft Corporation Hover widgets: using the tracking state to extend capabilities of pen-operated devices
US20060267967A1 (en) * 2005-05-24 2006-11-30 Microsoft Corporation Phrasing extensions and multiple modes in one spring-loaded control
US20070168890A1 (en) * 2006-01-13 2007-07-19 Microsoft Corporation Position-based multi-stroke marking menus
US7603633B2 (en) * 2006-01-13 2009-10-13 Microsoft Corporation Position-based multi-stroke marking menus
US20070257915A1 (en) * 2006-05-08 2007-11-08 Ken Kutaragi User Interface Device, User Interface Method and Information Storage Medium
US8890895B2 (en) * 2006-05-08 2014-11-18 Sony Corporation User interface device, user interface method and information storage medium
US9223471B2 (en) 2010-12-28 2015-12-29 Microsoft Technology Licensing, Llc Touch screen control
US20130141428A1 (en) * 2011-11-18 2013-06-06 Dale L. Gipson Computer-implemented apparatus, system, and method for three dimensional modeling software
US20150278211A1 (en) * 2014-03-31 2015-10-01 Microsoft Corporation Using geographic familiarity to generate search results
US9619523B2 (en) * 2014-03-31 2017-04-11 Microsoft Technology Licensing, Llc Using geographic familiarity to generate search results
US10371541B2 (en) 2014-03-31 2019-08-06 Microsoft Technology Licensing, Llc Using geographic familiarity to generate navigation directions
US11532139B1 (en) * 2020-06-07 2022-12-20 Apple Inc. Method and device for improved pathfinding

Also Published As

Publication number Publication date
EP1442356B1 (en) 2009-02-18
JP4130409B2 (en) 2008-08-06
DE60231229D1 (en) 2009-04-02
WO2003038593A1 (en) 2003-05-08
US20030081012A1 (en) 2003-05-01
US6907579B2 (en) 2005-06-14
TW200301444A (en) 2003-07-01
EP1442356A1 (en) 2004-08-04
TW583597B (en) 2004-04-11
JP2005531824A (en) 2005-10-20

Similar Documents

Publication Publication Date Title
US20050144574A1 (en) Constraining user movement in virtual environments
US20230083703A1 (en) Capturing Environmental Features Using 2D and 3D Scans
US6426757B1 (en) Method and apparatus for providing pseudo-3D rendering for virtual reality computer user interfaces
US5463722A (en) Automatic alignment of objects in two-dimensional and three-dimensional display space using an alignment field gradient
US6323859B1 (en) Method and system for interactively determining and displaying geometric relationship between three dimensional objects based on predetermined geometric constraints and position of an input device
US6094196A (en) Interaction spheres of three-dimensional objects in three-dimensional workspace displays
de Berg et al. Delaunay triangulations: height interpolation
Fan et al. A three-step approach of simplifying 3D buildings modeled by CityGML
US7439972B2 (en) Method of generating a computer readable model
US11182513B2 (en) Generating technical drawings from building information models
Ma Bisectors and Voronoi diagrams for convex distance functions
Zheng et al. SmartCanvas: Context‐inferred Interpretation of Sketches for Preparatory Design Studies
EP2589933B1 (en) Navigation device, method of predicting a visibility of a triangular face in an electronic map view
Hürst et al. Dynamic versus static peephole navigation of VR panoramas on handheld devices
US20020180809A1 (en) Navigation in rendered three-dimensional spaces
Huhnt Reconstruction of edges in digital building models
US20230185984A1 (en) Generating Technical Drawings From Building Information Models
Asghari et al. Developing an integrated approach to validate 3D ownership spaces in complex multistorey buildings
US9454554B1 (en) View dependent query of multi-resolution clustered 3D dataset
AU2020221451A1 (en) Generating technical drawings from building information models
Žalik et al. Construction of a non-symmetric geometric buffer from a set of line segments
Ieronutti et al. Automatic derivation of electronic maps from X3D/VRML worlds
Sinyukov et al. CWave: theory and practice of a fast single-source any-angle path planning algorithm
US20030001906A1 (en) Moving an object on a drag plane in a virtual three-dimensional space
Izaki et al. Visibility polygon traversal algorithm

Legal Events

Date Code Title Description
AS Assignment

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHANG, NELSON LIANG AN;REEL/FRAME:016261/0737

Effective date: 20050204

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION