WO1997022918A1 - Computer-controlled system for producing three-dimensional navigable photographs of areas and method thereof - Google Patents

Computer-controlled system for producing three-dimensional navigable photographs of areas and method thereof Download PDF

Info

Publication number
WO1997022918A1
WO1997022918A1 PCT/US1996/020039 US9620039W WO9722918A1 WO 1997022918 A1 WO1997022918 A1 WO 1997022918A1 US 9620039 W US9620039 W US 9620039W WO 9722918 A1 WO9722918 A1 WO 9722918A1
Authority
WO
WIPO (PCT)
Prior art keywords
computer
camera apparatus
select
controlled system
photographic images
Prior art date
Application number
PCT/US1996/020039
Other languages
French (fr)
Inventor
Michael H. Zimmerman
Scott A. Moore
Original Assignee
Mediamaxx Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mediamaxx Incorporated filed Critical Mediamaxx Incorporated
Priority to AU14637/97A priority Critical patent/AU1463797A/en
Publication of WO1997022918A1 publication Critical patent/WO1997022918A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering

Definitions

  • This invention relates generally to computer-controlled systems and pertains more particularly to a computer-controlled system for producing three-dimensional navigable photographs of areas and the method thereof .
  • Computer technology has developed so that particular areas, such as retail establishments, homes and other spaces, can be displayed on personal computers in a realistic three- dimensional form or in a "virtual reality" environment.
  • Apple Computer, Inc. of Cupertino, California has developed technology under its QuickTime VR ® trademark which allows users to create such realistic areas which are then playable on a personal computer.
  • This technology permits the user to create a "node” by stitching and blending together a series of side-by-side still-life digital photographs to form one long panoramic photograph. This created panoramic photograph results in a single "node” which is a 360° panorama of the area around the user.
  • Each node can then be combined with other nodes to create a fully navigable multi-node "movie.”
  • a multi- node movie a user can not only spin around a room, for example, and “see” a 360° panorama of the viewing area, but also "walk” through the room shown in the panoramic multi- node movie and explore different rooms of a store, for example, simply by branching from one node to another.
  • the connection between nodes is accomplished by photographing intermediate non-panoramic step frames animating the transition from one node to another.
  • a computer-controlled system for producing and combining photographic images to create a virtual reality environment which includes a movable camera apparatus for creating at least one digital photographic image and a computer for controlling the movable camera apparatus to move to a select location and produce a select number of digital photographic images at the select location.
  • the computer then combines the digital photographic images together to form the virtual reality environment.
  • FIG. 1 shows a block diagram of the components of the computer-controlled system and method for producing three- dimensional navigable photographs of areas in accordance with the principles of the present invention
  • FIG. 2 shows an illustrative diagram of a digital map for use in the computer-controlled system and method of FIG. 1.
  • the present invention is directed to a computer- controlled system 10 and method which provides three- dimensional navigable photographs for selected areas to create a virtual reality environment.
  • the computer- controlled system and method of the present invention comprises a movable camera apparatus for creating at least one digital photographic image and computing means for controlling the movable camera apparatus to move to a select location and produce a select number of digital photographic images at the select location.
  • Computing means then combines the digital photographic images together to form a three- dimensional virtual reality environment.
  • the computer-controlled system 10 produces three- dimensional navigable digital photographs in order to create a multi-node movie to allow a user to see a selected particular area, such as a retail establishment, home, etc., in a three-dimensional or virtual reality environment.
  • the computer-controlled system 10 allows a user to capture quickly, efficiently and automatically all necessary photographic components of each node of the movie, automatically process the digital photographs into a single- node to create the panoramic image by stitching and dicing these images together, and automatically capture the intermediate, non-panoramic step-frames.
  • the computer-controlled system 10 allows users to quickly add "hot spots" or selection items which can themselves be selected and viewed. These images and hot spots are then automatically combined and compressed into a composite multi- node movie.
  • the computer-controlled system 10 in its preferred embodiment comprises a hardware and software system as illustrated in FIG. 1.
  • computing means 100 or Apple Macintosh ® computers which are commercially available from Apple Computer, Inc. of Cupertino, California are used in the computer-controlled system 10.
  • the computer 100 includes memory 102, a central processing unit 104 and interface 106 for interfacing between the computer 100 and other components.
  • the other components comprise display means or monitor 110 for viewing information, input means or input device 120 such as a terminal, pointing device, mouse, bar code reader, etc., for inputting information, and output means or printer 130 for printing information.
  • a movable camera apparatus 2 00 comprises camera means or a computer-controlled digital camera 210 for creating digital photographic images and specifically configured to photograph 360° panoramas, moving means or a computer-controlled motorized device 230 which rotates the camera in precise, programmable increments for a total of 360° while ensuring that the lens plane remains centered over the vertical rotational axis, leveling means or a computer-controlled motorized leveling device 240 which ensures that the camera lens is stationed at all times at a 90° angle to level ground, support structure or a motorized base 220 which transports the entire system linearly from one point to another, driving means or motor 250 for driving the motorized device 230, leveling device 240 and the support structure 220 and supply means or battery 260 for supplying power to the motor 250 to drive the various devices of the movable camera apparatus 200.
  • the computer 100 is a low- energy consumption CPU such as an Apple Macintosh ® PowerPC 603 computer which is commercially available from Apple Computer, Inc. of Cupertino, California with 64MB or 128MB of RAM and 2GB or 4GB of hard disk space, an interchangeable battery-pack capable of powering the system for up to eight hours or more and a portable battery recharging station.
  • a low- energy consumption CPU such as an Apple Macintosh ® PowerPC 603 computer which is commercially available from Apple Computer, Inc. of Cupertino, California with 64MB or 128MB of RAM and 2GB or 4GB of hard disk space, an interchangeable battery-pack capable of powering the system for up to eight hours or more and a portable battery recharging station.
  • the system 10 also comprises remote control means or a remote controller 140 for controlling the functioning and movement of the camera apparatus 200 through the computer 100 as well as storing means or storage device 150, such as an internal hard disk, CD-ROM disk, magnetic tape, etc. for storing the application software, work-in-progress and final photographic images.
  • storing means or storage device 150 such as an internal hard disk, CD-ROM disk, magnetic tape, etc. for storing the application software, work-in-progress and final photographic images.
  • infrared networking technology can also be used to operate the camera apparatus 200 as well as permit viewing by the user of what the camera apparatus 200 is photographing.
  • infrared networking technology which is commercially available from a variety of vendors, helps to free the camera apparatus 200 from the difficult task of transporting heavy computer hardware thereon in order to store data. Rather the data would be transmitted to and from a remote computing system from and to the camera apparatus 200.
  • the camera apparatus 200 of the system 10 is controlled by the computer 100.
  • the computer 100 has stored therein control commands for controlling the movement of the camera.
  • the computer 100 also has stored therein a "digital map" or path set up by the user prior to the beginning of the photography session for the camera apparatus 200 to follow a particular route in order to create the multi-node movie.
  • This digital map is inputted to the computer 100 to outline the route the user wants the camera apparatus 200 to follow to create each node of the movie and ultimately to form the multi-node movie.
  • the camera apparatus 200 and the movement thereof is then controlled by the computer 100 based upon the information input to form the digital map.
  • the computer 100 can be programmed so as to control the camera apparatus 200 to follow any particular path in the store.
  • each aisle of the store as indicated by "A", “B” , “C”, “D”, “E” and “F” in FIG. 2, having shelves therebetween, can represent a "leg" of the digital map which when combined forms the "journey" of the camera apparatus 200.
  • Each "N” on the map represents a select location for the camera apparatus 200 to create a node for the movie. Arrows as illustrated in FIG. 2 represent the direction the camera apparatus 200 can travel in order to create the multi-node movie.
  • a "START" point, an "END" point and arrows indicated thereon define the path which the camera apparatus 200 can be controlled to follow in order to create the multi-node movie.
  • this digital map is merely illustrative m that the camera apparatus 200 can follow a variety of different paths to create the desired multi-node movie.
  • a digital map can also be generated which allows for random movement of the camera apparatus 200 and its spacing m order to create a multi-node movie with maximum navigational freedom and flexibility.
  • This digital map can be generated in a variety of ways, such as pictorially by drawing it using a graphical interface, or generating a map by "walking" or powering the system to each leg's beginning and ending point, etc. in a trial run.
  • two store aisles might be composed of three "legs" one leg down the first aisle, a second leg across and around the end of shelves of the aisle and a third leg returning the camera apparatus 200 almost to the original point of departure at the end of the second aisle.
  • a fourth leg would be added to return the camera apparatus 200 precisely to its starting point.
  • system 10 is not limited to the above described digital map, but can be an infinite variety of different legs and ourneys which can be programmed in order to create the multi-node movie.
  • the system 10 can operate as follows: 1) at a first node, the camera 210 of the camera apparatus 200 is instructed by the computer 100 to digitally capture a photograph at point zero, to save the photograph to memory 102 or the storage device 150 using a three or four digit sequential number as its name, to rotate the camera 210 on its vertical axis a predefined and uniform number of degrees to the right or left by driving the motor 250 which m turn drives the motorized device 230, and to repeat the process until enough photographs or a select number of photographs to prepare a 360° photographic panorama have been completed.
  • the computer 100 of the system 10 begins to automatically process, i.e., stitch, dice and compress, the photographs for a single node
  • the computer 100 instructs the camera apparatus 200 of the system 10 to move a specified increment towards the next node by sending instructions to the power supply 250 to drive the motor 250 which in turn drives the motorized device 230.
  • the camera apparatus 200 is also instructed to capture a single intermediate, non-panoramic step-frame.
  • the computer 100 Upon the arrival of the camera apparatus 200 at its next node, for example three to four feet from the first node, the computer 100 instructs the camera 210 of the camera apparatus 200 to capture another 360° panorama and controls the functioning of the motorized device 230 and the leveling device 240 to produce the desired images. The computer 100 can then begin processing the step-frames and the node panoramas into a linear multi-node movie.
  • the computer 100 based on the programming set by the user, can either wait for further operation instructions, for example, adjustments to the stitching, programming of "hot spots", etc., or automatically complete assembly of the multi-node "movie.”
  • the system 10 displays on the monitor 130 the completed multi- node movie creating the three-dimensional or virtual reality environment.
  • an input device 120 such as a mouse, "touch” screen or electronic “pen” capability by drawing directly on the monitor 110, etc.
  • the user chooses a hot spot or selection item on display in the panoramic image by "sketching" over the entire area of the item. This sketching occurs by tracing an outline of the entire area of the item and filling the space therein.
  • Other methods of item identification can be implemented as well.
  • the user can also select one point on the computer's monitor 110 within the desired selection item and then allow the computer 100 to define the appropriate area of the selection item by selecting all contiguous, similarly colored screen pixels.
  • the user uses an input device 120 such as a bar-code reader to collect selection item identification information (SKU number, manufacturer, product description, list price, etc.) from labeling on the item's packaging or other reference.
  • the computer 100 then permanently associates this item identification information (SKU number or other unique identification number) with the area m the panorama image.
  • the computer 100 also gives the user the opportunity to associate any other existing graphics or data such as a photograph, a video clip, a spreadsheet or a text file with this selection item. This association process is completed for any or all items in each panorama image which the user wishes to be "interactive," that is, subject to user selection, viewing and/or further action within the panorama image.
  • the hot spot or selection item can be a photographic image of a toy on a shelf in a store. While viewing the panoramic images taken by the camera apparatus 200 in store, the user can choose the toy, outline the shape of the toy and fill in its definition.
  • a bar code reader 120 can then be used to input information to the computer 100 concerning this toy.
  • the customer can select the toy and receive a variety of information concerning the toy as well as purchase the toy based upon the information stored in the computer 100.
  • the system 10 then completes the panoramic "movie" by combining, compressing, compiling and saving all elements into one final, finished data file.
  • the system 10 as identified is intended to be used for the cost-effective development of navigable virtual reality PC-based movies specifically to create a selected area, such as a shopping environment, which enables the user to explore an almost unlimited number of stores in a "virtual" shopping mall.
  • These movies could be distributed through a variety of storage devices, such as floppy diskette or CD-ROM, or communication means, such as the Internet, Interactive TV and Interactive Cable TV.
  • the system 10 is not limited to the above, but can be used to create movies so that realtors could let potential buyers “tour” a home via the Internet, travel packages can be shown to let potential customers “tour” an ocean liner or hotel room before making reservations, manufacturers could show their facilities to people in distant locations, etc.
  • the system 10 can be used to allow computer game designers to significantly reduce the expense incurred by rendering or photographing individual photographic frames in producing an interactive game.
  • Fig. 1 is directed to a computer-controlled system 10 with a camera apparatus 200, an infinite number of apparatuses 200 could be used in various areas of a store, for example, one apparatus in each aisle of a retail establishment, to yield even greater efficiencies.
  • the present embodiment limited to digital cameras 210 but can also include a slit camera which can be used to produce one continuous, seamless panorama which would avoid having to stitch the photographs together to form a panorama.
  • system 10 of the present embodiment uses a Macintosh ® computer
  • the system 10 is implementable in other embodiments on computers other than Macintosh ® computers .
  • the present invention is implementable in other graphical user interface environments, such as Windows 95, IBM OS/2, Sun Solaris, as well as World Wide Web Browsers such as Netscape, etc.
  • system 10 i ⁇ not limited to use with QuickTime VR but can be used in other navigable movie or photograph format applications which have been developed to form single node panoramas or multi-node movies.
  • the system 10 as illustrated in FIG. 1, is not limited to the components as shown, but can comprise a variety of different hardware and software configurations in order to create navigable three-dimensional photographs for selected areas.

Abstract

A computer-controlled system (10) for producing and combining photographic images to create a virtual reality environment includes a moveable camera apparatus (200) for creating at least one digital photographic image and a computer (100) including a memory (102), a CPU (104) and an interface (106) for controlling the moveable camera to move to a select location and produce a select number of digital photographic images at the select location. The computer (100) then combines the digital photographic images together to form the virtual reality environment.

Description

COMPUTER-CONTROLLED SYSTEM FOR PRODUCING THREE-DIMENSIONAL NAVIGABLE PHOTOGRAPHS OF AREAS AND METHOD THEREOF
Field of the Invention This invention relates generally to computer-controlled systems and pertains more particularly to a computer- controlled system for producing three-dimensional navigable photographs of areas and the method thereof .
Background of the Invention
Computer technology has developed so that particular areas, such as retail establishments, homes and other spaces, can be displayed on personal computers in a realistic three- dimensional form or in a "virtual reality" environment. Apple Computer, Inc. of Cupertino, California has developed technology under its QuickTime VR® trademark which allows users to create such realistic areas which are then playable on a personal computer. This technology permits the user to create a "node" by stitching and blending together a series of side-by-side still-life digital photographs to form one long panoramic photograph. This created panoramic photograph results in a single "node" which is a 360° panorama of the area around the user.
Each node can then be combined with other nodes to create a fully navigable multi-node "movie." With a multi- node movie, a user can not only spin around a room, for example, and "see" a 360° panorama of the viewing area, but also "walk" through the room shown in the panoramic multi- node movie and explore different rooms of a store, for example, simply by branching from one node to another. The connection between nodes is accomplished by photographing intermediate non-panoramic step frames animating the transition from one node to another.
In order to create this multi-node movie, a photographer must photograph each node one at a time. A camera must be set on a tripod which must be perfectly level at all times during the photography session, the center plane of the camera lens must be precisely bisected by the vertical axis in the center of the tripod which rotates the camera, the lens plane must be situated at exactly a 90 degree angle to level ground and the camera must be rotated precisely and uniformly between each shot . This filming of a multi-node movie is extremely exacting and tedious since in order to provide the appearance of smooth and continuous motion, each incremental movement of the camera from one node to the next node must be precisely measured and cannot have any bias from left to right . In addition, great accuracy must be maintained and duplicated each time a node is to be photographed. Thus if the same parameters are not maintained during the photography session of each node, then the session would have to be re-shot because the movie would appear to be disjointed and unconnected and would not correctly produce a feeling of three-dimensionality or virtual reality. Such action is very time consuming and can become a very expensive process if mistakes are made.
Accordingly, other systems are being developed to create digital photographic images for producing a multi-node movie which have increased accuracy while also being simple and easy to use.
It is therefore an object of the present invention to provide a computer-controlled system and method for producing three-dimensional navigable photographs of areas which is accurate and precise .
It is an additional object of the present invention to provide a computer-controlled system and method for producing three-dimensional navigable photographs of areas which is simple and easy to use.
It is a further object of the present invention to provide a computer-controlled system and method for producing three-dimensional navigable photographs of areas which is economical to manufacture and is not labor intensive. These and further objects and advantages of the present invention will become clearer in light of the following detailed description of an illustrative embodiment of this invention described in connection with the drawings. Summary of the Invention
In accordance with the principles of the present invention, the above and other objectives are realized in a computer-controlled system for producing and combining photographic images to create a virtual reality environment which includes a movable camera apparatus for creating at least one digital photographic image and a computer for controlling the movable camera apparatus to move to a select location and produce a select number of digital photographic images at the select location. The computer then combines the digital photographic images together to form the virtual reality environment.
Brief Description of the Drawings The above and other features and aspects of the present invention will become more apparent upon reading the following detailed description in conjunction with the accompanying drawings, in which:
FIG. 1 shows a block diagram of the components of the computer-controlled system and method for producing three- dimensional navigable photographs of areas in accordance with the principles of the present invention; and
FIG. 2 shows an illustrative diagram of a digital map for use in the computer-controlled system and method of FIG. 1.
All of the figures are drawn for ease of explanation of the basic teachings of the present invention only; the extension of the figures with respect to a particular application of the present invention will be within the skill of a practitioner of the art after the teachings of the present invention have been read, followed and understood.
Detailed Description
The present invention is directed to a computer- controlled system 10 and method which provides three- dimensional navigable photographs for selected areas to create a virtual reality environment. The computer- controlled system and method of the present invention comprises a movable camera apparatus for creating at least one digital photographic image and computing means for controlling the movable camera apparatus to move to a select location and produce a select number of digital photographic images at the select location. Computing means then combines the digital photographic images together to form a three- dimensional virtual reality environment.
The computer-controlled system 10 produces three- dimensional navigable digital photographs in order to create a multi-node movie to allow a user to see a selected particular area, such as a retail establishment, home, etc., in a three-dimensional or virtual reality environment. The computer-controlled system 10 allows a user to capture quickly, efficiently and automatically all necessary photographic components of each node of the movie, automatically process the digital photographs into a single- node to create the panoramic image by stitching and dicing these images together, and automatically capture the intermediate, non-panoramic step-frames. In addition, the computer-controlled system 10 allows users to quickly add "hot spots" or selection items which can themselves be selected and viewed. These images and hot spots are then automatically combined and compressed into a composite multi- node movie. The computer-controlled system 10 in its preferred embodiment comprises a hardware and software system as illustrated in FIG. 1.
In the present illustrative case, it is assumed that computing means 100 or Apple Macintosh® computers which are commercially available from Apple Computer, Inc. of Cupertino, California are used in the computer-controlled system 10. In the present embodiment, The computer 100 includes memory 102, a central processing unit 104 and interface 106 for interfacing between the computer 100 and other components. In the present embodiment, the other components comprise display means or monitor 110 for viewing information, input means or input device 120 such as a terminal, pointing device, mouse, bar code reader, etc., for inputting information, and output means or printer 130 for printing information.
In the present embodiment, a movable camera apparatus 200 comprises camera means or a computer-controlled digital camera 210 for creating digital photographic images and specifically configured to photograph 360° panoramas, moving means or a computer-controlled motorized device 230 which rotates the camera in precise, programmable increments for a total of 360° while ensuring that the lens plane remains centered over the vertical rotational axis, leveling means or a computer-controlled motorized leveling device 240 which ensures that the camera lens is stationed at all times at a 90° angle to level ground, support structure or a motorized base 220 which transports the entire system linearly from one point to another, driving means or motor 250 for driving the motorized device 230, leveling device 240 and the support structure 220 and supply means or battery 260 for supplying power to the motor 250 to drive the various devices of the movable camera apparatus 200. In the present embodiment, the computer 100 is a low- energy consumption CPU such as an Apple Macintosh® PowerPC 603 computer which is commercially available from Apple Computer, Inc. of Cupertino, California with 64MB or 128MB of RAM and 2GB or 4GB of hard disk space, an interchangeable battery-pack capable of powering the system for up to eight hours or more and a portable battery recharging station.
In the present embodiment, the system 10 also comprises remote control means or a remote controller 140 for controlling the functioning and movement of the camera apparatus 200 through the computer 100 as well as storing means or storage device 150, such as an internal hard disk, CD-ROM disk, magnetic tape, etc. for storing the application software, work-in-progress and final photographic images.
Although the present embodiment utilizes cabling to communicate between the computer 100 and the camera apparatus 200, infrared networking technology can also be used to operate the camera apparatus 200 as well as permit viewing by the user of what the camera apparatus 200 is photographing. Such infrared networking technology, which is commercially available from a variety of vendors, helps to free the camera apparatus 200 from the difficult task of transporting heavy computer hardware thereon in order to store data. Rather the data would be transmitted to and from a remote computing system from and to the camera apparatus 200.
In order to create this multi-node movie, the camera apparatus 200 of the system 10 is controlled by the computer 100. The computer 100 has stored therein control commands for controlling the movement of the camera. The computer 100 also has stored therein a "digital map" or path set up by the user prior to the beginning of the photography session for the camera apparatus 200 to follow a particular route in order to create the multi-node movie. This digital map is inputted to the computer 100 to outline the route the user wants the camera apparatus 200 to follow to create each node of the movie and ultimately to form the multi-node movie. The camera apparatus 200 and the movement thereof is then controlled by the computer 100 based upon the information input to form the digital map.
As shown in FIG. 2, for example, which illustrates a digital map for a retail establishment, the computer 100 can be programmed so as to control the camera apparatus 200 to follow any particular path in the store. As illustrated, for example, each aisle of the store, as indicated by "A", "B" , "C", "D", "E" and "F" in FIG. 2, having shelves therebetween, can represent a "leg" of the digital map which when combined forms the "journey" of the camera apparatus 200. Each "N" on the map represents a select location for the camera apparatus 200 to create a node for the movie. Arrows as illustrated in FIG. 2 represent the direction the camera apparatus 200 can travel in order to create the multi-node movie. Select locations for creating a primary non-panoramic step frame and/or a secondary non-panoramic step frame are also illustrated. As illustrated on FIG. 2, a "START" point, an "END" point and arrows indicated thereon define the path which the camera apparatus 200 can be controlled to follow in order to create the multi-node movie. However, this digital map is merely illustrative m that the camera apparatus 200 can follow a variety of different paths to create the desired multi-node movie. In addition, a digital map can also be generated which allows for random movement of the camera apparatus 200 and its spacing m order to create a multi-node movie with maximum navigational freedom and flexibility.
This digital map can be generated in a variety of ways, such as pictorially by drawing it using a graphical interface, or generating a map by "walking" or powering the system to each leg's beginning and ending point, etc. in a trial run. For example, two store aisles might be composed of three "legs" one leg down the first aisle, a second leg across and around the end of shelves of the aisle and a third leg returning the camera apparatus 200 almost to the original point of departure at the end of the second aisle. A fourth leg would be added to return the camera apparatus 200 precisely to its starting point.
However, the system 10 is not limited to the above described digital map, but can be an infinite variety of different legs and ourneys which can be programmed in order to create the multi-node movie.
In an illustrative embodiment, the system 10 can operate as follows: 1) at a first node, the camera 210 of the camera apparatus 200 is instructed by the computer 100 to digitally capture a photograph at point zero, to save the photograph to memory 102 or the storage device 150 using a three or four digit sequential number as its name, to rotate the camera 210 on its vertical axis a predefined and uniform number of degrees to the right or left by driving the motor 250 which m turn drives the motorized device 230, and to repeat the process until enough photographs or a select number of photographs to prepare a 360° photographic panorama have been completed.
Once all the necessary photographs have been taken, the computer 100 of the system 10 begins to automatically process, i.e., stitch, dice and compress, the photographs for a single node In what could be a simultaneous process, the computer 100 instructs the camera apparatus 200 of the system 10 to move a specified increment towards the next node by sending instructions to the power supply 250 to drive the motor 250 which in turn drives the motorized device 230. At each specified increment or select location, the camera apparatus 200 is also instructed to capture a single intermediate, non-panoramic step-frame.
Upon the arrival of the camera apparatus 200 at its next node, for example three to four feet from the first node, the computer 100 instructs the camera 210 of the camera apparatus 200 to capture another 360° panorama and controls the functioning of the motorized device 230 and the leveling device 240 to produce the desired images. The computer 100 can then begin processing the step-frames and the node panoramas into a linear multi-node movie. When the digital map stored in the computer 100 has been completed by the camera apparatus 200, the computer 100, based on the programming set by the user, can either wait for further operation instructions, for example, adjustments to the stitching, programming of "hot spots", etc., or automatically complete assembly of the multi-node "movie."
If, however, the user chooses to add "hot spots," the system 10 displays on the monitor 130 the completed multi- node movie creating the three-dimensional or virtual reality environment. Using an input device 120, such as a mouse, "touch" screen or electronic "pen" capability by drawing directly on the monitor 110, etc., the user chooses a hot spot or selection item on display in the panoramic image by "sketching" over the entire area of the item. This sketching occurs by tracing an outline of the entire area of the item and filling the space therein. Other methods of item identification can be implemented as well. For example, the user can also select one point on the computer's monitor 110 within the desired selection item and then allow the computer 100 to define the appropriate area of the selection item by selecting all contiguous, similarly colored screen pixels.
Once the area of the selection item within the panorama image has been defined, the user uses an input device 120 such as a bar-code reader to collect selection item identification information (SKU number, manufacturer, product description, list price, etc.) from labeling on the item's packaging or other reference. The computer 100 then permanently associates this item identification information (SKU number or other unique identification number) with the area m the panorama image. The computer 100 also gives the user the opportunity to associate any other existing graphics or data such as a photograph, a video clip, a spreadsheet or a text file with this selection item. This association process is completed for any or all items in each panorama image which the user wishes to be "interactive," that is, subject to user selection, viewing and/or further action within the panorama image.
For example in the present embodiment, the hot spot or selection item can be a photographic image of a toy on a shelf in a store. While viewing the panoramic images taken by the camera apparatus 200 in store, the user can choose the toy, outline the shape of the toy and fill in its definition. A bar code reader 120 can then be used to input information to the computer 100 concerning this toy. Thus when the multi-node movie is viewed by a customer, the customer can select the toy and receive a variety of information concerning the toy as well as purchase the toy based upon the information stored in the computer 100. Once this information has been developed, retrieved and stored, the system 10 then completes the panoramic "movie" by combining, compressing, compiling and saving all elements into one final, finished data file.
The system 10 as identified is intended to be used for the cost-effective development of navigable virtual reality PC-based movies specifically to create a selected area, such as a shopping environment, which enables the user to explore an almost unlimited number of stores in a "virtual" shopping mall. These movies could be distributed through a variety of storage devices, such as floppy diskette or CD-ROM, or communication means, such as the Internet, Interactive TV and Interactive Cable TV.
The system 10 is not limited to the above, but can be used to create movies so that realtors could let potential buyers "tour" a home via the Internet, travel packages can be shown to let potential customers "tour" an ocean liner or hotel room before making reservations, manufacturers could show their facilities to people in distant locations, etc. In addition, the system 10 can be used to allow computer game designers to significantly reduce the expense incurred by rendering or photographing individual photographic frames in producing an interactive game. Although Fig. 1 is directed to a computer-controlled system 10 with a camera apparatus 200, an infinite number of apparatuses 200 could be used in various areas of a store, for example, one apparatus in each aisle of a retail establishment, to yield even greater efficiencies. Nor is the present embodiment limited to digital cameras 210 but can also include a slit camera which can be used to produce one continuous, seamless panorama which would avoid having to stitch the photographs together to form a panorama.
Although the system 10 of the present embodiment uses a Macintosh® computer, the system 10 is implementable in other embodiments on computers other than Macintosh® computers . Further, the present invention is implementable in other graphical user interface environments, such as Windows 95, IBM OS/2, Sun Solaris, as well as World Wide Web Browsers such as Netscape, etc. Also the system 10 iε not limited to use with QuickTime VR but can be used in other navigable movie or photograph format applications which have been developed to form single node panoramas or multi-node movies. Further, the system 10 as illustrated in FIG. 1, is not limited to the components as shown, but can comprise a variety of different hardware and software configurations in order to create navigable three-dimensional photographs for selected areas.
In all cases it is understood that the above-described arrangements are merely illustrative of the many possible specific embodiments which represent applications of the present invention. Accordingly numerous and varied other arrangements, can be readily devised in accordance with the principles of the present invention without departing from the spirit and scope of the invention. For example, the present invention while directed to retail establishments can also be applied to viewing homes, travel promotions, etc., where three-dimensional viewing or a virtual reality environment is desired to be photographed.

Claims

What Is Claimed Is;
1. A computer-controlled system for producing and combining photographic images to create a virtual reality environment, comprising: a movable camera apparatus for creating at least one digital photographic image; and computing means for controlling said movable camera apparatus to move to a select location and produce a select number of digital photographic images at the select location, said computing means combining the digital photographic images together to form the virtual reality environment.
2. A computer-controlled system according to claim 1, wherein said computing means includes a memory for storing a path along which the movable camera apparatus is to be moved.
3. A computer-controlled system according to claim 2, wherein said path comprises a number of select locations to which the computing means moves the movable camera apparatus and instructs the movable camera apparatus to produce a select number of digital photographic images at each select location.
4. A computer-controlled system according to claim 3, wherein at a select number of select locations, a plurality of digital photographic images are taken and combined to form a panoramic image.
5 A computer-controlled system according to claim 4, wherein at a select number of select locations, a primary non-panoramic step frame is taken.
6. A computer-controlled system according to claim 5, wherein at a select number of select locations, a secondary non-panoramic step frame s taken.
7. A computer-controlled system according to claim 1, wherein said computing means s located on the movable camera apparatus.
8. A computer-controlled system according to claim 1, wherein said movable camera apparatus is controllable by remote-control means.
9. A method for producing and combining photographic images to create a virtual reality environment, comprising the steps of : moving a movable camera apparatus to a select location by means of computing means and controlling the movable camera apparatus to produce a select number of digital photographic images at the select location by means of the computing means; and combining the digital photographic images together by means of the computing means to form the virtual reality environment .
10. A method according to claim 9, wherein said moving step further comprises moving the movable camera apparatus along a path stored in a memory of the computing means.
11. A computer-controlled system according to claim 10, wherein said path comprises a number of select locations to which the computing means moves the movable camera apparatus to each of the number of select locations and instructs the movable camera apparatus to produce a select number of digital photographic images at each select location.
PCT/US1996/020039 1995-12-20 1996-12-20 Computer-controlled system for producing three-dimensional navigable photographs of areas and method thereof WO1997022918A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU14637/97A AU1463797A (en) 1995-12-20 1996-12-20 Computer-controlled system for producing three-dimensional navigable photographs of areas and method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US911195P 1995-12-20 1995-12-20
US60/009,111 1995-12-20

Publications (1)

Publication Number Publication Date
WO1997022918A1 true WO1997022918A1 (en) 1997-06-26

Family

ID=21735639

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1996/020039 WO1997022918A1 (en) 1995-12-20 1996-12-20 Computer-controlled system for producing three-dimensional navigable photographs of areas and method thereof

Country Status (2)

Country Link
AU (1) AU1463797A (en)
WO (1) WO1997022918A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0989747A2 (en) * 1998-09-28 2000-03-29 Canon Kabushiki Kaisha Television lens apparatus adapted for a virtual studio

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5159458A (en) * 1988-09-13 1992-10-27 Canon Kabushiki Kaisha Camera sensing when a memory cartridge is installed
US5255096A (en) * 1992-04-10 1993-10-19 Boyle William M Video time code synchronized robot control apparatus
US5394517A (en) * 1991-10-12 1995-02-28 British Aerospace Plc Integrated real and virtual environment display system
US5422653A (en) * 1993-01-07 1995-06-06 Maguire, Jr.; Francis J. Passive virtual reality
US5479597A (en) * 1991-04-26 1995-12-26 Institut National De L'audiovisuel Etablissement Public A Caractere Industriel Et Commercial Imaging system for producing a sequence of composite images which combine superimposed real images and synthetic images
US5497188A (en) * 1993-07-06 1996-03-05 Kaye; Perry Method for virtualizing an environment
US5526041A (en) * 1994-09-07 1996-06-11 Sensormatic Electronics Corporation Rail-based closed circuit T.V. surveillance system with automatic target acquisition
US5531227A (en) * 1994-01-28 1996-07-02 Schneider Medical Technologies, Inc. Imaging device and method
US5566280A (en) * 1993-09-20 1996-10-15 Kabushiki Kaisha Toshiba 3D dynamic image production system with automatic viewpoint setting
US5572248A (en) * 1994-09-19 1996-11-05 Teleport Corporation Teleconferencing method and system for providing face-to-face, non-animated teleconference environment

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5159458A (en) * 1988-09-13 1992-10-27 Canon Kabushiki Kaisha Camera sensing when a memory cartridge is installed
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5479597A (en) * 1991-04-26 1995-12-26 Institut National De L'audiovisuel Etablissement Public A Caractere Industriel Et Commercial Imaging system for producing a sequence of composite images which combine superimposed real images and synthetic images
US5394517A (en) * 1991-10-12 1995-02-28 British Aerospace Plc Integrated real and virtual environment display system
US5255096A (en) * 1992-04-10 1993-10-19 Boyle William M Video time code synchronized robot control apparatus
US5255096B1 (en) * 1992-04-10 1997-12-23 William M Boyle Video time code synchronized robot control apparatus
US5422653A (en) * 1993-01-07 1995-06-06 Maguire, Jr.; Francis J. Passive virtual reality
US5497188A (en) * 1993-07-06 1996-03-05 Kaye; Perry Method for virtualizing an environment
US5566280A (en) * 1993-09-20 1996-10-15 Kabushiki Kaisha Toshiba 3D dynamic image production system with automatic viewpoint setting
US5531227A (en) * 1994-01-28 1996-07-02 Schneider Medical Technologies, Inc. Imaging device and method
US5526041A (en) * 1994-09-07 1996-06-11 Sensormatic Electronics Corporation Rail-based closed circuit T.V. surveillance system with automatic target acquisition
US5572248A (en) * 1994-09-19 1996-11-05 Teleport Corporation Teleconferencing method and system for providing face-to-face, non-animated teleconference environment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0989747A2 (en) * 1998-09-28 2000-03-29 Canon Kabushiki Kaisha Television lens apparatus adapted for a virtual studio
EP0989747A3 (en) * 1998-09-28 2000-11-29 Canon Kabushiki Kaisha Television lens apparatus adapted for a virtual studio

Also Published As

Publication number Publication date
AU1463797A (en) 1997-07-14

Similar Documents

Publication Publication Date Title
US6972757B2 (en) Pseudo 3-D space representation system, pseudo 3-D space constructing system, game system and electronic map providing system
US9704193B2 (en) System and method for constructing and displaying active virtual reality cyber malls, show rooms, galleries, stores, museums, and objects within
US8972897B2 (en) Information presentation in virtual 3D
CN102129812B (en) Viewing media in the context of street-level images
US20080033641A1 (en) Method of generating a three-dimensional interactive tour of a geographic location
CN110874818B (en) Image processing and virtual space construction method, device, system and storage medium
US20060114251A1 (en) Methods for simulating movement of a computer user through a remote environment
WO2005034041A1 (en) Apparatus and method for creating 3-dimensional image
Jensen et al. Alpha: a nonproprietary OS for large, complex, distributed real-time systems
EP2022010A1 (en) Virtual display method and apparatus
Wüst et al. Applying the 3D GIS DILAS to archaeology and cultural heritage projects requirements and first results
US20030090487A1 (en) System and method for providing a virtual tour
CN108364353A (en) The system and method for guiding viewer to watch the three-dimensional live TV stream of scene
JP2004139294A (en) Multi-viewpoint image processing program, system, and marker
WO1997022918A1 (en) Computer-controlled system for producing three-dimensional navigable photographs of areas and method thereof
Sundaram et al. Plane detection and product trail using augmented reality
Andersen et al. HMD-guided image-based modeling and rendering of indoor scenes
Chan et al. Orientation-aware handhelds for panorama-based museum guiding system
Guven et al. Interaction techniques for exploring historic sites through situated media
CN112256771A (en) Exhibition platform system for expo
KR20000050196A (en) Three dimensions imagination system for displaying viewing direction and changing image of object by viewing direction, method for emboding it
Vlahakis et al. 3D interactive, on-site visualization of ancient Olympia
CN110634190A (en) Remote camera VR experience system
CN111552192A (en) Robot tourist exhibition room Internet of things system
Liarokapis et al. Design experiences of multimodal mixed reality interfaces

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IL IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG UZ VN AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 97522954

Format of ref document f/p: F

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase