CA2199619A1 - Virtual reality imaging system - Google Patents

Virtual reality imaging system

Info

Publication number
CA2199619A1
CA2199619A1 CA002199619A CA2199619A CA2199619A1 CA 2199619 A1 CA2199619 A1 CA 2199619A1 CA 002199619 A CA002199619 A CA 002199619A CA 2199619 A CA2199619 A CA 2199619A CA 2199619 A1 CA2199619 A1 CA 2199619A1
Authority
CA
Canada
Prior art keywords
data
phenomena
image
user
multidimensional space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002199619A
Other languages
French (fr)
Inventor
William Loring Myers
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University Corp for Atmospheric Research UCAR
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2199619A1 publication Critical patent/CA2199619A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S345/00Computer graphics processing and selective visual display systems
    • Y10S345/949Animation processing method
    • Y10S345/958Collision avoidance

Abstract

The virtural reality imaging system (10) takes a multidimensional space that contains real world objects and phenomena, be they static or dynamic in nature, and enables a user to define a point and/or a path through this multidimensional space. The apparatus (10) then displays (11) the view to the user that would be seen from the point and/or path through the multidimensional space. This view is filtered 301-303 through user definable characteristics that refine the real world phenomena and objects to a perspective that is of interest to the user. This filtered view presents the user with a virtual view of the reality contained within this multidimensional space, which virtual reality presents data to the user of only objects, views and phenomena that are of particular interest to the user.

Description

603203 F 21 ~96 t 9 A~,ENDE~ SHEr~.' VIRTUAL REALII Y IMAGING SYSll~M

~IELD OF T~IE INVENTION
This invention relates to computer generated images and, in particular, to a system 5 that creates a visual image of a multidimensional space to present a filtered image of various three dimensional phenomena and features that are contained within the multidimensional space as viewed from any predefined locus within the space.
PROBLEM
It is a problem in complex computer controlled systems that deal with real world10 phenomena to present a representation of the phenomena in a manner that is both informative to the user and in a simple presentation format. Computer generated graphics are ubiquitous and are typically used to present an accurate representation of an object, a phenomena, a multidimensional space and interactions therebetween. Computer generated graphics are also used extensively in simulation systems to present an image of a real world 15 situation or a hypothetical situation to a user for training, analysis or other purposes.
Computer generated graphics have become extremely sophisticated and can represent extremely complex and fanciful situations in a manner that is virtually lifelike. The application of computer graphics spans many technologies and applications.
One area in which computer graphics has yet to make a significant impact is the area 20 of real time display of complex real world phenomena. Some elementary work has taken place in this area but systems of great flexibility and adaptability that can handle extremely complex phenomena are presently unavailable. This is because the volume of data that must be processed to present an accurate display represents a significant processing task and when coupled with a requirement to provide a display in real time, exceeds the processing 25 capability of present processors. It is therefore a problem to visually display a complex multidimensional and real time phenomena in a large multidimensional space in a simple manner that maps the derived reality to a predefined user's viewpoint.
One reference that discloses a system that produces filtered images of the threedimensional space is published European Patent Application W0-94/083 12, which discloses 30 a virtual reality im~ing system. This system filters the data received from sensors to present an image to the user of only the characteristics of time varying phenomena extant in the three dimensional space that are of interest to the user.

6~3-.~)3 F 2 1 99 6 1 9 i ~lLlt~ s SOLUTION
The above described problems are solved and a technical advance achieved in the field by the virtual reality image generation system of the present invention. This apparatus takes a multidimensional space that contains real world objects and phenomena, be they 5 static or dynamic in nature, and enables a user to define a point and/or a path through this multidimensional space. The apparatus then displays the view to the user that would be seen from the point and/or path through the multidimensional space. This view is filtered -la-through user definable characteristics that refine the real world phenomena and objects to a pel~e~ re that is of interest to the user. This filtered view p~esenls the user with a virtual view of the reality con~ed within this m~lltitlimPn.cional space, which virtual reality pr~senls data to the user of only objects, views and phenomena that are of particular interest S to the user. This app~a~Lls hi~hlightc emphasizes, deletes, and reorients the reality colllained within the m~llti~imPncional space to present an image to the user of only what the user needs to see to accomplish a stated task. The selective presentation of information in real time of real world phP~-o...~ enables the user to process the reduced data set co..~ d in the image presented by this app~ s to pelrollll a deci~en~ted task in a manner 10 that heretofore was impossible. In addition, the phenomena that is displayed is stored and processed in an efficient manner. The phenomena is reduced to a compact data representation to ~illlpliry the procp~scing task and data communications.
The prert;l.ed embodiment described herein is that of an airport operations system wherein an airport is located in a predetermined location in a mlllti-limP~n~ional space and 15 is surrounded by various three di,llFns;onal topological surface features. The three dimensional air space sulloul~ding the airport is typically m~n~ed by air traffic controllers to route aircraft in the vicinity of the airport into arrival and departure patterns that avoid the topological features, various weather conditions around the airport, and other aircraft that share the ail~ace with a particular flight. This problem is ~ ely cQmrleY in nature 20 in that the mllltirlimPnsicnal space around the airport cQl-lA;n~ fixed objects such as the airport and its s~-l,oul,ding topological r~lures as well as dynamic phenol"ena such as meteorological events that are beyond the control of the air traffic controllers as well as dynamic phel~o"lena, such as the aircraft, that can be indi, ~lly controlled by the air traffic conllolle.~. The .l~"a~,. c pheno",~n_ vary in time and space and the movement of the 25 aircra~ within this mllltid;....,n~:orlal space must be m~naePcl in real time in r-;sponsc to real time and so,..e~;...es sudden r-h~s in the ",~t~lolc~ l ph~-o...~ as well as theposition of other aircraft.
No known system even .~.notely appr~aches providing the air traffic controllers, the pilots or other POI~ users with a reAcol-~le dictill~tion of all of the data CQ. .l A;ned with 30 the mlllti~1;...F...~:~n~l space around an airport. FY;Rt;n~ airport operations include a Rigl .; r.c~ amount of data ~ itiQ~ t~ GIl to provide the air traffic controllers as well as the pilots of the aircraft with data relating to weather, air traffic and spatial W096/07988 2 1 9 9 6 1 9 PCTtUS95/11223 relationships of the aircraft with respect to the airport and the ground level. The problem with this app~al~ls is that all of the data acquisition instr~ l;on is configured into individual units, each adapted to present one set of n ~u~ly defined relevant il)folmalion to the user with lit~e attempt to integrate the plurality of s~le.lls into a universal instrument 5 that can be ~d~pted to controllably provide an image of the mlllti~imf n~;onql space to the various users, with each image being presented to a user in terms of their specific need for h~rll,&lion. This is especially important since the air traffic controller has a ~i~ifi~ntly di~ l need for i,~llr~;on than the pilot ofthe aircraft. The data output by these diverse systems varies greatly in both format and content and is not easily integrated into a single 10 system that can l~presellt the mlllti~];...-n~ion~l space and its collt~ts.
The appar~L.Is of the present invention obtains data from a m~lltitude of data acquisition sources and controllably melds this il.rolll,alion into a ~t~b~se that l~;p-esellls all the infollllalion of interest relating to this mllltitlimf n~ional space. Graphic proces.~i~
apparatus lcsllonds to user input to define a predetermined point or path (interactively or on 15 a predçfined basis) through the mllltid;....,l.~;o~l space as well as certain vis~l~li7~tion ch&,~ "islics for each individual user. The graphic procçssing appalal~s thence, in real time, plescllls the user with a ~;u.lo~ -~ view of the mllltitlimPn~;on~l space in a visual form by dfleting information that is cAIl~neo~ls or confil~ing and pl~ g only the data that is of ci~ific~nt lele~ce to the pa~ ular user as defined by the filter. In an airport 20 operation en~irol...,e~lt, low level wind shear alert systems (LLWAS) use ground-based sensors to gell~"ale data indicative ofthe pl~se,lce and locus of meteorological phf .~o...f~
such as wind shear and gust fronts in the vicinity of the airport. In addition, terminal doppler weather radar (IDWR) may also be present at the airport to identify the pl~,sellcf;
and locus of l.leleo~logical phf.. ~o",r,n9 in the region su,luundillg the airport to enable the 25 air traffic controllers to guide the airaa~ around undesirable lll~teorcl~lgic~l ph~l~o~
such as lLunde,~lolllls. I~ data is available in the form of LANDSAT data indicative oftopological surface feaLules ~ullo~ the airport. This system can also use other digital image data such as aviation charts, road maps, night light ;..~ g etc. Air ~afflc control radar is also available to ;..d ch~e the pl~Se,lCf and locus of aircraft within the 30 space around the airport for air traffic control pul~ses. Collectively, these s~tellls provide data representatiw of the immutable characteristics of the ml)lti~:...f . on l space as well as the dynamic phen~,l"~a ~nL~led in the air space, inc~ eleorological events and w096~07g88 - ~ 2 1 9 9 6 1 9 PCT/US95/11223 aircraft operations. It is not UllCOIlullOll for airport operations to take place in a zero visibility mode ~l,~,.e.n the pilot's ability to obtain a visual image of air space in front ofthe aircraft is ""p~ed to the point where the pilot is flying blind. Further, some aviation wealher hazards are not detect~able by the naked eye in clear air conditions, e.g., dry S mi~,lv~ul ~ls or turbulent regions. The pilot must rely on the air traffic controllers and radar conl~.led within the aircraft to ensure that the pilot does not fly the aircraft on a collision course with a solid object, such as another aircraft or the topological fealules surrounding the airport.
The virtual reality ima~in~ system of the present invention converts the data 10 o~u~ed from the m~ itll~e of systems into compact data ,~,ple3el lalions ofthe phenomena of intaest to the usa. These co.n~ a~t data repl ese~ ;ons from the various data collection systems can be merged and the i,~l~--alion conlailled therein simply disti11ed into a vi~l-ali7ation of the flight path pre3e,ltly in front of the aircraft. This appa al~ls can delete extraneous ..~-...alion, such as clouds, fog, etc. and illustrate to the pilot and/or the air 15 traffic controller only pheno...~na that would be of si~ifi~snt interest to the pilot, such as danger.,us meteorological phf!no~ n~ and other aircraft, to present the pilot with a clear image of hazards within the m111tiJ;....,..~;onq1 space to permit the pilot to chart a course through these hazards ~itl.oul the pilot being able to see these dangers with the naked eye.
The specific exarnple noted above is simply one of many applic..l;Q~ ofthis concept 20 which ope-~les to filter vast ~ ounlc of data typically found in a visual ;...~,~g sihuation to present a "clearer irnage" to the user as defined by the specific needs of the user. The user lheref~Jle sees only what they need to see and can ~.nplete tasks that herelofore were impossible due to the visual overload e~ d in many sih~ations~ such as flying an aircraft through fog or clouds or not being able to identify a wind shear event in a 25 meteorological phP.non~enD of ~:~.;r.~D~1 extent and complexity. An ad~litiQnal capability ofthis system isthe prediction offuhure states ofthe dynamic ph~ o...F~ Data is co11ected by the multihlde of data ~ icition ~st~."s over a plurality of ~...~ intervals and can be extrapolated through trend u ~s or through model ~ ;onC on the data availableto illustrate the state of the d~.~".c ph~-~o... ~ in fuhure ~ intervals. This 30 capability enables the air traffic control supervisor to model the weather activity around the airport to provide il~u~ tion to plan airport operations for the; --~-PA;~e future.

wo 96/07988 2 1 9 9 6 1 9 PCT/US95tll223 BRIEF DESCRIPTION OF THE DRAWING
Figure l illustrates in block diagram form the overall arrhitect~lre of the appa~ s of the present invention;
Figures 2 - 4 illustrate in flow diagram form the operation of the various se~
S of the il~ oved weather alert system;
Figure 5 illustrates in block diagram form the overall arç~ re of the il~")ro~edweather alert system;
Figure 6 illustrates a plot of a typical airport configuration, in~ ing LLWAS and TDWR in~t~ tions and typical weather conf~itionc;
Figures 7 - 12 illustrate an e ~I"~le of converting the compact data represf l~;on of a phenomena to a three-dimensional object representation;
Figures 13 - 17 illustrate typical visual images produced by this appa,al~ls;
Figure l 8 illustrates additional detail of the renderer; and Figure l9 illusllales in flow d;agl~.l form the operation of the pl~sf~nlS1l;on 15 subsyah,ll.
DETAILED DESCRIPTION
Figure 1 illu~llales in block diagram form the overall arrhitectllre of the virtual reality im~gjnp~ system 10 of the present invention Wlthin the virtual reality imz~
system l0, a data -c~ ~ion sul,~st~,.., 1 fim~iQn~ to collect and produce the real time data 20 thatisre~l~s~.~la~ eofthemlllti~;...el--;onAIspaceandther~lL~resandphFno~ Aextant therein Graphics ~ub~;~l~m 2 fill~.,l;olls to utilize the real time data that is produced by the data ac~lisitiQn ;,~;,l~.,. 1 to produce the visual displays required by the plurality of users To accon~pli~ this, a shared database 3 is used into which the real time data is written bythe data ? . ~ ti-~n sul,s~st~,--- 1 and a~c~ssçd by the various p.~cçs~; ~g el~ - ls 25 of graphics s~b~ste,.. 2 A user data input device ~ is provided to enable a user or a plurality of users to enter data into the graphics sul.~t~.. 2 indicali~re of the particular i~lful~t;on that each ofthe plurality of users desires to have d;s~l~ed on the COII ~,Sl)Ol di~lg display device 11.
In opelalion~ the data ac4~ n subsystem 1 co...plises a plurality of data 30 aç~ il;on apparalus 21-2n, each of which produces data l~les~ tali~, of ...~su.e,..~ nts pcl~--l.ed on the ph~n(s ..~ n~ or redtu~es that are located in the mllltid;l~ n~;onal space These data ac~ c:l;Qn appalalus 21-2n can process the real tirne measurement data into compact data l~re~senl~l;ons of the phenolllela and features, which compact dataleprf;s~nl~l;onc are lli-..c...;lled to graphics subsystem 2 for processing into the visual images. The graphics subsystem 2 converts the compact data l~les~ l;ons produced by the plurality of data a~ ir;~ion app~ ~lus 21 -2n into vi~l~li7~tions as defined by each of the S users of the virtual reality im~n~ system 100. This viC~ tion is produced by p~;- r.. -~ a d~ ce transversal to present the data in a form and format of interest to each of the users.
Aviation Weater Display System A typical application of this appaldl~ls is an aviation weather display system whose 10 data ac~ ;fion sulJs~a~.lls make use of a plurality of aviation weather instr~lment~tion that are used in and about an airport inct~ tion. The aviation weather ill~ n~f~ tion may include ground based sensors such as radar, lighting detection nelw~,l~, and wind sensors as well as ~l,ollle sensors~ such as so!~n~;ng balloons or aircraft based sensors. Each ofthe aviation weather insl,~ lion produces raw data indicative of real time meteorological 15 phenomena, topolo~c~l realu,t;a and aircraft operations in the mllhidin~n~:onal space, which real time data is processed by the data acqlliciti~n subsystem 1 to produce co ract repl esc n~ ~tions of the real time data. These data procecQ;~ steps often include filtering, fea~re extraction, and coll~lalionrmte~alion of more than one data stream. Furthellllor~, this plOC~Ssf~ data may be used as input to physically based models, which attempt to 20 predict the evolving pheno,.,~ a based on the stored Illf~ul ~.lle ltS.
From the compact data r~,es~ ~1r~;ons the gr~phi~s subsystem 2 generates generalized graphical rt~)ie~ ;ons of the phf nQ~PI~ and L,al~ r~s. This involves the creation of an object or objects which exist in a virtual ml~hi(~;.. n ~onal space. In an aviation weather display applic~tion~ this virtual reality ;---~ g system 10 must operate in 25 real time since Q;gJ~ g-~1ly delayed data affects the validity and fi~n~tit)n~ ty ofthe system as a whole. The v.~.~Al; ~;on pl~se.l~ed to the user typically inr.bldes frame of lefel-tllce information such as terrain, ove.laid with identifiable fedtures in the form of high~.aya~
range rings or icons ,~p,.,_ ~~ti~ munic;pql;ties or airports. Furthc~--lore, the terrain surfiace can be colored by texture ...a~ it with an image such aa a LANDSAT image or a digital 30 map. This system can also use other digital image data such as analioll charts, road maps, night light ;---~-~ etc. In order to il~le~ale the plurality of data streams that are produced in a data a~illio;tion ;>~s~tell 1, the graphics ;,ubs~stt,n 2 must pell~llll llulnai~us wos6lo7s88 2 1 9 9 6 1 9 PCT/US95/11223 operations such as d~ e culling, relative level of detail dete~ alion and rendering to create user ~co~ lc images from the raw data or cornpact data represe..l i-l ;onC that are stored in d~t~h~ 3.
Data Acquisition Subsystem Architecture S Figure 1 illustrates the major s~l"i)o~ ts of a typical data acqu;~ Qn app~ alus 21. In a typical configuration, a plurality of sensors 201 are used to make mea~ur~men~s during a ~...pling interval of predetermined duration and repetition frequency, of one or more cl~~ ~s of a particular p h~l-o.--el-~ or feature within the ml~hi~imPnc;onal space.
The output signals from the plurality of sensors 201 are received by data filtering and 10 feature extraction element 202 which functions to filter the data received from the plurality of sensors 201 to remove ambient noise or unwanted signal components lhel~o~n. The data filtPring feature extraction PlemPns 202 also functions to convert the raw data received ~om the plurality of sensors 201 into a definition of the particular phenol"~na or feature that is being monitored by this particular data acquisition app~alus 21. An example of such a 15 capability is the use of an improved low level wind shear detection appa,alLls which converts the wind q~ nitllde measure,lle~lls from a plurality of ground based sensors into data r~l~llt~ti~ e of wind shear events within the mlllfi~ PI-ri~n~l space. To accomplish this, the raw data obtained from the sensors 201 must be converted into a form to extract the wind shear events from the plurality of wind "l~sure,ll~ts taken throughout the 20 mnlti~limpn~ional space. The resolt~nt h~llnaliol is used by compact data r~l~s~ l;Qn appal al~lS 204 to produce a set of data i ~dica~ of the ~ A~ led feature in a convenient çffi~iPnt manner. This can be in the form of gridded data sets, feature extent and location data as well as other possible ~ nc FullL~.lllGre~ the data a~q~ itiQn appa,alus can include a pl~li~ e rl~ 1 203 which uses the data oblained from data 25 filtPrin~ feature extraction appa.dtus 202 to ~AIIa~Sl&le into one or more predelelllfil ed future sampling intervals to identify a future te,ll~l~l state of the feature or phenGlllena that is being measured. The data output by the plcdict;~re f1~..ç ~l 203 is also fc.l~Jed to compact data representation Pl~"~ l 204 for inclusion in the data set that is produced therein. The l~vJIlP I1 COl ract data l~l~ a~;ol~ are ll~ ;lled to the graphics 30 ~UI,~vt~.ll 2.
It is obvious that if the feature being l~onilored is t~lllpol~lly and spatially static, the data that is produced is ill~a.ianl and need not be ~-p~te~ during S~ e ~ ;"8 W096/07988 - ~ 2 1 9 9 6 1 9 PCT/US95/11223 intervals. However, most phenol,.ella that are monitored in this en~ilol~"enl tend to be te.--po,ally and in many cases spatially varying and the operation of the data ~cq~licitiQn apparatus 1 is on a time sal,.pled basis, with a set of data being produced at the end of each sampling interval. The plurality of data ncquiC;~ion elPmentc 21-2n preff;-~bly operate in 5 a time cooldi~ed manner to produce ~.cl~lu.~;d sets of data sets in the ~t~b~ce 3 so that graphics ;~s~r~le~ 2 can produce t~,mpol~lly coo,dinaled views of the phf l-o",r -A, and features located in the m~-lfi-l;.nf .~;onAl space on a once per sA-.~rlin~ interval basis or over a plurality of 5~ intervals, dependenl on the amount of data that must be processed.
In a real time en~i,(.n...~ , the plurality of data acquisition app~al.ls 21-2n function to 10 collect trPrnP~l~ollS ~mol)ntc of data and reduce the dah to m~n~P~ble amounts for use by the graphics s.lbsy~l&." 2.
The improved low-level wind shear alert system, illustrated in block diagram form in Figure 5, provides an improved method of identifying the presence and locus of wind shear in a p,~df r.l-ed area. This low lcvcl wind shear alert system enh~llces the operational 15 effe.,li~. ness of the c~ p. LLWAS system by mapping the two--limen~ion~l wind velocity""eas~red at a number of loc~tiQnc~ to a geographical in~ tiQn of wind shear events. This re~lt~nt geographical inrlic~tio~ is displayed in color-graphic form to the air traffic control pc.~Gm,el and can also be ~ (ed via a t~le-..el.y link to aircraft in the vicinity of the airport for display therein. In addition, gust fronts are tracked and their 20 pro~ress through the predefinP,d area d;;~pla~cd to the users.
This low-level wind shear alert system can also i"legrale data and processed hlro"llalion received from a plurality of sources, such as anelllûllletel ~ and Doppler radar systems, to produce low le~el wind shear alerts of ci~nific~ntly improved accuracy over those of prior systems. In particular, the app&alus of the improved low lcvcl wind shear 25 alert system makes use ofthe data and ploce~d ;nro,,,~liûn produced by the e-;s~ p Low-Level ~md Shear Alert System (LLWAS) as well as that produced by the Terminal Doppler Weather Radar (IDWR) to pr~isely identify the locus and ..~ de of low-level windshear events within a pl~d~,t~ll~ed area. This is nc~...~ ~ by the use of a novel ~on system that utilizes the data and pl~.~ inru~ ion received from these two 30 s~ .lls (LLWAS ~ TDWR) in such a way that the limit~tiOI~C of the two stand-alone e.lls are ameliorated. This integration sch~me, while addl~;,;,.llg these limit~tiQ~
-~,ul'n~ y maintains the st~ Ills ofthe two stand-alone ~t~llls. This technique then wo 96/07988 2 1 9 9 6 1 9 PCT/US95/11223 provide- the best possible wind shear hazard alert il~l,l,alion. FurthGIlllolG, this integration mFthodoloey addresses the operator interaction plobk-n diccllssed above. The hlleglalion is fully a .lo,..~e~ requires no meteorolo~cql hllG~lGl~lion by the users and produces the required gr~-ql and alph~ n~!;ic h~llllalion in an unambiguous format. Lastly, this S integration te~hn:que is impl~ mFnted fully without any major sonwal~; mo~lific~q,tionc nor without any ha~.lwal~ modifications to the ~ g stand-alone sy~ lls.
The TDWR appal alllS uses a 5 cm. C-band Doppler radar system to Illcas.lre radial winds when atmospheric scatterers are present. This system processes the radar return signals to create a field of radially oriented line se~..F~Ic indicative of the radial velocity 10 data received from the radar. The TDWR àpp~al~lS bounds isolated sets of se~ .lc that are above a predetelllllned lhlesllold to define an area which would contain a specific, polelLal low-level wind shear event. The bounding is such that it il~cGl~olales the cmqllest area which incl~dçs all of the line se~..c l1s above the predetelll~led threshold. A
predF!fined geolll~ tl ic shape is used to produce this ~ow~ding and the characteristics of this 15 geometric shape are a~qpted in order to Fn~...?~s all ofthe re~ul~ed data points in the minimql area.
The app&, us ofthe ill~ cd low-level wind shear alert system is divided into twoindependent se~tionc dete~tion of wind shear with loss citllqtionC (microbursts, etc.) and det~-.l;ol- of wind shear with gain situations (gust fronts, etc.). The TDWR system outputs 20 wind shear with loss data in the form of mi~ ul ~l shapes. The el-h~ced low 1~ ~cl wind shear alert system gCnC.aleS e~uivalent LLWAS micl~ul~l shapes using the triangle and edge dive.gence values produced by the ~-;r.t;.~g LLWAS appalalus. The LLWAS
microburst shapes are v-qli~sted by using ~Aili&y ;--ru~ ;Qn from LLWAS and TDWRto elirninate ll~al and false~d~t~ n LLWAS micr~ul~l shapes. The rF,~-ltqnt two sets of micl~ l shapes are then cc ;~F~ed for alarm gel ~.alion p~,o3es. The wind shear with gain portion of this s~stem simply divides the eo.~,.~e area into two regions, with TDWR producing wind shear with gain runway alarms for wind shear events that occur outside ofthe LLWAS sensor while the LLWAS ~unway Gl;P- ~led gain alarms are produced for wind shear events that occur inside of the LLWAS sensor n~ilwul~.
This integration architect~ure enables the col ~l~,nl use of a plurality of sensor-based systems to provide the wind shear det~iol- fim~ion with incl~d accuracy. Both ground-based and aircraftbased sensor systems can be used to provide wind dah for this appa-~lus.

WO 96107988 . . . PCT/US95/11223 ~'-i 21 9961 9 The ~l.appmg, of diverse forms of input data into a common data structure (pre~fined geo~ ic shapes) avoids the neces~,;ly of modifying PYicting sensor systems and simplifies the production of i,~,l,~lion dis~l~s for the user. The use of a CO,~"Oll i"Çol"~lion display appalalus and format renders the co~nbinalion of systems tla,~spar~ t to the user.
5 Improved Low-Level Wind Shear Detection System Adverse weather conditions, eSpe~ ly those ~e~;ling airport op~.~lion, are a si~ific~nt safety conc~." for airline opc.alo,~,. Low level wind shear is of cignific~nt interest because it has caused a number of major air carrier accid~Pnts Wmd shear is a change in wind speed and/or direction bet~cen and two points in the atmosphere. It is 10 generally not a serious hazard for aircraft en route belween airports at normal cruising tudes but strong, sudden low-level wind shear in the terminal area can be deadly for an aircraft on approa-h or departure from an airport. The most hazardous form of wind shear is the microburst, an outflow of air from a small scale but powerful dOw~ d gush of cold, heavy air that can occur beneath or from the storm or rain shower or even in rain free air 15 under a harmless looking ~lml~hlc cloud. As this dow"dr~ll reaches the earth's surface, its spreads out ho~i~nlally like a stream of water sprayed straight down on a concl~le driveway from a garden hose. An aircra~ that flies through a mic,obu,~l at low altitude first f' U'4lJ~ a strong headwind, then a d~Jwll L~l, and finally a tailwind that produces a sharp r~luction in air speed and a sudden loss of lift. This lo$s of lift can cause an &i~l,lane to stall 20 and crash when flying at a low speed, such as when approaching an airport runway for landing or dep~lii~ on takeoff. It is th~,..,fol~ desu~le to provide pilots with a runway specific alert when a f~een knot or greater headwil~d loss or gain situation is detected in the region where the aircraft are below one tho~ ul feet above ground level and within three nautical miles of the runway ends.
Figure 6 illustrates a top view of a typical airport in~t~ iOI- wlle~ the airport is within the region ;,.d;r~led on the ho,iLo,l~l axis by the line labeled L and a Terminal Doppler Weather Radar s~vstem 502 is located a ~; ~t~ce D from the pe~iphe y of the airport.
~nel~lded within the bounds of the airport are a plurality of Low Level ~md Shear Alert System sensors 505. The sensors 505 are tvpically ~ -..o...et~ located two to four 30 l~lo",~ te,~ apart and areused to p,~ ce a single plane, two dimensional picture ofthe vind velocity within the region of the airport. The Terminal Doppler Weather Radar 502, in conll~sl, co~ of a one d;~e ~ Al (radial) beam which scans all lun..d~ (R1-R4) and wos6/07sss 2 1 9 9 6 1 9 PCT/US95/11223 flight paths but can measure only a radial hol~on~ outflow component of wind. The nominAl TDWR scan ,l~le~;y produces one surface elevation scan per minute and scans aloft of the operational region to an altitude of at least twenty thousand feet every two and a half minlltes. This Sllate~y is intPnded to provide frequent updates of surface outflow S while molu~o,il,g for features aloft to indit~ste that a microburst is ;~ r~ Microbursts (Ml-M8) are recognized primarily by surface outfiow although they can be snticir-Ated to a certain extent by monitoring fealures and events in the region above the airport location.
Thl~ndel~loll"s typically produce a powerful downward gush of cold heavy air which spreads out ho"Lonlally as it reaches the earth's surface. One seg,..~ .l of this downflow 10 spreads out away from TDWR radar while an opposing seg..~nl spreads out towards the TDWR radar. It is generally ~ ...ed that these outflows are symmetrical for the purpose of detecting mic,~.l,~l wind shears. Because most microbursts do not have purely~y~mel,ical ho,i~o,-l l Ouln~, the TDWR system can have problems detecting or estimAting the true ill~cnsi~y of a ,~"ul.~ical mi~obu~l outflows. As can be seen from 15 Figure 6, the ~ t~-~ 505 ofthe Low Level Wmd-Shear Alert System are sited on both sides of airport n~l~.a~Rl-R4 but do not extend to the full three mile ~1;CI~AnCC from the end of the runway as is desirable. Thclerore, the g--~ ~.o..-ele ~ 505 can only detect ho~i~o,llal ir~ows that occur in their i...,..~ le vicinity (M2, M3, M5-M8) even though there can be horizontal airflow outside the An~ ...ol-.. ler nelwo,~ (Ml, M4) that can impact airport 20 operations but are outside of the range of the limited number of Al~ G~ ~-elPrS 505 sited at an airport.
Improved Wind Shear Alert System Architecture Figure S i~ sllates in block diag,~" form the overall a,c~ re of the improved low-level wind shear .lert system lOO. This lo~vl~cl wind shear alert system 10025 ult~. t~ s the ground level wind data co!k~ by one set of ~tstions-y ground level sensor t;- s) 505 with the higher a~tude wind data c o"- ~1ed by a second sensor (Doppler radar) 502 in order to aa;ul~llely identify both the locus and ~~ de of low lcvcl wind shear co~ ;Qns within a pfedete~ ed area ~ The two sets of data inputs illu~llaled in this embodiment of the invention include the data produced by e~ ;~ data procçs~ing 30 systems associated with the sensors in order to pr~p,ucess the data prior to i,~le~alion into the unified precise output pre~.lted to the end user.

wo 96l07988 2 1 9 9 6 1 9 PCT/US95/11223 The sensor systems include the existing Low Level Wind Shear Alert System (LLWAS) front end p-ocessing 101 which is an ane~l-omeler-based wind shear alert system used to detect the p.esellce and identify the locus of wind shear events at or near ground level. The LLWAS system 101 generates data indicative of the wind velocity (ma nih~de 5 and direction) at each of a plurality of fixed sites 505 located within a pred~fined area. The col1ectçd wind velocity data is then preprocessed by the LLWAS system 101 to identify the locus and m~gnih~de of wind shears at ground level by ide..liryin~, the divel~ence or convergence that occurs in the measured wind velocity throughout the predçfined area.
Similarly, the second set of sensors is the Terminal Doppler Weather Radar (TDWR) 502 10 which uses a Doppler radar system to measu~t: low-level wind shear activity in the predçfined area. The TDWR system 502 searches its radar scan for se~ lc ofthe radar beam of monotonically i..~i,~;"g radial velocity. These regions and areas of radial convergence are idçntified as the locus of wind shear events.
The integration system 103 that has been developed for the integration of TDWR
15 502 and LLWAS 101 uses a product-level technique and is divided into two independent seç~i~ nc the detection of ~;lulQ~ r-with-loss ~;lualions (micr~ur~ls, etc.) and windchP~r-with-gain situations (gust fronts, etc.).
The outputs from the w;ntlch~l-with-loss portion of the TDWR system 502 are microburst shapes - which are used both as graphical il~llllalion and to gen~ le the textual 20 runway alerts. As an integration "add-on" to the ~sling LLWAS system 101, an el-hAI-~1 LLWAS section 102 was developed to generate LLWAS mlCl~ul~l shapes. These shapesare col-.p~ P~ using triangle and edge dive.~ ce values obtained from the LLWAS system 101. Even though the .~ .odc used to ~,e..~,.t.le these shapes is quite di~ertnl, these LLWAS micr~u,~l shapes are idPntir~l - in both form and content - to the TDWR
25 microburst shapes. This allows forthe same alert-g~n~alioll logic to be applied, and for the collllllon graphical display 116 of micl~ul~l detectionc The TDWR/LLWAS (~ .1 .h~-with-loss) mi.i,obul~l integration 114 is ~$~nti~11y the co.,lbined use of micro~ul~t shapes ~om each sub-system 112, 502. This co,~ alion, however, is not a spatial Illcr~, of the shapes: each shape is con~;d~-red as a separate 30 entity. Fullh~ ole, the LLWAS micl~ul:,l shapes have been passed through a v~lid~tion process in ~III~ tlr test 113. By this we mean that auxilia~y il~llllaLon 703 from both TDWR and LLWAS is utilized in an attempt to Pl;...;~ e certain of the "weaker" LLWAS

wo 96/07988 2 1 9 9 6 1 9 PCTtUS95/11223 mihubu.~l shapes - ones that could generate nlli~nre or false alarms The motivation and implem~nt~tion for this procedure is desclil.ed below However, an alternative to this process, the sensor data from each of the sub-sy~lenls 112, 502 could be merged to produce a compos;le set of shapes indicative of the merged data This alternative process is noted 5 herein in the context of this system resli7~s-tion Once a set of llf,crul,u- ~l shapes are produced by the çl-h~l-ced LLWAS app& alLls 102 and integration appa"ll,Js 103, these shapes are l~ C .;lled to the Terminal Doppler Weather Radar system 502 which colllains the runway loss alert genelalion process Similarly, the integration app~alus 103 receives LLWAS runway oriented gain data and 10 TDWR gust front data in gust front i..leg~hlion app~alus 115 The LLWAS runway-oriented-gain data inrludes data front tracking system 119 which uses the LLWAS
anelllollleler wind vectors to detect, track, and graphically display gust-fronts within the predete.. ned area LLWAS runway-o.;c.. led-gain (ROG) is also used for detection of generic wind shear with gain hazards within the LLWAS nelwo-l~ This is not netxc~.;ly 15 tied to a specific gust front det~tiQn Wind shear with gain sitllfltiQns can occur dC~ r,~lf ~ly of gust fronts - e g the leading edge of a mi~,-o~u-~l outnow, or 1arger-scale ( ..~,t~l~lo~r-sl) frontal passage The selected data is then l-~n~ led to the TDWR system 505 where a runway gain alert generation process produces an alarm indicative of the pr~ sence of a wind shear with gain hazard Ala~m arbitration process in TDWR system 502 selects the alarm produced by either runway loss alert gsll~,.alion process or runway gain alert ~,el ~.alion process to present to TDWR &splays 116 The ~s~ing d;jpl~i, 116 consist of the TDWR Geographic Situation Display (GSD) which illustrates in graphical form the micl~ul~l shapes, gust fronts and indicates which .u... ~;. are in alert status The TDWR and LLWAS Ribbon Display 25 Termin~ (RDT) gives an nlph~ . ;c ...e~ ~,F, i.~ , alert status, event type, location and . ~ de for each OpC"aliOnal runway.
It is obvious from the above description that the ~ g Ll,WAS 101 and TDWR
502 systems are utilized as much as possible wilLoul modifirqtiQn to ~ ;n.; e cost and impact on ~ , in~tgll9~ion~ It is also pGss;~le to implement these f~lu~s in other 30 system confi~lrations Any other data collection system can be similarly ill~e~ted with the eYisting TDWR system 502 or the c~ 8 LLWAS system by the arpli~;QI~ of the wos6/07sss 2 1 9 9 6 1 q PCT/US95/11223 philosophy des~ihed above. For example, the addition of another Doppler radar, or another Al~."~ r ne~w~
Shape ~eneration Philosophy The LLWAS microburst shape computations are based upon the detection of 5 div~ ~,ence in the surface winds. These triangle and edge divergence estim~teS are mapped onto a Ic~ l-gul~r grid. Conti~o ls "clumps" of above-threshold grid points are collected and then used to genelale microburst shapes. Co.ll?~n~t; ~g for the spatial under-~..\pling ofthe true surface wind field inhel~ in the LLWAS data, a llsyllllllelly hypothesis" is used in g~~ lg the loc~tiQn, extent, and magnitude (loss estim~te) for these microburst shapes.
10 This hypothesis is applied as if a sy"l,ll l-ic microburst were centered at each (above threshold) grid point. In general, microburst outflows are not syll-nlel.ic. However, the spatial superposition of these S~llllll~,tl iC "grid-point-microbursts" in a given clump does a very good job of appru~ 7~ g a non-~yll~ lic event.
While a given detected dive genc~ may be real, the LLWAS data alone cannot be 15 used to de~e~ ne whether it is truly associated with a microburst. Thererole, the ~L ' -~ti-)n ofthe ~ll~ y hypothesis may not always be valid. The ploblelll is two-sided.
If the ~ ,tly hyl>ullle~;s is always used, it could generate false alarms in certain non-microburst c~ nc For example, strong surface winds setting up in a persistent divergent pattern. On the other hand, if the ~IIl.ll~l-y as:,u~llplions are never used, wind shear 20 warnings for valid microburst events could be delayed, inaccurate, or even çl;.,,;n~led The issue is then to de~lllllK ~Lelller a given LLWAS-detected diver~ ce is associated with a nllclobu~l and hence det~lllllne ~ll~ ,r the ~Inlll~,tly L~olllc~;s should be applied.
The ~lgo-;lh--- that was d~eloped w-~~Lined "ç~lules-aloftN il~ullllalion from TDWR three-dimensional re~ectivity structures and microburst precursors, (both ploj~:e~
25 down to the surface); and det~ted "strong" surface divt ~ence (mi~r~w~l shapes) from both TDWR 502 and LLWAS lOl. Tl~is il~rollll~.lion is then syntheci7pA both spatially and temporally to create a set of ~. . ~ - ;c discs. The intent of these discs is to intlic~te a region of the ~Il..ospke e within and/or above the disc, (i.e. a ~ lder), where there is good lilcelihood of llf..,robul~l activity. This "region" could be in space: the detection of the 30 surface outflow, or llf.C.I'~UI:~ f~lwes above the surface (refiectivity and/or velocity sig~ r~ s). It could also be in time, that is, a microburst is either: going to occur, is in progress, or has r~.l~ly been present.

These discs are then ~ .i..,.;nfA for "closeness" to those LLWAS microburst shapes that are to be v~ ted If this pl oAi~llily criteria is met, the LLWAS microburst shape is "validated" and passed onwards. That is, the use of the S~IIUIIC~ hypothesis is ~ccumed to be approp-iate in this case, and this LLWAS microburst shape is to be used for gene,aling S wind shear warnings and to be displayed on the GSD. If the pruAimily test fails, the LLWAS shape is discarded. However, in this latter circl~mct~nce, there could be a valid wind shear ha_ard oc.,u"ing that is not assoc;aled with a microburst - or possibly a microburst that is not being co"~clly id~ntifie~ in the s~,,ulletly disc c~lc~lationc. To prevent this type of missed detection~ the LLWAS Runway-Oriented-Loss (ROL) 10 u~ll,~;on 703 is then used as a fall-back to generate any approp~;ale wind shear warnings.
Enhanced LLWAS System-prep~orp ~ ~
The enh~n~d LLWAS system creates a grid point table for use in cr~li"g microburst shapes. This process is illustrated in Figure 3 and is activated at system initi~li7~tiQn. As a p,epl~cr~ g step, a set of pointers are generated which map triangle 15 and edge ll,-cr~ul~l de,tection areas to an analysis grid. During real-time Op~.allon~
LLWAS triangle and edge di~e,gence values are then mapped onto the grid - applying a magnitllde value at each grid point. This set of grid point m~ des are used with the clumps produced by clump shape generation appalall~s 111 to ~el &~le a set of low level wind shear alert system ,I,.cr~bul~l shapes. The Upol"l~.~ for the "~appin~ of triangle and 20 edges to the grid is a "first-time-lhloL~ghn, pl~ploces~ . step. This is done this way since the "pointer" ;.~fo. --- ~1;o~ is solely a function of a given site's LLWAS ~nPmQmp~ter nelwo ges...~i .y-which doesn't change.
The prel,loc~ g loc~tion specific table data gen~.alion is initi~ted at step 1201 where the ~ ,.nc~ location values are ret~ ~ from ll~ ol~ and, at step 1202 the site 25 adaptable p~,.~te ~ needed to modify the ca~ tiQnc are also retrieved from llle.llol~.
At step 1203, a grid is created by ~...pu~ g the number of grid points in an x and y Cartesian coordindte set of ~; .. nr ons based on the number of input data points to create a minim~l size xy grid to pc~r,llll the CO..~p~ ;onC.~ At step 1204, a set of grid pointers is produced to map the di~e.~,e.lce estim~t-p~s that are above a threshold value with the 30 pal li~ lar points in the grid system created at step 1203 . This is to locate the center of a mic,obu,~l that would be causing an alarm. Since a number of grid points are above the d;~e.~,e,nce element lL~hold value it is ~iffiCI-lt to denote the location where the mic~ubul~l wos6/07sss 2 1 9 9 6 1 ~ PCT/US95/11223 to be centered which would cause these ek ~ to create the alarm. Each sensor or r~Gtw~"k element is tested by placing a msthemsticsl microburst at each grid point and each one ofthe grid points so tested that would cause the given nelwulk elem~nt to be an alarm status is then associated with that particular network elemP,nt As a result, a set of grid 5 points associated with each Low Level Wind Shear Alert System lOl triangle and edge is produced to create the ~lem~ont grid point pOi.l~G,~. In order to pe.rul... this r-slclll-s-tion~ a symmetrical .-liclobu-~l model is used: a simplictic half sine wave model which is time invariant and s~llllll~,ll;c in both space and ...~..;I.lde and is only a function of amplitude and a maximum radius. Even though a real mi.,. obL-- ~I may be spatially a~y..ll..~.ical, it can 10 be app.u~i.,.d~ed by a linear superposition of a number of ~y..-l--cl-ical microbursts at least to a first order mathem-s-tic-s-l expansion which produces sufficient speçificity for this Gslcl~lstionprocess. Once the above steps have been pelro~ ed, the processing ofmeasurement data begins at step 1205, where the Low Level Wind Shear Alert System triangle and edge divG gence values are used to generate the co.. ~ j~ondh~g sets of ratios of 15 the divergence values to the threcholdc~ estim-s-ted loss values and alarm status. Associated with these grid points are two sets of mqgnitu~e values: the low level wind shear alert system d;~ nce to threshold ratios and scso~isted es~ ed loss values. The purpose of these two sets of ma~~ e i. ro. .,lalion lies in the fact that, shhough the measured quantity is wind-field divc-~,nce (or wil-~lch~Ar), the r~.li.td output value to the users is a runway-20 oriented loss value. Hence a llldppmg from divel~;~nce to loss is needed.
The following data p.oc ~ ;~ steps are done at each update of i.~....alion from theLLWAS system:
1. Input of lliangle and edge d;~e,~ence values from LLWAS system.
2. Coll,u~lion of "ratios" (dive.gel c~ e~l.old) for each triangle and edge.
3. Mapping of triangle and edge ratios to grid.
4. C~ g of grid points.
5. Shape gene alion from clumps.
Clump Generation Theor~
Figure 2 illustrates in flow diagram form the clump ge.l~ l aLion process 11 l which 30 receives algorithm products from the Low Level Wind Shear Alert System lOl to produce an indication of the lor~tion of wind shear events. This routine accepts as an input the triangle and edge divelgences produced by the Low Level Wmd Shear Alert System 101.
The clump gen~a~ion process l l l then generates clumps of points that are above a certain input threshold level. These clumps are then output to the low level wind shear alert system shape gellerhlion algolilLll. 112. The grid points are the data collection points within the pl~ F~l area around the airport v~ich area is ple~ullled to be two riimPncion~l rect~n~ r area having a set of coold.~tes in the sl~ d.u-l two ~ n ;onal rectilin~r ,,,~ F."~I ir~l q ;~ n -with positive x values to the east and positive y values to the north. The clumps 5 are gc~ aled by first finding grid points that are above a given threshold value.
In the pre-p,ocP~ u. stage, a grid with 0.5 km by 0.5 km spacing is constructed over a region which covers the A~ n~ ".~,ter n~lwolk 505. A simul~ted microburst is placed at each grid point and the di-er~,el ce is computer for each llelwulk P~ l If the computer dive~g~,nce for a given ~1.!....(!~l iS above that el~ I's threshold, an "association" is made 10 b~l~neen the grid point and that rlen~ In this manner, a table is constructed that connected all of the grid points to the l etwo,l- triangles and edges via a hypothetical divergence detection. This table is then employed in real-time using an inverse logic.
Given that a network f 1 ....- nl detects a dive,~,c;nce above its threshold, a set of grid points (via the table) is ~r;s~ed with that divt,~ ce, since from the theoretical analysis these 15 points are po~eQlial mic,~u,:~l lo~ lio~
Once these subsets of grid points have been i~ f;ed, they are processed to generate "clumps" of conti~o~ts groups of grid points. By cQntigl-Qllc it is meant that .~ cent up, down, right, or left points are ccn~;dered, not those along the ~iagon~lc Three sets of clumps are ge elaled to include grid point threshold data r~les-.lt~ e of "low-lcvcln, 20 "high leveln, and "low l~,~el dcns;ly" collectionc of grid points. The "lov~-leiel" and "high-level" grid points are hldicali~e ofthe magnitude ofthe estim~te~ wind div~ ,l-ce at those particular grid points. The "high-level" grid points are r~re~ll~li~e of a second~
threshold used to ~ sh the grid points that have c;~ c~ ly ~ceeded the initial threshold. This so~n~l~ y lLI~s' ~Id thereby dil~ ,.lt;ates wind shears of cignifi~nt 25 ~-~Aen;~ e from those of moderate ma{~r;1tlde~
"Low level del~:lyr grid-point clumps ue idPntic~l to those for the low level and high level process (1~ s~ above but ~ ~nl a colldP~r ~l;Qn of a large number of grid points, which llullll~C would be overly large or the r~ "1 g~lllet~ic pattern would be conca~ _ or c ~t~ nd~ in nature. An example of such a problem would be a c~lle~tion of grid 30 points that are located in a figure eight shape. In order to reduce the coll~ction of grid points into small, convex and compact patterns, a density ~ i.lg OpC~aliOn iS pc,Çolllled on the lowlevelgridpointvalues. Inorderton~c~lnpli~hthis~theorigina~ ;ludeofeachgnd wo 96/07988 2 1 9 9 6 1 9 PCT/US95/11223 point is mnltirlied by a local neighborhood occ lp-q-tion density weight to compute a new mq.~itllde value at each grid point to thereby more ac~ u~alely reconfigure the geomellic pattern ofthese grid points. The density weight is a norm-qli7ed value b~lween zero and one which is generated by any one of a number of mqthem~Atir.ql methods depending upon a 5 given point's location in the grid. For c~.lpl~, the neighborhood set of points for a given interior point are the eight adjAcent points incl~l~lin~ the diagonals and the given point itsel~
The number of points in this set that are above a threshold value are sl-mmed and this total number is divided by the number of grid points that are in the original neighl~olllood set.
These density weighted points are then formed into clumps in an iclçnticql fashion as for the 10 low level and high level ~nl~,ulalions to form the low level density geolllt;llic clumps. This procedure conden~es the collection of grid points into more compact p&llt;lllS and also separates overly eYt~n~ed clumps into a set of smaller, compact clumps.
~referred Geometric Shape A single prGrt;l 1 ed geQn.l,rl ic shape is used throughout these comr~ltqtions in order 15 to have co,~ .,~ and simrlicity ofthe çqlr~ qtions. The p,ert;"ed shape disclosed herein is a semi-rectilineqr oval akin to the shape of a band-aid, that is a rectqngle with semi-circle "end-caps" (these miclv~ul~l shapes are the same as the TDWR shapes). This shape is mqthemqtic.Ally defined by an axis line se~...F -I having two end points and a radius used at each of the end points to define a s_.",c.,cle. This geol"~ l,ic shape is produced for each 20 clump such that the axis line sF.~.~ has the ~ .. weighted squared ~ ce from all of the grid points that are within this given clump and rul Ihe.",ore, this shape encloses all ofthe clump's grid points. In cases where the shape is overly large or concave in nature, the shape is pr~ed to create a number of sm~ller shapes which endose the grid points. This shape is produced such that it is of minimll-n area after ~,i;,r,n"g these con~itiQn~ A
25 further processin~ step, a least-squares size red~lction is then pc.rvllllFd to Htrim" overly large shapes. In CO..~p.~ g the shapes for mi~r~ul~ts, the ..-~- ~de ii~""at;on used is the ratio of the CA~ 9ted di~ ence to the l~ that is ...~pped from l,iangles and edges into the grid points. A given grid point's ratio value is ~enc~aled as follows. First, a ratio for each LLWAS r,etw~"L ~lc -~r.1 (triangle and/or edge), is co...l..lle~l This ratio 30 is the ratio of that e~1~."e.~1~' del~le~ div.;,~ence e~ P and that Pl ~" n~' d;~. .gcnce ILn ' -' d value. This pred~ t~ .",il~ed tl"~ ' ~?d is d~ ~l to ;-~1ir ~l e hazardous ~~nnd-field di~e.~ c-~, is co .~p.-led based upon a ...-lhP~ ;r~ .)bul~l s ...--l~;on and takes into W096/07988 2 1 9 9 6 1 q PCT/US95/11223 account the ~eo... tlical nature of the given triangle or edge. Another set of magnih~de inÇo,l,lalion used is an associated loss value e~ e for each point, based on these dive~ge.lces. The ,lliclob~ shapes are c~lYll~ted at the "wind shear alert" (WSA) level using the low level density clumps, least squares shape size reduction and the sl~;c1;~9l 5 shape magD ~Ide ~ 1l The other set of g~--~t-ic shapes is at the "microburst alert"
(MBA) level using the high level clumps, least squares reduction and the l-~A~ value of m~gnitllde comput~tion Clump Generation 1 . . ~ess Figure 2 illustrates in detailed flow diagram the clump gene.~lion process 111 which 10 process is initi~ted at step 1102 where the data is received from the associated low level wind shear alert system 101 and stored in menlol~. At step 1102, the clump genel~lion process 111 converts the low level n~gnitllde points into local occupied neighbor density cigllted m~nitude values. Tnis process as di~CU~.~d above uses all of the low level input magnitude values and co...l~ es new values for these points based on the density of adj~cent 15 data points that have ~Y~eeded the initial pred.;lt;llll,ned threshold. Each given data point that is above the input ll~r~ hc'~ value is given a density weight which is a number b~ ~n zero and one indicative of the IlUlll~C . of conti~louc grid points, in~ludi~ the given point that are above the input threshold value, divided by the total number of conti~ous points.
That is, for an interior point the density weight is the number of neigllbo,il,g points above 20 the input threshold value divided by nine. This is because the contiguous points is defined as the adjac~nt points to the left, right, up, down and the four d;agonal points in this xy Cartesian coo,~te system. Once this set of density weighted values have been colll~,uled~
processing advances to step 1104 ~.h~ the initial g~O~lp~S of data points is acco",l)lished by grou~ g the grid points that have ~ -~ded the threshold value into 25 contiguous glo~,pil~g~ Cor.~iull~nlly with the operations on low level densiq data points, or subsequent thereto, the steps 1 lOS and 1106 are f -~iui~ on the high level .~ae,.il~lde points to ~lrull'' the same Conti~o~ glOup.-lg function of steps 1102 and 1103. The set of grùupil~gi is then used at step 1106 by the shape driver to gel-c~te the predele,ll~ed geolllehic shapes of minimum area.
Using points that are still inside the shape a~er radius rçdll~ion CGIll~ule the least squares reduced axis seC~..e.~1 to produce a new reduced axis line se.c;,..~--.l The re~lt~t~t reduced shape axis line se~ 1 is then con~e~led into the original, non-rotated Cartesian wo 96/07988 ~ 2 1 9 9 6 1 9 PCTIUS95/11223 coord;ndle system and the overall magnitude for the shape is computed. The res..ltsnt shape consists of a line whose end points replese~ll the center of a semicircle of predetermined radius which end point semicircles when conllecled by straight line se~ .Iç~ create a band-aid shape to enclose all of the data points in a minim~l area whose m~gnit~lde has been 5 calclllsted Similar procec~i~ of the input data takes place for the high level m~itllde points in steps 1106 and 1107 the proces~inp of which can occur sequentially or in parallel with the operation of steps 1104 and 1105. Once the shapes and their m~gnit~lde have been calculated for both the low level density ma~nit~lde points and the high level m~gnihlde points processing exits at step 1109.
10 Shape Production As noted above, this predete"",ned geometric shape is a band-aid shape which is defined by an axis line se~ having two end points and a radius used at the end points to produce two semicircular shapes. This process is illustrated in flow diagram form in Figure 3. The process is initisted by retrieving all of the grid points in one of the above 15 noted sets and storing these in III."IIG~. Using these stored grid points, the measured or calculated magnitude of each grid point in a clump is norms-li7.ed Once all of the grid point values in the set have been nG,...~ d, a ~ne;g~l~ed least squares line is fit through these points using a slal~dard weighted least squares technique. This produces the best line fit through all of the valid points in the input set of grid points. Once the w~ghled least 20 squares line has been produced, the ends of this line se~ 1 are cs-lculs-ted by projecting all of the data points in the set onto the co...~ e.d 1east squares line. The process uses the coordinates of each of the data points and the slope of the computed least squares line throughthese points. The coG,-lil~les ofthe clump points are put into a rotated coord,~ e system such that the least- squares line is ho,~nlal. The output from this c~slrlllstion is the 25 clump point coor~" ales in this rotated system and the axis line s~ nt end points also in this coordil~le systern. The first set of coor~;ndle values of this rotated end point is the le~LIlGsl point on the line rc~pres~lltali~e of the smsllest x value in the rotated xy Cartesian coolL~ te system and the second coorL~le output is the 1;~ .o.,; point l~p~sc~ e of the largest x value in this C~lc~i~n coold~le system. Once the ends of the shape line 30 sce---P--I have been d~ t~ll~ed all of the ,--bseyu~nt eQn~U~ ;On$ are done in the rotated co~,d ~ te system. The radius ofthe shape that enr1oses the points and is of ... ~ area -2~

Wo ~G1~79h8 2 1 9 9 6 1 9 PCT/U$95/11223 is c~s~ sted by using a one rlimen~;rnal smooth-function, (i.e., monotonic) .~;n;~ ;on routine.
Shape Area Minimization The "~ ~ation fim~ion is then activated to compute the radius that ~ 7es the 5 shape area and using this new radius a review is made to determine wl~elhel the axis line segment end points can be modified in view of the d~le~ ed radius. This is done by projecting the valid data points in the current set onto the computed least squares line and c~ ntr~ltin~ new end points as tliccl~csed above. Once this is done, the axis length is reduced if possible by moving the axis end points tOwalds the axis seg...~ .l bary center using a 10 weighted least squares reduction of the ho"zo,l~al tiict~nce from clump points to the closest shape bound~y. By closest, it is meant that these points are partitioned into three sets: a set whose x values are less than the shapes bary center, a set whose x values are greater than the shapes bary center and a set of points that were originally s-csocisted with the shape but after radius reduction are now outside the shape. The normsli7~d weights are se1ected to 15 be a function of points ma~itnde and its t~ nt~ to the axis sec~ bary center. The process uses the current access line se.~. ~-.l end points and computes the bary center of the current axis line se~ and initis1i7es the ...;n~ ;on iteration interval.
If the shape so gel1cl aled is too large, it is dicsected into a plurality of shapes. The test of excessive si_e is that the length of the axis line sc.~ .-l plus twice the radius is 20 greater than a predett;""med lL ' r'~' If so, the axis line se.~ 1 is divided into smaller and potentially ovc;,l~ppl"g pieces. The grid data points originally scsor;sted with the original clump are then ~cso.: ~ ed with the ~"~;spon.l",g suW l~pe~~ If there is an overlap ofthe multiple shapes, the grid data points can be sccoci~ted with more than one shape. The rec 11tsnt plurality of shapes more accurately reflect the COnCull~ e~ 'A of m111tiple 25 sd 7~P-nt or oielL.pp"lg wind shear events.
Least Squares Shape Size Reduction This process provides for a simple, ell.l ie~-1 and ,..~ ;c~11y ligOI'OU5 method for more precisely i~u1icl~t;n~ the ha~do-ls miclo~ l region. The origin. l micr~~ l shape algo, ;lllll. - still used in the TDWR system"~u;~s that all of the shear~ c 804 (the 30 "runs of radial .~ "lcr~") be e ~r,~ within the ~l~iclo~ul~l shape(s) 803. (Figure 8) If the locus of these shear 3ev~ 804 is overly eYtpntled andVor fairly concave in k~hu~lul~ the "aU e~ G -~" shape 803 can be too large. That is, it may contain non-hazardous regions 805. This can generate false alarm warnings as a runway alarm is generated when 3aY portion of a microburst shape 803 intersects a pre-defined box 802 around a given runway 801. This same situation applied with the LLWAS microburstshapes. Where herein, we are conc~.,.ed with overly extended and/or concave grid point S clumps, as opposed shear-sec~ clusters, though the concepl is identicAl The solution to this ~oc~ .. n1 ed "ove. wal .. ng" problem has been developed in the context of the least-squares reduction of the ~hape-s~t for the LLWAS n~i~,r~ur~l shapes in the appa, allls of the present invention.
A further contribution of the "ove wa~ g" problem, is in the gentlalion of the 10 ~mA~i~lde~' of the runway alert. That is, after a given microburst shape 803 intersects a "runway alert-box" 802, a mA~ihlde for the alert must be comr~lterl Again, the te~n:que used for the TDWR stand-alone system is fairly s..ni)lis~ic and tends to over-e~l;....,le the hazard mAgnitude These over e~l;...AIes are often viewed as false-alarms by the pilots.
Therefore, again in the context of the LLWAS .n.clobu.~l shapes, a simple, ~ffirien~ and 15 ... 11.. ." ~ , I;~lOus ~ ~Y is used in the appa~alus of the present invention. This algo.ill.... employs a stAtictir~l e~timste for a given microburst shape's ...~ ude.
A shape is defined by two axis end points: (Xl, Yel) and X2~ Yo2)~ [Xel ~ X2] and a radius R (Figure 7) The shape is gen~,.ated initially by finding the line which, in a least squares sense, (~ .tod by magnitude) best fits the set of points in a given "clumpH. These 20 clump points ess~ 1y reflect the di~cr~,ence msgritllde at those points in space - as estimAted from the LLWAS wind field.
The radius is then found by an iterative p.ocedule which ...;..;...;,es the area ofthe shape while simultanoou;,ly ~ g that all points in the clump are enclosed This technique is i~l~nticAl to the l,roce~lu e used for TDWR, which uses "sec~ endpoi.lts" as 25 o~,~sed to "points in a clump". Next, we tly to reduce the shape size so that it gives a better fit to the points. This Is done because the original criteria that all points be c ..,lGseA, tends to result in overly-large shapes when the clump is fairly concave. A further ullde~ed complication occurs because of the gene,~lly "weaker-msgnitude" points on the edges of the clump. This can be collr~pt~ ; ~ by co~ ring a ~.. el.ical microbu.bt outflow.
30 The clump points can be viewed as desc.il.llg ~-ltou.-levels of dive~el ce. The Hcenter"
oftheclumpbeingthe"center"ofthemicl~bloutflow. Thehighestlevelofdi~e.g,ence would be at the center of the micr~u.bl outflow, then .nonotonically dccrea~ing in magnitude with illcl~SIl g c~ from the center. The shape's radius is first reduced, then the axis length. Both are done using a weighted least squares techniqlle.
Reduction of the Shape Radius What we do here is reduce the (weighted) ~ t~nce of the (originally) enclosed 5 points, (Y~,~, Y,~), to the shape bOulldaly.
We have that R = d~ + d~, where R is the original radius, d ~is the perp.~n~ r tlist~nce from the point to the shape axis (or axis endpoint if X~ ~
Xl, or X~ 2 Xc~)~ and d~ is the dict~nre from the point to the boundary.
Therefore, we ~n;n;...;7e d~ = R - d~, which leads to the weighted least squares10 equation for R, the new radius:

~ Wk(~ dk )=~~

which has the solution:

R ~ Wk dk, when we choose a set of norm~ PA ~";gllts W~ , W~
We define the ~ 1IL~ to be:

mkdk wk= ~ ~ d where m~ is the given m~itlldç at each point. This ~~eigllil,g is used to remove the bias 15 generated by the relative higher density of the internal points. This can be understood by considering a shape which is a disc, and whose cor. ~ ent clump-points all have equal ...~gr.;l.ldes If the~.e;glfilg r..,..,~;0,- onlyco ;dered...a~ dçc thentheleastsquares radius re~uction would always ~ttP-mpt to make a new disc of minim~l-radius~ The use of the r~ r~e values in the ..e;~ n~ function is d~ A to coullteract this t~n~Pncy.20 Furthe.,llore, we choose a coofdlndle system rotated such that the axis is ho.;,~

Y; Y;--Y, Xk Xk ~ Yk Yk 1ic,.~ g rotated coordinates) WO 96/07988 . . PCT/US95/11223 In this coold~le system, the d~'s are given by:

[ ( X --X~ ) 2 +( Yk--y) 2]1/2; Xk<Xel d k ¦ Yk Y¦; Xel 15Xk ~Xe2 [ ( Xk ~Xe2) + ( Yk--Y) ] /; Xk >Xe2 Reduction of the Shape Asis ~ rn~th Next, we reduce the axis length by (sepal~lel~l) moving the axis se~ endpoinls toward the se~ mid-point. We use a least squares reduction of the holizo.llal (in rotated 5 coordinates) distance from a given point to the (closest) boundary. Note: the axis is reduced only when the axis length is longer than a threshold length (appro~llalely l km). By "closest", we mean that the clump points are partitioned into three sets: a set whose X-coordinates are less than the shape axis se~..~ mid-point, X; one "greater-than" g; and a third set ~n~ g of those points that (after radius reduction) are outside the shape. We 10 do not use this third set of points since their (hol~olltal) ~ ce to the boundary is (now) n-3~fined X ~ + X
b-- 2 The.erore, the pr~l_lll we are trying to solve (for a generic endpoint "e") is:

dk =dk (Xe Xe) where d~ is the hol~olltal (X~ nr~ from point k to the bo~d~y, a" is the (eventual) least squares ~ r~, Xe and ~, are ~";l~ly the original and least squares el~dpG"lts.
The new e.~dpG,nl we want is:

Xe=~ Wj ( dj-Xe) where the set of pointsj refers to either points greater than ~ for the "right" elldlJo,llt or less than ~ for the "le~" cndl)~..ll".spe~,li.,~ely. The ~eigllls are chosen to be:
w= mjlX; xl ' ~ mjl xj-X

where:
~, Wj=l As before, the weights are chosen to reduce over-bias by points dose to ~.
The hor~o~ l (x)-~ nce to the b~Julld~ dj is given by:
dj= Lj-~Xj ( R 2 _ y ~2 ) 1/ 2 _ (Xj~ -X ) -The value we want to ...;I.;...;,P is then:
dj-X = (R 2-y;2) 1/2 -X;

S where Lj is the hol~on~ ce fTom the point (X, Y;) to the least squares reduced bound&,~, and /~X; is the hol~onl~l d;~l~nce ~el-.~en X; and X:
L = ( R2_ yj2) 1/ 2 (R is the least squares reduced radius) I~X; =X; ~Xe Ther~fure, the new ell.lpol,~l, X, is given by (again in rotated cooldll-ales):

Xe=~ Wjl(R2-Y;2)l/2-X;]

where:

W = m j¦X] -X¦
~ ~mjlX;-X

wo 96/07988 2 1 9 9 6 1 9 PCT/US95/11223 Note: the same values result for points between ~ and X, and X~ and the boundary.
Furthermore, the same result applies to points on either side of ~. That is, the sarne equations apply equally for both sets of points "j" (partitioned based upon being less-than or greater-than ~).
5 LLWAS Microburst Shapes - M~nitude Computation This routine computes an overall m~itu(le estim~te for a given shape. The ter~ni~le is to assume a Student's t-statistic distribution for the m~gnitlldes for the set of points associated with the shape. The shape n1~gnit~lde is then the percel-lile value given by the mean m~gnitllde plus "K" standard deviations. This is an application of the well-10 known "confid~nr,e interval" technique from st~ti~tic~l theory. This distribution was chosenfor its applicability to small sample sets and its applo~il,lalion to a normal distribution for sample sets of around thirty el~ment~ or more. Furthermore, the value of "K" that has been used (k = 1.0), was chosen to ap~,lo,.;..l~te an 80 to 90~ pe,c~ ile value over a wide range of degrees of freedom, (which is the number of points minus one).
15 Symmetry Test Symmetry test app~alus 113 validates the microburst shapes produced by microburst shapes generator 112 based on the ~il;al~l il~llllalion produced by the fealures aloft and shape il~llllalion obta~ed from the Terrninal Doppler Weather Radar System 502. This v~ tion detem~ines if there is ;~uppollillg evidence that a given LLWAS
20 micl~w:,l shape, is truly associated with a microburst. That is, the shape that is genelaled from the detection of surface wind field di~e.gd-ce can be associated with either a microburst or some other type of wind field anomaly, such as thermal activity, noisy winds, etc. Since s~ le~ ae~mptiQn~ are implicit in a gcne.~lion of ,llicrobul~l shapes and these as~ flions are based on the assoc;dlion of the surface di~er~ ce with the 25 mic,robul~l. In non-miclv~ul~ ion~ these as~.ull~tions can lead to the gene-~lion of unwanted false alalms. This ~Ill~llell~ test proc~h"e 113 I~-n-)~eS the unwanted alarms by reviewing reflectivity and miclol~ul~t pr~;.ll~r inrolllldtion from the Terminal Doppler Weather Radar system 502. These inputs are combined spatially and t~,.llpol~lly to form SyllllllCtly disks whose pre3ence in~ic~es the possible eyictence of a microburst within or 30 above its ~soui~d~y. The given mi~.r~ul~l shape that is to be v~ ted by the ~IIUII~tI~r test 113 is then tested for its Pn~AUIIIIY to â ~ u~let~ ~r disk. T},e. erore, â weak microbul ~l shape that is close to a ~llull~ y disk is ~ d~ and those that are not are pr~,s.lllled to be an wo 96/07988 2 1 9 9 6 l 9 PCTrUS95/11223 elloneous detection. This ~llllletly test 113is initi~teJ at step 1301 with retrieval of site specific pa.~ll~tel~ from n.~lloly to modify the c~lc~ tions based on local climatological conditions and sensor configuration. At step 1302, a re,~ r grid in the xy Cartesian coordinate system is produced conc;~ g of a minim~l size grid nece~.y to analyze the 5 ç~lc~ ted shapes. At step 1303 the microburst shapes are selected whose m~ihlde are equal to or greater than a site adaptable 1~ At step 1304 the present grid point values are collll,uled based on current Terrninal Doppler Weather Radar features aloft h~llllalion and any Terminal Doppler Weather Radar or Low Level Wind Shear Alert System microburst shapes. The Ç~u,es aloft inputs are in the form of disks described by an xy 10 center coordinate, a radius, and a type: low reflectivity, storm cell, reflectivity core or microburst precursor disks. A m~gnit~lde value for each of these r~lul~s aloft disks is A~ ;,5J~ based upon its type. The mih~bu~l shapes herein are those that have been filtered out previous to this routine and exceed the predetermined threshold values. Thererore, all of the Low Level Wind Shear Alert System and Terminal Doppler Weather Radar shapes 15 colllpuled are s~leened to come up with a composite set of shapes that exceed a given threshold value. For each disk that impacts the analysis grid that has been produced, specific grid points within that disk have their ma~itude upd~ed based on the nature of the disk. Each grid point ...~.;~de value is time filtered with a single pole recursive filter to enforce a sense of time conlill.l;ly. This set of filtered ma~itudes is then the output of this 20 routine to the create ~Ill~llell~ disks step 1305. The disk ma~itudes are sPlected by appropliately choosing base or ,,,;~ values for each input set so that the features aloft disk type relates to the value of the actual loss .~-a~ des Once these grid values have been es~ h~ at step 1305 the ~.lh.l~,hy disks are created using a slightly modified version ofthe clump and shape g~n~alion algolilL,Il ~ ~cl above. Once these shapes 25 have been created at step 1305, at step 1306 the ~.IIII..,ll~ test is pe.r..lll.ed to validate the weaker Low Level Wlnd Shear Alert System mih~l~l shapes. The LLWAS mi~,lob~l shapes and ~IllllI~ r disks are the input to this step. Any Low Level Wmd Shear Alert System miclob~t shape whose ~ de is equal to or above a threshold value autom~ti.~lly passes the test. Olh~ .;~, a ciniull~lib.ng disk is created around each of 30 these weak shapes and a test is pc.rulllled to see ~h~ r a given Low Level ~md Shear Alert System disk is close to any i~llUl~t~y disk. If it is, then that Low Level ~md Shear Alert System shape passes the test. The output of this process is a list of logical values for each ofthe input Low Level Wind Shear Alert System microburst shapes to inflic~te results of this ~ tly test with a true value ;~ ;~ that the shape has passed the test and is a valid for use in cr~ling a r.--crobu.~l alert.
Microburst Int~.ration The micr~u.~l illleglalion appa-~lus 114 is the driver ofthe microburst portion of the integration appa,~lus. This appalalLls converts the Terminal Doppler Weather Radar microbu.~l shapes and v~lid~ted mic-obu-~l shapes output by ~Il--~ ly test app~al~ls 113 and the Low Level Wind Shear Alert System microburst shapes into runway specific alerts for any regions on the operational ~ Wl~ay~ (arrival R1, departure R1, etc.) that are defined 10 for the physical runways R1-R4 in the associated predt;Le ..I~ned area which are affected by the shapes. The regions so affected are co---bil-ed with the Low Level Wind Shear Alert System runway o~iented loss alarms. The Low Level Wind Shear Alert System inputs to this microburst ~--leg,~ion ~pa.~us 114 are the runway oriented losses that are the outputs produced by the Low Level Wind Shear Alert System 101. The microburst integration 15 appalalus 114 produces arrays co.~ ;~ the ...a~ de and location of any loss alarm as mapped onto the runway confi~,,..al;on within the predete.l..;n~d area. The ll icrobu.~l il~ion apparatus 114 -~;~ Terminal Doppler Weather Radar mic.obu.~l shapes from the Terminal Doppler Weather Radar system 502 and CGn~e- IS these by mapp;ng them into runway specific locus and ...a~y.il~de indic~tion~ to produce runway alarms. In addition, 20 microburst shapes that are ~...~ ed from the Low Level Wind Shear Alert System 101 as validated by the ~.ll.l~ test app~lus 113 are also converted into runway alarms once they have s~lffi*~nt ...~.;l~lde or the ~llll~l~tl~ L~tllej;s of ~ll..l,etl~ test app~ s 113 A \1;~t~ their e -;~t~-ee.. In r~ditiQn any Low Level W~d Shear Alert System runway oriented losses, as produced by Low Level Wind Shear Alert System 101, that are con.,Ull~ with any Low Level Wind Shear Alert microbw~l shapes are coll~led intoalarms and co.-~l~;oed with the above noted Termin~l Doppler Weather Radar micl~w~l shapes and Low Level Wind Shear Alert System mic.vbul~l shapes and output as a Al;on of alalms.
(1) C~ne.~lion of Runway Sperific Alerts:
(a) find alerts that would be ~ene.aled individually by TDWR and v~ P~d LLWAS microb~l shapes. This is done by the il~.e~e~l TDWR logic which finds the inl~l~e~,tioll of â given shape with an "alert box" (nominally a l~ gle around the wo 96/07988 '2 l 9 9 PCTIUS95/11223 operational runway path - l/2 n~lltic~l rnile to either side and eYten~in~ to 3 N.Mi offthe runway end). This is done for each micr~u,~l shape. [The LLWAS-generated runway-ori~F~ted-loss (ROL) value(s) are only used when an LLWAS l,licrol)u, ~l shape is generated - but then not v~lidated via the ~ -el~y-test algc,ilhl".] Then the overall alert for the 5 given operational runway is computed by finding the "worst-case" ma~nihlde and "first-el-r,o.~ k~ ' location: from all the "interesting" shapes and the ROL's for the runway.
(2) DisplayII,ro,,,,alion:
(a) The above logic is for gene,aling the runway alerts. That h~ alion is then relayed to the ribbon display terminals for the air traffic controllers, who then 10 l-ans..lil it to any i...FaGted aircraft. The same inro."l&lion is also displa~ed on the geographical sihl~ti~n display by "lighting-up" the appfopliate runway loc?tion.c ~ b) The TDWR and ~ 'id~ted LLWAS miclol)u~ Sl shapes are also displayed on the geographic display terminals.
The above-mentioned "worst-case" mq~ih)de and "first e~ l k~ " logic is further 15 applied down-stream after the gust-front i,lte~alion alerts are ~pa- ~lely generated. That is, there can - and often is - multiple types of alerts for a given Opclalional runway. Again, to avoid user-inlel~,relalion and confilsion issues, only one alert is gel elaled for a given ope,aLonal runway at a given time. Theltrore, the above logic is applied for all alerts for a runway. That is, alerts are separately generated for losses micl~u,~ls etc. and gains (gust 20 fronts, etc.) then a single "worst-case" alert is gel elated. However, rni.,r~bu,~l alerts (losses 2 30 knots) always take pr~k.~r~. That is, if there is con~ull.,lltly a 35 knot loss and a 45 knot gain - the 35 knot loss is used. This is because a wind shear that would generate a verv hazardous loss (i.e. 2 30 knots) is c~- -;dF,~ed to be more ~;~ir.c~ for the aircraft.
Additional Data Acguisition Subsystems The above desclil)tion of the il.lpr,~o~ Iow lcvcl wind shear alert system 100 is simply F ~ of the type of a~ialion ~.eal~ l appalallls that are available for use in irnpl~ F..~ g the virtual reality ;...a~ system 10 in an aviation ~.~lL~r applic-l;on Additional data a~.h~;l;on apparatus can include liJ.I..;.,g det~1o,~ g~st front traclcing ~slcl"s, ~ll.~r radar to identify the piesence and locus of storm cells and plCC p;li~l;on 30 icing condition det~tion s~ ,nls, aircraft tracking radar, etc. Each of these systems produce data indicative ofthe pl~ ce, locus, nature and s~ .~,lily of various Ill~teol~.logical ph_no",Fna of interest to ~id1ion ope~lions. In ~d~lition~ topological data such as a wo 96/07988 . 2 1 9 9 6 1 9 PCT/US95/11223 LANDSAT image of the land surface within the predetermined mllltitlimPn~ional space is also available. Other spatial realules of the mllltillimP.ncional space, such as aircraft operations, restricted airspace, airport locations, etc. are also data inputs that are available in the form of static or dynamic data from e.~,~l,ng instr lmPntAtion or input to graphics S subsystem 2 as initi-sli7Ation data. In sunl~l~y~ there are numerous sources of the data relevant to the user's needs and the graphics subsystem 2 integrates these data sources and filters the received data to create a simrlifiecl image of the multidimPn~ional space for the user to enable the user to perform a desired task without being ove- wl,eL"ed by the quantity of data or without having to ignore major sources of data due to the user's inability to absorb 10 and process the quantity of data that is available.
Graphics Subsystem Architecture Figure 1 also illustrates ~d~liti~-nsl detail of an imrlç..~ ion of the graphicssul,~l~", 2. The virtual reality im~ing system 10 can serve a plurality of users, with each user dçfining a particular image set that is to be J;spl~ed. There~,e, the gla?l cs 15 subsystem 2 can be ~uipped with a number of graphics proces~i~ app~al~ls 31-3m or a single graphic proce~ g appa,~lus can process data for multiple users. Each of ~l~ic~
pr~c~ ;ng apparatus 31-3m receives datainput from one or more data nC~ ;c;l;on app&alus 21-2n in the form of raw data or compact data l~re3e~AI;ons. The gr~phic~ p,oces~in~
appa,~ s 31-3m convert the received data into images as de3c,il,ed below.
Within a glaphics ploces~.ng app~dtus 31, graphical object gc~c.~lor module 4 functions to convert the raw data or ~ act data ~pr~ ;ons received from an P~ data ~ cition appa,dlus 21 into graphical objects that are later mAnirllAted to produce the ,~u.~ images. Each graphical object gcn~alor module 4 includps a plurality of graphical object genc.ato,~ 41-4k that are dcsc,ibed in aclditionsl detail below. The 25 graphical objects that are produced by a graphical object gelu~alor module 4 are stored in database 3, along with viewing data input by user interface 5. The user input interface 5 can be a simply terminal device, such as a k~ ,bGald, to define a single user s~ led view, or can be a device to input a continuous stream of data il Lcali~e of a contin~o~sly Cl.A~ g user defined view. This latter device c~be a set of sensors worn by the user that sense the user's 30 head po.ition to thereby enable the ~nrtual rellity ;--.~g system 10 to present the il.~tAl-l AneQIls virtual field of view to the user that is pl~scYltly in the user's field of vision.
The database 3 is also co~ v1ed to pres~.lt~tion ~ t~.... 6 that co,.~.;.ls the data -3~

stored in d-s-ts-bs-ce 3 into a visual image for the user as illustrated in flow diagram form in Figure 19. P~sc.ltdlion subsystem 6 inc1~des ~PIen. - ~1 301 which functions at step 1901 to initisli7e the visual display that is to be produced for the one or more users. This function clears the processing ~ler"~ -tC that comprise the graphics procecsin~ a~palal.ls 31 and S de~ -~ at step 1902 which of the sets of data sets co~ uned in ds-~sb~ce 3 are to be used to produce the graphic images for the selecte~l user. Fl~m~nt 302 d~el.~ Ps which characteristics or pa,~.lelers of the data colll~lPd in the ~l~t-sb-s-ce 3 are to be displayed to this particular ~esi~sted user. This is ~s.~...rlished by determine viewing parameters eleln~nt 302 retrieving at step 1903 the filter p~.lelers from user interface definition 10 section 3A of data ba,se 3 that define the objects and fe~.lules that are of interest to this user.
The dete.~ ed par~le~ers are then transported at step 1904 along with the raw data obtained from d~st~bace 3 to the render ~a?hical objects e1P ..~ 303 which pelrolllls the data Ill~g, transposition and whatever other proces,sing steps are le~uil~d to produce the visual image. The image that is produced by the render g. 1~ ~gl objects elF ~.P.~l 303 is 15 then ll~ ed by the al propl;ale lli~h - ~ 5- 01~ media to the display 11 that corresponds to the particular user.
Tm~pe Presentation Examples ofthe views that are created by this appalalLls are illustrated in Figures 13 - 17. These views are in the context of an airport weather system, wherein the displays 20 illustrate an overview of the weather in a predete...lll~ed space, as viewed from above in Figure 13 and in s~ccescive views of Figures 14-17 that are pl~senlcd to a pilot or an air traffic col~oll~ Iallnp the flight path talcen by an aircraft to app.oa-ll and line up with a particular s~le~ed runway 79 at the airport. As can be seen from these views, there are a plurality of ~e~her ph~ no...~ in the nn.ltidim~sional space. The ~ll.~,r phenGlll~na 25 include wind shear events 91-98, thunde.~lul...s P and gust fronts G. The display illustrates not the ph~n~.. n~ per se but filtered ~ on~s thereof that ;n~ e to the pilot of the aircraft only the ~ .;r.cAnl f~lu-~s thereof in orda to enable the pilot to avoid any section~ of this phcl~Glllena that are da.lgao~ls to the opc~dt;on of the aircraft. In particular, the lLu. de.~lorms P may include wind shear events 91-98 that are e,.l.~,...el,~ da"gelous for 30 aircraft Ope~aliOnS. The view from the cockpit of the weather ph~ l-O. . .~ n~ may be totally obs~,d due to rain, fog or clouds or snow and the illustrations provided in Figures 13 - 17 are indicative of how these visually obs~ events can be el;~ ed by the appaldlus of wog6to79s8 2 1 9 9 6 1 9 PCT/US95/11223 the virtual reality im~ng system 10 to provide the pilot with a clear in~icatiQn of the re of hazards, some of which may not be visually detect~hle by the naked eye, in the path ofthe aircraft or adjacent thereto. The pilot can tl~ r~ avoid these hazards using the virtual reality p~f~e~ded by the app~ ~ s of the present invention. By fiying along the dear 5 flight path as inr1ir~te~ by the display, the pilot can avoid all weather phenoll.~,la that are obs-,uled by the visually occ~ phe~n5~f -~ wilLoùl having to be instructed by the air traffic controllers. Furthermore, the air traffic controllers or the pilot can make use of the capability of this system to visuaUy d~;le.ll~e a proposed flight path through the weather to identify prer~.l~ routes for aircraft opc.~lions. This capability can be initi~ted via user 10 int~f~r~ 5, wllcreu~ an air traffic controller moves a cursor on the screen of a display, such as l l, or types an aircraft identifier on a keyboard to select one of the plurality of aircraft A inthe multid: ~.r~ nAi space. This aircraft selection is Il~ ~Ql~e~ using aircra~ position, altitude and heading data received from an aircraft tracking radar data acquisition subsystem and stored in ~ b~ce 3, into a set of CoGrdil~leS indicative of a point in the 15 mlllti~imf ncional space. A pr~cfin*d field of view for an aircraft of a particular type is also retrieved from the d7t~--e 3 and used to create the graphic image for the user.
An example of a visually obscu-ing event is pr~;~ Qn The range of i Itensily of precipitation can be divided into a plurality of categories, for e - A..lpl~ on a range of 0 (clear) to 6 (nasty). A level 1 ple~;p;l~l;ol is characterized by douds and/or light rain that causes 20 some .. ;n;.. n. impact on visibility, such that aircra~ flying through level 1 pr~.i~ .l;on usually wiU rely on in~l.u...~,.lt~ for g";d~n~ rather that exclusively on visual ~ nr~.
Level 3 pl~i~ ;on is characterized by clouds and moderate rain with a more si~ifi~nt impact on visibility. r i~ ~ is possible in level 3 precipitation and o~en ~n~n~tes form the higher level precipitation regions that are typicaUy e n.hedde~ in a level 3 region and can 25 strike outside the higher level region. Aircra~ can usually fly l~ u~l~ level 3 p.c;;p;l~;on but it is typically avoided ~.h~,n~ ~r possible due to the air lu bult lce enco~ r~Xl therein.
A level S pl~ :p;l;olion region is chara.,le.lLed by clouds, heavy rain, andVor hail with lightning and heavy lu l,.,le~ce. often present. A level S region of p.c~ ;on re"r~s~.lts a region to avoid due to the hazards erlCOIJn1f ~'~d in flying lh,uugl~ this region.
In visually l~pl~.n;.~ these various regions of plC~ ;ol~, the level 1 iso-surface r~rese,l~s the rough "extent of the ~ .dthe.", while the higher level regions leple~ll~
"weather impacted ~ r~e" that lie within the level 1 region. The iso-surfaces that are W096/07988 2 1 9 9 6 1 q PCT/US95/11223 displayed on display l l can be opaque or semi~ spa~e,ll. If opaque, only the lowest level prec;~ ;Qn iso-surface is displayed since the other higher level regions are nested inside of this iso-surface and cannot be seen. If a semi-l~lspdlelll display is selected, then the nested regions of higher p,ecipil~lion can be seen through the se-m-~ ansparelll exterior iso-5 surface as darker iso-surfaces or regions displa~ed by an iso-surface of a contrasting color.
Renderin~ P~ce~s and A~paratus The p,efe.,èd e"ll.od;",~nl of the virtual reality imggin~ system of the presentinvention makes use of weather pheno"~f na as one of the objects displayed to the user. In order to better understand the operation of the graphics subsystem 2, the rendering of a 10 m-i~il~ul~l shape is desc,il,ed in additional detail. The concepl of rendering is related to apparatus that creates a synthetic image of an object. The renderer app~al~ls creates a shaded synthetic image of the object based upon three-dimensional geometric descriptions, a definition of surface attributes of the object and a model of the ~ lminstion present in the space in which the object resides. The final image produced by the renderer app~al.ls is 15 spatially correct in that surfaces are ordered col-~lly from the observer, and the surfaces appear illu~ ed within the scope of the ilh~minvstion model. Surfaces may be displayed in a manner to f ..h~ ~ the texture of the surface to hi~hlight that feature, or reflection images on the object can be displa~éd. Renderer 303 can be a separate processing clP~.lf r,l or can be s~n~ ul~nillg on a p,ocessor shared by other elfl"~",ls disl~la~ed in Figure l.
20 If an object is to be rellde,ed, a de3c-i~)~iol of the object is passed from d~ e 3 to the renderer 303, where the object d~r...~fi- ~- is merged with other so retrieved object definitions to create the image.
This appalal~ls filnctionc in a manner ~r slogous to the operation of a camera. The camera's pGs;l;on~ aiming direction and type of lens must all be spe~ ified to produce the 25 visual image. This j,~,...~;on is created by d~lf~ l,ne viewing ~ rS ~ "~n~ 302 from the data stored in database 3 by the user via user interface 5. The viewing pa,~"ele.s are used by the rel d~rer 303 to d~ t~ ~ the field of view of the camera and to delete objects or se~ilions of objects that are obs~;uro~ by the pre~nce of other objects that are located closer to the camera and in the line of sight of the camera. As the camera moves 30 along a path, the camera views the objects in a dirr~ nl pe~ e. The renderer 303 creates an image for each prede~.-&~l interval of time andtor space as the camera traverses the path.

wo 96/07988 2 1 9 9 6 1 9 PCT/US95/11223 The fepl~s~ ;on of each object in the predettlllfilled mllhi~;...~n~;onql space can be acco.~pli$hP~d by defini~ the object as an h~ onneclion of a plurality of polygons and lines. For a polygon that COIll~ lises a triangle, its definition is acco...pliched by specifying the location in three-dim~onciQnal space of the three vertices of the triangle. The renderer S 303 uses the data defining the three vertices to delellnine whether the two-d;.ne-~ic)n~l space encGI~.racscd by the three sides of the triangle that inlel~iol-l-ect the three vertices is within the field of view ofthe camera. If so, the renderer must also delel..line whether the triangle is partially or fully obscured by other triangles already retrieved from the database, which triangles define the surfaces of other objects. Additional complexity is added to this 10 task by the inclusion of color, texture, opacity as defini~ terms to the object. When the renderer 303 traverses all objects defined in the database and incl~lded within the field of vision, the graphical image is co...pl~led and the re.cl.ltqnt image is lr~lsrt;l.t;d to the display 11.
A microbu.~l glaphical image complisPs a collecti~n of the primitive graphical 15 objects, such as ~ gles and lines. If ll;angles are used to define the surfaces of a microburst, a plurality of l~ gles are assembled together to create a multi-faceted series of ~lrf~r~c to project the image of a solid object. Data d~fini~ the vertices of each triangle and other relevant surface f~lul~s, are stored in the graphical object seeJ..f ~1 of the ~.,t.~ ~ 3. When the ~~nde-~;l 303 traverses the database 3, it uses the data obl~ned from 20 user interface virtual realit,v d~finition database 3A to identify the filter p&~ll~,tel~ that are used to define the objects and re~l.lr~s that are of interest to the specific user. These filter parameters can be Illi~lobul~l ..-9~-;l~ldP, pruAilll;ty to an airport runway or aircraft flight path, ~lion of movemen~ etc. The ad&tional objects are also l~hi~ ~cd from the d~bqce 3, such as other aircraft, p,e~;p;~ Qn, gust fronts, terrain f~l~n,s, etc. Each object is 25 defined in terms of the polygonal shapes and their localioll and extent within the predefined volume. Additional primitives can be in~ ded in this system, such as object ~ ,&~,n~, native color, graphical l~les ~~1AI;on color, etc.
Depth Buffer To illustrate the l~.ndelin~, of an image in further detail, the following description 30 elabolales on typical irnage l- n~lP~ g conccpts as related to the ploce~ e e~ P-~1c of ~"dc~e. 303 that are d;s.,losPA in Figure 18.

wo 96107988 2 1 9 9 6 1 9 PCTtUS95/11223 A fi~d~ element forimage p-vces~ g is a depth buffer l8l that is used at step 1905 to cllala~ e an object in the mlllti~ c;onal space that is displayed to the user.
In depth buffer proc~çQQ;~ each pixel that is stored in memory co..l~ not only data values that describe the color ~Qsi~ed to this pixel, but also data values that are termed a "depth"
S p~--ele~. The depth p~,.eler is a measure of the ~liQ~t~nce from the viewer's eye to the spot on the object reple3~ ed by this pixel. On system initi~li7~tion, all the pixel color values are set to the bacL~ound color and the depth paf~nele ~ are set to the .n~.~u~l~
value. This set of data rep.ese,.lj an "empty stage", ~llerem the user is presented with a view of the m~ltitlimPn~ional space absent any objects present therein. As each object is 10 added to the mnlti~1....~llc;or~1 space, it is rendered pixel by pixel, and the depth value of each pixel is comr~lted and coll-l)ared to the depth value stored in the n.cl..o.y. If the newly comrut~ depth value is less than the depth value presenlly stored in memory for this pixel, a dete....i,.alion is made that the newly rendered object is in the ro.~gro~md, closer to the viewer than the previously rendered object that is ple~,.lly rtprese..led by the pixel. The 15 color and depth values for this pixel are then updated to those compllte~ for the newly rendered object. If the depth value for the newly rendered object is greater than the value pl~lt~y stored in ...~.,.-o-y, then the newly le,-deled object is in the bacl~ground behind the previously re~lde ed object and no change is made to the pixel values.
The result of this S_~;dt;ll~ u~ g process is a bit map of pixel values, which 20 l~p~ a displayable image that shows the front surfaces ofthe front-most objects in the mllll;tl;...~n~:oll~l space. The objects appear solid to the viewer and all hidden surfaces are suppres~d. The le.,ul~l image clearly ,~-,senls the spatial r~lqtion~hip among the objects. The computation of depthvalues is fairly simple and can be pc,ru",.ed in real time by ~le~ t~ h~J~ e. This process is fast but l~ullc;S an extensive amount of25 l"e.,lo,y. In addition, this process does not address the issue of shadows, ll~u~a.t;,,l or ~~ne~ objects.
Color Determination The depth r~ process only determines the ~ el ~ ;c position of objects in the space. The color r~ I to each pixel is a lùn~lion ofthe lighti~, viewing geG~.~ t~r, and 30 the object's visual surface p,ope.l;es. One method of color ~llocation is termed the Phong li~hting model. The color der~ ;Qn process 182 at step 1906 makes use of a llu~ er of v~;~les that define the object's surface and ~,.,v.r~l~n~e.ll. These en~,irol,ll,c~nl -va,;ables 2 1 99 6 1 q include a unit surface normal vector, which is a vector that is pe.~J~n~ic.llqr to the surface of the object at the point where the color of the object is being ev~lu?ted The unit eye vector is a vector from the viewer's eye to the point on the object's surface where the color of this object is being ev~ ted Finally, the unit light vectors are vectors from the point S on the object's surface where the color is being evluated to each light source in the ml~ ;onal space that ilhlminqtes the object. The object surface variables include the base color ofthe object and diffuse color, which is the color ofthe reflection co,..ponen~ that is scall~.cd equally in all direcliolls. The spea)lqr color is the color of shiny highli~ht~ and need not be the same as the diffuse color. Finally, the specular eAI onenl is a scalar value 10 that controls the "tigl,~ es;~" of the hi~hlightc, as a diffuser lens intensity.
Te~ture Mappin~
Texture mapping is a set oftechniques 183 that il..,.ca3e the visual detail at step 1907 of eomrlter images. These te. '~- ~quçs add surface detail at step 1907 to geom~ l-ic objects without ;,~ ;~l ~lly modelling those details using n~itionql gCO~ .tl,~. The il~cleased detail 15 is accomplished by mapping â p ~etc Ill,lled image (texture map) on to the object's surface.
The predetc...~ed image is itself stored in llwlllo-~l as a two-d;...P---r;~,l-~l or three-&..~ n.~l table of sampled data. As each pixel of the object surface is CG~ ."e~i sçlected Pl~ -". -,l; from the texture map are used to replace or alter some material propcllies ofthe pri nitive object.
The texture "lapplng process follows a plurality of steps, starting with a texture synthesis step which produces the texture map data. This step can be pe.~lllled prior to object .~ or during object r~- ~A. ;.~g Once the texture map is produced, the selected point on the primitive object is pa.~ll.,te.i2~ to uniquely identify the location of the selected point in the texture space. The texture map is a ~ ~ e ~ed using the data that uniquely 25 ide.~1;rles the s~ e.d point and the coll~;,yond...g texture at that point on the texture map is ~clu...ed and used to alter the s~1evl~ prop~,.ly of the object. This texture mapping process is de~-il,ed in terms of applying a discrete texture map to an object image.
These techniques can all be used to enhance the image pre~.-ted to the u~er. Theimage pl~g can be p~..l.cd at a central loç ~ ;OIl and the final image data l-~m~. .;l led 30 to the display device or the p~ can be d;;,l- ibulcd, with co~ z~,l data l ep~ A I ;on~
of the objects being ~ ed to the display device along with the object surface characteristic data noted above. The display device can then ass~ blE the image from the wo 96/07988 2 1 9 9 6 1 9 PCT/US95/11223 received data. In either case, the compact data leprese,.laLion of an object, such as that desc~ ;l ed above, reduces the quantity of data that must be l~ etl stored and processed.
At step 1908 the g~ ,aled image data is l~ ;lled to the sPlçcted display device 11 for p,esenlalion to the user. At step 1909, a dete""malion is made of wl.elhel a single S static image is to be p,t;senled to the user or a series of views taken along a defined path through the mllltillin.l.n~;onal space and/or over a length of time or at a future time. If a single static view is desired, processing exits at step 1910. If a seqllçnr.e of images is desired, proc~scing advances to step 1911 where a detel"linalion is made whether the sequence of images ,~;~les~nls the view as seen from a path through the mlllti-limPn~ional 10 space. If not, proc~ing advances to step 1913.
The path definition occurs at step 1912 where the path dçfinition processing is accomplished via one of a number of possible methods. The path can be defined in its totality so that the path definition operation simply redçfinçs the user viewing par~,,c;lers to identify the next point along the path at which an image is to be presented to the user.
15 This view point s~l~ion is typically a fimrtion of the speed of image proces~in~ and ~h~1I-er the user desires a slow motion view or Nreal time" view. In addition, if the end of the path is reaeled, p.oce~ P exits at step 1910. Altell~ ly~ the path can be dynamically defined by the user via a "joystick" type of input device to "fly" a path l the multi~ e~;onal space. In this ;n.~ ~, at step.1912, the input device is queried to 20 identify the next viewing point desired by the user. Once a viewing point is sele~l P~~C~ B advances to step 1913.
At step 1913, a dete,...;n~ is made ~.h.,ll,c. a present view or time sequçnced view is desired by the user. A sfle~l;on of present view results in pr~ces~;n~ advancing to step 1905 where the image gen-, alion pr~c~ds as des~,ibed above. If a time se~Uçnr~e~
25 view is elected, p~ ~ advances to step 1914 where the image data is ~..An;f ~lated. The user can select from a l~ul~lb~ of possible te~l.pGl~l views. A replay of past views or future views or a CGIlllJllldt;On of past-present, present-future, past-present-future views are possible. Tc~lpo.~l pr~ g step 1914 is typically impl~!~..e~.~ted in te"*~-~l p,l)cess~r 184, which can ...A;.d~ n a buffer of n last most recent views, as defined using the compact 30 data r~r~.eC~ ;0n for view replay pu~ses In rddition, le..li)o,~l p~uccssor 184 can extrapolate past and present image data to create â view of the moltid;~..e~-~;on~l space at a point in the future. Temporal p~ûces~or 184 msnirll~tes the available data at step 1914 to produce data indicative of the state of the various objects extant in the m-~lti~limPncional space at the sPlected point in time. This t~ pol~lly zdju~ted data is then used at step 1905 and the following image proceC~in~ steps to produce the image desired by the user.
Compact Data Representation Co .v~. ~ion To Object Model In order to illustrate the use of polygonal shapes to define object surfaces, the microbu,~l shapes described above are illustrated in three-dimensional form. The compact data represenlaLion of the Ill,cr~bu,~l shown in Figure 8 is a band-aid shaped two-dil-~r~ Al region on a surface that is used to in-~icvDte the location and extent of an aviation hazard. This shape can be defined in terms of two points and a radius as noted by their 10 Cartesian coordinates - (Xcl, YCl) and (X C2~ YC2 ), and a radius R where the Cartesian Coordinate system is defined in relation to the mlllti~ nsional space. The m~D~gnit~lde of the wind shear in this microburst is additional data that defines this wind shear event and may not be used initially to produce a repfese.,lalion of the wind shear event, since as described above, all wind shear events in excess of a predete""ined mD~it~lde are 15 c~nr;d~ ~d to be a hazard and should be dis~l~cd. However, the user's filter definition may include ",i~robu,~l se~e,ily pal~u,Fte~:i that modl~lDte the standard wind shear event threshold data to display more or fewer nllCI ~Ul ~ as a function of the user's filter. In addition, the microbursts can be color-coded to in~ .,D,te their intensity.
A microburst is a surface related phe. G",~,na,.in that the aircra~ hazard only exists 20 when the aircra~ is in close PIU~dlllL;~ to the ground, such as in a f~nal approach to an airport mnway or ;"""P~ F.ly upon takeoff. However, a g,~nical image ofthe microbu,~l must have three-d-,llens;onal extent to ensure that the user can visually detect its p,e~nce in the display 11, e~ Jlly when viewed from an obtuse angle. The ~",~act data lel~.;sF~ ion of a mic-~u, ~l as shown in Figure 7 provides a two-d""el,s;ol al dçfinitiQn of the surface 25 locus and extent ~ r ? ~te~ by the Ill;i~;l~Ul:~. The surface is typically not fealul.,le;.s and this two-~ F~-cion~l compact data ~~.~ ;on must be tr~n~l ~ed into a three-F.~cional .~,resF ~ 1;oll of the event. Thelerore, the d~ ee 3 is queried to obtaintopological data that defines the surface feature in and around the microburst impact area.
The f~ndc~er 303 maps the two-~l;...e~.~;o -' locus and extent data into a three-dim~n~:~n~l 30 ~ ;on of the actual surface area impacted by the micl~ul~l as illustrated in Figure 8. In addition, there can be a spreading effect at the surface and the micfobu,~l may be h~led in a manner that the top ofthe micl~u-~l is slightly smaller than its extent at the wo 96/07988 2 1 9 9 6 1 q PCT/US95/11223 surface, so~"~hat lilce a plateau. Surface detail can be added to the three-dimensional rendering of the micr~ul~l using the texture surface teçhnillue disclosed above. The surface detail can also be varied as a function of the user sPlected threshold to highlight phel1G,.... .....~na of si~ific-qnt interest to the user.
Figures 7-12 illustrate details of the method used by this app~alus to convert the two-d-...~ns;onal compact data rep,es~ ;on of a microburst into a three-~imPncional graphic image rel).esenldlion. In Figure 7, the tWo-~limpn~ion~l bq-nd-q-i-l shape is cpres. .~1e~ inçlvr~ n datum points P; and surface normals lel)resenlali~e of the two-dimPnsional tangent vector at each datum point on the curve. Once each datum point is 10 identified, its location on the three-dimpn~ n-q-l surface is defined such that Pj = (x;, y; z~.
The series of datum points can then be illustrated dia~nmalically in Figure 8 as a perimeter line that follows the topology of the surface to include the locus and extent defined by the two-~imen~ionql b-q-n~lqid shape. The series of datum points so defined in three~l; .... cnAl space each have the prop~ r that two data components (x;, yj) define the 15 datum point locr~;Qn on the b~ qi~ p~ .,te- in tWo-~ on~l space. A third co~ )o~ t defnes the height ofthe datum point abo~e a base level in the mohi~linnPn~;onql space. The normal vectors g~n~ ed for each datum point define the "outward" direction of the surface of the mil.;.ob~ l at that datum point. The normal vectors can be used for shading and to provide the al,pe~.u ce of a smooth surface.
Once the series Pj of datum points are defined as desc-;l.ed above, a second series of datum points Qj are defined in three~ ol~l space to define the top pe.i~ t~ ofthe microbu,~l. Figure 9 illu~l,ales the rel~;o~ of the first . nd second series of datum points in two-dimensional space. The second series of datum points are selPcted so that their pclh~,etel defines a bq~lsid shape of locus and extent less than that ofthe perimeter 25 defined by the f rst series of datum points so that the second series are insc,il,ed within the first series of datum points. The third Colllpo~ of each datum point in the second series of datum points is selected to be a fixed height above the surface location of a co"~ondiilg datum point in the first series of datum points. The res~ nt three~ ol~l shape is led in Figure 10 and resembles a plateau whose top surface follows the contours of 30 the surface on which it rests.
To enable the ,e.ld~,. to pc~ru"" its task. the microbu~l shape defined by the two series of datum points are co"~led to a plur~lity of surface d-r...;n~ polygons. Figure 11 WO 96/07988 , ' 2 1 9 9 6 1 9 PCT/US95/11223 illustrates the definitiQn of one side surface of the microburst while Figure 12 illustrates the definition of the top of the microburst. In particular, in Figure 11, a triq-n~.lqr se~..e..l of surface 1 results from col-nç~ the two series of datum points with lines that define the edges oftriangle. A first triangle is defined by conl-e~ -g points (Pl, Ql. Q2) while a second 5 triangle is defines by co~-l-çc~ e points a'" P2, Q2). This process is continued until the last defined triangle comle ils to the original or starting points Pl, Q,. A similar process is used to define the top of the microburst shape.
The exterior surfaces ofthe microburst are thus defined by a series of triangles, each of which is prec;sely located within the ml)lti~im~n.cional space by the Cartesian 10 Coo-L~Les of the vertices of each triangle. The renderer can then use the vertices and the surface normal vectors to denote surfaces in the field of vision ofthe user and to represe.ll the obscuration of one object by another. Once the field of vision is thus d~finçd the additional attributes of the various objects are used in conjun~ion with the user-defined filters to L.~.... the visual image into the virtual realit,v defined by the user. In particular, 15 the types of objects displayed can be defined by the user to e~ .AIe the visually obseu--l-g effects of p,ec~ ;on or fog, or clouds. In addition, the surface defined by the triangles and the surface normal vectors can be visually dis~la~ed using shading as a function of the m~gn:~l-de and L~.L;O.I of each surface d~ vector to provide the user with an accurate three--l;...l ..~;onal solid .~pres~ ion ofthe object. Fl~lhe.lllole~ time sequenti-ql values 20 for the loc~q,tion of each datum point can be used to provide a "moving image~ reprecçntqtion of the object to illustrate its movement in the m~lkidimencionql space over a number of tirne-seq~l~ntiql time intervals. This time series of data points can also be extrapolated to predict the future movement ofthe mi~u.~l. As can be seen from this e~..ple, the ~.Lclobu.~l can be defined by the compact data r~re~.llLation, which can then be ~ nded to create a 25 three~ .s:onal l~nd~,.;.~g ofthe object for display to the user Display E~ample Figures 13-17 illustrate an ~"~ ,Z'E of a series of ~l;~l~ that can be used in an aircraft weather display application of the vfftual reality ;~gei~e system 10. Figure 13 illustrates a top view of a predde...lined space, which view inrl~ldes a plurality of f~lulès 30 and phellG~ na. The preddelllllned space is deline~qted by bo.u-d~ lines B and range circles Rl-R3 can be in~ ded to denote range form a prcdete.llllned point located in the pledete.lll.lled space. Temporally co~ fe~lures located in this space are roads Hl-H6 ~0-WO 96/07988 PCTtUS95/11223 219961q and nahlral topological ff~lures, such as ...o~ M. Other features shown on Figure 13 are auport runway 79 and aircra~ A. Temporally and spatially varying phel-o.... ~-~ that are present in this space are regions of precir;l~l;on P, wind shear events 91-98, and gust front G. Collectively, these elf -..e ~1 s leprese.lt the items of interest to aircraft operations in the 5 p~ led space. It is obvious that a level of prcc;~ ;on is selected for the opel alio of display 11 to delinf ~te in Figure 13 only the extent ofthe s~lected level of prec;~ ;rn to minimi7e the complexity of the display shown in Figure 13. This feature enables the viewer to ...;l~ e the extent of the pl~ipit~lion regions displ~ed to only the levels of interest. For ~ ...plç, if only level 3 or higher regions of preci~ ;Qn are displayed, then 10 the visual obscul~lion presented by the level l and 2 prec;l~ ;on are deleted from the display thereby ç.~h~n~ g the visibility for the viewer, such as a pilot.
A more useful display for a pilot is shown in Figure 14 where the used via user interface 5 defines a point in the predetelllJlled space and a field of view. This data is 1 into a p. .~ e three~ n :o~ type of display to illustrate the view from the 15 selected point, with the o~ , data filtered out by the ~enderer 303. As noted above, the low level pr~lpilalion and other aircra~ A can be remo~ed from the view and only objects of interest displ~ed thereon. The display thereby pl~nls an image of the pole.llially threatening ~lLeY phf.--o...f n~ that co~lfiolds the pilot, in the form of pre~ ;t~lion P of at least level 3, wind shear events 91-98 and surface topology S. Note that the b~ n~. ;es B
20 of the p~edt~ ned space are also shown to ;..~l;r~e the extent of the data present in the field of view. As the aircra~ decr~s its altitude and approu,lles the runway 79, the field of view and point in the pl~d~l~ll~ ed space change as the aircraft travases the flight path.
The virtual reality im~in~ system 10 pe.iodicall~ llpl~s the data to update the point in space, field of view as well as sele~lfA characteristics for display to create ever ~
25 images, such as that shown in Figure 15, as the aircra~ tla~ ~s the pr~ete~ ...;-~ed space to circle the runway 79 an line up for an approach. The view on approach is shown in Pigure 16 where the runway 79 is clearly seen as are regions of p~ ;p;~ ;Qn P and wind shear events 96-98. These wind shear events 96-98 in reality may not be visible to the pilot and the display via a CQ~ ''' gener~led r~-ndf~nng provides the pilot with ;,~ro~ ion of 30 weather related phf.~o...f.~ that otL~l~;sf is unavailable. The pilot can det~ e the proAihnil~ of the wind shear events 9~98 to runway in de~ u~ ~L~,lllf r to contim~e the approach to runway 79. In addition, .~nder~r 303 can be activated to extrapolate the ~1-wos6/07sss 2 1 9 9 6 1 q PCT/US95/11223 g data to illustrate a likely progression of wind shear events 96-98 that is likely to occur during the aircraft appleach This predicted scenario can be quickly displ~ed on display 11 to enable the pilot to determine the dosest approach of wind shear events 96-98 during the entire landing operation. Figure 17 illustrates a timewise sequential display 5 following that of Figure 16 to indicate the display that would be seen further along the flight path as the runway 79 is approached. This se~u~nre of aircra~ operation can be ~im~ ted in real time by the user providing flight path d~finin data via a joystick or other such control device to "fly" a proposed flight path through the predete~ ned space.
Summary .As can be seen from the above c.~unples, the virtual reality im~ging system displays features and ph~nom~n~, that can be temporally and/or spatially varying, in a manner to filter out the characteristics of the features and phenomena that are not of interest to the viewer. The image p,~llled to the viewer is a con~çn~tion of all the data collected by the plurality of data acq ~i~ition systems, and some ofthe data presented represents features or 15 phenomena that are not visible to the viewer with the naked eye. Thus, this app~alus operates in real time to provide each user with a customized view of a predete.lllined space to enable the user to pc.rullll a desired task.

~2-

Claims (14)

I CLAIM:
1. Apparatus for presenting a user with a virtual image of phenomena located in a predefined multidimensional space, comprising:
means (201) for generating data indicative of at least one time varying phenomena extant in a multidimensional space, which multidimensional space has predefined extent in a plurality of dimensions; said apparatus further CHARACTERIZED
BY:
means (204) for converting said generated data to a compact data representation indicative of a presence and locus of said at least one time varying phenomena in said multidimensional space as well as indicative of an extent of said phenomena in said multidimensional space, said compact data representation defining exterior surfaces of said time varying phenomena;
means (3) for storing data defining a plurality of characteristics of said time varying phenomena that are to be displayed to a user;
means (301, 302) for extracting data, that satisfies said plurality of characteristics defined by said stored data, from said compact data representation; and means (303), responsive to said extracted data, for producing an image representative of a three dimensional view of at least a portion of said multidimensional space to display said phenomena, substantially temporally concurrent with the generation of said data used to produce said image.
2. The apparatus of claim 1 further comprising:
means (202) for storing data representative of at least one texture map, which texture map is indicative of features of a surface of said phenomena, and wherein said image producing means accesses data in said storing means corresponding to a selected one of said texture maps to produce said image.
3. The apparatus of claim 2 wherein said image producing means (303) retrieves said data indicative of exterior surfaces of said phenomena concurrent with accessing data in said storing means corresponding to said selected texture map, said apparatus further comprises:

means for altering said surface to incorporate features of said texture map.
4. The apparatus of claim 3 wherein said image producing means (303) converts said compact data representation data indicative of said exterior surfaces to a pixel by pixel image of said exterior surface of said phenomena.
5. The apparatus of claim I wherein said image producing means (303) retrieves locus information of said phenomena from said compact data representation to define position of exterior surfaces of said phenomena in said multidimensional space.
6. The apparatus of claim 5 wherein said image producing means (303) arbitrates among a plurality of phenomena to identify segments of exterior surfaces of ones of said phenomena to be represented in a foreground of a field of view from a user.
7. The apparatus of claim 6 wherein said image producing means (303) combines said identified segments of exterior surfaces to present an inter-object spatial relationship view to said user.
8. A method for presenting a user with a virtual image of phenomena located in a predefined multidimensional space, comprising the steps of:
generating data indicative of at least one time varying phenomena extant in a multidimensional space, which multidimensional space has predefined extent in a plurality of dimensions; said method further CHARACTERIZED BY the steps of converting said generated data to a compact data representation indicative of a presence and locus of said at least one time varying phenomena in said multidimensional space as well as indicative of an extent of said phenomena in said multidimensional space, said compact data representation defining exterior surfaces of said time varying phenomena;
storing data in a memory defining a plurality of characteristics of said time varying phenomena that are to be displayed to a user;
extracting data, that satisfies said plurality of characteristics defined by said stored data, from said compact data representation; and producing, in response to said extracted data, an image representative of a three dimensional view of at least a portion of said multidimensional space to display said phenomena, substantially temporally concurrent with the generation of said data used to produce said image.
9. The method of claim S further comprising:
storing data in said memory representative of at least one texture map, which texture map is indicative of features of a surface of said phenomena; and wherein said step of image producing accesses data in said memory corresponding to a selected one of said texture maps to produce said image.
10. The method of claim 9 wherein said step of image producing retrieves said data from said memory indicative of exterior surfaces of said phenomena concurrent with accessing data in said memory corresponding to said selected texture map;
altering said surface to incorporate features of said texture map.
11. The method of claim 10 wherein said step of image generation converts said compact data representation data indicative of said exterior surfaces to a pixel by pixel image of said exterior surface of said phenomena.
12. The method of claim 8 wherein said step of image generation retrieves locus information of said phenomena from said compact data representation to define position of exterior surfaces of said phenomena in said multidimensional space.
13. The method of claim 12 wherein said step of image producing arbitrates among a plurality of phenomena to identify segments of exterior surfaces of ones of said phenomena to be represented in a foreground of a field of view from a user.
14. The method of claim 13 wherein said step of image producing combines said identified segments of exterior surfaces to present an inter-object spatial relationship view to said user.

-44a-
CA002199619A 1994-09-08 1995-09-08 Virtual reality imaging system Abandoned CA2199619A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/302,640 1994-09-08
US08/302,640 US5490239A (en) 1992-10-01 1994-09-08 Virtual reality imaging system

Publications (1)

Publication Number Publication Date
CA2199619A1 true CA2199619A1 (en) 1996-03-14

Family

ID=23168613

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002199619A Abandoned CA2199619A1 (en) 1994-09-08 1995-09-08 Virtual reality imaging system

Country Status (7)

Country Link
US (1) US5490239A (en)
EP (1) EP0780009B1 (en)
AT (1) ATE200157T1 (en)
AU (1) AU691976B2 (en)
CA (1) CA2199619A1 (en)
DE (1) DE69520504T2 (en)
WO (1) WO1996007988A1 (en)

Families Citing this family (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751289A (en) * 1992-10-01 1998-05-12 University Corporation For Atmospheric Research Virtual reality imaging system with image replay
US5760752A (en) * 1993-07-15 1998-06-02 Nec Corporation Image display apparatus for displaying an image corresponding to an image state at the transmitting end
US5819016A (en) * 1993-10-05 1998-10-06 Kabushiki Kaisha Toshiba Apparatus for modeling three dimensional information
JP3214776B2 (en) * 1994-04-13 2001-10-02 株式会社東芝 Virtual environment display device and method
US6085256A (en) * 1994-08-19 2000-07-04 Sony Corporation Cyber space system for providing a virtual reality space formed of three dimensional pictures from a server to a user via a service provider
JP3632705B2 (en) * 1994-08-31 2005-03-23 ソニー株式会社 Interactive image providing method, server device, providing method, user terminal, receiving method, image providing system, and image providing method
JPH08110950A (en) * 1994-09-08 1996-04-30 Sony Corp Plotting device and method utilizing hierarchical approximation system for graphic data
US6219062B1 (en) * 1995-03-10 2001-04-17 Hitachi, Ltd. Three-dimensional graphic display device
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US5729471A (en) * 1995-03-31 1998-03-17 The Regents Of The University Of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
CA2180899A1 (en) 1995-07-12 1997-01-13 Yasuaki Honda Synchronous updating of sub objects in a three dimensional virtual reality space sharing system and method therefore
JP3461980B2 (en) * 1995-08-25 2003-10-27 株式会社東芝 High-speed drawing method and apparatus
JP3785700B2 (en) 1995-12-18 2006-06-14 ソニー株式会社 Approximation method and apparatus
AU718608B2 (en) 1996-03-15 2000-04-20 Gizmoz Israel (2002) Ltd. Programmable computer graphic objects
US6181302B1 (en) * 1996-04-24 2001-01-30 C. Macgill Lynde Marine navigation binoculars with virtual display superimposing real world image
US6563520B1 (en) 1996-05-01 2003-05-13 Light And Sound Design Ltd. Virtual reality interface for show control
US5903270A (en) * 1997-04-15 1999-05-11 Modacad, Inc. Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface
US6341044B1 (en) 1996-06-24 2002-01-22 Be Here Corporation Panoramic imaging arrangement
US6373642B1 (en) 1996-06-24 2002-04-16 Be Here Corporation Panoramic imaging arrangement
US6493032B1 (en) 1996-06-24 2002-12-10 Be Here Corporation Imaging arrangement which allows for capturing an image of a view at different resolutions
US6459451B2 (en) 1996-06-24 2002-10-01 Be Here Corporation Method and apparatus for a panoramic camera to capture a 360 degree image
US6331869B1 (en) 1998-08-07 2001-12-18 Be Here Corporation Method and apparatus for electronically distributing motion panoramic images
US6721952B1 (en) 1996-08-06 2004-04-13 Roxio, Inc. Method and system for encoding movies, panoramas and large images for on-line interactive viewing and gazing
JPH10111953A (en) * 1996-10-07 1998-04-28 Canon Inc Image processing method, device therefor and recording medium
JP3148133B2 (en) * 1996-10-30 2001-03-19 三菱電機株式会社 Information retrieval device
JPH10134208A (en) * 1996-10-31 1998-05-22 Sony Corp Shape data approximating method and plotting device
US5982372A (en) * 1996-11-14 1999-11-09 International Business Machines Corp. Visual metaphor for shortcut navigation in a virtual world
US5907568A (en) * 1996-11-22 1999-05-25 Itt Manufacturing Enterprises, Inc. Integrated precision approach radar display
JP3785709B2 (en) * 1996-12-13 2006-06-14 ソニー株式会社 Shape data approximation method and drawing apparatus
US6125328A (en) 1997-02-10 2000-09-26 Baron Services, Inc. System and method for projecting storms using NEXRAD attributes
US6188960B1 (en) 1997-02-10 2001-02-13 Baron Services, Inc. System and method for predicting storm direction
US5762612A (en) * 1997-02-28 1998-06-09 Campbell; Craig Multimodal stimulation in virtual environments
US5923324A (en) * 1997-04-04 1999-07-13 International Business Machines Corporation Viewer interactive three-dimensional workspace with interactive three-dimensional objects and corresponding two-dimensional images of objects in an interactive two-dimensional workplane
DE19716958A1 (en) * 1997-04-17 1998-10-22 Zbigniew Rybczynski Optical imaging system
US6466254B1 (en) 1997-05-08 2002-10-15 Be Here Corporation Method and apparatus for electronically distributing motion panoramic images
US6356296B1 (en) 1997-05-08 2002-03-12 Behere Corporation Method and apparatus for implementing a panoptic camera system
US6184867B1 (en) * 1997-11-30 2001-02-06 International Business Machines Corporation Input for three dimensional navigation using two joysticks
US6088698A (en) * 1998-02-27 2000-07-11 Oracle Corporation Method and apparatus for incrementally generating a virtual three-dimensional world
US6348927B1 (en) * 1998-02-27 2002-02-19 Oracle Cor Composing a description of a virtual 3D world from values stored in a database and generated by decomposing another description of a virtual 3D world
JP2000067270A (en) 1998-06-12 2000-03-03 Sony Corp Method for approximating shape data, information processor and medium
US6405133B1 (en) * 1998-07-30 2002-06-11 Avidyne Corporation Displaying lightning strikes
DE19832974A1 (en) * 1998-07-22 2000-01-27 Siemens Ag Arrangement for generating virtual industrial system model compares system component information with real system image data to identify components in image data
US6252539B1 (en) 1998-07-10 2001-06-26 Kavouras, Inc. System for processing weather information
US6199008B1 (en) 1998-09-17 2001-03-06 Noegenesis, Inc. Aviation, terrain and weather display system
US6163756A (en) 1998-10-20 2000-12-19 Baron Services, Inc. System and method for detecting and displaying wind shear
US6310619B1 (en) 1998-11-10 2001-10-30 Robert W. Rice Virtual reality, tissue-specific body model having user-variable tissue-specific attributes and a system and method for implementing the same
US6369818B1 (en) 1998-11-25 2002-04-09 Be Here Corporation Method, apparatus and computer program product for generating perspective corrected data from warped information
US6175454B1 (en) 1999-01-13 2001-01-16 Behere Corporation Panoramic imaging arrangement
US6408257B1 (en) * 1999-08-31 2002-06-18 Xerox Corporation Augmented-reality display method and system
US7184051B1 (en) * 1999-09-10 2007-02-27 Sony Computer Entertainment Inc. Method of and apparatus for rendering an image simulating fluid motion, with recording medium and program therefor
GB9921584D0 (en) * 1999-09-13 1999-11-17 Realstreets Ltd A method and system for simulating visiting of real geographical areas
US6400313B1 (en) * 2000-01-12 2002-06-04 Honeywell International Inc. Projection of multi-sensor ray based data histories onto planar grids
DE60137660D1 (en) * 2000-03-17 2009-04-02 Panasonic Corp Map display and navigation device
US6753784B1 (en) * 2001-03-28 2004-06-22 Meteorlogix, Llc GIS-based automated weather alert notification system
US20020147991A1 (en) * 2001-04-10 2002-10-10 Furlan John L. W. Transmission of panoramic video via existing video infrastructure
US7640098B2 (en) 2001-07-31 2009-12-29 Stenbock & Everson, Inc. Process for generating travel plans on the internet
WO2003036558A1 (en) 2001-10-24 2003-05-01 Nik Multimedia, Inc. User definable image reference points
US7602991B2 (en) * 2001-10-24 2009-10-13 Nik Software, Inc. User definable image reference regions
US6809738B2 (en) 2001-12-21 2004-10-26 Vrcontext S.A. Performing memory management operations to provide displays of complex virtual environments
US6791549B2 (en) 2001-12-21 2004-09-14 Vrcontext S.A. Systems and methods for simulating frames of complex virtual environments
US20030212536A1 (en) * 2002-05-08 2003-11-13 Cher Wang Interactive real-scene tour simulation system and method of the same
US7515156B2 (en) * 2003-01-08 2009-04-07 Hrl Laboratories, Llc Method and apparatus for parallel speculative rendering of synthetic images
DK1735454T3 (en) 2004-03-25 2017-08-28 Novozymes Inc PROCEDURES FOR DEVELOPMENT OR CONVERSION OF PLANT CELL CELL POLICY ACCHARIDES
US7479967B2 (en) 2005-04-11 2009-01-20 Systems Technology Inc. System for combining virtual and real-time environments
US7292178B1 (en) 2005-07-28 2007-11-06 Rockwell Collins, Inc. Aircraft hazard detection and alerting in terminal areas
US20080300696A1 (en) * 2005-12-22 2008-12-04 Koninklijke Philips Electronics, N.V. Environment Adaptation for Schizophrenic User
US7492305B1 (en) 2006-09-27 2009-02-17 Rockwell Collins, Inc. Weather profile display system and method with uncertainty indication
US8185881B2 (en) * 2007-06-19 2012-05-22 International Business Machines Corporation Procedure summaries for pointer analysis
US20100218078A1 (en) * 2007-08-28 2010-08-26 Martin Gerard Channon Graphical user interface (gui) for scientific reference comprising a three-dimentional, multi-framed unification of concept presentations
US8868338B1 (en) 2008-11-13 2014-10-21 Google Inc. System and method for displaying transitions between map views
US20120058523A1 (en) 2009-02-17 2012-03-08 Edenspace Systems Corporation Tempering of cellulosic biomass
US9454847B2 (en) * 2009-02-24 2016-09-27 Google Inc. System and method of indicating transition between street level images
US7982658B2 (en) * 2009-03-31 2011-07-19 Honeywell International Inc. Systems and methods for assessing weather in proximity to an airborne aircraft
US20120079627A1 (en) 2009-05-29 2012-03-29 Edenspace Systems Corporation Plant gene regulatory elements
CN102542583A (en) * 2010-12-24 2012-07-04 北京金山软件有限公司 Method and device for displaying ambient effect in two-dimensional image
AU2011200830B2 (en) * 2011-02-25 2014-09-25 Canon Kabushiki Kaisha Method, apparatus and system for modifying quality of an image
DK2805189T3 (en) * 2012-01-18 2019-01-02 Earth Networks Inc USING SOUND DATA TO GENERATE INDIRECT REFLECTION DATA
CN102800130B (en) * 2012-07-04 2014-08-20 哈尔滨工程大学 Water level-close aircraft maneuvering flight visual scene simulation method
US9672747B2 (en) 2015-06-15 2017-06-06 WxOps, Inc. Common operating environment for aircraft operations
US10486061B2 (en) 2016-03-25 2019-11-26 Zero Latency Pty Ltd. Interference damping for continuous game play
US10717001B2 (en) 2016-03-25 2020-07-21 Zero Latency PTY LTD System and method for saving tracked data in the game server for replay, review and training
US9916496B2 (en) 2016-03-25 2018-03-13 Zero Latency PTY LTD Systems and methods for operating a virtual reality environment using colored marker lights attached to game objects
US10071306B2 (en) 2016-03-25 2018-09-11 Zero Latency PTY LTD System and method for determining orientation using tracking cameras and inertial measurements
US10421012B2 (en) 2016-03-25 2019-09-24 Zero Latency PTY LTD System and method for tracking using multiple slave servers and a master server
US10751609B2 (en) 2016-08-12 2020-08-25 Zero Latency PTY LTD Mapping arena movements into a 3-D virtual world

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4752836A (en) * 1984-09-07 1988-06-21 Ivex Corporation Method and apparatus for reproducing video images to simulate movement within a multi-dimensional space
WO1988002156A2 (en) * 1986-09-11 1988-03-24 Hughes Aircraft Company Digital simulation system for generating realistic scenes
FR2613509B1 (en) * 1987-04-03 1989-06-09 Thomson Cgr METHOD FOR CALCULATING AND REPRESENTING IMAGES OF VIEWS OF AN OBJECT
US5432895A (en) * 1992-10-01 1995-07-11 University Corporation For Atmospheric Research Virtual reality imaging system
US5396583A (en) * 1992-10-13 1995-03-07 Apple Computer, Inc. Cylindrical to planar image mapping using scanline coherence

Also Published As

Publication number Publication date
DE69520504T2 (en) 2001-08-09
AU3583995A (en) 1996-03-27
AU691976B2 (en) 1998-05-28
DE69520504D1 (en) 2001-05-03
US5490239A (en) 1996-02-06
EP0780009B1 (en) 2001-03-28
EP0780009A1 (en) 1997-06-25
WO1996007988A1 (en) 1996-03-14
ATE200157T1 (en) 2001-04-15

Similar Documents

Publication Publication Date Title
CA2199619A1 (en) Virtual reality imaging system
EP0663091B1 (en) Virtual reality imaging system and method
US5751289A (en) Virtual reality imaging system with image replay
US5845874A (en) System and method for creating visual images of aircraft wake vortices
US5583972A (en) 3-D weather display and weathercast system
EP1476720B1 (en) Apparatus for the display of weather and terrain information on a single display
US5566073A (en) Pilot aid using a synthetic environment
EP0547202B1 (en) Improved low-level windshear alert system
CN104457735A (en) 4D trajectory displaying method based on World Wind
Moller et al. Synthetic vision for enhancing poor visibility flight operations
Zhang et al. A 3d visualization system for hurricane storm-surge flooding
WO1998026306A1 (en) 3-d weather display and weathercast system
Below et al. 4D flight guidance displays: an approach to flight safety enhancement
Schafhitzel et al. Increasing situational awareness in DVE with advanced synthetic vision
Von Viebahn The 4D-display
Etherington et al. Synthetic vision information system
May et al. Controlled digital elevation data decimation for flight applications
Vonder Haar et al. Four-dimensional imaging for meteorological applications
CN111091617A (en) Aircraft accident prediction and three-dimensional visualization system
Hembree et al. Incorporation of a cloud simulation into a flight mission rehearsal system: Prototype demonstration
CN114125705A (en) ADS-B base station monitoring range estimation method based on mathematical morphology
Brown Displays for air tra c control: 2D, 3D and VR| a preliminary investigation
Montag Visual weather simulation using meteorological databases
Kraut et al. StormGen: A Proposed Solution to Weather Simulation in NextGen Research
Haar et al. GeoRGE-3D: A Minicomputer-Based System For Interactive 3-D Rendering Of Digital Environmental Data Sets

Legal Events

Date Code Title Description
FZDE Discontinued
FZDE Discontinued

Effective date: 20020909