WO2002018874A1 - Real or near real time earth imaging system - Google Patents

Real or near real time earth imaging system Download PDF

Info

Publication number
WO2002018874A1
WO2002018874A1 PCT/AU2001/001073 AU0101073W WO0218874A1 WO 2002018874 A1 WO2002018874 A1 WO 2002018874A1 AU 0101073 W AU0101073 W AU 0101073W WO 0218874 A1 WO0218874 A1 WO 0218874A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
earth
imaging
user
Prior art date
Application number
PCT/AU2001/001073
Other languages
French (fr)
Inventor
Alan Robert Burns
Original Assignee
Marine Research Wa Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Marine Research Wa Pty Ltd filed Critical Marine Research Wa Pty Ltd
Priority to AU2001281605A priority Critical patent/AU2001281605A1/en
Publication of WO2002018874A1 publication Critical patent/WO2002018874A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • G01C11/025Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures by scanning the object

Definitions

  • the present invention relates to earth imaging systems and more particularly to earth satellite based imaging systems for producing images of the earth in real or near real-time.
  • Imaging equipment located on board satellites has been used for producing images of, and data related to, the earth's surface.
  • these imaging systems are constrained by the stationary orbit, or the orbit path, of the satellites on which the imaging and data collection equipment is located. This means that only an image of, or data relating to, the earth's surface falling under the satellite's orbit can be obtained.
  • the areas of the earth surface falling under its orbit can only be imaged at certain time periods corresponding to when the satellite is aligned with these areas.
  • the USA in particular has been sensitive to allowing private access to imaging satellites that can achieve images of the earth's surface having resolutions of less than 10 metres.
  • relevant regulatory authorities in the USA have relaxed these requirements and have permitted image resolutions to be provided to the public down to 1 metre.
  • a further aspect of imaging satellites in current use is that the images and data are collected by expensive purpose-engineered sensors. These sensors generally do not use all of the available natural radiation from the earth to generate an image.
  • the earth's sun is believed to emit most of its energy in the so-called "optical" frequencies of the electromagnetic spectrum.
  • the earth due to its lower non-incandescent temperature, is believed to emit most of its energy in frequencies that are non optical and which is of weaker strength.
  • an imaging system for providing image information of a desired portion of the earth's surface to a user of the system in real or near real time comprising:-
  • each platform having one or more imaging sensors
  • any portion within the footprint coverage being capable of being imaged any time with a spatial resolution of approximately 1 metre or less and a temporal resolution of approximately 25 images per second or more;
  • communication means for conveying information between a user requesting an image of the desired portion of the earth covered by the footprints and the network of platforms providing same, including:
  • request delivery means to deliver a processed request for an image from the user to the platform(s) required to generate the image requested
  • image delivery means to deliver the image generated by the platform(s) to the user.
  • the platforms form a constellation of satellites.
  • the invention may use aircraft such as high altitude balloons or unmanned remote controlled aircraft to locate said sensors for collecting said image data.
  • the aircraft are preferably arranged so as to form a constellation.
  • the imaging system includes an image request processing station including:
  • (b) data processing means to generate processed image request data including:
  • location processing means for determining the location and size of the portion to be viewed and the time duration of the imaging
  • resource allocation means to determine requisite resources such as platforms and sensors to provide the image
  • the imaging system includes image data acquisition and processing means disposed on each platform including:
  • control and processing means to process the stored data to generate processed image data ready for transport;
  • image data transport means to transport processed image data via said image delivery means.
  • the imaging system includes an image receipt processing station for processing received processed image data including:
  • pre-processing means to calibrate, enhance and classify the initially processed image data if necessary
  • transport processing means to process the extracted image data ready for delivery to the user
  • transmission means to transmit the processed extracted image data to the user.
  • the imaging system includes an earth imaging application means including :-
  • selection means to select a portion of the earth that is desired to be imaged by the user
  • zooming means to fine tune the selection of the portion
  • resolution means to select the desired resolution of the image
  • a user image display interface integrated with the user request interface to display the requested image in real time or near real time including:
  • image processing means to process the received extracted image data from the image processing station for display
  • the image data comprises three-dimensional data of the earth's surface whereby the combined image is a three dimensional image.
  • more than one sensor is provided on each platform to collect optical data. It is also preferable that at least one sensor on each platform is provided to detect and collect a number of frequencies beyond optical frequencies such as micro-wave, infra-red, ultra-violet or gamma radiation frequencies.
  • said platforms are adapted so that image data of at least one area of the earth's surface is collected by at least two sensors at any one point in time whereby said platforms collect said three dimensional data.
  • said three dimensional data of an area is collected by a single sensor from two different points of an orbit of a platform containing said sensor.
  • said sensor collects said data sets at a rate of 25 data sets per second or higher as the platform traverses its orbit path.
  • said platforms are further adapted to communicate with each other so that said image data of an area collected by at least two sensors is filtered by said platforms so that said combining means receives a selected set of image data of said area.
  • an imaging system comprising a large constellation of platforms collectively having a first set of passive electromagnetic radiation sensors and a second set of active electromagnetic radiation sensor each sensor being adapted to collect data from the earth's surface; identification means for identifying an area on the earth's surface from where data has been collected by said passive sensor; comparison means for comparing said identified area with a target area; whereby in response to a positive result of said comparison, said active sensor is adapted to collect data from said identified area.
  • said imaging system comprises a set of images of the earth's surface; said comparison means adapted to compare said data collected by said passive sensor with said set of images.
  • Preferably said second set of active sensors are wide range sensors.
  • said sets of active sensors and said passive sensors are adapted to receive electromagnetic radiation from a spread of frequencies, such as optical, infra-red, ultra-violet and microwave frequencies; said set of image data derived from said spread of frequencies.
  • a spread of frequencies such as optical, infra-red, ultra-violet and microwave frequencies
  • a system for collecting data related to the earth comprising a constellation of platforms some or all of which contain a set of sensors for detecting electromagnetic radiation emitted from the earth's surface; processing means for receiving at least two sets of data; each said set of data being received from a separate one of said platforms; and said processing means combining said at least two sets of data to generate a representation of the earth.
  • said sensors operate to detect a pre-determined set of electromagnetic radiation frequencies including optical, infra-red, ultra-violet and microwave frequencies whereby said representation is generated, at least in part, from data collected from electromagnetic frequencies outside the optical range of frequencies.
  • said unified representation comprises a graphical or visual representation of said data.
  • an imaging system for generating graphical representations of the earth comprising input means for receiving input data from a large constellation of platforms orbiting the earth; processing means for processing said data to generate a set of image data of the earth; and imaging means for generating graphical representations of the earth from said set of image data.
  • the platforms are satellites.
  • said input data comprises communications signals received by a constellation of communications satellites; and said processing means is adapted to process said communications signal to identify sets of image data of the earth.
  • said constellation of communications satellites have a non-stationary orbit and collectively receive on a continuous basis communication signals from a predetermined area of the earth's surface, whereby said processing means updates said set of image data on a real or near real time basis.
  • said set of image data comprises data in three dimensions.
  • said imaging means access said data in three dimensions whereby said graphical representations are three-dimensional representations.
  • the data signals may be obtained from telecommunications receivers located in high altitude balloons, aircraft, including high altitude unmanned reconnaissance/communications aircraft.
  • a method for providing image information of a desired portion of the earth's surface to a user in substantially real time comprising:-
  • Figure 1 is a schematic representation of a constellation of satellites in accordance with the first embodiment
  • Figure 2 is a representation of a typical set of footprints produced by a constellation of satellites such as Figure ;
  • Figure 3 is a block diagram of an imaging system using the constellation of satellites of Figure 1 ;
  • Figure 4 is a flow chart of operation of the system in Figure 3;
  • Figures 5A and 5B are block diagrams of a compression system located on the satellites of Figure 1;
  • Figures 6A and 6B are block diagrams of two different configurations of a ground station and imaging station in Figure 3;
  • Figure 7 is a schematic representation of a combined optical and microwave sensor used in the satellites of Figure 1.
  • Figure 8a is a representation of bands of frequencies in the radio spectrum
  • Figure 8b is a representation of bands of frequencies in the micro-wave spectrum
  • Figure 9 is a transmission diagram of the earth's atmosphere
  • Figure 10 is a table of frequencies allocated to passive sensor applications
  • Figure 11 is a table of frequencies allocated to remote sensing applications
  • Figure 12 is a further representation of the electromagnetic spectrum showing bands of natural phenomenon
  • Figure 13 is a top level flow chart of the imaging system in accordance with the second embodiment
  • Figure 14 is a top level schematic for the imaging system of the second embodiment
  • Figure 15 is a flow chart showing the procedure for a user making an imaging request in accordance with the second embodiment
  • Figure 16 is a schematic showing the ground segment of the second embodiment
  • Figure 17 is a flow chart showing the procedure for processing a user request in accordance with the second embodiment
  • Figure 18 is a laser communications system developed by Matra - Marconi Space
  • Figure 19 is a schematic drawing of intra- (vertical) and inter-plane (horizontal) link arrangements in a satellite constellation
  • Figure 20 is a schematic of network links in the Teledesic constellation
  • Figure 21 is an example of a networking configuration for the imaging system of the second embodiment
  • Figure 22 is a schematic of the use of a satellite in a different orbit to relay an image date from a LEO imaging satellite to earth;
  • Figure 23 is a flow chart showing the procedure for the real time uplink of data in accordance with the second embodiment
  • Figure 24 is a flow chart showing the procedure for the real time downlink of data in accordance with the second embodiment
  • Figure 25 is a flow chart showing the procedure for image acquisition in accordance with the second embodiment
  • Figure 26 is a flow chart showing the image processing procedure in accordance with the second embodiment
  • Figure 27 is a flow chart showing the procedure for image display in accordance with the second embodiment
  • Figure 28 is a diagram showing how a single polar orbiting satellite images the earth's surface
  • Figure 29 is a footprint on the earth's surface, projected from an imaging satellite, whereby the satellite has a single moveable sensor and therefore can image any area within the circular region;
  • Figure 30 is a geometry for calculating the footprint area
  • Figure 31 are schematic projections of imaging satellite footprints on the earth's surface in which the white circles show the actual sensor beams and the shaded circles show the area able to be imaged by the sensor; and wherein: Figure 31a shows the situation where the individual satellite footprints do not overlap; whilst Figure 31b shows the case of minimum overlap for total coverage where the grey sensor beam is an example of a region being imaged by two satellites simultaneously, allowing tor instantaneous stereo image.
  • the first embodiment is directed towards an imaging system adapted to produce images of the earth, in near real-time.
  • the imaging system operates through use of a computing device connected to a network, such as the Internet or a direct subscriber network such as a news organisation or government entity.
  • a network such as the Internet or a direct subscriber network such as a news organisation or government entity.
  • the user activates an earth imaging application which displays a map or image of the earth.
  • the user selects a region of the earth that they are interested in.
  • the user zooms into this region so as to make a further selection in respect of an area of specific interest.
  • the user is presented with several options such as whether they wish to view a two dimensional image of the selected area, a 3 Dimensional (3D) image of the selected area, a video image in real or near real-time.
  • a graphical representation of the earth can alternatively be presented that is generated from information obtained from electromagnetic frequencies other than optical frequencies. For example, information from frequencies in the ultra-violet, infra- red or micro-wave ranges may be used in generating a graphical representation of the earth.
  • a three dimensional representation may be a data set having spatial information on the selected region. This data set of spatial information enables a view of the region to be represented on a two dimensional screen. The spatial information enables the region to be viewed from a number of different angles, (t also enables a user to navigate the region as the spatial information enables a continuous set of such two dimensional images to be generated.
  • Navigation of a region is seen as the display of a series of these two dimensional images. Each new image is from a new perspective/angle of the region that is slightly different to the previous angle/perspective. The change in perspective accords with the direction of navigation through the region and the speed with which the region is navigated. This provides the perception that the user is moving through a three dimensional space.
  • Images displayed to the user of a selected area are in near real-time.
  • Near real time means that the image viewed by the user may be displayed with a delay of from 10 to 15 seconds after the raw data has been collected and processed by an imaging system.
  • the data is updated at a rate of approximately, or more than, 25 images per second as the satellite traverses its orbit path.
  • FIG. 1 a schematic representation of a constellation of satellites 22 used by the first embodiment is shown to provide image data from an imaging platform 15 thereon, in real-time or near real-time, to a user 14 of an imaging system 11.
  • a constellation of satellites is a group of satellites orbiting the earth that are networked to each other in a constellar network 21 so that the imaging platforms 15 thereon provide coverage of the earth's surface 13 that is greater than can be provided by the imaging platform 15 of a single satellite 22.
  • Such constellations typically comprise Low Earth Orbiting (LEO) satellites, though they may comprise Medium Earth Orbit (MEO) satellites and GEO-stationary satellites.
  • LEO Low Earth Orbit
  • MEO Medium Earth Orbit
  • a constellation may comprise a series of satellites from some or all of the various orbit categories.
  • the orbit paths 17 of satellites 22 in a constellation of satellites having a non- stationary orbit are typically arranged so that a pre-determined area on the earth's surface is continuously in sight of the imaging platform 15 of at least one satellite of the constellation at any one time.
  • a constellation of GEO-stationary satellites is organised so that a pre-determined area of the earth is continuously in sight of the imaging platforms of the satellites in the constellation.
  • global hand held mobile communications networks such as the Iridium network http://www.iridium.com, use constellations of LEO satellites without imaging platforms to provide continuous telecommunications service to hand held communications terminals across a large proportion of the earth's surface.
  • the use of a constellation of LEO satellites is particularly advantageous in an imaging system as it allows high-resolution images to be obtained compared with satellites in Geo-Stationary orbit (GEO satellites).
  • GEO satellites Geo-Stationary orbit
  • An alternate advantage is that lower cost sensing systems for imaging platforms, such as Charged Coupled Devices (CCDs) may be used in place of the high cost systems for imaging platforms used in GEO-stationary satellites.
  • CCDs Charged Coupled Devices
  • LEO satellites to form an earth imaging system will allow the earth to be imaged through passive sensing of electromagnetic radiation emitted from the earth's surface.
  • the radiation may be both naturally occurring radiation or man made radiation.
  • the lower attenuation of passively sensed signals by LEO satellites due to the shorter transmission path of these signals to the LEO satellites, compared with their transmission path to GEO satellites, will enable the signals to be detected and an image or representation of the earth, or a selected region of the earth to be imaged.
  • passive sensing from LEO satellites enables passive detection of man made electromagnetic signals such as telecommunication signals, radio signals, television signals and radar signals. These signals are emitted from the earth's surface or are reflected, scattered or faded from the earth's surface by buildings, cars, foliage, bodies of water etc. These reflected, scattered and faded signals represent noise in a received/detected telecommunications, radio, television, or radar signal. However, the reflected, scattered and faded signals contain information as to the object that they have been reflected off or scattered by or faded by. Processing of these reflected, scattered or faded signals produces information on these objects that enables graphical representations of the earth's surface to be compiled.
  • man made electromagnetic signals such as telecommunication signals, radio signals, television signals and radar signals.
  • noise in man made signals may also contain information on the earth as will naturally occurring radiation. All of these types of signal sources, including naturally occurring radiation, may be used to form an image or representation of the earth.
  • the signals detected may be signals from cellular communications terminals. Signals from base stations of cellular telecommunications networks will be detected and processed as these signals originate from a source with a fixed location on the earth's surface.
  • Such reflected signals can be sensed from two or more points of the satellite's orbit path. This enables two or more sets of data on each point in the satellites footprint to be collected. This data may be used to generate a stereoscopic image, or three dimensional to generate a three dimensional image in respect of the areas covered by the footprint.
  • the data sets will be off-set in time. This off-set will correspond with the time it takes the satellite to move from a first position where the first data set is collected to a second position in its orbit where a second data set is collected. Collecting data sets and processing these data sets at a rate of 25 data sets or images per second, or greater, should enable near real time images/representation at video frame rates to be generated and displayed to users.
  • inventions may process reflection and scattering and fading noise in telecommunications signals received by telecommunications satellites in order to produce such an image.
  • information received by base transceiver stations in cellular communications networks may also be used.
  • sensors are located in satellites of a constellation and are used for collecting image data relating to the earth's surface, the earth's crust and atmosphere.
  • the sensors operate over a discrete set of electromagnetic frequencies, such as micro-wave frequencies, infra-red frequencies, optical frequencies, ultra-violet frequencies and gamma radiation frequencies.
  • This data is preferably relayed to one or more ground stations which process the data before forwarding selected sets of this data in accordance with user requests.
  • the satellites are adapted to communicate with each other by means analogous to a local area network (LAN) or similar network configuration. This communication is also used to notify each satellite of selected data that they are required to relay to a ground station.
  • LAN local area network
  • the vast majority of the satellites are low earth orbiting satellites, such as those used for hand held satellite phones. These satellites have a low orbit and so are replaced every few years, allowing updating of imaging apparatus.
  • the low altitude orbit of these satellites allows conventional imaging technology such as charged coupled devices (CCDs) to be used. Alternatively, higher resolution images can be achieved through use of standard equipment.
  • CCDs charged coupled devices
  • FIG 2 a schematic representation of sensor footprints 23 of image sensors 19 located in a constellation of satellites is shown, such as the constellation of satellites in figure 1.
  • the footprints 23 indicate that most areas of land mass on the earth's surface do or will fall within the footprint of two or more satellites at any one point in time. This enables the system to collect two or more sources of data relating to particular areas on the earth's surface at the same time. Two or more sources of data enable data sets of three-dimensional data relating to these areas to be collected, or stereoscoping of the data.
  • Two or more data steams also enables the imaging system to select between different streams of image data for an area selected by an operator. This enables the system to selectively control the amount of data transmitted by any one satellite in the constellation. Where the satellites in the constellation have non- stationary orbits, overlapping sensor footprints do or will enable image data collection of selected areas to be handed off from one satellite to another.
  • aircraft instead of satellites.
  • Such aircraft may include high altitude balloons, and unmanned remotely controlled aircraft.
  • Such systems preferably utilise these aircraft in a similar manner to a constellation of satellites.
  • FIG. 3 a block diagram of an imaging system 11 according to the present embodiment is shown.
  • the system 11 comprises a ground station 300 that receives image data from the imaging platform 15 of a satellite receiver 320.
  • the satellite receiver is in communication with a constellation of satellites 22 such as the constellation detailed in Figure 1.
  • the constellation of satellites is represented by a single satellite 325.
  • the ground station 300 may be in communication with one or more other ground stations that are in turn in communication with different satellites 22 in the constellation 325.
  • Each ground station 300 may in turn be in communication with one or more imaging stations 330.
  • a user 14 of the system 310 logs into the imaging station by means of a communications network, such as a public switched telephone network (PSTN) or the Internet 305 using a personal computer (PC) 310.
  • PSTN public switched telephone network
  • PC personal computer
  • Each ground station 300 receives data from the satellites that it is in communication with. This data is processed by the ground station before it is forwarded to the imaging station 330.
  • the imaging station 330 operates to combine the streams of data from the various ground stations 300 so that an image of various points on the earth's surface 13, preferably any point, can be viewed by a user 14 on their PC 310 in real-time or near real time.
  • the system has two levels of encryption for the image data.
  • the first level of encryption applies to information transmitted between satellites 22 of the constellation 325 and a ground station 300, and a ground station 300 and an imaging station 330.
  • the second level of encryption is applied to image data transmitted between an imaging station 330 and a user 14.
  • the use of encryption enables the image data to be transmitted over a public network such as the Internet 305 without the data being interpreted by un-authorised users.
  • the two levels of encryption isolate end users 14 from data transmitted by a ground station 300 to an imaging station 330.
  • FIG. 4 a flow chart detailing a method by which the embodiment of the imaging system operates is shown.
  • the process of obtaining a real-time or near real-time image of a point on the earth's surface commences at step 400 where a user 14 selects by means of an application running on a networked computing device 310, an area of the earth's surface 13 that the user wishes to view.
  • the user's terminal transmits to the imaging station 330 request data that identifies the area that the user wishes to view, the resolution to which the area is to be viewed, and other information such as whether the image is to be a two dimensional image, a 3D image, a still image, or a video image.
  • the image may be derived from data sets of three-dimensional information that is obtained from electromagnetic frequencies outside of the visible spectrum.
  • the image station 330 identifies, from a database of reference images, reference image data that corresponds with the area identified by the user's terminal.
  • this reference image data is compressed and then encrypted and at step 420 the encrypted reference data is transmitted to the satellites whose footprints traverse the area selected.
  • the selected satellites receive the data and store it in a module referred to as a compression/comparison module.
  • the data stored in the compression/comparison module has been decrypted and uncompressed.
  • image data received by the satellite's sensors is compared with the reference data stored in the compression/comparison module. This comparison enables data relating to an area selected by a user to be extracted from the received data.
  • the satellite proceeds, at step 435, to compress the extracted data. This may be achieved by subtracting the reference data from the received data so that only the difference between the two signals is transmitted. Other means of data compression may be used, however it is preferable that the compression technique be an intra-frame compression technique, such as a technique that transmits the difference between successive image frames.
  • step 445 the compressed data received by the ground station is decrypted and expanded before it is transmitted to the imaging station for combining with other received data, after which it is transmitted to the requesting user.
  • the process then terminates at step 450.
  • FIGS. 5A and 5B block diagrams of an embodiment that utilises a sensor system installed on the imaging platform of some or all of the satellites 22 in a constellation of satellites such as the constellation of satellites in figure 1 , are shown.
  • Figure 5A provides a general overview of the configuration of the sensing system whilst Figure 5B provides a more detailed view of the configuration.
  • the sensor system comprises a sensor 500 that detects electromagnetic radiation emitted from the earth's surface in a predetermined range of frequencies and polarisations such as micro-wave and infra-red radiation, visible radiation and ultra-violet and gamma radiation.
  • the sensor 500 outputs its received data to a compression module 505 forming part of a compression/encryption stage 507.
  • This data is output on a frame-by-frame basis at a rate of 25 frames per second.
  • the compression module 505 also receives reference data from a data store 510 after being treated by a control/processing stage 512.
  • the data store 510 also receives data from the sensor for use in compression of the data by a technique such as an intra-frame compression technique.
  • the compression module 505 operates to compress each frame of data received from the sensor in accordance with a compression algorithm. Once a frame of data is compressed, the compression module 505 forwards the frame to an encoder 515, which operates to encode each frame, the encoder also forming part of the compression/encryption stage 507. Once the frame has been encoded it is transmitted by a transmitter 520 to another satellite 22 from where it is transmitted to a ground station 300. Alternatively the satellite's orbit may enable it to transmit encoded frames directly to a ground station.
  • the onboard satellite sensor system also comprises a receiver 525 that receives data, such as reference data either directly from a ground station 300 or from another satellite 22. This data is forwarded to a decoder and expander module 530 that forms part of a decompression/decryption stage 532.
  • the decoder and expander module 530 operates to decode the received data and to expand the received data for use by a control module 535 and for storage in the data store 510.
  • the control module 535 receives expanded data from the decoder and expander module.
  • the control module 535 operates to compare data received from the sensor 500 with reference data received from the decoder and expander module 530. This comparison enables the control module 535 to determine which portion of the data fed to the data store 510 from the sensor 500 corresponds with data received from the decoder and expander module 530.
  • control module 535 controls the extraction and compression processes performed by the compression module 505 on the data received by the sensor 500.
  • FIGS. 6A and 6B block diagrams of a ground station 300 and an imaging station 330 according to the present embodiment are detailed. These figures show two slightly different configurations that may be adopted for the ground station and imaging station arrangements.
  • compressed data transmitted by a satellite is received by receiver 605, which passes the received data onto a decoder 610.
  • the decoder 610 under the operation of control module 615 decodes the received data before passing it onto a decompression module 620 and control module 615.
  • Control module 615 processes the data to detect the area on the earth's surface that has been transmitted by the satellite. Having determined this area the control module instructs the decompression module 620 to access reference data from the reference data store 625 that corresponds with the area imaged by the satellite. This reference data enables the decompression module 620 to perform the reverse process of the compression process performed onboard the satellite.
  • the expanded data is then passed to an image combining module 630 that operates to combine the decompressed image with other image data, possibly collected by other satellites, so as to produce a real-time or near real-time image of the selected area.
  • the image is compressed by means of an image compression technique, such as enhanced compressed wavelet technique, and, is then transferred via a PSTN or Internet network using image transferring software, (such as Image Web Server available from earth Resource Mapping (ERM), www.ermapper.com), to an updating module 640 that is housed in an imaging station 330.
  • image transferring software such as Image Web Server available from earth Resource Mapping (ERM), www.ermapper.com
  • the updating module 640 transfers the updated image to an image data store 645.
  • the image data store is accessible by users of the system through a server 650 using image transferring software, such as IWS above.
  • the receiver 605 of the ground station 300 receives compressed data from a satellite and passes it through a decompression/decryption stage 655, whereby the expanded data is stored in a data storage area 660.
  • This expanded data is then processed by an image formation module 665 to form an initial image, which is then compressed and encrypted via a compression/encryption stage 670 for transmitting via a PSTN or the Internet to the imaging station 330 where it may be further processed for the user requesting the same.
  • the imaging station 330 comprises a decompression/decryption stage 675 for expanding the compressed and encrypted image received from the ground station 300 and a data storage area 680 for storing the expanded image data.
  • the expanded image data is then processed further by an image processing module 685 to improve the image before being compressed and encrypted via a compression/encryption stage 690 and delivered to the user via the server 650 and the Internet 635.
  • An alternate embodiment may operate to transfer decoded data directly from the decoder 610 to the updating module 640.
  • the updating module may then perform the decompression and imaging combining function of the decompression module 620 and the image-combining module 630 respectively.
  • the processing of the data may be performed by high powered personal computers (PCs) and work stations such as those using Microsoft'sTM WindowsTM operating system or the UNIXTM and LINUXTM operating systems which also work on Sun MicrosystemsTM workstations.
  • an imaging platform 15 comprising a combined passive and active sensor arrangement that combine together to form an imaging sensor module 19.
  • a passive sensor is used for passive remote sensing which is a process that records terrestrial and reflected extraterrestrial electromagnetic radiation.
  • Active sensors are used for active remote sensing which is a process that generates electromagnetic radiation to illuminate the surface of the planet and to record a reflected portion of the transmitted signal.
  • the imaging platform 15 comprises three passive sensors, namely a lagging sensor 705, a central sensor 710 and a leading sensor 715. Adjacent each, passive sensor is located an active sensor. These active sensors comprise a lagging sensor 720, a central sensor 735 and a leading sensor 740. Each active sensor comprises a transmitter 725 and a receiver 730. Multiple passive/active sensor arrangements may be required for sensing different frequencies of the electromagnetic radiation.
  • the transmitter 725 illuminates a preselected area of the earth's surface with electromagnetic radiation.
  • the receiver 730 detects the radiation reflected by the earth's surface in the band of frequencies transmitted by the transmitter 725.
  • Each passive sensor comprises a bundle of wave-guides, such as optic fibres, which interface to a fixed focal length lens. This combination of lens and optic fibres forms a scanning head that scans areas of the earth's surface within the sensors footprint.
  • Each fibre of the optic fibres is linked to a sensor such as a charged coupled device (CCD).
  • CCD charged coupled device
  • the CCD sensor may be located at a convenient point on the spacecraft.
  • CCD sensors are lightweight and robust allowing them to be located on a satellite with relative ease. They are also relatively inexpensive.
  • each CCD has its own controller that enables it to compare image data collected by its optic fibres with the reference data received from the ground station.
  • the three passive sensors operate to provide stereo imaging of the earth's surface. Provision is made for selecting between the three sets of image data collected by the sensor 700 in order to produce a three-dimensional image of a selected area on the earth's surface.
  • the three active sensors 720, 735 and 740 respectively, also have their image data combined to form a three-dimensional image of the electromagnetic data received from the earth's surface.
  • the passive sensors In operating the active sensors, it is preferable that the passive sensors first identify the area of the earth's surface that is to be imaged by the active sensors. Once the passive sensors have identified the requested area, the active sensors are then directed towards the requested area, whereupon the transmitters 730 are operated to illuminate the selected area with electromagnetic radiation.
  • an adaptive process is used to perform comparisons of image data collected by the sensors with reference data received from a ground station.
  • An example may be a system where an initial comparison is performed, based on wavelet parameters. Should the correlation of the wavelet parameters fall within predefined values then a more detailed comparison may be performed.
  • an alternate system collects two or more sets of image data from one or more satellites. These separate sets of image data are combined by the ground station to generate a three-dimensional image of a selected area.
  • the compression scheme used is a high-level compression scheme that is adapted to transmit an update of the reference image from the satellite to the ground station, rather than transmitting all of the data collected by the sensors.
  • interfacing between these two satellites is provided in order to ensure that the combined images are compatible to produce a three-dimensional image.
  • interfacing between the two satellites is also required to ensure that the data can be combined efficiently.
  • the sensors on the satellite continually supply image data to a ground station or the imaging station at a low data rate.
  • the ground station or imaging station operate to geometrically and radiometrically correct the received data to remove cloud and other influences whereby an accurate image of the planet is maintained by an imaging station in near real time.
  • the compressed data transmitted by the satellites represents an update of the current vegetative and human activity on the earth's surface.
  • the comparison process between the received data and reference data is based on several factors such as: histograms in each received band; edge structures; band ratios; textures; time; polarisations; power attenuation (active); and known cavities.
  • telecommunications signals received by telecommunications satellites are also processed.
  • the signals are processed to identify noise produced by objects that reflect transmitted signals. Processing this noise enables information relating to the reflecting objects to be obtained. Similar processing may be performed on signals received by base transceiver stations of cellular telecommunications systems.
  • FIGS 8a and 8b representations of the radio and microwave spectrums respectively and the frequency bands over which various types of navigation, communication and radar systems operate, are shown. These are examples of man made radiation that can be detected and be used for generating a representation of the earth.
  • Figure 8a identifies the well known band designations of the radio spectrum as ULF, VLF, LF, MF, HF, VHF, UHF, SHF and EHF, and the different types of communication media that utilise these bands.
  • Figure 8b is a graphical representation of the microwave spectrum that extends between approximately 0.3 GHz and 100.0 GHz.
  • the microwave spectrum is divided into a number of bands of frequencies identified by the letters P,L,S,C,X,K,QN and W.
  • Various frequencies allocated to these bands are detailed in Figure 8b. Signals detected in these bands should correspond with transmitting devices of the systems detailed in Figure 8a.
  • the transmission axis represents the degree of attenuation caused by the earth's atmosphere on a one-way (ie passive) transmission of electromagnetic radiation emitted from the earth's surface and received by a satellite in space.
  • This transmission diagram is measured under clear sky conditions.
  • the transmission diagram shows that water vapour in the earth's atmosphere absorbs electromagnetic radiation at two frequencies of approximately 22 GHz, and approximately 180 GHz.
  • the diagram also represents that oxygen in the atmosphere absorbs electromagnetic radiation emitted from the earth at two frequencies of just under 60 GHz and at approximately 120 GHz.
  • the transmission diagram also demonstrates that, in general terms, higher frequency electromagnetic radiation is attenuated by the atmosphere to a greater degree than lower frequency electromagnetic radiation. It should be noted that the 100% transmission as defined on the vertical axis is the transmission of 1 GHz radiation through the earth's atmosphere.
  • the transmission diagram demonstrates that at certain frequencies the earth's atmosphere operates as a band stop filter. Frequencies outside of these bands may be detected by use of sensors on board satellites. By detecting frequencies close to these bands, data relating to the present composition of the earth's atmosphere may be collected. As the composition of the earth's atmosphere changes, the transmission properties of the earth's atmosphere should change. Other windows and band stop frequencies may occur at different frequencies as the composition of the atmosphere changes.
  • the transmission diagram also demonstrates that outside of these attenuating bands, the earth's atmosphere presents a number of natural windows, such as the window centred on 35 GHz, 90 GHz and 135 GHz, which provide local maxima in the atmosphere's transmission properties.
  • band stop filter frequencies and these window frequencies in combination with a constellation of satellites
  • data representing a snap shot of the earth's surface and the earth's atmosphere, or at least a predetermined portion of the earth's atmosphere and the earth's surface may be collected and transmitted to a ground station for later processing.
  • the use of a constellation of satellites also enables two-dimensional and three- dimensional data to be collected.
  • This data may also be collected in real-time or near real-time and may be presented as either two dimensional or three dimensional data at video frame rates of 25 frames per second. In this way live data relating to electromagnetic radiation emitted from the earth's surface may be detected.
  • a table detailing frequency allocations for passive sensors is shown.
  • the frequencies are represented in Giga Hertz and the data in the table is labelled according to the applications that are allowed to operate in the nominated frequency bands.
  • the band of frequencies between 0.404 and 0.406 GHz is labelled with an "a”. This indicates that these bands of frequencies are reserved for radio astronomy and hence no transmission of electromagnetic radiation at these frequencies from the earth's surface is permitted.
  • Those frequencies labelled with a "p” operate within a band of frequencies whose primary use is for applications that transmit electromagnetic radiation.
  • Those frequencies labelled with an "s" are within a band of frequencies whose secondary use is for applications that transmit electromagnetic radiation.
  • FIG 11 details radar frequency allocations for active remote sensing applications in Giga Hertz.
  • Active remote sensing is the use of electromagnetic radiation to illuminate a selected area on the earth's surface. Electromagnetic radiation reflected from this selected area due to this illumination is detected by a satellite performing the illumination. These systems are active systems in the sense that they transmit electromagnetic radiation.
  • the imaging system of the present embodiment utilises these and other frequency allocations to collect data relating to electromagnetic radiation emitted from the earth's surface that is outside of the visible spectrum of electromagnetic radiation. It is preferable that further frequencies such as those representing infra-red radiation, ultra violet radiation and gamma radiation are detected.
  • radiation from communication systems is also used to collect data that enables a representation of the radiation emitted by communication systems on the earth, to be developed.
  • Other naturally occurring sources of electromagnetic radiation are also detected and are used for generating an image/representation of the earth.
  • Transmissions from mobile phones and other satellite phones and the reflections of these transmissions off buildings and other objects on the earth's surface is detected by a sensor on a satellite. Collection of this data from mobile telephones and other telecommunication systems is also used to generate a data representation, that is a graphical representation, of the earth's surface or at least part of the earth's surface.
  • the second embodiment is substantially similar to the first embodiment, being directed towards an imaging system comprising a large constellation of satellites that can image all, or some portion, of the earth's surface continuously and transfer this information to an end-user with minimal delay.
  • a particular feature of the present embodiment is the production of 'movies' of user-selected portions of the earth. This can be achieved from continuous observation, and a sufficiently fast revisit period.
  • the real-time imaging system is sub-divided into two distinct sections. Firstly, an imaging portion which relates to how the information is acquired and secondly a non-imaging portion which relates to how the information is processed and transferred to the end-user.
  • the requirements for the first section are large-area (preferably whole earth) high spatial resolution ( « ⁇ 1m) combined with high temporal resolution (preferably «> 25 'images' per second to allow 'movies') and for the second section, very fast transfer and processing.
  • Imaging satellite systems do not meet these requirements for several reasons. This is chiefly due to the very large number of satellites needed to achieve the high space and time resolution and also because imaging satellites can, at present, only transfer data when they are within range of a ground station.
  • the embodied method for meeting the requirements for good coverage and fast transfer combines a large number of satellites utilising optical and microwave sensors with a network of inter-satellite links.
  • the first feature allows high spatial and temporal resolution under all lighting conditions while the second will allow transfer of data when a given satellite is not in range of any ground station.
  • the image processing and associated support systems will be similar to existing satellite systems.
  • a top-level schematic of the imaging process is shown in figure 13. The process commences with the user requesting an image, this request is then processed to determine how the request is to be implemented and then the information is transferred to the imaging satellite.
  • the satellite's sensors acquire the requested image, which is then transferred to ground and processed. Lastly, the image is displayed on a visual output device for the user.
  • the real-time aspect of the imaging system means that the user-request will be real-time interactive, in contrast to all other satellite imaging user-request interfaces. These interfaces have a similar 'preview' system but then deliver the requested image in non-real-time so that the user must wait until the image is received before making any further requests based on that image. Since the images are displayed with no (or minimal) delay after the request it is possible for the user to interactively make requests based on the real-time images being displayed.
  • the system operates via a computing device connected to a network, such as the Internet or a direct subscriber network such as a news organisation or government entity.
  • a network such as the Internet or a direct subscriber network such as a news organisation or government entity.
  • the connection, or communications link may include land-lines, microwave links and fibre-optics.
  • the system is secured against unauthorised usage by some suitable means such as a password requirement or the need for proprietary software.
  • the procedure for the user request is shown in figure 15. ln operation, a user firstly logs in to the system station by means of a communications network, such as a public switched telephone network (PSTN) or the Internet or another method.
  • PSTN public switched telephone network
  • the user next activates an earth imaging application, and selects a region of the Earth in which they are interested, which may include zooming in to refine the target area.
  • PSTN public switched telephone network
  • the user Having selected the area of interest the user is, possibly, then presented with several options such as the required resolution, the frequencies to be sampled, e.g. optical, infrared or microwave, and a choice of view format, e.g. two- or three-dimensional images of the selected area or static or video images in real or near real-time.
  • the user's terminal transmits the total request information to the imaging server by means of a communications network, such as a public switched telephone network (PSTN) or the Internet or another method.
  • PSTN public switched telephone network
  • the system that the user logs in to has a number of functional elements. These can be separated into:
  • the ground station which handles the transmission and reception of all data, both acquired image data and system control and state information.
  • T&C Telemetry, Tracking and Control
  • SCC Satellite Control Center
  • the Image Station which provides the image processing and associated processing for the ALTRAV system.
  • ground stations and associated TT&C centers all able to communicate with one another and with the image stations and the SCC.
  • the image stations it is also preferable for the image stations to be able to communicate with each other.
  • a user has a dedicated ground station this could also have an associated image station/s.
  • the flowchart for the request processing and scheduling process is shown in figure 17.
  • the user request is received by the imaging station it is initially processed to determine the location (i.e. latitude, longitude and area coverage) and time (i.e. start and finish times) required. This information is processed to determine whether the user has the right to access the image information requested. If the request is allowed, then the system determines what resources are required, in terms of satellites and sensors, and this information is then used for scheduling. Resource allocation is done by each imaging station, which would be in communication with other stations so as to determine the total requests at any one time. Alternatively the resource allocation could be done in some more centralised fashion. Scheduling/resource allocation would be performed using either existing scheduling software/methods or some other system.
  • the system would first obtain the position of all satellites, throughout the duration of the request, which would be calculated from a knowledge of their orbital speeds and trajectories using orbital dynamical theory. From this, a subset of those satellites which would satisfy the user- request requirements could be determined. The system would also calculate when each individual satellite should begin and end image acquisition. From these satellites and their sensor specifications the system would then determine what sensor settings would be needed, this would include the sensor pointing directions and/or sensor selection, if there is more than one sensor on the satellite as is the preferred case. The system would then schedule the requested imaging task using the existing tasks, their resource requirements and assigned priorities, if appropriate.
  • the system would preferably determine the optimal uplink and downlink transmission paths for the request information and captured image. This is done at a TT&C center, which would also determine any other satellite housekeeping information that may need to be transferred (uplinked) to the satellite.
  • the routing would require a knowledge of the state of the network, in terms of available inter-satellite links (ISLs), existing network traffic, and the position of the satellites and ground stations for earth- satellite transfer. The availability of ISLs will depend on the physical link state and the position and orientation of the satellites for inter-orbital plane ISLs.
  • ISLs inter-satellite links
  • the availability of ISLs will depend on the physical link state and the position and orientation of the satellites for inter-orbital plane ISLs.
  • the system determines, using standard network theory, the fastest route for request and other information to reach the satellite (here termed uplink), and the fastest path for image and other information to return, via the network, to ground (here termed downlink).
  • the satellites also have some routing capabilities so that the data routing is done 'in transit'.
  • Satellite to satellite communication is necessary to achieve a wide global coverage of the whole earth, or partial coverage, in as near to real time as possible.
  • the real-time data transfer component of the imaging satellite system is similar to the existing telecommunications network and has certain distinct elements. Firstly, there is the method for communicating between satellites, the physical layer, which is characterised by the physical transfer method used (e.g. laser or radio) and the means of implementing this. Secondly, there is the network layer, or linking scheme, used to route the data either up to the target imaging satellite (uplink) or back down from the imaging satellite to ground (downlink). In addition, there is also the protocol method used for the transmission of data across the links.
  • the physical layer which is characterised by the physical transfer method used (e.g. laser or radio) and the means of implementing this.
  • the network layer or linking scheme, used to route the data either up to the target imaging satellite (uplink) or back down from the imaging satellite to ground (downlink).
  • protocol method used for the transmission of data across the links.
  • inter-satellite links In the physical layer, inter-satellite links (ISLs) are provided.
  • ISLs inter-satellite links
  • radio links When transferring data between two satellites there are two methods that may be used, radio links or laser links.
  • the system uses laser links as these have higher capacity, are more concentrated and require less power than radio links.
  • Proposed systems have capacities in the order of Gigabits per second. Also, due to the optical frequency of the laser, there is no possibility of interference with radio communications.
  • the ISL system is small, light and consumes relatively little power.
  • a system using laser ISLs must provide a means for producing a laser beam, a method for integrating the image data with the laser carrier and a method for directing this beam towards another satellite.
  • the system needs a method for receiving the laser signals from other satellites and relaying these to another satellite or to ground.
  • An example of a system designed for the telecommunications satellite constellations is shown in figure 18.
  • the system includes a method of tracking the ISL beam from the imaging satellite to the data-relay satellite.
  • a similar system to that used with the SILEX/SPOT4 system is used. In this method, the satellites 'search' for each other using a wide angle beam to roughly locate each other. Next, a finer beam is used to refine this pointing and finally, when the satellites are aligned, they 'lock-on' to each other and data-transfer begins.
  • the ISL system therefore needs some method of directing the beam to allow for the relative motion of the sending and receiving satellites. This is either a movable mirror (as in figure 18) or an optical phased array, similar to the phased antenna arrays used in radio.
  • the other critical feature in the data transfer chain is the initial link between the ground station and the space-based network segment.
  • This physical link is a microwave link similar to existing uplinks in communication systems.
  • the attenuation of a microwave uplink is lower than optical links.
  • Satellites in a constellation are generally arranged in several discrete orbits (or orbital planes) with multiple satellites in each orbit. Links can then be intra-plane or inter-plane. In the present embodiment both are used, as shown in figure 19.
  • This sort of linking arrangement is used in telecommunications constellations, for instance the Teledesic network shown in figure 20.
  • satellites stay at fixed distances apart so it is simpler to link intra-plane rather than inter-plane where the satellites may move relative to each other.
  • the imaging system has a data transfer path which goes between the imaging satellite and either directly to ground, if this is possible, or via one or more 'routing satellites' if direct ground contact cannot be achieved.
  • These 'routing' satellites may or may not have imaging capabilities. Imaging capabilities are not necessary for the data transfer portion of the system, however, in the present embodiment, the 'routing' satellites also have an imaging capacity.
  • the 'routing satellites' are GEO-, MEO- or LEO-based, depending upon the particular embodiment chosen.
  • the intermediate satellites having imaging capabilities
  • the system is also designed to minimise the time from the image request to the display of the image, by optimal use of the satellites and their communications capabilities. This gives maximum flexibility to the design of the overall system.
  • the satellites are from a third party design and the system's sensors are 'piggy-backed' onto them.
  • An example of this would be to place the sensors onto Teledesic's proposed satellite constellation.
  • One main advantage of using third-party satellites is that the communications infrastructure is already in place, thus reducing costs and time.
  • the third-party satellites are not specifically manufactured for the imaging system of the present embodiment, significant redesign of the third-party system may be necessary. For example, space on the platform would be needed to house the sensors, and the imaging system would need to be integrated into the existing one provided on the particular platform.
  • the relay satellites do not have imaging capabilities
  • the relay satellites are in geostationary orbit, similar to the system planned for use with SPOT4 or in Medium Earth Orbit (MEO) or in LEO.
  • MEO Medium Earth Orbit
  • LEO LEO
  • the system operates in a similar fashion to that shown in figure 22.
  • geostationary relay satellites are their large area of coverage, hence only a few are needed, and their stationary position, relative to the earth's surface which simplifies the ground reception.
  • a network of MEO satellites equipped with ISLs, are used in a similar manner to relay the image data back to earth.
  • Using a network of MEO satellites has the advantage of requiring a smaller number than a LEO constellation and the satellites do not have to be designed for imaging but only for communications.
  • Another scheme using 'non-imaging' routing satellites in a further embodiment is to relay the image data from the sensing satellites to ground via a separate constellation/network of communications satellites.
  • These satellites are distinguished from the 'imaging' relay satellites described above by being separate to the imaging system.
  • the satellites are from the Teledesic system or some other external provider.
  • Advantages of using third- party relay satellites are their large area of coverage and inter-satellite links (ISLs), which are important for providing 'real-time' ground reception. Additionally, by not being a direct part of the imaging system, the development and maintenance of the transfer-to-earth portion of the system is simplified.
  • FIG. 23 The procedure commences with the transfer of compressed and encrypted data from the ground station to a space-based network node (i.e. a satellite). This satellite examines the data to determine if it is the intended recipient of the information. If not, it determines which inter-satellite link to use for data transfer, either by examining the data for routing information or by performing some routing to determine the next optimal step in the data transfer chain.
  • a space-based network node i.e. a satellite
  • This satellite examines the data to determine if it is the intended recipient of the information. If not, it determines which inter-satellite link to use for data transfer, either by examining the data for routing information or by performing some routing to determine the next optimal step in the data transfer chain.
  • the satellite When the ISL has been determined the satellite then attempts to establish the physical link to the intended satellite. Slightly different methods may be required depending on whether the link is intra- or inter-plane, however the general approach is as described above. Once the link is established, the data will be transferred using the designated communications protocol. Finally, the receiving satellite examines the data to determine if it is the intended recipient of the data and takes the appropriate action.
  • the overall method of operation for real-time downlinking used in the present embodiment is shown in figure 24. It commences with the transfer of compressed and encrypted data from the imaging satellite to the ground station, if this is possible, or to a space-based network node (i.e. a satellite). If direct transfer to ground is not possible, the satellite determines which ISL to transfer the data across, either by examining the data for routing information or by performing some routing to determine the next optimal step in the data transfer chain.
  • a space-based network node i.e. a satellite
  • the satellite When the ISL has been determined, the satellite then attempts to establish the physical link to the intended satellite. Slightly different methods may be required depending on whether the link is intra- or inter-plane, however the general approach is as described above. Once the link has been established, the data is transferred using the designated communications protocol. Finally, the receiving satellite examines the data to determine if it can transmit to ground and takes the appropriate action.
  • the sensors used in the real time imaging system detect EM radiation emitted by the earth's surface over a range of frequencies such as microwave, infrared, visible, ultraviolet and gamma radiation. Since, at present, one single sensor cannot currently cover this wider frequency range, a generic sensor system oniy will be described. A notable exception to this is the case for active sensors that require a transmitter, in addition to the general embodiment of a passive sensor.
  • a particular feature of the image acquisition process in the present embodiment is the use of both optical and microwave sensors on the same satellite to allow all- weather day/night imaging capability.
  • the system comprises a data receiver that receives data, such as control instructions and reference data, either directly from a ground station or from another satellite. This data is processed, including decryption and decompression, to determine what the satellite is required to do to perform the instructions.
  • the sensor component outputs data to a data storage area (DSA) and to the control and processing module (CPM), which performs compression, encryption and other image processing such as geo-locating the image.
  • DSA data storage area
  • CCM control and processing module
  • the method adopted uses sensors with independently directable beams which allow more area to be imaged, within an 'effective footprint', than with a single fixed beam.
  • the system would have a multiplicity of sensors that are able to move in 2 dimensions to record data from a given target that lies in the satellite's effective footprint.
  • This system assuming a sufficient number of satellites, gives partial coverage with a fast refresh rate, or global coverage with a slower image refresh rate.
  • This method is able to be implemented using radar sensors, with the beams being directed using steerable antennas.
  • the effective footprints are made to be overlapping to allow stereo imaging.
  • a method is adopted using a number of sensors, each imaging an area within the footprint through a static lens.
  • This method has a large number of sensors on the satellite all observing different fixed areas within the footprint simultaneously.
  • An example of such a system is an optical/1 R passive sensor comprising a bundle of wave-guides, such as optical-, i.e. glass-, fibre, that interface to a fixed focal length lens.
  • Each fibre in the fibre optic bundle is linked to a separate optical sensor such as a charged-coupled device (CCD), which is located at a convenient point on the spacecraft.
  • CCD charged-coupled device
  • the processing scheme adopted in the present embodiment is shown in figure 25.
  • the image acquisition process commences when the satellite receives the image request, and any other information, from the ground station. If the data has been compressed/encrypted, then it is de-compressed/de-encrypted and passed to the control and processing module (CPM).
  • the CPM analyses the information and then instructs the sensor instruments in the necessary fashion to implement the request. This may include such things as the angle of sensor pointing or the particular sensors to utilise.
  • Data received by the satellite's sensors is sent to a data storage area (DSA), on a frame-by-frame basis, at a rate of >25 frames per second. Due to onboard memory constraints, it may be necessary to compress the data stored in the DSA.
  • the sensed data is sent to the CPM for processing and/or compression.
  • the system use some form of compression to reduce the quantity of data that must be downlinked to the ground station. Downlink data may also be encrypted for security reasons.
  • An inter-frame compression technique is used, i.e. one that transmits the difference between successive image frames. This is achieved by a comparison between the current image and previous images, which are stored in the DSA.
  • An adaptive process is used to perform this comparison. An example of this is where an initial comparison is performed, based on wavelet parameters. Should the correlation of the wavelet parameters fall within predefined values then a more detailed comparison is performed. The comparison process is also based on several factors such as, histograms in each received band, edge structures, band ratios and textures. Alternatively, other forms of image compression is applied. ln the present embodiment, some, or all, of the 'image processing' (section Image Processing and accompanying figures) is done on-board. This includes the processing required by synthetic-aperture radar techniques or image calibration.
  • the system also performs a comparison between the current image and geo-located reference data (i.e. data whose geographic location is known) to allow the user-selected area to be extracted from the sensed data. Extracting the requested data from the sensed data further reduces the volume of downlinked data. In other embodiments, some, or all, of the on-board processing is done on the ground so that the satellite essentially transmits raw data.
  • geo-located reference data i.e. data whose geographic location is known
  • a paralleled computing structure is adopted. Such a structure enables the above process to be repeated simultaneously for a number of areas within the same region.
  • the system transmits the data to ground via the optimal data transfer route.
  • other information such as satellite status, network and ISL status and satellite pointing direction is also downlinked.
  • the image-processing scheme of the present embodiment shown in figure 26, outlines the overall structure of the processing required to produce an image from raw sensor data. For simplicity, it will be assumed that all processing is done in the imaging station, however it is possible that both the satellites and the end user software will do a proportion of the image processing.
  • the imaging station receives data from a ground station and decompresses/decrypts the data. This data is then pre-processed, enhanced, and possibly classified, before the area selected by the user is extracted and then compressed/encrypted before finally being transmitted to the user.
  • Image pre-processing involves calibrating the image to remove any systematic errors, or offsets, that may have been introduced and the processing involved in converting synthetic-aperture radar data into an image.
  • the calibration occurs in two main areas: geometric and radiometric calibration.
  • Geometric calibration corrects for the orientation or pointing of the satellite, relative to the fixed earth co-ordinates of latitude and longitude. This is done either by comparison with an archived geo-located reference image, or by a knowledge of the pointing vector of the satellite and the sensor which then allows appropriate transformations to be made to the image.
  • Radiometric calibration allows the relationship between the incident radiation intensity and the sensor output to be quantitatively determined. This is done by periodic comparison with an EM radiation source of known strength such as the sun or, for an active sensor system, against a target of known backscattering cross-section.
  • Image enhancement involves the removal of random errors, e.g. noise, from the image and additional processing to improve the image, such as incorporating stereoscopic effects.
  • the system performs processes such as contrast modification, edge detection and noise reduction. If non-optical frequencies have been used then the system will use false colouring and overlays of multiple frequencies to form the image.
  • Stereoscopic processes are also performed on the image to produce a three-dimensional effect. This includes a change in perspective transformation, to give a 'fly-through' effect.
  • image classification is performed.
  • This is a technique which uses 'pattern recognition' to group the image information in ways that makes visual interpretation simpler, or more efficient, for the end-user.
  • the human brain performs image classification automatically. For example rather than observing a random array of brightness and colours the brain recognises a region as being a dwelling, therefore classifying all the information in that region as 'house', as opposed to, say, 'vehicle'.
  • An example of this type of image classification in an imaging system is a microwave image having radar backscatter being processed to identify friendly or hostile objects.
  • this pattern recognition uses neural networks or some other process.
  • the image display is handled differently than in many current user interfaces. These interfaces generally have a 'preview' system, similar to that in the user-request section of the present imaging system, but then dispatch the requested data in non-real time to the user. With the present system, there is no (or minimal) delay between the user-request and image display. In addition once the initial image has been obtained, continuous (refresh rate «>25Hz) imaging of the requested region is possible, which is a particular feature of the present embodiment.
  • the system operates via a computing device connected to a network, such as the Internet or a direct subscriber network such as a news organisation or government entity.
  • a network such as the Internet or a direct subscriber network such as a news organisation or government entity.
  • the connection, or communications link includes land-lines, microwave links and fibre-optics.
  • the user's system receives the image data over the network from the imaging station and then performs some image processing on the data before displaying it in the requested format on an output device.
  • Some, or all, of the 'image processing' is done by the user display device.
  • the level of data processing is tailored to the user's requirements and to capabilities of the user's system. For example, if the end-user display terminal were a small hand-held device then, at present, it would be preferable to do the majority of the image processing at the imaging station. Conversely, if the end-users were a government or scientific organisation then it may be preferable, or necessary, for a larger proportion of the image processing to be done by the user's system.
  • the image is displayed. This is done using some digital visual output device, such as a computer monitor, flat screen TV or palmtop display. This process is shown schematically in figure 27.
  • satellites in LEO are provided with the imaging sensors, the sensors being as close as possible to earth to achieve the highest spatial resolution.
  • a LEO-based sensor however, has a smaller footprint than the same sensor in a higher orbit, so this configuration presents a worse case scenario in establishing the number of satellites required to obtain global coverage of the earth.
  • Each satellite has optical/IR and active microwave based sensors on board.
  • the maximum spatial resolution of 1 metre for an optical/IR sensor and for microwave sensors is a constraint provided for national security reasons, and not technological reasons.
  • the technology for greater resolution is available for classified purposes, but as yet, has not become available for commercial purposes.
  • LEOs in the telecommunications and earth imaging industry are usually circular, polar orbits. This orbit ensures that every satellite will pass over an area on the earth at some point in time and have the ability to perform its operational function, with the exception of the far north and south regions. To achieve coverage in these areas, polar elliptical orbits are needed. In the case of circular polar orbits, the satellite is travelling from pole to pole at a constant speed, while the earth is rotating about the polar axis. Figure 28 shows the effect.
  • the imaging system of the present embodiment comprises a purpose-built LEO imaging satellite constellation travelling in elliptical polar orbits having both microwave and optical sensors.
  • the satellite network and imaging sensors are configured to cover 95% of the earth's surface.
  • the altitude of the orbit of these satellites is approximately 640 km and the footprint of each satellite's image sensor is circular, as shown in figure 29.
  • the angle of view from nadir, ⁇ in figure 29, is approximately 27° for all sensor types.
  • the surface area of the whole earth can be calculated as:
  • the sensing method adopted by the imaging system of the present embodiment involves having "all-weather" sensors that can move through 360° in azimuth and have a 27° scan angle from nadir. Using this scan angle the sensor can use its 'spot beam' to map out a much larger area. This scan angle is similar to that achieved by the newest of the SPOT imaging satellites, SPOT 5. For simplicity, it is assumed that there is a single type of sensor on the satellite. However, in the present embodiment multiple sensors are used in practice to provide multiple-user access and multi-frequency sensing capabilities. To achieve a 55° (i.e. ⁇ 27°) viewing angle, at all azimuths, the sensor needs to be moveable in two planes.
  • the size of the circular footprints is a function of the orbit height of the satellite and the sensor scan angle.
  • To calculate the area of the footprint consider figure 30.
  • the area of circular footprint in figure 19 is given by:
  • Footprint area ⁇ x (Footprint Radius) 2
  • the effective area covered by a single satellite is the area, which is uniquely (i.e. only) imaged by that satellite. This is given by the square inside the circles in figure 31 b.
  • the number of satellites required to image a given proportion of the earth's surface is determined by the size of the area and the coverage of a single satellite. The number is given by:
  • Number of satellites Area to be covered /Area of Coverage
  • a large constellation of around 2000 satellites are needed to cover 95% of the earth's surface. If further overlapping of the footprints is required in order to achieve instantaneous stereo images, this number may need to be increased.
  • the system could either have multiple moveable sensors on the one platform, or a larger scan angle or be in a different orbit. Increasing the scan angle would give a larger effective footprint, as the effective area of coverage. Having a larger effective footprint would reduce the number of satellites required for whole-earth coverage, which is desirable.
  • the number of satellites required for coverage is primarily related to the effective image footprint of a single satellite.
  • the footprint size is, in turn, critically dependent on the orbit height and sensor scan angle.
  • the number of satellites required is largely determined by these system parameters and if these are changed then this number may change considerably.
  • the system may configure the satellite constellation to cover a reduced portion of the earth's surface, say North America, which would also reduce the number of satellites required.

Abstract

An imaging system (11) for providing image information of a desired portion of the earth's surface (13) to a user (14) of the system in real or near real time. A large number of imaging platforms (15) in non-geostationary earth orbits (17) are provided so that each platform (15) has one or more imaging sensors (19). The platforms (15) are arranged in a constellar network (21) so that the footprint coverage (23) of the imaging sensors (19) of each platform (15) is adjacently and overlappingly disposed with respect to the footprints (23) of each adjacent platform (15) thereto. The footprints (23) contiguously and concurrently cover a substantial part of the earth's surface (13) continuously and dynamically. Any portion within the footprint coverage (23) being capable of being imaged any time with a spatial resolution of approximately 1 metre or less and a temporal resolution of approximately 25 images per second or more.A communication means (25) is also provided for conveying information between a user (15) requesting an image of the desired portion of the earth (13) covered by the footprints (23) and the network of platforms providing the image. A request delivery means (27) is included to deliver a processed request for an image from the user (14) to the platform(s) (15) required to generate the image requested and an image delivery means (29) is provided to deliver the image generated by the platform(s) (15) to the user (14).

Description

"Real or Near Real Time Earth Imaging System"
Field of the Invention
The present invention relates to earth imaging systems and more particularly to earth satellite based imaging systems for producing images of the earth in real or near real-time.
Throughout the specification, unless the context requires otherwise, the word "comprise" or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated integer or group of integers but not the exclusion of any other integer or group of integers.
Background Art
Imaging equipment located on board satellites has been used for producing images of, and data related to, the earth's surface.
However, these imaging systems are constrained by the stationary orbit, or the orbit path, of the satellites on which the imaging and data collection equipment is located. This means that only an image of, or data relating to, the earth's surface falling under the satellite's orbit can be obtained.
Further, where a satellite is in a non-stationary orbit, the areas of the earth surface falling under its orbit can only be imaged at certain time periods corresponding to when the satellite is aligned with these areas.
Although there has been much development of satellite technology, and in particular the production and installation of large constellar networks of satellites in low earth orbits (LEO) and medium earth orbits (MEO), this has been driven primarily by the commercial gains that are achievable with the use of satellites in the telecommunications sector. Accordingly, these large constellar satellite networks are virtually exclusively dedicated to private telecommunication networks and have had nothing to do with imaging. Unfortunately, because the same commercial gains are not realisable in the imaging industry, imaging satellites are typically limited in number to 2 or 3 satellites in a network orbiting the earth, which makes it impossible for these networks to provide for instantaneous real time imaging on demand of selected parts of the earth's surface.
Another reason for the lack of development and implementation of imaging satellites in the private sector on a similar scale to the take up of satellite technology in the telecommunications industry is that of national security and the deployment of imaging satellites for military purposes.
The USA in particular has been sensitive to allowing private access to imaging satellites that can achieve images of the earth's surface having resolutions of less than 10 metres. However recently, relevant regulatory authorities in the USA have relaxed these requirements and have permitted image resolutions to be provided to the public down to 1 metre.
A further aspect of imaging satellites in current use is that the images and data are collected by expensive purpose-engineered sensors. These sensors generally do not use all of the available natural radiation from the earth to generate an image.
The earth's sun is believed to emit most of its energy in the so-called "optical" frequencies of the electromagnetic spectrum. The earth, by contrast, due to its lower non-incandescent temperature, is believed to emit most of its energy in frequencies that are non optical and which is of weaker strength.
Humans have evolved so that their brains process and recognise optical frequencies because the sun emits and the moon reflects strong levels of optical frequency electromagnetic radiation.
Some animals and insects "see" other radiation bands i.e. snakes, infra-red and bees, microwaves. A great deal of information about the earth has not been collected in these frequency ranges. Disclosure of the Invention
It is an object of the invention to provide for instantaneous real or near real time imaging on demand of selected parts of the earth's surface.
It is a preferred object of the present invention to supplement such real or near real time imaging to provide for images that include collected and converted radiation from a range of electromagnetic frequencies, including but extending beyond the optical range, into a form, that can be viewed by humans.
It is a further preferred object to provide for viewing a composite image of the earth's surface or a substantial portion of the earth's surface, in real time or near real time across a wide range of these frequencies.
According to a first aspect of the present invention there is provided an imaging system for providing image information of a desired portion of the earth's surface to a user of the system in real or near real time comprising:-
(a) a large number of imaging platforms in non-geostationary earth orbits,
(i) each platform having one or more imaging sensors;
(ii) the platforms being arranged in a constellar network so that the footprint coverage of the imaging sensors of each platform is adjacently and overlappingly disposed with respect to the footprints of each adjacent platform thereto, so that the footprints contiguously and concurrently cover a substantial part of the earth's surface continuously and dynamically; and
(iii)any portion within the footprint coverage being capable of being imaged any time with a spatial resolution of approximately 1 metre or less and a temporal resolution of approximately 25 images per second or more; (b) communication means for conveying information between a user requesting an image of the desired portion of the earth covered by the footprints and the network of platforms providing same, including:
(i) request delivery means to deliver a processed request for an image from the user to the platform(s) required to generate the image requested; and
(ii) image delivery means to deliver the image generated by the platform(s) to the user.
Preferably, the platforms form a constellation of satellites. Alternatively the invention may use aircraft such as high altitude balloons or unmanned remote controlled aircraft to locate said sensors for collecting said image data. The aircraft are preferably arranged so as to form a constellation.
Preferably, the imaging system includes an image request processing station including:
(a) an image requesting server to receive compiled requested information from the user;
(b) data processing means to generate processed image request data including:
(i) location processing means for determining the location and size of the portion to be viewed and the time duration of the imaging;
(ii) permission processing means to determine user access;
(iii) resource allocation means to determine requisite resources such as platforms and sensors to provide the image;
(iv)scheduling means to schedule the use of the resources amongst competing requests; and (v) request data transport means to transport processed image request data via the request delivery means.
Preferably, the imaging system includes image data acquisition and processing means disposed on each platform including:
(a) data receiving means for receiving the processed request data;
(b) data processing means to process the received processed request data and to control the imaging sensors in accordance therewith;
(c) a data storage area to store data output from the imaging sensors;
(d) control and processing means to process the stored data to generate processed image data ready for transport; and
(e) image data transport means to transport processed image data via said image delivery means.
Preferably, the imaging system includes an image receipt processing station for processing received processed image data including:
(a) receiving means to receive and initially process the transported image data for decompression, decryption and the like;
(b) pre-processing means to calibrate, enhance and classify the initially processed image data if necessary;
(c) extracting means to extract the portion of the image originally selected by the user;
(d) transport processing means to process the extracted image data ready for delivery to the user; (e) transmission means to transmit the processed extracted image data to the user.
Preferably, the imaging system includes an earth imaging application means including :-
(a) a user request interface that is real-time interactive having:
(i) selection means to select a portion of the earth that is desired to be imaged by the user;
(ii) zooming means to fine tune the selection of the portion;
(iii) resolution means to select the desired resolution of the image;
(iv) frequency sampling means to select the desired sampling frequency of the image;
(v) view formatting means to select the dimension in which the image is viewed and video or static views;
(b) compilation means for compiling requested information;
(c) transmission means to transmit requested information to the image requesting server;
(d) a user image display interface integrated with the user request interface to display the requested image in real time or near real time, including:
(i) image processing means to process the received extracted image data from the image processing station for display; and
(ii) display means for displaying the processed image data to the user. Preferably the image data comprises three-dimensional data of the earth's surface whereby the combined image is a three dimensional image.
Preferably, more than one sensor is provided on each platform to collect optical data. It is also preferable that at least one sensor on each platform is provided to detect and collect a number of frequencies beyond optical frequencies such as micro-wave, infra-red, ultra-violet or gamma radiation frequencies.
Preferably, said platforms are adapted so that image data of at least one area of the earth's surface is collected by at least two sensors at any one point in time whereby said platforms collect said three dimensional data.
Preferably said three dimensional data of an area is collected by a single sensor from two different points of an orbit of a platform containing said sensor. Preferably said sensor collects said data sets at a rate of 25 data sets per second or higher as the platform traverses its orbit path.
Preferably said platforms are further adapted to communicate with each other so that said image data of an area collected by at least two sensors is filtered by said platforms so that said combining means receives a selected set of image data of said area.
According to a further aspect of the present invention there is provided an imaging system comprising a large constellation of platforms collectively having a first set of passive electromagnetic radiation sensors and a second set of active electromagnetic radiation sensor each sensor being adapted to collect data from the earth's surface; identification means for identifying an area on the earth's surface from where data has been collected by said passive sensor; comparison means for comparing said identified area with a target area; whereby in response to a positive result of said comparison, said active sensor is adapted to collect data from said identified area. Preferably, said imaging system comprises a set of images of the earth's surface; said comparison means adapted to compare said data collected by said passive sensor with said set of images.
Preferably said second set of active sensors are wide range sensors.
Preferably, said sets of active sensors and said passive sensors are adapted to receive electromagnetic radiation from a spread of frequencies, such as optical, infra-red, ultra-violet and microwave frequencies; said set of image data derived from said spread of frequencies.
According to yet another aspect of the present invention there is provided a system for collecting data related to the earth; said system comprising a constellation of platforms some or all of which contain a set of sensors for detecting electromagnetic radiation emitted from the earth's surface; processing means for receiving at least two sets of data; each said set of data being received from a separate one of said platforms; and said processing means combining said at least two sets of data to generate a representation of the earth.
Preferably, said sensors operate to detect a pre-determined set of electromagnetic radiation frequencies including optical, infra-red, ultra-violet and microwave frequencies whereby said representation is generated, at least in part, from data collected from electromagnetic frequencies outside the optical range of frequencies.
Preferably said unified representation comprises a graphical or visual representation of said data.
According to a further aspect of the present invention there is provided an imaging system for generating graphical representations of the earth comprising input means for receiving input data from a large constellation of platforms orbiting the earth; processing means for processing said data to generate a set of image data of the earth; and imaging means for generating graphical representations of the earth from said set of image data. Preferably, the platforms are satellites.
Preferably, said input data comprises communications signals received by a constellation of communications satellites; and said processing means is adapted to process said communications signal to identify sets of image data of the earth.
Preferably, said constellation of communications satellites have a non-stationary orbit and collectively receive on a continuous basis communication signals from a predetermined area of the earth's surface, whereby said processing means updates said set of image data on a real or near real time basis.
Preferably, said set of image data comprises data in three dimensions.
Preferably, said imaging means access said data in three dimensions whereby said graphical representations are three-dimensional representations.
Alternatively, the data signals may be obtained from telecommunications receivers located in high altitude balloons, aircraft, including high altitude unmanned reconnaissance/communications aircraft.
In accordance with a further aspect of the present invention, there is provided a method for providing image information of a desired portion of the earth's surface to a user in substantially real time comprising:-
(a) imaging the earth from a plurality of platforms located above the earth's surface;
(b) arranging the platforms in a constellar network with the footprint coverage of the imaging that can be achieved from each platform adjacently and overlappingly disposed with respect to the footprint coverage of the imaging from each adjacent platform thereto so that the footprints contiguously and concurrently cover a substantial part of the earth's surface continuously and dynamically; (c) spatially resolving the imaging of any portion of the earth's surface within the footprint coverage at any time to approximately 1 metre or less and temporally resolving the imaging to approximately 25 images per second or more;
(d) conveying information between a user requesting an image of the desired portion of the earth covered by the footprints and the network of platforms providing the imaging, including:
(i) delivering a request for an image from the user to the platform(s) required to perform the imaging to generate the image requested; and
(ii) delivering the image generated by the platform(s) to the user.
Brief Description of the Drawings
The invention will be better understood in the light of the following description of several embodiments. The description will be made with reference to the accompanying drawings in which:
Figure 1 is a schematic representation of a constellation of satellites in accordance with the first embodiment;
Figure 2 is a representation of a typical set of footprints produced by a constellation of satellites such as Figure ;
Figure 3 is a block diagram of an imaging system using the constellation of satellites of Figure 1 ;
Figure 4 is a flow chart of operation of the system in Figure 3;
Figures 5A and 5B are block diagrams of a compression system located on the satellites of Figure 1;
Figures 6A and 6B are block diagrams of two different configurations of a ground station and imaging station in Figure 3; Figure 7 is a schematic representation of a combined optical and microwave sensor used in the satellites of Figure 1.
Figure 8a is a representation of bands of frequencies in the radio spectrum;
Figure 8b is a representation of bands of frequencies in the micro-wave spectrum;
Figure 9 is a transmission diagram of the earth's atmosphere;
Figure 10 is a table of frequencies allocated to passive sensor applications;
Figure 11 is a table of frequencies allocated to remote sensing applications;
Figure 12 is a further representation of the electromagnetic spectrum showing bands of natural phenomenon;
Figure 13 is a top level flow chart of the imaging system in accordance with the second embodiment;
Figure 14 is a top level schematic for the imaging system of the second embodiment;
Figure 15 is a flow chart showing the procedure for a user making an imaging request in accordance with the second embodiment;
Figure 16 is a schematic showing the ground segment of the second embodiment;
Figure 17 is a flow chart showing the procedure for processing a user request in accordance with the second embodiment;
Figure 18 is a laser communications system developed by Matra - Marconi Space;
Figure 19 is a schematic drawing of intra- (vertical) and inter-plane (horizontal) link arrangements in a satellite constellation; Figure 20 is a schematic of network links in the Teledesic constellation;
Figure 21 is an example of a networking configuration for the imaging system of the second embodiment;
Figure 22 is a schematic of the use of a satellite in a different orbit to relay an image date from a LEO imaging satellite to earth;
Figure 23 is a flow chart showing the procedure for the real time uplink of data in accordance with the second embodiment;
Figure 24 is a flow chart showing the procedure for the real time downlink of data in accordance with the second embodiment;
Figure 25 is a flow chart showing the procedure for image acquisition in accordance with the second embodiment;
Figure 26 is a flow chart showing the image processing procedure in accordance with the second embodiment;
Figure 27 is a flow chart showing the procedure for image display in accordance with the second embodiment;
Figure 28 is a diagram showing how a single polar orbiting satellite images the earth's surface;
Figure 29 is a footprint on the earth's surface, projected from an imaging satellite, whereby the satellite has a single moveable sensor and therefore can image any area within the circular region;
Figure 30 is a geometry for calculating the footprint area; and
Figure 31 are schematic projections of imaging satellite footprints on the earth's surface in which the white circles show the actual sensor beams and the shaded circles show the area able to be imaged by the sensor; and wherein: Figure 31a shows the situation where the individual satellite footprints do not overlap; whilst Figure 31b shows the case of minimum overlap for total coverage where the grey sensor beam is an example of a region being imaged by two satellites simultaneously, allowing tor instantaneous stereo image.
Best Mode(s) for Carrying out the Invention
The first embodiment is directed towards an imaging system adapted to produce images of the earth, in near real-time.
In describing the first embodiment, reference will be made to figures 1 to 12 of the drawings.
From a user's point of view the imaging system operates through use of a computing device connected to a network, such as the Internet or a direct subscriber network such as a news organisation or government entity.
In operation, the user activates an earth imaging application which displays a map or image of the earth. The user selects a region of the earth that they are interested in. The user then zooms into this region so as to make a further selection in respect of an area of specific interest. Upon selecting the area of interest and identifying the resolution that they require, the user is presented with several options such as whether they wish to view a two dimensional image of the selected area, a 3 Dimensional (3D) image of the selected area, a video image in real or near real-time.
Rather than an image generated from purely optical data, a graphical representation of the earth can alternatively be presented that is generated from information obtained from electromagnetic frequencies other than optical frequencies. For example, information from frequencies in the ultra-violet, infra- red or micro-wave ranges may be used in generating a graphical representation of the earth. To generate a three dimensional representation of a selected region of the earth, at least two separate data sources each having a different aspect to the selected region is required. A three dimensional representation may be a data set having spatial information on the selected region. This data set of spatial information enables a view of the region to be represented on a two dimensional screen. The spatial information enables the region to be viewed from a number of different angles, (t also enables a user to navigate the region as the spatial information enables a continuous set of such two dimensional images to be generated.
Navigation of a region is seen as the display of a series of these two dimensional images. Each new image is from a new perspective/angle of the region that is slightly different to the previous angle/perspective. The change in perspective accords with the direction of navigation through the region and the speed with which the region is navigated. This provides the perception that the user is moving through a three dimensional space.
Images displayed to the user of a selected area are in near real-time. Near real time means that the image viewed by the user may be displayed with a delay of from 10 to 15 seconds after the raw data has been collected and processed by an imaging system. The data is updated at a rate of approximately, or more than, 25 images per second as the satellite traverses its orbit path.
Referring now to Figure 1 , and other figures as appropriate, a schematic representation of a constellation of satellites 22 used by the first embodiment is shown to provide image data from an imaging platform 15 thereon, in real-time or near real-time, to a user 14 of an imaging system 11.
A constellation of satellites is a group of satellites orbiting the earth that are networked to each other in a constellar network 21 so that the imaging platforms 15 thereon provide coverage of the earth's surface 13 that is greater than can be provided by the imaging platform 15 of a single satellite 22.
Such constellations typically comprise Low Earth Orbiting (LEO) satellites, though they may comprise Medium Earth Orbit (MEO) satellites and GEO-stationary satellites. A constellation may comprise a series of satellites from some or all of the various orbit categories.
The orbit paths 17 of satellites 22 in a constellation of satellites having a non- stationary orbit are typically arranged so that a pre-determined area on the earth's surface is continuously in sight of the imaging platform 15 of at least one satellite of the constellation at any one time. Similarly a constellation of GEO-stationary satellites is organised so that a pre-determined area of the earth is continuously in sight of the imaging platforms of the satellites in the constellation.
For example, global hand held mobile communications networks, such as the Iridium network http://www.iridium.com, use constellations of LEO satellites without imaging platforms to provide continuous telecommunications service to hand held communications terminals across a large proportion of the earth's surface.
The use of a constellation of LEO satellites is particularly advantageous in an imaging system as it allows high-resolution images to be obtained compared with satellites in Geo-Stationary orbit (GEO satellites). An alternate advantage is that lower cost sensing systems for imaging platforms, such as Charged Coupled Devices (CCDs) may be used in place of the high cost systems for imaging platforms used in GEO-stationary satellites.
The use of LEO satellites to form an earth imaging system will allow the earth to be imaged through passive sensing of electromagnetic radiation emitted from the earth's surface. The radiation may be both naturally occurring radiation or man made radiation. The lower attenuation of passively sensed signals by LEO satellites due to the shorter transmission path of these signals to the LEO satellites, compared with their transmission path to GEO satellites, will enable the signals to be detected and an image or representation of the earth, or a selected region of the earth to be imaged.
For example passive sensing from LEO satellites enables passive detection of man made electromagnetic signals such as telecommunication signals, radio signals, television signals and radar signals. These signals are emitted from the earth's surface or are reflected, scattered or faded from the earth's surface by buildings, cars, foliage, bodies of water etc. These reflected, scattered and faded signals represent noise in a received/detected telecommunications, radio, television, or radar signal. However, the reflected, scattered and faded signals contain information as to the object that they have been reflected off or scattered by or faded by. Processing of these reflected, scattered or faded signals produces information on these objects that enables graphical representations of the earth's surface to be compiled.
Other forms of noise in man made signals may also contain information on the earth as will naturally occurring radiation. All of these types of signal sources, including naturally occurring radiation, may be used to form an image or representation of the earth.
For example, the signals detected may be signals from cellular communications terminals. Signals from base stations of cellular telecommunications networks will be detected and processed as these signals originate from a source with a fixed location on the earth's surface.
Where a satellite has a non-stationary orbit, such reflected signals can be sensed from two or more points of the satellite's orbit path. This enables two or more sets of data on each point in the satellites footprint to be collected. This data may be used to generate a stereoscopic image, or three dimensional to generate a three dimensional image in respect of the areas covered by the footprint. The data sets will be off-set in time. This off-set will correspond with the time it takes the satellite to move from a first position where the first data set is collected to a second position in its orbit where a second data set is collected. Collecting data sets and processing these data sets at a rate of 25 data sets or images per second, or greater, should enable near real time images/representation at video frame rates to be generated and displayed to users.
Other embodiments of the invention may process reflection and scattering and fading noise in telecommunications signals received by telecommunications satellites in order to produce such an image. Alternatively, such information received by base transceiver stations in cellular communications networks may also be used.
In the present embodiment sensors are located in satellites of a constellation and are used for collecting image data relating to the earth's surface, the earth's crust and atmosphere. The sensors operate over a discrete set of electromagnetic frequencies, such as micro-wave frequencies, infra-red frequencies, optical frequencies, ultra-violet frequencies and gamma radiation frequencies.
This data is preferably relayed to one or more ground stations which process the data before forwarding selected sets of this data in accordance with user requests.
In order to relay data from each satellite to a ground station the satellites are adapted to communicate with each other by means analogous to a local area network (LAN) or similar network configuration. This communication is also used to notify each satellite of selected data that they are required to relay to a ground station.
In the preferred embodiment, the vast majority of the satellites, if not all of the satellites, are low earth orbiting satellites, such as those used for hand held satellite phones. These satellites have a low orbit and so are replaced every few years, allowing updating of imaging apparatus. The low altitude orbit of these satellites allows conventional imaging technology such as charged coupled devices (CCDs) to be used. Alternatively, higher resolution images can be achieved through use of standard equipment.
Referring now to Figure 2, a schematic representation of sensor footprints 23 of image sensors 19 located in a constellation of satellites is shown, such as the constellation of satellites in figure 1.
The footprints 23 indicate that most areas of land mass on the earth's surface do or will fall within the footprint of two or more satellites at any one point in time. This enables the system to collect two or more sources of data relating to particular areas on the earth's surface at the same time. Two or more sources of data enable data sets of three-dimensional data relating to these areas to be collected, or stereoscoping of the data.
Two or more data steams also enables the imaging system to select between different streams of image data for an area selected by an operator. This enables the system to selectively control the amount of data transmitted by any one satellite in the constellation. Where the satellites in the constellation have non- stationary orbits, overlapping sensor footprints do or will enable image data collection of selected areas to be handed off from one satellite to another.
Other embodiments may use aircraft instead of satellites. Such aircraft may include high altitude balloons, and unmanned remotely controlled aircraft. Such systems preferably utilise these aircraft in a similar manner to a constellation of satellites.
Referring now to Figure 3, a block diagram of an imaging system 11 according to the present embodiment is shown.
The system 11 comprises a ground station 300 that receives image data from the imaging platform 15 of a satellite receiver 320. The satellite receiver is in communication with a constellation of satellites 22 such as the constellation detailed in Figure 1. The constellation of satellites is represented by a single satellite 325.
The ground station 300 may be in communication with one or more other ground stations that are in turn in communication with different satellites 22 in the constellation 325.
Each ground station 300 may in turn be in communication with one or more imaging stations 330. A user 14 of the system 310 logs into the imaging station by means of a communications network, such as a public switched telephone network (PSTN) or the Internet 305 using a personal computer (PC) 310. Each ground station 300 receives data from the satellites that it is in communication with. This data is processed by the ground station before it is forwarded to the imaging station 330. The imaging station 330 operates to combine the streams of data from the various ground stations 300 so that an image of various points on the earth's surface 13, preferably any point, can be viewed by a user 14 on their PC 310 in real-time or near real time.
The system has two levels of encryption for the image data. The first level of encryption applies to information transmitted between satellites 22 of the constellation 325 and a ground station 300, and a ground station 300 and an imaging station 330. The second level of encryption is applied to image data transmitted between an imaging station 330 and a user 14. The use of encryption enables the image data to be transmitted over a public network such as the Internet 305 without the data being interpreted by un-authorised users. The two levels of encryption isolate end users 14 from data transmitted by a ground station 300 to an imaging station 330.
Referring now to Figure 4, a flow chart detailing a method by which the embodiment of the imaging system operates is shown. The process of obtaining a real-time or near real-time image of a point on the earth's surface commences at step 400 where a user 14 selects by means of an application running on a networked computing device 310, an area of the earth's surface 13 that the user wishes to view.
At step 405 the user's terminal transmits to the imaging station 330 request data that identifies the area that the user wishes to view, the resolution to which the area is to be viewed, and other information such as whether the image is to be a two dimensional image, a 3D image, a still image, or a video image. The image may be derived from data sets of three-dimensional information that is obtained from electromagnetic frequencies outside of the visible spectrum.
At step 410 the image station 330 identifies, from a database of reference images, reference image data that corresponds with the area identified by the user's terminal. At step 415 this reference image data is compressed and then encrypted and at step 420 the encrypted reference data is transmitted to the satellites whose footprints traverse the area selected.
At step 425 the selected satellites receive the data and store it in a module referred to as a compression/comparison module. The data stored in the compression/comparison module has been decrypted and uncompressed.
At step 430 image data received by the satellite's sensors is compared with the reference data stored in the compression/comparison module. This comparison enables data relating to an area selected by a user to be extracted from the received data.
Having extracted the required data from the received data, the satellite proceeds, at step 435, to compress the extracted data. This may be achieved by subtracting the reference data from the received data so that only the difference between the two signals is transmitted. Other means of data compression may be used, however it is preferable that the compression technique be an intra-frame compression technique, such as a technique that transmits the difference between successive image frames.
Once the requested data has been compressed it is then encrypted before the data is transmitted, at step 440, to a ground station.
At step 445 the compressed data received by the ground station is decrypted and expanded before it is transmitted to the imaging station for combining with other received data, after which it is transmitted to the requesting user. The process then terminates at step 450.
To enable a satellite to image more than one area at a time, a paralleled computing structure is adopted. Such a structure enables the process of Figure 4 to be repeated in parallel for a number of selected areas. Referring now to Figures 5A and 5B, block diagrams of an embodiment that utilises a sensor system installed on the imaging platform of some or all of the satellites 22 in a constellation of satellites such as the constellation of satellites in figure 1 , are shown. Figure 5A provides a general overview of the configuration of the sensing system whilst Figure 5B provides a more detailed view of the configuration.
The sensor system comprises a sensor 500 that detects electromagnetic radiation emitted from the earth's surface in a predetermined range of frequencies and polarisations such as micro-wave and infra-red radiation, visible radiation and ultra-violet and gamma radiation.
As shown in Figure 5B, the sensor 500 outputs its received data to a compression module 505 forming part of a compression/encryption stage 507. This data is output on a frame-by-frame basis at a rate of 25 frames per second. The compression module 505 also receives reference data from a data store 510 after being treated by a control/processing stage 512. The data store 510 also receives data from the sensor for use in compression of the data by a technique such as an intra-frame compression technique.
The compression module 505 operates to compress each frame of data received from the sensor in accordance with a compression algorithm. Once a frame of data is compressed, the compression module 505 forwards the frame to an encoder 515, which operates to encode each frame, the encoder also forming part of the compression/encryption stage 507. Once the frame has been encoded it is transmitted by a transmitter 520 to another satellite 22 from where it is transmitted to a ground station 300. Alternatively the satellite's orbit may enable it to transmit encoded frames directly to a ground station.
The onboard satellite sensor system also comprises a receiver 525 that receives data, such as reference data either directly from a ground station 300 or from another satellite 22. This data is forwarded to a decoder and expander module 530 that forms part of a decompression/decryption stage 532. The decoder and expander module 530 operates to decode the received data and to expand the received data for use by a control module 535 and for storage in the data store 510.
The control module 535 receives expanded data from the decoder and expander module. The control module 535 operates to compare data received from the sensor 500 with reference data received from the decoder and expander module 530. This comparison enables the control module 535 to determine which portion of the data fed to the data store 510 from the sensor 500 corresponds with data received from the decoder and expander module 530.
Having determined which portion of the data received by the sensor 500 is required for transmission to a ground station, the control module 535 controls the extraction and compression processes performed by the compression module 505 on the data received by the sensor 500.
Referring now to Figures 6A and 6B, block diagrams of a ground station 300 and an imaging station 330 according to the present embodiment are detailed. These figures show two slightly different configurations that may be adopted for the ground station and imaging station arrangements.
Commencing firstly with the configuration shown in Figure 6A, compressed data transmitted by a satellite is received by receiver 605, which passes the received data onto a decoder 610. The decoder 610 under the operation of control module 615 decodes the received data before passing it onto a decompression module 620 and control module 615.
Control module 615 processes the data to detect the area on the earth's surface that has been transmitted by the satellite. Having determined this area the control module instructs the decompression module 620 to access reference data from the reference data store 625 that corresponds with the area imaged by the satellite. This reference data enables the decompression module 620 to perform the reverse process of the compression process performed onboard the satellite. The expanded data is then passed to an image combining module 630 that operates to combine the decompressed image with other image data, possibly collected by other satellites, so as to produce a real-time or near real-time image of the selected area. The image is compressed by means of an image compression technique, such as enhanced compressed wavelet technique, and, is then transferred via a PSTN or Internet network using image transferring software, (such as Image Web Server available from earth Resource Mapping (ERM), www.ermapper.com), to an updating module 640 that is housed in an imaging station 330. The updating module 640 transfers the updated image to an image data store 645. The image data store is accessible by users of the system through a server 650 using image transferring software, such as IWS above.
In the alternative configuration shown in Figure 6B, the receiver 605 of the ground station 300 receives compressed data from a satellite and passes it through a decompression/decryption stage 655, whereby the expanded data is stored in a data storage area 660. This expanded data is then processed by an image formation module 665 to form an initial image, which is then compressed and encrypted via a compression/encryption stage 670 for transmitting via a PSTN or the Internet to the imaging station 330 where it may be further processed for the user requesting the same.
The imaging station 330 comprises a decompression/decryption stage 675 for expanding the compressed and encrypted image received from the ground station 300 and a data storage area 680 for storing the expanded image data. The expanded image data is then processed further by an image processing module 685 to improve the image before being compressed and encrypted via a compression/encryption stage 690 and delivered to the user via the server 650 and the Internet 635.
An alternate embodiment may operate to transfer decoded data directly from the decoder 610 to the updating module 640. The updating module may then perform the decompression and imaging combining function of the decompression module 620 and the image-combining module 630 respectively. Advantageously, the processing of the data may be performed by high powered personal computers (PCs) and work stations such as those using Microsoft's™ Windows™ operating system or the UNIX™ and LINUX™ operating systems which also work on Sun Microsystems™ workstations.
Referring now to Figure 7, a schematic representation of an imaging platform 15 is shown according to the present embodiment comprising a combined passive and active sensor arrangement that combine together to form an imaging sensor module 19.
A passive sensor is used for passive remote sensing which is a process that records terrestrial and reflected extraterrestrial electromagnetic radiation. Active sensors are used for active remote sensing which is a process that generates electromagnetic radiation to illuminate the surface of the planet and to record a reflected portion of the transmitted signal.
The imaging platform 15 comprises three passive sensors, namely a lagging sensor 705, a central sensor 710 and a leading sensor 715. Adjacent each, passive sensor is located an active sensor. These active sensors comprise a lagging sensor 720, a central sensor 735 and a leading sensor 740. Each active sensor comprises a transmitter 725 and a receiver 730. Multiple passive/active sensor arrangements may be required for sensing different frequencies of the electromagnetic radiation.
The transmitter 725 illuminates a preselected area of the earth's surface with electromagnetic radiation. The receiver 730 detects the radiation reflected by the earth's surface in the band of frequencies transmitted by the transmitter 725.
Each passive sensor comprises a bundle of wave-guides, such as optic fibres, which interface to a fixed focal length lens. This combination of lens and optic fibres forms a scanning head that scans areas of the earth's surface within the sensors footprint. Each fibre of the optic fibres is linked to a sensor such as a charged coupled device (CCD). The CCD sensor may be located at a convenient point on the spacecraft. Advantageously, CCD sensors are lightweight and robust allowing them to be located on a satellite with relative ease. They are also relatively inexpensive.
In the present embodiment, each CCD has its own controller that enables it to compare image data collected by its optic fibres with the reference data received from the ground station.
The three passive sensors operate to provide stereo imaging of the earth's surface. Provision is made for selecting between the three sets of image data collected by the sensor 700 in order to produce a three-dimensional image of a selected area on the earth's surface.
The three active sensors 720, 735 and 740 respectively, also have their image data combined to form a three-dimensional image of the electromagnetic data received from the earth's surface.
In operating the active sensors, it is preferable that the passive sensors first identify the area of the earth's surface that is to be imaged by the active sensors. Once the passive sensors have identified the requested area, the active sensors are then directed towards the requested area, whereupon the transmitters 730 are operated to illuminate the selected area with electromagnetic radiation.
Within the system it is preferable that an adaptive process is used to perform comparisons of image data collected by the sensors with reference data received from a ground station. An example may be a system where an initial comparison is performed, based on wavelet parameters. Should the correlation of the wavelet parameters fall within predefined values then a more detailed comparison may be performed.
In producing a three-dimensional image, an alternate system collects two or more sets of image data from one or more satellites. These separate sets of image data are combined by the ground station to generate a three-dimensional image of a selected area.
The compression scheme used is a high-level compression scheme that is adapted to transmit an update of the reference image from the satellite to the ground station, rather than transmitting all of the data collected by the sensors.
Where data from two satellites is used to produce a three dimensional image, interfacing between these two satellites is provided in order to ensure that the combined images are compatible to produce a three-dimensional image. Similarly, where data from two or more satellites is combined to produce an image that extends across two footprints, interfacing between the two satellites is also required to ensure that the data can be combined efficiently.
In order to overcome problems of pitch, roll, yaw, and orbit variations of the satellite, the sensors on the satellite continually supply image data to a ground station or the imaging station at a low data rate. The ground station or imaging station, operate to geometrically and radiometrically correct the received data to remove cloud and other influences whereby an accurate image of the planet is maintained by an imaging station in near real time. Thus, in this arrangement, the compressed data transmitted by the satellites represents an update of the current vegetative and human activity on the earth's surface.
The comparison process between the received data and reference data is based on several factors such as: histograms in each received band; edge structures; band ratios; textures; time; polarisations; power attenuation (active); and known cavities.
Other embodiments are provided that use satellites and/or aircraft with non- stationary orbits to collect three-dimensional data through use of a single sensor (whether active or passive). To collect such three-dimensional data, electromagnetic radiation from the same area on the earth's surface is sensed from two different points on the satellite's orbit. This provides two separate data sets on the one area that are recorded with different attitudes between the satellite and the area. The two sets of data on the one area with separate aspects enables three-dimensional data on the area to be produced.
In the present embodiment, telecommunications signals received by telecommunications satellites are also processed. The signals are processed to identify noise produced by objects that reflect transmitted signals. Processing this noise enables information relating to the reflecting objects to be obtained. Similar processing may be performed on signals received by base transceiver stations of cellular telecommunications systems.
Referring now to Figures 8a and 8b, representations of the radio and microwave spectrums respectively and the frequency bands over which various types of navigation, communication and radar systems operate, are shown. These are examples of man made radiation that can be detected and be used for generating a representation of the earth.
Figure 8a identifies the well known band designations of the radio spectrum as ULF, VLF, LF, MF, HF, VHF, UHF, SHF and EHF, and the different types of communication media that utilise these bands.
Figure 8b is a graphical representation of the microwave spectrum that extends between approximately 0.3 GHz and 100.0 GHz. The microwave spectrum is divided into a number of bands of frequencies identified by the letters P,L,S,C,X,K,QN and W. Various frequencies allocated to these bands are detailed in Figure 8b. Signals detected in these bands should correspond with transmitting devices of the systems detailed in Figure 8a.
Referring now to Figure 9, a graph of the transmission characteristics of electromagnetic radiation versus frequency in Giga Hertz is shown. The transmission axis represents the degree of attenuation caused by the earth's atmosphere on a one-way (ie passive) transmission of electromagnetic radiation emitted from the earth's surface and received by a satellite in space. This transmission diagram is measured under clear sky conditions. The transmission diagram shows that water vapour in the earth's atmosphere absorbs electromagnetic radiation at two frequencies of approximately 22 GHz, and approximately 180 GHz. The diagram also represents that oxygen in the atmosphere absorbs electromagnetic radiation emitted from the earth at two frequencies of just under 60 GHz and at approximately 120 GHz.
The transmission diagram also demonstrates that, in general terms, higher frequency electromagnetic radiation is attenuated by the atmosphere to a greater degree than lower frequency electromagnetic radiation. It should be noted that the 100% transmission as defined on the vertical axis is the transmission of 1 GHz radiation through the earth's atmosphere.
The transmission diagram demonstrates that at certain frequencies the earth's atmosphere operates as a band stop filter. Frequencies outside of these bands may be detected by use of sensors on board satellites. By detecting frequencies close to these bands, data relating to the present composition of the earth's atmosphere may be collected. As the composition of the earth's atmosphere changes, the transmission properties of the earth's atmosphere should change. Other windows and band stop frequencies may occur at different frequencies as the composition of the atmosphere changes.
The transmission diagram also demonstrates that outside of these attenuating bands, the earth's atmosphere presents a number of natural windows, such as the window centred on 35 GHz, 90 GHz and 135 GHz, which provide local maxima in the atmosphere's transmission properties.
Through the use of these band stop filter frequencies, and these window frequencies in combination with a constellation of satellites, data representing a snap shot of the earth's surface and the earth's atmosphere, or at least a predetermined portion of the earth's atmosphere and the earth's surface may be collected and transmitted to a ground station for later processing.
The use of a constellation of satellites also enables two-dimensional and three- dimensional data to be collected. This data may also be collected in real-time or near real-time and may be presented as either two dimensional or three dimensional data at video frame rates of 25 frames per second. In this way live data relating to electromagnetic radiation emitted from the earth's surface may be detected.
Referring now to Figure 10, a table detailing frequency allocations for passive sensors is shown. The frequencies are represented in Giga Hertz and the data in the table is labelled according to the applications that are allowed to operate in the nominated frequency bands. For example, the band of frequencies between 0.404 and 0.406 GHz is labelled with an "a". This indicates that these bands of frequencies are reserved for radio astronomy and hence no transmission of electromagnetic radiation at these frequencies from the earth's surface is permitted. Those frequencies labelled with a "p" operate within a band of frequencies whose primary use is for applications that transmit electromagnetic radiation. Those frequencies labelled with an "s" are within a band of frequencies whose secondary use is for applications that transmit electromagnetic radiation.
Figure 11 details radar frequency allocations for active remote sensing applications in Giga Hertz. Active remote sensing is the use of electromagnetic radiation to illuminate a selected area on the earth's surface. Electromagnetic radiation reflected from this selected area due to this illumination is detected by a satellite performing the illumination. These systems are active systems in the sense that they transmit electromagnetic radiation.
The imaging system of the present embodiment utilises these and other frequency allocations to collect data relating to electromagnetic radiation emitted from the earth's surface that is outside of the visible spectrum of electromagnetic radiation. It is preferable that further frequencies such as those representing infra-red radiation, ultra violet radiation and gamma radiation are detected.
Referring now to Figure 12, a representation of the electromagnetic spectrum from a frequency of approximately 1 Hertz up to a frequency of 10 22 Hertz is illustrated. The horizontal lines indicate the approximate sprectral ranges of various physical phenomena and practical applications. The spectrum details certain frequency bands such as infra-red radiation, ultra violet radiation, visible radiation, cosmic rays, microwaves, etc. Further information such as the frequency bands where "electron spin magnetism", "molecular rotation and vibration", and "molecular energy in atoms and molecules" occurs is also detailed.
Location of passive sensors onboard a satellite that is tuned to these various frequencies of naturally occurring radiation, enables data relating to these and other phenomena to be collected. As the data is collected by satellites within the satellite constellation of Figure 1 , real-time or near real-time representations of these phenomena for most of the earth's surface and/or atmosphere can be collected at any one time. Such a composite representation enables behaviour of these phenomena at various points on the earth's surface and/or atmosphere to be observed and composed.
Similarly, radiation from communication systems is also used to collect data that enables a representation of the radiation emitted by communication systems on the earth, to be developed. Other naturally occurring sources of electromagnetic radiation are also detected and are used for generating an image/representation of the earth.
Transmissions from mobile phones and other satellite phones and the reflections of these transmissions off buildings and other objects on the earth's surface is detected by a sensor on a satellite. Collection of this data from mobile telephones and other telecommunication systems is also used to generate a data representation, that is a graphical representation, of the earth's surface or at least part of the earth's surface.
Advantageously, much of the radiation that emanates from the earth will pass through cloud and water vapour enabling it to be detected in the presence of cloud bands and other forms of meteorological phenomenon. These phenomena will also be detectable at night as they occur at frequencies outside of the visible spectrum. Now describing a second specific embodiment, reference is made to figures 13 to 31.
The second embodiment is substantially similar to the first embodiment, being directed towards an imaging system comprising a large constellation of satellites that can image all, or some portion, of the earth's surface continuously and transfer this information to an end-user with minimal delay. A particular feature of the present embodiment is the production of 'movies' of user-selected portions of the earth. This can be achieved from continuous observation, and a sufficiently fast revisit period.
In the present embodiment, the real-time imaging system is sub-divided into two distinct sections. Firstly, an imaging portion which relates to how the information is acquired and secondly a non-imaging portion which relates to how the information is processed and transferred to the end-user. The requirements for the first section are large-area (preferably whole earth) high spatial resolution («<1m) combined with high temporal resolution (preferably «> 25 'images' per second to allow 'movies') and for the second section, very fast transfer and processing.
Current imaging satellite systems do not meet these requirements for several reasons. This is chiefly due to the very large number of satellites needed to achieve the high space and time resolution and also because imaging satellites can, at present, only transfer data when they are within range of a ground station.
The embodied method for meeting the requirements for good coverage and fast transfer combines a large number of satellites utilising optical and microwave sensors with a network of inter-satellite links. The first feature allows high spatial and temporal resolution under all lighting conditions while the second will allow transfer of data when a given satellite is not in range of any ground station. The image processing and associated support systems will be similar to existing satellite systems. A top-level schematic of the imaging process is shown in figure 13. The process commences with the user requesting an image, this request is then processed to determine how the request is to be implemented and then the information is transferred to the imaging satellite. The satellite's sensors acquire the requested image, which is then transferred to ground and processed. Lastly, the image is displayed on a visual output device for the user.
The system to implement this process is shown in figure 14 and will be a combination of:
1. A large satellite constellation size, giving global coverage.
2. A variety of sensors on each satellite, giving high resolution under all conditions.
3. Inter-linking of the satellites, giving real-time image transfer.
The real-time aspect of the imaging system means that the user-request will be real-time interactive, in contrast to all other satellite imaging user-request interfaces. These interfaces have a similar 'preview' system but then deliver the requested image in non-real-time so that the user must wait until the image is received before making any further requests based on that image. Since the images are displayed with no (or minimal) delay after the request it is possible for the user to interactively make requests based on the real-time images being displayed.
From a user's point of view, the system operates via a computing device connected to a network, such as the Internet or a direct subscriber network such as a news organisation or government entity. The connection, or communications link, may include land-lines, microwave links and fibre-optics. The system is secured against unauthorised usage by some suitable means such as a password requirement or the need for proprietary software. The procedure for the user request is shown in figure 15. ln operation, a user firstly logs in to the system station by means of a communications network, such as a public switched telephone network (PSTN) or the Internet or another method. The user next activates an earth imaging application, and selects a region of the Earth in which they are interested, which may include zooming in to refine the target area. Having selected the area of interest the user is, possibly, then presented with several options such as the required resolution, the frequencies to be sampled, e.g. optical, infrared or microwave, and a choice of view format, e.g. two- or three-dimensional images of the selected area or static or video images in real or near real-time. Finally, the user's terminal transmits the total request information to the imaging server by means of a communications network, such as a public switched telephone network (PSTN) or the Internet or another method.
The system that the user logs in to has a number of functional elements. These can be separated into:
I) The ground station which handles the transmission and reception of all data, both acquired image data and system control and state information.
II) The Telemetry, Tracking and Control (TT&C) center associated with the ground station which generally performs lower level satellite control operations.
III) The Satellite Control Center (SCC), which handles the overall control of the satellite network and generally provides higher-level satellite control.
IV) The Image Station, which provides the image processing and associated processing for the ALTRAV system.
These functional elements and their interrelationships are illustrated in figure 16.
Note that there will preferably be several ground stations and associated TT&C centers all able to communicate with one another and with the image stations and the SCC. For processing reasons and for scheduling it is also preferable for the image stations to be able to communicate with each other. For the case where a user has a dedicated ground station this could also have an associated image station/s.
The flowchart for the request processing and scheduling process is shown in figure 17. When the user request is received by the imaging station it is initially processed to determine the location (i.e. latitude, longitude and area coverage) and time (i.e. start and finish times) required. This information is processed to determine whether the user has the right to access the image information requested. If the request is allowed, then the system determines what resources are required, in terms of satellites and sensors, and this information is then used for scheduling. Resource allocation is done by each imaging station, which would be in communication with other stations so as to determine the total requests at any one time. Alternatively the resource allocation could be done in some more centralised fashion. Scheduling/resource allocation would be performed using either existing scheduling software/methods or some other system.
To determine the resource requirements the system would first obtain the position of all satellites, throughout the duration of the request, which would be calculated from a knowledge of their orbital speeds and trajectories using orbital dynamical theory. From this, a subset of those satellites which would satisfy the user- request requirements could be determined. The system would also calculate when each individual satellite should begin and end image acquisition. From these satellites and their sensor specifications the system would then determine what sensor settings would be needed, this would include the sensor pointing directions and/or sensor selection, if there is more than one sensor on the satellite as is the preferred case. The system would then schedule the requested imaging task using the existing tasks, their resource requirements and assigned priorities, if appropriate.
In addition to the processing of request information the system would preferably determine the optimal uplink and downlink transmission paths for the request information and captured image. This is done at a TT&C center, which would also determine any other satellite housekeeping information that may need to be transferred (uplinked) to the satellite. The routing would require a knowledge of the state of the network, in terms of available inter-satellite links (ISLs), existing network traffic, and the position of the satellites and ground stations for earth- satellite transfer. The availability of ISLs will depend on the physical link state and the position and orientation of the satellites for inter-orbital plane ISLs. The system determines, using standard network theory, the fastest route for request and other information to reach the satellite (here termed uplink), and the fastest path for image and other information to return, via the network, to ground (here termed downlink). The satellites also have some routing capabilities so that the data routing is done 'in transit'.
Once the total information to be transmitted has been obtained, including image acquisition, routing, satellite control information and image reference database information, it is then transmitted by the ground station. Some form of compression is used on the data to reduce the amount of information which must be transmitted.
Satellite to satellite communication is necessary to achieve a wide global coverage of the whole earth, or partial coverage, in as near to real time as possible.
The real-time data transfer component of the imaging satellite system is similar to the existing telecommunications network and has certain distinct elements. Firstly, there is the method for communicating between satellites, the physical layer, which is characterised by the physical transfer method used (e.g. laser or radio) and the means of implementing this. Secondly, there is the network layer, or linking scheme, used to route the data either up to the target imaging satellite (uplink) or back down from the imaging satellite to ground (downlink). In addition, there is also the protocol method used for the transmission of data across the links.
In the physical layer, inter-satellite links (ISLs) are provided. When transferring data between two satellites there are two methods that may be used, radio links or laser links. In the present embodiment the system uses laser links as these have higher capacity, are more concentrated and require less power than radio links. Proposed systems have capacities in the order of Gigabits per second. Also, due to the optical frequency of the laser, there is no possibility of interference with radio communications. The ISL system is small, light and consumes relatively little power.
A system using laser ISLs must provide a means for producing a laser beam, a method for integrating the image data with the laser carrier and a method for directing this beam towards another satellite. In addition to this, the system needs a method for receiving the laser signals from other satellites and relaying these to another satellite or to ground. An example of a system designed for the telecommunications satellite constellations is shown in figure 18.
The system includes a method of tracking the ISL beam from the imaging satellite to the data-relay satellite. A similar system to that used with the SILEX/SPOT4 system is used. In this method, the satellites 'search' for each other using a wide angle beam to roughly locate each other. Next, a finer beam is used to refine this pointing and finally, when the satellites are aligned, they 'lock-on' to each other and data-transfer begins. The ISL system therefore needs some method of directing the beam to allow for the relative motion of the sending and receiving satellites. This is either a movable mirror (as in figure 18) or an optical phased array, similar to the phased antenna arrays used in radio.
The other critical feature in the data transfer chain is the initial link between the ground station and the space-based network segment. This physical link is a microwave link similar to existing uplinks in communication systems. The attenuation of a microwave uplink is lower than optical links. In other embodiments it is possible to use some other link type, such as optical lasers, which have higher capacity.
In addition to the physical method of transferring data between satellites, it is necessary to have some method of routing the data via these satellites to earth.
This will depend on the network between the satellites, i.e. how and in what fashion they are linked. Satellites in a constellation are generally arranged in several discrete orbits (or orbital planes) with multiple satellites in each orbit. Links can then be intra-plane or inter-plane. In the present embodiment both are used, as shown in figure 19. This sort of linking arrangement is used in telecommunications constellations, for instance the Teledesic network shown in figure 20. Within an orbital plane, satellites stay at fixed distances apart so it is simpler to link intra-plane rather than inter-plane where the satellites may move relative to each other.
There are several methods suitable for transferring image data from a constellation of satellites to a ground station. For simplicity of explanation, it is assumed that the system has only one ground reception station, however, in the preferred embodiment, a multiplicity of ground stations are used.
There are a large number of configurations that allow real-time data transfer, due to the combination of imaging/non-imaging satellites used and the orbits which these satellites are in.
Possible network configurations for the ISL network are the following:
1. Imaging Satellite→-lmaging Satellite→Ground
2. Imaging Satellite→Other Orbit→Ground
3. Imaging Satellite-»Communications (non-imaging) Satellite-»Ground
Possible orbits are:
1. GEO based satellites
2. MEO based satellites
These are shown in figure 21.
In the networking scheme adopted in the present embodiment, the imaging system has a data transfer path which goes between the imaging satellite and either directly to ground, if this is possible, or via one or more 'routing satellites' if direct ground contact cannot be achieved. These 'routing' satellites may or may not have imaging capabilities. Imaging capabilities are not necessary for the data transfer portion of the system, however, in the present embodiment, the 'routing' satellites also have an imaging capacity. The 'routing satellites' are GEO-, MEO- or LEO-based, depending upon the particular embodiment chosen.
With the intermediate satellites having imaging capabilities, there are two options. Firstly, all of the imaging satellites are purpose-built for the variety of sensors. This requires adequate resources and the communications and data-management network/infrastructure to support this. The system is also designed to minimise the time from the image request to the display of the image, by optimal use of the satellites and their communications capabilities. This gives maximum flexibility to the design of the overall system.
With the second option, all the satellites are from a third party design and the system's sensors are 'piggy-backed' onto them. An example of this would be to place the sensors onto Teledesic's proposed satellite constellation. One main advantage of using third-party satellites is that the communications infrastructure is already in place, thus reducing costs and time. However, since the third-party satellites are not specifically manufactured for the imaging system of the present embodiment, significant redesign of the third-party system may be necessary. For example, space on the platform would be needed to house the sensors, and the imaging system would need to be integrated into the existing one provided on the particular platform. This second issue may not be too difficult for a proposed 'multi-media internet-in-the-sky' system, like Teledesic, as these are designed to handle image data as well as other forms of data, so the onboard switching would be tapped into and utilised. However, if an existing telecommunications constellation was to be used, the integration of the sensors with the on-board processors may prove to be more difficult, as these satellites only deal with speech data and not images or movies.
In one embodiment where the relay satellites do not have imaging capabilities, the relay satellites are in geostationary orbit, similar to the system planned for use with SPOT4 or in Medium Earth Orbit (MEO) or in LEO. The system operates in a similar fashion to that shown in figure 22.
Advantages of using geostationary relay satellites are their large area of coverage, hence only a few are needed, and their stationary position, relative to the earth's surface which simplifies the ground reception.
In another embodiment, a network of MEO satellites, equipped with ISLs, are used in a similar manner to relay the image data back to earth. Using a network of MEO satellites has the advantage of requiring a smaller number than a LEO constellation and the satellites do not have to be designed for imaging but only for communications.
In a further embodiment still, using a LEO constellation for imaging, it is advantageous to design 'routing' satellites with no imaging capabilities.
Another scheme using 'non-imaging' routing satellites in a further embodiment is to relay the image data from the sensing satellites to ground via a separate constellation/network of communications satellites. These satellites are distinguished from the 'imaging' relay satellites described above by being separate to the imaging system. For example, the satellites are from the Teledesic system or some other external provider. Advantages of using third- party relay satellites are their large area of coverage and inter-satellite links (ISLs), which are important for providing 'real-time' ground reception. Additionally, by not being a direct part of the imaging system, the development and maintenance of the transfer-to-earth portion of the system is simplified.
In addition to providing the actual network linking, particular communication protocols are used for transferring data along the links. For both the ground- satellite and inter-satellite links, a system specification for the data format is made to allow optimal data transfer rates. The data transfer protocol has provision for error-checking. The method of operation for the real-time uplink used in the present embodiment is shown in figure 23. The procedure commences with the transfer of compressed and encrypted data from the ground station to a space-based network node (i.e. a satellite). This satellite examines the data to determine if it is the intended recipient of the information. If not, it determines which inter-satellite link to use for data transfer, either by examining the data for routing information or by performing some routing to determine the next optimal step in the data transfer chain.
When the ISL has been determined the satellite then attempts to establish the physical link to the intended satellite. Slightly different methods may be required depending on whether the link is intra- or inter-plane, however the general approach is as described above. Once the link is established, the data will be transferred using the designated communications protocol. Finally, the receiving satellite examines the data to determine if it is the intended recipient of the data and takes the appropriate action.
The overall method of operation for real-time downlinking used in the present embodiment is shown in figure 24. It commences with the transfer of compressed and encrypted data from the imaging satellite to the ground station, if this is possible, or to a space-based network node (i.e. a satellite). If direct transfer to ground is not possible, the satellite determines which ISL to transfer the data across, either by examining the data for routing information or by performing some routing to determine the next optimal step in the data transfer chain.
When the ISL has been determined, the satellite then attempts to establish the physical link to the intended satellite. Slightly different methods may be required depending on whether the link is intra- or inter-plane, however the general approach is as described above. Once the link has been established, the data is transferred using the designated communications protocol. Finally, the receiving satellite examines the data to determine if it can transmit to ground and takes the appropriate action.
The sensors used in the real time imaging system detect EM radiation emitted by the earth's surface over a range of frequencies such as microwave, infrared, visible, ultraviolet and gamma radiation. Since, at present, one single sensor cannot currently cover this wider frequency range, a generic sensor system oniy will be described. A notable exception to this is the case for active sensors that require a transmitter, in addition to the general embodiment of a passive sensor.
A particular feature of the image acquisition process in the present embodiment is the use of both optical and microwave sensors on the same satellite to allow all- weather day/night imaging capability.
The system comprises a data receiver that receives data, such as control instructions and reference data, either directly from a ground station or from another satellite. This data is processed, including decryption and decompression, to determine what the satellite is required to do to perform the instructions. The sensor component outputs data to a data storage area (DSA) and to the control and processing module (CPM), which performs compression, encryption and other image processing such as geo-locating the image.
A large number of satellites are required to achieve a ground resolution of 1 m or better and an update frequency of 25Hz. To solve this problem two methods are available.
In the present embodiment the method adopted uses sensors with independently directable beams which allow more area to be imaged, within an 'effective footprint', than with a single fixed beam. The system would have a multiplicity of sensors that are able to move in 2 dimensions to record data from a given target that lies in the satellite's effective footprint. This system, assuming a sufficient number of satellites, gives partial coverage with a fast refresh rate, or global coverage with a slower image refresh rate. This method is able to be implemented using radar sensors, with the beams being directed using steerable antennas. In addition, by using movable sensors, the effective footprints are made to be overlapping to allow stereo imaging.
In another embodiment, a method is adopted using a number of sensors, each imaging an area within the footprint through a static lens. This method has a large number of sensors on the satellite all observing different fixed areas within the footprint simultaneously. An example of such a system is an optical/1 R passive sensor comprising a bundle of wave-guides, such as optical-, i.e. glass-, fibre, that interface to a fixed focal length lens. Each fibre in the fibre optic bundle is linked to a separate optical sensor such as a charged-coupled device (CCD), which is located at a convenient point on the spacecraft.
The processing scheme adopted in the present embodiment is shown in figure 25. The image acquisition process commences when the satellite receives the image request, and any other information, from the ground station. If the data has been compressed/encrypted, then it is de-compressed/de-encrypted and passed to the control and processing module (CPM). The CPM analyses the information and then instructs the sensor instruments in the necessary fashion to implement the request. This may include such things as the angle of sensor pointing or the particular sensors to utilise.
Data received by the satellite's sensors is sent to a data storage area (DSA), on a frame-by-frame basis, at a rate of >25 frames per second. Due to onboard memory constraints, it may be necessary to compress the data stored in the DSA. The sensed data is sent to the CPM for processing and/or compression. The system use some form of compression to reduce the quantity of data that must be downlinked to the ground station. Downlink data may also be encrypted for security reasons.
An inter-frame compression technique is used, i.e. one that transmits the difference between successive image frames. This is achieved by a comparison between the current image and previous images, which are stored in the DSA. An adaptive process is used to perform this comparison. An example of this is where an initial comparison is performed, based on wavelet parameters. Should the correlation of the wavelet parameters fall within predefined values then a more detailed comparison is performed. The comparison process is also based on several factors such as, histograms in each received band, edge structures, band ratios and textures. Alternatively, other forms of image compression is applied. ln the present embodiment, some, or all, of the 'image processing' (section Image Processing and accompanying figures) is done on-board. This includes the processing required by synthetic-aperture radar techniques or image calibration. Additionally, the system also performs a comparison between the current image and geo-located reference data (i.e. data whose geographic location is known) to allow the user-selected area to be extracted from the sensed data. Extracting the requested data from the sensed data further reduces the volume of downlinked data. In other embodiments, some, or all, of the on-board processing is done on the ground so that the satellite essentially transmits raw data.
To enable a satellite to image more than one area at a time, a paralleled computing structure is adopted. Such a structure enables the above process to be repeated simultaneously for a number of areas within the same region. Finally, the system transmits the data to ground via the optimal data transfer route. In addition to the image data, other information such as satellite status, network and ISL status and satellite pointing direction is also downlinked.
The image-processing scheme of the present embodiment shown in figure 26, outlines the overall structure of the processing required to produce an image from raw sensor data. For simplicity, it will be assumed that all processing is done in the imaging station, however it is possible that both the satellites and the end user software will do a proportion of the image processing.
The imaging station receives data from a ground station and decompresses/decrypts the data. This data is then pre-processed, enhanced, and possibly classified, before the area selected by the user is extracted and then compressed/encrypted before finally being transmitted to the user.
Image pre-processing involves calibrating the image to remove any systematic errors, or offsets, that may have been introduced and the processing involved in converting synthetic-aperture radar data into an image. The calibration occurs in two main areas: geometric and radiometric calibration. Geometric calibration corrects for the orientation or pointing of the satellite, relative to the fixed earth co-ordinates of latitude and longitude. This is done either by comparison with an archived geo-located reference image, or by a knowledge of the pointing vector of the satellite and the sensor which then allows appropriate transformations to be made to the image.
Radiometric calibration allows the relationship between the incident radiation intensity and the sensor output to be quantitatively determined. This is done by periodic comparison with an EM radiation source of known strength such as the sun or, for an active sensor system, against a target of known backscattering cross-section.
Image enhancement involves the removal of random errors, e.g. noise, from the image and additional processing to improve the image, such as incorporating stereoscopic effects. The system performs processes such as contrast modification, edge detection and noise reduction. If non-optical frequencies have been used then the system will use false colouring and overlays of multiple frequencies to form the image. Stereoscopic processes are also performed on the image to produce a three-dimensional effect. This includes a change in perspective transformation, to give a 'fly-through' effect.
After the image has been pre-processed and enhanced, image classification is performed. This is a technique which uses 'pattern recognition' to group the image information in ways that makes visual interpretation simpler, or more efficient, for the end-user. The human brain performs image classification automatically. For example rather than observing a random array of brightness and colours the brain recognises a region as being a dwelling, therefore classifying all the information in that region as 'house', as opposed to, say, 'vehicle'. An example of this type of image classification in an imaging system is a microwave image having radar backscatter being processed to identify friendly or hostile objects. In the present embodiment, this pattern recognition uses neural networks or some other process. Once the image has been processed, the portion requested by the user is then extracted. This is usually done after the image processing as the techniques may require large image sizes for statistical purposes, however this may be done at an earlier stage to reduce the volume of data. Finally, the data is compressed/encrypted before being transmitted to the end-user's system.
In the present embodiment, the image display is handled differently than in many current user interfaces. These interfaces generally have a 'preview' system, similar to that in the user-request section of the present imaging system, but then dispatch the requested data in non-real time to the user. With the present system, there is no (or minimal) delay between the user-request and image display. In addition once the initial image has been obtained, continuous (refresh rate «>25Hz) imaging of the requested region is possible, which is a particular feature of the present embodiment.
From a user's point of view, the system operates via a computing device connected to a network, such as the Internet or a direct subscriber network such as a news organisation or government entity. The connection, or communications link, includes land-lines, microwave links and fibre-optics. The user's system receives the image data over the network from the imaging station and then performs some image processing on the data before displaying it in the requested format on an output device.
Some, or all, of the 'image processing' (section Image Processing and accompanying figures) is done by the user display device. The level of data processing is tailored to the user's requirements and to capabilities of the user's system. For example, if the end-user display terminal were a small hand-held device then, at present, it would be preferable to do the majority of the image processing at the imaging station. Conversely, if the end-users were a government or scientific organisation then it may be preferable, or necessary, for a larger proportion of the image processing to be done by the user's system.
Following the image processing, if any, performed on the data, the image is displayed. This is done using some digital visual output device, such as a computer monitor, flat screen TV or palmtop display. This process is shown schematically in figure 27.
Now describing the specific configuration of the satellite network having particular regard to how real time global coverage is achieved, reference is made to figures 28 to 31
In the present embodiment, satellites in LEO are provided with the imaging sensors, the sensors being as close as possible to earth to achieve the highest spatial resolution. A LEO-based sensor, however, has a smaller footprint than the same sensor in a higher orbit, so this configuration presents a worse case scenario in establishing the number of satellites required to obtain global coverage of the earth.
Each satellite has optical/IR and active microwave based sensors on board. At present, the maximum spatial resolution of 1 metre for an optical/IR sensor and for microwave sensors is a constraint provided for national security reasons, and not technological reasons. Thus the technology for greater resolution is available for classified purposes, but as yet, has not become available for commercial purposes.
LEOs in the telecommunications and earth imaging industry are usually circular, polar orbits. This orbit ensures that every satellite will pass over an area on the earth at some point in time and have the ability to perform its operational function, with the exception of the far north and south regions. To achieve coverage in these areas, polar elliptical orbits are needed. In the case of circular polar orbits, the satellite is travelling from pole to pole at a constant speed, while the earth is rotating about the polar axis. Figure 28 shows the effect.
Accordingly, the imaging system of the present embodiment comprises a purpose-built LEO imaging satellite constellation travelling in elliptical polar orbits having both microwave and optical sensors. The satellite network and imaging sensors are configured to cover 95% of the earth's surface. The altitude of the orbit of these satellites is approximately 640 km and the footprint of each satellite's image sensor is circular, as shown in figure 29.
The angle of view from nadir, α in figure 29, is approximately 27° for all sensor types.
The surface area of the whole earth can be calculated as:
Area = 4πR2
Where the radius of the earth, R, is approximately 6378 km, and so:
Surface Area of Earth = 4π(6378)2
* 5.112 x 108 km2
Therefore, the total area to be covered, which is assumed to be 95% of the earth's surface, is:
Surface area of earth covered = 0.95 x 5.112 x 108
= 4.856 x 108 km2
Now having regard to the coverage of a single sensor, consideration will be made of the sensor parameters and the sensor footprint.
The sensing method adopted by the imaging system of the present embodiment involves having "all-weather" sensors that can move through 360° in azimuth and have a 27° scan angle from nadir. Using this scan angle the sensor can use its 'spot beam' to map out a much larger area. This scan angle is similar to that achieved by the newest of the SPOT imaging satellites, SPOT 5. For simplicity, it is assumed that there is a single type of sensor on the satellite. However, in the present embodiment multiple sensors are used in practice to provide multiple-user access and multi-frequency sensing capabilities. To achieve a 55° (i.e. ±27°) viewing angle, at all azimuths, the sensor needs to be moveable in two planes.
As can be seen from figure 29, the size of the circular footprints is a function of the orbit height of the satellite and the sensor scan angle. To calculate the area of the footprint, consider figure 30.
The area of circular footprint in figure 19 is given by:
Footprint area = π x (Footprint Radius)2
= π x (Orbit Height x tan(α))2
= π x (640 x tan(27°))2
= 334073.12 km2
It is not possible to provide complete coverage of the earth's surface with circular footprints unless these footprints have a certain degree of overlap. This can be seen in figure 31 , which shows the situation for no overlap (figure 31a) and the minimum overlap required for total coverage (figure 31b). As can be seen in figure 31a, with no overlap of the footprints, the sensor beam cannot image some areas.
The effective area covered by a single satellite is the area, which is uniquely (i.e. only) imaged by that satellite. This is given by the square inside the circles in figure 31 b.
The area of this square is given by :
Effective Footprint Area = 2 x (Footprint Radius)2
= 2 x (Orbit Height x tan(α))2 = 2 x (640 x tan(27°))2
212677.58 km2
The number of satellites required to image a given proportion of the earth's surface is determined by the size of the area and the coverage of a single satellite. The number is given by:
Number of satellites = Area to be covered /Area of Coverage
= 4.856 x 10s / 212677.58
= 2283 satellites.
Thus, given certain constraints, in the present embodiment a large constellation of around 2000 satellites are needed to cover 95% of the earth's surface. If further overlapping of the footprints is required in order to achieve instantaneous stereo images, this number may need to be increased. Alternatively, the system could either have multiple moveable sensors on the one platform, or a larger scan angle or be in a different orbit. Increasing the scan angle would give a larger effective footprint, as the effective area of coverage. Having a larger effective footprint would reduce the number of satellites required for whole-earth coverage, which is desirable.
It is important to note that the number of satellites required for coverage is primarily related to the effective image footprint of a single satellite. The footprint size is, in turn, critically dependent on the orbit height and sensor scan angle. Thus, the number of satellites required is largely determined by these system parameters and if these are changed then this number may change considerably. Alternatively, the system may configure the satellite constellation to cover a reduced portion of the earth's surface, say North America, which would also reduce the number of satellites required. It should be appreciated that the scope of the present invention is not limited to the scope of the particular embodiments described herein. As is evident from the description there are different combinations of features and variations that can be provided and would be apparent to a skilled person in the field of the invention, which whilst not being one of the specific embodiments, nonetheless fall within the scope and spirit of the invention.

Claims

The Claims Defining the Invention are as Follows
1. An imaging system for providing image information of a desired portion of the earth's surface to a user of the system in real or near real time comprising:-
(a) a large number of imaging platforms in non-geostationary earth orbits,
(i) each platform having one or more imaging sensors;
(ii) the platforms being arranged in a constellar network so that the footprint coverage of the imaging sensors of each platform is adjacently and overlappingly disposed with respect to the footprints of each adjacent platform thereto, so that the footprints contiguously and concurrently cover a substantial part of the earth's surface continuously and dynamically; and
(iii) any portion within the footprint coverage being capable of being imaged any time with a spatial resolution of approximately 1 metre or less and a temporal resolution of approximately 25 images per second or more;
(b) communication means for conveying information between a user requesting an image of the desired portion of the earth covered by the footprints and the network of platforms providing same, including:
(i) request delivery means to deliver a processed request for an image from the user to the platform(s) required to generate the image requested; and
(ii) image delivery means to deliver the image generated by the platform(s) to the user.
2. An imaging system as claimed in claim 1 , wherein the platforms form a constellation of satellites.
3. An imaging system as claimed in claim 1 , wherein said sensors for collecting said image data are located by aircraft such as high altitude balloons or unmanned remote controlled aircraft.
4. An imaging system as claimed in claim 2, wherein said aircraft are arranged so as to form said constellation.
5. An imaging system as claimed in any one of the preceding claims, including an image request processing station comprising:
(a) an image requesting server to receive compiled requested information from the user;
(b) data processing means to generate processed image request data including:
(i) location processing means for determining the location and size of the portion to be viewed and the time duration of the imaging;
(ii) permission processing means to determine user access;
(iii) resource allocation means to determine requisite resources such as platforms and sensors to provide the image;
(iv) scheduling means to schedule the use of the resources amongst competing requests; and
(v) request data transport means to transport processed image request data via the request delivery means.
6. An imaging system as claimed in any one of the preceding claims, including image data acquisition and processing means disposed on each platform including:
(a) data receiving means for receiving the processed request data; (b) data processing means to process the received processed request data and to control the imaging sensors in accordance therewith;
(c) a data storage area to store data output from the imaging sensors;
(d) control and processing means to process the stored data to generate processed image data ready for transport; and
(e) image data transport means to transport processed image data via said image delivery means.
7. An imaging system as claimed in any one of the preceding claims, including an image receipt processing station for processing received processed image data including:
(a) receiving means to receive and initially process the transported image data for decompression, decryption and the like;
(b) pre-processing means to calibrate, enhance and classify the initially processed image data if necessary;
(c) extracting means to extract the portion of the image originally selected by the user;
(d) transport processing means to process the extracted image data ready for delivery to the user;
(e) transmission means to transmit the processed extracted image data to the user.
8. An imaging system as claimed in any one of the preceding claims, including an earth imaging application means comprising:-
(a) a user request interface that is real-time interactive having: (i) selection means to select a portion of the earth that is desired to be imaged by the user;
(ii) zooming means to fine tune the selection of the portion;
(iii) resolution means to select the desired resolution of the image;
(iv) frequency sampling means to select the desired sampling frequency of the image;
(v) view formatting means to select the dimension in which the image is viewed and video or static views;
(b) compilation means for compiling requested information;
(c) transmission means to transmit requested information to the image requesting server;
(e) a user image display interface integrated with the user request interface to display the requested image in real time or near real time, including:
(i) image processing means to process the received extracted image data from the image processing station for display; and
(ii) display means for displaying the processed image data to the user.
9. An imaging systems as claimed in claim 6, or claims 7 or 8 as dependent on claim 6, wherein said image data comprises three-dimensional data of the earth's surface whereby the combined image is a three dimensional image.
10. An imaging system as claimed in claim 6 or claims 7 or 8 as dependent on claim 6, wherein said image data comprises two-dimensional data of the earth's surface whereby the combined image is a three dimensional or stereoscopic image.
11. An imaging system as claimed in any one of the preceding claims, wherein more than one sensor is provided on each platform to collect optical data.
12. An imaging system as claimed in any one of the preceding claims, wherein at least one sensor on each platform is provided to detect and collect a number of frequencies beyond optical frequencies such as micro-wave, infra-red, ultra-violet or gamma radiation frequencies.
13. An imaging system as claimed in any one of the preceding claims, wherein said platforms are adapted so that image data of at least one area of the earth's surface is collected by at least two sensors at any one point in time whereby said platforms collect said three dimensional data.
14. An imaging system as claimed in any one of claims 1 to 12, wherein said three dimensional data of an area is collected by a single sensor from two different points of an orbit of a platform containing said sensor.
15. An imaging system as claimed in any one of the preceding claims, wherein said sensor collects said data sets at a rate of 25 data sets per second or higher as the platform traverses its orbit path.
16. An imaging system as claimed in any one of the preceding claims, wherein said platforms are further adapted to communicate with each other so that said image data of an area collected by at least two sensors is filtered by said platforms so that said combining means receives a selected set of image data of said area.
17. An imaging system comprising a large constellation of platforms collectively having a first set of passive electromagnetic radiation sensors and a second set of active electromagnetic radiation sensors, each sensor being adapted to collect data from the earth's surface; identification means for identifying an area on the earth's surface from where data has been collected by said passive sensor; comparison means for comparing said identified area with a target area; whereby in response to a positive result of said comparison, said active sensor is adapted to collect data from said identified area.
18. An imaging system as claimed in claim 17, further comprising a set of images of the earth's surface; said comparison means adapted to compare said data collected by said passive sensor with said set of images.
19. An imaging system as claimed in claim 17 or 18, wherein said second set of active sensors are wide range sensors.
20. An imaging system as claimed in any one of claims 17 to 19, wherein said sets of active sensors and passive sensors are adapted to receive electromagnetic radiation from a spread of frequencies, such as optical, infrared, ultra-violet and microwave frequencies; and said set of image data being derived from said spread of frequencies.
21. A system for collecting data related to the earth; said system comprising a constellation of platforms some or all of which contain a set of sensors for detecting electromagnetic radiation emitted from the earth's surface; processing means for receiving at least two sets of data; each said set of data being received from a separate one of said platforms; and said processing means combining said at least two sets of data to generate a representation of the earth.
22. A system as claimed in claim 21 , wherein said sensors operate to detect a pre-determined set of electro-magnetic radiation frequencies including optical, infra-red, ultra-violet and microwave frequencies whereby said representation is generated, at least in part, from data collected from electromagnetic frequencies outside the optical range of frequencies.
23. A system as claimed in claim 21 or 22, wherein said representation comprises a graphical or visual representation of said data.
24. An imaging system for generating graphical representations of the earth comprising input means for receiving input data from a large constellation of platforms orbiting the earth; processing means for processing said data to generate a set of image data of the earth; and imaging means for generating graphical representations of the earth from said set of image data.
25. An imaging system as claimed in claim 24, wherein the platforms are satellites.
26. An imaging system as claimed in claim 25, wherein said input data comprises communications signals received by a constellation of communications satellites; and said processing means is adapted to process said communications signal to identify sets of image data of the earth.
27. An imaging system as claimed in claim 26, wherein said constellation of communications satellites have a non-stationary orbit and collectively receive communication signals on a continuous basis from a predetermined area of the earth's surface, whereby said processing means updates said set of image data on a real or near real time basis.
28. An imaging system as claimed in any one of claims 24 to 27, wherein said set of image data comprises data in three dimensions.
29. An imaging system as claimed in any one of claims 24 to 28, wherein said imaging means access said data in three dimensions whereby said graphical representations are three-dimensional representations.
30. An imaging system as claimed in any one of claims 24 to 29, wherein alternatively or additionally to obtaining data signals from satellites, said data signals are obtained from telecommunications receivers located in high altitude balloons, aircraft, including high altitude unmanned reconnaissance/communications aircraft.
1. A method for providing image information of a desired portion of the earth's surface to a user in substantially real time comprising:-
(a) imaging the earth from a plurality of platforms located above the earth's surface;
(b) arranging the platforms in a constellar network with the footprint coverage of the imaging that can be achieved from each platform adjacently and overlappingly disposed with respect to the footprint coverage of the imaging from each adjacent platform thereto so that the footprints contiguously and concurrently cover a substantial part of the earth's surface continuously and dynamically;
(c) spatially resolving the imaging of any portion of the earth's surface within the footprint coverage at any time to approximately 1 metre or less and temporally resolving the imaging to approximately 25 images per second or more;
(d) conveying information between a user requesting an image of the desired portion of the earth covered by the footprints and the network of platforms providing the imaging, including:
(i) delivering a request for an image from the user to the platform(s) required to perform the imaging to generate the image requested; and
(ii) delivering the image generated by the platform(s) to the user.
32. An imaging system for providing image information of a desired portion of the earth's surface to a user of the system substantially as herein described with reference to the accompanying drawings as appropriate.
33. A method for providing image information of a desired portion of the earth's surface to a user substantially as herein described with reference to the accompanying drawings as appropriate.
PCT/AU2001/001073 2000-08-28 2001-08-28 Real or near real time earth imaging system WO2002018874A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001281605A AU2001281605A1 (en) 2000-08-28 2001-08-28 Real or near real time earth imaging system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AUPQ9741 2000-08-28
AUPQ9741A AUPQ974100A0 (en) 2000-08-28 2000-08-28 Real or near real time earth imaging system

Publications (1)

Publication Number Publication Date
WO2002018874A1 true WO2002018874A1 (en) 2002-03-07

Family

ID=3823795

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2001/001073 WO2002018874A1 (en) 2000-08-28 2001-08-28 Real or near real time earth imaging system

Country Status (2)

Country Link
AU (1) AUPQ974100A0 (en)
WO (1) WO2002018874A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1484628A1 (en) * 2003-06-07 2004-12-08 Zeiss Optronik GmbH System and method for generating three-dimensional images
US8594972B2 (en) 2010-06-11 2013-11-26 The Johns Hopkins University System and method for tomographic retrieval of parameter profile from traveling path
WO2015009981A1 (en) 2013-07-17 2015-01-22 Hughes Network Systems, Llc System and architecture for space-based and mobile terrestrial sensor vehicles
GB2518951A (en) * 2013-08-09 2015-04-08 Boeing Co Demand based field of view (FOV) allocation for remote sensing systems
US9511729B1 (en) 2009-07-23 2016-12-06 Rockwell Collins, Inc. Dynamic resource allocation
JP2018506919A (en) * 2015-02-03 2018-03-08 クラウド コンステレーション コーポレイション Space-based electronic data storage and transfer network system
US10230925B2 (en) 2014-06-13 2019-03-12 Urthecast Corp. Systems and methods for processing and providing terrestrial and/or space-based earth observation video
CN110207671A (en) * 2018-12-29 2019-09-06 中国科学院软件研究所 A kind of space-based intelligence imaging system
US10587335B1 (en) 2019-07-30 2020-03-10 Thomas Kyo Choi Direct-to-user Earth observation satellite system
US10615513B2 (en) 2015-06-16 2020-04-07 Urthecast Corp Efficient planar phased array antenna assembly
CN112068130A (en) * 2020-09-01 2020-12-11 上海卫星工程研究所 Stationary orbit microwave imaging method and system based on whole-satellite two-dimensional scanning motion
US10871561B2 (en) 2015-03-25 2020-12-22 Urthecast Corp. Apparatus and methods for synthetic aperture radar with digital beamforming
US10955546B2 (en) 2015-11-25 2021-03-23 Urthecast Corp. Synthetic aperture radar imaging apparatus and methods
US11378682B2 (en) 2017-05-23 2022-07-05 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods for moving targets
US11506778B2 (en) 2017-05-23 2022-11-22 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods
US11525910B2 (en) 2017-11-22 2022-12-13 Spacealpha Insights Corp. Synthetic aperture radar apparatus and methods

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115468533B (en) * 2022-11-10 2023-02-28 南京英田光学工程股份有限公司 Rapid orientation device and orientation method for laser communication ground station

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1986001592A1 (en) * 1984-08-24 1986-03-13 Hughes Aircraft Company System and method for mapping geosynchronous real image data into idealized images
US4602257A (en) * 1984-06-15 1986-07-22 Grisham William H Method of satellite operation using synthetic aperture radar addition holography for imaging
US4814607A (en) * 1986-04-26 1989-03-21 Messerschmitt-Bolkow-Blohm Gmbh Method and apparatus for image recording of an object
US5248979A (en) * 1991-11-29 1993-09-28 Trw Inc. Dual function satellite imaging and communication system using solid state mass data storage
US6084989A (en) * 1996-11-15 2000-07-04 Lockheed Martin Corporation System and method for automatically determining the position of landmarks in digitized images derived from a satellite-based imaging system
US6271877B1 (en) * 1999-06-25 2001-08-07 Astrovision, Inc. Direct broadcast imaging satellite system apparatus and method for providing real-time, continuous monitoring of earth from geostationary earth orbit

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4602257A (en) * 1984-06-15 1986-07-22 Grisham William H Method of satellite operation using synthetic aperture radar addition holography for imaging
WO1986001592A1 (en) * 1984-08-24 1986-03-13 Hughes Aircraft Company System and method for mapping geosynchronous real image data into idealized images
US4814607A (en) * 1986-04-26 1989-03-21 Messerschmitt-Bolkow-Blohm Gmbh Method and apparatus for image recording of an object
US5248979A (en) * 1991-11-29 1993-09-28 Trw Inc. Dual function satellite imaging and communication system using solid state mass data storage
US6084989A (en) * 1996-11-15 2000-07-04 Lockheed Martin Corporation System and method for automatically determining the position of landmarks in digitized images derived from a satellite-based imaging system
US6271877B1 (en) * 1999-06-25 2001-08-07 Astrovision, Inc. Direct broadcast imaging satellite system apparatus and method for providing real-time, continuous monitoring of earth from geostationary earth orbit

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1484628A1 (en) * 2003-06-07 2004-12-08 Zeiss Optronik GmbH System and method for generating three-dimensional images
US9511729B1 (en) 2009-07-23 2016-12-06 Rockwell Collins, Inc. Dynamic resource allocation
US8594972B2 (en) 2010-06-11 2013-11-26 The Johns Hopkins University System and method for tomographic retrieval of parameter profile from traveling path
US10063311B2 (en) 2013-07-17 2018-08-28 Hughes Network Systems, Llc System and architecture for space-based and mobile terrestrial sensor vehicles, and end-to-end network for aggregation and processing of sensor data
WO2015009981A1 (en) 2013-07-17 2015-01-22 Hughes Network Systems, Llc System and architecture for space-based and mobile terrestrial sensor vehicles
EP3022852A4 (en) * 2013-07-17 2017-03-22 Hughes Network Systems, LLC System and architecture for space-based and mobile terrestrial sensor vehicles
GB2518951A (en) * 2013-08-09 2015-04-08 Boeing Co Demand based field of view (FOV) allocation for remote sensing systems
US9774829B2 (en) 2013-08-09 2017-09-26 The Boeing Company Demand based field of view (FOV) allocation for remote sensing systems
GB2518951B (en) * 2013-08-09 2018-07-18 Boeing Co Demand based field of view (FOV) allocation for remote sensing systems
US10230925B2 (en) 2014-06-13 2019-03-12 Urthecast Corp. Systems and methods for processing and providing terrestrial and/or space-based earth observation video
JP2018506919A (en) * 2015-02-03 2018-03-08 クラウド コンステレーション コーポレイション Space-based electronic data storage and transfer network system
US10871561B2 (en) 2015-03-25 2020-12-22 Urthecast Corp. Apparatus and methods for synthetic aperture radar with digital beamforming
US10615513B2 (en) 2015-06-16 2020-04-07 Urthecast Corp Efficient planar phased array antenna assembly
US11754703B2 (en) 2015-11-25 2023-09-12 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods
US10955546B2 (en) 2015-11-25 2021-03-23 Urthecast Corp. Synthetic aperture radar imaging apparatus and methods
US11506778B2 (en) 2017-05-23 2022-11-22 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods
US11378682B2 (en) 2017-05-23 2022-07-05 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods for moving targets
US11525910B2 (en) 2017-11-22 2022-12-13 Spacealpha Insights Corp. Synthetic aperture radar apparatus and methods
CN110207671A (en) * 2018-12-29 2019-09-06 中国科学院软件研究所 A kind of space-based intelligence imaging system
EP3772189A1 (en) 2019-07-30 2021-02-03 Thomas Kyo Choi Direct-to-user earth observation satellite system
US10587335B1 (en) 2019-07-30 2020-03-10 Thomas Kyo Choi Direct-to-user Earth observation satellite system
CN112068130A (en) * 2020-09-01 2020-12-11 上海卫星工程研究所 Stationary orbit microwave imaging method and system based on whole-satellite two-dimensional scanning motion
CN112068130B (en) * 2020-09-01 2024-04-09 上海卫星工程研究所 Static orbit microwave imaging method and system based on whole-satellite two-dimensional scanning motion

Also Published As

Publication number Publication date
AUPQ974100A0 (en) 2000-09-21

Similar Documents

Publication Publication Date Title
WO2002018874A1 (en) Real or near real time earth imaging system
WO2003040653A1 (en) Improved real or near real time earth imaging system and method for providing imaging information
CA3067604C (en) System and method for widespread low cost orbital satellite access
US5584047A (en) Methods and apparatus for augmenting satellite broadcast system
JP2004501343A (en) Direct broadcast imaging satellite system apparatus and method
US5248979A (en) Dual function satellite imaging and communication system using solid state mass data storage
US20240103156A1 (en) System, method, and satellites for surveillance imaging and earth observation using synthetic aperture radar imaging
US7379088B2 (en) System and method for real-time image control and processing for use in wide area space based surveillance
EP0578316B1 (en) Method and system for pointing one antenna in the direction of the other
US20180062735A1 (en) Cloud mask data acquisition and distribution using geosynchronous communications or broadcast satellites
EP3420309B1 (en) Image sensor and method for a geostationary orbiting satellite
US20050083412A1 (en) Internet interactive realtime video image acquisition system based in low earth orbit
US6452538B1 (en) Satellite system for monitoring space
CA3196175A1 (en) Satellite image sensor and method
Suzuki et al. Overview of ALOS-2 and ALOS-3
EP0454034B1 (en) Radio frequency earth observation device, and space system with such a device
US11496679B2 (en) Real-time satellite imaging system
Jolly et al. A Canadian spaceborne hyperspectral mission
Bajpai et al. A vision for a national global operational environmental satellite system (NGOESS)
Gascon et al. Sentinel-2 optical high resolution mission for GMES land operational services
Tomar et al. Gain & size considerations of satellite subscriber terminals on earth in future intelligent satellites
Cohen et al. EPS, the European contribution to the NOAA/EUMETSAT Initial Joint Polar system (IJPS)
Zhou Future earth observing satellites
Selivanov et al. Radio-television complexes of the Meteor-Priroda satellites
JOINT PUBLICATIONS RESEARCH SERVICE ARLINGTON VA JPRS Report, Science & Technology, China, 16th International Congress of the International Society for Photogrammetry and Remote Sensing--Vol. 1

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP