CA1207927A - Three-dimensional display system - Google Patents

Three-dimensional display system

Info

Publication number
CA1207927A
CA1207927A CA000430825A CA430825A CA1207927A CA 1207927 A CA1207927 A CA 1207927A CA 000430825 A CA000430825 A CA 000430825A CA 430825 A CA430825 A CA 430825A CA 1207927 A CA1207927 A CA 1207927A
Authority
CA
Canada
Prior art keywords
image
display
depth
data
accordance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired
Application number
CA000430825A
Other languages
French (fr)
Inventor
Tsu Y. Shen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lexidata Corp
Original Assignee
Lexidata Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lexidata Corp filed Critical Lexidata Corp
Application granted granted Critical
Publication of CA1207927A publication Critical patent/CA1207927A/en
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/40Hidden part removal
    • G06T15/405Hidden part removal using Z-buffer

Abstract

Abstract of the Disclosure A display system for displaying a three-dimensional image on a two-dimensional raster display screen wherein a host processor supplies input information on the geometric elements, e.g. polygons, which make up the image to a local display processor which processes said input information and provides data concerning the location, color, intensity and depth of the points which make up the surfaces of the polygon. The depth data is stored in a depth buffer which is a part of the display processor and the color and intensity data is also stored directly in a frame buffer or as color index data in the frame buffer which is used to address the desired color and intensity stored in a color look-up table. The color and intensity video data is supplied to a suitable display means, such a, a cathode ray tube.

Description

` ~45~5~ , ~ 7~7 I~RB~ ME:N~;lQ~ spLAy - ~ysl~EM

Dt rQd~C~iQ~
This invention relates generally to systems for displaying three-dimensional shaded images on a two-dimensional raster display screen and, more particularly~ to an improved system for permiting the display to be built up incrementally in any selected order of locations on said screen and for a complete image in a time period substantially less than that required for displaying a complete image in presently known systems of this general type.

Q~ DY:~iQIl In providing three-dimensional representations of images on a twv-dimensional display means, such as a raster display screen of a cathode ray tube, for example, i~ is necessary to provide mechanisms for eliminating hidden surfaces and for shading visible surfaces in the image so that a general '.
three-dimensional representational effect will be provided. In conventional display systems for producing such an effect, each point on the overall image to be displayed is created at a first processor (normally referred to as a "host" processor) but the data representing each such point is not supplied for display on the raster display screen until every point forming the overall image has been created and the data defining such complete image has been formed and stored at the host processorO The data is then suppl.ied to a second processor (normally referred to as a ;~lrj,jq 1~07927 34525 local "display" processor) which is associated with the raster display means and which converts the overall point data into video data for providing a line-by-line scan display of the image to be represented.
In such systems the host processor normally performs a number of functions. The host processor provides for "viewing transformation", that is, the transforming of input information i~
received from a data base and representing the geometry of the image to be displayed into data which provides for an appropriate scaling of the image or for a rotational movement of the image, or for an enlargement of different regions of the image. The host processor further provides for a "volume clipping"
ope~ation, i.e~, determining a selected region o an overall image which is required to be displayed, a process sometimes J~
referred to as "windowing~.
Although many approaches have been used in the art to ,~
provide three-dimensional displays and which use various techniques for solving "hidden surface" and "shading" problems, one known approach has been to use a depth buffer, sometimes referred to as a Z-buffer in the system. r In such a system the host processor provides suitable information for permitting "hidden surface" or "hidden line" ¦~
removal, i.e., data which defines the depth relationships among the points which form the image so that points on one portion of the image which are behind points on other portions of the image at the same display location in the overall visual representation are effectîvely removed, i.e. not used, in the displayed image.
l.P

~ .

~ 34525 ~ 12~792~ l Such depth information is normally stored in the host processor as an array of data points in a suitable portion (The depth buf f er portion~ of the processor memory.
The host computer also performs the function of shading visible surfaces of the image, i.e., it creates suitable intensity values for each visible point on the overall image so as to provide a three-dimensional shading effect in the image display.
The host processor further provides for "scan conversion" or "rasterization" of the data so that the data is translated into the necessary data for making up the image needed at the display means in terms of the color and intensity, as well as the locations, of all of the points which form the image.
Such data is then supplied from the host processor to the display processor for display on a cathode tube screen, for example, on a line-by-line basis.
When a depth, or Z, buffer approach is used, the hidden surface removal problem is solved at the host processor by the use of a suitable algorithm, sometimes referred to as a depth ¦
buffer or Z-buffer algorithm, which processes the input data and stores the necessary depth relationship information required to determine each visible point of the three-dimensional image, such processed depth information being stored in the depth buffer portion of memory. Data defining the location, color and intensity of each point of the image to be displayed is stored in a suitable storage means which can be referred to as the "image"
buffer portion of memory~. Once all of the data required to 34s2s ::lLZ~7927 define the image is so stored, the image information can then be transferred to a local display processor where it is placed into a conventional frame buffer means so that the image is then displayed on the raster display screen on a line-by-line basis.
A major disadvantage of such an approach is that the entire image muqt be created and stored at the host processor before any data concerning the points which define the image can be supplied to the display processor. It is entirely impractical in such a system to attempt to transfer such information from the host processor to the display processor incrementally as image data for each point is created. The overall time for such a process would be too long for any practical advantage to be obtained from such a process. Accordingly, no matter what hidden t surface, or depth buffer, algorithm is used by the host processor to provide the desired depth relationship data, the need for storing all the lmage information at the host processor permits the image t~ be displayed only when all such information is created and made available at the host. Normally, then, the most effective way to display the information is on a conventional line-by-line basis from top to bottom of a display screen.
Accordingly, once the user wishes to use the informatlon in the data base to display an image, the user must wait for data defining the complete image to be duplicated in a non-displayable form at the host processor before it can be transferred to the display processor and the image begins to appear on the display screen. Moreover, when a change is made in the image by the user, the change must be created at the appropriate locations by ,i.. - .....

~ 7 the host processor and the entire image must be re-displayed on a scanned line-by-line basis before the revised image can be seen by the user. Nor can an overall image comprising several distinct objects be presented to the user on the display screen on an object by object basis.
SuchAa system effectively acts in the nature of a raster plotting device rather than a random access display device and requires a relatively long waiting time from the time a request for an image display is made and the time the first line of the ima~e can even begin to be displayed on the screen. Once such display begins, the host processor must supply every data point from the image buffer at the host processor to the frame buffer li of the display processor, a significant burden.
In order to improve the operation of such a system, it i8 desirable to devise a system which avoids the relatively long waiting time required before a user-requested image can be created and displayed. It is further desirable for such an improved system to permit a change at any location of the image without requiring the host processor to re-calculate the entire image that is going to be displayed before such change can be seen at the display screen.
Moreover, it is desirable for a system to permit a "build~up" of the overall image so that various locations of the image can be displayed independently of other locations thereof and the display of the overall image can be continuously built up ~n any selected order of its component parts or objects. The build up of an overall image in a selected order can be qenerally , . .

79~7 referred to as an incremental construction of the overall image.

~m~y_Qf_~b~ Y~tiQ~
In accordance with the invention, a display system is arranged so that the processing of user-supplied input information defining a three-dimensional image, which input information is required to produce both intensity and color data and depth relationship data for the imager is performed not at the host processor bu~ at the local display processor.
Accordingly, the display processor, rather than the host processor, is used to create image data defining the location, color and intensity of each point of the overall image. The display processor also processes and stores depth data which defines the corresponding depth relationships of the image points at each location of the overall image such data being stored in a depth buffer means which also forms a part of the local display processor. Image data representing the image color and intensity can be stored directly in a frame buffer means which is part of the display processor and then supplied as video data to the display or, alternatively, color index data representing such image information can be stored in the frame buffer and used to address a color look-up-table to obtain the color and intensity information which is then supplied therefrom as video data to the display. Because both image information and the depth information are immediately available for use by the local display processor, video image data defining each point of the overall image can be immediately supplied for display at the ~7~7 display means as soon as it has been created and the overall image can be continuously "built up~ on a display screen for immediate viewing by the user.
The host processor in such system is used to per~orm geometry transformation functions, as discussed above, but need not be used in or~er to create all the data for duplicating an entire image at the host processor before any portion of the image is available for display. The host processor is then available for other purposes by the user. Because the functions of hidden surface removal, visible surface shading, scan conversion and point display drawing are performed by the display processor, the overall image to be displayed is constructed incrementally and selected portions thereof can be displayed immediately at any location and in any order (i.e. a random access display system) I;
The user sees the image as it is being constructed without waiting for the complete image to be processed before it can begin to be displayed, as on a line-by-line basis. Such a display system i5 then more responsive to the user than previously available systems and avoids the time consuming and P
tedious process of displaying the three-dimensional image, with its hidden surface removal and shading problems, on a line-by-line basis, often a source of great annoyance to a user.
Other advantages of such a system in accordance with the invention, such as its ability to incrementally construct the overall image on a object-by-object basis, its ability to make relatively rapid corrections to portions of the image, and its ..... ~ , 9~:~
ability to create realistic three dimensional shading of the image at the local display processor, will become apparent as a more specific description thereof is supplied below.
The invention may be summarized, according to a first broad aspect, as a system for displaying a three-dimen-sional representation of an image on a two-dimensional display means, said system comprising; host processor means responsive to input data from a data base source defining said image for generating host data representing the configuration and orientation of one or more three-dimensional geometric elements comprising said image to be displayed on said display means;
display processor means responsive to said host data for providing video data to said raster display means; said dis-play processor means including; means responsive to host data defining selected points of said geometric elements for pro-viding depth data defining the depth relationships of points at each location required to form said image and video image data defining the color and intensity at each location required to form said image; means for storing said depth data; means for storing said video image data; and means responsive to said video image data for supplying said data to said display means for displaying thereon all visible points required to produce a three-dimensional representation of said image.
According to another broad aspect, the invention provides a display processing system for processing input information for use in displaying a three-dimensional image on a two-dimensional display means, said system comprising means for successively receiving in any order input image information defining at least part of each of a plurality of geometric elements in a three-dimensional image; and means for succes-~2~27 sively processing the received input image information withrespect to each successive geometric element and for succes-sively supplying processed image information to a display means for successively displaying thereon the visible portions of each successive geometric element and for preventing the dis-play of the non-visible portions of each successive geometric element so that the overall three-dimensional image is incre-mentally built up on the display means on a geometric element by geometric element basis.
According to a further broad aspect, the invention provides a display processing system for use in displaying a three-dimensional representation of an image on a two-dimen-sional display means, said system comprising processing means responsive to input image information defining said image for processing said input image information so as to provide a translucent effect in the display of at least one selected region of said three-dimensional image, said processing means processing said image information so as to determine for said selected region a selected ratio of the number of points in said selected region which are to be displayed to the number of points in said selected region which are not to be displayed, said selected ratio thereby determining the degree of trans-lucency of said selected region; and so as to supply, for use by the display means, processed image information with respect to those points in said selected region which are to be displayed by the display means.
According to yet another broad aspect, the invention provides a system for displaying a three-dimensional image on a two-dimensional display means, said system comprising a display means; means for obtaining from a data base data with respect to each of a plurality of geometric elements of a -8a-~., ~2~7~
three-dimensional image and for providing input image infor-mation defining at least part of each of said geometric ele-ments; means for successively receiving in any order the input image information defining at least part of each of said geo-metric elements; and means for successively processing the received input image information with respect to each succes-sive geometric element and for successively supplying proces-sed image information to said display means for successively displaying thereon the visible portions of each successive geometric element and for preventing the display of the non-visible portions of each successive geometric element so that the overall three-dimensional image is incrementally built up on the display means on a geometric element by geometric element basis.
The invention also comprises methods consistent with the preceding paragraphs reciting apparatus.
Description of the Invention The system of the invention can be described in more detail with the help of -the drawings wherein:
FIG. 1 shows a block diagram of a conventional system known to the pr.ior art;
FIG. lA shows a more detailed representation of the system shown in FIG. l;
FIG. 2 shows a block diagram of a system in accor-dance with the invention;
FIG. 2A shows a more detailed representation of the system shown in FIG. 2;
FIG~S. 3 and 4 show graphical representations of a comparison of the relative time periods in which display occurs using the conventional approach of FIG. 1 and the approach of the invention in FIG. 2, respectively;

-8b-~z~

FIG. 5 shows a simplified flow chart for drawing a point of an overall image using conventional depth bu-ffer techniques;
FIG. 6 shows a more complex flow chart for drawing each point of an overall image in accordance with the system of the invention;
FIG. 7 shows an exemplary flow chart, useful in the system of the invention, for producing an image of a polygon which forms a part of an image to be displayed; and -8c-` ~12~79~

FIG. 8 shows a diagrammatic representation of an exemplary polygon formed in accordance with the process of FIG.
8.
As can be seen in the diagrams of FIGS. 1 and lA, an t approach used by prior art systems for providing a display of three-dimensional graphics information on a two dimensional di~play scree~ utilizes a host processor 10 which receive~ object dat~, i.e., data from a suitable data base 11 which defines an overall image which is to be displayed in a three-dimensional image form on a cathode ray tube raster display screen 13. The host processor may include, for example, any well-known processor (eOg. r a Model VAX 780 computer made and sold by Digital Equipment Corporation of Maynard, Massachusetts, or a Model 68000 mic~oproc~ ~or made and sold by Motorola Company of Phoenix, Arizona) and further incl~des host memory 14 for storing image data concerning the location~ color and intensity of each point which makes up the overall image which is to be displayed, i.e. a non-displayable image buffer memory portion 14A. Such non-displayable image bufer portion of memory 14 effectively stores sequential data defining the intensity and color of each of an array of data points of the overall image, as shown in FIG.
lA.
Further, the host processor memory 14 normally includes a suitable depth buffer memory portion 14B which stores depth data defining the depth relationships of the array of points which form the overall image. For example, if two or more portions of an overall image overlap, certain locations on the . .; ,. . .

~Z~ 7 image have one or more points in common and at each such common location a decision must be made as to which portion is in front of one or more other portions so that common points on the latter portions are "removed" or "hidden, n i. e-, they are not shown in the final displayed image. Any suitable algorithm can be used by the host processor for such purpose, one such algorithm being described, for example, in the text ~ çi~lg~-Qf-l~te~ i s~m~ L~bi~ William M. Newman and Robert F. Sproull, McGraw-Hill Publishing Co., Second Ed., 1979, Chapt. 24r "Hidden Surface Eliminationn, Pages 369 et seq.
Accordingly, a non-displayed duplicate of the overall data which is required to create the desired image display on the cathode ray tube raster screen is formed and maintained at the host computer memory 14 in a fully completed form prior to the transfer of such data to the local display processor 12. Once the input data has been so processed to form point data which is stored in the image and depth buffer portions of memory 14, the stored image data is transferred to the display processor, the microprocessor 15 of which merely converts such data into video signals via a video frame bufer 16 which supplies such video data to a display, ~uch as a cathode ray tube display 13, so as to draw each point on the raster screen thereof.
As can be seen in FIG. 1, the functions performed by the host processor and the display processor are indicated below each processor block. The only function performed by the local display processor is to provide the video data displaying each of the points in the scanning process once the point data has been .... .

lZ~79Z7 34525 completely obtained from the host processor.
YIGS. 2 and 2A show the novel approach of one embodiment of the invention wherein the looal display processor 20 is arranged to provide the functions of hidden surface removal, vislble surface shading and scan conversion as well as the ~unction of p~oviding video data for displaying the image points on the cathode ray tube display screen. By providing local processing of image and depth buffer data at the display processor 20 rather than at a host processor 21, the two most time consuming tasks for generating a shaded three dimensional image can be overlapped. Thus, the host processor 21 performs the geometry transformation operation and, if necessary, a volume clipping operation, such operations being performed concurrently with the point data processing operation which is performed by the display processor. There is no need to duplicate the entire display image at the host processor.
The host processor then supplies appropriate command signals to the display processor, as discussed in more detail below, in response to which the display processor per~orms the necessary operations for creating the displayed image on the CRT
display 13.
Thus, for example~ input data defining the vertices of a geometric element, e,g. a polygon, which forms a part of the overall image to be displayed can be supplied by the host processor 21~ As the host processor provides data concerning the vertices of such polygon to the local display processor 20, it i8 suitably processed, as discussed below, for example, to provide ~2~i7~7 intensity and color video data to the CRT display via a video frame buffer 25 and a color look-up-table 27 in the particular embodiment being described herein. The display processor also processes depth data for supply to a depth, or Z~ buffer 26, also located at the local processor 20. As the input data is so processed, the visible portions of the polygon can be immediately displayed on the raster screen of the CRT. As data concerning each subsequent polygon is transferred from the host processor to the local display processor, the visible portions of each polygon are immediately displayed and the overall image is incrementally built up. Such a process not only eliminates the user's anxiety .j while waiting for a display of some kind to appear on the screen but also allows separate objects in the overall image to be constructed instead of constructing the entire image on a line-by-line basis.
In the particular embodiment shown, for example, processing of the input data in response to commands from the host processor can be arran~ed to provide color index data with respect to the color and intensity of each point of the image for t storage in the video frame buffer 25. Such index acts as a "pointer" to a specified location ~address) in a color look-up-table corresponding to such index which then provides the required video color (red-qreen-blue) data to the CRT for dlsplay. Alternatively, data directly specifying the color for each image point can be stored in the frame buffer 25 and such color data can be supplied directly to the CRT from the frame buffer.

.,,, .

~ 34525 ~.2~ %7 FIGS. 3 and 4 show a graphical comparison of the time periods involved in utilizing the approach of the system of FIGS.¦
2 and 2A as compared with that of the system of FIGS. 1 and lA.
As can be seen in FIG. 3, in previously known systems using Z
buffers, the host processor must perform the viewing transformation (and volume clipping operation, if necessary) initially before it performs any hidden surface removal process ~using a depth buffer algorithm) or any shading operation. The host processor conventionally stores all of the data which results before transferring any data concerning the complete image to the display processor which can then display the overall image on the display screen using a conventional line-by-line raster scanning operation. ¦
In contrast, as shown in FIG. 4, the host processor of the system shown in FIGS4 2 and 2A performs only the viewing transformation ~and volume clipping) operations on the input data and transfers such data, e.g. describing a geometric element such as a polygon, immediately to the local display processor which then performs the appropriate algorithms in response to host processor commands for producing the image and depth data for each element. The display processor can then immediately display the visible surfaces of each polygon, for example, on the screen as it is processed and while the host processor is processing subsequent polygons in the overall image. Such simultaneous host processor and display processor operations proceed until the entire image which is to be displayed has been built up on an element by element tpolygon by polygon), i.e., an incremental, Il .

lZ~79z7 34525 basis. ~ometric elements other than polygons which can be so processed, for example, are three-dimensional points, lines or other surfaces. The overall time, as can be seen in FIG. 4, is reduced considerably from the time required using the system of FIGS. 1 and lA. There is no extensive waiting time before a user ¦ can see any part of the image since the image is being continuously built up substantially from the beginning when the initial geometric element is processed and displayed.
As can be seen in the more specific block diagram of ~IG. 2A, the microprocessor 24, which operates in combination with the video frame buffer means 25 and the depth buffer means 26~ can be of any well-known type, such as the Model Z-8000 microprocessor made and sold by Zilog Company, Cupertino, California, and can be appropriately programmed, in accordance with the particular processor language associated with the l processor model selected, to perform various algorithms for ¦ processing the data supplied ~rom the host processor. One such j algorithm provides information relating to the location, color ¦land intensity of each point which makes up the overall image for storage in the frame buffer means 25 directly as an array f ~ t image color~intensity data or as an array of color indices for ¦
selecting the required color and intensity from a color look~up-table~ A suitable algorithm for such operation which !
would be well known to the art as discussed, for example, can be j¦found in the above referenced text of Newman and Sproull at page 398 et seq. Another algorithm provides information defining the ~depth relatïoll6hips of points at common locations on the overall t^
!
~ 14- ' !l I , .. , 1~ .

~ 34525 ~ 7~Z~
I
image (i.e., the hidden surface information which provides the three~dimensional effect as discussed above) for storage as an array of depth data in the dep~h buffer means 26.
Shading can be obtained by linearly interpolating data between two points supplied by hos~ processor, as discussed in more detail below, which linearly interpolated data can then be supplied as appropriate color indices for the intervening points to the display means 13 via color-look-up table 27. The latter table must be pre~loaded by the user in accordance with the user's desired color display, the table responding to the image color index data from frame buffer 25 to provide the appropriate combination of red, green and blue information to display the geometric element involved on a cathode ray tube display screen.
Control of whether a particular point is actually displayed is determined by the "depth" information stored in depth buffer 26 , since processed points which are defined as "hidden" will not be ¦ displayed on the screen of display means 13.
The microprocessor provides the timing for producing, in i a particular embodiment, for example, a 640 x 512 image resolution for the image and depth buffers, as shown in FIG. 2A, at a 30 Hz frame rate, for example. The buffers can be well-known memory modules using standard chips, such as the Model l ¦
4116 dynamic memory made and sold by Motorola Company! Phoenix, i Arizona~ which can be used for each of the buffers. Such modules j utilize 12 memory planes, for example, as shown in FIG. 2A. It is not necessary, however,to utilize the same memory module for each buffer Ind different module types can be used ln accordance ~ ` l ~ Ij 34525' li 1;~?79;2~7 1 I
wlth the invention. The color look-up-table 27 responds to a 12-bit input address signal, for example, from the image buffer, and provides 8-bit output signals per color channel, for example.
The color and shading of a particular point is determined by the 12-bit index value, for example, which is written into the frame buffer for each point to be displayed.
This value is used as an address input to the color look-up-table 27 from the frame buffer to select rom the table particular color and shades thereof at such address as pre-loaded by the use~.
Depth tor "Z ) valuès in depth buffer 26, are in the particular embodiment being described, for example, 12-bit unsigned integers with zero representing a "background" valuei i~e., a value furthest from the viewer. Each point with a depth, or Z, value larger than a previous Z value at the same location is considered as a visible point and the new Z value is stored into the depth buffer. Subsequent Z values are then compared with the current Z value to determine which points are visible on the displayed image.
In providing for the display ~i.e. the drawing) of each point of the image as it is processed by display processor 20 in the system of FIG. 2A, appropriate operations are performed by microprocessor 24 in accordance with the simplified flow chart depicted in FIG. 5. For illustrative purposes, such flow chart ~;
depicts the simplest operation involved where it is assumed that jr the point to be displayed is one in which no translucent effect is to be est blished ln the displayed image, which does not .

. ~ 34525 ~Z~7~2~
i , ` 1.
¦ involve any change to be made from an already es~ablished zero j coordinate point for the image to be displayed, which does not j involve the sectioning" of the image to be displayed, and which ¦1 i5 to be used to provide the desir~d three-dimensional effect ~ ~i.e., the point to be displayed is not merely used to provide a ¦ two-dimensional image effect). Vnder such conditions the !~ con~ents of t~e depth buffer 26 at the location of the current point under consideration for display is read. The depth value of the current point is then compared with the previously stored depth value in the depth buffer at such location. If the current point is greater than the already stored value, then the current Ipoint is in front of the previously stored point in the imaqe to jbe displayed. Such comparison then conditionally permits the desired color intensity value to be written into the image buffer ¦ ¦
!25/ which value can be immediately transferred to the display imeans via the color look~up-table 27 for immediate display on the ¦
screen at the identified location with the desired color and intensity. At the same time the depth value of the current point ~lis written into the depth buffer so that a subsequent point being I
processed at the same location by the image can be compared with it.
If the depth value of the current point under consideration is equal to or less than the previously stored depth value in the depth buffer at such location, the current ! point is not displayed.
¦ The flow chart of FIG. 6 represents a more complicated ¦operation for drawing a point of the displayed image which takes , , .

Il -17-!l i i ~ 7~2~
1, ' into consideration other conditions including those discussed above~ As can be seen therein, before reading the depth buffer ¦contents the translucency condition is first tested. Thus, if the user desires to provide a translucent image at the location of the current point under consideration, the translucency enablement (YES) overrides the reading of the depth buffer~
i Before drawing the point involved the translucency pattern, i.e.
the degree or density of translucency must be set by the user and I a determination as to whether such pattern has been set is made before proceeding with the process depicted in the flow chart.
! In accordance therewith the user may desire that a surface ! portion which normally would be hidden (all points in such !¦ , portion would be eliminated and not be displayed) be made partially visible, such partial visibility providing a ¦
translucent effect in the image at the location of such portion.
In such case the degree of translucency will be determined by the I
ratio of the number of points on such surface which are to be made visible to the number of points which are to be hidden. For example, the pattern may be such as to make half of the normally hidden points of such a surface portion visible and to make half 1 ¦
of the points hiddenr Accordingly, before proceeding with a declsion to display or not display the points on such surface the ¦
processor must determine whether a translucency effect is to be established and, if so, must determine that the translucency pattern (the "degree" thereof) is set.
If the translucency pattern is so set, or if translucency is not to be enabled at ~11, the next condition -1&-75~

¦¦examined is whether the currently established zero point for the ¦coordinate system used for the image display is to be changed or not. Normally, the coordinate zero depth point is at the back plane of the image. However, in some cases the user may desire that the front plane of the image be used as the zero depth point ~sometimes referred to as the complement depth). The processor accordinyly détermines whether the normal zero depth is to be used or whether the complement depth is to be enabled~
Once the zero depth point is defined, either by enabling the complement or using the conventional back plane zero depth, the next condition to be examined is whether the point currently being considered is in a portion of the image which represents a I "~ectioning" of the image~ If sectioning is to occur (~Sectloning Enabled" is YES), the current point is examined to see if its depth value is less than or equal to the depth limit of the section being taken. If less than or equal to such limit, the current point is a candidate for display, while if it is not ! less than or equal to such limit, it is not displayed and ,¦ examination thereof ends.
I The next condition examined is whether a two-dimensional I i~
or a three-dimensional image is to be displayed. In the former case the depth buffer is simply disabled (YES) ~since depth information has no significance) and the current point is written into the image bu~fer and can be immediately displayed on the ¦
display screen via the color look-up-table. If, on the other hand, a three-dimensional image is required, the depth buffer i8 not disabled (NO) and the contents thereof are read at the Il -19-,1 ~IZ~7~

location of the current point under consideration, as discussed above, to determine whether the current point is a visible one or not.
At such stage, once the contents of the depth buffer at the current point are read, a determination is made as to whether ¦
the current point intersects with a previous point at the location invoived and, if so, whether the current point should also be displayed together with the previous point so that the intensity at that location is accordlngly emphasized (~contouringn). If both intensity points at an intersection are to be displayed ~contouring enabled is YES) a determination must be made as to whether the current point is at the same depth as the previous point stored at such location in the depth buffer and, if it is not, further examination of the current point end~
If it is, the current point is then displayed via the color look-~p-table.
If no "contouring" is required one further condition is examined to determine whether the current point under examination should be displayed if it is either equal to or greater than the depth of the point presently stored in the depth buffer at the location in question. If so (YES), the depth (Z) value of the point is compared with the depth buffer value and if it is not less than the latter the currently examined point is displayed.
If lt is less than the latter, no further examination of such point is made.
If the current point is to be displayed onIy when it is greater tha the depth (Z) value of the depth buffer ~a normal .
I

l! 34525 1, I ~Z~7~Z7 condition for most three-dimensional images~, the value of the current point under examination is compared with the depth value previously stored in the depth buffer at the location in question ~and, if greater than the latter value, it is displayed via the color look-up-table. If not, examination of the current point ends and it is not displayed.
When the current point is displayed, a decision must -further be made as to whether its depth value is to be stored in the depth buffer at the location involved. Since in some cases (as when a "cursor" point is being displayed) the current point may be displayed but its depth value is not stored in the depth buffer, in which case it is treated as a "phantom" point ¦ (effectively a temporarily displayed point only). If it is not ¦to be treated as a phantomt or non-stored, point, lts depth value ¦is written into the depth buffer at the location in question.
¦ An exemplary use of the procedure described above with ; ¦reference to the flow charts of FIG. 5 or FIG. 6 is depicted in l l the flow chart vf PIG. 7 for displaying a polygon in a !1 three-dimensional representation. A convenient technique for drawing a polygon is to break the polygon into a plurality of separate triangles and to determine the image and depth values along parallel segments defined by the end points thereof along two sides of each triangle beginning at the top of the triangle 1 and moving downward from segment to segment and from left to right along each segment. Such overall procedure is depicted for the pentagon 30 shown in FIG. 8, for example, wherein three triangles 31 32 and 33 are deiined as making up the pentagon.

.... ... . . . . .

, 34525 ;

I' ~Z~7~312'7 The location of points on triangle 31 is defined as including end points 34, 40 and 41, for example. The surface is defined by the !Ipoints along successive segments 35, 36, 37,...etc. from left ~at ! the edge 38 of triangle 31) to right (at the edge 39 of triangle , Ij31).
¦ When the points on a segment are displayed (see block 40 1 ! of the flow chart in FIG. 7~, the process is repeated for each I successive segment until the display of the triangle, or that ¦I portion of the triangle which is visible in the thxee-dimensional , Il representation involved, is completed. The same process is then ¦ ~i ! repeated for the next triangle, and so onr until the overall polygon, or visible portion thereof, is completed.
l In each ca~e the display of each point (or non-display the point is determined to be not visible on the overall ¦ three-dimensional image) which is performed in block 40 is performed using the flow chart depicted in FIG. 6 and described il aboYe.
!l In accordance with the particular embodiment being described, the host processor 21 supplies information defining the vertices of the triangle 31, i.e. the location and information for producing depth and color index data Eor each vertex. The microprocessor 24 determines the depth data ~, therefor, using a suitable depth algorithmr as mentioned above, and also determines the color index values for points along the lines forming the sides of the triangle (lines 34-40, 34-41 and i 40 41). The processor then uses a suitable algorithm to linearly interpolate the color index (shading) data for points on such Il 1 .
I -22- 1 !

I' , Il ~2~7~7 lines between the end point, i~e. The end points of each segment 35, 36, 37 ... etc. One such linear interpolation algorithm is discussed in the Newman and Sproull text at page 398, referred to ¦
above~
The processor then performs the same linear interpolation for points between the end points of each segment using the samé algorithm. Accordingly when such linear interpolations are completed the color indices for all of the points making up the triangle are determined and are stored in frame buffer 25 and the triangle can be displayed with the desired color and shading using the video data supplied from the color look-up table 27 in response to such color indices.
Similar data can be generated for each triangle which makes up the polygon so that the local display processors can provide the ¦ three-dimensional image thereof with appropriate color and shading from the information on the vertices thereof supplied by the host processor.
As discussed above, the host processor ~1 supplies suitable commands to the display processor to request that certain operations be performed with respect to the image which is to be displayed. Suitable commands (eOg. 16-bit command signals) which can be so supplied are described below. More ¦
~specific information on such commands is set forth in Appendix A, I such information being summarized in general below. The specific forms thereof and encodings therefor in Appendix A are exemplary and other specific embodiments of such commands may be devised by ¦
¦~those in the art to perform substantially the same function.

.

.
3-~2~75~;~7 A first such command, designated herein as a DSBUFF
(buffer select) command, effectively specifies which of the buffers 25 and 26 (frame or depth) is to be used for reading and writing.
The DS3MOD (three-dimensional drawing mode) command specifies how three-dimensional drawings should be done, and, in effect, places the processor in a three-dimensional drawing mode for perorming the processes, for example, set forth in the flow charbs of FIGS. 5-7.
The DSPATT command specifies the pattern to be used when displaying an image having a translucent effect at selec-ted portions thereof (e.g. an 8x8 pattern).
The DS3PNT command specifies a single point to be written into the buffer for use in drawing a point as set forth, for example, in the flow chart of FIG. 6.
The DS3SEG command specifies depth and a constant color index value to be written into the buffers for a single horizontal segment (e.g. a horizontal segment of a triangle) which is to have a constant shading.
The DSSSEG command specifies values for a horizontal segment having a smooth (linearly interpolated) shading.
The DS3POL command specifies values for a polygon having constant surface shading.
The DSSPOL command specifies values for a polygon having a smooth (linearly interpolated) shading along its sides and along its horizontal segments, as discussed above.
The DS3VEC command specifies values for a vector (i.e. a '"'''.

- ~ 34525~
~2~79Z~

line along any selected direction) having a constant shading.
The DSSVEC command specifies values for a vector having a smooth (linearly interpolated) shading.
The above commands can be supplied from the host processor to the display processor using any appropriate interface mecbanism which permits commands specified at the host to be placed into the foem requlred for transer to and use by the display processor.
While the embodiment discussed above represents a particular preferred embodiment of the inventionr variations thereof will occur to those in the art within the spirit and scope of the invention. Hence, the invention is not to be limited to the particular embodiment described except as defined by the appended claims.

~ I lZF~7~
APPENDIX A
DSBUFF Select Di~plfly ~uffer ._.
~ORMAT fc BlCt ~NPUT fc - 56 (Fu~ct~on Code).
alct - Word cos)talnin5 bit set~ gfi.
OUTPUT None.
D~SC~IPTION This functlon speclfIes wlllcll of the di6play buffer6 (bsnks) wlll be u6ed for readlng and writlng, which wlll be used for the depttl b~ffer, and wl:lch for vlewing (on 8 ~onitor). There i8 R160 A blt for ~riting Into all buffers (for clearlng both b~ffcrs simultaneousty, for example), and one for waitlng untll ~ertlcal retrace before updatlng buffcr ~clectlons (for ~ooth double bufferlnr, of l~e~s). Ttlese buffer selections are ln effect for All 3400 eoD~D.~rlds, lncludln~ ~he baslc IDOS/EGOS commands~ The depth buffer ifi only ~6ed ln the SOLIDVlEW f~rmware, howevcr.
These fielectlons are made by settin~ bits ln tl)e slct ~rgumerlt A6 shown below:
Bit ~: 15 14 13 12 11 lO 9 R 7 6 5 4 3 ? I O
~-~ X~ ~EE ~ ~ ~

Name: Re.~ervcd ~ A R/W DEP VlEW
where:
VIF.W - Selectfi tlle vfewillF, burfer.
DEP - Selectq the de~th buffer.
R/W - Selects thc readlwrlte buffer.
A - When 6et, 6electfi writiTlg, to all buffers (read~n~ ~6 stll] dooe from the R/W buffer).
W - Wllcrl set, CAUSCS l:~lc ~ys~em to WJlt Ullt the ~.tart ot vertlca] retrace before challglng ~ tfer selectlonfiO
Default ~ettlll~,s for Rl'n~ VIEW~ W~ and A are zero, and one for DEP.

. I ~b .

lZ~`7927 DS3MO~ Set 3D l)rawlng Hutie _ _ ~ORMAT fc cDInd bt~
~ak lmt brk xoff yoff INPUT fc - 57 (functlon c~de).
cmd - Com~and word contalnirlg bit settings.
bts - Word Bpeci fylng the nl~mber of normal vector blt~. !
mr~k - Pl9ne enable ~a6k (mnx. t2 b~t~
lmt - Maximum ~llowahle z value.
brk - Elrst llne of the second half oF n fipllt-depLh buffer.
xoff - R/W buffer offYet from t~lP ~epth buffer when an yoff offset or 6pl1t buffer lfi 6peclfled.
OUTPUT None.
D~SCRIPTION Thl8 ftlnctlon fipecif1e& ht)w tllree-dimell~lollal drawing cllould ~e done. It nffectfi on1y ttle 31) ~raw commands:
DS3PNT, DS3SEC, DSSEG, DS3POL, DSSPOL, DS3~C, ~ni DSSVEC.
Ar~umen~ cmd is fl couml,3llt] word conCnin11lg sevcral blt ~ettlnes ~s 6ho~l b~l~w:
Blt ~: 15 14 ll 12 l1 In ~ 8 7 6 5 4 ~ 2 l O
~1.~
Name: Re se r ve~ P Y T S C E Z N MAP

where:
D ~ Dlsable all refercnces to the depth buffer.
P (Ph.~ntom write) di6able on1y the writes (not rend~ or te~t6) to the depLh buffer.
Y ~ Move orlgin to lower left corller.
T ~ (Trallslucencv) enable wrlte wlth pattern.
S - Enab1e z-cllpping (secLionillg).
C ~ ~rlte only on equal z-values (conCourlng).
E o Enable write on greater-ttlan-or-equal z-vfllue6 (rAt~ler tllar- ~u6t gre~ter than).
Z ~ Comp1emeot incom~ng z-values.
N ~ Enflble nor~al vector interpolatlon.
~P ~ Deptll buffer mappln~:
OO ~ No offset double bu~fer.
Ol - R/W ~uffer x,y ofset.
10 ~,y oEf~ct i double buffer.

^i= I! o2~7 .
~ 79;~7' Argume1t bt~ ifi n word sreclfylrtg tt~e numbcr of b5ts for the flrst Mnd second component~ of the norma~ vector:
B~t ~: 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 ~x l~~x'~

Name: Re~erved f1rs~ second cnmponent co0ponent Tshe ~sk arg1ment l~ n p~ane enable mask for depth buffer reads and ~rltes up to A rstnximurts of 12 blts.
The Imt ~trgume11t ~peclfies the t~lxits~tm allowable z value when z cllpplng is enabled.
The brk argu~ertt r~peclflefi the first lIne of the secol1d half of a tpllt depth buffer. The lines above thls haJe thelr x and y coordlnate~ 6wapped and flre offset by the xoff val1e uhen referenclng the depth buffer. Tte llnes ut brk and below sre offset by the yoff ~alue. This word ls ignored unles6 6plit depth buf~er r~spplr1f is enabled.
Ar~ument6 xoff and yoff speclsry the offset of the depth buffer from the R/W buffer wi1en an offr~et or 6pllt buffer i9 specified. Otherwlse they are lgnored.
The defatslts for all argurnents are zeroes except for the 12-blt tBk argu1nent wh ~h iB all OrlOt.

1~' l ~': ~
I ~g l l , li 1 7~
l Z~ ?,7 DSPATT Lond Pntt~rn I rORMAT fc l dot2 I dat3 ~lat 4 i ~NPUT fc - 58 (funrtion code) datl - Four words contalnlng pattern.
: I
d&t4 OUTPUT None.
DESCRlYTIoN Thtis function Apt-clfles ~tn 8 by 8 p'~tttern to be used . when displaylng poly~ons flnd se~tentr; whr n D53MOD
enables pattern mode. The pattern i~ allgned to screen . boundaries and ls lnititlly undefilled. lhe paLtern ls . contalned ln nrgu-nent~ datl t~lro~ll dat4, where the hl~h order byte of dat I is the top r~w of tlle patteln, and the low order byte of dat4 19 the ~ot tom row.

~ ' l !

1~ 1 ~!
li æq . lZ~7~2 ~
DS3PN'r DrAW 3~ Point .~
P'ORMAT f C
l x z ahd INPUr fc - 59 (functlon code).
x - x~coordinatc of polnt.
y - y-coordlnate of point.
z - Depth value.
6hd - Shade value to wrlte into the read/wrlte buffer.
0l1TPUT None.
~ESCRIPTION Thl6 functlon ~peclfles a sin~le polnt to be condltlonally written into the buffers. T~is 1s par-tlcularly useful lf the host deco~pose~s surfaces dlrectly eDe o pi~els.

~ ` I

~.~, :lZ::~79Z~

DS3SEG Draw 3~ Con~tan _ aded Horlz ntnl~ ent PORMAT fc s~hd x2 z2 INPUT fc - fiO (function code).
xl - x-coordinste of flr6t end polnt.
y - y-coordlnate of ~Otll ~nd~oints.
zl - CPPth of flrBt end-poi1lt.
shd - Shnde value of the neg~ent.
x2 - x-coordinate of ~econd elldpoint.
z2 - Depth of necond endpoint.
OUTP~T None.
DESCRIPTIoN Thls function ~pecifles a 6ingle horizontal scan segment of co~6tant ~hade to be conditionally ~rltten to the buffer6.
The z values for the endpolllts ~re specifled by argu-~ent6 zl nod z2~ Wh1~e t~o6c iur PIXe1B along the llne I ~ are deter~lned by incerpolatio bc~een zl and z2.

~ I

I 1, . ... Ii i ~2~79Z7 VSSSEG Draw 3D Srltooth-Shadcd Horlzon~nl~
___ .... _ _ __ i PORHAT fc Yl I
shdl ~2 z2 Bhd2 INPUT fc - 61 (functlor1 code~.
xl - ~-coordlnate o flrst endpolnt.
y - y-coordlnate of both endpoint6.
zl - Depth of flr6t endpotnt.
~hdl Shude v~lue of first endpolnt.
x2 - ~-coordlnate of aecond endpolnt.
z2 - Depth of secortd ~ndpolnt~
shd2 - Shade value of second endpoil1t.
OUTPUT None.
~ESCRIPTION Thi6 function condltLonally writes a single hortzontal scan llne of vAriahle shadlrlg to the buffer~. ~8 wlth DS3SEG, z-values for pl~els along the llne ~re tnter-polated between zl snd z2. Llkewl~te, ~thade values along the llne are lnterpolated b~tween shdl al1d ~11d~. Format of the 6hdl and ~hd2 argumenLs Dre dete2mlrted by the D53MoD co~an~l.

1~

I
l ll ~ I

11 , .-` lZU79Z ~
DS3POL Draw 3D Con~é~nt-Shade ~ORMAT a h d ¦ cnt : I
xn INPUT fc - 62 ( ~ unc t i on code).
~hd - Shade value to ~rrlte to tt-e R/W buffrr.
cnt - ~umber of polygon vertlces.
xl~xn - x-coor(llnatc6 of polygoll vertlce~.
yl-yn - y-coordlnnte6 of polygon ve~t.icer;.
zl-zn - Depth of polygon verLices.
OU~PUT None.
DESCRIPTION This functlon conditiollally dra~r~, a polygon of constant 6hsde. The coordlnates and depttl6 of the polygon ver-tlces are cFecifled wi~.h xyz algumerlts, and tlle z-value~
of ttle plxel6 wlthln the polygc)n are calculated by . lnterpolating the z values at the vertlces. The number of ver~lces ls fiprclfied wlth nrg~men~ cnt, which ~t6t be greater than 2.

~i~r ~ ,..

`- lZ1~79Z7 DSSPOL ~raw 3D S~ooth-SIIndcd Poly~on ;
~ORMAT cnt yl , ~hdl IM VT fc - 63 (funetlon codc).
cnt - Numbcr of poly~on vertices.
xl~ m - x~coordlnatefi of ~oly~on vertlces.
yl-yn - y~coordln~te8 of polygon vertl;:er;.
zl zn - Deptll6 ol polygt,rl vertlces, shdl-6hd2 - Shade values of rolyRc)n vertlces.
OUTPUT None .
DESCRIPTION Thls funrtion condltlonnlly drRws a poly~on wiLh a 6~00thly varylng r;hade. The coorcllllates of the vertices i are s~ecifled with ~yz coordinate trll)lets, sncl the sh~dillg of each vertex i8 specified by argumellt6 shdl-slldn~ The dcl~h6 and fihnde valnes of plY.el~ within the polygon nre ~nleulatrd hy inte~polntlng the depth arld shade vAlues at the vertice~. ~he number of ver-tice~ ls speclfled ~y flrgumellc cnt, whlcll must be greater than 2.
Format of ttle shd argumer)ts is determined by the DS3MOD
c~ d.

` l . ~
--"
~ :
,~
~Z~79Z7 DS3VEC ~raw 3U Confitant-Shade Vcctor _ ~ORMAT xl Yl I
shd y2 INPUT fc - 64 (functlon code).
xl,yl - Coordinates oE first endpoint.
zl - Depth of flrfit endpolnt.
ahd - Stlflde value of the vector. I
x2,y2 - Coordinntes of ~econd elldpoint.
z2 - Depth of second endpolnt.
OUTPUT None.
DESCRIPTION Tllls functlon condltlon~lly d~aws a vector of a constant shade. The coordlnate~ nnd depths of the eudpoints are ~ecifled wlth xyz ~rlplet~, And the gtlade ls fipeclled with argument ~hd. Ttle depth of the plxel6 co~-rl61ng the vector are calculated by lnterpo1tlng the z-values at the endpoint fi .
:, -. 11.Z~79Z7 DSSVEC ¦ Dr~ 3D_S~ooth-Sha(1td V~cto~
rORMAT I fc . Yl I
~hdl y2 z2 . Bhd2 INPUT fc - 65 (functlon code).
xl,yl - Coordinate of flr~t endpolnt.
zl - Depth of flrfit endpoint.
shdl Sh~de of first endpoint~ ¦
x2,y2 - Coordinate of ~econd endpolnt.
z2 - Depth of second endpolnt.
6hd2 - Shade of t~ecnnd endpoint.
OUTPU~ None.
DESC~IPTION Thls function cond1tionally dra~s fl slllgle vector with I 6moothly varying s~nde~ The coordlnate6 and dcpths of the endpoints sre ~pecifled by xyz argument tr~plet~.
Tt1e shade of the endpoint6 1~ spe~lfled with argument6 shdl and shd2.
The depth and shadil1g of thc plxels comprlslng the vec- ¦
tor are calculated by lnterpolating the depth and 6hade values ~t the endpoints.
Format of the 6hd arguments is deterMir1ed by the DS3MOD
command.

."~.'Si~

,. . ~ . ,

Claims (48)

WHAT IS CLAIMED IS:
1. A system for displaying a three-dimensional representation of an image on a two-dimensional display means, said system comprising;
host processor means responsive to input data from a data base source defining said image for generating host data representing the configuration and orientation of one or more three-dimensional geometric elements comprising said image to be displayed on said display means;
display processor means responsive to said host data for providing video data to said raster display means; said display processor means including;
means responsive to host data defining selected points of said geometric elements for providing depth data defining the depth relationships of points at each location required to form said image and video image data defining the color and intensity at each location required to form said image;
means for storing said depth data;
means for storing said video image data; and means responsive to said video image data for supplying said data to said display means for displaying thereon all visible points required to produce a three-dimensional representation of said image.
2. A system in accordance with claim 1 wherein the means for providing said depth data and said video image data is a microprocessor means.
3. A system in accordance with claim 2 wherein said display processor means provides said display data to said display means substantially immediately as host data is supplied form said host processor to said display processor means.
4. A system in accordance with claim 3 wherein said video image data storing means comprises a frame buffer means.
5. A system in accordance with claim 4 wherein said frame buffer means stores color and intensity values which can be directly supplied to said display means.
6. A system in accordance with claim 4 wherein said frame buffer means stores color index values relating to said video image data; and further including color look-up table means for storing color and intensity values and responsive to said color index values for supplying color and intensity values which can be supplied to said display means.
7. A system in accordance with claim 6 wherein said display means is a cathode ray tube.
8. A system in accordance with claim 6 wherein said microprocessor means includes means for determining the depth value of a currently processed point at a selected location of said image relative to the depth value previously stored in said depth buffer means at said selected location.
9. A system in accordance with claim B wherein said determining means includes means for reading the depth value at said selected located of said depth buffer means.
means for comparing the depth value of a currently processed point at said selected location with the depth value read from said depth buffer means at said selected location; and means responsive to said comparison for controlling the writing of the color index values of said currently processed point into said frame buffer means.
10. A system in accordance with claim 9 wherein said controlling means writes the color index values of said currently processed point into said frame buffer means when the depth value thereof is greater than the depth value read from said depth buffer means.
11. A system in accordance with claim 9 wherein said microprocessor means further includes means for determining whether the three-dimensional representation of said image to be displayed is to provide an image one or more portions of which display a translucency effect.
12. A system in accordance with claim 11 wherein said microprocessor means further includes means for selecting a reference depth value which is to be used as the zero depth value reference for the three-dimensional representation of the image on said display means.
13. A system in accordance with claim 12 wherein said microprocessor means further includes means for determining whether the depth value of a currently processed data point is less than or equal to a selected limit depth value when a portion of said three-dimensional image representation is to provide an image, one or more portions of which are to be displayed in cross-section on said display means.
14. A system in accordance with claim 13 wherein said microprocesor means further includes means for disabling the control of the writing of depth data into said depth buffer means in response to said comparison when the image to be displayed is to be a two-dimensional image representation.
15. A system in accordance with claim 14 wherein said microprocessor means further includes means for controlling the display of one or more points having a common depth at a selected location of said image so that the intersection of said one or more points is displayed with greater intensity on said display means.
16. A system in accordance with claim 14 wherein said microprocessor means further includes means responsive to said comparing means to control the display of said currently processed point at a selected location of said image when its depth value is either equal to or greater than the currently stored depth value in said depth buffer at said location.
17. A system in accordance with claim 1 wherein said host processor supplies input information concerning the vertices of one or more geometric elements which form said image; and said microprocessor means responds to said vertex input information to produce depth and image data concerning said vertices and further produces depth and image information of the remaining points defining the surfaces of said geometric elements.
18. A system in accordance with claim 17 wherein the depth and image data concerning said remaining points produced by said microprocessor means are produced by linear interpolation of the value of the image data of said vertices.
19. A system in accordance with claim 18 wherein said microprocessor means determines the values of said image data at points along the lines between vertices by linearly interpolating between the values of said image data at said vertices.
20. A system in accordance with claim 19 wherein said microprocessor means further determines the values of the image data along horizontal segments having as end points the points along said lines between said vertices, said values being determined by linearly interpolating the values of said image data at said end points.
21. A system in accordance with claim 4 wherein said host processor includes means for providing a command signal which specifies which one of the depth buffer and frame buffer is to be used for storing and for supplying information provided by said microprocessor means concerning said depth data and said image data.
22. A system in accordance with claim 1 wherein said host processor includes means for providing a command signal which specifies that said display processor means is to be placed in an operating mode for providing said three-dimensional image representation.
23. A system in accord with claim 1 wherein said host processor includes means for providing a command signal which specifies a pattern of image data for use in one or more selected portons of said image which produces a translucent effect in said displayed image representation.
24. A system in accord with claim 1 wherein said host processor includes means for providing a command signal which specifies single points with respect to which image and depth data are to be stored for use in displaying said image representation.
25. A system in accord with claim 1 wherein said host processor includes means for providing a command signal which specifies the image and depth values which are to be stored for use in producing a single horizontal segment of said image having a constant shading for use in displaying said horizontal segment in said image representation.
26. A system in accord with claim 1 wherein said host processor includes means for providing a command signal which specifies the image and depth values to be stored for use in providing a single horizontal segment of said image having a smooth linearly interpolated shading for use in displaying said horizontal segment in said image representation.
27. A system in accord with claim 1 wherein said host processor includes means for providing a command signal which specifies the image and depth values to be stored for use in producing a polygon of said image having a constant shading over the surface of said polygon for use in displaying said polygon in said image representation.
28. A system in accord with claim 1 wherein said host processor includes means for providing a command signal which specifies the image and depth value to be stored for use in producing a polygon of said image having a smooth linearly interpolated shading over said surface for use in displaying said polygon in said image representation.
29. A system in accord with claim 1 wherein said host processor includes means for providing a command signal which specifies the image and depth values which are to be stored for use in producing a single vector of said image having a constant shading for use in displaying said vector in said image representation.
30. A system in accord with claim 1 wherein said host processor includes means for providing a command signal which specifies the image and depth values to be stored for use in providing a single vector of said image having a smooth linearly interpolated shading for use in displaying said vector in said image representation.
31. A display processing system for processing input information for use in displaying a three-dimensional image on a two-dimensional display means, said system comprising means for successively receiving in any order input image information defining at least part of each of a plurality of geometric elements in a three-dimensional image; and means for succes-sively processing the received input image information with respect to each successive geometric element and for succes-sively supplying processed image information to a display means for successively displaying thereon the visible portions of each successive geometric element and for preventing the dis-play of the non-visible portions of each successive geometric element so that the overall three-dimensional image is incre-mentally built up on the display means on a geometric element by geometric element basis.
32. A system in accordance with claim 31 wherein said processing and supplying means supplies the processed image information to the display means as the input image information is being received and processed for each successive geometric element.
33. A system in accordance with claim 31 wherein said processing and supplying means includes display memory means;
and processor means having direct access to said display mem-ory means and being responsive to said input image information for processing said image information, for storing said proces-sed image information in said display memory means and for sup-plying said processed image information from said display memory means substantially immediately to the display means for displaying said three-dimensional image.
34. A system in accordance with claim 33 wherein said processor means comprises a microprocessor means.
35. A system in accordance with claim 31 wherein said input image information includes image data defining the ver-tices of one or more geometric elements which form the image to be displayed; and said processing and supplying means responds to said input image information for supplying processed image information defining the remaining points which define the surface of said geometric elements.
36. A system in accordance with claim 31 wherein said processing and supplying means supplies processed image infor-mation which produces a translucent effect in at least one selected region of the image to be displayed.
37. A system in accordance with claim 31 wherein said processing and supplying means supplies processed image infor-mation defining geometric elements in the form of polygons each having a smooth linearly interpolated shading over its surface for use in displaying said polygons in the image to be displayed.
38. A display processing system for use in displaying a three-dimensional representation of an image on a two-dimensional display means, said system comprising processing means responsive to input image information defining said image for processing said input image information so as to provide a translucent effect in the display of at least one selected region of said three-dimen-sional image, said processing means processing said image infor-mation so as to determine for said selected region a selected ratio of the number of points in said selected region which are to be displayed to the number of points in said selected region which are not to be displayed, said selec-ted ratio thereby determining the degree of translucency of said selected region; and so as to supply, for use by the display means, processed image information with respect to those points in said selected region which are to be displayed by the display means.
39. A system in accordance with claim 36 wherein said processing means processes said input image information so as to determine for said selected region a selected ratio of the number of points which are to be displayed in said selected region to the number of points which are not to be displayed in said selected region, said selected ratio thereby deter-mining the degree of translucency of said selected region; and so as to supply, for use by the display means, processed image information with respect to those points in said selected region which are to be displayed by the display means.
40. A method for processing image information for use in the display of a three-dimensional image on a two-dimensional display means comprising the steps of successively receiving in any order information defining at least part of each of a plurality of geometric elements in a three-dimensional image;
successively processing the received input image information with respect to each successive geometric element; and succes-sively supplying processed image information with respect to each successive geometric element to a display means for succes-sively displaying thereon the visible portions of each succes-sive geometric element and for preventing the display of the non-visible portions of each successive geometric element so that the overall three-dimensional image is incrementally built up on the display means on a geometric element by geometric element basis.
41. A method in accordance with claim 40 wherein said supplying step includes supplying said processed image infor-mation to the display means as the input image information is being received and processed for each successive geometric element.
42. A method in accordance with claim 40 wherein said processing step includes storing said processed image informa-tion so as to make said processed image information available for use as it is being processed; and said supplying step includes supplying said stored processed information substan-tially immediately for use by said display means as said input image information is being processed.
43. A method in accordance with claim 40 wherein said processing step further includes the step of causing one or more selected portions of the three-dimensional image to be displayed with greater intensity than other portions thereof.
44. A method in accordance with claim 40 wherein said processing step further includes the step of producing a trans-lucent effect in at least one selected region of the image to be displayed.
45. A method for displaying a three-dimensional image on a two-dimensional display means comprising the steps of processing input image information to provide a translucent effect in the display of at least one selected region of said three-dimensional image, said processing including determining for said selected region a selected ratio of the number of points in said selected region which are to be displayed to the number of points in said selected region which are not to be displayed, said selected ratio determining the degree of translucency of said selected region; and supplying for use by the display means processed image information with respect to those points in said selected region which are to be dis-played by the display means.
46. A method in accordance with claim 44 wherein the step of providing said translucent effect includes determining a selected ratio of the number of points in said selected region which are to be displayed to the number of points in said selected region which are not to be displayed; and supplying for use by the display means image information with respect to those points in said selected region which are to be displayed by the display means.
47. A system for displaying a three-dimensional image on a two-dimensional display means, said system comprising a display means; means for obtaining from a data base data with respect to each of a plurality of geometric elements of a three-dimensional image and for providing input image informa-tion defining at least part of each of said geometric elements;
means for successively receiving in any order the input image information defining at least part of each of said geometric elements; and means for successively processing the received input image information with respect to each successive geo-metric element and for successively supplying processed image information to said display means for successively displaying thereon the visible portions of each successive geometric ele-ment and for preventing the display of the non-visible portions of each successive geometric element so that the overall three-dimensional image is incrementally built up on the display means on a geometric element by geometric element basis.
48. A method for processing image information for use in the display of a three-dimensional image on a two-dimensional display means comprising the steps of obtaining from a data base data with respect to each of a plurality of geometric elements of a three-dimension image; providing input image information defining at least part of each of said geometric elements; successively receiving in any order the input image information defining at least part of each of said geometric elements; successively processing the received input image information with respect to each successive geometric element;
and successively supplying processed image information with respect to each successive geometric element to a display means for successively displaying thereon the visible portions of each successive geometric element and for preventing the display of the non-visible portions of each successive geomet-ric element so that the overall three-dimensional image is incrementally built up on the display means on a geometric element by geometric element basis.
CA000430825A 1983-01-17 1983-06-21 Three-dimensional display system Expired CA1207927A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US458,362 1983-01-17
US06/458,362 US4475104A (en) 1983-01-17 1983-01-17 Three-dimensional display system

Publications (1)

Publication Number Publication Date
CA1207927A true CA1207927A (en) 1986-07-15

Family

ID=23820489

Family Applications (1)

Application Number Title Priority Date Filing Date
CA000430825A Expired CA1207927A (en) 1983-01-17 1983-06-21 Three-dimensional display system

Country Status (4)

Country Link
US (1) US4475104A (en)
EP (1) EP0116737A3 (en)
JP (1) JPS59129897A (en)
CA (1) CA1207927A (en)

Families Citing this family (142)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4209852A (en) * 1974-11-11 1980-06-24 Hyatt Gilbert P Signal processing and memory arrangement
US7907793B1 (en) 2001-05-04 2011-03-15 Legend Films Inc. Image sequence depth enhancement system and method
US8396328B2 (en) 2001-05-04 2013-03-12 Legend3D, Inc. Minimal artifact image sequence depth enhancement system and method
US5488952A (en) * 1982-02-24 1996-02-06 Schoolman Scientific Corp. Stereoscopically display three dimensional ultrasound imaging
JPS5952380A (en) * 1982-09-17 1984-03-26 Victor Co Of Japan Ltd Interpolating device
GB2130854B (en) * 1982-10-10 1986-12-10 Singer Co Display system
US6590573B1 (en) * 1983-05-09 2003-07-08 David Michael Geshwind Interactive computer system for creating three-dimensional image information and for converting two-dimensional image information for three-dimensional display systems
US4594673A (en) * 1983-06-28 1986-06-10 Gti Corporation Hidden surface processor
US4549275A (en) * 1983-07-01 1985-10-22 Cadtrak Corporation Graphics data handling system for CAD workstation
US4667306A (en) * 1983-07-20 1987-05-19 Ramtek Corporation Method and apparatus for generating surface-fill vectors
US4615013A (en) * 1983-08-02 1986-09-30 The Singer Company Method and apparatus for texture generation
IL72685A (en) * 1983-08-30 1988-08-31 Gen Electric Advanced video object generator
US4730261A (en) * 1983-10-25 1988-03-08 Ramtek Corporation Solids modelling generator
US4646075A (en) * 1983-11-03 1987-02-24 Robert Bosch Corporation System and method for a data processing pipeline
US4550315A (en) * 1983-11-03 1985-10-29 Burroughs Corporation System for electronically displaying multiple images on a CRT screen such that some images are more prominent than others
US4771275A (en) * 1983-11-16 1988-09-13 Eugene Sanders Method and apparatus for assigning color values to bit map memory display locations
US4586038A (en) * 1983-12-12 1986-04-29 General Electric Company True-perspective texture/shading processor
US4649499A (en) * 1984-03-07 1987-03-10 Hewlett-Packard Company Touchscreen two-dimensional emulation of three-dimensional objects
US4808988A (en) * 1984-04-13 1989-02-28 Megatek Corporation Digital vector generator for a graphic display system
US4679040A (en) * 1984-04-30 1987-07-07 The Singer Company Computer-generated image system to display translucent features with anti-aliasing
US4649498A (en) * 1984-05-08 1987-03-10 The University Of Rochester Computer systems for curve-solid classification and solid modeling
US4631691A (en) * 1984-05-14 1986-12-23 Rca Corporation Video display device simulation apparatus and method
US4697178A (en) * 1984-06-29 1987-09-29 Megatek Corporation Computer graphics system for real-time calculation and display of the perspective view of three-dimensional scenes
US4734690A (en) * 1984-07-20 1988-03-29 Tektronix, Inc. Method and apparatus for spherical panning
US4685070A (en) * 1984-08-03 1987-08-04 Texas Instruments Incorporated System and method for displaying, and interactively excavating and examining a three dimensional volume
US4821212A (en) * 1984-08-08 1989-04-11 General Electric Company Three dimensional texture generator for computed terrain images
JPH0746391B2 (en) * 1984-09-14 1995-05-17 株式会社日立製作所 Graphic seeding device
GB2166316B (en) * 1984-10-31 1987-10-28 Sony Corp Video signal processing circuits
JPH0769971B2 (en) * 1984-11-20 1995-07-31 松下電器産業株式会社 Three-dimensional figure processing method
US4748572A (en) * 1984-12-05 1988-05-31 The Singer Company Video processor architecture with distance sorting capability
JP2526857B2 (en) * 1984-12-27 1996-08-21 ソニー株式会社 Image signal conversion method
JPS61219079A (en) * 1985-03-25 1986-09-29 ヤマハ株式会社 Information processor
JPH0681275B2 (en) * 1985-04-03 1994-10-12 ソニー株式会社 Image converter
US4737921A (en) * 1985-06-03 1988-04-12 Dynamic Digital Displays, Inc. Three dimensional medical image display system
US4729098A (en) * 1985-06-05 1988-03-01 General Electric Company System and method employing nonlinear interpolation for the display of surface structures contained within the interior region of a solid body
US4710876A (en) * 1985-06-05 1987-12-01 General Electric Company System and method for the display of surface structures contained within the interior region of a solid body
US4679041A (en) * 1985-06-13 1987-07-07 Sun Microsystems, Inc. High speed Z-buffer with dynamic random access memory
US5025400A (en) * 1985-06-19 1991-06-18 Pixar Pseudo-random point sampling techniques in computer graphics
US5239624A (en) * 1985-06-19 1993-08-24 Pixar Pseudo-random point sampling techniques in computer graphics
US4897806A (en) * 1985-06-19 1990-01-30 Pixar Pseudo-random point sampling techniques in computer graphics
DE3650494T2 (en) * 1985-07-05 1996-10-02 Dainippon Printing Co Ltd DESIGNING METHOD AND DEVICE OF THREE-DIMENSIONAL CONTAINERS
JP2604710B2 (en) * 1985-08-26 1997-04-30 ソニー株式会社 Image conversion device
US4719585A (en) * 1985-08-28 1988-01-12 General Electric Company Dividing cubes system and method for the display of surface structures contained within the interior region of a solid body
US4758965A (en) * 1985-10-09 1988-07-19 International Business Machines Corporation Polygon fill processor
GB2181929B (en) * 1985-10-21 1989-09-20 Sony Corp Methods of and apparatus for video signal processing
US4745407A (en) * 1985-10-30 1988-05-17 Sun Microsystems, Inc. Memory organization apparatus and method
US5095301A (en) * 1985-11-06 1992-03-10 Texas Instruments Incorporated Graphics processing apparatus having color expand operation for drawing color graphics from monochrome data
US5294918A (en) * 1985-11-06 1994-03-15 Texas Instruments Incorporated Graphics processing apparatus having color expand operation for drawing color graphics from monochrome data
US4692880A (en) * 1985-11-15 1987-09-08 General Electric Company Memory efficient cell texturing for advanced video object generator
US4811245A (en) * 1985-12-19 1989-03-07 General Electric Company Method of edge smoothing for a computer image generation system
JPS62165279A (en) * 1986-01-17 1987-07-21 Fanuc Ltd Graphic extracting system
JPS62168281A (en) * 1986-01-20 1987-07-24 Fanuc Ltd Graphic extracting system
US4928231A (en) * 1986-03-06 1990-05-22 Hewlett-Packard Company Apparatus for forming flow-map images using two-dimensional spatial filters
JPS62231379A (en) * 1986-03-31 1987-10-09 Namuko:Kk Picture synthesizing device
US4901251A (en) * 1986-04-03 1990-02-13 Advanced Micro Devices, Inc. Apparatus and methodology for automated filling of complex polygons
US4805116A (en) * 1986-04-23 1989-02-14 International Business Machines Corporation Interpolated display characteristic value generator
JPH0752470B2 (en) * 1986-06-30 1995-06-05 株式会社日立製作所 3D figure filling display device
US4816813A (en) * 1986-09-19 1989-03-28 Nicolet Instrument Corporation Raster scan emulation of conventional analog CRT displays
US4870599A (en) * 1986-09-26 1989-09-26 International Business Machines Corporation Traversal method for a graphics display system
US4875097A (en) * 1986-10-24 1989-10-17 The Grass Valley Group, Inc. Perspective processing of a video signal
US4912657A (en) * 1986-10-30 1990-03-27 Synthesis, Inc. Method and systems for generating parametric designs
US4882692A (en) * 1986-10-30 1989-11-21 Transformercad, Inc. Methods and systems for generating parametric designs
US5197120A (en) * 1986-10-30 1993-03-23 Synthesis, Inc. Methods and systems for generating parametric designs
US4879668A (en) * 1986-12-19 1989-11-07 General Electric Company Method of displaying internal surfaces of three-dimensional medical images
US4988985A (en) * 1987-01-30 1991-01-29 Schlumberger Technology Corporation Method and apparatus for a self-clearing copy mode in a frame-buffer memory
US4903217A (en) * 1987-02-12 1990-02-20 International Business Machines Corp. Frame buffer architecture capable of accessing a pixel aligned M by N array of pixels on the screen of an attached monitor
JP2541539B2 (en) * 1987-02-13 1996-10-09 日本電気株式会社 Graphic processing device
US5029111A (en) * 1987-04-29 1991-07-02 Prime Computer, Inc. Shared bit-plane display system
JPS63271673A (en) * 1987-04-30 1988-11-09 Toshiba Corp Three-dimensional display device
US4791583A (en) * 1987-05-04 1988-12-13 Caterpillar Inc. Method for global blending of computer modeled solid objects using a convolution integral
US4825391A (en) * 1987-07-20 1989-04-25 General Electric Company Depth buffer priority processing for real time computer image generating systems
US4935879A (en) * 1987-08-05 1990-06-19 Daikin Industries, Ltd. Texture mapping apparatus and method
GB2207840B (en) * 1987-08-07 1991-09-25 Philips Electronic Associated Method of and apparatus for modifying data stored in a random access memory
GB2210540A (en) * 1987-09-30 1989-06-07 Philips Electronic Associated Method of and arrangement for modifying stored data,and method of and arrangement for generating two-dimensional images
US4991122A (en) * 1987-10-07 1991-02-05 General Parametrics Corporation Weighted mapping of color value information onto a display screen
US5379371A (en) * 1987-10-09 1995-01-03 Hitachi, Ltd. Displaying method and apparatus for three-dimensional computer graphics
JP2667835B2 (en) * 1987-10-09 1997-10-27 株式会社日立製作所 Computer Graphics Display
JPH07122905B2 (en) * 1987-10-21 1995-12-25 ダイキン工業株式会社 Polygon fill control device
US4888711A (en) * 1987-11-16 1989-12-19 General Electric Company Image interpretation method and apparatus using faces for constraint satisfaction
DE3854600T2 (en) * 1987-12-04 1996-06-05 Evans & Sutherland Computer Co Method for using barycentric coordinates as for polygon interpolation.
US5088054A (en) * 1988-05-09 1992-02-11 Paris Ii Earl A Computer graphics hidden surface removal system
US5068644A (en) * 1988-05-17 1991-11-26 Apple Computer, Inc. Color graphics system
US4970499A (en) * 1988-07-21 1990-11-13 Raster Technologies, Inc. Apparatus and method for performing depth buffering in a three dimensional display
US5091960A (en) * 1988-09-26 1992-02-25 Visual Information Technologies, Inc. High-speed image rendering method using look-ahead images
DE3887517T2 (en) * 1988-09-29 1994-05-11 Toshiba Kawasaki Kk Control device of the buffer containing the depth information.
US5101365A (en) * 1988-10-31 1992-03-31 Sun Microsystems, Inc. Apparatus for extending windows using Z buffer memory
US5003497A (en) * 1988-12-15 1991-03-26 Sun Micosystems Inc Method for three-dimensional clip checking for computer graphics
JP2762502B2 (en) * 1988-12-29 1998-06-04 ダイキン工業株式会社 Stereoscopic display method and apparatus
US5222203A (en) * 1989-01-20 1993-06-22 Daikin Industries, Ltd. Method and apparatus for displaying translucent surface
US5446479A (en) * 1989-02-27 1995-08-29 Texas Instruments Incorporated Multi-dimensional array video processor system
US5121469A (en) * 1989-03-20 1992-06-09 Grumman Aerospace Corporation Method and apparatus for processing and displaying multivariate time series data
JPH04140892A (en) * 1990-02-05 1992-05-14 Internatl Business Mach Corp <Ibm> Apparatus and method for encoding control data
US5220646A (en) * 1990-04-30 1993-06-15 International Business Machines Corporation Single pass hidden line removal using z-buffers
US5252953A (en) * 1990-05-22 1993-10-12 American Film Technologies, Inc. Computergraphic animation system
US5201035A (en) * 1990-07-09 1993-04-06 The United States Of America As Represented By The Secretary Of The Air Force Dynamic algorithm selection for volume rendering, isocontour and body extraction within a multiple-instruction, multiple-data multiprocessor
JP3350043B2 (en) * 1990-07-27 2002-11-25 株式会社日立製作所 Graphic processing apparatus and graphic processing method
US5305430A (en) * 1990-12-26 1994-04-19 Xerox Corporation Object-local sampling histories for efficient path tracing
US5640496A (en) * 1991-02-04 1997-06-17 Medical Instrumentation And Diagnostics Corp. (Midco) Method and apparatus for management of image data by linked lists of pixel values
US5189626A (en) * 1991-03-27 1993-02-23 Caterpillar Inc. Automatic generation of a set of contiguous surface patches on a computer modeled solid
JPH0797413B2 (en) * 1991-05-16 1995-10-18 インターナショナル・ビジネス・マシーンズ・コーポレイション Pick method and apparatus in graphics system
US5546105A (en) * 1991-07-19 1996-08-13 Apple Computer, Inc. Graphic system for displaying images in gray-scale
JP3416894B2 (en) * 1992-06-24 2003-06-16 日本電信電話株式会社 Computer controlled display system
US5321809A (en) * 1992-09-11 1994-06-14 International Business Machines Corporation Categorized pixel variable buffering and processing for a graphics system
GB9302271D0 (en) * 1993-02-05 1993-03-24 Robinson Max The visual presentation of information derived for a 3d image system
GB2293079B (en) * 1993-05-10 1997-07-02 Apple Computer Computer graphics system having high performance multiple layer z-buffer
US5583974A (en) * 1993-05-10 1996-12-10 Apple Computer, Inc. Computer graphics system having high performance multiple layer Z-buffer
US5581680A (en) * 1993-10-06 1996-12-03 Silicon Graphics, Inc. Method and apparatus for antialiasing raster scanned images
US5528738A (en) * 1993-10-06 1996-06-18 Silicon Graphics, Inc. Method and apparatus for antialiasing raster scanned, polygonal shaped images
US5515484A (en) * 1993-10-06 1996-05-07 Silicon Graphics, Inc. Method and apparatus for rendering volumetric images
US5748946A (en) * 1995-02-17 1998-05-05 International Business Machines Corporation Method and apparatus for improved graphics picking using auxiliary buffer information
US5790125A (en) * 1996-04-22 1998-08-04 International Business Machines Corporation System and method for use in a computerized imaging system to efficiently transfer graphics information to a graphics subsystem employing masked span
JPH1091811A (en) * 1996-07-01 1998-04-10 Sun Microsyst Inc Graphical picture re-scheduling mechanism
US6222552B1 (en) 1996-07-26 2001-04-24 International Business Machines Corporation Systems and methods for caching depth information of three-dimensional images
US5923333A (en) * 1997-01-06 1999-07-13 Hewlett Packard Company Fast alpha transparency rendering method
US6002407A (en) 1997-12-16 1999-12-14 Oak Technology, Inc. Cache memory and method for use in generating computer graphics texture
US6230382B1 (en) * 1998-05-11 2001-05-15 Vought Aircraft Industries, Inc. System and method for assembling an aircraft
US7102633B2 (en) * 1998-05-27 2006-09-05 In-Three, Inc. Method for conforming objects to a common depth perspective for converting two-dimensional images into three-dimensional images
US7116324B2 (en) * 1998-05-27 2006-10-03 In-Three, Inc. Method for minimizing visual artifacts converting two-dimensional motion pictures into three-dimensional motion pictures
US20050231505A1 (en) * 1998-05-27 2005-10-20 Kaye Michael C Method for creating artifact free three-dimensional images converted from two-dimensional images
US7116323B2 (en) * 1998-05-27 2006-10-03 In-Three, Inc. Method of hidden surface reconstruction for creating accurate three-dimensional images converted from two-dimensional images
DE19835215C2 (en) * 1998-08-05 2000-07-27 Mannesmann Vdo Ag Combination instrument
GB9909163D0 (en) 1999-04-21 1999-06-16 Image Scan Holdings Plc Automatic defect detection
US6621918B1 (en) 1999-11-05 2003-09-16 H Innovation, Inc. Teleradiology systems for rendering and visualizing remotely-located volume data sets
US9031383B2 (en) 2001-05-04 2015-05-12 Legend3D, Inc. Motion picture project management system
US9286941B2 (en) 2001-05-04 2016-03-15 Legend3D, Inc. Image sequence enhancement and motion picture project management system
US8401336B2 (en) 2001-05-04 2013-03-19 Legend3D, Inc. System and method for rapid image sequence depth enhancement with augmented computer-generated elements
US8897596B1 (en) 2001-05-04 2014-11-25 Legend3D, Inc. System and method for rapid image sequence depth enhancement with translucent elements
US7039723B2 (en) 2001-08-31 2006-05-02 Hinnovation, Inc. On-line image processing and communication system
EP2249312A1 (en) 2009-05-06 2010-11-10 Thomson Licensing Layered-depth generation of images for 3D multiview display devices
CN102110308A (en) * 2009-12-24 2011-06-29 鸿富锦精密工业(深圳)有限公司 Three-dimensional solid graph display system and method
US8730232B2 (en) 2011-02-01 2014-05-20 Legend3D, Inc. Director-style based 2D to 3D movie conversion system and method
US9282321B2 (en) 2011-02-17 2016-03-08 Legend3D, Inc. 3D model multi-reviewer system
US9113130B2 (en) 2012-02-06 2015-08-18 Legend3D, Inc. Multi-stage production pipeline system
US9407904B2 (en) 2013-05-01 2016-08-02 Legend3D, Inc. Method for creating 3D virtual reality from 2D images
US9288476B2 (en) 2011-02-17 2016-03-15 Legend3D, Inc. System and method for real-time depth modification of stereo images of a virtual reality environment
US9241147B2 (en) 2013-05-01 2016-01-19 Legend3D, Inc. External depth map transformation method for conversion of two-dimensional images to stereoscopic images
US9007365B2 (en) 2012-11-27 2015-04-14 Legend3D, Inc. Line depth augmentation system and method for conversion of 2D images to 3D images
US9547937B2 (en) 2012-11-30 2017-01-17 Legend3D, Inc. Three-dimensional annotation system and method
FI20135001L (en) * 2013-01-02 2014-07-03 Tekla Corp Computer-aided modeling
US9007404B2 (en) 2013-03-15 2015-04-14 Legend3D, Inc. Tilt-based look around effect image enhancement method
US9438878B2 (en) 2013-05-01 2016-09-06 Legend3D, Inc. Method of converting 2D video to 3D video using 3D object models
US9609307B1 (en) 2015-09-17 2017-03-28 Legend3D, Inc. Method of converting 2D video to 3D video using machine learning

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3621214A (en) * 1968-11-13 1971-11-16 Gordon W Romney Electronically generated perspective images
US3602702A (en) * 1969-05-19 1971-08-31 Univ Utah Electronically generated perspective images
US3899662A (en) * 1973-11-30 1975-08-12 Sperry Rand Corp Method and means for reducing data transmission rate in synthetically generated motion display systems
US3944997A (en) * 1974-04-18 1976-03-16 Research Corporation Image generator for a multiterminal graphic display system
US4127849A (en) * 1975-11-03 1978-11-28 Okor Joseph K System for converting coded data into display data
GB1532275A (en) * 1976-01-28 1978-11-15 Nat Res Dev Apparatus for controlling raster-scan displays
NL179417C (en) * 1976-06-22 1986-09-01 Hollandse Signaalapparaten Bv BRIGHTNESS CONTROL DEVICE FOR DISPLAYING VIDEO SIGNALS ON A GRID SCAN DISPLAY.
JPS5326534A (en) * 1976-08-25 1978-03-11 Hitachi Ltd Vi deo display device
US4056713A (en) * 1976-10-01 1977-11-01 Digital Equipment Corporation Display processing unit for drawing vectors
US4121283A (en) * 1977-01-17 1978-10-17 Cromemco Inc. Interface device for encoding a digital image for a CRT display
SU834692A1 (en) * 1977-10-19 1981-05-30 Институт Автоматики И Электрометриисо Ah Cccp Device for output of halftone images of three-dimensional objects onto television receiver screen
US4222048A (en) * 1978-06-02 1980-09-09 The Boeing Company Three dimension graphic generator for displays with hidden lines
US4303986A (en) * 1979-01-09 1981-12-01 Hakan Lans Data processing system and apparatus for color graphics display
US4238826A (en) * 1979-02-12 1980-12-09 Aai Corporation Method and apparatus for image signal generation and image display
US4254467A (en) * 1979-06-04 1981-03-03 Xerox Corporation Vector to raster processor
US4439761A (en) * 1981-05-19 1984-03-27 Bell Telephone Laboratories, Incorporated Terminal generation of dynamically redefinable character sets
US4439760A (en) * 1981-05-19 1984-03-27 Bell Telephone Laboratories, Incorporated Method and apparatus for compiling three-dimensional digital image information
US4412296A (en) * 1981-06-10 1983-10-25 Smiths Industries, Inc. Graphics clipping circuit

Also Published As

Publication number Publication date
JPS59129897A (en) 1984-07-26
US4475104A (en) 1984-10-02
EP0116737A3 (en) 1985-05-29
EP0116737A2 (en) 1984-08-29

Similar Documents

Publication Publication Date Title
CA1207927A (en) Three-dimensional display system
US4609917A (en) Three-dimensional display system
US6147695A (en) System and method for combining multiple video streams
US5909219A (en) Embedding a transparency enable bit as part of a resizing bit block transfer operation
EP0758118A2 (en) A volume rendering apparatus and method
EP0568358B1 (en) Method and apparatus for filling an image
US5877769A (en) Image processing apparatus and method
JPS62231380A (en) Picture synthesizing device
WO1983002509A1 (en) Method and apparatus for controlling the display of a computer generated raster graphic system
WO1996027857A1 (en) Hardware architecture for image generation and manipulation
US4970499A (en) Apparatus and method for performing depth buffering in a three dimensional display
WO1991012588A1 (en) Method and apparatus for providing a visually improved image by converting a three-dimensional quadrilateral to a pair of triangles in a computer system
JPH10334273A (en) Three-dimension image processing unit and three-dimension image processing method
EP0353952B1 (en) Reduced viewport for graphics display
US5327501A (en) Apparatus for image transformation
JP2000228779A (en) Image processor and image processing method
KR100576973B1 (en) Method of and system for graphics detection and rendering
US4748442A (en) Visual displaying
EP0168981B1 (en) Method and apparatus for spherical panning
US6919898B2 (en) Method and apparatus for ascertaining and selectively requesting displayed data in a computer graphics system
JP2755204B2 (en) Polyhedron display method and polyhedron display device
JPH05342368A (en) Method and device for generating three-dimensional picture
JPH10124039A (en) Graphic display device
Stock Introduction to Digital Computer Graphics for Video
JPS63247868A (en) Display device for 3-dimensional pattern

Legal Events

Date Code Title Description
MKEX Expiry
MKEX Expiry

Effective date: 20030715