US7864197B2 - Method of background colour removal for porter and duff compositing - Google Patents

Method of background colour removal for porter and duff compositing Download PDF

Info

Publication number
US7864197B2
US7864197B2 US10/525,417 US52541705A US7864197B2 US 7864197 B2 US7864197 B2 US 7864197B2 US 52541705 A US52541705 A US 52541705A US 7864197 B2 US7864197 B2 US 7864197B2
Authority
US
United States
Prior art keywords
image
opacity
color
composite image
channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US10/525,417
Other versions
US20060103671A1 (en
Inventor
Craig Matthew Brown
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROWN, CRAIG MATTHEW
Publication of US20060103671A1 publication Critical patent/US20060103671A1/en
Application granted granted Critical
Publication of US7864197B2 publication Critical patent/US7864197B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/503Blending, e.g. for anti-aliasing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image

Definitions

  • the present invention relates generally to the rendering of graphical objects and, in particular, to a method and apparatus for compositing a group of graphical objects using background colour removal, and to a computer program product including a computer readable medium having recorded thereon a computer program for compositing a group of graphical objects using background colour removal.
  • a typical graphical rendering system accepts a number of input graphical objects and combines the objects to produce a resultant image.
  • Such systems typically store software coded descriptions of graphical objects displayed on a page, using a “Page Description Language (‘PDL’)”.
  • PDL describes each such graphical object including the attributes of each object. Such attributes include the size, shape and colour of each object.
  • a PDL also describes various other attributes of graphical objects such as the opacity (i.e., alpha channel) associated with an object and the compositing operation used to draw the object, as will be discussed below.
  • Graphical rendering systems are typically configured to combine graphical objects described in a PDL in order to create various formats of output.
  • the output of a graphical rendering system can include a bitmap or rendering commands sent directly to an output system renderer.
  • Many rendering systems perform such a combination of objects by placing the objects, one at a time, into a destination bitmap. As each object is placed into the destination bitmap, a resultant destination bitmap is generated that contains any previously drawn objects plus the object most recently placed in the bitmap. This result is then in turn used as the input for placing further objects.
  • Some rendering systems may generate the final result of a compositing operation for a single pixel at a time, or for a group of pixels by combining each of the objects in an image multiple times, rather than once.
  • each input object determines the effect that an object has on a final image.
  • the ordering of objects is also important in that each object is placed into the resultant bitmap in turn, where later objects may partially or completely obscure earlier objects.
  • the objects are typically rendered sequentially on a page, where each subsequent object may partially or completely obscure preceding objects.
  • FIG. 1( a ) shows an image 100 resulting from the rendering of three objects A, B and C.
  • FIG. 1( b ) shows the operations involved during the rendering of the objects A, B and C.
  • initially object B is rendered OVER object A to produce the image 103 .
  • object C is rendered OVER the objects A and B to produce the image 100 shown in FIG. 1( a ).
  • object C was the fmal object to be rendered, object C completely obscures objects A and B in a region 101 where object C overlaps objects A and B.
  • the process of rendering objects one on top of another in this manner is conventionally know as the “Painter's Algorithm”.
  • Input and output colour information of a graphical rendering system is typically described in terms of an intensity value associated with each colour component of a pixel.
  • each of the Red, Green, and Blue colour components of a pixel is represented by an 8 bit (or byte) value.
  • the value of each of the bytes represents the intensity of a particular colour component (i.e., Red, Green or Blue).
  • each 24 bit pixel has an associated opacity value (i.e., alpha channel) ranging between 0% opacity and 100% opacity.
  • An opacity value of 0% indicates that a pixel is completely transparent whilst an opacity value of 100% indicates that the pixel is completely opaque.
  • Opacity values allow for a plurality of objects to be placed one on top of another to produce a resultant image, where one or more of the objects may be partially obscured by one or more other transparent objects.
  • the operation of combining objects using opacity values is referred to as compositing.
  • a partially opaque object representing a piece of tinted glass can be placed OVER one or more other objects to produce a resultant image.
  • a graphics rendering system combines the colour and opacity values representing the glass with the colour and opacity values of the other objects. The image produced by this combination depicts the other objects as seen through the tinted glass.
  • FIG. 2( a ) shows a partially opaque object F (e.g. having an opacity value of 50%) composited onto a completely opaque object E (i.e., having an opacity value of 100%) to produce a resultant image 200 .
  • FIG. 2( b ) shows the operations involved during the rendering of the objects E and F.
  • the resultant pixel values represent the combination of object E and object F.
  • FIG. 3( a ) shows the object F composited onto the object E to produce an image 300 , where object E is completely opaque and object F is partially opaque.
  • FIG. 3( b ) shows the operations involved during the rendering of the objects E and F, where the compositing operation used to produce the image 300 is an intersection (IN) operation.
  • FIG. 18 shows the result of each of the above compositing operators together with a variety of other conventional compositing operators, which are conventionally known as “Porter-Duff Compositing Operators”.
  • Graphical rendering systems can also be configured to combine objects into a group before processing. Generally, such groups of objects are processed as though the objects of the group are joined to produce a single object. The single object can then be placed onto a background image. Objects grouped in such a manner can have operations applied to the group as a whole after all group member objects are combined and before the group is placed on the background image.
  • FIG. 4( a ) shows the result of rendering three objects A, B and C to produce the same image 100 seen in FIG. 1( a ). However, as shown in FIG. 4( b ), in this instance the objects B and C are initially combined to form group X. Group X is then composited onto object A to produce the resultant image 100 .
  • each of the objects within a group can be drawn onto a background image (e.g. object A) using a different compositing operation.
  • a background image e.g. object A
  • different objects of a group may be intersected with a background image or may act as a lighting condition on the background image.
  • the technique of initially combining objects to produce a single object, then placing the combined result on a background is not suitable when objects are grouped together to form a single object (e.g. group X).
  • Such a technique is not suitable as the operations associated with each of the individual objects of a group and which are required to be performed to separately combine each object with the background image, are not performed if the objects are grouped.
  • FIG. 5( a ) shows an image 501 resulting from the compositing of an object E onto an object A, using an OVER operator.
  • An object F is then composited onto the image 501 using an intersection (IN) operator to produce the image 500 shown in FIG. 5( b ).
  • the compositing expression for the image 500 is, therefore, (F IN (E OVER A)).
  • FIG. 6( a ) shows the operations involved in the rendering of the objects E, A and F, where objects E and F are initially grouped together to produce group Y.
  • the compositing expression for the image 600 is ((F IN E) OVER A).
  • the operations of FIG. 6( a ) produce an image 600 shown in FIG. 6( b ).
  • the image 500 is different to the image 600 .
  • the grouping of the objects E and F results in the operation (F OVER A) not being performed in producing the image 600 .
  • a conventional rendering system typically takes a copy of the background image and then renders each of the objects in the group onto the copy. Such a rendering system then calculates the percentage of background colour in the resultant image copy and removes this background colour from the image copy. The rendering system then applies operations to the group as a whole. The result of such operations is then composited onto the original background using conventional blending operations.
  • one object of a group of objects adds colour and opacity to a background image
  • conventional rendering systems using the above process produce an aesthetically satisfactory result for certain conventional compositing operations (i.e., OVER, ATOP, ROVER, Multiply and Plus).
  • one object of such a group removes colour or opacity data from the background image
  • conventional rendering systems are unable to satisfactorily remove the background image colour from the representation of the group composited onto the background.
  • a group of objects cannot be composited as a whole onto a background image in order to produce an aesthetically satisfactory result for all compositing operations.
  • a method of compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values comprising the steps of:
  • a method of compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values comprising the steps of:
  • a method of compositing a grouped plurality of graphical objects with an image, each said object and said image having associated colour and opacity component values comprising the steps of:
  • an apparatus for compositing a grouped plurality of graphical objects with an image, each said object and said image having associated colour and opacity component values comprising:
  • a memory for storing a program
  • said program comprising:
  • a memory for storing a program
  • said program comprising:
  • an apparatus for compositing a grouped plurality of graphical objects with an image, each said object and said image having associated colour and opacity component values comprising:
  • a memory for storing a program
  • said program comprising:
  • a computer program product having a computer readable medium having a computer program recorded therein for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said computer program product comprising:
  • a computer program product having a computer readable medium having a computer program recorded therein for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said computer program product comprising:
  • a computer program product having a computer readable medium having a computer program recorded therein for compositing a grouped plurality of graphical objects with an image, each said object and said image having associated colour and opacity component values, said computer program product comprising:
  • FIG. 1( a ) shows an image resulting from the rendering of three objects
  • FIG. 1( b ) shows the operations involved during the rendering of the objects of FIG. 1( a );
  • FIG. 2( a ) shows a partially opaque object composited onto a completely opaque object
  • FIG. 2( b ) shows the operations involved during the rendering of the objects of FIG. 2( a );
  • FIG. 3( a ) shows an object composited onto another object
  • FIG. 3( b ) shows the operations involved during the rendering of the objects of FIG. 3( a );
  • FIG. 4( a ) shows an image resulting from the rendering of three objects
  • FIG. 4( b ) shows the operations involved during the rendering of the objects of FIG. 4( a );
  • FIG. 5( a ) shows the operations involved during the rendering of three objects
  • FIG. 5( b ) shows an image resulting from the operations of FIG. 5( a );
  • FIG. 6( a ) shows the operations involved during the rendering of three objects
  • FIG. 6( b ) shows an image resulting from the operations of FIG. 6( a );
  • FIG. 7 is a flow diagram showing a method of compositing a group of graphical objects in accordance with one embodiment
  • FIG. 8( a ) shows a green object that is partially transparent
  • FIG. 8( b ) shows a group object
  • FIG. 9 shows a clear background image upon which the objects of FIGS. 8( a ) and 8 ( b ) will be composited, in accordance with an example
  • FIG. 10 shows the object of FIG. 8( a ) composited onto the background image of FIG. 9 ;
  • FIG. 11 shows a duplicate of the background image of FIG. 10 and an extra opacity channel
  • FIG. 12 shows an updated duplicate background image and an updated alpha channel following the compositing of an object of FIG. 8( b ) onto the duplicate background image of FIG. 11 ;
  • FIG. 13 shows a further updated duplicate background image and an updated alpha channel following the compositing of another object of FIG. 8( b ) onto the duplicate background image of FIG. 12 ;
  • FIG. 14 shows a further updated duplicate background image following background colour removal
  • FIG. 15 shows a further updated duplicate background image following the application of group opacity to the updated duplicate background image of FIG. 14 ;
  • FIG. 16 shows an updated version of the original background image of FIG. 10 following the compositing of the group object of FIG. 8( b );
  • FIG. 17 is a schematic block diagram of a general purpose computer upon which arrangements described can be practiced.
  • FIG. 18 shows the result of each of the above compositing operators together with a variety of other conventional compositing operators.
  • a method 700 (see FIG. 7 ) of compositing a group of graphical objects onto a background image, in accordance with an embodiment of the present invention, is described below with reference to FIGS. 7 to 18 .
  • the method 700 allows a grouped plurality of objects to be composited onto a background image to produce the same result as if each of the objects had been rendered separately onto the background image.
  • the method 700 allows such a composition to be performed when an operation is to be applied to a grouped plurality of objects as a whole.
  • the principles of the method 700 described herein have general applicability to any rendering system that accepts input graphical objects and generates a final image.
  • the final image may be represented in any suitable format (e.g. pixels or rendering commands).
  • the input objects for such a rendering system can be generated using a graphical user interface, where a user can group a plurality of objects together in order to process the group of objects as a single object (or group object).
  • a group object allows any operation (e.g. group opacity or input filtering) that can be performed on a single object to be performed on the group object.
  • Each separate input object of such a rendering system includes an associated compositing operation used to composite the object onto a background image.
  • the method 700 is preferably practiced using a general-purpose computer system 1700 , such as that shown in FIG. 17 wherein the processes of FIGS. 7 to 18 may be implemented as software, such as an application program executing within the computer system 1700 .
  • the steps of method 700 are effected by instructions in the software that are carried out by the computer.
  • the instructions may be formed as one or more code modules, each for performing one or more particular tasks.
  • the software may also be divided into two separate parts, in which a first part performs the method 700 and a second part manages a user interface between the first part and the user.
  • the software may be stored in a computer readable medium, including the storage devices described below, for example.
  • the software is loaded into the computer from the computer readable medium, and then executed by the computer.
  • a computer readable medium having such software or computer program recorded on it is a computer program product.
  • the use of the computer program product in the computer preferably effects an advantageous apparatus for implementing the method 700 .
  • the computer system 1700 is formed by a computer module 1701 , input devices such as a keyboard 1702 and mouse 1703 , output devices including a printer 1715 , a display device 1714 and loudspeakers 1717 .
  • a Modulator-Demodulator (Modem) transceiver device 1716 is used by the computer module 1701 for communicating to and from a communications network 1720 , for example connectable via a telephone line 1721 or other functional medium.
  • the modem 1716 can be used to obtain access to the Internet, and other network systems, such as a Local Area Network (LAN) or a Wide Area Network (WAN), and may be incorporated into the computer module 1701 in some implementations.
  • LAN Local Area Network
  • WAN Wide Area Network
  • the computer module 1701 typically includes at least one processor unit 1705 , and a memory unit 1706 , for example formed from semiconductor random access memory (RAM) and read only memory (ROM).
  • the module 1701 also includes an number of input/output (I/O) interfaces including an audio-video interface 1707 that couples to the video display 1714 and loudspeakers 1717 , an I/O interface 1713 for the keyboard 1702 and mouse 1703 and optionally a joystick (not illustrated), and an interface 1708 for the modem 1716 and printer 1715 .
  • the modem 1716 may be incorporated within the computer module 1701 , for example within the interface 1708 .
  • a storage device 1709 is provided and typically includes a hard disk drive 1710 and a floppy disk drive 1711 .
  • a magnetic tape drive (not illustrated) may also be used.
  • a CD-ROM drive 1712 is typically provided as a non-volatile source of data.
  • the components 1705 to 1713 of the computer module 1701 typically communicate via an interconnected bus 1704 and in a manner which results in a conventional mode of operation of the computer system 1700 known to those in the relevant art. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations or alike computer systems evolved therefrom.
  • the application program is resident on the hard disk drive 1710 and read and controlled in its execution by the processor 1705 .
  • Intermediate storage of the program and any data fetched from the network 1720 may be accomplished using the semiconductor memory 1706 , possibly in concert with the hard disk drive 1710 .
  • the application program may be supplied to the user encoded on a CD-ROM or floppy disk and read via the corresponding drive 1712 or 1711 , or alternatively may be read by the user from the network 1720 via the modem device 1716 .
  • the software can also be loaded into the computer system 1700 from other computer readable media.
  • computer readable medium refers to any storage or transmission medium that participates in providing instructions and/or data to the computer system 1700 for execution and/or processing.
  • storage media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 1701 .
  • transmission media include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like.
  • the method 700 of compositing a group of graphical objects may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of the method 700 .
  • dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories.
  • compositing operations are generally described below in terms of premultiplied alpha (i.e., opacity), unless otherwise indicated.
  • Many graphical rendering systems store pixel colour component values as pre-multiplied alpha values in order to reduce the complexity of compositing operations.
  • R Red, Green, Blue, Alpha
  • Table 1 below shows the definition of terms used throughout the following description to represent compositing operations. As shown in Table 1, the term ‘S’ below is used to refer to a colour component value stored in a source buffer. The term ‘D’ below is used to refer to a colour component value stored in a destination buffer.
  • S c Non-premultiplied source colour component S ca Premultiplied source colour component.
  • S ra S ga S ba Premultiplied source colour components for Red, Green and Blue, respectively.
  • S a Source opacity component S c Non-premultiplied destination colour component.
  • D ca Premultiplied destination colour component.
  • D ra D ga , D ba Premultiplied destination colour components for Red, Green and Blue, respectively.
  • D′ The result of a destination buffer following a compositing step (i.e., the updated buffer).
  • equations (1) and (2) are used to determine the result of compositing source pixel values with destination pixel values.
  • a source pixel value may be associated with either a single object or a group of objects represented as a single object.
  • a value is determined using equations (1) and (2) for each colour component (e.g. R, G, B) and for the alpha channel.
  • equations (1) and (2) are resolved in terms of premultiplied alpha values prior to rendering.
  • D ca ′ f ( S c, D c ) ⁇ S a ⁇ D a +Y ⁇ S ca ⁇ (1 ⁇ D a )+ Z ⁇ D ca ⁇ (1 ⁇ S a ) (1)
  • D a ′ X ⁇ S a ⁇ D a +Y ⁇ S a ⁇ (1 ⁇ D a )+ Z ⁇ D a ⁇ (1 ⁇ S a ) (2) where:
  • Table 2 defines some compositing operations in terms of X, Y, Z and f( ). A person skilled in the relevant art would appreciate that other compositing operations known in the relevant art can be defined in a similar fashion.
  • Table 2 also includes a column, Colour (D ca ′), indicating an equation for determining the destination colour component value for a particular compositing operation (e.g. source pixel OVER destination pixel).
  • Table 2 also includes a column, D a ′, showing an equation for determining the destination opacity component value, and a column D a(d) ′ showing an equation for determining the destination opacity component value for a group buffer containing the opacity of the background channel in the group buffer.
  • the method 700 of compositing a group of graphical objects (hereinafter ‘the group object’) onto a background image will now be described, with reference to FIG. 7 .
  • the method 700 will be further described below with reference to an example shown in FIGS. 8( a ) to 16 .
  • a background image (D o ) is generated by compositing one or more graphical objects to produce the background image (D o ).
  • the background image (D o ) includes pixel value background colour components (e.g. RGB colour components) and opacity (i.e., alpha channel).
  • opacity i.e., alpha channel.
  • each one of the objects in the group object includes colour components and opacity.
  • the method 700 is preferably implemented as software resident in memory 1706 and controlled in its execution by the processor 1705 .
  • the method 700 preferably uses bitmap rendering, where each object is completely drawn into a destination bitmap one at a time.
  • bitmap rendering i.e., complete single scan lines are generated one at a time
  • other rendering techniques can be utilised in accordance with the described embodiments, including scan line rendering (i.e., complete single scan lines are generated one at a time) or pixel rendering (i.e., the final value of each pixel is generated one at a time).
  • the method begins at step 701 , where the processor 1705 duplicates the background image (D o ) to produce a duplicate background image (D 1 ). Also at step 701 , the processor 1705 stores the original background image (D o ), in memory 1706 , for later compositing with the group object. In the method 700 , the duplicate background image (D 1 ) is used for generating each of the objects in the group object. Once the group object is generated in (D 1 ), the group object is composited back onto the background image (D o ).
  • the method 700 continues at the next step 702 , where the processor 1705 generates an alpha channel (D a(d) ) for storing the opacity values of the background image (D 0 ) remaining in the duplicate background image (D 1 ), during the operation of compositing the group object onto the background image (D 1 ).
  • the duplicate background image (D 1 ) initially contains identical values to the background image (D 0 ).
  • the alpha channel (D a(d) ) is initially set to fully opaque.
  • the following steps 703 , 704 , 705 and 706 of the method 700 composite each of the objects of the group object, into the duplicate background image (D 1 ) and the alpha channel (D a(d) ).
  • the processor 1705 determines that each of the objects of the group object have been rendered into the duplicate background image (D 1 ) then the method 700 proceeds to step 707 . Otherwise, the method 700 proceeds to step 704 , where the colour component values associated with a current object of the group object are composited into the duplicate background image (D 1 ).
  • the colour components of the current object are composited at step 704 using an equation shown in the colour column, D ca ′, of Table 2, depending on the compositing operation being performed.
  • the processor 1705 composites the alpha channel of the current object of the group object onto an alpha channel (not shown) associated with the duplicate of the background image (D 1 ).
  • the compositing of the alpha channel of the current object of the group object in step 705 is performed using an equation shown in the opacity column (D a ′) of Table 2, depending on the compositing operation being used.
  • the method 700 continues at the next step 706 , where the processor 1705 composites the alpha channel of the current object of the group object onto the alpha channel, D a(d) ′, generated at step 702 .
  • step 706 the method 700 returns to step 703 .
  • step 707 if the group object has an associated group opacity, a filter effect or any other operation acting on the group object as a whole, then the method 700 proceeds to step 708 . Otherwise, the method 700 proceeds to step 712 , where processor 1705 replaces the background image (D 0 ) with the duplicate background image including the group object (D 1 ), and the method 700 concludes.
  • the processor 1705 removes any of the background colour left in the duplicate background image (D 1 ) from the duplicate background image (D 1 ) prior to applying the group opacity or effect, for example, to the group object rendered in the duplicate background image (D 1 ).
  • the processor 1705 determines the updated premultiplied colour component value (D ca1 ′), and the updated destination opacity component value (D a1 ′), for each pixel of the duplicate background image (D 1 ).
  • step 709 the processor 1705 inverts (D a1(d) ′).
  • the result of step 709 represents the amount of the original background image (D 0 ) removed from the duplicate background image (D 1 ) during the execution of steps 703 to 706 .
  • D a1(d) ′ 1 ⁇ D a1(d) (5)
  • the processor 1705 applies the group effect (e.g. the group opacity or the filter effect) to the group object to produce a completed group object.
  • the method 700 concludes at the next step 711 , where the processor 1705 composites the completed group object onto the original background image (D 0 ).
  • the colour and opacity component values of the original background image (D 0 ) are calculated at step 711 in accordance with the following equations (6) and (7):
  • D ca0 ′ f ( D c1 ,D c0 ) ⁇ D a1 ⁇ D a0 +Y ⁇ D ca1 ⁇ (1 ⁇ D a0 )+ Z ⁇ D ca0 ⁇ (1 ⁇ D a1(d) ) (6)
  • D a0 ′ X ⁇ D a1 ⁇ D a0 +Y ⁇ D a1 ⁇ (1 ⁇ D a0 )+ Z ⁇ D a0 ⁇ (1 ⁇ D a1(d) ) (7)
  • the method 700 will now be further described below by way of an example shown in FIGS. 8( a ) to 16 .
  • FIG. 8( a ) shows a 100% green object 802 that is 60% opaque.
  • FIG. 8( b ) shows a group object 805 .
  • the group object 805 contains two objects 803 and 804 .
  • the first object 803 is red and is 80% opaque.
  • the “XOR” compositing operation is associated with the red object 803 . As such, the red object 803 will be composited onto any background image using the XOR compositing operation.
  • the second object 804 of the group object 805 , is purple and is 70% opaque.
  • the “ROVER” compositing operation is associated with the object 804 .
  • the purple object 804 will be composited onto any background image using the ROVER compositing operation.
  • the group object 805 is 50% opaque and the “OVER” compositing operation is associated with the group object. As such, the group object 805 will be composited onto any background image using the OVER compositing operation.
  • other rendering techniques can be utilised in accordance with the example, including scan line rendering or pixel rendering, for example.
  • the object 802 is initially composited onto the background image (D 0 ), resulting in an updated background image (D 0 ′) 1001 , as seen in FIG. 10 .
  • the object 802 is composited onto the background image (D 0 ) using the compositing operation (i.e., OVER) associated with the object 802 .
  • the background image (D 0 ) 1001 is duplicated to produce a duplicate background image (D 1 )) 1101 , as seen in FIG. 11 .
  • the background image (D o ′) 1001 can be stored in memory 1706 for later compositing with the group object 805 .
  • an alpha channel (D a(d) ) 1103 as shown in FIG.
  • D a(d) the updated destination values following the creation of the duplicate background image (D 1 ) 1101 , and the alpha channel (D a(d) ) 1103 , are listed below:
  • FIG. 12 shows an updated duplicate background image (D 1 ′) 1201 following the compositing of the object 803 onto the duplicate background image (D 1 ) 1101 including the object 802 , as at steps 703 to 706 of the method 700 .
  • FIG. 12 also shows the updated alpha channel (D a(d) ′) 1203 following the compositing of the object 803 .
  • the updated destination values for the region of overlap 1204 of the objects 802 and 803 , as seen in FIG. 12 , following the compositing of the object 803 onto the duplicate background image (D 1 ) 1101 are listed below:
  • FIG. 13 shows an updated duplicate background image (D 1 ′) 1301 following the compositing of the object 804 onto the previously updated duplicate background image (D 1 ′) 1201 , as at steps 703 to 706 of the method 700 .
  • FIG. 13 also shows the updated alpha channel (D a(d) ′) 1303 following the compositing of the object 804 .
  • the object 804 is composited onto the previously updated background image (D 1 ′) 1201 using the ROVER compositing operation.
  • the equations for colour D ca1 ′, opacity D a1 ′ and D a1(d) ′ for the ROVER compositing operation are listed below:
  • D a1(d) ′ D a1(d) ⁇ (1 ⁇ S a )
  • the updated destination values for the region of overlap 1304 of the objects 802 , 803 and 804 , as seen in FIG. 13 , following the compositing of the object 804 onto the previously updated duplicate background image (D 1 ′) 1201 are listed below:
  • FIG. 14 shows an updated duplicate background image (D 1 ′) 1401 following the removal of any remaining colour attributed to the previously updated duplicate background image (D 1 ′) 1301 , as at step 708 , and inverting the result of D a1(d) , as at step 709 of the method 700 .
  • the processor 1705 determines the updated premultiplied colour component value (D ca1 ′), and the updated destination opacity component value (D a1 ′), for each pixel of the updated duplicate background image (D 1 ′) 1401 .
  • FIG. 14 also shows the updated alpha channel (D a(d) ′) 1402 following the inversion step 709 .
  • the equations for colour D ca1 ′, opacity D a1 ′, and D a1(d) ′ for the operation of steps 708 are listed as follows:
  • D ca1 ′ D ca1 ⁇ D ca0 ⁇ D a1(d)
  • D a1 ′ D a1 ⁇ D a0 ⁇ D a1(d)
  • D a1(d) 1 ⁇ D a1(d)
  • FIG. 15 shows an updated duplicate background image (D 1 ′) 1501 following the application of group opacity to the previously updated duplicate background image (D 1 ′) 1401 and the previously updated alpha channel (D a(d) ′) 1402 , as at step 710 of the method 700 .
  • the equations for colour D ca1 ′, opacity D a1 ′, and D a1(d) ′ for the operation of step 710 are listed as follows:
  • D ca1 ′ D ca1 ⁇ 0.5
  • D a1 ′ D a1 ⁇ 0.5
  • D a1(d) D a1(d) ⁇ 0.5
  • the updated destination values for the region of overlap 1503 as seen in FIG. 15 , following the operation of step 710 on the previously updated duplicate background image (D 1 ′) 1401 and opacity (D a(d) ′) 1402 are listed below:
  • FIG. 16 shows an updated original background image (D 0 ′) 1601 following the compositing of the group object 805 of the previously updated duplicate background image (D 1 ′) 1501 of FIG. 15 , onto the background image (D 0 ′) 1001 , as at step 711 of the method 700 .
  • the group object 805 is composited onto the background image (D 0 ′) 1001 using the OVER compositing operation.
  • f(S c ,D c ) S c
  • Z 1.
  • the equations for colour D ca1 ′, opacity D a1 ′ for the OVER compositing operation are listed below:
  • the principles of the method 700 have general applicability to any rendering system that accepts input graphical objects and generates a final image.
  • the final image may be represented in any suitable format (e.g. pixels or rendering commands).
  • the input objects for such a rendering system can be generated using a graphical user interface, where a user can group a plurality of objects together in order to process the group of objects as a single object (or group object).
  • a group object allows any operation (e.g. group opacity or input filtering) that can be performed on a single object to be performed on the group object.
  • Each separate input object of such a rendering system includes an associated compositing operation used to composite the object onto a background image.
  • the method 700 utilises equations (1) and (2) to determine the result of compositing source pixel values with destination pixel values.
  • a source pixel value may be associated with either a single input object or a group of objects represented as a single object.
  • a value is determined using equations (1) and (2) for each colour component (e.g. R, G, B) and for the alpha channel.
  • the source colour component values are stored in a source buffer and the destination colour component values are stored in a destination buffer.
  • the source and destination buffers can be in the form of image bitmaps substantially identical in size and position.
  • a plurality of such image bitmaps can be layered to form an “image file”, where each layer of the image file comprises bitmap information for each pixel in the layer.
  • the source colour component values ‘S’, as described above may be accessed from one or more layers of a source image file and the destination colour component values ‘D’, as described above, resulting from the combination of the source colour component values may be output to a further image file or to an output device.
  • the operation used to combine the source pixels from each of the layers of the source image file can be specified in the source image file.
  • Such a source image file may also specify the grouping of image layers, where a subset of the source image file is first composited together to produce a resultant value, which is subsequently combined with the other layers of the source image file in accordance with the method 700 described above.
  • vector graphics representations of input objects to a rendering system may be converted by the rendering system to pixel based objects in the form of an image file.
  • the image file can then be included as an object in a further vector graphics file and then be processed in accordance with the method 700 by the rendering system and a resultant destination image file can be subsequently converted back to vector based graphics objects for rendering.
  • the aforementioned preferred method(s) comprise a particular control flow. There are many other variants of the preferred method(s) which use different control flows without departing the spirit or scope of the invention. Furthermore one or more of the steps of the preferred method(s) may be performed in parallel rather sequential.
  • the word “comprising” means “including principally but not necessarily solely” or “having” or “including”, and not “consisting only of”. Variations of the word “comprising”, such as “comprise” and “comprises” have correspondingly varied meanings.

Abstract

A method of representing an amount of image color in a composite image includes the steps of generating at least one additional opacity channel for use in creating the composite image, and compositing at least one graphical object having object color and object capacity, with an image having image opacity and the image color, to create the composite image. The composite image has composite image color and composite image opacity, and the composite image color and composite image opacity are derived from one or more of the object color, the object opacity, the image color and the image opacity. An additional step includes compositing the object opacity with the additional opacity channel to create an updated opacity channel, with the updated opacity channel representing an amount of the image color remaining in the composite image following the compositing of the at least one graphical object with the image.

Description

FIELD OF THE INVENTION
The present invention relates generally to the rendering of graphical objects and, in particular, to a method and apparatus for compositing a group of graphical objects using background colour removal, and to a computer program product including a computer readable medium having recorded thereon a computer program for compositing a group of graphical objects using background colour removal.
BACKGROUND
Many computer implemented graphical rendering systems have been developed in order to allow for the creation of graphical images comprising a combination of graphical objects. A typical graphical rendering system accepts a number of input graphical objects and combines the objects to produce a resultant image. Such systems typically store software coded descriptions of graphical objects displayed on a page, using a “Page Description Language (‘PDL’)”. A PDL describes each such graphical object including the attributes of each object. Such attributes include the size, shape and colour of each object. A PDL also describes various other attributes of graphical objects such as the opacity (i.e., alpha channel) associated with an object and the compositing operation used to draw the object, as will be discussed below.
Graphical rendering systems are typically configured to combine graphical objects described in a PDL in order to create various formats of output. For example, the output of a graphical rendering system can include a bitmap or rendering commands sent directly to an output system renderer. Many rendering systems perform such a combination of objects by placing the objects, one at a time, into a destination bitmap. As each object is placed into the destination bitmap, a resultant destination bitmap is generated that contains any previously drawn objects plus the object most recently placed in the bitmap. This result is then in turn used as the input for placing further objects. Some rendering systems may generate the final result of a compositing operation for a single pixel at a time, or for a group of pixels by combining each of the objects in an image multiple times, rather than once.
The attributes of each input object determine the effect that an object has on a final image. The ordering of objects is also important in that each object is placed into the resultant bitmap in turn, where later objects may partially or completely obscure earlier objects. Where a PDL describes a plurality of overlapping objects, the objects are typically rendered sequentially on a page, where each subsequent object may partially or completely obscure preceding objects. For example, FIG. 1( a) shows an image 100 resulting from the rendering of three objects A, B and C. FIG. 1( b) shows the operations involved during the rendering of the objects A, B and C. As seen in FIG. 1( b), initially object B is rendered OVER object A to produce the image 103. Then object C is rendered OVER the objects A and B to produce the image 100 shown in FIG. 1( a). As object C was the fmal object to be rendered, object C completely obscures objects A and B in a region 101 where object C overlaps objects A and B. The process of rendering objects one on top of another in this manner is conventionally know as the “Painter's Algorithm”.
Input and output colour information of a graphical rendering system is typically described in terms of an intensity value associated with each colour component of a pixel. For example, for conventional 24 bit Red, Green, and Blue (‘RGB’) colour pixel format, each of the Red, Green and Blue colour components of a pixel is represented by an 8 bit (or byte) value. The value of each of the bytes represents the intensity of a particular colour component (i.e., Red, Green or Blue). Further, each 24 bit pixel has an associated opacity value (i.e., alpha channel) ranging between 0% opacity and 100% opacity. An opacity value of 0% indicates that a pixel is completely transparent whilst an opacity value of 100% indicates that the pixel is completely opaque.
Opacity values allow for a plurality of objects to be placed one on top of another to produce a resultant image, where one or more of the objects may be partially obscured by one or more other transparent objects. The operation of combining objects using opacity values is referred to as compositing. For example, a partially opaque object representing a piece of tinted glass can be placed OVER one or more other objects to produce a resultant image. In order to produce the resultant image, a graphics rendering system combines the colour and opacity values representing the glass with the colour and opacity values of the other objects. The image produced by this combination depicts the other objects as seen through the tinted glass.
As another example of rendering objects using opacity values, FIG. 2( a) shows a partially opaque object F (e.g. having an opacity value of 50%) composited onto a completely opaque object E (i.e., having an opacity value of 100%) to produce a resultant image 200. Again, FIG. 2( b) shows the operations involved during the rendering of the objects E and F. In a region 201 of the image 200, where object F overlaps object E, the resultant pixel values represent the combination of object E and object F.
More recently, graphical rendering systems have mathematically extended the Painter's Algorithim process to include operations such as intersecting objects or colour combinations. Such operations can give the effect of intersecting objects or of shining a light onto an object. For example, FIG. 3( a) shows the object F composited onto the object E to produce an image 300, where object E is completely opaque and object F is partially opaque. FIG. 3( b) shows the operations involved during the rendering of the objects E and F, where the compositing operation used to produce the image 300 is an intersection (IN) operation.
FIG. 18 shows the result of each of the above compositing operators together with a variety of other conventional compositing operators, which are conventionally known as “Porter-Duff Compositing Operators”.
Graphical rendering systems can also be configured to combine objects into a group before processing. Generally, such groups of objects are processed as though the objects of the group are joined to produce a single object. The single object can then be placed onto a background image. Objects grouped in such a manner can have operations applied to the group as a whole after all group member objects are combined and before the group is placed on the background image.
FIG. 4( a) shows the result of rendering three objects A, B and C to produce the same image 100 seen in FIG. 1( a). However, as shown in FIG. 4( b), in this instance the objects B and C are initially combined to form group X. Group X is then composited onto object A to produce the resultant image 100.
In an extension of the example of FIG. 4( a), each of the objects within a group (e.g. the group X) can be drawn onto a background image (e.g. object A) using a different compositing operation. For example, different objects of a group may be intersected with a background image or may act as a lighting condition on the background image. The technique of initially combining objects to produce a single object, then placing the combined result on a background is not suitable when objects are grouped together to form a single object (e.g. group X). Such a technique is not suitable as the operations associated with each of the individual objects of a group and which are required to be performed to separately combine each object with the background image, are not performed if the objects are grouped.
As an example, FIG. 5( a) shows an image 501 resulting from the compositing of an object E onto an object A, using an OVER operator. An object F is then composited onto the image 501 using an intersection (IN) operator to produce the image 500 shown in FIG. 5( b). The compositing expression for the image 500 is, therefore, (F IN (E OVER A)).
In contrast, FIG. 6( a) shows the operations involved in the rendering of the objects E, A and F, where objects E and F are initially grouped together to produce group Y. The compositing expression for the image 600 is ((F IN E) OVER A). The operations of FIG. 6( a) produce an image 600 shown in FIG. 6( b). As can be seen from a comparison of FIGS. 5( b) and 6(b), due to the different association of the objects A, E and F, the image 500 is different to the image 600. The grouping of the objects E and F results in the operation (F OVER A) not being performed in producing the image 600.
To render a grouped plurality of objects onto a background image, a conventional rendering system typically takes a copy of the background image and then renders each of the objects in the group onto the copy. Such a rendering system then calculates the percentage of background colour in the resultant image copy and removes this background colour from the image copy. The rendering system then applies operations to the group as a whole. The result of such operations is then composited onto the original background using conventional blending operations.
Where one object of a group of objects, adds colour and opacity to a background image, conventional rendering systems using the above process produce an aesthetically satisfactory result for certain conventional compositing operations (i.e., OVER, ATOP, ROVER, Multiply and Plus). However, where one object of such a group removes colour or opacity data from the background image, conventional rendering systems are unable to satisfactorily remove the background image colour from the representation of the group composited onto the background. As such, a group of objects cannot be composited as a whole onto a background image in order to produce an aesthetically satisfactory result for all compositing operations.
Thus, a need clearly exists for a method of compositing graphical objects, which allows a grouped plurality of objects to be composited onto a background image for substantially all compositing operations.
SUMMARY
It is an object of the present invention to substantially overcome, or at least ameliorate, one or more disadvantages of existing arrangements.
According to one aspect of the present invention there is provided a method of compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said method comprising the steps of:
generating at least one opacity channel having associated opacity component values;
compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image; and
compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel, said altered opacity channel thereby representing the opacity component values associated with said image remaining in said image following composition with said colour and opacity components of said at least one object.
According to another aspect of the present invention there is provided a method of compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said method comprising the steps of:
generating at least one opacity channel having associated opacity component values;
compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image;
compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel; and
utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with said colour and opacity component values of said at least one object.
According to still another aspect of the present invention there is provided a method of compositing a grouped plurality of graphical objects with an image, each said object and said image having associated colour and opacity component values, said method comprising the steps of:
generating at least one opacity channel having associated opacity component values;
compositing the colour and opacity component values of each of said objects with the colour and opacity component values of said image;
compositing said opacity component values of each of said objects with that of said at least one opacity channel to produce an altered opacity channel; and
utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with the colour and opacity component values of each of said objects.
According to still another aspect of the present invention there is provided an apparatus for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said apparatus comprising:
means for generating at least one opacity channel having associated opacity component values;
means for compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image; and
means for compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel, said altered opacity channel thereby representing the opacity component values associated with said image remaining in said image following composition with said colour and opacity components of said at least one object.
According to still another aspect of the present invention there is provided an apparatus for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said apparatus comprising:
means for generating at least one opacity channel having associated opacity component values;
means for compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image;
means for compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel; and utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with said colour and opacity component values of said at least one object.
According to still another aspect of the present invention there is provided an apparatus for compositing a grouped plurality of graphical objects with an image, each said object and said image having associated colour and opacity component values, said apparatus comprising:
means for generating at least one opacity channel having associated opacity component values;
means for compositing the colour and opacity component values of each of said objects with the colour and opacity component values of said image;
means for compositing said opacity component values of each of said objects with that of said at least one opacity channel to produce an altered opacity channel; and utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with the colour and opacity component values of each of said objects.
According to still another aspect of the present invention there is provided an apparatus for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said apparatus comprising:
a memory for storing a program; and
a processor for executing said program, said program comprising:
    • code for generating at least one opacity channel having associated opacity component values;
    • code for compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image; and
    • code for compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel, said altered opacity channel thereby representing the opacity component values associated with said image remaining in said image following composition with said colour and opacity components of said at least one object.
According to still another aspect of the present invention there is provided an apparatus for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said apparatus comprising:
a memory for storing a program; and
a processor for executing said program, said program comprising:
    • code for generating at least one opacity channel having associated opacity component values;
    • code for compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image;
    • code for compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel; and
    • code for utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with said colour and opacity component values of said at least one object.
According to still another aspect of the present invention there is provided an apparatus for compositing a grouped plurality of graphical objects with an image, each said object and said image having associated colour and opacity component values, said apparatus comprising:
a memory for storing a program; and
a processor for executing said program, said program comprising:
    • code for generating at least one opacity channel having associated opacity component values;
    • code for compositing the colour and opacity component values of each of said objects with the colour and opacity component values of said image;
    • code for compositing said opacity component values of each of said objects with that of said at least one opacity channel to produce an altered opacity channel; and
    • code for utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with the colour and opacity component values of each of said objects.
According to still another aspect of the present invention there is provided a computer program for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said program comprising:
code for generating at least one opacity channel having associated opacity component values;
code for compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image; and
code for compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel, said altered opacity channel thereby representing the opacity component values associated with said image remaining in said image following composition with said colour and opacity components of said at least one object.
According to still another aspect of the present invention there is provided a computer program for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said program comprising:
code for generating at least one opacity channel having associated opacity component values;
code for compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image;
code for compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel; and
code for utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with said colour and opacity component values of said at least one object.
According to still another aspect of the present invention there is provided a computer program for compositing a grouped plurality of graphical objects with an image, each said object and said image having associated colour and opacity component values, said program comprising:
code for generating at least one opacity channel having associated opacity component values;
code for compositing the colour and opacity component values of each of said objects with the colour and opacity component values of said image;
code for compositing said opacity component values of each of said objects with that of said at least one opacity channel to produce an altered opacity channel; and
code for utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with the colour and opacity component values of each of said objects.
According to still another aspect of the present invention there is provided a computer program product having a computer readable medium having a computer program recorded therein for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said computer program product comprising:
computer program code means for generating at least one opacity channel having associated opacity component values;
computer program code means for compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image; and
computer program code means for compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel, said altered opacity channel thereby representing the opacity component values associated with said image remaining in said image following composition with said colour and opacity components of said at least one object.
According to still another aspect of the present invention there is provided a computer program product having a computer readable medium having a computer program recorded therein for compositing at least one graphical object with an image, said object and said image having associated colour and opacity component values, said computer program product comprising:
computer program code means for generating at least one opacity channel having associated opacity component values;
computer program code means for compositing the colour and opacity component values of said at least one object with the colour and opacity component values of said image;
computer program code means for compositing said opacity component values of said at least one object with that of said at least one opacity channel to produce an altered opacity channel; and
computer program code means for utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with said colour and opacity component values of said at least one object.
According to still another aspect of the present invention there is provided a computer program product having a computer readable medium having a computer program recorded therein for compositing a grouped plurality of graphical objects with an image, each said object and said image having associated colour and opacity component values, said computer program product comprising:
computer program code means for generating at least one opacity channel having associated opacity component values;
computer program code means for compositing the colour and opacity component values of each of said objects with the colour and opacity component values of said image;
computer program code means for compositing said opacity component values of each of said objects with that of said at least one opacity channel to produce an altered opacity channel; and
computer program code means for utilising said altered opacity channel to remove the colour and opacity component values of said image remaining in said image following composition with the colour and opacity component values of each of said objects.
Other aspects of the invention are also disclosed.
BRIEF DESCRIPTIONS OF THE DRAWINGS
Some aspects of the prior art and one or more embodiments of the present invention are described with reference to the drawings and appendices, in which:
FIG. 1( a) shows an image resulting from the rendering of three objects;
FIG. 1( b) shows the operations involved during the rendering of the objects of FIG. 1( a);
FIG. 2( a) shows a partially opaque object composited onto a completely opaque object;
FIG. 2( b) shows the operations involved during the rendering of the objects of FIG. 2( a);
FIG. 3( a) shows an object composited onto another object;
FIG. 3( b) shows the operations involved during the rendering of the objects of FIG. 3( a);
FIG. 4( a) shows an image resulting from the rendering of three objects;
FIG. 4( b) shows the operations involved during the rendering of the objects of FIG. 4( a);
FIG. 5( a) shows the operations involved during the rendering of three objects;
FIG. 5( b) shows an image resulting from the operations of FIG. 5( a);
FIG. 6( a) shows the operations involved during the rendering of three objects;
FIG. 6( b) shows an image resulting from the operations of FIG. 6( a);
FIG. 7 is a flow diagram showing a method of compositing a group of graphical objects in accordance with one embodiment;
FIG. 8( a) shows a green object that is partially transparent;
FIG. 8( b) shows a group object;
FIG. 9 shows a clear background image upon which the objects of FIGS. 8( a) and 8(b) will be composited, in accordance with an example;
FIG. 10 shows the object of FIG. 8( a) composited onto the background image of FIG. 9;
FIG. 11 shows a duplicate of the background image of FIG. 10 and an extra opacity channel;
FIG. 12 shows an updated duplicate background image and an updated alpha channel following the compositing of an object of FIG. 8( b) onto the duplicate background image of FIG. 11;
FIG. 13 shows a further updated duplicate background image and an updated alpha channel following the compositing of another object of FIG. 8( b) onto the duplicate background image of FIG. 12;
FIG. 14 shows a further updated duplicate background image following background colour removal;
FIG. 15 shows a further updated duplicate background image following the application of group opacity to the updated duplicate background image of FIG. 14;
FIG. 16 shows an updated version of the original background image of FIG. 10 following the compositing of the group object of FIG. 8( b);
FIG. 17 is a schematic block diagram of a general purpose computer upon which arrangements described can be practiced;and
FIG. 18 shows the result of each of the above compositing operators together with a variety of other conventional compositing operators.
DETAILED DESCRIPTION INCLUDING BEST MODE
It is to be noted that the discussions contained in the “Background” section and that above relating to prior art arrangements relate to discussions of documents or devices which form public knowledge through their respective publication and/or use. Such should not be interpreted as a representation by the present inventor(s) or patent applicant that such documents or devices in any way form part of the common general knowledge in the art.
A method 700 (see FIG. 7) of compositing a group of graphical objects onto a background image, in accordance with an embodiment of the present invention, is described below with reference to FIGS. 7 to 18. The method 700 allows a grouped plurality of objects to be composited onto a background image to produce the same result as if each of the objects had been rendered separately onto the background image. In particular, the method 700 allows such a composition to be performed when an operation is to be applied to a grouped plurality of objects as a whole.
The principles of the method 700 described herein have general applicability to any rendering system that accepts input graphical objects and generates a final image. The final image may be represented in any suitable format (e.g. pixels or rendering commands). The input objects for such a rendering system can be generated using a graphical user interface, where a user can group a plurality of objects together in order to process the group of objects as a single object (or group object). Such a group object allows any operation (e.g. group opacity or input filtering) that can be performed on a single object to be performed on the group object. Each separate input object of such a rendering system includes an associated compositing operation used to composite the object onto a background image.
The method 700 is preferably practiced using a general-purpose computer system 1700, such as that shown in FIG. 17 wherein the processes of FIGS. 7 to 18 may be implemented as software, such as an application program executing within the computer system 1700. In particular, the steps of method 700 are effected by instructions in the software that are carried out by the computer. The instructions may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part performs the method 700 and a second part manages a user interface between the first part and the user. The software may be stored in a computer readable medium, including the storage devices described below, for example. The software is loaded into the computer from the computer readable medium, and then executed by the computer. A computer readable medium having such software or computer program recorded on it is a computer program product. The use of the computer program product in the computer preferably effects an advantageous apparatus for implementing the method 700.
The computer system 1700 is formed by a computer module 1701, input devices such as a keyboard 1702 and mouse 1703, output devices including a printer 1715, a display device 1714 and loudspeakers 1717. A Modulator-Demodulator (Modem) transceiver device 1716 is used by the computer module 1701 for communicating to and from a communications network 1720, for example connectable via a telephone line 1721 or other functional medium. The modem 1716 can be used to obtain access to the Internet, and other network systems, such as a Local Area Network (LAN) or a Wide Area Network (WAN), and may be incorporated into the computer module 1701 in some implementations.
The computer module 1701 typically includes at least one processor unit 1705, and a memory unit 1706, for example formed from semiconductor random access memory (RAM) and read only memory (ROM). The module 1701 also includes an number of input/output (I/O) interfaces including an audio-video interface 1707 that couples to the video display 1714 and loudspeakers 1717, an I/O interface 1713 for the keyboard 1702 and mouse 1703 and optionally a joystick (not illustrated), and an interface 1708 for the modem 1716 and printer 1715. In some implementations, the modem 1716 may be incorporated within the computer module 1701, for example within the interface 1708. A storage device 1709 is provided and typically includes a hard disk drive 1710 and a floppy disk drive 1711. A magnetic tape drive (not illustrated) may also be used. A CD-ROM drive 1712 is typically provided as a non-volatile source of data. The components 1705 to 1713 of the computer module 1701, typically communicate via an interconnected bus 1704 and in a manner which results in a conventional mode of operation of the computer system 1700 known to those in the relevant art. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations or alike computer systems evolved therefrom.
Typically, the application program is resident on the hard disk drive 1710 and read and controlled in its execution by the processor 1705. Intermediate storage of the program and any data fetched from the network 1720 may be accomplished using the semiconductor memory 1706, possibly in concert with the hard disk drive 1710. In some instances, the application program may be supplied to the user encoded on a CD-ROM or floppy disk and read via the corresponding drive 1712 or 1711, or alternatively may be read by the user from the network 1720 via the modem device 1716. Still further, the software can also be loaded into the computer system 1700 from other computer readable media. The term “computer readable medium” as used herein refers to any storage or transmission medium that participates in providing instructions and/or data to the computer system 1700 for execution and/or processing. Examples of storage media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 1701. Examples of transmission media include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like.
The method 700 of compositing a group of graphical objects may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of the method 700. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories.
For ease of explanation, compositing operations are generally described below in terms of premultiplied alpha (i.e., opacity), unless otherwise indicated. Many graphical rendering systems store pixel colour component values as pre-multiplied alpha values in order to reduce the complexity of compositing operations. For example, in a Red, Green, Blue, Alpha (RGBA) colour environment, to represent a value of 50% opaque Red as pre-multiplied RGBA, the colour component values stored are R=0.5 (i.e., 100% Red×50% opacity), G=0, B=0, A=0.5.
Table 1 below shows the definition of terms used throughout the following description to represent compositing operations. As shown in Table 1, the term ‘S’ below is used to refer to a colour component value stored in a source buffer. The term ‘D’ below is used to refer to a colour component value stored in a destination buffer.
TABLE 1
Term Definition
Sc Non-premultiplied source colour component.
Sca Premultiplied source colour component.
Sra Sga Sba Premultiplied source colour components for Red,
Green and Blue, respectively.
Sa Source opacity component.
Dc Non-premultiplied destination colour component.
Dca Premultiplied destination colour component.
Dra, Dga, Dba Premultiplied destination colour components for
Red, Green and Blue, respectively.
Da Destination opacity component.
Da(d) Destination opacity component for a group buffer
containing the opacity of the background channel
in the group buffer.
D<n> Destination buffer <n>, where n = 0 represents the
background, n = 1 represents groups in a top level
object, n = 2 represents nested groups.
D′ The result of a destination buffer following a
compositing step (i.e., the updated buffer).
The following equations (1) and (2) are used to determine the result of compositing source pixel values with destination pixel values. A source pixel value may be associated with either a single object or a group of objects represented as a single object. For each pixel, a value is determined using equations (1) and (2) for each colour component (e.g. R, G, B) and for the alpha channel. Depending on the compositing operation to be performed, equations (1) and (2) are resolved in terms of premultiplied alpha values prior to rendering.
D ca ′=f(S c, D cS a ·D a +Y·S ca·(1−D a)+Z·D ca·(1−S a)  (1)
D a ′=X·S a ·D a +Y·S a·(1−D a)+Z·D a·(1−Sa)  (2)
where:
  • X: represents the intersection of the opacity of a source pixel value and a destination pixel value;
  • Y: represents the intersection of the source pixel value and the inverse of the destination pixel value;
  • Z: represents the intersection of the inverse of the source pixel value and the destination pixel value; and
  • f( ): represents a function of the source and destination colour component values as defined by the particular compositing operator used to perform the operation (see Table 2 below).
Table 2 defines some compositing operations in terms of X, Y, Z and f( ). A person skilled in the relevant art would appreciate that other compositing operations known in the relevant art can be defined in a similar fashion. Table 2 also includes a column, Colour (Dca′), indicating an equation for determining the destination colour component value for a particular compositing operation (e.g. source pixel OVER destination pixel). Table 2 also includes a column, Da′, showing an equation for determining the destination opacity component value, and a column Da(d)′ showing an equation for determining the destination opacity component value for a group buffer containing the opacity of the background channel in the group buffer.
TABLE 2
Operation f(Sc, Dc) X Y Z Colour (Dca′) Opacity (Da′) Da(d)
Clear 0 0 0 0 0
0
0
Src Sc 1 Sc · Sa · Da + Sa · Da + 0
1 Sca · (1 − Da) = Sa · (1 − Da) =
0 Sca Sa
Dst Dc 1 Dc · Sa · Da + Sa · Da + Da(d) · (1 − Sa)
0 Dca · (1 − Sa) = Da · (1 − Sa) =
1 Dca Da
OVER Sc 1 Sc · Sa · Da + Sa · Da + Da(d) · (1 − Sa)
1 Sca · (1 − Da) + Sa · (1 − Da) +
1 Dca · (1 − Sa) = Da · (1 − Sa) =
Sca + Dca · (1 − Sa) Sa + Da − Sa · Da
ROVER Dc 1 Dc · Sa · Da + Sa · Da + Da(d) · (1 − Sa)
1 Sca · (1 − Da) + Sa · (1 − Da) +
1 Dca · (1 − Sa) = Da · (1 − Sa) =
Dca + Sca · (1 − Da) Sa + Da − Sa · Da
IN Sc 1 Sc · Sa · Da Sa · Da 0
0
0
RIN Dc 1 Dc · Sa · Da Sa · Da 0
0
0
OUT 0 0 Sca · (1 − Da) Sa · (1 − Da) 0
1
0
ROUT 0 0 Dca · (1 − Sa) Da · (1 − Sa) Da(d) · (1 − Sa)
0
1
ATOP Sc 1 Sc · Sa · Da + Sa · Da + Da(d) · (1 − Sa)
0 Dca · (1 − Sa) = Da · (1 − Sa) =
1 Sca · Da + Dca · (1 − Sa) Da
RATOP Dc 1 Dc · Sa · Da + Sa · Da + 0
1 Sca · (1 − Da) = Sa · (1 − Da) =
0 Dca · Sa + Sca · (1 − Da) Sa
XOR 0 0 Sca · (1 − Da) + Sa · (1 − Da) + Da(d) · (1 − Sa)
1 Dca · (1 − Sa) = Da · (1 − Sa) =
1 Sca · (1 − Da) + Dca · (1 − Sa) Sa + Da − 2 · Sa · Da
Plus Sc + Dc 1 (Sc + Dc) · Sa · Da Sa · Da Da(d) · (1 − Sa)
1 Sca · (1 − Da) + Sa · (1 − Da) +
1 Dca · (1 − Sa) = Da · (1 − Sa) =
Sca + Dca Sa + Da − Sa · Da
Multiply Sc · Dc 1 Sc · Dc · Sa · Da Sa · Da Da(d) · (1 − Sa)
1 Sca · (1 − Da) + Sa · (1 − Da) +
1 Dca · (1 − Sa) = Da · (1 − Sa) =
Sca · Dca + Sca · (1 − Da) + Dca · (1 − Sa) Sa + Da − Sa · Da
The method 700 of compositing a group of graphical objects (hereinafter ‘the group object’) onto a background image, will now be described, with reference to FIG. 7. The method 700 will be further described below with reference to an example shown in FIGS. 8( a) to 16.
Prior to executing the method 700, a background image (Do) is generated by compositing one or more graphical objects to produce the background image (Do). The background image (Do) includes pixel value background colour components (e.g. RGB colour components) and opacity (i.e., alpha channel). Similarly, each one of the objects in the group object, includes colour components and opacity.
The method 700 is preferably implemented as software resident in memory 1706 and controlled in its execution by the processor 1705. The method 700 preferably uses bitmap rendering, where each object is completely drawn into a destination bitmap one at a time. Alternatively, other rendering techniques can be utilised in accordance with the described embodiments, including scan line rendering (i.e., complete single scan lines are generated one at a time) or pixel rendering (i.e., the final value of each pixel is generated one at a time).
The method begins at step 701, where the processor 1705 duplicates the background image (Do) to produce a duplicate background image (D1). Also at step 701, the processor 1705 stores the original background image (Do), in memory 1706, for later compositing with the group object. In the method 700, the duplicate background image (D1) is used for generating each of the objects in the group object. Once the group object is generated in (D1), the group object is composited back onto the background image (Do).
The method 700 continues at the next step 702, where the processor 1705 generates an alpha channel (Da(d)) for storing the opacity values of the background image (D0) remaining in the duplicate background image (D1), during the operation of compositing the group object onto the background image (D1). The duplicate background image (D1) initially contains identical values to the background image (D0). Further, the alpha channel (Da(d)) is initially set to fully opaque.
The following steps 703, 704, 705 and 706 of the method 700 composite each of the objects of the group object, into the duplicate background image (D1) and the alpha channel (Da(d)). At step 703, if the processor 1705 determines that each of the objects of the group object have been rendered into the duplicate background image (D1) then the method 700 proceeds to step 707. Otherwise, the method 700 proceeds to step 704, where the colour component values associated with a current object of the group object are composited into the duplicate background image (D1). The colour components of the current object are composited at step 704 using an equation shown in the colour column, Dca′, of Table 2, depending on the compositing operation being performed.
At the next step 705, the processor 1705 composites the alpha channel of the current object of the group object onto an alpha channel (not shown) associated with the duplicate of the background image (D1). The compositing of the alpha channel of the current object of the group object in step 705 is performed using an equation shown in the opacity column (Da′) of Table 2, depending on the compositing operation being used.
The method 700 continues at the next step 706, where the processor 1705 composites the alpha channel of the current object of the group object onto the alpha channel, Da(d)′, generated at step 702. The compositing of the alpha channel in step 706 is performed using an equation shown in the (Da(d)′) column of Table 2, depending on the compositing operation being used. For example, for the compositing operation “ATOP”, the function f(Sc, Dc)=Sc, X=1, Y=0 and Z=1, as seen in Table 2. Therefore, Dca1′=Sca·Da1+Dca1·(1−Sa), Da1′=Da1 and Da1(d)′=Da1(d)·(1−Sa). Following step 706, the method 700 returns to step 703.
Once each of the objects of the group object have been rendered into the duplicate background image (D1), then the method 700 proceeds to step 707. At step 707, if the group object has an associated group opacity, a filter effect or any other operation acting on the group object as a whole, then the method 700 proceeds to step 708. Otherwise, the method 700 proceeds to step 712, where processor 1705 replaces the background image (D0) with the duplicate background image including the group object (D1), and the method 700 concludes.
At step 708, the processor 1705 removes any of the background colour left in the duplicate background image (D1) from the duplicate background image (D1) prior to applying the group opacity or effect, for example, to the group object rendered in the duplicate background image (D1). In order to execute step 708, the processor 1705 determines the updated premultiplied colour component value (Dca1′), and the updated destination opacity component value (Da1′), for each pixel of the duplicate background image (D1). The processor 1705 determines the values (Dca1′) and (Da1′) in accordance with the following equations:
D ca1 ′=D ca1 −D ca0 ·D a1(d)  (3)
D a1 ′=D a1 −D a0 ·D a1(d)  (4)
The method 700 continues at the next step 709, where the processor 1705 inverts (Da1(d)′). The result of step 709, represents the amount of the original background image (D0) removed from the duplicate background image (D1) during the execution of steps 703 to 706.
D a1(d)′=1−D a1(d)  (5)
At the next step 710, the processor 1705 applies the group effect (e.g. the group opacity or the filter effect) to the group object to produce a completed group object. The method 700 concludes at the next step 711, where the processor 1705 composites the completed group object onto the original background image (D0). The colour and opacity component values of the original background image (D0) are calculated at step 711 in accordance with the following equations (6) and (7):
D ca0 ′=f(D c1 ,D c0D a1 ·D a0 +Y·D ca1·(1−D a0)+Z·D ca0·(1 −D a1(d))  (6)
D a0 ′=X·D a1 ·D a0 +Y·D a1·(1−D a0)+Z·D a0·(1−D a1(d))  (7)
The method 700 will now be further described below by way of an example shown in FIGS. 8( a) to 16.
FIG. 8( a) shows a 100% green object 802 that is 60% opaque. The colour and opacity component values of the green object 802 are therefore, R=0, G=0.6, B=0 and A=0.6. FIG. 8( b) shows a group object 805. The group object 805 contains two objects 803 and 804. The first object 803 is red and is 80% opaque. The colour and opacity component values of the red object 803 are therefore, R=0.8, G=0, B=0 and A=0.8. The “XOR” compositing operation is associated with the red object 803. As such, the red object 803 will be composited onto any background image using the XOR compositing operation.
The second object 804, of the group object 805, is purple and is 70% opaque. The colour and opacity component values of the purple object 804 are therefore, R=0.7, G=0, B=0.7 and A=0.7. The “ROVER” compositing operation is associated with the object 804. As such, the purple object 804 will be composited onto any background image using the ROVER compositing operation. The group object 805 is 50% opaque and the “OVER” compositing operation is associated with the group object. As such, the group object 805 will be composited onto any background image using the OVER compositing operation.
FIG. 9 shows a clear (i.e., Dca0′=0, and Da0′=0) background image (D0) 901, upon which the object 802 and the group object 805 will be composited, using bitmap rendering, in accordance with the example. Again, other rendering techniques can be utilised in accordance with the example, including scan line rendering or pixel rendering, for example.
In the example, the object 802 is initially composited onto the background image (D0), resulting in an updated background image (D0′) 1001, as seen in FIG. 10. The object 802 is composited onto the background image (D0) using the compositing operation (i.e., OVER) associated with the object 802.
As at steps 701 and 702 of the method 700, the background image (D0) 1001 is duplicated to produce a duplicate background image (D1)) 1101, as seen in FIG. 11. The background image (Do′) 1001 can be stored in memory 1706 for later compositing with the group object 805. As at step 702, an alpha channel (Da(d)) 1103, as shown in FIG. 11, is also generated for storing the opacity values of the background image (D0) 901 remaining in the duplicate background image (D1) 1101, following the compositing of the group object (i.e., objects 803 and 804) onto the background image (D1) 1101. The alpha channel Da(d) is initially set to one representing a fully opaque object (i.e., Da(d)=1). For the region 1104 formed by the object 802, as seen in FIG. 11, the updated destination values following the creation of the duplicate background image (D1) 1101, and the alpha channel (Da(d)) 1103, are listed below:
(i) Dra1′ = Dra0 = 0
(ii) Dga1′ = Dga0 = 0.6
(iii) Dba1′ = Dba0 = 0
(iv) Da1′ = Da0 = 0.6
(v) Da1(d)′ = 1
FIG. 12 shows an updated duplicate background image (D1′) 1201 following the compositing of the object 803 onto the duplicate background image (D1) 1101 including the object 802, as at steps 703 to 706 of the method 700. FIG. 12 also shows the updated alpha channel (Da(d)′) 1203 following the compositing of the object 803. The object 803 is composited onto the duplicate background image (D1) 1101 using the XOR compositing operation. As such, from Table 2, f(Sc,Dc)=0, X=0, Y=1 and Z=1. Further, the equations for colour Dca1′, opacity Da1′ and Da1(d)′ for the XOR compositing operation are listed below:
D ca1 ′=S ca·(1−D a1)+D ca1·(1−S a)  (i)
D a1 ′=S a·(1−D a1)+D a1·(1−S a)  (ii)
D a1(d) ′=D a1(d)·(1−S a)   (iii)
The updated destination values for the region of overlap 1204 of the objects 802 and 803, as seen in FIG. 12, following the compositing of the object 803 onto the duplicate background image (D1) 1101 are listed below:
(i) Dra1′ = Sra · (1 − Da1) + Dra1 · (1 − Sa) = 0.8 · (1 − 0.6) + 0 · (1 − 0.8) = 0.32
(ii) Dga1′ = Sga · (1 − Da1) + Dga1 · (1 − Sa) = 0 · (1 − 0.6) + 0.6 · (1 − 0.8) = 0.12
(iii) Dba1′ = Sba · (1 − Da1) + Dba1 · (1 − Sa) = 0 · (1 − 0.6) + 0 · (1 − 0.8) = 0
(iv) Da1′ = Sca · (1 − Da1) + Da1 · (1 − Sa) = 0.8 · (1 − 0.6) + 0.6 · (1 − 0.8) = 0.44
(v) Da1(d)′ = Da1(d) · (1 − Sa) = 1 · (1 − 0.8) = 0.2
FIG. 13 shows an updated duplicate background image (D1′) 1301 following the compositing of the object 804 onto the previously updated duplicate background image (D1′) 1201, as at steps 703 to 706 of the method 700. FIG. 13 also shows the updated alpha channel (Da(d)′) 1303 following the compositing of the object 804. The object 804 is composited onto the previously updated background image (D1′) 1201 using the ROVER compositing operation. As such, from Table 2, f(Sc,Dc)=Dc, X=1, Y=1 and Z=1. Further, the equations for colour Dca1′, opacity Da1′ and Da1(d)′ for the ROVER compositing operation are listed below:
(i) Dca1′ = Dca1 · Sa + Sca · (1 − Da1) + Dca1 · (1 − Sa) = Dca1 + Sca · (1 − Da1)
(ii) Da1′ = Da1 · Sa + Sa · (1 − Da1) + Da1 · (1 − Sa) = Da1 + Sa · (1 − Da1)
(iii) Da1(d)′ = Da1(d) · (1 − Sa)
The updated destination values for the region of overlap 1304 of the objects 802, 803 and 804, as seen in FIG. 13, following the compositing of the object 804 onto the previously updated duplicate background image (D1′) 1201 are listed below:
(i) Dra1′ = Dra1 + Sra · (1 − Da1) = 0.32 + 0.7 · (1 − 0.44) = 0.712
(ii) Dga1′ = Dga1 + Sga · (1 − Da1) = 0.12 + 0 · (1 − 0.44) = 0.12
(iii) Dba1′ = Dba1 + Sba · (1 − Da1) = 0 + 0.7 · (1 − 0.44) = 0.392
(iv) Da1′ = Da1 + Sa · (1 − Da1) = 0.44 + 0.7 · (1 − 0.44) = 0.832
(v) Da1(d)′ = Da1(d) · (1 − Sa) = 0.2 · (1 − 0.7) = 0.06
FIG. 14 shows an updated duplicate background image (D1′) 1401 following the removal of any remaining colour attributed to the previously updated duplicate background image (D1′) 1301, as at step 708, and inverting the result of Da1(d), as at step 709 of the method 700. As described above, in order to produce the updated duplicate background image (D1′) 1401, the processor 1705 determines the updated premultiplied colour component value (Dca1′), and the updated destination opacity component value (Da1′), for each pixel of the updated duplicate background image (D1′) 1401. FIG. 14 also shows the updated alpha channel (Da(d)′) 1402 following the inversion step 709. As such, the equations for colour Dca1′, opacity Da1′, and Da1(d)′ for the operation of steps 708 are listed as follows:
(i) Dca1′ = Dca1 − Dca0 · Da1(d)
(ii) Da1′ = Da1 − Da0 · Da1(d)
(iii) Da1(d) = 1 − Da1(d)
The updated destination values for the region of overlap 1403, as seen in FIG. 14, following the operation of steps 708 and 709 on the previously updated duplicate background image (D1′) 1301 are listed below:
(i) Dra1′ = Dra1 − Dra0 · Da1(d) = 0.712 − 0 · 0.06 = 0.712
(ii) Dga1′ = Dga1 − Dga0 · Da1(d) = 0.12 − 0.6 · 0.06 = 0.084
(iii) Dba1′ = Dba1 − Dba0 · Da1(d) = 0.392 − 0 · 0.06 = 0.392
(iv) Da1′ = Da1 − Da0 · Da1(d) = 0.832 − 0.6 · 0.06 = 0.796
(v) Da1(d)′ = 1 − Da1(d) = 1 − 0.06 = 0.94
FIG. 15 shows an updated duplicate background image (D1′) 1501 following the application of group opacity to the previously updated duplicate background image (D1′) 1401 and the previously updated alpha channel (Da(d)′) 1402, as at step 710 of the method 700. The equations for colour Dca1′, opacity Da1′, and Da1(d)′ for the operation of step 710 are listed as follows:
(i) Dca1′ = Dca1 · 0.5
(ii) Da1′ = Da1 · 0.5
(iii) Da1(d) = Da1(d) · 0.5
The updated destination values for the region of overlap 1503, as seen in FIG. 15, following the operation of step 710 on the previously updated duplicate background image (D1′) 1401 and opacity (Da(d)′) 1402 are listed below:
(i) Dra1′ = 0.712 · 0.5 = 0.356
(ii) Dga1′ = 0.084 · 0.5 = 0.042
(iii) Dba1′ = 0.392 · 0.5 = 0.196
(iv) Da1′ = 0.796 · 0.5 = 0.398
(v) Da1(d)′ = 0.94 · 0.5 = 0.47
FIG. 16 shows an updated original background image (D0′) 1601 following the compositing of the group object 805 of the previously updated duplicate background image (D1′) 1501 of FIG. 15, onto the background image (D0′) 1001, as at step 711 of the method 700. The group object 805 is composited onto the background image (D0′) 1001 using the OVER compositing operation. As such, from Table 2, f(Sc,Dc)=Sc, X=1, Y=1 and Z=1. Further, the equations for colour Dca1′, opacity Da1′ for the OVER compositing operation are listed below:
(i) Dca1′ = Sca · Da1 + Sca · (1 − Da1) + Dca1 · (1 − Da1(d)) = Sca + Dca1 · (1 − Da1(d))
(ii) Da1′ = Sa · Da1 + Sa · (1 − Da1) + Da1 · (1 − Da1(d)) = Sa + Da1 · (1 − Da1(d))
The updated destination values for the region of overlap 1602, as seen in FIG. 16, following the operation of step 711 on the background image (D0′) 1001 are listed below:
(i) Dra1′ = Sra + Dra1 · (1 − Da1(d)) = 0.356 + 0 · (1 − 0.47) = 0.356
(ii) Dga1′ = Sga + Dga1 · (1 − Da1(d)) = 0.042 + 0.6 · (1 − 0.47) = 0.36
(iii) Dba1′ = Sba + Dba1 · (1 − Da1(d)) = 0.196 + 0 · (1 − 0.47) = 0.196
(iv) Da1′ = Sa + Da1 · (1 − Da1(d)) = 0.398 + 0.6 · (1 − 0.47) = 0.716
As described above, the principles of the method 700 have general applicability to any rendering system that accepts input graphical objects and generates a final image. The final image may be represented in any suitable format (e.g. pixels or rendering commands). The input objects for such a rendering system can be generated using a graphical user interface, where a user can group a plurality of objects together in order to process the group of objects as a single object (or group object). Such a group object allows any operation (e.g. group opacity or input filtering) that can be performed on a single object to be performed on the group object. Each separate input object of such a rendering system includes an associated compositing operation used to composite the object onto a background image.
As also described above, the method 700 utilises equations (1) and (2) to determine the result of compositing source pixel values with destination pixel values. A source pixel value may be associated with either a single input object or a group of objects represented as a single object. For each pixel, a value is determined using equations (1) and (2) for each colour component (e.g. R, G, B) and for the alpha channel. The source colour component values are stored in a source buffer and the destination colour component values are stored in a destination buffer.
In one implementation of the method 700, the source and destination buffers can be in the form of image bitmaps substantially identical in size and position. A plurality of such image bitmaps can be layered to form an “image file”, where each layer of the image file comprises bitmap information for each pixel in the layer. In such an implementation, the source colour component values ‘S’, as described above, may be accessed from one or more layers of a source image file and the destination colour component values ‘D’, as described above, resulting from the combination of the source colour component values may be output to a further image file or to an output device. The operation used to combine the source pixels from each of the layers of the source image file can be specified in the source image file. Such a source image file may also specify the grouping of image layers, where a subset of the source image file is first composited together to produce a resultant value, which is subsequently combined with the other layers of the source image file in accordance with the method 700 described above.
As an example of the implementation described directly above, vector graphics representations of input objects to a rendering system may be converted by the rendering system to pixel based objects in the form of an image file. The image file can then be included as an object in a further vector graphics file and then be processed in accordance with the method 700 by the rendering system and a resultant destination image file can be subsequently converted back to vector based graphics objects for rendering.
The aforementioned preferred method(s) comprise a particular control flow. There are many other variants of the preferred method(s) which use different control flows without departing the spirit or scope of the invention. Furthermore one or more of the steps of the preferred method(s) may be performed in parallel rather sequential.
The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive. For example, various techniques can be used to perform the rendering operations in accordance with the embodiments described above. Where bitmap rendering is used, each object is completely drawn into the bitmap one at a time. Where a copy of a background image is made, or extra storage is needed to store a pixel, a buffer can be used. Alternatively, other rendering techniques can be utilised in accordance with the described embodiments including scan line rendering where complete single scan lines are generated, one at a time, or pixel rendering where the final value of each pixel is generated one at a time.
In the context of this specification, the word “comprising” means “including principally but not necessarily solely” or “having” or “including”, and not “consisting only of”. Variations of the word “comprising”, such as “comprise” and “comprises” have correspondingly varied meanings.

Claims (24)

The claims defining the invention are as follows:
1. A computer-implemented method of representing an amount of image color in a composite image, said method comprising the steps of:
generating at least one additional opacity channel for use in creating the composite image, the additional opacity channel initially set to fully opaque to represent that all of the image color is visible;
compositing at least one partially transparent graphical object having object color and object opacity, with an image having image opacity and the image color, to create the composite image, the composite image having composite image color and composite image opacity, the composite image color and composite image opacity being derived from one or more of the object color, the object opacity, the image color and the image opacity;
compositing the object opacity with the additional opacity channel to update the additional opacity channel, the updated additional opacity channel thereby being less than fully opaque and representing an amount of the image color remaining in the composite image following the compositing of the at least one graphical object with the image; and
storing at least the updated additional opacity channel in a computer-readable memory,
wherein the steps are performed on a processor.
2. A method according to claim 1, further comprising the step of utilizing the updated additional opacity channel to remove the image color and image opacity remaining in the composite image following composition with the object color and object opacity.
3. A method according to claim 2, further comprising the step of utilizing the updated additional opacity channel to composite the object color and object opacity with the image color and image opacity.
4. A method according to any one of claims 1 to 3, wherein the at least one object is one object of a grouped plurality of objects.
5. A method according to claim 4, further comprising the step of applying a group effect to the grouped plurality of objects.
6. A method according to claim 4, further comprising the step of compositing object color and object opacity of each object of the grouped plurality of objects with the image color and image opacity.
7. A method according to claim 1, further comprising the step of inverting the opacity values of the updated additional opacity channel.
8. A method according to claim 1, further comprising the step of copying the image to form an image copy.
9. A method according to claim 8, further comprising the step of compositing the object color and object opacity with color and opacity component values of the image copy.
10. A method according to claim 9, wherein the updated additional opacity channel represents opacity component values associated with the image copy remaining in the image copy following composition of the object color and object opacity with the color and opacity component values of the image copy.
11. A method according to claim 9, further comprising the step of utilizing the updated additional opacity channel to remove the color and opacity component values of the image copy remaining in the image copy following composition of the object color and object opacity with the color and opacity component values of the image copy.
12. A method according to claim 11, further comprising the step of utilizing the updated additional opacity channel to composite the object color and object opacity with the image color and image opacity.
13. A method according to claim 1, wherein the object color and object opacity are accessed from an image file.
14. A method according to claim 1, wherein the image color and image opacity are accessed from an image file.
15. The method according to claim 4, further comprising the step of compositing the composite image with the image using a group opacity.
16. A computer-implemented method of representing an amount of image color in a composite image, said method comprising the steps of:
generating at least one additional opacity channel for use in creating the composite image, the additional opacity channel initially set to fully opaque to represent that all of the image color is visible;
compositing at least one partially transparent graphical object having object color and object opacity, with an image having image opacity and the image color, to create the composite image, the composite image having composite image color and composite image opacity, the composite image color and composite image opacity being derived from one or more of the object color, the object opacity, the image color and the image opacity;
compositing the object opacity with the additional opacity channel to update the additional opacity channel, the updated additional opacity channel thereby being less than fully opaque and representing an amount of the image color remaining in the composite image following the compositing of the at least one graphical object with the image;
storing at least the updated additional opacity channel in a computer-readable memory; and
utilizing the stored updated additional opacity channel to remove the remaining image color in the composite image,
wherein the steps are performed on a processor.
17. A method according to claim 16, further comprising the step of utilizing the updated additional opacity channel to composite the object color and object opacity with the image color and image opacity component.
18. A method according to either one of claims 16 or 17, wherein the at least one object is one object of a grouped plurality of objects.
19. An apparatus for representing an amount of image color in a composite image, said apparatus comprising:
means for generating at least one additional opacity channel for use in creating the composite image, the additional opacity channel initially set to fully opaque to represent that all of the image color is visible;
means for compositing at least one partially transparent graphical object having object color and object opacity, with an image having image opacity and the image color, to create the composite image, the composite image having composite image color and composite image opacity, the composite image color and composite image opacity being derived from one or more of the object color, the object opacity, the image color and the image opacity; and
means for compositing the object opacity with the additional opacity channel to update the additional opacity channel, the updated additional opacity channel thereby being less than fully opaque and representing an amount of the image color remaining in the composite image following said compositing of the at least one graphical object with the image.
20. An apparatus representing an amount of image color in a composite image, said apparatus comprising:
means for generating at least one additional opacity channel for use in creating the composite image, the additional opacity channel initially set to fully opaque to represent that all of the image color is visible;
means for compositing at least one partially transparent graphical object having object color and object opacity, with an image having image opacity and the image color, to create the composite image, the composite image having composite image color and composite image opacity, the composite image color and composite image opacity being derived from one or more of the object color, the object opacity, the image color and the image opacity; and
means for compositing the object opacity with the additional opacity channel to update the additional opacity channel, the updated additional opacity channel thereby being less than fully opaque and representing an amount of the image color remaining in the composite image following the compositing of the at least one graphical object with the image, and utilizing the updated additional opacity channel to remove the remaining image color in the composite image.
21. An apparatus for representing an amount of image color in a composite image, said apparatus comprising:
a memory for storing data and a computer program; and
a processor coupled to said memory for executing said computer program, said computer program comprising:
code for generating at least one additional opacity channel for use in creating the composite image, the additional opacity channel initially set to fully opaque to represent that all of the image color is visible;
code for compositing at least one partially transparent graphical object having object color and object opacity, with an image having image opacity and the image color, to create the composite image, the composite image having composite image color and composite image opacity, the composite image color and composite image opacity being derived from one or more of the object color, the object opacity, the image color and the image opacity; and
code for compositing the object opacity with the additional opacity channel to update the additional opacity channel, the updated additional opacity channel thereby being less than fully opaque and representing an amount of the image color remaining in the composite image following the compositing of the at least one graphical object with the image.
22. An apparatus for representing an amount of image color in a composite image, said apparatus comprising:
a memory for storing data and a computer program; and
a processor coupled to said memory for executing said computer program, said computer program comprising:
code for generating at least one additional opacity channel for use in creating the composite image, the additional opacity channel initially set to fully opaque to represent that all of the image color is visible;
code for compositing at least one partially transparent graphical object having object color and object opacity, with an image having image opacity and the image color, to create the composite image, the composite image having composite image color and composite image opacity, the composite image color and composite image opacity being derived from one or more of the object color, the object opacity, the image color and the image opacity;
code for compositing the object opacity with the additional opacity channel to update the additional opacity channel, the updated additional opacity channel thereby being less than fully opaque and representing an amount of the image color remaining in the composite image following the compositing of the at least one graphical object with the original image; and
code for utilizing the updated additional opacity channel to remove the remaining image color in the composite image.
23. A non-transitory computer storage medium having a computer program recorded therein for representing an amount of image color in a composite image, said computer program comprising:
code for generating at least one additional opacity channel for use in creating the composite image, the additional opacity channel initially set to fully opaque to represent that all of the image color is visible;
code for compositing at least one partially transparent graphical object having object color and object opacity, with an image having image opacity and the image color, to create the composite image, the composite image having composite image color and composite image opacity, the composite image color and composite image opacity being derived from one or more of the object color, the object opacity, the image color and the image opacity; and
code for compositing the object opacity with the additional opacity channel to update the additional opacity channel, the updated additional opacity channel thereby being less than fully opaque representing an amount of the image color remaining in the composite image following the compositing of the at least one graphical object with the image.
24. A non-transitory computer storage medium having a computer program recorded therein for representing an amount of image color in a composite image, said computer program comprising:
code for generating at least one additional opacity channel for use in creating the composite image, the additional opacity channel initially set to fully opaque to represent that all of the image color is visible;
code for compositing at least one partially transparent graphical object having object color and object opacity, with an image having image opacity and the image color, to create the composite image, the composite image having composite image color and composite image opacity, the composite image color and composite image opacity being derived from one or more of the object color, the object opacity, the image color and the image opacity;
code for compositing the object opacity with the additional opacity channel to update the additional opacity channel, the updated additional opacity channel thereby being less than fully opaque and representing an amount of the original image color remaining in the composite image following the compositing of the at least one graphical object with the image; and
code for utilizing the updated additional opacity channel to remove the remaining image color in the composite image.
US10/525,417 2002-10-30 2003-10-29 Method of background colour removal for porter and duff compositing Expired - Fee Related US7864197B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AU2002952382 2002-10-30
AU2002952382A AU2002952382A0 (en) 2002-10-30 2002-10-30 Method of Background Colour Removal for Porter and Duff Compositing
PCT/AU2003/001427 WO2004040514A1 (en) 2002-10-30 2003-10-29 Method of background colour removal for porter and duff compositing

Publications (2)

Publication Number Publication Date
US20060103671A1 US20060103671A1 (en) 2006-05-18
US7864197B2 true US7864197B2 (en) 2011-01-04

Family

ID=28795775

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/525,417 Expired - Fee Related US7864197B2 (en) 2002-10-30 2003-10-29 Method of background colour removal for porter and duff compositing

Country Status (7)

Country Link
US (1) US7864197B2 (en)
EP (1) EP1556835B1 (en)
JP (1) JP4366317B2 (en)
KR (1) KR100664632B1 (en)
CN (1) CN1703724B (en)
AU (1) AU2002952382A0 (en)
WO (1) WO2004040514A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060290961A1 (en) * 2005-06-22 2006-12-28 Xerox Corporation System and method for conveying rendering intents
US20080278519A1 (en) * 2007-05-11 2008-11-13 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Method and apparatus for fast flicker-free displaying overlapped sparse graphs with optional shape
US20080307342A1 (en) * 2007-06-08 2008-12-11 Apple Inc. Rendering Semi-Transparent User Interface Elements
US20150029215A1 (en) * 2013-07-29 2015-01-29 Oracle International Corporation Interactive intersection areas
US20160328633A1 (en) * 2015-05-05 2016-11-10 Canon Kabushiki Kaisha Parallelising per-pixel compositing
US11030816B2 (en) * 2014-12-01 2021-06-08 Thinkware Corporation Electronic apparatus, control method thereof, computer program, and computer-readable recording medium

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003901416A0 (en) * 2003-03-27 2003-04-10 Canon Kabushiki Kaisha Graphical object group management system
US7486337B2 (en) * 2003-12-22 2009-02-03 Intel Corporation Controlling the overlay of multiple video signals
US7145578B2 (en) * 2004-03-18 2006-12-05 Canon Kabushiki Kaisha Scalable object recognition architecture
AU2005201013B2 (en) * 2004-03-18 2007-04-05 Canon Kabushiki Kaisha Scalable Object Recognition Architecture
JP4600648B2 (en) * 2004-08-30 2010-12-15 富士ゼロックス株式会社 Drawing command processing device and drawing command processing method, image drawing device and image drawing method, drawing command generating device and drawing command generating method
US7692831B2 (en) 2004-12-13 2010-04-06 Canon Kabushiki Kaisha Image processing apparatus and method thereof
US7924284B2 (en) * 2005-04-22 2011-04-12 Microsoft Corporation Rendering highlighting strokes
JP4724517B2 (en) 2005-09-29 2011-07-13 キヤノン株式会社 Image processing system, image processing method, program, and storage medium
US8638341B2 (en) * 2007-10-23 2014-01-28 Qualcomm Incorporated Antialiasing of two-dimensional vector images
JP5197043B2 (en) * 2008-02-12 2013-05-15 キヤノン株式会社 Image forming apparatus, image forming method, storage medium, and program
AU2009225336B2 (en) * 2009-10-13 2011-08-04 Canon Kabushiki Kaisha Method of compositing variable alpha fills supporting group opacity
JP5383906B2 (en) * 2010-04-15 2014-01-08 三菱電機株式会社 Image drawing apparatus and image drawing method
AU2010241218B2 (en) * 2010-11-03 2013-10-31 Canon Kabushiki Kaisha Method, apparatus and system for associating an intermediate fill with a plurality of objects
CN112714357B (en) * 2020-12-21 2023-10-13 北京百度网讯科技有限公司 Video playing method, video playing device, electronic equipment and storage medium

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0924654A2 (en) 1997-12-22 1999-06-23 Adobe Systems Incorporated Transparency processing in a page description language
EP0924652A2 (en) 1997-12-22 1999-06-23 Adobe Systems Incorporated Blending image data using layers
US5917937A (en) * 1997-04-15 1999-06-29 Microsoft Corporation Method for performing stereo matching to recover depths, colors and opacities of surface elements
US6014147A (en) 1994-07-25 2000-01-11 Canon Information Systems Research Australia Pty Ltd Computer machine architecture for creating images from graphical elements and a method of operating the architecture
US6028583A (en) 1998-01-16 2000-02-22 Adobe Systems, Inc. Compound layers for composited image manipulation
US6034694A (en) 1997-06-30 2000-03-07 Sun Microsystems, Inc. Method and apparatus for pixel composition
JP2000149035A (en) 1998-09-11 2000-05-30 Canon Inc Method and device for processing graphic object for high- speed raster form rendering
US6115049A (en) 1996-09-30 2000-09-05 Apple Computer, Inc. Method and apparatus for high performance antialiasing which minimizes per pixel storage and object data bandwidth
US6130676A (en) * 1998-04-02 2000-10-10 Avid Technology, Inc. Image composition system and process using layers
JP2001209819A (en) 1999-12-22 2001-08-03 Adobe Syst Inc Range limit of mixed mode for two-dimensional synthesis using isolated groups
US6301382B1 (en) * 1996-06-07 2001-10-09 Microsoft Corporation Extracting a matte of a foreground object from multiple backgrounds by triangulation
JP2001283213A (en) 2000-02-29 2001-10-12 Canon Inc Image processor and its method
JP2002056396A (en) 2000-04-18 2002-02-20 Canon Inc Plotting method and device for graphic object image
US20020027563A1 (en) * 2000-05-31 2002-03-07 Van Doan Khanh Phi Image data acquisition optimisation
US20020149600A1 (en) 2001-04-09 2002-10-17 Marinus Van Splunter Method of blending digital pictures
US6532022B1 (en) * 1997-10-15 2003-03-11 Electric Planet, Inc. Method and apparatus for model-based compositing
US20030189568A1 (en) * 2002-04-09 2003-10-09 Alkouh Homoud B. Image with depth of field using z-buffer image data and alpha blending
US20030193508A1 (en) 2002-04-11 2003-10-16 Sun Microsystems, Inc Method and apparatus to calculate any porter-duff compositing equation using pre-defined logical operations and pre-computed constants
US6828985B1 (en) 1998-09-11 2004-12-07 Canon Kabushiki Kaisha Fast rendering techniques for rasterised graphic object based images
US6903738B2 (en) * 2002-06-17 2005-06-07 Mitsubishi Electric Research Laboratories, Inc. Image-based 3D modeling rendering system
US7102651B1 (en) 1999-12-22 2006-09-05 Adobe Systems Incorporated Hierarchical 2-D color compositing with blending mode and opacity controls at all levels

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU727503B2 (en) * 1996-07-31 2000-12-14 Canon Kabushiki Kaisha Image filtering method and apparatus
JP3870422B2 (en) * 1997-02-20 2007-01-17 ソニー株式会社 Video signal processing apparatus and method, image composition apparatus, and editing apparatus
JP3059956B2 (en) * 1998-03-19 2000-07-04 コナミ株式会社 IMAGE CREATING DEVICE, IMAGE CREATING METHOD, AND READABLE RECORDING MEDIUM CONTAINING IMAGE CREATING PROGRAM
US6337925B1 (en) * 2000-05-08 2002-01-08 Adobe Systems Incorporated Method for determining a border in a complex scene with applications to image masking
JP2002314990A (en) * 2001-04-12 2002-10-25 Auto Network Gijutsu Kenkyusho:Kk System for visually confirming periphery of vehicle

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6014147A (en) 1994-07-25 2000-01-11 Canon Information Systems Research Australia Pty Ltd Computer machine architecture for creating images from graphical elements and a method of operating the architecture
US6301382B1 (en) * 1996-06-07 2001-10-09 Microsoft Corporation Extracting a matte of a foreground object from multiple backgrounds by triangulation
US6115049A (en) 1996-09-30 2000-09-05 Apple Computer, Inc. Method and apparatus for high performance antialiasing which minimizes per pixel storage and object data bandwidth
US5917937A (en) * 1997-04-15 1999-06-29 Microsoft Corporation Method for performing stereo matching to recover depths, colors and opacities of surface elements
US6034694A (en) 1997-06-30 2000-03-07 Sun Microsystems, Inc. Method and apparatus for pixel composition
US6532022B1 (en) * 1997-10-15 2003-03-11 Electric Planet, Inc. Method and apparatus for model-based compositing
US6289364B1 (en) 1997-12-22 2001-09-11 Adobe Systems, Inc. Transparency processing in a page description language
EP0924652A2 (en) 1997-12-22 1999-06-23 Adobe Systems Incorporated Blending image data using layers
EP0924654A2 (en) 1997-12-22 1999-06-23 Adobe Systems Incorporated Transparency processing in a page description language
US6028583A (en) 1998-01-16 2000-02-22 Adobe Systems, Inc. Compound layers for composited image manipulation
US6130676A (en) * 1998-04-02 2000-10-10 Avid Technology, Inc. Image composition system and process using layers
JP2000149035A (en) 1998-09-11 2000-05-30 Canon Inc Method and device for processing graphic object for high- speed raster form rendering
US7046253B2 (en) 1998-09-11 2006-05-16 Canon Kabushiki Kaisha Processing graphic objects for fast rasterised rendering
US6483519B1 (en) * 1998-09-11 2002-11-19 Canon Kabushiki Kaisha Processing graphic objects for fast rasterised rendering
US6828985B1 (en) 1998-09-11 2004-12-07 Canon Kabushiki Kaisha Fast rendering techniques for rasterised graphic object based images
JP2001209819A (en) 1999-12-22 2001-08-03 Adobe Syst Inc Range limit of mixed mode for two-dimensional synthesis using isolated groups
US7151546B1 (en) 1999-12-22 2006-12-19 Adobe Systems Incorporated Restricting scope of blending modes in 2-D compositing using isolated groups
US7102651B1 (en) 1999-12-22 2006-09-05 Adobe Systems Incorporated Hierarchical 2-D color compositing with blending mode and opacity controls at all levels
JP2001283213A (en) 2000-02-29 2001-10-12 Canon Inc Image processor and its method
US6741261B2 (en) 2000-02-29 2004-05-25 Canon Kabushiki Kaisha Alpha-channel compositing system
JP2002056396A (en) 2000-04-18 2002-02-20 Canon Inc Plotting method and device for graphic object image
US7277102B2 (en) 2000-04-18 2007-10-02 Canon Kabushiki Kaisha Rendering graphic object based images
US20020027563A1 (en) * 2000-05-31 2002-03-07 Van Doan Khanh Phi Image data acquisition optimisation
US20020149600A1 (en) 2001-04-09 2002-10-17 Marinus Van Splunter Method of blending digital pictures
US20030189568A1 (en) * 2002-04-09 2003-10-09 Alkouh Homoud B. Image with depth of field using z-buffer image data and alpha blending
US20030193508A1 (en) 2002-04-11 2003-10-16 Sun Microsystems, Inc Method and apparatus to calculate any porter-duff compositing equation using pre-defined logical operations and pre-computed constants
US6903738B2 (en) * 2002-06-17 2005-06-07 Mitsubishi Electric Research Laboratories, Inc. Image-based 3D modeling rendering system

Non-Patent Citations (17)

* Cited by examiner, † Cited by third party
Title
"Clipping, Masking and Compositing", SVG 1.1, Chapter 14 (pp. 1-29), Mar. 25, 2002, (http://lists.w3.org/Archives/Member/w3c-archive/2002Mar/0051.html).
"Current Support for SVG Adobe SVG Viewer", Version 3.0 (Build 76), 2001 (http://www.adobe.com/svg/indepth/pdfs/CurrentSupport.pdf).
Adobe PDF 1.4 Specification-Additional Pages, Dec. 2001, Addison-Wesley, pp. 1-3, 23, 77, and 705. *
Adobe PDF 1.4 Specification-Additional Pages, Dec. 2001, Addison-Wesley, pp. 131-133. *
Alvy Ray Smith, "Paint", Computer Graphics Lab, New York Institute of Technology, Technical Memo No. 7, Jul. 20, 1978 (pp. 1-24).
Alvy Ray Smith, "Tint Fill," ACM Journal, Computer Graphics Lab, New York Institute of Technology (pp. 276-283), 1979.
Chapter 7, Adobe PDF 1.4 Specification (pp. 409-470), Dec. 2001 (http://partners.adobe.com/asn/developer/acrosdk/docs/filefmtspecs/PDFReference.zip).
Craig Northway, "Understand Compositing and Color Extensions in SVG 1.2 in 30 minutes!", Jun. 29, 2009, Sections 8 and 9.
European Search Report dated Apr. 12, 2009, in related corresponding EP 03 769 035.1.
European Search Report dated Jul. 6, 2009, in related corresponding EP 03 76 9035.
James D. Foley, et al., "Computer Graphics: Theory and Practice" (1st Ed.), pp. 830-839, 17.6 Image compositing, (Mar. 23, 2001).
Japanese Office Action dated Mar. 16, 2009, in related corresponding Japanese Patent Appln. No. 2004-547278.
Jonathan Knudsen, "Java 2D Graphics", O'Reilly Media, Inc., May 5, 1999, Section 5.2.3.
Ola Andersson, et al., "Scalable Vector Graphics (SVG) 1.1 Specification-W3C Candidate Recommendation", Apr. 30, 2002, Sections 14.15 and 15.12.
Ola Andersson, et al., "Scalable Vector Graphics (SVG) 1.2 Specification-W3C Working Draft", Oct. 27, 2004, Section 10.1.4.
Ola Andersson, et al., "Scalable Vector Graphics (SVG) 1.2-W3C Working Draft", Nov. 15, 2002, Section 4.1.
Thomas Porter et al., "Compositing Digital Images", Computer Graphics (vol. 18, No. 3, Jul. 1984), pp. 253-259.

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8477323B2 (en) * 2005-06-22 2013-07-02 Xerox Corporation System and method for conveying rendering intents
US20060290961A1 (en) * 2005-06-22 2006-12-28 Xerox Corporation System and method for conveying rendering intents
US20080278519A1 (en) * 2007-05-11 2008-11-13 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Method and apparatus for fast flicker-free displaying overlapped sparse graphs with optional shape
US7969439B2 (en) * 2007-05-11 2011-06-28 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Method and apparatus for fast flicker-free displaying overlapped sparse graphs with optional shape
US10181204B2 (en) 2007-06-08 2019-01-15 Apple Inc. Rendering semi-transparent user interface elements
US20080307342A1 (en) * 2007-06-08 2008-12-11 Apple Inc. Rendering Semi-Transparent User Interface Elements
US11074725B2 (en) 2007-06-08 2021-07-27 Apple Inc. Rendering semi-transparent user interface elements
US10607377B2 (en) 2007-06-08 2020-03-31 Apple Inc. Rendering semi-transparent user interface elements
US9607408B2 (en) * 2007-06-08 2017-03-28 Apple Inc. Rendering semi-transparent user interface elements
US9317943B2 (en) * 2013-07-29 2016-04-19 Oracle International Corporation Interactive intersection areas
US20150029215A1 (en) * 2013-07-29 2015-01-29 Oracle International Corporation Interactive intersection areas
US11030816B2 (en) * 2014-12-01 2021-06-08 Thinkware Corporation Electronic apparatus, control method thereof, computer program, and computer-readable recording medium
US9727982B2 (en) * 2015-05-05 2017-08-08 Canon Kabushiki Kaisha Parallelising per-pixel compositing
US20160328633A1 (en) * 2015-05-05 2016-11-10 Canon Kabushiki Kaisha Parallelising per-pixel compositing

Also Published As

Publication number Publication date
EP1556835A1 (en) 2005-07-27
JP4366317B2 (en) 2009-11-18
KR20050051719A (en) 2005-06-01
CN1703724B (en) 2010-05-05
WO2004040514A1 (en) 2004-05-13
US20060103671A1 (en) 2006-05-18
EP1556835B1 (en) 2015-03-25
AU2002952382A0 (en) 2002-11-14
JP2006504191A (en) 2006-02-02
EP1556835A4 (en) 2009-08-19
CN1703724A (en) 2005-11-30
KR100664632B1 (en) 2007-01-04

Similar Documents

Publication Publication Date Title
US7864197B2 (en) Method of background colour removal for porter and duff compositing
JP3678428B2 (en) Method and apparatus for chroma key, transparency, and fog operation
US6987518B2 (en) Graphics and video integration with alpha and video blending
US20060256136A1 (en) Compositing two-dimensional and three-dimensional image layers
JP4240395B2 (en) Image composition apparatus, electronic device, image composition method, control program, and readable recording medium
US20050200867A1 (en) Compositing list caching for a raster image processor
JP2001357410A (en) Graphic system for composing three-dimensional images generated separately
US7554554B2 (en) Rendering apparatus
EP1306810A1 (en) Triangle identification buffer
JP2000137825A (en) Fast rendering method for image using raster type graphic object
US20130135339A1 (en) Subpixel Compositing on Transparent Backgrounds
US6927778B2 (en) System for alpha blending and method thereof
JPH07146931A (en) Picture generating method
US20020135587A1 (en) System and method for implementing accumulation buffer operations in texture mapping hardware
US7965299B2 (en) Implementing compositing operations on images
US7982746B2 (en) Simplification of alpha compositing in the presence of transfer functions
US20050195220A1 (en) Compositing with clip-to-self functionality without using a shape channel
EP0855682B1 (en) Scan line rendering of convolutions
US6980220B1 (en) Run-based compositing
AU746985B2 (en) Run-based compositing
AU743218B2 (en) Fast renering techniques for rasterised graphic object based images
AU2005200948B2 (en) Compositing list caching for a raster image processor
Anholt High Performance X Servers in the Kdrive Architecture.
AU2005200528B2 (en) Compositing with clip-to-self functionality without using a shape channel
AU2005201868A1 (en) Removing background colour in group compositing

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROWN, CRAIG MATTHEW;REEL/FRAME:017005/0420

Effective date: 20050819

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552)

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20230104