WO1997009830A1 - Method and apparatus for determining the position of a tv camera for use in a virtual studio - Google Patents

Method and apparatus for determining the position of a tv camera for use in a virtual studio Download PDF

Info

Publication number
WO1997009830A1
WO1997009830A1 PCT/GB1996/002227 GB9602227W WO9709830A1 WO 1997009830 A1 WO1997009830 A1 WO 1997009830A1 GB 9602227 W GB9602227 W GB 9602227W WO 9709830 A1 WO9709830 A1 WO 9709830A1
Authority
WO
WIPO (PCT)
Prior art keywords
lines
camera
horizontal
edge
pattem
Prior art date
Application number
PCT/GB1996/002227
Other languages
French (fr)
Inventor
Alexander Steinberg
Zinovy Livshits
Itzhak Wilf
Moshe Nissim
Michael Tamir
Avi Sharir
David Aufhauser
Original Assignee
Orad Hi-Tec Systems Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orad Hi-Tec Systems Limited filed Critical Orad Hi-Tec Systems Limited
Priority to DE69601880T priority Critical patent/DE69601880T2/en
Priority to BR9606555A priority patent/BR9606555A/en
Priority to PL96325423A priority patent/PL325423A1/en
Priority to JP9511005A priority patent/JPH11503588A/en
Priority to IL12337296A priority patent/IL123372A/en
Priority to US08/765,898 priority patent/US6304298B1/en
Priority to AU69353/96A priority patent/AU6935396A/en
Priority to EP96930236A priority patent/EP0848886B1/en
Publication of WO1997009830A1 publication Critical patent/WO1997009830A1/en
Priority to NO981010A priority patent/NO981010L/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/74Circuits for processing colour signals for obtaining special effects
    • H04N9/75Chroma key

Definitions

  • the present invention relates to methods and apparatus for creating
  • Chroma Key panels are known for use in TV studios. By focusing
  • JP 57-93788 a chroma-key panel is used which includes a series
  • Such features may include characters, symbols,
  • the TV camera to be processed to ascertain the position of the camera.
  • a TV camera relative to a panel when part of the panel is occluded by a
  • the present invention therefore provides a method of determining
  • the TV camera including the steps of identifying a plurality of edge points
  • the method comprises the steps of identifying a plurality
  • edges of each family lie on a set of parallel lines comprising at least two
  • step of allocation is followed by computation of the
  • the method also includes the step of projecting the edges
  • the method further includes the step of assigning each
  • a line is specified for each list of edges, edges not
  • the method steps are then preferably repeated for vertical edges
  • This comparison preferably comprises a first step of identifying a
  • the perspective transformation is used to solve for .
  • the patterned panel comprises a chroma-key panel having
  • panel comprises two or more distance coded families of lines.
  • the present invention also provides apparatus for determining the
  • TV camera including:
  • Figure 1 shows a patterned panel for use in the present invention
  • Figure 2 shows a close up of a portion of the panel of Figure 1
  • Figure 3 shows a perspective view of Figure 2 from one side
  • Figure 4 shows a complex perspective view from one side
  • Figure 5 illustrates the process for identification of edge points
  • Figure 6 illustrates diagrammatically the initial vanishing point
  • Figure 7 illustrates diagrammatically the rectified line image
  • Figure 8 illustrates the projected line images for the horizontal
  • Figure 9 shows the accurate video lines after final processing for
  • FIG. 10 illustrates the inventive concept of using coded bundles
  • FIG. 11 shows the top level flow of processing
  • Figure 12 illustrates the line detection process.
  • Figure 1 shows a patterned
  • the lines need not be horizontal or vertical but will preferably
  • each family lie on a set of parallel lines comprising at least two lines.
  • orientations of the families are far apart such that an
  • edge point can be assigned to a specific family by means of its orientation
  • the TV camera 20 indicated diagrammatically is shown in Figure
  • TV camera 20 is operated to zoom in to the area 10' shown dotted in
  • the method of the present invention provides a means for
  • video image can be identified.
  • the method comprises identifying a large plurality of edge points
  • Each edge point may be considered to
  • the edge points are allocated in groups to specific lines in the
  • lines 141, 142, line 141 being, for example, aligned with peak 161 and
  • the edge points assigned to a most probable peak are processed to
  • the least-squared error line can be computed.
  • each line L4 may be
  • the system can provide such information either in the case that one
  • FIG. 11 shows the top level flow of the processing, staring from
  • a chroma-keyer 1104 is used to determine whether false edges due to foreground objects.
  • This segmentation is performed based on a key signal which
  • the key image is preferably filtered 1106 to remove isolated features and
  • Edge detection is then applied 1108 to the background image.
  • the edge detection process consists of the following
  • each pixel by means of x and y spatial derivatives.
  • Each edge is projected through a vanishing point 1204 to produce a projection histogram which is analysed 1206, to find
  • the list of peaks is compared with each edge point to assign

Abstract

A method of determining the position of a TV camera relative to a patterned panel being viewed by the TV camera including the steps of: identifying a plurality of edge points of the pattern from the video signal produced by said camera and using these edge points to calculate the perspective of the pattern relative to the camera.

Description

Method and Apparatus for Determining the Position of a TV
Camera for Use in A Virtual Studio
The present invention relates to methods and apparatus for creating
virtual images and for determining the relative position of a TV camera.
Chroma Key panels are known for use in TV studios. By focusing
a TV camera onto a chroma-key background (or panel) and positioning a
foreground object in front of the panel a combined picture can be created
in which the foreground object appears against a virtual background which
can be, for example, a still picture or a video sequence.
A problem which arises from this basic technique is that the camera
cannot be allowed to move because the virtual background and the
foreground object (possibly a TV presenter) will not move synchronously
as in real life.
In JP 57-93788 a chroma-key panel is used which includes a series
of equidistant parallel lines, figure 11, of two different shades of backing
colour to monitor any changes in zoom which are manifested as changes
in the frequency of the video signal. The boundaries of a chroma-key
window are detected in order to fit the inserted image in size and position
to the chroma-key window.
Perspective can be solved by using a two shade pattern with
characteristic features etc. Such features may include characters, symbols,
vertices of polygons etc. Whenever at least the image features can be matched with the physical pattern the perspective can be solved.
For the purpose of the present invention, the description will generally be confined to the use of a TN camera within a virtual studio but
it is to be understood that the invention can be used for general tracking
of a TV camera or an object on which it is positioned.
In co-pending Israeli Patent Application No. 109,487 to the same
applicant, the use of chroma-key patterned panels is disclosed. These
panels have a defined pattern which allows the video signals generated by
the TV camera to be processed to ascertain the position of the camera.
A problem which arises in the above prior art systems is that for
large zoom in factors the features in the Field of View (FOV) are reduced
in number. Also, for a substantial occlusion the recognition of robust
features may be difficult. Since the present invention movement and zoom
of the camera are permitted and also the foreground object is allowed to
move, these circumstances are very likely to occur.
In addition large perspective distortion makes the recognition of
features very difficult in particular when said features comprise characters,
graphical symbols etc.
If the camera loses synchronism between the foreground real object
and the virtual background then the effect will be a loss of reality in the
composite picture. Thus, as explained above, early previous systems were
limited to a static camera and the later systems, although allowing camera movement, may still be subjected to a loss of synchronism between
foreground and background.
Obviously if none of the patterned chroma-key background is
visible then synchronism cannot be maintained but also is not necessary
since no virtual background will be shown.
As the camera zooms in to the foreground object, the background
chroma-key panel will become more occluded by the foreground object
and the characteristic pattern will be broken and/or distorted in the case
of large perspective views.
It is an object of the present invention to provide a TV camera
position determination apparatus and method for measuring the position of
a TV camera relative to a panel when part of the panel is occluded by a
foreground object.
It is also an object of the present invention to provide a virtual
studio system in which the TN camera is able to be moved laterally with
respect to a foreground object and to a background chroma-key panel; in
which the camera is able to zoom in and out wim respect to the
foreground object without losing synchronism between the foreground
object and the virtual background even when the chroma-key panel is
substantially completely occluded by the foreground object.
It is also a further object of the present invention to provide a
camera positioning apparatus in which the position of a TV camera relative to a patterned panel can be determined even when a substantial part of the panel is obscured by an occluding object.
The present invention therefore provides a method of determining
the position of a TV camera relative to a patterned panel being viewed by
the TV camera including the steps of identifying a plurality of edge points
of the pattern from the video signal produced by said camera and using
these edge points to calculate the perspective ofthe pattern relative to the
camera.
Preferably the method comprises the steps of identifying a plurality
of said first edge points and a plurality of said second points; and
producing an edge image.
Preferably two or more families of edges are used such that the
edges of each family lie on a set of parallel lines comprising at least two
lines. Preferably the orientations of the families are sufficiently far apart
such that an edge point can be assigned to a specific family by means of
its orientation only.
In a specific embodiment said patterned panel comprises a pattern
of vertical and horizontal straight edges defining lines delineating a colour
difference and in which each edge point is situated on one of said
horizontal or vertical straight lines.
In a first embodiment said plurality of first edge points are clustered
to associate edge points to specific lines using a slope and intercept process.
In a second embodiment steps of processing the video signal
relating to said first and said second plurality of edge points comprise the
steps of analysing all detected edge points and grouping together edge
point into a first plurality of groups corresponding to horizontal lines and
a second plurality of group corresponding to vertical lines.
Preferably the edge points in the first and second plurality of groups
are allocated preliminarily to specific horizontal and vertical lines.
Preferably the step of allocation is followed by computation of the
vanishing points of the horizontal and vertical lines, said vanishing points
being computed within a defined location error.
The perspective projection of any set of parallel lines which are not
parallel to the image plane, will converge to a vanishing point. In the
singular case where the lines are parallel to the image plane, the vanishing
point is at infinity.
Preferably the method also includes the step of projecting the edges
corresponding to horizontal edges to obtain an edge projection profile map
comprising peaks and troughs.
Preferably in the projection process a vertical accumulator array
H[y] is cleared to zero. Then for each horizontal edge, the line
connecting the vanishing point (previously computed for horizontal edges) with the edge is computed. That line is then intersected with the vertical
axis (x=0). The cell of the accumulator array which corresponds to the
intersection point is then incremented. Peaks in that array correspond to
candidate lines.
Preferably the method further includes the step of assigning each
horizontal edge to a most probable peak and producing a list of edges for
each of a plurality of candidate lines indicated by the peak.
Preferably a line is specified for each list of edges, edges not
corresponding to any specified line being disregarded.
The method steps are then preferably repeated for vertical edges
and lines.
In the method an accurate video image edge line pattern is produced
and in which the known pattern on the panel is compared with the edge line pattern.
This comparison preferably comprises a first step of identifying a
first horizontal line in the accurate video image edge pattern, identifying
a second horizontal line in the accurate video image pattern, calculating
the distance between said first and second video image lines, comparing
the calculated distance between the video image lines with the known
pattern to produce a horizontal position and scale determination, repeating
said steps to produce a vertical position and scale determination and from
said horizontal and vertical position and scale determinations. Once all positions and scales have been determined, the matching
between the pattern and the image is now complete. Preferably, that
matching is used to solve for the final, accurate perspective transformation
between the pattern and the image.
Preferably, the perspective transformation is used to solve for .the
position of the TV camera relative to the panel.
Preferably the patterned panel comprises a chroma-key panel having
two separately identifiable chroma-key colours. -Preferably the patterned
panel comprises two or more distance coded families of lines.
In a further preferred embodiment the patterned panel comprises
two or more families of lines such that the lines of each family intersect
at a common point.
The present invention also provides apparatus for determining the
position of a TN camera relative to a patterned panel being viewed by the
TV camera including:
means for identifying a plurality of edge points of the pattern from
the vedeo signal produced by said camera and means for processing these
edge points to calculate the perspective of the pattern relative to the
camera.
Embodiments of the present invention will now be described, by
way of example with reference to the accompanying drawings in which :-
Figure 1 shows a patterned panel for use in the present invention; Figure 2 shows a close up of a portion of the panel of Figure 1
with an occluding object obscuring part of the pattern;
Figure 3 shows a perspective view of Figure 2 from one side;
Figure 4 shows a complex perspective view from one side and
above;
Figure 5 illustrates the process for identification of edge points;
Figure 6 illustrates diagrammatically the initial vanishing point
calculation for the edge points;
Figure 7 illustrates diagrammatically the rectified line image;
Figure 8 illustrates the projected line images for the horizontal
lines;
Figure 9 shows the accurate video lines after final processing for
comparison with the pattern of Figure 1;
Figure 10 illustrates the inventive concept of using coded bundles
of lines;
Figure 11 shows the top level flow of processing, and
Figure 12 illustrates the line detection process.
With reference now to the drawings, Figure 1 shows a patterned
panel 10 which comprises a plurality of vertical and horizontal lines
12,14. These lines may be formed from narrow line or stripes of different
colour, their function being to provide a plurality of defined edges. For chroma-key panels the colours of the lines or stripes will
preferably be different shades of the same colour.
The lines need not be horizontal or vertical but will preferably
always be parallel straight lines with a predetermined angular relationship
between the generally horizontal and vertical lines. Preferably in any
pattern two or more families of edges are provided such that the edges of
each family lie on a set of parallel lines comprising at least two lines.
Also preferably the orientations of the families are far apart such that an
edge point can be assigned to a specific family by means of its orientation
only.
The TV camera 20 indicated diagrammatically is shown in Figure
1 viewing the panel directly from the front.
In Figure 2 the video image viewed by camera 20 is shown. The
TV camera 20 is operated to zoom in to the area 10' shown dotted in
Figure 1 and an occluding object 30 of irregular shape is shown occluding
part of the pattem. The pattem in Figure 2 is therefore not continuous
and it may be seen that there are no continuous horizontal lines in the
zoomed video image.
In Figure 2 only one occluding object is shown but there may be
several producing further discontinuities in the lines.
In Figure 3 the camera has been moved to create a simple
perspective which illustrates that the generally horizontal lines 14 are not now parallel and in Figure 4 in the more complex perspective, neither the
horizontal or vertical lines are parallel.
With the change in size of the pattem, discontinuities in the lines
and the non-parallel image matching the video image pattem in Figure 4
with a pattem of the panel stored in digital format will be extremely
difficult since no part ofthe video image corresponds to the stored pattem.
The method of the present invention provides a means for
determining the position ofthe TV camera from the video image of Figure
4.
Preferably in the pattern of Figure 1 the line spacings are not all
equal such that distance ratios in sets of adjacent lines are unique within
the family of either horizontal or vertical lines. Thus if it is possible to
identify the line spacing between two vertical lines 121, 122 and two horizontal lines 141, 142 then the area of the pattem forming part of the
video image can be identified.
However, because of the unknown magnification or zoom of the
TV camera, the unknown complex perspective and occlusion the lines
appear totally different from the pattem in Figure 1.
The method comprises identifying a large plurality of edge points
144 as shown in Figure 4. Each edge point may be considered to
comprise a mini-line having slope and intercept as indicated by angle 148.
It may also have a nominal direction if it is on a line of any thickness as indicated by arrow 146. The locations of these edge points are stored
digitally to provide an initial edge point map. As can be seen in Figure
4 there may be substantial blank areas in the centre portion where the
occlusion occurs but within this area there may be false edge points not
correctly belonging to the pattem which will be recorded and will require
to be discarded.
The edge points are allocated in groups to specific lines in the
horizontal and vertical directions using the~ Hough transform [J.
Illingworth and J. Kittler, A survey of the Hough transform, Computer
Vision, Graphics and Image Processing, 26, pp. 139-161 (1986)].
Altematively the initial parallelism of line sets is used to provide
approximate positions of line sets in the horizontal and vertical directions.
It may be seen from Figure 4 that none of the lines are either
horizontal or vertical due to the perspective change. These terms are
therefore used herein generally to refer to lines which are substantially
horizontal or vertical, that is to say nearer to the horizontal rather than to
the vertical and vice versa.
With reference now to Figure 5 each line, as approximately
determined by either grouping of the edge point and/or by computation of
the initial parallelism of the line sets is projected to an approximate
vanishing point for both horizontal (150) and vertical lines (152). As
shown the lines will not intersect at a single point because the ofthe errors and thus a "circle" of error 150, 152 is allowed, the centre of the circle,
for example, being considered to be the vanishing point. When the
camera is looking perpendicular to the panel, the vanishing point is at
infinity. Working in a homogeneous coordinate system, the latter case can
be handled as well.
With reference to Figure 7, the horizontal vanishing point is used
to cluster the horizontal edge points into lines. The line connecting the
vanishing point Ph with edge point El is intersected with the vertical axis.
The process is repeated for all horizontal edge points. Clearly, for real
lines which are characterised by a multitude of edge points, the
intersection points will tend to accumulate as shown in Figure 7. False
edges or very short visible lines will contribute mode randomly. In Figure
8, the intersections provide a histogram type waveform. The process is
described for horizontal lines but will be repeated for the vertical lines.
Each edge point is reassessed by assigning it to the most probable
peak and a revised list of edges is then stored for each probable candidate
such as 160, 161, 162 in Figure 8.
Those edge points which are found not to correspond to a probable
candidate are discarded, thus, for horizontal lines, a list of edge points
has now been produced which will accurately align with the horizontal
lines 141, 142, line 141 being, for example, aligned with peak 161 and
line 142 with peak 162 by means of a list of edge points for each line. The lines are therefore accurately detected.
This process is then repeated for the vertical lines.
The edge points assigned to a most probable peak are processed to
find a line passing through these points in some optimal sense. For
example, the least-squared error line can be computed. The vanishing
points can be now computed more accurately, as a most probable
intersection points of a set of horizontal (or vertical lines).
Let the vanishing point of the horizontal bundle be given in
homogeneous coordinates by (Xh, Yh, Wh). Also let the vanishing point
of the vertical bundle (or set of lines) be given by (Xv, Yv, Wv). These
points correspond to vanishing points (1,0,0) and (0,1,0) of the parallel
bundles on the panel. From this correspondence, the perspective
transformation can be solved up to the shift and scale determinations for
both bundles. Applying the inverse transformation to the detected lines,
produces an accurate grill pattem as shown in Figure 9.
This pattem is then matched against the stored pattem (Figure 1)
for each axis independently. In the search process each line L4 may be
any line in the horizontal pattem. L5 is, however, the next line and the
distance or pattem being unique the lines can be identified. If we assume
that no lines are missing then we have a matching solution in the
horizontal direction and by a similar process we will have a matching
solution in the vertical direction. If some lines are missing then a score is determined for the number of other matching lines and a search can be conducted, using the
knowledge of the matched lines, for any missing lines. If these are totally
obscured then a decision can be taken on a match using a threshold value
for the scores for both vertical and horizontal directions.
To obtain the exact vanishing points and perspective, the corrected
list of edge points for each line is used to provide accurate line equations,
thereby enabling the vanishing points to be accurately calculated.
Having matched the lines, one knows not only the perspective
distortion as before but also the shifts and scales. This completes the
determination of the perspective transformation and thereby the position
of the TV camera relative to the panel.
The system can provide such information either in the case that one
or more lines in the pattem are obscured totally or in the event that the
lines are discontinuous. The system can, therefore, work with high
camera zoom parameters where only a very small fraction of the panel is
visible.
With reference now to Figure 10, the concept of a parallel family
of lines can be extended to an intersecting family using an altemative
system of coded bundles 200 (Fig. 10a) (families of lines). The lines are
not parallel, yet one can use basically the same techniques.
Consider two parallel coded bundles 202', 204' ("primary bundles") which is transformed by a known perspective transformation (the "pre-
transformation") in the panel design process to 2 intersecting bundles
("pattem bundles"). These bundles are further transformed by the
(unknown) camera perspective transformation and appear as "image
bundles" 202", 204" (Fig. 10c).
Clearly, the combination of the pre-transformation and the camera
transformation is an unknown perspective transformation. We proceed as
in the usual algorithm to find that unknown transformation (between the
primary bundles Fig. 10a and the image bundles Fig. 10c). Once that
transformation is known, we use the pre-transformation to extract the
camera transformation (between the pattem bundles and the image
bundles).
Figure 11 shows the top level flow of the processing, staring from
a video signal 1100 and producing an estimate of the perspective
transformation 1102 from the panel to the image. To reduce the number
of false edges due to foreground objects, a chroma-keyer 1104 is used to
segment the background (which contains the pattem information) from the
foreground. This segmentation is performed based on a key signal which
describes the distance of a specific pixel from the backing colour
(preferably blue or green). To further reduce the number of false edges
the key image is preferably filtered 1106 to remove isolated features and
pixels near the border of foreground objects. This filtering is preferably done using morphological image processing [Serra, J. Image Analysis and
Mathematical Moφhology, Academic Press, London 1982].
Edge detection is then applied 1108 to the background image. The
method is not sensitive to the specific edge detector used. For a survey
see [A. Rosenfeld and A, Kak, Digital Picture Processing, Academic
Press 1982, Vol. 2, pp. 84-112].
Preferably the edge detection process consists of the following
steps:
1. Smoothing the image to reduce the effect of image noise.
2. Computing a gradient vector (magnitude and directions) at
each pixel, by means of x and y spatial derivatives.
3. Thresholding the gradient magnitude and suppressing pixels
where the gradient response does not have a local maximum.
This suppression step is necessary to obtain thin edge
contours.
4. Storing the edge points in an edge array.
The line detection process is further described with reference to
Figure 12 for horizontal lines. Vertical lines are processed in a similar
manner.
From a list of horizontal edge points an approximate vanishing
point is computed 1202. Each edge is projected through a vanishing point 1204 to produce a projection histogram which is analysed 1206, to find
the peaks. The list of peaks is compared with each edge point to assign
an edge point to a peak and to then fit the lines 1208 to provide a list of
lines.

Claims

1. A method of determining the position of a TV camera relative to
a patterned panel being viewed by the TV camera including the steps of:
identifying a plurality of edge points of the pattem from the video
signal produced by said camera and using these edge points to calculate
the perspective of the pattem relative to the camera.
2. A method as claimed in claim 1 comprising the steps of:
identifying a plurality of first edge points and a plurality of second
edge points; and
producing an edge image.
3. A method as claimed in claim 2 in which said patterned panel
comprises a pattem of vertical and horizontal straight edges defining lines
delineating a colour difference and in which each edge point is situated on
one of said horizontal or vertical straight lines.
4. A method as claimed in claim 3 in which said plurality of first edge
points are clustered to associate edge points to specific lines using a slope
and intercept process.
5. A method as claimed in claim 3 in which said steps of processing the video signal relating to said first and said second plurality of edge
points comprises the steps of :
analysing all detected edge points and grouping together edge points
into a first plurality of groups corresponding to horizontal lines and a
second plurality of groups coπesponding to vertical lines.
6. A method as claimed in claim 5 in which the edge points in the first
and second plurality of groups are allocated preliminarily to specific
horizontal and vertical lines.
7. A method as claimed in claim 6 in which the step of allocation is
followed by computation of the vanishing points of the horizontal and
vertical lines, said vanishing points being computed within a defined
location enor.
8. A method as claimed in claim 7 further including the step of :
projecting the edges coπesponding to horizontal edges to obtain an
edge projection profile map comprising peaks and troughs.
9. A method as claimed in claim 8 further including the step of :
assigning each horizontal edge to a most probable peak and
producing a list of edges for each of a plurality of candidate lines
indicated by the peak.
10. A method as claimed in claim 9 in which a line is specified for each
list of edges, edges not coπesponding to any specified line being
disregarded.
11. A method as claimed in claim 10 in which the steps are repeated
for vertical edges and lines.
12. A method as claimed in claim 11 in which accurate vanishing points
are computed from the specified lines.
13. A method as claimed in claim 11 in which the perspective
transformation is solved up to the shift and scale determinations for botii
families of lines.
14. A method as claimed in claim 13 in which an accurate line pattem
is produced by means of inverse perspective transformation and in which
the known pattem on the panel is compared with the edge line pattem.
15. A method as claimed in claim 14 in which said comparison
comprises a first step of identifying a first horizontal line in the accurate
video image edge pattem, identifying a second horizontal line in the
accurate video image pattem, calculating the distance between said first and second video image lines, comparing the calculated distance between
the video image lines with the known pattem to produce a horizontal
position and scale determination, repeating said steps to produce a vertical
position and scale determination and from said horizontal and vertical
position and scale determinations determining the position of the TV
camera relative to the panel.
16. A method as claimed in any one of claims 1 to 15 in which the
patterned panel comprises a chroma-key panel having two separately
identifiable chroma-key colours.
17. A method as claimed in any one of claims 1 to 15 in which the
patterned panel comprises two or more distance coded families of lines.
18. A method as claimed in any one of claims 1 to 15 in which the
patterned panel comprises two or more families of lines such that the lines
of each family intersect at a common point.
19. A method as claimed in claim 16 in which the determination ofthe
position of the TN camera relative to the panel is used to calculate the
perspective of a background video picture relative to a foreground object.
20. Apparatus for determining the position of a TN camera relative to a patterned panel being viewed by the TN camera including:
means for identifying a plurality of edge points of the pattem from
the video signal produced by said camera and means for processing these
edge points to calculate the perspective of the pattem relative to the
camera.
21. Apparatus as claimed in claim 18 in which the patterned panel is a
chroma-key panel.
22. Apparams as claimed in claim 19 further including further
processing means for processing said calculated position of the camera,
background scene storage means for storage of background scene,
perspective displacement means to adjust the perspective of a background
scene in accordance with the calculated camera position and video display
means for displaying the background scene in a conect perspective on said
chroma-key background panel with foreground objects interposed between
said camera and said background panel.
23. Apparatus as claimed in any one of claims 20 to 22 in which the
patterned panel comprises two or more distance coded families of lines.
24. Apparatus as claimed in any one of claims 20 to 22 in which the
patterned panel comprises two or more families of lines such that the lines
of each family intersect at a common point.
PCT/GB1996/002227 1995-09-08 1996-09-09 Method and apparatus for determining the position of a tv camera for use in a virtual studio WO1997009830A1 (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
DE69601880T DE69601880T2 (en) 1995-09-08 1996-09-09 METHOD AND DEVICE FOR CREATING THE POSITION OF A TELEVISION CAMERA FOR USE IN A VIRTUAL STUDIO
BR9606555A BR9606555A (en) 1995-09-08 1996-09-09 Method and apparatus for determining the position of a TV camera in relation to a configured panel that is viewed by the TV camera
PL96325423A PL325423A1 (en) 1995-09-08 1996-09-09 Method of and apparatus for determining actual position of a televidion camera for use at an virtual television studio
JP9511005A JPH11503588A (en) 1995-09-08 1996-09-09 Method and apparatus for positioning television camera used in virtual studio
IL12337296A IL123372A (en) 1995-09-08 1996-09-09 Method and apparatus for determining the position of a tv camera for use in a virtual studio
US08/765,898 US6304298B1 (en) 1995-09-08 1996-09-09 Method and apparatus for determining the position of a TV camera for use in a virtual studio
AU69353/96A AU6935396A (en) 1995-09-08 1996-09-09 Method and apparatus for determining the position of a tv camera for use in a virtual studio
EP96930236A EP0848886B1 (en) 1995-09-08 1996-09-09 Method and apparatus for determining the position of a tv camera for use in a virtual studio
NO981010A NO981010L (en) 1995-09-08 1998-03-06 Method and apparatus for determining the position of a television camera for use in a virtual studio

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9518432A GB2305050A (en) 1995-09-08 1995-09-08 Determining the position of a television camera for use in a virtual studio employing chroma keying
GB9518432.1 1995-09-08

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US08/765,898 A-371-Of-International US6304298B1 (en) 1995-09-08 1996-09-09 Method and apparatus for determining the position of a TV camera for use in a virtual studio
US09/921,160 Continuation US20010048483A1 (en) 1995-09-08 2001-08-02 Method and apparatus for determining the position of a TV camera for use in a virtual studio

Publications (1)

Publication Number Publication Date
WO1997009830A1 true WO1997009830A1 (en) 1997-03-13

Family

ID=10780450

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1996/002227 WO1997009830A1 (en) 1995-09-08 1996-09-09 Method and apparatus for determining the position of a tv camera for use in a virtual studio

Country Status (16)

Country Link
US (2) US6304298B1 (en)
EP (1) EP0848886B1 (en)
JP (1) JPH11503588A (en)
CN (1) CN1104816C (en)
AT (1) ATE178180T1 (en)
AU (1) AU6935396A (en)
BR (1) BR9606555A (en)
DE (1) DE69601880T2 (en)
GB (1) GB2305050A (en)
HU (1) HUP9900176A2 (en)
IL (1) IL123372A (en)
NO (1) NO981010L (en)
PL (1) PL325423A1 (en)
TR (1) TR199800397T2 (en)
WO (1) WO1997009830A1 (en)
ZA (1) ZA967588B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998044723A1 (en) * 1997-04-02 1998-10-08 Orad Hi-Tec Systems Limited Virtual studio
WO1998050889A1 (en) * 1997-05-06 1998-11-12 Dimensions As Method for image processing
JP4775678B2 (en) * 1999-04-14 2011-09-21 リベイ グラス インコーポレイテッド Cooling system for glassware making machine
EP3104330A1 (en) 2015-06-09 2016-12-14 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Methods for tracking at least one object and method for replacing at least one object with a virtual object in a motion picture signal recorded by a camera

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE46310E1 (en) 1991-12-23 2017-02-14 Blanding Hovenweep, Llc Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US5903454A (en) 1991-12-23 1999-05-11 Hoffberg; Linda Irene Human-factored interface corporating adaptive pattern recognition based controller apparatus
USRE47908E1 (en) 1991-12-23 2020-03-17 Blanding Hovenweep, Llc Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
USRE48056E1 (en) 1991-12-23 2020-06-16 Blanding Hovenweep, Llc Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US10361802B1 (en) 1999-02-01 2019-07-23 Blanding Hovenweep, Llc Adaptive pattern recognition based control system and method
US6850252B1 (en) 1999-10-05 2005-02-01 Steven M. Hoffberg Intelligent electronic appliance system and method
GB9607541D0 (en) * 1996-04-11 1996-06-12 Discreet Logic Inc Processing image data
GB2312125B (en) * 1996-04-11 1998-07-01 Discreet Logic Inc Processing image data
IL123733A0 (en) * 1997-03-27 1998-10-30 Rt Set Ltd Method for compositing an image of a real object with a virtual scene
US5930740A (en) * 1997-04-04 1999-07-27 Evans & Sutherland Computer Corporation Camera/lens calibration apparatus and method
DE69831181T2 (en) * 1997-05-30 2006-05-18 British Broadcasting Corp. location
GB2329292A (en) * 1997-09-12 1999-03-17 Orad Hi Tec Systems Ltd Camera position sensing system
IL137619A0 (en) * 1998-02-18 2001-07-24 Gmd Gmbh Camera tracking system for a virtual television or video studio
RU2161871C2 (en) * 1998-03-20 2001-01-10 Латыпов Нурахмед Нурисламович Method and device for producing video programs
US6912293B1 (en) * 1998-06-26 2005-06-28 Carl P. Korobkin Photogrammetry engine for model construction
US6965397B1 (en) 1999-11-22 2005-11-15 Sportvision, Inc. Measuring camera attitude
GB2356998A (en) * 1999-12-02 2001-06-06 Sony Uk Ltd Video signal processing
US6778699B1 (en) * 2000-03-27 2004-08-17 Eastman Kodak Company Method of determining vanishing point location from an image
EP1189171A2 (en) * 2000-09-08 2002-03-20 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and apparatus for generating picture in a virtual studio
WO2003005303A2 (en) * 2001-07-02 2003-01-16 Matchlight Software, Inc. System and method for discovering and categorizing attributes of a digital image
US6873732B2 (en) * 2001-07-09 2005-03-29 Xerox Corporation Method and apparatus for resolving perspective distortion in a document image and for calculating line sums in images
DE10139846C1 (en) * 2001-08-14 2003-02-06 Daimler Chrysler Ag Method for estimating positions and locations uses alignment of image data for a camera of model structures in order to increase long-duration stability and autonomics of aerodynamic vehicles/missiles.
EP1488413B1 (en) * 2002-03-22 2012-02-29 BRITISH TELECOMMUNICATIONS public limited company Anomaly recognition method for data streams
ATE315257T1 (en) * 2002-03-22 2006-02-15 British Telecomm COMPARISON OF PATTERNS
GB0229625D0 (en) * 2002-12-19 2003-01-22 British Telecomm Searching images
JP4508553B2 (en) * 2003-06-02 2010-07-21 カシオ計算機株式会社 Captured image projection device and captured image correction method
US7116342B2 (en) * 2003-07-03 2006-10-03 Sportsmedia Technology Corporation System and method for inserting content into an image sequence
JP2005122323A (en) * 2003-10-14 2005-05-12 Casio Comput Co Ltd Photographing apparatus, image processor, and image processing method and program for photographing device
JP4363151B2 (en) * 2003-10-14 2009-11-11 カシオ計算機株式会社 Imaging apparatus, image processing method thereof, and program
GB0328326D0 (en) 2003-12-05 2004-01-07 British Telecomm Image processing
JP3925521B2 (en) * 2004-08-19 2007-06-06 セイコーエプソン株式会社 Keystone correction using part of the screen edge
EP1789910B1 (en) 2004-09-17 2008-08-13 British Telecommunications Public Limited Company Analysis of patterns
EP1732030A1 (en) * 2005-06-10 2006-12-13 BRITISH TELECOMMUNICATIONS public limited company Comparison of patterns
WO2007012798A1 (en) * 2005-07-28 2007-02-01 British Telecommunications Public Limited Company Image analysis
EP1798961A1 (en) * 2005-12-19 2007-06-20 BRITISH TELECOMMUNICATIONS public limited company Method for focus control
EP2100254A2 (en) 2006-11-30 2009-09-16 Canon U.S. Life Sciences, Inc. Systems and methods for monitoring the amplification and dissociation behavior of dna molecules
US20080195938A1 (en) * 2006-12-14 2008-08-14 Steven Tischer Media Content Alteration
CN101267493B (en) * 2007-03-16 2011-01-19 富士通株式会社 Correction device and method for perspective distortion document image
JP4966231B2 (en) * 2008-03-13 2012-07-04 株式会社アイデンティファイ Studio system
GB2465793A (en) * 2008-11-28 2010-06-02 Sony Corp Estimating camera angle using extrapolated corner locations from a calibration pattern
JP5541031B2 (en) * 2010-09-16 2014-07-09 セイコーエプソン株式会社 Projector and projector control method
US9215383B2 (en) 2011-08-05 2015-12-15 Sportsvision, Inc. System for enhancing video from a mobile camera
JPWO2015162910A1 (en) * 2014-04-24 2017-04-13 パナソニックIpマネジメント株式会社 In-vehicle display device, control method for in-vehicle display device, and program
WO2018104904A1 (en) * 2016-12-08 2018-06-14 Eden Nir Methods and systems for image layer separation
US10594995B2 (en) * 2016-12-13 2020-03-17 Buf Canada Inc. Image capture and display on a dome for chroma keying
US10304210B2 (en) * 2017-05-25 2019-05-28 GM Global Technology Operations LLC Method and apparatus for camera calibration
US11094083B2 (en) * 2019-01-25 2021-08-17 Adobe Inc. Utilizing a critical edge detection neural network and a geometric model to determine camera parameters from a single digital image
DE102021106488A1 (en) 2020-12-23 2022-06-23 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Background display device, background display system, recording system, camera system, digital camera and method for controlling a background display device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5793788A (en) * 1980-12-03 1982-06-10 Nippon Hoso Kyokai <Nhk> Chroma-key device
WO1994005118A1 (en) * 1992-08-12 1994-03-03 British Broadcasting Corporation Derivation of studio camera position and motion from the camera image
WO1995030312A1 (en) * 1994-04-29 1995-11-09 Orad, Inc. Improved chromakeying system

Family Cites Families (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2974190A (en) 1957-12-09 1961-03-07 Columbia Broadcasting Syst Inc Electronic matting apparatus
JPS4934385A (en) 1972-07-28 1974-03-29
US3848082A (en) * 1973-01-16 1974-11-12 Atlantic Res Corp System for transmitting and utilizing supplemental data via television systems
JPS5637586B2 (en) 1973-07-02 1981-09-01
US3973239A (en) 1973-10-17 1976-08-03 Hitachi, Ltd. Pattern preliminary processing system
JPS5723295B2 (en) 1973-12-28 1982-05-18
US4200890A (en) 1977-07-11 1980-04-29 Nippon Electric Company, Ltd. Digital video effects system employing a chroma-key tracking technique
GB2013448B (en) * 1978-01-30 1983-02-23 Quantel Ltd Measurement of chroma key area in television systems
US4394680A (en) 1980-04-01 1983-07-19 Matsushita Electric Industrial Co., Ltd. Color television signal processing apparatus
US4396939A (en) 1980-06-09 1983-08-02 Nippon Electric Co., Ltd. Chromakey effect apparatus
CA1187166A (en) 1981-07-09 1985-05-14 Kaichi Yamamoto Digital chromakey apparatus
US4393394A (en) 1981-08-17 1983-07-12 Mccoy Reginald F H Television image positioning and combining system
JPS5846783A (en) 1981-09-12 1983-03-18 Sony Corp Chromakey device
US4409611A (en) 1981-09-24 1983-10-11 Vlahos-Gottschalk Research Corp., (Now) Ultimatte Corp. Encoded signal color image compositing
US4566126A (en) 1982-04-30 1986-01-21 Fuji Electric Company, Ltd. Pattern discriminator
US4409418A (en) * 1982-07-07 1983-10-11 Phillips Petroleum Company Isomerization process
JPS5972285A (en) 1982-10-18 1984-04-24 Nec Corp Chromakey signal generator
JPS5992678A (en) 1982-11-19 1984-05-28 Nec Corp Key signal detector
US4547897A (en) 1983-02-01 1985-10-15 Honeywell Inc. Image processing for part inspection
JPS60194696A (en) 1984-03-15 1985-10-03 Toshiba Corp Digital chromakey device
NL8402541A (en) * 1984-08-20 1986-03-17 Philips Nv AMPLIFIER CIRCUIT.
DE3704289A1 (en) 1987-02-12 1988-08-25 Broadcast Television Syst METHOD FOR CHECKING A COLOR PUNCH SIGNAL DECODER
DE3810328A1 (en) 1988-03-26 1989-10-05 Bosch Gmbh Robert METHOD AND CIRCUIT FOR COMBINING TWO TELEVISION SIGNALS
GB8826880D0 (en) * 1988-11-17 1988-12-21 Dickson J W Vehicle control
US4979021A (en) 1989-11-30 1990-12-18 Thomas Milton L Optical chromakey field
FR2661061B1 (en) * 1990-04-11 1992-08-07 Multi Media Tech METHOD AND DEVICE FOR MODIFYING IMAGE AREA.
CN1020986C (en) * 1990-07-28 1993-05-26 大连理工大学 Microcomputerized visual monitor system for secondary winder
JP3166173B2 (en) 1991-07-19 2001-05-14 プリンストン エレクトロニック ビルボード,インコーポレイテッド Television display with selected and inserted mark
GB9119964D0 (en) 1991-09-18 1991-10-30 Sarnoff David Res Center Pattern-key video insertion
US5638116A (en) * 1993-09-08 1997-06-10 Sumitomo Electric Industries, Ltd. Object recognition apparatus and method
IL108957A (en) 1994-03-14 1998-09-24 Scidel Technologies Ltd System for implanting an image into a video stream
US5488675A (en) 1994-03-31 1996-01-30 David Sarnoff Research Center, Inc. Stabilizing estimate of location of target region inferred from tracked multiple landmark regions of a video image
US5436672A (en) * 1994-05-27 1995-07-25 Symah Vision Video processing system for modifying a zone in successive images
US5892554A (en) * 1995-11-28 1999-04-06 Princeton Video Image, Inc. System and method for inserting static and dynamic images into a live video broadcast
GB9601101D0 (en) 1995-09-08 1996-03-20 Orad Hi Tech Systems Limited Method and apparatus for automatic electronic replacement of billboards in a video image
US5917553A (en) * 1996-10-22 1999-06-29 Fox Sports Productions Inc. Method and apparatus for enhancing the broadcast of a live event
US5930740A (en) * 1997-04-04 1999-07-27 Evans & Sutherland Computer Corporation Camera/lens calibration apparatus and method
US6122014A (en) * 1998-09-17 2000-09-19 Motorola, Inc. Modified chroma keyed technique for simple shape coding for digital video

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5793788A (en) * 1980-12-03 1982-06-10 Nippon Hoso Kyokai <Nhk> Chroma-key device
WO1994005118A1 (en) * 1992-08-12 1994-03-03 British Broadcasting Corporation Derivation of studio camera position and motion from the camera image
WO1995030312A1 (en) * 1994-04-29 1995-11-09 Orad, Inc. Improved chromakeying system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 006, no. 176 (E - 130) 10 September 1982 (1982-09-10) *
SOMMERHCUSER F.: "Das virtuelle Studio Grundlagen einer neuen Studioproduktionstechnik", FKT FERNSEH- UND KINO-TECHNIK,, vol. 50, no. 1, January 1996 (1996-01-01), HEILDELBERG (DE), pages 11 - 22, XP000555564 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998044723A1 (en) * 1997-04-02 1998-10-08 Orad Hi-Tec Systems Limited Virtual studio
WO1998050889A1 (en) * 1997-05-06 1998-11-12 Dimensions As Method for image processing
US6546153B1 (en) 1997-05-06 2003-04-08 Dimensions As Method for image processing
JP4775678B2 (en) * 1999-04-14 2011-09-21 リベイ グラス インコーポレイテッド Cooling system for glassware making machine
EP3104330A1 (en) 2015-06-09 2016-12-14 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Methods for tracking at least one object and method for replacing at least one object with a virtual object in a motion picture signal recorded by a camera
US10110822B2 (en) 2015-06-09 2018-10-23 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for tracking at least one object and method for replacing at least one object by a virtual object in a moving image signal recorded by a camera

Also Published As

Publication number Publication date
BR9606555A (en) 1998-12-15
PL325423A1 (en) 1998-07-20
CN1196151A (en) 1998-10-14
GB9518432D0 (en) 1995-11-08
AU6935396A (en) 1997-03-27
NO981010D0 (en) 1998-03-06
DE69601880D1 (en) 1999-04-29
CN1104816C (en) 2003-04-02
DE69601880T2 (en) 1999-07-29
IL123372A (en) 2002-08-14
ZA967588B (en) 1997-03-10
GB2305050A (en) 1997-03-26
ATE178180T1 (en) 1999-04-15
IL123372A0 (en) 1998-09-24
EP0848886B1 (en) 1999-03-24
NO981010L (en) 1998-03-09
US20010048483A1 (en) 2001-12-06
TR199800397T2 (en) 1999-09-21
US6304298B1 (en) 2001-10-16
JPH11503588A (en) 1999-03-26
HUP9900176A2 (en) 1999-04-28
EP0848886A1 (en) 1998-06-24

Similar Documents

Publication Publication Date Title
EP0848886B1 (en) Method and apparatus for determining the position of a tv camera for use in a virtual studio
EP0758515B1 (en) Improved chromakeying system
US5710875A (en) Method and apparatus for processing 3-D multiple view images formed of a group of images obtained by viewing a 3-D object from a plurality of positions
US5436672A (en) Video processing system for modifying a zone in successive images
US6671399B1 (en) Fast epipolar line adjustment of stereo pairs
US6094501A (en) Determining article location and orientation using three-dimensional X and Y template edge matrices
US6914599B1 (en) Image processing apparatus
US20110102461A1 (en) Mosaic oblique images and methods of making and using same
US6181345B1 (en) Method and apparatus for replacing target zones in a video sequence
Zheng et al. A novel projective-consistent plane based image stitching method
CN108447022B (en) Moving target joining method based on single fixing camera image sequence
CN111127318A (en) Panoramic image splicing method in airport environment
Böhm Multi-image fusion for occlusion-free façade texturing
EP0780003B1 (en) Method and apparatus for determining the location of a reflective object within a video field
JP3800905B2 (en) Image feature tracking processing method, image feature tracking processing device, and three-dimensional data creation method
US5995662A (en) Edge detecting method and edge detecting device which detects edges for each individual primary color and employs individual color weighting coefficients
Subramanyam Automatic image mosaic system using steerable Harris corner detector
EP0834151B1 (en) Object recognition method
Heinrichs et al. Robust spatio-temporal feature tracking
JPH06243258A (en) Depth detector
Zheng et al. Computing 3D models of rotating objects from moving shading
Tsai et al. Facade texture generation and mapping using digital videos
Zhang et al. An epipolar geometry constraint based view synthesis algorithm
Deng et al. Correction and rectification of light fields
Fernandes et al. Computing box dimensions from single perspective images in real time

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 96196803.6

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG US UZ VN AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 08765898

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 1996930236

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 1998/00397

Country of ref document: TR

ENP Entry into the national phase

Ref document number: 1997 511005

Country of ref document: JP

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 1996930236

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

WWG Wipo information: grant in national office

Ref document number: 1996930236

Country of ref document: EP