WO2001050747A2 - System and method of adaptive timing estimation for horizontal overscan data - Google Patents

System and method of adaptive timing estimation for horizontal overscan data Download PDF

Info

Publication number
WO2001050747A2
WO2001050747A2 PCT/US2000/034720 US0034720W WO0150747A2 WO 2001050747 A2 WO2001050747 A2 WO 2001050747A2 US 0034720 W US0034720 W US 0034720W WO 0150747 A2 WO0150747 A2 WO 0150747A2
Authority
WO
WIPO (PCT)
Prior art keywords
data
video signal
encoded
computer
readable medium
Prior art date
Application number
PCT/US2000/034720
Other languages
French (fr)
Other versions
WO2001050747A3 (en
Inventor
Craig Ranta
Original Assignee
Microsoft Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corporation filed Critical Microsoft Corporation
Priority to AU24453/01A priority Critical patent/AU2445301A/en
Publication of WO2001050747A2 publication Critical patent/WO2001050747A2/en
Publication of WO2001050747A3 publication Critical patent/WO2001050747A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/084Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the horizontal blanking interval only
    • H04N7/085Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the horizontal blanking interval only the inserted signal being digital
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/025Systems for the transmission of digital non-picture data, e.g. of text during the active part of a television frame
    • H04N7/035Circuits for the digital non-picture data signal, e.g. for slicing of the data signal, for regeneration of the data-clock signal, for error detection or correction of the data signal

Definitions

  • This invention relates generally to the field of computer systems and, more particularly to a system and method for detecting digital data encoded in a horizontal overscan portion of a video signal.
  • Ancillary digital data has been transmitted on analog television signals via various methods for several years. This digital data is used today for the purposes of closed-caption displays, interactive television, and commercial distribution of real time data such as stock quotes and weather reports. Various schemes are used to encode digital data onto the signal, each which has advantages and disadvantages. Horizontal overscan data insertion, invented by Microsoft, is a new method of broadcasting ancillary digital data onto NTSC and PAL television signals and has many desirable characteristics which make it superior to other methods such as VBI (vertical blanking insertion) and field luminance modulation (ref. U.S. Patent #4,807,031).
  • VBI vertical blanking insertion
  • field luminance modulation ref. U.S. Patent #4,807,031.
  • the data broadcast receiver can be coupled to a wireless data transmitter which removes the need for a cable between the interactive device and the ancillary data receiver. This allows a wider variety of devices and in particular allows television interactive educational toys for children to be developed without the hazards of becoming entangled in a cord to the ancillary data receiver.
  • control data should be temporarily synchronized with the video signal so that the actions of the controlled devices operate in synchronism with the programming information displayed on the television or monitor.
  • control data should be easily concatenated with a standard video signal for transmission in a variety of broadcast media using standard equipment.
  • control data should not interfere with the video signal or visibly disrupt the display of the video signal.
  • sufficient bandwidth should be provided in the upstream communication link (e.g., a broadcast-level communication link) to fully satisfy the bandwidth requirements of the downstream communication link (e.g., local wireless communication link).
  • bandwidth of the upstream communication link it would be advantageous for additional bandwidth to be available in the upstream communication link for transmitting additional information for other data sinks to provide advertising, subscription, or emergency warning services, such as e-mail, foreign language subtitling, telephone pages, weather warnings, configuration data for a set-top box, and so forth. It would also be advantageous for the bandwidth of the upstream communication link to be adjustable to meet the cost and performance needs of a wide variety of consumers.
  • the protocol for the upstream communication link should be addressable so that several wireless controlled devices, as well as other data sinks, may be controlled simultaneously.
  • the protocol should also be error tolerant and accommodate forward compatibility for future wireless controlled devices and other services that may be provided through the broadcast media. All of these attributes should be implemented at a cost that is feasible to deploy in connection with a system that is primarily intended to be a children's entertainment product.
  • horizontal overscan data receivers are presently used in consumer products and toys to receive signals from the controllers. Controllers send signals such as video signals to these receivers so that consumer products and toys can be interactive with consumers.
  • horizontal overscan receivers rely on the presence of a horizontal synchronization pulse in the horizontal previsible overscan region of the video signal.
  • a video data pulse containing encoded horizontal overscan data appears in a fixed time window or horizontal overscan window following the horizontal synchronization pulse.
  • the horizontal overscan receiver expects to see this data in a predetermined time window on a predetermined number of lines of the video image field.
  • Horizontal picture shift occurs when the active video data shifts from its expected horizontal data position. If the active video data shifts to the left or right by more than approximately 400 ns, then active video data is found in the fixed time window or horizontal overscan window where the receiver expects to find horizontal overscan data. Such a shift in the active video signal corrupts the video data, thus affecting the quality and content of the received data signal.
  • a variety of different hardware and processing equipment can be introduced into the video stream as it travels from the originating source, through satellite systems, and to the consumer via cable.
  • Each type or brand of video processing equipment introduces a different amount of distortion into the fixed time window or horizontal overscan window. This distortion varies the amount of horizontal picture shift experienced by the horizontal overscan data receiver.
  • two different amplifiers connected to the same cable broadcast system will introduce different amounts of distortion into the video signal. Thus, each amplifier will create a varying amount of horizontal picture shift upon the video signal.
  • Conventional methods for recovering horizontal overscan data encoded in a video signal use a fixed timing window in the area where horizontal overscan data is expected to reside.
  • a data pulse is expected between 9.2 and 10.6 microseconds after the horizontal reference synchronization point (HREF). If horizontal phase shift causes active video to shift left of the expected data range, then video beginning at 10.2 microseconds (the beginning of the viewable picture area) will shift into the data window and cause decoding errors. Alternatively, if the horizontal phase shift causes video to shift right, then horizontal overscan data will shift out of the expected data window and cause decoding errors.
  • HREF horizontal reference synchronization point
  • the present invention meets the needs described above in a system and method for data recovery from a video signal encoded with horizontal overscan data. Furthermore, the present invention provides a system and method for counteracting horizontal picture or phase shift in a video signal. The present invention also provides a system and method that corrects for the presence of horizontal phase shift and is relatively inexpensive and non-complex.
  • the invention is an adaptive timing module with an adaptive timing processor.
  • the adaptive timing module is configured for extracting and decoding digital data encoded in a horizontal overscan portion of a video signal.
  • the adaptive timing module conducts a sweeping operation through a timing search range within a plurality of scan lines over multiple fields of the video signal to detect a horizontal position within the scan lines associated with the digital data. Based on the sweeping operation, the adaptive timing module determines a desired horizontal detection position within the scan lines. The adaptive timing module then detects digital data encoded at the desired horizontal detection position of subsequent fields of the video signal.
  • the adaptive timing module conducts a sweeping operation through a timing search range within a plurality of scan lines over multiple fields of the video signal by dividing the timing search range into a plurality of equal sub-portions. Each sub-portion of the timing search range is scanned for the presence of a special data sequence within the scan lines associated with the digital data.
  • the adaptive timing module stores the data detected within each sub-portion, and determines a center point or average of the positions of the sub-portions where a valid sequence is detected. The module then determines a desired horizontal detection position within the scan lines by locking onto the center point or average of the sub-portions where a valid sequence is detected.
  • the adaptive timing module conducts a sweeping operation through a timing search range between 8.8 and 11.0 microseconds from a horizontal synchronization pulse or a timing signal that indicates the beginning of a scan line.
  • the horizontal position can include a specific data sequence, such as an intelligent signal detect word (ISDW), that indicates the beginning of a field of digital data.
  • the adaptive timing module determines a desired horizontal detection position within the scan lines by comparing the observed data sequence to a stored data sequence, such as a stored intelligent signal detect word (ISDW).
  • the adaptive timing module repeatedly detects digital data encoded at the desired horizontal detection position of subsequent fields of the video signal until a reset condition is enabled.
  • a reset condition includes the elapse of a predetermined length of time, or manually triggering a reset button.
  • the invention may also be embodied in a display device for recovering data from a video signal divided into frames, wherein each frame comprises a plurality of horizontal scan lines consecutively illuminated on the display device, wherein each scan line comprises a prescan portion comprising a pre-data encoding zone, and wherein the display device scans the prescan portion for the presence of encoded data in the pre-data encoding zone over a plurality of subsequent frames.
  • the display device determines a set of sampling positions within a prescan portion, and sweeps over the set of sampling positions for the presence of encoded data.
  • the display device detects encoded data within the prescan portion.
  • the display device determines a center point or average location of the sampling positions.
  • the display device locks onto the center point of the sampling positions, and uses the center point or average location of the sampling positions for recovering subsequent data from the video signal. That the invention improves over the drawbacks of the prior art and accomplishes the advantages described above will become apparent from the following detailed description of the exemplary embodiments and the appended drawings and claims.
  • FIG. 1 is block diagram of a duplex wireless control environment including a controller and a controlled device.
  • FIG. 2 is a functional block diagram that illustrates the components of a system incorporating an adaptive timing module for recovering data from a television signal encoded with horizontal overscan data in accordance with the present invention.
  • FIG. 3a is a waveform diagram illustrating a data bit value "one" encoded in the horizontal overscan portion of a scan line of an encoded video signal.
  • FIG. 3b is a waveform diagram illustrating a data bit value "zero" encoded in the horizontal overscan portion of a scan line of an encoded video signal.
  • FIG. 4a is a diagram illustrating the location of data bits in a portion of a frame of an encoded video signal.
  • FIG. 4b is a diagram illustrating the location of data bits in two interlaced fields of a frame of an encoded video signal.
  • FIG. 5a is a diagram illustrating a timing window divided into equally sized sub-portions.
  • FIG. 5b is a diagram illustrating a set of fields divided into equally sized sub- portions.
  • FIG. 5c is a diagram illustrating a flag table for determining a selected sampling point within a set of scanned fields.
  • FIG. 6 is a logic flow diagram illustrating a method for recovering data from a television signal encoded with horizontal overscan data.
  • FIG. 7 is a logic flow diagram illustrating a method for sweeping a timing window for an intelligent signal detection word (ISDW). '
  • FIG. 8 is a logic flow diagram illustrating a method for locking onto a selected sample point.
  • FIG. 9 is a logic flow diagram illustrating an example of a method for recovering data from a television signal encoded with horizontal overscan data in accordance with the present invention.
  • the invention may be implemented as an adaptive timing software module that counteracts horizontal picture shift and permits the recovery of horizontal overscan data from an encoded television signal.
  • the adaptive timing module exposes a standard interface that client programs may access to communicate with the adaptive timing module.
  • the object-oriented architecture permits a number of different client programs, such as application programs, and the like, to use the adaptive timing module.
  • the adaptive timing module can be used with an "actimates" application program.
  • hardware devices such as a display device or a data decoder may communicate with the adaptive timing module through the standard interface.
  • the interface exposed by the adaptive timing module allows the module to receive encoded data from an audio/video signal source.
  • the adaptive timing module receives encoded data from the audio/video signal source, and recovers data encoded within the audio/video signal.
  • the adaptive timing module can also be used in either a simplex or duplex environment, including a "REALMATION" system as described in U.S. Application No. 08/885,385 entitled “Method and System for Encoding Data in the Horizontal Overscan Portion of a Video Signal” filed on June 30, 1997, which is assigned to a common assignee and incorporated herein by reference.
  • FIG. 1 illustrates an exemplary simplex environment for embodiments of the present invention.
  • This simplex environment may be operated as a learning and entertainment system for a child.
  • the simplex environment includes a controller 11 that controls a controlled device 60.
  • the controller 11 includes an audio/video signal source 56, a wireless modulator 90, an antenna 98, and a display device 57 including a speaker 59.
  • the controller 11 transmits control data to the controlled device 60 via an antenna 98 and a RF communication channel 15.
  • the wireless modulator 90 interfaces with the audio /video signal source 56 and the display device 57 through a standard video interface. Over this standard video interface, the wireless modulator 90 receives a video signal encoded with control data (encoded video) from the audio /video signal source 56.
  • the wireless modulator 90 extracts the control data from the encoded video signal, and then transfers the control data to a controlled device 60 through the RF communication channel 15.
  • the wireless modulator 90 passes the video signal to the display device 57.
  • the audio/video signal source 56 also interfaces with the speaker 59 in the display device 57. Over this interface, the audio/video signal source 56 provides audio for an audio/video presentation.
  • a child 75 can observe the audio/video presentation on the display device 57 and the speaker 59 while the wireless modulator 90 transmits control data to one or more controlled device 60. The reception of the control data causes the controlled device 60 to move and talk as though it is a character in the audio/ video presentation.
  • An adaptive timing module 100 is deployed with the controller 11 as part of the wireless modulator 90.
  • the adaptive timing module 100 permits the controller 11 to improve the recovery of control data from the encoded video signal and to counteract horizontal phase shift by scanning the video signal for a selected sampling point. Using the selected sampling point, the controller 11 extracts the control data from the encoded video signal and generates the RF- modulated control signals for transmission to the controlled device 60. There is no need to modify the encoded video signal before passing it to the display device 57.
  • the controller 11 receives the encoded video signal, which is a standard video signal that has been modified to include digital information in the horizontal overscan intervals of the scan lines, which are invisible to the display device 57.
  • the display device 57 can receive and display the encoded video signal without modification.
  • control data typically, conventional methods and techniques are used to combine control data with the video signal by encoding the control data onto the video signal (i.e., generating an encoded video data stream).
  • One such encoding technique includes modulating the luminance of the horizontal overscan area of the video signal on a line-by-line basis. For example, the overscan area of each scan line may be modulated to represent a single control data bit.
  • the field boundaries of the video signal provide a framing structure for the control data, in which each frame contains a fixed number of data words.
  • FIG. 2 is a block diagram illustrating the various components that define the wireless modulator 90.
  • Each of the components of the wireless modulator 90 may be implemented in hardware, software, or a combination of hardware and software.
  • the adaptive timing module 100 is associated with the video data detector 91 of the wireless modulator 90.
  • the video data detector 91 receives an. encoded video signal 102 originating from an audio/video signal source 56, and utilizes the adaptive timing module 100 to recover control data from the encoded video signal and to counteract horizontal phase shift.
  • the adaptive timing module 100 determines a selected sampling point in the encoded video signal 102.
  • the adaptive timing module 100 extracts the control data from the encoded video signal 100, provides the control data to the data error processor 99, and simultaneously provides the encoded video signal 100 to the display device 57.
  • the data error processor 99 analyzes the control data to detect and attempt to correct any errors that may exist in the control data.
  • the protocol handler 93 receives the recovered and verified control data and assembles message packets for transmission to one or more controlled devices, represented by the controlled device 60.
  • the protocol handler 93 Upon assembling a message packet, the protocol handler 93 provides the message packet to a data encoder 94.
  • the data encoder 94 encodes the data and provides the encoded data to the RF transmitter 96.
  • the RF transmitter 96 receives the encoded data and modulates a predefined RF carrier (i.e., a predefined RF channel approved for use in connection with the wireless communication system) with the encoded data.
  • the RF transmitter then transmits the modulated carrier through the antenna 98.
  • the various components of the computer system 20 or the wireless modulator 90 may temporarily store the control data in a data buffer, such as the representative data buffer 92.
  • the display device 57 receives the video signal from the video data detector 91 or data decoder or another source along with an audio signal from the audio /video signal source 56.
  • the display device 57 and the speaker 59 then display the audio /visual presentation defined by the video signal, typically including a series of scenes depicted on the display device 57 and the speaker 59, in a conventional manner.
  • the audio /video presentation on the display device 57 and the control data that is transmitted from antenna 98 are synchronized so that the controlled device 60 behaves as a character in the scene depicted on the display device 57.
  • the processes of detecting the control data, correcting any errors, encoding the control data, and then modulating a carrier may introduce a slight delay.
  • embedding the control data within the video data in the encoded video signal effectively synchronizes the operation of the controlled device with the scene depicted on the display device 57.
  • the video signal received by the display device 57 and the control data transmitted from, antenna 98 are synchronized because they are obtained from the same area of the original encoded video signal, in which context sensitive control data is embedded within a video signal.
  • the encoded video signal may be separated in realtime into control data and related video data so that the controlled devices move and/or talk in a manner that relates to the audio/video presentation.
  • the audio/video signal source 56 may be any of a variety of conventional video sources, such as a video camera, a broadcast or cable television signal, a video tape player, the Internet transmitting a video signal, a computer generating a video signal, and so forth.
  • the video signal may be any type of video signal that includes a plurality of frames that each include a plurality of scan lines.
  • the video signal may be a standard 525-line, two-field interlaced NTSC television signal that includes 30 frames per second, each frame including two fields of 262.5 interlaced lines, as is well known to those skilled in the art
  • a video data encoder 94 merges encoded data with the lines of the video signal to create an encoded video signal 102, as described in detail with respect to FIGs. 3a-b and 4a-b.
  • a protocol is defined for the encoded data that is addressable, forwardly compatible, error tolerant, and feasible to deploy in connection with a system that is primarily intended to be a children's entertainment product. This protocol is described in detail with respect to U.S. Application Serial No. 08/795,710 entitled “PROTOCOL FOR A WIRELESS CONTROL SYSTEM” filed on February 4, 1997, which is assigned to a common assignee and incorporated herein by reference.
  • the video data encoder 94 transmits the encoded video signal 102 to a video data detector 91 or adaptive timing module 100, which may be a remote device that receives the encoded video signal 102 by way of a broadcast-level transmission.
  • a video data detector 91 or adaptive timing module 100 may be a local device, for example in an intercom application.
  • the encoded data does not interfere with the transmission of the underlying video signal.
  • the encoded video signal 102 may be transmitted using any type of video transmission media, such as a broadcast-level cable television signal, a video tape player, the Internet transmitting a video signal, a computer generating a video signal, and so forth.
  • the encoded video signal 102 may be passed directly from the video data detector 91 or adaptive timing module 100 to the display device 57, which displays the underlying video signal undisturbed by the encoded data.
  • the video data detector 91 detects the presence of the encoded data in the encoded video signal 102 by detecting the presence of an intelligent signal detection wor (ISDW), as described with reference to FIGs. 3a-b and 4a-b.
  • ISDW intelligent signal detection wor
  • a single ISDW is transmitted in the same location of each field of the encoded video signal 102, such as lines 23 - 29 in field-1 and 286 - 292 in field-2, of a standard interlaced 525-line NTSC television signal.
  • a consecutive series of the ISDWs defines a dynamic validation sequence in which each ISDW varies in at least two bits from the immediately preceding signal detection word.
  • the dynamic validation sequence may be the binary representation of 8, 1, 10, 3, 12, 5, 14, 7.
  • the adaptive timing module 100 corrects horizontal overscan or phase shift errors in the encoded video signal 102.
  • the adaptive timing module 102 includes an adaptive timing processor 104 to execute a routine to determine a set of sampling positions and sub-portions within a prescan portion of the encoded video signal 102.
  • the adaptive timing processor 104 sweeps over the set of sampling positions and sub-portions for the presence of encoded data.
  • the adaptive timing processor 104 detects encoded data such as an ISDW within the prescan portion
  • the adaptive timing processor 104 uses the sub-portions containing encoded data to determine a selected sampling point such as a center point or average location of the sub-portions containing encoded data.
  • the adaptive timing processor 104 locks onto the selected sampling point and uses the selected sampling point for recovering subsequent data from the encoded video signal 102.
  • the adaptive timing processor 104 reads the data, if any, in the specified lines, corrects the data for correctable errors that may have occurred in the ISDW bits, and detects the presence of the ISDW. In each frame, the ISDW is typically followed by a number of content words. If adaptive timing processor 104 detects the presence of the ISDW in the encoded video signal 104, adaptive timing processor 104 extracts the content words from the encoded video signal and assembles the content words into a serial data communication signal 106. The adaptive timing processor 104 then transmits a serial data communication signal to a data error processor 99.
  • the data error processor 99 strips out the error correction bits, corrects any correctable errors in the content bits, and assembles the corrected content words into a 9-bit error corrected data stream.
  • This 9-bit error corrected data stream is transmitted to a protocol handler 93, which includes a number of data handlers that detect and route device-specific control data to their associated data sinks.
  • the addressing protocol for the content data is described with reference to U.S. Application Serial No. 08/795,710 entitled "PROTOCOL FOR A WIRELESS CONTROL SYSTEM" filed on February 4, 1997, which is assigned to a common assignee and incorporated herein by reference.
  • FIGs. 3a and 3b show the location of the encoded data in the context of a single scan line 120, 120' of an encoded video signal 102.
  • FIG. 3a is a waveform diagram illustrating a data bit value "one" 128 encoded in the horizontal overscan portion of a scan line 120 of the encoded video signal 102.
  • the scan line represents one line of one frame displayed on the display device 57.
  • the vertical axis represents the magnitude of the signal waveform 120 in units of IRE and the horizontal axis represents time in microseconds, as is familiar to those skilled in the art.
  • FIGs. 3a-b are not drawn precisely to scale, important reference points. are marked in the units of their corresponding axis.
  • the horizontal synchronization pulse 122 is followed by a sinusoidal color burst 124 (the approximate envelope is shown), which is used as a calibration signal for the display device 57.
  • the color burst 124 is followed by a waveform representing the visible raster 126 (the approximate envelope is shown), which creates and typically overlaps slightly the visible image on the display device 57.
  • the waveform 120 includes a pre- visible horizontal overscan area 127 or prescan portion of the horizontal overscan data stream, approximately from 9.2 microseconds to 10.2 microseconds after H-REF, that occurs after the color burst 124 and before the visible raster 126.
  • a video data encoder 94 locates a pre-visible (i.e., before the visible raster 126) data bit "one" 128 by driving the waveform 120 to a predetermined high value, such as 80 IRE, in the interval from 9.2 microseconds to 10.2 microseconds after H-REF. Because the pulse denoting the data bit "one" 128 occurs after the calibration interval of the color burst 124 and before the visible raster 126, it does not interfere with the operation of the display device 57 or appear on the image displayed.
  • FIG. 3b is a waveform diagram illustrating a data bit value "zero" 128' encoded in the horizontal overscan portion of a scan line of the encoded video signal 104.
  • the video data encoder 94 locates the pre-visible data bit "zero" 128' by driving the waveform 120 to a predetermined low value, such as 7.5 IRE, in the interval from 9.2 microseconds to 10.2 microseconds after H-REF.
  • each 16-bit content word includes nine data bits, and each frame includes 13 content words.
  • encoding one bit per scan line produces a bandwidth for the data encoded in a typical 59.94 Hertz NTSC video signal of 7,013 Baud. This bandwidth is sufficient to provide a data sink with sufficient data to control several wireless controlled devices 60 in the manner described above.
  • the 7,013 Baud one-bit-per-scan-line bandwidth of the encoded data is also sufficient to control several other data sinks to provide additional services, such as advertising, subscription, and emergency warning information for transmission to the display device 57 and other data sinks.
  • these services might include e-mail, foreign language subtitling, intercom capability, telephone pages, weather warnings, configuration data for a set-top box, and so forth.
  • the 7,013 Baud one-bit-per-scan-line bandwidth is preferred because it provides sufficient bandwidth for the "REALMATION" system and minimizes the cost of the system components, in particular the video data encoder 94 and the video data detector 91.
  • the bandwidth may be increased, however, by locating a second pulse in the post-visual horizontal overscan area 130, which occurs after the visible raster 126 and before the horizontal blanking interval 132 (during which the electron gun in the CRT of the display device 57 sweeps back from the end of the just completed scan line to the beginning of the next scan line). And the bandwidth may be further increased by enabling each pulse 128, 130 to represent more that just two (1,0) states. For example, for 3 states (c.f the 1.0, 1.5, 2.0 DDM pulse widths), an analog of the "REALMATION" DDM protocol could be used.
  • the pulse could represent 3 bits; for 16 states, the pulse could represent 4 bits, and so forth.
  • each scan line would be able to transmit eight bits. This would increase the bandwidth from 7,013 Baud to 56,104 Baud, which might be worth the increased cost for the video data encoder 94 and the video data detector 91 for future applications.
  • FIGs. 4a and 4b show the location of encoded data in the context of a standard NTSC video frame.
  • FIG. 4a is a diagram illustrating the location of data bits in a portion of a standard 525-line two-field interlaced NTSC video signal.
  • Each frame of the video data includes a vertical blanking interval 140 (during which the electron gun in the CRT of the display device 57 sweeps back and up from the end of the just completed frame to the beginning of the next frame) followed by an active video interval 142, which includes a number of left-to-right scan lines that sequentially paint the display device 57 from the top to the bottom of the screen.
  • the last two pulses are typically reserved for closed caption data 146 and vertical blanking data 148, which may be already dedicated to other purposes.
  • the bottom of each field is typically corrupted by head switching noise present in the output of helical- scan video tape players of consumer formats such as VHS and 8mm. Therefore, the horizontal overscan portion of individual scan lines provides the preferred location for encoded data bits 128, 128' of the encoded video signal 102.
  • FIG. 4b is a diagram illustrating the location of data bits in the two interlaced fields of the standard NTSC video frame. That is, FIG. 4b shows the location of the encoded data in the context of a complete NTSC 525-line two-field interlaced video frame.
  • the frame of video data includes lines 1-262 in field-1 152 interlaced with lines 263-525 in field-2 154.
  • Field-1 152 includes a vertical blanking interval 140a and an active video interval 142a.
  • the vertical blanking interval 140a includes lines 1-22 and concludes with line 21, which may include closed caption data 146a, and line 22, which may include vertical blanking data 148a.
  • An ISDW 156a is encoded in lines 23-29 and content data 158a is encoded in lines 30-237.
  • Field-2 154 includes a vertical blanking interval 140b and a active video interval 142b.
  • the vertical blanking interval 140b includes lines 263-284 and concludes with line 283, which may include closed caption data 146b, and line 284, which may include vertical blanking data 148b.
  • An ISDW 156b is encoded in lines 286- 292 and content data 158b is encoded in lines 293-500.
  • Each ISDW preferably includes a plurality of data bits and a plurality of error correction bits defining a correction sequence that allows a single-bit error in the data bits to be detected and corrected.
  • the ISDW may include a seven-bit Hamming code (i.e., four data bits and three error correction bits) in the format shown below in Table 1.
  • each field 152, 154 of a video frame up to 13 16-bit content words 158 may follow the ISDW 156, as shown below in Table 2.
  • Each content word preferably includes a plurality of data bits 164 and a plurality of error correction bits 166 defining a correction sequence that allows a single-bit error in the data bits to be detected and corrected.
  • the content word may include a seven-bit Hamming code (i.e., four data bits and three error correction bits) and a nine-bit Hamming code (i.e., five data bits and four error correction bits) in the format shown below in Table 3.
  • Hamming codes are preferred because of their simplicity and small computation requirement.
  • FIGs. 5a and 5b illustrate the determination of a selected sampling point in a prescan portion of a horizontal overscan data stream.
  • FIG. 5a shows a diagram illustrating the division of a prescan portion of a single field in a standard 525-line two-field interlaced NTSC video signal.
  • the adaptive timing processor 104 determines a predefined timing window 202 over the pre-visible horizontal overscan area 127 of the horizontal overscan data steam 204 of a single field 206.
  • the adaptive timing processor 104 uses the same predefined timing window 202 over a range of a predefined number of fields 206. For example, as shown in FIG.
  • the adaptive timing processor 104 can define a timing window 202 in the prescan portion 127 of the encoded video signal 102 comprising 8.8 microseconds to 11.0 microseconds after the H-REF over a range of six or more fields 206a-m of the video signal 204.
  • the adaptive timing processor 104 uses a predefined increment "n" to divide the timing window 202 into "n" number of relatively equally sized sub- portions 208a-n using "n+1" sampling points 210.
  • the adaptive processor 104 sweeps each sub-portion 208a-n for the presence of an ISDW 212 within the timing window 202.
  • the adaptive timing processor 104 sets a series of six sampling points 210 which divide a timing window 202 into five relatively equally sized sub-portions 208a-n within a single field 206.
  • the adaptive timing processor 104 sweeps each of the five sub-portions 208a-n between adjacent sampling points 210 of the field 206 for the presence of an ISDW 212.
  • the presence of an ISDW 212 in the field 206 of the video signal 102 is distinguished by a pattern identification word consisting of four bits.
  • the value of the pattern identification word in each contiguous field cyclically sequences through a defined set of values.
  • the presence of the pattern identification word distinguishes an encoded video signal from a normal video signal. In a normal video signal, random noise appears in place of the pattern identification word.
  • An adaptive timing processor 104 attempting to recover control data from an encoded video signal 102 therefore determines whether the signal is an encoded video signal by detecting the presence of the pattern identification.
  • the pattern identification word provides an additional layer of integrity to the recovered control data beyond that of simple checksum error detection.
  • FIG. 5b shows a diagram illustrating the division of several fields of a horizontal overscan data stream into sub-portions.
  • the adaptive timing processor 104 scans "m" number of fields 206a-m for the presence of an ISDW 212.
  • the adaptive timing processor 104 detects the presence of an ISDW 212 within a particular sub-portion 208a-n of the timing window 202, the adaptive timing processor 104 sets a flag 214a-m for the particular sub-portion location.
  • the timing phase is adjusted so that a different sub-portion location is scanned by the adaptive timing processor 104.
  • the adaptive timing processor 104 determines the correct timing phase for scanning subsequent fields 206a-m and their respective sub-portions 208a-n for the presence of an ISDW 212.
  • the adaptive timing processor 104 can scan a particular sub- portion in each of eight fields 206a-m.
  • the adaptive timing processor 104 selects the third sub-portion 208c, between sampling points "T mm +T 2n " and "T min + T 3n " as illustrated in FIG. 5a, of each field 206a-m to scan. If a valid ISDW 212 is detected in the third sub-portion 208c of any of the scanned fields 206a-m, a flag 214a is set for the particular sub-portion 208c and field 206a-m indicating the presence of an ISDW 212 in the particular sub-portion 208c for the particular field 206a-m.
  • the adaptive timing processor 104 repeats the scan for another particular sub-portion 208 in all of the particular fields 206a-m until all of the sub-portions 208 for all of the fields 206a-m have been scanned for an ISDW 212.
  • each timing phase will be measured for six fields 206a-m to allow time to scan for an ISDW 212.
  • the number of sub-portions 208 and fields 206a-m scanned by the adaptive timing processor 104 can be varied with an increased number of sub-portions or fields, or both, increasing the scan time.
  • FIG. 5c shows a flag table for determining a selected sampling point within a set of scanned fields.
  • the adaptive timing processor 104 sets a flag 214a-m indicating the particular sub-portion 208a-n the ISDW 212a-m was detected in.
  • the adaptive timing processor 104 uses the table of checked flags 214a-m or the stored sub-portion locations of the detected ISDW 212a-m to determine a selected sampling point 216. For example, an adaptive timing processor 104 determines the center point or average location of the sub-portion positions where an ISDW 212 has been detected over a range of eight fields 206.
  • the adaptive timing processor 104 uses the center point or average location of the sub-portion positions to set a selected sampling point 216.
  • the selected sampling point 216 designates a "lock-on" position for the adaptive timing processor 104 to use for locating encoded data in subsequent scans.
  • FIG. 5d shows a diagram illustrating a subsequent video signal 218 with a selected sampling point 216 for the adaptive timing processor 104 to "lock on".
  • the adaptive timing processor 104 determines the selected sampling point 216, and uses the selected sampling point 216 to find the ISDW 212 in subsequent data fields 220a-m.
  • the selected sampling point 216 represents an optimum location within subsequent data fields 220a-m to find the ISDW 212.
  • FIG. 6 is a logic flow diagram illustrating a method for recovering data from a television signal encoded with horizontal overscan data. The steps illustrated by FIG. 6 are performed by an adaptive timing module 100 operating with an adaptive timing processor 104. Step 302 starts routine 300 performed by the adaptive timing processor. Step 302 is followed by routine 304, in which the adaptive timing processor
  • Routine 304 is further described in FIG. 7.
  • the adaptive timing processor 104 returns to decision block 306, in which the adaptive timing processor 104 determines whether an ISDW 212 has been located within the timing window 202 of the video signal 102.
  • the adaptive timing processor 104 looks at the data received from the each sub-portion 208 of each field 206 of the video signal 104 for a pattern identification word consisting of four bits. The presence of the pattern identification word distinguishes an encoded video signal from a normal video signal. If an ISDW 212 is not detected, then the "NO" branch is followed to step 308, in which the adaptive timing processor 104 resets a flag 214 indicating a valid ISDW. Step 308 returns to routine 304 to continue sweeping the timing window 202 for an ISDW 212.
  • routine 310 in which the adaptive timing processor 104 locks onto a selected sampling point 216.
  • the selected sampling point 216 is used by the adaptive timing processor 104 to optimize locating ISDW's 212 in a subsequent encoded video signal. Routine 310 is further described in FIG. 8.
  • Routine 310 returns to step 312, in which the adaptive timing processor 104 decodes the data in the ISDW 212.
  • the ISDW 212 contains a plurality of data bits and a plurality of error correction bits defining a correction sequence that allows a single-bit error in the data bits to be detected and corrected.
  • a consecutive series of ISDW's 212 defines a dynamic validation sequence indicating the presence of video data following each ISDW 212.
  • Step 312 is followed by decision block 314, in which the adaptive timing processor 104 determines whether the ISDW 212 is no longer detected by the adaptive timing processor 104.
  • an ISDW 212 in a television broadcast signal may be briefly interrupted by an event that does not contain encoded data such as a single commercial break, after which the television broadcast signal will continue to be broadcast.
  • the adaptive timing processor 104 waits for a predetermined amount of time such as an acquisition delay to determine if the ISDW 212 is discontinued. In such cases, the adaptive timing processor 104 retains the last "lock-on" position to use for locating encoded data in subsequent scans of the signal. If an ISDW 212 is continues to be detected in decision block 314, then the "NO" branch is followed to return to step 312, in which the adaptive timing processor 104 continues to decode data in the ISDW 212.
  • decision block 316 the adaptive timing processor 104 determines whether a reset condition is enabled. For example, in other cases, an ISDW in a television broadcast signal will no longer be detected when the signal is interrupted by an event that does not contain encoded data such as a commercial break. After a series of commercial breaks, the correct data recovery timing may be lost. In such a case, the adaptive timing processor waits for a predetermined amount of time such as an acquisition delay before determining that the ISDW is not longer detected.
  • Decision block 316 checks for the presence of a reset condition.
  • a reset condition is caused by a triggering event such as the elapse of a predetermined amount of time, or manually activating a reset switch.
  • routine 304 the sweep or scan routine begins again to reacquire an ISDW.
  • step 312 the last "lock on" position determined by the adaptive timing processor 104 is used for locating encoded data in subsequent scans of the signal 204.
  • FIG. 7 is a logic flow diagram illustrating a method for sweeping a timing window for an intelligent signal detection word (ISDW). The steps illustrated by FIG. 7 are performed by an adaptive timing module 100 operating with an adaptive timing processor 104. Routine 400 begins following step 304 in FIG. 6. In step 402, the adaptive timing processor 104 receives an encoded video signal 102 from an audio/video signal source 56.
  • ISDW intelligent signal detection word
  • Step 402 is followed by step 404, in which the adaptive timing processor 104 locates a horizontal reference point (H-REF) within the encoded video signal 102.
  • H-REF horizontal reference point
  • the H-REF typically precedes a prescan portion 127 of the encoded video signal 102.
  • Step 404 is followed by step 406, in which the adaptive timing processor 104 locates a timing window 202 between a predetermined range of approximately 8.8 to 11.0 microseconds after the H-REF.
  • the predetermined range can be set for other values as long as the range covers the expected position of the horizontal overscan data area 127.
  • the expected position of the horizontal overscan data area 127 is between 9.2 and 10.2 microseconds.
  • Step 406 is followed by step 408, in which the adaptive timing processor 104 divides each video field 206 into "n" number of equally-sized sub-portions 208 by selecting sampling points 210 along the width of each video field 206a-m. For example, as shown in FIGs. 5a-c, each video field 206 a-m is divided by a set of sampling points 210 into five sub-portions 208.
  • Step 408 is followed by step 410, in which the adaptive timing processor 104 sets a timing phase defining a predetermined number of video fields 206a-m to be scanned by the routine 400. For example, as shown in FIGs. 5a-c, the number of video fields 206a-m scanned is eight fields.
  • Step 410 is followed by step 412, in which the data within each video field 206 is sent to an adaptive timing processor 104 to determine the presence of an ISDW 212 within sub-portion 208.
  • the adaptive timing processor 104 receives the data within each sub-portion 208, and processes the data to determine the presence of the pattern identification word distinguishing an encoded video signal from a normal video signal.
  • Step 412 is followed by step 414, in which the routine 400 returns to decision block 306 in FIG.
  • FIG. 8 is a logic flow diagram illustrating a method for locking onto a selected sampling point. The steps illustrated by FIG. 8 are performed by an adaptive timing module 100 operating with an adaptive timing processor 104. Routine 500 begins following the "YES" branch of decision block 302 in FIG. 6. In step 502, the adaptive timing processor 104 increments a flag 214 indicating the presence of a valid ISDW within a sub-portion 208 of a field 206a-m.
  • Step 502 is followed by step 504, in which the adaptive timing processor 104 scans all of the "n" number of the video fields 206a-m for an ISDW 212.
  • Each of the video fields 206a-m is divided into sub-portions 208, in which the adaptive timing processor 104 sweeps each sub-portion 208 of each field 206 a-m for a valid ISDW 212 signal.
  • Step 504 is followed by step 506, in which the adaptive timing processor 104 stores the location of the valid ISDW in a storage device such as RAM or a data buffer 92.
  • a storage device such as RAM or a data buffer 92.
  • a table containing video signal fields and the locations of detected ISDW's can be generated by the adaptive timing processor 104.
  • Step 506 is followed by step 508, in which the adaptive timing processor 104 uses the stored positions of the valid ISDW's within the fields 206a-m, and calculates a selected sampling point 216 for decoding subsequent data within the encoded video signal 102.
  • the adaptive timing processor 104 uses stored ISDW locations in the storage device 92 to calculate a selected sampling point 216 such as a center point of the sub-portion locations where a valid ISDW 212 was found within each field 206 a-m.
  • a selected sampling point 216 such as a center point of the sub-portion locations where a valid ISDW 212 was found within each field 206 a-m.
  • using the center point of the detected valid ISDW's permits the adaptive timing processor 104 to estimate the magnitude of the horizontal phase or shift error.
  • Step 508 is followed by step 510, in which the adaptive timing processor 104 uses the selected sampling point 216 to "lock on" to a position in subsequent fields 220a-b for scanning data 218 in the encoded video signal 102.
  • step 510 in which the adaptive timing processor 104 uses the selected sampling point 216 to "lock on" to a position in subsequent fields 220a-b for scanning data 218 in the encoded video signal 102. For example, as shown in FIGs. 5c-d, using the center point of the detected valid ISDW positions creates an estimated location or selected sampling point 216 for optimizing detection of subsequent ISDW's 222 within the same encoded video signal 102.
  • Step 510 is followed by step 512, in which the routine returns to step 312 in FIG. 6, where data is decoded by the adaptive timing processor 104.
  • FIG. 9 is a logic flow diagram illustrating an exemplary method for recovering data from a television signal encoded with horizontal overscan data in accordance with the present invention. The steps illustrated by FIG. 9 are performed by an adaptive timing module 100 operating with an adaptive timing processor 104. Routine 600 begins with the start block 602.
  • Step 602 is followed by step 604, in which the adaptive timing processor 104 sets a series of sampling windows or sub-portions 208 within a timing window 202 of an encoded video signal 102. That is, the adaptive timing processor 104 divides a timing window 202 where a pre-visible horizontal overscan area 127 is expected to be into a number of sub-portions 208.
  • a timing window 202 can be defined between T ⁇ to T max , wherein T ⁇ is approximately 8.8 microseconds after H-REF and T max is approximately 11.0 microseconds after H-REF, when the expected pre-visible horizontal overscan area 127 is expected to be located between 9.2 and 10.2 microseconds after H-REF. As shown in FIG.
  • the timing window 202 is divided into a series of five sampling windows or sub-portions 208.
  • Step 604 is followed by step 606, in which the adaptive timing processor 104 waits for eight video fields 206a-m to capture or detect an ISDW 212.
  • the number of video fields 206a-m is a preselected number based upon the available processor time and capacity. A lesser or greater number of video fields 206a-m can be selected and scanned to capture or detect an ISDW 212. As shown in FIG. 5b, eight fields 206a-m are scanned by the adaptive timing processor 104 for the presence of an ISDW 212.
  • Step 606 is followed by decision block 608, in which the adaptive timing processor 104 determines whether a valid ISDW 212 is detected. If a valid ISDW 212 is detected, then the "YES" branch is followed to step 610. In step 610, the adaptive timing processor 104 sets a flag 214 indicating a valid ISDW 212 in the sampling window or sub-portion 208. As shown in FIGs. 5b-c, a flag 214a-m can be set indicating a valid ISDW 212 in a particular sampling window or sub-portion 208 for each field 206 a-m. Step 610 is followed by decision block 612, in which the adaptive timing processor 104 determines whether all of the sampling windows or sub-portions 208 have been checked or scanned by the adaptive timing processor 104 for a valid ISDW 212.
  • step 614 the adaptive timing processor 104 sets a flag 214 indicating that a valid ISDW 212 is not present in the timing window 202.
  • Step 614 is followed by step 616, in which the adaptive timing processor 104 increments the sampling window or sub-portion 208 by T ⁇ . As shown in FIG. 5a, a field 206 is divided into increments, each with the width of T n .
  • Step 616 is followed by decision block 612, in which the adaptive timing processor 104 determines whether all of the sampling windows or sub-portions 208 have been checked or scanned by the adaptive timing processor 104 for a valid ISDW 212. If not all of the sampling windows or sub-portions 208 have been checked, then the "NO" branch is followed to step 618, returning to step 606, in which the adaptive timing processor 104 scans eight video fields 206a-m to capture or detect an ISDW 212.
  • step 620 determines whether at least one sampling point or sub-portion 208 contains a valid ISDW 212. If none of the sampling points or sub-portions 208 contain a valid ISDW 212, then the "NO" branch is followed to step 624, returning to step 606, in which the adaptive timing processor 104 scans for six video fields 206a-m to capture or detect an ISDW.
  • step 626 the adaptive timing processor 104 determines an optimum timing sample point or a selected sampling point 216.
  • An optimum timing sample point or a selected sampling point 216 can be an average location or a center point between two or more ISDW sampling point or sub-portion positions. Other similar types of optimum timing sample points or selected sampling points can be calculated by the adaptive timing processor 104 for use with the routine 600.
  • Step 626 is followed by step 628, in which the adaptive timing processor 104 sets a flag 214 indicating a valid ISDW 212 at the sampling point or sub-portion 208 location. Furthermore, step 628 enables data decoding of the encoded video signal 102 using the calculated optimum timing sample point or selected sampling point 216. The adaptive timing processor 104 uses the optimum timing sample point or selected sampling point 216 to decode subsequent data 220 within the encoded video signal 102.
  • Step 628 is followed by decision block 630, in which the adaptive timing processor 104 determines whether the ISDW 212 is still valid. If the ISDW 212 is still valid, then the "YES" branch is followed to step 632, returning to step 628 where the adaptive timing processor 104 continues data decoding of the encoded video signal 102 using the calculated optimum timing sample point or selected sampling point 216.
  • step 634 the adaptive timing processor 104 starts an invalid ISDW timer.
  • step 634 disables data decoding of the encoded video signal.
  • step 634 is followed by decision block 636, in which the adaptive timing processor 104 determines whether the invalid ISDW timer has expired. If the invalid ISDW timer has expired, then the "YES" branch is followed to step 638, in which the routine 600 begins again.
  • Step 640 determines whether an ISDW 212 is present in the sampling window or sub- portion 208.
  • step 644 If no ISDW 212 is detected by the adaptive timing processor 104, then the "NO" branch is followed to step 644, returning to decision block 636 to determine whether the ISDW invalid timer has expired. However, if an ISDW 212 is detected, then the "YES” branch is followed to step 646, returning to step 628 continuing the data decoding with the calculated sample point or selected sampling point 216.
  • the invention provides an adaptive timing module for recovering data from a video signal encoded with horizontal overscan data. Furthermore, the present invention provides a system and method for counteracting horizontal picture or phase shift in a video signal. The present invention also provides a system and method for correcting horizontal picture or phase shift without using complex or expensive devices. It should be understood that the foregoing relates only to the exemplary embodiments of the present invention, and that numerous changes may be made therein without departing from the spirit and scope of the invention as defined by the following claims.

Abstract

The adaptive timing module is configured for recovering data encoded in a field of a video signal, and further configured for counteracting horizontal phase or picture shift. An adaptive timing processor defines a timing window where the expected prescan portion video signal is expected to be in an encoded video signal. The timing window is divided into sub-portions with a set of sampling points. The adaptive timing processor conducts a sweeping operation over each sub-portion of the prescan portion of the encoded video signal for the presence of a pre-data encoding zone. When a pre-data encoding zone is detected, the adaptive timing processor stores the position of the sub-portion with the pre-data encoding zone, or sets a flag to indicate the presence of a pre-data encoding zone for that particular sub-portion in a particular field. After scanning one or more fields for the presence of a pre-data encoding zone, the adaptive timing processor uses the stored positions of pre-data encoding zones, or uses the flags indicating the pre-data encoding zones to determine a selected sampling point. The adaptive timing processor uses the selected sampling point to lock on to a location for decoding subsequent data in the encoded video signal.

Description

SYSTEM AND METHOD OF ADAPTIVE TIMING ESTIMATION FOR
HORIZONTAL OVERSCAN DATA
REFERENCE TO RELATED APPLICATIONS
This application incorporates by reference U.S. Application Serial No. 08/885,385 entitled "METHOD AND SYSTEM FOR ENCODING DATA IN THE HORIZONTAL OVERSCAN PORTION OF A VIDEO SIGNAL" filed on June 30, 1997, which is assigned to a common assignee. This application further incorporates by reference U.S. Application entitled "METHOD AND SYSTEM FOR DECODING DATA IN THE HORIZONTAL OVERSCAN PORTION OF A VIDEO SIGNAL" which is assigned to a common assignee and filed concurrently herewith.
TECHNICAL FIELD
This invention relates generally to the field of computer systems and, more particularly to a system and method for detecting digital data encoded in a horizontal overscan portion of a video signal.
BACKGROUND OF THE INVENTION
Ancillary digital data has been transmitted on analog television signals via various methods for several years. This digital data is used today for the purposes of closed-caption displays, interactive television, and commercial distribution of real time data such as stock quotes and weather reports. Various schemes are used to encode digital data onto the signal, each which has advantages and disadvantages. Horizontal overscan data insertion, invented by Microsoft, is a new method of broadcasting ancillary digital data onto NTSC and PAL television signals and has many desirable characteristics which make it superior to other methods such as VBI (vertical blanking insertion) and field luminance modulation (ref. U.S. Patent #4,807,031).
Interactive toys, games, and learning products for the home are particularly useful applications of data broadcast technology. The data broadcast receiver can be coupled to a wireless data transmitter which removes the need for a cable between the interactive device and the ancillary data receiver. This allows a wider variety of devices and in particular allows television interactive educational toys for children to be developed without the hazards of becoming entangled in a cord to the ancillary data receiver.
In order to effectively broadcast the control data in connection with a video signal, several often competing objectives should be attained. First, as noted above, the control data should be temporarily synchronized with the video signal so that the actions of the controlled devices operate in synchronism with the programming information displayed on the television or monitor. Second, the control data should be easily concatenated with a standard video signal for transmission in a variety of broadcast media using standard equipment. Third, the control data should not interfere with the video signal or visibly disrupt the display of the video signal. Fourth, sufficient bandwidth should be provided in the upstream communication link (e.g., a broadcast-level communication link) to fully satisfy the bandwidth requirements of the downstream communication link (e.g., local wireless communication link). In addition, it would be advantageous for additional bandwidth to be available in the upstream communication link for transmitting additional information for other data sinks to provide advertising, subscription, or emergency warning services, such as e-mail, foreign language subtitling, telephone pages, weather warnings, configuration data for a set-top box, and so forth. It would also be advantageous for the bandwidth of the upstream communication link to be adjustable to meet the cost and performance needs of a wide variety of consumers.
As with the downstream wireless communication link, the protocol for the upstream communication link should be addressable so that several wireless controlled devices, as well as other data sinks, may be controlled simultaneously. The protocol should also be error tolerant and accommodate forward compatibility for future wireless controlled devices and other services that may be provided through the broadcast media. All of these attributes should be implemented at a cost that is feasible to deploy in connection with a system that is primarily intended to be a children's entertainment product.
Conventional horizontal overscan data receivers are presently used in consumer products and toys to receive signals from the controllers. Controllers send signals such as video signals to these receivers so that consumer products and toys can be interactive with consumers. To provide a synchronized video signal, horizontal overscan receivers rely on the presence of a horizontal synchronization pulse in the horizontal previsible overscan region of the video signal. A video data pulse containing encoded horizontal overscan data appears in a fixed time window or horizontal overscan window following the horizontal synchronization pulse. The horizontal overscan receiver expects to see this data in a predetermined time window on a predetermined number of lines of the video image field. Because the expected time window for occurrence of the data pulse is fixed and predetermined, shifting of the data pulse earlier or later than the expected position can cause data errors in existing systems. Conventional horizontal overscan data receivers are therefore sensitive to a phenomenon known as horizontal picture shift, or horizontal phase shift. Horizontal picture shift occurs when the active video data shifts from its expected horizontal data position. If the active video data shifts to the left or right by more than approximately 400 ns, then active video data is found in the fixed time window or horizontal overscan window where the receiver expects to find horizontal overscan data. Such a shift in the active video signal corrupts the video data, thus affecting the quality and content of the received data signal.
A variety of different hardware and processing equipment can be introduced into the video stream as it travels from the originating source, through satellite systems, and to the consumer via cable. Each type or brand of video processing equipment introduces a different amount of distortion into the fixed time window or horizontal overscan window. This distortion varies the amount of horizontal picture shift experienced by the horizontal overscan data receiver. For example, two different amplifiers connected to the same cable broadcast system will introduce different amounts of distortion into the video signal. Thus, each amplifier will create a varying amount of horizontal picture shift upon the video signal. Conventional methods for recovering horizontal overscan data encoded in a video signal use a fixed timing window in the area where horizontal overscan data is expected to reside. Typically, a data pulse is expected between 9.2 and 10.6 microseconds after the horizontal reference synchronization point (HREF). If horizontal phase shift causes active video to shift left of the expected data range, then video beginning at 10.2 microseconds (the beginning of the viewable picture area) will shift into the data window and cause decoding errors. Alternatively, if the horizontal phase shift causes video to shift right, then horizontal overscan data will shift out of the expected data window and cause decoding errors. Using conventional methods for recovering horizontal overscan data requires television broadcasters to maintain timing parameters to within +/- 100 nanoseconds of the original timing for proper decoding of the horizontal overscan data by a consumer decoder.
Furthermore, devices employed to maintain this timing accuracy are expensive and degrade the video signal slightly. Many broadcasters do not want to invest in expensive pieces of equipment to correct horizontal phase shift.
Thus, there is a need in the art for a system and method that improves the method for data recovery from a video signal encoded with horizontal overscan data.
There is a further need in the art for a system and method that counteracts horizontal picture shift and permits the recovery of horizontal overscan data from an encoded video signal.
Furthermore, there is a need in the art for a system and method that corrects horizontal phase shift and is relatively inexpensive and non-complex.
SUMMARY OF THE INVENTION
The present invention meets the needs described above in a system and method for data recovery from a video signal encoded with horizontal overscan data. Furthermore, the present invention provides a system and method for counteracting horizontal picture or phase shift in a video signal. The present invention also provides a system and method that corrects for the presence of horizontal phase shift and is relatively inexpensive and non-complex.
Generally described, the invention is an adaptive timing module with an adaptive timing processor. The adaptive timing module is configured for extracting and decoding digital data encoded in a horizontal overscan portion of a video signal. The adaptive timing module conducts a sweeping operation through a timing search range within a plurality of scan lines over multiple fields of the video signal to detect a horizontal position within the scan lines associated with the digital data. Based on the sweeping operation, the adaptive timing module determines a desired horizontal detection position within the scan lines. The adaptive timing module then detects digital data encoded at the desired horizontal detection position of subsequent fields of the video signal.
More particularly described, the adaptive timing module conducts a sweeping operation through a timing search range within a plurality of scan lines over multiple fields of the video signal by dividing the timing search range into a plurality of equal sub-portions. Each sub-portion of the timing search range is scanned for the presence of a special data sequence within the scan lines associated with the digital data. The adaptive timing module stores the data detected within each sub-portion, and determines a center point or average of the positions of the sub-portions where a valid sequence is detected. The module then determines a desired horizontal detection position within the scan lines by locking onto the center point or average of the sub-portions where a valid sequence is detected.
In another aspect of the invention, the adaptive timing module conducts a sweeping operation through a timing search range between 8.8 and 11.0 microseconds from a horizontal synchronization pulse or a timing signal that indicates the beginning of a scan line. The horizontal position can include a specific data sequence, such as an intelligent signal detect word (ISDW), that indicates the beginning of a field of digital data. The adaptive timing module then determines a desired horizontal detection position within the scan lines by comparing the observed data sequence to a stored data sequence, such as a stored intelligent signal detect word (ISDW). In yet another aspect of the invention, the adaptive timing module repeatedly detects digital data encoded at the desired horizontal detection position of subsequent fields of the video signal until a reset condition is enabled. A reset condition includes the elapse of a predetermined length of time, or manually triggering a reset button.
The invention may also be embodied in a display device for recovering data from a video signal divided into frames, wherein each frame comprises a plurality of horizontal scan lines consecutively illuminated on the display device, wherein each scan line comprises a prescan portion comprising a pre-data encoding zone, and wherein the display device scans the prescan portion for the presence of encoded data in the pre-data encoding zone over a plurality of subsequent frames. The display device determines a set of sampling positions within a prescan portion, and sweeps over the set of sampling positions for the presence of encoded data. The display device detects encoded data within the prescan portion. In another aspect of the display device, the display device determines a center point or average location of the sampling positions. The display device locks onto the center point of the sampling positions, and uses the center point or average location of the sampling positions for recovering subsequent data from the video signal. That the invention improves over the drawbacks of the prior art and accomplishes the advantages described above will become apparent from the following detailed description of the exemplary embodiments and the appended drawings and claims.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is block diagram of a duplex wireless control environment including a controller and a controlled device.
FIG. 2 is a functional block diagram that illustrates the components of a system incorporating an adaptive timing module for recovering data from a television signal encoded with horizontal overscan data in accordance with the present invention.
FIG. 3a is a waveform diagram illustrating a data bit value "one" encoded in the horizontal overscan portion of a scan line of an encoded video signal. FIG. 3b is a waveform diagram illustrating a data bit value "zero" encoded in the horizontal overscan portion of a scan line of an encoded video signal.
FIG. 4a is a diagram illustrating the location of data bits in a portion of a frame of an encoded video signal. FIG. 4b is a diagram illustrating the location of data bits in two interlaced fields of a frame of an encoded video signal.
FIG. 5a is a diagram illustrating a timing window divided into equally sized sub-portions.
FIG. 5b is a diagram illustrating a set of fields divided into equally sized sub- portions.
FIG. 5c is a diagram illustrating a flag table for determining a selected sampling point within a set of scanned fields.
FIG. 6 is a logic flow diagram illustrating a method for recovering data from a television signal encoded with horizontal overscan data. FIG. 7 is a logic flow diagram illustrating a method for sweeping a timing window for an intelligent signal detection word (ISDW). '
FIG. 8 is a logic flow diagram illustrating a method for locking onto a selected sample point.
FIG. 9 is a logic flow diagram illustrating an example of a method for recovering data from a television signal encoded with horizontal overscan data in accordance with the present invention.
DETAILED DESCRIPTION OF DISCLOSED EMBODIMENTS
The invention may be implemented as an adaptive timing software module that counteracts horizontal picture shift and permits the recovery of horizontal overscan data from an encoded television signal. As an object-oriented program, the adaptive timing module exposes a standard interface that client programs may access to communicate with the adaptive timing module. The object-oriented architecture permits a number of different client programs, such as application programs, and the like, to use the adaptive timing module. For example, the adaptive timing module can be used with an "actimates" application program. Furthermore, hardware devices such as a display device or a data decoder may communicate with the adaptive timing module through the standard interface.
The interface exposed by the adaptive timing module allows the module to receive encoded data from an audio/video signal source. The adaptive timing module receives encoded data from the audio/video signal source, and recovers data encoded within the audio/video signal.
Although the specification describes an exemplary simplex environment for an embodiment of the adaptive timing module, the adaptive timing module can also be used in either a simplex or duplex environment, including a "REALMATION" system as described in U.S. Application No. 08/885,385 entitled "Method and System for Encoding Data in the Horizontal Overscan Portion of a Video Signal" filed on June 30, 1997, which is assigned to a common assignee and incorporated herein by reference.
FIG. 1 illustrates an exemplary simplex environment for embodiments of the present invention. This simplex environment may be operated as a learning and entertainment system for a child. The simplex environment includes a controller 11 that controls a controlled device 60. The controller 11 includes an audio/video signal source 56, a wireless modulator 90, an antenna 98, and a display device 57 including a speaker 59. The controller 11 transmits control data to the controlled device 60 via an antenna 98 and a RF communication channel 15. To accomplish this task, the wireless modulator 90 interfaces with the audio /video signal source 56 and the display device 57 through a standard video interface. Over this standard video interface, the wireless modulator 90 receives a video signal encoded with control data (encoded video) from the audio /video signal source 56. The wireless modulator 90 extracts the control data from the encoded video signal, and then transfers the control data to a controlled device 60 through the RF communication channel 15.
In addition, the wireless modulator 90 passes the video signal to the display device 57. The audio/video signal source 56 also interfaces with the speaker 59 in the display device 57. Over this interface, the audio/video signal source 56 provides audio for an audio/video presentation. Thus, a child 75 can observe the audio/video presentation on the display device 57 and the speaker 59 while the wireless modulator 90 transmits control data to one or more controlled device 60. The reception of the control data causes the controlled device 60 to move and talk as though it is a character in the audio/ video presentation.
An adaptive timing module 100 is deployed with the controller 11 as part of the wireless modulator 90. The adaptive timing module 100 permits the controller 11 to improve the recovery of control data from the encoded video signal and to counteract horizontal phase shift by scanning the video signal for a selected sampling point. Using the selected sampling point, the controller 11 extracts the control data from the encoded video signal and generates the RF- modulated control signals for transmission to the controlled device 60. There is no need to modify the encoded video signal before passing it to the display device 57. Typically, the controller 11 receives the encoded video signal, which is a standard video signal that has been modified to include digital information in the horizontal overscan intervals of the scan lines, which are invisible to the display device 57. Thus, the display device 57 can receive and display the encoded video signal without modification.
Typically, conventional methods and techniques are used to combine control data with the video signal by encoding the control data onto the video signal (i.e., generating an encoded video data stream). One such encoding technique includes modulating the luminance of the horizontal overscan area of the video signal on a line-by-line basis. For example, the overscan area of each scan line may be modulated to represent a single control data bit. Furthermore, the field boundaries of the video signal provide a framing structure for the control data, in which each frame contains a fixed number of data words.
.FIG. 2 is a block diagram illustrating the various components that define the wireless modulator 90. Each of the components of the wireless modulator 90 may be implemented in hardware, software, or a combination of hardware and software. The adaptive timing module 100 is associated with the video data detector 91 of the wireless modulator 90. The video data detector 91 receives an. encoded video signal 102 originating from an audio/video signal source 56, and utilizes the adaptive timing module 100 to recover control data from the encoded video signal and to counteract horizontal phase shift. The adaptive timing module 100 determines a selected sampling point in the encoded video signal 102. The adaptive timing module 100 extracts the control data from the encoded video signal 100, provides the control data to the data error processor 99, and simultaneously provides the encoded video signal 100 to the display device 57.
The data error processor 99 analyzes the control data to detect and attempt to correct any errors that may exist in the control data. After correcting any errors in the control data, the protocol handler 93 receives the recovered and verified control data and assembles message packets for transmission to one or more controlled devices, represented by the controlled device 60. Upon assembling a message packet, the protocol handler 93 provides the message packet to a data encoder 94. The data encoder 94 encodes the data and provides the encoded data to the RF transmitter 96. The RF transmitter 96 receives the encoded data and modulates a predefined RF carrier (i.e., a predefined RF channel approved for use in connection with the wireless communication system) with the encoded data. The RF transmitter then transmits the modulated carrier through the antenna 98. During processing of the control data, the various components of the computer system 20 or the wireless modulator 90 may temporarily store the control data in a data buffer, such as the representative data buffer 92.
The display device 57 receives the video signal from the video data detector 91 or data decoder or another source along with an audio signal from the audio /video signal source 56. The display device 57 and the speaker 59 then display the audio /visual presentation defined by the video signal, typically including a series of scenes depicted on the display device 57 and the speaker 59, in a conventional manner.
As noted previously, the audio /video presentation on the display device 57 and the control data that is transmitted from antenna 98 are synchronized so that the controlled device 60 behaves as a character in the scene depicted on the display device 57. The processes of detecting the control data, correcting any errors, encoding the control data, and then modulating a carrier may introduce a slight delay. Nevertheless, embedding the control data within the video data in the encoded video signal effectively synchronizes the operation of the controlled device with the scene depicted on the display device 57. In other words, the video signal received by the display device 57 and the control data transmitted from, antenna 98 are synchronized because they are obtained from the same area of the original encoded video signal, in which context sensitive control data is embedded within a video signal. Thus, the encoded video signal may be separated in realtime into control data and related video data so that the controlled devices move and/or talk in a manner that relates to the audio/video presentation.
The audio/video signal source 56 may be any of a variety of conventional video sources, such as a video camera, a broadcast or cable television signal, a video tape player, the Internet transmitting a video signal, a computer generating a video signal, and so forth. The video signal may be any type of video signal that includes a plurality of frames that each include a plurality of scan lines. For example, the video signal may be a standard 525-line, two-field interlaced NTSC television signal that includes 30 frames per second, each frame including two fields of 262.5 interlaced lines, as is well known to those skilled in the art
A video data encoder 94 merges encoded data with the lines of the video signal to create an encoded video signal 102, as described in detail with respect to FIGs. 3a-b and 4a-b. A protocol is defined for the encoded data that is addressable, forwardly compatible, error tolerant, and feasible to deploy in connection with a system that is primarily intended to be a children's entertainment product. This protocol is described in detail with respect to U.S. Application Serial No. 08/795,710 entitled "PROTOCOL FOR A WIRELESS CONTROL SYSTEM" filed on February 4, 1997, which is assigned to a common assignee and incorporated herein by reference.
The video data encoder 94 transmits the encoded video signal 102 to a video data detector 91 or adaptive timing module 100, which may be a remote device that receives the encoded video signal 102 by way of a broadcast-level transmission. Alternatively, a video data detector 91 or adaptive timing module 100 may be a local device, for example in an intercom application. The encoded data does not interfere with the transmission of the underlying video signal. Thus, the encoded video signal 102 may be transmitted using any type of video transmission media, such as a broadcast-level cable television signal, a video tape player, the Internet transmitting a video signal, a computer generating a video signal, and so forth. In addition, because the encoded data is located in the pre- visible or post-visible portions of the video signal, the encoded data does not visibly interfere with the operation of typical televisions or monitors. Therefore, the encoded video signal 102 may be passed directly from the video data detector 91 or adaptive timing module 100 to the display device 57, which displays the underlying video signal undisturbed by the encoded data.
Utilizing the adaptive timing module 100, the video data detector 91 detects the presence of the encoded data in the encoded video signal 102 by detecting the presence of an intelligent signal detection wor (ISDW), as described with reference to FIGs. 3a-b and 4a-b. Preferably, a single ISDW is transmitted in the same location of each field of the encoded video signal 102, such as lines 23 - 29 in field-1 and 286 - 292 in field-2, of a standard interlaced 525-line NTSC television signal. A consecutive series of the ISDWs defines a dynamic validation sequence in which each ISDW varies in at least two bits from the immediately preceding signal detection word. For example, the dynamic validation sequence may be the binary representation of 8, 1, 10, 3, 12, 5, 14, 7.
The adaptive timing module 100 corrects horizontal overscan or phase shift errors in the encoded video signal 102. The adaptive timing module 102 includes an adaptive timing processor 104 to execute a routine to determine a set of sampling positions and sub-portions within a prescan portion of the encoded video signal 102. The adaptive timing processor 104 sweeps over the set of sampling positions and sub-portions for the presence of encoded data. When the adaptive timing processor 104 detects encoded data such as an ISDW within the prescan portion, the adaptive timing processor 104 uses the sub-portions containing encoded data to determine a selected sampling point such as a center point or average location of the sub-portions containing encoded data. The adaptive timing processor 104 locks onto the selected sampling point and uses the selected sampling point for recovering subsequent data from the encoded video signal 102.
The adaptive timing processor 104 reads the data, if any, in the specified lines, corrects the data for correctable errors that may have occurred in the ISDW bits, and detects the presence of the ISDW. In each frame, the ISDW is typically followed by a number of content words. If adaptive timing processor 104 detects the presence of the ISDW in the encoded video signal 104, adaptive timing processor 104 extracts the content words from the encoded video signal and assembles the content words into a serial data communication signal 106. The adaptive timing processor 104 then transmits a serial data communication signal to a data error processor 99.
The data error processor 99 strips out the error correction bits, corrects any correctable errors in the content bits, and assembles the corrected content words into a 9-bit error corrected data stream. This 9-bit error corrected data stream is transmitted to a protocol handler 93, which includes a number of data handlers that detect and route device-specific control data to their associated data sinks. The addressing protocol for the content data is described with reference to U.S. Application Serial No. 08/795,710 entitled "PROTOCOL FOR A WIRELESS CONTROL SYSTEM" filed on February 4, 1997, which is assigned to a common assignee and incorporated herein by reference.
Although the various components and modules have been described separately, one skilled in the art should recognize that the components and modules could be combined in various ways and that new program components and modules could be created to accomplish similar results.
FIGs. 3a and 3b show the location of the encoded data in the context of a single scan line 120, 120' of an encoded video signal 102. FIG. 3a is a waveform diagram illustrating a data bit value "one" 128 encoded in the horizontal overscan portion of a scan line 120 of the encoded video signal 102. The scan line represents one line of one frame displayed on the display device 57. The vertical axis represents the magnitude of the signal waveform 120 in units of IRE and the horizontal axis represents time in microseconds, as is familiar to those skilled in the art. Although FIGs. 3a-b are not drawn precisely to scale, important reference points. are marked in the units of their corresponding axis. The waveform 120 for the scan line begins with a horizontal synchronization pulse 122 down to -40 IRE, which is a timing signal that indicates the beginning of the scan line (i.e., time = 0) when the leading edge of the pulse passes through -20 IRE to establish the horizontal reference point "H-REF." The horizontal synchronization pulse 122 is followed by a sinusoidal color burst 124 (the approximate envelope is shown), which is used as a calibration signal for the display device 57. The color burst 124 is followed by a waveform representing the visible raster 126 (the approximate envelope is shown), which creates and typically overlaps slightly the visible image on the display device 57. The waveform 120 includes a pre- visible horizontal overscan area 127 or prescan portion of the horizontal overscan data stream, approximately from 9.2 microseconds to 10.2 microseconds after H-REF, that occurs after the color burst 124 and before the visible raster 126. A video data encoder 94 locates a pre-visible (i.e., before the visible raster 126) data bit "one" 128 by driving the waveform 120 to a predetermined high value, such as 80 IRE, in the interval from 9.2 microseconds to 10.2 microseconds after H-REF. Because the pulse denoting the data bit "one" 128 occurs after the calibration interval of the color burst 124 and before the visible raster 126, it does not interfere with the operation of the display device 57 or appear on the image displayed.
FIG. 3b is a waveform diagram illustrating a data bit value "zero" 128' encoded in the horizontal overscan portion of a scan line of the encoded video signal 104. The video data encoder 94 locates the pre-visible data bit "zero" 128' by driving the waveform 120 to a predetermined low value, such as 7.5 IRE, in the interval from 9.2 microseconds to 10.2 microseconds after H-REF.
As noted above, each 16-bit content word includes nine data bits, and each frame includes 13 content words. Thus, encoding one bit per scan line produces a bandwidth for the data encoded in a typical 59.94 Hertz NTSC video signal of 7,013 Baud. This bandwidth is sufficient to provide a data sink with sufficient data to control several wireless controlled devices 60 in the manner described above. See also, the related patent application, U.S. Application Serial No. 08/795,710 entitled "PROTOCOL FOR A WIRELESS CONTROL SYSTEM" filed on February 4, 1997, which is assigned to a common assignee and incorporated herein by reference.
The 7,013 Baud one-bit-per-scan-line bandwidth of the encoded data is also sufficient to control several other data sinks to provide additional services, such as advertising, subscription, and emergency warning information for transmission to the display device 57 and other data sinks. For example, these services might include e-mail, foreign language subtitling, intercom capability, telephone pages, weather warnings, configuration data for a set-top box, and so forth. At present, the 7,013 Baud one-bit-per-scan-line bandwidth is preferred because it provides sufficient bandwidth for the "REALMATION" system and minimizes the cost of the system components, in particular the video data encoder 94 and the video data detector 91. The bandwidth may be increased, however, by locating a second pulse in the post-visual horizontal overscan area 130, which occurs after the visible raster 126 and before the horizontal blanking interval 132 (during which the electron gun in the CRT of the display device 57 sweeps back from the end of the just completed scan line to the beginning of the next scan line). And the bandwidth may be further increased by enabling each pulse 128, 130 to represent more that just two (1,0) states. For example, for 3 states (c.f the 1.0, 1.5, 2.0 DDM pulse widths), an analog of the "REALMATION" DDM protocol could be used. For 4 states, the pulse could represent 2 bits (e.g., 100-80 IRE = 1,1; 70-50 IRE = 1,0; 40-20 IRE = 0,0; 10 to -40 IRE = 0,1). For 8 states, the pulse could represent 3 bits; for 16 states, the pulse could represent 4 bits, and so forth. For example, if data pulses are used in both the pre-visual horizontal overscan area 127 and the post-visual horizontal overscan area 130, each data pulse having 16 states, each scan line would be able to transmit eight bits. This would increase the bandwidth from 7,013 Baud to 56,104 Baud, which might be worth the increased cost for the video data encoder 94 and the video data detector 91 for future applications.
FIGs. 4a and 4b show the location of encoded data in the context of a standard NTSC video frame. FIG. 4a is a diagram illustrating the location of data bits in a portion of a standard 525-line two-field interlaced NTSC video signal. Each frame of the video data includes a vertical blanking interval 140 (during which the electron gun in the CRT of the display device 57 sweeps back and up from the end of the just completed frame to the beginning of the next frame) followed by an active video interval 142, which includes a number of left-to-right scan lines that sequentially paint the display device 57 from the top to the bottom of the screen. At the end of the vertical blanking interval 140, the last two pulses are typically reserved for closed caption data 146 and vertical blanking data 148, which may be already dedicated to other purposes. In addition, the bottom of each field is typically corrupted by head switching noise present in the output of helical- scan video tape players of consumer formats such as VHS and 8mm. Therefore, the horizontal overscan portion of individual scan lines provides the preferred location for encoded data bits 128, 128' of the encoded video signal 102.
FIG. 4b is a diagram illustrating the location of data bits in the two interlaced fields of the standard NTSC video frame. That is, FIG. 4b shows the location of the encoded data in the context of a complete NTSC 525-line two-field interlaced video frame. The frame of video data includes lines 1-262 in field-1 152 interlaced with lines 263-525 in field-2 154. Field-1 152 includes a vertical blanking interval 140a and an active video interval 142a. The vertical blanking interval 140a includes lines 1-22 and concludes with line 21, which may include closed caption data 146a, and line 22, which may include vertical blanking data 148a. An ISDW 156a is encoded in lines 23-29 and content data 158a is encoded in lines 30-237. Field-2 154 includes a vertical blanking interval 140b and a active video interval 142b. The vertical blanking interval 140b includes lines 263-284 and concludes with line 283, which may include closed caption data 146b, and line 284, which may include vertical blanking data 148b. An ISDW 156b is encoded in lines 286- 292 and content data 158b is encoded in lines 293-500.
Each ISDW preferably includes a plurality of data bits and a plurality of error correction bits defining a correction sequence that allows a single-bit error in the data bits to be detected and corrected. For example, the ISDW may include a seven-bit Hamming code (i.e., four data bits and three error correction bits) in the format shown below in Table 1.
Figure imgf000017_0001
Table 1
In each field 152, 154 of a video frame, up to 13 16-bit content words 158 may follow the ISDW 156, as shown below in Table 2.
Figure imgf000018_0001
Table 2 Each content word preferably includes a plurality of data bits 164 and a plurality of error correction bits 166 defining a correction sequence that allows a single-bit error in the data bits to be detected and corrected. For example, the content word may include a seven-bit Hamming code (i.e., four data bits and three error correction bits) and a nine-bit Hamming code (i.e., five data bits and four error correction bits) in the format shown below in Table 3.
Figure imgf000019_0001
Table 3
Although many other, often more sophisticated, data correction techniques may be used, Hamming codes are preferred because of their simplicity and small computation requirement.
FIGs. 5a and 5b illustrate the determination of a selected sampling point in a prescan portion of a horizontal overscan data stream. FIG. 5a shows a diagram illustrating the division of a prescan portion of a single field in a standard 525-line two-field interlaced NTSC video signal. The adaptive timing processor 104 determines a predefined timing window 202 over the pre-visible horizontal overscan area 127 of the horizontal overscan data steam 204 of a single field 206. The adaptive timing processor 104 uses the same predefined timing window 202 over a range of a predefined number of fields 206. For example, as shown in FIG. 5b, the adaptive timing processor 104 can define a timing window 202 in the prescan portion 127 of the encoded video signal 102 comprising 8.8 microseconds to 11.0 microseconds after the H-REF over a range of six or more fields 206a-m of the video signal 204.
Using a predefined increment "n", the adaptive timing processor 104 divides the timing window 202 into "n" number of relatively equally sized sub- portions 208a-n using "n+1" sampling points 210. The adaptive processor 104 sweeps each sub-portion 208a-n for the presence of an ISDW 212 within the timing window 202. For example, the adaptive timing processor 104 sets a series of six sampling points 210 which divide a timing window 202 into five relatively equally sized sub-portions 208a-n within a single field 206. The adaptive timing processor 104 sweeps each of the five sub-portions 208a-n between adjacent sampling points 210 of the field 206 for the presence of an ISDW 212.
The presence of an ISDW 212 in the field 206 of the video signal 102 is distinguished by a pattern identification word consisting of four bits. The value of the pattern identification word in each contiguous field cyclically sequences through a defined set of values. The presence of the pattern identification word distinguishes an encoded video signal from a normal video signal. In a normal video signal, random noise appears in place of the pattern identification word. An adaptive timing processor 104 attempting to recover control data from an encoded video signal 102 therefore determines whether the signal is an encoded video signal by detecting the presence of the pattern identification. Thus, the pattern identification word provides an additional layer of integrity to the recovered control data beyond that of simple checksum error detection.
FIG. 5b shows a diagram illustrating the division of several fields of a horizontal overscan data stream into sub-portions. Using a predefined increment "m", the adaptive timing processor 104 scans "m" number of fields 206a-m for the presence of an ISDW 212. When the adaptive timing processor 104 detects the presence of an ISDW 212 within a particular sub-portion 208a-n of the timing window 202, the adaptive timing processor 104 sets a flag 214a-m for the particular sub-portion location. After the adaptive timing processor 104 has scanned a particular sub-portion location in each of a particular number of fields 206a-m, the timing phase is adjusted so that a different sub-portion location is scanned by the adaptive timing processor 104. After all of the sub-portions 208a-n have been scanned for "m" number of fields 206a-m, the adaptive timing processor 104 determines the correct timing phase for scanning subsequent fields 206a-m and their respective sub-portions 208a-n for the presence of an ISDW 212.
For example, the adaptive timing processor 104 can scan a particular sub- portion in each of eight fields 206a-m. The adaptive timing processor 104 selects the third sub-portion 208c, between sampling points "Tmm+T2n" and "Tmin + T3n" as illustrated in FIG. 5a, of each field 206a-m to scan. If a valid ISDW 212 is detected in the third sub-portion 208c of any of the scanned fields 206a-m, a flag 214a is set for the particular sub-portion 208c and field 206a-m indicating the presence of an ISDW 212 in the particular sub-portion 208c for the particular field 206a-m. After all of the particular sub-portions 208c have been scanned in the particular fields 206a-m, the adaptive timing processor 104 repeats the scan for another particular sub-portion 208 in all of the particular fields 206a-m until all of the sub-portions 208 for all of the fields 206a-m have been scanned for an ISDW 212. Typically, each timing phase will be measured for six fields 206a-m to allow time to scan for an ISDW 212. However, the number of sub-portions 208 and fields 206a-m scanned by the adaptive timing processor 104 can be varied with an increased number of sub-portions or fields, or both, increasing the scan time.
FIG. 5c shows a flag table for determining a selected sampling point within a set of scanned fields. When an ISDW 212a-m is detected in a particular scanned sub-portion 208a-n of a particular field 206a-m, the adaptive timing processor 104 sets a flag 214a-m indicating the particular sub-portion 208a-n the ISDW 212a-m was detected in. The adaptive timing processor 104 uses the table of checked flags 214a-m or the stored sub-portion locations of the detected ISDW 212a-m to determine a selected sampling point 216. For example, an adaptive timing processor 104 determines the center point or average location of the sub-portion positions where an ISDW 212 has been detected over a range of eight fields 206. The adaptive timing processor 104 uses the center point or average location of the sub-portion positions to set a selected sampling point 216. The selected sampling point 216 designates a "lock-on" position for the adaptive timing processor 104 to use for locating encoded data in subsequent scans.
FIG. 5d shows a diagram illustrating a subsequent video signal 218 with a selected sampling point 216 for the adaptive timing processor 104 to "lock on". The adaptive timing processor 104 determines the selected sampling point 216, and uses the selected sampling point 216 to find the ISDW 212 in subsequent data fields 220a-m. The selected sampling point 216 represents an optimum location within subsequent data fields 220a-m to find the ISDW 212. FIG. 6 is a logic flow diagram illustrating a method for recovering data from a television signal encoded with horizontal overscan data. The steps illustrated by FIG. 6 are performed by an adaptive timing module 100 operating with an adaptive timing processor 104. Step 302 starts routine 300 performed by the adaptive timing processor. Step 302 is followed by routine 304, in which the adaptive timing processor
104 sweeps a timing window 202 in a received video signal 204 for the presence of an intelligent signal detection word (ISDW) 212. Other similar types of signals or markers can be located by the adaptive timing processor 104 when programmed into the routine 304 executed by the adaptive timing processor 104. Routine 304 is further described in FIG. 7.
The adaptive timing processor 104 returns to decision block 306, in which the adaptive timing processor 104 determines whether an ISDW 212 has been located within the timing window 202 of the video signal 102. The adaptive timing processor 104 looks at the data received from the each sub-portion 208 of each field 206 of the video signal 104 for a pattern identification word consisting of four bits. The presence of the pattern identification word distinguishes an encoded video signal from a normal video signal. If an ISDW 212 is not detected, then the "NO" branch is followed to step 308, in which the adaptive timing processor 104 resets a flag 214 indicating a valid ISDW. Step 308 returns to routine 304 to continue sweeping the timing window 202 for an ISDW 212.
If an ISDW 212 is detected, then the "YES" branch is followed to routine 310, in which the adaptive timing processor 104 locks onto a selected sampling point 216. The selected sampling point 216 is used by the adaptive timing processor 104 to optimize locating ISDW's 212 in a subsequent encoded video signal. Routine 310 is further described in FIG. 8.
Routine 310 returns to step 312, in which the adaptive timing processor 104 decodes the data in the ISDW 212. As described previously in FIG. 4b, the ISDW 212 contains a plurality of data bits and a plurality of error correction bits defining a correction sequence that allows a single-bit error in the data bits to be detected and corrected. Furthermore, a consecutive series of ISDW's 212 defines a dynamic validation sequence indicating the presence of video data following each ISDW 212. Step 312 is followed by decision block 314, in which the adaptive timing processor 104 determines whether the ISDW 212 is no longer detected by the adaptive timing processor 104. For example, in some cases, an ISDW 212 in a television broadcast signal may be briefly interrupted by an event that does not contain encoded data such as a single commercial break, after which the television broadcast signal will continue to be broadcast. The adaptive timing processor 104 waits for a predetermined amount of time such as an acquisition delay to determine if the ISDW 212 is discontinued. In such cases, the adaptive timing processor 104 retains the last "lock-on" position to use for locating encoded data in subsequent scans of the signal. If an ISDW 212 is continues to be detected in decision block 314, then the "NO" branch is followed to return to step 312, in which the adaptive timing processor 104 continues to decode data in the ISDW 212.
If an ISDW 212 is no longer detected in decision block 314, then the "YES" branch is followed to decision block 316, in which the adaptive timing processor 104 determines whether a reset condition is enabled. For example, in other cases, an ISDW in a television broadcast signal will no longer be detected when the signal is interrupted by an event that does not contain encoded data such as a commercial break. After a series of commercial breaks, the correct data recovery timing may be lost. In such a case, the adaptive timing processor waits for a predetermined amount of time such as an acquisition delay before determining that the ISDW is not longer detected.
Decision block 316 checks for the presence of a reset condition. A reset condition is caused by a triggering event such as the elapse of a predetermined amount of time, or manually activating a reset switch. When a reset condition is detected by the adaptive timing processor 104, then the "YES" branch is followed to routine 304, in which the sweep or scan routine begins again to reacquire an ISDW. If a reset condition is not detected by the adaptive timing processor 104, the "NO" branch is followed to step 312, in which the last "lock on" position determined by the adaptive timing processor 104 is used for locating encoded data in subsequent scans of the signal 204.
FIG. 7 is a logic flow diagram illustrating a method for sweeping a timing window for an intelligent signal detection word (ISDW). The steps illustrated by FIG. 7 are performed by an adaptive timing module 100 operating with an adaptive timing processor 104. Routine 400 begins following step 304 in FIG. 6. In step 402, the adaptive timing processor 104 receives an encoded video signal 102 from an audio/video signal source 56.
Step 402 is followed by step 404, in which the adaptive timing processor 104 locates a horizontal reference point (H-REF) within the encoded video signal 102. As shown in FIGs. 3a-b, the H-REF typically precedes a prescan portion 127 of the encoded video signal 102.
Step 404 is followed by step 406, in which the adaptive timing processor 104 locates a timing window 202 between a predetermined range of approximately 8.8 to 11.0 microseconds after the H-REF. The predetermined range can be set for other values as long as the range covers the expected position of the horizontal overscan data area 127. For example, as shown in FIGs. 5a-c, the expected position of the horizontal overscan data area 127 is between 9.2 and 10.2 microseconds.
Step 406 is followed by step 408, in which the adaptive timing processor 104 divides each video field 206 into "n" number of equally-sized sub-portions 208 by selecting sampling points 210 along the width of each video field 206a-m. For example, as shown in FIGs. 5a-c, each video field 206 a-m is divided by a set of sampling points 210 into five sub-portions 208.
Step 408 is followed by step 410, in which the adaptive timing processor 104 sets a timing phase defining a predetermined number of video fields 206a-m to be scanned by the routine 400. For example, as shown in FIGs. 5a-c, the number of video fields 206a-m scanned is eight fields. Step 410 is followed by step 412, in which the data within each video field 206 is sent to an adaptive timing processor 104 to determine the presence of an ISDW 212 within sub-portion 208. The adaptive timing processor 104 receives the data within each sub-portion 208, and processes the data to determine the presence of the pattern identification word distinguishing an encoded video signal from a normal video signal. Step 412 is followed by step 414, in which the routine 400 returns to decision block 306 in FIG. 6, in which the adaptive timing processor 104 determines whether a valid ISDW 212 has been located within the scanned sub- portion 208. FIG. 8 is a logic flow diagram illustrating a method for locking onto a selected sampling point. The steps illustrated by FIG. 8 are performed by an adaptive timing module 100 operating with an adaptive timing processor 104. Routine 500 begins following the "YES" branch of decision block 302 in FIG. 6. In step 502, the adaptive timing processor 104 increments a flag 214 indicating the presence of a valid ISDW within a sub-portion 208 of a field 206a-m.
Step 502 is followed by step 504, in which the adaptive timing processor 104 scans all of the "n" number of the video fields 206a-m for an ISDW 212. Each of the video fields 206a-m is divided into sub-portions 208, in which the adaptive timing processor 104 sweeps each sub-portion 208 of each field 206 a-m for a valid ISDW 212 signal.
Step 504 is followed by step 506, in which the adaptive timing processor 104 stores the location of the valid ISDW in a storage device such as RAM or a data buffer 92. For example, as described in FIG. 5c, a table containing video signal fields and the locations of detected ISDW's can be generated by the adaptive timing processor 104.
Step 506 is followed by step 508, in which the adaptive timing processor 104 uses the stored positions of the valid ISDW's within the fields 206a-m, and calculates a selected sampling point 216 for decoding subsequent data within the encoded video signal 102. For example, as shown in FIG. 5c, the adaptive timing processor 104 uses stored ISDW locations in the storage device 92 to calculate a selected sampling point 216 such as a center point of the sub-portion locations where a valid ISDW 212 was found within each field 206 a-m. Furthermore, using the center point of the detected valid ISDW's permits the adaptive timing processor 104 to estimate the magnitude of the horizontal phase or shift error. Step 508 is followed by step 510, in which the adaptive timing processor 104 uses the selected sampling point 216 to "lock on" to a position in subsequent fields 220a-b for scanning data 218 in the encoded video signal 102. For example, as shown in FIGs. 5c-d, using the center point of the detected valid ISDW positions creates an estimated location or selected sampling point 216 for optimizing detection of subsequent ISDW's 222 within the same encoded video signal 102.
Step 510 is followed by step 512, in which the routine returns to step 312 in FIG. 6, where data is decoded by the adaptive timing processor 104.
FIG. 9 is a logic flow diagram illustrating an exemplary method for recovering data from a television signal encoded with horizontal overscan data in accordance with the present invention. The steps illustrated by FIG. 9 are performed by an adaptive timing module 100 operating with an adaptive timing processor 104. Routine 600 begins with the start block 602.
Step 602 is followed by step 604, in which the adaptive timing processor 104 sets a series of sampling windows or sub-portions 208 within a timing window 202 of an encoded video signal 102. That is, the adaptive timing processor 104 divides a timing window 202 where a pre-visible horizontal overscan area 127 is expected to be into a number of sub-portions 208. For example, a timing window 202 can be defined between T^ to Tmax, wherein T^ is approximately 8.8 microseconds after H-REF and Tmax is approximately 11.0 microseconds after H-REF, when the expected pre-visible horizontal overscan area 127 is expected to be located between 9.2 and 10.2 microseconds after H-REF. As shown in FIG. 5a, the timing window 202 is divided into a series of five sampling windows or sub-portions 208. Step 604 is followed by step 606, in which the adaptive timing processor 104 waits for eight video fields 206a-m to capture or detect an ISDW 212. The number of video fields 206a-m is a preselected number based upon the available processor time and capacity. A lesser or greater number of video fields 206a-m can be selected and scanned to capture or detect an ISDW 212. As shown in FIG. 5b, eight fields 206a-m are scanned by the adaptive timing processor 104 for the presence of an ISDW 212.
Step 606 is followed by decision block 608, in which the adaptive timing processor 104 determines whether a valid ISDW 212 is detected. If a valid ISDW 212 is detected, then the "YES" branch is followed to step 610. In step 610, the adaptive timing processor 104 sets a flag 214 indicating a valid ISDW 212 in the sampling window or sub-portion 208. As shown in FIGs. 5b-c, a flag 214a-m can be set indicating a valid ISDW 212 in a particular sampling window or sub-portion 208 for each field 206 a-m. Step 610 is followed by decision block 612, in which the adaptive timing processor 104 determines whether all of the sampling windows or sub-portions 208 have been checked or scanned by the adaptive timing processor 104 for a valid ISDW 212.
If a valid ISDW 212 is not detected by decision block 608, then the "NO" branch is followed to Step 614. In step 614, the adaptive timing processor 104 sets a flag 214 indicating that a valid ISDW 212 is not present in the timing window 202.
Step 614 is followed by step 616, in which the adaptive timing processor 104 increments the sampling window or sub-portion 208 by T^. As shown in FIG. 5a, a field 206 is divided into increments, each with the width of Tn. Step 616 is followed by decision block 612, in which the adaptive timing processor 104 determines whether all of the sampling windows or sub-portions 208 have been checked or scanned by the adaptive timing processor 104 for a valid ISDW 212. If not all of the sampling windows or sub-portions 208 have been checked, then the "NO" branch is followed to step 618, returning to step 606, in which the adaptive timing processor 104 scans eight video fields 206a-m to capture or detect an ISDW 212. If all of the sampling windows or sub-portions 208 have been checked, then the "YES" branch is followed to step 620, and then to decision block 622. Decision block 622 determines whether at least one sampling point or sub-portion 208 contains a valid ISDW 212. If none of the sampling points or sub-portions 208 contain a valid ISDW 212, then the "NO" branch is followed to step 624, returning to step 606, in which the adaptive timing processor 104 scans for six video fields 206a-m to capture or detect an ISDW.
If at least one of the sampling point or sub-portion 208 contains a valid ISDW 212, then the "YES" branch is followed to step 626, in which the adaptive timing processor 104 determines an optimum timing sample point or a selected sampling point 216. An optimum timing sample point or a selected sampling point 216 can be an average location or a center point between two or more ISDW sampling point or sub-portion positions. Other similar types of optimum timing sample points or selected sampling points can be calculated by the adaptive timing processor 104 for use with the routine 600.
Step 626 is followed by step 628, in which the adaptive timing processor 104 sets a flag 214 indicating a valid ISDW 212 at the sampling point or sub-portion 208 location. Furthermore, step 628 enables data decoding of the encoded video signal 102 using the calculated optimum timing sample point or selected sampling point 216. The adaptive timing processor 104 uses the optimum timing sample point or selected sampling point 216 to decode subsequent data 220 within the encoded video signal 102.
Step 628 is followed by decision block 630, in which the adaptive timing processor 104 determines whether the ISDW 212 is still valid. If the ISDW 212 is still valid, then the "YES" branch is followed to step 632, returning to step 628 where the adaptive timing processor 104 continues data decoding of the encoded video signal 102 using the calculated optimum timing sample point or selected sampling point 216.
If the ISDW 212 is not valid, then the "NO" branch is followed to step 634, in which the adaptive timing processor 104 starts an invalid ISDW timer.
Furthermore, step 634 disables data decoding of the encoded video signal. Step 634 is followed by decision block 636, in which the adaptive timing processor 104 determines whether the invalid ISDW timer has expired. If the invalid ISDW timer has expired, then the "YES" branch is followed to step 638, in which the routine 600 begins again.
If the invalid ISDW timer has not expired, then the "NO" branch is followed to step 640, which is followed by decision block step 642. Step 642 determines whether an ISDW 212 is present in the sampling window or sub- portion 208.
If no ISDW 212 is detected by the adaptive timing processor 104, then the "NO" branch is followed to step 644, returning to decision block 636 to determine whether the ISDW invalid timer has expired. However, if an ISDW 212 is detected, then the "YES" branch is followed to step 646, returning to step 628 continuing the data decoding with the calculated sample point or selected sampling point 216.
In view of the foregoing, it will be appreciated that the invention provides an adaptive timing module for recovering data from a video signal encoded with horizontal overscan data. Furthermore, the present invention provides a system and method for counteracting horizontal picture or phase shift in a video signal. The present invention also provides a system and method for correcting horizontal picture or phase shift without using complex or expensive devices. It should be understood that the foregoing relates only to the exemplary embodiments of the present invention, and that numerous changes may be made therein without departing from the spirit and scope of the invention as defined by the following claims.

Claims

CLAIMS The invention claimed is:
1. A computer-readable medium having computer-executable instructions comprising: an adaptive timing processor for extracting digital data encoded in a horizontal overscan portion of a video signal operable to perform the steps: conducting a sweeping operation through a timing search range within a plurality of scan lines over multiple fields of the video signal to detect a horizontal position within the scan lines associated with the digital data; based on the sweeping operation, determining a desired horizontal detection position within the scan lines; and detecting digital data encoded at the desired horizontal detection position of subsequent fields of the video signal.
2. The computer-readable medium of Claim 1, wherein the timing search range comprises a range between 8.8 and 11.0 microseconds from a horizontal synchronization pulse or a timing signal that indicates the beginning of a scan line.
3. The computer-readable medium of Claim 1, wherein the horizontal overscan encoded data comprises a specific data sequence that indicates the beginning of a field of digital data.
4. The computer-readable medium of Claim 1, wherein the horizontal overscan encoded data comprises an intelligent signal detect word (ISDW).
5. The computer-readable medium of Claim 3, wherein the step of determining a desired horizontal detection position within the scan lines further comprises comparing the desired horizontal position to a measuredhorizontal position.
6. The computer-readable medium of Claim 4, wherein the step of determining a desired horizontal detection position within the scan lines further comprises comparing the decoded data to a predefined intelligent signal detect word (ISDW).
7. The computer-readable medium of Claim 1, wherein the step of conducting a sweeping operation through a timing search range within a plurality of scan lines over multiple fields of the video signal further comprises: dividing the timing search range into a plurality of equal portions; scanning each portion of the timing search range for datawithin the scan lines associated with the digital data; storing whether data was detected within each portion; and determining a center point of the portions where valid datais detected.
8. The computer-readable medium of Claim 7, wherein the step of determining a desired horizontal detection position within the scan lines further comprises locking onto the center point of the portions where valid data is detected.
9. The computer-readable medium of Claim 1, further comprising the step of repeating the step of detecting digital data encoded at the desired horizontal detection position of subsequent fields of the video signal until a reset condition is enabled.
10. The computer-readable medium of Claim 9, further comprising the step of in response to a reset condition, the decoder will repeat the steps of Claim 1.
11. The computer-readable medium of Claim 7, wherein the reset condition comprises the elapse of a predefined length of time.
12. The computer-readable medium of Claim 7, wherein the reset condition comprises manually triggering a reset button.
13. A display device for recovering data from a video signal divided into frames, wherein each frame comprises a plurality of horizontal scan lines consecutively illuminated on the display device, wherein each scan line comprises a prescan portion comprising a pre-data encoding zone, wherein the display device scans the prescan portion for the presence of encoded data in the pre-data encoding zone over a plurality of subsequent frames, operable to perform the steps: determining a set of sampling sub-portions within a prescan portion; sweeping over the set of sampling points for the presence of encoded data; and detecting encoded data within the prescan portion.
14. The display device of Claim 13, further comprising the steps: determining a center point of the sampling points; locking onto the center point of the sampling points; and using the center point of the sampling points for recovering subsequent data from the video signal.
15. The display device of Claim 13, wherein the encoded data comprises a predefined intelligent signal detect word (ISDW).
16. The display device of Claim 13, wherein the prescan portion comprises a range between 8.8 and 11.0 microseconds from a horizontal synchronization pulse or a timing signal that indicates the beginning of a scan line.
17. The display device of Claim 1 , further comprising the steps, wherein the step of determining a center point of the sampling points further comprises: dividing the prescan portion into a plurality of equally sized sub-portions; scanning each sub-portion for the presence of encoded data; detecting encoded data in a sub-portion; and storing the sampling position associated with the detected encoded data of each sub-portion.
18. The display device of Claim 13, further comprising the step of triggering a reset condition.
19. The display device of Claim 13, further comprising the step of repeating the step of detecting encoded data within the prescan portion until a reset condition is enabled.
20. The display device of Claim 19, further comprising the step of in response to a reset condition, the decoder will repeat the steps of Claim 13.
21. The display device of Claim 18, wherein the reset condition comprises the elapse of a predefined length of time.
22. The display device of Claim 18, wherein the reset condition comprises manually triggering a reset button.
23. A method adjusting a decoder for horizontal phase shift while recovering digital data encoded in a horizontal overscan portion of a video signal comprising: conducting a sweeping operation through a timing search range within a plurality of scan lines over multiple fields of the video signal to detect encoded data within the scan lines associated with the digital data; based on the sweeping operation, determining a desired horizontal detection position within the scan lines; and detecting digital data encoded at the desired horizontal detection position of subsequent fields of the video signal.
24. The method of Claim 1, wherein the timing search range comprises a range between 8.8 and 11.0 microseconds from a horizontal synchronization pulse or a timing signal that indicates the beginning of a scan line.
25. The computer-readable medium of Claim 1, wherein the encoded data comprises a specific data sequence that indicates the beginning of a field of digital data.
26. The computer-readable medium of Claim 1, wherein the encoded data comprises an intelligent signal detect word (ISDW).
27. The computer-readable medium of Claim 15, wherein the step of determining a desired horizontal detection position within the scan lines further comprises comparing the desired data sequence to data sequence.
28. The computer-readable medium of Claim 16, wherein the step of determining a desired horizontal detection position within the scan lines further comprises comparing the received data sequence to a stored intelligent signal detect word (ISDW).
29. The computer-readable medium of Claim 1, wherein the step of conducting a sweeping operation through a timing search range within a plurality of scan lines over multiple fields of the video signal further comprises: dividing the timing search range into a plurality of equal portions; scanning each portion of the timing search range for a horizontal position within the scan lines associated with the digital data; storing whether valid data was detected within each portion; and determining a center point of the portions where valid data is detected.
30. The computer-readable medium of Claim 19, wherein the step of determining a desired horizontal detection position within the scan lines further comprises locking onto the center point of the portions where valid data is detected.
31. The computer-readable medium of Claim 1, further comprising the step of repeating the step of detecting digital data encoded at the desired horizontal detection position of subsequent fields of the video signal until a reset condition is enabled.
32. The computer-readable medium of Claim 21, further comprising the step of in response to a reset condition, the decoder will repeat the steps of Claim 1.
33. The computer-readable medium of Claim 21, wherein the reset condition comprises the elapse of a predetermined length of time.
34. The computer-readable medium of Claim 21, wherein the reset condition comprises manually triggering a reset button.
PCT/US2000/034720 1999-12-30 2000-12-20 System and method of adaptive timing estimation for horizontal overscan data WO2001050747A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU24453/01A AU2445301A (en) 1999-12-30 2000-12-20 System and method of adaptive timing estimation for horizontal overscan data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/476,177 1999-12-30
US09/476,177 US6704058B2 (en) 1999-12-30 1999-12-30 System and method of adaptive timing estimation for horizontal overscan data

Publications (2)

Publication Number Publication Date
WO2001050747A2 true WO2001050747A2 (en) 2001-07-12
WO2001050747A3 WO2001050747A3 (en) 2002-02-21

Family

ID=23890816

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/034720 WO2001050747A2 (en) 1999-12-30 2000-12-20 System and method of adaptive timing estimation for horizontal overscan data

Country Status (4)

Country Link
US (1) US6704058B2 (en)
AU (1) AU2445301A (en)
TW (1) TW532038B (en)
WO (1) WO2001050747A2 (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1168742B1 (en) * 2000-06-28 2008-03-19 Sony Deutschland GmbH Modulation identification device
US6505123B1 (en) * 2000-07-24 2003-01-07 Weatherbank, Inc. Interactive weather advisory system
US7650624B2 (en) * 2002-10-01 2010-01-19 Koplar Interactive Systems International, L.L.C. Method and apparatus for modulating a video signal with data
US20040158877A1 (en) * 2003-02-07 2004-08-12 Global View Co., Ltd. Control method and device of video signal combined with interactive control data
US7075583B2 (en) * 2003-10-20 2006-07-11 Koplar Interactive Systems International, L.L.C. Methods for improved modulation of video signals
US20060161469A1 (en) 2005-01-14 2006-07-20 Weatherbank, Inc. Interactive advisory system
US8681225B2 (en) 2005-06-02 2014-03-25 Royce A. Levien Storage access technique for captured data
US9942511B2 (en) 2005-10-31 2018-04-10 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US20070222865A1 (en) 2006-03-15 2007-09-27 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Enhanced video/still image correlation
US9191611B2 (en) 2005-06-02 2015-11-17 Invention Science Fund I, Llc Conditional alteration of a saved image
US8142287B2 (en) * 2005-10-11 2012-03-27 Zeemote Technology Inc. Universal controller for toys and games
US8229467B2 (en) 2006-01-19 2012-07-24 Locator IP, L.P. Interactive advisory system
US8069461B2 (en) 2006-03-30 2011-11-29 Verizon Services Corp. On-screen program guide with interactive programming recommendations
US8418217B2 (en) 2006-09-06 2013-04-09 Verizon Patent And Licensing Inc. Systems and methods for accessing media content
US8566874B2 (en) 2006-10-03 2013-10-22 Verizon Patent And Licensing Inc. Control tools for media content access systems and methods
US8464295B2 (en) 2006-10-03 2013-06-11 Verizon Patent And Licensing Inc. Interactive search graphical user interface systems and methods
US8510780B2 (en) 2006-12-21 2013-08-13 Verizon Patent And Licensing Inc. Program guide navigation tools for media content access systems and methods
US8028313B2 (en) 2006-12-21 2011-09-27 Verizon Patent And Licensing Inc. Linear program guide for media content access systems and methods
US8015581B2 (en) 2007-01-05 2011-09-06 Verizon Patent And Licensing Inc. Resource data configuration for media content access systems and methods
US8634814B2 (en) 2007-02-23 2014-01-21 Locator IP, L.P. Interactive advisory system for prioritizing content
US8103965B2 (en) 2007-06-28 2012-01-24 Verizon Patent And Licensing Inc. Media content recording and healing statuses
US8051447B2 (en) 2007-12-19 2011-11-01 Verizon Patent And Licensing Inc. Condensed program guide for media content access systems and methods
US8923649B2 (en) 2011-09-06 2014-12-30 Cisco Technology, Inc. System and method for calibrating display overscan using a mobile device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993018614A1 (en) * 1992-03-11 1993-09-16 Thomson Consumer Electronics, Inc. Auxiliary video data decoder with large phase tolerance
WO1999000979A1 (en) * 1997-06-30 1999-01-07 Microsoft Corporation Method and system for encoding data in the horizontal overscan portion of a video signal

Family Cites Families (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3493674A (en) 1965-05-28 1970-02-03 Rca Corp Television message system for transmitting auxiliary information during the vertical blanking interval of each television field
US3743767A (en) 1971-10-04 1973-07-03 Univ Illinois Transmitter and receiver for the transmission of digital data over standard television channels
US3900887A (en) 1973-01-18 1975-08-19 Nippon Steel Corp Method of simultaneous multiplex recording of picture and data and of regenerating such record and apparatus therefor
US3891792A (en) 1974-06-25 1975-06-24 Asahi Broadcasting Television character crawl display method and apparatus
US3993861A (en) 1975-03-24 1976-11-23 Sanders Associates, Inc. Digital video modulation and demodulation system
JPS6050469B2 (en) 1976-10-18 1985-11-08 東京デザイン工芸株式会社 model equipment
US4186413A (en) 1977-11-14 1980-01-29 Sanders Associates, Inc. Apparatus for receiving encoded messages on the screen of a television receiver and for redisplay thereof on the same receiver screen in a readable format
FR2477816A1 (en) 1980-03-04 1981-09-11 Telediffusion Fse TELEVISION SYSTEM USING AN IMAGE SUPERIOR CODE
US4862268A (en) 1980-03-31 1989-08-29 General Instrument Corporation Addressable cable television control system with video format data transmission
US4665431A (en) 1982-06-24 1987-05-12 Cooper J Carl Apparatus and method for receiving audio signals transmitted as part of a television video signal
US4638359A (en) 1983-05-19 1987-01-20 Westinghouse Electric Corp. Remote control switching of television sources
DE3318919C2 (en) 1983-05-25 1985-03-21 TeleMetric S.A., Internationale Gesellschaft für Fernsehzuschauerforschung, Zug Method and apparatus for collecting data on television viewing behavior of television viewers
US4540176A (en) 1983-08-25 1985-09-10 Sanders Associates, Inc. Microprocessor interface device
JPS61156405A (en) 1984-12-28 1986-07-16 Nintendo Co Ltd Robot composite system
US4660033A (en) 1985-07-29 1987-04-21 Brandt Gordon C Animation system for walk-around costumes
GB2178584A (en) 1985-08-02 1987-02-11 Gray Ventures Inc Method and apparatus for the recording and playback of animation control signals
US4864607A (en) 1986-01-22 1989-09-05 Tomy Kogyo Co., Inc. Animated annunciator apparatus
US5108341A (en) 1986-05-28 1992-04-28 View-Master Ideal Group, Inc. Toy which moves in synchronization with an audio source
US4771344A (en) 1986-11-13 1988-09-13 James Fallacaro System for enhancing audio and/or visual presentation
US4846693A (en) 1987-01-08 1989-07-11 Smith Engineering Video based instructional and entertainment system using animated figure
US4840602A (en) 1987-02-06 1989-06-20 Coleco Industries, Inc. Talking doll responsive to external signal
US4847699A (en) 1987-07-16 1989-07-11 Actv, Inc. Method for providing an interactive full motion synched compatible audio/visual television display
US4847700A (en) 1987-07-16 1989-07-11 Actv, Inc. Interactive television system for providing full motion synched compatible audio/visual displays from transmitted television signals
US4855827A (en) 1987-07-21 1989-08-08 Worlds Of Wonder, Inc. Method of providing identification, other digital data and multiple audio tracks in video systems
US4807031A (en) 1987-10-20 1989-02-21 Interactive Systems, Incorporated Interactive video method and apparatus
US4969041A (en) 1988-09-23 1990-11-06 Dubner Computer Systems, Inc. Embedment of data in a video signal
US4930019A (en) 1988-11-29 1990-05-29 Chi Wai Chu Multiple-user interactive audio/video apparatus with automatic response units
US5021878A (en) 1989-09-20 1991-06-04 Semborg-Recrob, Corp. Animated character system with real-time control
US5198893A (en) 1989-09-20 1993-03-30 Semborg Recrob, Corp. Interactive animated charater immediately after the title
US5191615A (en) 1990-01-17 1993-03-02 The Drummer Group Interrelational audio kinetic entertainment system
NL9000130A (en) 1990-01-19 1990-05-01 Philips Nv VIDEO SYSTEM.
US5200822A (en) 1991-04-23 1993-04-06 National Broadcasting Company, Inc. Arrangement for and method of processing data, especially for identifying and verifying airing of television broadcast programs
US5453795A (en) * 1991-07-02 1995-09-26 Thomson Consumer Electronics, Inc. Horizontal line counter insensitive to large phase shifts of video
US5243423A (en) 1991-12-20 1993-09-07 A. C. Nielsen Company Spread spectrum digital data transmission over TV video
US5463423A (en) * 1992-03-11 1995-10-31 Thomson Consumer Electronics, Inc. Auxiliary video data detector and data slicer
WO1993023955A1 (en) 1992-05-12 1993-11-25 Holman Michael J Electronic redeemable coupon system
EP0572740B1 (en) * 1992-06-01 1998-09-09 THOMSON multimedia Auxiliary video data slicer
US5270480A (en) 1992-06-25 1993-12-14 Victor Company Of Japan, Ltd. Toy acting in response to a MIDI signal
JP3257081B2 (en) * 1992-10-08 2002-02-18 ソニー株式会社 Data demodulator
KR950703393A (en) 1992-10-19 1995-09-20 제프리 스코트 쟈니 Control device that operates and speaks according to video signal and wireless signal
KR940017376A (en) 1992-12-21 1994-07-26 오오가 노리오 Transmission method, reception method, communication method and two-way bus system
US5450134A (en) 1993-01-12 1995-09-12 Visual Automation Systems, Inc. Video facility management system for encoding and decoding video signals to facilitate identification of the video signals
US5523794A (en) 1993-04-16 1996-06-04 Mankovitz; Roy J. Method and apparatus for portable storage and use of data transmitted by television signal
US5398071A (en) 1993-11-02 1995-03-14 Texas Instruments Incorporated Film-to-video format detection for digital television
US5483289A (en) * 1993-12-22 1996-01-09 Matsushita Electric Industrial Co., Ltd. Data slicing circuit and method
WO1995029558A1 (en) 1994-04-20 1995-11-02 Shoot The Moon Products, Inc. Method and apparatus for nesting secondary signals within a television signal
KR0178718B1 (en) * 1994-06-10 1999-05-01 김광호 Detection clock generator for digital data on complex image signal and data detector by detection clock
EP0710022A3 (en) 1994-10-31 1998-08-26 AT&T Corp. System and method for encoding digital information in a television signal
JPH08340497A (en) 1995-06-14 1996-12-24 Hitachi Ltd Receiving device for television signal
US5752880A (en) 1995-11-20 1998-05-19 Creator Ltd. Interactive doll
US6377308B1 (en) * 1996-06-26 2002-04-23 Intel Corporation Method and apparatus for line-specific decoding of VBI scan lines
US5812207A (en) * 1996-12-20 1998-09-22 Intel Corporation Method and apparatus for supporting variable oversampling ratios when decoding vertical blanking interval data
US6415439B1 (en) 1997-02-04 2002-07-02 Microsoft Corporation Protocol for a wireless control system
US5977951A (en) 1997-02-04 1999-11-02 Microsoft Corporation System and method for substituting an animated character when a remote control physical character is unavailable
US6072532A (en) 1997-02-18 2000-06-06 Scientific-Atlanta, Inc. Method and apparatus for generic insertion of data in vertical blanking intervals
US6057889A (en) 1997-09-26 2000-05-02 Sarnoff Corporation Format-responsive video processing system
US6094228A (en) 1997-10-28 2000-07-25 Ciardullo; Daniel Andrew Method for transmitting data on viewable portion of a video signal
US6281939B1 (en) * 1998-11-12 2001-08-28 Microsoft Corporation Method and apparatus for decoding data encoded in the horizontal overscan portion of a video signal
WO2000044460A2 (en) 1999-01-28 2000-08-03 Pleschutznig Lanette O Interactively programmable toy

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993018614A1 (en) * 1992-03-11 1993-09-16 Thomson Consumer Electronics, Inc. Auxiliary video data decoder with large phase tolerance
WO1999000979A1 (en) * 1997-06-30 1999-01-07 Microsoft Corporation Method and system for encoding data in the horizontal overscan portion of a video signal

Also Published As

Publication number Publication date
TW532038B (en) 2003-05-11
US6704058B2 (en) 2004-03-09
AU2445301A (en) 2001-07-16
US20030169367A1 (en) 2003-09-11
WO2001050747A3 (en) 2002-02-21

Similar Documents

Publication Publication Date Title
US6704058B2 (en) System and method of adaptive timing estimation for horizontal overscan data
US4408225A (en) Subscription television decoder
AU699288B2 (en) Animated "on-screen" display provisions for an mpeg video signal processing system
CA2294948C (en) Method and system for encoding data in the horizontal overscan portion of a video signal
US4458268A (en) Sync displacement scrambling
EP1097578B1 (en) Method and arrangement for transmitting and receiving encoded images
IE61641B1 (en) Interactive video method and apparatus
US6281939B1 (en) Method and apparatus for decoding data encoded in the horizontal overscan portion of a video signal
EP0472572B1 (en) Data transmission in the active picture period
US6694518B1 (en) Method and apparatus for carrying data across high definition analog component video interfaces
JPH0334792A (en) Video signal transmission system
US6556247B1 (en) Method and system for decoding data in the horizontal overscan portion of a video signal
JPS6040067Y2 (en) Reception determination device for multiplexed signals
KR100202221B1 (en) Caption information additional methode of tv brodcasting
JPS6253994B2 (en)
KR0164735B1 (en) Addition information display apparatus and method thereof
KR19980020289A (en) Subtitle Signal Format of Viewer Selective Subtitle Broadcasting
JPH08149510A (en) Two-dimensional three-dimensional video image conversion method
KR970011699B1 (en) Caption image interpolation system for satellite broadcasting
JPS6059887A (en) Color television receiver
KR100188286B1 (en) Caption signal format of caption broadcasting televiewer option
JPH0498983A (en) Teletext multiplex receiver
JPH07212726A (en) Multiplex broadcast method and receiver
JPH0888840A (en) Teletext system discrimination device
JPH06225175A (en) Method and apparatus for synchronizing control function with video signal in television receiver

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP