CA2247637A1 - Video data editing apparatus, optical disc for use as a recording medium of a video data editing apparatus, and computer-readable recording medium storing an editing program - Google Patents

Video data editing apparatus, optical disc for use as a recording medium of a video data editing apparatus, and computer-readable recording medium storing an editing program Download PDF

Info

Publication number
CA2247637A1
CA2247637A1 CA002247637A CA2247637A CA2247637A1 CA 2247637 A1 CA2247637 A1 CA 2247637A1 CA 002247637 A CA002247637 A CA 002247637A CA 2247637 A CA2247637 A CA 2247637A CA 2247637 A1 CA2247637 A1 CA 2247637A1
Authority
CA
Canada
Prior art keywords
data
audio
video
video object
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002247637A
Other languages
French (fr)
Inventor
Tomoyuki Okada
Kazuhiro Tsuga
Hiroshi Hamasaka
Shinichi Saeki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Corp
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2247637A1 publication Critical patent/CA2247637A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/036Insert-editing
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/3027Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/326Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is a video-frame or a video-field (P.I.P.)
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/327Table of contents
    • G11B27/329Table of contents on a disc [VTOC]
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/21Disc-shaped record carriers characterised in that the disc is of read-only, rewritable, or recordable type
    • G11B2220/215Recordable discs
    • G11B2220/216Rewritable discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2562DVDs [digital versatile discs]; Digital video discs; MMCDs; HDCDs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2562DVDs [digital versatile discs]; Digital video discs; MMCDs; HDCDs
    • G11B2220/2575DVD-RAMs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/40Combinations of multiple record carriers
    • G11B2220/45Hierarchical combination of record carriers, e.g. HDD for fast access, optical discs for long term storage or tapes for backup
    • G11B2220/455Hierarchical combination of record carriers, e.g. HDD for fast access, optical discs for long term storage or tapes for backup said record carriers being in one device and being used as primary and secondary/backup media, e.g. HDD-DVD combo device, or as source and target media, e.g. PC and portable player
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/775Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/806Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal
    • H04N9/8063Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal using time division multiplex of the PCM audio and PCM video signals

Abstract

One or more video objects are recorded on an optical disc. When a user indicates a linking edit that links sections of the video objects, video object units (VOBUs) that include picture data at the end of a former section and VOBUs that include picture data at the start of a latter section are read from the optical disc and the audio packs and video packs are separated from these read VOBs. Next, the video packs are re-encoded and some of the audio packs that were originally in the former section are multiplexed into the latter section.
The result of the multiplexing is then recorded onto the optical disc.

Description

FROM _ 1998~ 9~163(~)21:15/~21:06/~4~00403387 P

FP-98141 Sea~les~
TITL13 OF THE I~VENTION
VIDEO DA~rA EDITTNG AElPA}U~!rU8, O~TICAL DI8C EY~R
U8E AS A RECORDING MEDIU~ OF ~ VII)~:O DATA ED
APPARATUS, A~D CO~ul-~K--READ~BLE: RECORDING ~:DIU~
8TORING A~ EDI~rING PROCRA~

BACKGROUND OF THE INVENTION
1. Fiel~ of the Tnvention The pre~ent invention rela~e~ to a video data editing apparatus that uses an optical di~c as an editing medium for video data, a computer~-readable recording medium that stores an e~iting program, an optical disc for u~e aQ a recording medium of a video data editing apparatu~, and a reproduction ~pparatus for an optical disc.

7.n~.~cri~tiOn of the Ba~k~rolln~ Art Video editors in the film and broad~a~ting indu~trie~ make full u3e of their skill and experience when edi~ing the great variety of video production~ that reach the market. While movie ~ans and home video makers may not posses~ ~uc~ skill or experience, many are still ~nspired by professional editinq to try ~ideo FROM 1998~9~163(~)21:15/~21:06/~430040~387P 4 edi~ing for themsel~es. This create~ a d~m~n~ for a domestic video editing apparatu~ that can perform advanced ~ideo editing ~hile still being easy to use.
While video editing generally involves a variety of opera~ion~, domestic video editing apparatuses that are likely to appear on the ~arket in the near future ~ill e~pecially require an advanced s~ene linking function. Such function links a number of scenes to form a ~ingle work.
When }inking ~cene~ using conventional dome~tic equipment, the user connects two video cassette recorder~ to form a dubbing ~y~te~. The operations performed when linking scenes using this kind of dubbing system are de~cribed below.
Fig. lA shows a video editing setup u~ing video cassette recorders that are respectively capa~le of recording and playing back video signals. The setup of Fig. lA includes the video cassette 301 that records the source ~ideo, the video cassette 302 ~or recording the editing re~ult, and two video cassette recorderY 303 and 304 for playing back and recording ~ideo images on the video cassettes 301 and 302. In this example, the user attempt~ to perform the editing operation shown in Fig.
lB using the setup of Fig. lA.
Fig. lB show the relation~hip ~etween the source FROM _ 1998~9~16E(~)21:16~21:06/~4300403387P 5 materials to be edited and the editing result. In this example, the u~er plays back scene 505 that is located between time tS and time tlO of the source materials, scene 506 that is located between time tl3 and t21, and scene 507 ~hat is located between time t23 and t25 and attempts to produce and editing result that is only composed of these scene~.
Wlth the setup of Fig. lA, the u~er ~e~s the vldeo ca~ette 301 including the source material~ into the video casset~e ~ecorder 303 and the video cassette 302 for recording the editing result into the video cassette reco~der 304.
After settin~ the video cassettes 301 and 3~2, the u~er presses th~ fa~t-forward button on the lS operation panel of the video cassette recorder 303 (as shown by ~ in Fig. lA) to search for the start of scene 505. Next, the user pre~ses the play button on the operation panel of the video cassette recorder 303 (as shown by ~ in Fig. lA) to reproduce scene 505. At the same time, the user pre~ses the record button on the operation panel of the video ea~ette recorder 304 (as sho~n by ~ in Fig. lA) to commence recording, When ~cene 505 has finished, the u3er stops the operation of both ~ideo ca3sette recorder~ 303 and 304. The u~er then fast-forwards the video ca~sette to the start of -- ., .

FROM ~ 1998~9~16~(~)21:16/~21:06/~4300403387P 6 scene 5a6, and then simultaneou~ly ~mm~nce~ the playback by video cassette recorder 303 and the recording by video ca~ette reco~der 304 After completing the above process for ~cenes 506 and 507, the s user ha~ the video ca~sette recorder~ 303 and 304 respectively rewind the ~ideo cassettes 301 and 302 to complete the editing operation.
If the 3cene linking operation described abo~e could be performed with eaQe at the home, users would then be able to easily manage programs that have been recorded on a large number of magnetic tape cassettes.
A first problem with the video editing setup described above though is that the source material and editing result need to be recorded on ~eparate recording media, meaning that two ~rideo ca~ette recorders need tO
be used for playing back and recording the re-Ypective recordlng media This greatly increases the 3cale of the video editing ~etup ~ 3econd pro~lem is that the need to reproduce the ~ideo images between time t5-tlO, tl3-t21, and ti~e t23-t25 using video cassette recorder 30~ makes the video editing very time consuming. Here, the longer the video excerpts that compose the editing result, the longer the reproduc~ion time and Ihe editing time, meaning that the editing of long source materials can FROM ~ 1998~9~16~(~)21:16/~21:06/~4300403387 P 7 take an extremely long time To complete the above linking process in a short time with small-scale equipment, it would be ideal for the piece~ of the recording media that record the deslred video image~ to be simply linked together, as with the conventional ~plicing of ~ection~ of magne~ic tape. When the source mate~ials are ~tored as analog video signal~, there may be no significant pro~lem~
should the sections of magnetic tape storing the desired materials ~e spliced together. However, when linking ~ystem streams that ha~e been highly compre~ed according ~o MPEG ~echnique~, there are the problem~
that the video reproduction can be interrupted or disturbed at the joins ~etween the spliced sections.
Here, the expre~ion "system stream" refers to ~ideo data and audio data that are multiplexed together, with such ~treams also being called audio-visual data (AV
data)" in this ~pecification.
One of the causes of the above problem is the assigning of variable-length code ~o video frames in a video st~eam. When encoding a video stream, an optimal amount o~ code is assigned to each di3play cycle in order to strike a good balance between to the complexity of the image to be displayed and the amount o~ da~a that is already ~tored in the buffer of the video decoder.

FROM _ 1998~9~163(~)21:17/~21:06/~4300403387P 8 Since precise calc~lation~ are performed when a~igning code ~ithin a video stream, it can be ensured that no underflows or o~erflow~ ~ill occur in the decoder buffer when a ~ingle video stream i9 reproduced in its original form. However, when ceparately encoded former and latter video ~tream~ are linked together, the latter video stream will be inputted into the ~uffer of a ~ideo decoder with no consideration to the amount of data already accumulated in the video decoder buffer at the end of reproduction of the former video stream.
When this happen~, there is a clear po~si~ility of an o~erflow or an underflow occurring in the ~ideo decoder buffer. When linking partial sections of a sy~tem ~tream in the same way a~ in Fig. lB, there is the lS possibility of an overflow or an under~low occurring in the video decoder buffer when the reproduction proceeds from a fonmer section to a latter section.
Reproduction of ~ideo withou~ interruptions and distur~ances is called seamless reproduction. To perform seamless reproduction of linked sections, it i~
necessary to temporarily convert the former section and lat~er section into video signal-~ and audio ~ignals and to then re-encode these signals to convert the signal~
of the former section and latter section into a single video stream and audio stream. The time taken ~y ~uch FROM - 1998~9~16E(~)21:17/~21:06/~4300403387P 9 re-encoding i~ proportional to the amount o~ data in the ~ideo stream-q and audio ~tream~ of the edited source ~aterials. A~ a result, when the s~urce material~
contain a large amount of data, this process will ~e very time-consuming.
To perform AV synchronization during the reproduction of an MPEG ~y~tem stream, the time ~tamps that show the respective reproduction times of a video s~ream and an audio stream must ~e consecutive.
Conventionally, MP~G s~andard~ have focused on the consecutive reproduction of one ~tream from start to end, so that seamlesq reproduc~ion has not ~een pos~ible for two MPEG streams which do not ha~e consecutiv~ time stamps . As a result, during editing, it has been necessary to give at least one of the linked MPEG
stream~ time stamp~ that are continuou3 with the time stamps in the other MPEG stream, meaning that the latter MP~G stream has had to be re-encoded in its entirety.

SUMMARY OF THE lN~NlION
It i3 a first object of the pre~ent invention to provide a video data editing apparatus that can perform an editing operation that seamlessly link~ video streams or part-~ of Yideo :~tream3 in a short time using only a ~ingle recording medium.

FROM ~ 1~98~ 9~16~(~)21:18~21:06/ ~ ~4300403387 P10 It l~ a second object o~ the present invention to provide a video data editing apparatus that can perform an editing operation that seamle~ly link~ video streams or partC of video st~eams, these including sy~tem streams that have been encoded after a precise calculation that ensu~es no underflo~ or over~lo~ ~ill occur in a buffer.
The main points worthy of attention for the realization of the first and second objects are as ~ollow~.
To searn}essly link video streams or parts of ~rideo streams on a ~ingle recording medium in a short time, the part of the data to be re-encoded to achieve the seamless link should be as ~hort as possible. However, due to the data construction of syste~ streams, this cannot be achieved by simply reducing the ~ize of the re-encoded data p~rt.
Such system streams are composed of an arrangement of a plu~ality of ~ideo packs and audio packs. In such a data construction, ~owever, the video data and audio data to be reproduced ~imultaneously for a given time are not necessarily arranged properly in the video packs and audio packs. A~ a result, i~ the re-encoded area is simply decreased, audio data, which is to be reproduced for re-encoded video data and should itself be re-encoded, uill be outside the area subject to re-encoding. In the pre ent in~ention, however, the re-encoded area is set as an integer multiple of video object units. Each video cbjQct unit includes a plurality of sets o~ picture data (video data) , FROM ~ 1998~ 9~16~(~)21:18/~21:06/ ~ ~4300403387 P

wlth a reproductlon perlod of around 0.5 second~ in the embo~iments, and in many cases will also include audio data to be simultaneouqly reproduced ~ith the picture data. A
"one-second rule" (described in the embodiments) is u~ed under MPEG standards, so that the audio data to be reproduced 3imultaneo~aly with video d~ta will be included within the ~ame video object unit. By performing a re-encode for an integer multiple of video object unit~, the part tha~ needs to be re-encoded Eor seamless linking can be greatly reduced.
The fir~t object can be achieved ~y a video ~ata editing apparatu~ that performs editing to enable seamless reproduction of at least t~o video objects that are ~ecorded on an optical disc, each video o~ject including a plurality of video object units, and each video object unit including sets of pictu~e data, the video data editing apparatus including; a reading unit for reading at least one of a former ~ideo object unit se~uence and a latter video object unit -~equence from a video o~ject recorded on the optical ZO disc, the former video object unit sequence being compo~ed of a predetermined number of video object unit~ positioned at the end of a former video object to be reproduced fir~t, and the latter video object unit sequence being composed of a predetermined n~mber of vidco object unit3 positioned at a start of a latter video object to be reproduced second; an encoding unit for re-encoding the qet~ of picture data FROM - 1998~ 9~16~(~)21:19/~21:06/~4300403387 P12 included in at least one of the fonmer video object unit sequence and the latter video object unit sequence to enable the former video object and the latter video object to be reproduced seamlessly; and a ~riting unit for re~riting at least o~e o~ the former video object and the latter object on the optical di~c after encoding by the encoding unit.
wit~ tne ~tated construction, the edited materials are video objects recorded on a single optical di~c, with the re-encoded data being video object sequences that are smaller tha~ video objects~ A~ a result, when the video object~ to be seamlessly linked are extremely long, for example, it is sufficient to read and re-encode only the video object unit~ at the end of a fir~t video object before ~ecording the re~ulting data back onto the same optical disc. As a re~ult, an editing operation that enable~ the ~eamless reproduction of video object~ can be completed in a ~hort time.
The 3econd object can be achie~ed a video data editing apparatu~ where the encoding unit re-encodes at least one of the sets of picture data included in the former video object unit sequence and the ~e~ of picture data included in the latter video object unit sequence using a target amount of code, the target amount of code being an amount whereby no overflow will occur in a video buffer of a video decoder, even ~hen the ~et~ of picture data included in the former video object unit sequence are present in the FROM - 1998$ 9~16~(~)21:19~21:06/~ 4300403387 P 13 video buffer at a same time as the sets of picture data included in the latter video object unit sequence.
With the ~tated con~truction, a case where picture data included in a latter ~ideo object i~ accumulated in a decoder buffer ~hile picture data included in a former video o~ject is present in the decoder buffer is considered, ~ith re-encoding being performed to en~ure that no overflow~
occur in the buffer of the video decoder. As a result, ~hen separately encoded video object~ are linked, seamless reproduction for the resulting single video object will be possible.
Here, each set of picture data may include data that is to be decoded for one video frame, the information generating unit further adding a presentation end time wher.
reproduction of the Yet~ of picture data in the ~ormer video object unit sequence ends and a p~esentation start time when reproduction of the set~ of picture data in the latter ~ideo object unit sequence starts to the seamle~s linking information, the certain off~et being found by subtracting the presentation start time of the latte~ video obj-ct unit ~equence from ~he presentation end time of the former video object unit sequence.
With the stated construction, the presentation end time lnformation fo~ the picture data included in the former video object and presentation start time information for the picture data included in tl~e latter video object are FROM 1998~ 9~16~(~)21:20/~21:06/~4300403387 P14 generated and ~rit~en onto the optlcal disc. If a reproduction apparatus is of ~xtended-STD type ~here one of a s~andard time measured by an STC and a ~um of ~he standard time and an offset is used ~y a video decoder during decode operations, the reproduction apparatus will read the preaentation end time information and the pre~entation start time information from the optical disc and use these to calculate the offset to ~e added to the standard time. As a result, even i:~ the PCR, DTS, and PTS are not continuous at the link boundary of a former video object and a latter video ob~ect, seamless reproduction will still be possible at this boundary.
Here, each video object unit may include a plurality of set~ of picture data and a plurality of sets of audio data, and the ~ideo data editing apparatu~ may further include: a separating unit for ~eparating xet~ of picture data and sets of audio data from the for~er video object unit ~equence and the latter video object unit sequence read by the reading unit; and a multiplexing unit ~or multiplexing at least one of the sets of p~cture data, which include one of picture data and re-encoded picture data, separated from the former ~ideo object unit ~e~uence with the sets of audio data read from the former video object unit sequence, and for multiplexing the sets of picture data, which include one of picture data and re-encoded picture data, separated from the latter video object unit FROM 1998~ 9~16~(~)21:20/~21:06/~ 4300403387 P 15 sequence with the set~ Or audio data separated rrom ~he latter video object unit sequence, ~he writing unit writing data outputted ~y the multiplexing unit onto the optical disc .
With the ~tated con~truction, while a plurality of 3et3 of picture data of the latter video object are being accumulated in the decoder buffer, a plurality of sets of audio data included in the former video object are read from ~he optlcal dlsc, and to enable ~lmultaneou~ reproduction, a plurality of set~ of picture data in the re-encoded video object units of ~he latter video ob;ect and multiplexed with the plurality of ~et~ of audio data in the former video object. As a result, even if the video stream is encoded ~ith a variable bit rate (VBR) ~hile the audio ~tream i~
encoded with a con~tant bit rate ~CBR), successive reproduction of a plurality of ~et~ of audio data will still be po~ible ~hile the video stream wait~ in the buffer for its decode time to be reached.
Here, a plurality of set~ o~ audio data that should be reproduced ror a plurallty o~ audio frame~ rrom the first audio frame to the second audlo frame may be ~tored as a first audio pack group, ~herein if a data ize of the fir~t pack group is not an integer multiple of 2 kilobytes (~B), one of ~tufring data and a padding packet may be used to make the data size of the first audio pack group is an integer multiple of 2RB, and wherein the plurality of ~et~

FROM _ 1998~ 9~16~(~)21:21/~21:06/~$~4300403387 P 16 of audio data that should be reproduced for a plurality of audio frame~ starting fr~m the third audio frame may be stored as a second audio pack group, with ~he multiplexing unit multiplexing sets of picture data and sets of audio data so that the firs~ audio pack group is located before the second audio pack group.
With the stated construction, it is possible to avoid audio reproduction of a plurality of sets of audio data in the former video object that coincide~ with the audio reproduction o~ sets o~ audio data in the latter video object. Also, synchronization between audio reproduction and video r~production can be maintained for the latter video object.
It i~ al~o possi~le to have a video data editing apparatus that performs editing to enable seamless reproduction of a former section and a latter section, the former section and the latter section being located in at least one video object that is recorded on an optical disc, each video object including a plurality of video object units and each video object unit including sets of picture data, the video data editing apparatus including: a reading unit for reading a former video object unit seguence and a latter video object unit sequence from a video object recorded on the optical di~c, the former video object unit sequence being composed of video object units positioned at an end of the rormer section that is to be reproduced rirst, FROM _ 1998~ 9~16~(~)21 21~21 06/ ~ ~4300403387 P17 and the latter video obje~t unit qequence being composed of vide~ obj~ct unit~ positioned at a start of a latte~ section that is to be reproduced second; an encoding unit for re-encoding the sets of picture data incl~lded in at least one 5 of the former video object unit sequence and the latter video object unit sequence to enable the former section and the latter section t:o be reploduced qea~Llesqly; and a uriting unit for re~riting at lea~t one of the former section and the latter section on the optical disc ~fter encoding by the encoding unit.
With the ~tated construction, when the edited materials are parts of video objects recorded on a same optical disc, re-encoding i~ performed for object units that are smaller that the parts of video objects. A~ a result, when the parts of video object3 to be ~eamlessly linked are extremely long, for example, it is sufficient to read and re-encode only the o~ject units at the end of a fir~t video o~ject before recording the resulting data back onto the same optical dis~. As a result, an editing operation that enables the seamless reproduction of object unit~ can be completed in a short time.
Here, when a picture type of a final set of picture data in a display order o~ the ~o ~ e~ section ls a Bidirectionally Predictive Picture (B picture), the re-encoding unit may perform re-encoding to convert the ~inal set of picture data to a Predictive Picture (P picture~

FROM ~ 1998~ 9~16~(~)21:21~21:06/~4300403387 P18 who~e information components are depe~dent on only ~et~ of picture data that are reproduced earlier than the final set of picture data.
With the stated construction, if it is necessary to convert a picture type ~hen linking video object~ that have a coding o~der and display order that comply to MP~G
~tandard, the transition of the buffer ~tate is properly e~timated without ignoring the increa~e in buffer occupancy that re~ults ~rom thl~ conver~ion of picture type. As a result, re-encoding can be performed u~ing a more suitable data amount.

BRIEF DESCRIPTION OF THE DR~WINGS
These and other object~, ad~antages and features of the in~ention will become appa~ent from the following description thereof ~aken in conjunction with the acc~m~ying drawing~ whieh ill~-~trat~ a specific embodiment of the invention. In the drawings:
zo Fig. lA shows a convention video editing ~etup u~ing video cas~ette recorder~ that are capable o~
playing back and recording video ~ignals;
Fig. lB sho~ the relation hip ~etween the source materials and the editing result;
Fig 2A shows the outward appearance of a DvD-RAM

FROM _ 1993~9~16~(~)21:22/~21:06~ 430040~387P19 disc that is the recordable optical di~c u~ed in the embodiment~ of the present in~ention;
Fig. 2s shows the recording areas on a DVD-RAMi Fig. 2C shows the croq~-sec~ion and ~urface of a DVD-RAM cut at a qector header;
Fig. 3A shows the zones O to 23 on a DVD-RAM;
Fig 3B shows the zonec O to 23 arranged into a horizontal sequence;
Fig. 3C showq the logical secto~ numbers ~LSN) in the ~olume area;
Fig. 3D shows ~he logical ~lock numbers ~LB~) in the volume areai Fig. ~A shows the contents of the data recorded in the ~olume area;
Fig. 4B shows the hierarchical structure of the data de~initions used in MPEG ~tandard;
Fig. 5A ~hows a plurality of ~ets of picture data arranged in diqplay order and a plurality of sets of picture data arranged in coding order;
Fig SB shows the correspondence betueen audio frame~ and audio data;
Fig. 6A shows a detailed hierarchy of the logical formats in the data con~truction of a ~OB (Video Object);
Fig. 6B shows the partial deletion of a VOB;

FROM-- 1998~9~16~(~)21:22/~21:06/~g4300~03387P20 Fig. 6C qho~q the lo~ical format of a video pack arranged at the start of a VOB;
Fig. 6D shows logical format of other video packs arranged in a VOB;
Fig. 6E sho~s the logical format of an audio packi Fig. 6F sho~s the logical format of a pack header Fig. 6G shows the logical format of a syste~
header;
Fig. 6H shows the logical format of a packet headeri Fig. 7A shows a video frame and the occupancy of the video bufferi Fig. 7B shows an audio frame and an ideal transition in the buffer state of the audio buffer;
Fig. 7C shows an audio frame and the actual tran~ition in the buffer 9tate of the audio bufferi Fig. 7D ~how~ the detailed transfer period of each ~et of picture data;
Fig. 8A shows how audio pack~, which ~tore the audio data to be reproduced in a plu~ality of a~dio ~rames, and video pack~, which 3tore the picture data that i to be reproduced in a plurality of video frames, 25 may be recorded;

FROM 1998~9~16~(~)21:23/~21:06/~4300403387P21 Fig. 8B shows a key to the notat1on used in Fig.
8A;
Fig. 9 3how~ how audio packs, which Ytore the audio data to be reproduced in a plurality of audio fra~es, and video packs, uhich store the picture data that is to be reproduced in a plurality of video frames, may be recordedi Fig. lOA shows the transition in the buffer state during for the fir~t part of a ~ideo stream;
Fig. lOB qhow~ the transition in the buffer state during ~or the last part of a ~ideo stream;
Fig. lOC shows the tran~ition in the buffer 9tate across two VO~s, when the video qtream whoqe la~t part cau~es the buffer state shown in Fig. lOB is seamlessly linked to the video stream whose former part cau3e~ the buffer state shown in Fig. lOA;
Fig. llA is a graph where the SCRs o~ video pack~
included in a VOB are plotted in the order in which the video pack~ are arrangedi Fig. llB shows an example where the first SCR in ection B matches the last SCR in section A;
Fig. llC shows an example where the fir~t SCR in section D i~ higher than the last SCR in section C;
Fig. llD ~hows an example where the last SCR in section E is higher than the first SCR in qection F;

FROM 19~8$9~l6~(~)2l 23/~2106/~g4300403387P22 Fig. llE show3 the graph for the continuity of VOB~ of Fig. llA for two specific VOBs;
Fig. 12A sho~s a detailed expansion of the data hierarchy in the RTRW management file;
Fig. 12B shows the PTM descriptor format;
Fig. 12C shows the data construction of the audio gap location informationi Fig. 13 shows the buffer occupancy for each o~ a former VOB and a latter VOB;
Fig. 14A show~ example~ of audio frames and video frames;
Fig~ 14B Rhows the time difference gl that appears at the end of the audio data and picture data when the reproduction ~ime of picture data and the reproduction time of audio data are aligned at the start of a VOB;
Fig. 14C showq the audio pack G3 including the audio gap and the audio pack G4, audio pack G3 including (i) the ~ets of audio data y-2, y-1, and y, which are located at the end of VOB~l, and (iil the Padding_Packet, and audio pack G4 including ~he ~et~ of audio data u, u+1, and u+2, which are located at the ~tart of VOB#2;
Fig. 14D shows into which of VOBU#1, VOBU~2, and VOBU#3 at the start of the VOB#2 the audio pack G3 FROM _ 1998~ 9~16~(~)21:23/~21:06/~$~4300403387 P 23 incll~d; n ~ the audio gap i~ arrangedi Fig~. lSA to 15D sho~ the procedure for the regeneration of the audio gap when the ~OBU~ located at the start of VOB#2, out of the vo~s #1 and #2 that are S to be reproduced seamlessly, are deleted;
Fig. 16 show an example syste~ configuration using the video data editing apparatu~ of the first embodiment;
Fig. 17 is a block diagram showing the hardware construction of the DVD recorder 70;
Fig. 18 shows the con~truction the MP~G encoder 2;
Fig. 19 shows the con-~truction of the MPEG
decoder 4;
Fig. 20 is a timing chart ~howing the ~iming for the switching of switches SW1 to SW4:
Eig. 21 is a flowchart showing the procedure of the seamles~ processing;
Fig. 22 is also a flowchart showing the procedure of the seamle~s proce~ing;
Fi~s. 23A and 23B show the analy~i~ of tran~ition in the buffer ~ta~e for audio pack~i Fig 23C ~hows the area that i~ to be read from the ~ormer VOB in step S106;
Fig. 23D shows the area that is to be read from lYY~Y~lb~ 4/~1:U~ UU~U~ 4 the latter VOB in s~ep S107;
Fig, 24A show~ the audio frames in the audio .~tream that correspond to the audio frames x, x+l, y, u, u~1, u+2 used in Fig. 22;
Fig 24B shows the case when the Fir~t_SCR+STC_offset correYpond~ to a boundary between audio frames in the fonmer VOB;
Fig. 24C shows the case when the video rep~oduction s~art time VOB_V_S_PTM+STC_off~et corresponds ~o a houndary between audio frames in the former VOBi Fig. 24D ~how3 the ca~e when the pre~entation end time of the video frame y corresponds to a boundary between audio frames in the latter VO~;
Fig. 25 ~hous how the audio packs ~oring audio data for a plurality of audio frames and the video packs storing video data for each video frame are multiplexed;
Fig. 26 show~ an example of the section of a VOB
that is specified u~ing time lnformation for a pair of C_V_S_PTM and C_V_E_PTM;
Fig. 27A ahow~ thc area that is to be read from the former cell in step S106;
Fig. 27B ~hows thQ ar~a that is to be read from the latter cell in step S107;
Fig. 28A show~ an example of the linking of ~et~

FROM 1998~9~163~)21:24/~21:06/~4300403387P25 of cell information that are ~pecified a~ the editing boundarie~ in a ~OBU;
Fig. 28s ~how~ the proce~ing ~or the three rules for reconqtructing GOPs ~hen correcting the display order ~nd coding orderi Fig. 2 9A shows the processing when changing a picture type of picture data in the former cell;
Fig. 2 9B shows the procedure for measuring the change ~ in the buffer occupancy when changing a picture type in the former cell;
Fig. 30A show~ the processing where changing the pic~ure type oE ~he latter celli Fig. 30B -~how~ the procedure for measuring the change ~ in the buffer occupancy when changing a picture type in the latter cell;
Fig 31 i~ a flowchart ~houing the procedure for the seamless proces~ing;
Fig 32 i~ also a flowchart showing the procedure for the seamless processingi Fig. 33 is also a flowchart showing the procedure for the ~eamle~-~ proces~ingi Fig. 34 shows the audio frames in the audio stream ~hat corre~pond to the audio frames x, x+l, and y u~ed in the f lowchart of Fig. 31i Fig. 35 shows the hierarchical directory ~UM lYY~Y~ l:U~ 4~U4~
,_ structurei Fig. 36 show~ the information, aside from the sector management table and Av block management table shown in Fig. 6, in the management information for the ~ile system;
Fig. 37 shows the linked relations ~hown by the arro~s in Fig. 6 within ~he directory structure;
Fig. 38A shows the data construction of file entries in greater detail;
lo Fig. 38B ~ho~ the data construction of ~he allocdtion descriptor~;
Fig. 38C -~hows the recorded state of the upper 2 bit~ in the data shows the extent length;
Fig. 39A shows the detailed data construction of the file identification descriptor for a directory;
Fig. 39B ~how.~ the detailed data con~truction of the file iden~ifica~ion descriptor for a file;
Fig. 40 is a model showing the buffe~ing in the track buffer o~ Av data read from the DVD-RAM;
Fig. 41 is a functional block diagram showing the construction of the DvD recorder 70 divided by functioni Fig. 42 show~ an example of an interactive ~creen displayed on the TV monitor 72 under the control of the recording-editing-reproduction ~ontrol unit lZi Fig. 43 is a flowchart showing the processing by FROM 1998~9~16~(~)21:25/~21:06~g4300403387P27 the recording-editing-reproduction control unlt 12 ~or a virtual edit and for a real edit;
Figs. 44A to 44F show a ~upplementary example to illustrate the processing of the AV data editing unit 15 in the flowchar~ of Fig. 43;
Figc. 45A to 45E show a ~upplementary example to illustrate the proce~ing of the Av data editing unit 15 in the flowchart of Fig. 43;
Fig~. 4 6A to 4 6F show a supplementary example to illu~trate the processing of the Av data editing unit 15 in the flowchart of Fig. 43;
Fig. 47A shows the relationship betwee~ the extents and the in-memory data, in terms of time;
Fig. 47B shows the positional relationship between the extents, the In area and the Out areai Fig. 48A is a flowchart ~howing the processing by the AV file sy~tem unit ll when executing a "SPLIT"
commandi Fig. g8B is a flowchart showing the processlng when executing a "SHORTEN" command is issued ~q Fig. ~ is a flowchart ~3howing the proce3:sing when executing a "MERGE" com~nd is issuedi Fig. 50 i~ a flowchart for the case when the ~ormer extent is belou AV block length but the latter extent is at lea~t equal to Av block length;

FROM 19~8~9~16~(~)21 26/~21 06/~4300403387P28 Figs. 51A-51B are a supplementary example ~howing the processing of the AV file system unit 11 in the flowchart of Fig. 50;
Figs. 52A to 52C are a s~pplementary example showing the processing o~ the AV file system unit 11 in the flowchart of Fig. 50i Figs. 53A to 53D are a supplementary example showing ~he processing of the AV file system unit 11 in the flowchart of Fig. 50;
Figs. 54A-54D are a supplementary example showing the processing of the AV file syste~ unit 11 in the fl~wchart of Fig. 50i Fig. 55 is a flowchart for the case when the former extent is at least equal ~o AV block length but the latter extent is below AV block length;
Figs. 56A-56B are a supplementary example showing the processing of the AV file system unit 11 in the flowchart of Fig 55;
Figs 57A-S7C a~e a cupplementary Px~m~le showing the processing of the A~ file system unit 11 in ~he flowchart of Fig. 55;
S~ ~
Fig~. 5~A-53~ are a supplementary example showing the processing of the Av file system u~it 11 in the flowchart of Fig~ 55;
Figs. ~ -5~ are a supplementary example sho~ing FROM 1998~9~163(~)21:26/~21:06/~g4300403387?29 the proce~sing of the Av file system unit 11 in the flowchart of Fig. 55i Fig. 60 is a flowchart for the ca~e ~hen the both the former extent and the latter extent are below AV

block length;
Figs. 61A-61D are a supplementary example showing the pro~essing of the AV file system llnit 11 in the flowchart of Fig. 60;

Figs. 62A-62C are a supplementary example Yhowing lo the proces~ing of the Av file ~y~tem unit 11 in the flowchart of Fig. 60;

Figs. 63A-63C are a supplementary ~x~rle showing the processing of the AV file syste~ unit 11 in the flowchart of Fig. 60;

Figs. 64A-64D are a supplementary example ~howing the processing of the AV file ~y~tem unit 11 in the flowchart of Fig. 60;
Fig. 65 iY a flowchart for the case ~hen the both the former extent and the latter extent are at least equal to AV block length;
Figs. 66A-66D are a ~upplementary example ~howing the processing of the Av file ~ystem unit 11 in the flo~chart of Fig. 65i Fig. 67 is a flowchart showing the case when the both the former extent and the latter extent are at ~KUM lYY~ Y~l~t~ Ub/~ UU4U~

least e~ual to AV block length but the data sizes of the In area and Out a~ea are in~ufficient;
Figs. 68A-68E are a -~upplementary example showing the proce~sing of the Av file system unit 11 in the S flowchart of Fig. 67;
Fig~. 69A-69D are a supplementary example showing the processinq of the defragmentation unit 16;
Fig. 70A ~how~ the detailed hierarchical content of the RTRW management file in the fourth embodiment;
Fig~ 70B is a flo~chart showing t~e logical format of the original PGC information in the fourth embodimenti Fig. 70C i~ a flowchart showing ~he logical format of the user-defined PGC information in the fourth embodiment;
Fig. 70D ~hows the logical format of the title search pointer;
Fig. 71 shows the inter-relationships bet~een the AV file, the extents, the VOBs, the vOB information, the original PGC information, and the u~er-defined PGC
information, with the unified element~ being enclosed in the frames drawn ~ith the hea~y line~;
Fig 72 shows an example of a user-defined PGC
and an original PGC;
Fig. 73 shows the part that corresponds to the FROM l998~9~l6a(~)2l:27/~21:06/~4300403387?3l cell to }:~e deleted using diagonal ~3h,:~ing;
Fig. 74~ ~hows which ~CC blocks are ~reed into empty area~ by a real edit u3ing the user-defined PGC
information #2i Fig. 79B show~ f~x~mF~le9 of VOBs, vOB information, and PGC information after a real editi Fig. 75 is a functional block diagra~ shown the construction of the DVD recorde~ 70 divided according to function;
Fig. 76 shows an ex~mple of original PGC
information that has been gene~ated by the user-defined PGC information generator 25 when recording an AV file;
Fig. 77A shows an example of graphics data that i~ di~played on the Tv monitor 72 un~er the control of the recording-editing-reproduction control unit 12:
Fig 77B shows an example of the PGC information and cell information that are displayed as a li~t of operation targets;
Fig. 78A is a flowchart shows the processing during partial reproduction of a title;
Fig. 78B ~hows how only the ~ection between the presentation start time C_V_S_PTM and the pre~entation end time C_v_E_PTM is reproduced, out of the vOBUs between the VOBU ~START) and the VOBU (END);
Figs 79A, 79B show the u~er pre~sing the mark ~ ~UM I ~ Y ~ Y~1 1 0 ~ : u c/ ~ u u L~ u ~ r~

key while ~iewing ~ideo image~ on the Tv monitor 7Z;
Fig~ 80A, 80B show how data is inputted and outputted bet~een the components shown in Fig. 75 ~hen a marking operation i~ performed;
Fig. 81 is a flowchart showing the processing of the editing multi-~tage control unit 26 when defining user-defined PGC information;
Fig. 82 is a flowcha~t showing the processing of the editing multi-stage control unit Z6 ~hen defining o u~er-defined PGC information;
Fig. 83 is a flowchart showing the processing of the recording-editing-reproduction control ~nit 12 during a preview and a real edit;
Fig. 84 is a flowchart showing the update processing for the PGC information after a real edit;
Fig 85 shows an example of the interactive ~creen that i~ di~played on the TV monitor 72 to have the user make a selection of cell information a~ a element in a set of user-defined PGC infor~ation during a virtual edit;
Fig-~ . 8 6A, 8 6B ~how the relationship bet~een the user operation of the remote controller 71 and the display proce~-~ing that accompanies the user operation;
Fig3. 87A to ~7D ~how the relationship between the u~er operation of the remote controller 71 and the . .

~()M I~Y~Y~ :U6/~4~4~
-display proce~sing that ac~nm~nie~ the user operation;
Fig~. 88A, 88B show the relation~hip between the u~er operation of the remote controller 71 and the display processing that accompanies the user operation;
Figs. 89A, 89B show the relationship between the user operation of the remote controller 71 and the di~play processing that accompanies the user operation;
Fig . 90 shows an example of the interactive ~creen that has the user select a Qet of user-defined lo PGC in~ormation or a pre~iew ~u~ing the play key) or a real edit (using the real edit key);
Fig 91 sho~s an example of the original PGC
information table and user-defined PGC information table, when the user-defined PGC infonmation ~2 compo~ed of CELL#2B, CELL~4B, C~LL#lOB, and CE~L#SB and the u~er-defined PGC information #3 composed of CELL#3C, CE~L#6C, CELL#8C, CELL~9C have been defined;
FigQ. 92A-92B ~how the relationship between the user operation of the remote controller 71 and the display proce~sing that accompanies the user operation;
Figs. 93A-93C show the relationship between the user oper~tion of the remote controller 71 and the di~play processing that accompanie~ the user operation;
Fig~. 94A-94C show the relationship between the user operation of the remote controller 71 and the ~K~M lYY~Y~ Y/~l:U~ 4U~ 4 -display proce~ing that accompanies the user operation and Fig. 95 ~how~ the original PGC information table and the user-defined PGC information table after the processing of VOB~ in a real edit.

D~SCRIPTION OF THE PREFERRE~ EMBODIMENTS
The following embodiments describe a video data lo editing apparatus and the optical disc which ~he ~ideo data editing apparatu~ u~e3 a~ recording medium For ease of explanation, the explanation is divided into four embodiments that deal with the physical structure o~ the optical disc, the logical ~tructure, the hard~are ~tructure of the ~ideo data editing apparatus, and the functional constr~tion of the video data editing apparatus.
The first embodiment explains the physical structure of the optical disc and the hardware 3tructure of the ~ideo data editing apparatus, a~ well a~ ~he seamle~ linking o~ video object~ a~ the first basic example of ~ideo editing.
The second embodiment explains seamless linking of partial section~ of video obje~ts as the second ba~ic example. The third embodiment deal~ with the functional FROM 1998~9~16~(~)21:29/~21:06/~4300403387?35 con-qtruction of the video data editing apparatus and the p~ocedure for realizing video editing ~ithin a file ~ystem The fourth embodiment describes the data ~tructures and p~ocedure of the ~ideo data editing apparatus when performing a two-stage editing proce~s composed of ~irtual editing and real editing of t~o ~ypes of program chain called a u~er-defined PGC and an original PGC.

~1-1) Physic~l ~strll~tllr~ of ~ Recor~h~e O~ l ni~r Fig. 2A showY the external appearance of a DvD-RAM disc that is a recordable optical disc. As ~hown in this drawing, the DVD-RAM is loaded into a video data editing apparatus having been placed into a cartridge 75. Thi~ cartridge 75 protects the recording su~face of the DvD-RAM, and has a shutter 76 which opens and ~lose~ to allow access to the DvD-RAM enclosed inside.
Fig. 2B show3 the recording area of DvD-RAM disc which i~ a recordable optical disc. A3 shown in the figure, the DVD-RAM has a lead-in area at its innermost periphery and a lead-out area at it~ outermost periphery, with the data area in between. The lead-in area records the necessa~y reference signals for the lYY~Y~ U/~ U4U~

stabilization of a ser~o during access by an optical pickup, and identification ~ignal~ to prevent confusion with other media. The lead-out area records the same type of reference ~ignals as the lead-in area. The data area, meanuhile, is divided into sector~ ~hich are the ~malle3t unit by which the DVD-RAM can be accessed.
Here, the size of each sector is ~et at 2KB.
Fig. 2C sho~s the cros~ ection and surface of a DVD-RAM cut at the header of a sector. A~ ~hown in the figure, each ~ector is composed of a pit sequence that is formed in the surface of a reflective film, such as a metal film, and a concave-convex part.
The pit sequence is compo~ed of ~4~m-1.87~m pits that are carved into the 3urface of the DvD-RAM to show the ~ector address The concave-convex part is compo~ed of a conca~e part called a "groove" and a convex part called a "land . Each groove and land ha~ a recording mark composed of a metal ~ilm capable of phase change attached to its ~urface. Here, the expre~sion "capable of phase change" mean~ that thc reco~ing mark can be in a crystalline 3tate or a non-crystalline state depending on whether the ~etal film has been exposed to a light beam. V~ing this phase change characteri.~tic, data can be recorded into this conca~e-convex part. While it i~

~UM I Y~ Y~ UU4U~
-only possible to reco~d data onto the land part of an MO
(Magnetic-Optical) disc, data can be recorded onto both the land and the groove parts of a DvD-RAM, meaning that the recording den~ity of a DVD-RAM exceeds that of an MO
disc. Error correction information i3 provided on a DVD-RAM for each group of 16 sector~ In this specification, each group of 16 ~ectors that ig given an ECC (Error Correcting Code) is called an ECC block.
On a DVD-RAM, the data area is divided ~o ~e~eral zone~ to realize ~otation control called Z-CLV~Zone-Constant Linear Velocity) during recording and reproduction.
Fig. 3A show~ the plurality of zones provided on a DvD-RAM. A3 shown in the figure, a DVD-RAM i~ divided to 24 zones numbered zone 0-zone 23. Each zone is a group of tracks that are acceQsed using the sa~e angular velocity In this embodiment, each zone includes 18~8 tracks. The rotational angular velocity of the DVD-RAM
i~ ~et separately for each zone, with this ~elocity being higher the closer a zone is located to the inner periphery of the disc. Division of the data area into zones ensures that the optical pickup can move at a constant velocity ~hile performing acce~Y within a single zone. By doing ~o, the recording density of DVD-RAM is raised, and rotation control during recording ~UM IYY~ Y~ :u~ 4~uu4 and reproduction is made ea~ier.
Fig. 3B shows a ho~izontal arrangement of the lead-in area, the lead-out area, and the zones O-Z3 that are shown in Fig. 3A.
The lead-in area and lead-out area each in~lude a defect management area (DMA: Defect Management Area).
Thi~ defect management area records position information showing the positions of sectors found to include defects and replacement position information showing whether the secto~s used for replacing defective sectors are located in any of the replacement areas.
Each zone has a u~er area, in addition to a replacement area and an unused area that are provided at the boundary with the next zone. A user area is an area that the file ~ystem can use as a recording area. The replacement area is used to repl~ce defective Yector~
when such defective sectors are found The unused area is an area that is not used for reco~ding data. Only t~o tracks are used as the unu~ed area, with such unused area being provided to prevent mi~taken identification of 3ecto~ a~dressea. T~e reason for this is that while sector addres~e~ are recorded at a same position in adjacent track~ within the same zone, ~or Z-C~V the recording positions of ~ector addresses are different for adjacent tracks at the boundaries between zones.

~u~ lYY~Y~Ib~J~ b/~ UU4~ Y
-In this way, 3ector~ which are not u~ed for data recording exi~ at the boundaries between zones. On a DVD-RAM, logical ~ector number~ ~LS~: Logical Sec~or Number) are assigned to phy~ical ~ectors of the user ~rea in order starting from the inner periphery to consecutively show only the sectorc u3ed for recording data A~ shown in Fig 3C, the area that records user data and i~ composed of sectors that have been a~igned an LSN is called the volume area.
10The volume area is used for recording A~ files that are each compo~ed of a plurality of VOBs and an RTR~ (RealTime Re~ritable) management file that is the management information for the AV file~. The~e AV filec and RTRW management file are in fact recorded in a file 15~y~tem according to ISO~IEC 13346, although this will not be explained in the pre~ent embodiment. The file system is dealt with in detail in the third embodiment below.

11-2) D~tA Recor~e~ in th~ VolllmP ~rea Fig. 4A show3 the content o~ the data recorded in the volume area of a DVD-RAM
The video stream and audio stream sho~n on the fifth level of Fig. 4A are divided in~o units of around 2KB, a~ ~hown on the fourth level. The unit~ obtained IYY~Y~lb~ 4~UU~ 4 through thi~ divi~ion are in~erleaved into ~OB#l and VoB#2 in the AV file shown on the third level as ~ideo packs and audio packs in compliance with ~P~G ~t~n~- rd .
The AV file i8 split into a plurality of extents as sho~n on the second level, in compliance with ISO/IEC
13396, and these extents each ~eing stored in an empty area within one zone in the volume area, as shown on the fir~t level of Fig. 4A.
Infonmation for VOB#1-VOB#3 is ~ecorded in an RTRW management file as the VO~#1 information, VOB#2 informa~ion, and VoB#3 information shown on the fifth level. In ~he ~ame way a~ an Av file, thi~ RTRw file is di~ided into a plura~ity of extents that are recorded in empty area3 in the volume area.
The following explanation will deal ~ith video streams, audio streams, and VOBs ~eparately, having fir~t explained the hierarchical ~tructure of MPEG
standard and DVD-RAM standard which define the data ~tructure~ of these elements.
Fig. 4B shows the hierarchical structure of the data de~inition~ u~od under MPEG -~tandard. The data structure for MPEG standard is composed of an elementary stream layer and a system layer.
The elementary ~tream layer Yhown in Fig 4B
in~lude~ a ~ideo layer that define~ the data structure .

lL~u~ u~ 7nluu~ t~ J~J/Aa~ J~J~J~JJJU~

of vldeo streams, an MPEG-Audio layer that deflnes ~he data ~tructure o~ an MP~G-Audio ~tream, an AC3 layer that define~ the data structure o~ an audio ctream under Dolby-AC3 methods, and a ~inear-PCM layer that defines the data structure of an audio stream under Linear-PCM
methods. The pre~entation start tlme ~Presentation Start Time) and pre~entation end time (Presentation End Time) are defined within the elementary stream layer, though, a~ shown by the qeparate boxe~ uqed for the ~ideo layer, MP~G-Audio layer, AC-3 layer, and Linear-PCM layer, the data structures of the ~ideo stream and the audio stream are independent of each other. The pre~entation ~tart time and presentation end time of a video frame and the pre~entation start time and presentation end time of an audio frame are similarly not synchronized The system layer shown in Fig. 4B defines the packs, packets, DTS and PTS that are deqcri~ed later.
In Fig. 4B, the system layer is shown in a separate box to the video layer and audio layer, showing that the pack3, packets, DTS and PTS are independent of t~e data structures of the video streams and audio streams.
While the above layer ~tructure i~ used for MPEG
~tandard, DVD-RAM qtandard includeq the 3y~tem layer under MPEG 3tandard shawn in Fiq. 4B and an elementary rnu~L lY~D~F: Y~1103l~ l:U0/)(~UU~U~OI r '~
..

stream layer. In addition ~o the pack~, packets, DTS, and PTS described above, DVD standard defines the data structures of the vOBs ~hown in Fi~. 4A.

~ --1 ) Vi c31?0 rC:t re~m The video st~eam ~hown in Fig. 5A has a data st~ucture that is defined by the video layer shown in Fig. 4B. Each video stream is composed of an arrangement of a plurality of ~etY of picture data that lo each corre~pond to one ~rame of video images. This picture data is a video ~ignal according to NTSC
(~ational Tele~ision ~andard~ Committee) or PAL (Phase-Alternation Line) standard that has been compressed using MPEG techniques. Sets of picture data produced by compre~sing a video signal under NTSC standard are di~played by video f~ame~ that have a frame inter~al of around 33msec (1/29.97 seconds to be precise)~ while set-~ of picture data produced by compressing a video ~ignal ~nder PAL standard are displayed by video frames that have a frame interval of 40msec. The top level of Fig. 5A 3how~ examples of video frame3. In Fig. SA, the sections indicated between the "<" and ">" symbols are each a ~ideo frame, with the "<" symbol shouing the presentation start time (Pre~entation_Start_Time) for each video frame and the ">" 9ymbol showing the l' l~UJ~ 1 7 7U~ ~n l UU ~ I . JJ/ a~ . \)U/ J~,~7~JU~J~U ~JU I I ~J

pre~entation end time (pre~entatlon-~nd Time) for each video frame. This notation for ~ideo frame~ is al~o used in the following drawing~. The sections which are enclo~ed by theqe ~ymbols each include a plurality of ~ideo fields.
As shown in Fig. 5A, the picture data that should be diqplayed for a video frame is inputted into a decoder before the Preqentation Start Time of the video frame and must be taken from the buffe~ by the decoder at the Pre~en~ation_Start Time.
When compression is performed in accordance with ~PEG standards, the spatial frequency characteristics within the i~age of one frame and the time-related correlation with images that are displayed before or after the one frame are used. By doing so, each set of picture data is converted into one of a Bidirectionally Predicative (B) Picture, a Predicative (P) Picture, or an Intra (I) Picture. A B picture iq u~ed where compreq~ion is perfo~med using the time-related correlation with images that are reproduced both before and after the preYent image. A P picture i9 used ~here compreq3ion i9 performed using the time-related correlation with images that are reproduced before the pre~ent image. An I picture is used ~here compression is performed using the spatial frequency characteristic~

within one frame without u~ing tLme-related correlation with other image~. Fig. 5A show~ B pictures, P
pictures, and I plctures as all having the same slze, although it sho~ld be noted that there is in fact great variation in their ~ize3.
When decoding a B picture or a P picture that ~se the time-related correlation bet~een frames, it i~
necessary to refe~ to the image~ that are to be reproduced before or afte~ ~he pictu~e ~eing ~ecoded.
lo For example, when decoding a B picture, the decoder ha~
to wait until the decoding of the following image has been compleLed.
As a result, an MPEG video stream defines the coding order of each picture in addition to defining the display order of the pictures. In Fig. 5A, the second and third levels respectively show the sets of picture data arranged in display order and in coding order.
In Fig. 5A, the reference ta~get of one of the B
pict~res is shown ~y the broken line to be the following I picture. In the di~play order, this I picture follows the ~ picture, though ~ince the B picture is compressed using time-related correlation with the I picture, the decoding of the B picture ha-~ to wait for the decoding of the I picture to be completed. As a result, the coding order defines that the I picture comes before the rr~VM IYY~ YH I oa ~ o/ .x.~ u u~u ~ ~ ~ I r B picture. Thi~ rearranging of the display order of picture~ ~hen generating the coding order is called "reor~ering'.
A:3 ~hown on the '~hird level of Fig. 5A, each set 5 of picture data is divided into ZECB units after being arranged into the coding order. The resulting 2KB unit~
are ~to~ed as a video pack sequence, as shown on the bottom level of Fig. 5A.
When a sequence of B pictures and P pictures is used, problems can be caused, such a~ by ~pecial reproduction features that perform decoding starting midway throllgh the ~7ideo stream. To pre~ent such pro~lem~, an I picture is inserted into the video data at 0.5~ intervals. Each sequence of picture data starting from an I picture and continuing ac far a~ the next I picture is called a GOP (Group Of Pictures), ~Tith GOPs being defined in the system layer of MPEG ~tandard as the unit for ~PEG compression. On the third level of Fig. 5A, the dotted vertical line shows the boundary between the present GOP and the following GOP. In each GOP, the picture type of the picture data that is arranged last in the display order is a P pic~ure, while the picture type of the picture data that is arranged fir3t in the coding order must be an I picture.

1' I~VIII 1 7 7U~ ~1 1 ULJ ~ L 1 ~ JJ/ ~ta~ I . vu/ A~7~ u~u~Ju 1 1 ~su o ~stre,pm The audio stream -is data that has been compressed according to one of Dolby-AC3 method, MP~G method, and Linear-PCM. Like a video ~tream, an audio ~tream i~
generated using audio frames that have a fixed f~ame interval. Fig. 5B ~how~ the correspondence between the audio frames and audio data. In detail, the reproduction period of an audio frame i~ 32m~ec for Dolby-AC3, 24msec for MP~G, and around 1.67msec ~1/600sec to be precise) for Linear-PC~.
The top level of Fig. 5B show3 example audio frames. In Fig. SB, each s~ction indicated bet~een the "~" and "~" sym~ols is an audio frame, with the "c"
~ymbol 3howin~ the pre~entation ~tart time and the "~"
~ymbol sho~ing the presentation end time. This notation for video frames is also used in the following drawings The audio data that should be displayed for an audio frame i-~ inputted into a decoder before the presentation ~tart time of the audio frame and should be taken out of the buffer by the decoder at the presentation Ytart time.
The ~ottom level of Fig. SB shows an example of how the audio data to be reproduced in each frame is stored in audio packs In this figure, the audio data 2~ to be reproduced for audio frames f81, f82 is stored in . .

'98~09~17~(~)10~8 ~RICHES ~ P()r45 audio pack A71, the audio data to be reproduced for audio frame f84 is stored in audio pack A72, and the audio data to be reproduced for audio frames fa6, f87 is stored in audlo pack A73. The audio data to be i 5 reproduced ~or audio frame f83 is divided between the audio pack A71 that come~ first and the audio pack A72 which comes later. In the same way, the audio data to be reproduced for audio frame f86 ls di~ided between the audio pack A72 that come~ fir~t and the audio pack A73 which comes later. The reason the audio data to be reproduced for one audio frame is stored divided between two audio packs is that the boundaries between audio frames an~ video frame~ do not match the boundaries ~etween pack~. The reason that such boundaries do not match is that the data st~ucture of packs under MPEG
standard is independent of the data ~tructure of vldeo ~treams and audio streams.
3 DAt~ ~tr--ctllre of YOR~
The vOB~ (Video Objects) #l, #2, #3 .... sho~n in Fig. 4A are program streams ~nder ISO/IEC 13al8-1 that are obtained ~y ~ultiplexing a video ~trea~ and audio itream, al~hough these vosY do not have a program_end_code at the end.

Fig. 6A sho~ the detailed hierarchy for the um 1 7 :7 l)~ :71-J I U l~ I . J U/ jjlr~L I . U U/ Aa~'~ U J i u I 1 'S U

logical construction of VOBs. This me~n~ that the logical format located on the highest level of Fig. 6A
i3 :3hown in more detail in the lo~er levels.
The video stream that i~ located on the highest 5 level in Fig. 6A is sho-~n divided into a plurality of GOPs on the second level, uith these GOP~ having been ~hown in ~ig 5A. As in Fig. 5A, the picture data in GOP units is divided into a large number o~ 2KB unit~
On the other hand, the audio stream ~hown on the left of the highest level in Fig. 6A is di~ided into a large number of approximately 2KB units on the third level in the :3ame way a~ in Fig. 5B The picture data for a GOP
ùnit that is divided into 2KB unit~ is interleaved with the audio stream that is similarly divided into approximately 2KB units. This produces the pack sequence on the fourth le~el of Fig. 6A. This pack ~equence forms a plu~ality of VOBUs (Video Obj ect Unit9) that are ~hown on the fifth le~el, with the VO~ ~Video Object) ~hown on the sixth level being composed of a 20 plurality of these VOBU~ arranged in a time 3eries. In Fig. 6A, the guidelinc~ drawn u~ing broken lines show the relation~ between the data in the data structures on adj acent levels . ~y referring to the guideline~ in Fig.
6A, it can be seen that the VOBUs on the fifth level 25 correYpond to the pack ~equence on the fourth level and " I J ~ Ut ~11 I U LI \~J~/ L I . J U/ ~L 1 . U U/ ACI ~ ~ ~ ~ V o -r U J ~ U ~
_.

the picture data in GOP units shown on the second level.
As can be ~een by tracing the guidelines, each VO~U i~ a unit that includes at least one GOP composed of picture data with a reproduction period of around 0.4 S to 1.0 second and audio data that has been interleaved with thi~ picture data. At the same time, each VOBU i~
composed of an arrangement of video packs and audio pack~ under MPEG ~tandard. The unit called a GOP under MPEG standard is defined by the ~ystem layer, altho~gh when only video da~a i~ qpecified by a GOP, a~ shown on the 3econd level of Fig. 6A, the audio data and other data ~such as su~-picture data and control data) that i5 mul~iplexed with the video data is not indicated by the GOP. Under DvD-RAM standard, the expres~ion "vOBU" is used for a ùnit that corresponds to a GOP, with this unit being a general name for at least one GOP composed of picture data with a reproduction period of around 0.4 to 1.0 ~econd and the audio data that has been interleaved with thi~ picture data.
Here, it is possible for parts of a VOB to be deleted, ~ith the minimllm unit being one vOBU, A~ one example, the video stream recorded on a DVD-RAM as a VOB
may contain image~ for a commercial that are not wanted by the user. The VO~Uq in this VOB include at least one 25 GOP that compose~ the commercial and audio ~ata that is r rs u~ l Y Y ~: YH ~ l: u o/ x~ u u ~t u ~ r ~ u interleaved wi~h this picture data, ~o that if only the VOBUs in the VOB that correspond to the comme~cial can be deleted, the u~er will then be able to watch the video stream without having to watch ~he commercial.
5 Here, even if one VOBU is deleted, for example, the VO~Us on either side of the deleted VOBU will include a part of the vi~eo stream in GOP units that each have an I picture located at their front. This means that a normal decode and rep~oduction process are possi~le, even after the delet ion of the ~OBU.
~ ig. 6B s~ows an example where part of a VOB is deleted. This vOB originally includes VOBU#1, VOBU#2, vosu#3, vosu#4 .... vOBU~7. When the deletion of VOBU#2, VOBU#4, and VOBV#6 is indicated, the areas that were originally occupied ~y these VOBUs are freed and so are shown as empty areas on the second level of Fig. 6B.
When the VOB i9 reproduced thereafter, the reproduc~ion order is VOBU#1, VOBU#3, VOBU#5, and V03U~7.
The video packs and audio pack~ includcd in a VO~U each have data length of 2K~. This 2K~ ~ize matches the sector size of a DvD-RAM, so that each video pack and audio pack is recorded in a separate qecto~.
The arrangement of video packs and audio packs is corresponds to the arrangement of an equal number of consecutive logical sectors, and the data held within 4~

.

L ~ J J ~ J/ J l ~ U \11~/ L l ~ J ~ L l ~ U ~ ~ J V U 'r U J J ~

these packs ls read from the DvD-RAM. Thi~ to ~ay, the arrangement of video packs and audio pack~ refers to the order in ~hich these packs are read from the DvD-RAM. Since each video pac~ is approximately 2KB in size, if the data size of the video stream for one vOBU
i~ ~everal hundred KB, for example, the ~ideo stream will be ~tored having been divided into several hundred video packs.

~ -3-1) n~ Struct~re of V1~PO ~ck.~ ~n~ iO p~cky ~ ig~. 6C to 6E show the logical ~onmat of the video pack~ and audio pack~ tored in a VOBU. Normally, a plurality of packets are inserted into one pack in an MPEG sy~tem stream, although under DvD-RAM standard, the number of packets that may be inserted into one pack is re~tricted to one. Fig. 6C sho~s the logical format of a video pack arranged at the start of a VOBV. As shown in Fig. 6C, the ~ir~t video pack in a VOBU is composed of a pack header, a ~ystem header, a packet header, and video data that is part of the video stream.
Fig. 6D show~ the logical format of the ~ideo pack~ that do not come first in the voBu~ As -~ho~n in Fig. 6D, these ~ideo pack~ are each composed of a pack header, a packet header, and video data, with no system header.

q9 '98~09~17~ ) 10:38 ~E~G R I CHE S ~ jfi Po3/~5 Fig 6E shows the logical format of the audio packs. As shown in Fig. 6E, each audio pack is composed of a pack header, a packet header, a sub_stream_id showing whether the compression method used for the audio stream included in the present pack is Linear-PCM
or Dolby-AC3, and audio data that is part of the audio ; stream and has been compressed according to the indicated method.

~ -3-2~ ffer Control within ~ VOR
The video ~trea~ and audio stream are stored in video packs and audio packs as described above.
! However, in order to ~eamlessly reproduce vOBs, it is not ~u~ficient to store the video stream and audio stream in video packs and audio packs, with it being necessary to s~itably arrange of video packs and audio packs to ensure that buffer control will be uninter~upted. The buffers referred to here are input buffer for temporarily storing the video stream and the audio strea~ before input into a decoder. Hereinafter, the ~eparate buffers are referred to as the video buffer and the audio buffer, with specific examples being shown in as the video buffer 4b and the audio buffer 4d in Fig. 19. Uninterrupted buffer control refers to lnput ZS control for the buffer that ensures that overflow or '98~09~ )lO:39 ~RICHES ~ fi Po4/45 i underflow do not occur for either input buffer. This is described in more detail later, but is fundamentally achieved by as~igning time stamps (showing the cor~ect times for ~he input, output, and display of data) that are ~tandardized for an MPEG stream to the pack header and packet header shown in Fig. 6D and ~ig. 6E. If ~o underflows or overflows occur for the video bu~fer and audio buffer, no interruptions will occur in the reproduction of the video streams and audio ~treams. As will be clear from this ~pecification, it is very important that buffer control is uninterrupted.
There is a time limitation whereby each set of ; audio dat~ needs transferred to the audio buffer and decoded ~y the pre~entation ~tart time of the audio frame to be reproduced by ~uch data, but since audio streams are encoded u~ing fixed-length encoding with a relatively small amount of data, the data that is I required for the reproduction of each audio frame can be ! stored in audio packs. The~e audio packs are transferred to the audio buffer during reproduction, meaning that the time limitation described above can be easily managed Fig 7A is a figure showing the ideal buffer operation for the audio ~uffer. This figure shows how the buf~er occupancy changes for a ~equence of audio '98~09~17~(~)1039 ~RIGHES ~ o$~ pO5~ ~5 f~ames. In this specification, the term "buffer occupancy" refers to the extent to which the capacity of a ~u~fer i~ being used to store data. The vertical axis of Fig. 7A shows the occupancy of the audio buffer, while the horizontal axis represents time~ This time axis i~ ~plit into 32msec sections, ~hich matches the reproduction period of each audio frame in the Dolby-AC3 method. By referring to this graph, it can be ~een that the occupancy of the buffer changes o~er time to exhi~it~ a sawtooth pattern.
The height of each triangular tooth that composes the sawtooth pattern repre~ents the amount of data in the part o~ the audio stream to be reproduced in each audio frame.
The gradient of each triangular tooth ~epresents the tran~fer rate of the audio stream. This transfer rate is the same for all audio frames.
During the period corresponding to one triangular tooth, audio data is accumulated with a constant transfer rate during the di~play period (32msec) of the audio frame preceding the audio frame that is reproduced by this audio data. At the pre~entation end time of the preceding audio frame ~thi~ time representing the decode time for the present frame), the audio data for the pre~ent ~rame is in~tantly outputted from the audio FROM _ 1998~9~163(~)21:42~21:06/~4300403387P55 buffer. ~he rea~on a sawtooth pattern ls ac~ieved i~
that the proces~ing from the storage in the buffer to output from the buffer is con~inually repeated.
As one ex~m~le~ a~3ume that tran~fer of an audio stream ~o the audio buffer begin~ at time Tl. Thi~
audio data should be reproduced at time T2, ~o that amount of data stored in the audio buffer will gradual}y increase between time Tl to time T2 due to the transfer of this audio data. However, because this transferred audio data is output at the pre~entation end time of the preceding audio frame, the audio buffer will be cleared of audio data at that point, so that the occupancy of the audio buffer return~ to O. In Fig. 7A, the same pattern i~ repeated between time T2 and time T3, between time T3 and time T4, and ~o on.
The buffer operation ~hown in Fig. 7A is the ideal buffer operation ~tate for the premise uhere the audio data to be reproduced in each audio frame is stored in one audio pack. In reality, however, it i~
normal for audio data tha~ will be reproduced in several different audio frames to be stored in one audio pack, a~ shown in Fig. 5B. Fig. 7B shows a more realistic operation for the audio buffer In thi~ figure, audio pack A31 stores audio data A2l, A22, and A23 which should respecti~ely be decoded by the presentation end _, , FROM _ 1998~ 9~16E(~)21:43/~21:06/~a~4300403387 P 56 ~ime~ of audio ~rame f21, f22, and f23. As shown in ~ig. 7B, only the decodlng of audio data A21 ~ill be completed at the presentation end time of audlo frame f21, with the decoding of the other ~ets of audio da~a s f22 and f23 ~eing respectively completed by the presentation end times of the following audio frame~ f22 and f23. of the audio frames included in thi~ audio pack, audio data A21 should be decoded first, ~ith the decoding of this audio data needing to be completed by lo the presentation end time of audio frame f21.
Accordingly, this audio p~ck should be read from the DvD-RAM during the reproduction period of the audio frame f21.
Video streamq are encoded with ~ariable code length due to the large differences in code size between the different typeY of pictures ~I pictures, P pictures, and B pictures~ used in co~pression method~ that u~e time-related correlation. Video streams also include a ~ig~ificant amount of data, so ~hat ~t ls dlfficult to complete the transfer of the picture data for a video frame, especially the picture data for an I picture, by the presentation end time of the preceding video frame.
Fig. 7C i~ a graph ~howing video frame~ and the occupancy of the video buf~er. In Fig. 7C, the vertical axis represents the occupancy in the video buffer, while FROM _ 1998~9~16E(~)21 43~21 06/~4300403387P57 the horizontal axls represents tlme. This horizontal axis is split into 33msec ~ection~ which each m~tch the reproduction period of a ~ideo frame under ~T~C
-Ytandard. By referring to thi~ graph, it can be seen that the chang~s in the occupancy of the video buffer changes over time to exhibit a sawtoot~ pattern.
The height of each triangular tooth that compo~e~
the ~awtooth patte~n represents the amoun~ of data in the part o~ the video stream to be reproduced in each video frame. As mentioned before, the amount of data in each video frame is not equal, since the amount of code for each ~rideo franle i3 dynar~ically assigned according to the complexity of the frame.
The gradient of each triangular tooth shows the transfer rate of the video stream. The approximate trancfer rate of the ~ideo stream is calculated by subtracting the output rate of the audio stream from the output rate of ~he track bu~fer. This transfer rate i~
the same during each frame period.
During the period corre~ponding to one triangular tooth in Fig. 1C, picture data is accumulated with a constant transfer rate in during the display period t33m~ec) o~ the video frame preceding the ~ideo frame that is reproduced by this picture data. At the pre~entation end time of the preceding ~ideo frame (this . . _ FROM _~ 1998~ 9~16~(~)21:44/~21:06/~$~4300403387 P 58 time representing the decode time for the pre~ent picture data), the picture data for the present frame is instantly outputted from the video buffer. The reason a sawtooth pattern is achieved is that the processing from the ~torage in the video buffer to output from the video buffer is continually repeated.
When the image to be displayed in a given video f~ame is complex, a larger amount of code needs to be assigned to this frame. When a larger amount of code is assigned, this means that the pre-~torage of data in the video buffer need~ to be commenced well in advance Normally, the period from the transfer start time, at which the transfer of picture data into the video buffer is commenced, to the decode time for the picture data i~ called the VBV ~Video Buffer Verify) delay. In general, the more complex the image, the larger the amount o~ assigned code and the longer the VBV delay.
As can be ~een from Fig. 7C, the tran3fer of the picture data that i~ decoded ~t the p~esentation end time T16 of the preceding video frame is co~nced at time T11. The tran~fer of picture data that i~ decoded at the pre~entation end time T18 of the preceding video frame, meanwhile, is commenced at time T12. The tran~fer of the picture data for other video frames can .

FROM ~ 1998~9~16E(~)21:44A~21:06/~4300403387P59 be ~een to be rommenced at time~ T14, T15, T17, Tl9, T20, and T21.
Fig. 7D show~ the tran~fer of sets of~ pic~u~e data in more detail. ~hen considering the situation in 5 Fig. 7C, the transfer of the picture data to be decoded at ti~e T24 in Fig. 7D need~ to be completed in the "Tf Period" between the ~tart time T23 of the "VBV
delay" and the start of the transfer of the picture data for the next video frame to be reproduced. The increase in the occupancy of the buffer that occurs from thi~
Tf_Period onwards is cau~ed by the transfer of the picture data for the image to be di~played in the next video frame.
The picture da~a accumulated in the video buffer waits for the time T24 at which the picture data is ~o be decoded. At the decode time T24, the image A i~
decoded, which clears part of the picture data -~tored in the video buffer, thereby reducing ~he total occupancy of the video buffer.
When considering the above situation, it can be seen that uhile it i~ ~ufficient for the trans~er of audio data to be reproduced in a cer~ain audio frame to be ~nm~Pnced around one frame in advance, the tran~fer of picture data for a cer~ain video frame need~ to be commenced well before the decode time of such picture _ FROM _ 1998~9~16E(~)21:44~21:06/~4300403387P60 data. In other wo~ds, ~he audio da~a which ~hould be reproduced in a certain audio frame should be inp~tted into the audio buffer a~ around the 3ame time as picture data for a ~ideo frame that i~ well in ad~ance of the 5 audio frame. This means that when the audio stream and video stream are multiplexed into an MP~G stream, audio data needs to multiplexed ~ell before the corresponding picture data. A~ a ~esult, the video da~a and audio data in a VOBU are in fact composed of video data that will be reproduced later and audio data.
The arrangement of the plurality of video packs and audio packs has been described a~ reflecting the tran~fer order of the data included in the packs.
Accordingly, to have ~he audio data to be reproduced in an audio ~rame read at approximately the ~ame time as the picture data to be reproduced in a video frame that is well ahead of the audio ~rame, the audio pack~ and video packs that store the audio data and picture data in question need to be arranged into a same part of the ~OB.
Fig. 8A shows ho~ the audio packs, uhich ~tore audio data to be reproduced in each audio frame, and the video pack3, which show the picture data to ~e reproduced in each video frame, should be stored.
In Fig. 8A, the rectangles marked with "V" and '9&~09~17~(~)1040 ~RICHES ~ P06/15 "A" show each video pack and audio pack. Fig. 8~ sho~s the meaning of the width and height of each of these rectangles. A~ ~hown in Fig. 8B, the height of each rectangle showq the ~itrate used to transfer the pack As a result, pack~ that have a tall heigh~ are transferred with a high ~itrate, which means that the pack can ~e inputted into a buffer relati~ely quickly Packs that are not tall, ~owever, are transferred with a low bitrate, and so take a relati~ely long time to be trans~erred into the buffer.
The picture data V11 that is decoded at time T11 in Fig. 8B is transferred during the period kll. Since the transfer and decoding of the audio data A11 are performed during this period kll, the video packs that store the ~ideo data V11 and the audio pack that stores the audio data All are ar~anged into a similar position, as shown in the lower part of Fig. 8A.
The picture data V12 that i9 decoded at tirne T12 in Fig. 8A is transfer~ed during the period kl2. Since the trans~er and decoding of the audio data A12 are performed during this period kl2, the ~ideo packs that store the video data V12 and the audio pack that stores the audio data A12 are arranged into a similar pos~tion, as shown in the lower part of Fig. 8A
In the same way, the audio data A13, Al4, and A15 FROM ~ 1998~9~16~(~)21:45/~21:06/~4300403387P62 are arranged into similar po~itions as the pict~re data Vl3 and ~14 whose trans~er is ~n~nced at the output ~ime of the3e ~ets of audio data.
Note that when picture data with a large amount of as-~igned code, such a~ pioture data v1 6, is accumulated in the buffer, a plurality of audio data Al5, Al6, and Al7 are multiplexed during kl6 which is the tran~er period of the picture data ~1 6.
Fig. 9 Qhows how audio packs that s~ore a plurality of sets of audio data to be reproduced in a plurality of audio frames and video packs that store picture data to bc reproduced in each video frame may be stored. In Fig. 9, audio pack A31 QtoreS the audio data A2 1, A22, and A23 tha~ is to be reproduced for audio frame~ f2 1, f22, and f23. Of the audio data that is ~ored in the audio pack A31, the first audio data to be decoded is the audio data A21 . since ~he audio data A21 need~ to ~e decoded at the presentation end time of the audio frame ~20, this audio data A2l needs to be read from the DV~-RAM together with the picture data vll whose transfer is performed during the same period (period kll) as the audio frame f2 0 As a result, the audio pack A3l i~ arranged near the video packs that ~tore the picture data Vll.
Z5 When considering that an audio pack can store FROM ~ 1998~9~16~(~)21:46/~21:06/~4300403387P63 audio data which ~hould be decoded for ~e~eral audio frames, and that audio pac~ are arranged in similar positions to video pack~ that are composed of picture data which should be decoded in the future, it may seem that the audio data and picture data to be decoded at the ~a~e time should be ~tored in audio packs and video pac~s that are at di~tant positions within a VOB.
However, there will be no ca~e~ where video packs which store picture data that will be decoded one second or more later are ~tored alongside audio data that should be decoded at the ~ame time. This i~ because MP~G
~tandard defines the upper limit for the time data can be accum~lated in the buffer, with all data having to be outputted from the buffer within one ~econd of ~eing inputted into the buffer. This restriction is called the "one-Yecond ~ulel' for MPEG ~tandard. Becau~e of the one-qecond rule, even if audio data and picture data that are to be decoded at the same time are arranged into diqtant position~, the audio pack that 3tore~ the audio data to be decoded at a given time will definitely be ~tored within a range of 3 ~OBUs ~rom the vOBU that stores the picture data to be decoded at the qame given time.

.. . ., . ~

FROM - 1998~9~163(~)21:46/~21:06/~4300403387 P64 -3-~-~) R-lffer Co~trol ~e~w~ VnR~
The following explanation deals with the buffer control that is perfonmed when rep~oducing t~o or more VOBs successi~ely. Fig. lOA shows the buffer state for the first part of a video stream. In Fig. lOA, the input of the pack that includes the picture data is commenced at the point indicated a~ Fir~t_SCR during the video frame f71, with the amount of data shown as BT2 being transferred by the pre~entation end time of the video frame f72. Similarly, the amount of data BT3 has been accumulated in the buffer by the presentation end time of the video frame f7 3. This data is read from the video ~uffer by the video decoder at the presentation end time of the video frame f74, with thi~ time being indicated hereafter by the notation First_DTS~ In this way, the state of the buffer changes as shown in Fig.
lOA, with no data for a preceding ~ideo stream at the start and the accumulated amount of data gradually increasing to trace a triangular shape ~ote here that Fig. lOA is drawn with the premise that the video pack is inputted at the time First_SCR, although when ~he pack positioned at the front of a VOB is a different pack, the start of the increased in the amount of buffered data will not match the time First_SCR. Also, the reason Last SCR is positioned midway through a ~ideo . . .

FROM - 1998~ 9~16E (~) 21 :47/~21 :06/~4300403387 P 65 frame is that 'che data -~ructure o~ the pack i9 unrelated to the data structure of the video data.
Fig. 103 sho~ the buffer ~tate du~ing the latter part of a video ~tream. In this drawing, the input of data into the video buf fer i9 completed at the time Last_SCR that i9 located ~idway thro~gh video frame f61.
After this, only the data amount a 3 of the accumulated ~ideo data is taken from video buffer a~ the presentation end time of video frame f61. Following thi~ can be seen that only the data amo~nt ~4 is taken from video buffer at the presentation end time of ~ideo frame f62, and only the data amount ~5 is taken at the pre~entation end time of video frame f63, this Iatter time al50 being called the Last_DTS.
For the latter part of a VOB, the input of ~ideo packs and audio packs is completed by the time shown as Last_SCR in Fig. 10B, so that the amount of data stored in the ~ideo buffer will t~ereafter decrease in step~ at the decoding of video frame~ f61, f62, f63 and f64. A~ a re~ult, the occupancy of the buffer decreases in steps at the end of a video ~ream, a~ ~hown in Fig. 10~.
Fig. 10C shows the buffer state acro~Y VOBs. In more detail, this dra~ing shows the ca~e where the latter part of a video stream that cau~e~ the bu~fer Z5 state shown in Fig. 10B is ~eamlessly linked to the .. . ..

FROM ~ 1998~ 9~16~(~)21:47/~21:06/~$~4300403387 P 66 form er part of anot her video stream that cau~es the buffe r state -~hown in Fig. 10~.
When the~e two video 3treams are seamles31y linked, the Fir~t DTS of the former part of the second s video stream to be reproduced need~ to follow after the video frame with the Last_DTS of the latter part of the fir~t video stream In other word~, the decoding of the first video frame in the second ~ideo stream needs tO be performed after the decoding of the video frame with the lo final decode time in the f~rst video stream. If the interval between the ~ast_DTS of the latter part of the first video stream and the Fir3t_DTS of the former part of the second video stream i5 equivalent to one video frame, the picture data o~ the latter part of the first video stream will coexi~t in the ~ideo buffer with the picture data of the former part of the second ~ideo ~tream, as shown in Fig. lOC.
In Fig. lOC, it is as~umed that the video frames f71, f72, and f73 shown in Fig. lOA match the video frames f61, f62, and f63 shown in Fig. lOB. In such condition3, at the presentation end time of video frame f71, the picture dat~ BEl of the latter part of the first ~ideo ~tream and the picture data BT1 of the former part o~ the ~econd video stream are pre~ent in the video buffer. At ~he presentation end tlme o~ the .

'98~09~17~ ) 10:41 ~c R I CHES ~ Pfi P()7,~5 ~.

video frame f72, the picture data BE2 of the latter part of the first ~ideo stream and the picture data BT2 of the former part of the second video stream are present in the video buf~er. At the presentation end time of S the video frame f73, the picture data BE3 of the latter part of the first video stream and the picture data BT3 of the former part of the second video stream are present in ~he video ~uffer. As the decoding of video frames progresses, the picture data of the latter part ~ 10 of the first video stream decreases in ~teps, ~hile the picture data of the former part of the second video ~tream gradually increa~es. These decrease~ and i~crea~e~ occur concurrently, so that the buffer ~tate shown in ~ig 10C exhibits a sawtooth pattern which closely resembles the buffer state ~hown for VOBs in Fig. 7C.
It should ~e noted here that each of total BTl+BE1 of the data amount BT1 and the data a~ount BE1, total BT2+BE2 of the data amount BT2 and the data a~ount BE2, and total BT3+BE3 of the data amount BT3 and the data amount BE3 i9 below the capacity o~ the video buffer. Here, if any of the~e totals BTl+BEl, BT2+BE2 or BT3+~E3 exceeds the capacity of the video buffer, an overflow will occur in the video buffer. If the highest ZS of these totals i~ expres~ed as Bvl+Bv2, thi~ value FROM ~ 1998~ 9~16~ (~) 21: 48/~21: 06/~$~300403387 P 68 R~rl I R~r7 ~~ t h~ ~; thin t.h~ ~acitv of the ~ideo buffer .

3) P~ck H~A~r, ~ .ct~m ~e~, p~k~t ~r The information for the buffer control described ~bove is writ~en a~ time stamps in the pack header, the system header, and the packet header shown in Figs.
6F-6H. Figs 6F-6H show the logical formats of the pack header, the sy~tem header, and the packe~ header. A~
shown in Fig. 6F, the pack header include-~ a lo Pack_Sta~t_Code, an SCR ~System Clock Reference) showing the time at which the data stored in the present pack should be inputted into ~he video buffer and audio buffer, and a Program_ max_rate. In a VOB, ~he first SCR
i~ set as the initial value of the STC (System Time Clock) that i~ provided as a ~andard ~eature in a decoder under MPEG standard.
The system header ~hown in Flg. 6G is only appended to the ~ideo pack that is located at the Qtart of a ~osu. Thi~ system header includes maxi~um rate information (shown as the "Rate.bound.info" in Fig. 6G) ~ho~ing t~e transfer rate to be reque~ted of the reproduction apparatus when inputting the data, and buffer .~ize information (shown as "Buffer.bound.info" in Fig. 6G) showing the highe~t buffer size to be requested of the reproduction appar~tus when inputting the data in . .

FROM ~ 1998~ 9~16~ (~;) 21: 48/~121: 06~4300403387 P 69 the YO8U
The packet header ~hown in Fig. 6H includes a DTS
~Decoding Time St~mp) sho~ing the decoding time and, for a video stream, a PTS [Presentation Time Stamp) ~hown the time at which data should be outputted after reordering the decoded video stream. The PTS and DTS
are ~3et ba~ed on the presentation start time of a video frame or audio frame. In 'che data con~truc~ion, a PTS
and a DTS can be set for all packs, although it is rare for such information for picture data that should be di~played for all the video frames. It iq common for such information to be assigned once in a GOP, which i~
to say once e~ery ~.5 seconds of reproduction time.
Every video pack and audio pack is assigned an SCR, however.
For a video s~ream, it is ~o~on for a PTS to be as3igned to each video frame in a GOP, though for an audio ~tream, it is co~mon for a PTS to be assigned every one or tYo audio frames~ For an audio ~tream, there will be no difference between the display order and the coding order, so t~at no DTS is required. When one audio pack ~tore~ all of the a~dio data that i.~ to be reproduced for two or more audio frarnes, a PTS is written at the ~tart of the audio pack.
As one example, the audio pack A71 shown in Fig.

FROM ~ 1998~9~16~(~)21:49/~21:06~ 4300403387P70 5B may be given the pre~entation start time of the audio frame f81 as the PTS. On the other hand, the audio pack A72 that store~ ~he divided audio frame f8 3 mu~t be given the presentation start time of the audio frame S fB4, not the presentation start time of the audio frame f~3, as the PTS. This i~ also the case for the audio pack A73, which must be given the presentation start time of the audio frame f8 6, not the pre~entation start time of the audio frame f8 5, as the PTS
--3--4~ Contlrluity of Tim~ StAn~eg Th~ following iR an explanation Of the values that are qet as the PTS, DTS, and SCR for video packs and audio packs, as ~hown in Figs 6F to 6H.
Fig. llA is a graph ~howing the value~ of the SCR
of packs included in a VOB in the order that packs are arranged in the vOB. The horizontal axls ~hows the order of the video pac~s, with the vertical axis sho~c the value of the SCR ~hich i3 a3signed to each pack.
The first ~alue of the SCR in Fig. llA is not zero, and is instead a predetermined val~e shown as Initl. The reason the fir~t value of the SCR is not zero i~ that the VOB~ that are proce~sed by a ~ideo editing apparatus are ~ubjected to many editing operations, so that there are many ca3es where the fir~t 6~

.. . . . .

FROM 1998~9~16~(~)21:49~21:06/~4300403387P71 part of a vOB will ~ave already been deleted. It ~hould be obvious that the initial value of the SCR of a vOB
that has ju~t been encoded will be zero, although the present embodiment assumes that the initial value of the SCR for a VOB i5 not zero, as shown in Fig. llA.
In Fig. llA, the closer a video pack i~ to the start of the VOB, the lower the value of the SCR of that video pack, and the further a vldeo pack is from the start of the VO~, the higher the value of the SCR of that video pack. This characterie~ic i~ referred to a~
the "continuity of time stamp~", ~ith the same continuity being exhibited by the DTS. Though the coding order of video packs is such that a latter video pack may in fact be displayed before a former video lS pack, meaning that the PTS of the latter pack has a lower value than the former pack, the PTS will ~till exhibit a rough con~inuity in the same way as the SCR
and the DTS.
The SCR of audio pack~ exhibits continuity in the ~ame way as ~or ~ideo packs.
The continuity of the SCR, DTS, and PTS is a prerequisite for the proper decoding of VOBs. The following i~ an explanatio~ of the valueY used for SCR
to maintain ~hi~ continuity.
In Fig. llB, the straight line showing the ~alue~

FROM. 1998~9~16~(~)21:50/~21:06/~g4300403387P72 of SCR in the ~ection B i~ an extension of the straight line showing the values o~ S~R in the section A. This ~eans that there is continuity ~et~een the values of SCR
between section A and section B.
In Fig. llC, the fir-~t value of SCR in the period D is higher than the largest value on the straight line showing the values of SCR in the section C. However, in this case also, the closer a pack is to the ~art of the VOB, the lower the value of SCR, and the f~rther a video pack is from the start of the vOB, the higher the value of SCR. Thi3 means that there is continuity of the time stamps between section C and section D.
Here, when the difference in time ctamp~ is large, these ~tamps are naturally non-continuous. Under MP~G ~tandard, the difference between pair~ of time stamp~, such as SCR~, must not exceed O.7 seconds, so that areas in the data ~here thi~ value is exceeded are treated as being non-continuou~.
In Fig. llD, the la~t value of SCR in section E
is higher than the firqt value on the ~traight line showing the values of SCR in section F. In this case, the continuity ~herein the closer ~ pack is to the start of the vOB, the lower the value of SCR, and the further a video pack is from the s~art of the vOB, the higher the value of SCR iq no longer valid, so that there i~ no FROM ._ 1998~9~16~(~)21:50/~21 06/~4300403387P73 continuity in the time stamps between ~ection E and ~ection F.
When there is no continuity in the time 3tamp~, as the example of section E and section F, the former and latter sections are managed as separate VOB-~.
~ t ~hould be noted that the details of buffer control between VOB~ and the multiplexing method are described in detail in the PCT publications "WO
97/1 33 67" and "WO 97/133 63".

4) A~ F;le~
An AV file i~ a file that record~ a~ lea3t one VOB ~hat i~ to be reproduced c~nsecutively. When a plurality of VOBs are held within one AV file, these VOBs are reproduced in the order they are stored in the AV file. For the example in Fig. 4, the three VOBs, VO3#1, VOB#2, and VOB#3, are ~tored in one AV file, with these VOB~ being reproduced in t~e order VOB#1 ~ VoB#2 - VOB#3. When VOBs are stored in this way, the buf~er state for the video stream positioned at the end of the first VOB to be reproduced and the video ~tream positioned at the ~tart of the next VOB to be reproduced will ~e a~ chown in Fig. 1OC. Here, i~ the highe~t amount data Bvl+Bv2 to be stored in the buffer exceed~
the capacity of the buffer, or if the first time ~tamp FROM ~ 1998~9~16~(~)21 51/~21 06/~g4300403387P74 ln the ~OB to be reproduced second i~ not continuous with the last time stamp in the VOB to be reproduced first, there is the danger that -~eamle~s reproduction will not ~e possible for the first an~ ~econd VOBs.

~1-3~ T.o~i~Al Con~trll~tion of the RTRW ~n~m~nt File The following i~ an explanation of the RTRW
management file. T~e RTRW management file is information showing at~ributes for each VOB stored in an Av file.
Fig. 12A ~hows the detailed hierarchical ~tructure in which data i~ ~tored in the RTRW management file. The logical format shown on the right of Fig. 12A
is a detailed expansion of the data shown on the left, 15 with the broken lines ~erving a~ guideline~ to clarify which parts of the data structure are being expanded.
By referring to the data structure in Fig. 12B, it can be seen that the RTRW management file record~ vOB
information for VOB#l, ~0~#2, VOB#3, ... VOB#6, and that 20 this VOB information i9 composed of VOB general information, 3tream attri~ute information, a time map table, and seamless linking information.

(1-3-1 ) VO~ Gener~l ~nfor~tion The "VOB general inf ormation" refers to the VO~-FROM ~ 1998~9~163(~)21:51/~21:06/~430~403387P75 ID that i~ uniquely assigned to each VOB in an AV file and to the VOB reproduction period information of each VOB.

(1-3~ tre~m Attrib~te Informat-~n The ~t~eam attribute information is composed of ~ideo attribute information and audio attribute information.
The ~ideo attribute information includes video format information that indicate~ one of MPEG2 and MPEGl, and a display method that indicates one of NTSC
and PAL/SECAM. When the video attribu~e information indicate~ NTSC, an indication such as "720x480" or "352x240" may be given as the di~play re~olution, and an indication such as "4:3" or "l6:9" may be given as the a~pect ratio The presenceJa~sence of copy pre~ention control for an analog video slgnal may also be indicated, as may the presence/absence of a copy guard for a video cassette recorder ~hich damage~ the AGC
circuit of a VTR by changing the signal amplitude during the blank period of a video ~ignal.
The audio attribute information ~howq the encoding method which may be one of ~P~G2, Dolby Digital, or Linear-PCM, the sampling frequency ~such as 48kHz), a bitrate when a fixed bitrate i~ used, or a FROM ~ 1998~ 9~16~(~)21 51/~21.06/~$~4300403387 P 76 bitra~e marked with "vBR" when a variable bl~rate is uqed .

The time map table sho~s the size of each VOBU
that composes the VOB and the reproduc~ion period of each VOBU. To improve acce~sing capabilities, repre~entati~e VOBUs are selected at a predetermined interval, such as a multiple of ten -~econds, and the addresses and reproduction times of these representative VOBUs are given relative to the start of the VO~.

~1--3--3~ .~e~ml.~çs T-i nki n~ Tnforrn~tion The seamless linking information i~ information which ena~le~ the consecuti~e reproduction of the pl~rality of VOBs in the AV file to be performed ~e~mlessly This seamless linking information includes the ~eamlex~ ~lag, the video presentation start time VOB_V_S_PT~q, the vldeo pre~entation end ~ime VOB V_E PTM, the First SCR, the La~t_SCR, the audio gap start time A_STP_PTM, the audio gap length A GAP_LEN, and the audio gap location in~ormation A GAP_LOC.

~1-3-3-1~ ~P~ml ~c Fl~a The seamless flag is a flag showing whether the VOB corresponding the present seamle~ linking information iq reproduced ~eamlessly following the end '98~09~17E~ ) 10:41 ~G R I CHE S ~ P(!3i~5 of reproduction of the VOB positioned i~mediately before the present vOB in the AV file. When this flag is set at "01", the reproduction of the present vOB (the latter VOB) is performed seamlessly, while when the flag is 5et at "00", ~he reproduction of the present VOB is not produced seamles~ly.
In order ~o perform the reprod~ction of a plurality of VOBs seamlessly, the relation~hip between the former VOB and the latter ~OB must be as follows.
~1) Both VOBs mu~t use the ~ame display method (NTSC, PAL, etc.) for the video stream as gi~en in the ~ideo attribute infor~ation.
(2) Both VOBs must use the same encoding method (~C-3, MPEG, Linear-PCM) for the audio stream as gi~en in the audio attribute information.
Failure to comply with the above conditions prevents seamle~ reproduction from being performed.
When a different di~play method i~ used for a video stream or a diff~erent encoding method i9 used for an audio stream, the video encoder and audio encoder will have to ~top their respective operations to switch the diYplay method, decoding method, and/or bit rate.
A~ one example, when two audio streams that are to be reproduced consecuti~ely are such that ~he former audio stream has been encoded according to AC-3 methods FROM 19~8~ 9~16~(~)21:52/~21:06/~ 430~4n3387 P 78 _.

and the latter according to MP~G methodY, an audio decoder will ha~e to stop decoding to switch the stream attributea when the stream switchcs from AC-3 to MPEG.
A similar situation also occurs for a video decoder when S the video stream changes The seamless flag i~ only set to "0l" ~hen both of ~he above conditions ~l) and ~2) are satisfied. I~
any one of the above conditions (l) and (2) is not ~ati~fied, the seamles~ flag is set at "00".
(1-3-3-~ Video Pre~entation Start Time VOB_V_S_PTM
The video presentation start ~ime VOB_v_S_PTM
show~ the time at which reproduction of the first video field in the video strea~ns composing a VOB is to start.
This time is gi~en in PTM descriptor format, PTM descriptor format is a format whereby the time is expressed with an accuracy of l/27, aoo, ooo -~econd~ or l/90,000 ~econds ~=300/27,000,000 seconds~.
This accuracy of l/90,000 seconds i~ ~et con~idering the cn~on multiples of the frame frequencies of ~TSC
signals, PAL signals, Dolby AC-3, and MPEG Audio, while the accuracy of l/27,000,000 seconds is ~et con.~idering the frequency of the STC.
Fig. 12B shows the PTM descriptor format. In this drawing, the PTM de~criptor fonmat i.~ compo~ed of a FROM _ 1998~9~16~(~)21 53/~21 06/~4300403387P79 ba~e element ~PTM base) that shows the quotient ~hen the presentation -Ytart time i divided by 1/9O,OOO ~econds and an extension element (PTM extension) that shows the remainder when the Yame presentation Ytart time i~
divided by the baqe element to an accuracy of 1/27, 000, 000 seconds.

(1-3-3-3~ Vi~ Pre.~e~tAtion F.nd Tim~ VOB V_~ ~ TM
The video presentation end time VOB_V_E_PTM shows the time at which reproduction of the last ~ideo field in the video streams composing a VOB ends. This time is also given in PTM descriptor format.

(1-3-3-4~ Rel~t;on hetween Vi~o prercentAtion St~rt Time 1~ VOB V S PTM an~ Video Pre~entation End Time VOB_V_E PTM
The following is an explanation of the relation between the YOB_V_E PTM o~ a former VOB and the vOB_V_S_PTM of a latter VOB, when the former VOB and latter VOB are to be seamlessly reproduced Since the latte~ VOB is fundamentally to be reproduced after all of the video packs included in the former VOB, YO that if the VOB V S PTM of the latter VOB
is not equal to the VOB V E PTM of the former VOB, the ti~e stamps will not be continuous, meaning that the former VOB and latter vOB cannot be reproduced FROM 1998~9~16~(~)21:53/~21:06/~4300403387P80 seamle~ly However, when the two vOBs ha~e been encoded completely separately, the encoder will have as~igned a unique time stamp to each video pack and audio pack during encoding, so that the condition for the vos_v_s PTM of the latter vos to ~e equal to the VOB_V_E_PTM of the former vOB become~ problematic.
Fig. 13 sho~s the state of the buffer for the former VOB and the latter VOB. In the graphs in Fig.
13, the vertical axi~ ~how~ the occupancy of the buffer while the horizon~al axi~ repre~ent~ time. The time~
representing ~he SCR, PTS, video pre~en~ation end time VO8_V_E_PTM, and ~ideo presentation start time VOB_V_S_PTM have been plotted. In Fig. llB, the picture data that is reproduced last in the former VOB i~
inputted into the video buffer by the time indicated as Last_SCR of the ~ideo pack composed by this picture data, wi~h the reproduction processing o~ this data waiting until the PTS that is the presentation start time is reached ~if the la~t pack inputted into an MPEG
decoder is an audio or othe~ pack, this condition is not valid). Here, video_pre~entation_end_ti~e vo~-v-E-pTM
shows the point where the display period hl of this final video has expired Ytarting from this PTS. This display period hl is the period taken to dra~ an image from the first field that compose~ one screen-~ized FROM 1998~9~16~(~)21:54/~21:06/~4300403387?81 -image ~o the final field.
In the lower part of Fig. llB, the picture data that should be displayed first in the la~ter VOB is inputted into the video buffe~ at the time Fir~t_SCR, with the reproduction of thi~ data waiting until the PTS
indicating the presentation ~tart time. In this drawing, the video packs of the former and latter VOBs are re~pectively assigned an SCR with the fir~t value "O", a videc presentation end time VOB_v E_PTM, and a video presentation start time vOB_Y_S PT~. ~or this example, it can be seen that VOB_V_S PTM of latter VO~
VOB_V_E_PT~ of former VOB.
The following is an explanation of why seamless reprod~lction i3 po~ible even for the condition lS VOB_V_S_PTM of latter VOB c VOB V E_PTM of former VOB.
Under DVD-RAM standard, an extended STD model (hereinafter "E-STD") is defined as ~he standard model for the reproduc~ion apparatu~, as sho~n in Fig. 19. In general, an MPEG deco~er has an ~TC (Sy~tem Time Clook) for measuring a Ytandard time, with the ~ideo decoder and audio decoder referring to the Qtandard time shown by the STC to perform decode processing and reproduction processing. In addition to the STC, however, E-STD ha~
an adder for adding an offset to the ~tandard time outputted by the STC, so that either of the standard FROM 1998~9~16~(~)21:54/~21:06/~4300403387PR2 ~ime output~ed ~y the STC and the addition re~ul~ of the adder may be selected and outputted to the video decoder and the audio decoder. With this construction, even i~
the time stamps for different VOBs are not continuous, s the output of the adder ~ay be supplied to the decoder to ha~e the decoder behave as if the time stamps o~ the VOBs were continuous. As a result, ~eamle~
reproduction is still possible even when the VOB v E PTM
of fo~mer vOB and the vOB-v-s-pTM of latter VOB are not continuous, as in the above P~mple The difference between the VOB_v_S_PTM of latter VOB and the VOB V E PTM of former VOB can be u~ed as the off~e~ to be added by ~he adder. ~hi~ i~ normally referred to as the "STC_of~set". As a result, a reproduction apparatus of the E-STD model finds the STC_offset according to the formula ~hown below which use~ the vOB_v S PTM of lat~er VOB and the vOB v ~ PTM_ _ _ of former vOB. After finding the STC_off~et, the reproduction apparatus then sets the result in the adder.

STC offset ~ vOB v E PTM of former vOB -VOB V S PTM ~f latter VOB

The reason the VOB V S PTM of latter VOB and the ~o FROM 1998~9~16~(~)21:54/~21:06/~g4300403387P83 VOB V E PTM of fonmer VOB are written in the seamleq9 _ linki~ infonmation is to enable the decoder to perform ~he above calculation and set the STC offset in the adder.
Fig. llE is a graph that ha~ been plotted for t~o VOBs in each of which the time stamp~ are continuous, as shown in Fig. llA. The time stamp of the fir~t pack in VOB#l includes the initial value Initl, ~ith the pack~
following thereafter having increaYingly higher value~
as their time stamps. In the same way, the time Qtamp of the first pack in voB#2 includes the initial value Init2, with the packs following thereafter having increaqingly higher values as their time stamps. In Fig. llE, the final value of the time stamp3 in VO~#l is lS higher than the first value of the time sta~ps in VOB#2, so that it can be seen that the time stamps are not continuou~ across the two VOBs. When the decoding of the first pack in voB#2 is desired following the final pack of vOB#l regardless of the non-contlnuity of time ~tamps, an STC offset can be added to the ~ime stamps in VOB#2, thereby shifting the time 3tamps in VOB#2 from the solid line shown in ~ig. llE to the broken line that continues as an exten~ion of the time stamp-~ in VOB#l.
A~ a result, the ~hifted time stamps in VOB#2 can be seen to be continuous with the time ~tamps in VOB#l.

'98~09~17~(~)10:42 ~RI~HES ~ p()9/45 (l-3-3-5) Firrct_.~CR
The First SCR shows ~he SCR of the first pac~ in a VOB, written in PT~ descriptor format.

~1-3-3-6) T~ ~t_SCR
The Last SCR shows the SCR of the last pack in a VOB, written in ~TM descriptor format.

(1-3-3-7~ R~1~tionshi~ hetween the First_SCR
T,Ast_SCR
As de~cribed above, since the reproduction of V~B
is performed by a decoder of E-STD type, the Last_SCR of the former vOB and the First_SCR of the latter VOB do not need to sati~fy the condition that Last_SCR of former VOB = First SCR of latter VOB. However when using an STC offset, the following relationship mu~t be satisfied.

Last_SCR of former VOB + time required by 1 pack transfer 5 STC_offset + Firs~ SCR of latter VOB

Here, if the La~t_SCR of former Vo~ and the First_SCR of latter ~OB do not satisfy the above equation, this means that the packs that compose the former VOB are tran~ferred into the video buffer and FROM l998~9~l63(~)2l:55~2l:06/~g4300403387P85 ,_ audio buffer at the same time as the packs that compo~e the latter vOB. This violates MPEG standard and the decoder model of E-STD where pack~ are transferred one at a time in the order of the pack sequence By re~erring to Fig. lOC, it can be seen that the La~t_SCR
of former vOB matcheq the First_SCR of latte~
VOB+STC_offset, so th~t the above relationship is sa~isfied.
When VOB i~ reproduced uqing decoder of E-STD
type, of particular note is the time at which switçhing is performed between outputting the qtandard time outputted by the STC and outputting the ~tandard time with the offqet added by the adder Since no information for this s~itching is given in the time stamps of a VOB, there i~ the risk that the improper timing will be used for switching to the output value o~
the adder.
FirYt_SCR and Last_SCR are effective for informing the decoder of the correct timing to switch to the output value of the adder. While the STC is counting, the decoder compares the standard time outputted by the STC with the First_SCR and Last_SCR.
When the standard time outputted by the STC matches the Fir~t_SCR or La~t_SCR, the decode ~witche~ from the standard time outputted by the STC to the output value FROM 1998~ 9~16E(~)21:56/~21:06/~ 4300403387 P 8h ,_ of the adder.
When reproducing a VOB, ~t~ rd reproduction reproduceq the latter VOB after reproducing the fonmer VOB, ~hile "rewind reproduction" (backward picture s ~earch) reproduces the former VOB after the latter voB.
Accordingly, the Last SCR is used for ~witching the value used by the decoder during standard reproduction, and First_SCR is used for switching the value used by the decoder during rewind reproduction. During rewind reproduction, the latter VOB is decoded starting from the last VOBU to the ~irst VOBU, and when the first ~ideo pack in the latter vOB has been decoded, the former VOB iq decoded starting from the la~t vOBU to the fir~t VOsU In other ~ords, during rewind reproduction, the time at which the decoding of the firqt video pack in the latter VOB is complete iq the ti~e at which the ~alue uqed by the de~ode~ needs to be switched. To inform a video data editing apparatu~ of E-STD type of ~hi~ time, the First SCR of each VOB iq provided in the RTRW management file.
A more detailed explanatlon of the technique~
u~ed for E-STD ~nd the STC offset is gi~en in the PCT
Publication W097/13364.

e4 ~K()M 1YY~ Y~l6~ 6/~ 6~ 4~U4 (1--3--3--8) At1~31O ÇAP St;~rt T1m~ "A STP_PTM"
When an audio reproduction ~ap exi~ts in a VOB, ~he audio gap 3tart time "A STP PT~" ~how~ the halt start time at which the audio decoder should halt it~
operation. This audio gap start time is given in PTM
descriptor format. One audio gap ~tart time A_STP_PTM
is indicated for one VOB

~1-3-3-9) A~ o G~n Length "A_~P_T,~.N"
The audio gap length A GAP LEN" ~ho~s how long the audio decoder should stop its operation startinq from the halt 3tart time indicated as the audio gap ~tart time "A_STP_PTM". The length of this audio gap length A_GAP LEN is restricted to being less than the length of one audio frame.

~1-3-3-l0) Inevi~hility of Audio G~
The following is an explanation of why a period where an audi~ gap occurs needs to be ~peclfled by the audio gap start time A_STP PTM and audio gap length A_GAP_LEN.
Since video 3treams and au~io ~trea~s are reproduced with different cycles, the total reproduction time of a video stream contained in a VO~ does not match the total reproduction time of the audio stream. ~or lYY~Y~16~ 6/~4~UU4U~

ex~m~le, if the video stream is for NTSC stAn~rd and the audio stream i~ for Dolby-AC3, the total reproduction time of the video strearn will be an integer ~ultiple of 33msec and the total reproduction ~ime of S the audio stream will be an intege~ multiple of 32msec, as shown in Fig . 1 4A .
If seamle~ reprod~ction of two VOBs is performed without regard to these differences in total reproduction time, it will be necessary to align the reproduction time of one ~et of the picture data and the reproduction time of the audio data to synchronize the ~eproduction of the picture data with the audio data.
In order to align such reproduction times, a difference in total time appear-~ at one of the start or the end of the picture data or audio data.
In Fig. 143, the reproduction time o~ the picture data is aligned with the reproduction time of the audio data at the start of a VOB, 90 that the time difference gl i~ present at the end of the picture data and audlo data Since the time difference gl i~ present at the end of voB#l~ when seamless reproduction VOB#l and VOB#2 i~ attempted, the reproduction of the audio ~tream in VOB#2 is performed to fill in the time difference gl, meaning that the reproduction of the audio stream in lYY~Y~ l:U~ 4~UU4U~ Y

VoB#2 starts at tlme gO. The audio decoder uses a fixed frame rate when reproducing an audio stream, 90 that the decoding of audio 3treams is continuously performed wlth a fixed cycle. When VOB#2 that is to be reproduced following vOB#l has already been read f~om the DvD-RAM, the audio decoder can commence the decoding of VOB#2 a~
soon a~ it has completed the decoding of the audio stream in VOB~l.
To prevent the audio stream in the next vo~ from being reproduced too early during seamless rep~oduction, the audio gap information in the stream i~ managed on the host side of a reproduction apparatus, so that during audio gap period, the host needs to halt the operation of the audio decoder. This reproduction halt lS period is the audio gap, and starts from the audio gap start time A_STP_PTM and continues for the period indicated a~ A_GAP LEN.
Processing to ~pecify audio gaps is alYo performed within a stream. More specifically, the PTS
of an audio frame immediately after an audio gap is uritten in the packet header of an audio packet, 90 that it is po~ible to specify when the audio gap end-~.
However, problem~ arise ~ith the ~pecifying method vhen several set~ of audio data that should be reproduced for ~everal audio frames are stored in a single audio ~M lYY~Y~ 4~UU4U~ vY~
-packet. In more detall, ~hen several set~ of audio da~a to be reproduced for se~eral audio frames are ~tored in a single alldio packet, it i5 only possible to provide a PTS for the first out of the plurality of audio frames in thi~ packet. In other words, a PTS ca~not be p~ovided for the remaining audio frames in the packet If the audio data that i~ to ~e reproduced for the audio frames located both before and after an audio gap is arranged lnto ~he same packet, it will not be pos~ible to provide a PTS for the audio frame located immediately after the audio gap. As a result, it will not be po~ible to ~pecify the audio gap, meaning that the audio gap will be lost. To avoid this, the audio frame located i~unediately afte~ an audio gap i9 processed so as to be arranged at the front of the next audio pack, so that the PTS (audio gap start time A_STP_PTM and audio gap length A_GAP_LEN) of the audio frame immediately after the audio qap can be clarified within the stream.
~hene~er necessary, a Padding-Packet, as prescri~ed by MPEG s~andard, may be inse~ted immediately a~ter audio data in an audio packet that stores the audio data to ~e reproduced immediately before an audio gap. Fig. 14C shows audio pack G3 which includes an audio gap which includes the audio data y-2,y-l,y to be FROM - 1998~ 9~16E(~)22:01/~21:06/~ 4300403387 P 91 reproduced for the audio frame~ y-2, y-1, y locatecl at the latter part of VOB~l shown in Fig. 14B and a Padding_Packet. This drawing alqo ~how~ audio pack G4 that includes the audio frame~ u+1, u+2, and u+3 that are po~itioned at the front of voB#2-The above-mentioned audio p~ck G4 is the pack that includes the audio data that is to be reproduced for the audio frame immediately after the audio gap, while audio pack G3 i5 the pack that is located in immediately before this pack.
If the audio data to be reproduced for the audio frame located immediately after the audio gap is included in a pack, the pack located immediately before ~uch pack i~ called an "audio pack including an audio gap".
Here, the audio pack G3 is positioned to~ard the end of the video pack sequence in a VOBU, with no picture data with a later reproduction time being included in voB#l- However, it is assumed that the reproduction of voB#2 will follow the reproduction of VOB#l, so that picture data included in voB$2 i~ the picture data that should be read correYponding to audio frame~ y-2, y-1, and y.
If this is the case, the audio pack G3 that includes the audio gap may be po~itioned within any of FROM - 1998~9~16~(~)22:02/~21:06/~4300403387P92 the first three vOBu in VOB#2 without vlolating the "one-~econd rule". Fig. 14D shows that thi~ audio pack ~3 that includes the audio gap may be positioned within any of VoBu#l~ VOBU#2, and voBu#3 at the start of VOB#2.
The operation of the audio decoder needs to be temporarily halted for the period of the audio gap.
Thi~ is because the audio decoder will try to perform the decode processing even during the audio gap, so that the host control unit that performs the core control proces~ing in a reproduction apparatus has to indicate an audio pause to the decoder once the reproduction of picture data and audio data has ended, there~y temporarily halting the audio decoder. This indication is shown a~ the ADPI ~Audio Decoder Pause Information) lS in Fig. l9.
By doing so, the operation of the ~udio decoder can be stopped during the period of the audio gap.
However, this doe~ not mean that the audio output can be ~topped regardless of how an audio gap appears in the data.
This is because it i9 normal for the control unit to be compased of a standard microcomputer and software, ~o that depending on the circumstance~ for ~topping the operation of the audio decoder, sho~ld audio gap~
repeatedly occur during a short period of time, there i~

FROM ~ 1998$9~16~(~)22 02/~21 06/~4300403387P93 the po~sibility of the control unit not i~suing the halt indication sufficiently early. As one example, when VoBs of approximately one second in length are reproduced consecutively, it ~ecomes necessary to give a halt indication to the audio decoder at interval~ of around one ~econd. When the control unit is composed of a standard microcomputer and software, there i~ the possi~ility that the control uni~ will not be able to halt the audio decoder for the period where ~uch audio gaps are pre~ent.
When reproducing VOBs, the reproduction time of picture data and the reproduction time of audio data have been aligned ~everal timeq, wi~h it being nece~sary to pro~ide the audio decoder ~ith a halt indication e~ery time. When the control unit i~ compo~ed of a ~tandard microcomputer and qoftware, there iY the possibility that the control unit will not be able to halt the audio decoder for the period where such audio gaps are present. For this reason, the following restrictions are enforced ~o that audio gap~ only occur once ~ithin a predetermined period.
First, to allow the control unit to perform the halt operation with ease, the reproduction period of VOBs is set at 1.5 seconds or abo~e, thereby reducing the frequency with audio gaps may occur.

FROM _ 1998~ 9~l6~(~)22:o3~2l:o6~$~43oo4o3387 P 94 Second, ~he alignmen~ of the reproduction time of picture data and the reproduction time of audio data i5 only performed once in each VOB. By doing qo, there will only be one audio gap in each VOB.
Third, the period of each audio gap is restricted to being less than one audio frame.
Fourth, the audio gap start time voB-A-sTp-pTM is ~et wi~h ~he video presentation start time VOB_V_S_PTM
of the follo~ing VOB as a standard, so that the audio gap start tlme VOB A STP_PTM is restric~ed ~o being within one audio frame of the follo~ing video presentation start time YO~ v S PTM.
A~ a result, VOB_v_S_PTM - rep~oduction period of one audio frame C A S T P PTM c VOB_V S_PTM

If an audio gap that s~tisfies the above formula occurs, the fir~t imaqe in the following VOB will just have been displayed, so that even if there in no audio output at thi~ time, this will not be particularly con3picuous to the user.
By providing the above re~triction, when audio gaps appear during ~eamless reproduction, t~e inter~al between the audio gaps will be at least "1.5 ~econds -Z5 reproduction period of two audio frames". More FROM _ 1998~ 9~16E(~)22:03/~21:06~ 4300403387 P 95 ~pecifically~ by ~u~bstltutlng actual value~, the reproduction period of each audio frame will be 32m~ec ~hen Dolby AC3 is u~ed, ;~o that the mini~num interval between audio gaps i-~ l436msec. This inter~al means that th~re is a high probability of the control unit being able to perform the halt control processing we within the deadline ~o~ the procesYing.

(1-3-3-1l) ~ io ~ T~orAtion Inf.orm~t-on The audio gap location in~ormation "A_GAP_~OC" is a 3-bit val~e that ~how~ into which of the three VOBs located at the start of the latter vOB the audio pack including the audio gap has been inqerted. When the first bit in this value is "l", this show~ the audio gap is present in VOBU#l. In the same way, the ~alues "2"
and "3" respectively ~how that the audio gap is present in VOBU#2 or VOBU#3.
The reason this flag i9 necessary is that it will be nece~sary to regenerate the audio gap when the latter of two VO~ that are to be seamle~ly reproduced ha~
~een partially deleted.
The partial deletion of the VOB refer~ to the deletion of a plurality of VOBUs that are located at the start or the end of a VOB. A~ one example, there are many cases during video editing when the user wi~he.c to .

F~OM ~_ 1998~ 9~16~(~)22:03/~21:06/~ 4300403387 P 96 remove the opening credit ~equence. The deletion of ~he VOBUs which include this opening credit ~equence is called the "partial deletion of a ~OB".
When performing partial deletion, audio packs s including an audio gap that are moved to a latter vOB
require ~pecial attention. A~ described above, the audio gap is determined according to the video pre~entation start time VOB_V_S_PTM of the latter vOB, so that ~hen some of the VOBUs are deleted from the lat~er VOB, the picture data that ha~ the video presentation start time VOB V S PTM that determines the audio gap and the vOBUs for this picture data will be deleted.
The audio gap i~ multiplexed into the one of the first three VOB~ at the start of a VOB. Accordingly, when a part o~ a VOB, such as the first VOBU, is deleted, it will not be clear aQ to whether the audio gap will have been destroyed a~ a result of this deletion. Since the number of audio gaps that may ~e provided within one VOB i9 limited to one, it i~ also nece~sary to delete a previous audio gap that is no longer needed once a new audio gap has been generated.
As ~hown in Fig. 14D, the audio pack G3 that includes the audio ~ap need~ to be inserted into one of VOBU#1 to VoBu#3 in VOB#2 QO as to comply to the one-FROM _ l998~9~l6~(~)22 04/~2l:06/~4300403387P97 ~econd rule, so that the audio pack that includes this audio gap needs to be taken out of the packs included in VoB#l to voB#3. While this in~ol~es a ~x;mllm of three VOBU~, the immediate extraction of only the audio pack G3 that includes the audio gap is technically very difficult. This means that stream analysi~ is required.
Here, each VOBU includes several hund~ed packs so that a ~ignificant amount o~ processing i~ required to refer to the content of all ~uch packs.
The audio gap location information A GAP_~OC uses a 3-bit flag to show into which of the three VOBUs at the start of a latter VOB an audio pack including an audio gap has been inserted, so that only one VOBU needs to be searched ~hen looking for the audio gap. This facilitate-~ the extraction of the audio pack G3 including the audio gap.
~ ig. 15A to lSE show the procedure for the regeneration of the audio gap by the video da~a editing apparatu~ when the VOBU~ located at the start of vOs#2 ha~e been deleted, out of two VO~, VOB#1 and VOB#2, that are to ~e reproduced seamles~ly.
As shown in Fig. 15A, the VOBUs, 'lvoBu#98~
"VOBU#99", and "VOBU~lOO" are located at the end of VOB#l and the VOBUs, "VOBU#1", ~vOBU#2~, and ~vOBU#3~
are located at the start of VOB#2. In thiq example, the F~OM - 1998~9~16E(~)22:04~21:06/~4300403387P~8 user instruct-q the video data editing apparatus to perform a partial deletion to delete VOBU#l and vOBU#2 in vos#2.
In this case, the audio pack G3 that include~ the 5 audio gap i~ required, out of the audio data ~tored in VOBU#lOO, ~ut it is kno~n for certain that thi~ audio pack G3 including the audio gap will be arranged into one of voBu#l~ VOBU#~, and vosu#3 in voB#2~ To ~ind the VOBU into which the audio pack G3 including ~he audio gap has been arranged, the video data editing apparatus refers to the audio gap location information A GAP LOC.
When the audio gap location information A_GAP_LOC is set as shown in Fig. 15B, it can be seen that the audio pack G~ including the audio gap i~ located ln vOBU#3 in VOB#2.
Once the video data editing apparatus knows that the audio pack G3 including the audio gap is located in vOBU#3, the video data editing apparatus will know whe~her the audio gap was multiplexed into the area that was subjected to the partial deletion. In the present example, the audio gap is not included in the deleted area, so that the value of A_GAP_LOC is only amended by the number of VOBU that were deleted.
Thi-q completes the explanation of the VOBs, video stream, audio stream, and VOB information that is stored .

FROM _. l998~9~16~(~)22:05/~2l:06/~4300403387P99 on an optical diSC for the preYent invention.

~1-4) .~yr~tem ~on~trll~tion o~ the V~o n~t~ ~Ai ting AnD~rAtl,c The video data editing appar~tus of ~he pre~ent em~odiment i~ provided with function-Y for both a DvD-RAM
reproduction apparatus and a DVD-RAM recording apparatus Fig. 1 6 ~hows an example of the sy~tem construction that include~ the video da~a editing apparatu~ of the present embodiment. A~ ~hown in Fig.
1 6, this sy~tem include~ a ~ideo data editing apparatus ~hereinafter DvD recorder 70~, a remote controller 71, a TV monitor ~2 that is connected to the DVD recorder 70, and an antenna 73. The DVD recorder 7O is conceived as a device to be used in place of a conventional video ca~ette recorder fo~ the recording of televi~ion broadcastA, but alqo features editing functions. ~he system illustrated in Fig. l6 ~hows the case when the DVD recorder 7O is used as a dome~tic video edltlng apparatus. The DVD-RAM described above is u~ed ~y the DVD recorder 70 as the recording medium for recording television broadca ts.
When a DVD-RAM is loaded into the DvD recorder 70, the DVD recorder 70 compre~ses a video signal received via the antenna 7 3 or a con~entional ~TSC

FROM ~ 1998~ 9~16~ (~() 22: 05/~21: 06/~4300403387 Pl 0() ~ignal and recordq the result onto the DVD-RA~ as VOBs.
The DVD recorder 70 al~o decompresses the video stream~3 and audio streams included in the vOBs recorded on a DVD-RAM and output~ the resulting ~ideo signal or NTSC
signal and audio signal to the TV monitor 72.

(1-4-1 ) ~T~r~wAre Construction of thR nvn Rerorder 70 Fig 17 is a ~lock diagram showing the hardware construction of the DVD recorder 70. As shown in Fig.
17, the DvD recorder 70 is composed of a control unit 1, an MPEG encoder 2, a disc access unit 3, an MPEG decoder 4, a ~ideo signal proce~sing unlt 5, a remote controller 71, a bus 7, a remote control signal reception unit 8, and a receiver 9.
The arrows drawn with solid lines in Fig. 17 show the phy~ical connections that are achieved by the circuit wiring inside the DVD recorder 70. The broken lines, meanwhile, show the logical connections that indicate the input and output of ~arious kinds of data Z0 on the connections shown with the solid lines during a video ~diting operation. The numeral~ (1) to ~5) assigned to the broken lines show how VOBUs and ~he picture data and audio data that composes VOBUs are transferred on the physical connections when the DVD
recorder 70 re-encode~ VOBUs~

FROM 1998~9~16~(~)22:06/~21:06/~4300403387P10 The control unit 1 is the hos~-side control uni~
that includes the CPU la, ~he proceYsor buo lb, the bus interface lc, the main storage ld, and the ROM le. ~y executing programs stored in ~he ROM le, the control unit 1 records, reproduce~, and edits vOBs.
The MPEG encoder 2 operates as follows. When the receiver 9 recei~es an ~TSC ~ignal ~ia the antenna 73, or when a ~ideo signal outputted by a domestic video camera is received ~ia the video input terminals provided at the back of ~he DvD recorder 70, the MPEG
encoder 2 encodec the NTSC signal or ~i~eo signal to produce VOBs and output~ the generated vOBs to the disc access uni~ 3 via the ~us 7. A~ a process that particularly relates to video editing, the MPEG encoder 2 recei~es an input of the decoding result of the MPEG
decoder 4 from the connection line Cl via the bus 7, ax shown by the broken line ~4), and output~ the encoding result for thi~ data to the disc access unit 3 via the bu~ 7, as shown by the broken line (5).
The disc access unit 3 includes a track buffer 3a, an ECC proces3ing unit 3b, and a drive mechanism 3c for a DVD-RAM, and acce~es the DVD-RAM in accordance with control by th~ control unit 1.
In more detail, when the control unit 1 give~ an indicaticn for recording on the DVD-RAM and the VOBs FROM ~ 1998~9~16~(~)22:06~21 06~ 4300403387 P102 encoded by the MPEG encoder 2 have been succe~sively outputted as sho~n by the broken line ~ 5 ), the disc acces~ unit 3 ~ore~ the received VOBs in the track ~uffer 3a, and, once ECC processing ha~ been performed S ~y the ECC processing unit 3b, control:3 the drive mechanism 3c to ~ucce~sively record these VOBs onto the DVD-RAM
On the other hand, when the control unit 1 indicates a data read from a DVD-RAM, the disc access 10 unit 3 controls the drive mechanism 3c to ~uccess~vely read VOBs from the DVD-RAM, and, once the ECC processing unit 3b has performed ECC proces~ing on these VOBs, store~ the re~ult in the track buffe~ 3a.
The drive mechanism 3c mentioned here includes a platter for setting the DVD-RAM, a ~pindle motor for clamping and rotating the DVD-RAM, an optical pickup for ~eading a ~ignal recorded on the DVD-RAM, and an actuator for the optical pickup. Reading and writing operations are achieved ~y controlling the-~e components of the drive mechani~m 3c, although such control does not form part of ~he gist of the pre~ent in~ention.
Since such control can be achie~ed using well-known method~, no further explanation will be gi~en in this specification.
When VOBs that have been read from the DVD-RAM by FROM _ 1998~9~17~(~) 0:14/~22:07/~4300403388 P 2 the di~c acceq~ unit 3 are outputted a~ sho~n by the broken line (1), the MPEG decod~r ~ decodes these vOBs to obtain uncompre3sed digital video data and an audio ~ignal. The MPEG decoder 4 outputs the uncompre~sed digital ~ideo data to the video signal proces~ing unit 5 and outputs the audio signal to the TV monitor 72.
During a video editing operation, the MPEG decoder 4 outputs the decoding result for a video stream and audio ~tream to the bu~ 7 ~ia the connections lines C2, C3, as shown by the broken lines (2) and (3) in Fig. 17 The decoding result outputted to the bus 7 is outputted to the MPEG encodcr 2 via the connection line C1, as shown by the broken line ~4) Thc video ~ignal processing unit S convert~ the image data outputted by the MPEG decoder ~ into a video ~ignal for the TV monitor 72. On receiving graphics data from outside, the video signal proces~ing unit 5 con~erts the graphic~ data into an image signal and performs sign~l processing to combine this im~ge ~ignal with the ~ideo signal.
The remote control signal reception unit 8 recei~es a remote controller ~ignal and informs the control unit 1 of the key code included in the signal ~o that the control unit 1 can per~orm control in accordance with user operations of the remote controller FROM _ l998~9~17~(~) 0:l5~22:07/~4300403388P 3 71.

--4-1-1 ) Tnterna1 t~c nqtrl~t;~n of ~he MPF~: ~rlcoder Fig. 18 i~ a block diagram showing the construction of the MP~G encoder 2. As shown in Fig.
18, the MPEG encoder 2 i~ composed of a video encoder 2a, a video buffer 2b for storing the output of the video encoder 2a, an audio encoder 2c, an audio buffer 2d for storing the output of the audio encoder 2c, a stream encoder 2e for ~ultiplexing the encoded ~ldeo ~tream in t~e video ~uffer 2b and the encoded audio ~tream in the audio buffer 2d, an STC ~System Time Clock) unit 2f for generating the synch~onization clock of the MPEG encoder 2, and the encoder control unit 2g for controlling and managing these components o~ the MPEG encoder 2.

~1-4-1-2) Tntern~l Con~trll~tion of the MP~G ~ecoder 4 Fig. 19 ~how~ the construction of the MPEG
decoder ~. As ~hown in Fig. 19, the M~EG decoder 4 is composed of a ~em~lltiplexer 4a, a video buffer 4b, a ~ideo decoder 4c, an audio buffer 4d, an audio decoder 4e, a reordering buffer 4f, an STC unit 4g, switches SW1 to S~4, and a decoder control unit 4k.
The demultiplexer 4a refers to the header of a FROM __ 1998~ 9~17~(~) 0:15/~22:07/~ 4300403388 P 4 packet that has been read from a VOB and judges whether the various pack~ are ~ideo packs or audio packs. The demultiplexer 4a outputs video data in pack~ judged to be video packs to the video buffer 4b and audio data in packs judged to be audio packs to the audio buffer 4d.
The video buffer 4b is a buffer ~or accumulating video data that is outputted by the demultiplexer 4a.
~ach set of picture data in the video buffer 4b i~
stored until its decode time when it is ta~en from the vldeo bu~fer 4b.
The video decoder 4c takes out sets of picture data from the ~id~o buf~er 4b at their respective decode times and instantly decodes the data.
The audio buffer 4d is a buf~er fo~ accumulating the audio data outputted by the demultiplexer 4a.
The audio decoder 4e succe~sively decode~ the audio data stored in the audio bufrer 4d in frame units. On receiving ADPI (Audio Decoder Pause Information) issued by the control unit 1, the audio decoder 4e halts the decode processing for audio frame data. The ADPI is issued by the control uni~ 1 when the pre~ent time reaches the audio gap -~tart time A_STP_PTM shown by the seamless linking information.
The reordering buffer 4f is a buffer for s~oring the decoding result of the video decoder ~c when it has FROM ~ 1998~9~17~(~) 0:16/~22:07/~4300403388P 5 decoded an ~ picture or P picture. The rea~on ~he decoding results for I pictureq or P pictures are stored iq that the encoding order wa.~ originally produced by rearranging the display order. A~cordingly, after every B pictu~e that ~hould be di~played before the decoding results ~to~ed in the reordering buffer 4f haq been decoded, the reordering buffer 4f outputq the decoding results of the hitherto stored I pictures and P pictures as an NTSC signal The STC unit 4g generate~ the synchroni2ation clock that ~hows the system clock for use in the MPEG decoder The adder 4h outputs a value produced by adding the STC_offset to the standard ~lock shown by the synchroniza~ion clock as the offset standard clock. The control unit l calculate~ this STC_offqet by finding the difference between the video pre-~entation start time vOB_V S PT~ and the video preqentation end time VOB v E PTM that are given in the ~eamless linking _ ZO i~fonmation, and qets the STC_offset in the adder 4h.
The s~itch SWl supplie~ the demultiplexer 4a with the ~tandard time meaqured by thc STC unit 4~ or the offset ~tandard ti~e outputted ~y the adder 4h.
The switch Sw2 supplieq the audio decoder 4e with the Ytandard time measured by the STC unit 4~ or the FROM ~ 1998~9~17~(~) 0:16/~22:07/~4300403388P 6 o~fset standard time outputted by the adder 4h. The supplie~ standard time or off~et st~n~d time i~ used to collate the decode time and presentation start time of each audio frame.
The switch SW3 supplies the video decoder 4c with the standard time measured by the STC unit 4g or the offset standard time outputted by the adder 4h. The supplied standard ~ime or off~et ~t~ndard time i~ used to collate the decode ti~e of each set of picture data.
The ~witch SW4 supplies the ~eordering buffer 4f ~ith the standard time mea~ured by the STC unit 4g or the offset standard time outputted by the adder 4h. The ~upplied standard time or offset standard time is used to collate the pre-Qentation ~tart time of each set of picture data.
The decoder control unit 4k receives a decode processing request from the control unit 1 for an integer multiple of VOBUs, which i9 to say an integer multiple of GOPs, and ha~ the decode processing performed by all of the components from the demultiplexer 4a to the reordering buffer 4f. Also, on recei~ing a valid/invalid indication for the reproduction output of the decoding result, the decoder control unit 9k has the decoding re~ults o~ the video decoder 4c and the audio deccder 4e outputted to the .. . . _ _ .

FROM _ 1998~ 9~17~(~) 0:16/~22:07/~ 4300403~88 P 7 outside if the indication i~ valid, or prohibit3 the output of the decoding re~ults of ~he video decoder 4c and the audio decoder 4e to the out~ide if the indication is invalid.
The validJin~alid indication can ~e given for a smaller unit that a video stream, such as for a video field Information that indicates the ~alid section of the reproduction output in video field unit~ i~ called valid reproduction section information.

1-4-1-~-1) Tim;n~ for the Switrhin~ of ~witches SW1-~W4 Fig. 20 i~ a timin~ chart of the timing for the ~itching of switches SW1 to Sw4. This timing chart shows the switching of ~witches SWl to Sw4 when seamless reproduction o~ vOB#l and VOB#2 is performed. The upper part of Fig. 20 ~hows the pack ~equences that compose VOB#1 and VOB#2, while the middle part show~ the ~ideo frames and the lower pa~t shous the audio frame~.
The ti~ing for the s~itching of ~witch SW1 is the point where the pack sequence that i~ transferred to the MPEG decoder 4 changes from VOB#1 to Vos#2. This time is indicated as the LaYt_SCR in the seamle~s linking information of VO~#1.
The timing for the switching of switch sw2 is the point where the all of the audio data in the VOB that is FROM _ 1998~ 9~17~(7k) 0:17/~22:07/~430040338& P 8 stored in the audio l~uffer 4d ~efore the switching of s~itch SW1, which is to say VOB#l, has ~een decoded.
The timing for t}~e switc:hing of switch SW3 is ~he poin~ where the all of the video data in the VOB that is stored in the video buffer 4b before the ~witching time (Tl) of switch SW#1, which is to say VOB#l, has been decoded The timing for the switching of switch SW9 is ~ne point during the reproduction of VOB#1 where the la~t video frame has been reproduced.
The program~ ~tored in the ROM le include ~odules that enable two VOB~ that ha~e been recorded on the DvD-RAM to be reproduced seamlessly.

~1-4-1-2-2) P~oce~llre for the Se~ml~c Proce~ina of VOBs Figs. 21 and 22 a~e flowchart~ showing the procedure that seamlessly link~ two VOBs in an AV file.
Figs. 23A and 23B show an analysis of the buffer state for each video pack. Figs. 24A and 25 show the audio frames in the audio stream that correspond to the audio frames x, x+1, y-1, y, u+1, u+2, and u+3 mentioned in Fig. 22 The following is an explanation of the re-encoding of VOBs. In step S102 of Fig. 21, the control unit 1 FROM _ l998~9~17~(~) 0:l7/~22:07/~4300403388 P 9 performs the calculation VOB_v_~ PTM of former ~OB minus VOB_V S PTM ~f latter VOB to obtain the STC offset._ In 3tep S103, the cont~ol unit 1 analyzes the changes in the occupancy of the buffer from the First SCR of the former vOB to the decode end time of all of the data in the fonmer VOB. Figs. 23A and 23B
show the analysi~ process for the occupancy of the buffer per~ormed in step S103.
~hen ~ideo pack #1 and video pack #2 are included in the ~ormer ~OB a~ ~hown in Fig. 23A, the SCR#1, ScR#2J and DTS#1 included in these video packs are plotted on the time axis. After this, the data size o~
the data included in video pack #1 and video pack ~2 is calculated.
A line i~ plotted starting from SCR#l with the bitrate information in the pack header as the gradient, ~ntil the data size of video pack #l has been plotted.
After this, the data size of video pack #2 is plotted starting from SCR#2. Next, the data size of the pic~ure data P1 that is to be decoded is removed at DTS#l. This data size of picture data P1 i~ obtained by analyzing the bitstream.
After plotting the data sizes of the ~ideo packs and picture data in thi~ way, the buf~er ~tate of the video buffer 4b from the first SCR to the DTS can ~e FROM _ 1998~ 9~173 (7k) 0: 18/~22: 07/~4300403388 P 10 plotted as a graph. By u3ing the ~ame procedure for all of the video data and audio data in a VOB, a graph :3howing the ~tate of the buffer can ~e obtained, a~3 shown in Fig. 23B.
In ~tep S104, the control unit 1 performs the same analyqis a~ in ~tep S103 for the latter vOB, and so analyzes the changes in the occupancy of the video buffer from the First SCR of the latter vOB to the decode end time Last_DTS of all the data in the latter ~OB
In step S105, the control unit 1 analyzes the changes in the occupancy of the video buffer from the First_SCR of the latter VOB + STC_offYet to the Last DTS
of the former vOB. This periocl fro~n the First_ SCR of the latter VOB + STC offset to the La~t_DTS of the data in the former VOB is when the fir~t picture data of the latter VOB is being transferred to the video buffer 9b while the la~t picture data of the former VOB is ~till stored in the ~ideo buffer 4b.
When the video data of the former vOB and the latter VOB coexist in the buffer, the buffer state will be as shown in Fig. 10C. In Fig. 10C, the video buffer 4b ~tore~ video data of both the former VOB and the latter VOB during the period from the First_SCR+STC_offset to the Last_DTS, with Bvl+Bv2 FROM 1998~9~17~(~) 0:18/~22:07/~4300403388P11 representing the highe3t occ~pancy of the video buffer 4b during this period.
In step SlO6, the control unit l controls the disc access unit 3 to read the three VOBUs that are located at the end of the former VOB. After this, in step S107 the control unit 1 controls the disc acce~s unit 3 to read the thr~e VOBUs that are located at the front of the latter VOB.
Fig. 23C shows the area that should be read from the former VOB in step S106 In Fig. 23C, the former VOB includes the VOBU~ #98-#105, so that the vOBu~ #103 to #105 are read as the VOBUs that include the picture data V_~ND that ~hould be decode~ last~ Fig. 23D, shows the area that should ~e read from the latter VOB in ~tep S107. In ~ig. 23D, the former VOB includes the VOBUs #1~#7, ~o tha~ when the VOBU ~1 comes first, VOBU~ #1 to #3 should be read a~ the VOBU~ that include the picture data V TOP.
Ac~ording to the one-second rule, there i~ a poscibility that the audio data and picture data that should be reproduced within one second are stored across three vOBU~, so that by reading the three vOBUs at the ~tart and end of a VOB, in step S106, all of the picture data and audio data to be reproduced between a point one -~econd from the presentation end tlme of the picture FROM _ 1998~ 9~17E(~) 0:19/~22 07/~ 4300403~88 P 12 data ~ ~D located at the end of the former vOB and this pre~entation end time itself can be read together.
Al:3o, in step 5107, all of the picture data and audio data to be reproduced bet~een the presentation start time of the picture data V_TOP located at the start of the latter VO~ and a point one ~econd after thi~ pre~entation Ytart time can be read together. It ~hould be noted that the reads in this flowchart are performed for VOBU units, although the reads ~ay instead be performed for the picture data and audio data that i~
to be reproduced in one second, out of all of the picture data and audio data included in a vOBU. In thi~
embodiment, the number of ~O~Us that corre-~pond to one second i~ three, although any number of VOBUs may be re-encoded. Reads may alternatively be performed forpicture data and audio data that is to be reproduced in a period longer than one second.
Next, in ~tep S108 the control unit 1 controls the demultiplexer 4a to separate the VOBUs for the first part and the last part into a video stream and an audio stream, and ha~ the video ~ecoder 9c and the audio decoder 4e decode these streams. During normal reproduc~ion, the decoding results of the video decoder 4c and the audio decoder 4e will be outputted as video and audio. ~hen re-encoding is performed, however, FROM - l998~9~163(~)22:l3~22:07/~4300403388P13 these decoding results should inputted into the MP~G
encoder 2, so that the control unit 1 has the video ~tream and the audio 3tream ol~ the decoding re3ult3 output to the bus 7, as shown by the arrows (2) and (3) that are dra~n with broken lines in Fig. 17.
~ he video stream and the audio stream that are the decoding result~ are transferred ~ia the bu.~ 7 in o~de~
to the MPEG encoder 2, as shown by the broken line ~4).
After this, the control unit 1 calculate~ the amount of code for the re-encoding of the decoded video ~tream and deco~ed audio stream by the MPEG encoder 2.
First, in tep S109, the control unit 1 judges whether the accumulated amount of data in the buffer exceeds the upper limit of the buf~er at any point in the decoding when the former VOB and the latter ~OB coexist in the buffer. In the present embodiment, this is achieved by judging whether the ~alue ~vl+B~2 calculated in step S105 exceeds the upper li~it of the buffer. If this ~alue doe~ not exceed the upper limit, the processing advances to step S112, o~ if the ~alue doe~ exceed the upper limit, the control unit 1 subtracts the excess amount of code A from the calculated amount and as~igns the resulting amount of code to the decoded vOBU
sequence.
If the amount of code is decreased, this means the FROM _ l998~9~l75(~) 0:l9~22:07/~4300403388P14 picture quality of Ihe ~ideo stream will decrea~e during the reproduction of the~e VOBUq. Howeve~, overflows in the video buffer 4b must be prevented when ~eamle331y linking two VOBs, so that this method that decrease-~
picture quality is used. In ~tep S111, the control unit1 controls video deco~er 4c to ~e-encode the decoding result~ of the video decoder 4c and the ~udio decoder 4e according to the amount of code a~signed in step SllO.
Here, the MPEG encoder 2 perform~ a decode to 10 temporarily convert ~he pixel values in the video data into digital data in a YUV coordinate ~ystem. Digital data in such a YUV coordinate sy~tem is digital data for the signals ~luminance signal (Y), chrominance signal (U,V) ) that ~pecify color~ for a color Tv, with the video decoder 4c re-encoding this digital data to produce set~ of picture data. The technique u~ed for the assigning of an amount of code is that described in MPEG2 DIS (Draft International Standard) Te~t Model 3.
Re-encoding to reduce the amount of code i~ achieved by 20 proceqqes such as replacing the quantization coef~ici~nts. Note that the amount of code from uhich the excess amount A has been subtracted may be assigned to only the latter VOB or to only former VOB.
In step S112, the control unit 1 calculates which part of the decoding result for the audio data taken FROM - 1998~9~17E(~) 0:20/~22:07/~4300403388 P15 from the former VOB corresponds to the audio frame x that include~ the STC offset+First-sc~ of the latter ~OB. In Fig. 24A, the graph show3 the buffer state for the former VOB and latter VOB, while the lower part 5 ~how~ the audio frames of the audio data separated from the former VOB and the audio frames of the audio data ~eparated f~om the latter VOB. The audio frame sequences in the lower part of Fig. 24A ~how the correspondence between each audio frame and the time axis of the graph ln the upper part. The de~cending line drawn from the point shown as First_SC~+STC_off~et in the graph intersects one audio frame out of the audio frame sequence for the former VOB.
The audio frame that intersects this de~cending line is the audio frame x, and the audio frame x+1 following immediately after is the final audio data included in the former VOB It should be noted that the data in the audio frames x and x+1 is included in ~he audio data that ~hould be reproduced during a period that is indicated by points 1,O seconds before and after the reproduction period of the final picture data v_ END, with thi~ being included in the three VO~Us read in ~tep S105.
Fig. 24B shows the ca-~e where the First SCR+STC off-~et matches an audio frame bcundary in FROM _ l998~9~l6E(~)22:14/~22:07/~4300403388P16 the former vOB In this case, the audio frame immediately before the boundary is set as the audio frame x.
In ~tep S113, the control unit 1 calculates the audi~ frame y+1 that includes the STC offset+VOB v S PTM
of the latter VOB. In Fig. 2~A, the descending line drawn from the video presentation start time VOB V S PTM+STC offset in the graph intersects one audio frame in the audio ~rame sequence of ~he latter VOB.
The audio frame tha~ in~er~ect~ thi~ de~cending line is the audio frame y+1. Here, the audio frame~ up to the preceding audio frame y are the valid audio frames that still used after the editing has been performed, out of the original audio data included i~ the former vo~.
Fig. 24C ~how~ the ~ase whe~e the video presentation start time VOB V S PTM+STC offset matches an audio ~rame boundary in the ~or.mer VOB. In thi~
case, the audio frame immediately before the video pre-qentation ~tart time VOB V S PTM+STC offQet is set as the audio frame y.
I~ ~tep S114, the audio data from the audio frame x+2 to the audio frame y is taken from the former audio data. In Fig 24A, the audio ~rames from audio frame y+1 onwards ha~e been drawn with a broken line, showing that this part is not multiplexed into the VO~. It FROM _ 1998~9~17~(~) 0:21/~22:07/~4300403388 P17 should be noted that the audio frames that ha~e been moved to the latter vOB will have been assigned time stamp~ for ~he former vOB, 90 that these audio frame~
are reassigned time ~tamps for the latter VOB.
S In s~ep Sll~, the audio frame u immediately after the audio frame that includes the boundary between the audio frame~ y and y+l is detected from the audio frame ~equence of the latter VOB. When a descending line is drawn from the boundary of the audio frames y and y+l, thi~ line will intersect one of the audio frame~ in the audio frame sequence of the latter VOB. The audio frame that follow3 this inter~ected audio frame is the audio frame u.
Fig, 24D shows the case ~here the presentation end time of ~he audio frame y matche~ an audio frame boundary in the latter VOB. In thi~ case, the audio frame immediately after thi~ presentdtion end time i~
set as the audio frame u.
In 3tep Sl16, thc audio pack G4, ~hich in~ludes an audio data sequence where the audio data reproduced for the audio frame u is arranged at the front is generated from the audio stream in the latter vO~. In Fig. 24A, the audio frame~ that precede audio frame u have been dra~n with a ~ro~en line, with thi~ audio da~a ~hown using a broken line no~ being multiplexed into the FROM _ 1998~9~17E(~) 0 21/~22:07/~4300403388 P18 latter VO~, A~ a result of steps S114-S116 above, the audio data from the firs~ audio frame to the audio frame x+1 is ~ultiplexed into t~e former VOB. The audio da~a fro~
the audio frame x+2 to the audio frame y and the audio data froIn the audio frame u to the final a~dio frame i3 multiplexed into the latter vOB By performing multiplexing in this way, the audio fr~mes for the audio data at the end of the for~er VO~ will be read from the DvD-RAM at the same time as picture data that is to be reproduced further a~ead in the reproduction.
At this point, when the audio data in the former VOB is not present a~ far as frame y, ~hlch i~ to say the audio data is short, qilent audio ~rame data is inserted to compensate for the inqufficient number of frames. In the same way, when the audio data in the latter VOB is not present startlng from audio frame u, whlch is to ~ay the audio data i~ short, silent audio ~rame data is inserted to co~pensate for the insufficient number of frames.
When the audio frames from the audio frame x+2 to the audio ~rame y in the former YOB and the audio data from the audio frame u to the final audio frame in the latter VOB i~ multiplexed into the latter VOB, attention needs to paid to the AV synchronization.

CA 02247637 l998-09-l7 FROM ~ 1998~ 9~17E(~) 0:22/~22:07/~ 4300403388 P 19 As shovn in Fi g 24A, an reproduct ion gap occur~
bet ween t he audio frame y and the audio frame u, and if multiplexing i~ performed without regard to this reproduction gap, a loss of qynchronizatio~ will occur whereby the audio frame u uill be reproduced before the corre~ponding video frame.
To prevent the increase of such time lag~ between audio and video, a time stamp showing the audio frame u may be as~igned to the audio packet lo To do so, in step S117, a Padding-Packet or ~tuffing bytes are inserted into the pack which includes the data of the audio fra~e y so that the audio frame u is not stored in the pack ~toring the audio frame y. As a result, the audio frame u is located at ~he start of the next pack.
In step S118, the VOBU qequence that is located at the end of the former vOB i~ generated by multiplexing the audio data up ~o the audio frame x+1, o~t of the audio data extracted from the VOBUs located at the end of this former VOB, wlth the ~ideo data tha~ ha.~ been re-encoded.
In -~tep S119, the audio data in audio frame x+2 onwards iq multiplexed with the ~rideo data that i3 extracted from the VOBUs located at the start of the latter Vo~ to generate the VOBU~ that should be arranged FROM - 1998~ 9~173(~) 0:22/~22:07/~ 4300403388 P 20 at t he front o ~ t he latte r v OB
In detail, the control unit l has the audio pack G3, which includes the audio data sequence from the first audio frame x+2 tO the audio frarne y and the Padding_Packet, and the audio pack G4, which includes the audio data sequence f~om the audio frame u onwards in the latter VOB, multiplexed with the re-encoded video data and ha~ the stream encoder 2e generate the VOBUs that are to be arranged at the start of the latter VOB. As a re~ult of thi~ multiplexing, the audio frames at the end of the audio data of the former vOB will be read from the DVD-RAM at the same time as sets of picture data that will be reproduced at a later time.
Fig. 25 ~hows how the audio packs that store a plurality of set~ of audio data to be reproduced for a plurality of audio frames are multiplexed with video packs that store picture data that i~ to be reproduced for a plurality of video frames.
In Fig. 25, the tran~fer of the picture data V TOP
that should be decoded at the start of the latter vOB
will be completed within the period Tf_Period. The pack sequence arranged below this period Tf Period in Fig. 25 show~ the pack~ that compo~e the picture data V TOP.
In Fig. 25, the audio pack G3 that includes the audio gap store~ the sets of audio data x+2, y-1, y that FROM ~- 1998~9~16~(~)22:24/~22:07/~4300403388 P21 are to be reproduced for the audio frames x~2, y-l, y.
Of the sets of audio data stored in thi9 audio pack, the first to be decoded i5 the audio data x+2.
Thi~ audio data x+2 should be decoded at the presentation end time of ~he audio frame x+l, and so should be ~ead from the DVD-RAM together with the picture data V_TOP whose pack sequence is transferred during the ~ame period (Tf_Period) a~ the audio frame x+l A~ a result, this audio data iq inserted between the video pack sequence P~l, which 3tores the picture data v_TOP, and the video pack sequence P52, as Yhown at the bottom of Fig. 25.
In the audio pack G4 tha~ stores the sets of al~dio data u, ~+l, and u+2 that are to be reproduced for the audio frames u, u+l, and u+2, the audio data u iY to be decoded first. This audio data u should be decoded at the presen~ation end time of the audio frame u-l, so that this audio data u should be read fro~ the DVD-RAM
together with the picture data ~ NXT whose pack sequence is tranYferred during the same period. As a re~ult, this audio data u is inserted between the video pack sequence P52, which store~ the picture data V_TOP, and the ~ideo pack sequence P53 which store~ the picture dat V_NXT, as shown at the ~ottom of Fig. 25.
As ~hown above, the audlo pack G3 that includes the FROM _ l998~9~16E(~)22:25/~22:07/~4300403388 P22 audio gap is inserted between the video pack sequences P51 and P52, while ~he audio pack G4 i~ in~erted between the ~ideo pack sequences P52 and P53, thereby comple~ing the multiplexing.
After this, in step S120 the control unit 1 inserts the First_SCR and La~t_SCR ~f the former VO~ and lat~er VOB, the seamless flag, the V0~3 V_E_PTM, and the VOB_V S PTM in~o the seamless linking information for_ the former VOB. In steps S12 1 and S122, the control unit 1 writes all of the information relating to the audio gap, which is to say the audio gap start time, A STP_PTM, the audio gap length A_GAP LEN, and the audio gap location information A_GAP_LOC into the seamleqs linking in~ormation.
After the above processing, the control unit 1 has the en~ of the former VOB, the start of the latter VOB, and ~he seamle~s llnklng information written onto the DVD-RAM.
The video packs and audio packs that store the video data and audio data obtained through the above re-encoding are aqsigned SCRs with ascending values. The initial value of the assigned SCRs is the value of ~he SCR of the pack originally located at the start of the area subjected to the re-encoding.
Since the SCRs show the time at which the FROM ~ 1998~9~16E(~)22:25/~22:07/~ 300403388 P23 re-~pective video pack~ and audio pack~ should be inputted into the video buffer 4b and the video decoder 4c, if there iq a change in the amount of data be~ore and after re-encoding, it will be nece~3ary to up~ate the values of the SCR~. E~en if this is the case, however, the decoding process will still be carried out correctly provided that the SCRs for the re-encoded ~irst part of the latter VOB are below the SCRs of ~he video packs in the remaining part of the latter VOB that was not re-encodea~
The PTS and the DTS a~e assigned in accordance with the video frames and audio frames, so that there uill be no significant change in their values when re-encoding i~ performed. As a result, continuity of the DTS-PTS is maintained between the data not subjected ~o re-encoding and the data in the re-encoded area.
To reproduce two VOBs ~eamlessly, non-continuity in the time stamp~ must be avoided. To do ~o, the control unit l ~udge~ in step Sl23 of Fig. 22 whether Z0 overlapping SCR~3 have appeared. If thi~ judgement i~
negati~e, the processing in the flo~chart of Fig. 22 ends. If overlapping SCR~ ha~e appeared, the control unit l proceeds to ~tep Sl24 uhere it calculate~ the exces~ amount A based on the number of pac~s that have the overlapping SCRs. T~e control unit l then return.

FROM _ 1998~ 9~16~ 22:26~22:07/~ 4300403388 P 24 to step S110 to repeat the re-encoding, basing the amount of assigned code for the repeated re-encoding on this exces~ amount A~
As shown by the arrow (S) in Fig. 17, the six VOBUs S that have been newly multiplexed by the processing in Fig. 22 are outputted to the disc access unit 3. The disc acce~s unit 3 then writes the vOBU sequence onto the DVD-RAM.
It ~hould ~e noted that while the flowchart of Fig.
21-Fig. 22 describes the seamle~s linking of two VOBs, the same proce~-~ing ~ay be u~ed to link two sections of the ~ame vOB. For the exa~ple shown in Fig. 6B, when deleting the VOBU-~ #2, #4, #6, and #8, the VOBU located before eac~ deleted part may ~e seamles~ly linked to the VOBU located after ~he deleted pack by the processing in Figs. 21 and 22.
The followlng i9 a description of the reproduction procedure for ~eamlessly reproducing two VOB~ that have been seamlessly linked by the processing de cribed above When the user indicates the seamless reproduction of two or more VOBs recorded in an AV file, the control unit 1 first refers to the seamlesY flag in the seamless linking information of the latter VOB. I~ this seamleq-~
flag is "on", the cont~ol unit 1 set~ the time obtained .. . .

FROM ~ 1998~9~16~(~)22:26/~22:07/~4300403388P25 by subtracting the video presentatiOn ~tart time VOB_V_S_PT~ of the latter VOB from the video presentation end time vOB V E PTM of the former VOB ~o obtain the STC_off~et~ The control unit 1 then has the adder 4h add the STC_offset to the ~tandard time meaqured by STC unit 4g.
After this, the buffer input time First_SC~ of the former VOB indicated by the seamless linking information is compared with the standard time measured by the STC
unit 4g. When the ~tandard ti~e reaches thi3 First_SCR, the control unit 1 controls the switch SW1 to switch to output the offset standard time outputted by the adder 4h instead of the standard time outputted by the STC
unit 4g. After this, the control unit 1 switche~ the lS states of the switches SW2-SW4 in accordance with the timing chart in Fig. 20 ~ ith the present embodiment, qeamless reproduction of a plurality of VOBs can be achie~red by reading and re-encoding only the respective ends and starts of the 2~ VOBs. Since the re-encoded data is only the VOBUa located at the ~tart and end of the vOBs, the re-encoding of VOBs can be achieved in a very short time.
Note that while the present embodiment de~cribeq a case where seamless linking information i9 managed for each vOB, the information that is required for the FROM _ 1998~ 9~16~(~)22:27/~22:07/~$~4300403388 P 26 seamless linking of VOB3 may be collectively pro~ided.
one example, the video presentation end time VOB_V_E_PTM and the video pre~entation start time VOB_V_S_PTM that are used to ~:alculate the STC_offset 5 are de~cri~ed as being given in two separate set~ of VOB
information, though these may ~e given as the seamle~Y
linking in~ormation of the latter vOB When doing so, it i~ desirable fo~ the VOB information to include information for the presentation end time of the previouY VOB (pRE;v-vos-v F_PTM).
In the same way, it is prefera~le for in~ormation that i~ the flnal SCR in the former vos (PREV ~O~ LAST SCR) to be included in the seamless linking information of the latter VOB.
In the present embc~diment, the DVD recorder 70 wa~
described as ~eing a device that takes the place of a conYentional (non-portable) dome~tic ~CR, although when a DVD-RAM is used as the recording medium for a computer, the following sy~tem ~etup may ~e used. ~he 20 di~c access unit 3 may function a~ a DVD-RAM drive device, and may be connected to a computer bus ~ia an interface that co~plies ~o SCSI, IDE or I~E1~94 standard In such a ca~e, the DVD recorder 70 will include a control unit 1, an MPEG encoder 2, a di~c access unit 3, FROM _ 1998~9~l6~(~)22:27/~22:07/~4300403388 P27 an MPEG decoder 4, a video signal processing unit 5, a remote controller 71, a bus 7, a remote control signal reception unit 8, and a receiver 9.
In the a~o~e embodiment, VOBs were described as being a multiplexed combination of a video stream and an audio ~tream, although qub-picture data produced by subjecting data for subtitles to run-length encodinq may alQo be multiplexed into vOBs. A video ~tream composed of sets o~ still image data may also be multiplexed.
In addition, the above embodiment de~cribe~ the ca~e where the re-encoding of data is performed by the MPEG decoder 4 after the VOBs have been decoded by the MPEG encoder 2. However, during the re-encoding the VOB~ may instead be directly inputted from the diQc acces~ unit 3 to the MP~G encoder 2 without prior decoding.
The present embo~im~nt deQcribes the caQe where one picture is depicted u~ing one frame, although there are ca~e3 one picture is in fact depicted u~ing 1.5 frame~, ~uch a~ for a vi~eo stream where 3:2 pulldown is used with image~ for 24 frame per second being ~ubject to comp~ession, inn the same way as with film materials.
The processing module software represented by the ~lowcharts in this ~irst embo~;~Pnt ~Fig~. 21-22) may be realized by a machine languaqe pro~ram which may be .

FROM _ 1998~ 9~16~(~)22:27/~22:07/~ 300403388 P 28 di~tributed and ~old ha~ing been recorded on a recording medium. Examp~es of such recording medium are an IC
card, an optical disc, or a floppy disc The machine language program recorded on the recording medium may then be installed in~o a standard personal computer. By executing the installed machine language programs, the standard per~onal co~p~ter can achieve the functions of the video data editing apparatus o~ the present embodiment.

Second F.mlho~;m~nt While the first embodiment deals with a premise ~hat seamless linking is performed for VO~s, this ~econd e~bodiment describes the seamleYs linking o~ a plurality of part~ of VO~s. In thi~ second embodiment, these part~ of a VOB are spec~fied using time information that indicates video fields. The video fields referred to here are units that are ~maller than one video frame, with the time information for video field~ being expressed using the PTS of video packs.
The parts of a VO~ that are specified using time information for video ~ield~ are called cell~, and the information used for indlcating these cell-q i~ called cell information. Cell information is recorded in the FROM - 1998~9~16E(~)22:28/~22:07/~4300403388P29 RTRW management file a~ one element in the PGC
information. The detail~ of the data construction and generation of cell infonnation and PGC inl~or~nation i6 given in the fourth embodiment.
Fig. 2 6 shows ~Amrles of the cells indicated by the video fields for the start and the end. In ~ig. 2 6, the se~q of time information C_V_S_PTM, C V_E_PTM
~pecify the video ~ield~ at the start and end of a cell.
In Fig. 26, the time information C_v_S_PTM is the presentation start time of a video field at which the P
pict~re in VOBU#lOO that forms one part of the present V03 3hould be reproduced In the same way, the time information C_V E PTM is the presentation end time of a video field at which the ~l picture in vo~u#lo5 that forms one part of the same vOB qhould be reproduced. As shown in Fig 2 6, the time information C_V S PTM and C_V_E_PT~ speci~y a qection from a P picture to a B
picture a~ a cell.

(2-1~ R~nstrllction of GOP~
When seamlessly linking parts of a vOB that are indicated by time information, it becomes necessary to use two proce~ses that were not required in the fir~t embodiment. First, the con~truction of the GOPs has to be reconstructed to convert the section indicated by the .

FROM ~ 1998~9~16~(~)22:28/~22:07/~4300403388 P3() time in~ormation into a ~eparate VOB, and 3econd, the increases in ~uffer occupancy due to the reconstruction of GOP~ have ~o be estimated.
The reconstruction of GOPs refers to a proces~ that changes the construction ôf GOP~ so that the sec~ion indicated as a cell has the p~oper display order and codinq order.
More specifically, when a section to be linked i~
indicated by cell information, there can be cases where an editing ~oundary i3 defined midway through a VOBU, a~
shown in Fig. 28A. If thi~ iY the ca~e, the two cells to be linked will not have a proper display order or coding order.
In order to ~ectify the di~play order and coding order, the recon~truction of GOPs i-~ performed using processing ba~ed on the three rule~ shown in Fig. 288.
When the ~lnal picture data in the di~play order of a former cell i~ a ~ picture, the processing based on the first rule re-encode~ this picture data to convert it into a P picture (or an I picture). The P picture in the for~ard direction that wa~ referred to by the B
picture is located before the B picture in the coding order. However, thi~ P picture will not be displayed after the editing, and so i~ deleted ~rom the VOB.
When the fir~t picture data in the encoding order ... . . .. .

FROM _............................ 1998~9~163(~)22:29~22:07/~4300403388 P31 of the latter cell is a P picture, the processing ba3ed on the seco~d rule re-encodes this picture data to convert it to an I picture.
When the fir~t set or consecutive sets of picture data in the display order of the latter cell is~are B
pictures, the processing based on the third rule re-encode~ this picture data to convert it to picture data ~hose display does not rely on the correlation with other images that ha~e previously been reproduced.
Hereinafter, image~ formed of pictu~e data that only relies on correlation with image~ that are yet to be displayed will called Foruard-B pictures.

~7--~)Estim~t~ nç~ the Increas~ in Rllffer ~rrl~n~n~y When the picture types o~ certain images ha~e been changed by the proce~-~ing based on the three rules described abcve, the processing for estimating the increases in buffer occupancy e~timate-~ the size~ of these con~erted set~ of picture data.
When the reconstruction descri~ed abo~e is perfonmed for the former cell, the final picture data in the reproduc~ion order of the former cell i~ converted from a ~ picture to a P picture or an I picture, there~y increasing the size of this data.
When the recon~truction described abo~e is ~K~M IYY~Y~ 4~

performed for the latter cell, the picture data located at the start of the co~t ng order of the final cell i~
conver~e~ from a P picture to an I picture, and the picture type of t~e ~ideo data located at ~he front of the display order iq converted to a Forward-B picture.
Thls also increase~ the size of the data.
The following i~ an explanation of the procedure for estimating the increases in data size that accompany the conver~ion in picture type. Fig. 29A and 29B ~ill lo be used to explain this procedure.
In Fig. 29A, the first cell continues as far as the B picture B3. According to the above rules, ~he video data editing apparatus has to convert thi~ B picture B3 ~o the P picture Pl. When the B picture B3 is dependent on the P picture P2 that is reproduced after the B
picture B3, the picture type conversion process will incorporate the necessary information of the P picture P2 into the P picture P1' that is produced by the con~ersion process.
In view of thi~ procedure, ~he video data editing apparatus can estimate the data ~ize of the P picture P1' that is obtained by the conver~ion proce~ u~ing a sum of the size of the B picture B~ and the .~ize o~ the P picture P2. This estimation method merely represents one potential method, however, so that other methods are FROM - 1998~9~16~(~)22:30/~22:07/~4300403388 P33 equally possible. sy de~ermi ni ng the amount of code for use in ~e-encoding based on the e~timated buffer occupancy, the video data editing apparatu~ can a~ign an optimal amount of code to the former cell and latter cell.
Figs. 30A and 30B show how the increases in buffer occupancy that accompany changes in picture type within the la~ter cell are estima~ed.

In Fig. 30A, the data from the B picture B3 onwards belong~ to Ihe latter VOB. Each cell is determined based on the display time for the start of the cell, so that the B pictu~e B3 is the picture data located at the start of the display order of t~e latter cell. As a result, the video data editing apparatus need~ to 1~ convert the B picture B3 into the For~ard-B picture B' according to the rules given abo~e. When this B picture B3 had an information component that is dependent on the previously reproduced P picture P2, this information component of the P picture P2 will have been incorporated into the Forward-B B' picture during the pictu~e type con~ersion.
In view of this proce~ure, the video data editing apparatu~ can estimate the data size of the Forward-B

picture B' that is obtained by the conver~ion process u~ing a sum of the size o~ the B picture B3 and the size FROM - 1998~ 9~16~(~)22:30/~22:07/~$~4300~03388 P 34 of the P picture P2 For the latter VOB, the video data editing ~pparatus needs to con~ert the picture type of the picture data located at the start of the coding order.
By referring to the display o~der of the latter ~IOB in Fig. 28A, it can be ~een that the P picture P3 is the picture data that is to be displayed immediately after the B picture B3. The P picture P3 is stored in ~he reordering buffer 4f of the ~ideo data editing apparatus until ~he decoding of the B picture B3 is complete, and so is only displayed after the decoding of B picture B3 has been performed. By having the reordering buffer 4f reorder the picture data in this way, the P picture P3 will precede the B picture B3 in the coding order even though P picture P3 i~ displayed after the B picture B3.
According to the rules described earlier, the video data edi~ing apparatus needs to convert picture data P3 detected as the ~irst picture data in the coding order into an I picture. When thi~ P picture has an information component that relies on the I picture that is reproduced before the P picture P3, thi~ information comp~nent of the I picture will have been incorporated into the P picture P3 during the picture type conversion.
In view of thi~ procedure, the ~ideo data editing FROM ~ 1998~9~16E(~)22:30~22:07/~4300403388P35 apparatu~ can estimate the data size of the I picture I' ~hat is obtained by the conversion process using a sum of the size o~ the P picture P3 and the ~ize of the p~eceding I picture. Based on the buffer occupancy that iq estim~ted in thi~ way, the video data editing apparatus can then as~ign optimal amount~ of code to the former and latter cell~ to be used in the re-encoding.

(~-3) proce~ ~ for ~~e~ml esoly ConnF~rti na Cells Figs. 31 to 33 are flowcharts showing the procedure that links two cells to enable 3eamless reproduction of the two. Note that many of the steps in these flo~chart~ are the same as the ~teps in the flowcharts ~hown in Figs. 21 and 22 ~ith the term ~VOB" having been replaced with term "cell". The~e ~teps have been gi~en the same reference numeral~ as in the fir~t embodiment, and their explanation has been omitted.
Fig. 34 shows the audio frames in the audio stream that correspond to the audio frame x, the audio frame x+1, and the audio frame y that are u~ed in Fig. 31.
In step S102, the control unit 1 refers to the time information qpecifying the end of the cell to ~e reproduced first (hereinafter called the "fonmer cell") and the time information specifying the end of the cell FROM ~ 1998~9~163(~)22:31/~22:07/~4300403388 P36 to be reproduced second (hereinafter called ~he ~latter cell") and su~tract~ the C V S_PTM of the latter cell from the C V E PTM of the former cell to obtain the STC_off~et.
In ~tep Sl03, the control unit l analyzeq the changes in the buffer occupancy from the First_SCR of the former cell to the decode end time Last_DTS o~ all of the data in the former cell.
In step Sl04, the control unit 1 performs the same analysi~ as in step Sl03 for the latter cell, and so analyzes the change~ in the buffer occupancy from the First_SCR of the latter cell to the decode end time La-~t_DTS of all of the data in the latter cell.
In ~tep Sl30, the control unit l estimates ~he increa3e a in the buffer occupancy that accompanies the change3 in picture type for the latter cell, in accordance wi~ the procedure shown in Figs. 30A and 30B. In step Sl31, ~he control unit l esti~ates the increa~e ~ in the buffer occupancy that accompanies the changes in picture type for the former cell, in accordance with the procedure shown in Figs. 29A and 29B. In qtep S132, the control unit l add~ the estimated increaqes a,~ to the re~pecti~e buffer occupancy for the latter and former cell~.
In qtep S105, the control unit l analyzes the . . . .. .

FROM 1998~9H16E(~)22.31~2207/~4300403388P37 changes in the occupancy of the video buffer from the First_SCR of the latter cell + STC_offset to the ha~t DTS o~ the former cell.
As shown in Fig. lOC of the first embodiment, the highe~t occupancy BvllBv2 of the video buffer 4b is obtained for the period where video da~a for both the former cell and latter cell is stored in the video buffer 4b.
In step Sl06, the control unit l controls the disc access unit 3 to read the three VOB~ believed to include the picture data located at the end of the former cell f~om the DvD-RAM. After this, in step Sl07 the control unit l controls the di~c access unit 3 to read the three VOBs believed to include the picture data located at the l~ start of the latter cell.
Fig. 27A shows the area that should be read from the former cell in step SlO6. Fig. 27B shows the VOB
include~ VOBUs #98 to #107, with VOBUs #99 to #105 being indicated as the former cell. When the picture data to be reproduced la~t in the former cell i~ the picture data Bend, this picture data will be included in one of VOBUs #103 to #105 in accordance with the one-second rule, so that VOBU#103 to ~OB#105 will be read as the VOBU ~equence that includes the picture data to be reproduced last.

'98~09~179(11~)10:42 ~G R I CHE S ~ f PlD 15 The vOB shown in Fig 27B includes the VOBUs #498 to #507, and of these, ~OBUs #500 to #506 are indicated as the latter cell. When the picture data to be displayed first in this latte~ cell is the picture data PTOP, this picture data PTOP will be included in v08us #500 to #502, so that VOBUs #500 to #S02 will be read as the vOBU sequence that include~ picture data to be displayed ~irst. The~e VOBUs include t~e all of the picture data that depends on the picture data PTOP and I 10 the picture data Bend, in addition to the audio data I that is to be reproduced at the same time as the picture data PTOP and the picture data Bend. As a resu~t, all of the picture data that i3 required for the conversion of picture types is read ~y this ope~ation.
15It should be noted that the reads in this ~louchart are performed for vOBU units, al~hough the reads may instead be per~ormed for the picture data and audio data that is to be reproduced in one second, out of all of the picture data and audio data included in a vOBU In the present embodime~t, the num~er of VOBs that correspond to one second of reproduction is given as three, although any number of vOBs may be u~ed. Reads may alternatively be performed for picture data and audio data that is to be reproduced in a period longer than one second.

. .

FROM - 1998~9~16~(~)22:32~22:07/~4300403388P~9 A~ter the~e reads are complete, in ~tep Sl08 the control unit l controls the demultiplexer 4a to separate the video data and audio data from the VOBU located at the end ~f the former cell and the start of the latter cell.
ln step SlO9, the control unit l judges whether the accumulated amount o~ d~ta in the ~uffer exceeds the upper limit of the buffer at any point in the decoding when the former cell and the latter cell coexi~t in the b~ffer. More ~pecifically, this i9 achieved by judging whether the value Bvl+Bv2 calculated in step SlOS
exceeds the upper limit o~ the buffer.
If this value does not exceed the upper limit, the processing advances to step Sl33, or if the value does lS exceed the upper limit, the control unit l assigns an amount of code based on the excess amount A to the former cell and latter cell in Step SllO. Note that the re-encoding performed in this case may only be performed for one of the former VOB and latter vOB~ or for both.
In ~tep Slll, the video data obtained from the two cells i~ re-encoded according to the amount of code assi~ned in step SllO.
In step Sl33, the Fir~t_SCR that has been ne~ly assigned to the re-encoded video data in the latter cell is obtained. In this latter cell, the fir~t picture FROM 1998~9~163(~)22:33/~22:07/~4300403388P40 data ln the dlsplay order and the fir~t picture data in the co~ ng order will have been converted into picture types with larger amount~ of picture data, so it should be obviou-~ that the value First SCR+STC offset will indicate an earlier time than before.
In step S112, the control unit 1 calculates the audio data, out of the audio data separated from the ~ormer cell, that correspondq to the audio frame x which includes the sum of the STC_offset and the First_SC~
that i~ newly aqqigned to the video data in the latter VOB. In Fig. 34, the upper and lower graphs respecti~ely show the transition in the buffer occupancy due to the video data in the former cell and latter cell. The lower graph in ~ig. 34 show-~ the audio f~ames of the audio data separated from the former cell.
The audio frame sequence below the lower graph in Fig. 34 shows each audio frame againqt the time axis of the graph given above it. The buffer occupancy for the new latter cell obtained as a result o~ the re-encoding increase~ by the amount al. Note that this amount al differs from the increased amount a that was eqtimated in step S132. Due to this amount al, the First SCR that is newly assigne~ to the latter video data indicate~ an earlier time.
Ag can be seen from the lower graph in Fig. 34, the FROM 1998~9~16~(~)22:33A~22:07/~4300403388P4 new ~alue of Fir3t SCR+STC offset i~ positioned at time which is Tal earlier than before. In Fig. 34, the de3cendin~ guideline drawn from the new ~alue of First_SCR+STC offset intersects one audio frame in the audio frame ~equence of the former cell This inter~ected audio frame is the audio frame x, with the follo~ing audio frame x+l being the final audio frame in the former cell.
Since the ~alue of the ~um of the STC_off~et and the new First_SCR of the latter cell indicates an earlier time, this means that an earlier frame is indicated a~ the audio frame x. A3 a re~ult, when a read i~ commenced for the video data in the latter cell, the audio data that Yhould be read from the former cell together with this video data i~ comparatively larger than in the first embodi~ent.
Hereafter, the proce~ing in step~ Sl13 to Sll9 is performed so that the ~tream encoder 2e performs the multiplexing shown in Fig 25.
After this, in step Sl20 the FirYt_SCR, La~t_SCR, the seamless flag, the C_v_E_PTM, and the C_v_S_PTM for the former and latter cells are inserted into the seamless linking information o~ the former cell The control unit 1 then per~orm~ the proceYsing in ~teps Sl21 and Sl22. Of the data for the six VOBUs obtained IYY~Y~ Ul/~4~UU4U~ V4 through the re-encoding, the three VOBUs arranged at the start ~the first VOBUs) originally formed part of the fonmer cell, and s~ are appended to the end of the fol~ler cell. Similarly, the three VOBU~ arranged at the end ~the latter VOBUs) oriqinally formed part of the latter cell, and so are inserted at that start of the latter cell.
While one of tne former and latter cell that have been given re-encoded data i~ managed having been assigned the ~ame identifier a~ the vOB from which it wa~ taken, the other of the two cells is managed having been a~igned a different identi~ier to the v~B from which it was taken This means that after this division, the former cell and latter cell are managed as separate VOBs, This is because there is a high pos~ibility of the time stamp~ not being continuous at the boundary between the former cell and the latter cell.
As in the first embodiment, in step S123 the control unit 1 judge~ whether the value~ of the SCR are continuous. If so, the control unit 1 encls the proce~sing in the flowchart~ of Fig~. 31 to 33. If not, the control unit 1 calculated the excess amount A based on the number of packs given o~erlapping SCRs, determines an amount of code based on the excess amount .

FROM _ l998~9~l6~(~)22:34/~22:07/~ 300403388P43 A, and re~urns ~o step SlO9 to repeat the re-encoding.
As a result of the above processing, cells are re-enco~ed, ~ith the cells indicated by the cell information being ~et as separate VOBs. This means that vo~ info~tion for the newly generated VOB~ need to be pro~rided in the RTRW manageInent file. The following i3 an explanation of ho~ this VOB information for cell~ is defined.
The "video stream att~i~ute informa~ion~ include~
compres~ion mode informatlon, TV system informa~ion, aspect ratio information, and re~olution information, although this information may ~e set to ~atch the information for the VOB~s) from which the cell~ were ~aken.
The 'laudio ~tream attribute information" includes an encoding mode, the presenceJabsence of dynamic range cont~ol, a ~ampling fre~uency, and a number of channels, although this information may ~e ~et to match the information for the vOB~) from which the cells were taken.
The "time map table" i~ compo~ed of the ~ize of each VOBU that compo~e~ the VOB and the display period of each VOBU, although a corresponding part of the information given for the VO~(s) from which the cells were taken may be used, with the sizes and di~play ~ lYY~ 6~0~ UI/~4~4~ 44 ,_ periods only belng ~me~ded for VOBUs that ha~e been re-encoded.
The following is an explanation of the ~seamless linking information" that wa~ generated in step Sl33.
This seamle~s linking information is composed of a seamles~ flag, a video presentation ~t~rt time VOB_V S_PTM, a video presentation end time vos v E PTM, a First_SCR, a Last SCR, an audio gap start time A_STP PTM, and an audio gap length A GAP_LEN. These_ element~ are written into the seamless linking information one at a time.
Only when the relationship between the former cell and the latter cell is Yati~fies the following conditions (1) and ~2~ is the ~eamle~s flag set at "01".
If either condition is not sati~fied, the seamless flag is set at "00 "
(l) Both cell~ mu~t use the same display method (NTSC, PAL, etc.) for the video strezm as given in the video attribute in~ormation.
(2) Both cells must use the same encoding method (AC-3, MPE(~:, Linear-PCM) for the audio ~tream a~ given in the audio attribute information The "~ideo presentation ~tart time voB-v-s-pTM~ ls updated to the pre~entation start time after re-encoding.

1~3 FROM 1998~ 9~17~ 03/~ 0:25/~$~4301968158 P 45 The "video pre3entation end time VOB_V_~_PTM" is updated to the presentation end ti~e after re-encoding.
The "First_SCR" i5 updated to the SCR of the firs~
pack after re-encoding.
The "Last_SC~" is updated to the SCR o~ the final pack after ~e-encoding.
The "audio gap start time A_STP_PTM" is set at the pre~entation end time of the audio frame y that i9 the final audio frame to be reproduced for the audio data tha~ is moved to the latter cell in Fig. 34.
The "audio gap length A_GAP_LEN" i~ ~et a~ the period from the presentation end time of the final audio frame y to be reproduced using the audio data that i~
moved to the latter cell in Fig. 34 to the presentation ~tart time of the audio frame u.
Once the VOB information has been generated a.~
described above, an RTRw man~gement file included this new VOB information i9 recorded onto the DVD-R~M. By doing so, the two cells that are indicated by the cell information can be recorded on the DVD-RAM as two VOBs that are to be reproduced 3eamlessly.
As described above, thls ~econd embodiment can proces~ cells in a VOB or vOBs so aY to have the cells sezmlessly reproduced ~y merely reading and re-encoding the end of the former cell and the start of the latter FROM l998~9~l6~(~)22:35~22:07/~4300403388P46 cell. since only the VOBU~ located at the s~a~t and end of the re~pecti~e cell~ are re-encoded, this re-encoding of cell~ can be achieved in a very ~hort time.
It should be noted that while the pre~ent embodi~ent describes the case where video fie~d~ a~ u~ed as the unit when indicating cells, ~ideo frames ~ay be used in~tead.
The processing module software represented ~y the flowchart~ in this first embodiment (Fig~. 31-33) may be realized by a machine language program w~ich may ~e distributed and sold having been recorded on a recording medium. Example~ of ~uch recording medium are an IC
card, an optical disc, or a floppy disc. The machine language program recorded on the recording medium may then be installed into a standard personal c~mputer. 8y executing ~he installed machine language programs, the standard per~onal computer can achie~e the function~ of the video data editing apparatus of the pre~ent embodiment.

Thi r~ F~hodiment The third ~mho~;~ent of the present invention manages Av files in a file system and allow~ greater freedom in video editing.

FROM 1998~ 9~16~(~)22:36/~22:07/~ 4300~03~88 P 47 3-1 n; rectory rctruc~l~re on A nvD~
The RTRW management file and AV files of ~he first embodiment ~re arranged in the directories shown in Fig. 35 ~ithin a file ~ys~em that complies to ISO~IEC 13346. In Fig. 35, the oval~ represent directories and the rectangles repre~ent files. The root directory includes directories called a "RTRW" and two files called "Filel.DAT" and "File2.DAT". The RTR~ directory includes three files called "Movi el.VOB", "Movie2.VOB", and "RTRwM.IFO".

(3-1--1) File !~v~tem ManAg~m~qnt Tnfo~tio~ in the DiFectories The following is a description of the management information used for managing the RTR~ management file and AV ~iles in the directory structure shown in Fig. 35. Fig.
36 show~ the file system management information in the directory structure of Fig. 35.
Fig. 36 shows the volume area sho~n in Fig. 3D, the sectors, and stored contents of ~ectors in a hierarchy.
Arrow3 (D-0 in this drawing show the order in whic~ the storage positions of the file "Moviel.VOB~ are specif~ed by the present management information.
The ~irst level in the hierarchy in Fig. 36 shows the volume area shown in Fig. 3D.
The ~econd levc~l in the hierarchy sho~s file 3et descriptor.~, end descriptors, file entries, and directories, lg6 FROM l998~ 9~l6E(~)22 36/~22 07/~4300403388 ? 48 ._ .

out of the entire management information. The information on this second level complies with a file sy~tem that is ~tandardized ~nder ISO/I};C 13346. File system~ that are standardized under ~SO/IEC 13346 manage directories in a ~ierarchy.

The management information in Fig. 36 is arranged in accordance with the directory structure. However, a recording region is only ~hown for the AV file "Moviel VOB".
The file set de~criptor (LBN 80) on the ~econd level ~ho~s information such as the ~BN of the sector that stores the file entry for the root directo~y. The end de~criptor (~BN 81) sho~s the end of the file set descriptor.
A file entry ~such as L~N 82, 584, 3585) is stored for each file (or directory), and Yhows a storage position for a file or directory. File entries for files and file entries for directories have a same format, so that hierarchical directories can be freely constructed.
A directory (such a~ LBN83, 58~, 3585) ~how~ storage positions for file entrie~ of the files o~ directories $ncluded in the directo~y.
Three fileY entries and two directories are sho~n on the third le~el in the hierarchy. The ~ile entries and directories are tracked by the file ystem and have a data construction that enable~ the storage po~ition of a Z5 ~pecified file to be indicated regardleas of the construction of the hierarchy in the directory structure.

FROM 1998~ 9~16E(~)22:37/~22:07/~ 4300403388 ? 49 ._, ~ach file entry include~ an allocation descriptor that show~ a ~torage position of a file or directory. When tne data recorded in each file is divided into a plurality of extents, a file entry includes a plurality of allocation 5 descriptors for each extent.
The exprè~sion "extent" refers here to a section of data included in a file that should be preferably ~tored in consecutive region~. When, for example, the size of ~ VOB
to be recorded in an AV file is large, but there are no consecutive regions for ~toring the VOB, the AV file cannot be recorded on th~ DVD-RAM.
However, when there is a plurality of small consecuti~e regions distributed across the par~ition area, by dividing the VOBs to be record~d in the AV file, the re2ulting divided sections of the VOBs may be stored into the di~tributed consecutive areas.
~y dividing VOBs in this way, the probabillty of being able to store VOBs a~ AV files increa~e~, even ~hen the number of ~onsecutive regions and length of the partition ZO area are limuted. To improve the efficiency with which data is recorded on a DvD-RAM, the VOBs recorded in one AV file are divid-d into a plurality of extent6, ~ith ~he~e exte~ts being recorded in separate consecutive areas on the di~c without regard to the order of the extents.
It should be noted that the expre~sion "consecuti~e region~" here refers ~o a region composed of ECC block~ that FROM 1998$ 9~16~(~)22:37/~22:07/~$~4300403388 P 50 are logically or phy~ically consecutive.
As one example, the file entries with the LBN 82 and 584 in Fig. 36 each include a single allocation descriptor, which mean~ that the file is not divided into a plurality of extents (which is to say, i9 composed of a single extent).
~he flle entry 3585 mean~hile ha~ t~o allo~ation de~criptors, ~hich means that the data to be stored in the file i~ composed of two extents.
~ ach directory includes a rile identification desc~iptor showing a storage position of a file entry for each file and each directory included in the directory~
When tracing a route through the rile entries and directories, the storage position of the file "root/videoJMoviel.VOB" can be found by following the order given as file set descripto~ -~-file entry (root)-~-director y (root~ file entry (RTRW)-~-directory (RTRW)-~-file entry (Moviel~vo~)-OEx~-file (extents 41 and ~2 of Moviel.VOB).
Fig. 37 ~hows the link relation~hip between the file entries and directorie~ on this route in another format that traces the directory construction. In this drawing, the directorie~ used for route include file identification de~criptor~ for the directory of the parent directory (the parent of the root being the root it-~elf), the RTRW
directory, the Filel.DAT file, and the File2.DAT file. The RTRW directory includes file identirication descriptor~ for .. . . .

FROM 1998~ 9~17~(~) 1 06/~ 0:25/~J~301968158 P 51 _ each of the directory of the parent directory (root), the Moviel.VO~ file, the Movie2.VOB file, a~d the RTRWM.I~O
file. In the same way the storage pocition of ~h~
Moviel.VOB file is specified by tracing the route ~-@~

3-l-? n~tA Con~truction of a File Entry Fig. 38A show the data construction of a file entry in more detail. As shown in Fig. 38A, a file entry includes a de~criptor tag, an ICB tag, an allocation descriptor length, expanded attributes, and an allocation descriptor. In thi~
figure, the legend BP" represents "bit position", while the legend "RBP" repre~ent~ relative bit position".

The descriptor tag is a tag showing the present ent~y is a file entry. For a DvD-RAM, a variety of tags are used, lS such as the file entry descriptor and the ~pace bitmap descriptor For a file entry, a value "261" is u~ed as the deqcriptor tag indicating a file entry.
The ICB tag ~how~ attribute information for ~he file entry itself.
The expanded attrlbutes are information showing the attributes with a higher-le~el content than the content specified by the attribute information field in the file entry.
The allocation descriptor field store~ a~ many alloca~ion descriptors as there are extents that compose the file. Each allocation descriptor sho~s the logical ~lock FROM 1998~ 9~16~ (~) 22: 38/~22: 07/~$g4300403388 ? 52 ~

num~e~ ~LBN) that indicate~ ~he storage position Or an extent for a file or a directory. The data con~truction of an allocation de~eriptor i-~ ~hown in Fig. 38B.
The allocation descriptor in Fig. 3~B includes da~a ~howing the extent length and a logical bloc~ num~e~ ~howing the ~torage po-~ition of the extent. Howe~er, the top two bits of the data indicating the extent length show the storage state of the extent storage area. The meanings of the various values are as sho~n in Fig. 38C.
~3-1-3) DAtA Con~trllction of the F~le I~t~f~c~t;on D~scrjp~o~s for D~rectorte~ ~n~ F~le~
Figs. 39A and 39B show the detailed data con~truction of the file identification descriptors for directories and files in the various directories. These t~o types of file identification descriptors ha~e the same format, and so each include management informatio~, identification information, a director~ name length, an address shouing the logical block number that stores the file entry for the directory or file, expansion information, and a directory name In this ~ay, the add~ess of a file entry is associated uith a directory name or file name.

(3-1-4) M;ni~llm S~ze of ~n AV Block When a VOB to be recorded in an Av file i3 divided into a plurality of extents, the data length of each extent must lS1 .. . . .

rAUM I YYt)~ YJ~ D ~ Y/~ U l/J(~ UU~U~D~ r ~

excee~ the data length of an Av block. The expres~ion "AV
block" here refer~ to the m~ m amount of data for which there is no danger of ~nderflo~ for the track buffer 3a ~hen reading a VOB from the DvD-RAM.
To guarantee consecutive reproduction, the minlml~m ~ize of an AV block i~ defined in relation to ~he track buffer provided in a reproduction apparatus. The following explanation deals ~ith how this m; niml~m size of an AV block is found.
5) M;n;~l~rn Size of An AV ~locl~ Are~
First, the rationale behind the need to determine the ~i ni mllm size of an AV block for guaranteeing uninterrupted reproduction is described.
Fig. 90 ~how~ a model of how a reproduction apparatu~
that reproduces ~ideo objects buffer~ AV data read from the DvD-RAM in the track burfer. Thiq model shows the minimum requirements of a reproduction apparatus for uninterrupted reproduction to be guaranteed.
In the upper part of Fig. 40, the reproduction apparatus ~ubjects the AY data it read~ from the DvD-RAM to ECC proce~qin~, temporarily accumulates the re3ulting data in the track buffer, which is 8 FIFO memory, and then outputs the data from the track ~uffer to the decoder In the illuatrated example, vr i~ the input transfer rate of the track buffer (or in other words, the rate at ~hich data is . 152 ~U~ lYY~ Y~ Y/~ UU4U~ 4 read from the optlcal dl~c), and VO i~ the output transfer rate of the track buffer (decoder input rate), ~here Yr>Vo.
In the present model, vr-llMbp~.
The lower part of Fig. 40 is a graph ~howing the S changes in the amount of data in the track buffer for the present model. In this graph, the vertical axis represents the amount of data in t~e buffer, while the horizontal axis represents time. This graph as~ume~ that the AV block#k that includes a defective sector i~ read follo~ing the AV
block#j that lncludes no defective sectors.
The period T1 shown on the time axis shows the tinle required to read all Av data in the AV block#j that include~
no defective ~ector~ During thiC period Tl, the amount of data in the track buffer increa~es at the rate (vr-vo)~
The period T2 (hereinafter called the "jump period") ~how3 the time required by the optical pickup to jump ~ro~
the AV block#j to the Av block#k. Thi~ jump period include~
the ~eek time for the optical pickup and the time taken for the rotation of the optical disc to ~tabilize. In the wor~t ~0 case scenario of a ~ump from the inner periphery to the outer periphery of the optical di~c, the jump time is assumed to be around lSOOm~ for the pre~ent ~odel. During the jump period T2, the amount of data in the track buffer decrea~e~ at a rate of VJ.
The period~ T3 to T5 shou the time taken to read all AV
data in the AV block#k that includes a defective 3ector.

r~u~ YJl l O ~ J ~ U/ ~ ~ U I / J(~ U U~U ~ ~ D ~ r Of t~ese period~, the period Tq sho~ the time taken to skip to the next ECC block from a pre.~ent ECC block that includes a defective sector. This ~kip operation in~ol~es skipping a present ECC block if one or more of the 16 sector~ is defecti~e and jumping to the next ~CC block.
This means that in an Av block, instead of merely logically replacing each defective sector in an ECC block with a replacement sector (o~ a ~eplacement ECC ~lock), u~e of each ~CC block ~all 16 ~ector~) wlth a defectlve sector is ~topped. This method is called the ECC block skip method.
The pe~iod T4 is the dis~ rotation wai~ time, which, in the ~orse case scenario, is the time taken for one re~olution o~
the di~c. This is presumed to be around 105ms for the present model. In pe~iod~ T3 and T5, the amount of data in the buffer increases at a rate given as Vr-VO, while during period T4, the amount decrea~es at the rate VO
When "~_ecc" represents the total nu~ber of ~CC block~
in an Av block, the size of an AV block is given by the formula "N_ecc~16~Z048" bits. To ensure consecutive reproduction is performed, the ml r~ m value of N_ecc i9 found as described below.
In period T2, AV data is only read from the track buffer with no concurrent replenishing of Av data. During thi~ period T2, should the amount of data in the buffer reach zero, an underflo~ ~ill occur in the decoder. In such case, the uninterrupted reproduction of AV data cannot be , rr~uM lYY~ YHloa ~ u/~ :u~ UUC~U~D~ r ~0 -guaran~eed. A~ a result, the relation shown as ~quation 1 below needs to be satisfied to guarantee the uninterrupted reproduction of Av data (which is to ~ay, to ensure that no underflow occurs).

F.cr~tion ~
(buffered data amo~nt B)2(consumed data amount R) The buffered da~a a~ount B is ~he amount Or data stored in the buffer at the end of the period T1. The consumed data amount R is the total amount of data read during the period TZ
The ~uffered data amount B is given by ~quation 2 below.

~q~tion 2 (buffered data amount BJ=~period Tl)*(vr-vO) - kead time for 1 AV block)*(Vr-VO) ztAV block size L/Vr)*(vr-vo) ~N_ecc~16~8t204~/V~)~(V~-V) =(N_ecc~16*8*Z048)*~1-VO/Vr) The con-~umed data amount R is given by Equation 3 below.

rAUM 1~34 YH10~ U~ UU~U~

F.~1At1 nn 3 (con~umed data a~ unt R)~T2~Vo Sub~tituting ~quations 2 and 3 into the re~pective S ~ides of Eq~ation 1 gives Equation 4 below.

tion ~
(N_ecc~16*8~2048~ VO/Vr)>T2~Vo By rea~anging Equation 4, it can be seen that the number N_ecc of ECC blocks that guara~tee~ ~on~ecutive reproduction mu~t satisfy Equation 5 below.

~uat;on 5 N_ecc~T2*VO/(~16*8~2048)~ VO/Vr)) In Equat~on 5, T2 i~ the jump period describ~d above, ~hich has a m~xim~m of 1. 5s . vr, meanwhile, has a fixed value, which for the model in the upper part of Fig. 40 is llMbps. VO is expressed by the follo~ing Equation 6 that takes the variable bit rate of the Av block that includes a number N_ ecc of ECC blocks into consideration. Note that vO
is not the maximum value of the logical transfer rate for output from the track buffer, but is given by the equation below as the effective input rate of variable rate AV data into the decoder. AV block length here is given as the ~ . . . . . .

ItUM I ~ Y ~ Y ,~ l b ~ : U I / X.~4 ~ U U ~ U ~

number N pack of pacX~ in an AY block composed of N_ecc ECC
block~ ( ~N ecc~ 6<N--packsN_ecc~l6) .

~ t; on 6 Vo~AV blocl; length (bit) ~ ~ l/AV block reproduction time(sec)) -(N_pack*2048~8)~(27M/(SCR ~irxt next - SCR_~irst current)) In the above equation, SCR fi~st next is the SCR of the flrst pack in the next Av block, while SCR ~ir~t current is the SCR of the first pack in the present AV block. ~ach SCR
sho~ the time at uhich the cor~esponding pack should be outputted fro~ the track buffer to the decoder The unit for SCRs is 1/27 megaseconds.
As shoun in the a~o~e Equations 5 and 6, the ~;n-m~lm size of an ~V block can theoretically be calculated in accordance with the actual bit rate of the AV data.
~ quation 5 applies to a case where no defective ~ectors exist on ~he optical disc. ~hen such sectors are pre~ent, the number of ECC blocks Necc required to en-~ure uninterrup~ed reproduction is as described belo~.
It is presumed here that the AV block area include~ ECC
blocks with defective sectors, the number of which is represented a~ "dN ecc". ~o Av data is recorded into the dN_ecc defective ECC blocks due to the ECC block ~kipping de~cribed abo~e. The loss time Ts caused by 3kipping the dN_ecc defective ECC block~ i5 represented as "T4~dN ecc,"

r~ul~ lY~4 YnlO~ UI/~ UU~U~ r where "T4" represents the ECC ~lock ~klp time for the model shown in Fig. 40.
To ensure the uninterrupted reproduction of the AV data when defective sectors are included, the AV block area needs.
to include as t~e nu~ber o~ ECC blocks aQ represented by ~quation 7.

tion 7 N ecc ~ d~_ecc + Vo~(Tj+Ts)/((16~8~2048)~ VO/Vr)) As described above, the Qi ze of the AV block area is calculated ~rom ~ormula S when no defective secto~ is present, and from Formula 7 when defective 3ectors are present.
It should be noted here that ~hen AV data is compo~ed of a plurality of AV blocks, the first and last AV ~locks do not need to not Yatisfy Equation ~ or 7. Thi~ i~ because the tLming at Uhich decoding is commenced for the first AV
block can be delayed, which is to say, the supply of data to the decoded may be delayed until ~uf~icient data is accumulated in thc buffer, tnereby ensuring uninterrupted reproduction between the first and second Av block~. The last AV block, meanwhile, is no~ followed by any particular AV data, meaning that the reproduction may simply end with thiq last AV block.

, . ~ . , .

L 1 77U~ 7nlUU ~ JUU'sUJJUU ! UIJ

~3--~) F'lln~t; nn~l Rl ocl~; of the D~rn pF~orfl~r 70 Fig. 41 is a func~ion block diagram showing the construction of the DVD ~ecorder 70 divided into functions.
~ach runc~ion in Fig. 41 is realized ~y the CPU la in the control unit 1 executing a program in the ROM le to control the hardware 6houn in Fig. 17.
The DVD player of Fig. 41 includes the disc recording unit 100, the disc readlng unit 101, the cu~ on file system unit 10, the AV file ~y~tem unit 11, the recording-editing-reproduction control unit 12, the AV data recording unit 13,the AV data reproduction unit 14, and the AV data editing unit 15.

~-2-1) nis~ Re~or~; n~ Un- t 100 - D; sc Re~ a Un; t 101 The disc recording unit 100 operates as follo~s. On receiving an input of the logical sector number from ~hich recording is to start and the data to be reco~ded from the common file system unit 10 and the AV file sy-~tem unit 11, the disc recording unit 100 moves the optical pickup to the appropriate logical ~ector number and ha~ the optical pickup record data in ECC block units (16 sectors~ into t~e indicated ~ectors on the disc. When the amount of data to be recorded i9 belo~ 16 sectors, the di~c recording unit 100 first reads the data, subjects it to ECC proce~ing, and record~ it onto the disc as an ECC block.
The di~c reading unit 101 operate~ a~ follows. On L ~ J ~IT J I J ~ U U ~ L ~ L L ~ U ~ / Ac~ ~ ~ ~ J u u -r u ~ ~ ~ u 1 \J I

receiving an input of a logical sector number ~rom ~hich data i5 to be read and a number of sector~ from the co~ on file ~y~tem unit 10 and the Av file sy~tem unit 11, the disc reading unit 101 moves the optical pickup to the appropriate 5 logical sector number and ha~ the optical pick~p read data in EC:C block units from the indicated logical ~ectors. The di~c reading unit lOl has ECC processing performed on the read data and transfers only the required sector data to the common file ~ystem unit lO. As with the disc recording unit lO lO0, the disc reading unit lOl reads VOB in units of 16 ~ector~ for each ECC block, thereby reducing the overhead~.

(3--2--21 r~ File SystQm Un; t lO
The coI~unon file ~y tem unit lO provides the recording-editing-reproduction control unit 12, the AV data recording unit 13, the AV data reproduction unit l4, and the Av data editing unit 15 with the ~tandard functions for accessin data rormat ~tandardi~ed under ISO/IEC 13346. These 20 standard functions provided by the common file ~y~tem unit 10 control the disc recording unit lO0 ancl the disc reading unit lOl to read or write data onto or from the DVD-RAM in directory units and file unlt~.
Repre~entative exa~nple of the standard functions 25 provided by the common file ~y~tem unit 10 are as follow~.
l. Having the disc recording unit 100 record a file 1 ~ J Ut ~ /1 1 u U ~1~'/ L L ~ -r J~ ~71~L ~ ~ u I / A~ 19 ~ ~ J v U -r ~I J J U ~J ~ u ~

entry a~d output the file identification descriptor to the recording-editing-reproduction control unit 12, the AV data recording unit 13, the AV data reproduction unit 14, and the AV data editing unit 15.
2. Converting a recorded area on the ~isc that includes one file into an empty area.
3. Controlling the disc reading unit 101 to ~ead the file identification de~criptor of a specified file from a DvD-RA~.
4. Controlling the disc recording unit 100 to record me~o~y present in the memory onto the di~c a~ a non-AV file.
5. Controlling the disc reading uni~ 101 to read an extent that composes a file recorded on the disc.
6. Controlling the disc ~eading unit 101 to mo~e the optical pickup to a de~ired position in the extents that compo~e a file.
To use any of the functions ~1~ to ~6), the recording-editing-reproduction control unit 12 to AV data editing unit 15 may iq~ue a command to the common file system unit 10 to indicate the file to be r~ad or recorded a~ a parameter.
Such commands are called common file system-oriented co~ands.
variou~ types of common file sys~em-orlented command~
are available, such as "~l)CREATE", "(2~DELETE", ~(3)0PEN/CLOSE~, "(4~RIT~ S) READ", and "(6)SEEK". Such commands are respectively assigned to the functions (1) to .

rr~UM I ~Y~4 ~ V ~ UU~U~D~ r (6), In the present ~m~sd;~ent, the assignment of ~om~an~s to the ~tandard function~ i as follows To use function (1), the recording-editing-reproduction control unit 12 to AV data editing unit 15 may issue a "CR~ATE" c~mm~n~ to the common file ay3tem unit 10. To use function ~2), the recording-editing-reproduction control unit 12 to AV data editing unit 15 may issue a "DELETE" command to the common file Yystem unit 10. In the same ~ay, to respectively use functions (3), (4), (5), and (6), the recording-editing-repr~duction control unit 12 to AV data editing unit 15 may issue an "OPE;N~C~OSE", "WRITE", "READ" or "SE;~SK-' command to the common file system unit 10.

(3-2-3) AV File ~;y:3t~m ~n;t 11 The AV file system unit 11 provides the AV data ~ecording unit 13, AV data reproduction unit 1~, and AV da~a editing unit 15 with extended function~ which are only necessary when recording or editing an AV file. These extended function~ cannot be provided ~y the common file system unit 10.
The follo~ing are representati~e example.~ of these extended ~unctions.
~ 7) Writing a VOB that has been encoded by the MPEG
encoder 2 onto a DVD-RA~ a~ an AV file.
(8) Cutting out an indicated part of the VOBs recorded V~ 7 70~t 7n 1 UU ~ L ~ U 1/ )~ Ju ~U ~Jl)U I U~
-in an AV file and ~etting the part a~ a diffe~ent file.
(9) Clearing an indicated part of the VOBs recorded in an AV ~ile ( 10 ) T,~ n~i ng two AV file~ that are present on the ~v~-RA~ ~ith VOBUs that have been re-encoded acco~ding to the procedurc in the fir3t and second embodi~ents.
To use the extended functions ~7) to (10), the recording-edi~ing-reproduction control unit 12 to AV data editing unit 15 mdy iY~ue a c~ ~n~ to the common file ~y~tem unit 10 to indicate the file to ~e recorded, linked, or cut out. Such commands are ~alled AV file ~y~tem-oriented command~. Here, the AV file 3ystem-oriented commands "AV-WRITE", "SPLIT", "SHORTEN", and "MERG~" are available, with these being respecti~ely as~igned to the functions ~7) to (10).
In the pre3ent embo~;m~nt, the assignment of cnmm~n~s to the extended function~ i~ a~ follows. To use the function (7), the AV data recording unit 13 to AV data editing unit 15 may i~sue an AV WRIT~ command. To u~e the function (8), the AV data recording unit 13 to Av data editing unit 15 may ia~ue a SPLIT command. Similarly, to use the function (9) or ~10), the AV data recording unit 13 to AV data editing unit 15 may issue a "SHORTE~" or '-MERGE'-commdnd. With function (10), the extent of the file after linking is as long aq or longer than an Av ~lock.

'98~09,q17~ ) 10:43 ~G R I CHE S ~ jfi Pll 1 (3-2-4) Recor~n~ tin~-Re~rQ~'ctio~ Control Un't lZ
The recordi~g-editing-reproduction control unit 12 issue3 an OPENJCLOSE command that i~di~ates directory names as parameters to the com~on file system unit 10, and by doing ~o has the common file system unit 10 read a plurality of file identi~ication descriptors fro~ the DvD-RAM. The recording-editing-reproduction control unit 12 then analyzes the directory structure of the ~VD-RAM from the file identification descriptors and receives a user indication of a file or directory to be operated upon.
On receiving the user indication of the target file or directory, the recording-editing-reproduction control unit 12 identifies the desired operation content based on the user operation notified by the remote control signal reception unit 8, and issue~ instruction~ to have the AV
data recording unit 13, the AV data reproduction unit 14, and the AV data editing unit 15 perform the appropriate processing for the file or directory indicated as the operation target.
2 0 To have the user indicate the operation target, the recording-editing-reproduction control unit 12 outputs graphics data, which visually repre~ents the directory structure, the total number of A~ files, and the data sizes of empty areas on the present disc, to the video signal proces3ing unit 5. The ~ideo signal processing unit 5 con~erts thi~ graphics data into an image ~ignal and has it . .

L 1 ~ J U ~r ~ I J 1 ~ U ~ ~ J ~ I L L . --~ ~ / ~ L L ~ U I ~ A ~ ~ ~ -r J ~I U ~ V J J U U
~_.

displayed on the TV mc~nitOr 72.
Fig. 42 ~ow~ an example of the graphics data displayed on the Tv monitor 72 under the control of the recording-editing-reproduccion con~rol unit lZ. During the display of this graphics data, the display color of any of ~he files or directories may change to -~how potential operation targets.
This change in color is used to focu~ the attention of the user, and so is called the "focuY state" Display using the normal color, mean~hile, i~ called the "normal ~tate".
When the u~er pre~e~ the mark key on the remote controller 71, the display of the file or directory that i~
cu~rently in the focus state return~ to the normal state and a different, newly-indicated ~ile or directory i~ di~played in the focus state. When any of the file~ or directorie~ i~
in the focus state, the recording-editing-reproduction control unit lZ waits for the user to press the "confi rm"
key on the remote controller 71.
When the user presses the enter key, the recording-editing-reproduction control unit 12 identifie~ the file or directory that is currently in tho focu~ ~tate a~ a potential operation target. In thi3 way, the recording-editing-reproduction control unit 12 can identify the file or directory that is the operation target.
To identify the operation content, howeve~, the recording-editing-reproduction control unit 12 determines what operation content ha3 been assigned to the key code received from the remote control ~ignal reception unlt 8.
As sho~n on the left side of Fig. 41, keys ~ith the legend~
hPLAY", ~ REWIND", "STOP", "FAST FORWARD", "R~CORD", "MARX", "VIRTUA~ EDIT", AND "REAL ~DIT" are present on the remo~e controller 71. In this way, the recording-editing-reproduction control unit 12 identifies the operation content indicated by the user according to the key code received from the remote control signal reception unit 8.

(3-~-4-1) Ooerat~on Contents That Can ~e Recei ~e~ h~ the Recor~;na-F.~;t;n~-Re~r~Auct.i~n Control ~nit 1~
The ope~ation contents are classified ln~o operation con~en~s tha~ are provided on conventional domestic AV
equipment, and operation conte~ts that are specially provided for ~ideo editing. As specific examples, "play", "rewindn, "stop", "fast forward", and "record" all fall into the fo~mer category, while "mark", "~irtual edit", and "real edi~" all fall into the latter category.
A "play" operation has the DVD recorder 70 play back a zo VOB that i~ recorded in an AV file that is specified as the operation target.
A "rewind" operation ha~ the DvD recorder 70 rapidly play bac~ a pre~ently reproduced vOB in re~erse.
A "stop" operation has the DVD recorder 70 stop the 2S reproduction o~ the present VOB.
A "fast ~orward" operation has the ~vD recorder 70 I' l ~ Ul~L 1 7 7 U ~t 7 r~ 1 U U ~ U/ ~fJ~ \J / / A~ '~ 's J U \) ~ U J J u u I u u rapidly play back the pre~ent VOB in the forward direction.
A "record" operation ha~ the DVD recorder 70 generate a new AV file in the direc~ory indicated a~ the op~ration target and ~rite the VOB to be recorded into the new AV
file.
These operations in this ~ormer category are well-kno~n to users as function~ of conventional domestic AV equipment, such as video cassette ~ecorde~s and CD players. The operations in the latter category are performed by users when, to u~e an analogy of editing a conventional movie fil~, sections of movie film are cut out and ~pliced together to produce a new movie ~equence.
A "mark" operation ha3 the DVD recorder 70 replay a VOB
included in the Av file indicated a6 the operation target and marks desired images out of the video Lmages replayed by the VOB. To use the analo~y of editin~ a ~ovie ~ilm, this "mar~" operation involve~ the ma~ing of point~ where the film i~ to be cut.
A "~irtual edit" operation ha~ the DVD recorder 70 select a plu~ality of pairs of tuo points lndlcated by a mark operation as reproduction start points and reproduction end point~ and then define a logical reproduction route by a~signing a reproduction order to the~e pairs of point~.
In a ~irtual edit operation, the ~ection derined by one pair of a reproduction start point and reproduct:ion end point ~elected by the u~er iY called a "cell-. The rAU~ c~ ~Hl~~ u~ uu~u~ r o~
.

reproduction rou~e defined by a~signing a reproduction order to the cells is called a "program chain".
A real "edit" operation ha~ the DvD recorder 70 cut out each ~ection indicated ~s a cell from an Av file recorded on a DvD-RAN, ~et the cut-out section~ as ~eparate files, and link a plurality of cut-out ~ection~ in accordance with the reproduction order ~ho~n by a program chain. Such edit operations are analogous to the cutting of a ~ovie film at the marXed poYitlons and the ~plicing of the cut ~ections together. In these edit operations, the extent of the linked files is equal to or greater than the length of one AV ~lock.
The recording-editing-reproduction control unit 12 controls which of the AV data recording unit 13 to the AV
data editing unit 15 are used ~hen performing the operation co~tents described a~ove. In addition to ~pecifying the operation target and operation content, the recording-editing-reprotuction control unit 12 choo~e~ the appropriate component (9) for the operation content out of the Av data recording unit 13 to AV data editing unit 15 and outputs instructions informing these components o~ the operation content.
The following is a description of exA~ple instruction~
that the recording-editing-reproduction control unit 12 gives to the AV data recording unit 13, the Av data reproduction unit 14, and the Av data editing unit 15 using combination~ of an operation ~arget and an ope~ation content.
In Fig. 42, the directory "DVD_Video" is in the focu~

state, so that if the u~er pre~e~ the "RECORD" key, the recording-editing-reproduction control unit 12 identifie~
the directory "DVD_Video" aa the operation target and "reco~d" as the operation content. The recording-editing-reproduction control unit 12 ~elect~ the Av data recordlng unit 13 a~ the component capable of performlng ~ ~ecord operation, and in~truct~ ~he AV data recording unit 13 to generate a ne~ AV file in the directory indic~ted ~s the operation target.
When the file "AV_~ILE#1" is in the focus state and the u~er presses the "PLAY" key on the remote controller 71, the reco~ing-editing-reproduction control unit 12 identifie~
the file "AV_FI~E#1" as the operation target and "play" as the operation content. Tho recording-editing-reproduction control unit 12 selects the Av data reproduction unit 14 a~
the component capable of performlng a play operation, and in~tructs the Av data reproduction unit 14 to reproduce the AV file indicated a~ the operation target.
When the file "AV_FILE~l" is in the focus state and the u~er pre~es the "MARK" key on the remote controller 71, the recording-editing-reproduction control unit 12 identifie~
the file "AV_FI~E#1~ as the operation target and ~mark" a3 the operation content. The recording-editing-reproduction ~ "~"" 1 J JUT Jll I UU ~ / ~L . -:U/ ~'j)~ . U 1/ A~JUu~u~J~u I l I
~, .' control unit 12 selects the AV data editing unit 15 as the component capable of performing a ~ark operation, and instructs the AV data editing unit 15 to perform a ~larking operation for the AV file indica~ed as ~he operation targe~.

(3-~-5 AV DAtA Recor~in~ Unit 13 The Av data recording unit 13 controls encoding operation~ of the MPEG encoder 2 while i~uing co =on file ~y~tem-oriented command~ and AV file ~y~stem-oriented commands in a predetermined order to the common file system unit lO and the AV file syste~n unit 11. By doing 50, the AV
data recording unit 13 makes use of the function~ (l) to ~10) and realizes recording operations.

13-~--6) AV n~t~ r~ rtlnn Unit 14 The Av data reproduction unit 14 controls decoding operat:ions o~ ~he MPEG decoder 4, while is~suing cos~non file ~ystem-oriented command~ and AV ~ile ~ystem-oriented commands in a predetermined order to the common file system unit lO and the AV file system unit 11. By doing so, the AV
data reproduction unit 14 makes u~e of the functions (1) to (10) and ~ealizes "play", "re~ind", "fast forward", and "~top" operations.

(3-2-7~ AV l:~ta l;.~li t;na Un; t 15 The AY data editing unit 15 control~ the decoding .
CA 02247637 l998-09-l7 operations of the ~P~G decoder 4, while issuing common file sy~tem-oriented ~n~n~ and Av file system-oriented c~m~n~q in a pred~t~rm;~od order to the ~ommon file system unit 10 and the AV ~ile system unit 11. By doing so, the AV
data reproduction unit 14 makes use of the functions (1) to ~10) and realize~ "mark", "~irtual edit~, and ~'edit"
operations.
In more detail, cn receiving in~tructions from the recording-editing-rep~oduction control unit 12 to mar~ the AV file indicated as the operation tar~et, the AV data editing unit 15 has the AV data reproduction unit 14 reproduce the indicated AV file and monitors when the user presses the "MA*X key on the remote controller 71. When the user presses the "MARR" key during the reproduc~ion, the AV
data editing unit 15 writes information called a "mark point" onto the disc as a non-AV file. ~his mark point information -~ho~r~ the time in ~econd~ from the ~tart of the ~eproduction of the AY file to the point where the u~er pressed the "MARK" ~ey.
On recei~ing in~truction~ from t~e recording-editing-reproduction control unit 12 f or a virtual edit operation, the AV data editing unit lS generates information that defines a logical reproduction route in accordance ~i~h the u~er key operations of the remote controller 71. The AV
data editing unit lS then control~ the common file sys~em unit 10 so that this information is ~ritten onto the DVD-RAM

as a non-AV f ile.
On receiving in~truction~ from the recording-editing-reproduction control unit 12 for a real edit operation, the AV data editing unit 15 cuts out the sections of the DvD-RAM
S indicated aq cells and sets the cut-out ~ection~ a~ ~eparate file~ which it links to form a sequence of cells.
When 1 1 nki ng a plurality of file~, the AV data editing unit 15 performs processing so that seamless reproduction of i~ages ~ill be achie~ed. Thi~ means tha~ ~here uill be no interruptions in the image di~play when a linked Av file i~
reproduced. The AV data editing unit 15 links extents to make all extents, except for the last extent to ~e reproduced, equal to or greater than the Av block length.

13-~-7-1) Pro~es.~;~n~l ~or Virt~ it.~ ~n~l F~ ts by the AV

nAtA ~itino Unit lS
Fig. 43 is a flowchart for the processing of ~irtual edit and real edit operations. Figs. 44A to g4F are figures ~howing a Yupplementary example of the proce~ing by the AV
data editing unit 15 according ~o the flo~chart of Fig. 43.
The ~ollowing de~cribes the editing processes of the Av data editing unit 15 ~ith reference to the ~lowchart of Fig. 43 and the example in Fig~. 44A to 44F.
The Av file shown in Fig. 44A i~ already stored on the DVD-RA~. When thi~ Av file i~ indic~ted as the operation target, the u~er presse~ the "PLAY'- key on the remote controller 71. T~e recording-editing-reproduction control unit 12 detect~ key operations, ~o that when the user pr~sses the "MARR" key, the AV data oditing ur~it 15 ha~ the AV data reproduction unit ~4 commence the reprodu~tion of the AV file in step Sl.
After the start of rep~oduction, the reproduction proceeds a~ far as the time tl in Fig. 44B when the u~er next pre~es the "M~RK" key. In response to this, the AV
data editing unit 15 sets the mark point41, which expreYse~
a rela~i~e time code for time tl, into the present AV file.
The user subsequently pre~3e~ the "MARR" key a total of ~even time~ at tLmes t2, t3, t4, ... t8. In re~pon3e, the AV data editing unit 15 ~ets the mark points #2, #3, #4, #5, ... #8, ~hich expre~s relative time codes for time t2, t3, t4, ... t~, into the present Av file, as sho~n in Fig. 44B.
After the execution of step Sl, the processing proceeds to step S2 where the Av data editing unit 15 has the user indicate pairs of mark point~. The AV data editing unit 15 then determine~ the cells to be reproduced within the pre~ent AV flle 1n accordance vith tne selected palr~ of mark point~.
In Fig. 44C, the user ind~cates that mark point~ #1 and #2 form pair (1~, mark point~ #3 and #4 rOrm pair (2), mark points #5 and #6 form pair (3), and m~rk points #7 and #8 form pair (4).
In this way, the AV data editing unit 15 set~ the AY

CA 02247637 l998-09-l7 data within each pair o~ point~ as a ~eparate cell, and so in the present example set~ the four cells, Cell#l, Cell~2, Cell~3, and Cell#4 Note that in the pre~ent example, the AV data editing unit 15 may alternatively set the pair of ~ark#2 and Mark#3 a~ one cell, and the pair of Mark~ and Mark#5 as another cell.
Next, in 3tep S3, the AV data editing ~nit 15 generates a program chain by a~signing a reproduction order to the cells it haa produced. In ~ig. 44D, Cell#1 is the fir~t in the rep~oduction route ~shown by the legend "1~"' in the drawing), Cell#2 is the second in the reproduction ~oute (shown by the legend "2 d~ in the drawing), and Cells #3 and #4 re~pectively are the third and fourth in the reproduction By doing ~o, the Av data editing unit 15 treats the plurality of cells as a program chain, ba~ed on the chosen reproduction order. Note ~hat Fig. 44D show-~ the simplest reproduction order of cells, with the 9etting of other orders, such as Cell#3 ~ Cell#1 ~ Cell#2 -4 Cell#4, being equally possible.
In step S6, the AV data editing unit 15 monitor3 whether the user has indicated the reproduction of the program chain. ~n ~tep S5, the Av data editing unit lS
monitor~ whether the user has indicated an editing operation for the program chain. When the user indicates reproduction, the AV data editing unit 15 instructs the AV

data reproduction unit 14 to reproduce the program chain indicated for reproduction.
On receiving reproduction in~truction~ from the Av data editing unit 15, the Av data reproduction unit 14 ha~ the optical piçkup ~eek Mark#1 ~hich is the reproduction ~tart position for cell#1, as ~hown in Fig. 44E. Once the optical pickup has moved to Mark#1 in the AV ~ile in accordance ~ith the SEE~ command, the AV data editing unit 15 has the 3ection ~etween Mark~l and ~ark~2 read by issuing a READ
command to the common file system unit 10. In this way, the VOBUs in Cell#1 are read from the DVD-RAM, before being sequenti~lly decoded by the MP~G decoder 4 and di~played as imdges on the Tv monitor 72.
Once the VOBUs have ~een decoded as far as Mark#2, the Av data editing unit 15 has the same proce~sing performed for the remaining cells. By doing 90, the AV data editing unit 15 ha3 only the ~ection~ indicatcd a~ Cells ~1, #2, #3, and #4 reproduced.
The AV file shown in Fig. 44A i9 a movie that wa3 ~roadca~t on televi~ion. Fig. 44F ~hows the image content of the different sections in this AV file. The ~ection between time t~ and time ~1 19 the credit ~equenee V1 w~ich qhow~ the cast and director of the movie. The section between tLme tl and time t2 is the first broadcast sequence V2 of the movie itself. The section between time t2 and time t3 i~ a commercial ~equence v3 that uas in~erted into FROM _ l998~ 9~l6e(~)22 55~22 07/~4300403388 P 77 the TV broadca~t. The ~ection between time t3 and time t4 is the ~econd b~oadcast sequence v4 in the movie. The section between time t5 and time t6 is the third broadcast ~equence V5 in the movie.
Here, times tl, t2, t3, t4, t5, and t6 are ~et a~
Mark~1~ Mark#2~ Mark#3~ Mark#4~ Mark~5~ and Mark#6, and pairs of marks are set as cells. The display order of cells is set as a program chain.
~hen performing a read a~ ~houn in Fig. 44E:, Av data editing unit 15 has the credit sequence vl skipped, so that the reproduction starts ~ith the first movie sequence v2 given between the time tl and the t2. Following thi3, the Av data editing unit 15 ha~ the commercial sequence V3 skipped, and ha3 the second mo~ie ~equence V4 ~et~een the time t3 and the t9 reproduced.
The following i~ a description of the operation of the Av data editing unit 15 ~hen the ~er indicates a real edit operation, with reference to Fig~. 45A to 45E and Fig~. 46A
to 46F. Fig~. 45A to 45E sho~ a ~upplementary example of the processing of the Av data editing unit 15 in the flowchart of Fig. 43. The variables mx, Af in the flowchart of Fig. 43 and Fig~. 45A to 45E indicate a po~ltion in the AV ~ile. The ~ollo~ing explanation deals with the processing of the AV data editing unit 15 for a real edit operation First, in ~tep S8, the AV data editing unit 15 '9&~09~17~ ) 10:43 76~*G R I CHE S ~ j,f P12,'~5 determines at lea~t two sec~ions that are to be cut out from the present AV ~ile in accordance with the p~ogram chain that wa~ generated d~ring a virtual edit operation.
~he "source AV file" in Fig. 4SA has been given the m~rk point~ Mark#l~ #2, ~3, .... #8. The cells that have been set for this source AV file are defined by pairs of the ! mar~ points Mark#l, #2, #3, ..... #8, so that the A~ data editing unit 15 treat~ the mark points in each pair as an editing start point and an editing end point, respe~ti~ely.
As a ~esult, the AV data editing unit 15 treats the pair of Marks #l and ~2 as the editing start point '~In(~ and the editing end point "Out~l)". The AV data editing unit 15 simllarly treats the pair o~ Mark~ #3 and #4 as the editing start point "In(2)" and the editing end point "Out(2)", the the pair of Marks ~5 and ~6 as the editing start point "In(3)" and the editing end point "Out(3)", and the pair of Marks #7 and #8 as the editing sta~t point "In(4)" and the editing end point "Out(4)".
The period between Mark#l and Mark#2 corresponds to the first ~ovie seguence v2 between the time tl and the time t2 shoun in Fig. 44F. Similarly, the period between Mark~3 and Mark#4 corre~ponds to the second movie seq~ence V4 between the time t3 and the time t4 shown in Fig. 44F, and the period between Mark#5 and Mark#6 corresponds to the second movie ~equence v5 between the time tS and t~e time t6 Accordingly, by indicating this real edit operation, the .. . . .

-user obtain~ an AV file that only includes the movie sequence~ V2, V4, and V5.
Next, in step S9, the Av data editing unit 15 issues a SPLIT command to the AV file system unit 11 to have the S determined split region divided into mx AV files (~here mx is an integer no leqs than 2). The AV data editing unit 15 treats each closed area indicated by a pair of an editing ~tart point and an editing end point in Fig. 45A as an area to be cut out, and ~o cuts out the four Av files ~hown in Fig. ~5B.
The AV data editing unit 15 hcreafter spe~ifies one of the cut-out mx AV file~ using the varia~le Af, with the cut-out files being numbered AV file Afl, Af2, Af3, ... Afm. In step S10, the AV data editing unit 15 sets the variable A~
at "1" to initialize the variable Af. In step Sll, the AV
data editing unit 15 i~3ue~ a R~AD comm~n~ to the AV file system unit 11 for the YOBU~ ~hereinafter called the "end part") located at the end of the AV file Af and the VOBUs ~hereinafter called the "first part") ~oc~ted at the start of the AV ~ile Af+l. After issuing theqe cQmm~nds, in Ytep S12 the AV data editing unit 15 uses the ~ame procedure as the second embo~l~e~t to re-encode the last part of AV file Af and the first part of AV file Af.
After the re-encoding, the AV data editing unit 15 issues a SHORTEN command to the Av filc .~y~tem unit 11 for the la~t part of the AV file Af and the first part of the AV

-~ile Afll tA~2~.
In Fig. 45C, the last part of the AV file Afl and the first part of the AV file Af2 aro read aq a result of the READ cn~n~ and are re-encoded A~ a result of the re-encode process, the re-encoded data produced by re-encoding the read data i~ accumulated in the memory of the DVD
recorder 70. In step S13, the Av data editing unit 15 issues a SHO~TEN command, which results in the area formerly occupied by the read last and rirst part~ being deleled.
It ~hould be noted that the deletion performed in thi~
way results in one of the two follo~ing cases.
The first case i~ when regardless of ~hether either of the AV file Afl or the AV file Af+1, whose section3 to be re-encoded have been deleted, ha~ an continuous length that is equal to or greate~ than the Av block length, the continuous length of the other AV file is ~elow the data 3ize of an AV block. Since the length of an AV block is set at the length which preven~ o~erflo~s occurring, if AV file Af o~ Af+1 i~ reproduced in a state where its continuous length i9 3hor~cer ~han the length of an AV block, an underflow will occur in the track buffer.
The ~econd case is where the data ~ize of the data ~in-memory data) that has been re-encoded and stored in the memo~y is below the data size ~length) of an Av block. When the data size of ~he in-memory data i~ large and ~o would occupy a region on a DVD-RAM that i~ equal to or greater than one AV block, the d~ta ~ay be stored at a different position on the DVD-RAM away from the AV files Af and Af+1.
However, when the data size of the in-memory data i~ ~maller than one AV block, the data cannot be stored at a dif~erent position on the DVD-RAM away from the AV files Af and Af+1.
Thi-~ is for the ~ollowing reasons. Du~ing a read performed for in-memory data that is smaller than the size of an Av block but i-~ Ytored at a separate po~ition, a sufficient amount of data cannot be accumulated in ~he track buffer. Should the jump from the in-memory data to the AV
file Af+1 take a relatively long time, an underflow will occur in the track ~uffer while the jump i~ taking place.
In Fig. 45D, the broken line~ show that the last part of the Av file Afl and the first part of the Av file Af2 ha~e been deleted. This results in the length of the AV
file Afl being below the length of an AV block, and in the length of the in-memory data being below the len~th of an AV
block.
If ~his Av file Afl is left as it is, the~e is the risk that an underflow will occur when jumping from the AV file Afl to the Av file Af2. To pre~ent the occurrence of such underflows, in step S14 the AV data editing unit 15 issues a MERGE com~and for the Av file Afl and the AV file Af+1.
As shown in Fig. 45E and Fig 46A, this processing results in th~ linking of the Av file Afl and the re-encoded VOB~ so that the continuous length of the recording region FROM _ 1998~ 9~17~(~) 0:33~22:21/ ~ ~4600149702 P 2 for all the extents formung the AY ~ile A~1 end~ up equal to or longer than the length of an Av block. After issuing the M~RG~ c~ ~d, the AV data editing unit 15 judge~ in step Sl5 whether the ~ariable A~ matches the number o~ Av files mx-l. If the numbers do not match, the AV data editing unit 15 increments the ~ariable Af i~ step Sl6 and returns to step Sll. In this way, the AV data editing unit 15 repeats the processing in steps Sll to Sl4.
A~ter the variable Af ha~ been incremented to become "2", the AV data editing unit 15 issue~ a R~AD command so that the last part of the Av file Af2 ~after the previous linking) and the first part of the AV file Af3 are read, as shown in Fig. 46B. Once the VOBUs in this last part and firat part h~e ~een re-encod~d, thc resulting re-encoded data i~ stored in the memory of the DVD recorder 70.
The reglons on the DVD-RAM that ~ere originally occupled by ~he first part and the last part are deleted as a result of the SHORTEN command that the AV data editing unit 15 issued by the step Sl3. As a result, the re~al nt ng Av file Af3 has a continuous length that is below ~he length of an Av block. The AV data editing unit 15 i~sues a MERGE
c Z-n~ to the AV ~ile ~y~tem unit 11 i~or the Av files Af2 and Af3, as shown in Fig. 46D and 46~. This procedure is repeated until the variable Af is equal to the value mx-l.
A~ a result Or the abo~e processing, the extents in the storage area only contain the movie sequence.~ v2, v4, and FROM _ 1998~ 9~17~(~) 0:33/~22:21/~ 4600149702 P 3 v5. The3e extents each nave a continuous leng~h ~ha~ is above the length of an AV block, so that it is guaranteed that there will be no interruptions to the image display du~ing the reproduction of these AV file~.
The period between the Mark#1 and the Mark#2 correqponds to the first movie sequence V2. The period between the Mark#3 an~ the Mark#4 corresponds to the first movie ~equence V4, and the period bet~reen the Mark#5 and the Mark#6 corresponds to the third movie sequence V5. As a result, ~y performing an edit operation, the user can obtain a ~equence composed of AV files for only the movie ~equence~
v2, v4, and vS.

t3-~-7-l-~ ProceYs;na o~ ~e Av File cy~tPm Unit 11 when a Solit Cn~ n~ I3 Issued The following explanation deals ~ith the details of the processing by the Av file system unit 11 uhen providing extended functions in respon~e to a SPLIT command. Fig. 98A
shows the operation of the AV file system unit 11 when providing extended functions in respon~e to a SP~IT command.
In this flowchart, one out of the mx pairs of an editing start poinL (In point) and an editing end point (Out point) is indicated u~ing the varia~le h. ln ~tep S22, the value "1" is substituted into the variable h ~o that the ~ir~t pair of In point and Out point are processed.
The AV file system unit 11 generates a file en~ry (h) '38~09,q17~ )10:44 ~G RICHES ~ f Pi3, l5 in step S31, and adds the file iden~ifier ~h) for the file entry (h) in a directory file of a temporary directory.
In step S33, the AV file sy5tem unit 11 calculates the first addres~ s of the sequence of u logical blocks (where u21) from the logical block corresponding to the In point (h) to the logical block corresponding to t~e Out point (h), and the number of occupied blocks r.
In step S34, the AV file system unit ll generates u allocation descriptor~ within the file entry (h). In ~tep S35, the AV file sy~tem unit 11 records the ~irst address s of the sequence of u loglcal blocks and the number of occupied blocks r into the each of the u allocation : descriptors. In step S35, the Av file system unit 11 judges whether the variable h ha~ ~eached the value mx-l.
I~ the variable h has not reached thi~ value, the AV
file system unit 11 increments the variable h and returns to step S31. By doing so, the AV file system unit 11 repeats the processing in steps S31 to S35 until variable h reaches the value mx-l, and so cuts out the closed sections within each of the mx-1 pairs of an In point and an Out point as AV
~iles.

~3-2-7-1-3) Procescino of the AV File ~yst~m Unit ll When .~horten Cnmm~n~ I~ Tssl~ed ~he explanation deals with the processing Or the AV
file syste~ unit ll when providing extended functions in CA 02247637 l998-09-l7 rr\A~r I nnok n,q~ rtC (+~ 3~9 ~Icl~a ~ ~nr7q~ r~ r response to a SHORTEN command. Fig. 48 is a flowchart sho~ing the content of this proce~sing.
In step S38, the AV file 3y3tem unit 11 calculates both the fi~st address c of the logical bloc~ ~equence ~etween the deletion start addre~ and the deletion end address that specify the area to be deleted and the number of occupied blocks d. In step S45, the Av file system unit 11 accesses the allocation identifier~ of the ~v file whose fir~t or last part is to be deleted. In ~tep S46, the AV file system unit 11 judges whether the area to be deleted is the first part of the extent of an AV file.
If the area to be deleted is the first part of an extent (-'Yes" in step S46), the AV file system unit 11 advance~ to ~tep S47 and update~ the ~torage fir~t addres~ p of the extent to the storage fir~t addre~ p+c*d in the allocation descriptor.
After this, in step S48 the AV file system unit 11 update~ the data ~ize q of the extent of the number q of occupied blocks gi~en in the allocation descriptor to the data si7e q-c~d. On the other hand, if ln step S46 the AV
file system unit 11 find~ that the area to be deleted is the la-~t part of an AV file, the AV file system unit ll proceeds directly to ~tep S48, and updates the data size q of the extent of the num~er q of occupied block~ given in the allocation descriptor to the data slze q-c*d.

FROM _ 1998~ 9~17~(~) 0:35/~22:21/ ~ ~4600149702 P 6 ~3--7--7--l--9) ~roces-c~ g of th.~ AV file syctf~m lln;t ll when a MF.Rt~F. c~nn~nrl ;S Issued The follouing explanation deals with the proce~sing content of the AV file system unit ll when providing extended functions in response to a MERGE c~m~n~ The following explanation a;m~ to clarify the procedure u~ed to process the areas 3urrounded by the dot-da~h lines y3, y4 in Fig. 45E and Fig. 46D.
In response to a MERGE command, the AV ~ile system unit ll arranged the AV files Af and Af+l, which were partially deleted a~ a result of the SP~IT and SHORTEN ~n~m~nds, and the re-encoded data (in-memory data), which is present in the memory of the DVD recorder ~O as a result o~ the re-encoding, onto the DVD-RAM in a way that enables the sezmle~s reproduction of the AV file Af, the data in the memory, and the Av file Af+l in that order.
Fig. 47A shows an example of the Av data processed by the Av file 3ystem unit ll ~hen providing extended functions in response to a MERGE command. In Fig. 47A, the AV files x and y have been processed according to a SP~IT command.
~he virtual editing i~ a.~sumed to ha~e defined a reproduction route whereby the AV data iq reproduced in the order AV file x ~ in-memory data ~ AV file y.
Fig. 47A shows an example reproduction route for the AV
data in the AV files x and y. In Fig. 4-/A, the horizontal axls repre~ent~ time, 50 that the reproduction route can be FROM _ 1998~ 9~17~(~) 0 35/~22 21/~g4600149702 P 7 seen to set the display order as AV file x - in-~e~ory data -~ AV file y.
Of the AV data in AV file x, the data part m located at the end of AV file x is stored in a con~ecutive area of the DVD-RAM, with this being called the "former extent"
Of the AV data in AV file y, the data part n located a~
the start of AV ~ile y is also stored in a consecutive area of the DVD-RAM, with this being called the "latter extent".
As a result of the "SPLIT" .~ ~nd, AV flles x and y are obtained uith certain sections of Av data ha~ing been cut ~w~y However, while the file system mdnages the areas on the disc that correspond to the cut-away data as if they were empty, the data of the original Av file is in ~act left as i~ is in the logical ~lock~ on the DVD-R~l.
It i~ assumed that when the reproduction route is set by the user, the user does not need to consider the way in which AV blocks on the DVD-RAM store the cut-away AV file~.
A~ a result, there is no way in uhich the positions on the DVD-RAN storing the ~ormer and latter extents can be identi~ied for certain. E~en if the reproduction route specifies the order a~ Av file x ~ AV file y, there i~ still the possibility o~ Av data that i~ unrelated to the pre~ent reproduction route being present on the disc ~etween the ~ormer and the latter extent.
In vle~ o~ the above consideration, the linking Or AV
file~ cut away by a SPLIT command does not assume that the '38~09,ql7~ ) 10: 44 ~ ~ R I C H E S ~ fi Pll l5 ~ormer extent and latte~ extent are recorded at consecutive po~itions on the DVD-RAM, and qo should instead assu~e that the for~er extent and latter extent are recorded at completely unrelated positions on the DVD-RAM.
Here, it should be a~sumed that at least one "other I file extent", which is unrelated to the reproduction route indicating the AV files x and y, is pre~ent between the I ~torage ~egions of the former extent and the latter extent.
I Fig. 47~ shows a representation o~ the positlonal relationship of the ~torage areas on the DVD-RAM of the former extent and the latter extent, in view of the above consideration.
The Av file x including the former extent is partially cut away as a result of the SPLIT command, and ~o includes an empty area where the former extent was formerly pre3ent.
his area i3 called the Out area. As described above, this Out area in fact still logically includes the data of the AV
file x that ~as cut out, although the AV file system unit 11 treats the area as an empty area ~ince the S2LI~ co~m~n~ has already been issued.
The AV file y including the latter extent is partially ! cut away as a result o~ the SPLI~ command, and so includes an empty area where the latter extent wa~ formerly present This area ii called the In area. As described above, thi~
In area in fact still logically include~ the data of the AV
file y that was cut out, although the AV file system unit 11 '98~09,ql7~ )10:45 ~G RICHES ~ 3ifi P~5,i~5 -treats the area a~ an empty area since the SPLIT command has already been issued.
In Fig. 47B, the former extent is stored at a preceding position to the latter extent, though this merely illustrate~ one example, ~o that it is perfectly possible for the latter extent to be ~tored at a preceding position to the former extent.
In the present example, the other file extent is present between the former extent and the latter extent.
While the In area and the Out area are ideal for the recording of the in-memory data, the continuous length of the In area and the Out area is restricted due to the presence of the othe~ file extent between the former extent and the latter extent.
In step S62 in the flowchart of ~ig. 49, the AV file system unit 11 calculates the data size of the Out area, and the data size of the In area.
On finding the data slze of the In area and the Out area, the AV file system unit 11 refers to the data slze m of the former extent and the data size n of the latter extent and judges whether the former extent could cause an underflow in the track buf~er during reproduction.

(3-2-7-~-4-1~ Processina When the Formpr ~Fxtent m i~ less th~n the AV Rlock r~nath When the former extent m is shorter than the AV block . . . _ .

FROM_ l998~ 9~l7~(~) 0 36/~22 2l/~4600l49702 Pl0 length and the latter extent n is at least equal to the AV
block length, an underflow may occur ror the former extent m. The proceqsing proceed~ to step S70 in Fig. 50.
Fig 50 is a flowchart when the former extent m is shorter than the AV block length and the latter extent n i~
at lea~t equal to the AV block length. The proce~3ing by the Av ~ile system unit 11 in Fig. 5O i~ explained with reference to Figs. 51, 52, and 53. Figs. S1, 52, and 53 show the relationship~ among the data ~izes or the extents m and n, the In area and the Out area i and j, the in-~emory data k, and the Av block B, as well as the areas in which each piece of the data is recorded and the areas to which the data is moved.
The former extent i~ shorter than the AV ~lock length.
1~ A~ a re~ult, an underflow would occur if no remedial action we~e taken. Accordingly, the flowchart in Fig. 50 ~how~ the proce~sing to determine the appropriate storage location for the former extent and the in-memory data.
In -~tep S70, it is judged uhether the sum of the size~
of the former extent and the in-memory data is equal to or greater than the AY block length. If so, the processing proceeds to ~tep S71, and it is judged whether the out area are is larger than the in-memory data. When the Out area i~
larger than the in-memory data, the in-memory data is written ln the out area to make the con~ecutlve length o~
the former extent at least equal to the AV block length.

CA 02247637 l998-09-l7 FROM - 1~98~ 9~179(~) 0 37/~22 21/ ~ g4600149702 Pll Flg. 51A sho~s an arrangement of thc former extent, the latter extent, the In area, and the Out area on the DvD-RA~
in a relationship i2k, m+k~B. In Fig. 51B, when the in-memory data is recorded in the Out area, the consecutiYe length of the former extent becomes at least equal to the AV
block length.
On the other hand, ~hen the out area is smaller than the in-memory data, data i3 moved. Fig. 52A shous an arrangement of ~he former extent, the latter extent, the In area, and the Out area on the DvD-RAM in a relationship i<k, m+k>B
In Fig. 52A, the for~er exten~ i~ first read into the memory, and in Fig. 52B the former extent is written in an empty a~ea in thc ~ame zone a~ the former extent. After the first extent has been moved, the in-memory data is written immediately a~ter the moved former extent, as shown in Fig.
52C .
When the sum of the sizes of the former extent and the in-~emo~y data is less than the AV block length, the processing proceeds to step S7Z. In step S72, it i~ judged whether the sum of the sizes of t~e former extent, the latter extent, and the in-me~ory data i~ at least equal to two Av block lengths is judged. When the sUm of the size~
is less than the AV block length, even if data is moved, the ~ize is lesY than the AV block length. AY a re~ult, an under~1Ow occurs. When the sum of the si2es is les~ than FROM _ l998~ 9~17~(~) 0 37/~22 2~ 4600149702 Pl2 two AV ~lock lengths, even i~ the former extent, the in-memory data, and the latter extent are ~ritten in a logical block, the r~cording ti~e ~ill not be too long. In the flowchart in Fig. 50, when tne sum of the sizes of the in-memory data, the former extent, and the latter extent is les~ t~an tuo Av blocks, the proce~aing proceed~ from step S72 to step S73, and the former extent and the latter extent are moved.
Fig. 53A shows an arrangement of the ~ormer extent, the latter extent, the In area, and the Out area on the DvD-RAM
in a relationship i<k, m+kCB, B~m+n+k<2B In thi~ case, a search is performed for an empty area in the same zone as the former extent and the latter extent. When an empty area is found, the former extent is read into the memory and is written in the empty area to move the former extent to the empty area, as shown ln Fig. 53B. After the move, the in-memory data is written just after the moved former extent, as shown in Fig. 53C. After the in-memory data has been written, the latter ext-nt is read into the memory and is uritten immediately after the occupied area of the in-memory data to move the latter extent to the empty area, as shown in Fig. 53D.
When the sum of the ~izeq of the in-~emory data, the fonmer extent, and the latter extent i~ at least equal to two AV block lengths, the proces31ng proceeds from step S72 to step S74. When the sum of the ~i2e~ ic equal to or '98~09~ ) 10:45 ~G R I CHE S ~ pfi Pl6i45 greater than two AV block lengths, it will take a long time to urite the data into the logical block. Meanwhile, a simple method in which the former extent is moved and the in-memory data i~ written just a~ter the mo~ed former extent S should not be adopted in vie~ of the access ~peed. Here, it should be especially noted that the processing proceeds from step S72 to step S74 because the sum of the sizes o~ the in-memory data and the former extent is le~s than the Av block length. The reason why the sum of the sizes of the in-memory data and the former extent i~ less than the AV blocklength yet the ~um of the sizes of the in-memory data, the ~ormer extent, and the latter extent is at least equal to two AV block leng~hs is that the latter extent si2e is relatively large, with the difference between the latter extent size and the Av block length is ~eing large. As a result, when the sum of the sizes of the former extent and the in-memory data is less tha~ the Av ~lock lengt~, part o~
the data in the latter extent may be added to the ~um, with there being no ri~k o~ the remaining data size o~ the latter ex~ent being in~ufficient.
When the sum of the sizes o~ the in-memory data, the former extent, and the latter extent i~ at least equal to two Av block lengths, the proces~ing proceed~ from step 572 to step S74, and the data are linked in the manner shown in Figs. 54A to 54D.
Fig. 54A sho~s an arrangement of the ~ormer extent, the '98~09~17~ ) 10:46 ~E~G R I CHES ~ P17/'45 Latter extent, the In area, and the Out area on the DvD-RAM
in a relationship m+k~B, m+n+k~2B. In thi9 case, a search is performed for an empty area in the same zone as the former extent and the latter extent. When such an empty area is found, the former extent is read into the memory and is then w~itten in the empty area to move the former extent, as shown in Fig. 54B. ~ext, the in-memory data is written immediately after the moved former extent, as shown in Fig.
54C. When the in-memo~y data has been ~ritten, a set o~
data that i~ suf~iciently large to make the size of the data in this empty area equal to AV block size is moved from t~e start of the latter extent just after the in-memory data as shown in Fig. 54D.
After the former extent, the in-memory data, and front part of the latter extent are linked in the above-descri~ed procedure, the file entries o~ the Av file Af that includes the former extent and the AV ~ile Af+l are integrated. One integrated ~ile entry is obtained, and the proces~ing ends.

~3-2-7-1-4-7) Proce~sina When the T.Atter ~xtent n 's shorter th;~n the AV ~lock Ten~th When the judgment "No' i9 given in ~tep S63 in the flowchart of Fig. 49, the processing proceeds to step S64 where it is judged whether the former extent m is at least equal to the AV hlock length ~t the latter extent n is chorter than the AV block lenyth In other words, in step FROM 1998~ 9~173(~) 0:39/~22:21/ ~ ~4600149702 P15 ._ S63, it is judged whether an underflow may occur for the latter extent.
Fig, ~S i~ a rlowchart ~hen the latter extent i-~Q
shorter than the AV block length and the former extent is at least equal to the AV block length. The proces~ing ~y the AV file sy~tem unit ll in the ~lowch~rt in Fig. 55 is explained ~ith reference to Fig~. 56, 57, 58 and 59. Fig~.
56, 57, 58 and S9 sho~ the relationships a~ong the data ~izes of the extents m and n, the In area and the Out area i and j, the in-memory data k, and the AV block B, as well as the areas in ~hich each piece of the data is recorded and the areas to which the data is moved.
In step S75, it i~ judged whether the sum of the sizes of the latter extent and the in-memory is at lea~t equal to lS the AV block length. I~ ~o, the processing proceeds from step S75 to step 576, where it is judged whether the In area i~ larger than the in-memory data. Fig. 56A shows an arrangement of the former extent, the latter extent, the In area, and the out area on the DVD-RAM in a relationship j2k, n+k>B. In Fig. 56B, the recording of the in-memory data in the In area results in the consecutive length of the latter extent becomlng at least equal to the AV block length.
On the other hand, when the In area is ~maller than the in-memory data, data i~ mo~ed Fig. 57A shows an arrangement of the former extent, the latter extent, the In area, and the Out area on the DvD-RAM in a relationship j<k, FROM - 1998~ 9~17a(~) 0:39/~22:21/~4600149702 P16 n+k2s. In this case, a 3earch is performed for an e~pty area in the ~ame zone as the former extent and the latter extent. When such an empty area is found, the in-memory data is written in the empty area a~ ~hown in Fig. 57~. The latter extent is then read into the memo~y and is written immediately after the occupied area ot the in-memory data, as shown in Fig. 57C.
~ hen the ~um of the sizes of the latter extent and the in-memory data is les~ than the AV block length, the processing proceed~ from step S75 to step S77. In step S77, it is judged ~hether the su~n of the sizes of the fo~er extent, the latter extent, and the in-memory data is at lea~t equal to two A~ block le~gth~.
When the sum of the ~izes i~ ~es~ than two AV block length~, the processing proceeds to ~tep S78. Fig. 53A
shows an arrangement of the former extent, the latter extent, the In area, and the Out ar~ea on the DvD-f~AM in a relationship j<k, n+k~, m+nlk<2~. In step S78, the AV file system unit ll searches for an empty area in the same zone a~ the former extent and the latter extent. When 5UC~ an empty area is found, the former extent is read into the memory and i~ written into the empty area to move the former extent to the empty area, a~ shown in Fig. 58B. Next, the in-memory data is written immediately after the moved former ZS extenl, as shown in Fig. 58C. When the in-memory data has been written, the latter extent is read into the memory and lg5 '38~09~17~(~)10:47 ~ RICHES ~fi ~ fi P18i~5 is written immediately after the area o~cupied by the in-memory data to move the latter extent to the empty area, as shown in Fig. 58D.
When the sum of the size~ of the in-memory data, the former extent, and the latter extent is at least equal ~o two AV block lengths, the processing proceeds from step S77 to ~tep S79, and the data are linked in the manner shown in Figs. 59A to 59D.
Fig. 59A shows an arrangement of the for~er extent, the latter extent, the In area, and the Out area on the DVD-RAM
in a relationship n+k<B, m+n+k22B. In this case, a cearch is perfor~ed for an empty area in the same zone as the former extent and the latter extent. When such an empty area ls found, data ~ith a data size of which is (the AV
block length - (n+k)) is moved from the end of the former extent to the empty area, as shown in Fig. 59~ As shown in Fig. 59C, the in-memory data is written immediately after this data mo~ed fro~ the former extent. When the in-memory data has been ~ritten, the latter extent is moved to immediately after the occupied area of the in-memory data, as ~hown in Fig. 59D.
When the judgement "No" is given in step S64 in the flowchart in Fig. 49, the processing proceeds to step S65, where it is judged whether the both the former extent m and the latter extent n are shorter than the AV block length is judged. In other words, it is judged whether an underflow . .

FROM 1998~ 9~17~(~) 0:40/~22:21/~ 4600149702 P 18 -may occur for ~oth the former extent m and the lattcr oxtent n.
Fig. ~0 is a f}owchart for ~hen both the former extent and the latter extent are shorter than the AV block length.
The processing by the AV file system unit 11 in the flowchart in Fig. 60 is explained ~it~ reference to Fig~.
61, 62, 63 and 64. Figs. 61, 62, 63 and 64 show the relationshipq among the data sizes of the extents m and n, the In area and the Out area i and j, the in-memory data k, and the Av block B, as well as the areas in w~ich each piece of the dat~ is recorded and the areas to which the data is mo~ed.
In step S80 in this flowchart, it is judged whether the sum of the sizes of the in-memory data, the former extent, and the latter extent is at least equal to AV block length.
If not, the processing pro~eeds to qtep S81 In this caqe, the sum of the sizes of the former extent, the in-memory data, and the latter extent is shorter than the AV block length. As a result, it is judged ~hether there is an extent which follows the latter extent. When no extent follows the latter extent, the latter extent i~ at the end of the AV file that is created by the lin3cing of data, so that no additional processing is needed. When an extent follows the latter extent, an underflow may occur since the Z5 sum Or the slzes of the former extent, t~e in-memory dala, and the latter extent i~ less than the Av block length. In ., FROM _ 1998~ 9~17~(~) 0 40/~22 21/ ~ ~4600149702 P19 order to avoid ~uch underflow, when the extent f~llowing the latter extent is linked to the latter extent by the link processing qhown in Figs. 61A-61D. Fig. 61A ~how~ an arrangement of the former extent, the latter extent, the In area, and the Out area on the DVD-RAM in a relationship m+n+k~B. In step S81, the AV file sy-~tem unit ll ~rites the in-memory data in the In area, a~ shown in Fig. 61B. When the in-memory data has been uritten in the In area, the AV
file system unit ll read~ the latter extent into the memory and writes the read latter extent immediately after the area occupied by the in-memory data to move the latter extent to the empty area, as shown in Fig. 61C.
Then, as shown in Fig. 61D, the AV file system unit ll takes data whose ~iz~ i~ (the AV block length - (the ~ormer extent + the in-memory data + the latter extent)) from the extent following the latter extent. The Av file sy~tem unit ll links this data with the ~ormer extent, the in-memory data, and the latter extent.
When the sum of the si2e~ of the ~ormer extent, the latter extent, and the in-memory data i9 at least equal to the Av block length, the proces~ing proceeds to step S82.
In step S82, the AY file system unit ll judges whether the data size of the Out area following the former extent is less than the sum of the sizes of the latter extent and the in-memory data. I~ not, the processlng proceeds to step S83. Fig. 62A show~ an arrangement of the former extent, FR(lM _~ 1998~ 9~173(~k) 0:41/i~22:21/~g4600149702 P 20 the latter extent, the ln area, and the Out area on the DvD-RA~ in a relationship i2n+k, m+nlk2~. In step S83, the AV
file system unit 11 writes the in-memory data into the In area, as ~hown in Fig. 62B. After writing the in-memory data, the Av file sy~tem unit 11 ~eads the latter extent into the memory and writes the latter extent i =ediately after the occupied area of the in-memory data to move the latter extent.
When the data .~ize Or tne Out area following the ~ormer extent is less than the sum of the si~es of the latter extent and the in-me~ory data, the proce~sing proceed~ from step S~2 to step S84. In step S84, it i~ judged ~hether the da~a size of the In area preceding the latter extent i9 less than the sum of the sizes o~ the former extent and the in-memory data. If not, the processing proceeds to step S85.
Fig. 63A sho~s an arrangement of the former extent, the latter extent, the In area, and the Out area on the DVD-RAM
in a relationship i<n+k, m+n+k2B. In step S85, the AV file system unit 11 writes the in-~emo~y data in the In area as shown in Fig. 63B. After writing the in-memory data, the AV
file ~ystem unit 11 reads the fonmer extent into the memory and writes the ~ormer extent into a storage area immediately before the occupied area of the in-memory data to move the former extent to the In area, as shown in Fig. 63C.
When the ~udgement "No" is given ln step S84, the processing proceeds to step S86. Fig. 64A ~hows an FROM ~ 1998~ 9~17a(~) 0:4l~22:2~ 4600149702 P 21 arrangement Or the former extent, the latter exte~t, the In area, and the Out area on the DvD-RAM in a relationship i~n+k, jCml k, m+n~k2E~. In step S86, it is judged whether the sum of the ~izes of the former extent, the latter extent, and the in-memory data i~ more than t~o AV block leng~hs. If not, the AV file system unit ll ~earche~ for an e~pty area in the ~ame zone as the former extent. When an empty area is found, the AV file ~y~tem unit ll reads the former extent into the memory and writes the read former extent into the empty area to move the former extent to the empty area, as shown in Fig. 64B. After the move, the AV
file ~y~tem unit ll write~ the in-memory data into a ~torage area immediately after the moved former extent, as shown in Fig. 64C. After ~riting the in-memory data, the AV file system unit ll reads the latter extent into the memory and writes the latter extent into a storage area just after the occupied area of the in-memory to mo~e the latter extent to the em~ty area, a~ ~hown in Fig. 64D.
When the combined size of the former extent, the latter extent, and the in-memory data exceeds AV blocks, it is judged whether either the Out area or the In area is large.
When the out area i~ large, a part o~ the in-memory data is recorded in the out area to make the continuou~ length equal to AV ~lock length. The rPm~i ni ng part of the in-memory data is recorded in a different empty area, and the latter extent is mo~ed to a position directly after this remaining '98~09~17~ ) 10:47 ~ R I CHE S ~ Pl9/-5 ,_ part o~ the in-~emory data.
~ hen the In area is large, the A~ file syste~ unit 11 moves the former extent to an empty area and records a first part o~ the in-memory data to make the continuous length equal to AV block length. After this, the remaining part of the in-memory data is recorded in the In area.
As a result of the above processing for moving extents, the total consecutive lengt~ can be kept equal to or below 2 ! AV block lengths.
After the former extent, the in-memory data, and front part of the latter extent are linked in the above-deqcr1bed procedure, the file entries of the Av file Af that includes the former extent and the AV file Af+1 are integrated. One integrated file entry is obtained, and the processing ends (3-2-7-1-4-3) Process~ Wh~n ~oth the For~r F~xtent ~nd the T~tter F~xtent ~re ~t T,e~st F~a~ to the AV ~lock Ten~th When the judgement "No" is given in step S65 in the ~lowchart of Fig. 49, the processing proceeds to step S66 where it is judged whether the in-memory data is at least equal to the Av block length If so, the in-memory data is recorded in an empty area and the processing ends When the judgment "No" is gi~en in step S66 in the flowcha~t of Fig. 49, the Av file system unit 11 ~udges whethe~ the former extent m i~ at leaqt equal to the AV
block length, the latter extent n iq at least equal to ~he . .

'98~09,q17~ ) 10: 48 7e ~G R I C H E S ~ fi P2()/ 15 ;

AV block length, but the in-memory data i~ ~maller than the combined size of the In area i and the Out area j. Fig. 65 i3 a flowchart when the latter extent is at lea t equal to ~he Av block length.
Fig~. 66A-66D show an ~upplementary example showing ~he processing of the AV file system unit 11 in Fig. 65. In Fig. 66A, the fo~mer extent and latter extent are both at least equal to the AV block length. Figs. 66B-66D show how ~he in-memory data and extent~ are recorded in the In area, Out area, and other empty areas as a result of the steps in Fig. 65.
In this case, there is no risk of an underflow occurring for either the former or the latter extent. It would be ideal, ho~e~er, i~ the in-memory data could be recorded in at lea~t one of the Out area following the AV
file A~ and the In area preceding the AF file Af+l ~ithout having to move the former or latter extent.
In step S87 of ~he flowchart in Fig 65, i~ is judged whether the ~ize of the Out area exceeds the data size of the in-memory data. If so, the in-memory data ls simply recorded into the Out area in step S88, as show~ in Fig.
66~.
If the size of the Out area is belo~ the data size of the in-memory data, the proces~ing proceed~ to step S89, ~-here it is judged whether the size of the In area exceed~
the data size of the in-memory data. If so, the in-memory 2~2 ... .

.Y~:UYf311~ lU ~ C YG K l ~ n ~ ~ ~ la ~ ,t ~
i data is simply recorded into the In area in step S90, as shown in Fig. 66C. If the in-memory data cannot be recorded into either the Out area or the In area, the processing proceeds to step S91 where the in-memory data is divided into two par~s that are re~pectively recorded in the Out area and In area, as shown in Fig. 66D
After the fo~mer extent, the in-memory data, and front part of the latter extent are linked in the above-described p~ocedure, the file entries of the AV file A~ that includes the former extent and the AV file Af+1 are integrated. One in~egrated file entry is obtained, and the processlng ends.

(3-~-7-~-4-4l Proces!~ina When Roth the For~er Fxten~ ~n~ the r~tter Extent ~re at T.e~t Fc~ o the AV Block T.en~th In step S69 in the flowchart of Fig. ~g, it is judged whether the former extent m i9 at least equal to the AV
block length and the latter extent n i9 at lea~t equal to the AV block length, but the size of the in-memory data k exceeds the combined ~ize of the Out area j and the In area i.
Fig. 67 is a flowchart showing the proce~sing when both the former extent but the combined size of the In area and the Out area is below th- data size of the in-memory data.
Figs. 68A-68E show supplementary examples foe the processing of the A~r ~ile system unit 11 in the flowchart of Flg. 67.
In Fig. 68A, both the former extent and the latter extent , . . . .

Y~Y~ OlU ~Y ~ Kl~ la ~ ot~P~

are at least equal to AV block length. Flgs. 68B-68D show how the extents and in-memo~y data are recorded in the In area, Out area, and other empty areas as a result of the steps in Fig. 67.
! 5 In this ca~e, both the former extent and the latter extent are at least equal to AV block length, so that there i~ no risk of an underflow occurring, although t~e recording area of the in-memory data must have a continuous length that is at least eq~al to AV block length.
In step S9Z, it is judged whether the total size of the former extent and the in-memory data is at least equal to two AV block lengths.
If the total size exceeds two AV ~lock lengths, the processing advances to step S93 where data ~ho~e size is (AV
block len~th-data size o~ in-memory data k) is read from t~e end of the former extent and moved to an empty area where the in-memory data is also recorded. This results in the recording state of thi~ empty area and both extents being equal to ~V block length, as ~hown in Fig. 68B.
I~ the judgement "No" is given in step S92, the processing advances to step S94, where it is judged whether the total size of the latter extent and the in-memory data is at lea~t equal to two AV block lengths. If so, the ! processing follows the pattern in step S92, since an exce~ively long logical block write operation i~ to be avoided and since a relatively large amount o~ data can be ., _ . .

'98~09,q17~ ) 10:49 3~G R I CHE S ~ F~T P23i45 mo~ed fro~ the latter extent ~ithout any risk of the latter extent ending up ~horter than AV block length.
If the total 3ize of the latter extent and the in-me~nory data i9 at least equal to two AV block lengths, the processinq advances to step S95, where data whose size is (AV block length-data si~e of in-memory data k) is read from the start of the latter extent and moved to an empty area in the same zone as the former and latter extent~, where the ; in-memory data is then also recorded. This results in the recording state of thiY empty area and both extents being equal to AV block length, as ~hown in Fig. 68C.
If t~e total size of the former extent and the in-me~ory data is ~elow t~o AV block lengths, and the total size of the latter extent and the in-memo~y data is belo~
two AV block lengths, the total dat~ amount to ~e written into logical blocks will be less than two AV block lengths, so that the ~ove processing can ~e performed without concern for the time taken by the write proces~ing involved.
Accordingly, when the total size of the former extent and the in-memory data is below two AV block length~, and the total size of the latter extent and the in-memory data is below two AV block lengths, the processing advances to step S96, where the larger of the former extent and the latter extent is found. In thia situation, either the former or the latter extent may be moved, although in the present embodiment it i~ ideal for the cmaller of the two to '98~09q17~ ) 10: 50 ~ ~G R I C H E S ~ ifi P~4/~5 be moved, hence thig judgement in step S96. When the ~ormer extent ls the s~aller of the two, in step S97 the former extent is moved, with the in-memory data then ~eing recorded at a position im~ediately after the in-memory data. When doing so, the continuo~s length o~ the data recorded in this empty area will be below two Av block lengths, as shown in Fig. 68D.
When the latter extent is the smaller of the two, in step S98 the latter extent is mo~ed, with the in-memory data then being recorded at a position immediately be~ore the ln-memory data When doing so, the continuous length o~ the data recorded in this empty area will be below two Av block lengths, as sho~n in Fig. 68E.
After the ~ormer extent, the in-memory data, and ~ront part o~ the latter exten~ are llnked in the above-descrlbed procedure, the file entries of the AV file Af that includes the former extent and the AV ~ile Af+1 are integrated. One integrated ~ile entry is obtained, and the processing ends.
Flowcharts ~or "ME~GE" processing in a variety of clrcumstances ha~e been explained, with it being possible to limit the data size of the moved a~d recorded data to two AV
¦ block lengths in the worst case scenario. However, this does not mean that there are no cases where data that exceeds two ~ blocks lengths need~ to be written, w~th the following two cases de~cribing such exceptions where data that exceeds two AV blocks lengths need4 to be written.

.

'98~09~17~ ) 10:50 ~G R I CHE S ~ 3ifi P~5~5 In ~he ~irst exCeptiOn, an empty area with a continuous length of two AV block lengths i~ required, although only 3eparate empty areas of A~ block length are a~ailable. In this ca~e to create an empty area with a continuous length of two AV block lengths, AV data for one Av block length must be moved In the second exception, in step S81 of Fig. 60, the moving of data ~rom the latter extent results ln the remaining part of the latter exte~t becoming belo~ AV block length In this case, a further move operation becomes necessa~y, with the total amount of mo~ed data in the entire processing exceeding two Av block lengths.
While the above explanation only deals with the linkin~
of t~o AV ~ile~ and in-memory data, a MERGI; command may be executed for only one AV file and in-memory data. This case is the same as w~en adding data to the final extent in an AV
file, so that the total size after such addition needs to be at least equal to AV block size. As a result, the in-~emory data is recorded into the Out area following this ~inal extent. When the Out area is too small to record all the in-memory data, the re~Ai~lng part of the in-memory data may be recorded in a ~eparate empty Av block.
The above linking process has been explained for the premise of seamless reproduction within a file, although it may also be u~ed ~or seamless reproduction acros~ files.
Seamless reproduction across ~iles refers to a branch in Y~Y~ (01~ c~ 4 reproduction from a present AV ~ile to another AV file. In the same way as described abo~e, when linking t~o AV files and in-memory data, the continuous length o~ each extent must be at least equal to A~ block length, so ~hat a thorough link procedure must ~e used.
~ his completes the explanation o~ t~e linking proce~u~e used by the AV file system unit 11.

(3-~-7-1-5) Up~Atina o~ the VO~ Inform~t~on ~n~ PGC Inform~tion The following is an explanation o~ the updating of the VOB in~or~ation ~time map table, seamless linking information) and PGC information (cell information) when executing a SPLIT com~and or ~ERGE command.
First, the processing when a SP~IT command ha~ been executed will be explained. Out of the plurality of AV
files that are obtained by the execution of the SPLIT
command, one AV ~ile is assigned the same AV_File ID as the AV ~ile which recorded the VOB from which it was split. The Av File I~s of the other AV files split ~rom the AV file howe~er need to be ac~igned new values VOBs that were originally recorded as an AV file will lose several sections due the execution o~ a SPLIT command, so that the marks ~hat indicated the lo~t section~ need to be deleted Ir~ the same way, the cell information that gave these mark~ as the ~tart points and end points need to be .

Y~Y~ Kl~ O~ P~ 4~
I

deleted from the RTRW management file.
In addition to deleting the mark point~, it is nece~sary to generate new cell information that indicates the video presentation start frame of t~e AV file as C_V S_PTM and the video presentation end frame of the AV
file as C_V E PTM, and to add this new cell information to the RTRW management file.
The VOB information that includes the seamless linking information and time map table i~ divided into a pluraLity of parts when the corre~ponding VOB i~ divided. In more detail, if mx VOBs are produced ~y the division, the VOB
information is divlded to give mx time map tables and mx sets of seamle~s linking information.
The ~ideo presentation start time VOB_v_S_PTM and the video pre~entation end time vOB_V_E_PTM of a vOB generated by the processing that accompanies the execution of the SPLIT command are respectively set based on the C_V_S_PTM, C_V E PTM indicated by the start point and end point in the_ cell information used for the SPLIT command. The LAST_SCR
and FIRST_SCR in the seamless linking information are also updated.
The following i~ a description of how the information is updated when a MERGE co~mand has been executed. The execution of a MERGE command re~ults in one Av file being 2s prcduced from a plurality of A~ file~, so that the VOBs that are included in this plurality of Av files will be composed ! 209 ~, Y~:UY,~l Itl ~/1\) lU ~ C K 1 ~ la ~ t ~ n v~,, 4l~
, ~ .~

of sets of frame data that are not inte~related, meaning that the time stamps across these AV files will not be continuous. Since these are managed as a VO~ that differs ~ro~ the plurality of VOBs that were originally include~ in different ~V ~ile~, separate VOB_IDs will be assigned to i these VOBs.
The other necessary proces~ing is as described in the second em~odiment. However, the C_v_E_PTM in the cell information t~a~ specifies a split area needs to be increased by the number of frames included in the part of the forwer VO~U that have been encoded. Similarly, t~e C V S PT~ in the cell information that specifies a split area in a latter AV file needs to be decreased by the number o~ frames included in the part of the lat~er vOBU that have been encoded.
(3-2-3) The defragmentation unit 16 is connected to a fixed magnetic disc apparatus. This defragmentation unit 16 reads an extent, out o~ the extents recorded on the DvD-RAM that have been subjected to link processing or other processing, that has an empty area on either side of its recording area and ~rites thi~ extent onto the fixed magnetic disc apparatus to generate backup data in the fixed magnetic disc apparatus. After writing all of s~ch extent onto the fixed magnetic disc apparatus, the defragmentation unit 16 reads the ge~erated backup data and write the backup data for the . _ ~$~ (01~ h~ Y 4 backed-up extent into the empty area adjacent to the extent.
Here, extents ~hich ha~e an empty area adjacent to their recording area are extent~ that have been generated by the AV file system unit ll executing a "SPLIT" command or a "SHORTEN" command. These empty areas equate to areas have been cleared and not sin~e used as the recording area o~ the in-memory data or the moved-to area for an extent when a MERGE cn~n~ has been performed.
Figs. 69A-69D sho~ an example that illustrates the operation of the defrag~entation unit 16. In Fig. 69A, extent #x is shown as an extent with empty areas i, j on both sides of its recording area. As shown in Fig. 69A, the defragmentation unit 16 detects this extent, read~ it from the DvD recorder 70, and writes it onto the fixed magnetic disk apparatus.
As a result of thi~ write operation, backup data is generated in the fixed magnetic disk apparatus, as shown in Fig. 69B. A~ter this, the defragmentation unit 16 reads the backup data from the fixed magnetic disk apparatus, as shown in Fig. 69C, and ~rites the extent onto the DVD-R~M to use both the current recording area of the extent #x and the e~pty area j ~ollowi~g this recording area. Thi~ creates a continuous empty area of length i+j before the extent #x, as shown in Fig. 69D. By next performlng this processing ~or the extent ~y, the continuou~ leng~h o~ the empty area can be ~urther increased.

,.

Y~Y~ )lU ~ Kl~ t~ v The recording performed by the defragmentation unit 16 is achie~ed by first storing an extent on the fixed magnetic di~k apparatus, so that even if a power failure occurs for the DVD recorder 70 during the writing of the extent back S onto the DVD-RAM, this writlng pLoCessing can still be re-executed. By generating backup data before moving the extents to free large empty areas on the DVD-RAM, there is no risk of the losing the data ln an extent when there is a power ~ailure for the DVD recorder 70.
With the present em~odiment descrlbed above, the editing of a plurality of AV files can be freely performed by the user. Even if a plurality of fragmental Av files with short continuous lengths are generated, the DVD
recorder 70 will be able to links these short AV files to generated AV files with continuous lengths that are at least equal to Av block length. As a re~ult, problem~ caused by the ~ragmentation of AV file~ can be managed, and uninterrupted rep~oduction can be performed for the AV data that is recorded in these AV files.
~uring the link proces~ing, it ls judged whether the total size of the data to be written is as least equal to two AV block lengths, and i~ so, the moved amount of prerecorded AV data i~ restricted. As a result, it can be guaranteed that to~al size of the data to be written is belo-~ tuo Av ~lock lenqth~, ~o that the linklng can be completed in a ~hort amount of time.

Y~:UY,~ lU ~ C YG K 1 ~ Ot ~ ~ ~n -Even when it is necessa~y a~ a result of a user editing operation for a plurality of files to record re-encoded data with a short continuous length, the DvD recorder 70 ~ill record this re-encoded data at a recording position ~hat allows the re-encoded data to be linked with the AV data that precedes or follow~ it during reproduction. This means that the fragmented recording of re-encoded da~a is prevented ~rom the outset, so that uninterrupted reproduction will be possible for the Av data tha~ is recorded in such an AV file.
It should be noted here that the movement of data may also be performed so as to a~oid excessive separation on the disc of two sets of Av data that ha~e been linked together.
In ~uch a case, the data produced ~y linking the set~ of data that are physically separated on the di~c i9 arranged in a manner that en~ures uninterrupted reproduction of the two sets of AV data will be pos~ible. ~owe~er when special reproduction such as ~ast forward is performed, ex~essive separation of t~e data on the disc will result in jerky reproduction of the data.
To en~ure smooth reproduction in ~uch a case, when two set3 of Av data are linked, if one o~ the sets of data has a consecuti~re length that i9 ieveral times a predetermined amount and an empty block o~ ~uitable size is positioned between the two sets of data, the data may be moved to this empty block. By doing so, smooth reproduction can be Yl~:U~,i911~ ~1\) lU ~ C K 1 ~ lo ~ f ot ~ ~ Plt t~

ensured for both normal reproduction and special reproduction.
It should ~e noted here that the time information may be taken from the mark points in the cell informa~ion and managed with information such as address taken from the time map table in the form of a table.
By doing so, this information can be presented ~o the user as potential ~elections in a screen showing the initial pre-editing state.
Reduced images (known as "thumbnails") may al~o be generated for each mark point and stored as ~eparate files, with pointer infor~ation also being produced for each thumbnail. When displaying the cell information at the pre-editing stage, these thumbnails may be displayed to show the potential selections that can be made by ~he user.
Also, while the present embodiment describes a case when video data and audio data are handled, thi~ i~ not an ~ ~: r__ l; ..c, 1 i ~, ~ ,~; ~. f~r tho t~r~hn~ nf th~. nr~.qent invention. For a DvD-ROM, sub-picture data for subtitles that has been run-length encoded and still image~ may also be handled.
~he processing of AV file system unit 11 tFigs 48A, 48B, 49-50, S5, 60, 65, 67) that was de~cribed in this third embodi~ent using flowcharts can be achieved CA 02247637 l998-09-l7 Y~)YF3! 1~ 3~ GKl ~ n ~ t ~
,_ by a machine language program. Such machine language program may be distributed and ~old having been recorded on a recording medium. Example~ of such recording medium are an IC card, an optical disc, or a floppy disc. The mac~ine language program recorded on the recording medium ~ay ~hen be installed into a standard personal computer. By executing the in~talled machine language p~ograms, the ~tandard personal computer can achieve the functions of the ~ideo data editing apparatus of this third embodiment.

Fourth F.mho~; ment The fourth embodiment of the present in~ention perform~ a two-stage editing process composed of virtual edits and real e~its using two types of p~ogram chain, namely user-defined PGCs and original PGCs. To define the user-defined PGC~ and the original PGCs, a new table i~ added to the RTRW management file of the first embodiment.

~4-1~ RTRW ~n~m~nt File The ~ollowing is a description of the construction of the RTRW management file in this fourth embodiment.
In the ~ourth embodi~ent, the RTRW management file ls recorded in the same directory as AV files (the RTRW

Y~Y~ ) lU ~ C Yc K 1 ~ t directory), and ha~ the content shown in Fig. 70A.
Fig. 70A ~hows a detaile~ expansion of the stored content of the RTRW management file in the fourth embodiment. This is to say, the logical format located on the right ~ide of ~ig. 70A shows the logical format located on the left side in more detail, with the broken guidelines in Fig. 70A ~howing the correspondence between the left and right side~.
From the logical format of vOBs shown in Fig. 7OA, the RTRw management file can be seen to include an original PGC information table, a user-defined PGC
information table, and a title search pointer, in addition to the VOB information of the fir~t em~odiment.

lS ~4~ Content of the Ori~in~l PGC Infor~tion The original PGC information ta~le is composed of a plurality of ~ets of original PGC in~ormation. Each set of original PGC information i~ information that indicate~ either the VOBs tha~ are stored in a AV ~ile pre~ent in the RTRW directory or ~ections within these VOBs, in accordance with the order in which these are arranged in the AV file. Each set of original PGC
information correspond-~ to one of the voBs recorded in an A~ file pre~ent in the RTRW directory, so that when an AV file i~ recorded in the RTRW directory, sets of , ~ J~ Kl~ 13 ~ ot~ / 4~
-original ~GC information are generated by the video dataediting apparatus and recorded in the RTRW management file.
Fig. 70B shows the data format of a set of original . 5 PGC information. Each set of original PGC information is composed of a plurality of sets of cell information, ~ith each set of cell information being compo~ed of a cell ID (CELL #1, #2, #3, #4... in ~ig. 70B) that is a unique identifier assigned to the ~et of cell information, an Av file ID (AvF_ID in Fig. 70B), a VOB_ID, a C_V_S_PTM, and a C_V_E_PTM.
The AV file ID i9 a column for writing the identifier of the AV file that corresponds to the set of cell information.
The VOB ID is a column for writing the identi~ier o~ a VOB that i~ included in the AV file When a plurality of VOBs are incl~ded in the AV file that corresponds to the set of cell information, this VOB_ID
indicates which of the plurality of VOBs corre~ponds to the present set of cell information.
The cell start time C V S PTM (ab~reviated to C V S PT~ in the drawings) shows the start time of the cell indicated by the present cell information, and so has a column ~or writing the PTS that is assigned to the 2s s~ar~ time of the first video field in the section using Y~:UY,911~ ~/1\) lU~ QYC K 1 ~ t~Rt PT~ de~criptor format.
The cell end time C V_E_PTM (abbreviated to C V E PTM in the drawing~) show~ the end ti~e of the cell indicated by the present cell information, and so has a column for writing the end ti~e of the final ~ideo field in the section usi~g PTM descriptor format.
The time information gi~en as the cell start time C V S PTM and cell end time C V E PT~ show~ the start time for an encoding operation by the video encoder and the end time for the encoding operation, with these corresponding to the mark points inserted by the u~er The cell end time C V E PTM in each set of cell in~orma~ion in a set of original PGC information ma~ches the cell start time C_V_S_PTM of the next ~et of cell information in the gi~en order. Since this relationship is established between t~e set~ of cell information, an original PGC indica~e~ all of the sections in a VO~
without omit~ing any of the sections. As a re~ult, an original PGC is unable to indicate sections of a vOB in an order where the section~ are interch~nged.

(4-1-3~ Content of the U~r-~efine~ PGC inform~tion The u~er-defined PGC information table is composed of a plurality of ~ets of u~er-defined PGC information.
The data format of set~ of ~ser-defined PGC information Y~ UY~ /1\) lU ~ Q jG K 1 ~ ot ~

is shown in Fig. 70C. Like the sets of original PGC
information, the sets of user-defined PGC informatian are composed of a plurality of set~ of cell information, each of which i9 composed of an AV file ID, a VOB_ID, a S C v S PTM, and a C V ~ PTM.
------ _ _ _ A set of user-defined PGC information is composed of a plurality of sets of cell information in the same way a~ a set of original PGC information, although the nature and arrangement of these sets of cell information differ to those in a set of original 2GC information.
While a set o~ original PGC information indicates that the sections in a video object are to be sequentially reproduced in the order in which the sets of cell information are arranged, a ~et of user-defined PGC
information is not restricted to indicating that the sections in a video object are to be reproduced in the order in which they are arranged.
The ~ections indicated by the sets of cell information in a user-de~ined PGC can be the ~ame as the sections indicated by the sets of user-defined PGC
information or a part (partial section) of one of the sections indicated by a set of original PGC information.
Note that it is possible for the ~ection indicated by one set of cell information to overlap a ~ection indicated by another set of cell information.

of u ;~ H l I a ~/1\) lU;~O ~ n~ ot ~ T n,oj-~

There also may be gaps between a section that is indicated by one ~et of cell information and a section that is indicated ~y another ~et of cell information.
This means that sets of user-defined PGC information do no~ need to indicate every section in a VO~, so that one or more parts of a VOB may not be indicated While original PGCs have strict limitations concerning their reproduc~ion orders, user-defined PGCs are not subject to such limitations, 90 that the reproduction order of cells may be freely defined. As a specific example, the reproduction order of the cells in a u~er-defined PGC may be the inverce of the order in which the cells are arranged. Al~o, a user-defined PGC
~ay indicate section~ of VOBs that are recorded in different AV files.
Original PGC~ indicate the partial sections in one AV file or one VOB in accordance with the order in which the AV file or VOB~ are arranged, so that original PGCs may be said ~o respect the arrangement of the indicated data. U~er-defined PGCs, however, have no such restriction, and so are able to indicate the section~ in the u~er'~ desired order As a result, these u~er-defined PGCs are ideal for storing reproduction orders that are provisionally determined by the user for the linking o~ a plurality of section3 in vOBs during a 'J ' 1 ~ U~ ' I' U \ T'/ IG 1'~ L~ . J~ la ~ Ti ~1 '!' 17J r~ u~ I .

video data editing operation.
Original PGC~ are associated to AV files and the VO~s in AV files, and the cells in an original PGC only ~ indicate sections in these VOBs. User-de~ined PGCs, mean~hile, are not limited to being a~ociated to par~icular VOBs, so that the sets of cell information included in user-defined PGC in~ormation may indicate sections in different ~OBs.
As another difference, an original PGC i~ generated when recording an AV file, while a user-defined PGC may be generated at any point following the recording of an AV file.

(4-1-4) Unity of the PGC infor~t;on - Vj~Q Attr~h1~te Infor~tion - Ay F;le The following is an explanation of the inter-relatednes~ of the AV files, VOBs, and sets of PGC
information. Fig 71 shous the inter-relatedness of the AV file~, VOBs, time map table, and ~ets of ~GC
information, with the element~ that form a unified ~ody being enclo~ed within the fra~es drawn using thick black lines. Note that in Fig 71, ~he term "PGC information"
has been abbre~iated to "PGCI".
In Fig 71, the AV file #1, the VOB information #l, and the original PGC information #l composed of the sets CA 02247637 l998-09-l7 )Y~ Y ~ Kl~ lo ~ t~ Jj~3 of cell information #1 to #3 have been arranged within the same frame, while the AV file #2, the vOB
infor~ation #2, and the original PGC information #2 composed of the sets of cell information #1 to #3 have been arranged within a different frame.
! These combinations of an Av file ~or VOB), VOB
information, and original PGC information that are present in the same frame in Fig 71 are called an "original PGC" under DVD-RAM ~tand~rd. A video data editing apparatus that complie~ to DVD-RAM ~tandard ~reats the units called original PGCs as a managemen~
unit called a ~ideo title.
For the example in Fig. 71, the combination of the AV file #1, the VOB infor~ation #1, and original PGC
information #1 is called the original PGC #1, while the combination of the ~v file #2, the VOB information #2, and original PGC information #2 is called the original I PGC #2.
¦ When recording an original PGC, in addition to recording the encoded VOBs on the DvD-RAM, it is necessary to generate VOB information and original PGC
information for these VOB~. The recording of an original PGC is therefore regarded a~ complete ~hen all three of the Av file, vOB information table, and original PGC information have been recorded onto the ~o~u~ \J l~ 0 ~ t ~ ~ ~ r~

DVP-RAM. Putting this another way, the recording of encoded ~OBs on a DVD-RAM as an AV ~ile itself i~ not regarded a~ completing the recording of an original PGC
on the DvD-RAM.
This i~ also ~he case for deletion, so t~at o~iginal PGCs are deleted a~ a whole. Putting this another way, when any of an AV file, VOB information and original PGC information is deleted, the other elements in the same original PGC are also deleted.
1~ The reproduction of an original PGC i~ performed by the u3er indicating the original PGC information. This means that the user does not gi~e direct indications for the reproduction of a certain Av file or VOB~.
It should be noted here that an original PGC may also be reproduced in part. Such partial reproduction of an original PGC is performed by the user indicating ~ets of cell information that are included in the original PGC, although reproduction of a section that is smaller than a cell, ~uch a~ a VOBU, cannot be ! 20 indicated.
The following describes the reproduction a u~er-defined PGC~ In Fig. 71, it can be seen that the user-defined PGC information #3, composed o~ the cell~ #1 to #4, is included in a ~eparate frame to the original PGCs #1 and #2 de~cribed earlier Thiq shows that for DvD-:10 U ~ /1\1 11 U U ~ n~ ot ~ r'i~

RAM standard, the user-defined PGC information is not in fact Av data, and is instead managed as a separate title As a result, a video data editing apparatus defines the user-defined PGC information in the RTRW management file, and by doing so is able to complete the generation of a user-defined PGC. For user-defined PGC~, there is a relatlonship whereby the production of a user-defined PGC equates to the definition of a set of user-defined ! 10 PGC information.
When deleting a user-defined PGC, it is su~ficient to delete the user-defined PGC information from the RTRW
manage~ent file, with the ~ser-defined PGC being regarded as not existing therea~ter.
The units for reproduction of a user-defined PGC
are the same as for an original PGC. Thi~ means that the reproduction of a user-defined PGC is performed by the u~er indicating the user-defined PGC infor~ation.
It is also po~sible for user-defined PGC~ to be partially reproduced. Su~h partial reproduction of a user-defined PGC is achieved by the user indicating cells that are included in the user-defined PGC.
The differences between original PGCs and user-defined PGC~ are a~ described above, but, from the 2s viewpoint of the user, there is no need to be aware of such differences. This is becau~e the entire reproduction or partial reproduction of both types o~
PGCs is performed in the same ~ay by respectively indicating the PGC information or cell information As a result, both kinds of PGCs are managed in the same ~ay using a unit called a "video title".
The following is an explanation of the reproduction of original PGCs and user-defined PGCs. The arrows drawn with ~roken lines in Fig. 71 show how certain sets 1~ of data refer to other data. Arrows y2, y4, y6, and y8 ~how the relationship between each VOBU in a VOB and the time code~ included in the time map table in the VOB
information, while yl, y3, y5, and y7 ~how the relationship between the time code~ included in the time map table in the VOB information and the ~ets of cell information.
Here, it is assumed that the user has indicated one of the PGC~, so that a video title is to be reproduced When the indicated PGC is the original PGC #l, the ~et of cell information #1 located at the front of the original PGC in~ormation #1 is extracted by the reproduction apparatu~. Ne~t, the reproductlon apparatus refer~ to the AV file and VOB identifiers included in the extracted ~et of cell information #1, and specifies the AV file #1, the voB#l~ and the time ~:10LU~ n ~ t ~ T r~i4/ ~
-map table #1 for thi~ VO~ as the A~ file and VOB
corresponding to thi~ cell information.
The specified time map table #1 includes the size of each VOBU that composes the VOB and the reproduction period of each vOBU To improve the data accessing ability, the specified time map table #1 also includes the address and elapsed time rela~ive to the start of ~he VOB for repre~entati~e VOBUs that are selected at a constant inter~al, such as a multiple of ten seconds As a result, by referring to the time map table using the cell start time C_V S_PTM, as shown by the arrow yl, the reproduction apparatu~ can speci~y the VOBU in the AV file that corresponds to the cell start time C_V S_PTM included in the ~et of cell information #1, and so can specify the flrst a~dress of this VOBU. By doing so, the reproduction apparatu~ can determine the first address of the VOBU that corresponds to thi~ cell start time C_V_S_PTM, can access vOBU #1 as shown by the arrow y2, and so can start reading the VOBU sequence that starts ~rom VO~U#1 Since the set of cell information #1 also includes the cell end time C V_E_PTM, the reproduction apparatus can access the time map table using this cell end tlme C_v_E_PTM, as shown by the arrow y3, to ~peci~y the vosU
in the AV file that corresponds to the cell end time .. _ .... ..

; C_V E PTM included in the set of cell information ~1.
As a re~ult, the reproduction apparatus can determine the first address of the VOBU that corresponds to the cell end time C_V E_PTM When the vOBU that corre~ponds to the cell end time C_V E_PTM is VOBU #10, for example, the reproduction apparatus will stop reading the VOBU
sequence on reaching VOBU#10~ as sho~n by ~rrow y4.
~ y accessin~ the AV file via the cell infor~ation #l and the VOB information #1, the reproduction apparatus can read only the ~ection indicated ~y the cell information X1, out of the data in VOB #1 that is included in AV file #1. If reads are also performed for the cell information #2, #3, and #4, all VOBUs that are included in VOB#1 can be read and reproduced.
When reproduction is performed for an original PGC
as described abo~e, the section~ in the VOB can be reproduced in the order in which they are arranged in the VOB.
The following explanation is for when the user indicates the reproduction of a video title indicated by one o~ the user-defined PGCs.
When the indicated PGC is the user-defined PGC #1, the reproduction apparatus extracts the set of cell information #l that is positioned at the front of the 2S user-defined PGC information ~1 for this user-defined ~ 4 5 i~ - , ' 98~09~17~ 17 ~E ~G R I C H E S ~ Pfi P02/4~

PGC #1. Next, the reproduction apparatus refers to the time map table ~1 using the cell start time C_V_S_PTM
included in this cell information #1, as shown by the ar~ow y5, and specifies t~e VO~U in VOBU #1 that corresponds to this cell start time C_V_S PTM included in the cell information #1. In thi~ case, the reproduction apparatu~ specifie~ VOBU #11 as the VOBU
that correspond~ to the cell ~tart time C_v_S_PTM, ; accesses VOBU #11 as shown by the arrow y6, and starts reading a VOBU qequence that starts from VOBU #ll.
The cell information #1 included in the user-defined PGC #1 al~o includes the cell end time C V E PTM, so that the reproduction apparatus refers to _ _ _ the time map table using thi~ cell end time C_V_E PTM, as ~hown by the arrow y7, and specifies the VOBU in VOB
#1 that co~re~ponds to the cell end time C_V_E_PTM that is included in the cell information #1. When the VOBU
I that correspond~ to the cell end time C_V_E_PTM is VO~U
! #21, for example, the reproduction apparatus ~ill stop reading the VOBU se~uence on reaching VOBU #21, as shown by arrow y8 As descri~ed above, after acce~sing the AV file via the cell information #1 and Vo~ information #1, the reproduction apparatus performs the same processing for the cell information #2, #3, and #4 included in the '98~Q9~17~(;1~) 11:17 ~gC R I CHES ~fi ~ ifi Po3/44 user-defined PGC information #l After extracting the cell information #2 which is located at a po~ition following the cell information #1, the reproduction apparatus refers to the AV file identifier included in the extracted cell information #2 and so determines that AV file #2 corresponds to this cell information and that time map table #Z corresponds to this Av file.
The specified time map table ~2 includes the size of each VOB~ that compo~es the VOB and the reproduction period of each VOBU To improve the data acce~sing ability, the specified time map table #2 also includes the address and elapsed time relative to the start of the vOB for representati~e VOBUs that are selected at a constant inter~al, such as a multiple of ten seconds.
As a result, by referring to the time map table using the cell start time C_v_S_PTM, as ~hown by the arrow y9, the reproduction apparatus can speci~y the vOBU in the AV file that corre~ponds to the cell start time C_v_S_PTM included in the set of cell information #2, and so can ~pecify the first address of this voBu~ By doing so, the reproduction apparatu~ can determine the fir~t address of the ~OBU that corresponds to this cell start time C_v_S_PTM, can access VOBU #2 a~ ~hown by the arrow ylO, and so can ~tart reading the VO~U -qequence '98~09~17~ 18 ~RICHES ~h~ P04/44 that Ytarts from VOBU~2.
Since the set of cell information #2 al50 includes the cell end time C_V E PTM, the reproduction apparatus can access the time map table using this cell end time C_V_E_PTM, as ~ho~n by the arrow yll, to specify the VOBU in the AV file that corresponds to the cell e~d time C_V_E_PTM included in the set of cell information #2. A~ a result, the reproduction apparatu~ can determine the first address of the vOBU that corresponds to the cell end time C_V_E_PTM. When the VOBU that correspond-~ to the cell end time C_v_E_PTM is VOBU #11, for example, the ~eproduction apparatus will stop reading the VOBU ~equence on reaching VOBU#ll, as shown by arrow yl2 By reproduclng the user-define~ PGC information in this way, the desired ~ections in VOBs included in two AV file~ may be reproduced in the given order.
This complete~ the explana~ion of the unity of AV
file, VOB information, and PGC information. The following is a description of the title search pointer shown in Fig. 70.

4-l-5) Content of the Title .~e~rch Pointer The title search pointer i~ information for managing the VOB information, time map table, PGC

'98~0~17~(11~)11:19 3~G R I CHES ~ pfi pO5/4d information, and AV files recorded on a DVD-RAM in the units called ~ideo titles that were de~cribed abo~e Each title search pointer is compo~ed of the PGC number that is assigned to a ~et of original PGC information or a set of user-defined PGC information, a title type, and a title recording history.
Each title type corresponds to one of the PGC
numbers, and is set at the value "00" to show that ~he AV title with the corre~ponding PGC number iq an ~ 10 original type PGC, o~ is set at the value "01" to show that the AV title with the corresponding PGC number is a user-defined PGC.
The title recording history shows the data and time at which the corre~ponding PGC in~ormation was recorded onto the DVD-~AM
When the RTRW directory on a DVD-RAM is indicated, a reproduction apparatu~ that complies to DVD-RAM
standard reads the title search pointers from the RTRW
management file and so can instantly know how many original PGCs and user-defined PGCs are gi~en in each ~ directory on the DVD-RAM and when each of these video titles were recorded in the RTRW management file.

'98$09~17~(~)1119 ~RICHES ~ F~ P06/44 ~4-1-6) Interch~na~hilitv of U~er-~efine~ PGCr~
~n~ O~ig;n~l PGC~ ~n A Re~ t The u~er-defined PGC information de~ined in a virtual edit can be used to indicate the linking order for cells in a real edit, as shown in thi~ fourth em~odiment.
Also, once a real edit has been performed as descri~ed in the fourth embodiment, if a set of user-defined PGC information is converted into a ~et of original PGC information, original PGC information can be ea~ily generated for ~he Vo8s o~tained by this linking.
This is becau~e the data construction of the user-defined PGC information and the original type information only differ in the value given a~ the title type, and becau~e the sections of a VOB ob~ained by a real edit are the section~ that were indicated ~y the user-defined PGC information ~efore the real edit.
The following is an explanation o~ the procedure ~ 20 ~or a real edit in ~his fourth embodiment, and of the process for updating user-defined PGC information to original PGC information. Fig. 72 shows an example of a user-defined PGC and an original PGC.
In ~ig 72, original PGC information #1 includes only cell#1, and forms part of an original PGC with '98~09~17~ 20 ~ RICHES ~ m~ P()7/44 VOB#1 and the VOB information. On the other hand, user-defined PGC information #2 for~s a u~er-defined PGC
using only cell#1, cell#2, and cell~3.
- In Fig. 72, cell#1 indicates the section ~rom VOBU#1 to VOBU#i, as shown by the broken arrows y51 and y52, while cell#2 indicates the section from VOBU#i+1 to VOBU#j, as shown by the broken arrows y53 and y54, and cell#3 indicate~ the section from VOBU#j+1 to VOBU#k+2, as shown by the broken arrow~ y55 and y56 In the following example, cell#2 is deleted from the use~-defined PGC information, and the user indicates a real edit using the user-de~ined PGC information #2 composed of the cells #1 and #3. In Fig. 73, the area that corre~ponds to the deleted cell is shown using shading.
Cell#2, which i~ deleted here indicates one of the video frames, out of the plurality of sets of picture data included in vo~u ~i+1 shown within the frame wll, using the cell start time C_V_S_PTM. Cell#2 also indicates one of the video frames, out o~ the plurality o~ sets of picture data included in vOBu #j+1 shown within the frame w12, using the cell end time C_V_E_PTM.
If a real edit is per~ormed u~ing the user-defined PGC information #2, VOBUs #i-1, i, and i+1 located at the end o~ cell#l and VOBUs #j, j+l, and j+2 located at '98~09~17~ 20 ~RICHES ~ fi P08/4 the start of cell#2 will be ~ubjected to re-encoding This re-encoding is performed according to the procedure described in the first and second embodiment~, and the linking of the extents i~ then performed according to the procedure described in the thi~d embodiment.
Fig. 74A shows the ECC block~ on the DVD-RAM that are freed by a real edit performed using user-defined PGC information #2. As shown on the second level of Fig. 74A, ~OBUs #i, #i+1, and #i+2 are recorded in the AV block #m, and VOBUs #j, #~+1, and #j+2 are recorded in the AV block #n.
As shown in Fig. 73, cell #2 indicates the picture data included in VOBU #i+1 as the C_v_S_PTM, and the picture data included in vOBU ~j+1 as the C_v_~_PTM. ~s a result, a SPLIT command and a SHORTEN command of the second embodiment are issued to ~ree the area from the ECC block occupied by VOBU #i+2 to the ECC block occupied by VOBU #j, as ~hown by the frames w13 and w14 in Fig. 74A. However, the ECC block~ occupied by VOBUs #i and #i+l and the ECC blocks occupied by VOBUs #j+l and j+2 are not freed.
Fig. 74B show~ an example of a VOB, VOB informa~ion and PGC information after a real edit. Since the area corresponding to cell #2 have been deleted, vo~ #1 is Z5 deleted into (new) VOB#1 and VOB#2.

'98~0~Lql7~ (1li) 11:21 ~c R I C HE S ~ fi P09i44 When the SPLIT command is issued, the VO~
information for VOB#1 is di~id~d into VOB information #1 and vOB information #2. The time map tables included in this vOB information is also divided into the time map table #1 and the time map table #2. Although not illustrated, the seamless linking information is also di~ided.
The VOBUs in voB#l and VOB#2 are referred to by a reproduction apparatus via these divided time map ta~les The u~er-defined PGC information and original PGC
information ha~e the same data construction, with only I the value of the title types differing. The sections of ~OBs obtained after a real edit were originally lS indicated by the user-de~ined PGC information #2 before the real edit, so that the user-defined PGC information ; #2 is converted into original PGC information Since thiY u~er-de~ined PGC information #2 i~ used to define the original information, ~here iY no need for a separate process to generate new original PGC data after a real edit.

~4-~) Fl~n~tion~l R]o~k~ of the DVD Recor~er 7~
Fig. 75 i~ a functional block diagram sho~ing the construction of the DvD recorder 70 in this fourth '98~09~17~(~)1121 ~RIC H E S ~ l~c ~ h P10/44 embodiment. Each function shown in Fig. 75 i~ realized ~y the CPU la execu~ing the programs in the ROM le and controlling the hardware ~hown in Fig. 17.
The DvD player ~hown in Fig. 75 is composed of a disc recording unit 100, a disc reading unit 101, a common file ~ystem unit lO, an AV file system unit 11, ; and an recording-editing-reproduction control unit 12, in the same way as in video data editing apparatus de~cribed in the third em~odiment. The present embodiment differ~ ~ith the third embodiment, however, in that the AV data recording unit 13 i~ replaced with the title recording control unit 22, the AV data I reproduction unit 14 is replaced with the title reproduction control unit 23, and the AV data editing lS unit 15 is replaced ~th the editing multi-stage control unit 26. This DVD player also includes a PGC
in~ormation table work area 21, an RTRW management file work area 24, and a user-defined PGC information generator 25, in place of the defragmentation unit 16.
(4-2-l) Recor~ina-F~itin~-Re~ro~llrti~n Control Unit 12 The recording-editing-reproduction control unit 12 in thi~ fourth embodiment recei~es a user indication of a directory in the directory xtructure on the DvD-RAM as the operation target. On receiving the user indication '98~17~(~)1122 ~RICHES ~ F~ Pll/;4 of the operation target, the recording-editing-reproduction control unit 12 specifies the operation content according to the user operation that has been reported by the remote control ~ignal reception unit 8 At the same time, the recordi~g-editing-reproduction control unit 12 gives instructions so that processing corresponding to the operation content is performed for the directory that is the operation target by the title recording control unit 22, the title reproduction control unit 23, or any of the other components.
Fig. ~7A shows an example of graphics data that is di~played on the TV monitor 72 under the control of the recording-editing-reproduction control unit 12. When any o~ the directories has been set into the focus state, the recording-editing-reproduction control unit 12 waits for the user to press the enter key. When the user does so, the recording-editing-reproduction control unit 12 specifies the directory that is presently in the focus state a~ the current directory.

~4-7-7~ PGC Tnforrn~tion T~hle Work Are;l ~1 The PGC information table work area 21 is a memory area that has a ~tandardized logical format so that sets of PGC in~ormation can be successively defined. This PGC information table uork area 21 has internal regions 98~~9~17~(71~) 11:22 ~G R I CHE S ~h ~ P12/44 that are managed as a matrix. The plurality of sets of PGC information that are pre~ent in the PGC informa~ion table work area 21 are arranged in different columns while a plurality of sets of cell information are ~ranged in dif~erent rows. In the PGC information table work area 21, any set of cell information in a stored set of PGC information can be accessed using a combination of a row numbe~ and a column number.
Fig. 76 shows examples of ~ets of original PGC
information that are stored in PGC information table ~ work area 21. It ~hould be noted here that when the ~ recording of an AV file is completed, the user-defined PGC information table ~ill be empty ~shown a~ "NULL" in Fig. 76"). In Fig. 76, the original PGC informa~ion #1 includes the ~et of cell information #1 showing the section between the start time t0 and the end time tl, the set of cell information #2 sho~ing the section between the start time tl and the end time t2, the set of cell information ~3 sho~ing the section between the start time t2 and the end time t3, and the set of cell information #4 showing the ~ection be~ween the start time t3 and the end time t4.

t~-2-3~ Tit~e Rec~or~ina Ct~ntrol Unit ~
The title recording control unit 22 records VOBs '98~09~17~ 23 ~RICHES ~ fi P13/44 ~ onto the DVD-RAM in the same way as the AV data recording unit 13 in the third embodiment, although in doing ~o the title recording control unit Z2 also store3 a time map table in the RTRW management file work area Z4, generates VOB information, and generates original PGC information which it stores in the PGC information table work area 21.
When generating original PGC information, the title recording control unit 22 follows the procedure described below. First, on receiving notification of a p~essing of the record key from the recording-editing-reproduction control unit 12, the title recording control unit 22 secures a row area in the PGC
infor~ation table work area 21 ~ext, after the Av data recording unit 13 has assigned an AV file identifier and a vOB identi~ie~ to the vO~ to be newly recorded, the title recording control unit 22 obtain~ these identifiers and stores them in the secured row area cor~esponding to a newly assigned PGC num~er.
Next, when encoding i9 started for the VOB, the title recording control unit 22 in~truct~ the MPEG
encoder 2 to output the PTS of the first video frame.
When the encoder control unit 2g ha~ outputted thi~ PTS
~or the first ~ideo frame, the title record~ng control Z5 unit 22 stores this value and waits for the user to .

98~~17~ 23 ~ R I CHES ~o ~ P14/4 perform a marking operation.
Fig. 80~ shows ho~ data input and output are performed bet~een the components shown in Fig. 75 when a marking operation is performed. While viewing the video images displayed on the TV monitor 72, the uses presses the mark key on the remote controller 71. This marking operation is reported to the title recording control unit 22 via the rou~e shown as ~, ~, ~ in Fig. 80A.
The title recording control unit 22 then obtains the PTS
for the point where the user pressed the mark key from the encoder control unit 2g, as ~hown by ~ in Fig 80A, and ~et~ thi~ a~ time in~or~ation.
The title recording control unit 22 repeatedly per~orm~ the a~ove processing while a VO~ is being ~ 15 encoded If the user pre~ses the stop key during the generation of the VOB, the title recording control unit Z2 instructs the encoder controL unit 2g to output the pre~entation end time for the la~t video frame to be encoded. Once the encoder control unit 2g ha~ outputted this presentation end time for the last video frame to be encoded, the title recording control unit 22 -~tores this as time information.
By repeating the above procecsing until the encoding of a VOB is complete, the title recording control unit 22 ends up storing the AV file identifier, '98~0~17~ 24 ~ RICHES ~fi ~ P15/44 ; the vOB identifier, the presentation start time of the first video frame, the presentation start time of each video frame corresponding to a point where a marking operation was performed, and the presentation end time of the final video frame.
Of ~his stored time information, the title recording control unit 22 ~ets the start time and end time of a section and the corresponding AV file identifier and VOB identifier as one set of cell information which it stores in a newly-~ecured row in the PGC information table work area 21. By doing so, the title recording control uni~ 22 newly generates original PGC information.
On completing the above generation, the ti~le recording control unit 22 as~ociates this original PGC
information to the a~signed PGC number, and, in ~he PGC
information table work area 21, generates a title search pointer that has type information showing that this PGC
information is original PGC information, and a title recording hi~tory showing the date and time at whic~ the recording of this PGC information wa~ completed.
It should be noted here that if the tltle reproduction control unit 23 can detect when there is a large change in the content of ~cene~, the u~er-defined PGC information generator 25 may automatically obtain '98~09~17~(~)1124 ~RICHES ~ pfi Pl6/44 the PTS for the points at which such ~cene changes occur and automatically set the9e PTS in sets o~ cell information.
The generation of a time map table or V0~
information does not form part of the gist o~ this embodimenr, and so will not be explained.

~
~4-?-4) Title R~ro~llct;on Control Unit ~3 The title reproduction control unit 23 performs ~eproduction or partial reproduction for any of the ~itles recorded in the current directory that is indicated by the recording-editing-~eproduction control unit 12.
Thi~ is described in more detail below. When, as ~hown in Fig. ~7A, one o~ the directories is selected as the current directory and the user gives an indication ~or the reproduction of one of the title stored in this directory, the title reproduction control unit 23 displays the screen image shown in Fig. 77A, reads the original PGC in~ormation ta~le and user-defined PGC
information table in the RTRW management file in this directo~y, and has the user select the complete reproduction or partial reproduction of one of the original PGCs or user-defined PGCs in the current directory. Fig. 77~ shows the PGCs and cells that are '98~0~17~(~)1125 ~RICH~S ~ ifi P17/4 displayed as the list of potential operation targets.
The sets of PGC information and cell information t~at represent these PGCs and cells are the same a~ those shown in the example of Fig. 76.
The original PGCs that appear in this interactive screen are shown in a simple graph that shows time in the horizontal axis, with the each original PGC being displayed along with the date and time at which it was recorded. In Fig. 77B, the ~enu at the bottcm right of the screen shows whether complete reproduction or partial ~eproduction i~ to be performed for the video title in the current directory. By pre~sing the "1" or "2" key on the remote controller 71, the user can select complete reproduction or partial reproduction or the video title If the user selects complete reproduction, the title reproduction control unit 23 has the user ~elect one of the PGC~ a~ the operation target, while if the user selects partial reproduction, the title reproduction control unit 23 ha~ the u~er ~elect one of the cells as the operation target.
When complete reproduction ha~ been selected for a PGC, the tltle reproduction control unit 23 extracts the cells ~rom the PGC selected as the operation target and, by re~erring a time map table such a~ that ~hown in Fig.
71, reproduces the ~ections indicated by the cells one 98~0~17E3 (1~ 26 ~G R I C H E S ~ Pl~/44 ~y one. On completing the reproduction of the ~ections, the title reproduction control unit 23 ha~ the interactive ~creen s~o~n in Fig. 77B displayed and wait~
for the next selection of cell information.
Fig. 78A is a flowchart ~howing the processing when partially reproducing sets of cell information. First, in step S271, the title reproduction control unit 23 reads the C V_S_PTM and C_V_E_PTM from the cell information to reproduced out o~ the original PGC
information or user-defined PGC information. Next, in ~tep S272, the tit~e reproduction control unit 23 ~pecifies the addre~s of the vOBU (START) that includes the picture data a~igned C V S_PTM.
In step S273, the title reproduction control unit 23 ~pecifies the address of the vOBU (END) that includes the picture data assigned C_V_E_PTM, and in step S274, the title reproduction control unit 23 reads the section from VOBU ~START) to VOBU (END) f~om the present VOB.
In ~tep S275, the title reproduction control unit 23 instructs the MPEG decoder 4 to decode the read VOBUs.
In 4tep S276, the title reproduction control unit 23 output~ the cell pre~entation start ti~e ~C v S_PTM) and cell presentation end time (C_V_E_PTM) to the decoder control unit ~k of the MPEG decoder 4 a~ valid reproduc~ion section information, together with a decode '98~09~17E(~)1126 ~RICHES ~ P19i4 processing request.
The reason the title reproduction control unit 23 outputs the valid reproduction section information to the MPEG decoder 4 is that the decoder control unit 4k in the MPEG decoder 9 will try to decode even picture data that is not within the section indicated by the cell. In more detail, the uni~ for the decode processing of the MPEG decoder g i-~ a vOBU, so that the MPEG decoder 4 will decode the entire section from VOBU(START) to VOBU(END), and in doing so will have picture data outside the section indicated by the cell reproduced A cell indicates a ~ection in units of video fields, so that a method for prohi~iting the decoding and reproduction of picture data outside the section is necessary. To prohi~it the reproduction of ; such picture data, the title reproduction control unit 23 outputs valid reproduction section information to the title reproduction control unit 23 Fig. 78B shows how only the section between the cell presentation start time (C_V S PTM) and the cell presentation end time (C V_E_PT~), out of the area ~etween the VOBU (START) and the VOBU (END), is reproduced.
By receiving this valid reproduction section information, the MPEG decoder 4 can stop the display output of an appropriate number of video fields f~om the , .

Y~IY~ Kl~ ot~ Ui~
._ start of the VOBU (START) to C V S PTM and the display output of an appropriate number of video fields from C_V_E_PTM to the VOBU (~ND). For the hardware construction shown in Fig. 17, the disc access unit 3 reads the VOBU se~uence and outputs this to the MPEG
decoder 4 via the logical connection (1). The MPEG
decoder 4 decodes this VOBU sequence and prohibits the reproduction output of the part that precedes C V S_PTM
and the part that follows C_V_~_PT~. A~ a re~ult, only the section indicated by the cell info~mation is reproduced.
Since one et of original PGC infonmation or user-defined PGC information includeq a plurality of sets of cell informa~ion, the procedure shown in Fig. 78A may be repeated for each set of cell information included ln one set o~ PGC information.

I

~4-~-5) RTRW M~n~q~m~nt Fil~ Wo~k ~re~ ~4 The RTRW management file work area 24 is a ~ork area for arranging the original PGC information table composed of the plurality o~ ~et~ of original PGC
information generated in the PGC information table work area 21, the user-defined PGC information table composed of a plurality of sets of user-defined PGC information, 2s the title search pointer-~, and the set-~ of vOB

Y~ K1 ~ 'O ~ Ot~ 1J

information, in accordance with the logical format sho~n in Fig. 70. The common file system unit 10 writes the data arranged in the RTRW management file work area 24 into the RTRW directory aq non-AV files, and in doing so ~tore~ a RTRW management file in the ~TRW directory.

(4-2-6~ U~er-~efine~ PGC Tnfor~tlon G~n~r~tor ~5 ~ he user-defined PGC in~ormation generator 25 generates user-defined PGC information ~ased on one set of PGC information recorded in the RTRW management file of the current directory. Two ~ypes of cell information can be pre~ent in the u~er-defined PGC information ~called sets of user-defined cell information), with these being a first type that indicates an area inside a section indicated by cell information in an existing set of PGC information, and a second type that indicate~ the same section a~ a ~et of cell information in an existing ~et of PGC information. The user-defined PGC
information generator 25 generates the~e two types of cell information using different methods.
To generate the first type of user-defined cell information that indicate~ an area in~ide a section indicated ~y exlsting cell information, the user-defined PGC information generator 25 ha~ the title reproduction control unit 23 perform partial reproduction for the Y~:UY~ gC iG K 1 ~ ot ~

section indicated by the existing cell infor~ation.
I D~ring the partial reproduction for this section, the ! user-defined PGC information generator 25 monitors when the user performs marking operations, and generates sets of cell information with the times of the marking operations as the start point and end point. In this way, the user-defined PGC information generator 25 generate~ u~er-defined PGC information composed of this first type of cell information.
Figs. 79A and 79B show how the ~ser uses the TV
monitor 72 and remote controller 71 when generating user-defined PGC information. Fig. 80B shows the data input and outp~t between the components shown in Fig. 75 when a marking operation is performed. As shown in Fig.
79A, the u~er views the ~ideo images displayed on the TV
monitor 72 and presses the mark key on the remote controller 71 at the beginning of a desired ~cene.
i After this, the desired scene ends, a~ shown in Fig.
79~, and the video image~ change to a content in which the user has no intere~t Accordingly, the user pre~Yes the mark key again.
Thi~ marking operation i~ reported to the user-defined PGC in~ormation generator 25 via the route ~hown a~ ~, ~, ~ in Fig. ~OB. The user-defined PGC
! 25 information generator 25 then o~tain~ the PTS of the points when the user pressed the mark key from the MPEG
decoder 4, as shown by ~ in Fig. ~OB, and stores the PTS
as time information. The user-defined PGC information generator 25 then generates a set of cell information by attaching the appropriate Av file identifier and VOB
identifier to a pair of stored PTS that are the start point and end point of a section, and stores this cell information in a newly secured row area the PGC
info~ation table work area 21, as shown by ~ in Fig.
lo 80B.
When generating user-defined PGC information that indicates a section indicated by an existing set of cell in~ormation, the user-defined PGC information generator 25 merely copies the existing cell information into a different row area in the PGC information ta~le work area 21.
In more detail, the user-defined PGC information generator 25 secures a row area fo~ one row in the RTRW
management file work area 24, and a~igns a new user-defined PGC information identifier to this row area.
Once the cell information that should be u~ed inthe present u~er-defined PGC information has been indicated, out of the qets of cell information in the PGC information already ~tored in the PGC information table work area 21, using a combination of a row number '98~09~17~ 29 ~RICHES ~ fi P24/41 and a column num~er, the user-defined PGC information generator 25 read~ the cell information and copies it into a ne~ly secured row area in the PGC information table work area 21.

~4-2-7~ F.~itin~ Ml~lti-St~e Control Unit ~6 The editing multi-stage control unit 26 controls the title reproduction control unit 23, the user-defined PGC information generator 25, and the seamless linking unit 20 to perform a multi-stage editing process including:
1. virtual edits achieved by defining user-defined PGC information;
2. pre~iews which allow the user to view the video images that would be obtained by a real edit, based on the result of a virtual edit;
3. seamles~ linking, as de~cri~ed in the first and second embodiments; and 4. real edits performed by linking Av files as descri~ed in the third embodiment (4-?-7-1) Pro~-e~llre for M-llti-st~ge F.~itina k~y th~
F.~itin~ Mlllti-~t~e Contro] Unit Z6 The following i5 a description of the specific procedure ~or the multi-stage control performed by the -Y~Y~ (O~ '4 editing multi-stage control unit 26. When the user selects a virtual edit u~ing the remote controller 71 ln re~ponse to the interactive screen ~hown in Fig. 77A, the editing multi-stage control unit 26 accesses the ~TRw directory, has the common file system unit 10 read the RTRW management file from the RTRW directory, and has the RTRW management file ~tored in the RTRW
management file work area 24. Next, out of the RTRW
management file stored in the RTRW management file work area 24, the editing multi-stage control unit 26 transfers the original PGC information table, the user-defined PGC information table, and the title ~earch pointers to the PGC infor~ation table work area 21, and transfers the ti~e map table to the time map table ~ork area.
Based on the transferred original PGC information table, the editing multi-stage control unit 26 displays the interac~ive screen shown in Fig. 85, and waits for the next user indication.
Fig. 85 ~how~ an example of the interactive screen displayed by the TV monitor 72 to have the user ~elect the ~ections ~or the cells of a user-defined PGC in a virtual edit This interactive screen displays the original PGC~
and user-deflned PGCs a~ simple graphs, ~here the 'Y~Y~ Oll~J ~Kl~ Y~

horizontal axi~ represents time. The recording date and time of each original PGC and user-defined PGC is al~o displayed. This interactive ~creen displays the plurality of cells as a horizontal arrangement of rectangles. The user may select any of these rectangles using the cursor keys on the remote controller 71 These original PGC and cells are the same as those shown in Fig. 76, and the following describes the updating o~ the original PGC information table, the lo user-defined PGC information table and t~e title search pointers with Fig. 76 as the ini~ial state.
Fig. 81 is a flowchar~ Yhowing the proce~sing of the e~iting multi-~tage control unit 26 when defining a user-de~ined PGC. In this flowchart, the variable j indicates one of the plurality of original PGCs that are a~ranged vertically in the inte~active screen and the variable k indicate~3 one of the plurality of cells that are arranged horizontally in the interacti~e screen.
The variable m is the PGC number that should be assigned to the set of u~er-defined PGC information that is being newly defined in the RTRw m~nagement file, and the variable n is the cell number that ~hould be assigned to the set of cell information ~hat i~ being newly defined in the RTRW management file.
In step S201, the editing multi-~tage control unlt -26 substitutes a value given by adding one to the las~
num~er of the original PGC information in the RTRW
management file into the variable m and "1" into the variable n. In step S202, the editing multi-stage control unit 26 adds a space for the mth user-defined ~GC information to the user-defined PGC in~ormation table and in ~tep S203, the editing multi-stage control unit 26 waits for the user to make a key operation.
Once the user has made a key operation, in step S204 the editing multi-stage control unit 26 set~ the flag for the pressed key, out of the flags that correspond to the keys on the remote controller 71, at "1", and in step S205 judges whether the Enter_Flag, which shows whether the enter key has been pres~ed, is "1". In step S206, the editing multi-stage control unit 26 judges whether the End_Flag, which shows whether the end key has been pressed, is "1". When both these ~lags are "0", the editing multi-stage control unit 26 uses the Right_Flag~
Left_Flag~ Down Flag~ Upper_Flag, which respectively show whethe~ the right, left, down, or up keys have been pressed, to perform the following calculations, before substituting the calculation results into the variables k and j.
k-k+l~(Right_Flag)~ Left_Flag) j-j+l~(Down_Flag)-l~(Up_Flag) Y~UY~ Kl~ a ~ ot~ i4 When the right key has been pressed, the Right Flag is set at "1", so that the variable k is incremented by "1". When the up key has been pres~ed, the Up_Flag is set at "1", ~o that the variable j is incremented by "1". Conversely, when the left key has been pre~sed, ~he Left_Flag is set at "1", so that the ~ariable k is decremented by "1". In the same way, when the down key : has been pressed, the Down_Flag i~ set at "1", so that the variable j is decremented by "1".
~o After updating the value~ of the ~ariables k an~ j in this way, the editing multi-stage control unit Z6 has the cell repre~entation in row j and column k displayed in the focus state in xtep S208, clears all of the flags assigned to key~ on the remote controller 71 to zero in step S209, and returns to step S203 where it waits once again for a key operation By repeating the procedure in steps S203 to S209 described above, the focu~ ~tate can mo~e up/down and left/right among the cells according to key operations made using the remote controller ~1.
If the user presses the enter key with any of the cells in the focus state during the above processing, the editing multi-~tage control unit 26 proceeds to step S251 in Fig. 82.
In ~tep S251 of Fig. 82, the editing multi-~tage Y~$~Y~ (Ol~ 15 ~ Y~44 control unit Z6 haA the u~er give an indication as to whether the cell infor~a~ion in row ~ and column k should be u~ed as it is, or whether only an area within the section indicated by this cell information is to be used~ When the cell information i~ ~o be used a~ it is, the editing multi-Atage control unit 26 copies the cell representatio~ in row j and column k to the space given as row m and column n ln step S~52, and de~ines Original_PGC#j CELL#k as User_Defined_PGC#m.CELL#n in step S253 After t~iA defining, in step S254 the editing multi-stage control unit 26 increments the variable n and proceeds to step S209 in Fig. 81.
When an area within the section indicated by this cell information in row j and ~olumn k should be used, the editing multi-~tage control unit 26 proceeds to step S255 to ha~e the title reproduction control unit 23 commence partial reproduction for the cell information in row j and column k.
In step S255, the editing multi-stage control unit 26 determines the circumstance~ ~or the reproduction of the cell information in row j and column k. This determination is pe~formed since when the section indicated by this cell info~mation has been reproduced in part, there is no need to reproduce the section once Z5 again from the start, with it being preferable in this ~UY~ O~ Kl~ ot ~ U/ ~4 ca~e for the reproduction of the section indicated by the cell information in row j and column k to commence : at the position where the previous reproduction was I terminated (Step S266), this point being called the reproduction termination point t.
On the o~her hand, when the cell information in row j and col~mn k has not been reproduced, the section indica~ed by the cell information in row j and column k is reproduced from the ~tart in step S265, with the processing then returning to steps S256 and ente~ing the loop formed of steps S256 and S257. Step S256 waits for the reproduction of the ~ell to end, while step S257 waits for the user to pres~ the mark key When the judgement "Yes" is gi~en in step S257, the processing advances to step S258, where the time information for the pres~ing of the mark key is obtained, and then to step S259 In step S259, the editing multi-stage control unit 26 judges whether two sets of time information have been obtained If not, the processing returns to step S256, or i~ so, the proces~ing advances to step S260 where the obtained two sets of time information are set as the start point and end point.
One of the set~ of time information obtained here is the start of the ~ideo scene which was marked by the Y,~ tK~ la ~ ,t ~ 4 use~ during it~ di~play on the TV monitor 72, while the other ~et of time information is the end of this ~ideo scene. These ~ets of time information are interpreted ; a~ marking a section in the original ~GC which is especially wanted by the user as material for a video edit Acco~dingly, user-defined PGC information should be generated this ~ection, so that cell information is generated in the PGC information table work area 21.
The processing then advances to step S261.
In ~tep S261, the user-defined PGC information generator 25 obtains the VOB_ID and AV file ID in Original_PGC#j.CELL~k. ~n step S262, the user-defined PGC information generator 25 generates U~er-Defined-pGc#m~cEL~#n using the o~tained ~tart point lS and end poin~, VOB ID, and AV file ID. In step S263, the end point information is stored as the reproduction termination point t and in step S254, the variable n i~
i~cremented, ~efore the processing returns to ~tep S209.
As a re~ult of the above processing, new user-defined cell in~ormation is generated from the cell information in row j and column k. After this, another cell i~ set into the focus state and another set of user-defined cell information i9 generated from this cell, ~o that a ~et of user-de~ined PGC information is gradually defined one cell at a time.

~4UY~ O 11~ n~ ~ ~lo ~ ot~

It should be noted here that if the reproduc~ion ba~ed on the cell information in row j and column k in the loop process ~hown as step S256 to step S257 ends without a marking operation having bee~ made, the processing will return to step S254.
When it is determined that the end key has been pressed, the judgement "Yes" is given in step S206 in Fig. 80B and the procesQing advances to step S213. In step S213, a menu is displayed to have the user indicate ~hether a next user-defined PGC is to be defined. When the user wi~hes to define a new user-defined PGC and gives an indication of such, in step S214 the va~iable m is incremented, the variable n is initialized and the processing proceeds to steps S209 and S203.

~4-7--7-7) Specifit- F.XArr~1 e of the neflnitiOn of U~er-nefine~ PGC Tnform~tion The following i~ a description of the operation when defining user-defined PGC information ~rom a plurality of sets o~ original PGC information tha~ are di~played in the interactive screen image of Fig. 85.
Figs. 86A and 86B show the relationship bet~een the u~er operationQ made via the remote controller 71 and the di-~play processing that accompanie~ the ~ariou~ user operations. Fig. 87A through Fig. 90 also illustrate , . . ..

Y~U~ O~ Kl~ la ~ t~

examples of such operation~, and are referred to in the following explanation of the~e operations.
As shown in Fig. 85, once the cell #1 which is in row 1 and column 1 has been set in the focus state, ~he user presses the enter key, a~ shown in Fig. 86B. As a result, the judgement "Yes" is gi~en in step S205 and the processing proceeds to the flowchart in Fig. 82. In steps S251 to S266 of the flowchart in Fig. 82, the first cell information CELL #lA in the user-defined PGC
#1 is generated based on the Original_PGC#l.CELL#l shown in Fig. 86A. Once this generation is complete, the variable n is incremented in step S254, and the proce~ing returns to step S203 via ~tep S209 with the value of the ~ariable n at "2". In this example, the user pre~e~ the ~own key once, as sho~n in Fig. 87B, and the right key twice, as shown in Fig~. 87C and 87D.
In step S204, ~he flags that corresponds to the keys that have been pressed are set at "l".
As a re~ult of the first pre~ of the do~ key:
k=1(~1+1~0-1~0) ; j=2~ 1*1-1~0) As a result of the fi~st pres~ of the right key:
k=2(=1+1~1-1*0) j=2~=2+1~0-1~0) As a result of the second pre~ of the right key:

Y~UY~ Ia ~ t ~ ~ P~

k=3~=2+1*1-1~0) j=2~2+1*0-1~0) As shown in Fig. 87A, the cell #7 located in row 2 and column 3 i9 set in the focus state.
Once the cell in row 2 and column 3 ha~ ~een set in the focus state, the user presses the enter key, as shown in Fig. 88B, so that the judgement "Yes" is given in step S205 and the processi~g advances ~o the flowchart in Fig. 82. The cell information #7A~ which ls the second set of cell information in UserDefined_PGC#l, is then generated based on the Original_E?GC#2 . CELL#7 located in row 2 and column 3 of the original PGC information table (see Fig. ~8A).
After the ~econd set of cell information has been generated, the a~ove processing is repeated. The user presses the enter key as chown in Fig. 89B, so that the cell information #llA and the cell information #3A are respectively generated as the ~hird and fourth sets of cell in~ormation in UserDefined PGC#l.
The processing returns to step S203 and, in the present example, the user ~hen pre~ses the end key. As a result, the End_Flag corresponding to the end key is ~et at "1", and the processing advance~ to step S213.

Since the end key has been pressed, the editing multi-FROM-- 1998~9~16E(~)22:48~22:36/~4301968157P 2 ~tage control unit 26 regards the definition of the u~er-defined PGC information #l as complete. In step S213, the user i9 asked to indicate whether he/she wishes to define another ~et of u~er-defined PGC
information (~he u~er-defined PGC information #2) that follows this defined user-defined PGC info.",ation #l.
If the user wishe~ to do ~o, the variable m i~
incremented, the variable n is initialized, and the processing proceeds to step S209.
By repeating the above processing, the user-defined PGC information #2 and the user-defined PGC information #3 nre de~ined. As shown in Fig. 9l, this user-defined PGC information #2 is composed of cell #2B, cell ~4B, cell ~lOB, and cell #5B, and the user-defined PGC
information #3 i~ composed of cell #3C, cell #6C, cell #8C, and cell ~9C.
Fig. 9l sho~s the contents of the user-defined PGC
information table, the original PGC information ta~le and the title search pointer-~ at the end of the virtual edit process.
If the user presses the end key at t~i~ point, the interactive screen sho~n in Fig. 90 will be displayed in step S215 in Fig. 81, and the editing multi-~tage control unit 2 6 wait~ for the user to select a set of user-defined PGC information using the up and down key~.

FROM ~ 1998~ 9~16~ (~) 22: 49/~22: 36/5~4301968157 P 3 Here, the user can selec~ a preview ~y pre~ ing the play key , and can select a real edit by pressing the real edit key, Wit}l the user-defined PGC information ta~le not being recorded yet.
If the user giveq indication for an operation that records a user-defined PGC, the user-defined PGC
information table that includes the new u~er-defined PGC
generated in the PGC information table work area 21 i~
transferred to the RTRW management file work area 24, where it is written into the part of the RTRW ~anagement file written in the R~RW management file work area 24 that correspond-~ to the u~er-defined PGC information table.
At the same time, file system co~nd~ are issued so that a title search pointer for the newly generated user-defined PGC information is added to the title search pointer~ that are already present in the RTRW
management file tran~ferred to the RTRW management file ~ork area 24~
Fig. 83 is a flowchart showing the proce~ing during a prc~iew or a real edit. The ~ollowing i~ a description of the processing when performing a preview of a VOB linking operation, with reference to this flowchart in Fig. 83.
Z5 ~igs. 92A-92B and 93A-93C ~how the relationship FROM ~ 1998~9~16~(~)22:49~22 36/~4301968157P 4 ~e~ween operatlons made u~ing the remote controller 71 and ~he display proce~ing ~hat accompanies the.qe operations In step S220 of the flowchart of Fig. 83, the first number in the user-defined PGC in~ormation table i~
substituted into the variable j, and step S221, a key operation is awaited When the user make~ a key opera~ion, in ~tep S222 the flag corre~ponding the key pressed by the user i~ ~et at "l".
In step SZ23, it is judged whe~her the Play_Flag, which shows whether the play key has been pressed, i~
"1", and in ~tep S224, it i~ judged whether the RealEdit Flag, which shows whether the ~eal edit key has been pre~sed, is "l". When both the~e flag3 are on, ~he processing p~oceeds ~o step S225 where the following calculation i~ performed using the value~ o~ the Up_Flag and Down_Flag that respectively shou whether the up and down key~ have been pres~ed. The results of thi~
calculation i~ -~ub~tituted into the variable j.

j~ j+l~Down ~lag)~ Up_Flag) When the user ha~ pre~sed the up key, the Up_Flag will be set at "l", meaning that the variable j is decremented. Conversely, the user ha~ pressed the down Y~U~ O ll~ Kl ~ Ia ~ ,t~

key, the Down Flag will ~e set at "1", ~eaning that the ~aria~le j is incremented. Once ~he ~ariable j has been updated in thi~ way, in step S226 the i~age on the di~play corresponding to the PGC information positioned on row j is set in the focus state. In step S227, all of the flags corre ponding to keys on the remote controller 71 are clea~ed to zero and the processing return~ to step S221 where another key operation is awai~ed. Thls proce~ing in step~ S221 to S227 is repeated, with the focus state moving to a different set of PGC information in accordance with user operations of the up and down key~ on the remote controller 71.
If the user pres~es the play key, ~uring the abo~e processing i~ being repeated, with one of the sets of PGC information in the focus state, the Play_Flag is set at "1", the judgement "Yes" i~ given in step S223, and the proce~xing proceeds to S228 In step S228, the editing multi-stage control unit 26 instructs the title reproduction control unit 23 to reproduce the ~OBs ln accordance with the PGC, out of the ~ser-defined PGC~, that has been indicated by the user.
When the PGC indicated by the user is a user-defined PGC, the cell~ included in the user-defined PGC
will indicate ~ections out of the plurality of ~ection in one or more VOBs in a user-defined order. Since such FROM ~ 1998~9~16~(~)22:50/~22:36/~4301968157P 6 reproduction will not satis~y the nece~sary condltlons for seamles~ reproduction that were de~cribed in the fir~t and second embodiments, ~o that image di~play and output will be stopped at the boundary of a cell during reproduction before advancing to the next cell. Since the necessary condition~ for seamless rep~oduction of cells are not satisfied, image di~play and audio display will be interrupted. Howe~er, the object of this operation is only to give the user a pre~iew of the linking result for a plurality of scenes, so that thi~
object is still achieved regardless of such interruptions.

~4--?--7-3~ Process;no for ~ Preview of ~Illti-~t;~e F.r~l t ~n~ fnr ~ R~l Edit The operation for the linking of vOBS in a real edit is described below.
Figs. 94A to 94C ~how the relationship between user operationY of the remote controller 71 and the display processing that accompany these key operations. The user pre~ses the up key a3 ~hown in Fig. 94B to have cell #lA set into the focus sta~e, and this is reflected in the di~play ~creen displayed on the Tv monitor 72 as ~ho~n in Fig. 94A. If the user then pre~qes the real edit key, as shown in Fig. 94C, the judgement 'IYes" is FROM ~ 1998~9~16~(~)22:50/~22:36/~4301968157P 7 made in step S224 in Fig. 83, and the proceqYing from step S8 to step Sl6 in the flowchart of Fig. 43 described in ~he third embodiment i~ performed.
After comple~ing this processing in the third embo~;ment, the proceYsing advances to step S237 in Fig.
84. After the variable n is set at "l" in step S237, a search is performed for the Original PGC#j CELL#k which was used ~hen generating the UserDef ined PGC#m CELL#n in step S238 and in ~tep S239 it is judged whether this Original PGC#j exist~. If so, this Original PGc#j is -deleted in step S240, or if not, a search is per~ormed for the UserDefined_PGC#q that was generated from thi~
Original_~GC#j in step S240.

In step S242, it is determined whether there is at lea~t one such UserDefined_PGC#q, and if so, all such U~erDefined_PGC#q are deleted in step S243 In step S244, it i~ judged whether the value of variable n matches ~he last number of the cell in~ormation, and if not, the proce~ing advances to step S295 ~here the variable n is incremented to indicate the next set of cell information in the PGC information #q before the processing returns to step S238. The loop process in step S238 to step S245 is repeated until the variable n reache3 the last number of the cell information in the PGc infonmation #~.

FROM - 1998~9~16~(~)22 51~22 36/~4301968157P 8 The sections indicate~ by the u8er-de~ined PGC
information #l are all of VOBs ~l, #2, and #3, ~o that these are all subjected to the real edit. The ~ets of original PGC information that were used to generate the cell information included in user-defined PGC
information #l indicate vOBs that are subjected to the real edit, so that all of these sets of original PGC
information are deleted. The ~ets of user-defined PGC
that were generated from the~e ~ets of PGC information also indica~e VOBs that are ~ubjected to the real edit, so that all of these sets of u~er-defined PGC
information are also deleted.
The judgement "Yes" is made in ~tep S244, so that the processing advances to step S24 6, and, out of the freed PGC numbers obtained by deleting the ~etQ of original PGC information, the lo~est number is obtained a~ the PGC number ~e. Next, in step S247, ~he cell infor~ation is updated u~ing the AV file ID as~igned to the Av file ar~d the vOB_ID after the MERGE comm~nd, and in step S248 the PGC number of the U~erDefined_PGC#q is updated to the PGC number #c. In the title search pointer~, meanwhile, the type information is updated to the original type.
Fig. 95 shows exampleq of the PGC information table and the title ~earch pointerq after the deletion of sets Y~UY~ GiC K 1 ~ Ic ~ ot~

of original PGC infor~ation and user-defined PGC
information that accompanies a real edit.
Since the VOBs #1, #2, and #3 indicated by the I sections in user-defined PGC information #1 are S subjected to the real edit, the original PGC information #1, the original PGC information #2, the original PGC
infor~ation #3, the user-de~ined PGC information #2, and the user-defined PGC information #3 will already ha~e been deleted. Conversely, ~hat was formerly t~e user-defined PGC informa~ion #1 has been defined as theoriginal PGC information ~1.
Once the PGC information has been updated in the PGC information table work area 21 as described above, the new original PGc information i~ transferred to the RTRW management file work area 24 where it is used to overwrite the RTRw management file p~esently stored in the RTRW management file work area 24. At the same time, the title ~earch pointer for this newly generated original PGC information is transferred to the RTRW
management file work area 24 where it i~ u~ed to overwrite the title ~earch pointer~ already present in the RTRw management file.
Once the user-defined PGC information table and title search pointer~ have been written~ file system com~and~ are isYued so that the RTRw management file FROM - 1998~ 9~16~(~)22:52/~22:36~ $4301968157 P 10 slored in the RTRW manageInent ~ile work area 24 is written into the RTRW directory.
~ ith this present embodiment, the ~ections to be used as materials for a real edit are indicated by use~-S defin~d cell information, with these being freelyarranged to pro~isionally decide the reproduction route.
When the user wishes to set a reproduction route o~
the editing materials, this can be achieved without ha~ing to temporarily produce a VOB, so tha~ the edi~ing of video mate~ials can be performed in a ~hort time using a simple method. This also means that there is no need to u~e more of the storage capacity of the DvD-RAM
to store a temporarlly produced VOB.
If the provi~ional determination of 3cene linking can be achieved by merely defining a set of user-defined PGC information, the user can produce many variation~ Of the reproduction route in a shor~ time. The set~ of user-defined cell information are indicated using time information ~or section~ in vOBs, so that the indicated voBs can be maintained in the state in which they were already recorded.
The u~er can generate a plurality of ~e~s of user-defined PGC information for different reproduction routes and then ~iew previews of these routes to find the mo~t suitable of these reproduction ro~tes The FROM - 1998~ 9~16~(~)22:52/~22:36/~ 4301968157 P 11 user can then indicate a real edi~ for his/her preferred reproduction route, and ~o pro~ess the VOBs in ~ acco~dance with the selected uqer-defined PGC
information. Thi~ mean~ that the user can perform a bold editing proce~s that directly rewrites the VOB.~
that are already Ytored on an optical di~c. While the original VOBs will b~ effecti~ely deleted from the disc, the u:3er i9 able to verify the result of thiq before gi~ing the real edit indication, making this not a particular problem for thç pre~ent lnvent$on.
Once a real edit has been performed, the title type in the title search pointer of the user-defined PGC
information used for the real edit will be set to "original type PGC information", so that this can be u-ced a~ the base for following video editing operations.
As described above, a single video data editin~
apparatus that uses only one optical disc can perfor~
advanced video editing where~y a u-~er can select one out of a plurality of freely chosen potential arrangementY
of the source material. As a result, by u~ing the present video data editing apparatu , a large number of ~ideo enthu~la~t~ will be a~le to perform advanced editing operations that ~e~e considered out of the reach o~ con~entional dome~tic ~ideo equipment.
It ~hould be noted here that the time information FROM ~ 1998$9fi16e(~)22:52/~22:36/~4301968157P12 may be taken from the mark points in ~he cell information and managed with information ~uch as address taken ~rom thc time map table in the form of a table.
By doing so, this information can be presented to the user as potential selections in a screen showing the pre-editing state.
Reduced image~ (known as "thumbnail~"~ may also be generated for each mark point and stored as ~eparate file~, with pointer infonmation also being produced for each thumbnail. When di~playing the cell information at the pre-edi~ing stage, the~e thumbnails may be displayed to show the potential selections that can be made by the user.
The processlng of components such a.~ the title reproduction control unit 23 ~see Fig. 78) and the processing of the editing multi-stage control unit 26 (Figs~ ~l to 84) that was described in this fourth embo~iment using flowcharts can be achieved by a machine language program, Such machine language program may be distri~uted and sold having been recorded on a recording medium. ~xample.q of ~uc~ recording medium are an IC
card, an optical disc, or a floppy diqc. The machine language program recorded on the recording medium may then ~e in~talled into a standard pe~sonal computer. ~y executing the in-qtalled machine language programs, the Y~*UY~ b gQ j~ K 1 ~ 18 ~ f ot ~ ~i PIT ~ 4 standa~d personal computer can achieve the f~nctions of the video data editing apparatus of this fourth embodiment A~ a final note regarding the relationship between VOBs and original PGC information, it is prefera~le for one set of original PGC information to be provided for each VOB.
Although the present invention has been fully described by way of examples with reference to accompanying drawing~, it is to be noted that various changes and modifications will ~e apparent to those skilled in the art. Therefore, unless such changes and modifications depart from the scope of the present invention, they should be con~trued a~ being included therein.

Claims (31)

1. A video data editing apparatus that performs editing to enable seamless reproduction of at least two video objects that are recorded on an optical disc, each video object including a plurality of video object units, and each video object unit including sets of picture data, the video data editing apparatus comprising:
reading means for reading at least one of a former video object unit sequence and a latter video object unit sequence from a video object recorded on the optical disc, the former video object unit sequence being composed of a predetermined number of video object units positioned at the end of a former video object to be reproduced first, and the latter video object unit sequence being composed of a predetermined number of video object units positioned at a start of a latter video object to be reproduced second;
encoding means for re-encoding the sets of picture data included in at least one of the former video object unit sequence and the latter video object unit sequence to enable the former video object and the latter video object to be reproduced seamlessly; and writing means for rewriting at least one of the former video object and the latter object on the optical disc after encoding by the encoding means.
2. The video data editing apparatus of Claim 1, wherein the encoding means re-encodes at least one of the sets of picture data included in the former video object unit sequence and the sets of picture data included in the latter video object unit sequence using a target amount of code, the target amount of code being an amount whereby no overflow will occur in a video buffer of a video decoder, even when the sets of picture data included in the former video object unit sequence are present in the video buffer at a same time as the sets of picture data included in the latter video object unit sequence.
3. The video data editing apparatus of Claim 2, wherein the plurality of sets of picture data are stored in a plurality of video packs, each video pack being assigned an input time stamp showing an input time for input into the video buffer, and one of the video packs being assigned a decode time stamp showing at what time the one of the sets of picture data should be taken from the video buffer, the video data editing apparatus further comprising analyzing means for calculating an amount of data that will be stored in the video buffer for each of the plurality of video frames in an analyzed period between a final input time in a final video pack in the former video object unit sequence to a final decode time of a set of picture data in the former video object unit sequence that should be decoded last, the analyzing means calculating the amount of data by referring to the input time stamps and decode time stamp assigned to each video pack in the former video object unit sequence and the latter video object unit sequence and totaling a data size of each video pack corresponding to the analyzed period, and the analyzing means calculating the target amount of code based on the calculated amount of data for each video frame and a buffer capacity of the video buffer, the re-encoding means re-encoding at least one of the sets of picture data included at an end of the former video object unit sequence and the sets of picture data included at a start of the latter video object unit sequence using the target amount of code.
4. The video data editing apparatus of Claim 3, further comprising:
information generating means for generating seamless linking information that includes a time stamp assigned to a final pack in the former video object unit sequence, a time stamp assigned to a first pack in the latter video object unit sequence, and a seamless flag showing whether reproduction is to be seamlessly performed for the former video object unit sequence and the latter video object unit sequence, an input time at which a first pack in the latter video object unit sequence is inputted into a buffer being found by adding a certain offset to a time stamp assigned to the first pack in the latter video object unit sequence, the writing means writing the generated seamless linking information onto the optical disc.
5. The video data editing apparatus of Claim 4, wherein each set of picture data includes data that is to be decoded for one video frame, the information generating means further adding a presentation end time when reproduction of the sets of picture data in the former video object unit sequence ends and a presentation start time when reproduction of the sets of picture data in the latter video object unit sequence starts to the seamless linking information, the certain offset being found by subtracting the presentation start time of the latter video object unit sequence from the presentation end time of the former video object unit sequence.
6. The video data editing apparatus of Claim 2, wherein each video object unit includes a plurality of sets of picture data and a plurality of sets of audio data, the video data editing apparatus further comprising:

separating means for separating sets of picture data and sets of audio data from the former video object unit sequence and the latter video object unit sequence read by the reading means; and multiplexing means for multiplexing at least one of the sets of picture data, which include one of picture data and re-encoded picture data, separated from the former video object unit sequence with the sets of audio data read from the former video object unit sequence, and for multiplexing the sets of picture data, which include one of picture data and re-encoded picture data, separated from the latter video object unit sequence with the sets of audio data separated from the latter video object unit sequence, the writing means writing data outputted by the multiplexing means onto the optical disc.
7. The video data editing apparatus of Claim 6, wherein the plurality of sets of audio data in the former video object unit sequence and the latter video object unit sequence are reproduced for a plurality of audio frames, the video data editing apparatus further comprising analyzing means for specifying a period between a first audio frame and a second audio frame, out of the plurality of audio frames in the former video object unit sequence, for taking a first audio data sequence that should be reproduced during the specified period from the former video object unit sequence, and for extracting a second audio data sequence that should be reproduced starting from a third audio frame in the plurality of audio frames in the latter video object unit sequence, the first audio frame being the second next audio frame from an audio frame that corresponds to a time at which a first pack in the latter video object unit sequence is inputted, the second audio frame being located immediately before an audio frame in the former video object unit sequence that corresponds to a presentation start time of reproduction of a first set of picture data in the latter video object unit sequence, the third audio frame being located immediately after the audio frame in the latter video object unit sequence that corresponds to a presentation end time of reproduction of the second audio frame, and the multiplexing means multiplexing the sets of picture data and the sets of audio data so that the first audio data sequence is located at a position before the second audio data sequence.
8. The video data editing apparatus of Claim 7, further comprising:
generating means for specifying a presentation end time of the second audio frame as a decode processing halt time for an audio decoder, and for generating halt control information that indicates a decode processing halt time and a period from the presentation end time of the second audio frame to the presentation start time of the third audio frame as a halt period for a processing halt by the audio decoder, the writing means writing the generated halt control information onto the optical disc.
9. The video data editing apparatus of Claim 8, wherein a plurality of sets of audio data that should be reproduced for a plurality of audio frames from the first audio frame to the second audio frame are stored as a first audio pack group, wherein if a data size of the first pack group is not an integer multiple of 2 kilobytes (KB), one of stuffing data and a padding packet is used to make the data size of the first audio pack group is an integer multiple of 2KB, and wherein the plurality of sets of audio data that should be reproduced for a plurality of audio frames starting from the third audio frame are stored as a second audio pack group, with the multiplexing means multiplexing sets of picture data and sets of audio data so that the first audio pack group is located before the second audio pack group.
10. The video data editing apparatus of Claim 9, wherein the analyzing means generates location information that shows which video object unit, out of the video object units in the latter video object unit sequence, includes a final pack in the first audio pack group, the writing means writing the generated location information onto the optical disc.
11. A video data editing apparatus that performs editing to enable seamless reproduction of a former section and a latter section, the former section and the latter section being located in at least one video object that is recorded on an optical disc, each video object including a plurality of video object units and each video object unit including sets of picture data, the video data editing apparatus comprising:
reading means for reading a former video object unit sequence and a latter video object unit sequence from a video object recorded on the optical disc, the former video object unit sequence being composed of video object units positioned at an end of the former section that is to be reproduced first, and the latter video object unit sequence being composed of video object units positioned at a start of a latter section that is to be reproduced second;
encoding means for re-encoding the sets of picture data included in at least one of the former video object unit sequence and the latter video object unit sequence to enable the former section and the latter section to be reproduced seamlessly; and writing means for rewriting at least one of the former section and the latter section on the optical disc after encoding by the encoding means.
12. The video data editing apparatus of Claim 11, wherein the encoding means re-encodes at least one of the sets of picture data included in the former video object unit sequence and the sets of picture data included in the latter video object unit sequence using a target amount of code, the target amount of code being an amount whereby no overflow will occur in a video buffer of a video decoder, even when the sets of picture data included in the former video object unit sequence are present in the video buffer at a same time as the sets of picture data included in the latter video object unit sequence.
13. The video data editing apparatus of Claim 12, wherein when a picture type of a final set of picture data in a display order of the former section is a Bidirectionally Predictive Picture (B picture), the re-encoding means performs re-encoding to convert the final set of picture data to a Predictive Picture (P picture) whose information components are dependent on only sets of picture data that are reproduced earlier than the final set of picture data.
14. The video data editing apparatus of Claim 13, further comprising:
analyzing means for analyzing, when a picture type of a final set of picture data in a display order of the former section is a B picture, an increase in data size that accompanies a conversion of the B picture into a P picture by the encoding means, based on a data size of a set of picture data to be reproduced after the final set of picture data in the display order, the encoding means re-encoding sets of picture data at an end of the former video object unit sequence using a target code amount that ensures an underflow will not occur in a video buffer even when picture data with the analyzed increase in data size is accumulated in the video buffer.
15. The video data editing apparatus of Claim 12, wherein when a picture type of a first set of picture data in a coding order of the latter section is a P picture, the re-encoding means performs re-encoding to convert the first set of picture data to an Intra Picture (I picture) whose information components are not dependent on other sets of picture data.
16. The video data editing apparatus of Claim 15, further comprising:
analyzing means for analyzing, when a picture type of a first set of picture data in a coding order of the latter section is a P picture, an increase in data size that accompanies a conversion of the P picture into an I picture by the encoding means, based on a data size of a set of picture data to be reproduced before the first set of picture data in the display order, the encoding means re-encoding sets of picture data at a start of the latter video object unit sequence using a target code amount that ensures an underflow will not occur in a video buffer even when picture data with the analyzed increase in data size is accumulated in the video buffer.
17. The video data editing apparatus of Claim 12, wherein when a picture type of a first set of picture data in a display order of the latter section is a B
picture, the re-encoding means performs re-encoding to convert the first set of picture data to a forward predicative picture whose information component are only dependent on sets of picture data that are reproduced after the first set of picture data.
18. The video data editing apparatus of Claim 17, further comprising:
analyzing means for analyzing, when a picture type of a first set of picture data in a display order of the latter section is a B picture, an increase in data size that accompanies a conversion of the B picture into a forward predicative picture by the encoding means, based on a data size of a set of picture data to be reproduced after the first set of picture data, the encoding means re-encoding sets of picture data at a start of the latter video object unit sequence using a target code amount that ensures an underflow will not occur in a video buffer even when picture data with the analyzed increase in data size is accumulated in the video buffer.
19. The video data editing apparatus of Claim 18, further comprising:
separating means for separating sets of picture data and sets of audio data from the former video object unit sequence and the latter video object unit sequence read by the reading means; and multiplexing means for multiplexing at least one of the sets of picture data, which include one of picture data and re-encoded picture data, separated from the former video object unit sequence with the sets of audio data read from the former video object unit sequence, and for multiplexing the sets of picture data, which include one of picture data and re-encoded picture data, separated from the latter video object unit sequence with the sets of audio data separated from the latter video object unit sequence;
the writing means writing data outputted by the multiplexing means onto the optical disc.
20. The video data editing apparatus of Claim 19, wherein the plurality of sets of audio data in the former video object unit sequence and the latter video object unit sequence are reproduced for a plurality of audio frames, the video data editing apparatus further comprising analyzing means for specifying a period between a first audio frame and a second audio frame, out of the plurality of audio frames in the former video object unit sequence, for taking a first audio data sequence that should be reproduced during the specified period from the former video object unit sequence, and for extracting a second audio data sequence that should be reproduced starting from a third audio frame in the plurality of audio frames in the latter video object unit sequence, the first audio frame being the second next audio frame from an audio frame that corresponds to a time at which a first pack in the latter video object unit sequence is inputted, the second audio frame being located immediately before an audio frame in the former video object unit sequence that corresponds to a presentation start time of reproduction of a first set of picture data in the latter video object unit sequence, the third audio frame being located immediately after the audio frame in the latter video object unit sequence that corresponds to a presentation end time of reproduction of the second audio frame, and the multiplexing means multiplexing the sets of picture data and the sets of audio data so that the first audio data sequence is located at a position before the second audio data sequence.
21. The video data editing apparatus of Claim 20, further comprising:
generating means for specifying a presentation end time of the second audio frame as a decode processing halt time for an audio decoder, and for generating halt control information that indicates a decode processing halt time and a period from the presentation end time of the second audio frame to the presentation start time of the third audio frame as a halt period for a processing halt by the audio decoder, the writing means writing the generated halt control information onto the optical disc.
22. The video data editing apparatus of Claim 21, further comprising:
wherein a plurality of sets of audio data that should be reproduced for a plurality of audio frames from the first audio frame to the second audio frame are stored as a first audio pack group, wherein if a data size of the first pack group is not an integer multiple of 2 kilobytes (KB), one of stuffing data and a padding packet is used to make the data size of the first audio pack group is an integer multiple of 2KB, and wherein the plurality of sets of audio data that should be reproduced for a plurality of audio frames starting from the third audio frame are stored as a second audio pack group, with the multiplexing means multiplexing sets of picture data and sets of audio data so that the first audio pack group is located before the second audio pack group.
23. The video data editing apparatus of Claim 22, further comprising:
wherein the analyzing means generates location information that shows which video object unit, out of the video object units in the latter video object unit sequence, includes a final pack in the first audio pack group, the writing means writing the generated location information onto the optical disc.
24. An optical disc, comprising:
a data area recording a plurality of video objects that include a plurality of video object units, each video object unit including a plurality of sets of picture data and a plurality of sets of audio data, the plurality of video objects having a display order, one of the video object units in a following video object, which is a video object out of the plurality of video objects that is to be reproduced following a preceding video object in the display order, including a first audio data sequence and the second audio data sequence, the first audio data sequence being a plurality of sets of audio data that should be reproduced during a specified period between a first audio frame and a second audio frame, out of a plurality of audio frames in the preceding video object, the second audio data sequence being a plurality of sets of audio data that should be reproduced from a third audio frame onwards, out of a plurality of audio frames in the following video object, the first audio frame being a second next audio frame from an audio frame that corresponds to a time at which a first pack in the following video object is inputted, the second audio frame being located immediately before an audio frame in the preceding video object unit sequence that corresponds to a presentation start time of video frames in the following video object, and the third audio frame being located immediately after the audio frame in the following video object that corresponds to a presentation end time of the second audio frame; and an index area storing a set of seamless linking information for each video object in the data area, the seamless linking information enabling seamless reproduction of a combination of two out of the plurality of video objects recording in the data area, each set of seamless linking information including:
audio gap start time information indicating a presentation end time of the second audio frame as a decode processing halt time of an audio decoder;
audio gap period information indicating a period between the presentation end time of the second audio frame to a presentation start time of the third audio frame as a decode halt period of the audio decoder; and location information indicating which video object unit, out of the video object units in the following video object, includes the first audio pack sequence.
25. The optical disc of Claim 24, wherein the first audio data sequence is stored in a first audio data pack group, the first audio data pack group being positioned before a second audio data pack group in the following video object, the second audio data pack group being an audio data sequence that is reproduced for a plurality of audio frames in the following video object, the location information indicating a video object unit that includes a final pack in the first audio data pack group.
26. A video data editing apparatus for an optical disc, the optical disc comprising:
a data area recording a plurality of video objects that include a plurality of video object units, each video object unit including a plurality of sets of picture data and a plurality of set of audio data, the plurality of video objects having a display order, one of the video object units in a following video object, which is a video object out of the plurality of video objects that is to be reproduced following a preceding video object in the display order, including a first audio data sequence and the second audio data sequence, the first audio data sequence being a plurality of sets of audio data that should be reproduced during a specified period between a first audio frame and a second audio frame, out of a plurality of audio frames in the preceding video object, the second audio data sequence being a plurality of sets of audio data that should be reproduced from a third audio frame onwards, out of a plurality of audio frames in the following video object, the first audio frame being a second next audio frame from an audio frame that corresponds to a time at which a first pack in the following video object is inputted, the second audio frame being located immediately before an audio frame in the preceding video object unit sequence that corresponds to a presentation start time of video frames in the following video object, and the third audio frame being located immediately after the audio frame in the following video object that corresponds to a presentation end time of the second audio frame; and an index area storing a set of seamless linking information for each video object in the data area, the seamless linking information enabling seamless reproduction of a combination of two out of the plurality of video objects recording in the data area, each set of seamless linking information including:

audio gap start time information indicating a presentation end time of the second audio frame as a decode processing halt time of an audio decoder;
audio gap period information indicating a period between the presentation end time of the second audio frame to a presentation start time of the third audio frame as a decode halt period of the audio decoder; and location information indicating which video object unit, out of the video object units in the following video object, includes the first audio pack sequence, the video data editing apparatus comprising:
reception means for receiving an indication of a part to be deleted, out of a plurality of video object units that are located at the front of a latter video object;
reading means for referring to the location information in the seamless linking information and reading the video object unit, out of the plurality of video object units in the following video object, in which the first audio data sequence is located; and deleting means for deleting a plurality of video object units, which correspond to the part to be deleted, and the audio gap.
27. The video data editing apparatus of Claim 26, extracting means for extracting, from the first audio data sequence and the second audio data sequence, audio data sequences to be newly arranged in the following video object, based on the video presentation start time of a set of picture data to be reproduced first in the following video object from which the part has been deleted;
arranging means for storing the audio data sequence extracted from the first audio data sequence in a first audio pack group and the audio data sequence extracted from the second audio data sequence in a second audio pack group, and for arranging the first audio pack group and the second audio pack group into video object unit in the following video object.
28. The video data editing apparatus of Claim 27, further comprising:
updating means for updating the audio gap start time information and the audio gap period information, based on the audio data sequences extracted by the extracting means, and updating the location information, based on a location of the first audio data sequence used by the arranging means.
29. A computer-readable recording medium recording an editing program that editing that enables seamless reproduction of two video objects on an optical disc, each video object including a plurality of video object units, and each video object unit including sets of picture data, the editing program comprising the following steps:
a reading step for reading at least one of a former video object unit sequence and a latter video object unit sequence from a video object recorded on the optical disc, the former video object unit sequence being composed of a predetermined number of video object units positioned at the end of a former video object to be reproduced first, and the latter video object unit sequence being composed of a predetermined number of video object units positioned at a start of a latter video object to be reproduced second;
an encoding step for re-encoding the sets of picture data included in at least one of the former video object unit sequence and the latter video object unit sequence to enable the former video object and the latter video object to be reproduced seamlessly; and a writing step for rewriting at least one of the former video object and the latter object on the optical disc after encoding by the encoding step.
30. A computer-readable recording medium storing an editing program that edits parts of video objects recorded on an optical disc to enable seamless reproduction for the parts, each video object including a plurality of video object units and each video object unit including seta of picture data for a given reproduction period including a plurality of video frames, each set of picture data being reproduced together with one audio frame, each part being a section between one video frame and another video frame, the editing program including the following steps:
a reading step for reading a former video object unit sequence and a latter video object unit sequence from a video object recorded on the optical disc, the former video object unit sequence being composed of video object units positioned at an end of the former section that is to be reproduced first, and the latter video object unit sequence being composed of video object units positioned at a start of a latter section that is to be reproduced second;
an encoding step for re-encoding the sets of picture data included in at least one of the former video object unit sequence and the latter video object unit sequence to enable the former section and the latter section to be reproduced seamlessly; and a writing step for rewriting at least one of the former section and the latter section on the optical disc after encoding by the encoding step.
31. A computer-readable recording medium storing an editing program that edits an optical disc, the optical disc comprising:
a data area recording a plurality of video objects that include a plurality of video object units, each video object unit including a plurality of sets of picture data and a plurality of sets of audio data, the plurality of video objects having a display order, one of the video object units in a following video object, which is a video object out of the plurality of video objects that is to be reproduced following a preceding video object in the display order, including a first audio data sequence and the second audio data sequence, the first audio data sequence being a plurality of sets of audio data that should be reproduced during a specified period between a first audio frame and a second audio frame, out of a plurality of audio frames in the preceding video object, the second audio data sequence being a plurality of sets of audio data that should be reproduced from a third audio frame onwards, out of a plurality of audio frames in the following video object, the first audio frame being a second next audio frame from an audio frame that corresponds to a time at which a first pack in the following video object is inputted, the second audio frame being located immediately before an audio frame in the preceding video object unit sequence that corresponds to a presentation start time of video frames in the following video object, and the third audio frame being located immediately after the audio frame in the following video object that corresponds to a presentation end time of the second audio frame; and an index area storing a set of seamless linking information for each video object in the data area, the seamless linking information enabling seamless reproduction of a combination of two out of the plurality of video objects recording in the data area, each set of seamless linking information including:
audio gap start time information indicating a presentation end time of the second audio frame as a decode processing halt time of an audio decoder;
audio gap period information indicating a period between the presentation end time of the second audio frame to a presentation start time of the third audio frame as a decode halt period of the audio decoder; and location information indicating which video object unit, out of the video object units in the following video object, includes the first audio pack sequence, the editing program including the following steps:
a reception step for receiving an indication of a part to be deleted, out of a plurality of video object units that are located at the front of a latter video object;
a reading step for referring to the location information in the seamless linking information and reading the video object unit, out of the plurality of video object units in the following video object, in which the first audio data sequence is located; and a deleting step for deleting a plurality of video object units, which correspond to the part to be deleted, and the audio gap.
CA002247637A 1997-09-17 1998-09-17 Video data editing apparatus, optical disc for use as a recording medium of a video data editing apparatus, and computer-readable recording medium storing an editing program Abandoned CA2247637A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP25199597 1997-09-17
JP9-251995 1997-09-17

Publications (1)

Publication Number Publication Date
CA2247637A1 true CA2247637A1 (en) 1999-03-17

Family

ID=17231092

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002247637A Abandoned CA2247637A1 (en) 1997-09-17 1998-09-17 Video data editing apparatus, optical disc for use as a recording medium of a video data editing apparatus, and computer-readable recording medium storing an editing program

Country Status (10)

Country Link
US (2) US6148140A (en)
EP (2) EP0903742B1 (en)
KR (1) KR100532738B1 (en)
CN (1) CN1137488C (en)
CA (1) CA2247637A1 (en)
DE (1) DE69812258T2 (en)
ID (1) ID21786A (en)
MY (1) MY115908A (en)
TW (1) TW388027B (en)
WO (1) WO1999014757A2 (en)

Families Citing this family (125)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8370746B2 (en) 1992-12-14 2013-02-05 Monkeymedia, Inc. Video player with seamless contraction
US8381126B2 (en) 1992-12-14 2013-02-19 Monkeymedia, Inc. Computer user interface with non-salience deemphasis
US5623588A (en) 1992-12-14 1997-04-22 New York University Computer user interface with non-salience deemphasis
CA2670077C (en) * 1997-09-17 2010-09-21 Panasonic Corporation A recording apparatus, computer-readable recording medium, file management system and optical disc for recording video objects
US6370325B2 (en) 1997-09-17 2002-04-09 Matsushita Electric Industrial Co., Ltd. Optical disc recording apparatus, computer-readable recording medium recording a file management program, and optical disc
DE19818819A1 (en) * 1997-11-20 1999-05-27 Mitsubishi Electric Corp Screen setting editor for screen display linked to personal computer
EP2261920A3 (en) * 1998-02-23 2011-03-09 Kabushiki Kaisha Toshiba Information storage medium, information playback method and apparatus and information recording method
US20020067913A1 (en) * 1998-05-15 2002-06-06 Hideo Ando Information recording method and information reproducing method
JP3383580B2 (en) * 1998-05-15 2003-03-04 株式会社東芝 Information storage medium, information recording / reproducing apparatus and method
DE19828072A1 (en) * 1998-06-24 1999-12-30 Thomson Brandt Gmbh Procedure for recording and playing back a broadcast program contribution
JP3356691B2 (en) 1998-07-07 2002-12-16 株式会社東芝 Information recording medium, recording method and reproducing method thereof
US6233389B1 (en) 1998-07-30 2001-05-15 Tivo, Inc. Multimedia time warping system
US7558472B2 (en) 2000-08-22 2009-07-07 Tivo Inc. Multimedia signal processing system
US8577205B2 (en) 1998-07-30 2013-11-05 Tivo Inc. Digital video recording system
US6553086B1 (en) * 1998-10-02 2003-04-22 Lg Electronics, Inc. Method and apparatus for recording time information for digital data streams
US6408128B1 (en) * 1998-11-12 2002-06-18 Max Abecassis Replaying with supplementary information a segment of a video
JP2000165802A (en) 1998-11-25 2000-06-16 Matsushita Electric Ind Co Ltd Stream edit system and edit method
KR100326144B1 (en) * 1999-02-09 2002-03-13 구자홍 Method and apparatus for creating search information of recorded digital data streams and searching the recorded streams by using the search information
AU764308B2 (en) * 1999-02-17 2003-08-14 Matsushita Electric Industrial Co., Ltd. Information recording medium, apparatus and method for performing after-recording on the recording medium
DE60013781T2 (en) * 1999-02-25 2005-09-29 Macrovision Corp., Santa Clara Controlling the copying of a video signal by means of watermarks and related additional data
US6393158B1 (en) * 1999-04-23 2002-05-21 Monkeymedia, Inc. Method and storage device for expanding and contracting continuous play media seamlessly
US10051298B2 (en) 1999-04-23 2018-08-14 Monkeymedia, Inc. Wireless seamless expansion and video advertising player
EP1054404B1 (en) * 1999-05-18 2006-03-01 Deutsche Thomson-Brandt Gmbh Method of marking digital data
EP1054405A1 (en) 1999-05-18 2000-11-22 Deutsche Thomson-Brandt Gmbh Method for marking digital data
US7184648B1 (en) * 1999-06-25 2007-02-27 Koninklijke Philips Electronics N.V. Incomplete streams
US6788876B1 (en) * 1999-06-28 2004-09-07 Matsushita Electric Industrial Co., Ltd. Information recording medium, information recording/reproduction system apparatus, and information recording/reproduction method
US7292781B1 (en) 1999-07-07 2007-11-06 Matsushita Electric Industrial Co., Ltd. AV data recording device and method, disk recorded by the AV data recording device and method, AV data reproducing device and method therefor
US6418273B1 (en) 1999-09-07 2002-07-09 The Original San Francisco Toymakers Video compact disc player
AU7706500A (en) 1999-09-20 2001-04-24 Tivo, Inc. Closed caption tagging system
JP4389365B2 (en) 1999-09-29 2009-12-24 ソニー株式会社 Transport stream recording apparatus and method, transport stream playback apparatus and method, and program recording medium
CN100409350C (en) * 1999-09-30 2008-08-06 松下电器产业株式会社 Information recording medium and system controller
KR100561329B1 (en) * 1999-09-30 2006-03-16 마쯔시다덴기산교 가부시키가이샤 Information recording medium and system controller
WO2001037560A1 (en) * 1999-11-15 2001-05-25 Matsushita Electric Industrial Co., Ltd. Video searching method and video searching apparatus
JP4328989B2 (en) 1999-11-24 2009-09-09 ソニー株式会社 REPRODUCTION DEVICE, REPRODUCTION METHOD, AND RECORDING MEDIUM
CN1383678A (en) * 2000-04-21 2002-12-04 索尼公司 Encoding device and method, recorded medium, and program
JP4599740B2 (en) * 2000-04-21 2010-12-15 ソニー株式会社 Information processing apparatus and method, recording medium, program, and recording medium
US7236687B2 (en) * 2000-04-21 2007-06-26 Sony Corporation Information processing apparatus and method, program, and recording medium
US7027711B2 (en) * 2000-06-26 2006-04-11 Matsushita Electric Industrial Co., Ltd. Editing apparatus for an optical disk, computer-readable storage medium, and computer program
US20040013403A1 (en) * 2000-06-26 2004-01-22 Shin Asada Edit apparatus, reproduction apparatus, edit method, reproduction method, edit program reproduction program, and digital record medium
JP2002056650A (en) * 2000-08-15 2002-02-22 Pioneer Electronic Corp Information recorder, information recording method and recording medium with record control program recorded therein
JP2002056609A (en) * 2000-08-15 2002-02-22 Pioneer Electronic Corp Information recorder, information recording method and information recoring medium recorded with recording control program
JP4467737B2 (en) * 2000-08-16 2010-05-26 パイオニア株式会社 Information recording apparatus, information recording method, and information recording medium on which recording control program is recorded
JP4304888B2 (en) * 2000-09-04 2009-07-29 ソニー株式会社 Recording medium, editing apparatus and editing method
EP1209683A3 (en) * 2000-11-24 2003-08-13 Pioneer Corporation Information recording apparatus
MXPA02007340A (en) * 2000-11-29 2002-12-09 Matsushita Electric Ind Co Ltd Optical disk, recorder, reproducer, program, computer-readable record medium and method.
US7043484B2 (en) * 2000-12-05 2006-05-09 Dvdemand Technologies Inc. System and method for producing storage media images
US20020106191A1 (en) * 2001-01-05 2002-08-08 Vm Labs, Inc. Systems and methods for creating a video montage from titles on a digital video disk
US7356250B2 (en) 2001-01-05 2008-04-08 Genesis Microchip Inc. Systems and methods for creating a single video frame with one or more interest points
JP2002278996A (en) * 2001-03-22 2002-09-27 Sony Corp Recorder, recording method and recording medium
TWI236294B (en) 2001-05-11 2005-07-11 Ulead Systems Inc Method and device for capturing digital video
WO2002099804A1 (en) * 2001-06-04 2002-12-12 Matsushita Electric Industrial Co., Ltd. Recording apparatus, recording medium, reproduction apparatus, program, and method
JP3747806B2 (en) * 2001-06-11 2006-02-22 ソニー株式会社 Data processing apparatus and data processing method
EP1271522A1 (en) * 2001-06-23 2003-01-02 Deutsche Thomson-Brandt Gmbh A method for controlling the editing of DVD stream recorded data and editing management system
JP3663626B2 (en) * 2001-09-18 2005-06-22 ソニー株式会社 Video signal processing apparatus and method, program, information recording medium, and data structure
US7747144B2 (en) * 2001-10-01 2010-06-29 Sony Corporation Information processing apparatus, information processing method, recording medium, control
US7650058B1 (en) * 2001-11-08 2010-01-19 Cernium Corporation Object selective video recording
EP1463056A4 (en) * 2001-11-29 2006-07-05 Sharp Kk Data recording method; data deletion method; data display method; recording apparatus; recording medium; and program
AU2003221415A1 (en) * 2002-03-18 2003-09-29 Sharp Kabushiki Kaisha Data recording method, data recording device, data recording medium, data reproduction method, and data reproduction device
JP2004013571A (en) * 2002-06-07 2004-01-15 Pioneer Electronic Corp Operation explanation device, operation explanation method, operation explaining program and information recording medium with operation explaining program recorded
US7295753B2 (en) * 2002-06-25 2007-11-13 International Business Machines Corporation Personal video recording with further compression of recorded shows
JP3989312B2 (en) * 2002-07-05 2007-10-10 富士通株式会社 Cache memory device and memory allocation method
JP2004055083A (en) * 2002-07-23 2004-02-19 Pioneer Electronic Corp Data reproducing device and data reproducing method
AU2003264414A1 (en) 2002-09-12 2004-04-30 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction device, program, reproduction method, and recording method
JP2004199767A (en) 2002-12-18 2004-07-15 Sony Corp Data processor, data processing method and program
EP1579411B1 (en) * 2002-12-20 2012-10-10 Trident Microsystems (Far East) Ltd. Apparatus for re-ordering video data for displays using two transpose steps and storage of intermediate partially re-ordered video data
JP4223800B2 (en) * 2002-12-20 2009-02-12 ソニー株式会社 Recording apparatus and recording method
WO2004059650A1 (en) * 2002-12-24 2004-07-15 Matsushita Electric Industrial Co., Ltd. Recording and reproduction apparatus, recording apparatus, editing apparatus, information recording medium, recording and reproduction method, recording method, and editing method
TWI261821B (en) * 2002-12-27 2006-09-11 Toshiba Corp Information playback apparatus and information playback method
US20040146211A1 (en) * 2003-01-29 2004-07-29 Knapp Verna E. Encoder and method for encoding
US8194751B2 (en) * 2003-02-19 2012-06-05 Panasonic Corporation Moving picture coding method and moving picture decoding method
KR101027249B1 (en) * 2003-02-21 2011-04-06 파나소닉 주식회사 Recording medium, playback device, recording method, and playback method
US8290353B2 (en) * 2003-02-27 2012-10-16 Panasonic Corporation Data processing device and method
KR20040081903A (en) * 2003-03-17 2004-09-23 삼성전자주식회사 Information storage medium and method of recording and/or reproducing on the same
KR20040081904A (en) * 2003-03-17 2004-09-23 삼성전자주식회사 Information storage medium and method for recording and/or reproducing data on/from the same
CN101404171B (en) * 2003-04-04 2011-08-31 日本胜利株式会社 Audio/video recording apparatus and recording method
CN1820318A (en) * 2003-07-08 2006-08-16 皇家飞利浦电子股份有限公司 Apparatus for and method of recording digital information signals
JP3675464B2 (en) * 2003-10-29 2005-07-27 ソニー株式会社 Moving picture coding apparatus and moving picture coding control method
KR20050049924A (en) * 2003-11-24 2005-05-27 엘지전자 주식회사 Method for managing and reproducing a playlist file of high density optical disc
KR20070000471A (en) * 2004-01-06 2007-01-02 엘지전자 주식회사 Recording medium and method and apparatus for reproducing and recording text subtitle streams
KR20050072255A (en) * 2004-01-06 2005-07-11 엘지전자 주식회사 Method for managing and reproducing a subtitle of high density optical disc
JP4537083B2 (en) 2004-01-28 2010-09-01 キヤノン株式会社 Data processing apparatus and control method thereof
RU2377669C2 (en) * 2004-02-10 2009-12-27 ЭлДжи ЭЛЕКТРОНИКС ИНК. Recording medium with data structure for managing different data, and method and device for recording and playing back
KR20070028323A (en) * 2004-02-10 2007-03-12 엘지전자 주식회사 Recording medium having a data structure for managing data streams associated with different languages and recording and reproducing methods and apparatuses
EP1716566A1 (en) * 2004-02-10 2006-11-02 LG Electronic Inc. Recording medium having a data structure for managing font information for text subtitles and recording and reproducing methods and apparatuses
KR20070028325A (en) * 2004-02-10 2007-03-12 엘지전자 주식회사 Text subtitle decoder and method for decoding text subtitle streams
BRPI0507596A (en) * 2004-02-10 2007-07-03 Lg Electronics Inc physical recording medium, method and apparatus for decoding a text subtitle stream
EP1714281A2 (en) * 2004-02-10 2006-10-25 LG Electronic Inc. Recording medium and method and apparatus for decoding text subtitle streams
KR100716973B1 (en) * 2004-02-21 2007-05-10 삼성전자주식회사 Information storage medium containing text subtitle data synchronized with AV data, and reproducing method and apparatus
KR20060129067A (en) * 2004-02-26 2006-12-14 엘지전자 주식회사 Recording medium and method and apparatus for recording and reproducing text subtitle streams
JP2005302152A (en) * 2004-04-12 2005-10-27 Sony Corp Composite type storage device, data writing method, and program
KR100599175B1 (en) * 2004-06-21 2006-07-12 삼성전자주식회사 Apparatus and method for editing of optical-disc
US20080098051A1 (en) * 2004-07-28 2008-04-24 Koninklijke Philips Electronics, N.V. Managing Data Space on a Record Carrier
CN101010952A (en) * 2004-09-01 2007-08-01 松下电器产业株式会社 Image reproduction method and image reproduction device
JP4085392B2 (en) * 2004-09-02 2008-05-14 ソニー株式会社 Recording / reproducing apparatus, method thereof, and program
JP4791969B2 (en) * 2004-09-24 2011-10-12 パナソニック株式会社 Data processing device
KR20060028849A (en) * 2004-09-30 2006-04-04 삼성전자주식회사 Compression-encoding apparatus and storage method for moving-picture
DK2408202T3 (en) 2004-11-19 2017-08-28 Tivo Solutions Inc Method and device for secure transfer and playback of multimedia content
TWI289997B (en) * 2004-12-02 2007-11-11 Sony Corp Encoding device, method, and program
JP4356624B2 (en) * 2005-02-07 2009-11-04 株式会社日立製作所 Recording / reproducing apparatus, recording apparatus, recording / reproducing method, and recording method
KR20060101654A (en) 2005-03-21 2006-09-26 삼성전자주식회사 A dvd recorder and a cell unit editing method of the dvd recorder
JP4604806B2 (en) * 2005-04-12 2011-01-05 ソニー株式会社 Recording device
US8145528B2 (en) 2005-05-23 2012-03-27 Open Text S.A. Movie advertising placement optimization based on behavior and content analysis
US8141111B2 (en) 2005-05-23 2012-03-20 Open Text S.A. Movie advertising playback techniques
US9648281B2 (en) 2005-05-23 2017-05-09 Open Text Sa Ulc System and method for movie segment bookmarking and sharing
EP2309738A1 (en) * 2005-05-23 2011-04-13 Thomas S. Gilley Distributed scalable media environment
WO2007014216A2 (en) 2005-07-22 2007-02-01 Cernium Corporation Directed attention digital video recordation
AU2006277225B2 (en) 2005-08-09 2011-06-09 Panasonic Corporation Recording medium, playback apparatus, method and program
MY162080A (en) * 2005-11-07 2017-05-31 Koninl Philips Electronics Nv Method and apparatus for editing a program on an optical disc
KR100793752B1 (en) * 2006-05-02 2008-01-10 엘지전자 주식회사 The display device for having the function of editing the recorded data partially and method for controlling the same
CN101461246B (en) 2006-06-02 2013-03-20 松下电器产业株式会社 Coding device and editing device
JP4704972B2 (en) * 2006-07-24 2011-06-22 ルネサスエレクトロニクス株式会社 Stream editing method and stream editing apparatus
KR100806488B1 (en) * 2006-10-11 2008-02-21 삼성에스디에스 주식회사 System and method for performance test in outside channel combination environment
JP4948147B2 (en) 2006-12-15 2012-06-06 富士通株式会社 Method and apparatus for editing composite content file
US20080238928A1 (en) * 2007-03-30 2008-10-02 Bimal Poddar Frame buffer compression for desktop composition
KR101345386B1 (en) * 2007-09-19 2013-12-24 삼성전자주식회사 Method and apparatus for editting mass multimedia data
US8111974B2 (en) * 2007-10-24 2012-02-07 International Business Machines Corporation Enabling complete viewing content for selected programming
US9215467B2 (en) 2008-11-17 2015-12-15 Checkvideo Llc Analytics-modulated coding of surveillance video
EP2425621A2 (en) * 2009-04-28 2012-03-07 Vubites India Private Limited Method and apparatus for splicing a compressed data stream
US9307262B2 (en) * 2011-01-13 2016-04-05 Texas Instruments Incorporated Methods and systems for facilitating multimedia data encoding utilizing configured buffer information
US8861926B2 (en) * 2011-05-02 2014-10-14 Netflix, Inc. Audio and video streaming for media effects
US8682139B2 (en) * 2011-05-02 2014-03-25 Netflix, Inc. L-cut stream startup
CN102946542B (en) * 2012-12-07 2015-12-23 杭州士兰微电子股份有限公司 Mirror image video interval code stream recompile and seamless access method and system are write
US10623821B2 (en) 2013-09-10 2020-04-14 Tivo Solutions Inc. Method and apparatus for creating and sharing customized multimedia segments
CN111757143B (en) * 2020-07-22 2023-01-17 四川新视创伟超高清科技有限公司 Video playing method based on cloud streaming media picture cutting
CN113708736A (en) * 2021-08-27 2021-11-26 北京安达维尔科技股份有限公司 LOC and VDB shared numerical control automatic gain loop and method

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3467832B2 (en) * 1994-04-20 2003-11-17 ソニー株式会社 Recording method and recording device
US5696557A (en) * 1994-08-12 1997-12-09 Sony Corporation Video signal editing apparatus
US5559562A (en) * 1994-11-01 1996-09-24 Ferster; William MPEG editor method and apparatus
CA2168327C (en) * 1995-01-30 2000-04-11 Shinichi Kikuchi A recording medium on which a data containing navigation data is recorded, a method and apparatus for reproducing a data according to navigationdata, a method and apparatus for recording a data containing navigation data on a recording medium.
US5802240A (en) * 1995-04-28 1998-09-01 Sony Corporation Video editing apparatus
US5819004A (en) * 1995-05-08 1998-10-06 Kabushiki Kaisha Toshiba Method and system for a user to manually alter the quality of previously encoded video frames
US6026232A (en) * 1995-07-13 2000-02-15 Kabushiki Kaisha Toshiba Method and system to replace sections of an encoded video bitstream
US20020044757A1 (en) * 1995-08-04 2002-04-18 Sony Corporation Information carrier, device for reading and device for providing the information carrier and method of transmitting picture information
TW305043B (en) * 1995-09-29 1997-05-11 Matsushita Electric Ind Co Ltd
TW436777B (en) * 1995-09-29 2001-05-28 Matsushita Electric Ind Co Ltd A method and an apparatus for reproducing bitstream having non-sequential system clock data seamlessly therebetween
TW385431B (en) * 1995-09-29 2000-03-21 Matsushita Electric Ind Co Ltd A method and an apparatus for encoding a bitstream with plural possible searching reproduction paths information useful in multimedia optical disk
TW303570B (en) * 1995-09-29 1997-04-21 Matsushita Electric Ind Co Ltd
JP3644650B2 (en) * 1995-12-13 2005-05-11 三菱電機株式会社 Television receiver with built-in disk device
JPH09282849A (en) * 1996-04-08 1997-10-31 Pioneer Electron Corp Information-recording medium and recording apparatus and reproducing apparatus therefor
US5838873A (en) * 1996-05-31 1998-11-17 Thomson Consumer Electronics, Inc. Packetized data formats for digital data storage media

Also Published As

Publication number Publication date
DE69812258D1 (en) 2003-04-24
EP1300850A2 (en) 2003-04-09
DE69812258T2 (en) 2003-09-25
EP0903742B1 (en) 2003-03-19
ID21786A (en) 1999-07-22
EP1300850A3 (en) 2004-06-09
TW388027B (en) 2000-04-21
KR100532738B1 (en) 2005-12-01
WO1999014757A2 (en) 1999-03-25
WO1999014757A3 (en) 1999-06-03
EP0903742A2 (en) 1999-03-24
CN1243597A (en) 2000-02-02
KR20000069007A (en) 2000-11-25
CN1137488C (en) 2004-02-04
US6263150B1 (en) 2001-07-17
EP0903742A3 (en) 1999-06-02
MY115908A (en) 2003-09-30
US6148140A (en) 2000-11-14

Similar Documents

Publication Publication Date Title
CA2247637A1 (en) Video data editing apparatus, optical disc for use as a recording medium of a video data editing apparatus, and computer-readable recording medium storing an editing program
CA2247629C (en) Video data editing apparatus and computer-readable recording medium storing an editing program
CA2247626C (en) Optical disc, video data editing apparatus, computer-readable recording medium storing an editing program, reproduction apparatus for the optical disc, and computer-readable recording medium storing an reproduction program
US20010043799A1 (en) Optical disc, video data editing apparatus, computer-readable recording medium storing an editing program, reproduction apparatus for the optical disc, and computer-readable recording medium storing an reproduction program
JP3050311B2 (en) Optical disk, recording device and reproducing device
JP3948979B2 (en) Information recording medium, information recording apparatus and method, information reproducing apparatus and method, information recording / reproducing apparatus and method, computer program for recording or reproduction control, and data structure including control signal
JP2000078519A (en) Video data editing device and computer readable recording medium for recording editing program
JP3954406B2 (en) Information recording medium, information recording apparatus and method, information reproducing apparatus and method, information recording / reproducing apparatus and method, computer program for recording or reproduction control, and data structure including control signal
JP3410695B2 (en) Playback apparatus, playback method, and computer-readable recording medium
JPH11155131A (en) Video data editing device, optical disk used for editing medium by the video data editing device and computer-readable record medium recorded with editing program
JP2002093125A (en) Optical disk, video data editing device, computer-readable recording medium with recorded editing program, reproducing device for optical disk, and computer- readable recording medium with recorded reproducing program
MXPA99004448A (en) Video data editing apparatus, optical disc for use as a recording medium of a video data editing apparatus, and computer-readable recording medium storing an editing program
MXPA99004453A (en) Video data editing apparatus and computer-readable recording medium storing an editing program
MXPA99004447A (en) Optical disc, video data editing apparatus, computer-readable recording medium storing an editing program, reproduction apparatus for the optical disc, and computer-readable recording medium storing a reproduction program

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued