WO2001099403A2 - Video processing system - Google Patents

Video processing system Download PDF

Info

Publication number
WO2001099403A2
WO2001099403A2 PCT/US2001/019130 US0119130W WO0199403A2 WO 2001099403 A2 WO2001099403 A2 WO 2001099403A2 US 0119130 W US0119130 W US 0119130W WO 0199403 A2 WO0199403 A2 WO 0199403A2
Authority
WO
WIPO (PCT)
Prior art keywords
video
scene
ofthe
video data
user
Prior art date
Application number
PCT/US2001/019130
Other languages
French (fr)
Other versions
WO2001099403A3 (en
WO2001099403A9 (en
Inventor
Sai-Wai Fu
Hon Pun Sit
Subutai Ahmad
Sadie Louise Honey
Adwait Ullal
Jeffrey Lane Edwards
Original Assignee
Yesvideo.Com
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=24383971&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2001099403(A2) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Yesvideo.Com filed Critical Yesvideo.Com
Priority to JP2002504123A priority Critical patent/JP4942276B2/en
Priority to AU2001268432A priority patent/AU2001268432A1/en
Priority to EP01946372A priority patent/EP1310086B1/en
Priority to DE60143663T priority patent/DE60143663D1/en
Publication of WO2001099403A2 publication Critical patent/WO2001099403A2/en
Publication of WO2001099403A9 publication Critical patent/WO2001099403A9/en
Publication of WO2001099403A3 publication Critical patent/WO2001099403A3/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/73Querying
    • G06F16/738Presentation of query results
    • G06F16/739Presentation of query results in form of a video summary, e.g. the video summary being a video sequence, a composite still image or having synthesized frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/783Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/7847Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content
    • G06F16/785Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content using colour or luminescence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/783Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/7847Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content
    • G06F16/786Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content using motion, e.g. object motion or camera motion
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/11Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/87Regeneration of colour television signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2508Magnetic discs
    • G11B2220/2516Hard disks
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2545CDs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2562DVDs [digital versatile discs]; Digital video discs; MMCDs; HDCDs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/40Combinations of multiple record carriers
    • G11B2220/41Flat as opposed to hierarchical combination, e.g. library of tapes or discs, CD changer, or groups of record carriers that together store one title
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/90Tape-like record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/022Electronic editing of analogue information signals, e.g. audio or video signals
    • G11B27/024Electronic editing of analogue information signals, e.g. audio or video signals on tapes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums

Definitions

  • the present invention relates generally to computer systems and more particularly to methods and apparatus for collecting, editing and distributing video content.
  • the invention provides a method for producing a video disc and includes acquiring video data from a source. If the video data is not digitized, then the video data is digitized.
  • the method includes generating scene indexes for the video data including a representative still image for each scene and combining the video data and scene indexes along with a media player on a video disc.
  • the video player is operable to play the video data in accordance with the scene indexes including playing a scene from the video data on a client computer while displaying the representative stills for other ofthe scenes available for display on the video disc.
  • the step of acquiring can include capturing the video data from an analog source or a digital source.
  • the step of generating scene indexes can include detecting a transition between consecutive frames in the video data, determining when the transition indicates a scene break and marking the end ofthe previous scene and a beginning of a new scene at a point in time that corresponds to the initially detected transition.
  • the step of detecting a transition can include detecting a color difference between the frames and determining if a difference between frames exceeds a preset threshold.
  • the method can further include cropping one or more ofthe frames prior to the comparison to eliminate effects from the boundary ofthe image frame.
  • the step of detecting a transition can include detecting a motion difference between the frames.
  • the step of detecting a transition can include determining if a difference between frames exceeds a preset threshold.
  • the step of determining when a transition indicates a scene break can include comparing plural frames to a last frame thought to be part of a preceding scene.
  • the step of generating representative stills for each scene can include selecting a first frame from each scene or a frame from an introductory group of frames from each scene.
  • the step of selecting a frame can include determining a color distribution for plural frames in a scene and selecting a frame from the introductory group that is a best match to the determined color distribution.
  • the method can include creating a contact sheet for distribution with the video disc that includes each representative still for the scenes detected on the video disc.
  • the video disc can be a compact disc or a digital video disc.
  • the invention provides a method for producing a video based product that includes acquiring video data and generating temporal indices including analyzing the video data to detect the temporal indices.
  • the temporal indices indicate a division ofthe video data into distinct segments.
  • the method includes providing a media player operable to play the video data on a client computer in accordance with the temporal indices and packaging the video data, the temporal indices and media player on a physical medium for delivery to the client computer.
  • the method can include digitizing the video data prior to packaging the video data and generating representative stills for one or more segments.
  • the media player can be operable to display one or more ofthe representative stills while playing the video data on the client computer.
  • the method can include providing a media editor operable to generate one or more edit lists. Each edit list can define a set of operations to be performed on the video data by another computer so as to allow editing operations defined on one computer to be performed on the video data to be replicated on another computer.
  • the method can include editing the video data in accordance with the edit lists on the another computer and distributing the edited video to user designated distributees.
  • the packaging step can include producing a physical manifestation ofthe content to be delivered.
  • the physical manifestation can be a video disc.
  • the packaging step can include producing a streaming version of the video data in accordance with the temporal indices, receiving a request to webcast the streaming version ofthe video data and streaming the streaming version to a requestor.
  • the distribution ofthe edited video can include providing the edit list to a central distribution site, generating the edited video at the central distribution site and directly delivering the edited video to the distributees.
  • An efficient and inexpensive system for collecting, digitizing and editing video content.
  • the system includes digitizing equipment for digitizing analog and digital video input for distribution over the Internet.
  • the system includes scene detection mechanisms for parsing the digitized content into plural scenes which can then be edited or otherwise manipulated by the user.
  • the system provides tools for the user to easily combine plural scenes into a single composition (an album) which can be burned into a compact disc (CD) for distribution to plural sources.
  • the composition can be viewed locally with streaming content, or can be burned into a CD.
  • the content on the CD can be a high resolution or a streaming format.
  • the system provides fully automated digital video editing services to allow users to manipulate each scene, combine scenes and integrate other input including audio and digital still pictures.
  • the system combines an Internet web hosting service for viewing digitized content with video processing tools to facilitate the distribution of content to plural distributees.
  • the system provides a menu of still frames, one for each detected scene in a digitized input.
  • the system provides video search tools that can analyze scenes to search for particular content based on keywords.
  • CDs produced by the system can be archived or otherwise backed up to make available the original high quality content for later derivative work. Only desired high quality content will be archived through the interaction ofthe user in creating an edited album on the Web. The user can select from a high quality or a lower quality streaming version ofthe content when publishing an album. High quality still images can be extracted from the high quality content in a publish album. High quality stills can also be extracted from the high quality content archived by the system.
  • FIG.l is a schematic block diagram of a system for capturing, editing and distributing video content.
  • FIG. 2a is a block diagram of a capture and digitization module.
  • FIG. 2b is flow diagram for a method for detecting scenes in a digitized video.
  • FIG. 3 is a flow diagram of a high-level process for offering digitized video products for sale over a computer network such as the Internet.
  • FIG. 4a shows a user interface for a login screen.
  • FIG. 4b shows a use interface for a start page.
  • FIG. 5 a is a flow diagram for a process for editing a video and creating an album.
  • FIG. 5b shows a user interface for an editing page.
  • FIG. 5c shows a user interface for a preview page.
  • FIG. 5d shows a user interface for a options page.
  • FIG. 5e shows a user interface for an alternative editing page.
  • FIG. 5f shows a user interface for a second alternative editing page.
  • FIG. 5g shows a user interface when a selection button of FIG. 5f is activated.
  • FIG. 5h shows a user interface when a view button of FIG. 5f is activated.
  • FIG. 5i shows a user interface for trimming a scene.
  • FIG. 6 shows a user interface for a media player.
  • FIG. 7 shows an operational flow for the system of FIG. 1
  • FIG. 8 shows a production flow for the system of FIG. 1
  • Video Data refers to an image stream, audio stream or synchronized image and audio stream.
  • Physical media refers to means for storing digitized content and can include a video disc, floppy disc, zip drive, minidisc, magnetic tape, CD-ROM, NCD and DND.
  • Segment refers to a definable portion of video data. Tools described below can be used to locate segments ofthe video data. Portions ofthe description below are described with reference to a scene. A scene is a type of segment often associated with a image stream. While the description sets forth particular details of scene detection and other scene features, those of ordinary skill in the art will recognize that the invention is equally suited to process other video data types.
  • the system 100 includes a local video processing system 60 and server system 70.
  • Local video processing system 60 captures and digitizes content and provides digitized video to server system 70.
  • Server system 70 maintains a database 74 of digitized video and one or more servers.
  • Database 74 may itself be a database server that includes one or more storage means for storing streaming and high resolution video and other data.
  • the servers execute one or more applications to host video editing services as will be described in greater detail below.
  • Server system 70 includes a website that can be accessed to retrieve, manipulate, order and distribute digitized video to one or more distributees. The details ofthe website, editing tools and distribution services provided by server system 70 is described in greater detail below.
  • Server system 70 can be linked to by a user using a client computer 80 via a network 82 (e.g., the Internet).
  • the user can login, review and edit video that has been captured, combine the captured/edited content with other media and preview the results (i.e., a storyboard) in real time.
  • a "storyboard" is a working area presented by a user interface provided by server system 70 to the user operating client computer 80.
  • One or more scenes are added to the storyboard as the user develops a finished product referred to as an album.
  • An album includes a name and a representative still. Albums can be edited and eventually published.
  • Publication can include the creation of a high resolution version ofthe digitized content and may include the production of a physical manifestation ofthe digitized content (physical media) for distribution to one or more distributees.
  • an album can be published on-line and viewed by others in a streaming format.
  • the user can view a streaming video version ofthe digitized content stored in the database 74 in server system 70.
  • Streaming video server 78 can download to the user via the network 82 a streaming version of a scene, storyboard or album.
  • the streaming video version can be a low resolution version ofthe original digitized content stored in the database 74.
  • the user can use a browser to order a physical manifestation ofthe storyboard/album.
  • the user can also allow others to access an album or distribute multiple copies ofthe physical manifestation to other distributees.
  • the processes invoked by the browser are described in greater detail below.
  • System 100 includes a production system 90 that is used to produce a published version of a selected album as well as produce the physical manifestations ofthe album for distribution to the distributees.
  • the published version ofthe album can be a high resolution, streaming or other version ofthe original digitized video content that is stored in the database 74 ofthe server system 70.
  • an information stream can be produced to deliver a version ofthe content to the distributees.
  • the information stream can be delivered by a delivery system such as the World Wide Web using an internet enabled set top box (using the file transfer protocol ftp), DND player or personal computer, a cable system incorporating a video-on-demand set top box, or satellite system (satellite narrowcast). These and other delivery systems can be used to deliver a streaming version ofthe digitized content.
  • Local Video Processing System can be used to deliver a streaming version ofthe digitized content.
  • Local video processing system 60 includes a capture and digitization module 62, a scene detection module 64, one or more streaming video processors 66, splitters 68 and local storage 69 (not shown).
  • capture and digitization module 62 includes an input module 102, an input monitoring multiplexor 104, a digitization source multiplexor 106, a digitization module 108, a digitization control module 110, content monitoring module 112, content monitoring multiplexor 114 and network connection 116 for interfacing with a network (e.g., local area network (LAN), intranet, Internet) that couples the digitization module 62 and the rest ofthe local video processing system 60.
  • a network e.g., local area network (LAN), intranet, Internet
  • Input module 102 includes plural means for reading input received from a user.
  • Input can be received from a user by US Mail, delivery (e.g., FedEx, UPS), through a designated receiving site (e.g., a drop off center, kiosk, photo shop), or can be uploaded directly by the user.
  • Input can be analog or digital. If the input has already been digitized, then the input can be provided directly to digitization control module 110.
  • input module 102 includes a video cassette player (NHS, SNHS or 8mm format), a compact disc player (video compact disc (NCD) and digital video compact disc (DND)) and a camcorder for reading input.
  • Input can be ofthe form of analog or digital tape (NHS, SNHS or 8mm tape), NCDs, DNDs or direct input from a video recording device such as a 8mm HI-8 camcorder.
  • Input module 102 provides as an output plural input streams, one from each input device, that can be passed to both the input monitoring multiplexor 104 and digitization source multiplexor 106.
  • the input module input stream can be coupled directly to the digitization control module using a Fire Wire connection (IEEE-1394 interface) or other direct input means.
  • IEEE-1394 interface Fire Wire connection
  • Input monitoring multiplexor 104 receives as inputs a video stream on each of its input ports and provides a single selected stream as an output on its output port.
  • input monitoring multiplexor 104 receives as inputs two video streams from the input module (a stream from a video cassette player and the compact disc player) and a feedback stream from the digitization control module 110.
  • the output of the input monitoring multiplexor 104 is coupled to an input ofthe content monitoring module 112. In this way, the video output from each input device can be viewed by a quality control monitor for the system.
  • Digitization source multiplexor 106 receives as inputs video streams on each of its input ports and provides a single selected stream as an output on its output port.
  • digitization source multiplexor 106 receives as input three video streams from the input module (one from each ofthe video cassette player, compact disc player and camcorder).
  • the output ofthe digitization source multiplexor 106 is coupled to the input of digitization module 108.
  • Digitization module 108 can include plural devices for digitizing the video input received from the input module 102.
  • digitization module includes a controller 170 (e.g., an Osprey 200 video capture card available from NiewCast.com), and two digitizers 172 (a Digital Video Creator available from Dazzle Multimedia and Studio MP10 available from Pinnacle Systems). Each device (controller 170 and digitizers 172) is coupled by a bi-directional communications bus to the digitization control module 110. In one implementation, controller 170 is included as part of digitization control module 110.
  • controller 170 e.g., an Osprey 200 video capture card available from NiewCast.com
  • digitizers 172 a Digital Video Creator available from Dazzle Multimedia and Studio MP10 available from Pinnacle Systems.
  • controller 170 is included as part of digitization control module 110.
  • Digitization control module 110 controls the configuration and selection ofthe devices in the digitization module 108. Depending on the configuration, one or more of the devices will operate on the video stream received from the digitization source multiplexor 106 and provide output to both the content monitoring multiplexor 114 and the digitization control module 110.
  • each digitizer 172 provides a digitized stream that contains the digitized video as an output to the digitization control module 110.
  • the digitized content can be rendered to produce a video stream that is provided as an output to content monitoring multiplexor 114.
  • Digitization control module 110 can also perform a synchronization function for the data transfers between the digitization module 108 and input module 102. Digitization control module 110 can activate input module 102 and digitization module 108 in an appropriate sequence so that output of input module 102 can feed into the input of digitization module 108 without any human intervention.
  • Content monitoring multiplexor 114 receives as inputs a video stream on each of its input ports and provides a single selected stream as an output on its output port. In one implementation, content monitoring multiplexor 114 receives as inputs two video streams from the digitization module (a stream from each digitizer 172). The output of the content monitoring multiplexor 114 is coupled to an second input ofthe content monitoring module 112. In this way, the video output from each digitizer 172 can be viewed by a quality control monitor for the system.
  • Content monitoring module 112 includes a video monitor for viewing video streams processed by the system 100.
  • the content monitoring module 112 includes two inputs, one from the digitization module 108 and one from the input module 102 (via their respective multiplexors).
  • Digitization control module 110 controls the operation ofthe digitization module 108. Digitization control module 110 receives as an input a digitized video stream that can be manipulated for further processing. Digitization control module 110 associates with each processed digitized stream a video identifier (ID) associated with the source (user) ofthe input. The output from digitization control module 110 (a digitized video stream) is coupled by network connection 116 to remaining portions ofthe local video processing system 60. In one implementation, the digitized video stream is encoded prior to output. In one implementation, the output format is an MPEG format.
  • digitization control module 110 can add information to the digitized video prior to transfer to the remaining portion ofthe local video processing system 60. For example, production screens or customized title screens can be added to the digitized video to indicate the source of origin ofthe digitization services.
  • the digitized content produced in the local video processing system 60 can be stored locally in local storage 69 on disc or other physical media.
  • scene detection module 64 includes an algorithm for detecting scene changes in the digitized video.
  • Scene detection module 62 receives as an input the digitized stream from the capture and digitization module 64 (e.g., the MPEG file) and provides as an output scene information.
  • the scene information includes scene identification information, bad video segment information as well as a representative still image for the scene.
  • the scene detection module 64 provides as an output a JPEG still for each scene.
  • the method 200 is used to detect when the video camera was turned on or off (i.e., a scene break).
  • the output of the method is a list of segments, where each segment contains beginning and end frame numbers.
  • the method begins by retrieving a first image frame (202). If no more frames are to be processed (204), the process ends (230). Else, the next image frame is retrieved (206). The current image frame is cropped (208). The image frame is cropped to include only the interior portion ofthe image area since the outside boundary may contain camera noise.
  • the method only considers pixels within the rectangle (bw,bh) - (W-bw,H-bh).
  • bw is set to 0.1 *W and bh is set to 0.1 *H.
  • a color difference and a motion difference between the current image frame and the previous one is computed (210).
  • a check is made to determine if the method is in the "tentative" mode (212). If not, another check is made to determine if a time-out counter (TOC) is non-zero (213). If not the process continues at step 214. If the TOC is non-zero, then the TOC is decremented by one unit (240). If, after decrementing, the TOC is now zero (242), the method declares a true scene break and records a frame identifier associated with the frame located previously (in step 218) as the end of a scene and the next frame identifier as the start ofthe next (new) scene in the list of scenes (244). Thereafter, or if the result ofthe comparison in step 242 is non-zero, the process continues at step 214.
  • TOC time-out counter
  • step 214 the motion and color differences are compared to preset thresholds. If both the motion and the color differences are not above the preset threshold(s), then the process continues at step 204. If both the motion and the color differences are above the preset thresholds, then a possible scene break has been detected and the method enters the tentative mode (216).
  • the tentative mode is defined as a time period after detecting an initial indication of a scene break over which the frame-to-frame motion and/or color difference continues to exceed the threshold(s).
  • the thresholds are determined by performing a regression analysis on a database of home camcorder footage. The image frame immediately before the detected break is identified (218) and the process continues at step 204.
  • the method has detected a true scene break, but will wait for a number of frames equal to the initial time out value before declaring a scene break.
  • This waiting period can be used to detect another scene break (within the time-out period) which has an even larger motion and color difference.
  • the system includes the tentative mode. While in the tentative mode, the system compares new image frames to the previously stored image frame (identified in step 218) for a timeout period. In one implementation, the timeout value is determined to be 1/3 of a second. If by the end ofthe timeout, the image is still very different from the stored image, the system declares the detected scene break to be a true scene break and turns off tentative mode.
  • the system first computes a 2D color histogram ofthe image frame.
  • the two color axes are used to compute a representative color model ofthe image frame.
  • the system creates a 2D array, H, where each bin, H(r',g'), represents a rectangular region in normalized red and green space. Each bin is initially set to zero.
  • the pixel's normalized color values (r',g') are computed and the count in H(r',g') is incremented by one.
  • the actual color difference is computed by comparing the histogram for the current image frame and the previous image frame(or identified image frame).
  • the system computes a motion difference by calculating the total number of pixels that have changed significantly between the two images.
  • > threshold, 0 otherwise. Then the motion difference MD sum( m(x,y) ).
  • the process for selecting a representative still for a scene includes picking the first frame ofthe scene. Although this strategy works very well for professionally produced footage, it does not work well for home camcorder footage. The first few frames of home footage often contain garbage because the person has not quite set up the shot yet.
  • a better choice for the representative frame is selected by analyzing the entire segment and selecting an image that best approximates the rest ofthe segment. In one implementation, the system restricts this representative image to be within the first few seconds ofthe scene (the "segment intro").
  • the system computes a color model ofthe entire scene by computing the average color histogram ofthe entire scene. This average histogram is compared with every image in the segment intro using histogram intersection (see previous discussion). The image frame with the smallest color difference is chosen.
  • a one pass algorithm is used to extract the representative frames while the scene break detection is going on.
  • the system keeps track ofthe running sum of all the bins in the histogram.
  • the system computes the average histogram by dividing each bin by the total number of images. This average histogram is compared against the frames in the buffer. The best match is selected and output as the representative frame.
  • a slight trick is used due to the timeout period.
  • the system does not know that a scene has been detected until a timeout (e.g., 1/3 of a second) after the break was initially processed.
  • Home camcorder footage often includes segments in which there is not a valid video signal on the tape. This can occur if the lens cap is left on while recording, if the tape is accidentally fast forwarded during filming (leaving an unrecorded portion ofthe tape) or by other means. The most common cause of this problem occurs when a tape is removed before it has been filled to 100% of capacity producing a tape that includes a final segment with no video signal.
  • the system can perform an analysis on the representative still selected for each segment and remove those segments that do not have a valid video signal. This of course assumes that if the representative still has a valid signal, then the remainder ofthe detected segment will also have a valid signal. Since the representative frame is by definition the frame that best represents the entire segment, then this assumption is very good in practice.
  • the system computes statistics on each ofthe color channels (e.g., three in an RGB device) and compare then to preset thresholds.
  • the system checks to see if the standard deviation computed is less than a threshold [e.g. StdDevR ⁇ ThresholdR.] If so, then the still image is labeled a bad image and the associated segment is labeled a bad segment (e.g.,scene).
  • a threshold e.g. StdDevR ⁇ ThresholdR.
  • the threshold values are determined by performing a regression analysis on a database of home camcorder footage.
  • a check can be made of each image (still) in a segment. When the color statistics for each frame exactly match, a bad segment is declared.
  • the digitized video e.g., MPEG file
  • scene detection information e.g. scene identifiers and JPEG stills
  • streaming video processor 66 operates on the digitized video to produce a streaming version (encoded version) that can be easily downloaded or viewed over a network (e.g., the Internet).
  • a network e.g., the Internet
  • two parallel streaming processors 66a and 66b are provided that produce streaming video output streams at two resolutions and bit rates.
  • Streaming video processor 66a provides a streaming video output for supporting a 56k modem configuration while streaming video processor 66b provides a streaming video output for supporting a digital subscriber line (DSL) configuration.
  • video output processor 66 outputs a RealVideo format file and any accompanying SMIL files necessary for previewing the RealVideo format file by the user.
  • the output of video processor 66 is provided as an input to splitter 68.
  • Splitter 68 takes the scene detection information and produces individual encoded files, one for each scene.
  • the output ofthe splitter 68 is provided as an input to server system 70.
  • server system 70 two parallel splitters 68a and 68b are provided that produce encoded output files at two resolutions and bit rates.
  • Splitter 68a provides as an output scene- based encoded files to support a 56k modem configuration while splitter 68b provides as an output scene-based encoded files to support a DSL configuration.
  • splitter 68 outputs RealVideo format file(s) and any accompanying SMIL files necessary for previewing the RealVideo format file by the user.
  • Server system 70 includes a batch processor 72, a database 74, a client server 76 and streaming video server 78.
  • Batch processor 72 receives as an input encoded (e.g., scene-detected RealVideo files) and digitized video files (e.g., the digitized MPEG file) from local video processing system 60.
  • the connection between server system 70 and local video processing system 60 can be optimized to support needed bandwidth and cost structure.
  • the connection between the devices is a highspeed TI link.
  • this connection is a physical device such as a removable hard disc or a Iomega, Inc. jazz disc.
  • Batch processor 72 writes all ofthe information associated with the files to the database 74 and copies all files into an appropriate directory structure.
  • all files associated with a digitized video are stored in a single directory associated with the user's ID. Other directory structures can be used.
  • Client server 76 receives notice from batch processor 72 when the data transfer to the database 72 has been completed and the video content is available for processing.
  • Client server 76 includes plural applications for interfacing with the user and the various other system components.
  • client server 76 includes an E-mail application that can be invoked to send notification to the user that the digitized video is available for review.
  • Client server 76 hosts a website that can be visited by the user.
  • Client server 76 is connected to one or more client computers 80 by a network 82 such as the Internet.
  • Client server 76 includes a web front end (not shown) that manages the communications with the client computers 80.
  • the website can include plural applications that when executed allow the user to view, edit, manipulate, archive and order copies ofthe digitized content.
  • the website architecture and user interface are described in greater detail below.
  • the user can view a streaming video version ofthe digitized content stored in the database 74 in server system 70.
  • Streaming video server 78 can download to the user via the network 82 a streaming version of a scene, storyboard or album.
  • the streaming video version can be a low resolution version ofthe original digitized content stored in the database 74.
  • the video material is stored for predefined length of time at server system 70.
  • server system 70 sends E-mails at 10 and 14 days that warn of imminent deletion of material.
  • material can be deleted after a pre-defined period (e.g., 21 days). Any in-process albums will be altered to remove the deleted material.
  • An E-mail can be sent after deletion that informs the user how to send in an archive CD for reposting of material.
  • the client computers 80 can be connected to various input devices (digital video camera, digital still camera and storage means) so that a user can upload captured digital images, video clips or previously digitized video scenes to the client computer 80.
  • the client computer 80 can execute digital video processing software such as Ulead Video Studio3.0 SE or image processing software such as ADOBE PHOTOSHOP® in order to create and/or edit digital video or still images.
  • the client computer 80 includes a storage medium (not shown) such as a hard disk for storing the digital video or still images.
  • the client computer 80 is connected to the network 82, for example, using a modem or network interface card.
  • the system can be implemented as a browser-based system in accordance with the standard protocols for communicating over the Word Wide Web.
  • a user ofthe client computer 80 can execute a browser 84 to connect to and interact with the client server 76.
  • client server 76 includes a web front end that manages the communications with the client computer 80.
  • the user ofthe client computer 80 can upload digital content to the client server 76.
  • the web front end receives the uploaded digital content and stores the content in database 74.
  • the user ofthe client computer 80 can also order content made from edited or raw content as will be described in greater detail below.
  • the client server 76 includes or is connected to a production system 90.
  • Production system 90 receives the selected digitized content from the client server 76 and generates a physical manifestation (e.g.
  • the production system receives an edit list that identifies the content to be processed and the content is retrieved from the database 74 (e.g., from the database server).
  • the items generated by the production system 90 can be shipped to the user using a conventional shipping service such as the United States Postal Service or Federal Express.
  • the production system includes a user interface that is presented to the user to allow for a one click publication process.
  • the user interface can include for example a finished button (563 of FIG. 5b) that automatically publishes a finished album.
  • the automatic publication process includes the generation of scenes, high resolution content, streaming content, contact sheets and other materials for inclusion in a finished product.
  • the publication services are described in greater detail below.
  • the CDs produced by production system 90 can be archived or otherwise backed up to allow for high quality content to be available for later derivative work.
  • the product shipped includes the high resolution video (e.g., MPEG files), a Media Player and an auto play file.
  • a streaming version ofthe selected content is also included in the shipped product.
  • a contact sheet can be created and printed for inclusion with the product. After the scene detection process has been completed, a contact sheet can be generated.
  • the contact sheet is a sheet (e.g. of paper) that contains a number ⁇ of thumb-nail sized key frame images (stills) representing the detected segments ofthe input video as well as the movie times associated with each key frame. In one implementation, the maximum number of thumbnails is 30.
  • two distinct special cases are considered, when the number of detected segments M is greater than ⁇ , and a when the number of detected segments is less than N.
  • the algorithm when presented with the first case where M>N and more segments are detected than can be printed on a single sheet, only the first N thumbnails are printed on the sheet.
  • an algorithm can be selected to fill up the sheet.
  • the algorithm selects a longest segment, divides the segment into equal pieces and selects new representative stills for the segments (as necessary). Alternatively, only a subset ofthe stills can be printed.
  • the selection ofthe number of stills to printed on the contact sheet does not affect the underlying digitized content (i.e., no new scene breaks are established due to the division).
  • the images can be added to a digital image representative ofthe contact sheet, and subsequently printed.
  • the album author's name, title, and date of creation can also be added to the digital image prior to printing.
  • a new set of scenes associated with the published product can be included.
  • a user can click on a scene and the video segment corresponding to that scene will be played immediately.
  • Other features include an A-B playback loop and random/shuffle play options.
  • a user interface for a media player associated with the final product is shown in FIG. 6 Labels are printed and attached to the product (CD and jewel case) that include the title ofthe album or content included.
  • the product CD and jewel case
  • Album CDs may be ordered via E-mail, phone, and mail. Upon receipt of payment within the appropriate time period the product is shipped as described above. Multiple products (e.g., CDs) may be shipped to multiple addresses.
  • FIG. 3 is a flow diagram of a high-level process 300 of offering digitized video products for sale over a computer network such as the Internet.
  • content is received (block 302).
  • a user can send an 8mm tape for processing to the local video processing system 60.
  • the content is digitized (304) and a scene detection process is invoked (306).
  • a user executing a browser on the client computer 80 can access the web front end ofthe client server 76 and upload a digitized video or other digital content directly to the client server 76 bypassing the local video processing system 60.
  • the client server either invokes a local scene detection process or forwards the digitized video that is received to the local video processing system 60 for scene detection services.
  • the user can upload or otherwise send material that has been previously digitized by the local video processing system (e.g., a published CD).
  • the material is reposted to the web site after which it is handled as if it had just been digitized by the local video processing system 60.
  • the local video processing system 60 operates on the digitized video to produce one or more encoded streaming versions ofthe digitized video (308).
  • the digitized video, scene detection information and encoded streaming versions ofthe digitized video are then stored in database 74 accessible by client server 76 (310).
  • the web front end ofthe client server 76 can be configured to allow the user to view scenes stored in the database 74 and select one or more scenes for subsequent processing or inclusion in an album as described below. More specifically, the user logs in to the website using a video identifier for the content that was previously provided as part of block 302 (312).
  • FIG. 4a shows one example of a user interface (i.e., the "login page") 400 presented to the user of client computer 80 for accessing digitized video stored at server system 70.
  • the login includes a video identifier (ID) box 402 for indicating the identifier for the video product that is to be edited/viewed.
  • ID video identifier
  • a video ID is assigned.
  • the video ID is a unique identifier to a particular product.
  • the video ID may be required to be accompanied with a password for security purposes.
  • the login page includes an interface specification checkbox 404 for specifying the connection type for the current session.
  • a user logs in they are presented with a wizard (not shown) that gathers basic information including preferred bit stream (required) and other information about them. Information that the customer specifies can be changed later through a preferences page (not shown).
  • a user may request that their password be reset to the default or a reminder sent to them.
  • the default password can be the last four digits ofthe customer's home phone number.
  • the user is redirected to an appropriate page. If a video ID is provided, the user may be directly presented an editing page for the selected video. Alternatively, the user may be presented with a start page as described below, from which the user can select a video for editing.
  • the user can edit the raw input (314), select scenes for inclusion in an album (316) and publish the album (318).
  • the publication ofthe album includes the creation of a high resolution copy ofthe selected content (or a streaming version).
  • an order is received for the published album (320).
  • the order can be received by the web front end from the user's client computer 80.
  • the order can be received in other ways including, for example, via electronic mail, Internet Relay Chat, the telephone, and/or the mail.
  • the order will include information specifying (or otherwise referring or pointing to) a type of product, the published album to incorporate into the product, a quantity ofthe product being ordered, payment information, and delivery information.
  • the order is fulfilled (322).
  • the order can be fulfilled by burning or otherwise generating the product (e.g. the DVD or
  • the start page includes links to Published Albums (finished albums 411), Unfinished Albums 413, and can include an Inbox (raw material) pages for recently uploaded content.
  • Published Albums finished albums 411)
  • Unfinished Albums 413 can include an Inbox (raw material) pages for recently uploaded content.
  • published albums are listed. Displayed are the title, description, first key frame (representative thumbnail still), and length of each album.
  • a streaming video e.g., Real Video
  • the key frames are at least 80 X 60 pixels.
  • On each start page appears a list of videos that are available for editing. Title or ID, number of scenes, and length are displayed as a link that leads to an editing page for that video.
  • the user is taken to an editing page for the selected video.
  • the user can directly order an album or delete an album by selecting an appropriate button.
  • FIG. 5 a a process 500 for editing a video and creating an album is shown. The process begins by selecting an unpublished album or video to be edited
  • an storyboard edit page is presented.
  • a user interface 550 presented by server system 70 for a storyboard edit page is shown in FIG. 5b.
  • the video title or ID and length will be displayed along with a key frame (e.g., the individual JPEGs from a JPEGs directory in database 74) from each scene.
  • Controls presented in conjunction with the storyboard edit page can be manipulated by a user to select a scene for subsequent processing or inclusion in an album, a plurality of representative "thumbnail" images for scenes stored in the image database 74 (also referred to here as "thumbnails" or key frames) are displayed in the user interface 550.
  • the user interface includes two frames, one for clips (scenes) 554 and one for a storyboard 556.
  • the clips frame 554 includes individual scenes associated with a single video ID (i.e. raw input scenes).
  • the clip frame 554 can be used to display all ofthe scenes associates with a previously created album.
  • Each scene includes a thumbnail 552.
  • the user can select ones ofthe scenes from the clip frame 554 for inclusion in the storyboard frame 556 (504).
  • the user interface 550 can be configured in a conventional manner so that a user can select a particular scene for inclusion in a storyboard by clicking on the thumbnail 552 associated with that scene.
  • the user interface 550 can include a plurality of buttons (or other user interface controls) associated with one or more scenes stored in the database 74; a user can click on (or otherwise actuate) one ofthe buttons in order to select the one or more scenes associated with that button.
  • the selected scene is then retrieved from the database 74 and used for subsequent processing.
  • the scene can be received and selected in other ways, including for example, as an attachment to an E-mail or embodied on a storage medium such as DND or CD-ROM.
  • a preview ofthe scene can be displayed (506).
  • the digitized video is processed by the local video processing system 60, one or more encoded versions ofthe digitized video are created to allow for streaming video downloads to the user.
  • Streaming video server 78 processes these requests and provides a streaming video preview ofthe selected scene. In one implementation, by clicking on a key frame on the editing page the user can view a
  • a Real Player module pops up, plays the scene, then disappears.
  • the Real Player module pops up and plays just that scene but also displays the key frames for the other available scenes.
  • the Real Player area is embedded in the editing page. Clicking on a key frame plays just that scene.
  • a user interface 580 presented by streaming video server 78 is shown.
  • the user interface 580 includes a streaming video frame 582, scene frame 584 and one or more controls.
  • the streaming video frame 582 displays the encoded video stream associated with the selected scene.
  • the streaming video frame can include one or more controls for manipulating or otherwise editing the selected scene.
  • the streaming video frame 582 includes controls for cropping the selected frame to reduce the size/ duration ofthe selected video clip.
  • Scene frame 584 includes plural thumbnails 552 associated with the scenes presented in the clip frame of user interface 550.
  • a new scene can be selected from the scene frame 584 and displayed in the video streaming frame 582 by the user selecting a desired scene.
  • various controls are included on the user interface 580 for editing or otherwise managing the selected scene. Controls can include a tracking control for changing the video tracking, extraction control for selecting a frame or portion of a scene and a deletion control for deleting a scene from the clip frame 554.
  • a scene can be selected to be added to the storyboard frame 556 (508). To do so, the user can actuate a control (add control
  • the user can also deselect scenes included in the storyboard frame 556. To do so, the user can actuate a control (delete control 560) presented under a selected scene in the storyboard frame 556. In addition, the user can reorganize selected frames. To do so, the user can actuate a control (arrow control 562 and 564) presented under a selected scene in the clip frame 554. .
  • user interface 550 includes an upload button 567 that can be selected to upload other digital content (e.g., digital stills) or other scenes.
  • upload button 567 By invoking the upload button 567, an upload page can be presented that includes raw scenes, or options for uploading other digital content from the client computer or server computer.
  • a user can add scenes from more than one video to an album.
  • the editing page will not allow a user to create an album that is longer in length than 60 minutes, as this is the maximum length that can be burned onto a conventional CD.
  • Use interface 550 includes an options button 559 that can be selected by the user to customize an album. By clicking on this button, the user will be taken to a page where they can edit the title and description ofthe album, create a custom title screen, and set the security level ofthe album (private, shared, or public). If the album is designated as shared, the user may create or change a password associated with the album, or delete it. If the password or security level is changed on a published album, an E-mail can be sent to the user with album ID, password, and guest login instructions.
  • buttons include a change theme button 591, an add music button 592, an add titles button 593 and change permissions button 594.
  • the change theme button 591 can be invoked by a user to add frames, artwork, borders, fonts or other stylized features to the album.
  • the add music button 592 can be invoked by the user to add a music track to the album to be mixed with or supersede the existing audio track.
  • the add titles button 593 can be invoked to add title pages to the album to introduce the album or individual scenes.
  • the change permissions button 594 can be invoked to change the access rights to published albums.
  • the user interface 590 can include a change title box 595 that can be used to change the title ofthe album.
  • the user can preview the album (512).
  • the user can preview the album by selecting a preview button 565 on the user interface 550.
  • the preview function includes the display ofthe entire album using the streaming video scenes that are selected for inclusion in the album. All title pages that are included are rendered and the entire finished product is displayed.
  • the user can publish the album indicating that the album is finished (514).
  • the publication process includes the generation of a high resolution or streaming version ofthe storyboard and all included options.
  • the high resolution version is stored in database 74.
  • the user can publish the album by selecting a finish button 563 in the user interface 550. When finished, the user will be prompted to indicate whether the album should be published or remain unpublished.
  • the entries are assimilated and the final videos are created.
  • the following steps are asynchronous, that is, the user is not required to stay online waiting for the steps to complete. Instead a message will appear stating that they are free to move about the site and that they will be notified via E-mail when their album is published and ready for their viewing. Steps to be completed include extracting new stills for any trimmed scenes, creation of a customized title screen, creation of a final streaming version ofthe album and moved to the appropriate directory, and the creation ofthe final high resolution version ofthe storyboard.
  • MPEG-1 and CD directory structure is created and stored at the processing site.
  • the final high resolution version can include proprietary productions screens and customized title screens.
  • a contact sheet is also created and stored at the processing site.
  • the user can invite friends to view or order an album (516).
  • the user interface 550 includes a invite button 569 that can be invoked to create an E-mail to be distributed with instructions to one or more invitees to view the published album.
  • an album is designated as shared, the user receives an E-mail with the Album ID, password and instructions for viewing. The user can then forward this information to their guests.
  • the guest On the guests' login page the guest will enter an Album ID and a password associated with this album as well as their preferred bit stream. The guest will then be taken to a screen that shows the key frames ofthe published album and instructions on how to view the album.
  • a Real Player pops up and plays the album.
  • the server system 70 will include functionality to set the number of guest views of an album. If an Album is designated as public, a URL is generated for that album and sent via E-mail to the user. That URL will then point directly to the public album. This completes the album creation and publication process.
  • a third frame a collection frame 570, is included in the user interface 550.
  • the collection frame presents albums that have been previously compiled by the user. Each album has associated with it a single representative thumbnail which can be viewed by the user. An album from the collection frame 570 can be selected by the user. Each scene in the selected album is then displayed in the clip frame 554. Scenes from the selected album can be included in the storyboard as described above.
  • the editing page includes a user interface 595 that presents a storyboard 556, a video selection button 557, a preview button 565 and finished button 563.
  • the storyboard 556 includes scenes and controls (trim control 593, delete control 595 and shift control 583). Trim control allows a user to trim an individual scene. A user interface presented for trimming a scene is shown in FIG. 5i. The user interface includes controls for selecting an amount of time to trim from the beginning and end of the scene along with preview functions. Referring again to FIG. 5f, the selection ofthe delete control 595 will delete the selected scene from the storyboard. The user can manipulate the shift control 583 to reorganize scenes in a storyboard.
  • FIG. 5g An example ofthe user interface 597 shown when actuating the selection button 557 is shown in FIG. 5g. Thereafter a user can select one ofthe available videos for review by selecting a view button 598.
  • the user interface presented 599 when selecting the view button is shown in FIG. 5h.
  • a user can select individual scenes for inclusion in storyboard 556 by selecting a box 589 and the stay or go buttons 587 and 585.
  • a reset button is provided to clear the selections from boxes 589.
  • the stay button causes the selected scenes to be added to the storyboard but the user interface remains on the current page.
  • the go button 585 adds the scenes and transports the user back to the edit page (e.g., user interface 595). Operational Flow
  • the operational flow 700 includes three loops an archiving loop 702, a production loop 704 and a sharing loop 706.
  • a customer provides content to be processed by the system.
  • the system digitizes and encodes the content and provides a value added function of scene detection.
  • High quality digitized content is stored and a low resolution version is passed to the sever system for viewing and editing by the customer.
  • a customer 710 provides a tape for processing (i.e. video tape source acquisition 712) after which digitization and encoding operations 714 are initiated.
  • the digitized and encoded original content is stored locally 716 for future processing ( in the production cycle).
  • the high quality encoded and digitized video is converted 718 to a streaming format.
  • the high quality encoded and digitized video is also processed by a scene detection system to detect individual segments in the original content 719.
  • the scene detected streaming version ofthe high resolution content can be edited and organized into an album that can be published.
  • the scene detection information and streaming format data for any received tape is provided to the database server 724.
  • the database server stores the streaming format content and scene detection information in a local storage 726.
  • the customer accesses a browser 730 to view the content.
  • a web server 720 presents the user with web-based editing tools 732 for manipulating the scene detected content as described above to form an album of content.
  • the user can also view scenes by accessing a streaming server 722.
  • the web server 720 can also allow the customer to access other content that can be included in an album for example from a multimedia jukebox 734.
  • content can be directly provided by a third party to the server system and bypass the archiving loop. That is, the customer can access a strategic partner 750, for example through the Internet, and provide content to the strategic partner.
  • the strategic partner can intake and process content and provide content through middleware 752 to the server system for inclusion in a customer album.
  • the third party can provide: upload content (this content goes into the upload loop) and customer support (e.g., the third party has its own web site and provides support to the customers).
  • customer support e.g., the third party has its own web site and provides support to the customers.
  • the tape can be sent to the third party or directly to the server system. After processing, material can be posted through the third party website.
  • the customer produces an album as a finished product in the production loop.
  • the album is published 740.
  • a edit list is produced that describes the final published content.
  • a mastering and duplication service can produce a CD from the edit list.
  • the album can be published on the Web, and cast to one or more recipients.
  • a customer is directed to a particular album that can be viewed.
  • the customer accesses the album, can in turn edit portions, order copies and share the content just as the original producer ofthe album.
  • a production flow 800 for the system is show.
  • the operation begins when the customer places an order for an album 802.
  • the customer accesses the web server home page 804 and traverses to an order page 806.
  • the customer identifies a product to be ordered and is presented with pricing, shipping and other information.
  • a confirmation page is presented to confirm the order details 808. If confirmed, the customer is thanked 810 and sent a confirming e-mail 812 that is received by the customer 814.
  • a mailer kit is sent to the customer 816, received by the customer 818 and sent back with a tape to the production center 820.
  • the production center receives the mailer from the customer 822 and generates an email 824 that is sent to the customer 826 indicating that the mailer (e.g. tape) has been received.
  • the production system identifies the kit from, for example a bar code on the mailer, and associates the mailer kit with an order 828.
  • the tape is digitized and encoded 830 to create the high resolution content. Thereafter, streaming versions ofthe digitized content are produced 832 and files are moved to the web server 834 and archived 835.
  • An email is generated 836 and sent to the customer 838 notifying the customer that the digitized content is available for editing at the website.
  • the customer can manipulate/edit the content 840 and then sends a notification to the production system to publish an album 842.
  • the production system retrieves the high quality content from the archive and produces data to be included in the CD as described above 850.
  • the CD is burned 852 and a quality assurance test is performed 854.
  • a back-up CD can be created 856.
  • a contact sheet is created and printed 858 along with a CD label 860.
  • the label is attached to the finished CD 862 and shipped to the customer 864.
  • a video identifier with tracking information for the CD is entered into a database 866 and a confirmation email is generated 868. The confirmation email is sent to the customer 870 and, if all goes well, the finished product is receive by the customer 872.
  • the website includes a master table of contents for aggregating content from multiple sources.
  • the master table of contents links information from local, a home page and other URLs for inclusion in an album.
  • a CD quality still image can be extracted from the original high quality digitized video source.
  • the still images can be distributed as any form of image-based product including mugs, prints, t-shirts or distributed using E-mail.
  • published albums and other content can be viewed using a Multimedia JukeboxTM.
  • the Multimedia JukeboxTM can be used to organize videos, audio, text, images all in one place with one-click access.
  • the input content can be divided upon detection of a change in speech pattern (e.g., a difference person speaking or upon the detection ofthe beginning of a speech) or upon the appearance or disappearance of an individual from a scene.
  • a change in speech pattern e.g., a difference person speaking or upon the detection ofthe beginning of a speech
  • the system can create (either automatically or under the direction ofthe user) an edit list that defines a series of start and stop points in the video content. Associated with each edit list can be one or more filters or other effects to be applied to the segment.
  • the system allows the user to create a playlist.
  • a playlist editor can be used to guide the display of a particular sequence of content.
  • each album can have an associated table of contents that describes all ofthe content in the album.
  • a play list can be created for the album that sequences through some or all of the content in the album in the order specified by the user.
  • a playlist player is included in the content provided to each ofthe distributees.
  • the playlist player receives as an input the playlist and plays the contents ofthe album in accordance with the instructions contained therein.
  • the playlist allows the user to select only the segments they want to see, in the order they choose.
  • the playlist editor can include tools for allowing the user to edit or otherwise manipulate the content of an album.
  • the playlist can include tools for trimming segments, applying filters to segments or other editing functions such as applying titles or other special effects.
  • the playlist editor allows the user to trim the length of any included segment, show the same segment multiple times, and add titles, special effects, etc to display a personalized version ofthe content. Others can select their own playlists which can be shared as they like.
  • the playlist editor includes tools for performing all ofthe editing functions described above with the web-based editing system.
  • Playlists can be uploaded to the central system to allow the user to distribute personalized content to different distributees. For example, digitized content can be shared with everyone: Once one or more personalized playlists are produced, the playlist(s) can be uploaded and then processed as if the order had been prepared on-line.
  • the editing page presented in the user interface can include a "share" button. Invoking the share button, a user can order new physical media (e.g., CDs) of a personalized nature.
  • the physical media may include video tapes.
  • a user can specify the format ofthe output content that is to be delivered to the distributees. For example, a NHS copy of an album can be ordered for those without access to a CD-ROM drive.
  • the user may order an Internet webcast.
  • the webcast can be up to 5 minutes for up to 20 viewings with additional length and viewings available at minimal extra cost.
  • the process of delivering content to the distributes can include a sharing option.
  • a sharing option For example, once the user creates an album, the user has a number of choices to make. Choices range from the format ofthe delivery (CDR, DND, online broadcast etc.), the distribution list (who you want to share content with), the distribution means (the system can provide copies to the user or distribute the copies to each recipient), and storage options.
  • the user can save albums scenes and other content (playlists, edit lists, stills and other content) in a personal multimedia jukebox.

Abstract

A method and apparatus for producing video content. The method includes acquiring video data from a source (102). If the video data is not digitized, then the video data is digitized. The method includes generating scene indexes for the video data including a representative still image for each scene and combining the video data and scene indexes along with a media player (110) on a video disc. The video player (110) is operable to play the video data in accordance with the scene indexes including playing a scene from the video data on a client computer (108) while displaying the representative stills for other of the scenes available for display on the video disc.

Description

Video Processing System
The present invention relates generally to computer systems and more particularly to methods and apparatus for collecting, editing and distributing video content.
BACKGROUND Video camcorders have been around for many years and provide non-professional users an easy and an inexpensive mechanism for capturing life moments. Conventional video footage recorded by non-professional users suffer from three major problems that have no practical solutions. The longevity of a conventional video tape is approximately 10 years, after which the tapes degrade rather quickly. Homeowners and renters alike typically store video tapes in non-secure storage means that are susceptible to theft and damage (e.g, fire, flood and other natural disasters). Finally, most video tape recorded by conventional non-professional users includes a more junk than real footage. That is, non-professional users of camcorders tend to not set up their shots and as such over record, creating undesirable junk footage. Conventional editing tools where available are difficult to use and very time consuming. As such, most non-professional users keep all ofthe raw footage on tape without editing out the junk.
Conventional solutions to these problems are either inadequate or too expensive. Tape to tape duplication services are available, but costs are not trivial and the duplicate tapes suffer from the same limitations discussed above. Professional encoding of video tapes to optical disks is very expensive typically on the order of magnitude of $60/min.
Home equipment for digital encoding and editing where available is expensive and time consuming to operate.
SUMMARY In one aspect the invention provides a method for producing a video disc and includes acquiring video data from a source. If the video data is not digitized, then the video data is digitized. The method includes generating scene indexes for the video data including a representative still image for each scene and combining the video data and scene indexes along with a media player on a video disc. The video player is operable to play the video data in accordance with the scene indexes including playing a scene from the video data on a client computer while displaying the representative stills for other ofthe scenes available for display on the video disc.
Aspects ofthe invention can include one or more ofthe following features. The step of acquiring can include capturing the video data from an analog source or a digital source. The step of generating scene indexes can include detecting a transition between consecutive frames in the video data, determining when the transition indicates a scene break and marking the end ofthe previous scene and a beginning of a new scene at a point in time that corresponds to the initially detected transition. The step of detecting a transition can include detecting a color difference between the frames and determining if a difference between frames exceeds a preset threshold.
The method can further include cropping one or more ofthe frames prior to the comparison to eliminate effects from the boundary ofthe image frame. The step of detecting a transition can include detecting a motion difference between the frames. The step of detecting a transition can include determining if a difference between frames exceeds a preset threshold.
The step of determining when a transition indicates a scene break can include comparing plural frames to a last frame thought to be part of a preceding scene. The step of generating representative stills for each scene can include selecting a first frame from each scene or a frame from an introductory group of frames from each scene. The step of selecting a frame can include determining a color distribution for plural frames in a scene and selecting a frame from the introductory group that is a best match to the determined color distribution. The method can include creating a contact sheet for distribution with the video disc that includes each representative still for the scenes detected on the video disc. The video disc can be a compact disc or a digital video disc.
In another aspect the invention provides a method for producing a video based product that includes acquiring video data and generating temporal indices including analyzing the video data to detect the temporal indices. The temporal indices indicate a division ofthe video data into distinct segments. The method includes providing a media player operable to play the video data on a client computer in accordance with the temporal indices and packaging the video data, the temporal indices and media player on a physical medium for delivery to the client computer.
Aspects ofthe invention can include one or more ofthe following features. The method can include digitizing the video data prior to packaging the video data and generating representative stills for one or more segments. The media player can be operable to display one or more ofthe representative stills while playing the video data on the client computer.
The method can include providing a media editor operable to generate one or more edit lists. Each edit list can define a set of operations to be performed on the video data by another computer so as to allow editing operations defined on one computer to be performed on the video data to be replicated on another computer. The method can include editing the video data in accordance with the edit lists on the another computer and distributing the edited video to user designated distributees. The packaging step can include producing a physical manifestation ofthe content to be delivered. The physical manifestation can be a video disc. The packaging step can include producing a streaming version of the video data in accordance with the temporal indices, receiving a request to webcast the streaming version ofthe video data and streaming the streaming version to a requestor. The distribution ofthe edited video can include providing the edit list to a central distribution site, generating the edited video at the central distribution site and directly delivering the edited video to the distributees.
Aspects ofthe invention can include one or more ofthe following advantages. An efficient and inexpensive system is provided for collecting, digitizing and editing video content. The system includes digitizing equipment for digitizing analog and digital video input for distribution over the Internet. The system includes scene detection mechanisms for parsing the digitized content into plural scenes which can then be edited or otherwise manipulated by the user. The system provides tools for the user to easily combine plural scenes into a single composition (an album) which can be burned into a compact disc (CD) for distribution to plural sources. The composition can be viewed locally with streaming content, or can be burned into a CD. The content on the CD can be a high resolution or a streaming format. The system provides fully automated digital video editing services to allow users to manipulate each scene, combine scenes and integrate other input including audio and digital still pictures. The system combines an Internet web hosting service for viewing digitized content with video processing tools to facilitate the distribution of content to plural distributees. The system provides a menu of still frames, one for each detected scene in a digitized input. The system provides video search tools that can analyze scenes to search for particular content based on keywords.
CDs produced by the system can be archived or otherwise backed up to make available the original high quality content for later derivative work. Only desired high quality content will be archived through the interaction ofthe user in creating an edited album on the Web. The user can select from a high quality or a lower quality streaming version ofthe content when publishing an album. High quality still images can be extracted from the high quality content in a publish album. High quality stills can also be extracted from the high quality content archived by the system.
These and other advantages will be evident from the description below, the claims and the attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS FIG.l is a schematic block diagram of a system for capturing, editing and distributing video content.
FIG. 2a is a block diagram of a capture and digitization module. FIG. 2b is flow diagram for a method for detecting scenes in a digitized video.
FIG. 3 is a flow diagram of a high-level process for offering digitized video products for sale over a computer network such as the Internet. FIG. 4a shows a user interface for a login screen. FIG. 4b shows a use interface for a start page. FIG. 5 a is a flow diagram for a process for editing a video and creating an album.
FIG. 5b shows a user interface for an editing page. FIG. 5c shows a user interface for a preview page. FIG. 5d shows a user interface for a options page. FIG. 5e shows a user interface for an alternative editing page. FIG. 5f shows a user interface for a second alternative editing page.
FIG. 5g shows a user interface when a selection button of FIG. 5f is activated. FIG. 5h shows a user interface when a view button of FIG. 5f is activated. FIG. 5i shows a user interface for trimming a scene. FIG. 6 shows a user interface for a media player. FIG. 7 shows an operational flow for the system of FIG. 1 FIG. 8 shows a production flow for the system of FIG. 1
DETAILED DESCRIPTION
As used herein the term "Video Data" refers to an image stream, audio stream or synchronized image and audio stream. "Physical media," as used herein, refers to means for storing digitized content and can include a video disc, floppy disc, zip drive, minidisc, magnetic tape, CD-ROM, NCD and DND.
"Segment," as used herein, refers to a definable portion of video data. Tools described below can be used to locate segments ofthe video data. Portions ofthe description below are described with reference to a scene. A scene is a type of segment often associated with a image stream. While the description sets forth particular details of scene detection and other scene features, those of ordinary skill in the art will recognize that the invention is equally suited to process other video data types.
Referring now to FIG. 1, a system 100 is shown for capturing, editing and distributing video content. The system 100 includes a local video processing system 60 and server system 70. Local video processing system 60 captures and digitizes content and provides digitized video to server system 70. Server system 70 maintains a database 74 of digitized video and one or more servers. Database 74 may itself be a database server that includes one or more storage means for storing streaming and high resolution video and other data. The servers execute one or more applications to host video editing services as will be described in greater detail below. Server system 70 includes a website that can be accessed to retrieve, manipulate, order and distribute digitized video to one or more distributees. The details ofthe website, editing tools and distribution services provided by server system 70 is described in greater detail below. Server system 70 can be linked to by a user using a client computer 80 via a network 82 (e.g., the Internet). The user can login, review and edit video that has been captured, combine the captured/edited content with other media and preview the results (i.e., a storyboard) in real time. For the purposes of these discussions, a "storyboard" is a working area presented by a user interface provided by server system 70 to the user operating client computer 80. One or more scenes are added to the storyboard as the user develops a finished product referred to as an album. An album includes a name and a representative still. Albums can be edited and eventually published. Publication can include the creation of a high resolution version ofthe digitized content and may include the production of a physical manifestation ofthe digitized content (physical media) for distribution to one or more distributees. Alternatively, an album can be published on-line and viewed by others in a streaming format.
In one implementation, the user can view a streaming video version ofthe digitized content stored in the database 74 in server system 70. Streaming video server 78 can download to the user via the network 82 a streaming version of a scene, storyboard or album. The streaming video version can be a low resolution version ofthe original digitized content stored in the database 74. After the user has reviewed and/or modified a storyboard, the user can use a browser to order a physical manifestation ofthe storyboard/album. The user can also allow others to access an album or distribute multiple copies ofthe physical manifestation to other distributees. The processes invoked by the browser are described in greater detail below. System 100 includes a production system 90 that is used to produce a published version of a selected album as well as produce the physical manifestations ofthe album for distribution to the distributees. The published version ofthe album can be a high resolution, streaming or other version ofthe original digitized video content that is stored in the database 74 ofthe server system 70. In addition to the delivery of a physical manifestation ofthe digitized content, an information stream can be produced to deliver a version ofthe content to the distributees. The information stream can be delivered by a delivery system such as the World Wide Web using an internet enabled set top box (using the file transfer protocol ftp), DND player or personal computer, a cable system incorporating a video-on-demand set top box, or satellite system (satellite narrowcast). These and other delivery systems can be used to deliver a streaming version ofthe digitized content. Local Video Processing System
Local video processing system 60 includes a capture and digitization module 62, a scene detection module 64, one or more streaming video processors 66, splitters 68 and local storage 69 (not shown). Referring now to FIG. 2a, capture and digitization module 62 includes an input module 102, an input monitoring multiplexor 104, a digitization source multiplexor 106, a digitization module 108, a digitization control module 110, content monitoring module 112, content monitoring multiplexor 114 and network connection 116 for interfacing with a network (e.g., local area network (LAN), intranet, Internet) that couples the digitization module 62 and the rest ofthe local video processing system 60.
Input module 102 includes plural means for reading input received from a user. Input can be received from a user by US Mail, delivery (e.g., FedEx, UPS), through a designated receiving site (e.g., a drop off center, kiosk, photo shop), or can be uploaded directly by the user. Input can be analog or digital. If the input has already been digitized, then the input can be provided directly to digitization control module 110.
Otherwise, all other forms of input are digitized using digitization module 108. In one implementation, input module 102 includes a video cassette player (NHS, SNHS or 8mm format), a compact disc player (video compact disc (NCD) and digital video compact disc (DND)) and a camcorder for reading input. Input can be ofthe form of analog or digital tape (NHS, SNHS or 8mm tape), NCDs, DNDs or direct input from a video recording device such as a 8mm HI-8 camcorder. Input module 102 provides as an output plural input streams, one from each input device, that can be passed to both the input monitoring multiplexor 104 and digitization source multiplexor 106. Alternatively, the input module input stream can be coupled directly to the digitization control module using a Fire Wire connection (IEEE-1394 interface) or other direct input means.
Input monitoring multiplexor 104 receives as inputs a video stream on each of its input ports and provides a single selected stream as an output on its output port. In one implementation, input monitoring multiplexor 104 receives as inputs two video streams from the input module (a stream from a video cassette player and the compact disc player) and a feedback stream from the digitization control module 110. The output of the input monitoring multiplexor 104 is coupled to an input ofthe content monitoring module 112. In this way, the video output from each input device can be viewed by a quality control monitor for the system.
Digitization source multiplexor 106 receives as inputs video streams on each of its input ports and provides a single selected stream as an output on its output port. In one implementation, digitization source multiplexor 106 receives as input three video streams from the input module (one from each ofthe video cassette player, compact disc player and camcorder). The output ofthe digitization source multiplexor 106 is coupled to the input of digitization module 108. In this way, the video output stream from each input device can be selected for digitization by the digitization module 108. Digitization module 108 can include plural devices for digitizing the video input received from the input module 102. In one implementation, digitization module includes a controller 170 (e.g., an Osprey 200 video capture card available from NiewCast.com), and two digitizers 172 (a Digital Video Creator available from Dazzle Multimedia and Studio MP10 available from Pinnacle Systems). Each device (controller 170 and digitizers 172) is coupled by a bi-directional communications bus to the digitization control module 110. In one implementation, controller 170 is included as part of digitization control module 110.
Digitization control module 110 controls the configuration and selection ofthe devices in the digitization module 108. Depending on the configuration, one or more of the devices will operate on the video stream received from the digitization source multiplexor 106 and provide output to both the content monitoring multiplexor 114 and the digitization control module 110. In one implementation, each digitizer 172 provides a digitized stream that contains the digitized video as an output to the digitization control module 110. In addition, the digitized content can be rendered to produce a video stream that is provided as an output to content monitoring multiplexor 114.
Digitization control module 110 can also perform a synchronization function for the data transfers between the digitization module 108 and input module 102. Digitization control module 110 can activate input module 102 and digitization module 108 in an appropriate sequence so that output of input module 102 can feed into the input of digitization module 108 without any human intervention.
Content monitoring multiplexor 114 receives as inputs a video stream on each of its input ports and provides a single selected stream as an output on its output port. In one implementation, content monitoring multiplexor 114 receives as inputs two video streams from the digitization module (a stream from each digitizer 172). The output of the content monitoring multiplexor 114 is coupled to an second input ofthe content monitoring module 112. In this way, the video output from each digitizer 172 can be viewed by a quality control monitor for the system.
Content monitoring module 112 includes a video monitor for viewing video streams processed by the system 100. In one implementation, the content monitoring module 112 includes two inputs, one from the digitization module 108 and one from the input module 102 (via their respective multiplexors).
Digitization control module 110 controls the operation ofthe digitization module 108. Digitization control module 110 receives as an input a digitized video stream that can be manipulated for further processing. Digitization control module 110 associates with each processed digitized stream a video identifier (ID) associated with the source (user) ofthe input. The output from digitization control module 110 (a digitized video stream) is coupled by network connection 116 to remaining portions ofthe local video processing system 60. In one implementation, the digitized video stream is encoded prior to output. In one implementation, the output format is an MPEG format.
In one implementation, digitization control module 110 can add information to the digitized video prior to transfer to the remaining portion ofthe local video processing system 60. For example, production screens or customized title screens can be added to the digitized video to indicate the source of origin ofthe digitization services.
The digitized content produced in the local video processing system 60 can be stored locally in local storage 69 on disc or other physical media.
I. Scene Detection
Referring again to FIG. 1, scene detection module 64 includes an algorithm for detecting scene changes in the digitized video. Scene detection module 62 receives as an input the digitized stream from the capture and digitization module 64 (e.g., the MPEG file) and provides as an output scene information. In one implementation, the scene information includes scene identification information, bad video segment information as well as a representative still image for the scene. In one implementation, the scene detection module 64 provides as an output a JPEG still for each scene.
Referring now to FIG. 2b, a method 200 invoked by the scene detection module 64 for detecting scenes in the digitized content is shown. The method 200 is used to detect when the video camera was turned on or off (i.e., a scene break). The output of the method is a list of segments, where each segment contains beginning and end frame numbers. The method begins by retrieving a first image frame (202). If no more frames are to be processed (204), the process ends (230). Else, the next image frame is retrieved (206). The current image frame is cropped (208). The image frame is cropped to include only the interior portion ofthe image area since the outside boundary may contain camera noise. Where the image frame dimensions are W x H, the method only considers pixels within the rectangle (bw,bh) - (W-bw,H-bh). In one implementation, bw is set to 0.1 *W and bh is set to 0.1 *H.
Thereafter, a color difference and a motion difference between the current image frame and the previous one is computed (210). A check is made to determine if the method is in the "tentative" mode (212). If not, another check is made to determine if a time-out counter (TOC) is non-zero (213). If not the process continues at step 214. If the TOC is non-zero, then the TOC is decremented by one unit (240). If, after decrementing, the TOC is now zero (242), the method declares a true scene break and records a frame identifier associated with the frame located previously (in step 218) as the end of a scene and the next frame identifier as the start ofthe next (new) scene in the list of scenes (244). Thereafter, or if the result ofthe comparison in step 242 is non-zero, the process continues at step 214.
In step 214, the motion and color differences are compared to preset thresholds. If both the motion and the color differences are not above the preset threshold(s), then the process continues at step 204. If both the motion and the color differences are above the preset thresholds, then a possible scene break has been detected and the method enters the tentative mode (216). The tentative mode is defined as a time period after detecting an initial indication of a scene break over which the frame-to-frame motion and/or color difference continues to exceed the threshold(s). In one implementation, the thresholds are determined by performing a regression analysis on a database of home camcorder footage. The image frame immediately before the detected break is identified (218) and the process continues at step 204.
If the method is already in the tentative mode, then a check is made to determine if both the motion and the color difference are above preset thresholds (for the identified frame and the current frame) (220). If so, then the method switches back to the "normal mode" (exits tentative mode (222)) and then compares the frame identified in step 218 with the current frame (223). The comparison includes the computation ofthe color and motion differences between the two frames and a comparison ofthe results to a preset threshold(s). If the differences do not exceed the threshold(s) 224, then the tentative scene break is cancelled (including deleting the scene identified in step 218) (226) and the method continues at step 204. If the differences exceed the thresholds, then the TOC counter is initialized to a preset value 228. At this point, the method has detected a true scene break, but will wait for a number of frames equal to the initial time out value before declaring a scene break. This waiting period can be used to detect another scene break (within the time-out period) which has an even larger motion and color difference.
Thereafter, the process continues at step 204.
In home camcorder footage, camera flashes and fast motions (e.g. someone walks in front of a camera) can often cause color and motion differences to exceed the preset thresholds for a short period of time. In order to reduce these false positives, the system includes the tentative mode. While in the tentative mode, the system compares new image frames to the previously stored image frame (identified in step 218) for a timeout period. In one implementation, the timeout value is determined to be 1/3 of a second. If by the end ofthe timeout, the image is still very different from the stored image, the system declares the detected scene break to be a true scene break and turns off tentative mode.
A. Computing Color Difference
There are many standard ways of computing color differences. In one implementation, the system first computes a 2D color histogram ofthe image frame. The two color axes are used to compute a representative color model ofthe image frame. The two axes are normalized red (r' = r/r+g+b) and normalized green (g' = g/r+g+b). Thereafter, the system creates a 2D array, H, where each bin, H(r',g'), represents a rectangular region in normalized red and green space. Each bin is initially set to zero. For each pixel in the image, the pixel's normalized color values (r',g') are computed and the count in H(r',g') is incremented by one. The actual color difference is computed by comparing the histogram for the current image frame and the previous image frame(or identified image frame). The color difference between the two image frames is the histogram intersection: the system accumulates, for each bin location in each histogram, the count ofthe smaller ofthe two bins [CD = sum( min(Hl(i,j),H2(ij)) ) / N where I and j are indexed over all the bins in the histogram and where N is the total number of pixels in the image frame] .
B. Computing motion difference
There are many standard ways of computing motion difference. In one implementation, the system computes a motion difference by calculating the total number of pixels that have changed significantly between the two images. The system uses the intensity value of each pixel I(x,y) to do the comparison. Let m(x,y) = 1 if |Il(x,y) - I2(x,y)| > threshold, 0 otherwise. Then the motion difference MD = sum( m(x,y) ).
C. Extracting Representative Frames In one implementation, the process for selecting a representative still for a scene includes picking the first frame ofthe scene. Although this strategy works very well for professionally produced footage, it does not work well for home camcorder footage. The first few frames of home footage often contain garbage because the person has not quite set up the shot yet. In one implementation, a better choice for the representative frame is selected by analyzing the entire segment and selecting an image that best approximates the rest ofthe segment. In one implementation, the system restricts this representative image to be within the first few seconds ofthe scene (the "segment intro").
To select a better representative frame, the system computes a color model ofthe entire scene by computing the average color histogram ofthe entire scene. This average histogram is compared with every image in the segment intro using histogram intersection (see previous discussion). The image frame with the smallest color difference is chosen.
In one implementation, a one pass algorithm is used to extract the representative frames while the scene break detection is going on. As soon as a new scene is detected, all the successive images in the segment intro (usually 5 seconds = 150 frames) are stored in a buffer. In addition, the system keeps track ofthe running sum of all the bins in the histogram. When the end ofthe segment is detected, the system computes the average histogram by dividing each bin by the total number of images. This average histogram is compared against the frames in the buffer. The best match is selected and output as the representative frame. In one implementation, a slight trick is used due to the timeout period. More specifically, the system does not know that a scene has been detected until a timeout (e.g., 1/3 of a second) after the break was initially processed. The system maintains a second additional buffer sized in accordance with the timeout period (e.g., 1/3 second = 10 frames for a 30 fps video segment) to make sure the system does not miss any frames.
D. Bad Video Segment Detection
Home camcorder footage often includes segments in which there is not a valid video signal on the tape. This can occur if the lens cap is left on while recording, if the tape is accidentally fast forwarded during filming (leaving an unrecorded portion ofthe tape) or by other means. The most common cause of this problem occurs when a tape is removed before it has been filled to 100% of capacity producing a tape that includes a final segment with no video signal.
In order to automatically detect and remove such bad video segments, the system can perform an analysis on the representative still selected for each segment and remove those segments that do not have a valid video signal. This of course assumes that if the representative still has a valid signal, then the remainder ofthe detected segment will also have a valid signal. Since the representative frame is by definition the frame that best represents the entire segment, then this assumption is very good in practice.
To determine whether or not the representative still has a valid video signal, the system computes statistics on each ofthe color channels (e.g., three in an RGB device) and compare then to preset thresholds. In one implementation, the system computes the standard deviation ofthe red, green and blue color components [for example for the red component: StdDevR=sum((Rk-avgR)*(Rk-AvgR))/N, where AvgR Is the average value ofthe red component throughout the image, N is the total number of pixels in the image and Rk is the value ofthe red component ofthe kth pixel in the image, where k ranges from O to N-1.]
The system then checks to see if the standard deviation computed is less than a threshold [e.g. StdDevR< ThresholdR.] If so, then the still image is labeled a bad image and the associated segment is labeled a bad segment (e.g.,scene). In one implementation, the threshold values are determined by performing a regression analysis on a database of home camcorder footage.
In an alternative approach, a check can be made of each image (still) in a segment. When the color statistics for each frame exactly match, a bad segment is declared.
Referring again to FIG. 1, after the scene detection process has been performed, the digitized video (e.g., MPEG file) and scene detection information (e.g. scene identifiers and JPEG stills) are provided to streaming video processor(s) 66 and splitter 68. Streaming video processor 66 operates on the digitized video to produce a streaming version (encoded version) that can be easily downloaded or viewed over a network (e.g., the Internet). In one implementation, two parallel streaming processors 66a and 66b are provided that produce streaming video output streams at two resolutions and bit rates.
Streaming video processor 66a provides a streaming video output for supporting a 56k modem configuration while streaming video processor 66b provides a streaming video output for supporting a digital subscriber line (DSL) configuration. In one implementation, video output processor 66 outputs a RealVideo format file and any accompanying SMIL files necessary for previewing the RealVideo format file by the user.
The output of video processor 66 is provided as an input to splitter 68. Splitter 68 takes the scene detection information and produces individual encoded files, one for each scene. The output ofthe splitter 68 is provided as an input to server system 70. In one implementation, two parallel splitters 68a and 68b are provided that produce encoded output files at two resolutions and bit rates. Splitter 68a provides as an output scene- based encoded files to support a 56k modem configuration while splitter 68b provides as an output scene-based encoded files to support a DSL configuration. In one implementation, splitter 68 outputs RealVideo format file(s) and any accompanying SMIL files necessary for previewing the RealVideo format file by the user.
Server System
Server system 70 includes a batch processor 72, a database 74, a client server 76 and streaming video server 78.
Batch processor 72 receives as an input encoded (e.g., scene-detected RealVideo files) and digitized video files (e.g., the digitized MPEG file) from local video processing system 60. The connection between server system 70 and local video processing system 60 can be optimized to support needed bandwidth and cost structure. In one implementation, the connection between the devices is a highspeed TI link. In another implementation, this connection is a physical device such as a removable hard disc or a Iomega, Inc. Jazz disc. Batch processor 72 writes all ofthe information associated with the files to the database 74 and copies all files into an appropriate directory structure. In one implementation, all files associated with a digitized video are stored in a single directory associated with the user's ID. Other directory structures can be used.
Client server 76 receives notice from batch processor 72 when the data transfer to the database 72 has been completed and the video content is available for processing.
Client server 76 includes plural applications for interfacing with the user and the various other system components. In one implementation, client server 76 includes an E-mail application that can be invoked to send notification to the user that the digitized video is available for review. Client server 76 hosts a website that can be visited by the user. Client server 76 is connected to one or more client computers 80 by a network 82 such as the Internet. Client server 76 includes a web front end (not shown) that manages the communications with the client computers 80. The website can include plural applications that when executed allow the user to view, edit, manipulate, archive and order copies ofthe digitized content. The website architecture and user interface are described in greater detail below. In one implementation, the user can view a streaming video version ofthe digitized content stored in the database 74 in server system 70. Streaming video server 78 can download to the user via the network 82 a streaming version of a scene, storyboard or album. The streaming video version can be a low resolution version ofthe original digitized content stored in the database 74.
In one implementation, the video material is stored for predefined length of time at server system 70. In one implementation, server system 70 sends E-mails at 10 and 14 days that warn of imminent deletion of material. Ultimately, material can be deleted after a pre-defined period (e.g., 21 days). Any in-process albums will be altered to remove the deleted material. An E-mail can be sent after deletion that informs the user how to send in an archive CD for reposting of material.
Client Computer
The client computers 80 can be connected to various input devices (digital video camera, digital still camera and storage means) so that a user can upload captured digital images, video clips or previously digitized video scenes to the client computer 80. Alternatively, or in addition, the client computer 80 can execute digital video processing software such as Ulead Video Studio3.0 SE or image processing software such as ADOBE PHOTOSHOP® in order to create and/or edit digital video or still images. The client computer 80 includes a storage medium (not shown) such as a hard disk for storing the digital video or still images.
The client computer 80 is connected to the network 82, for example, using a modem or network interface card. The system can be implemented as a browser-based system in accordance with the standard protocols for communicating over the Word Wide Web. In such an implementation, a user ofthe client computer 80 can execute a browser 84 to connect to and interact with the client server 76. As described above, client server 76 includes a web front end that manages the communications with the client computer 80. The user ofthe client computer 80 can upload digital content to the client server 76. The web front end receives the uploaded digital content and stores the content in database 74. Production System
The user ofthe client computer 80 can also order content made from edited or raw content as will be described in greater detail below. The client server 76 includes or is connected to a production system 90. Production system 90 receives the selected digitized content from the client server 76 and generates a physical manifestation (e.g.
DND or CD) ofthe content from the selected digitized content. Alternatively, the production system receives an edit list that identifies the content to be processed and the content is retrieved from the database 74 (e.g., from the database server). The items generated by the production system 90 can be shipped to the user using a conventional shipping service such as the United States Postal Service or Federal Express.
In one implementation, the production system includes a user interface that is presented to the user to allow for a one click publication process. The user interface can include for example a finished button (563 of FIG. 5b) that automatically publishes a finished album. The automatic publication process includes the generation of scenes, high resolution content, streaming content, contact sheets and other materials for inclusion in a finished product. The publication services are described in greater detail below.
In one implementation, the CDs produced by production system 90 can be archived or otherwise backed up to allow for high quality content to be available for later derivative work.
In one implementation the product shipped (CD) includes the high resolution video (e.g., MPEG files), a Media Player and an auto play file. In another implementation, a streaming version ofthe selected content is also included in the shipped product. A contact sheet can be created and printed for inclusion with the product. After the scene detection process has been completed, a contact sheet can be generated. The contact sheet is a sheet (e.g. of paper) that contains a number Ν of thumb-nail sized key frame images (stills) representing the detected segments ofthe input video as well as the movie times associated with each key frame. In one implementation, the maximum number of thumbnails is 30. When generating the contact sheet, two distinct special cases are considered, when the number of detected segments M is greater than Ν, and a when the number of detected segments is less than N.
In one implementation, when presented with the first case where M>N and more segments are detected than can be printed on a single sheet, only the first N thumbnails are printed on the sheet. In the second case where the N>M, then an algorithm can be selected to fill up the sheet. In one implementation, the algorithm selects a longest segment, divides the segment into equal pieces and selects new representative stills for the segments (as necessary). Alternatively, only a subset ofthe stills can be printed. In one implementation, the selection ofthe number of stills to printed on the contact sheet does not affect the underlying digitized content (i.e., no new scene breaks are established due to the division).
After the key frames are selected, the images can be added to a digital image representative ofthe contact sheet, and subsequently printed. The album author's name, title, and date of creation can also be added to the digital image prior to printing. A new set of scenes associated with the published product can be included. A user can click on a scene and the video segment corresponding to that scene will be played immediately. Other features include an A-B playback loop and random/shuffle play options. A user interface for a media player associated with the final product is shown in FIG. 6 Labels are printed and attached to the product (CD and jewel case) that include the title ofthe album or content included. In one implementation, the product
(e.g. CD) and contact sheet, along with the original tape (if any), are sent to the user.
Album CDs may be ordered via E-mail, phone, and mail. Upon receipt of payment within the appropriate time period the product is shipped as described above. Multiple products (e.g., CDs) may be shipped to multiple addresses.
Process for Ordering Video Products
FIG. 3 is a flow diagram of a high-level process 300 of offering digitized video products for sale over a computer network such as the Internet. First, content is received (block 302). For example, a user can send an 8mm tape for processing to the local video processing system 60. The content is digitized (304) and a scene detection process is invoked (306). In one implementation, a user executing a browser on the client computer 80 can access the web front end ofthe client server 76 and upload a digitized video or other digital content directly to the client server 76 bypassing the local video processing system 60. In this implementation, the client server either invokes a local scene detection process or forwards the digitized video that is received to the local video processing system 60 for scene detection services. Alternatively, the user can upload or otherwise send material that has been previously digitized by the local video processing system (e.g., a published CD). The material is reposted to the web site after which it is handled as if it had just been digitized by the local video processing system 60.
Then, the local video processing system 60 operates on the digitized video to produce one or more encoded streaming versions ofthe digitized video (308). The digitized video, scene detection information and encoded streaming versions ofthe digitized video are then stored in database 74 accessible by client server 76 (310).
The web front end ofthe client server 76 can be configured to allow the user to view scenes stored in the database 74 and select one or more scenes for subsequent processing or inclusion in an album as described below. More specifically, the user logs in to the website using a video identifier for the content that was previously provided as part of block 302 (312).
FIG. 4a shows one example of a user interface (i.e., the "login page") 400 presented to the user of client computer 80 for accessing digitized video stored at server system 70. The login includes a video identifier (ID) box 402 for indicating the identifier for the video product that is to be edited/viewed. When a user uploads or otherwise delivers the video content to the system, a video ID is assigned. The video ID is a unique identifier to a particular product. In one implementation, the video ID may be required to be accompanied with a password for security purposes. In addition, the login page includes an interface specification checkbox 404 for specifying the connection type for the current session. The first time a user logs in they are presented with a wizard (not shown) that gathers basic information including preferred bit stream (required) and other information about them. Information that the customer specifies can be changed later through a preferences page (not shown). On subsequent logins, a user may request that their password be reset to the default or a reminder sent to them. The default password can be the last four digits ofthe customer's home phone number. After finishing the wizard or after a successful login, the user is redirected to an appropriate page. If a video ID is provided, the user may be directly presented an editing page for the selected video. Alternatively, the user may be presented with a start page as described below, from which the user can select a video for editing. Once selected, the user can edit the raw input (314), select scenes for inclusion in an album (316) and publish the album (318). The publication ofthe album includes the creation of a high resolution copy ofthe selected content (or a streaming version). Next, an order is received for the published album (320). For example, the order can be received by the web front end from the user's client computer 80. The order can be received in other ways including, for example, via electronic mail, Internet Relay Chat, the telephone, and/or the mail. Typically, the order will include information specifying (or otherwise referring or pointing to) a type of product, the published album to incorporate into the product, a quantity ofthe product being ordered, payment information, and delivery information. After the order has been received, the order is fulfilled (322). For example, the order can be fulfilled by burning or otherwise generating the product (e.g. the DVD or
CD) and delivering the product to the customer.
Referring to FIG. 4b, a user interface 410 presented by server system 70 for a start page is shown. The start page includes links to Published Albums (finished albums 411), Unfinished Albums 413, and can include an Inbox (raw material) pages for recently uploaded content. On each start page published albums are listed. Displayed are the title, description, first key frame (representative thumbnail still), and length of each album. By clicking on a key frame a viewer can see a streaming video (e.g., Real Video) version ofthe associated album. In one implementation, the key frames are at least 80 X 60 pixels. On each start page appears a list of videos that are available for editing. Title or ID, number of scenes, and length are displayed as a link that leads to an editing page for that video. By clicking on the video or by selecting an edit button, the user is taken to an editing page for the selected video. In addition, the user can directly order an album or delete an album by selecting an appropriate button.
Referring now to FIG. 5 a, a process 500 for editing a video and creating an album is shown. The process begins by selecting an unpublished album or video to be edited
(502). When the selection is made from the start page or other page in the user interface, an storyboard edit page is presented. One example of a user interface 550 presented by server system 70 for a storyboard edit page is shown in FIG. 5b. On this page the video title or ID and length will be displayed along with a key frame (e.g., the individual JPEGs from a JPEGs directory in database 74) from each scene. Controls presented in conjunction with the storyboard edit page can be manipulated by a user to select a scene for subsequent processing or inclusion in an album, a plurality of representative "thumbnail" images for scenes stored in the image database 74 (also referred to here as "thumbnails" or key frames) are displayed in the user interface 550. The user interface includes two frames, one for clips (scenes) 554 and one for a storyboard 556. In one implementation, the clips frame 554 includes individual scenes associated with a single video ID (i.e. raw input scenes). Alternatively, the clip frame 554 can be used to display all ofthe scenes associates with a previously created album. Each scene includes a thumbnail 552.
The user can select ones ofthe scenes from the clip frame 554 for inclusion in the storyboard frame 556 (504). The user interface 550 can be configured in a conventional manner so that a user can select a particular scene for inclusion in a storyboard by clicking on the thumbnail 552 associated with that scene. In addition, or instead, the user interface 550 can include a plurality of buttons (or other user interface controls) associated with one or more scenes stored in the database 74; a user can click on (or otherwise actuate) one ofthe buttons in order to select the one or more scenes associated with that button. The selected scene is then retrieved from the database 74 and used for subsequent processing. The scene can be received and selected in other ways, including for example, as an attachment to an E-mail or embodied on a storage medium such as DND or CD-ROM. Once a scene is selected, then a preview ofthe scene can be displayed (506). As described above, when the digitized video is processed by the local video processing system 60, one or more encoded versions ofthe digitized video are created to allow for streaming video downloads to the user. Streaming video server 78 processes these requests and provides a streaming video preview ofthe selected scene. In one implementation, by clicking on a key frame on the editing page the user can view a
RealVideo version ofthe selected scene. In one implementation, a Real Player module pops up, plays the scene, then disappears. Alternatively, the Real Player module pops up and plays just that scene but also displays the key frames for the other available scenes. In another implementation, the Real Player area is embedded in the editing page. Clicking on a key frame plays just that scene. Referring to FIG. 5c, a user interface 580 presented by streaming video server 78 is shown. The user interface 580 includes a streaming video frame 582, scene frame 584 and one or more controls. The streaming video frame 582 displays the encoded video stream associated with the selected scene. The streaming video frame can include one or more controls for manipulating or otherwise editing the selected scene. For example, in one implementation, the streaming video frame 582 includes controls for cropping the selected frame to reduce the size/ duration ofthe selected video clip. Scene frame 584 includes plural thumbnails 552 associated with the scenes presented in the clip frame of user interface 550. In one implementation, a new scene can be selected from the scene frame 584 and displayed in the video streaming frame 582 by the user selecting a desired scene. In one implementation, various controls are included on the user interface 580 for editing or otherwise managing the selected scene. Controls can include a tracking control for changing the video tracking, extraction control for selecting a frame or portion of a scene and a deletion control for deleting a scene from the clip frame 554.
At any time (either before or after preview), a scene can be selected to be added to the storyboard frame 556 (508). To do so, the user can actuate a control (add control
558) presented under a selected scene in the clip frame 554. The user can also deselect scenes included in the storyboard frame 556. To do so, the user can actuate a control (delete control 560) presented under a selected scene in the storyboard frame 556. In addition, the user can reorganize selected frames. To do so, the user can actuate a control (arrow control 562 and 564) presented under a selected scene in the clip frame 554. .
Other content can be added to a storyboard. In one implementation, user interface 550 includes an upload button 567 that can be selected to upload other digital content (e.g., digital stills) or other scenes. By invoking the upload button 567, an upload page can be presented that includes raw scenes, or options for uploading other digital content from the client computer or server computer. A user can add scenes from more than one video to an album. In one implementation, the editing page will not allow a user to create an album that is longer in length than 60 minutes, as this is the maximum length that can be burned onto a conventional CD.
When the user has completed adding scenes to the storyboard frame 556, customized options can be included (510). Use interface 550 includes an options button 559 that can be selected by the user to customize an album. By clicking on this button, the user will be taken to a page where they can edit the title and description ofthe album, create a custom title screen, and set the security level ofthe album (private, shared, or public). If the album is designated as shared, the user may create or change a password associated with the album, or delete it. If the password or security level is changed on a published album, an E-mail can be sent to the user with album ID, password, and guest login instructions.
A user interface 590 for an options page is shown in FIG. 5d. User interface 590 can include plural buttons for customizing the album. In the implementation shown, buttons include a change theme button 591, an add music button 592, an add titles button 593 and change permissions button 594. The change theme button 591 can be invoked by a user to add frames, artwork, borders, fonts or other stylized features to the album. The add music button 592 can be invoked by the user to add a music track to the album to be mixed with or supersede the existing audio track. The add titles button 593 can be invoked to add title pages to the album to introduce the album or individual scenes. The change permissions button 594 can be invoked to change the access rights to published albums. In addition, the user interface 590 can include a change title box 595 that can be used to change the title ofthe album.
After any customization has been added, the user can preview the album (512). In one implementation, the user can preview the album by selecting a preview button 565 on the user interface 550. The preview function includes the display ofthe entire album using the streaming video scenes that are selected for inclusion in the album. All title pages that are included are rendered and the entire finished product is displayed.
At any time, the user can publish the album indicating that the album is finished (514). The publication process includes the generation of a high resolution or streaming version ofthe storyboard and all included options. The high resolution version is stored in database 74. In one implementation, the user can publish the album by selecting a finish button 563 in the user interface 550. When finished, the user will be prompted to indicate whether the album should be published or remain unpublished. When the user chooses to publish the album the entries are assimilated and the final videos are created. The following steps are asynchronous, that is, the user is not required to stay online waiting for the steps to complete. Instead a message will appear stating that they are free to move about the site and that they will be notified via E-mail when their album is published and ready for their viewing. Steps to be completed include extracting new stills for any trimmed scenes, creation of a customized title screen, creation of a final streaming version ofthe album and moved to the appropriate directory, and the creation ofthe final high resolution version ofthe storyboard. In one implementation, a final
MPEG-1 and CD directory structure is created and stored at the processing site. The final high resolution version can include proprietary productions screens and customized title screens. A contact sheet is also created and stored at the processing site. When all publishing steps are complete an E-mail is sent to the user with specifics on how to order a product (CD of the album) and an indication that they must order this album within specified time period days. If the album is marked as shared, then included in this E-mail is an album ID, password, and instructions for guest login. If an album is designated as public, a URL is generated for that album and sent via E-mail to the user. That URL will then point directly to the public album. In one implementation, a check is made to determine if the addition ofthe published album will violate the user's maximum storage space allocated in the database 74. If so then the album is not published and instead, the user is directed to delete existing albums to create space for the new one.
After publication, the user can invite friends to view or order an album (516). In one implementation, the user interface 550 includes a invite button 569 that can be invoked to create an E-mail to be distributed with instructions to one or more invitees to view the published album. When an album is designated as shared, the user receives an E-mail with the Album ID, password and instructions for viewing. The user can then forward this information to their guests. On the guests' login page the guest will enter an Album ID and a password associated with this album as well as their preferred bit stream. The guest will then be taken to a screen that shows the key frames ofthe published album and instructions on how to view the album. In one implementation, when a guest clicks on a key frame, a Real Player pops up and plays the album. In one implementation, the server system 70 will include functionality to set the number of guest views of an album. If an Album is designated as public, a URL is generated for that album and sent via E-mail to the user. That URL will then point directly to the public album. This completes the album creation and publication process.
Referring to FIG. 5e, in one implementation a third frame, a collection frame 570, is included in the user interface 550. The collection frame presents albums that have been previously compiled by the user. Each album has associated with it a single representative thumbnail which can be viewed by the user. An album from the collection frame 570 can be selected by the user. Each scene in the selected album is then displayed in the clip frame 554. Scenes from the selected album can be included in the storyboard as described above.
Referring to FIG. 5f, in one implementation the editing page includes a user interface 595 that presents a storyboard 556, a video selection button 557, a preview button 565 and finished button 563.
The storyboard 556 includes scenes and controls (trim control 593, delete control 595 and shift control 583). Trim control allows a user to trim an individual scene. A user interface presented for trimming a scene is shown in FIG. 5i. The user interface includes controls for selecting an amount of time to trim from the beginning and end of the scene along with preview functions. Referring again to FIG. 5f, the selection ofthe delete control 595 will delete the selected scene from the storyboard. The user can manipulate the shift control 583 to reorganize scenes in a storyboard.
When the selection button 557 is actuated, a list of other videos that can be reviewed is presented. An example ofthe user interface 597 shown when actuating the selection button 557 is shown in FIG. 5g. Thereafter a user can select one ofthe available videos for review by selecting a view button 598. The user interface presented 599 when selecting the view button is shown in FIG. 5h. A user can select individual scenes for inclusion in storyboard 556 by selecting a box 589 and the stay or go buttons 587 and 585. A reset button is provided to clear the selections from boxes 589. The stay button causes the selected scenes to be added to the storyboard but the user interface remains on the current page. The go button 585 adds the scenes and transports the user back to the edit page (e.g., user interface 595). Operational Flow
Referring now to FIG. 7, an operational flow 700 for system 100 is shown. The operational flow includes three loops an archiving loop 702, a production loop 704 and a sharing loop 706.
In the archiving loop, a customer provides content to be processed by the system. The system digitizes and encodes the content and provides a value added function of scene detection. High quality digitized content is stored and a low resolution version is passed to the sever system for viewing and editing by the customer. More specifically, a customer 710 provides a tape for processing (i.e. video tape source acquisition 712) after which digitization and encoding operations 714 are initiated. The digitized and encoded original content is stored locally 716 for future processing ( in the production cycle). The high quality encoded and digitized video is converted 718 to a streaming format. The high quality encoded and digitized video is also processed by a scene detection system to detect individual segments in the original content 719.
In the production loop, the scene detected streaming version ofthe high resolution content can be edited and organized into an album that can be published. The scene detection information and streaming format data for any received tape is provided to the database server 724. The database server stores the streaming format content and scene detection information in a local storage 726. The customer accesses a browser 730 to view the content. A web server 720 presents the user with web-based editing tools 732 for manipulating the scene detected content as described above to form an album of content. The user can also view scenes by accessing a streaming server 722. The web server 720 can also allow the customer to access other content that can be included in an album for example from a multimedia jukebox 734.
Alternatively, content can be directly provided by a third party to the server system and bypass the archiving loop. That is, the customer can access a strategic partner 750, for example through the Internet, and provide content to the strategic partner. The strategic partner can intake and process content and provide content through middleware 752 to the server system for inclusion in a customer album. The third party can provide: upload content (this content goes into the upload loop) and customer support (e.g., the third party has its own web site and provides support to the customers). When a customer orders a service, the tape can be sent to the third party or directly to the server system. After processing, material can be posted through the third party website.
In each case, the customer produces an album as a finished product in the production loop. When the customer has completed the editing process, the album is published 740. A edit list is produced that describes the final published content. A mastering and duplication service can produce a CD from the edit list. Alternatively, the album can be published on the Web, and cast to one or more recipients.
In the sharing loop, a customer is directed to a particular album that can be viewed. The customer accesses the album, can in turn edit portions, order copies and share the content just as the original producer ofthe album.
Production Flow for a CD
Referring now to FIG 8, a production flow 800 for the system is show. The operation begins when the customer places an order for an album 802. The customer accesses the web server home page 804 and traverses to an order page 806. At the order page, the customer identifies a product to be ordered and is presented with pricing, shipping and other information. After an order has been completed, a confirmation page is presented to confirm the order details 808. If confirmed, the customer is thanked 810 and sent a confirming e-mail 812 that is received by the customer 814.
A mailer kit is sent to the customer 816, received by the customer 818 and sent back with a tape to the production center 820. The production center receives the mailer from the customer 822 and generates an email 824 that is sent to the customer 826 indicating that the mailer (e.g. tape) has been received. The production system identifies the kit from, for example a bar code on the mailer, and associates the mailer kit with an order 828. The tape is digitized and encoded 830 to create the high resolution content. Thereafter, streaming versions ofthe digitized content are produced 832 and files are moved to the web server 834 and archived 835. An email is generated 836 and sent to the customer 838 notifying the customer that the digitized content is available for editing at the website. The customer can manipulate/edit the content 840 and then sends a notification to the production system to publish an album 842. The production system retrieves the high quality content from the archive and produces data to be included in the CD as described above 850. The CD is burned 852 and a quality assurance test is performed 854. A back-up CD can be created 856. A contact sheet is created and printed 858 along with a CD label 860. The label is attached to the finished CD 862 and shipped to the customer 864. A video identifier with tracking information for the CD is entered into a database 866 and a confirmation email is generated 868. The confirmation email is sent to the customer 870 and, if all goes well, the finished product is receive by the customer 872.
Alternative Implementations
In one implementation, the website includes a master table of contents for aggregating content from multiple sources. The master table of contents links information from local, a home page and other URLs for inclusion in an album.
In one implementation, in addition to the video products produced, a CD quality still image can be extracted from the original high quality digitized video source. The still images can be distributed as any form of image-based product including mugs, prints, t-shirts or distributed using E-mail.
In another implementation, published albums and other content can be viewed using a Multimedia Jukebox™. The Multimedia Jukebox™ can be used to organize videos, audio, text, images all in one place with one-click access.
In another implementation, rather than detecting scene breaks, other temporal indices can be used to trigger segmentation ofthe input content. Character recognition, subject recognition, voice recognition or other technologies can be employed to divide the input content. For example, the input content can be divided upon detection of a change in speech pattern (e.g., a difference person speaking or upon the detection ofthe beginning of a speech) or upon the appearance or disappearance of an individual from a scene.
In one implementation, the system can create (either automatically or under the direction ofthe user) an edit list that defines a series of start and stop points in the video content. Associated with each edit list can be one or more filters or other effects to be applied to the segment. In one implementation, the system allows the user to create a playlist. A playlist editor can be used to guide the display of a particular sequence of content. For example, each album can have an associated table of contents that describes all ofthe content in the album. A play list can be created for the album that sequences through some or all of the content in the album in the order specified by the user. In one implementation, a playlist player is included in the content provided to each ofthe distributees. The playlist player receives as an input the playlist and plays the contents ofthe album in accordance with the instructions contained therein. The playlist allows the user to select only the segments they want to see, in the order they choose. In one implementation, in addition to the selection of content, the playlist editor can include tools for allowing the user to edit or otherwise manipulate the content of an album. For example the playlist can include tools for trimming segments, applying filters to segments or other editing functions such as applying titles or other special effects. The playlist editor allows the user to trim the length of any included segment, show the same segment multiple times, and add titles, special effects, etc to display a personalized version ofthe content. Others can select their own playlists which can be shared as they like.
In one implementation, the playlist editor includes tools for performing all ofthe editing functions described above with the web-based editing system. Playlists can be uploaded to the central system to allow the user to distribute personalized content to different distributees. For example, digitized content can be shared with everyone: Once one or more personalized playlists are produced, the playlist(s) can be uploaded and then processed as if the order had been prepared on-line. In one implementation, the editing page presented in the user interface can include a "share" button. Invoking the share button, a user can order new physical media (e.g., CDs) of a personalized nature.
Uploading is simple and quick since the playlist player only sends the central system a small text file (i.e., the playlist information) In this implementation, the edited CD's are just as high quality as the original physical media produced.
In another implementation, the physical media may include video tapes. A user can specify the format ofthe output content that is to be delivered to the distributees. For example, a NHS copy of an album can be ordered for those without access to a CD-ROM drive.
In one implementation, rather than receiving a physical manifestation that includes the digitized video content, the user may order an Internet webcast. In one implementation, the webcast can be up to 5 minutes for up to 20 viewings with additional length and viewings available at minimal extra cost.
As described above, the process of delivering content to the distributes can include a sharing option. For example, once the user creates an album, the user has a number of choices to make. Choices range from the format ofthe delivery (CDR, DND, online broadcast etc.), the distribution list (who you want to share content with), the distribution means (the system can provide copies to the user or distribute the copies to each recipient), and storage options. In one implementation, the user can save albums scenes and other content (playlists, edit lists, stills and other content) in a personal multimedia jukebox.
The present invention has been described in terms of specific embodiments, which are illustrative ofthe invention and not to be construed as limiting. Other embodiments are within the scope ofthe following claims.

Claims

WHAT IS CLAIMED:
1. A method for producing a video disc comprising; acquiring video data from a source; if the video data is not digitized, then digitizing the video data; generating scene indexes for the video data including a representative still image for each scene; and combining the video data and scene indexes along with a media player on a video disc, the video player operable to play the video data in accordance with the scene indexes including playing a scene from the video data on a client computer while displaying the representative stills for other ofthe scenes available for display on the video disc.
2. The method of claim 1 wherein the step of acquiring includes capturing the video data from an analog source.
3. The method of claim 1 wherein the step of acquiring includes capturing the video data from an digital source.
4. The method of claim 1 wherein the step of generating scene indexes includes detecting a transition between consecutive frames in the video data; determining when the transition indicates a scene break; and marking the end ofthe previous scene and a beginning of a new scene at a point in time that corresponds to the initially detected transition.
5. The method of claim 4 wherein the step of detecting a transition includes detecting a color difference between the frames.
6. The method of claims 5 where the step of detecting a transition includes determining if a difference between frames exceeds a preset threshold.
7. The method of claims 5 further comprising cropping one or more ofthe frames prior to the comparison to eliminate effects from the boundary ofthe image frame.
8. The method of claim 4 wherein the step of detecting a transition includes detecting a motion difference between the frames.
9. The method of claims 8 where the step of detecting a transition includes determining if a difference between frames exceeds a preset threshold.
10. The method of claims 8 further comprising cropping one or more ofthe frames prior to the comparison to eliminate effects from the boundary ofthe image frame.
11. The method of claim 4 wherein the step of detecting a transition includes detecting a color and motion difference between the frames.
12. The method of claims 11 where the step of detecting a transition includes determining if a difference between frames exceeds a preset threshold.
13. The method of claims 11 further comprising cropping one or more of the frames prior to the comparison to eliminate effects from the boundary ofthe image frame.
14. The method of claim 4 where the step of determining when a transition indicates a scene break includes comparing plural frames to a last frame thought to be part of a preceding scene; if for each frame a transition is detected, then determining that a scene break is indicated; and else, determining that no scene break is indicated and continuing to search for a next scene break.
15. The method of claim 1 where the step of generating representative stills for each scene includes selecting a first frame from each scene.
16. The method of claim 1 where the step of generating representative stills for each scene includes selecting a frame from an introductory group of frames from each scene.
17. The method of claim 1 where the step of selecting a frame includes determining a color distribution for plural frames in a scene and selecting a frame from the introductory group that is a best match to the determined color distribution.
18. The method of claim 1 further comprising creating a contact sheet for distribution with the video disc that includes each representative still for the scenes detected on the video disc.
19. The method of claim 1 wherein the video disc is a compact disc.
20. The method of claim 1 wherein the video disc is a digital video disc.
21. A method for producing a video based product comprising; acquiring video data; generating temporal indices including analyzing the video data to detect the temporal indices, the temporal indices indicating a division ofthe video data into distinct segments; providing a media player operable to play the video data on a client computer in accordance with the temporal indices; and packaging the video data, the temporal indices and media player on a physical medium for delivery to the client computer.
22. The method of claim 20 further comprising digitizing the video data prior to packaging the video data.
23. The method of claim 20 further comprising generating representative stills for one or more segments.
24. The method of claim 22 wherein the media player is operable to display one or more ofthe representative stills while playing the video data on the client computer.
25. The method of claim 22 further comprising generating representative stills for each segment.
26. The method of claim 20 further comprising, providing a media editor operable to generate one or more edit lists, each edit list defining a set of operations to be performed on the video data by another computer so as to allow editing operations defined on one computer to be performed on the video data to be replicated on another computer; editing the video data in accordance with the edit lists on the another computer; and distributing the edited video to user designated distributees.
27. The method of claim 25 wherein the packaging step includes producing a physical manifestation ofthe content to be delivered.
28. The method of claim 26 wherein the physical manifestation is a video disc.
29. The method of claim 25 wherein the packaging step includes producing a streaming version ofthe video data in accordance with the temporal indices.
30. The method of claim 28 wherein the packaging step includes receiving a request to webcast the streaming version ofthe video data; and streaming the streaming version to a requestor.
31. The method of claim 25 where the distribution of the edited video includes providing the edit list to a central distribution site, generating the edited video at the central distribution site and directly delivering the edited video to the distributees.
PCT/US2001/019130 2000-06-16 2001-06-14 Video processing system WO2001099403A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2002504123A JP4942276B2 (en) 2000-06-16 2001-06-14 Video processing system
AU2001268432A AU2001268432A1 (en) 2000-06-16 2001-06-14 Video processing system
EP01946372A EP1310086B1 (en) 2000-06-16 2001-06-14 Video processing system
DE60143663T DE60143663D1 (en) 2000-06-16 2001-06-14 VIDEO PROCESSING SYSTEM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/595,615 2000-06-16
US09/595,615 US6882793B1 (en) 2000-06-16 2000-06-16 Video processing system

Publications (3)

Publication Number Publication Date
WO2001099403A2 true WO2001099403A2 (en) 2001-12-27
WO2001099403A9 WO2001099403A9 (en) 2002-07-18
WO2001099403A3 WO2001099403A3 (en) 2002-10-10

Family

ID=24383971

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/019130 WO2001099403A2 (en) 2000-06-16 2001-06-14 Video processing system

Country Status (6)

Country Link
US (5) US6882793B1 (en)
EP (1) EP1310086B1 (en)
JP (1) JP4942276B2 (en)
AU (1) AU2001268432A1 (en)
DE (1) DE60143663D1 (en)
WO (1) WO2001099403A2 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003102955A2 (en) * 2002-05-31 2003-12-11 Koninklijke Philips Electronics N.V. Method of capturing scenes and recorder with means of performing this method
EP1378910A2 (en) * 2002-06-25 2004-01-07 Eastman Kodak Company Software and system for customizing a presentation of digital images
WO2004088663A2 (en) * 2003-04-04 2004-10-14 Bbc Technology Holdings Limited Media processor
WO2005036875A1 (en) * 2003-10-06 2005-04-21 Disney Enterprises, Inc. System and method of playback and feature control for video players
EP1533812A2 (en) * 2003-11-18 2005-05-25 Pioneer Corporation Information processing apparatus, information editing apparatus, information processing system, playback apparatus, playback system, method therefor, program therefor, and recording medium storing the program
EP1533813A2 (en) * 2003-11-18 2005-05-25 Pioneer Corporation Information processing apparatus, information editing apparatus, information processing system, playback apparatus, playback system, method therefor, program therefor, and recording medium storing the program
EP1734757A2 (en) 2002-08-08 2006-12-20 Samsung Electronics Co., Ltd. Video recording/reproducing apparatus and method of displaying a menu guide
EP1818938A1 (en) * 2006-02-08 2007-08-15 Ricoh Company, Ltd. Content reproducing apparatus, content reproducing method and computer program product
WO2008079568A1 (en) * 2006-12-22 2008-07-03 Apple Inc. Fast creation of video segments
WO2009040538A1 (en) * 2007-09-25 2009-04-02 British Telecommunications Public Limited Company Multimedia content assembling for viral marketing purposes
EP2104104A1 (en) * 2008-03-20 2009-09-23 British Telecommunications Public Limited Company Multimedia content assembling for viral marketing purposes
US7992097B2 (en) 2006-12-22 2011-08-02 Apple Inc. Select drag and drop operations on video thumbnails across clip boundaries
EP1971134A3 (en) * 2007-03-15 2011-08-10 Sony Corporation Information processing apparatus, imaging apparatus, image display control method and computer program
EP2398021A3 (en) * 2010-06-15 2012-01-04 Sony Corporation Information processing apparatus, information processing method, and program
WO2012112904A1 (en) * 2011-02-18 2012-08-23 Apple Inc. Video context popups
US8867894B2 (en) 2000-06-16 2014-10-21 Yesvideo, Inc. Video processing system
US8943410B2 (en) 2006-12-22 2015-01-27 Apple Inc. Modified media presentation during scrubbing

Families Citing this family (189)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6859799B1 (en) 1998-11-30 2005-02-22 Gemstar Development Corporation Search engine for video and graphics
JP4592844B2 (en) * 1999-08-27 2010-12-08 ソニー株式会社 Data reproducing apparatus and data reproducing method
US7996878B1 (en) * 1999-08-31 2011-08-09 At&T Intellectual Property Ii, L.P. System and method for generating coded video sequences from still media
US8117644B2 (en) 2000-01-07 2012-02-14 Pennar Software Corporation Method and system for online document collaboration
US6845448B1 (en) 2000-01-07 2005-01-18 Pennar Software Corporation Online repository for personal information
US7788686B1 (en) * 2000-03-01 2010-08-31 Andrews Christopher C Method of and apparatus for describing, promoting, publishing, aggregating, distributing and accessing live content information
JP3784603B2 (en) * 2000-03-02 2006-06-14 株式会社日立製作所 Inspection method and apparatus, and inspection condition setting method in inspection apparatus
US7490092B2 (en) 2000-07-06 2009-02-10 Streamsage, Inc. Method and system for indexing and searching timed media information based upon relevance intervals
KR20040041082A (en) * 2000-07-24 2004-05-13 비브콤 인코포레이티드 System and method for indexing, searching, identifying, and editing portions of electronic multimedia files
JP2002073542A (en) 2000-08-31 2002-03-12 Sony Corp Method for use reservation of server, reservation managing device and program storage medium
US7039678B1 (en) * 2000-09-07 2006-05-02 Axis Mobile, Ltd. E-mail proxy
JP4681723B2 (en) * 2000-09-25 2011-05-11 キヤノン株式会社 Playback apparatus, control method, and storage medium
US7103906B1 (en) 2000-09-29 2006-09-05 International Business Machines Corporation User controlled multi-device media-on-demand system
CN100579203C (en) 2000-10-11 2010-01-06 联合视频制品公司 Storage system and method on the server in the on-demand media transmission system is provided
JP2002132663A (en) * 2000-10-20 2002-05-10 Nec Corp Information communication system and its communication method and recording medium with communication program recorded thereon
US7689705B1 (en) * 2000-10-27 2010-03-30 Realnetworks, Inc. Interactive delivery of media using dynamic playlist generation subject to restrictive criteria
JP4029569B2 (en) 2000-12-13 2008-01-09 株式会社日立製作所 Digital information recording / reproducing apparatus, recording apparatus, receiving apparatus, and transmitting apparatus
US7356250B2 (en) * 2001-01-05 2008-04-08 Genesis Microchip Inc. Systems and methods for creating a single video frame with one or more interest points
US20020089519A1 (en) * 2001-01-05 2002-07-11 Vm Labs, Inc. Systems and methods for creating an annotated media presentation
US20020106191A1 (en) * 2001-01-05 2002-08-08 Vm Labs, Inc. Systems and methods for creating a video montage from titles on a digital video disk
WO2002065781A1 (en) * 2001-02-13 2002-08-22 Koninklijke Philips Electronics N.V. Recording device and method with a still picture real time slide show record mode
US7089309B2 (en) * 2001-03-21 2006-08-08 Theplatform For Media, Inc. Method and system for managing and distributing digital media
US7143353B2 (en) * 2001-03-30 2006-11-28 Koninklijke Philips Electronics, N.V. Streaming video bookmarks
US20020175917A1 (en) * 2001-04-10 2002-11-28 Dipto Chakravarty Method and system for streaming media manager
US7046914B2 (en) * 2001-05-01 2006-05-16 Koninklijke Philips Electronics N.V. Automatic content analysis and representation of multimedia presentations
CA2386303C (en) 2001-05-14 2005-07-05 At&T Corp. Method for content-based non-linear control of multimedia playback
US20030052909A1 (en) * 2001-06-25 2003-03-20 Arcsoft, Inc. Real-time rendering of edited video stream
US6986018B2 (en) * 2001-06-26 2006-01-10 Microsoft Corporation Method and apparatus for selecting cache and proxy policy
US6990497B2 (en) * 2001-06-26 2006-01-24 Microsoft Corporation Dynamic streaming media management
US7574723B2 (en) * 2001-07-19 2009-08-11 Macrovision Corporation Home media network
US20030033602A1 (en) * 2001-08-08 2003-02-13 Simon Gibbs Method and apparatus for automatic tagging and caching of highlights
US7091989B2 (en) * 2001-08-10 2006-08-15 Sony Corporation System and method for data assisted chroma-keying
US7526433B2 (en) * 2001-09-04 2009-04-28 Panasonic Coporation Virtual content distribution system
US20030058781A1 (en) * 2001-09-27 2003-03-27 Millikan Thomas N. Method and apparatus for providing a playlist in a compact disc player
US7203380B2 (en) * 2001-11-16 2007-04-10 Fuji Xerox Co., Ltd. Video production and compaction with collage picture frame user interface
US20030110209A1 (en) * 2001-12-07 2003-06-12 Loui Alexander C. Method of producing a multimedia media
US20070220580A1 (en) * 2002-03-14 2007-09-20 Daniel Putterman User interface for a media convergence platform
US20030187820A1 (en) 2002-03-29 2003-10-02 Michael Kohut Media management system and process
US20030185301A1 (en) * 2002-04-02 2003-10-02 Abrams Thomas Algie Video appliance
JP4352653B2 (en) * 2002-04-12 2009-10-28 三菱電機株式会社 Video content management system
AU2003239435C1 (en) * 2002-05-14 2010-04-01 Screenlife, Llc DVD game
KR100502710B1 (en) * 2002-05-24 2005-07-20 주식회사 아이큐브 Optical disk regenerative apparatus
US7009655B2 (en) * 2002-07-23 2006-03-07 Mediostream, Inc. Method and system for direct recording of video information onto a disk medium
US10373420B2 (en) * 2002-09-16 2019-08-06 Touchtunes Music Corporation Digital downloading jukebox with enhanced communication features
US11029823B2 (en) 2002-09-16 2021-06-08 Touchtunes Music Corporation Jukebox with customizable avatar
JP4096670B2 (en) * 2002-09-19 2008-06-04 富士ゼロックス株式会社 Image playback system
US7668842B2 (en) 2002-10-16 2010-02-23 Microsoft Corporation Playlist structure for large playlists
US7054888B2 (en) * 2002-10-16 2006-05-30 Microsoft Corporation Optimizing media player memory during rendering
US8931010B2 (en) * 2002-11-04 2015-01-06 Rovi Solutions Corporation Methods and apparatus for client aggregation of media in a networked media system
US9027063B2 (en) * 2002-11-27 2015-05-05 Deluxe Digital Distribution Inc. Video-on-demand (VOD) management system and methods
US7921448B2 (en) * 2002-11-27 2011-04-05 Ascent Media Group, LLP Multicast media distribution system
US7493646B2 (en) 2003-01-30 2009-02-17 United Video Properties, Inc. Interactive television systems with digital video recording and adjustable reminders
JP2004274627A (en) * 2003-03-11 2004-09-30 Sony Corp Method and system for editing video material
US8832758B2 (en) * 2003-03-17 2014-09-09 Qwest Communications International Inc. Methods and systems for providing video on demand
US7574691B2 (en) * 2003-03-17 2009-08-11 Macrovision Corporation Methods and apparatus for rendering user interfaces and display information on remote client devices
US9171577B1 (en) 2003-04-25 2015-10-27 Gopro, Inc. Encoding and decoding selectively retrievable representations of video content
US20040216035A1 (en) * 2003-04-28 2004-10-28 Lotfi Belkhir Trusted printing of customized document content
JP2004336343A (en) * 2003-05-07 2004-11-25 Canon Inc Image processing system
US20040237101A1 (en) * 2003-05-22 2004-11-25 Davis Robert L. Interactive promotional content management system and article of manufacture thereof
US7761795B2 (en) * 2003-05-22 2010-07-20 Davis Robert L Interactive promotional content management system and article of manufacture thereof
JP4401690B2 (en) * 2003-06-10 2010-01-20 キヤノン株式会社 Video camera
JP2005033383A (en) * 2003-07-09 2005-02-03 Canon Inc Dynamic image editing apparatus and method for controlling the same
US20070118812A1 (en) * 2003-07-15 2007-05-24 Kaleidescope, Inc. Masking for presenting differing display formats for media streams
US7340765B2 (en) * 2003-10-02 2008-03-04 Feldmeier Robert H Archiving and viewing sports events via Internet
JPWO2005050986A1 (en) * 2003-11-19 2007-07-12 独立行政法人情報通信研究機構 Video content presentation method and apparatus
JP2005198165A (en) * 2004-01-09 2005-07-21 Canon Inc Device and method for reproducing image, computer program, and computer readable recording medium
US9396212B2 (en) 2004-04-07 2016-07-19 Visible World, Inc. System and method for enhanced video selection
US9087126B2 (en) 2004-04-07 2015-07-21 Visible World, Inc. System and method for enhanced video selection using an on-screen remote
US20050234992A1 (en) * 2004-04-07 2005-10-20 Seth Haberman Method and system for display guide for video selection
US8086575B2 (en) 2004-09-23 2011-12-27 Rovi Solutions Corporation Methods and apparatus for integrating disparate media formats in a networked media system
US20060083153A1 (en) * 2004-10-20 2006-04-20 Jacques Pitoux Method of producing a product comprising a carrier for audio and/or video digital data, of the audio video compact disc type, and obtained compact disc
US7613383B2 (en) 2004-12-02 2009-11-03 Hitachi, Ltd. Editing method and recording and reproducing device
JP2006174309A (en) * 2004-12-17 2006-06-29 Ricoh Co Ltd Animation reproducing apparatus, program, and record medium
US20060200745A1 (en) * 2005-02-15 2006-09-07 Christopher Furmanski Method and apparatus for producing re-customizable multi-media
JP4588642B2 (en) * 2005-03-15 2010-12-01 富士フイルム株式会社 Album creating apparatus, album creating method, and program
US7596751B2 (en) * 2005-04-22 2009-09-29 Hewlett-Packard Development Company, L.P. Contact sheet based image management
US20060271855A1 (en) * 2005-05-27 2006-11-30 Microsoft Corporation Operating system shell management of video files
US20070088844A1 (en) * 2005-06-07 2007-04-19 Meta Interfaces, Llc System for and method of extracting a time-based portion of media and serving it over the Web
US9432710B2 (en) * 2005-07-08 2016-08-30 At&T Intellectual Property I, L.P. Methods systems, and products for conserving bandwidth
US7739599B2 (en) * 2005-09-23 2010-06-15 Microsoft Corporation Automatic capturing and editing of a video
US20070101394A1 (en) * 2005-11-01 2007-05-03 Yesvideo, Inc. Indexing a recording of audiovisual content to enable rich navigation
US9467322B2 (en) * 2005-12-27 2016-10-11 Rovi Solutions Corporation Methods and apparatus for integrating media across a wide area network
US8607287B2 (en) 2005-12-29 2013-12-10 United Video Properties, Inc. Interactive media guidance system having multiple devices
US9681105B2 (en) 2005-12-29 2017-06-13 Rovi Guides, Inc. Interactive media guidance system having multiple devices
US20070154171A1 (en) * 2006-01-04 2007-07-05 Elcock Albert F Navigating recorded video using closed captioning
US20070154176A1 (en) * 2006-01-04 2007-07-05 Elcock Albert F Navigating recorded video using captioning, dialogue and sound effects
US20070226432A1 (en) * 2006-01-18 2007-09-27 Rix Jeffrey A Devices, systems and methods for creating and managing media clips
JP4797653B2 (en) * 2006-01-31 2011-10-19 フリュー株式会社 Image providing apparatus and method, and program
US9032297B2 (en) * 2006-03-17 2015-05-12 Disney Enterprises, Inc. Web based video editing
US20080065990A1 (en) * 2006-03-22 2008-03-13 Harrison Fredrick W Integrated product branding method
CN101867679B (en) * 2006-03-27 2013-07-10 三洋电机株式会社 Thumbnail generating apparatus and image shooting apparatus
US20070233741A1 (en) * 2006-03-31 2007-10-04 Masstech Group Inc. Interface for seamless integration of a non-linear editing system and a data archive system
US20070253675A1 (en) * 2006-04-28 2007-11-01 Weaver Timothy H Methods, systems, and products for recording media
US7647464B2 (en) * 2006-04-28 2010-01-12 At&T Intellectual Property, I,L.P. Methods, systems, and products for recording media to a restoration server
US8805164B2 (en) 2006-05-24 2014-08-12 Capshore, Llc Method and apparatus for creating a custom track
US20080008440A1 (en) * 2006-05-24 2008-01-10 Michael Wayne Shore Method and apparatus for creating a custom track
US20080002942A1 (en) * 2006-05-24 2008-01-03 Peter White Method and apparatus for creating a custom track
US20070274683A1 (en) * 2006-05-24 2007-11-29 Michael Wayne Shore Method and apparatus for creating a custom track
US8831408B2 (en) 2006-05-24 2014-09-09 Capshore, Llc Method and apparatus for creating a custom track
US7929551B2 (en) * 2006-06-01 2011-04-19 Rovi Solutions Corporation Methods and apparatus for transferring media across a network using a network interface device
US7945142B2 (en) * 2006-06-15 2011-05-17 Microsoft Corporation Audio/visual editing tool
US8392947B2 (en) * 2006-06-30 2013-03-05 At&T Intellectual Property I, Lp System and method for home audio and video communication
US8667540B2 (en) * 2006-07-07 2014-03-04 Apple Partners, Lp Web-based video broadcasting system having multiple channels
US8577204B2 (en) * 2006-11-13 2013-11-05 Cyberlink Corp. System and methods for remote manipulation of video over a network
US8375302B2 (en) * 2006-11-17 2013-02-12 Microsoft Corporation Example based video editing
US9417758B2 (en) * 2006-11-21 2016-08-16 Daniel E. Tsai AD-HOC web content player
US20080215984A1 (en) * 2006-12-20 2008-09-04 Joseph Anthony Manico Storyshare automation
US20080155422A1 (en) * 2006-12-20 2008-06-26 Joseph Anthony Manico Automated production of multiple output products
US7711733B2 (en) * 2007-02-07 2010-05-04 At&T Intellectual Property I,L.P. Methods, systems, and products for targeting media for storage to communications devices
US7650368B2 (en) * 2007-02-07 2010-01-19 At&T Intellectual Property I, L.P. Methods, systems, and products for restoring electronic media
US9083938B2 (en) * 2007-02-26 2015-07-14 Sony Computer Entertainment America Llc Media player with networked playback control and advertisement insertion
KR101120017B1 (en) * 2007-02-27 2012-03-26 삼성전자주식회사 Method and apparatus for recording optical disc
US10127480B1 (en) 2007-03-09 2018-11-13 R. B. III Associates, Inc. System for automated decoration
US20080235588A1 (en) * 2007-03-20 2008-09-25 Yahoo! Inc. Media player playlist creation and editing within a browser interpretable document
US20080235580A1 (en) * 2007-03-20 2008-09-25 Yahoo! Inc. Browser interpretable document for controlling a plurality of media players and systems and methods related thereto
JP2008262280A (en) * 2007-04-10 2008-10-30 Sony Corp Information processing system, information processor, server device, information processing method and program
US9146925B2 (en) * 2007-05-04 2015-09-29 Manuel Ignacio Tijerino User defined internet jukebox kiosks set top box
US7778973B2 (en) * 2007-05-18 2010-08-17 Tat Kuen Choi System, method, and program for sharing photos via the internet
WO2008143167A1 (en) * 2007-05-21 2008-11-27 Mitsubishi Electric Corporation Image difference detection method and apparatus, scene change detection method and apparatus, as well as image difference value detection method and apparatus
US20090019492A1 (en) * 2007-07-11 2009-01-15 United Video Properties, Inc. Systems and methods for mirroring and transcoding media content
US20090077556A1 (en) * 2007-09-19 2009-03-19 Martin Kay Nohr Image media modifier
US20090106807A1 (en) * 2007-10-19 2009-04-23 Hitachi, Ltd. Video Distribution System for Switching Video Streams
US8768137B2 (en) * 2007-12-14 2014-07-01 Microsoft Corporation Program segments display bar
KR20090093105A (en) * 2008-02-28 2009-09-02 삼성전자주식회사 Content playing apparatus and method
KR20090098247A (en) * 2008-03-13 2009-09-17 삼성전자주식회사 Image processing apparatus, image processing system having image processing apparatus and control method thereof
US20090249406A1 (en) * 2008-03-31 2009-10-01 Broadcom Corporation Mobile video device with enhanced video navigation
US8079054B1 (en) * 2008-04-14 2011-12-13 Adobe Systems Incorporated Location for secondary content based on data differential
US8307395B2 (en) * 2008-04-22 2012-11-06 Porto Technology, Llc Publishing key frames of a video content item being viewed by a first user to one or more second users
US8601526B2 (en) 2008-06-13 2013-12-03 United Video Properties, Inc. Systems and methods for displaying media content and media guidance information
US10282391B2 (en) 2008-07-03 2019-05-07 Ebay Inc. Position editing tool of collage multi-media
US8365092B2 (en) 2008-07-03 2013-01-29 Ebay Inc. On-demand loading of media in a multi-media presentation
US8893015B2 (en) 2008-07-03 2014-11-18 Ebay Inc. Multi-directional and variable speed navigation of collage multi-media
US8320738B2 (en) * 2008-07-17 2012-11-27 Indata Corporation Video management system and method
CN102099802B (en) * 2008-07-28 2015-05-27 索尼公司 Client device, information processing system and associated methodology of accessing networked sevices
JP5499457B2 (en) * 2008-10-21 2014-05-21 富士通モバイルコミュニケーションズ株式会社 Terminal device
US9442933B2 (en) * 2008-12-24 2016-09-13 Comcast Interactive Media, Llc Identification of segments within audio, video, and multimedia items
US8713016B2 (en) 2008-12-24 2014-04-29 Comcast Interactive Media, Llc Method and apparatus for organizing segments of media assets and determining relevance of segments to a query
US11531668B2 (en) * 2008-12-29 2022-12-20 Comcast Interactive Media, Llc Merging of multiple data sets
TW201031208A (en) * 2009-02-06 2010-08-16 Wistron Corp Media management device, system and method thereof
JP2010231771A (en) * 2009-03-05 2010-10-14 Fujifilm Corp Data reproduction device, content delivery system, and content delivery method
US8176043B2 (en) 2009-03-12 2012-05-08 Comcast Interactive Media, Llc Ranking search results
US8681239B2 (en) 2009-04-07 2014-03-25 Panasonic Corporation Image capturing device, image capturing method, program, and integrated circuit
US8533223B2 (en) 2009-05-12 2013-09-10 Comcast Interactive Media, LLC. Disambiguation and tagging of entities
US8494341B2 (en) * 2009-06-30 2013-07-23 International Business Machines Corporation Method and system for display of a video file
US9892730B2 (en) 2009-07-01 2018-02-13 Comcast Interactive Media, Llc Generating topic-specific language models
US9014546B2 (en) 2009-09-23 2015-04-21 Rovi Guides, Inc. Systems and methods for automatically detecting users within detection regions of media devices
CA2881453A1 (en) 2010-01-26 2011-08-04 Touchtunes Music Corporation Digital jukebox device with improved user interfaces, and associated methods
EP2384001A1 (en) * 2010-04-29 2011-11-02 Alcatel Lucent Providing of encoded video applications in a network environment
US8423555B2 (en) 2010-07-09 2013-04-16 Comcast Cable Communications, Llc Automatic segmentation of video
US8806340B2 (en) * 2010-09-01 2014-08-12 Hulu, LLC Method and apparatus for embedding media programs having custom user selectable thumbnails
US20130227416A1 (en) * 2011-01-06 2013-08-29 Edward Massena Device for logging, editing and production of video programs for activities of local interest
US9201571B2 (en) 2011-01-06 2015-12-01 It's Relevant, LLC Logging, editing and production system for activities of local interest and related video
KR101770298B1 (en) * 2011-01-11 2017-08-22 삼성전자주식회사 digital image photographing apparatus and method for controling the same
US9792363B2 (en) * 2011-02-01 2017-10-17 Vdopia, INC. Video display method
US20130031589A1 (en) * 2011-07-27 2013-01-31 Xavier Casanova Multiple resolution scannable video
US10467289B2 (en) 2011-08-02 2019-11-05 Comcast Cable Communications, Llc Segmentation of video according to narrative theme
US20130036442A1 (en) * 2011-08-05 2013-02-07 Qualcomm Incorporated System and method for visual selection of elements in video content
US9185152B2 (en) * 2011-08-25 2015-11-10 Ustream, Inc. Bidirectional communication on live multimedia broadcasts
US20130132843A1 (en) * 2011-11-23 2013-05-23 BenchFly Inc. Methods of editing personal videograpghic media
US9565476B2 (en) * 2011-12-02 2017-02-07 Netzyn, Inc. Video providing textual content system and method
US8805418B2 (en) 2011-12-23 2014-08-12 United Video Properties, Inc. Methods and systems for performing actions based on location-based rules
US9100668B2 (en) * 2012-05-31 2015-08-04 Matthew Nathan Lehrer Graphics correction engine
US9137428B2 (en) 2012-06-01 2015-09-15 Microsoft Technology Licensing, Llc Storyboards for capturing images
US9304823B2 (en) * 2012-07-17 2016-04-05 Adobe Systems Incorporated Method and apparatus for optimizing download operations
US9082092B1 (en) * 2012-10-01 2015-07-14 Google Inc. Interactive digital media items with multiple storylines
KR101328199B1 (en) * 2012-11-05 2013-11-13 넥스트리밍(주) Method and terminal and recording medium for editing moving images
US9081521B2 (en) 2012-11-09 2015-07-14 Xerox International Partners Networked printing systems for providing a la carte reproduction services
US9286016B2 (en) 2012-11-09 2016-03-15 Xerox International Partners Networked printing systems
US20140132970A1 (en) * 2012-11-09 2014-05-15 Xerox International Partners Networked printing systems
US9195413B2 (en) 2012-11-09 2015-11-24 Xerox International Partners Networked printing systems
US20160035392A1 (en) * 2012-11-22 2016-02-04 Didja, Inc. Systems and methods for clipping video segments
CN105659618B (en) * 2012-12-19 2019-06-28 交互数字Ce专利控股公司 Method and apparatus for being detected automatically to image/video resolution ratio and its color double sampling
US9106720B1 (en) 2013-01-02 2015-08-11 Amazon Technologies, Inc. Personalized smart-list video channels
US20140226955A1 (en) * 2013-02-12 2014-08-14 Takes Llc Generating a sequence of video clips based on meta data
US9538232B2 (en) * 2013-03-14 2017-01-03 Verizon Patent And Licensing Inc. Chapterized streaming of video content
FR3004054A1 (en) * 2013-03-26 2014-10-03 France Telecom GENERATING AND RETURNING A FLOW REPRESENTATIVE OF AUDIOVISUAL CONTENT
US9674563B2 (en) 2013-11-04 2017-06-06 Rovi Guides, Inc. Systems and methods for recommending content
US10867635B2 (en) * 2013-11-11 2020-12-15 Vimeo, Inc. Method and system for generation of a variant video production from an edited video production
US9449231B2 (en) * 2013-11-13 2016-09-20 Aol Advertising Inc. Computerized systems and methods for generating models for identifying thumbnail images to promote videos
US20150222686A1 (en) * 2014-02-06 2015-08-06 Elijah Aizenstat System and a method for sharing interactive media content by at least one user with at least one recipient over a communication network
WO2015148727A1 (en) * 2014-03-26 2015-10-01 AltSchool, PBC Learning environment systems and methods
US9635337B1 (en) * 2015-03-27 2017-04-25 Amazon Technologies, Inc. Dynamically generated media trailers
US20170076752A1 (en) * 2015-09-10 2017-03-16 Laura Steward System and method for automatic media compilation
US20170085941A1 (en) * 2015-09-23 2017-03-23 Rovi Guides, Inc. Systems and methods to detect events in programming from multiple channels
US10158904B2 (en) 2015-09-23 2018-12-18 Rovi Guides, Inc. Systems and methods to combine programming from multiple channels
US10708673B2 (en) 2015-09-25 2020-07-07 Qualcomm Incorporated Systems and methods for video processing
KR20180056655A (en) * 2015-09-25 2018-05-29 퀄컴 인코포레이티드 Systems and methods for video processing
US9641566B1 (en) 2016-09-20 2017-05-02 Opentest, Inc. Methods and systems for instantaneous asynchronous media sharing
CN113434223A (en) * 2020-03-23 2021-09-24 北京字节跳动网络技术有限公司 Special effect processing method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5485611A (en) 1994-12-30 1996-01-16 Intel Corporation Video database indexing and method of presenting video database index to a user
US5818439A (en) 1995-02-20 1998-10-06 Hitachi, Ltd. Video viewing assisting method and a video playback system therefor
US5852435A (en) 1996-04-12 1998-12-22 Avid Technology, Inc. Digital multimedia editing and data management system
EP1074989A1 (en) 1999-08-05 2001-02-07 Hewlett-Packard Company Video data conversion mechanism

Family Cites Families (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3058333B2 (en) * 1989-07-25 2000-07-04 ソニー株式会社 Image retrieval apparatus and method
JP2865467B2 (en) * 1991-11-22 1999-03-08 富士写真フイルム株式会社 Image data supply system
US5546191A (en) 1992-02-25 1996-08-13 Mitsubishi Denki Kabushiki Kaisha Recording and reproducing apparatus
JP3108585B2 (en) * 1993-07-28 2000-11-13 日本電信電話株式会社 Video image print access method and system
JP3340532B2 (en) * 1993-10-20 2002-11-05 株式会社日立製作所 Video search method and apparatus
US5642294A (en) 1993-12-17 1997-06-24 Nippon Telegraph And Telephone Corporation Method and apparatus for video cut detection
JP3676413B2 (en) * 1994-05-16 2005-07-27 三菱電機株式会社 Video data management method
US5635982A (en) * 1994-06-27 1997-06-03 Zhang; Hong J. System for automatic video segmentation and key frame extraction for video sequences having both sharp and gradual transitions
US5835667A (en) * 1994-10-14 1998-11-10 Carnegie Mellon University Method and apparatus for creating a searchable digital video library and a system and method of using such a library
US5805733A (en) * 1994-12-12 1998-09-08 Apple Computer, Inc. Method and system for detecting scenes and summarizing video sequences
JP3367268B2 (en) 1995-04-21 2003-01-14 株式会社日立製作所 Video digest creation apparatus and method
JP3656282B2 (en) 1995-05-15 2005-06-08 ソニー株式会社 Data recording apparatus, data reproducing apparatus, data recording method, data reproducing method, and data recording medium
US5930493A (en) 1995-06-07 1999-07-27 International Business Machines Corporation Multimedia server system and method for communicating multimedia information
JPH09128408A (en) * 1995-08-25 1997-05-16 Hitachi Ltd Media for interactive recording and reproducing and reproducing device
WO1997014250A1 (en) 1995-10-11 1997-04-17 Sony Corporation Data processing system
JP3752298B2 (en) * 1996-04-01 2006-03-08 オリンパス株式会社 Image editing device
US5767922A (en) 1996-04-05 1998-06-16 Cornell Research Foundation, Inc. Apparatus and process for detecting scene breaks in a sequence of video frames
GB2312078B (en) 1996-04-12 1999-12-15 Sony Corp Cataloguing video information
US6154601A (en) 1996-04-12 2000-11-28 Hitachi Denshi Kabushiki Kaisha Method for editing image information with aid of computer and editing system
TW332293B (en) * 1996-04-23 1998-05-21 Matsushita Electric Ind Co Ltd Editing control apparatus and editing control method
US5920360A (en) 1996-06-07 1999-07-06 Electronic Data Systems Corporation Method and system for detecting fade transitions in a video signal
US6047292A (en) * 1996-09-12 2000-04-04 Cdknet, L.L.C. Digitally encoded recording medium
JPH10150629A (en) * 1996-11-15 1998-06-02 Sony Corp Transmission and reception system, receiver and transmitter
US5930492A (en) * 1997-03-19 1999-07-27 Advanced Micro Devices, Inc. Rapid pipeline control using a control word and a steering word
JP3932631B2 (en) * 1997-03-21 2007-06-20 松下電器産業株式会社 Compressed video data cut detection device
JPH10276388A (en) * 1997-03-28 1998-10-13 Sony Corp Device, method for processing and reproducing image and recording medium
JP4000623B2 (en) * 1997-05-26 2007-10-31 ソニー株式会社 Video signal recording apparatus and video signal recording method
US6125229A (en) * 1997-06-02 2000-09-26 Philips Electronics North America Corporation Visual indexing system
US6029194A (en) * 1997-06-10 2000-02-22 Tektronix, Inc. Audio/video media server for distributed editing over networks
US6195458B1 (en) * 1997-07-29 2001-02-27 Eastman Kodak Company Method for content-based temporal segmentation of video
JPH1169281A (en) * 1997-08-15 1999-03-09 Media Rinku Syst:Kk Summary generating device and video display device
US6233428B1 (en) * 1997-09-17 2001-05-15 Bruce Fryer System and method for distribution of child care training materials and remote monitoring of child care centers
US6134531A (en) * 1997-09-24 2000-10-17 Digital Equipment Corporation Method and apparatus for correlating real-time audience feedback with segments of broadcast programs
KR100750520B1 (en) * 1997-09-25 2007-08-21 소니 가부시끼 가이샤 Encoded stream generating device and method, data transmission system and method, and editing system and method
US6028603A (en) * 1997-10-24 2000-02-22 Pictra, Inc. Methods and apparatuses for presenting a collection of digital media in a media container
JPH11146325A (en) * 1997-11-10 1999-05-28 Hitachi Ltd Video retrieval method, device therefor, video information generating method and storage medium storing its processing program
JP3895849B2 (en) * 1997-11-11 2007-03-22 トムソン ライセンシング Non-linear video editing system
US6363380B1 (en) * 1998-01-13 2002-03-26 U.S. Philips Corporation Multimedia computer system with story segmentation capability and operating program therefor including finite automation video parser
JP3597689B2 (en) * 1998-01-21 2004-12-08 株式会社東芝 Information recording medium and information recording medium processing device
JPH11215472A (en) * 1998-01-22 1999-08-06 Pioneer Electron Corp Information recording apparatus and information reproducing apparatus
US6611624B1 (en) * 1998-03-13 2003-08-26 Cisco Systems, Inc. System and method for frame accurate splicing of compressed bitstreams
JP3944807B2 (en) * 1998-04-02 2007-07-18 ソニー株式会社 Material selection device and material selection method
US6154771A (en) * 1998-06-01 2000-11-28 Mediastra, Inc. Real-time receipt, decompression and play of compressed streaming video/hypervideo; with thumbnail display of past scenes and with replay, hyperlinking and/or recording permissively intiated retrospectively
US6307550B1 (en) * 1998-06-11 2001-10-23 Presenter.Com, Inc. Extracting photographic images from video
US6366296B1 (en) * 1998-09-11 2002-04-02 Xerox Corporation Media browser using multimodal analysis
US6564380B1 (en) * 1999-01-26 2003-05-13 Pixelworld Networks, Inc. System and method for sending live video on the internet
US7075683B1 (en) * 1999-02-15 2006-07-11 Canon Kabushiki Kaisha Dynamic image digest automatic editing system and dynamic image digest automatic editing method
US6262724B1 (en) * 1999-04-15 2001-07-17 Apple Computer, Inc. User interface for presenting media information
US7313809B1 (en) * 1999-04-16 2007-12-25 Apple, Inc. Convergence-enabled DVD and web system
WO2001028238A2 (en) 1999-10-08 2001-04-19 Sarnoff Corporation Method and apparatus for enhancing and indexing video and audio signals
JP2001320667A (en) 2000-05-12 2001-11-16 Sony Corp Service providing device and method, reception terminal and method, and service providing system
US6882793B1 (en) 2000-06-16 2005-04-19 Yesvideo, Inc. Video processing system
US6721361B1 (en) * 2001-02-23 2004-04-13 Yesvideo.Com Video processing system including advanced scene break detection methods for fades, dissolves and flashes
US20020183984A1 (en) * 2001-06-05 2002-12-05 Yining Deng Modular intelligent multimedia analysis system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5485611A (en) 1994-12-30 1996-01-16 Intel Corporation Video database indexing and method of presenting video database index to a user
US5818439A (en) 1995-02-20 1998-10-06 Hitachi, Ltd. Video viewing assisting method and a video playback system therefor
US5852435A (en) 1996-04-12 1998-12-22 Avid Technology, Inc. Digital multimedia editing and data management system
EP1074989A1 (en) 1999-08-05 2001-02-07 Hewlett-Packard Company Video data conversion mechanism

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1310086A4

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8867894B2 (en) 2000-06-16 2014-10-21 Yesvideo, Inc. Video processing system
US9390755B2 (en) 2000-06-16 2016-07-12 Yesvideo, Inc. Video processing system
WO2003102955A3 (en) * 2002-05-31 2004-07-01 Koninkl Philips Electronics Nv Method of capturing scenes and recorder with means of performing this method
WO2003102955A2 (en) * 2002-05-31 2003-12-11 Koninklijke Philips Electronics N.V. Method of capturing scenes and recorder with means of performing this method
EP1378910A2 (en) * 2002-06-25 2004-01-07 Eastman Kodak Company Software and system for customizing a presentation of digital images
EP1378910A3 (en) * 2002-06-25 2004-10-13 Eastman Kodak Company Software and system for customizing a presentation of digital images
US8285085B2 (en) 2002-06-25 2012-10-09 Eastman Kodak Company Software and system for customizing a presentation of digital images
US7236960B2 (en) 2002-06-25 2007-06-26 Eastman Kodak Company Software and system for customizing a presentation of digital images
EP1734757A2 (en) 2002-08-08 2006-12-20 Samsung Electronics Co., Ltd. Video recording/reproducing apparatus and method of displaying a menu guide
EP1734757A3 (en) * 2002-08-08 2007-02-14 Samsung Electronics Co., Ltd. Video recording/reproducing apparatus and method of displaying a menu guide
WO2004088663A3 (en) * 2003-04-04 2004-12-02 Bbc Technology Holdings Ltd Media processor
WO2004088663A2 (en) * 2003-04-04 2004-10-14 Bbc Technology Holdings Limited Media processor
WO2005036875A1 (en) * 2003-10-06 2005-04-21 Disney Enterprises, Inc. System and method of playback and feature control for video players
AU2004306754B2 (en) * 2003-10-06 2009-09-17 Disney Enterprises, Inc. System and method of playback and feature control for video players
US8112711B2 (en) 2003-10-06 2012-02-07 Disney Enterprises, Inc. System and method of playback and feature control for video players
EP1533812A3 (en) * 2003-11-18 2005-07-06 Pioneer Corporation Information processing apparatus, information editing apparatus, information processing system, playback apparatus, playback system, method therefor, program therefor, and recording medium storing the program
EP1533813A3 (en) * 2003-11-18 2005-07-06 Pioneer Corporation Information processing apparatus, information editing apparatus, information processing system, playback apparatus, playback system, method therefor, program therefor, and recording medium storing the program
EP1533813A2 (en) * 2003-11-18 2005-05-25 Pioneer Corporation Information processing apparatus, information editing apparatus, information processing system, playback apparatus, playback system, method therefor, program therefor, and recording medium storing the program
EP1533812A2 (en) * 2003-11-18 2005-05-25 Pioneer Corporation Information processing apparatus, information editing apparatus, information processing system, playback apparatus, playback system, method therefor, program therefor, and recording medium storing the program
EP1818938A1 (en) * 2006-02-08 2007-08-15 Ricoh Company, Ltd. Content reproducing apparatus, content reproducing method and computer program product
CN101025676B (en) * 2006-02-08 2011-04-06 株式会社理光 Content reproducing apparatus, content reproducing method
US7992097B2 (en) 2006-12-22 2011-08-02 Apple Inc. Select drag and drop operations on video thumbnails across clip boundaries
US8943433B2 (en) 2006-12-22 2015-01-27 Apple Inc. Select drag and drop operations on video thumbnails across clip boundaries
US9959907B2 (en) 2006-12-22 2018-05-01 Apple Inc. Fast creation of video segments
US9830063B2 (en) 2006-12-22 2017-11-28 Apple Inc. Modified media presentation during scrubbing
WO2008079568A1 (en) * 2006-12-22 2008-07-03 Apple Inc. Fast creation of video segments
US8020100B2 (en) 2006-12-22 2011-09-13 Apple Inc. Fast creation of video segments
US9335892B2 (en) 2006-12-22 2016-05-10 Apple Inc. Select drag and drop operations on video thumbnails across clip boundaries
US9280262B2 (en) 2006-12-22 2016-03-08 Apple Inc. Select drag and drop operations on video thumbnails across clip boundaries
US8943410B2 (en) 2006-12-22 2015-01-27 Apple Inc. Modified media presentation during scrubbing
US8260090B2 (en) 2007-03-15 2012-09-04 Sony Corporation Information processing apparatus, imaging apparatus, image display control method and computer program
EP1971134A3 (en) * 2007-03-15 2011-08-10 Sony Corporation Information processing apparatus, imaging apparatus, image display control method and computer program
WO2009040538A1 (en) * 2007-09-25 2009-04-02 British Telecommunications Public Limited Company Multimedia content assembling for viral marketing purposes
EP2104104A1 (en) * 2008-03-20 2009-09-23 British Telecommunications Public Limited Company Multimedia content assembling for viral marketing purposes
US8774604B2 (en) 2010-06-15 2014-07-08 Sony Corporation Information processing apparatus, information processing method, and program
EP2398021A3 (en) * 2010-06-15 2012-01-04 Sony Corporation Information processing apparatus, information processing method, and program
KR101428959B1 (en) 2011-02-18 2014-08-11 애플 인크. Method, system, and computer-readable medium for performing video context popups
CN103380461A (en) * 2011-02-18 2013-10-30 苹果公司 Video context popups
US8467663B2 (en) 2011-02-18 2013-06-18 Apple Inc. Video context popups
WO2012112904A1 (en) * 2011-02-18 2012-08-23 Apple Inc. Video context popups

Also Published As

Publication number Publication date
WO2001099403A3 (en) 2002-10-10
US20150104154A1 (en) 2015-04-16
EP1310086A2 (en) 2003-05-14
JP2004501574A (en) 2004-01-15
DE60143663D1 (en) 2011-01-27
US9390755B2 (en) 2016-07-12
US7668438B2 (en) 2010-02-23
JP4942276B2 (en) 2012-05-30
US20100115410A1 (en) 2010-05-06
US20050281535A1 (en) 2005-12-22
EP1310086A4 (en) 2004-08-18
AU2001268432A1 (en) 2002-01-02
US6882793B1 (en) 2005-04-19
US8630529B2 (en) 2014-01-14
EP1310086B1 (en) 2010-12-15
WO2001099403A9 (en) 2002-07-18
US20130294750A1 (en) 2013-11-07
US8867894B2 (en) 2014-10-21

Similar Documents

Publication Publication Date Title
US9390755B2 (en) Video processing system
US6721361B1 (en) Video processing system including advanced scene break detection methods for fades, dissolves and flashes
EP1378910B1 (en) Software and system for customizing a presentation of digital images
US9038108B2 (en) Method and system for providing end user community functionality for publication and delivery of digital media content
JP5112287B2 (en) Method and system for providing distributed editing and storage of digital media over a network
US8972862B2 (en) Method and system for providing remote digital media ingest with centralized editorial control
US8644679B2 (en) Method and system for dynamic control of digital media content playback and advertisement delivery
US8126313B2 (en) Method and system for providing a personal video recorder utilizing network-based digital media content
CA2615100C (en) Method and system for remote digital editing using narrow band channels
US8977108B2 (en) Digital media asset management system and method for supporting multiple users
US20060236221A1 (en) Method and system for providing digital media management using templates and profiles
US20070089151A1 (en) Method and system for delivery of digital media experience via common instant communication clients
US20070133609A1 (en) Providing end user community functionality for publication and delivery of digital media content
US9210482B2 (en) Method and system for providing a personal video recorder utilizing network-based digital media content
WO2002084598A1 (en) Method and system for streaming media manager
US20030081249A1 (en) Easy printing of visual images extracted from a collection of visual images

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: C2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: C2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

COP Corrected version of pamphlet

Free format text: PAGES 5/18, 6/18 AND 8/18-16/18, DRAWINGS, REPLACED BY NEW PAGES 5/18, 6/18 AND 8/18-16/18

AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

ENP Entry into the national phase in:

Ref country code: JP

Ref document number: 2002 504123

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 2001946372

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2001946372

Country of ref document: EP