WO2001082195A1 - Systems and methods for integrating virtual advertisements into recreated events - Google Patents

Systems and methods for integrating virtual advertisements into recreated events Download PDF

Info

Publication number
WO2001082195A1
WO2001082195A1 PCT/US2001/013475 US0113475W WO0182195A1 WO 2001082195 A1 WO2001082195 A1 WO 2001082195A1 US 0113475 W US0113475 W US 0113475W WO 0182195 A1 WO0182195 A1 WO 0182195A1
Authority
WO
WIPO (PCT)
Prior art keywords
advertisement
event
recreated
inventory
advertisements
Prior art date
Application number
PCT/US2001/013475
Other languages
French (fr)
Inventor
Brian K. Mitchell
Jeffery D. Boyken
Daniel E. Fugit
Robert V. Wells
Ralph T. Drensek
William A. Koons
Original Assignee
Anivision, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anivision, Inc. filed Critical Anivision, Inc.
Priority to AU2001259171A priority Critical patent/AU2001259171A1/en
Publication of WO2001082195A1 publication Critical patent/WO2001082195A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising

Definitions

  • the invention relates generally to three-dimensional computer graphics, and, more particularly, to systems and methods that enable viewers to be marketed using in-scene advertisements while viewing rendered real events.
  • game developers want to increase profits by receiving advertisement revenue from sponsors.
  • game developers have included o advertisements by displaying sponsor logos as a backdrop or by adding splash screens during game initialization.
  • sponsor logos have been attached to objects within the game.
  • the advertisements are often not viewable because of the object's orientation or the object is outside of the viewable window.
  • the gaming industry is not interested in advertisements that divert the audience's attention 5 from the game. Consumers tolerate intrusive advertisements when the product or service is provided free of charge, but generally are annoyed otherwise.
  • Current advertising approaches are inadequate to incorporate significant advertising material without antagonizing the consumers.
  • Advertisement material also needs to be able to be altered based upon collected information about the consumer, and it needs to be prominently displayed without antagonizing the consumer.
  • the present invention provides a new marketing medium using in scene advertisements to consumers viewing recreated events.
  • the system inserts advertisements into recreated events without distracting from the action provided by the event.
  • the system can collect metric and demographic information about the consumer and dynamically select advertisement inventory based upon the collected information.
  • advertising indicia are inserted into scene graphs and displayed to a consumer as part of a recreated event.
  • Event data can captured from a real event by active or passive sources. Advertising inventory is determined based upon the advertising objectives.
  • the event data is decomposed into high-level objects.
  • the high-level objects are dynamically decomposed based upon the advertisement objectives, and the requisite advertisement inventory is generated and mapped into a scene graph.
  • the scene graph is displayed as part of the recreated event.
  • consumer metric information can be collected, and the selected advertising inventory to be included in a scene graph can be altered based upon information known about the consumer.
  • each advertisement in the advertisement inventory can be activated by a consumer. This interactive advertisement inventory, upon triggering, can cause the display of additional advertisement material.
  • An object of the invention is to enable displayed advertisements to face a camera while the object to which the advertisement is associated moves in the event. Consequently, a model node is associated with an object in a scene graph of a recreated event.
  • Model nodes include a predetermined region of space for - advertisement inventory. From the advertisement inventory, selected advertisements are associated with the object, and a line of sight is calculated from a virtual camera to an advertisement that is required for the advertisement to face the virtual camera. If needed, the advertisement is rotated or translated within the predetermined region in order to keep an angle between an advertisement plane and the line of sight between 0 degrees and 180 degrees.
  • a model node is associated with an object in a scene graph of a recreated event.
  • the recreated event can be derived from a real event.
  • the model node includes a displacement attribute.
  • an advertisement is associated with the object. Based upon the displacement attribute, an offset from the object's location in the event is determined, and based upon the offset the object can be displaced when the object moves out of a rendered window's frustum. By moving the object, the advertisement associated with the object can remain displayed in the rendered window.
  • Yet another object is to enable advertisements to be prominently displayed by changing the orientation of an object.
  • An advertisement is associated with the object.
  • a model node is associated with an object in a scene graph of a recreated event.
  • the model node includes an orientation attribute.
  • the object is rotated or translated based upon the orientation attribute such that a predetermined angle range between an object plane and a line of sight from a virtual camera is maintained. Maintaining the object's orientation displays the advertisement prominently within a rendered window.
  • a further object is to enable advertisements to be viewed by removing obstructing obstacles.
  • Model nodes are associated with each object in a scene graph of a recreated event, which may be derived from a real event.
  • the model nodes include a priority attribute and an advertisement is associated with a high priority model node.
  • Each model node is tested with a line of sight vector and obstructing objects that obstruct a view to higher priority model nodes are displaced. Moving obstructing objects enables the advertisement to be displayed within a rendered window.
  • Another further object is to accent advertisements by incorporating the advertisement as part of the action.
  • objects can be morphed into a virtual advertisement upon a triggering action within the event.
  • a virtual advertisement is an advertisement displayed by morphing an object in the event.
  • Still another object is to be able to display advertisements on any object within a scene graph of a recreated event.
  • the event data is decomposed into high-level objects. Each high-level object is decomposed. Any decomposed object can be associated with advertisement material.
  • the advertisement inventory comprises selectable advertisements that can inserted into the scene graph. Based upon desired advertisement objectives, selected advertisement inventory can be mapped onto any decomposed object within the scene graph.
  • Yet a further object is to display advertisements based upon the results in a recreated event of a participant's action that is being controlled by the consumer.
  • Each participant can be controlled by the consumer upon selection by the consumer.
  • the consumer controls the action of the participant.
  • result advertisements are displayed.
  • the displayed advertisements can be dynamically altered based upon the first result and the replay result.
  • the consumer can control participants in similar plays later in the event.
  • the displayed advertisements can be dynamically altered based upon the first result and the later result.
  • other advertisements can be displayed based upon a comparison of the controlled action results and the result in the real event.
  • Figure. 1 illustrates a functional block diagram of an exemplary virtual view advertising system
  • Figure 2 illustrates a functional block diagram of an exemplary capture system
  • Figure 3 illustrates a functional block diagram of an exemplary event recreation system
  • Figure 4 illustrates a functional block diagram of an exemplary scene generation system
  • Figure 5 illustrates a functional block diagram of an exemplary interactive viewing system
  • Figure 6 illustrates an exemplary screen shot of a stand alone animation as a dynamic advertisement
  • Figure 7 illustrates an exemplary screen shot of an animation morph as a dynamic advertisement
  • Figure 8 illustrates an exemplary screen shots of an object displacement as a dynamic advertisement
  • the exemplary embodiment of the present invention is called the Virtual View Advertisement System (WA System).
  • the exemplary system is designed to incorporate advertisement objectives within the Virtual View Sport System disclosed in Patent Number 6,124,862.
  • Patent Number 6,124,862 is incorporated by reference in its entirety.
  • numerous other systems can dynamically place advertisements on objects within captured real events.
  • the present invention will be described with reference to the insertion advertisement material into objects within the context of sporting events.
  • the WA System may be applied to any situation in which pre-selected objects are tracked and converted into a virtual environment, such as military maneuvers, space exploration, political speeches, and parades, etc.
  • Figure 1 illustrates the overall system architecture of an exemplary WA System 10.
  • the exemplary embodiment comprises five major subsystems: a capture system 40, a recreation system 42, a distribution system 44, an archive system 46, and an interactive viewing system 48. These subsystems interact utilizing commercial off the shelf electronic components that provide high speed connectivity to manage and present advertisement inventory for alternative media formats including animation and sound recreated from real events.
  • the capture system 40 collects and processes event data from various sources.
  • the event data attained by the capture system 40, is provided to the event recreation system 42 in order to render the event into a 3-D recreation and integrate advertisements into the scene.
  • the capture system 40 is described in further detail in reference to Figure 2.
  • the event recreation system 42 recreates the event in three dimensional animation and identifies the inventory required to present the animation.
  • the inventory includes all models, textures, sounds, advertisements, and state data to present the animation.
  • the inventory and recreated event is provided to the archive system 46 for storage.
  • the distribution system 44 provides the event recreation system 42 with requests for recreated events.
  • the event recreation system 42 provides the distribution system 46 with the recreated event either in real-time from the capture system 40 or from archived event data stored in the archive system 46.
  • the advertisements inserted into any scene are provided to the distribution system 44 is primarily based upon demographic and metric information, which is stored in the archive system 46, collected by the interactive viewing system 48.
  • the event recreation system 42 is described in further detail in reference to Figure 3.
  • the distribution system 44 receives a consumer's request for an event and distributes the archived inventory and recreated event data to the interactive viewing system 48 for presentation to the consumer 20.
  • the archive system 46 stores data for non-real-time distribution of events; stores inventory models, textures, and sounds; and stores advertisement metric data and information about the consumer 20.
  • the interactive viewing system 48 presents the recreated event to the consumer 20, collects the metric data, and responds to consumer interaction.
  • the interactive viewing system 48 is described in greater detail in reference to Figure 5.
  • Figure 2 illustrates an exemplary capture system 40 that receives sensor data from video sources 30 and global positioning satellite (GPS) sources 32.
  • GPS global positioning satellite
  • a set of conventional television cameras 30 strategically positioned around the event provides the raw video data stream used to generate three dimensional positional data based on what each camera 30 views in two dimensions.
  • Two or more cameras 30 should be used depending on the application and the amount of accuracy desired, and additional high-resolution cameras 30 may be incorporated if extreme accuracy is required.
  • the GPS source 32 can provide a data stream for precise positioning of an object being tracked.
  • the imaging processing system 210 collects and processes frame data.
  • the image processing system 210 processes video frames and camera orientation from the video cameras 30 and stores the information in frame buffers.
  • a series of camera interface cards provides the buffering of camera frame data.
  • a bus expansion card can be utilized to provide high-speed connectivity to other system components.
  • the buffered frames are polled and associated with a sighting time.
  • Image array processor boards can provide the image processing. Each image processing board hosts a single instruction multiple data stream architecture (SIMD)-type array of high-speed image processors.
  • SIMD single instruction multiple data stream architecture
  • Edge detection is performed on each frame to identify bodies and their body centroid or object type with position and orientation.
  • Standard image software techniques such as segmentation, feature extraction, and classification are utilized to breakdown each body or object image in a frame.
  • Each resulting extracted group is classified according to preselected image templates.
  • the body centroid, body identification, and body sighting time, or object type, position and orientation with starting time is provided to the track correlation system 220.
  • the tracking correlation system 220 creates three dimensional positional data for objects viewed by the cameras 30. This three dimensional data is continuously updated to correspond with targeted figures' movements and relative position of other objects such as the relative position of a ball.
  • the input from the GPS source 32 can aid in the positioning of an object and provide state data to the track correlation system 220.
  • a track correlation board processes the object position data from each of the image array boards and correlates each object position data with historical 3-D object position data know as track files, that are stored in memory.
  • a shared memory area is accessible through a common DMA interface connected to all the systems comprising the WAS through common electronic subsystem buses. Two dimension position data for each object relative to the frame is passed to the track correlation system 220.
  • the track correlation system 220 processes object position data and correlates each object position data with historical three dimension object position data.
  • the track correlation system 220 formulates 3-D object and body tracks based on 2D object and body state information.
  • the object propagation system 230 generates realistic animated characters from positional data and video frames. Preprogrammed knowledge of body movements correct and improve body states estimated by the track correlation system 220. Measured body states can be altered by recognizing certain gestures, such as a player running, to smooth the state to present a more realistic visual simulation. Smoothing reduces jerkiness and compensates for any missing measurements.
  • physics of motion and interaction with other bodies can be modeled to improve body states.
  • Articulated parts of each body are included in the body state calculations.
  • Each body includes articulated body parts such as two feet, two knees, two torso points, one abdomen, two shoulder points, two elbows, two hands, and one head.
  • the body propagation system 230 applies human movement algorithms to smooth the body articulated parts which are stored in the body state file. After each object has been decomposed and the movements smoothed for fluid motion, the event data is provided to the event recreation system 42.
  • Figure 3 illustrates an exemplary event recreation system 42.
  • the event recreation system 42 generates realistic, animated characters from positional data and video frames. Actual video data is overlaid onto the characters to produce very lifelike appearances. Advertisements can be dynamically placed on any participant and object such that the advertisements appear to be attached to the participant or object in its exact location in the real event.
  • the event recreation system 42 includes a model texturing board and a series of computers to generate the realistic, animated characters from positional data and video frames.
  • a bus expansion card can provide high-speed connectivity between the track correlation system and the object propagation system through the direct memory access (DMA) interface. Body position data and real-time visual models are synchronized with the interactive viewing system 48 through the DMA interface.
  • DMA direct memory access
  • Recreated event data contains state data for all objects in the recreated scene in addition to information about specific inventory for the scene. For instance, a football scene recreated event data will include references to the stadium models, textures, and sounds; spectator models, textures, and sounds; each participants models, textures, and sounds; and advertising materials to be incorporated into the scenes.
  • the recreated event data can be sent to the archive system 46 for storage or passed directly to the distribution system 44 for real-time distribution.
  • the distribution system 44 receives and processes consumer requests for event data and distributes customized inventory and event data to interactive viewing system 48 for display to the consumer 20.
  • the distribution system 44 receives the recreated event data from the event recreation system 40 and inventory data from the archive data from the archive system 46. Each time that a request is made to the distribution system 44, it passes the request to the event recreation system 42.
  • the inventory management system 310 processes the request and determines if the event inventory should be modified.
  • Baseline of advertising objectives are manually set prior to receiving the event data from the capture system. Advertising objectives can also be automatically set by known techniques based upon the advertisement inventory. These objectives identify the type of advertising that is to be applied to the scene. Many advertisements may fall within a particular type and may include a priority attribute.
  • Each advertisement inventory is dynamically modified based on consumer metrics and priority. For instance, if the objective is to include hand advertisements in the scene, consumer metrics may determine that the consumer is a fan of a particular advertiser such as NIKE, and therefore, a NIKE brand batting glove advertisement will be placed in the scene.
  • the priority is a dynamic attribute and can change each time the event is recreated.
  • All advertising inventory is stored and maintained in the archive system 46.
  • Each advertisement has attributes that characterize the advertisement and defines the type of object to which it can be applied. For instance, batting glove advertisements are applied to hand objects. The node of hand object defines the region where the batting glove ad is to be applied within the scene.
  • Event inventory may be modified to meet specific advertisement goals or to reflect changes in user preferences.
  • the inventory management system 310 determines the inventory, comprising of models, textures, and sounds, that is required to recreate the event and determines what advertisement inventory is to be included in the event.
  • the event data is then retrieved from the archive system 46 and passed back to the distribution system 44 for delivery to the interactive viewing system 48.
  • the event recreation system 42 generates photo-realistic 3-D visual models based on the event data provided by the tracking correlation system 40.
  • the system 42 access information from databases that hold the measurements for each body parts of a participant and from which it generates a 3-D wire-frame representation of the participant.
  • Video frame data is utilized to generate model textures for each body part and maps these textures onto the wire-frame model held by the tracking correlation system 40, as previously generated, to create the realistic 3-D visual model.
  • a user interface 340 enables an inventory operator 350 to designate which objects should display advertisements.
  • the inventory operator 350 can predetermine the model nodes associated with the objects.
  • Each object in the scene has a node in the animation scene graph. For example, in a football highlight, there are nodes for each object that make up a football player.
  • the football player may be composed of a head node, a torso node, a hip node, two arm nodes, and two leg nodes.
  • the object decomposition is defined a priori, however, each time a highlight is recreated.
  • the football player node may be further decomposed to include hand and finger nodes.
  • the object model utilized is determined by the advertising objectives.
  • Model nodes contain attributes that determine how the model should be displayed in the scene in addition to the actions that are to be performed by selection of the consumer 20.
  • Display attributes include orientation with respect to the camera location, displacement from state data, and presentation priority in the field of view.
  • a billboard node on a stadium wall can include orientation display attributes such that a billboard maintains a 90 degree angle with the virtual camera when displayed in a rendered window.
  • the displacement attribute determines the 3-D offset from the model's location in the event data that the model can be displaced when the object moves out of the rendered window frustum.
  • the displacement attribute is used to move the object such that it stays within the rendered view.
  • An example of the displacement attribute can be illustrated by a football player running down the sidelines and the billboard wall moves down the wall with the player in order to stay in view.
  • the presentation priority attribute is designed to ensure that the specific models are displayed in the rendered window's forefront.
  • each model node is tested with a line of site vector and adjustments to the state data is made to models that obstruct the view of higher priority model nodes. This ensures that an advertisement inventory will be prominently displayed in the rendered window.
  • the advertisement objects include attributes that determine if the ad in the scene should be adjusted such that it always faces the virtual camera.
  • Each model node in the scene graph as a predetermined region of space for an advertisement inventory.
  • the interactive view system 48 calculates a line sight from the virtual camera to the advertisement that is required to face the virtual camera.
  • the advertisement is translated and rotated within the region to maintain the angle between advertisement plane and the line of sight between 0 and 180 degrees. For example, an advertisement can be placed on a baseball bat with the bat being in the exact location as in the real event. However, during a rotation of the bat in the real event, the advertisement can be adjusted in the recreated event such that the advertisement always faces the virtual camera.
  • the inventory operator 350 can select and insert the automatic and manual triggers to be incorporated in the event.
  • Action within the scene can trigger objects to morph into virtual advertisements. For instance, when a long pass is thrown, a football might morph into a branded plane such as those owned by Delta or Southwest Airlines. Morphing can be triggered by specific actions or inserted in the timeline to occur at a specific instance in time. An event such as scoring of a touchdown can trigger an advertisement to be displayed.
  • Audio advertisements can be inserted within the action of the event.
  • a virtual public address system within the animation can be utilized to deliver audio advertisements during a break in the action.
  • a whistle, buzzer, or the expiration of clock time can signal a break in the action.
  • triggers within the event data can trigger audio advertisements from the participants within the recreation. For instance, when a quarterback is sacked in a recreated football event, an audio advertisement delivered by the quarterback for headache powder, such as the product sold by Goody's, might be triggered.
  • All advertisements within the recreated scene will be interactive via hotspots. Consumers are able to view other forms of media for the product by clicking on the hot spot. Activation of a hot spot can send the consumer to the sponsor's web site, initiate a purchase, or cause the display of a video clip, etc.
  • advertisements can be tailored to specific participants. For example, selecting a specific player can provide a list of products endorsed by the player.
  • advertisements can be incorporated as a result of action within a recreated event controlled by the consumer 20.
  • a rendered event derived from a real event
  • the consumer can take control of the actions of one or more of the participants in the recreated event and change the outcome of a portion of the real event.
  • Consumer can interact within recreated events by standard techniques known in the gaming industry.
  • the subsequent outcome as produced by the consumer, can trigger an advertisement based upon the results of the outcome produced by the consumer.
  • numerous such outcomes can be combined over a period of viewing time or combined over multiple viewing times to produce combined results that can trigger different advertisements.
  • the method for combining such outcomes can coincide with scoring methods used in the real event allowing for triggered advertisements based upon how well the consumer faired in comparison to the participants in the real event.
  • the virtual viewing system 330 renders the video generated by the virtual camera utilizing entity states and visual models.
  • the virtual view system 330 is described in greater detail in reference to Figure 4.
  • the virtual viewing system 330 interacts with the interactive viewing system 48 to display a scene from a desired view, orientation, or distance.
  • the virtual view system 330 responds to consumer command to rotate a camera angle, zoom in or out, or provide a view desired by the consumer. Any time the virtual viewing system 330 displays an advertisement or a consumer activates a hot spot, metric data is collected and stored in the archive system 46.
  • the archive system stores 46 event data for later display.
  • An archive server can provide real-time recording and playback of body position information and visual models from a series of redundant array of drives (RAID) that provide fast redundant storage and retrieval of historical, virtual data on demand.
  • RAID redundant array of drives
  • the archive system 46 delivers the event data to the inventory management system 310.
  • the inventory management system 310 can insert all new advertisements anywhere within the event.
  • the virtual view advertisement system 10 is designed such that there is complete control over the specific advertisements that are presented to the individual consumers 20.
  • the processes for applying advertising inventory starts with manually setting the advertisements objects. Then, the event data is received from the capture system. Event data is decomposed in high-level objects such as participants, audience, officials, and playing fields, etc. The high level objects are dynamically decomposed based upon advertising objectives. Consumer metric data is retrieved from the archive system. Based upon consumer metric information, advertisement inventory is retrieved from the archive system 46. The advertisement inventory is mapped to the scene graph and the recreated event data is generated.
  • Figure 4 illustrates an exemplary scene generation system 340.
  • the scene generation system 340 generates virtual scenes. Event data is received and processed in real time from the capture system 40. Event data that is received from the capture system 40 is decomposed into unique models for the event.
  • the virtual event consists of participants, spectators, and venue objects.
  • Each of the objects can be decomposed in various level of detail by the objection decomposition subsection 410.
  • the model level of detail is partially determined by the advertisement inventory for the event. For example, if the event is to include a virtual advertisement for a wristwatch on a particular participant, the model for that participant will include objects for the participants arms, including wrist nodes. The wrist node will contain attributes that describe how the model should respond when presented in the interactive client viewing system 48. The inventory will include the watch textures which will be applied to the arm model in the rendered scene.
  • the inventory management system 310 determines the inventory, comprising of models, textures, sounds, and advertisements that are required to recreate the event and determines what advertisement inventory is to be included in the event.
  • the inventory models, textures, sounds, as well as advertisement material are stored in the archive system and provided to the inventory mapping function to provide texture to the objects.
  • Advertising inventory consists of digitized image data files in targa, jpeg, or other formats and attributes maintained in the archive system database. Advertising attributes include size, local axis origin, object type, and priority.
  • the inventory management system uses these attributes in addition to inventory objectives, consumer metric data, and activated manual triggers to determine what advertising inventory to include in the scene.
  • the inventory mapping subsection 430 generates photo-realistic 3-D visual models.
  • the inventory mapping subsection 430 accesses information from shared databases holding measurements for each of the body parts of a sports figure and from which it generates a 3-D wire-frame representation of the participant.
  • the inventory mapping subsection 430 utilizes video frame data to generate model textures for each body part. The textures are mapped onto the wire-frame model.
  • the scene rendering subsection 420 utilizes body state information previously calculated by the track correlation system 220 and texture information generated by the inventory mapping subsection 430 to provide coherent body position information and visual models to the interactive view system 48.
  • Body position data and real-time visual model are synchronized with the interactive view system 48 through the DMA interface.
  • FIG. 5 illustrates an exemplary interactive viewing system 48.
  • the interactive viewing system 48 is the consumer's interface for viewing recreated events. Inventory and other event data is downloaded to the consumer's computer for event recreation.
  • the scene generation system 320 is capable of reproducing events in three dimensional animation including the insertion of advertisement material.
  • the distribution system 44 provides the models, textures, sounds, and advertisements material for rendering of the event desired by the consumer from consumer inventory storage file 510 in the archive system 46.
  • a user interface 520 enables the consumer 20 to interact with the virtual viewing system 330.
  • the user interface 520 processes virtual camera commands and replay commands from the consumer 20.
  • the consumer 20 can use a joy stick in the interface console or select from pre-selected viewpoints to move the viewing perspective to virtually any location.
  • the consumer 20 controls the virtual perspective through standard imaging software.
  • the user interface 520 serves to transmit navigation commands such as pointing, zooming, and focusing to the virtual viewing system 330.
  • the user interface 520 collects statistical information about the consumer 20. Upon accessing the WA system 10, the consumer 20 is prompted to provide information about himself. The information can include age, sex, and zip code, etc. The zip code and other collect demographic information can yield statistical information such as approximate family and income demographics.
  • the user interface 520 provides the selection information to the metric collection system 530, which sends the metric storage file 530 for storage in the archive system 46.
  • the scene generation system 320 utilizes entity state data to position objects in a 3 -dimensional scene and pre-programmed visual models data to determine how bodies appear in the generated 3-D scene from the consumer's view.
  • the scene rendering function responds to virtual commands to alter the scene as desired.
  • the virtual viewing system reformats the generated 3-D event into a protocol suitable to transmission.
  • the virtual viewing system 330 also collects viewing metric data for all advertisements displayed to the consumer 20 and advertisements selected by the consumer 20. This metric data is provided to a metric collection system 530 that stores the information in metric storage file 540 with the archive system 46.
  • the archive system 46 stores the metric information for distribution.
  • the advertisements provided to a consumer 20 can be altered to ensure certain metrics are obtained. In addition, advertisements can be altered based on demographic information about the consumer.
  • the football may morph into an advertisement for an airline; a consumer based in Atlanta, Georgia may view an airline advertisement for Southwest Airlines, while a consumer from Denver may view an advertisement for Frontier Airlines.
  • the scene generation system 320 can insert or change advertisements displayed to the consumer 20 based upon demographics or metric desired.
  • Figure 6 illustrates an exemplary stand alone animation as a dynamic advertisement screen shot 600.
  • Manually pre-determined advertising objectives preset dynamic advertisements to occur upon a triggering action.
  • a branded advertisement 620 is automatically displayed upon a triggering action.
  • the champagne bottle 620 is displayed and uncorked in celebration.
  • the brand label 625 is prominently displayed in the scene graph.
  • a consumer can control the action of the event being viewed.
  • the YOU MAKE THE PLAY button 660 allows the consumer to replay action in the event.
  • the consumer controls the action of a selected participant in the action.
  • the PLAY console 650 enables the consumer 20 to direct the actions performed by selected participant.
  • the techniques for controlling the actions of participants in recreated events is well known in the art.
  • Other buttons 670 enable the consumer to control additional aspects of the recreation.
  • the consumer has selected to control the actions of one of the football players 630.
  • the consumer successfully directed the football player 630 and scored a touchdown.
  • the results of consumer controlled action triggers an advertisement.
  • a success as indicated by the touchdown signal 615, can trigger one advertisement such as the uncorking of a champagne bottle advertisement 620. Differing results can trigger different advertisements to be displayed.
  • advertisements can be dynamically altered based upon a comparison of the results of the controlled action and the results in the real event.
  • a different advertisement can be dynamically displayed based upon a preset criteria.
  • Figure 7 illustrates an exemplary animation morph as a dynamic advertisement screen shot 700.
  • Manually pre-determined advertising objectives preset dynamic advertisements to occur upon a triggering action.
  • an object in the scene is automatically morphed into an advertisement upon a triggering action.
  • the virtual view advertising system 10 automatically morphs the football 710 into an airline advertisement 730 indicating the services of the sponsor.
  • the virtual view advertising system 10 can present different sponsorship indicia on the airline advertisement 730 based upon viewer demographics.
  • the advertisement can be altered to display the major airline carrier servicing the geographic area of the consumer.
  • a consumer can control the action of the event being viewed.
  • the YOU MAKE THE PLAY button 760 allows the consumer to replay action in the event.
  • the consumer controls the action of a selected participant in the action.
  • the PLAY console 750 enables the consumer 20 to direct the actions performed by a selected participant.
  • the techniques for controlling the actions of participants in recreated events are well known in the art.
  • Other buttons 770 enable the consumer to control additional aspects of the recreation. s In the illustrated screen shot 700, the consumer has selected to control the actions of one of the quarterback 720. As illustrated, the consumer successfully directed the quarterback 720 to complete a pass to the receiver 740. Hence, the results of a consumer's action triggers an advertisement.
  • a success as indicated by the receiver 740 catching the ball 710, can automatically trigger an advertisement ⁇ o such as an airline advertisement 730. Differing results can trigger different advertisements to be displayed. In addition, advertisements can be dynamically altered based upon a comparison of the results of the controlled action and the results in the real event. Furthermore, even if the consumer replays the same action and achieves the same result, a different advertisement can be dynamically displayed i5 based upon a preset criteria.
  • Figure 8 illustrates an exemplary object displacement as dynamic advertisement screen shots 800' and 800".
  • Objects in a recreated event are associated with model nodes.
  • the model nodes include a displacement attribute.
  • the virtual view advertisement system 10 determines an offset based upon the displacement
  • the offset establishes the displacement from the object's location in the real event that the object can be displayed when the object moves out of the rendered window frustum.
  • the object nodes contain a priority attribute. Objects that block the line-of-sight to a higher priority object can be shifted to enable unfretted display of the higher priority object.
  • an object in a displayed rendered window 800' is automatically displaced to continue displaying the advertisement in another rendered window 800".
  • Sponsor A billboard 810 is automatically displaced from its original position 812 in the first displayed rendered window 800' to another billboard position 814 in the next displayed rendered window 800".
  • an so object such as a sideline camera tower or any other obstruction, is moved from its actual position 850 to a displaced location 850' to ensure that Sponsor A billboard 810 is viewed without obstruction.
  • the disclosed methods and systems provide for the display of advertisement material in recreated events.
  • the methods and systems can map interactive advertisement materials onto any object in a recreated scene or dynamically present an advertisement based upon automatic or manual triggers. Therefore, the methods and systems disclosed have industrial applicability for advertisement, entertainment, and other industries.

Abstract

The virtual viewing advertising system (330) provides marketing using in scene advertisements to consumers viewing recreated events. Advertisements are inserted into recreated events without distracting from the action provided by the real event. Event data from a real event is captured either by active or passive sources. An inventory operator sets the advertisement objectives that identify the type of advertisement that is to be applied to a recreated event. The event data is decomposed (410) into high-level objects. The high-level objects are dynamically decomposed (410) based upon the advertisement objectives. The requisite advertisement inventory (430) is generated and mapped into a scene graph. The scene graph is displayed as part of the recreated event. In addition, consumer metric information can be collected, and the selected advertising inventory to be included in a scene graph can be altered based upon information known about the consumer.

Description

DESCRIPTION
SYSTEMS AND METHODS FOR INTEGRATING VIRTUAL
ADVERTISEMENTS INTO RECREATED EVENTS
The Following PCT Application claims priority to United States Provisional Application number 60/199,887 filed on April 26, 2000 entitled "Method and Apparatus for Integrating Advertisements Within a Three-Dimensional Rendered Image With Interaction to Live Events." All information disclosed in that prior
5 pending provisional application is incorporated herein by reference.
Technical Field The invention relates generally to three-dimensional computer graphics, and, more particularly, to systems and methods that enable viewers to be marketed using in-scene advertisements while viewing rendered real events. o Background Art
In the art of computer gaming, sporting events have been represented in animation with the intent of immersing the user into the action. One such system is the Virtual View Sport System disclosed in Patent Number 6,124,862. The Virtual View Sport System captures an event and processes the captured event data to s generate a three-dimensional recreation of the event. A consumer viewing the recreated event is immersed into the action and can watch the event from any desired view.
Naturally, game developers want to increase profits by receiving advertisement revenue from sponsors. To this extent, game developers have included o advertisements by displaying sponsor logos as a backdrop or by adding splash screens during game initialization. In addition, sponsor logos have been attached to objects within the game. However, the advertisements are often not viewable because of the object's orientation or the object is outside of the viewable window. Furthermore, the gaming industry is not interested in advertisements that divert the audience's attention 5 from the game. Consumers tolerate intrusive advertisements when the product or service is provided free of charge, but generally are annoyed otherwise. Current advertising approaches are inadequate to incorporate significant advertising material without antagonizing the consumers.
Sponsors desire to know how often a target audience views their advertising o indicia. Furthermore, sponsors wish for their advertising indicia to be constantly in the view of the consumers. Advertising exposure can be measured on basis of the number of units sold. However, in the unit sold metric model, advertisers are limited to promoting their products to when a new game version is released, which typically occurs annually. In addition, sponsors generally want their advertisements targeted to particular demographics.
Therefore, what is desired in the industry is a system that offers complete control of the advertisements offered to the consumers and offered in relation to collected metric and demographic information. Advertisement material also needs to be able to be altered based upon collected information about the consumer, and it needs to be prominently displayed without antagonizing the consumer.
Disclosure of Invention
I
The present invention provides a new marketing medium using in scene advertisements to consumers viewing recreated events. The system inserts advertisements into recreated events without distracting from the action provided by the event. The system can collect metric and demographic information about the consumer and dynamically select advertisement inventory based upon the collected information.
Generally described, advertising indicia are inserted into scene graphs and displayed to a consumer as part of a recreated event. Event data can captured from a real event by active or passive sources. Advertising inventory is determined based upon the advertising objectives. The event data is decomposed into high-level objects. The high-level objects are dynamically decomposed based upon the advertisement objectives, and the requisite advertisement inventory is generated and mapped into a scene graph. The scene graph is displayed as part of the recreated event. In addition, consumer metric information can be collected, and the selected advertising inventory to be included in a scene graph can be altered based upon information known about the consumer. Furthermore, each advertisement in the advertisement inventory can be activated by a consumer. This interactive advertisement inventory, upon triggering, can cause the display of additional advertisement material.
An object of the invention is to enable displayed advertisements to face a camera while the object to which the advertisement is associated moves in the event. Consequently, a model node is associated with an object in a scene graph of a recreated event. Model nodes include a predetermined region of space for - advertisement inventory. From the advertisement inventory, selected advertisements are associated with the object, and a line of sight is calculated from a virtual camera to an advertisement that is required for the advertisement to face the virtual camera. If needed, the advertisement is rotated or translated within the predetermined region in order to keep an angle between an advertisement plane and the line of sight between 0 degrees and 180 degrees.
Another objective is to enable advertisements to move in order to remain displayed in a rendered window. Consequently, a model node is associated with an object in a scene graph of a recreated event. The recreated event can be derived from a real event. The model node includes a displacement attribute. Additionally, an advertisement is associated with the object. Based upon the displacement attribute, an offset from the object's location in the event is determined, and based upon the offset the object can be displaced when the object moves out of a rendered window's frustum. By moving the object, the advertisement associated with the object can remain displayed in the rendered window.
Yet another object is to enable advertisements to be prominently displayed by changing the orientation of an object. An advertisement is associated with the object. In order to aid in the prominent display of the advertisement, a model node is associated with an object in a scene graph of a recreated event. The model node includes an orientation attribute. The object is rotated or translated based upon the orientation attribute such that a predetermined angle range between an object plane and a line of sight from a virtual camera is maintained. Maintaining the object's orientation displays the advertisement prominently within a rendered window. A further object is to enable advertisements to be viewed by removing obstructing obstacles. Model nodes are associated with each object in a scene graph of a recreated event, which may be derived from a real event. The model nodes include a priority attribute and an advertisement is associated with a high priority model node. Each model node is tested with a line of sight vector and obstructing objects that obstruct a view to higher priority model nodes are displaced. Moving obstructing objects enables the advertisement to be displayed within a rendered window. Another further object is to accent advertisements by incorporating the advertisement as part of the action. In a rendered event, objects can be morphed into a virtual advertisement upon a triggering action within the event. A virtual advertisement is an advertisement displayed by morphing an object in the event. Still another object is to be able to display advertisements on any object within a scene graph of a recreated event. After receiving the event data, the event data is decomposed into high-level objects. Each high-level object is decomposed. Any decomposed object can be associated with advertisement material. The advertisement inventory comprises selectable advertisements that can inserted into the scene graph. Based upon desired advertisement objectives, selected advertisement inventory can be mapped onto any decomposed object within the scene graph.
Yet a further object is to display advertisements based upon the results in a recreated event of a participant's action that is being controlled by the consumer. Each participant can be controlled by the consumer upon selection by the consumer. Upon triggering the participant, the consumer controls the action of the participant. Based upon the first results of the controlled action, result advertisements are displayed. In addition, if the consumer replays the action, the displayed advertisements can be dynamically altered based upon the first result and the replay result. Furthermore, the consumer can control participants in similar plays later in the event. The displayed advertisements can be dynamically altered based upon the first result and the later result. In addition, other advertisements can be displayed based upon a comparison of the controlled action results and the result in the real event. Thus, advertisements can be inserted into the recreated event and dynamically altered based upon the results of recreated action controlled by the consumer. Other features and advantages of the present invention will become apparent from a reading of the following description as well as a study of the appended drawings. Brief Description of Drawings
Figure. 1 illustrates a functional block diagram of an exemplary virtual view advertising system; Figure 2 illustrates a functional block diagram of an exemplary capture system;
Figure 3 illustrates a functional block diagram of an exemplary event recreation system;
Figure 4 illustrates a functional block diagram of an exemplary scene generation system;
Figure 5 illustrates a functional block diagram of an exemplary interactive viewing system;
Figure 6 illustrates an exemplary screen shot of a stand alone animation as a dynamic advertisement; Figure 7 illustrates an exemplary screen shot of an animation morph as a dynamic advertisement; and,
Figure 8 illustrates an exemplary screen shots of an object displacement as a dynamic advertisement;
Best Mode for Carrying Out the Invention
Referring to the drawings for a better understanding of the function and structure of the invention, the exemplary embodiment of the present invention is called the Virtual View Advertisement System (WA System). The exemplary system is designed to incorporate advertisement objectives within the Virtual View Sport System disclosed in Patent Number 6,124,862. Patent Number 6,124,862 is incorporated by reference in its entirety. However, it will be understood by those skilled in the art that numerous other systems can dynamically place advertisements on objects within captured real events. For the purpose of illustration and not limitation, the present invention will be described with reference to the insertion advertisement material into objects within the context of sporting events.
Nonetheless, it will be understood by those skilled in the art that the WA System may be applied to any situation in which pre-selected objects are tracked and converted into a virtual environment, such as military maneuvers, space exploration, political speeches, and parades, etc.
Figure 1 illustrates the overall system architecture of an exemplary WA System 10. The exemplary embodiment comprises five major subsystems: a capture system 40, a recreation system 42, a distribution system 44, an archive system 46, and an interactive viewing system 48. These subsystems interact utilizing commercial off the shelf electronic components that provide high speed connectivity to manage and present advertisement inventory for alternative media formats including animation and sound recreated from real events. The capture system 40 collects and processes event data from various sources.
These sources include both active and passive sources such as digital video 30 and global positioning satellite (GPS) source 32 data. The event data, attained by the capture system 40, is provided to the event recreation system 42 in order to render the event into a 3-D recreation and integrate advertisements into the scene. The capture system 40 is described in further detail in reference to Figure 2. The event recreation system 42 recreates the event in three dimensional animation and identifies the inventory required to present the animation. The inventory includes all models, textures, sounds, advertisements, and state data to present the animation. The inventory and recreated event is provided to the archive system 46 for storage. The distribution system 44 provides the event recreation system 42 with requests for recreated events. The event recreation system 42 provides the distribution system 46 with the recreated event either in real-time from the capture system 40 or from archived event data stored in the archive system 46. The advertisements inserted into any scene are provided to the distribution system 44 is primarily based upon demographic and metric information, which is stored in the archive system 46, collected by the interactive viewing system 48. The event recreation system 42 is described in further detail in reference to Figure 3. The distribution system 44 receives a consumer's request for an event and distributes the archived inventory and recreated event data to the interactive viewing system 48 for presentation to the consumer 20. The archive system 46 stores data for non-real-time distribution of events; stores inventory models, textures, and sounds; and stores advertisement metric data and information about the consumer 20. The interactive viewing system 48 presents the recreated event to the consumer 20, collects the metric data, and responds to consumer interaction. The interactive viewing system 48 is described in greater detail in reference to Figure 5.
Figure 2 illustrates an exemplary capture system 40 that receives sensor data from video sources 30 and global positioning satellite (GPS) sources 32. A set of conventional television cameras 30 strategically positioned around the event provides the raw video data stream used to generate three dimensional positional data based on what each camera 30 views in two dimensions. Two or more cameras 30 should be used depending on the application and the amount of accuracy desired, and additional high-resolution cameras 30 may be incorporated if extreme accuracy is required. The GPS source 32 can provide a data stream for precise positioning of an object being tracked.
The imaging processing system 210 collects and processes frame data. The image processing system 210 processes video frames and camera orientation from the video cameras 30 and stores the information in frame buffers. A series of camera interface cards provides the buffering of camera frame data. A bus expansion card can be utilized to provide high-speed connectivity to other system components. The buffered frames are polled and associated with a sighting time. Image array processor boards can provide the image processing. Each image processing board hosts a single instruction multiple data stream architecture (SIMD)-type array of high-speed image processors.
Edge detection is performed on each frame to identify bodies and their body centroid or object type with position and orientation. Standard image software techniques such as segmentation, feature extraction, and classification are utilized to breakdown each body or object image in a frame. Each resulting extracted group is classified according to preselected image templates. The body centroid, body identification, and body sighting time, or object type, position and orientation with starting time is provided to the track correlation system 220.
The tracking correlation system 220 creates three dimensional positional data for objects viewed by the cameras 30. This three dimensional data is continuously updated to correspond with targeted figures' movements and relative position of other objects such as the relative position of a ball. The input from the GPS source 32 can aid in the positioning of an object and provide state data to the track correlation system 220. A track correlation board processes the object position data from each of the image array boards and correlates each object position data with historical 3-D object position data know as track files, that are stored in memory. A shared memory area is accessible through a common DMA interface connected to all the systems comprising the WAS through common electronic subsystem buses. Two dimension position data for each object relative to the frame is passed to the track correlation system 220. The track correlation system 220 processes object position data and correlates each object position data with historical three dimension object position data. The track correlation system 220 formulates 3-D object and body tracks based on 2D object and body state information. The object propagation system 230 generates realistic animated characters from positional data and video frames. Preprogrammed knowledge of body movements correct and improve body states estimated by the track correlation system 220. Measured body states can be altered by recognizing certain gestures, such as a player running, to smooth the state to present a more realistic visual simulation. Smoothing reduces jerkiness and compensates for any missing measurements.
Additionally, physics of motion and interaction with other bodies can be modeled to improve body states. Articulated parts of each body are included in the body state calculations. Each body includes articulated body parts such as two feet, two knees, two torso points, one abdomen, two shoulder points, two elbows, two hands, and one head. The body propagation system 230 applies human movement algorithms to smooth the body articulated parts which are stored in the body state file. After each object has been decomposed and the movements smoothed for fluid motion, the event data is provided to the event recreation system 42.
Figure 3 illustrates an exemplary event recreation system 42. The event recreation system 42 generates realistic, animated characters from positional data and video frames. Actual video data is overlaid onto the characters to produce very lifelike appearances. Advertisements can be dynamically placed on any participant and object such that the advertisements appear to be attached to the participant or object in its exact location in the real event. The event recreation system 42 includes a model texturing board and a series of computers to generate the realistic, animated characters from positional data and video frames. A bus expansion card can provide high-speed connectivity between the track correlation system and the object propagation system through the direct memory access (DMA) interface. Body position data and real-time visual models are synchronized with the interactive viewing system 48 through the DMA interface.
Recreated event data contains state data for all objects in the recreated scene in addition to information about specific inventory for the scene. For instance, a football scene recreated event data will include references to the stadium models, textures, and sounds; spectator models, textures, and sounds; each participants models, textures, and sounds; and advertising materials to be incorporated into the scenes. The recreated event data can be sent to the archive system 46 for storage or passed directly to the distribution system 44 for real-time distribution. The distribution system 44 receives and processes consumer requests for event data and distributes customized inventory and event data to interactive viewing system 48 for display to the consumer 20. The distribution system 44 receives the recreated event data from the event recreation system 40 and inventory data from the archive data from the archive system 46. Each time that a request is made to the distribution system 44, it passes the request to the event recreation system 42.
The inventory management system 310 processes the request and determines if the event inventory should be modified. Baseline of advertising objectives are manually set prior to receiving the event data from the capture system. Advertising objectives can also be automatically set by known techniques based upon the advertisement inventory. These objectives identify the type of advertising that is to be applied to the scene. Many advertisements may fall within a particular type and may include a priority attribute. Each advertisement inventory is dynamically modified based on consumer metrics and priority. For instance, if the objective is to include hand advertisements in the scene, consumer metrics may determine that the consumer is a fan of a particular advertiser such as NIKE, and therefore, a NIKE brand batting glove advertisement will be placed in the scene. The priority is a dynamic attribute and can change each time the event is recreated.
All advertising inventory is stored and maintained in the archive system 46. Each advertisement has attributes that characterize the advertisement and defines the type of object to which it can be applied. For instance, batting glove advertisements are applied to hand objects. The node of hand object defines the region where the batting glove ad is to be applied within the scene. Event inventory may be modified to meet specific advertisement goals or to reflect changes in user preferences. The inventory management system 310 determines the inventory, comprising of models, textures, and sounds, that is required to recreate the event and determines what advertisement inventory is to be included in the event. The event data is then retrieved from the archive system 46 and passed back to the distribution system 44 for delivery to the interactive viewing system 48.
The event recreation system 42 generates photo-realistic 3-D visual models based on the event data provided by the tracking correlation system 40. The system 42 access information from databases that hold the measurements for each body parts of a participant and from which it generates a 3-D wire-frame representation of the participant. Video frame data is utilized to generate model textures for each body part and maps these textures onto the wire-frame model held by the tracking correlation system 40, as previously generated, to create the realistic 3-D visual model.
A user interface 340 enables an inventory operator 350 to designate which objects should display advertisements. The inventory operator 350 can predetermine the model nodes associated with the objects. Each object in the scene has a node in the animation scene graph. For example, in a football highlight, there are nodes for each object that make up a football player. For instance, the football player may be composed of a head node, a torso node, a hip node, two arm nodes, and two leg nodes. The object decomposition is defined a priori, however, each time a highlight is recreated. In the example above, the football player node may be further decomposed to include hand and finger nodes. The object model utilized is determined by the advertising objectives. Model nodes contain attributes that determine how the model should be displayed in the scene in addition to the actions that are to be performed by selection of the consumer 20. Display attributes include orientation with respect to the camera location, displacement from state data, and presentation priority in the field of view. For example, a billboard node on a stadium wall can include orientation display attributes such that a billboard maintains a 90 degree angle with the virtual camera when displayed in a rendered window. Additionally, the displacement attribute determines the 3-D offset from the model's location in the event data that the model can be displaced when the object moves out of the rendered window frustum. The displacement attribute is used to move the object such that it stays within the rendered view. An example of the displacement attribute can be illustrated by a football player running down the sidelines and the billboard wall moves down the wall with the player in order to stay in view.
The presentation priority attribute is designed to ensure that the specific models are displayed in the rendered window's forefront. When the scene is rendered by the scene rendering system, each model node is tested with a line of site vector and adjustments to the state data is made to models that obstruct the view of higher priority model nodes. This ensures that an advertisement inventory will be prominently displayed in the rendered window. The advertisement objects include attributes that determine if the ad in the scene should be adjusted such that it always faces the virtual camera. Each model node in the scene graph as a predetermined region of space for an advertisement inventory. The interactive view system 48 calculates a line sight from the virtual camera to the advertisement that is required to face the virtual camera. The advertisement is translated and rotated within the region to maintain the angle between advertisement plane and the line of sight between 0 and 180 degrees. For example, an advertisement can be placed on a baseball bat with the bat being in the exact location as in the real event. However, during a rotation of the bat in the real event, the advertisement can be adjusted in the recreated event such that the advertisement always faces the virtual camera.
In addition, the inventory operator 350 can select and insert the automatic and manual triggers to be incorporated in the event. Action within the scene can trigger objects to morph into virtual advertisements. For instance, when a long pass is thrown, a football might morph into a branded plane such as those owned by Delta or Southwest Airlines. Morphing can be triggered by specific actions or inserted in the timeline to occur at a specific instance in time. An event such as scoring of a touchdown can trigger an advertisement to be displayed.
Audio advertisements can be inserted within the action of the event. For instance, in sporting events, a virtual public address system within the animation can be utilized to deliver audio advertisements during a break in the action. A whistle, buzzer, or the expiration of clock time can signal a break in the action. Furthermore, triggers within the event data can trigger audio advertisements from the participants within the recreation. For instance, when a quarterback is sacked in a recreated football event, an audio advertisement delivered by the quarterback for headache powder, such as the product sold by Goody's, might be triggered.
All advertisements within the recreated scene will be interactive via hotspots. Consumers are able to view other forms of media for the product by clicking on the hot spot. Activation of a hot spot can send the consumer to the sponsor's web site, initiate a purchase, or cause the display of a video clip, etc. In addition, advertisements can be tailored to specific participants. For example, selecting a specific player can provide a list of products endorsed by the player.
In addition, advertisements can be incorporated as a result of action within a recreated event controlled by the consumer 20. In a rendered event, derived from a real event, the consumer can take control of the actions of one or more of the participants in the recreated event and change the outcome of a portion of the real event. Consumer can interact within recreated events by standard techniques known in the gaming industry. The subsequent outcome, as produced by the consumer, can trigger an advertisement based upon the results of the outcome produced by the consumer. Furthermore, numerous such outcomes can be combined over a period of viewing time or combined over multiple viewing times to produce combined results that can trigger different advertisements. Furthermore, the method for combining such outcomes can coincide with scoring methods used in the real event allowing for triggered advertisements based upon how well the consumer faired in comparison to the participants in the real event.
The virtual viewing system 330 renders the video generated by the virtual camera utilizing entity states and visual models. The virtual view system 330 is described in greater detail in reference to Figure 4. The virtual viewing system 330 interacts with the interactive viewing system 48 to display a scene from a desired view, orientation, or distance. The virtual view system 330 responds to consumer command to rotate a camera angle, zoom in or out, or provide a view desired by the consumer. Any time the virtual viewing system 330 displays an advertisement or a consumer activates a hot spot, metric data is collected and stored in the archive system 46.
The archive system stores 46 event data for later display. An archive server can provide real-time recording and playback of body position information and visual models from a series of redundant array of drives (RAID) that provide fast redundant storage and retrieval of historical, virtual data on demand. When an archived event is chosen for viewing, the archive system 46 delivers the event data to the inventory management system 310. The inventory management system 310 can insert all new advertisements anywhere within the event. The virtual view advertisement system 10 is designed such that there is complete control over the specific advertisements that are presented to the individual consumers 20.
The processes for applying advertising inventory starts with manually setting the advertisements objects. Then, the event data is received from the capture system. Event data is decomposed in high-level objects such as participants, audience, officials, and playing fields, etc. The high level objects are dynamically decomposed based upon advertising objectives. Consumer metric data is retrieved from the archive system. Based upon consumer metric information, advertisement inventory is retrieved from the archive system 46. The advertisement inventory is mapped to the scene graph and the recreated event data is generated. Figure 4 illustrates an exemplary scene generation system 340. The scene generation system 340 generates virtual scenes. Event data is received and processed in real time from the capture system 40. Event data that is received from the capture system 40 is decomposed into unique models for the event. The virtual event consists of participants, spectators, and venue objects. Each of the objects can be decomposed in various level of detail by the objection decomposition subsection 410. The model level of detail is partially determined by the advertisement inventory for the event. For example, if the event is to include a virtual advertisement for a wristwatch on a particular participant, the model for that participant will include objects for the participants arms, including wrist nodes. The wrist node will contain attributes that describe how the model should respond when presented in the interactive client viewing system 48. The inventory will include the watch textures which will be applied to the arm model in the rendered scene.
The inventory management system 310 determines the inventory, comprising of models, textures, sounds, and advertisements that are required to recreate the event and determines what advertisement inventory is to be included in the event. The inventory models, textures, sounds, as well as advertisement material, are stored in the archive system and provided to the inventory mapping function to provide texture to the objects. Advertising inventory consists of digitized image data files in targa, jpeg, or other formats and attributes maintained in the archive system database. Advertising attributes include size, local axis origin, object type, and priority. The inventory management system uses these attributes in addition to inventory objectives, consumer metric data, and activated manual triggers to determine what advertising inventory to include in the scene.
The inventory mapping subsection 430 generates photo-realistic 3-D visual models. The inventory mapping subsection 430 accesses information from shared databases holding measurements for each of the body parts of a sports figure and from which it generates a 3-D wire-frame representation of the participant. The inventory mapping subsection 430 utilizes video frame data to generate model textures for each body part. The textures are mapped onto the wire-frame model.
The scene rendering subsection 420 utilizes body state information previously calculated by the track correlation system 220 and texture information generated by the inventory mapping subsection 430 to provide coherent body position information and visual models to the interactive view system 48. Body position data and real-time visual model are synchronized with the interactive view system 48 through the DMA interface.
Figure 5 illustrates an exemplary interactive viewing system 48. The interactive viewing system 48 is the consumer's interface for viewing recreated events. Inventory and other event data is downloaded to the consumer's computer for event recreation. The scene generation system 320 is capable of reproducing events in three dimensional animation including the insertion of advertisement material. The distribution system 44 provides the models, textures, sounds, and advertisements material for rendering of the event desired by the consumer from consumer inventory storage file 510 in the archive system 46. A user interface 520 enables the consumer 20 to interact with the virtual viewing system 330.
The user interface 520 processes virtual camera commands and replay commands from the consumer 20. The consumer 20 can use a joy stick in the interface console or select from pre-selected viewpoints to move the viewing perspective to virtually any location. The consumer 20 controls the virtual perspective through standard imaging software. The user interface 520 serves to transmit navigation commands such as pointing, zooming, and focusing to the virtual viewing system 330. In addition, the user interface 520 collects statistical information about the consumer 20. Upon accessing the WA system 10, the consumer 20 is prompted to provide information about himself. The information can include age, sex, and zip code, etc. The zip code and other collect demographic information can yield statistical information such as approximate family and income demographics. When a consumer 20 selects a hot spot, the user interface 520 provides the selection information to the metric collection system 530, which sends the metric storage file 530 for storage in the archive system 46.
The scene generation system 320 utilizes entity state data to position objects in a 3 -dimensional scene and pre-programmed visual models data to determine how bodies appear in the generated 3-D scene from the consumer's view. The scene rendering function responds to virtual commands to alter the scene as desired. The virtual viewing system reformats the generated 3-D event into a protocol suitable to transmission. The virtual viewing system 330 also collects viewing metric data for all advertisements displayed to the consumer 20 and advertisements selected by the consumer 20. This metric data is provided to a metric collection system 530 that stores the information in metric storage file 540 with the archive system 46. The archive system 46 stores the metric information for distribution. The advertisements provided to a consumer 20 can be altered to ensure certain metrics are obtained. In addition, advertisements can be altered based on demographic information about the consumer. For example, during a long pass play in a football game, the football may morph into an advertisement for an airline; a consumer based in Atlanta, Georgia may view an airline advertisement for Southwest Airlines, while a consumer from Denver may view an advertisement for Frontier Airlines. The scene generation system 320 can insert or change advertisements displayed to the consumer 20 based upon demographics or metric desired.
Figure 6 illustrates an exemplary stand alone animation as a dynamic advertisement screen shot 600. Manually pre-determined advertising objectives preset dynamic advertisements to occur upon a triggering action.
In the illustrated screen shot 600, a branded advertisement 620 is automatically displayed upon a triggering action. Upon determining that a football player 630 has scored a touchdown by the touchdown signal 610 indicated by a referee 615, the champagne bottle 620 is displayed and uncorked in celebration. The brand label 625 is prominently displayed in the scene graph. hi addition, a consumer can control the action of the event being viewed. The YOU MAKE THE PLAY button 660 allows the consumer to replay action in the event. The consumer controls the action of a selected participant in the action. The PLAY console 650 enables the consumer 20 to direct the actions performed by selected participant. The techniques for controlling the actions of participants in recreated events is well known in the art. Other buttons 670 enable the consumer to control additional aspects of the recreation.
In the illustrated screen shot 600, the consumer has selected to control the actions of one of the football players 630. As illustrated, the consumer successfully directed the football player 630 and scored a touchdown. The results of consumer controlled action triggers an advertisement. A success, as indicated by the touchdown signal 615, can trigger one advertisement such as the uncorking of a champagne bottle advertisement 620. Differing results can trigger different advertisements to be displayed. In addition, advertisements can be dynamically altered based upon a comparison of the results of the controlled action and the results in the real event. Furthermore, even if the consumer replays the same action and achieves the same result, a different advertisement can be dynamically displayed based upon a preset criteria.
Figure 7 illustrates an exemplary animation morph as a dynamic advertisement screen shot 700. Manually pre-determined advertising objectives preset dynamic advertisements to occur upon a triggering action.
In the illustrated screen shot 700, an object in the scene is automatically morphed into an advertisement upon a triggering action. In a long yardage football pass play, such as where the football 710 is in the air for more than 20 yards, the virtual view advertising system 10 automatically morphs the football 710 into an airline advertisement 730 indicating the services of the sponsor. The virtual view advertising system 10 can present different sponsorship indicia on the airline advertisement 730 based upon viewer demographics. The advertisement can be altered to display the major airline carrier servicing the geographic area of the consumer.
In addition, a consumer can control the action of the event being viewed. The YOU MAKE THE PLAY button 760 allows the consumer to replay action in the event. The consumer controls the action of a selected participant in the action. The PLAY console 750 enables the consumer 20 to direct the actions performed by a selected participant. The techniques for controlling the actions of participants in recreated events are well known in the art. Other buttons 770 enable the consumer to control additional aspects of the recreation. s In the illustrated screen shot 700, the consumer has selected to control the actions of one of the quarterback 720. As illustrated, the consumer successfully directed the quarterback 720 to complete a pass to the receiver 740. Hence, the results of a consumer's action triggers an advertisement. A success, as indicated by the receiver 740 catching the ball 710, can automatically trigger an advertisement ιo such as an airline advertisement 730. Differing results can trigger different advertisements to be displayed. In addition, advertisements can be dynamically altered based upon a comparison of the results of the controlled action and the results in the real event. Furthermore, even if the consumer replays the same action and achieves the same result, a different advertisement can be dynamically displayed i5 based upon a preset criteria.
Figure 8 illustrates an exemplary object displacement as dynamic advertisement screen shots 800' and 800". Objects in a recreated event are associated with model nodes. The model nodes include a displacement attribute. The virtual view advertisement system 10 determines an offset based upon the displacement
20 attribute. The offset establishes the displacement from the object's location in the real event that the object can be displayed when the object moves out of the rendered window frustum. Also, the object nodes contain a priority attribute. Objects that block the line-of-sight to a higher priority object can be shifted to enable unfretted display of the higher priority object.
25 In the illustrated screen shots 800' and 800", an object in a displayed rendered window 800' is automatically displaced to continue displaying the advertisement in another rendered window 800". Sponsor A billboard 810 is automatically displaced from its original position 812 in the first displayed rendered window 800' to another billboard position 814 in the next displayed rendered window 800". In addition, an so object, such as a sideline camera tower or any other obstruction, is moved from its actual position 850 to a displaced location 850' to ensure that Sponsor A billboard 810 is viewed without obstruction. In view of the foregoing, it will be appreciated that the invention provides an improved system for incorporating advertisements into recreated events. It should be understood that the foregoing relates only to the exemplary embodiments of the present invention, and that numerous changes may be made therein without departing from the spirit and scope of the invention as defined by the following claims. Accordingly, it is the claims set forth below, and not merely the foregoing illustration, which are intended to define the exclusive rights of the invention.
Industrial Applicability The disclosed methods and systems provide for the display of advertisement material in recreated events. The methods and systems can map interactive advertisement materials onto any object in a recreated scene or dynamically present an advertisement based upon automatic or manual triggers. Therefore, the methods and systems disclosed have industrial applicability for advertisement, entertainment, and other industries.

Claims

ClaimsHaving set forth the nature of the present invention, what is claimed is:
1. A method for displaying advertisement material characterized by the steps of: a. setting advertisement objectives that identifies type of advertisements that is to be applied to a recreated event; b. receiving event data; c. decomposing said event data into high-level objects; d. dynamically decomposing said high-level objects based upon said advertisement objectives; e. mapping selected advertisement inventory into a scene graph; and f. generating recreated event data.
2. The method of claim 1, further characterized by the steps of: a. collecting metric information; and b. dynamically altering said selected advertisement inventory based upon said collected metric information.
3. The method of claim 1, further characterized by the steps of: a. collecting consumer demographic information; and b. dynamically altering said selected advertisement inventory based upon said collected consumer demographic information.
4. The method of claim 1, wherein the step of receiving event data comprises receiving event data captured from a real event.
5. The method of claim 1, further characterized by the step of associating a model node with a decomposed object, wherein said model node includes a predetermined region of space for an advertisement.
6. The method of claim 5, further characterized by the steps of: a. calculating a line of sight from a virtual camera to said advertisement that is required for said advertisement to face said virtual camera; and, b. rotating or translating said advertisement within said predetermined region in order to keep an angle between an advertisement plane and said line of sight between 0 degrees and 180 degrees.
7. The method of claim 1, further characterized by the step of associating a model node with at least one high-level object, wherein said model node includes a displacement attribute.
8. The method of claim 7, further characterized by the steps of: a. determining, based upon said displacement attribute, an offset from an object location in a real event that said at least one high-level object can be displaced when said at least one high-level object moves out of a rendered window frustum; and b. moving said at least one high-level object based upon said offset such that said at least one object stays within a rendered window.
9. The method of claim 1, further characterized by steps of: a. associating a model node with at least one high-level object, wherein said model node includes an orientation attribute; and b. rotating or translating said at least one high-level object based upon said orientation attribute such that a predetermined angle range between an object plane and a line of sight from a virtual camera is maintained.
10. The method of claim 9, wherein said predetermined angle range comprises a 90 degree angle.
11. The method of claim 1, further characterized by the steps of: a. associating model nodes with each high-level object, said model nodes include a priority attribute; b. testing each model node with a line of sight vector; and c. moving obstructing objects that obstruct a view to higher priority model nodes.
12. The method of claim 1, further characterized by the step of morphing an event object into a virtual advertisement upon a triggering action.
13. The method of claim 1, wherein said selected advertisement inventory comprises interactive advertisement inventory, wherein said interactive advertisement inventory is activateable to display additional advertisement material.
14. A method for displaying advertisement material characterized by the steps of: a. associating a model node with an object in a scene graph of a recreated event, wherein said model node includes a predetermined region of space for advertisement inventory; b. associating an advertisement with said object; c. calculating a line of sight from a virtual camera to said advertisement that is required for said advertisement to face the virtual camera; and d. rotating or translating the advertisement within said predetermined region in order to keep an angle between an advertisement plane and said line of sight between 0 degrees and 180 degrees.
15. The method of claim 14, wherein said recreated event is derived from a real event.
16. A method for displaying advertisement material characterized by the steps of: a. associating a model node with an object in a scene graph of a recreated event, the model node includes a displacement attribute; b. associating an advertisement with said object; c. determining, based upon said displacement attribute, an offset from an object location in said recreated event that said object can be displaced when said object moves out of a rendered window frustum; d. moving said object based upon said offset such that said object stays within a rendered window; and e. generating recreated event data.
17. The method of claim 16, wherein said recreated event is derived from a real event.
18. A method for displaying advertisement material characterized by the steps of: a. associating a model node with an object in a scene graph of a recreated event, the model node includes an orientation attribute; b. associating an advertisement with said object; c. rotating or translating said object based upon said orientation attribute such that a predetermined angle range between an object plane and a line of sight from a virtual camera is maintained; and d. generating recreated event data.
19. The method of claim 18, wherein the predetermined angle range is 90 degrees.
20. The method of claim 18, wherein said recreated event is derived from a real event.
21. A method for displaying advertisements characterized by the steps of: a. associating model nodes with each object in a scene graph of a recreated event, the model nodes includes a priority attribute; b. testing each model node with a line of sight vector; c. moving obstructing objects that obstruct a view to higher priority model nodes; d. associating an advertisement with a high priority model node; and e. generating recreated event data.
22. The method of claim 21, wherein said recreated event is derived from a real event.
23. A method for displaying advertisement material characterized by the steps of: a. receiving event data; b. rendering an event; c. morphing an object into a virtual advertisement upon a triggering action; and d. generating recreated event data.
24. A system for displaying advertising material characterized by: a. a capture system for collecting event data; b. an inventory management system, coupled to said capture system, for identifying inventory including advertisements for display in a recreated event; c. a scene generation system, coupled to said inventory system, for decomposing said event data into high-level objects, wherein said scene generation system dynamically decomposes said high-level objects based upon advertising objectives; d. said scene recreation system maps said inventory onto decomposed objects; and e. wherein said scene recreation system recreates an event in three dimensions with objects bearing said advertisements.
25. The system of claim 24, further characterized by an interactive viewing system for providing the recreated event to a consumer.
26. The system of claim 25, further characterized by an archive system, coupled to said scene recreation system, for storing said inventory.
27. A system for displaying advertisement material characterized by: a. means for capturing a real event; b. means for rendering the real event; c. means for determining what advertisements to insert into a recreated real event based upon collected consumer metrics; d. means for inserting the advertisements within the recreated real event; and e. means for generating recreated event data.
28. A method for displaying advertisement material characterized by the steps of: a. setting advertisement objectives; b. receiving event data; c. decomposing said event data into high-level objects; d. dynamically decomposing each of said high-level objects based upon said advertisement objectives; e. maintaining advertisement inventory, wherein said advertisement inventory comprises selectable advertisements that are insertable into a scene graph, wherein said selectable advertisements comprise attributes that define object types to which said selectable advertisements may be associated; f. mapping selected advertisement inventory onto any said decomposed object in said scene graph; and g. generating recreated rendered event data.
29. The method of claim 28, further characterized by the steps of: a. collecting metric information; and b. dynamically altering said selected advertisement inventory based upon said collected metric information.
30. The method of claim 28, further characterized by the steps of: a. collecting consumer demographic information; and b. dynamically altering said selected advertisement inventory based upon said collected consumer demographic information.
31. The method of claim 28, wherein said advertising objectives is changeable upon receipt of any request for said recreated event data.
32. The method of claim 28, wherein the step of receiving event data comprises receiving event data captured from a real event.
33. The method of claim 28 wherein said selectable advertisements are activateable to display additional advertisement material.
34. A method for displaying advertisement material characterized by the steps of: a. setting advertisement objectives; b. receiving event data; c. decomposing said event data into high-level objects, wherein said high-level objects include participants in an event; d. dynamically decomposing each of said high-level objects based upon said advertisement objectives; e. maintaining advertisement inventory, wherein said advertisement inventory comprises selectable advertisements that are insertable into a scene graph, wherein said selectable advertisements comprise attributes that define object types to which said selectable advertisements may be associated; f. mapping selected advertisement inventory onto any said decomposed object in said scene graph; g. said participants triggerable resulting in control by a consumer of recreation action in a recreated event; h. providing for display result advertisement based upon a first result of said recreation action; and i. generating recreated rendered event data.
35. The method of claim 34 further characterized by the steps of: a. collecting metric information; and b. dynamically altering said selected advertisement inventory based upon said collected metric information.
36. The method of claim 34 further characterized by the steps of: a. collecting consumer demographic information; and b. dynamically altering said selected advertisement inventory based upon said collected consumer demographic information.
37. The method of claim 34 wherein said advertising objectives is changeable upon receipt of any request for said recreated event data.
38. The method of claim 34 further characterized by the steps of: a. storing said first result; b. said participants re-triggerable resulting in further control by said consumer of later recreation action in said recreated event; and c. dynamically altering said selected advertisement inventory based upon said first result and a later result of said later recreation action.
39. The method of claim 34 further characterized by the steps of: a. storing said first result; b. said participants re-triggerable resulting in further control by said consumer of a replay of said recreation action in said recreated event; and c. dynamically altering said selected advertisement inventory based upon said first result and a replay result of said replay action.
40. The method of claim 34 further characterized by the steps of: a. comparing said first result to a real result in said event; and b. dynamically altering said selected advertisement inventory based upon said first result and said real result.
PCT/US2001/013475 2000-04-26 2001-04-26 Systems and methods for integrating virtual advertisements into recreated events WO2001082195A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001259171A AU2001259171A1 (en) 2000-04-26 2001-04-26 Systems and methods for integrating virtual advertisements into recreated events

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US19988700P 2000-04-26 2000-04-26
US60/199,887 2000-04-26

Publications (1)

Publication Number Publication Date
WO2001082195A1 true WO2001082195A1 (en) 2001-11-01

Family

ID=22739422

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/013475 WO2001082195A1 (en) 2000-04-26 2001-04-26 Systems and methods for integrating virtual advertisements into recreated events

Country Status (2)

Country Link
AU (1) AU2001259171A1 (en)
WO (1) WO2001082195A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2120201A1 (en) * 2008-05-15 2009-11-18 Research In Motion Limited Method and system to avoid fake metrics in advertising
US20110043524A1 (en) * 2009-08-24 2011-02-24 Xuemin Chen Method and system for converting a 3d video with targeted advertisement into a 2d video for display
CN103297811A (en) * 2012-02-24 2013-09-11 北京明日时尚信息技术有限公司 Method for realizing video advertisement in intelligently embedding mode
US8645828B2 (en) 2001-01-08 2014-02-04 Telstra Corporation Limited Management system for a contact center
CN109791666A (en) * 2016-09-30 2019-05-21 特里弗股份有限公司 Advertisement based on target is implanted into platform
EP4254962A1 (en) * 2022-03-31 2023-10-04 Canon Kabushiki Kaisha Image processing apparatus, control method and program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5105184A (en) * 1989-11-09 1992-04-14 Noorali Pirani Methods for displaying and integrating commercial advertisements with computer software
US5946664A (en) * 1995-06-30 1999-08-31 Sony Corporation Apparatus and method for executing a game program having advertisements therein
US6012984A (en) * 1997-04-11 2000-01-11 Gamesville.Com,Inc. Systems for providing large arena games over computer networks
US6036601A (en) * 1999-02-24 2000-03-14 Adaboy, Inc. Method for advertising over a computer network utilizing virtual environments of games
US6124862A (en) * 1997-06-13 2000-09-26 Anivision, Inc. Method and apparatus for generating virtual views of sporting events
US6196920B1 (en) * 1998-03-31 2001-03-06 Masque Publishing, Inc. On-line game playing with advertising
US6243856B1 (en) * 1998-02-03 2001-06-05 Amazing Media, Inc. System and method for encoding a scene graph

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5105184A (en) * 1989-11-09 1992-04-14 Noorali Pirani Methods for displaying and integrating commercial advertisements with computer software
US5105184B1 (en) * 1989-11-09 1997-06-17 Noorali Pirani Methods for displaying and integrating commercial advertisements with computer software
US5946664A (en) * 1995-06-30 1999-08-31 Sony Corporation Apparatus and method for executing a game program having advertisements therein
US6012984A (en) * 1997-04-11 2000-01-11 Gamesville.Com,Inc. Systems for providing large arena games over computer networks
US6124862A (en) * 1997-06-13 2000-09-26 Anivision, Inc. Method and apparatus for generating virtual views of sporting events
US6243856B1 (en) * 1998-02-03 2001-06-05 Amazing Media, Inc. System and method for encoding a scene graph
US6196920B1 (en) * 1998-03-31 2001-03-06 Masque Publishing, Inc. On-line game playing with advertising
US6036601A (en) * 1999-02-24 2000-03-14 Adaboy, Inc. Method for advertising over a computer network utilizing virtual environments of games

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8645828B2 (en) 2001-01-08 2014-02-04 Telstra Corporation Limited Management system for a contact center
EP2120201A1 (en) * 2008-05-15 2009-11-18 Research In Motion Limited Method and system to avoid fake metrics in advertising
US20110043524A1 (en) * 2009-08-24 2011-02-24 Xuemin Chen Method and system for converting a 3d video with targeted advertisement into a 2d video for display
US8803906B2 (en) * 2009-08-24 2014-08-12 Broadcom Corporation Method and system for converting a 3D video with targeted advertisement into a 2D video for display
CN103297811A (en) * 2012-02-24 2013-09-11 北京明日时尚信息技术有限公司 Method for realizing video advertisement in intelligently embedding mode
CN109791666A (en) * 2016-09-30 2019-05-21 特里弗股份有限公司 Advertisement based on target is implanted into platform
EP3519930A4 (en) * 2016-09-30 2020-03-18 Trivver, Inc. Objective based advertisement placement platform
EP4254962A1 (en) * 2022-03-31 2023-10-04 Canon Kabushiki Kaisha Image processing apparatus, control method and program

Also Published As

Publication number Publication date
AU2001259171A1 (en) 2001-11-07

Similar Documents

Publication Publication Date Title
US10948982B2 (en) Methods and systems for integrating virtual content into an immersive virtual reality world based on real-world scenery
US10789764B2 (en) Systems and associated methods for creating a viewing experience
US6124862A (en) Method and apparatus for generating virtual views of sporting events
US20210233304A1 (en) Systems and associated methods for creating a viewing experience
CN110249631B (en) Display control system and display control method
US10582191B1 (en) Dynamic angle viewing system
US20070198939A1 (en) System and method for the production of presentation content depicting a real world event
US6072504A (en) Method and apparatus for tracking, storing, and synthesizing an animated version of object motion
US20070296723A1 (en) Electronic simulation of events via computer-based gaming technologies
JP2021511729A (en) Extension of the detected area in the image or video data
US20230008567A1 (en) Real-time system for generating 4d spatio-temporal model of a real world environment
US7868914B2 (en) Video event statistic tracking system
US20100271367A1 (en) Method and apparatus for combining a real world event and a computer simulation
US10272340B2 (en) Media system and method
US20200359079A1 (en) Augmented reality apparatus and method
US20080068463A1 (en) system and method for graphically enhancing the visibility of an object/person in broadcasting
US9087380B2 (en) Method and system for creating event data and making same available to be served
US10406440B2 (en) Method of collecting advertisement exposure data of 360 VR game replay video
US11328559B2 (en) System and method for enabling wagering event between sports activity players with stored event metrics
Bebie et al. A Video‐Based 3D‐Reconstruction of Soccer Games
US20220270447A1 (en) System and method for enabling wagering event between sports activity players with stored event metrics
WO2001082195A1 (en) Systems and methods for integrating virtual advertisements into recreated events
CN116719176B (en) Intelligent display system of intelligent exhibition hall
JP6609078B1 (en) Content distribution system, content distribution method, and content distribution program
GB2589917A (en) Data processing method and apparatus

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP