US20080183575A1 - Back-channel media delivery system - Google Patents

Back-channel media delivery system Download PDF

Info

Publication number
US20080183575A1
US20080183575A1 US11/761,761 US76176107A US2008183575A1 US 20080183575 A1 US20080183575 A1 US 20080183575A1 US 76176107 A US76176107 A US 76176107A US 2008183575 A1 US2008183575 A1 US 2008183575A1
Authority
US
United States
Prior art keywords
media
rendering
sensors
rendering device
sensed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/761,761
Inventor
Robert E. Kaplan
Stuart Graham
Mars Tanumihardja
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vulcan Portals Inc
Vulcan IP Holdings Inc
Original Assignee
Vulcan Portals Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vulcan Portals Inc filed Critical Vulcan Portals Inc
Priority to US11/761,761 priority Critical patent/US20080183575A1/en
Assigned to VULCAN IP HOLDINGS INC. reassignment VULCAN IP HOLDINGS INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GRAHAM, STUART, TANUMIHARDJA, MARS, KAPLAN, ROBERT E.
Priority to PCT/US2008/052511 priority patent/WO2008095028A1/en
Publication of US20080183575A1 publication Critical patent/US20080183575A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0204Market segmentation
    • G06Q30/0205Location or geographical consideration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0207Discounts or incentives, e.g. coupons or rebates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0242Determining effectiveness of advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0255Targeted advertisements based on user history
    • G06Q30/0256User search
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0257User requested
    • G06Q30/0258Registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0273Determination of fees for advertising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0277Online advertisement

Definitions

  • the present invention relates to techniques for rendering media content on a media delivery device that tracks human impressions of the media content, as well as other environmental data, during the time it was rendered by the media delivery device.
  • Google In this case, gathers data on the precise number of times that a given advertisement is actually rendered during a page view. Likewise, the advertisement provider can gather data representing the precise number of times a given advertisement is actually clicked by the viewer of the advertisement.
  • Such feedback is invaluable because it allows advertisers to get feedback on the exact, rather than approximate, number of impressions the advertising made on the target audience.
  • An ‘impression’ is any exposure a person has to an advertisement. In the context of a newspaper, an advertisement has an impression every time a person turns to the page of the paper where the advertisement is located.
  • an internet advertisement provider such as Google also provides valuable information about how effective an internet-based advertisement is in generating an inquiry (i.e. it tells you how many impressions actually result in a click on the advertisement).
  • Data generated by, and fed back from, an advertising channel is more commonly known as ‘back-channel data.’
  • Back-channel data has increasingly become the currency driving Internet advertising business.
  • Such advertising media might comprise playback of DVD's, computer generated media or animation, set-top box video and audio, satellite dish video, streaming internet protocol television (‘IPTV’), still pictures, or even audio.
  • IPTV internet protocol television
  • Some such systems have the capability to report on what media content was played at what time and to schedule the time at which particular media is played. While these are very valuable controls for advertisers who wish to control their messaging, there is currently no mechanism for reporting how many people were or are exposed to an impression of such media content. Likewise, there is no mechanism for adapting the media content to account for local variables and conditions detected during media playback.
  • FIGS. 1A and 1B are detailed block diagrams of example embodiments of back-channel media delivery systems.
  • FIGS. 2A-2C are data flow diagrams of operation of example embodiments of back-channel media delivery systems.
  • FIG. 3 is a flow diagram of an example embodiment of a routine for a logging system of a back-channel media delivery system.
  • FIG. 4 is a high level block diagram of an example embodiment of a back-channel media delivery system.
  • embodiments of the invention are described in terms of a set-top box, it will be understood that any computing device or devices capable of performing the disclosed functions of the set-top box will suffice and in no way does such a device or devices need to be literally on top of a television set.
  • media delivery system has been described in terms of advertising and advertising media, embodiments of the invention are not so limited. Embodiments of the invention may, therefore, render media that is not specifically advertising related.
  • FIG. 1A depicts a back-channel media delivery system according to one embodiment of the invention.
  • the system includes a set-top box 100 , a display 150 , environmental sensors 171 , an environmental data server 170 and a backend server 180 .
  • the set-top box 100 includes, among other things, a content management system 111 , a media player system 112 and a logging system 114 .
  • a content management system 111 includes, among other things, a content management system 111 , a media player system 112 and a logging system 114 .
  • the set-top box 100 may, for example, include only the content management system 111 and the logging system 114 whereas the media player system 112 is physically separate from the set-top box 100 .
  • any of the functions of the content management system 111 , the media player 112 and the logging system 114 may be performed by devices or systems that are physically separate.
  • the media player system 112 along with the display 150 , or other content presentation devices 160 , are used to render instances of media content that embody and convey the message intended for the audience.
  • media content 121 stored in storage device 120 is selected by the content management system 111 for playback and is processed and rendered on the display 150 , or other content presentations devices 160 , by the media player system 112 .
  • Examples of storage devices 120 include hard drives, flash memory, remote server, network attachable storage and other types of non-volatile storage and memory devices.
  • the media content 121 is rendered as digital or analog signals which are routed to input/output (I/O) connections 130 on the set-top box 100 .
  • the I/O connections 130 further include a network I/O connection 132 for routing signals between the set-top box 100 and a network.
  • the network I/O connection 132 might be comprised of, for example, a modem connection or an 802.11x WiFi connection.
  • a pluggable device port I/O connection 133 can be used to connect the set-top box 100 to a pluggable device, as will be described in more detail below.
  • the output signals are then electronically transferred from these I/O connectors 130 to an appropriate device, for example, from the display I/O connection to the display 150 or from the I/O connectors 139 to some other media content presentation device 160 .
  • the output and input connectors follow A/V industry standard formats (e.g., Component, Composite, VGA, DVI, and HDMI).
  • Such embodiments of the set-top box can process and render, for example, at least one of the following digital media formats using an associated CODEC: MP3, MPEG2, MPEG4, AVI and Windows Media files such as WMA (for audio) and WMV (for video). It will be understood that these digital media formats are only for illustrative purposes and other types of media might be rendered by the media player system 112 .
  • the back-channel media delivery system also includes environmental sensors 171 . These sensors are responsible for detecting a myriad of environmental states, signals and conditions indicative of a human impression of the media content rendered by the media player system 112 .
  • one or more sensors 171 are configured to count foot-traffic in the vicinity of the back-channel media delivery system and these sensors are connected to an environmental data server 170 .
  • the count of foot-traffic corresponds to the count of impressions.
  • the count of impressions is stored and used with playlist rules or for post-processing.
  • the sensors 171 may be connected through, for example, a wired connection, which includes an Ethernet, RS-232 serial, USB or modem connection, or they may connect wirelessly through, for example, an 802.11x Wi-Fi network, or a Bluetooth or Infra-Red connection.
  • An environmental data program 113 a is executed on the data server 170 and processes the signals received by the environmental data server 170 from the sensors 171 .
  • the environmental data program 113 a can use various parameterized algorithms to determine whether the sensors 171 have detected a valid impression.
  • the Environmental data 175 which includes the number of valid impressions, is transmitted to the set-top box 100 for further processing and, as will be discussed in more detail below, for use by content management system 111 .
  • the environmental sensors 171 are capable of capturing “dwell time” of a person in an area in the vicinity of the back-channel media delivery system. Dwell time is a measure of how long the person or persons remained in proximity of the sensors 171 or back-channel media delivery system. That is, in these embodiments the sensors 171 are capable of determining when a person is in proximity to the back-channel media delivery system 100 , and additionally, determining how long they stay in proximity.
  • the number of valid impressions and other environmental data is transmitted to the set-top box 100 via an I/O connection 131 such as, for example, the network connection 132 or through the pluggable device port 133 .
  • sensor types include, but are not limited to, thermal imaging camera sensors, infrared sensors, pressure sensors, video imaging camera sensors, sonar sensors, laser sensors, audio sensors, motion sensors and RFID tag sensors.
  • the environmental sensors 171 are integrated into or attached to the display device 150 , and in other embodiments, the sensors 171 are be integrated into or attached to the set-top box 100 itself. In other embodiments, the environmental sensors 171 may be installed anywhere within a suitable vicinity of the display device 150 . For example, on the wall, ceiling or floor, within windows or doors, or self-contained and free standing.
  • the environmental data 175 is processed by the logging system 114 executing on the set-top box 100 .
  • the data is stored in the storage 120 temporarily or permanently on the set-top box as environmental data 123 .
  • the media player system 112 communicates with the logging system 114 via an inter-process-communication mechanism, either in a event-driven or polling fashion, to provide playback information, such as playback state and metadata, of the media content the media player system 112 is rendering.
  • the logger program 114 aggregates and correlates the media playback information, duration of the media content, and a timestamp of when the media content was rendered together with the environmental data 123 .
  • the resulting output of the logging system 114 is a playback history log file 124 .
  • instances of media rendered by the back-channel media delivery system may include a variety of different types of media such as video, audio or still pictures.
  • such media is managed by the content management system 111 which is part of the set-top box 100 .
  • the content management system 111 enables a user to define playlist rules 122 that govern what media content 121 is to be loaded onto the storage 120 of the set-top box 100 for playback as well as for defining playlist rules that govern when or how often instances of media content are to be rendered on the display device 150 or other content presentation devices 160 .
  • the playlist rules that govern the playback of media content, along with the media content files, are transferred onto the set-top box 100 from an external location such as another networked computing device commonly known as the backend server 180 .
  • such rules and media content may be transferred from an external data store onto a removable memory storage device (not shown) (e.g., a Universal Serial Bus (USB) flash memory drive), and then transferred from the removable memory storage device onto the set-top box 100 by connecting the removable memory storage device to a compatible I/O connection 130 on the set-top box 100 , for example, a USB port.
  • USB Universal Serial Bus
  • the back-channel media delivery system communicates with the backend server 180 via a modem or other data connection.
  • playlist rules many different rules can be specified, with the number and type of rules related to the capabilities of the set-top box 100 and environmental sensors 171 .
  • An example of a simple playlist rule is one that is time based.
  • the content management system 111 can be instructed via a playlist rule to play a certain media selection according to the current time of day, day of the week, or a combination of the two.
  • the rules can further specify a sequential, random or weighted randomization of media selections during a given time period. Different rules can be applied to different times of the day and on different days of the week. For example, supposing an embodiment of the invention were placed on a commuter train for playing advertisements or other media to commuters.
  • Playlist rules allow embodiments of the invention to be sensitive to these differences and enable an advertiser, for example, to tailor the selection and playback of media accordingly.
  • playlist rule Another example of a playlist rule is one which specifies that the same media should not be repeated within a given period of time. In the commuter train example above, it is likely that almost the same audience would be on board the train from, for example, the suburbs into the city. Once the train has emptied at its destination, the playlist rules could permit the media selections to repeat because presumably, a new audience would be present to see the media content.
  • Playlist rules may also specify quotas for specific media with promotion or demotion of playback priority based on the number of impressions each media has received. For example, suppose a particular advertisement, ‘ad A’, is targeted to receive 100 impressions in a month. Suppose that ‘ad B’ is targeted to receive only 50 impressions per month. Further suppose that ‘ad B’ has already received 40 impressions while ‘ad A’ has received only 30 . The content management system 111 may, in such a situation, boost the priority of ‘ad A’ so that it plays more frequently and likewise decrease the play priority of ‘ad B’ so it plays less frequently. In this way, the back-channel media delivery system can increase the likelihood that each advertisement will receive its targeted number of impressions.
  • the play priority for any given piece of media may also be specified based on a premium service where advertisers, for example, pay a premium for more impressions or for playback priority.
  • media can receive a higher playback priority because of its particular perishability. That is, certain media content is particularly time sensitive and in recognition of this, such media will receive a higher playback priority to hopefully increase the number of impressions. Examples of such media could involve sporting events (e.g., the Super Bowl), the season finale of a popular television show or media content related to an election.
  • More complex playlist rules can be used by the content management system 111 in conjunction with environmental data 123 provided by the environmental sensors 171 .
  • the environmental sensors 171 act as traffic counters that simply count the number of persons passing in proximity to the back-channel media delivery system. The traffic count is provided as feedback to the back-channel media delivery system as was previously discussed.
  • the content management system 111 may then prioritize the playback of specific media items or groups of media for playback during times of high traffic. It can also be specified in the playlist rules that the same media should not be played back-to-back.
  • the playlist rules can also be used to have the content management system 111 prioritize specific media items or a group of media items to playback when the system senses a high traffic-count.
  • a high-traffic count might be characterized in different ways. For example, the content management system 111 might consider passing a certain traffic-count threshold as “high-traffic.” Alternatively, the content management system 111 could characterize a large change in traffic within a certain period of time as high traffic and ignore the raw number of traffic counts altogether.
  • traffic counting sensors are perhaps the most common type of environmental sensors 171
  • more sophisticated sensors are capable of sensing and measuring more complex data for use with more complex playlist rules and data gathering.
  • some sensors are capable of measuring the dwell time of a person or persons within a proximity of the back-channel media delivery system. Such capability is useful for helping determine the efficacy of any given advertisement. For example, it is advantageous to know that while ‘ad A’ was watched by 10 persons, only 2 of them stay for the entire 30 second duration of the advertisement. This data is valuable if you also know that ‘ad B’ was also watched by 10 persons and 8 of them stayed for the entire 30 second duration of the advertisement.
  • the simplest sensors would detect only proximity and determine how long a person or persons are in proximity to the back-channel media delivery system.
  • RFID tags could be placed in the name tags of persons attending a large convention. The RFID tag could store information about that particular persons area of expertise. The RFID tag could then be read as the person moves about the convention and media content that would be of interest to such a person could be adaptively rendered by a back-channel media delivery system as they pass by.
  • environmental sensors could possibly detect the height and weight of a person in proximity to the back-channel media delivery system. Such information might be particularly useful especially when coupled with other information.
  • the playlist rules can be used to have the sensors 171 interpret a shorter, lighter person in front of the system during after-school hours between 3 and 5 P.M. to be children. In such instances, the playlist rules can further control the content management system to render media content intended for children.
  • Embodiments of the invention might also include one or more environmental sensors capable of tracking the attention of persons in proximity to the back-channel media delivery system.
  • attention tracking sensors can track the attention of a subject through the measurement or detection of aspects of the subject's face.
  • One such attention tracking sensor might, for example, use a camera and suitable illumination to capture images of an area in proximity with the back-channel media delivery system.
  • Suitable processing of the images could be used to determine the locations of people within the image and in particular, where those people are actually looking. Such processing could, for example, detect whether a person is looking at the screen based on, for example, the angle of their facial features within the captured images.
  • complex algorithms such as the mean shift algorithm that allow for face recognition and face tracking and such algorithms may be advantageously employed in an attention tracking sensor.
  • a suitable eye tracking algorithm may process the captured images in order to determine whether the subject's eyes are pointed at the back-channel media delivery system. Attention tracking using only the eyes may be advantageous in certain lighting situations or where the particular illumination results in accentuation of the eyes within the captured images. Attention tracking using both the eyes and other aspects of the face or head may be advantageous since although a person's face may be generally facing the back-channel media delivery system, they may not be looking directly at the system. Instead, for example, they may be looking at something behind or to the side of the system. Use of eye tracking may thus permit attention tracking sensors to be more accurate. An attention tracking sensor incorporated into the embodiment depicted in FIG. 1A would allow the back-channel media delivery system to determine whether each individual in the defined field is looking at the display 150 from moment to moment.
  • Attention tracking sensors would permit embodiments of the back-channel media delivery system to gather information on how long each individual looks at the screen. Gathering such information on a second-by-second basis permits gauging the effectiveness of a particular instance of media content, or different time segments within that instance, in getting and maintaining the attention of people.
  • one embodiment of the back-channel media delivery system could dynamically alter rendering of media in response to changing interest in the media being rendered. For example, suppose a person is watching the display 150 of the embodiment depicted in FIG. 1A . An attention tracking environmental sensor connected to such an embodiment could detect that the person is no longer paying attention or has averted their gaze in some manner, and that information could cause the system to start rendering a different instance of media. Alternatively, other embodiments of the system might change the volume of sound, change the brightness of the display, or other parameters of media playback in an attempt to regain the person's attention.
  • An attention tracking environmental sensor could also permit more accurate determination of a person's size, shape, height or the speed with which they move. Such information could be used by embodiments of the invention to generate probabilistic demographic information. Such information is useful and valuable in and of itself. Such information might also, however, be used by a playlist rule within an embodiment of the invention to custom tailor a media selection suitable for the person mostly likely to be watching the display at that moment.
  • the set-top box 100 may also periodically synchronize with the backend server 180 .
  • the backend server 180 receives the playback history log file 124 and can also upload new media content 185 and playlist rules 186 to the set-top box 100 .
  • the set-top box may continually communicate with the backend server 180 allowing the playback history to be communicated to the backend server 180 in real-time.
  • the playback history log file 124 may be reformatted and exported as some digitally transmittable format prior to being transmitted to the backend server 180 .
  • the transmission makes use of HTTP over TCP/IP protocols between set-top box 100 and the backend server 180 , which could be connected via an Ethernet network.
  • the connection could also be wireless using an 802.11x Wi-Fi network, Bluetooth connectivity, Cellular connectivity, radio frequency, or some variation thereof.
  • the transmitted playback history log file 124 is collected, stored, and analyzed on the backend server 180 and available for various reporting functionality as needed by the user of the system.
  • the backend server 180 is able to support the simultaneous collection of playback history log file 124 from multiple set-top boxes 100 .
  • the playback history log files 124 are aggregated and processed by an analysis program 181 that executes on the backend server 180 .
  • the analysis program 181 generates reports, and can further allow users to interactively query and view the imported playback history log file 124 and aggregated information.
  • FIG. 1B depicts an embodiment of the back-channel media delivery system wherein the environmental data server 170 is integrated into the set-top box 100 .
  • the environmental data server 170 is integrated into the set-top box 100 .
  • Such an embodiment obviates the need for network connections between the environmental data server 170 and the set-top box 100 as well as the need for separate server hardware for the environmental data server.
  • the environmental sensors 171 are likewise directly connected to the set-top box 100 . The functionality of these embodiments is otherwise identical to the embodiments discussed above.
  • FIG. 2A is a data flow diagram that describes the flow of data within the embodiment of system depicted in FIG. 1A above. Operation of this embodiment of the back-channel media delivery system typically begins with the content management system 111 determining the next media to render in accordance with the playlist rules 122 .
  • the content management system 111 communicates the location of the next media to the media player system 112 .
  • the media player system 112 retrieves the appropriate media file from the media content files 121 and then typically renders the media on, for example, the display screen 150 .
  • the media player system 112 begins to render the media, it generates a media playback begin event which is communicated to the logger program 114 .
  • the media player system 112 stops rendering that particular media it generates a media playback end event that is also communicated to the logger program 114 .
  • the environmental sensors 171 While the media is being rendered, the environmental sensors 171 begin detecting impressions and generate count events which are communicated to the environmental data server 170 .
  • the traffic count and type of traffic is passed from the environmental data server 170 to the logger program 114 .
  • the logger program 114 logs the media ID, the timestamp and the traffic count and other environmental data to the playback log file 124 . Periodically, the playback log will be exported to the backend server 180 .
  • FIG. 2B is a data flow diagram that describes the flow of data within the embodiment of the system wherein the environmental data server 170 is integrated into the set-top box 100 and as depicted in FIG. 1B .
  • the data flow shown in FIG. 2B is essentially the same as that of FIG. 2A except that the environmental sensors 171 communicate directly with set-top box 100 and its integrated environmental data server 170 .
  • FIG. 2C is a data flow diagram of an example embodiment of a back-channel media delivery system where the collected environmental data is used as feedback to help determine the next media to play.
  • the media player system 112 is rendering a media content file and the logger program 114 is likewise creating the playback history log 124 .
  • This feedback is used in conjunction with the playlist rules 122 to determine the next media content to render.
  • the data flow depicted in this Figure is otherwise identical to that of FIG. 2A .
  • FIG. 3 shows a flow chart diagram for one implementation of the log media playback history routine 301 of the logging system 114 .
  • a media playback event is received from the media player system 112 at step 302 .
  • the logging system 114 checks the playback event type at step 303 . If the playback event type is a “Play Start” event, the event details are written to the log file at step 304 , including the media identifier and timestamp T 1 information. The timestamp T 1 is stored in memory at step 305 for use later when a “Play End” event is received.
  • the routine ends 310 . If the playback event type is a “Play End” event, the timestamp T 1 that was stored in memory is retrieved at step 306 .
  • Environmental sensor count data for the time interval between timestamp T 1 and the current timestamp T 2 is retrieved from the environmental data 123 at step 307 .
  • the event details are written to a playback history log file 124 at step 308 including the media identifier, the timestamp T 2 , and environmental sensor count data.
  • the playback history log file 124 Once the playback history log file 124 has been completed, it may be exported to the backend server 180 for further analysis.
  • FIG. 4 depicts a high level block diagram back-channel media delivery system according to one embodiment of the invention.
  • the back-channel media delivery system 400 includes a media delivery device 410 and a backend server 440 .
  • the media delivery device 410 includes a computing device 420 .
  • the media device 410 also includes environmental sensors 415 and a rendering device 425 coupled to the computing device 420 .
  • the computing device 420 selects media stored on the computing device 420 for rendering on the rendering device 425 . Alternatively, the computing device 420 may select media stored elsewhere.
  • the computing device 420 then renders the media on the rendering device 425 .
  • the computing device 420 gathers environmental data from the environmental sensors 415 .
  • the computing device 420 compiles the playback history 435 and transmits this history that is received by the backend server 440 .
  • the backend server 440 is used to process and analyze the back-channel data. From this data, new playback rules may be devised for use by the media delivery device 410 during future renderings of new media on the rendering device 425 . New media and playback rules 430 are then transmitted to the media delivery device 410 .
  • receiving the playback history 435 by the backend server 440 and sending the new media and playback rules 430 to the media delivery device 410 may be accomplished in numerous ways. For example, and as was discussed more fully above, the communication may take place via various types of wired or wireless connections or via non-volatile media.

Abstract

A back-channel media delivery system that may be used for tracking the number and type of human impressions of media content rendered by the system during the time the media was rendered is provided. The back-channel media delivery system includes a rendering device for rendering media, an environmental sensor for sensing impressions and other environmental variables and a computing device configured to play media on the rendering device, and gather data related to the external states detected by the environmental sensor. The system may include rules that interpret that data and may cause the system to custom select, tailor or control future playback of media on the system.

Description

    RELATED APPLICATION DATA
  • This application claims benefit of U.S. Provisional Patent Application Ser. No. 60/898,855, entitled BACK-CHANNEL MEDIA DELIVERY SYSTEM, filed Jan. 31, 2007, which application is incorporated herein by reference.
  • TECHNICAL FIELD
  • The present invention relates to techniques for rendering media content on a media delivery device that tracks human impressions of the media content, as well as other environmental data, during the time it was rendered by the media delivery device.
  • BACKGROUND OF THE INVENTION
  • It has always been difficult for advertisers to gauge the effectiveness of their advertisements particularly where the advertising is done through traditional modes of advertising such as television or newspaper. Generally speaking, television networks and newspaper publishers have only approximate statistics on the number of viewers or readers within a given market. Newspaper publishers, for example, can approximate the number of newspapers that are read on any given day based on subscription and other sales data. Of course, every person who receives a newspaper is not going to read every advertisement within that paper. Consequently, newspaper publishers and those who purchase advertisements from the publishers have only a loose idea of how many people are exposed to or actually read their advertisements. Likewise with television advertising, the viewership of any given program, and the commercials that run during such programs, is not known with precision. The so-called ‘ratings’ for television programs are gathered statistically and again, calculating the number of people who are reached with any given advertisement is imprecise. Ideally, advertisers would like more substantive feedback about who and how their advertising content is being viewed.
  • With the rise of Internet advertising, advertisers are given more direct and immediate feedback on who is viewing their advertisements. Suppose, for example, that an advertiser purchases advertisements on the website of a major internet search engine such as Google. The advertisement provider, Google in this case, gathers data on the precise number of times that a given advertisement is actually rendered during a page view. Likewise, the advertisement provider can gather data representing the precise number of times a given advertisement is actually clicked by the viewer of the advertisement. Such feedback is invaluable because it allows advertisers to get feedback on the exact, rather than approximate, number of impressions the advertising made on the target audience. An ‘impression’ is any exposure a person has to an advertisement. In the context of a newspaper, an advertisement has an impression every time a person turns to the page of the paper where the advertisement is located. Since it is not possible to know with any certainty what pages of a newspaper are every actually viewed by a person, it is not possible to know with any certainty how many impressions a newspaper-based advertisement receives. A similar problem exists with television advertising because, as was discussed above, television ‘ratings’ are statistical estimates and calculating the number of people reached with any given advertisement is imprecise.
  • In addition to impression information, the feedback provided by an internet advertisement provider such as Google also provides valuable information about how effective an internet-based advertisement is in generating an inquiry (i.e. it tells you how many impressions actually result in a click on the advertisement). Data generated by, and fed back from, an advertising channel is more commonly known as ‘back-channel data.’ Back-channel data has increasingly become the currency driving Internet advertising business. Absolute measurement—vs. statistical analysis—is key to advertisers, corporate and content programmer confidence.
  • Although television, newspaper and magazine advertising channels continue to be very important, other forms of advertising such as audio, video and electronic signage in retail spaces, hotels, restaurants and other public places are becoming increasingly prevalent. Such advertising media might comprise playback of DVD's, computer generated media or animation, set-top box video and audio, satellite dish video, streaming internet protocol television (‘IPTV’), still pictures, or even audio. Some such systems have the capability to report on what media content was played at what time and to schedule the time at which particular media is played. While these are very valuable controls for advertisers who wish to control their messaging, there is currently no mechanism for reporting how many people were or are exposed to an impression of such media content. Likewise, there is no mechanism for adapting the media content to account for local variables and conditions detected during media playback.
  • There is therefore a need for an media delivery system that gathers data about the number and type of human impressions of media content delivered by a content rendering device for cross-correlation of such impression data with the media content. Such a system may also alter the media content it delivers based on such data.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1A and 1B are detailed block diagrams of example embodiments of back-channel media delivery systems.
  • FIGS. 2A-2C are data flow diagrams of operation of example embodiments of back-channel media delivery systems.
  • FIG. 3 is a flow diagram of an example embodiment of a routine for a logging system of a back-channel media delivery system.
  • FIG. 4 is a high level block diagram of an example embodiment of a back-channel media delivery system.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Techniques are described below for consolidating and correlating information about media content that is rendered at a specific time by a set-top box coupled to a display with information about the number of impressions the content made on people within some detectable proximity of the display. Although described in terms of a set-top box and display, it should be understood that such media rendering and display devices, as well as other related components, are only exemplary. Other types of media, such as still pictures or audio, may also be rendered by embodiments of the invention by an appropriate display or playback device and information about the number and type of impressions of such content likewise collected, consolidated and correlated. In particular, although embodiments of the invention are described in terms of a set-top box, it will be understood that any computing device or devices capable of performing the disclosed functions of the set-top box will suffice and in no way does such a device or devices need to be literally on top of a television set. Likewise, although the media delivery system has been described in terms of advertising and advertising media, embodiments of the invention are not so limited. Embodiments of the invention may, therefore, render media that is not specifically advertising related.
  • FIG. 1A depicts a back-channel media delivery system according to one embodiment of the invention. The system includes a set-top box 100, a display 150, environmental sensors 171, an environmental data server 170 and a backend server 180. The set-top box 100 includes, among other things, a content management system 111, a media player system 112 and a logging system 114. Although discussed in terms of such components and programs, alternative embodiments of the invention are possible and it will be understood that the embodiments discussed below are for illustrative purposes only. In an alternative embodiment, the set-top box 100 may, for example, include only the content management system 111 and the logging system 114 whereas the media player system 112 is physically separate from the set-top box 100. Indeed, any of the functions of the content management system 111, the media player 112 and the logging system 114 may be performed by devices or systems that are physically separate.
  • The media player system 112, along with the display 150, or other content presentation devices 160, are used to render instances of media content that embody and convey the message intended for the audience. As will be discussed more fully below, media content 121 stored in storage device 120 is selected by the content management system 111 for playback and is processed and rendered on the display 150, or other content presentations devices 160, by the media player system 112. Examples of storage devices 120 include hard drives, flash memory, remote server, network attachable storage and other types of non-volatile storage and memory devices. Typically, the media content 121 is rendered as digital or analog signals which are routed to input/output (I/O) connections 130 on the set-top box 100. For example, in the case of video media, video signals are routed to the display I/O connection 130. Other types of media, such as audio or pictures, may be routed to other devices through their respective connectors 139. The I/O connections 130 further include a network I/O connection 132 for routing signals between the set-top box 100 and a network. The network I/O connection 132 might be comprised of, for example, a modem connection or an 802.11x WiFi connection. A pluggable device port I/O connection 133 can be used to connect the set-top box 100 to a pluggable device, as will be described in more detail below.
  • The output signals are then electronically transferred from these I/O connectors 130 to an appropriate device, for example, from the display I/O connection to the display 150 or from the I/O connectors 139 to some other media content presentation device 160. In at least some embodiments, the output and input connectors follow A/V industry standard formats (e.g., Component, Composite, VGA, DVI, and HDMI). Such embodiments of the set-top box can process and render, for example, at least one of the following digital media formats using an associated CODEC: MP3, MPEG2, MPEG4, AVI and Windows Media files such as WMA (for audio) and WMV (for video). It will be understood that these digital media formats are only for illustrative purposes and other types of media might be rendered by the media player system 112.
  • The back-channel media delivery system also includes environmental sensors 171. These sensors are responsible for detecting a myriad of environmental states, signals and conditions indicative of a human impression of the media content rendered by the media player system 112. In the embodiment of FIG. 1A, one or more sensors 171 are configured to count foot-traffic in the vicinity of the back-channel media delivery system and these sensors are connected to an environmental data server 170. In one embodiment, the count of foot-traffic corresponds to the count of impressions. As will be discussed more fully below, the count of impressions is stored and used with playlist rules or for post-processing. The sensors 171 may be connected through, for example, a wired connection, which includes an Ethernet, RS-232 serial, USB or modem connection, or they may connect wirelessly through, for example, an 802.11x Wi-Fi network, or a Bluetooth or Infra-Red connection. An environmental data program 113 a is executed on the data server 170 and processes the signals received by the environmental data server 170 from the sensors 171. The environmental data program 113 a can use various parameterized algorithms to determine whether the sensors 171 have detected a valid impression. The Environmental data 175, which includes the number of valid impressions, is transmitted to the set-top box 100 for further processing and, as will be discussed in more detail below, for use by content management system 111. In at least some embodiments, in addition to being used for counting the number of impressions, the environmental sensors 171 are capable of capturing “dwell time” of a person in an area in the vicinity of the back-channel media delivery system. Dwell time is a measure of how long the person or persons remained in proximity of the sensors 171 or back-channel media delivery system. That is, in these embodiments the sensors 171 are capable of determining when a person is in proximity to the back-channel media delivery system 100, and additionally, determining how long they stay in proximity.
  • The number of valid impressions and other environmental data is transmitted to the set-top box 100 via an I/O connection 131 such as, for example, the network connection 132 or through the pluggable device port 133. Some examples of sensor types include, but are not limited to, thermal imaging camera sensors, infrared sensors, pressure sensors, video imaging camera sensors, sonar sensors, laser sensors, audio sensors, motion sensors and RFID tag sensors. In some embodiments, the environmental sensors 171 are integrated into or attached to the display device 150, and in other embodiments, the sensors 171 are be integrated into or attached to the set-top box 100 itself. In other embodiments, the environmental sensors 171 may be installed anywhere within a suitable vicinity of the display device 150. For example, on the wall, ceiling or floor, within windows or doors, or self-contained and free standing.
  • The environmental data 175 is processed by the logging system 114 executing on the set-top box 100. The data is stored in the storage 120 temporarily or permanently on the set-top box as environmental data 123. In one embodiment, the media player system 112 communicates with the logging system 114 via an inter-process-communication mechanism, either in a event-driven or polling fashion, to provide playback information, such as playback state and metadata, of the media content the media player system 112 is rendering. The logger program 114 aggregates and correlates the media playback information, duration of the media content, and a timestamp of when the media content was rendered together with the environmental data 123. The resulting output of the logging system 114 is a playback history log file 124.
  • As previously discussed, instances of media rendered by the back-channel media delivery system may include a variety of different types of media such as video, audio or still pictures. In one embodiment, such media is managed by the content management system 111 which is part of the set-top box 100. The content management system 111 enables a user to define playlist rules 122 that govern what media content 121 is to be loaded onto the storage 120 of the set-top box 100 for playback as well as for defining playlist rules that govern when or how often instances of media content are to be rendered on the display device 150 or other content presentation devices 160.
  • In some embodiments, the playlist rules that govern the playback of media content, along with the media content files, are transferred onto the set-top box 100 from an external location such as another networked computing device commonly known as the backend server 180. In other embodiments, where network connectivity is not available, such rules and media content may be transferred from an external data store onto a removable memory storage device (not shown) (e.g., a Universal Serial Bus (USB) flash memory drive), and then transferred from the removable memory storage device onto the set-top box 100 by connecting the removable memory storage device to a compatible I/O connection 130 on the set-top box 100, for example, a USB port. Although discussed in terms of a USB flash drive, other modes of transferring playlist rules and media content are possible. For example, other forms of portable, non-volatile storage such as DVDs, CDs, tape or floppy disk or Memory Cards such as Compact Flash, Secure Digital Card, MultiMedia Card, SmartMedia, Memory Stick, Memory Stick PRO, xD-Picture Card or a Micro Drive might be used instead. In alternative embodiments where Internet connectivity is not possible, the back-channel media delivery system communicates with the backend server 180 via a modem or other data connection.
  • Turning to the playlist rules, many different rules can be specified, with the number and type of rules related to the capabilities of the set-top box 100 and environmental sensors 171. An example of a simple playlist rule is one that is time based. For example, the content management system 111 can be instructed via a playlist rule to play a certain media selection according to the current time of day, day of the week, or a combination of the two. The rules can further specify a sequential, random or weighted randomization of media selections during a given time period. Different rules can be applied to different times of the day and on different days of the week. For example, supposing an embodiment of the invention were placed on a commuter train for playing advertisements or other media to commuters. In such a situation, the audience would be different during the rush hour commute than it would be, for example, at noon. Likewise, the audience on such a train would be different on the weekend than it would be during an ordinary mid-week work day. Playlist rules allow embodiments of the invention to be sensitive to these differences and enable an advertiser, for example, to tailor the selection and playback of media accordingly.
  • Another example of a playlist rule is one which specifies that the same media should not be repeated within a given period of time. In the commuter train example above, it is likely that almost the same audience would be on board the train from, for example, the suburbs into the city. Once the train has emptied at its destination, the playlist rules could permit the media selections to repeat because presumably, a new audience would be present to see the media content.
  • Playlist rules may also specify quotas for specific media with promotion or demotion of playback priority based on the number of impressions each media has received. For example, suppose a particular advertisement, ‘ad A’, is targeted to receive 100 impressions in a month. Suppose that ‘ad B’ is targeted to receive only 50 impressions per month. Further suppose that ‘ad B’ has already received 40 impressions while ‘ad A’ has received only 30. The content management system 111 may, in such a situation, boost the priority of ‘ad A’ so that it plays more frequently and likewise decrease the play priority of ‘ad B’ so it plays less frequently. In this way, the back-channel media delivery system can increase the likelihood that each advertisement will receive its targeted number of impressions. The play priority for any given piece of media may also be specified based on a premium service where advertisers, for example, pay a premium for more impressions or for playback priority.
  • In other embodiments, media can receive a higher playback priority because of its particular perishability. That is, certain media content is particularly time sensitive and in recognition of this, such media will receive a higher playback priority to hopefully increase the number of impressions. Examples of such media could involve sporting events (e.g., the Super Bowl), the season finale of a popular television show or media content related to an election.
  • More complex playlist rules can be used by the content management system 111 in conjunction with environmental data 123 provided by the environmental sensors 171. In one embodiment, the environmental sensors 171 act as traffic counters that simply count the number of persons passing in proximity to the back-channel media delivery system. The traffic count is provided as feedback to the back-channel media delivery system as was previously discussed. According to a particular rule, the content management system 111 may then prioritize the playback of specific media items or groups of media for playback during times of high traffic. It can also be specified in the playlist rules that the same media should not be played back-to-back. The playlist rules can also be used to have the content management system 111 prioritize specific media items or a group of media items to playback when the system senses a high traffic-count. A high-traffic count might be characterized in different ways. For example, the content management system 111 might consider passing a certain traffic-count threshold as “high-traffic.” Alternatively, the content management system 111 could characterize a large change in traffic within a certain period of time as high traffic and ignore the raw number of traffic counts altogether.
  • Although traffic counting sensors are perhaps the most common type of environmental sensors 171, more sophisticated sensors are capable of sensing and measuring more complex data for use with more complex playlist rules and data gathering. For example, as previously discussed, some sensors are capable of measuring the dwell time of a person or persons within a proximity of the back-channel media delivery system. Such capability is useful for helping determine the efficacy of any given advertisement. For example, it is advantageous to know that while ‘ad A’ was watched by 10 persons, only 2 of them stay for the entire 30 second duration of the advertisement. This data is valuable if you also know that ‘ad B’ was also watched by 10 persons and 8 of them stayed for the entire 30 second duration of the advertisement. The simplest sensors would detect only proximity and determine how long a person or persons are in proximity to the back-channel media delivery system.
  • Another type of sensor could read the information stored in an RFID tag. Such a tag might be placed in products sold in a store. In one embodiment, the sensors in a back-channel media delivery system could determine what the quantity and type of products a person has in their shopping cart as they approach the system in a retail store. Such data could then be used to select media for playback that is tailored for that particular person and their buying habits. Alternatively, RFID tags could be placed in the name tags of persons attending a large convention. The RFID tag could store information about that particular persons area of expertise. The RFID tag could then be read as the person moves about the convention and media content that would be of interest to such a person could be adaptively rendered by a back-channel media delivery system as they pass by.
  • In yet another embodiment, environmental sensors could possibly detect the height and weight of a person in proximity to the back-channel media delivery system. Such information might be particularly useful especially when coupled with other information. Perhaps, for example, the playlist rules can be used to have the sensors 171 interpret a shorter, lighter person in front of the system during after-school hours between 3 and 5 P.M. to be children. In such instances, the playlist rules can further control the content management system to render media content intended for children.
  • Embodiments of the invention might also include one or more environmental sensors capable of tracking the attention of persons in proximity to the back-channel media delivery system. There are many possible configurations of attention tracking sensors. Some attention tracking sensors, for example, can track the attention of a subject through the measurement or detection of aspects of the subject's face. One such attention tracking sensor might, for example, use a camera and suitable illumination to capture images of an area in proximity with the back-channel media delivery system. Suitable processing of the images could be used to determine the locations of people within the image and in particular, where those people are actually looking. Such processing could, for example, detect whether a person is looking at the screen based on, for example, the angle of their facial features within the captured images. As is known in the art, there are complex algorithms such as the mean shift algorithm that allow for face recognition and face tracking and such algorithms may be advantageously employed in an attention tracking sensor.
  • Alternatively, methods exist for attention tracking based on tracking only on the eyes. A suitable eye tracking algorithm may process the captured images in order to determine whether the subject's eyes are pointed at the back-channel media delivery system. Attention tracking using only the eyes may be advantageous in certain lighting situations or where the particular illumination results in accentuation of the eyes within the captured images. Attention tracking using both the eyes and other aspects of the face or head may be advantageous since although a person's face may be generally facing the back-channel media delivery system, they may not be looking directly at the system. Instead, for example, they may be looking at something behind or to the side of the system. Use of eye tracking may thus permit attention tracking sensors to be more accurate. An attention tracking sensor incorporated into the embodiment depicted in FIG. 1A would allow the back-channel media delivery system to determine whether each individual in the defined field is looking at the display 150 from moment to moment.
  • Attention tracking sensors would permit embodiments of the back-channel media delivery system to gather information on how long each individual looks at the screen. Gathering such information on a second-by-second basis permits gauging the effectiveness of a particular instance of media content, or different time segments within that instance, in getting and maintaining the attention of people. Likewise, through the use of an appropriate playlist rule, one embodiment of the back-channel media delivery system could dynamically alter rendering of media in response to changing interest in the media being rendered. For example, suppose a person is watching the display 150 of the embodiment depicted in FIG. 1A. An attention tracking environmental sensor connected to such an embodiment could detect that the person is no longer paying attention or has averted their gaze in some manner, and that information could cause the system to start rendering a different instance of media. Alternatively, other embodiments of the system might change the volume of sound, change the brightness of the display, or other parameters of media playback in an attempt to regain the person's attention.
  • An attention tracking environmental sensor could also permit more accurate determination of a person's size, shape, height or the speed with which they move. Such information could be used by embodiments of the invention to generate probabilistic demographic information. Such information is useful and valuable in and of itself. Such information might also, however, be used by a playlist rule within an embodiment of the invention to custom tailor a media selection suitable for the person mostly likely to be watching the display at that moment.
  • With further reference to FIG. 1A, although the set-top box 100 is capable of functioning more or less autonomously using playlist rules and environmental data, the set-top box 100 may also periodically synchronize with the backend server 180. The backend server 180 receives the playback history log file 124 and can also upload new media content 185 and playlist rules 186 to the set-top box 100. Alternatively, the set-top box may continually communicate with the backend server 180 allowing the playback history to be communicated to the backend server 180 in real-time. The playback history log file 124 may be reformatted and exported as some digitally transmittable format prior to being transmitted to the backend server 180. In some embodiments, the transmission makes use of HTTP over TCP/IP protocols between set-top box 100 and the backend server 180, which could be connected via an Ethernet network. The connection could also be wireless using an 802.11x Wi-Fi network, Bluetooth connectivity, Cellular connectivity, radio frequency, or some variation thereof. The transmitted playback history log file 124 is collected, stored, and analyzed on the backend server 180 and available for various reporting functionality as needed by the user of the system. The backend server 180 is able to support the simultaneous collection of playback history log file 124 from multiple set-top boxes 100. The playback history log files 124 are aggregated and processed by an analysis program 181 that executes on the backend server 180. The analysis program 181 generates reports, and can further allow users to interactively query and view the imported playback history log file 124 and aggregated information.
  • FIG. 1B depicts an embodiment of the back-channel media delivery system wherein the environmental data server 170 is integrated into the set-top box 100. Such an embodiment obviates the need for network connections between the environmental data server 170 and the set-top box 100 as well as the need for separate server hardware for the environmental data server. In some embodiments, the environmental sensors 171 are likewise directly connected to the set-top box 100. The functionality of these embodiments is otherwise identical to the embodiments discussed above.
  • FIG. 2A is a data flow diagram that describes the flow of data within the embodiment of system depicted in FIG. 1A above. Operation of this embodiment of the back-channel media delivery system typically begins with the content management system 111 determining the next media to render in accordance with the playlist rules 122. The content management system 111 communicates the location of the next media to the media player system 112. The media player system 112 retrieves the appropriate media file from the media content files 121 and then typically renders the media on, for example, the display screen 150. When the media player system 112 begins to render the media, it generates a media playback begin event which is communicated to the logger program 114. When the media player system 112 stops rendering that particular media, it generates a media playback end event that is also communicated to the logger program 114.
  • While the media is being rendered, the environmental sensors 171 begin detecting impressions and generate count events which are communicated to the environmental data server 170. The traffic count and type of traffic is passed from the environmental data server 170 to the logger program 114. During the period of time between the media playback begin event and end event, the logger program 114 logs the media ID, the timestamp and the traffic count and other environmental data to the playback log file 124. Periodically, the playback log will be exported to the backend server 180.
  • FIG. 2B is a data flow diagram that describes the flow of data within the embodiment of the system wherein the environmental data server 170 is integrated into the set-top box 100 and as depicted in FIG. 1B. The data flow shown in FIG. 2B is essentially the same as that of FIG. 2A except that the environmental sensors 171 communicate directly with set-top box 100 and its integrated environmental data server 170.
  • FIG. 2C is a data flow diagram of an example embodiment of a back-channel media delivery system where the collected environmental data is used as feedback to help determine the next media to play. During the time that the media player system 112 is rendering a media content file and the logger program 114 is likewise creating the playback history log 124, information about the traffic count and other environmental feedback is communicated back to the content management system 111. This feedback is used in conjunction with the playlist rules 122 to determine the next media content to render. The data flow depicted in this Figure is otherwise identical to that of FIG. 2A.
  • FIG. 3 shows a flow chart diagram for one implementation of the log media playback history routine 301 of the logging system 114. A media playback event is received from the media player system 112 at step 302. The logging system 114 checks the playback event type at step 303. If the playback event type is a “Play Start” event, the event details are written to the log file at step 304, including the media identifier and timestamp T1 information. The timestamp T1 is stored in memory at step 305 for use later when a “Play End” event is received. The routine ends 310. If the playback event type is a “Play End” event, the timestamp T1 that was stored in memory is retrieved at step 306. Environmental sensor count data for the time interval between timestamp T1 and the current timestamp T2 is retrieved from the environmental data 123 at step 307. The event details are written to a playback history log file 124 at step 308 including the media identifier, the timestamp T2, and environmental sensor count data. Once the playback history log file 124 has been completed, it may be exported to the backend server 180 for further analysis.
  • FIG. 4 depicts a high level block diagram back-channel media delivery system according to one embodiment of the invention. Although FIG. 4 illustrates a particular embodiment, it will be understood that alternative embodiments are possible as is evident from the embodiments and variations described above. The back-channel media delivery system 400 includes a media delivery device 410 and a backend server 440. The media delivery device 410 includes a computing device 420. The media device 410 also includes environmental sensors 415 and a rendering device 425 coupled to the computing device 420. The computing device 420 selects media stored on the computing device 420 for rendering on the rendering device 425. Alternatively, the computing device 420 may select media stored elsewhere. The computing device 420 then renders the media on the rendering device 425. While rendering the media, the computing device 420 gathers environmental data from the environmental sensors 415. The computing device 420 compiles the playback history 435 and transmits this history that is received by the backend server 440. As was discussed more fully above, the backend server 440 is used to process and analyze the back-channel data. From this data, new playback rules may be devised for use by the media delivery device 410 during future renderings of new media on the rendering device 425. New media and playback rules 430 are then transmitted to the media delivery device 410. It will be understood that receiving the playback history 435 by the backend server 440 and sending the new media and playback rules 430 to the media delivery device 410 may be accomplished in numerous ways. For example, and as was discussed more fully above, the communication may take place via various types of wired or wireless connections or via non-volatile media.
  • From the foregoing it will be appreciated that, although specific embodiments of the invention have been described herein for purposes of illustration, it will be understood by one skilled in the art that various modifications may be made without deviating from the invention. Accordingly, the invention is not limited except as by the appended claims.

Claims (39)

1. A method of rendering media, the method comprising:
selecting a first instance of media;
rendering the first instance of media on a first rendering device;
sensing at least one environmental state external to the first rendering device to provide a first sensed state; and
saving a first record of the first sensed state.
2. The method of claim 1 wherein the at least one environmental state comprises at least one of: the presence of a person in proximity to the first rendering device, the duration of the presence of a person in proximity to the first rendering device, the height or weight of a person in proximity to the first rendering device, the count of the number of persons sensed over time, the rate of change of persons sensed within a given time period, the number of people looking at the first rendering device, how long a person looks at the first rendering device, the actual time during which a person is looking at the first rendering device, the size, shape or height of a person who is looking at the rendering device, the speed with which someone in proximity to the first rendering device moves, and the presence of a radio frequency identification (‘RFID’) tag in proximity to the first rendering device.
3. The method of claim 2 further comprising:
selecting a second instance of media according to the first record;
rendering the second instance of media on a second rendering device;
sensing at least one environmental state external to the second rendering device to provide a second sensed state; and
saving a second record of the second sensed state.
4. The method of claim 3 further comprising:
prior to selecting the first instance of media, defining a plurality of rules for selecting media.
5. The method of claim 4 wherein the first rendering device comprises at least one of: a television screen, a video monitor, an electronic sign, and an audio playback system.
6. The method of claim 5 wherein the second rendering device comprises at least one of: a television screen, a video monitor, an electronic sign, and an audio playback system.
7. The method of claim 5 wherein the first rendering device is the same device as the second rendering device.
8. The method of claim 1 wherein sensing at least one environmental state comprises sensing with at least one sensor.
9. The method of claim 8 wherein the at least one sensor comprises at least one of: thermal imaging camera sensors, infrared sensors, pressure sensors, video imaging camera sensors, sonar sensors, laser sensors, attention tracking sensors, and radio frequency identification (‘RFID’) tag sensors.
10. The method of claim 9 wherein attention tracking sensors comprise at least one of: eye tracking sensors and face tracking sensors.
11. The method of claim 10 wherein the face tracking sensors track the attention of a subject based on the angle the subject's face makes with at least one rendering device.
12. The method of claim 1 wherein media comprises media encoded as at least one of: MP3, MPEG2, MPEG4, AVI, WMA and WMV.
13. The method of claim 3 wherein selecting the first and second instances of media comprises selecting the first and second instances of media according to at least one rule for selecting media.
14. The method of claim 13 wherein the at least one rule for selecting media comprises a rule that permits rendering an instance of media depending on at least one of: the time of day, the day of the week, the media previously rendered, playback quotas, perishability and the sensing of at least one of the plurality of environmental states.
15. The method of claim 3 wherein the first and second records of the first and second sensed states, respectively, comprise:
a record of the time of day when the sensing occurred;
data related to the type of environmental state sensed; and
a record identifying the instance of media being rendered at that time.
16. The method of claim 1 further comprising:
periodically communicating with a backend server to upload the records of the sensed states and to obtain new instances of media for rendering.
17. A media playback system comprising:
at least one rendering device;
at least one environmental sensor;
a computing device coupled to the at least one rendering device and the at least one environmental sensor and configured to:
select a first instance of media stored in the computing device;
render the first instance of media on the at least one rendering device;
while rendering the first instance of media:
use the at least one environmental sensor to sense at least one environmental state external to the computing device to provide sensed states; and
save a record of the sensed states.
18. The media playback system of claim 17 wherein the at least one environmental state comprises at least one of: the presence of a person in proximity to the media playback system, the duration of the presence of a person in proximity to the media playback system, the height or weight of a person in proximity to the system, the count of the number of persons sensed over time, the rate of change of persons sensed within a given time period, the number of people looking at the first rendering device, how long a person looks at the first rendering device, the actual time during which a person is looking at the first rendering device, the size, shape or height of a person who is looking at the rendering device, the speed with which someone in proximity to the first rendering device moves, and the presence of a radio frequency identification (‘RFID’) tag in proximity to the media playback system.
19. The media playback system of claim 17 wherein the at least one rendering device comprises at least one of: a television screen, a video monitor, an electronic sign, and an audio playback system.
20. The media playback system of claim 17 wherein the at least one environmental sensor comprises at least one of: thermal imaging camera sensors, infrared sensors, pressure sensors, video imaging camera sensors, sonar sensors, laser sensors, attention tracking sensors, and radio frequency identification (‘RFID’) tag sensors.
21. The media playback system of claim 20 wherein attention tracking sensors comprise at least one of: eye tracking sensors and face tracking sensors.
22. The media playback system of claim 21 wherein the face tracking sensors track the attention of a subject based on the angle the subject's face makes with at least one rendering device.
23. The media playback system of claim 17 wherein the computing device is further configured to render media encoded as at least one of: MP3, MPEG2, MPEG4, AVI, WMA and WMV.
24. The media playback system of claim 17 wherein the computing device is further configured to select the first instance according to at least one rendering rule.
25. The media playback system of claim 24 wherein the at least one rendering rule comprises a rule that permits rendering an instance of media depending on at least one of: the time of day, the day of the week, the media previously rendered, playback quotas, perishability and the record of sensed states.
26. The media playback system of claim 17 wherein the computing device is further configured to:
select a second instance of media according to the record of the sensed states; and
render the second instance of media on at least one rendering device.
27. The media playback system of claim 17 further comprising:
a backend server coupled to the computing device, the computing device further configured to:
periodically communicate with the backend server;
upload the records of sensed states to the backend server; and
to download new instances of media or new rules to the computing device.
28. A method of defining rendering criteria for rendering media on a media delivery device comprising:
receiving rendering information from the media delivery device;
creating rules based on the rendering information; and
sending the rules to the media delivery device wherein the rules define the criteria for rendering the media on the media delivery device.
29. The method of claim 28 further comprising:
sending media for rendering to the media delivery device.
30. The method of claim 28 wherein the rules that define criteria for rendering the media depend on at least one of: the time of day, the day of the week, the media previously rendered, playback quotas, perishability and the record of sensed states.
31. The method of claim 28 wherein the rendering information comprises:
environmental data sensed by the media playback device during rendering of media by the device;
a record of the time of day when the media playback device sensed the environmental data; and
a record identifying the media being rendered when the media playback system sensed the environmental data.
32. The method of claim 31 wherein the environmental data comprises information indicating at least one of: the presence of a person in proximity to the media playback system, the duration of the presence of a person in proximity to the media playback system, the height or weight of a person in proximity to the media playback system, the count of the number of persons sensed over time, the rate of change of persons sensed within a given time period, the number of people looking at the first rendering device, how long a person looks at the first rendering device, the actual time during which a person is looking at the first rendering device, the size, shape or height of a person who is looking at the rendering device, the speed with which someone in proximity to the first rendering device moves, and the presence of a radio frequency identification (‘RFID’) tag in proximity to the media playback system.
33. The method of claim 31 wherein the environmental data is sensed by at least one environmental sensor.
34. The method of claim 33 wherein the at least one environmental sensor comprises at least one of: thermal imaging camera sensors, infrared sensors, pressure sensors, video imaging camera sensors, sonar sensors, laser sensors, attention tracking sensors, and radio frequency identification (‘RFID’) tag sensors.
35. The method of claim 34 wherein attention tracking sensors comprise at least one of: eye tracking sensors and face tracking sensors.
36. The method of claim 35 wherein the face tracking sensors track the attention of a subject based on the angle the subject's face makes with at least one rendering device.
37. The method of claim 28 wherein receiving rendering information and sending media for rendering and rules comprises receiving and sending, respectively, through at least one of: an Ethernet connection, an RS-232 serial connection, a USB connection, an 802.11x wireless connection, a Bluetooth connection, an infra-red connection and non-volatile storage.
38. The method of claim 37 wherein non-volatile storage comprises at least one of: a DVD, a CD, a magnetic tape or floppy disk and a memory card.
39. The method of claim 38 wherein a memory card comprises at least one of: a Compact Flash card, a Secure Digital Card, a MultiMedia Card, SmartMedia, a MemoryStick of MemoryStick PRO, and xD-Picture Card and a Micro Drive.
US11/761,761 2007-01-31 2007-06-12 Back-channel media delivery system Abandoned US20080183575A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/761,761 US20080183575A1 (en) 2007-01-31 2007-06-12 Back-channel media delivery system
PCT/US2008/052511 WO2008095028A1 (en) 2007-01-31 2008-01-30 Back-channel media delivery system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US89885507P 2007-01-31 2007-01-31
US11/761,761 US20080183575A1 (en) 2007-01-31 2007-06-12 Back-channel media delivery system

Publications (1)

Publication Number Publication Date
US20080183575A1 true US20080183575A1 (en) 2008-07-31

Family

ID=39669016

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/761,761 Abandoned US20080183575A1 (en) 2007-01-31 2007-06-12 Back-channel media delivery system
US11/981,636 Active 2030-09-20 US9171317B2 (en) 2007-01-31 2007-10-30 Back-channel media delivery system
US12/011,331 Active 2032-11-19 US9105040B2 (en) 2007-01-31 2008-01-25 System and method for publishing advertising on distributed media delivery systems

Family Applications After (2)

Application Number Title Priority Date Filing Date
US11/981,636 Active 2030-09-20 US9171317B2 (en) 2007-01-31 2007-10-30 Back-channel media delivery system
US12/011,331 Active 2032-11-19 US9105040B2 (en) 2007-01-31 2008-01-25 System and method for publishing advertising on distributed media delivery systems

Country Status (2)

Country Link
US (3) US20080183575A1 (en)
WO (2) WO2008095028A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080183560A1 (en) * 2007-01-31 2008-07-31 Vulcan Portals, Inc. Back-channel media delivery system
US20090005891A1 (en) * 2007-06-28 2009-01-01 Apple, Inc. Data-driven media management within an electronic device
US20090005892A1 (en) * 2007-06-28 2009-01-01 Guetta Anthony J Data-driven media management within an electronic device
US20090048908A1 (en) * 2007-01-31 2009-02-19 Vulcan Portals, Inc. Media delivery system
US20090049005A1 (en) * 2007-08-17 2009-02-19 Graywolf Sensing Solutions Method and system for collecting and analyzing environmental data
US20090187967A1 (en) * 2007-06-28 2009-07-23 Andrew Rostaing Enhancements to data-driven media management within an electronic device
US20100106597A1 (en) * 2008-10-29 2010-04-29 Vulcan Portals, Inc. Systems and methods for tracking consumers
US20110106990A1 (en) * 2009-10-30 2011-05-05 International Business Machines Corporation Efficient handling of queued-direct i/o requests and completions
US8140714B2 (en) 2007-06-28 2012-03-20 Apple Inc. Media management and routing within an electronic device
US8934645B2 (en) 2010-01-26 2015-01-13 Apple Inc. Interaction of sound, silent and mute modes in an electronic device
US8989884B2 (en) 2011-01-11 2015-03-24 Apple Inc. Automatic audio configuration based on an audio output device
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080249850A1 (en) * 2007-04-03 2008-10-09 Google Inc. Providing Information About Content Distribution
AU2008301216A1 (en) 2007-09-17 2009-03-26 Seeker Wireless Pty Limited Systems and methods for triggering location based voice and/or data communications to or from mobile radio terminals
KR101303672B1 (en) * 2007-10-15 2013-09-16 삼성전자주식회사 Device and method of sharing contents by devices
US20100063862A1 (en) * 2008-09-08 2010-03-11 Thompson Ronald L Media delivery system and system including a media delivery system and a building automation system
US8364011B2 (en) * 2009-05-06 2013-01-29 Disney Enterprises, Inc. System and method for providing a personalized media consumption experience
US20110105084A1 (en) * 2009-10-30 2011-05-05 Openwave Systems, Inc. Back-channeled packeted data
US20110320259A1 (en) * 2010-06-25 2011-12-29 Wavemarket, Inc. Location based advertising system and method
US8438590B2 (en) 2010-09-22 2013-05-07 General Instrument Corporation System and method for measuring audience reaction to media content
US8725174B2 (en) 2010-10-23 2014-05-13 Wavemarket, Inc. Mobile device alert generation system and method
US8990108B1 (en) 2010-12-30 2015-03-24 Google Inc. Content presentation based on winning bid and attendance detected at a physical location information in real time
US20150154639A1 (en) * 2012-04-04 2015-06-04 Google Inc. Serving advertisements based on user physical activity
US9489531B2 (en) 2012-05-13 2016-11-08 Location Labs, Inc. System and method for controlling access to electronic devices
US9554190B2 (en) 2012-12-20 2017-01-24 Location Labs, Inc. System and method for controlling communication device use
US9232495B2 (en) 2013-12-06 2016-01-05 Location Labs, Inc. Device association-based locating system and method
US10013639B1 (en) 2013-12-16 2018-07-03 Amazon Technologies, Inc. Analyzing digital images based on criteria
CN105354191A (en) * 2014-08-18 2016-02-24 联想(北京)有限公司 Information processing method and electronic device
US10235690B2 (en) * 2015-03-11 2019-03-19 Admobilize Llc. Method and system for dynamically adjusting displayed content based on analysis of viewer attributes
CN105022795B (en) * 2015-06-30 2018-06-05 成都蓝码科技发展有限公司 A kind of new media cloud distribution platform and its implementation towards big data
CN106028391B (en) * 2016-05-19 2020-12-01 腾讯科技(深圳)有限公司 People flow statistical method and device
US10825043B2 (en) * 2017-01-31 2020-11-03 Oath Inc. Methods and systems for processing viewability metrics
US11379877B2 (en) * 2019-08-29 2022-07-05 Skopus Media Inc. Measuring effectiveness of images displayed on a moving object

Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5841987A (en) * 1994-08-19 1998-11-24 Thomson Consumer Electronics, Inc. Simple bus and interface system for consumer digital equipment
US5973683A (en) * 1997-11-24 1999-10-26 International Business Machines Corporation Dynamic regulation of television viewing content based on viewer profile and viewing history
US20020062481A1 (en) * 2000-02-25 2002-05-23 Malcolm Slaney Method and system for selecting advertisements
US20020125993A1 (en) * 2001-03-09 2002-09-12 Koninklijke Philips Electronics N.V. Apparatus and method for delivering an audio and/or video message to an individual
US6574793B1 (en) * 2000-02-25 2003-06-03 Interval Research Corporation System and method for displaying advertisements
US6631356B1 (en) * 1999-03-15 2003-10-07 Vulcan Portals, Inc. Demand aggregation through online buying groups
US20040003393A1 (en) * 2002-06-26 2004-01-01 Koninlkijke Philips Electronics N.V. Method, system and apparatus for monitoring use of electronic devices by user detection
US6904408B1 (en) * 2000-10-19 2005-06-07 Mccarthy John Bionet method, system and personalized web content manager responsive to browser viewers' psychological preferences, behavioral responses and physiological stress indicators
US6968565B1 (en) * 2000-02-25 2005-11-22 Vulcan Patents Llc Detection of content display observers with prevention of unauthorized access to identification signal
US7027054B1 (en) * 2002-08-14 2006-04-11 Avaworks, Incorporated Do-it-yourself photo realistic talking head creation system and method
US20060100980A1 (en) * 2004-10-27 2006-05-11 Bellsouth Intellectual Property Corporation Methods and systems for delivering yellow pages content to a media delivery device
US20060117341A1 (en) * 2004-11-26 2006-06-01 Park Ju-Hee Method and apparatus to transmit data broadcasting content and method and apparatus to receive data broadcasting content
US20060147192A1 (en) * 2005-01-05 2006-07-06 Jian Zhang Best shooting moment selectable digital camera apparatus
US20060174261A1 (en) * 2004-11-19 2006-08-03 Image Impact, Inc. Method and system for quantifying viewer awareness of advertising images in a video source
US20060188109A1 (en) * 2003-07-18 2006-08-24 Sony Corporation Reproducer and method for controlling reproduction
US20060282465A1 (en) * 2005-06-14 2006-12-14 Corescient Ventures, Llc System and method for searching media content
US20070089125A1 (en) * 2003-12-22 2007-04-19 Koninklijke Philips Electronic, N.V. Content-processing system, method, and computer program product for monitoring the viewer's mood
US20070150340A1 (en) * 2005-12-28 2007-06-28 Cartmell Brian R Advertising technique
US7302475B2 (en) * 2004-02-20 2007-11-27 Harris Interactive, Inc. System and method for measuring reactions to product packaging, advertising, or product features over a computer-based network
US7319779B1 (en) * 2003-12-08 2008-01-15 Videomining Corporation Classification of humans into multiple age categories from digital images
US7364068B1 (en) * 1998-03-11 2008-04-29 West Corporation Methods and apparatus for intelligent selection of goods and services offered to conferees
US20080183560A1 (en) * 2007-01-31 2008-07-31 Vulcan Portals, Inc. Back-channel media delivery system
US20090048908A1 (en) * 2007-01-31 2009-02-19 Vulcan Portals, Inc. Media delivery system
US7505621B1 (en) * 2003-10-24 2009-03-17 Videomining Corporation Demographic classification using image components
US20090142038A1 (en) * 2006-03-06 2009-06-04 Masaru Nishikawa Compressed coded data playback apparatus, and decoding/playback method of compressed coded data in the same apparatus
US7574727B2 (en) * 1997-07-23 2009-08-11 Touchtunes Music Corporation Intelligent digital audiovisual playback system
US7584150B2 (en) * 2001-10-12 2009-09-01 Hitachi, Ltd. Recording method, recording medium, and recording system
US7584353B2 (en) * 2003-09-12 2009-09-01 Trimble Navigation Limited Preventing unauthorized distribution of media content within a global network
US7664124B2 (en) * 2005-05-31 2010-02-16 At&T Intellectual Property, I, L.P. Methods, systems, and products for sharing content
US20100106597A1 (en) * 2008-10-29 2010-04-29 Vulcan Portals, Inc. Systems and methods for tracking consumers

Family Cites Families (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7316025B1 (en) 1992-11-16 2008-01-01 Arbitron Inc. Method and apparatus for encoding/decoding broadcast or recorded segments and monitoring audience exposure thereto
US5884028A (en) * 1994-07-29 1999-03-16 International Business Machines Corporation System for the management of multiple time-critical data streams
EP1643340B1 (en) * 1995-02-13 2013-08-14 Intertrust Technologies Corp. Secure transaction management
US5740549A (en) 1995-06-12 1998-04-14 Pointcast, Inc. Information and advertising distribution system and method
US7895076B2 (en) * 1995-06-30 2011-02-22 Sony Computer Entertainment Inc. Advertisement insertion, profiling, impression, and feedback
US5995134A (en) 1995-12-14 1999-11-30 Time Warner Cable Method and apparatus for enticing a passive television viewer by automatically playing promotional presentations of selectable options in response to the viewer's inactivity
US5974398A (en) 1997-04-11 1999-10-26 At&T Corp. Method and apparatus enabling valuation of user access of advertising carried by interactive information and entertainment services
US6542185B1 (en) * 1998-01-07 2003-04-01 Intel Corporation Method and apparatus for automated optimization of white and color balance on video camera
US6152563A (en) 1998-02-20 2000-11-28 Hutchinson; Thomas E. Eye gaze direction tracker
US8290351B2 (en) 2001-04-03 2012-10-16 Prime Research Alliance E., Inc. Alternative advertising in prerecorded media
US7340439B2 (en) 1999-09-28 2008-03-04 Chameleon Network Inc. Portable electronic authorization system and method
US6526335B1 (en) 2000-01-24 2003-02-25 G. Victor Treyz Automobile personal computer systems
TWI238319B (en) 2000-03-24 2005-08-21 Norio Watanabe Commercial effect measuring system, commercial system, and appealing power sensor
US20020111146A1 (en) 2000-07-18 2002-08-15 Leonid Fridman Apparatuses, methods, and computer programs for displaying information on signs
US20020065046A1 (en) 2000-07-18 2002-05-30 Vert, Inc. Apparatuses, methods, and computer programs for showing information on a vehicle having multiple displays
US6937996B1 (en) 2000-08-29 2005-08-30 Charles Bradley Forsythe Method and system for selecting and purchasing media advertising
WO2002041262A1 (en) 2000-11-16 2002-05-23 Laurence Haquet System for analysing people's movements
US6645078B1 (en) 2001-02-16 2003-11-11 International Game Technology Casino gambling apparatus with person detection
US7174029B2 (en) 2001-11-02 2007-02-06 Agostinelli John A Method and apparatus for automatic selection and presentation of information
US8561095B2 (en) 2001-11-13 2013-10-15 Koninklijke Philips N.V. Affective television monitoring and control in response to physiological data
MXPA04006108A (en) * 2001-12-21 2005-03-31 Thinking Pictures Inc Method, system and apparatus for media distribution and viewing verification.
US20030126013A1 (en) 2001-12-28 2003-07-03 Shand Mark Alexander Viewer-targeted display system and method
KR100438841B1 (en) * 2002-04-23 2004-07-05 삼성전자주식회사 Method for verifying users and updating the data base, and face verification system using thereof
US20040128198A1 (en) * 2002-05-15 2004-07-01 Linwood Register System and method for computer network-based enterprise media distribution
US20040073482A1 (en) 2002-10-15 2004-04-15 Wiggins Randall T. Targeted information content delivery using a combination of environmental and demographic information
JP2004157842A (en) 2002-11-07 2004-06-03 Nec Corp Eco drive diagnostic system and its method and business system using the same
US8292433B2 (en) 2003-03-21 2012-10-23 Queen's University At Kingston Method and apparatus for communication between humans and devices
US20040111360A1 (en) 2003-07-14 2004-06-10 David Albanese System and method for personal and business information exchange
JP2007503045A (en) 2003-08-18 2007-02-15 ユー−マーケティング インテレクチュアル プロパティーズ プライベート リミテッド Spontaneous delivery sales system and method
US20050149396A1 (en) 2003-11-21 2005-07-07 Marchex, Inc. Online advertising system and method
JP2005293491A (en) 2004-04-05 2005-10-20 Hitachi Ltd Server system
US8321269B2 (en) * 2004-10-26 2012-11-27 Validclick, Inc Method for performing real-time click fraud detection, prevention and reporting for online advertising
US20070073589A1 (en) 2004-11-04 2007-03-29 Vergeyle David L Electronic capture of promotions
US20060170670A1 (en) 2004-11-10 2006-08-03 Burke Joel S Interactive electronic dispaly, methods and apparatus for targeted propagation of sign content, systems for capturing and sending photographs and video, as a means of integrated customer service, information capture and marketing
US20080154671A1 (en) 2005-03-15 2008-06-26 Delk Louis D Emissions Tracking, Such as Vehicle Emissions Tracking, and Associated Systems and Methods
WO2006121986A2 (en) 2005-05-06 2006-11-16 Facet Technology Corp. Network-based navigation system having virtual drive-thru advertisements integrated with actual imagery from along a physical route
US9558498B2 (en) 2005-07-29 2017-01-31 Excalibur Ip, Llc System and method for advertisement management
US20070073585A1 (en) 2005-08-13 2007-03-29 Adstreams Roi, Inc. Systems, methods, and computer program products for enabling an advertiser to measure user viewing of and response to advertisements
US20070073579A1 (en) * 2005-09-23 2007-03-29 Microsoft Corporation Click fraud resistant learning of click through rate
US20070105536A1 (en) 2005-11-07 2007-05-10 Tingo George Jr Methods and apparatus for providing SMS notification, advertisement and e-commerce systems for university communities
US20070179852A1 (en) 2005-11-17 2007-08-02 Intent Media Works Holding, Llc Media distribution systems
JP2007143754A (en) 2005-11-25 2007-06-14 Aruze Corp Game machine
US20080004953A1 (en) 2006-06-30 2008-01-03 Microsoft Corporation Public Display Network For Online Advertising
US7693869B2 (en) 2006-09-06 2010-04-06 International Business Machines Corporation Method and apparatus for using item dwell time to manage a set of items
US20080147488A1 (en) * 2006-10-20 2008-06-19 Tunick James A System and method for monitoring viewer attention with respect to a display and determining associated charges
US20080167992A1 (en) 2007-01-05 2008-07-10 Backchannelmedia Inc. Methods and systems for an accountable media advertising application

Patent Citations (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5841987A (en) * 1994-08-19 1998-11-24 Thomson Consumer Electronics, Inc. Simple bus and interface system for consumer digital equipment
US7574727B2 (en) * 1997-07-23 2009-08-11 Touchtunes Music Corporation Intelligent digital audiovisual playback system
US5973683A (en) * 1997-11-24 1999-10-26 International Business Machines Corporation Dynamic regulation of television viewing content based on viewer profile and viewing history
US7364068B1 (en) * 1998-03-11 2008-04-29 West Corporation Methods and apparatus for intelligent selection of goods and services offered to conferees
US6631356B1 (en) * 1999-03-15 2003-10-07 Vulcan Portals, Inc. Demand aggregation through online buying groups
US6968565B1 (en) * 2000-02-25 2005-11-22 Vulcan Patents Llc Detection of content display observers with prevention of unauthorized access to identification signal
US20020062481A1 (en) * 2000-02-25 2002-05-23 Malcolm Slaney Method and system for selecting advertisements
US6574793B1 (en) * 2000-02-25 2003-06-03 Interval Research Corporation System and method for displaying advertisements
US6904408B1 (en) * 2000-10-19 2005-06-07 Mccarthy John Bionet method, system and personalized web content manager responsive to browser viewers' psychological preferences, behavioral responses and physiological stress indicators
US20050288954A1 (en) * 2000-10-19 2005-12-29 Mccarthy John Method, system and personalized web content manager responsive to browser viewers' psychological preferences, behavioral responses and physiological stress indicators
US20020125993A1 (en) * 2001-03-09 2002-09-12 Koninklijke Philips Electronics N.V. Apparatus and method for delivering an audio and/or video message to an individual
US7584150B2 (en) * 2001-10-12 2009-09-01 Hitachi, Ltd. Recording method, recording medium, and recording system
US20040003393A1 (en) * 2002-06-26 2004-01-01 Koninlkijke Philips Electronics N.V. Method, system and apparatus for monitoring use of electronic devices by user detection
US7027054B1 (en) * 2002-08-14 2006-04-11 Avaworks, Incorporated Do-it-yourself photo realistic talking head creation system and method
US20060188109A1 (en) * 2003-07-18 2006-08-24 Sony Corporation Reproducer and method for controlling reproduction
US7584353B2 (en) * 2003-09-12 2009-09-01 Trimble Navigation Limited Preventing unauthorized distribution of media content within a global network
US7505621B1 (en) * 2003-10-24 2009-03-17 Videomining Corporation Demographic classification using image components
US7319779B1 (en) * 2003-12-08 2008-01-15 Videomining Corporation Classification of humans into multiple age categories from digital images
US20070089125A1 (en) * 2003-12-22 2007-04-19 Koninklijke Philips Electronic, N.V. Content-processing system, method, and computer program product for monitoring the viewer's mood
US7302475B2 (en) * 2004-02-20 2007-11-27 Harris Interactive, Inc. System and method for measuring reactions to product packaging, advertising, or product features over a computer-based network
US20060100980A1 (en) * 2004-10-27 2006-05-11 Bellsouth Intellectual Property Corporation Methods and systems for delivering yellow pages content to a media delivery device
US20060174261A1 (en) * 2004-11-19 2006-08-03 Image Impact, Inc. Method and system for quantifying viewer awareness of advertising images in a video source
US20060117341A1 (en) * 2004-11-26 2006-06-01 Park Ju-Hee Method and apparatus to transmit data broadcasting content and method and apparatus to receive data broadcasting content
US20060147192A1 (en) * 2005-01-05 2006-07-06 Jian Zhang Best shooting moment selectable digital camera apparatus
US7664124B2 (en) * 2005-05-31 2010-02-16 At&T Intellectual Property, I, L.P. Methods, systems, and products for sharing content
US20060282465A1 (en) * 2005-06-14 2006-12-14 Corescient Ventures, Llc System and method for searching media content
US20070150340A1 (en) * 2005-12-28 2007-06-28 Cartmell Brian R Advertising technique
US20090142038A1 (en) * 2006-03-06 2009-06-04 Masaru Nishikawa Compressed coded data playback apparatus, and decoding/playback method of compressed coded data in the same apparatus
US20080183560A1 (en) * 2007-01-31 2008-07-31 Vulcan Portals, Inc. Back-channel media delivery system
US20090048908A1 (en) * 2007-01-31 2009-02-19 Vulcan Portals, Inc. Media delivery system
US20100106597A1 (en) * 2008-10-29 2010-04-29 Vulcan Portals, Inc. Systems and methods for tracking consumers

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080183560A1 (en) * 2007-01-31 2008-07-31 Vulcan Portals, Inc. Back-channel media delivery system
US20080189168A1 (en) * 2007-01-31 2008-08-07 Vulcan Portals, Inc. System and method for publishing advertising on distributed media delivery systems
US9171317B2 (en) 2007-01-31 2015-10-27 Vulcan Ip Holdings, Inc. Back-channel media delivery system
US9105040B2 (en) 2007-01-31 2015-08-11 Vulcan Ip Holdings, Inc System and method for publishing advertising on distributed media delivery systems
US20090048908A1 (en) * 2007-01-31 2009-02-19 Vulcan Portals, Inc. Media delivery system
US8504738B2 (en) 2007-06-28 2013-08-06 Apple Inc. Media management and routing within an electronic device
US9712658B2 (en) 2007-06-28 2017-07-18 Apple Inc. Enhancements to data-driven media management within an electronic device
US10523805B2 (en) 2007-06-28 2019-12-31 Apple, Inc. Enhancements to data-driven media management within an electronic device
US20090005891A1 (en) * 2007-06-28 2009-01-01 Apple, Inc. Data-driven media management within an electronic device
US10430152B2 (en) 2007-06-28 2019-10-01 Apple Inc. Data-driven media management within an electronic device
US20110213901A1 (en) * 2007-06-28 2011-09-01 Apple Inc. Enhancements to data-driven media management within an electronic device
US8041438B2 (en) 2007-06-28 2011-10-18 Apple Inc. Data-driven media management within an electronic device
US8095694B2 (en) 2007-06-28 2012-01-10 Apple Inc. Enhancements to data-driven media management within an electronic device
US8111837B2 (en) 2007-06-28 2012-02-07 Apple Inc. Data-driven media management within an electronic device
US8140714B2 (en) 2007-06-28 2012-03-20 Apple Inc. Media management and routing within an electronic device
US8171177B2 (en) * 2007-06-28 2012-05-01 Apple Inc. Enhancements to data-driven media management within an electronic device
US20090005892A1 (en) * 2007-06-28 2009-01-01 Guetta Anthony J Data-driven media management within an electronic device
US8635377B2 (en) 2007-06-28 2014-01-21 Apple Inc. Enhancements to data-driven media management within an electronic device
US8694140B2 (en) 2007-06-28 2014-04-08 Apple Inc. Data-driven media management within an electronic device
US8694141B2 (en) 2007-06-28 2014-04-08 Apple Inc. Data-driven media management within an electronic device
US20090187967A1 (en) * 2007-06-28 2009-07-23 Andrew Rostaing Enhancements to data-driven media management within an electronic device
US9411495B2 (en) 2007-06-28 2016-08-09 Apple Inc. Enhancements to data-driven media management within an electronic device
US8943225B2 (en) 2007-06-28 2015-01-27 Apple Inc. Enhancements to data driven media management within an electronic device
US9659090B2 (en) 2007-06-28 2017-05-23 Apple Inc. Data-driven media management within an electronic device
US20090049005A1 (en) * 2007-08-17 2009-02-19 Graywolf Sensing Solutions Method and system for collecting and analyzing environmental data
US7788294B2 (en) * 2007-08-17 2010-08-31 Graywolf Sensing Solutions, Llc Method and system for collecting and analyzing environmental data
US8700451B2 (en) 2008-10-29 2014-04-15 Vulcan Ip Holdings Inc. Systems and methods for tracking consumers
US20100106597A1 (en) * 2008-10-29 2010-04-29 Vulcan Portals, Inc. Systems and methods for tracking consumers
US20110106990A1 (en) * 2009-10-30 2011-05-05 International Business Machines Corporation Efficient handling of queued-direct i/o requests and completions
US8934645B2 (en) 2010-01-26 2015-01-13 Apple Inc. Interaction of sound, silent and mute modes in an electronic device
US9792083B2 (en) 2010-01-26 2017-10-17 Apple Inc. Interaction of sound, silent and mute modes in an electronic device
US10387109B2 (en) 2010-01-26 2019-08-20 Apple Inc. Interaction of sound, silent and mute modes in an electronic device
US8989884B2 (en) 2011-01-11 2015-03-24 Apple Inc. Automatic audio configuration based on an audio output device
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics

Also Published As

Publication number Publication date
US20080189168A1 (en) 2008-08-07
US9171317B2 (en) 2015-10-27
US9105040B2 (en) 2015-08-11
WO2008095028A1 (en) 2008-08-07
US20080183560A1 (en) 2008-07-31
WO2008095130A1 (en) 2008-08-07

Similar Documents

Publication Publication Date Title
US20080183575A1 (en) Back-channel media delivery system
US20170109781A1 (en) Media delivery system
US10346860B2 (en) Audience attendance monitoring through facial recognition
JP7207836B2 (en) A system for evaluating audience engagement
US8700451B2 (en) Systems and methods for tracking consumers
KR101652030B1 (en) Using viewing signals in targeted video advertising
JP6054448B2 (en) Targeted video advertising
CN102244807B (en) Adaptive video zoom
JP5230440B2 (en) Selective advertising display for multimedia content
KR101094119B1 (en) Method and system for managing an interactive video display system
US8752115B2 (en) System and method for aggregating commercial navigation information
US11533536B2 (en) Audience attendance monitoring through facial recognition
US20080244635A1 (en) Method to encourage digital video recording users to view advertisements by providing compensation offers
JP5559360B2 (en) Data highlighting and extraction
US20160063318A1 (en) Systems, methods, and devices for tracking attention of viewers of a display device
KR20090001680A (en) Centralized advertising system and method thereof
TW201424352A (en) Marketing method and computer system thereof for cloud system
US20080243604A1 (en) Method to dispose of compensation offers on a digital video recorder
TWI518641B (en) Advertisement broadcasting method and its apparatus
JP2010108259A (en) Multimedia document transmission and feedback system
TW201011666A (en) Multimedia information transmission and feedback system integrated with effectiveness evaluation
JP2023519608A (en) Systems and methods for collecting data from user devices
TWI396097B (en) Data feed back system and method

Legal Events

Date Code Title Description
AS Assignment

Owner name: VULCAN IP HOLDINGS INC., WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAPLAN, ROBERT E.;GRAHAM, STUART;TANUMIHARDJA, MARS;REEL/FRAME:019661/0291;SIGNING DATES FROM 20070726 TO 20070730

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION