WO2007036890A2 - Improving living lights with color coherency - Google Patents

Improving living lights with color coherency Download PDF

Info

Publication number
WO2007036890A2
WO2007036890A2 PCT/IB2006/053524 IB2006053524W WO2007036890A2 WO 2007036890 A2 WO2007036890 A2 WO 2007036890A2 IB 2006053524 W IB2006053524 W IB 2006053524W WO 2007036890 A2 WO2007036890 A2 WO 2007036890A2
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
content
color
generating
lighting script
Prior art date
Application number
PCT/IB2006/053524
Other languages
French (fr)
Other versions
WO2007036890A3 (en
Inventor
Mauro Barbieri
Srinivas Gutta
Original Assignee
Koninklijke Philips Electronics N.V.
U.S. Philips Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., U.S. Philips Corporation filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2007036890A2 publication Critical patent/WO2007036890A2/en
Publication of WO2007036890A3 publication Critical patent/WO2007036890A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources

Abstract

A display device includes a screen configured to display content. At least one light source provides ambient light under the control of a processor. The processor is configured to generate a lighting script from coherent pixels of at least one portion of the content including discarding incoherent pixels of the at least one portion, and to control the light source(s) in accordance with the lighting script.

Description

IMPROVING LIVING LIGHTS WITH COLOR COHERENCY
The present invention relates to systems, devices, and methods that generate a lighting script from content displayed or to be displayed on a screen, where the lighting script is configured to drive light sources for providing light output related to or associated with the content.
Display devices to enhance the viewing experience of television viewing through lighting applications have been introduced. A particularly interesting application is to project light on a wall from light sources surrounding a display, the so-called AmbiLight television (TV) from Philips Electronics as depicted in FIG 1 showing a display device 100 such as a TV which may be a flat TV which is hung on a wall. Controllable light sources 110, such as LED bars, are provided around the TV surrounding a display area or screen 120 where images or content are displayed. The light sources 110 emit surround light 130, for example, to wash the back wall (on which the TV is hung) with what is referred to as AmbiLight. The AmbiLight concept aims at enlarging the perceived image size through lighting effects or ambient light 130 surrounding the displayed image. The color of the ambient light 130 emitted from the light units or sources 110 is dynamically changed by sampling the video content that is currently being watched by the user, for example.
The ambient light feature, or AmbiLight, primarily involves automatic triggering of light effects based on the content that is currently being consumed by the user, where consumption refers to users or viewers watching movies, listening to music, browsing photos, etc. Furthermore, the content may be live or stored locally, for example, on a digital versatile disc (DVD) or hard drive of a personal video recorder (PVR). Alternatively or in addition to the ambient light 130, a TV may be provided with or coupled to a device with a surround light feature having light speakers, e.g., tower speakers with on-board light units, which may be integrated with or a stand alone device along with the TV and/or a home entertainment system. The light speakers are triggered based on the audio and/or visual content that is being consumed by the user thereby creating an immersive experience for the user. The light effects may be realized by real-time analysis of the content stored or displayed through a unit such as a PVR or DVD player, a TV integrated with processing platform, a home theatre system, etc. Instead of real-time content analysis, the light effects may be stored as a script on e.g., on a DVD, as a separate track or may be downloaded from the Internet, and the script executed when the content is rendered thus generating light effects associated with or related to the content. Of course, the script may be generated in real-time during rendering, e.g. displaying, the content on the TV or any display device.
Ambient light 130 is automatically triggered when the content is rendered, e.g., when video content is displayed on the display area 120, by analyzing the color information in the content. The color of the ambient light 130 depends on the color information inherent in the content, particularly, the color of the content around the boundary of the scene or edges of the display area 120 of the TV 100.
The intensities and colors of the ambient light 130 depend on the video content displayed on the display area 120. For example, if the video content near one side of the TV 100 has a green color, then green light is provided from this side to backwash the back wall near this light with green light. Similarly, if the video content near another side of the TV 100 has a yellow color, then yellow light is provided from this other side to backwash the corresponding wall portion with yellow light, for example. Thus, the ambient light 130 with various attributes, such as various colors and intensities, is provided from the four sides of the TV 100 to illuminate the surrounding environment, such as the back wall, with a desired ambilight or light scenario. The lighting scenario is either provided with the video content via a script accompanying the content, for example, or derived from the video content through content analysis such as in real-time as the video is being rendered. Systems with active or passive ambient light features are described in International Publication No. WO 2006/003603 Al, entitled "Passive Diffuser Frame System for Ambient Lighting Using a Video Display Unit as a Light Source", filed on June 27, 2005, published on January 12, 2006, and as described in International Publication No. WO
2006/003604 Al, entitled "Active Frame System for Ambient Lighting Using a Video Display as a Signal Source", filed on June 27, 2005, published on January 12, 2006, both of which are assigned to the assignee hereof, and the contents of both of which are incorporated herein by reference in their entirety. These International Publications also describe various active diffuser systems where light output from the displayed content or the light source is modified or controlled in various ways to provide ambient light that follows or relates to the displayed content, such as via using various combinations of sensors, electromagnetic couplers, modulators and light source(s).
The light source(s) 110 that provide ambient light 130 may be controlled by a processor in response to the content information displayed on the display image area 120, such as via RF signals received by the display device from a broadcast for example, or video signals derived from the RF signals using modulators or other circuits needed for such controlled operation, as described in the above referenced International Publication Nos. WO 2006/003603 and WO 2006/003604, as well as described in International Publication No. WO 2005/062608, entitled "Supplementary Visual Display System", filed on December 20, 2004, published on July 7, 2005, and assigned to the assignee hereof, the content of which is incorporated herein by reference in its entirety. Accordingly, the ambient light 130 will be related to, and/or derived from, the content displayed on display image area 120 under the control of a processor, for example.
The human perception of color is a complex process. When dealing with visual data and color representations, several simplifying assumptions are typically made. For example, color features are treated at the pixel level. This assumes that the perception of a color is not influenced by the surrounding colors. In computing the color associated with a particular image or region in an image, all pixels are considered and contribute to the final result regardless of the fact that the value of some of the pixels might strongly depend on noise inherently present in the image. Furthermore, a "segment" of N pixels of the same color is considered to have the same influence as if the N pixels were scattered throughout the whole image. This clearly violates the human perception of colors in a complex image. Accordingly, there is a need for better content analysis or script generation used for controlling light sources to provide light having attributes, such as color and intensity, that more closely match the displayed content.
One object of the present systems is to overcome the disadvantage of conventional ambient light displays.
This and other objects are achieved by systems and devices comprising a screen configured to display content. At least one light source provides ambient light under the control of a processor. The processor is configured to generate a lighting script from coherent pixels of at least one portion of the content including discarding incoherent pixels of the at least one portion, and to control the at least one light source in accordance with the lighting script.
Further areas of applicability of the present systems will become apparent from the detailed description provided hereinafter. It should be understood that the detailed description and specific examples, while indicating exemplary embodiments of the displays and guiding foils, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
These and other features, aspects, and advantages of the apparatus, systems of the present invention will become better understood from the following description, appended claims, and accompanying drawing where:
FIG 1 shows a conventional display device;
FIG 2 shows a system according to one embodiment;
FIG 3 shows another system according to one embodiment; and
FIG 4 shows a display device according to one embodiment The following description of certain exemplary embodiments is merely exemplary in nature and is in no way intended to limit the invention, its application, or uses. In the following detailed description of embodiments of the present systems and devices, reference is made to the accompanying drawings which form a part hereof, and in which are shown by way of illustration specific embodiments in which the described devices may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the presently disclosed systems, and it is to be understood that other embodiments may be utilized and that structural and logical changes may be made without departing from the spirit and scope of the present system.
The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present system is defined only by the appended claims. The leading digit of the reference numbers in the figures herein typically correspond to the figure number, with the exception that identical components which appear in multiple figures are identified by the same reference numbers. Moreover, for the purpose of clarity, detailed descriptions of well-known devices, circuits, and methods are omitted so as not to obscure the description of the present system.
FIG 2 shows a device 200 in accordance with an embodiment of the present system. The device 200 includes a processor 210 operationally coupled, via any type of link such as a wired or wireless link, to a content source 220, at least one display 230, and at least one controllable light source 240. The content 250 from the content source 220 may be video and/or audio content which may be live or recorded, e.g., broadcast by a broadcaster or received from the Internet. The content source 220 and a further memory 260 may also include any suitable type of memory where the content 250 and other data are stored or recorded, including a remote memory accessible through a network, local or wide area network such as the Internet, or a local memory of the device 200, such as a hard drive, or a removable memory including a DVD, for example.
The memory storing the content 250 or the further memory 260 may also store application data as well as other desired data accessible by the processor 210 for configuring it to perform operation acts in accordance with the present system. The operation acts include controlling the display(s) 230 to display the content 250 (upon processing thereof, such as decoding and generating video signals for display on the display 230), generating a lighting script, and executing the generated lighting script to control the light source(s) 240 to provide ambient light 130 synchronous with and related to the content 250 displayed on the display 230 in the display area 120 shown in FIG 1, for example.
Clearly the processor 210, the memory of content source 220, the further memory 260, the display 230 and/or the light source(s) 240 may all or partly be a portion of single (fully or partially) integrated unit such as a TV, computer system, personal digital assistant (PDA), mobile phone or any consumer electronics devices such as a set-top box, a digital video recorder (DVR), juke-box, media centre, e-hub, home theatre in the box (HTiB), etc. For example, instead of being integrated in a single device, the processor may be distributed where the script may be generated by a stand alone device connected to the display/TV 230 or a set-top box and the like.
The methods of the present system are particularly suited to be carried out by a computer software program, such program containing modules corresponding to one or more of the individual steps or acts described and/or envisioned by the present system. Such program may of course be embodied in a computer-readable medium, such as an integrated chip, a peripheral device or memory, such as the memory 260 or other memory coupled to or accessible by the processor 210. The computer-readable medium and/or memory 260 and/or the memory associated with the content source 220 for storing the content 250 may be any recordable medium (e.g., RAM, ROM, removable memory, CD-ROM, hard drives, DVD, floppy disks or memory cards) or may be a transmission medium (e.g., a network comprising fiber-optics, the world- wide web, cables, or a wireless channel using time-division multiple access, code-division multiple access, or other radio-frequency channel). Any medium known or developed that can store and/or transmit information suitable for use with a computer system may be used as the computer-readable medium and/or memory.
Additional memories may also be used. The computer-readable medium storing the content 250, the memory 260, and/or any other memories may be long-term, short- term, or a combination of long-term and short-term memories. These memories configure the processor 210 to implement the methods, operational acts, and functions disclosed herein. The memories may be distributed or local and the processor 210, where additional processors may be provided, may also be distributed or may be singular. The memories may be implemented as electrical, magnetic or optical memory, or any combination of these or other types of storage devices. Moreover, the term "memory" should be construed broadly enough to encompass any information able to be read from or written to an address in the addressable space accessed by a processor. With this definition, information on a network is still within memory 260 or the memory associated with the content source 220 where the content 250 is stored, for instance, because the processor 210 may retrieve the information from the network for operation in accordance with the present system.
The processor 210 is capable of providing control signals to control the light source(s) 240 and/or performing operations in response to executing the script generated by the processor 210 from the content 250, such as edge pixels of the content, and executing instructions stored in the memory 260. The processor 210 may be an application-specific or general-use integrated circuit(s). Further, the processor 210 may be a dedicated processor for performing in accordance with the present system or may be a general-purpose processor wherein only one of many functions operates for performing in accordance with the present system. The processor 210 may operate utilizing a program portion, multiple program segments, or may be a hardware device, such as a decoder, demodulator, or a renderer such as TV, DVD player/recorder, PDA, PVR, set top box, mobile phone, etc, utilizing a dedicated or multi-purpose integrated circuit(s). Any type of processor may be used such as dedicated or shared one. The processor may include micro-processors, central processing units (CPUs), digital signal processors (DSPs), ASICs, or any other processor(s) or controller(s) such as digital optical devices, or analog electrical circuits that perform the same functions, and employ electronic techniques and architecture. The processor is typically under software control for example, and has or communicates with memory that stores the software and other data such as user preferences. The processor is configured to relate the ambient light to the video information displayed on the relevant display image area, such as the content displayed near edges of the screen. The processor may also include any intelligent device that may allow controlling directly or indirectly the light source(s) 240 so that character of the output light made therefrom changes, such as by changing any light attributes including color, hue, saturation, intensity, or other photometric quality, e.g., specular reflection properties, retroreflective properties, etc. This may also include controlling an on/off duty cycle for a plurality of light generating devices, controlling modulators, changing the luminous output of an electroluminescent device, or any other modifications which change the ambient light character directly or indirectly as a function of content information signal (RF, video, audio or the like) and/or as a function of the video image(s) displayed on the display 230. The light source(s) 240 may be any controllable light source(s) capable of providing lights of various attributes, such as various intensity levels, different colors, hue, saturation and the like, including any one of or combination(s) of LEDs, incandescent, fluorescent, halogen, or high intensity discharge (HID) light, which may have a ballast for control of the various light attributes. However, LEDs are particularly well suited light sources as they can be easily configured to provide light with changing light attributes (such as changing colors, intensity, hue, saturation and other attributes), and typically have electronic drive circuitry for control and adjustment of the various light attributes. Of course, the LEDs may include individually controllable red, green and blue LEDs that in combination provide any desired color, intensity and the like.
The different types of light source may be used alone or in combination with each other, such as incandescent, gaseous discharge, fluorescent, phosphorescence, laser, photo- luminescent, electro-luminescent, cathode-luminescent, galvano-luminescent, crystallo- luminescent, kine-luminescent, thermo-luminescent, tribo-luminescent, sono-luminescent and/or radio-luminescent sources, as described in the International Publication No. WO 2005/062608 A2. Further, the light source may be provided on the back boundary of the display, and/or side edges of the display, e.g., TV 100 shown in FIG 1.
Although the following description is directed to an embodiment using TV, it should be understood that any display device may be provided with or connected to present systems and devices, to provide ambient light to the surrounding environment, in any desired direction, whether or not a back wall is included in the environment. Any display type may be used, such as cathode ray tube (CRT), liquid crystal display (LCD), light emitting diode (LED) displays, organic light emitting diode (OLED) displays, plasma display panels (PDPs), field emission display (FEDs) or any other display such as projection displays, thin-film printed optically-active polymer displays, or displays using any other technologies or displays.
In the present systems and devices, the computation of the lighting script is reduced by discarding pixels that are not "coherent" where coherence of a color in an image includes the degree to which pixels of that color are members of large similarly colored regions. An article by Greg Pass et al. entitled "Comparing Images Using Color Coherence Vectors," which is incorporated herein by reference in its entirety, discusses discarding incoherent pixels for use in content-based image retrieval, including comparing images as well as searching, querying and retrieving similar images from a large database of images, such as a collection of digital photographs.
Lighting scripts computed by considering only coherent pixels are perceptually better and less noisy. Color space of the image is discretized into N distinct (discretized) colors/color buckets, and each pixel is associated with a particular color bucket. Given an image or any sub-segment of it, each pixel is also classified as either coherent or incoherent, based on whether or not it is part of a large similarly colored region.
The image is slightly blurred by replacing pixel values with the average value in a small local neighborhood, such as the eight adjacent pixels. This eliminates small variations between neighboring pixels. Further, the pixels are classified within a given color bucket as either coherent or incoherent. A coherent pixel is part of a large group of contiguous pixels of the same or substantially similar color, while an incoherent pixel is not. The candidate coherent pixel groups are determined by computing connected components within a given discretized color bucket. A connected component C is a maximal set of pixels such that, for any two pixels p and q, there is a path in C between p and q. Thus, a path in C is a sequence of pixels such that each pixel and any two sequential pixels are adjacent to each other within a color bucket. Two pixels are considered to be adjacent if one pixel is among the eight closest neighbors of the other. When the computation of the connected components is complete, each pixel will belong to exactly one connected component. The pixels are classified as either coherent or incoherent depending on the size in pixels, or number of pixels, of its connected component. A pixel is coherent if the size of its connected component exceeds a fixed threshold value T; otherwise, the pixel is incoherent. The threshold value T may be any desired value and may be set to 1% of the total area image area 120 (FIG 1).
Incoherent pixels are then discarded from the successive computation of the light script.
The following example from the above-noted Pass article demonstrates classification of pixels as coherent and discarding incoherent pixels. Let the threshold value T equal 4 (T=4) and assume there are three color buckets R, G, and B color buckets. Assume the following matrix includes the resulting color values of the pixels:
22 10 21 22 15 16
24 21 13 20 14 17
23 17 38 23 17 16
25 25 22 14 15 21 27 22 12 11 21 20
24 21 10 12 22 23
The following matrix is obtained upon discretizing the color space and assigning the pixels to individual color buckets 1, 2, 3 associated with the colors R, G, B, respectively, where pixel color values from 10-19 are classified as red 'R' and assigned to bucket 1, pixel color values from 20-29 are classified as green 'G' and assigned to bucket 2, and pixel color values from 30-39 are classified as blue 'B' and assigned to bucket 3.
2 1 2 2 1 1
2 2 1 2 1 1 2 1 3 2 1 1
2 2 2 1 1 2
2 2 1 1 2 2
2 2 1 1 2 2 Next, the connected components are computed, where contiguous pixels of the same color are assigned and grouped together, where the groups are labeled with and letters such as A, B, C etc. The image matrix now becomes:
B C B B A A B B C B A A B C D B A A B B B A A E B B A A E E B B A A E E
Table 1 shows the size or the number of the connected components where, for example, there are twelve connected components 'A's in the above matrix, 'A' being color bucket 1, e.g., red.
Figure imgf000012_0001
TABLE 1
Since the threshold value T was assumed to be 4, pixels in groups C and D are classified as incoherent since there size is less than or equal 4. As the size of the pixels in groups A, B and E is greater than 4, these pixels in groups A, B and E are classified as coherent.
From Table 1, Table 2 may be derived that shows the three color buckets 1, 2, 3, and the associated coherent pixels α as well as the incoherent pixels β where, for example as seen from Table 1, color bucket 1 include 12+5=17 coherent pixels α (from groups A and E) and 3 incoherent pixels β from group C.
Figure imgf000012_0002
TABLE 2
As noted, the incoherent pixels β are discarded, resulting in the three color buckets containing only coherent pixels α, namely 17, 15 and 0 coherent pixels in color buckets 1, 2 and 3, respectively, as seen from Table 2.
Once the coherency or incoherency of the pixels has been ascertained and the incoherent pixels β discarded, ambient light is generated based on the coherent pixels of the content or portions of the content, particularly, edges or ends of the content displayed near the border or frame of the display.
FIG 3 shows an ambient lighting system 300 with light speakers positioned throughout a room 310, namely, using eleven independently controllable LED lighting speakers or units. All the eleven light sources or units take one RGB coordinate from the corresponding color space. The eleven light units include four light speakers 322, 324, 326, 328 at corner of the room 310, one unit under a couch 330 referred to as sub-light speaker 340, and six independently controlled LED units on the center-light speaker 350 which is situated behind the four sides or at the four edges of the Flat TV 360. Of course, only the center-light speaker 350 may be used, which may be a stand alone unit, or integrated with and part of the TV 360, which is shown in greater detail in FIG 4. FIG 3 also shows other furniture in the room 310 such as three chairs 370 and table 380.
FIG 4 shows the TV 360 surrounded in the back and/or sides by six independently controlled LED units Ll to L6. Each light unit Ll to L6 is triggered by extracting the average color information from each non-overlapping border or content region Rl to R6 at edges of the TV 360 near the light units Ll to L6. Illustratively, each region Rl to R6 of the content displayed on the TV 360 has the same width W of 100 pixels, or differing widths. As an example, if the size of the frame were 720x576 pixels, then the size of regions Rl, R2, R4 and R5 would be 360x100 pixels. Similarly, the size regions of R3 and R6 would be 100x376 pixels .
Ambient light 130 is produced by first acquiring a video signal and decoding the video signal into a set of frames that are displayed on the TV 360. Color information is extracted from the content (frames) currently being displayed, where displayed content around the boundary or edges of the TV 360 is used for color extraction such as regions Rl to R6, for example. The color information of the content is transformed or mapped from the RGB space onto the color space of the LEDs and the displayed color space. In addition, the transformed color information of the content, e.g., the RGB coordinates, are transmitted to the LED units or light sources Ll to L6 so as to trigger them to provide ambient light 130 having light attributes that are associated with the content displayed at the respective regions Rl to R6.
For the illustrative example where the size of the frame is 720x576 pixels, since the video signal is decoded into a set of frames (e.g., 25 frames per second) in the RGB color space, the resulting image size would be 720x576x3 which is a 3D matrix where each 2D matrix of size 720x576 corresponds to each one of the Red, Green and Blue channels.
From each such border region Rl to R6, all the pixels in each region are averaged to get an average color for that border region. In one embodiment, the average color information for each region of the channel is extracted by summing up all the pixels in that region and dividing by the total number of pixels in that region for each channel. The equation for the extraction of the average color information for each region for one channel is shown below:
Krreeιd
Figure imgf000014_0001
If the region under consideration is Rl, then J\/[ ι is of size 360x100 with n equal to 360 and m equal to 100. The above equation gives us the average of all the pixels for the red channel. Thus the average color for particular region would now be a triplet,
IV AVE ~
Figure imgf000014_0002
' IVgreen ' IVbluei
The same procedure is repeated for all the regions and for all the channels within each region.
Next in order to set the lights, a mapping transformation is performed between the TV and light units. This is achieved via a standard set of equations that take as input the measured color primaries from each LED unit. The color primaries for the red, green, blue and the reference white color components are acquired by using a color spectrometer. Once the primaries are obtained, the transformation process is as follows:
(a) Given a set of chromaticity (red, green and blue primaries) coordinates and the reference white, the transformation matrix is computed for mapping the average color information onto the XYZ color gamut space for both the FLAT TV 360 as well as the LED units L1-L6 surrounding the TV 360 and/or other light speakers in the room 310 shown in FIG 3, for example. This gives us two set of equations: [X; Y; Z]=Mi * [R;G;B] for Flat TV
[X; Y; Z]=M2 * [R';G'B'] for LED's
(b) The mapped RGB values for the light units may be found by solving the following: [R';G';B'] = M2 1 * Mi * [R;G;B]
In steps (a) above, [R; G; B] corresponds to the triplet which is the computed average color information for a particular region for all channels. The general method for computing the matrix M is shown below:
Given the chromaticity coordinates of an RGB system (xr,yr), (xg,Yg) and (xb,yt>) and the white point (xw,yw), the method to compute the 3 x 3 matrix for converting RGB to XYZ is as follows:
[X Y Z] = [R G B][M]
where
[M] = SrXr SrYr srzr
Figure imgf000015_0001
7
SbXb SbYb SbZgb
Xr = xr Yr = yr Z1 = I " (Xr + Yr)
Figure imgf000015_0002
Xb = Xb Yb = yb Zb = I - (Xb + Yb)
Xw = Xw Yw = yw Zw = 1 - (xw + yw)
Figure imgf000015_0003
The above method is used for obtaining Mi and M2 and [R';G';B'] by following step (b) above. Thus [R';G';B'] is the transformed color information for a particular region. The same process is repeated for obtaining [R';G';B'] for each of the 6 regions Rl- R6. The transformed color information is then sent to the light units so that they may be triggered. It should be noted that the whole process is repeated for all the 25 frames in each second. It should further be noted that the width W of each region R1-R6 may be varied and the number of regions in the frame may also be varied. As an example, instead of using 6 regions R1-R6, one may use only 4 regions as well. In such a case, the transformed color information for the upper region (e.g., regions Rl and R2 combined into a single upper region) may be sent to both the LED panels Ll , L2 located at the top of the TV 360. Of course as would be apparent to one skilled in the art from reading this disclosure, the described procedure may be realized in software or via a programmable hardware platform such as EPLD, etc.
Of course, it is to be appreciated that any one of the above embodiments or processes may be combined with one or with one or more other embodiments or processes to provide even further improvements in providing ambient light associated with the watched or displayed content, for example.
Finally, the above-discussion is intended to be merely illustrative of the present system and should not be construed as limiting the appended claims to any particular embodiment or group of embodiments. Thus, while the present system has been described in particular detail with reference to specific exemplary embodiments thereof, it should also be appreciated that numerous modifications and alternative embodiments may be devised by those having ordinary skill in the art without departing from the broader and intended spirit and scope of the present system as set forth in the claims that follow. The specification and drawings are accordingly to be regarded in an illustrative manner and are not intended to limit the scope of the appended claims.
In interpreting the appended claims, it should be understood that: a) the word "comprising" does not exclude the presence of other elements or acts than those listed in a given claim; b) the word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements; c) any reference signs in the claims do not limit their scope; d) several "means" may be represented by the same or different item(s) or hardware or software implemented structure or function; e) any of the disclosed elements may be comprised of hardware portions (e.g., including discrete and integrated electronic circuitry), software portions (e.g., computer programming), and any combination thereof; f) hardware portions may be comprised of one or both of analog and digital portions; g) any of the disclosed devices or portions thereof may be combined together or separated into further portions unless specifically stated otherwise; and h) no specific sequence of acts or steps is intended to be required unless specifically indicated.

Claims

CLAIMS:
1. A display device comprising: a screen configured to display content; at least one light source configured to provide ambient light; a processor configured to: generate a lighting script from coherent pixels of at least one portion of the content including discarding incoherent pixels of the at least one portion; and control the at least one light source in accordance with the lighting script.
2. The display device of claim 1, wherein generating the lighting script further includes blurring image pixels of the at least one portion of the content.
3. The display device of claim 1, wherein generating the lighting script further includes replacing image pixel values of the at least one portion of the content with an average pixel value of the image pixel values of the at least one portion.
4. The display device of claim 1, wherein generating the lighting script further includes replacing an image pixel value of a pixel of the at least one portion of the content with an average pixel value, the average pixel value of the pixel being determined by averaging pixel values of pixels adjacent to the pixel.
5. The display device of claim 1, wherein the coherent pixels are part of a group of contiguous pixels having a substantially similar color.
6. The display device of claim 1, wherein generating the lighting script further includes associating at least one of the coherent pixels and incoherent pixels with color buckets.
7. The display device of claim 1 , wherein the at least one portion of the content includes non-overlapping portions near edges of the screen.
8. The display device of claim 1, wherein the at least one portion of the content includes portions near edges of the screen, and generating the lighting script further includes averaging pixels of the at least one portion to determine an average color of the at least one portion.
9. A system comprising: a screen configured to display content; at least one light source configured to provide light related to the content; a processor configured to: generate a lighting script from coherent pixels of at least one portion of the content including discarding incoherent pixels of the at least one portion; and control the at least one light source in accordance with the lighting script.
10. The system of claim 9, wherein the at least one light source includes at least one of light speakers and a lighting source attached to the screen.
11. The system of claim 9, wherein generating the lighting script further includes blurring image pixels of the at least one portion of the content.
12. The system of claim 9, wherein generating the lighting script further includes replacing image pixel values of the at least one portion of the content with an average pixel value of the image pixel values of the at least one portion.
13. The system of claim 9, wherein generating the lighting script further includes replacing an image pixel value of a pixel of the at least one portion of the content with an average pixel value, the average pixel value of the pixel being determined by averaging pixel values of pixels adjacent to the pixel.
14. The system of claim 9, wherein the coherent pixels are part of a group of contiguous pixels having a substantially similar color.
15. The system of claim 9, wherein generating the lighting script further includes associating at least one of the coherent pixels and incoherent pixels with color buckets.
16. The system of claim 9, wherein the at least one portion of the content includes non-overlapping portions near edges of the screen.
17. The system of claim 9, wherein the at least one portion of the content includes portions near edges of the screen, and generating the lighting script further includes average pixels of the at least one portion to determine an average color of the at least one portion.
18. A method for controlling at least one light source comprising the acts of: displaying content on a screen; generating a lighting script from coherent pixels of the content including discarding incoherent pixels of the content; and controlling the at least one light source in accordance with the lighting script.
19. The method of claim 18, the generating act includes averaging color of the coherent pixels in at least one portion of the content.
20. A computer program product stored on a computer readable medium, the computer program when executed by a processor is configured to perform the method as claimed in claim 19.
PCT/IB2006/053524 2005-09-30 2006-09-27 Improving living lights with color coherency WO2007036890A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US72290305P 2005-09-30 2005-09-30
US60/722,903 2005-09-30
US82611706P 2006-09-19 2006-09-19
US60/826,117 2006-09-19

Publications (2)

Publication Number Publication Date
WO2007036890A2 true WO2007036890A2 (en) 2007-04-05
WO2007036890A3 WO2007036890A3 (en) 2007-07-05

Family

ID=37772651

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2006/053524 WO2007036890A2 (en) 2005-09-30 2006-09-27 Improving living lights with color coherency

Country Status (1)

Country Link
WO (1) WO2007036890A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008072152A1 (en) * 2006-12-11 2008-06-19 Koninklijke Philips Electronics N.V. Visual display system with varying illumination
WO2010028706A1 (en) * 2008-09-15 2010-03-18 Deutsche Telekom Ag Interactive advertising display
WO2011073811A1 (en) * 2009-12-15 2011-06-23 Koninklijke Philips Electronics N.V. Dynamic ambience lighting system
DE102017119125A1 (en) * 2017-08-22 2019-02-28 Roccat GmbH Apparatus and method for generating moving light effects
WO2019233800A1 (en) 2018-06-08 2019-12-12 Signify Holding B.V. Adjusting parameters of light effects specified in a light script
CN112954854A (en) * 2021-03-09 2021-06-11 生迪智慧科技有限公司 Control method, device and equipment for ambient light and ambient light system
DE102007008164B4 (en) 2007-02-19 2022-01-27 Airbus Operations Gmbh Lighting adaptation to an image display on vehicle interior surfaces

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1225546A2 (en) * 2001-01-18 2002-07-24 Lg Electronics Inc. Method for setting dominant color using spatial coherency
US6611297B1 (en) * 1998-04-13 2003-08-26 Matsushita Electric Industrial Co., Ltd. Illumination control method and illumination device
WO2004006570A1 (en) * 2002-07-04 2004-01-15 Koninklijke Philips Electronics N.V. Method of and system for controlling an ambient light and lighting unit
WO2005069640A1 (en) * 2004-01-06 2005-07-28 Koninklijke Philips Electronics, N.V. Ambient light script command encoding
WO2005069637A1 (en) * 2004-01-05 2005-07-28 Koninklijke Philips Electronics, N.V. Ambient light derived form video content by mapping transformations through unrendered color space

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6611297B1 (en) * 1998-04-13 2003-08-26 Matsushita Electric Industrial Co., Ltd. Illumination control method and illumination device
EP1225546A2 (en) * 2001-01-18 2002-07-24 Lg Electronics Inc. Method for setting dominant color using spatial coherency
WO2004006570A1 (en) * 2002-07-04 2004-01-15 Koninklijke Philips Electronics N.V. Method of and system for controlling an ambient light and lighting unit
WO2005069637A1 (en) * 2004-01-05 2005-07-28 Koninklijke Philips Electronics, N.V. Ambient light derived form video content by mapping transformations through unrendered color space
WO2005069640A1 (en) * 2004-01-06 2005-07-28 Koninklijke Philips Electronics, N.V. Ambient light script command encoding

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PASS G ET AL: "COMPARING IMAGES USING COLOR COHERENCE VECTORS" PROCEEDINGS OF ACM MULTIMEDIA 96. BOSTON, NOV. 18 - 22, 1996, NEW YORK, ACM, US, 18 November 1996 (1996-11-18), pages 65-73, XP000734710 ISBN: 0-89791-871-1 cited in the application *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008072152A1 (en) * 2006-12-11 2008-06-19 Koninklijke Philips Electronics N.V. Visual display system with varying illumination
US8174488B2 (en) 2006-12-11 2012-05-08 Koninklijke Philips Electronics N.V. Visual display system with varying illumination
DE102007008164B4 (en) 2007-02-19 2022-01-27 Airbus Operations Gmbh Lighting adaptation to an image display on vehicle interior surfaces
WO2010028706A1 (en) * 2008-09-15 2010-03-18 Deutsche Telekom Ag Interactive advertising display
WO2011073811A1 (en) * 2009-12-15 2011-06-23 Koninklijke Philips Electronics N.V. Dynamic ambience lighting system
DE102017119125A1 (en) * 2017-08-22 2019-02-28 Roccat GmbH Apparatus and method for generating moving light effects
US11340711B2 (en) 2017-08-22 2022-05-24 Voyetra Turtle Beach, Inc. Device and method for generating moving light effects, and salesroom having such a system
WO2019233800A1 (en) 2018-06-08 2019-12-12 Signify Holding B.V. Adjusting parameters of light effects specified in a light script
CN112954854A (en) * 2021-03-09 2021-06-11 生迪智慧科技有限公司 Control method, device and equipment for ambient light and ambient light system
CN112954854B (en) * 2021-03-09 2023-04-07 生迪智慧科技有限公司 Control method, device and equipment for ambient light and ambient light system

Also Published As

Publication number Publication date
WO2007036890A3 (en) 2007-07-05

Similar Documents

Publication Publication Date Title
RU2352081C2 (en) Selection of dominating colour with application of perception laws for creation of surrounding lighting obtained from video content
JP4399087B2 (en) LIGHTING SYSTEM, VIDEO DISPLAY DEVICE, AND LIGHTING CONTROL METHOD
US8233033B2 (en) Supplementary visual display system
US20100265414A1 (en) Combined video and audio based ambient lighting control
KR101170408B1 (en) Dominant color extraction for ambient light derived from video content mapped through unrendered color space
EP2080418B1 (en) Method for color transition for ambient or general illumination system
US20100177247A1 (en) Ambient lighting
US8576325B2 (en) Generating still images and video by capture of images projected by light passing through a display screen
US20120242251A1 (en) Ambience lighting system using global content characteristics
WO2007036890A2 (en) Improving living lights with color coherency
US20060062424A1 (en) Method of and system for controlling an ambient light and lighting unit
JP2008505384A (en) Ambient light generation from broadcasts derived from video content and influenced by perception rules and user preferences
JP2008536165A (en) Color conversion unit to reduce stripes
US20090096917A1 (en) Scanning Projector Ambient Lighting System
KR20150049895A (en) Apparatus for preventing image sticking in display device
KR20110106317A (en) A display system, control unit, method, and computer program product for providing ambient light with 3d sensation
JP2015537248A (en) Using ambient light for copy protection of video content displayed on the screen
WO2007072339A2 (en) Active ambient light module
JP5266559B2 (en) System, method and computer readable medium for displaying light radiation
US20160267698A1 (en) Simplified lighting compositing
JP2013218153A (en) Illumination control device, control system, and illumination control method
JP5166794B2 (en) Viewing environment control device and viewing environment control method
CN117979500A (en) Atmosphere lamp control method, atmosphere lamp control device and computer readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06821160

Country of ref document: EP

Kind code of ref document: A2