WO2017129554A1 - Controlled lighting for three-dimensional trajectories - Google Patents

Controlled lighting for three-dimensional trajectories Download PDF

Info

Publication number
WO2017129554A1
WO2017129554A1 PCT/EP2017/051408 EP2017051408W WO2017129554A1 WO 2017129554 A1 WO2017129554 A1 WO 2017129554A1 EP 2017051408 W EP2017051408 W EP 2017051408W WO 2017129554 A1 WO2017129554 A1 WO 2017129554A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
general
dimensional trajectory
sensors
light sources
Prior art date
Application number
PCT/EP2017/051408
Other languages
French (fr)
Inventor
Harry Broers
Ruben Rajagopalan
Original Assignee
Philips Lighting Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Lighting Holding B.V. filed Critical Philips Lighting Holding B.V.
Publication of WO2017129554A1 publication Critical patent/WO2017129554A1/en

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present invention is directed generally to lighting control. More particularly, various inventive methods and apparatus disclosed herein relate to controlling one or more properties of light emitted to illuminate a three-dimensional trajectory.
  • LEDs light-emitting diodes
  • Functional advantages and benefits of LEDs include high energy conversion and optical efficiency, durability, lower operating costs, and many others.
  • Recent advances in LED technology have provided efficient and robust full-spectrum lighting sources that enable a variety of lighting effects in many applications.
  • Some of the fixtures embodying these sources feature a lighting module, including one or more LEDs capable of producing different colors, e.g., red, green, and blue, as well as a processor for independently controlling the output of the LEDs in order to generate a variety of colors and color-changing lighting effects, for example, as discussed in detail in U .S. Patent Nos. 6,016,038 and 6,211,626, incorporated herein by reference.
  • Lighting infrastructure for a public venue may be designed to illuminate relatively few different types of events.
  • a basketball game may be illuminated in one way that best enables the audience to see the players, the ball, and the baskets over a typically wooden court.
  • a hockey game may be illuminated in a different way than a basketball game because a hockey rink is made of ice.
  • An equestrian show may be illuminated in yet another way that best enables the audience to see the riders on their horses.
  • vertical luminance may also be measured at some distance above and/or away from a target surface, e.g., by placing light meters, mannequins, or sticks at a few desired heights.
  • some types of events may feature activity that does not always occur close to a surface, e.g., with objects that travel through free space along three-dimensional trajectories.
  • motocross events motorcycles often perform a number of jumps as part of the race, and these jumps may be a highlight of the audience experience.
  • ski jumpers may be airborne through much of their performance.
  • underwater performances such as those that may be featured in aqua parks (e.g., underwater mermaid shows, synchronized swimming, etc.) may feature performers travelling along three-dimensional trajectories.
  • the present disclosure is directed to inventive methods and apparatus for controlling one or more properties of light emitted to illuminate a three-dimensional trajectory.
  • a general three-dimensional trajectory followed by one or more objects e.g., motorcycles, ski jumpers, acrobats, etc.
  • one or more light quality attributes along the general three- dimensional trajectory may be determined.
  • a lighting system may be controlled to emit light having one or more properties selected based on the determined one or more light quality attributes along the general three-dimensional trajectory.
  • a method of controlling a plurality of light sources may include: identifying, based on data obtained from one or more sensors, a general three- dimensional trajectory to be followed by one or more objects through free space of an environment illuminated by the plurality of light sources; identifying, based on data obtained from the one or more sensors, one or more light quality characteristics at one or more points along the general three-dimensional trajectory; selecting one or more properties of light to be emitted by one or more of the plurality of light sources based on the identified one or more light quality characteristics; and operating the one or more of the plurality of light sources to emit light having the one or more selected properties while an object travels along the general three-dimensional trajectory.
  • the one or more sensors includes a camera, and identifying the general three-dimensional trajectory includes identifying the general three-dimensional trajectory based on video data obtained by the camera.
  • the video data depicts an object travelling through the environment.
  • the video data depicts a test pattern placed within the general three-dimensional trajectory.
  • the test pattern is affixed to an aerial drone that follows the general three- dimensional trajectory.
  • the method may include equipping an object passing through the three-dimensional trajectory with a radio frequency beacon.
  • a camera may be operated to automatically keep the radio frequency beacon within a field of view of the camera.
  • identifying the general three-dimensional trajectory may include identifying the general three-dimensional trajectory based on sensor data provided by a motion sensor attached to an object travelling through the environment.
  • At least one of the one or more sensors is affixed to an aerial drone.
  • identifying the one or more light quality characteristics comprises traversing the aerial drone along the general three-dimensional trajectory.
  • at least one of the one or more sensors includes one or more light sensors affixed to at least one of the one or more objects as it travels through the general three-dimensional trajectory.
  • identifying the one or more light quality characteristics is based on data collected by the one or more light sensors affixed to the at least one of the one or more objects.
  • the term "LED” should be understood to include any electroluminescent diode or other type of carrier injection/junction- based system that is capable of generating radiation in response to an electric signal.
  • the term LED includes, but is not limited to, various semiconductor-based structures that emit light in response to current, light emitting polymers, organic light emitting diodes (OLEDs), electroluminescent strips, and the like.
  • LED refers to light emitting diodes of all types (including semi-conductor and organic light emitting diodes) that may be configured to generate radiation in one or more of the infrared spectrum, ultraviolet spectrum, and various portions of the visible spectrum (generally including radiation wavelengths from approximately 400 nanometers to approximately 700 nanometers).
  • Some examples of LEDs include, but are not limited to, various types of infrared LEDs, ultraviolet LEDs, red LEDs, blue LEDs, green LEDs, yellow LEDs, amber LEDs, orange LEDs, and white LEDs (discussed further below).
  • LEDs may be configured and/or controlled to generate radiation having various bandwidths (e.g., full widths at half maximum, or FWHM) for a given spectrum (e.g., narrow bandwidth, broad bandwidth), and a variety of dominant wavelengths within a given general color categorization.
  • bandwidths e.g., full widths at half maximum, or FWHM
  • FWHM full widths at half maximum
  • an LED configured to generate essentially white light may include a number of dies which respectively emit different spectra of electroluminescence that, in combination, mix to form essentially white light.
  • a white light LED may be associated with a phosphor material that converts electroluminescence having a first spectrum to a different second spectrum.
  • electroluminescence having a relatively short wavelength and narrow bandwidth spectrum "pumps" the phosphor material, which in turn radiates longer wavelength radiation having a somewhat broader spectrum.
  • an LED does not limit the physical and/or electrical package type of an LED.
  • an LED may refer to a single light emitting device having multiple dies that are configured to respectively emit different spectra of radiation (e.g., that may or may not be individually controllable).
  • an LED may be associated with a phosphor that is considered as an integral part of the LED (e.g., some types of white LEDs).
  • the term LED may refer to packaged LEDs, non-packaged LEDs, surface mount LEDs, chip-on-board LEDs, T-package mount LEDs, radial package LEDs, power package LEDs, LEDs including some type of encasement and/or optical element (e.g., a diffusing lens), etc.
  • the term "light source” should be understood to refer to any one or more of a variety of radiation sources, including, but not limited to, LED-based sources (including one or more LEDs as defined above), incandescent sources (e.g., filament lamps, halogen lamps), fluorescent sources, phosphorescent sources, high-intensity discharge sources (e.g., sodium vapor, mercury vapor, and metal halide lamps), lasers, other types of electroluminescent sources, pyro-luminescent sources (e.g., flames), candle-luminescent sources (e.g., gas mantles, carbon arc radiation sources), photo-luminescent sources (e.g., gaseous discharge sources), cathode luminescent sources using electronic satiation, galvano-luminescent sources, crystallo- luminescent sources, kine-luminescent sources, thermo-luminescent sources, triboluminescent sources, sonoluminescent sources, radioluminescent sources, and luminescent polymers.
  • LED-based sources
  • a given light source may be configured to generate electromagnetic radiation within the visible spectrum, outside the visible spectrum, or a combination of both.
  • a light source may include as an integral component one or more filters (e.g., color filters), lenses, or other optical components.
  • filters e.g., color filters
  • lenses e.g., prisms
  • light sources may be configured for a variety of applications, including, but not limited to, indication, display, and/or illumination.
  • illumination source is a light source that is particularly configured to generate radiation having a sufficient intensity to effectively illuminate an interior or exterior space.
  • sufficient intensity refers to sufficient radiant power in the visible spectrum generated in the space or environment (the unit “lumens” often is employed to represent the total light output from a light source in all directions, in terms of radiant power or "luminous flux”) to provide ambient illumination (i.e., light that may be perceived indirectly and that may be, for example, reflected off of one or more of a variety of intervening surfaces before being perceived in whole or in part).
  • the term “spectrum” should be understood to refer to any one or more frequencies (or wavelengths) of radiation produced by one or more light sources. Accordingly, the term “spectrum” refers to frequencies (or wavelengths) not only in the visible range, but also frequencies (or wavelengths) in the infrared, ultraviolet, and other areas of the overall electromagnetic spectrum. Also, a given spectrum may have a relatively narrow bandwidth (e.g., a FWHM having essentially few frequency or wavelength components) or a relatively wide bandwidth (several frequency or wavelength components having various relative strengths). It should also be appreciated that a given spectrum may be the result of a mixing of two or more other spectra (e.g., mixing radiation respectively emitted from multiple light sources).
  • color is used interchangeably with the term “spectrum.”
  • the term “color” generally is used to refer primarily to a property of radiation that is perceivable by an observer (although this usage is not intended to limit the scope of this term). Accordingly, the terms “different colors” implicitly refer to multiple spectra having different wavelength components and/or bandwidths. It also should be appreciated that the term “color” may be used in connection with both white and non-white light.
  • color temperature generally is used herein in connection with white light, although this usage is not intended to limit the scope of this term.
  • Color temperature essentially refers to a particular color content or shade (e.g., reddish, bluish) of white light.
  • the color temperature of a given radiation sample conventionally is characterized according to the temperature in degrees Kelvin (K) of a black body radiator that radiates essentially the same spectrum as the radiation sample in question.
  • Black body radiator color temperatures generally fall within a range of approximately 700 degrees K (typically considered the first visible to the human eye) to over 10,000 degrees K; white light generally is perceived at color temperatures above 1500-2000 degrees K.
  • Lower color temperatures generally indicate white light having a more significant red component or a "warmer feel,” while higher color temperatures generally indicate white light having a more significant blue component or a "cooler feel.”
  • fire has a color temperature of approximately 1,800 degrees K
  • a conventional incandescent bulb has a color temperature of approximately 2848 degrees K
  • early morning daylight has a color temperature of approximately 3,000 degrees K
  • overcast midday skies have a color temperature of approximately 10,000 degrees K.
  • a color image viewed under white light having a color temperature of approximately 3,000 degree K has a relatively reddish tone
  • the same color image viewed under white light having a color temperature of approximately 10,000 degrees K has a relatively bluish tone.
  • the term "lighting fixture” is used herein to refer to an implementation or arrangement of one or more lighting units in a particular form factor, assembly, or package.
  • the term “lighting unit” is used herein to refer to an apparatus including one or more light sources of same or different types.
  • a given lighting unit may have any one of a variety of mounting arrangements for the light source(s), enclosure/housing arrangements and shapes, and/or electrical and mechanical connection configurations. Additionally, a given lighting unit optionally may be associated with (e.g., include, be coupled to and/or packaged together with) various other components (e.g., control circuitry) relating to the operation of the light source(s).
  • LED-based lighting unit refers to a lighting unit that includes one or more LED- based light sources as discussed above, alone or in combination with other non LED-based light sources.
  • a “multi-channel” lighting unit refers to an LED-based or non LED-based lighting unit that includes at least two light sources configured to respectively generate different spectrums of radiation, wherein each different source spectrum may be referred to as a "channel" of the multi-channel lighting unit.
  • controller is used herein generally to describe various apparatus relating to the operation of one or more light sources.
  • a controller can be implemented in numerous ways (e.g., such as with dedicated hardware) to perform various functions discussed herein.
  • a "processor” is one example of a controller which employs one or more microprocessors that may be programmed using software (e.g., microcode) to perform various functions discussed herein.
  • a controller may be implemented with or without employing a processor, and also may be implemented as a combination of dedicated hardware to perform some functions and a processor (e.g., one or more programmed microprocessors and associated circuitry) to perform other functions. Examples of controller components that may be employed in various embodiments of the present disclosure include, but are not limited to, conventional microprocessors, application specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAs).
  • ASICs application specific integrated circuits
  • FPGAs field-programmable gate arrays
  • a processor or controller may be associated with one or more storage media (generically referred to herein as "memory,” e.g., volatile and non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM, floppy disks, compact disks, optical disks, magnetic tape, etc.).
  • the storage media may be encoded with one or more programs that, when executed on one or more processors and/or controllers, perform at least some of the functions discussed herein.
  • Various storage media may be fixed within a processor or controller or may be transportable, such that the one or more programs stored thereon can be loaded into a processor or controller so as to implement various aspects of the present invention discussed herein.
  • program or “computer program” are used herein in a generic sense to refer to any type of computer code (e.g., software or microcode) that can be employed to program one or more processors or controllers.
  • the term "addressable” is used herein to refer to a device (e.g., a light source in general, a lighting unit or fixture, a controller or processor associated with one or more light sources or lighting units, other non-lighting related devices, etc.) that is configured to receive information (e.g., data) intended for multiple devices, including itself, and to selectively respond to particular information intended for it.
  • a device e.g., a light source in general, a lighting unit or fixture, a controller or processor associated with one or more light sources or lighting units, other non-lighting related devices, etc.
  • information e.g., data
  • the term “addressable” often is used in connection with a networked environment (or a "network,” discussed further below), in which multiple devices are coupled together via some communications medium or media.
  • one or more devices coupled to a network may serve as a controller for one or more other devices coupled to the network (e.g., in a master/slave relationship).
  • a networked environment may include one or more dedicated controllers that are configured to control one or more of the devices coupled to the network.
  • multiple devices coupled to the network each may have access to data that is present on the communications medium or media; however, a given device may be "addressable" in that it is configured to selectively exchange data with (i.e., receive data from and/or transmit data to) the network, based, for example, on one or more particular identifiers (e.g., "addresses") assigned to it.
  • network refers to any interconnection of two or more devices (including controllers or processors) that facilitates the transport of information (e.g., for device control, data storage, data exchange, etc.) between any two or more devices and/or among multiple devices coupled to the network.
  • information e.g., for device control, data storage, data exchange, etc.
  • networks suitable for interconnecting multiple devices may include any of a variety of network topologies and employ any of a variety of communication protocols.
  • any one connection between two devices may represent a dedicated connection between the two systems, or alternatively a non-dedicated connection.
  • a non-dedicated connection may carry information not necessarily intended for either of the two devices (e.g., an open network connection).
  • various networks of devices as discussed herein may employ one or more wireless, wire/cable, and/or fiber optic links to facilitate information transport throughout the network.
  • user interface refers to an interface between a human user or operator and one or more devices that enables communication between the user and the device(s).
  • user interfaces that may be employed in various implementations of the present disclosure include, but are not limited to, switches, potentiometers, buttons, dials, sliders, a mouse, keyboard, keypad, various types of game controllers (e.g., joysticks), track balls, display screens, various types of graphical user interfaces (GUIs), touch screens, microphones and other types of sensors that may receive some form of human-generated stimulus and generate a signal in response thereto.
  • game controllers e.g., joysticks
  • GUIs graphical user interfaces
  • three dimensional trajectory may refer to a trajectory travelled by one or more objects through “free space.”
  • Free space may refer to any location that is remote from any surface. For example, a motocross participant, ski jumper, or an acrobat will travel through at least one midair and/or airborne trajectory. A three-dimensional trajectory followed by an underwater performer similarly will occur partially or mostly away from the surface of the water and away from any underlying surfaces.
  • FIG. 1 illustrates an example scenario in which disclosed techniques may be employed, in accordance with various embodiments.
  • Fig. 2 depicts an example scenario in which an unmanned aerial vehicle is used to detect light quality characteristics in a three-dimensional trajectory corridor, in accordance with various embodiments.
  • FIG. 3 schematically depicts example components that may be configured to perform techniques described herein.
  • Fig. 4 depicts an example method for performing selected aspects of the present disclosure.
  • Fig. 5 depicts an example computing system.
  • Lighting infrastructure for a public venue may be designed to illuminate relatively few different types of events, and the different available illumination settings may be tailored towards illuminating a surface such as court or rink.
  • some types of events may feature activity that occurs relatively far from a surface, such as airborne or underwater activity that occurs along one or more three-dimensional trajectories that are at least partially remote from any surface.
  • Existing lighting infrastructure is not equipped to detect light quality so that objects passing through three-dimensional trajectories can be properly illuminated. Thus, there is a need in the art to properly detect and illuminate three-dimensional trajectories to be travelled by objects.
  • Applicants have recognized and appreciated that it would be beneficial to provide methods and systems for controlling one or more properties of light emitted to illuminate a three-dimensional trajectory.
  • various embodiments and implementations of the present invention are directed to determining a general th ree-dimensional trajectory to be traveled by one or more objects, to determining one or more lighting quality characteristics along the three-dimensional trajectory, and to adjusting light output by a lighting system to better illuminate objects that travel along the trajectory.
  • a lighting system 100 is nominally configured to illuminate a surface 102 within a venue 104.
  • Surface 102 may be used for a variety of entertainment and/or sports activities, and may be covered with various types of natural or artificial turfs, one or more ice rinks, dirt (e.g., for rodeos or automobile rallies/races), a basketball court, or other types of flooring that may be used to host other types of events, such as gymnastics competitions, fairs, expositions, and so forth.
  • lighting system 100 includes two sets, 106a and 106b, of individual stadium lights 108 (only four stadium lights 108 are referenced in Fig. 1 to avoid cluttering the drawings).
  • each set 106 may include more or less stadium lights 108 than are depicted in Fig. 1, stadium lights 108 may be in different configurations within each set 106 than is depicted, and one set 106 of stadium lights 108 may include a different number of stadium lights 108 than another set 106 in a different configuration .
  • Each stadium light 108 may come in various forms.
  • the stadium lights 108 may be LED lights, in which case it may be possible to select and/or control one or more properties of light they emit. These properties of emitted light may include but are not limited to hue, saturation, color temperature, dynamic effects (e.g., blinking, flashing, etc.), intensity, a direction in which the emitted light is aimed, beam width, and so forth.
  • each stadium light 108 may be individually controllable to emit light having one or more selected properties. Additionally or alternatively, in some embodiments, groups of stadium lights 108 (e.g., each set 106) may be controlled to collectively emit light having one or more selected properties. While examples described herein include LED-based light sources, other light source types may be employed, including but not limited to halogen, incandescent, fluorescent, so called planar light sources that employ carbon nanotubes, and so forth.
  • venue 104 is being used for a motocross event.
  • one or more motocross participants may ride along su rface 102, at various points hitting one or more jumps 110.
  • the participants hit a jump 110 at sufficient speed they may become airborne for some distance until they once again land on surface 102.
  • a single rider 112 is schematically depicted in Fig. 1 at various stages of approaching, going over, and landing after jump 110.
  • a general three-dimensional trajectory 114 that will be travelled by multiple motocross participants can be identified.
  • general three-dimensional trajectory 114 may be defined as a three-dimensional "corridor" through which multiple objects, such as motocross participants, may be expected to travel through while participating in an event.
  • lighting system 100 may be operated in various ways (e.g., configured to implement one or more predefined lighting scenes) to emit light having various properties selected to properly illuminate surface 102 for various events.
  • light emitted by lighting system 100 to have various properties selected to illuminate surface 102 in a particular way may not effectively illuminate motocross participants as they travel well above surface 102 th rough general th ree-dimensional trajectory 114.
  • Similar issues may arise with other events that include participants that travel through free space, including but not limited to ski jump events, acrobat events (e.g., as part of a circus), monster truck rallies, underwater performances, aquatic diving events, and so forth.
  • general three-dimensional trajectory 114 may be detected and/or identified based on data obtained from one or more sensors.
  • multiple objects may be observed in free space, e.g., in midair, using the sensors. Data from those sensors may be analyzed collectively to identify general three- dimensional trajectory 114 in a manner that captures the trajectories of all such objects, rather than a single object.
  • one or more light quality characteristics at one or more points along general three- dimensional trajectory 114 may be identified based on data obtained from the same one or more sensors or different sensors. Based on the identified one or more light quality
  • Lighting system 100 may be operated to emit light having the one or more selected properties while one or more objects such as motocross participants travel along general three- dimensional trajectory 114.
  • One example of a type of sensor that may be used to detect/identify general three- dimensional trajectory 114 is a camera 116. Only one camera 116 is depicted in Fig. 1, but this is not meant to be limiting. Any number of cameras 116 may be employed to identify/detect general three-dimensional trajectory 114. Camera 116 may take various forms and employ various technologies. For example, in some embodiments, camera 116 may be a two- dimensional or even a three-dimensional camera that detects light in the visible spectrum. Additionally or alternatively, camera 116 may detect light in the infrared spectrum. In some embodiments, camera 116 may be a thermal imaging camera configured to detect heat.
  • Camera 116 may be operated to detect three-dimensional trajectory 114 in various ways.
  • video frames contained in a signal produced by camera 116 may be analyzed, in real time or after the fact.
  • Video frames may be examined to determine, for instance, a location of single rider 112 at each point in time along a time interval.
  • single rider 112 (or multiple motocross participants) may be provided with radio frequency (“RF") beacons (also referred to as "tags").
  • RF radio frequency
  • Camera 116 may be configured with an RF receiver configured to detect radio waves emitted by the RF beacons.
  • camera 116 may point its field of view towards a location at which camera 116 detects a tag, so that in effect, camera 116 "follows" objects such as single rider 112 as they travel through general three-dimensional trajectory 114.
  • general three-dimensional trajectory 114 may be identified and/or detected based on video footage produced by such a camera 116 and/or a path followed by a field of view of camera 116.
  • camera 116 may remain stationary.
  • An object travelling along general three-dimensional trajectory 114 may pass through a field of view of camera 116.
  • Video frames of that footage may be analyzed using standard object detection image processing techniques (e.g., edge detection) to determine multiple locations of the object along a time line. Together, those multiple locations may reveal general three-dimensional trajectory 114.
  • a type of sensor that may be used to detect and/or identify general three-dimensional trajectory 114 is a position sensor.
  • an object such as single rider 112 may be equipped with one or more position sensors, such as accelerometers, gyroscopes, altimeters, global positioning system (“GPS”) sensors, etc. These position sensors may record and/or provide a signal indicative of their location while the object travels through midair along general three-dimensional trajectory 114. The signal may then be used to identify and/or detect general three-dimensional trajectory 114.
  • one or more objects may be equipped with an RF beacon.
  • One or more RF receivers may be configured to receive an RF signal emitted by the beacon.
  • the signals provided by these receivers may be analyzed, e.g., using techniques such as triangulation, to determine trajectories of the objects (and hence, to identify general three-dimensional trajectory 114).
  • more than one of the signals provided by these receivers may be analyzed, e.g., using techniques such as triangulation, to determine trajectories of the objects (and hence, to identify general three-dimensional trajectory 114). In some embodiments, more than one of the
  • aforementioned sensors may be used to identify general three-dimensional trajectory 114.
  • one or more light quality characteristics at one or more points along general three-dimensional trajectory 114 may be identified based on data obtained from the same one or more sensors that were used to identify/detect general three-dimensional trajectory 114, or different sensors. Light quality characteristics within general three-dimensional trajectory 114 may be determined in various ways.
  • one or more cameras 116 may be used to determine one or more light quality characteristics in general three-dimensional trajectory 114 (e.g., after three- dimensional trajectory 114 is identified or at the same time). For example, light reflected off one or more objects passing through general three-dimensional trajectory 114 (e.g., reflected from single rider 112) may be detected and analyzed, in real time based on a video signal or after the fact by analyzing video footage.
  • a test pattern may be affixed to an object (e.g., on clothes worn by single rider 112) as the object passes through general three-dimensional trajectory 114. The test pattern may be selected depending on which light quality characteristics are of most interest.
  • a large textile with a test pattern printed on it may be hoisted above general three-dimensional trajectory 114 to form a plane that is parallel to a direction of travel through, and cuts through (e.g., bisects), general three-dimensional trajectory 114.
  • Light reflected from multiple positions on the textile may be analyzed to identify light quality characteristics.
  • Sensors other than camera(s) 116 may be used to identify light quality characteristics within general three-dimensional trajectory 114 as well.
  • objects passing through general three-dimensional trajectory 114 may be equipped with light sensors. These light sensors may be configured to detect one or more properties of light emitted by, for instance, lighting system 100, particularly as the objects pass through general three-dimensional trajectory 114.
  • single rider 112 could be provided with a lux level meter to calculate an intensity of light experienced at multiple points within general three-dimensional trajectory 114.
  • motocross participants e.g., single rider 112
  • other objects may be traversed along general three-dimensional trajectory 114 to identify one or more light quality
  • an un manned aerial vehicle (“UAV,” also referred to as a "drone”) 120 may be provided with light sensors (not depicted). UAV 120 may then be piloted through general three-dimensional trajectory 114 so that the light sensor(s) can obtain one or more readings at various points.
  • a user may manually pilot UAV 120 through general three-dimensional trajectory 114.
  • UAV 120 may be automatically piloted through general th ree-dimensional trajectory 114, e.g., as a result of a flight plan being provided to UAV 120 or to a remote computing device (not depicted) that controls it.
  • one or more computing devices may select one or more properties of light to be emitted by lighting system 100. Lighting system 100 may then be operated to emit light having the one or more selected properties while one or more objects such as motocross participants travel along general three-dimensional trajectory 114.
  • the computing devices may execute an algorithm that uses current lighting homogeneity information (i.e., the desired lighting homogeneity employed to uniformly illuminate surface 102) to calculate the required lighting compensation to correct for the homogeneity.
  • one or more light quality characteristics measured at one or more positions along general three-dimensional trajectory 114 fails to satisfy a criterion—e.g., the object is too dark as it travels through the air. It may be determined that an intensity of emitted light that passes through those points along general th ree-dimensional trajectory 114 should be increased, at all times or only when an object is passing through general th ree-dimensional trajectory 114. For example, one or more stadium lights 108 that are aimed at those one or more positions may be operated to emit light having an increased intensity.
  • one or more stadium lights 108 may be rotated and/or tilted, e.g., by a motor (not depicted), to aim more directly at the one or more positions within general three-dimensional trajectory 114, with or without also adjusting light output intensity.
  • FIG. 3 schematically depicts various components that may cooperate to perform techniques described herein.
  • a lighting system controller 340 is communicatively coupled with, and configured to control, a plurality of light sources 308i_ N of a lighting system 300, which in this example are LED-based light sources 308. Lighting system controller 340 is also
  • a trajectory identification engine 342 and a light quality analysis engine 344 may be "communicatively coupled” if they are capable of exchanging any sort of analog and/or signal between them, if they are capable of exchanging any sort of data, electricity, and so forth.
  • each of components 340, 342, and 344 may be communicatively coupled via one or more computing networks (not depicted) that employ various wired or wireless technologies, including but not limited to Wi-Fi, Ethernet, USB, Serial, Bluetooth, ZigBee, and so forth. While each of components 340, 342, and 344 are depicted separately, this is not meant to be limiting.
  • Trajectory identification engine 342 may be configured to obtain, from one or more sensors, signals indicative of one or more objects passing through an environment. Based on these signals, trajectory identification engine 342 may be configured to identify and/or detect one or more general th ree-dimensional trajectories followed by the objects, such as general three-dimensional trajectory 114 of Fig. 1. Trajectory identification engine 342 is depicted in Fig.
  • trajectory identification engine 342 may be a real time video signal, or may comprise recorded video frames that may be analyzed after the fact.
  • Trajectory identification engine 342 may provide data indicative of a sensed three- dimensional trajectory to light quality analysis engine 344.
  • This data may come in various forms, such as a plurality of three-dimensional position coordinates.
  • the position coordinates may be accompanied by dimension data indicative of one or more dimensions (e.g., width, height) of a trajectory "corridor" associated with the three-dimensional trajectory.
  • a trajectory "corridor” may encompass composite position coordinates from a plurality of travelling objects, with the goal to ensure that an ultimate calculated trajectory corridor substantially or completely encompasses all objects that were observed to determine the three-dimensional trajectory.
  • Light quality analysis engine 344 may be configured to identify one or more light quality characteristics at one or more points along the general three-dimensional trajectory based on data from one or more sensors. For example, and as is depicted in Fig. 3, light quality analysis engine 344 is communicatively coupled with one or more cameras 316, which may be the same cameras as were communicatively coupled with trajectory identification engine 342, or may be different cameras. As described above, a video signal and/or recorded video frames obtained by camera 316 may be used by light quality analysis engine 344 to detect one or more properties of light emitted by one or more light sources 308 of lighting system 300. For example, light quality analysis engine 344 may detect light reflected off one or more objects, and/or light reflected off one or more test patterns that are affixed to the objects or that are standalone.
  • light quality analysis engine 344 may be any light quality analysis engine 344
  • light sensors 350 may be affixed to one or more objects that traverse a trajectory of interest, such as single rider 112, UAV 120, or even to a device such as a pole manually hoisted through the trajectory by a person .
  • light sensor 350 takes the form of an LED photodiode used to detect light, but this is not limiting.
  • Light sensor 350 may take various other forms, such as a photosensor, photodetector, active-pixel sensor, phototransistor, photovoltaic cell, lux level meter, charged-coupled device, bolometer, and so forth.
  • Light quality analysis engine 344 may provide data indicative of the one or more light quality characteristics it detected to various components, such as lighting system
  • Lighting system controller 340 may be configured to select one or more properties of light to be emitted by one or more light sources 308. Based on the light quality characteristics provided by light quality analysis engine 344, lighting system controller 340 may cause one or more light sources 308 to emit light having the one or more of the selected properties. Suppose a contrast observed between one or more objects traversing a three- dimensional trajectory and a surface underneath the three-dimensional trajectory is insufficient. This might occur, for example, if ski jumpers are insufficiently discernable from the snow underneath them. In various embodiments, one or more light sources 308 may be controlled to emit light having one or more properties (e.g., an aimed direction, beam width, color, intensity, etc.) selected to increase the contrast.
  • one or more light sources 308 may be controlled to emit light having one or more properties (e.g., an aimed direction, beam width, color, intensity, etc.) selected to increase the contrast.
  • ski jumpers may be illuminated with warm white light, whereas the underlying snow may be illuminated with relatively cool white light, to increase contrast between the two.
  • Fig. 4 schematically depicts an example method 400 of controlling one or more properties of light emitted to illuminate a three-dimensional trajectory, in accordance with various implementations.
  • This system may include various components of various computer systems. For instance, some operations may be performed at lighting system controller 340, while other operations may be performed by one or more of trajectory identification engine 342 and/or light quality analysis engine 344.
  • operations of method 400 are shown in a particular order, this is not meant to be limiting. One or more operations may be reordered, omitted or added.
  • the system may identify a general three-dimensional trajectory to be followed by one or more objects based on data from one or more sensors.
  • the system may identify one or more light quality characteristics at one or more points along the general th ree-dimensional trajectory identified at block 402.
  • the system may select one or more properties of light to be emitted by one or more light sources based on the light quality characteristics identified at block 404.
  • the system may operate one or more light sources to emit light having the one or more properties of light selected at block 406.
  • FIG. 5 is a block diagram of an example computer system 510.
  • Computer system 510 typically includes at least one processor 514 which communicates with a number of peripheral devices via bus subsystem 512. These peripheral devices may include a storage subsystem 524, including, for example, a memory subsystem 525 and a file storage subsystem 526, user interface output devices 520, user interface input devices 522, and a network interface subsystem 516. The input and output devices allow user interaction with computer system 510.
  • Network interface subsystem 516 provides an interface to outside networks and is coupled to corresponding interface devices in other computer systems.
  • User interface input devices 522 may include a keyboard, pointing devices such as a mouse, trackball, touchpad, or graphics tablet, a scanner, a touchscreen incorporated into the display, audio input devices such as voice recognition systems, microphones, and/or other types of input devices.
  • pointing devices such as a mouse, trackball, touchpad, or graphics tablet
  • audio input devices such as voice recognition systems, microphones, and/or other types of input devices.
  • use of the term "input device” is intended to include all possible types of devices and ways to input information into computer system 510 or onto a communication network.
  • User interface output devices 520 may include a display subsystem, a printer, a fax machine, or non-visual displays such as audio output devices.
  • the display subsystem may include a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), a projection device, or some other mechanism for creating a visible image.
  • the display subsystem may also provide non-visual display such as via audio output devices.
  • output device is intended to include all possible types of devices and ways to output information from computer system 510 to the user or to another machine or computer system.
  • Storage subsystem 524 stores programming and data constructs that provide the functionality of some or all of the modules described herein.
  • the storage subsystem 524 may include the logic to perform selected aspects of method 400, as well as one or more of the operations performed by lighting system controller 340, trajectory identification engine 342, light quality analysis engine 344, and so forth.
  • Memory 525 used in the storage subsystem 524 can include a number of memories including a main random access memory (RAM) 530 for storage of instructions and data during program execution and a read only memory (ROM) 532 in which fixed instructions are stored.
  • a file storage subsystem 526 can provide persistent storage for program and data files, and may include a hard disk drive, a floppy disk drive along with associated removable media, a CD-ROM drive, an optical drive, or removable media cartridges.
  • the modules implementing the functionality of certain implementations may be stored by file storage subsystem 526 in the storage subsystem 524, or in other machines accessible by the processor(s) 514.
  • Bus subsystem 512 provides a mechanism for letting the various components and subsystems of computer system 510 communicate with each other as intended. Although bus subsystem 512 is shown schematically as a single bus, alternative implementations of the bus subsystem may use multiple busses.
  • Computer system 510 can be of varying types including a workstation, server, computing cluster, blade server, server farm, or any other data processing system or computing device. Due to the ever-changing nature of computers and networks, the description of computer system 510 depicted in Fig. 5 is intended only as a specific example for purposes of illustrating some implementations. Many other configurations of computer system 510 are possible having more or fewer components than the computer system depicted in Fig. 5.
  • inventive embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed.
  • inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein.
  • a reference to "A and/or B", when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • the phrase "at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified.
  • At least one of A and B can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.

Abstract

Systems and methods are provided for controlling a plurality of light sources. In various embodiments, based on data obtained from one or more sensors, a general three-dimensional trajectory (114) to be followed by one or more objects through free space of an environment illuminated by the plurality of light sources may be identified (402). One or more light quality characteristics along the general three-dimensional trajectory may be identified (404) based on data obtained from the one or more sensors. One or more properties of light to be emitted by one or more of the plurality of light sources may be selected (406) based on the identified one or more light quality characteristics. The one or more of the plurality of light sources may be operated (408) to emit light having the one or more selected properties while an object travels along the general three-dimensional trajectory.

Description

CONTROLLED LIGHTING FOR THREE-DIMENSIONAL TRAJECTORIES
Technical Field
[0001] The present invention is directed generally to lighting control. More particularly, various inventive methods and apparatus disclosed herein relate to controlling one or more properties of light emitted to illuminate a three-dimensional trajectory.
Background
[0002] Digital lighting technologies, i.e., illumination based on semiconductor light sources, such as light-emitting diodes (LEDs), offer a viable alternative to traditional fluorescent, H ID, and incandescent lamps. Functional advantages and benefits of LEDs include high energy conversion and optical efficiency, durability, lower operating costs, and many others. Recent advances in LED technology have provided efficient and robust full-spectrum lighting sources that enable a variety of lighting effects in many applications. Some of the fixtures embodying these sources feature a lighting module, including one or more LEDs capable of producing different colors, e.g., red, green, and blue, as well as a processor for independently controlling the output of the LEDs in order to generate a variety of colors and color-changing lighting effects, for example, as discussed in detail in U .S. Patent Nos. 6,016,038 and 6,211,626, incorporated herein by reference.
[0003] Lighting infrastructure for a public venue may be designed to illuminate relatively few different types of events. For example, a basketball game may be illuminated in one way that best enables the audience to see the players, the ball, and the baskets over a typically wooden court. A hockey game may be illuminated in a different way than a basketball game because a hockey rink is made of ice. An equestrian show may be illuminated in yet another way that best enables the audience to see the riders on their horses. For some sporting events, vertical luminance may also be measured at some distance above and/or away from a target surface, e.g., by placing light meters, mannequins, or sticks at a few desired heights. [0004] However, some types of events may feature activity that does not always occur close to a surface, e.g., with objects that travel through free space along three-dimensional trajectories. For example, in motocross events, motorcycles often perform a number of jumps as part of the race, and these jumps may be a highlight of the audience experience. As another example, ski jumpers may be airborne through much of their performance. As yet another example, underwater performances such as those that may be featured in aqua parks (e.g., underwater mermaid shows, synchronized swimming, etc.) may feature performers travelling along three-dimensional trajectories. While somewhat flexible, existing lighting infrastructure is not equipped to detect light quality along every point of a three-dimensional trajectory so that, for instance, objects passing through midair can be properly illuminated. Thus, there is a need in the art to properly detect and illuminate three-dimensional trajectories to be traveled by objects.
Summary
[0005] The present disclosure is directed to inventive methods and apparatus for controlling one or more properties of light emitted to illuminate a three-dimensional trajectory. For example, in some embodiments, a general three-dimensional trajectory followed by one or more objects (e.g., motorcycles, ski jumpers, acrobats, etc.) may be detected, e.g., using various types of sensors. Then, one or more light quality attributes along the general three- dimensional trajectory may be determined. Finally, a lighting system may be controlled to emit light having one or more properties selected based on the determined one or more light quality attributes along the general three-dimensional trajectory.
[0006] Generally, in one aspect, a method of controlling a plurality of light sources may include: identifying, based on data obtained from one or more sensors, a general three- dimensional trajectory to be followed by one or more objects through free space of an environment illuminated by the plurality of light sources; identifying, based on data obtained from the one or more sensors, one or more light quality characteristics at one or more points along the general three-dimensional trajectory; selecting one or more properties of light to be emitted by one or more of the plurality of light sources based on the identified one or more light quality characteristics; and operating the one or more of the plurality of light sources to emit light having the one or more selected properties while an object travels along the general three-dimensional trajectory.
[0007] In various embodiments, the one or more sensors includes a camera, and identifying the general three-dimensional trajectory includes identifying the general three-dimensional trajectory based on video data obtained by the camera. In various versions, the video data depicts an object travelling through the environment. In various versions, the video data depicts a test pattern placed within the general three-dimensional trajectory. In various versions, the test pattern is affixed to an aerial drone that follows the general three- dimensional trajectory.
[0008] In various embodiments, the method may include equipping an object passing through the three-dimensional trajectory with a radio frequency beacon. In various versions, a camera may be operated to automatically keep the radio frequency beacon within a field of view of the camera. In various embodiments, identifying the general three-dimensional trajectory may include identifying the general three-dimensional trajectory based on sensor data provided by a motion sensor attached to an object travelling through the environment.
[0009] In various embodiments, at least one of the one or more sensors is affixed to an aerial drone. In various versions, identifying the one or more light quality characteristics comprises traversing the aerial drone along the general three-dimensional trajectory. In various embodiments, at least one of the one or more sensors includes one or more light sensors affixed to at least one of the one or more objects as it travels through the general three-dimensional trajectory. In various embodiments, identifying the one or more light quality characteristics is based on data collected by the one or more light sensors affixed to the at least one of the one or more objects.
[0010] As used herein for purposes of the present disclosure, the term "LED" should be understood to include any electroluminescent diode or other type of carrier injection/junction- based system that is capable of generating radiation in response to an electric signal. Thus, the term LED includes, but is not limited to, various semiconductor-based structures that emit light in response to current, light emitting polymers, organic light emitting diodes (OLEDs), electroluminescent strips, and the like. In particular, the term LED refers to light emitting diodes of all types (including semi-conductor and organic light emitting diodes) that may be configured to generate radiation in one or more of the infrared spectrum, ultraviolet spectrum, and various portions of the visible spectrum (generally including radiation wavelengths from approximately 400 nanometers to approximately 700 nanometers). Some examples of LEDs include, but are not limited to, various types of infrared LEDs, ultraviolet LEDs, red LEDs, blue LEDs, green LEDs, yellow LEDs, amber LEDs, orange LEDs, and white LEDs (discussed further below). It also should be appreciated that LEDs may be configured and/or controlled to generate radiation having various bandwidths (e.g., full widths at half maximum, or FWHM) for a given spectrum (e.g., narrow bandwidth, broad bandwidth), and a variety of dominant wavelengths within a given general color categorization.
[0011] For example, one implementation of an LED configured to generate essentially white light (e.g., a white LED) may include a number of dies which respectively emit different spectra of electroluminescence that, in combination, mix to form essentially white light. In another implementation, a white light LED may be associated with a phosphor material that converts electroluminescence having a first spectrum to a different second spectrum. In one example of this implementation, electroluminescence having a relatively short wavelength and narrow bandwidth spectrum "pumps" the phosphor material, which in turn radiates longer wavelength radiation having a somewhat broader spectrum.
[0012] It should also be understood that the term LED does not limit the physical and/or electrical package type of an LED. For example, as discussed above, an LED may refer to a single light emitting device having multiple dies that are configured to respectively emit different spectra of radiation (e.g., that may or may not be individually controllable). Also, an LED may be associated with a phosphor that is considered as an integral part of the LED (e.g., some types of white LEDs). In general, the term LED may refer to packaged LEDs, non-packaged LEDs, surface mount LEDs, chip-on-board LEDs, T-package mount LEDs, radial package LEDs, power package LEDs, LEDs including some type of encasement and/or optical element (e.g., a diffusing lens), etc. [0013] The term "light source" should be understood to refer to any one or more of a variety of radiation sources, including, but not limited to, LED-based sources (including one or more LEDs as defined above), incandescent sources (e.g., filament lamps, halogen lamps), fluorescent sources, phosphorescent sources, high-intensity discharge sources (e.g., sodium vapor, mercury vapor, and metal halide lamps), lasers, other types of electroluminescent sources, pyro-luminescent sources (e.g., flames), candle-luminescent sources (e.g., gas mantles, carbon arc radiation sources), photo-luminescent sources (e.g., gaseous discharge sources), cathode luminescent sources using electronic satiation, galvano-luminescent sources, crystallo- luminescent sources, kine-luminescent sources, thermo-luminescent sources, triboluminescent sources, sonoluminescent sources, radioluminescent sources, and luminescent polymers.
[0014] A given light source may be configured to generate electromagnetic radiation within the visible spectrum, outside the visible spectrum, or a combination of both. Hence, the terms "light" and "radiation" are used interchangeably herein. Additionally, a light source may include as an integral component one or more filters (e.g., color filters), lenses, or other optical components. Also, it should be understood that light sources may be configured for a variety of applications, including, but not limited to, indication, display, and/or illumination. An
"illumination source" is a light source that is particularly configured to generate radiation having a sufficient intensity to effectively illuminate an interior or exterior space. In this context, "sufficient intensity" refers to sufficient radiant power in the visible spectrum generated in the space or environment (the unit "lumens" often is employed to represent the total light output from a light source in all directions, in terms of radiant power or "luminous flux") to provide ambient illumination (i.e., light that may be perceived indirectly and that may be, for example, reflected off of one or more of a variety of intervening surfaces before being perceived in whole or in part).
[0015] The term "spectrum" should be understood to refer to any one or more frequencies (or wavelengths) of radiation produced by one or more light sources. Accordingly, the term "spectrum" refers to frequencies (or wavelengths) not only in the visible range, but also frequencies (or wavelengths) in the infrared, ultraviolet, and other areas of the overall electromagnetic spectrum. Also, a given spectrum may have a relatively narrow bandwidth (e.g., a FWHM having essentially few frequency or wavelength components) or a relatively wide bandwidth (several frequency or wavelength components having various relative strengths). It should also be appreciated that a given spectrum may be the result of a mixing of two or more other spectra (e.g., mixing radiation respectively emitted from multiple light sources).
[0016] For purposes of this disclosure, the term "color" is used interchangeably with the term "spectrum." However, the term "color" generally is used to refer primarily to a property of radiation that is perceivable by an observer (although this usage is not intended to limit the scope of this term). Accordingly, the terms "different colors" implicitly refer to multiple spectra having different wavelength components and/or bandwidths. It also should be appreciated that the term "color" may be used in connection with both white and non-white light.
[0017] The term "color temperature" generally is used herein in connection with white light, although this usage is not intended to limit the scope of this term. Color temperature essentially refers to a particular color content or shade (e.g., reddish, bluish) of white light. The color temperature of a given radiation sample conventionally is characterized according to the temperature in degrees Kelvin (K) of a black body radiator that radiates essentially the same spectrum as the radiation sample in question. Black body radiator color temperatures generally fall within a range of approximately 700 degrees K (typically considered the first visible to the human eye) to over 10,000 degrees K; white light generally is perceived at color temperatures above 1500-2000 degrees K.
[0018] Lower color temperatures generally indicate white light having a more significant red component or a "warmer feel," while higher color temperatures generally indicate white light having a more significant blue component or a "cooler feel." By way of example, fire has a color temperature of approximately 1,800 degrees K, a conventional incandescent bulb has a color temperature of approximately 2848 degrees K, early morning daylight has a color temperature of approximately 3,000 degrees K, and overcast midday skies have a color temperature of approximately 10,000 degrees K. A color image viewed under white light having a color temperature of approximately 3,000 degree K has a relatively reddish tone, whereas the same color image viewed under white light having a color temperature of approximately 10,000 degrees K has a relatively bluish tone. [0019] The term "lighting fixture" is used herein to refer to an implementation or arrangement of one or more lighting units in a particular form factor, assembly, or package. The term "lighting unit" is used herein to refer to an apparatus including one or more light sources of same or different types. A given lighting unit may have any one of a variety of mounting arrangements for the light source(s), enclosure/housing arrangements and shapes, and/or electrical and mechanical connection configurations. Additionally, a given lighting unit optionally may be associated with (e.g., include, be coupled to and/or packaged together with) various other components (e.g., control circuitry) relating to the operation of the light source(s). An "LED-based lighting unit" refers to a lighting unit that includes one or more LED- based light sources as discussed above, alone or in combination with other non LED-based light sources. A "multi-channel" lighting unit refers to an LED-based or non LED-based lighting unit that includes at least two light sources configured to respectively generate different spectrums of radiation, wherein each different source spectrum may be referred to as a "channel" of the multi-channel lighting unit.
[0020] The term "controller" is used herein generally to describe various apparatus relating to the operation of one or more light sources. A controller can be implemented in numerous ways (e.g., such as with dedicated hardware) to perform various functions discussed herein. A "processor" is one example of a controller which employs one or more microprocessors that may be programmed using software (e.g., microcode) to perform various functions discussed herein. A controller may be implemented with or without employing a processor, and also may be implemented as a combination of dedicated hardware to perform some functions and a processor (e.g., one or more programmed microprocessors and associated circuitry) to perform other functions. Examples of controller components that may be employed in various embodiments of the present disclosure include, but are not limited to, conventional microprocessors, application specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAs).
[0021] In various implementations, a processor or controller may be associated with one or more storage media (generically referred to herein as "memory," e.g., volatile and non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM, floppy disks, compact disks, optical disks, magnetic tape, etc.). In some implementations, the storage media may be encoded with one or more programs that, when executed on one or more processors and/or controllers, perform at least some of the functions discussed herein. Various storage media may be fixed within a processor or controller or may be transportable, such that the one or more programs stored thereon can be loaded into a processor or controller so as to implement various aspects of the present invention discussed herein. The terms "program" or "computer program" are used herein in a generic sense to refer to any type of computer code (e.g., software or microcode) that can be employed to program one or more processors or controllers.
[0022] The term "addressable" is used herein to refer to a device (e.g., a light source in general, a lighting unit or fixture, a controller or processor associated with one or more light sources or lighting units, other non-lighting related devices, etc.) that is configured to receive information (e.g., data) intended for multiple devices, including itself, and to selectively respond to particular information intended for it. The term "addressable" often is used in connection with a networked environment (or a "network," discussed further below), in which multiple devices are coupled together via some communications medium or media.
[0023] In one network implementation, one or more devices coupled to a network may serve as a controller for one or more other devices coupled to the network (e.g., in a master/slave relationship). In another implementation, a networked environment may include one or more dedicated controllers that are configured to control one or more of the devices coupled to the network. Generally, multiple devices coupled to the network each may have access to data that is present on the communications medium or media; however, a given device may be "addressable" in that it is configured to selectively exchange data with (i.e., receive data from and/or transmit data to) the network, based, for example, on one or more particular identifiers (e.g., "addresses") assigned to it.
[0024] The term "network" as used herein refers to any interconnection of two or more devices (including controllers or processors) that facilitates the transport of information (e.g., for device control, data storage, data exchange, etc.) between any two or more devices and/or among multiple devices coupled to the network. As should be readily appreciated, various implementations of networks suitable for interconnecting multiple devices may include any of a variety of network topologies and employ any of a variety of communication protocols.
Additionally, in various networks according to the present disclosure, any one connection between two devices may represent a dedicated connection between the two systems, or alternatively a non-dedicated connection. In addition to carrying information intended for the two devices, such a non-dedicated connection may carry information not necessarily intended for either of the two devices (e.g., an open network connection). Furthermore, it should be readily appreciated that various networks of devices as discussed herein may employ one or more wireless, wire/cable, and/or fiber optic links to facilitate information transport throughout the network.
[0025] The term "user interface" as used herein refers to an interface between a human user or operator and one or more devices that enables communication between the user and the device(s). Examples of user interfaces that may be employed in various implementations of the present disclosure include, but are not limited to, switches, potentiometers, buttons, dials, sliders, a mouse, keyboard, keypad, various types of game controllers (e.g., joysticks), track balls, display screens, various types of graphical user interfaces (GUIs), touch screens, microphones and other types of sensors that may receive some form of human-generated stimulus and generate a signal in response thereto.
[0026] The term "three dimensional trajectory" as used herein may refer to a trajectory travelled by one or more objects through "free space." "Free space" may refer to any location that is remote from any surface. For example, a motocross participant, ski jumper, or an acrobat will travel through at least one midair and/or airborne trajectory. A three-dimensional trajectory followed by an underwater performer similarly will occur partially or mostly away from the surface of the water and away from any underlying surfaces.
[0027] It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein. It should also be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.
Brief Description of the Drawings
[0028] In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention.
[0029] Fig. 1 illustrates an example scenario in which disclosed techniques may be employed, in accordance with various embodiments.
[0030] Fig. 2 depicts an example scenario in which an unmanned aerial vehicle is used to detect light quality characteristics in a three-dimensional trajectory corridor, in accordance with various embodiments.
[0031] Fig. 3 schematically depicts example components that may be configured to perform techniques described herein.
[0032] Fig. 4 depicts an example method for performing selected aspects of the present disclosure.
[0033] Fig. 5 depicts an example computing system.
Detailed Description
[0034] Lighting infrastructure for a public venue may be designed to illuminate relatively few different types of events, and the different available illumination settings may be tailored towards illuminating a surface such as court or rink. However, some types of events may feature activity that occurs relatively far from a surface, such as airborne or underwater activity that occurs along one or more three-dimensional trajectories that are at least partially remote from any surface. Existing lighting infrastructure is not equipped to detect light quality so that objects passing through three-dimensional trajectories can be properly illuminated. Thus, there is a need in the art to properly detect and illuminate three-dimensional trajectories to be travelled by objects. More generally, Applicants have recognized and appreciated that it would be beneficial to provide methods and systems for controlling one or more properties of light emitted to illuminate a three-dimensional trajectory. In view of the foregoing, various embodiments and implementations of the present invention are directed to determining a general th ree-dimensional trajectory to be traveled by one or more objects, to determining one or more lighting quality characteristics along the three-dimensional trajectory, and to adjusting light output by a lighting system to better illuminate objects that travel along the trajectory.
[0035] Referring to Fig. 1, in one embodiment, a lighting system 100 is nominally configured to illuminate a surface 102 within a venue 104. Surface 102 may be used for a variety of entertainment and/or sports activities, and may be covered with various types of natural or artificial turfs, one or more ice rinks, dirt (e.g., for rodeos or automobile rallies/races), a basketball court, or other types of flooring that may be used to host other types of events, such as gymnastics competitions, fairs, expositions, and so forth. In this example, lighting system 100 includes two sets, 106a and 106b, of individual stadium lights 108 (only four stadium lights 108 are referenced in Fig. 1 to avoid cluttering the drawings). However, this is for illustrative purposes only, and is not meant to be limiting. More or less sets 106 of stadium lights 108 may be employed to illuminate surface 102. Additionally, each set 106 may include more or less stadium lights 108 than are depicted in Fig. 1, stadium lights 108 may be in different configurations within each set 106 than is depicted, and one set 106 of stadium lights 108 may include a different number of stadium lights 108 than another set 106 in a different configuration .
[0036] Each stadium light 108 may come in various forms. In some embodiments, at least some of the stadium lights 108 may be LED lights, in which case it may be possible to select and/or control one or more properties of light they emit. These properties of emitted light may include but are not limited to hue, saturation, color temperature, dynamic effects (e.g., blinking, flashing, etc.), intensity, a direction in which the emitted light is aimed, beam width, and so forth. In some embodiments, each stadium light 108 may be individually controllable to emit light having one or more selected properties. Additionally or alternatively, in some embodiments, groups of stadium lights 108 (e.g., each set 106) may be controlled to collectively emit light having one or more selected properties. While examples described herein include LED-based light sources, other light source types may be employed, including but not limited to halogen, incandescent, fluorescent, so called planar light sources that employ carbon nanotubes, and so forth.
[0037] In. Fig. 1, venue 104 is being used for a motocross event. As part of the event, one or more motocross participants may ride along su rface 102, at various points hitting one or more jumps 110. When the participants hit a jump 110 at sufficient speed, they may become airborne for some distance until they once again land on surface 102. For example, a single rider 112 is schematically depicted in Fig. 1 at various stages of approaching, going over, and landing after jump 110. Assuming other motocross participants approach jump 110 at similar speeds, a general three-dimensional trajectory 114 that will be travelled by multiple motocross participants can be identified. In some embodiments, general three-dimensional trajectory 114 may be defined as a three-dimensional "corridor" through which multiple objects, such as motocross participants, may be expected to travel through while participating in an event.
[0038] As noted in the background, lighting system 100 may be operated in various ways (e.g., configured to implement one or more predefined lighting scenes) to emit light having various properties selected to properly illuminate surface 102 for various events. However, light emitted by lighting system 100 to have various properties selected to illuminate surface 102 in a particular way may not effectively illuminate motocross participants as they travel well above surface 102 th rough general th ree-dimensional trajectory 114. Similar issues may arise with other events that include participants that travel through free space, including but not limited to ski jump events, acrobat events (e.g., as part of a circus), monster truck rallies, underwater performances, aquatic diving events, and so forth.
[0039] Accordingly, in various embodiments, general three-dimensional trajectory 114 may be detected and/or identified based on data obtained from one or more sensors. In some embodiments, multiple objects may be observed in free space, e.g., in midair, using the sensors. Data from those sensors may be analyzed collectively to identify general three- dimensional trajectory 114 in a manner that captures the trajectories of all such objects, rather than a single object. Once general three-dimensional trajectory 114 is identified and/or detected, one or more light quality characteristics at one or more points along general three- dimensional trajectory 114 may be identified based on data obtained from the same one or more sensors or different sensors. Based on the identified one or more light quality
characteristics, one or more properties of light to be emitted by lighting system 100 may be selected. Lighting system 100 may be operated to emit light having the one or more selected properties while one or more objects such as motocross participants travel along general three- dimensional trajectory 114.
[0040] One example of a type of sensor that may be used to detect/identify general three- dimensional trajectory 114 is a camera 116. Only one camera 116 is depicted in Fig. 1, but this is not meant to be limiting. Any number of cameras 116 may be employed to identify/detect general three-dimensional trajectory 114. Camera 116 may take various forms and employ various technologies. For example, in some embodiments, camera 116 may be a two- dimensional or even a three-dimensional camera that detects light in the visible spectrum. Additionally or alternatively, camera 116 may detect light in the infrared spectrum. In some embodiments, camera 116 may be a thermal imaging camera configured to detect heat.
[0041] Camera 116 may be operated to detect three-dimensional trajectory 114 in various ways. In some embodiments, video frames contained in a signal produced by camera 116 may be analyzed, in real time or after the fact. Video frames may be examined to determine, for instance, a location of single rider 112 at each point in time along a time interval. To capture objects in a field of view of camera 116, in some embodiments, single rider 112 (or multiple motocross participants) may be provided with radio frequency ("RF") beacons (also referred to as "tags"). Camera 116 may be configured with an RF receiver configured to detect radio waves emitted by the RF beacons. By rotating and/or tilting its base, camera 116 may point its field of view towards a location at which camera 116 detects a tag, so that in effect, camera 116 "follows" objects such as single rider 112 as they travel through general three-dimensional trajectory 114. In some such embodiments, general three-dimensional trajectory 114 may be identified and/or detected based on video footage produced by such a camera 116 and/or a path followed by a field of view of camera 116. In other embodiments, camera 116 may remain stationary. An object travelling along general three-dimensional trajectory 114 may pass through a field of view of camera 116. Video frames of that footage may be analyzed using standard object detection image processing techniques (e.g., edge detection) to determine multiple locations of the object along a time line. Together, those multiple locations may reveal general three-dimensional trajectory 114.
[0042] Another example of a type of sensor that may be used to detect and/or identify general three-dimensional trajectory 114 is a position sensor. For example, an object such as single rider 112 may be equipped with one or more position sensors, such as accelerometers, gyroscopes, altimeters, global positioning system ("GPS") sensors, etc. These position sensors may record and/or provide a signal indicative of their location while the object travels through midair along general three-dimensional trajectory 114. The signal may then be used to identify and/or detect general three-dimensional trajectory 114. In some embodiments, in addition to or instead of the aforementioned position sensors, one or more objects may be equipped with an RF beacon. One or more RF receivers may be configured to receive an RF signal emitted by the beacon. The signals provided by these receivers may be analyzed, e.g., using techniques such as triangulation, to determine trajectories of the objects (and hence, to identify general three-dimensional trajectory 114). In some embodiments, more than one of the
aforementioned sensors may be used to identify general three-dimensional trajectory 114.
[0043] Once general three-dimensional trajectory 114 is identified and/or detected, one or more light quality characteristics at one or more points along general three-dimensional trajectory 114 may be identified based on data obtained from the same one or more sensors that were used to identify/detect general three-dimensional trajectory 114, or different sensors. Light quality characteristics within general three-dimensional trajectory 114 may be determined in various ways.
[0044] In some embodiments, one or more cameras 116 may be used to determine one or more light quality characteristics in general three-dimensional trajectory 114 (e.g., after three- dimensional trajectory 114 is identified or at the same time). For example, light reflected off one or more objects passing through general three-dimensional trajectory 114 (e.g., reflected from single rider 112) may be detected and analyzed, in real time based on a video signal or after the fact by analyzing video footage. In some embodiments, a test pattern may be affixed to an object (e.g., on clothes worn by single rider 112) as the object passes through general three-dimensional trajectory 114. The test pattern may be selected depending on which light quality characteristics are of most interest. In other embodiments, a large textile with a test pattern printed on it may be hoisted above general three-dimensional trajectory 114 to form a plane that is parallel to a direction of travel through, and cuts through (e.g., bisects), general three-dimensional trajectory 114. Light reflected from multiple positions on the textile may be analyzed to identify light quality characteristics.
[0045] Sensors other than camera(s) 116 may be used to identify light quality characteristics within general three-dimensional trajectory 114 as well. For example, in some embodiments, objects passing through general three-dimensional trajectory 114 may be equipped with light sensors. These light sensors may be configured to detect one or more properties of light emitted by, for instance, lighting system 100, particularly as the objects pass through general three-dimensional trajectory 114. For example, single rider 112 could be provided with a lux level meter to calculate an intensity of light experienced at multiple points within general three-dimensional trajectory 114.
[0046] In some embodiments, in addition to or instead of providing objects such as motocross participants (e.g., single rider 112) with light sensors, other objects may be traversed along general three-dimensional trajectory 114 to identify one or more light quality
characteristics. For example, and referring to Fig. 2, an un manned aerial vehicle ("UAV," also referred to as a "drone") 120 may be provided with light sensors (not depicted). UAV 120 may then be piloted through general three-dimensional trajectory 114 so that the light sensor(s) can obtain one or more readings at various points. In some embodiments, a user may manually pilot UAV 120 through general three-dimensional trajectory 114. I n other embodiments, UAV 120 may be automatically piloted through general th ree-dimensional trajectory 114, e.g., as a result of a flight plan being provided to UAV 120 or to a remote computing device (not depicted) that controls it.
[0047] Once light quality characteristics along general th ree-dimensional trajectory 114 are known, one or more computing devices may select one or more properties of light to be emitted by lighting system 100. Lighting system 100 may then be operated to emit light having the one or more selected properties while one or more objects such as motocross participants travel along general three-dimensional trajectory 114. In some embodiments, the computing devices may execute an algorithm that uses current lighting homogeneity information (i.e., the desired lighting homogeneity employed to uniformly illuminate surface 102) to calculate the required lighting compensation to correct for the homogeneity.
[0048] Suppose one or more light quality characteristics measured at one or more positions along general three-dimensional trajectory 114 fails to satisfy a criterion— e.g., the object is too dark as it travels through the air. It may be determined that an intensity of emitted light that passes through those points along general th ree-dimensional trajectory 114 should be increased, at all times or only when an object is passing through general th ree-dimensional trajectory 114. For example, one or more stadium lights 108 that are aimed at those one or more positions may be operated to emit light having an increased intensity. Additionally or alternatively, one or more stadium lights 108 may be rotated and/or tilted, e.g., by a motor (not depicted), to aim more directly at the one or more positions within general three-dimensional trajectory 114, with or without also adjusting light output intensity.
[0049] Fig. 3 schematically depicts various components that may cooperate to perform techniques described herein. A lighting system controller 340 is communicatively coupled with, and configured to control, a plurality of light sources 308i_N of a lighting system 300, which in this example are LED-based light sources 308. Lighting system controller 340 is also
communicatively coupled with a trajectory identification engine 342 and a light quality analysis engine 344. As used herein, two components may be "communicatively coupled" if they are capable of exchanging any sort of analog and/or signal between them, if they are capable of exchanging any sort of data, electricity, and so forth. For example, each of components 340, 342, and 344 may be communicatively coupled via one or more computing networks (not depicted) that employ various wired or wireless technologies, including but not limited to Wi-Fi, Ethernet, USB, Serial, Bluetooth, ZigBee, and so forth. While each of components 340, 342, and 344 are depicted separately, this is not meant to be limiting. In various embodiments, one or more aspects of any of these components may be implemented in a single computing device, or across multiple computing devices communicatively coupled with each other. [0050] Trajectory identification engine 342 may be configured to obtain, from one or more sensors, signals indicative of one or more objects passing through an environment. Based on these signals, trajectory identification engine 342 may be configured to identify and/or detect one or more general th ree-dimensional trajectories followed by the objects, such as general three-dimensional trajectory 114 of Fig. 1. Trajectory identification engine 342 is depicted in Fig. 3 as communicatively coupled with one or more cameras 316, an RF beacon 346, and/or a GPS unit 348, one or more of which may be operated as described above to obtain signals indicative of a general th ree-dimensional trajectory travelled by multiple objects. In the case of camera 316, and as was mentioned previously, the signal provided to trajectory identification engine 342 may be a real time video signal, or may comprise recorded video frames that may be analyzed after the fact.
[0051] Trajectory identification engine 342 may provide data indicative of a sensed three- dimensional trajectory to light quality analysis engine 344. This data may come in various forms, such as a plurality of three-dimensional position coordinates. In some embodiments, the position coordinates may be accompanied by dimension data indicative of one or more dimensions (e.g., width, height) of a trajectory "corridor" associated with the three-dimensional trajectory. For example, objects such as motocross participants or ski jumpers are unlikely to traverse the exact same trajectory, if for no other reason than the participants will not all be the exact same size, will not likely be travelling at the exact same speed, etc. Accordingly, in some embodiments, a trajectory "corridor" may encompass composite position coordinates from a plurality of travelling objects, with the goal to ensure that an ultimate calculated trajectory corridor substantially or completely encompasses all objects that were observed to determine the three-dimensional trajectory.
[0052] Light quality analysis engine 344 may be configured to identify one or more light quality characteristics at one or more points along the general three-dimensional trajectory based on data from one or more sensors. For example, and as is depicted in Fig. 3, light quality analysis engine 344 is communicatively coupled with one or more cameras 316, which may be the same cameras as were communicatively coupled with trajectory identification engine 342, or may be different cameras. As described above, a video signal and/or recorded video frames obtained by camera 316 may be used by light quality analysis engine 344 to detect one or more properties of light emitted by one or more light sources 308 of lighting system 300. For example, light quality analysis engine 344 may detect light reflected off one or more objects, and/or light reflected off one or more test patterns that are affixed to the objects or that are standalone.
[0053] Additionally or alternatively, light quality analysis engine 344 may be
communicatively coupled with one or more light sensors 350. As described above, light sensors 350 may be affixed to one or more objects that traverse a trajectory of interest, such as single rider 112, UAV 120, or even to a device such as a pole manually hoisted through the trajectory by a person . In Fig. 3, light sensor 350 takes the form of an LED photodiode used to detect light, but this is not limiting. Light sensor 350 may take various other forms, such as a photosensor, photodetector, active-pixel sensor, phototransistor, photovoltaic cell, lux level meter, charged-coupled device, bolometer, and so forth.
[0054] Light quality analysis engine 344 may provide data indicative of the one or more light quality characteristics it detected to various components, such as lighting system
controller 340. Lighting system controller 340 may be configured to select one or more properties of light to be emitted by one or more light sources 308. Based on the light quality characteristics provided by light quality analysis engine 344, lighting system controller 340 may cause one or more light sources 308 to emit light having the one or more of the selected properties. Suppose a contrast observed between one or more objects traversing a three- dimensional trajectory and a surface underneath the three-dimensional trajectory is insufficient. This might occur, for example, if ski jumpers are insufficiently discernable from the snow underneath them. In various embodiments, one or more light sources 308 may be controlled to emit light having one or more properties (e.g., an aimed direction, beam width, color, intensity, etc.) selected to increase the contrast. Thus, for instance, ski jumpers may be illuminated with warm white light, whereas the underlying snow may be illuminated with relatively cool white light, to increase contrast between the two. Note that it is not necessary to alter emitted light that will illuminate the actual objects. I n some instances, it may be beneficial to leave the light that is cast upon objects as they pass through midair unaltered, and to alter light that illuminates the underlying surface (or walls, or ramps, etc.) instead (e.g., by decreasing intensity, changing hue, etc.).
[0055] Fig. 4 schematically depicts an example method 400 of controlling one or more properties of light emitted to illuminate a three-dimensional trajectory, in accordance with various implementations. For convenience, the operations of the flow chart are described with reference to a system that performs the operations. This system may include various components of various computer systems. For instance, some operations may be performed at lighting system controller 340, while other operations may be performed by one or more of trajectory identification engine 342 and/or light quality analysis engine 344. Moreover, while operations of method 400 are shown in a particular order, this is not meant to be limiting. One or more operations may be reordered, omitted or added.
[0056] At block 402, the system may identify a general three-dimensional trajectory to be followed by one or more objects based on data from one or more sensors. At block 404, the system may identify one or more light quality characteristics at one or more points along the general th ree-dimensional trajectory identified at block 402. At block 406, the system may select one or more properties of light to be emitted by one or more light sources based on the light quality characteristics identified at block 404. At block 408, the system may operate one or more light sources to emit light having the one or more properties of light selected at block 406.
[0057] Fig. 5 is a block diagram of an example computer system 510. Computer system 510 typically includes at least one processor 514 which communicates with a number of peripheral devices via bus subsystem 512. These peripheral devices may include a storage subsystem 524, including, for example, a memory subsystem 525 and a file storage subsystem 526, user interface output devices 520, user interface input devices 522, and a network interface subsystem 516. The input and output devices allow user interaction with computer system 510. Network interface subsystem 516 provides an interface to outside networks and is coupled to corresponding interface devices in other computer systems. [0058] User interface input devices 522 may include a keyboard, pointing devices such as a mouse, trackball, touchpad, or graphics tablet, a scanner, a touchscreen incorporated into the display, audio input devices such as voice recognition systems, microphones, and/or other types of input devices. In general, use of the term "input device" is intended to include all possible types of devices and ways to input information into computer system 510 or onto a communication network.
[0059] User interface output devices 520, may include a display subsystem, a printer, a fax machine, or non-visual displays such as audio output devices. The display subsystem may include a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), a projection device, or some other mechanism for creating a visible image. The display subsystem may also provide non-visual display such as via audio output devices. In general, use of the term "output device" is intended to include all possible types of devices and ways to output information from computer system 510 to the user or to another machine or computer system.
[0060] Storage subsystem 524 stores programming and data constructs that provide the functionality of some or all of the modules described herein. For example, the storage subsystem 524 may include the logic to perform selected aspects of method 400, as well as one or more of the operations performed by lighting system controller 340, trajectory identification engine 342, light quality analysis engine 344, and so forth.
[0061] These software modules are generally executed by processor 514 alone or in combination with other processors. Memory 525 used in the storage subsystem 524 can include a number of memories including a main random access memory (RAM) 530 for storage of instructions and data during program execution and a read only memory (ROM) 532 in which fixed instructions are stored. A file storage subsystem 526 can provide persistent storage for program and data files, and may include a hard disk drive, a floppy disk drive along with associated removable media, a CD-ROM drive, an optical drive, or removable media cartridges. The modules implementing the functionality of certain implementations may be stored by file storage subsystem 526 in the storage subsystem 524, or in other machines accessible by the processor(s) 514.
[0062] Bus subsystem 512 provides a mechanism for letting the various components and subsystems of computer system 510 communicate with each other as intended. Although bus subsystem 512 is shown schematically as a single bus, alternative implementations of the bus subsystem may use multiple busses.
[0063] Computer system 510 can be of varying types including a workstation, server, computing cluster, blade server, server farm, or any other data processing system or computing device. Due to the ever-changing nature of computers and networks, the description of computer system 510 depicted in Fig. 5 is intended only as a specific example for purposes of illustrating some implementations. Many other configurations of computer system 510 are possible having more or fewer components than the computer system depicted in Fig. 5.
[0064] While several inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.
[0065] All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
[0066] The indefinite articles "a" and "an," as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean "at least one."
[0067] The phrase "and/or," as used herein in the specification and in the claims, should be understood to mean "either or both" of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with "and/or" should be construed in the same fashion, i.e., "one or more" of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the "and/or" clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to "A and/or B", when used in conjunction with open-ended language such as "comprising" can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
[0068] As used herein in the specification and in the claims, "or" should be understood to have the same meaning as "and/or" as defined above. For example, when separating items in a list, "or" or "and/or" shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as "only one of" or "exactly one of," or, when used in the claims, "consisting of," will refer to the inclusion of exactly one element of a number or list of elements. In general, the term "or" as used herein shall only be interpreted as indicating exclusive alternatives (i.e. "one or the other but not both") when preceded by terms of exclusivity, such as "either," "one of," "only one of," or "exactly one of." "Consisting essentially of," when used in the claims, shall have its ordinary meaning as used in the field of patent law.
[0069] As used herein in the specification and in the claims, the phrase "at least one," in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, "at least one of A and B" (or, equivalently, "at least one of A or B," or, equivalently "at least one of A and/or B") can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
[0070] It should also be understood that, unless clearly indicated to the contrary, in any methods claimed herein that include more than one step or act, the order of the steps or acts of the method is not necessarily limited to the order in which the steps or acts of the method are recited.
[0071] In the claims, as well as in the specification above, all transitional phrases such as "comprising," "including," "carrying," "having," "containing," "involving," "holding," "composed of," and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases "consisting of" and "consisting essentially of" shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03. It should be understood that certain expressions and reference signs used in the claims pursuant to Rule 6.2(b) of the Patent Cooperation Treaty ("PCT") do not limit the scope.

Claims

CLAIMS What is claimed is:
1. A method of controlling a plurality of light sources (108), comprising:
identifying (402), based on data obtained from one or more sensors, a general three- dimensional trajectory (114) to be followed by one or more objects through free space of an environment illuminated by the plurality of light sources;
identifying (404), based on data obtained from the one or more sensors, one or more light quality characteristics at one or more points along the general three-dimensional trajectory;
selecting (406) one or more properties of light to be emitted by one or more of the plurality of light sources based on the identified one or more light quality characteristics; and operating (408) the one or more of the plurality of light sources to emit light having the one or more selected properties while an object travels along the general three-dimensional trajectory.
2. The method of claim 1, wherein the one or more sensors includes a camera (116, 316), and identifying the general three-dimensional trajectory comprises identifying the general three-dimensional trajectory based on video data obtained by the camera.
3. The method of claim 2, wherein the video data depicts an object travelling through the environment.
4. The method of claim 2, wherein the video data depicts a test pattern placed within the general three-dimensional trajectory.
5. The method of claim 4, wherein the test pattern is affixed to an aerial drone (120) that follows the general three-dimensional trajectory.
6. The method of claim 2, further comprising:
equipping an object passing through the three-dimensional trajectory with a radio frequency beacon (346); and
operating the camera to automatically keep the radio frequency beacon within a field of view of the camera.
7. The method of claim 1, wherein identifying the general three-dimensional trajectory comprises identifying the general three-dimensional trajectory based on sensor data provided by a motion sensor (348) attached to an object travelling through the environment.
8. The method of claim 1, wherein at least one of the one or more sensors is affixed to an aerial drone (120).
9. The method of claim 8, wherein identifying the one or more light quality characteristics comprises traversing the aerial drone along the general three-dimensional trajectory.
10. The method of claim 1, wherein at least one of the one or more sensors comprises one or more light sensors (350) affixed to at least one of the one or more objects as it travels through the general three-dimensional trajectory.
11. The method of claim 10, wherein identifying the one or more light quality characteristics is based on data collected by the one or more light sensors affixed to the at least one of the one or more objects.
12. A lighting system, comprising:
a controller 340;
a plurality of light sources (108, 308) communicatively coupled with the controller; a first group of one or more sensors (316, 346, 348) communicatively coupled with the controller to detect a general three-dimensional trajectory (114) travelled by one or more objects through free space of an environment; and
a second group of one or more sensors (316, 350) communicatively coupled with the controller to measure one or more attributes of light emitted by the one or more light sources into the general three-dimensional trajectory;
wherein the lighting controller selects one or more properties of light to be emitted by at least one of the plurality of light sources based on one or more measured attributes of light, and operates the at least one of the plurality of light sources to emit light having the one or more selected properties at least while an object travels along the general three-dimensional trajectory.
13. The lighting system of claim 12, wherein the first group of one or more sensors includes a camera (116, 316), and wherein the controller is configured to identify the general three-dimensional trajectory based on video data obtained by the camera.
14. The lighting system of claim 12, wherein the first group of one or more sensors includes a motion sensor (348) attached to an object travelling through the environment, wherein the controller is configured to identify the general three-dimensional trajectory based on sensor data provided by the motion sensor.
15. The lighting system of claim 14, wherein the object comprises an aerial drone (120).
PCT/EP2017/051408 2016-01-29 2017-01-24 Controlled lighting for three-dimensional trajectories WO2017129554A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662288505P 2016-01-29 2016-01-29
US62/288,505 2016-01-29
EP16161525.7 2016-03-22
EP16161525 2016-03-22

Publications (1)

Publication Number Publication Date
WO2017129554A1 true WO2017129554A1 (en) 2017-08-03

Family

ID=55661217

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/051408 WO2017129554A1 (en) 2016-01-29 2017-01-24 Controlled lighting for three-dimensional trajectories

Country Status (1)

Country Link
WO (1) WO2017129554A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6016038A (en) 1997-08-26 2000-01-18 Color Kinetics, Inc. Multicolored LED lighting method and apparatus
US6211626B1 (en) 1997-08-26 2001-04-03 Color Kinetics, Incorporated Illumination components
WO2009003279A1 (en) * 2007-06-29 2009-01-08 Carmanah Technologies Corp. Intelligent area lighting system
WO2010040197A1 (en) * 2008-10-10 2010-04-15 Institut National D'optique Selective and adaptive illumination of a target
WO2013014582A1 (en) * 2011-07-22 2013-01-31 Koninklijke Philips Electronics N.V. Control unit and method for lighting control
WO2014174412A2 (en) * 2013-04-25 2014-10-30 Koninklijke Philips N.V. Adaptive outdoor lighting control system based on user behavior

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6016038A (en) 1997-08-26 2000-01-18 Color Kinetics, Inc. Multicolored LED lighting method and apparatus
US6211626B1 (en) 1997-08-26 2001-04-03 Color Kinetics, Incorporated Illumination components
WO2009003279A1 (en) * 2007-06-29 2009-01-08 Carmanah Technologies Corp. Intelligent area lighting system
WO2010040197A1 (en) * 2008-10-10 2010-04-15 Institut National D'optique Selective and adaptive illumination of a target
WO2013014582A1 (en) * 2011-07-22 2013-01-31 Koninklijke Philips Electronics N.V. Control unit and method for lighting control
WO2014174412A2 (en) * 2013-04-25 2014-10-30 Koninklijke Philips N.V. Adaptive outdoor lighting control system based on user behavior

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"United States Patent Office Manual of Patent Examining Procedures"

Similar Documents

Publication Publication Date Title
US8818083B2 (en) System of drones provided with recognition beacons
CN100419376C (en) Machine vision detecting system and method
US9402292B1 (en) Providing, measuring and demonstrating highly effective uplighting
RU2557084C2 (en) System and method for interactive illumination control
JP4652691B2 (en) Method and apparatus for controlled light emission
US8698646B2 (en) Systems and apparatus for light-based social communications
US10371504B2 (en) Light fixture commissioning using depth sensing device
EP3085205B1 (en) Lighting control based on interaction with toys in play area
RU2713463C2 (en) Control of lighting dynamics
CN106797692A (en) Illumination preference ruling
CN105210454B (en) The calibration based on camera of ambient lighting system
JP2017506803A (en) Method and apparatus for wirelessly controlling the illumination effect of a networked light source
CN106165538B (en) Method and apparatus for controlling lighting unit based on the actual measurement power of related illuminating equipment and/or movement
US20160044217A1 (en) Motion Capture Camera with Illuminated Status Ring
CN106664767A (en) A lighting system for a stadium
WO2019214643A1 (en) Method for guiding autonomously movable machine by means of optical communication device
CN110521286B (en) Image analysis technique
CN110663013A (en) System and method for presenting virtual object
CN108476574A (en) It is arranged to be attached to the lighting module of lamps and lanterns
CN109472767A (en) Stage lamp miss status analysis system
TWI647976B (en) Illumination control system and illumination control method
WO2017129554A1 (en) Controlled lighting for three-dimensional trajectories
US20200257831A1 (en) Led lighting simulation system
EP3095303B1 (en) Systems and methods for calibrating emitted light to satisfy criterion for reflected light
US20230262863A1 (en) A control system and method of configuring a light source array

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17701143

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17701143

Country of ref document: EP

Kind code of ref document: A1