WO2014035642A1 - Light painting live view - Google Patents

Light painting live view Download PDF

Info

Publication number
WO2014035642A1
WO2014035642A1 PCT/US2013/054454 US2013054454W WO2014035642A1 WO 2014035642 A1 WO2014035642 A1 WO 2014035642A1 US 2013054454 W US2013054454 W US 2013054454W WO 2014035642 A1 WO2014035642 A1 WO 2014035642A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
display
memory
live view
Prior art date
Application number
PCT/US2013/054454
Other languages
French (fr)
Inventor
Ryan Harrison WARNBERG
Michelle Kirstin McSWAIN
Original Assignee
Mri Lightpainting Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mri Lightpainting Llc filed Critical Mri Lightpainting Llc
Publication of WO2014035642A1 publication Critical patent/WO2014035642A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present invention generally relates to devices having a camera feature, and more particularly to a light painting live view.
  • smartphones such as the Apple iPhone®, Samsung Galaxy®, Blackberry Q10® and the like
  • tablet computers running, for example, Google's
  • Android® operating system (O/S) and Apple's iOS® O/S include among their features, built-in cameras for taking photos. Applications executing in the smartphones and tablet computers enable control of the built-in cameras, including light painting.
  • light painting is a photographic technique, often performed at night or in a dark area, where a photographer can introduce different lighting elements during a single long exposure photograph, light painting enables the capture of light trails, light graffiti tags, and so forth.
  • the invention features a method including, in a device including at least a processor, a memory, a display and a camera device having an on-screen viewfinder, accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the on-screen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the display, and rendering the first image into the second image to generate a final image
  • a device including at least a processor, a memory, a display and a camera device having an on-screen viewfinder, accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the on-screen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the display, and rendering the first image into
  • the invention features a method including, in a device including at least a processor, a memory, a display and a camera device, executing a light painting live view process in conjunction with the camera to provide a long exposure camera that displays a creation of an exposure in real time.
  • the invention features an apparatus including a processor, a memory, a display, and a camera device, the memory including a light painting live view process, the light painting live view process including accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the onscreen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the display, and rendering the first image into the second image to generate a final image.
  • a processor a memory, a display, and a camera device
  • the memory including a light painting live view process, the light painting live view process including accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the onscreen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the
  • FIG. 1 is a block diagram of an exemplary smartphone.
  • FIG. 2 is a flow diagram of an exemplary light painting live view process.
  • a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and the server can be a component.
  • One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
  • these components can execute from various computer readable media having various data structures stored thereon.
  • the components may communicate via local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal).
  • a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal).
  • the term "or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, "X employs A or B" is intended to mean any of the natural inclusive permutations.
  • an exemplary device 10 includes at least a processor 15, a memory 20, a display unit 25, a camera 30 and a graphical processing unit (GPU) 35.
  • processor 15 includes at least a processor 15, a memory 20, a display unit 25, a camera 30 and a graphical processing unit (GPU) 35.
  • memory 20 includes at least a processor 15, a memory 20, a display unit 25, a camera 30 and a graphical processing unit (GPU) 35.
  • GPU graphical processing unit
  • Example devices 10 include DSLR cameras, smartphones, tablet computers, personal data assistants, digital televisions, computers, laptops, devices with an integrated digital camera such as Nintendo® DS, wearable devices, devices with a digital camera, and so forth.
  • the GPU 35 is an electronic circuit designed to rapidly manipulate and alter memory 20 to accelerate a creation of images in a frame buffer intended for output to the display unit 25.
  • the memory 20 can include at least an operating system (O/S) 40, such as Windows®, Linux®, Google's Android®, Apple's iOS®, or a proprietary OS, and a light painting live view process 100.
  • O/S operating system
  • Light painting is a photographic technique in which exposures are made by moving a hand-held light source or by moving the camera.
  • the term light painting also encompasses images lit from outside the frame with hand-held light sources.
  • moving the light source the light can be used to selectively illuminate parts of the subject or to "paint" a picture by shining it directly into the camera lens.
  • Light painting requires a slow shutter speed, usually a second or more.
  • Light painting can take on the characteristics of a quick pencil sketch.
  • Light painting by moving the camera is the antithesis of traditional photography. At night, or in a dark room, the camera can be taken off the tripod and used like a paintbrush. An example is using the night sky as the canvas, the camera as the brush and cityscapes (amongst other light sources) as the palette. Putting energy into moving the camera by stroking lights, making patterns and laying down backgrounds can create abstract artistic images.
  • Light painting can be done interactively using a webcam. The painted image can already be seen while drawing by using a monitor or projector.
  • the light painting live view process 100 executes in conjunction with the camera 30 to provide a long exposure camera that displays the creation of the exposure in real time.
  • the device 10 can support a variety of applications, such as a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a blogging application, a web browsing application, a digital music player application, and/or a digital video player application.
  • applications such as a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a blogging application, a web browsing application, a digital music player application, and/or a digital video player application.
  • the light painting live view process 100 is a light painting application.
  • a user can use a light source to draw shapes and patterns in front of a camera set to a long exposure.
  • the light painting live view process 100 enables the user behind the camera 30 within the device 10 (or tablet computer) to watch the shapes or patterns that are being created, as they are being created. In prior approaches, the user must wait until the end of the exposure to see what has been made or created.
  • the light painting live view process 100 accesses (105) the camera, which captures individual frames of footage, each of the captured frames displayed on a viewfinder in cumulative succession.
  • the light painting live view process 100 renders (110) the captured frames on a graphical processing unit (GPU), which is a user-facing camera "viewfinder" feature of the light painting live view process 100.
  • GPU graphical processing unit
  • the light painting live view process 100 also sends (115) them through a shader program (also referred to as a vertex and fragment program) into graphical processing unit (GPU).
  • a shader is a computer program that is used to do shading, produce special effects and/or do postprocessing. Shaders calculate rendering effects on graphics hardware with a high degree of flexibility. Most shaders are coded for a graphics processing unit (GPU), though this is not a strict requirement.
  • the position, hue, saturation, brightness, and contrast of all pixels, vertices, or textures used to construct a final image can be altered on the fly, using algorithms defined in the shader, and can be modified by external variables or textures introduced by the program calling the shader.
  • Sending (115) the captured frames through the shader creates two images, one image saved (120) to the device's memory and the other image displayed (125) by light painting live view process 100 for the user to see as if they were watching a video.
  • the light painting live view process 100 uses frames from the camera as the input of the shader program and a progress frame as the output of the shader program. Through additive blending, one image is rendered (130) into the other by the light painting live view process 100, i.e., the image that is being drawn progressively is rendered to the display.
  • the light painting live view process 100 converts (135) the image that is rendered into the memory to a Joint Photographic Experts Group (JPEG) file and projects (140) the JPEG file as a final image on the display.
  • JPEG Joint Photographic Experts Group
  • GUI home screen graphical user interface
  • the GUI includes a main navigation bar that includes a pictorial rendering of a small camera.
  • the light painting live view process 100 opens up to the camera built into the device's memory.
  • the camera screen appears as though it's a video screen, ready for capture.
  • the navigation bar shows a button to tap to begin image capture.
  • a video capture session is initiated and anything that passes in front of the camera will leave a trail, similar to a long exposure on a single-lens reflex/digital single-lens reflex (SLR/DSLR) camera. The difference is that the user sees the trail as it is created, in real time, like a mixture of a stop motion video and an Etch-A-Sketch®.
  • SLR/DSLR single-lens reflex/digital single-lens reflex
  • Exposures can be set for one second, or they can run as long as the user has memory in their device to store the image/video data. The exposure can also be stopped by tapping the same button used to start the exposure. [0035] The user can move their camera around to capture trails, or they can make their own trails with a light of their own.
  • the captured frame is sent through a shader program into the GPU.
  • a GL_MAX blend operation which specifies how source and destination colors are combined, is responsible for producing the light painting, but to control the output a fragment shader program is used.
  • the fragment shader is run on each pixel of an image, producing for each input pixel a corresponding output pixel.
  • the fragment shader supports an "Ambient Light Amount" feature of the capture settings. By taking a brightness parameter between 0 and 1, the fragment shader enables throttling the affect of light input on the painting.
  • vec4 color texture2D(u_diffuseTexture, v_uv);
  • lumlntensity min(1.0, lumlntensity);
  • lumlntensity lumlntensity * lumlntensity
  • lumlntensity max(u_brightness, lumlntensity);
  • gl_FragColor color * lumlntensity
  • the light painting live view process 100 then generates images in stages:
  • Raw Image - this is the image data coming from the device's video camera, frame-by- frame, stored in a buffer managed by the operating system.
  • Input Image this is the image used as an input to the fragment shader program, stored in an OpenGL texture.
  • a texture is an OpenGL Object that contains one or more images that all have the same image format.
  • a texture can be used in two ways. It can be the source of a texture access from a shader, or it can be used as a render target. The raw image is copied into the input image.
  • Intermediate Output Image this is the output of the fragment shader program, stored in an OpenGL texture.
  • the input image is rendered into the intermediate output image, using a custom OpenGL frame buffer backed by an OpenGL texture.
  • frame buffer objects are a mechanism for rendering to images other than the default OpenGL Default frame buffer. They are OpenGL Objects that allow you to render directly to textures, as well as blitting from one frame buffer, to another.
  • Preview Image - this is the output of the fragment shader program, shown on the device's display.
  • the input image is rendered to the screen, using the default OpenGL frame buffer backed by the device's display.
  • Output Image - this is the output of copying and compressing the data from the intermediate output image to a JPEG representation.
  • the output image may be saved to the device's display's camera roll, shared via email, Facebook® or Twitter®, or uploaded to a server.
  • the pixels of the intermediate output image are blended with the pixels of the input image.
  • the output of that blending process is then used to replace the previous value of each pixel of the intermediate output image.
  • the OpenGL blend mode "GL_MAX” is used to blend the pixels.
  • the maximum of the two pixel values is the output of the operation.
  • Gr max G s G d
  • Output Image - this is the output of copying and compressing the data from the intermediate output image to a JPEG representation.
  • the output image may be saved to the device's display's camera roll, shared via email, Facebook® or Twitter®, or uploaded to the server.

Abstract

Methods and apparatus, including computer program products, for a light painting live view. A method includes, in a device comprising at least a processor, a memory, a display and a camera device having an on-screen viewfinder, accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the on screen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the display, and rendering the first image into the second image to generate a final image.

Description

LIGHT PAINTING LIVE VIEW CROSS REFERENCE TO RELATED APPLICATIONS
[001] This application claims the benefit of U.S. Provisional Application No.
61/693,795, filed August 28, 2012. The disclosure of the prior application is considered part of and is incorporated by reference in the disclosure of this application.
BACKGROUND OF THE INVENTION
[002] The present invention generally relates to devices having a camera feature, and more particularly to a light painting live view.
[003] Like cameras, smartphones, such as the Apple iPhone®, Samsung Galaxy®, Blackberry Q10® and the like, and tablet computers running, for example, Google's
Android® operating system (O/S) and Apple's iOS® O/S, include among their features, built-in cameras for taking photos. Applications executing in the smartphones and tablet computers enable control of the built-in cameras, including light painting.
[004] In general, light painting is a photographic technique, often performed at night or in a dark area, where a photographer can introduce different lighting elements during a single long exposure photograph, light painting enables the capture of light trails, light graffiti tags, and so forth.
SUMMARY OF THE INVENTION
[005] The following presents a simplified summary of the innovation in order to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the invention. It is intended to neither identify key or critical elements of the invention nor delineate the scope of the invention. Its sole purpose is to present some concepts of the invention in a simplified form as a prelude to the more detailed description that is presented later. [006] The present invention provides methods and apparatus, including computer program products, for a light painting live view.
[007] In general, in one aspect, the invention features a method including, in a device including at least a processor, a memory, a display and a camera device having an on-screen viewfinder, accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the on-screen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the display, and rendering the first image into the second image to generate a final image
[008] In another aspect, the invention features a method including, in a device including at least a processor, a memory, a display and a camera device, executing a light painting live view process in conjunction with the camera to provide a long exposure camera that displays a creation of an exposure in real time.
[009] In still another aspect, the invention features an apparatus including a processor, a memory, a display, and a camera device, the memory including a light painting live view process, the light painting live view process including accessing the camera, capturing individual frames of footage, each of the captured frames being displayed through the onscreen viewfinder in cumulative succession, rendering the captured frames on a graphical processing unit (GPU), sending the captured frames through a shader program, generating at least two images, a first image saved to the memory and a second image displayed on the display, and rendering the first image into the second image to generate a final image.
[0010] These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of aspects as claimed. BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The invention will be more fully understood by reference to the detailed description, in conjunction with the following figures, wherein:
[0012] FIG. 1 is a block diagram of an exemplary smartphone.
[0013] FIG. 2 is a flow diagram of an exemplary light painting live view process.
DETAILED DESCRIPTION
[0014] The subject innovation is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It may be evident, however, that the present invention may be practiced without these specific details. In other instances, well- known structures and devices are shown in block diagram form in order to facilitate describing the present invention.
[0015] As used in this application, the terms "component," "system," "platform," and the like can refer to a computer-related entity or an entity related to an operational machine with one or more specific functionalities. The entities disclosed herein can be either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers. Also, these components can execute from various computer readable media having various data structures stored thereon. The components may communicate via local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal). [0016] In addition, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or." That is, unless specified otherwise, or clear from context, "X employs A or B" is intended to mean any of the natural inclusive permutations. That is, if X employs A, X employs B, or X employs both A and B, then "X employs A or B" is satisfied under any of the foregoing instances. Moreover, articles "a" and "an" as used in the subject specification and annexed drawings should generally be construed to mean "one or more" unless specified otherwise or clear from context to be directed to a singular form.
[0017] As shown in FIG. 1, an exemplary device 10 includes at least a processor 15, a memory 20, a display unit 25, a camera 30 and a graphical processing unit (GPU) 35.
Example devices 10 include DSLR cameras, smartphones, tablet computers, personal data assistants, digital televisions, computers, laptops, devices with an integrated digital camera such as Nintendo® DS, wearable devices, devices with a digital camera, and so forth. The GPU 35 is an electronic circuit designed to rapidly manipulate and alter memory 20 to accelerate a creation of images in a frame buffer intended for output to the display unit 25.
[0018] The memory 20 can include at least an operating system (O/S) 40, such as Windows®, Linux®, Google's Android®, Apple's iOS®, or a proprietary OS, and a light painting live view process 100.
[0019] Light painting is a photographic technique in which exposures are made by moving a hand-held light source or by moving the camera. The term light painting also encompasses images lit from outside the frame with hand-held light sources. By moving the light source, the light can be used to selectively illuminate parts of the subject or to "paint" a picture by shining it directly into the camera lens. Light painting requires a slow shutter speed, usually a second or more. Light painting can take on the characteristics of a quick pencil sketch.
[0020] Light painting by moving the camera, also called camera painting, is the antithesis of traditional photography. At night, or in a dark room, the camera can be taken off the tripod and used like a paintbrush. An example is using the night sky as the canvas, the camera as the brush and cityscapes (amongst other light sources) as the palette. Putting energy into moving the camera by stroking lights, making patterns and laying down backgrounds can create abstract artistic images. [0021] Light painting can be done interactively using a webcam. The painted image can already be seen while drawing by using a monitor or projector.
[0022] Another technique used in the creation of light art is the projection of images on to irregular surfaces (faces, bodies, buildings, and so forth), in effect "painting" them with light. A photograph or other fixed portrayal of the resulting image is then made.
[0023] The light painting live view process 100 executes in conjunction with the camera 30 to provide a long exposure camera that displays the creation of the exposure in real time.
[0024] The device 10 can support a variety of applications, such as a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a blogging application, a web browsing application, a digital music player application, and/or a digital video player application.
[0025] The light painting live view process 100 is a light painting application. In light painting, a user can use a light source to draw shapes and patterns in front of a camera set to a long exposure. The light painting live view process 100 enables the user behind the camera 30 within the device 10 (or tablet computer) to watch the shapes or patterns that are being created, as they are being created. In prior approaches, the user must wait until the end of the exposure to see what has been made or created.
[0026] As shown om FIG. 2, the light painting live view process 100 accesses (105) the camera, which captures individual frames of footage, each of the captured frames displayed on a viewfinder in cumulative succession.
[0027] While frames are being captured by the camera, the light painting live view process 100 renders (110) the captured frames on a graphical processing unit (GPU), which is a user-facing camera "viewfinder" feature of the light painting live view process 100.
[0028] For every frame that is being captured to create an image, the light painting live view process 100 also sends (115) them through a shader program (also referred to as a vertex and fragment program) into graphical processing unit (GPU). In general, a shader is a computer program that is used to do shading, produce special effects and/or do postprocessing. Shaders calculate rendering effects on graphics hardware with a high degree of flexibility. Most shaders are coded for a graphics processing unit (GPU), though this is not a strict requirement. The position, hue, saturation, brightness, and contrast of all pixels, vertices, or textures used to construct a final image can be altered on the fly, using algorithms defined in the shader, and can be modified by external variables or textures introduced by the program calling the shader.
[0029] Sending (115) the captured frames through the shader creates two images, one image saved (120) to the device's memory and the other image displayed (125) by light painting live view process 100 for the user to see as if they were watching a video. The light painting live view process 100 uses frames from the camera as the input of the shader program and a progress frame as the output of the shader program. Through additive blending, one image is rendered (130) into the other by the light painting live view process 100, i.e., the image that is being drawn progressively is rendered to the display.
[0030] Once the user signals the light painting live view process to stop, the light painting live view process 100 converts (135) the image that is rendered into the memory to a Joint Photographic Experts Group (JPEG) file and projects (140) the JPEG file as a final image on the display.
[0031] As described above, a user initiates the light painting live view process 100, which generates a home screen graphical user interface (GUI). The GUI includes a main navigation bar that includes a pictorial rendering of a small camera. When the small camera is tapped, the light painting live view process 100 opens up to the camera built into the device's memory. The camera screen appears as though it's a video screen, ready for capture. The navigation bar shows a button to tap to begin image capture.
[0032] A video capture session is initiated and anything that passes in front of the camera will leave a trail, similar to a long exposure on a single-lens reflex/digital single-lens reflex (SLR/DSLR) camera. The difference is that the user sees the trail as it is created, in real time, like a mixture of a stop motion video and an Etch-A-Sketch®.
[0033] This is viewed facing through the viewfinder on of the light painting live view process 100, which is a screen that accesses the forward facing camera on the device.
Anything viewed by that camera is seen through the light painting live view process 100 viewfinder.
[0034] Exposures can be set for one second, or they can run as long as the user has memory in their device to store the image/video data. The exposure can also be stopped by tapping the same button used to start the exposure. [0035] The user can move their camera around to capture trails, or they can make their own trails with a light of their own.
[0036] For every frame that is being captured to create the image, the captured frame is sent through a shader program into the GPU.
[0037] A GL_MAX blend operation, which specifies how source and destination colors are combined, is responsible for producing the light painting, but to control the output a fragment shader program is used. The fragment shader is run on each pixel of an image, producing for each input pixel a corresponding output pixel. The fragment shader supports an "Ambient Light Amount" feature of the capture settings. By taking a brightness parameter between 0 and 1, the fragment shader enables throttling the affect of light input on the painting.
[0038] The following is one example of fragment shader source code:
[0039]
[0040] precision mediump float;
[0041] varying vec2 v_uv;
[0042] niform sampler2D u_diffuseTexture;
[0043] uniform float u_brightness;
[0044]
[0045] void main(void)
[0046]
[0047] // sample color
[0048] vec4 color = texture2D(u_diffuseTexture, v_uv);
[0049] // calculate luminance intensity
[0050] float lumlntensity = color.x * 0.299 + color.y * 0.587 + color.z * 0.114;
[0051] // clamp and exaggerate luminance intensity
[0052] lumlntensity = min(1.0, lumlntensity);
[0053] lumlntensity = lumlntensity * lumlntensity;
[0054] lumlntensity = max(u_brightness, lumlntensity);
[0055] // output final color
[0056] gl_FragColor = color * lumlntensity;
[0057] } [0058]
[0059] The light painting live view process 100 then generates images in stages:
[0060] Image Stages/Names Stage
[0061] 1. Raw Image - this is the image data coming from the device's video camera, frame-by- frame, stored in a buffer managed by the operating system.
[0062] 2. Input Image - this is the image used as an input to the fragment shader program, stored in an OpenGL texture. A texture is an OpenGL Object that contains one or more images that all have the same image format. A texture can be used in two ways. It can be the source of a texture access from a shader, or it can be used as a render target. The raw image is copied into the input image.
[0063] 3. Intermediate Output Image - this is the output of the fragment shader program, stored in an OpenGL texture. The input image is rendered into the intermediate output image, using a custom OpenGL frame buffer backed by an OpenGL texture. In general, frame buffer objects are a mechanism for rendering to images other than the default OpenGL Default frame buffer. They are OpenGL Objects that allow you to render directly to textures, as well as blitting from one frame buffer, to another.
[0064] 4. Preview Image - this is the output of the fragment shader program, shown on the device's display. The input image is rendered to the screen, using the default OpenGL frame buffer backed by the device's display.
[0065] 5. Output Image - this is the output of copying and compressing the data from the intermediate output image to a JPEG representation. The output image may be saved to the device's display's camera roll, shared via email, Facebook® or Twitter®, or uploaded to a server.
[0066] Through additive blending, one image is rendered into the other in the order laid out above. The image that is being drawn progressively is rendered to the display.
[0067] Blending Modes Stage
[0068] To produce a light painting, the pixels of the intermediate output image are blended with the pixels of the input image. The output of that blending process is then used to replace the previous value of each pixel of the intermediate output image.
[0069] The OpenGL blend mode "GL_MAX" is used to blend the pixels. The maximum of the two pixel values is the output of the operation. [0070] The following describes the effect of the GL_MAX blend mode on pixel values (taken from the OpenGL documentation at
http ://www . opengl . org/sdk/doc s/man/xhtml/glBlendEquation . xml) :
[0071] Mode
[0072] RGB Components
[0073] Alpha Component
[0074] GL_MAX
[0075] Rr = max R s R d
[0076] Gr = max G s G d
[0077] Br = max B s B d
[0078] Ar = max A s A d
[0079]
[0080] When all done, the output image is displayed:
[0081] 5. Output Image - this is the output of copying and compressing the data from the intermediate output image to a JPEG representation. The output image may be saved to the device's display's camera roll, shared via email, Facebook® or Twitter®, or uploaded to the server.
[0082] While the above describes a particular order of operations performed by certain embodiments of the invention, it should be understood that such order is exemplary, as alternative embodiments may perform the operations in a different order, combine certain operations, overlap certain operations, or the like. References in the specification to a given embodiment indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic.
[0083] While given components of the system have been described separately, one of ordinary skill will appreciate that some of the functions may be combined or shared in given instructions, program sequences, code portions, and the like.
[0084] The foregoing description does not represent an exhaustive list of all possible implementations consistent with this disclosure or of all possible variations of the implementations described. A number of implementations have been described.
Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the systems, devices, methods and techniques described here. Accordingly, other implementations are within the scope of the following claims.
[0085] What is claimed is:

Claims

1. A method comprising:
in a device comprising at least a processor, a memory, a display and a camera device having an on-screen viewfinder, accessing the camera;
capturing individual frames of footage, each of the captured frames being displayed through the on-screen viewfinder in cumulative succession;
rendering the captured frames on a graphical processing unit (GPU);
sending the captured frames through a shader program;
generating at least two images, a first image saved to the memory and a second image displayed on the display; and
rendering the first image into the second image to generate a final image.
2. The method of claim 2 further comprising:
compressing the final image; and
converting the compressed final image to a Joint Photographic Experts Group (JPEG) file.
3. The method of claim 2 further comprising projecting the JPEG file on the display.
4. The method of claim 1 wherein the shader program receives input from the camera and outputs a progress frame.
5. The method of claim 1 wherein the device is a smartphone or tablet computer.
6. The method of claim 1 wherein the device is selected from the group consisting of a DSLR camera, a smartphone, a tablet computer, a personal data assistants, a digital televisions, a computers, a laptops, a device with an integrated digital camera, a wearable device, and a device with a digital camera. a digital camera, and a personal data assistant.
7. The method of claim 1 wherein generating the least two images comprises: an image/name stage; and
blending modes stage.
8. The method of claim 7 wherein the image/name stage comprises:
storing image data coming from the camera in a buffer in the memory;
using the stored image as an input to the shader program; and
outputting an intermediate image from the shader program to the display, the intermediate image blended with the input images.
9. The method of claim 8 wherein the blending modes stage comprises:
blending pixels of the intermediate output image with pixels of the input image; and replacing previous values of pixels with pixels of the intermediate output image.
10. The method of claim 8 wherein the input is a OpenGL texture.
11. A method comprising:
in a device comprising at least a processor, a memory, a display and a camera device having an on-screen viewfinder, executing a light painting live view process in conjunction with the camera to provide a long exposure camera that displays a creation of an exposure in real time.
12. The method of claim 11 wherein the device is a smartphone or tablet computer.
13. The method of claim 11 wherein the device is selected from the group consisting of a DSLR camera, a smartphone, a tablet computer, a personal data assistants, a digital televisions, a computers, a laptops, a device with an integrated digital camera, a wearable device, and a device with a digital camera.
14. An apparatus comprising:
a processor;
a memory; a display; and
a camera device having an on-screen viewfinder;
the memory comprising a light painting live view process, the light painting live view process comprising:
accessing the camera;
capturing individual frames of footage, each of the captured frames being displayed through the on-screen viewfinder in cumulative succession;
rendering the captured frames on a graphical processing unit (GPU);
sending the captured frames through a shader program;
generating at least two images, a first image saved to the memory and a second image displayed on the display; and
rendering the first image into the second image to generate a final image.
15. The apparatus of claim 14 wherein the light painting live view process further comprises: compressing the final image; and
converting the compressed final image to a Joint Photographic Experts Group (JPEG) file.
16. The apparatus of claim 15 wherein the light painting live view process further comprises projecting the JPEG file on the display.
17. The apparatus of claim 14 wherein the shader program receives input from the camera and outputs a progress frame.
18. The apparatus of claim 14 wherein generating the least two images comprises:
an image/name stage; and
blending modes stage.
19. The apparatus of claim 18 wherein the image/name stage comprises:
storing image data coming from the camera in a buffer in the memory;
using the stored image as an input to the shader program; and outputting an intermediate image from the shader program to the display, the intermediate image blended with the input images.
20. The apparatus of claim 19 wherein the blending modes stage comprises:
blending pixels of the intermediate output image with pixels of the input image; and replacing previous values of pixels with pixels of the intermediate output image.
21. The apparatus of claim 19 wherein the input is a OpenGL texture.
PCT/US2013/054454 2012-08-28 2013-08-12 Light painting live view WO2014035642A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261693795P 2012-08-28 2012-08-28
US61/693,795 2012-08-28

Publications (1)

Publication Number Publication Date
WO2014035642A1 true WO2014035642A1 (en) 2014-03-06

Family

ID=50184134

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/054454 WO2014035642A1 (en) 2012-08-28 2013-08-12 Light painting live view

Country Status (1)

Country Link
WO (1) WO2014035642A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104202521A (en) * 2014-08-28 2014-12-10 深圳市中兴移动通信有限公司 Shooting method and shooting device
WO2016011859A1 (en) * 2014-07-23 2016-01-28 努比亚技术有限公司 Method for filming light painting video, mobile terminal, and computer storage medium
CN105959588A (en) * 2016-05-30 2016-09-21 努比亚技术有限公司 Mobile terminal, light-painted photograph shooting device and method
WO2018119632A1 (en) * 2016-12-27 2018-07-05 深圳市大疆创新科技有限公司 Image processing method, device and equipment
CN114697555A (en) * 2022-04-06 2022-07-01 百富计算机技术(深圳)有限公司 Image processing method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070031062A1 (en) * 2005-08-04 2007-02-08 Microsoft Corporation Video registration and image sequence stitching
US20090273686A1 (en) * 2008-05-02 2009-11-05 Nokia Corporation Methods, computer program products and apparatus providing improved image capturing
CN102497508A (en) * 2011-12-13 2012-06-13 刘桂荣 Practical light painting photo taking method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070031062A1 (en) * 2005-08-04 2007-02-08 Microsoft Corporation Video registration and image sequence stitching
US20090273686A1 (en) * 2008-05-02 2009-11-05 Nokia Corporation Methods, computer program products and apparatus providing improved image capturing
CN102497508A (en) * 2011-12-13 2012-06-13 刘桂荣 Practical light painting photo taking method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ROGGE, LAUREN.: "Integration of visual effects into the Virtual Video Camera system", 16 December 2009 (2009-12-16), Retrieved from the Internet <URL:http://www.cg.tu-bs.de/media/publications/integration-visual-effects-virtual-video-camera-system.pdf> [retrieved on 20131027] *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016011859A1 (en) * 2014-07-23 2016-01-28 努比亚技术有限公司 Method for filming light painting video, mobile terminal, and computer storage medium
WO2016011877A1 (en) * 2014-07-23 2016-01-28 努比亚技术有限公司 Method for filming light painting video, mobile terminal, and storage medium
US10129488B2 (en) 2014-07-23 2018-11-13 Nubia Technology Co., Ltd. Method for shooting light-painting video, mobile terminal and computer storage medium
CN104202521A (en) * 2014-08-28 2014-12-10 深圳市中兴移动通信有限公司 Shooting method and shooting device
CN104202521B (en) * 2014-08-28 2016-05-25 努比亚技术有限公司 Image pickup method and filming apparatus
CN105959588A (en) * 2016-05-30 2016-09-21 努比亚技术有限公司 Mobile terminal, light-painted photograph shooting device and method
WO2018119632A1 (en) * 2016-12-27 2018-07-05 深圳市大疆创新科技有限公司 Image processing method, device and equipment
CN114697555A (en) * 2022-04-06 2022-07-01 百富计算机技术(深圳)有限公司 Image processing method, device, equipment and storage medium
CN114697555B (en) * 2022-04-06 2023-10-27 深圳市兆珑科技有限公司 Image processing method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
US9813638B2 (en) Lightpainting live view
US11558558B1 (en) Frame-selective camera
KR101864059B1 (en) Mobile terminal and shooting method thereof
US9264630B2 (en) Method and apparatus for creating exposure effects using an optical image stabilizing device
US10116879B2 (en) Method and apparatus for obtaining an image with motion blur
US9019400B2 (en) Imaging apparatus, imaging method and computer-readable storage medium
US9591347B2 (en) Displaying simulated media content item enhancements on mobile devices
KR101766614B1 (en) Photographing method with slow shutter speed and photographing apparatus thereof
WO2016019770A1 (en) Method, device and storage medium for picture synthesis
US10148880B2 (en) Method and apparatus for video content stabilization
KR20160128366A (en) Mobile terminal photographing method and mobile terminal
JP2018513640A (en) Automatic panning shot generation
US9420181B2 (en) Electronic camera, computer readable medium recording imaging control program thereon and imaging control method
WO2014035642A1 (en) Light painting live view
WO2016011877A1 (en) Method for filming light painting video, mobile terminal, and storage medium
JP2008217785A (en) Display controller and image data converting method
CN114630053B (en) HDR image display method and display device
CN106162024A (en) Photo processing method and device
TW201340705A (en) Image pickup device and image preview system and image preview method thereof
CN103297660A (en) Real-time interaction special effect camera shooting and photographing method
US11792511B2 (en) Camera system utilizing auxiliary image sensors
JP7378963B2 (en) Image processing device, image processing method, and computer program
KR102146853B1 (en) Photographing apparatus and method
TW201722137A (en) Method and related camera device for generating pictures with object moving trace
TW201724838A (en) Method and related camera device for generating pictures with rotation trace

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13832192

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13832192

Country of ref document: EP

Kind code of ref document: A1