WO2013177467A1 - Systems and methods to display rendered images - Google Patents

Systems and methods to display rendered images Download PDF

Info

Publication number
WO2013177467A1
WO2013177467A1 PCT/US2013/042529 US2013042529W WO2013177467A1 WO 2013177467 A1 WO2013177467 A1 WO 2013177467A1 US 2013042529 W US2013042529 W US 2013042529W WO 2013177467 A1 WO2013177467 A1 WO 2013177467A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
rendered
images
base
base image
Prior art date
Application number
PCT/US2013/042529
Other languages
French (fr)
Inventor
Jonathan COON
Original Assignee
1-800 Contacts, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 1-800 Contacts, Inc. filed Critical 1-800 Contacts, Inc.
Publication of WO2013177467A1 publication Critical patent/WO2013177467A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T3/14
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/16Cloth

Definitions

  • FIG. 1 is a block diagram illustrating one embodiment of an environment in which the present systems and methods may be implemented
  • FIG. 2 is a block diagram illustrating one example of the rendered image display module
  • FIG. 3 is a block diagram illustrating one example of how a rendered image and a base image may be layered
  • FIG. 4 is a block diagram illustrating one example of a rendered object movie based on multiple sets of layered images
  • FIG. 5 is a block diagram illustrating one example of an image file that includes more than one rendered images
  • FIG. 6 is a block diagram illustrating one example of how an image file may be used to generate a rendered object movie
  • FIG. 7 is a block diagram illustrating one example of how multiple image files may be used to generate a rendered object movie
  • FIG. 8 is a block diagram illustrating one example of how a single image file with multiple sets of rendered images may be used to generate a rendered object movie
  • FIG. 9 is a block diagram illustrating one example of how a single image file with multiple sets of rendered images may be used to generate disparate rendered object movies;
  • FIG. 10 is a flow diagram illustrating one embodiment of a method to display rendered images
  • FIG. 1 1 is a flow diagram illustrating another embodiment of a method to display rendered images.
  • FIG. 12 depicts a block diagram of a computer system suitable for implementing the present systems and methods.
  • a computer- implemented method to display a rendered image is described.
  • a base image is obtained.
  • a rendered image is obtained.
  • the rendered image is matched to a location on the base image.
  • the rendered image is overlaid onto the base image at the location to generate a set of layered images.
  • the set of layered images is displayed.
  • the rendered image may include a rendered version of at least a portion of the base image.
  • the location on the base image may correspond to the at least a portion of the base image.
  • the set of layered images may include the base image and the rendered image.
  • the rendered image may cover the at least a portion of the base image.
  • an image file having a plurality of rendered images may be obtained.
  • obtaining a rendered image may include determining a rendered image from the plurality of rendered images based on the base image.
  • obtaining a rendered image may include selecting a pixel area of the image file that corresponds to the determined rendered image.
  • each of the plurality of rendered images may correspond to the same rendering scheme.
  • a computing device configured to display a rendered image is also described.
  • the computing device includes instructions stored in memory that in is in electronic communication with a processor.
  • the instructions being executable by the processor to obtain a base image.
  • the instructions also being executable by the processor to obtain a rendered image.
  • the instructions additionally being executable by the processor to match the rendered image to a location on the base image.
  • the instructions also being executable by the processor to overlay the rendered image onto the base image at the location to generate a set of layered images.
  • the instructions further being executable by the processor to display the set of layered images.
  • a computer-program product to display a rendered image is additionally described.
  • the computer-program product includes a non-transitory computer-readable storage medium that stores computer executable instructions.
  • the instructions being executable by the processor to obtain a base image.
  • the instructions also being executable by the processor to obtain a rendered image.
  • the instructions additionally being executable by the processor to match the rendered image to a location on the base image.
  • the instructions also being executable by the processor to overlay the rendered image onto the base image at the location to generate a set of layered images.
  • the instructions further being executable by the processor to display the set of layered images.
  • a base image of the user may be rendered to include the product.
  • a portion of the base image may be rendered. Therefore, a portion of the image may not be rendered.
  • this may allow for efficient storing, transferring, and/or processing of rendered images. This may be particularly beneficial when the same base image is used for displaying multiple renderings (as in the case of virtually trying-on products, for example). For instance, the same base image may be quickly modified to show different renderings by overlaying the appropriate rendering image. In the case that rendered images are obtained from a server, the reduced size of each rendering image and the reuse of the same base image may allow the user to virtually try-on and shop simultaneously (e.g., the products are presented to the user in the context of a rendered virtual try-on).
  • a user may desire to virtually try-on a pair of glasses (so that the user may see how the glasses look on their face/head, for example).
  • the user may provide one or more images of the user's face/head (e.g., the base images).
  • each base image may be rendered with a virtual pair of glasses.
  • a region that includes the rendered pair of glasses (a strip of a user's head that includes the region around the eyes and ears of the user, for example) may be stored as the rendered image.
  • multiple rendered images may be formed. For example, a rendered image may be generated for each pair of glasses.
  • a rendered image may be generated for each position on the face that a particular pair of glasses may be positioned at. This may allow a user to change glasses and/or change the position of a pair of glasses on their face simply by overlaying the proper rendered image on the base image.
  • FIG. 1 is a block diagram illustrating one embodiment of an environment 100 in which the present systems and methods may be im- plemented.
  • a device 102 may be coupled to a memory 108 through an interface 106.
  • the device 102 may include a rendered image display module 104 that implements the systems and methods described herein.
  • the rendered image display module 104 may ac- cess data stored in the memory 108.
  • the rendered image display module 104 may obtain one or more base images 1 10 and one or more rendered images 1 12 from the memory 108.
  • the rendered image display module 104 may match a rendered image 1 12 with a particular portion of a base image 1 10.
  • the rendered image display module 104 may overlay the rendered image 1 12 onto the base image 1 10 to cover the particular portion of the base image 1 10.
  • the rendered image display module 104 may display the layered images (e.g., the rendered image 1 12 and the base image 1 10). In some cases, displaying the set of layered images creates the illusion that the layered images are a single rendered image.
  • the rendered image display module 104 is discussed in greater detail below.
  • the memory 108 may be local to the device 102.
  • the memory 108 may be within the device 102 and/or directly attached to the device 102.
  • the memory 108 may be remote from the device 102.
  • the memory 108 may be hosted by a server.
  • the device 102 may access the one or more base images 1 10 and the one or more rendered images 1 12 through a network (via the server, for example).
  • a first memory 108 may be local to the device 102 and a second memory 108 may be remote from the device 102.
  • the rendered image display module 104 may obtain a base image 1 10 and/or a rendered image 1 12 from either memory 108.
  • Examples of interface 106 include system buses, serial AT attachment (SAT A) interfaces, universal serial bus (USB) interfaces, wired networks, wire- less networks, cellular networks, satellite networks, etc. In some cases, the interface 106 may be the internet.
  • FIG. 2 is a block diagram illustrating one example of the rendered image display module 104-a.
  • the rendered image display module 104-a may be one example of the rendered image display module 104 illustrated in FIG. 1.
  • the ren- dered image display module 104-a may include an obtaining module 202, a matching module 204, a layering module 206, and a displaying module 208.
  • the obtaining module 202 may obtain one or more base images 1 10 and one or more rendered images 1 12.
  • the obtaining module 202 may obtain a base image 1 10.
  • a base image 1 10 may be associated with a perspective (e.g., an x, y, z, orientation, for example).
  • the obtaining module 202 may obtain a rendered image 1 12 based on the perspective of the base image 1 10. For example, the obtaining module 202 may obtain a rendered image 1 12 that corresponds (has the same perspective, for example) to a base image 1 10.
  • the obtaining module 202 may obtain a base image 1 10 and/or a rendered image 1 12 by initiating a read access to the memory 108.
  • the obtaining module 202 may obtain a base image 1 10 and/or a rendered image 1 12 by receiving the base image 1 10 and/or the rendered image 1 12 from a server.
  • the obtaining module 202 may request the base image 1 10 and/or the rendered image 1 12 from the server and may receive the base image 1 10 and/or the rendered image 1 12 from the server in response to the request.
  • the obtaining module 202 may obtain one or more image files (image files 502 as illustrated in FIGs. 5, 6, 7, 8, or 9, for example).
  • the obtaining module 202 may obtain the one or more image files from a server as described previously.
  • an image file may include multiple rendered images 1 12.
  • the obtaining module 202 may determine one or more rendered image 1 12 from the multiple rendered images 1 12.
  • the obtaining module 202 may determine a rendered image 1 12 that corresponds to a base image 1 10, that corresponds to a particular rendering scheme (associated with a particular set of rendered images, for example), and/or that meets a predetermined criteria.
  • the obtaining module 202 may obtain a rendered image 1 12 by selecting the determined rendered image 1 12 from the image file 502.
  • the obtaining module 202 may extract a pixel area from the image file 502 to obtain the rendered image 1 12.
  • the matching module 204 may match a rendered image 1 12 to a location in its corresponding base image 1 10.
  • the matching module 204 may match the determined rendered image 1 12 to the location in the base image 1 10 that the rendered image 1 12 was rendered from.
  • the rendered image 1 12 may be a rendered version of a portion of the base image.
  • the rendered image 1 12 may be matched with a corresponding un-rendered version portion of the base image 1 10.
  • the matching module 204 may match the rendered image 112 with the base image 110 so that a layering of the images may create the illusion of a single rendered image.
  • the layering module 206 may overlay a rendered image 112 onto a base image 110.
  • the rendered image 112 may cover a portion of the base image 110.
  • the rendered image 112 may be a rendered version of the portion of the base image 110 that it is covering.
  • the layering module 206 may overlay the rendered image 112 onto the base image 110 based on the determined matching from the matching module 204.
  • the layering module 206 may overlay multiple rendering images 112 onto a single base image 110.
  • the displaying module 208 may display the overlaying rendered image 112 and the underlying base image 110.
  • the layered images create the illusion that the base image 110 has been rendered.
  • FIG. 3 is a block diagram illustrating one example of how a rendered image 112 and a base image 110 may be layered.
  • the rendered image 112 may cover a portion of the base image 306.
  • the rendered image 112 may be a rendered version of a portion of the base image 306.
  • the rendered image 112 may match the rendered image 112 with the base image 110 so that that the rendered image 112 may cover the portion of the base image 306 that it was rendered from.
  • the pixel location 302 of the rendered image 112 may correspond to a ren- dered version of the pixel location 304 of the base image 110.
  • the rendered image 112 may be matched so that the rendered image 112 may cover the portion of the base image 306.
  • pixel location 302 may be matched to pixel location 304 so that the rendered image 112 may cover the portion of the base image 306.
  • the rendered image 112 may be layered over the base image 110 based on the determined matching.
  • the rendered image 112 may be overlaid onto the base image 110 so that the rendered image 112 covers the portion of the base image 306 that the rendered image 212 was rendered from.
  • FIG. 3 illustrates the case of layering one rendered image 112 onto a base image 110, it is noted that more than one rendered images 112 may be layered onto a single base image 110.
  • one or more rendered images 112 that are layered with a base image 110 may be referred to as a set of layered images 308.
  • the set of layered images 308 may create the illusion of a single rendered image.
  • the set of layered images 308 may provide the same effect as if the entire base image 110 were rendered as a single image.
  • FIG. 4 is a block diagram illustrating one example of a rendered object movie 402 based on multiple sets of layered images 308.
  • each set of layered images 308 may be an example of the set of layered images 308 illustrated in FIG. 3.
  • different base images 110 may depict an object in different orientations.
  • a first base image 110-a-l may depict the object in a first orientation
  • the second base image 110-a-2 may depict the object in a second orientation
  • the nth base image 110-a-n may depict the object in an nth orientation.
  • these base images 1 lOa-n may be ordered to create an object movie of the object.
  • the base images 1 lOa-n may be ordered to so that the orientation of the object appears to rotate as two or more consecutive base images 110 are cycled.
  • each rendered image 112 may correspond to a particular base image 110.
  • a first rendered image 112-a-l may correspond to the first base image 110-a-l
  • the second rendered image 112-a-2 may correspond to the second base image 110-a-2
  • the nth rendered image 112-a-n may correspond to the nth base image 110-a-n.
  • each rendered image 112 may be a rendered version of a portion of its corresponding base image 110.
  • a first set of layered images 308-a- 1 may include the first rendered image 112-a-l and the first base image 110-a-l
  • a second set of layered images 308-a-2 may include the second rendered image 112-a-2 and the second base image 110-a-2
  • an nth set of layered images 308-a-n may include the nth rendered image 112-a-n and the nth base image 110-a-n.
  • the first set of layered images 308-a-l, the second set of layered images 308-a-2, and the nth set of layered images 308-a-n may be ordered to form a rendered object movie 402.
  • the rendered object movie 402 may create the illusion of an object movie of rendered base images.
  • the systems and methods described herein may be used for virtually trying-on glasses.
  • the base images 110 may be images of a user's face/head in various orientations.
  • a first base image 110-a-l may be an image of a user's face/head in a left facing orientation
  • a second base image 110-a-2 may be an image of a user's face/head in a center facing orientation
  • a third base image 110-a-3 may be an image of a user's face/head in a right facing orientation.
  • each rendered images 112 may be a rendered version of a portion of a base image 110.
  • the first rendered image 1 12-a-l may include a portion of the user's face/head with a rendered set of glasses (rendered based on the left facing orientation of the first base image 1 10-a-l , for example)
  • the second rendered image 1 12-a-2 may include a portion of the user's face/head with the rendered set of glasses (rendered based on the center facing orientation of the second base image 1 10-a-2)
  • the third rendering image 1 12-a-2 may include a portion of the user's face/head with the rendered set of glasses (rendered based on the right facing orientation of the third base image 1 10-a-3).
  • the portion of the user's face/head may be a strip that includes the eye and ear regions of the user's face/head.
  • each rendered image 1 12 may be layered with its corresponding base image 1 10 to create sets of layered images 308.
  • these sets of layered images 308 may be ordered to create a rendered object movie 402 that allows the rendered glasses to be viewed on the user's face/head from the various orientations of the underlying base images 1 10.
  • base images 1 10 may be used as the basis for a rendered object movie 402 (of a virtual try-on, for example).
  • base images 1 10-a-l through 1 10-a-8 may be images of the user's face/head in various (e.g., de- creasingly) left facing orientations
  • the ninth base image 1 10-a-9 may be an image of the user's face/head in a center facing orientation
  • base images 1 10-a-lO through 1 10-a-17 may be images of the user's face/head in various (e.g., increasingly) right facing orientations.
  • rendered images 1 12 based on these base images 1 10 and layered on their corresponding base images 1 10 may enable the efficient creation of a rendered object movie 402 (e.g., virtual try-on). In some cases, more (or less) than seventeen base images 1 10 may be used.
  • FIG. 5 is a block diagram illustrating one example of an image file 502 that includes more than one rendered images 1 12.
  • the image file 502 may include at least one rendered image 1 12 for each base image 1 10.
  • the first rendered image 1 12-a-l may be a rendered version of a portion of the first base image 1 10-a-l
  • the second rendered image 1 12-a-2 may be a rendered version of a portion of the second base image 1 10-a-2
  • the nth rendered image 1 12-a-n may be a rendered version of a portion of the nth base image 1 10-a-n.
  • the first rendered image 1 12-a-l may be a rendered image 1 12 for the first base image 1 10-a-l
  • the second rendered image 1 12-a-2 may be a rendered image 1 12 for the second base image 1 10-a-2
  • the nth rendered image 1 12-a-n may be a rendered image 1 12 for the nth base image 1 10-a-n.
  • disparate pixel areas of the image file 502 may correspond to disparate rendered images 1 12.
  • a first pixel area 504-a-l of the image file 502 may correspond to a first rendered image file 1 12-a-l
  • a second pixel area 504-a-2 of the image file 502 may correspond to a second rendered image file 1 12-a-2
  • an nth pixel area 504-a-n of the image file 502 may correspond to an nth rendered image file 1 12-a-n.
  • each pixel area 504 may be a strip that is 80 pixels high and as wide as the image of the image file 502.
  • the first rendered image 1 12- a-l may be obtained by selecting pixels 0-80 from the image file 502
  • the second rendered image 1 12-a-2 may be obtained by selecting pixels 80-160 from the image file 502, and so forth.
  • the dimensions and/or location of a pixel area 504 in the image file 502 may differ.
  • one or more configuration settings may define the dimensions and/or location of a pixel area 504 that should be selected to obtain a particular rendered image 1 12.
  • the location of a particular rendered image 1 12 may be consistent across multiple image files 502.
  • an image file 502 may include one or more sets of rendered images.
  • a set of rendered images may allow an entire object movie to be rendered with a particular rendering scheme (e.g., a first style of glasses in a first position, a second style of glasses in a first position, a first style of glasses in a second position, etc.).
  • each rendering image 1 12 in the set of rendering images may apply the particular rendering scheme to its corresponding base image 1 10.
  • a rendered object movie 402 may be rendered based on a single image file 502.
  • FIG. 6 is a block diagram illustrating one example of how an image file 502 may be used to generate a rendered object movie 402.
  • each of the rendered images 1 12 in the image file 502 are part of a set rendered images 1 12-a.
  • a set of rendered images 1 12-a may include a rendered image 1 12 for each base image 1 10 that applies a particular rendering scheme to each base image 1 10.
  • the first rendered image 1 12-a-l may apply a first rendering scheme to the first base image 1 10-a-l
  • the second rendered image 1 12-a-2 may apply the first rendering scheme to the second base image 1 10-a-2
  • the nth rendered image 1 12-a-n may apply the first rendering scheme to the nth base image 1 10-a-n.
  • the first rendering scheme may be to render a particular pair of glasses in a particular position. In some cases this may allow for the creation of a rendered object movie 402 with a particular rendering scheme using a single image file 502.
  • a first set of layered images 308-a-l may be obtained by covering a portion of the first base image 1 10-a-l with the first rendered image 1 12-a-l
  • a second set of layered images 308-a-2 may be obtained by covering a portion of the second base image 1 10-a-2 with the second rendered image 1 12-a-2
  • an nth set of layered image 308-a-n may be obtained by covering a portion of the nth base image 1 10-a-n with the nth rendered image 1 12-a-n.
  • the sets of layered images may be ordered to create a rendered object movie 402.
  • the rendered object movie 402 may be an object movie with a single rendering scheme.
  • the rendered object movie 402 may be an object movie of a virtual try-on of a pair of glasses.
  • the first rendered image 1 12-a-l may include a particular pair of glasses rendered in a first orientation at a particular position on the user's face/head
  • the second rendered image 1 12-a-2 may include the particular pair of glasses rendered in a second orientation at the particular position on the user's face/head
  • the nth rendered image 1 12-a-n may include the particular pair of glasses rendered in an nth orientation at the particular position on the user's face/head.
  • the rendered images 1 12 for a rendered object movie 402 of a particular product at a particular posi- tion may be stored in and/or extracted from a single image file 502.
  • FIG. 7 is a block diagram illustrating one example of how multiple image files 502 may be used to generate a rendered object movie 402.
  • a first image file 502-a-l may include a first set of rendered images 1 12-a (e.g., a first pair of glasses rendered in a first position) and a second image file 502-a-2 may include a second set of ren- dered images 1 12-b (e.g., the first pair of glasses rendered in a second position, a second pair of glasses rendered in a first position, the second pair of glasses rendered in a second position, etc.).
  • a first rendering scheme may be selected (e.g., a first pair of glasses rendered in a first position).
  • a first set of rendered images 1 12-a may be used to generate each set of layered images 308.
  • a first set of layered images 308-a-l may be obtained by overlaying the first rendered image 1 12-a-l from the first image file 502-a-l onto the first base image 1 10-a-l .
  • the rendering scheme may be changed.
  • a second rendering scheme may subsequently be selected.
  • a second set of rendering image 1 12-b may be used to generate each set of layered images 308.
  • a second set of layered images 308-b-2 may be obtained by overlaying the second rendered image 1 12-b-2 from the second image file 502-a-2 onto the second base image 1 10-a-2
  • an nth set of layered images 308-b-n may be obtained by overlaying the nth rendered image from the second image file 502-a-2 onto the nth base image 1 10-a-n.
  • the first set of layered images 308-a-l is based on the first set of rendered images 1 12-a and the second and nth sets of layered images 308-b-2, 308-b-n are based on the second set of rendered images 1 12-b.
  • each of sets of layered images 308 may be based on the first set of rendered images 1 12-a and after the change in rendering schemes, each of the sets of layered images 308 may be based on the second set of rendered images 1 12-b.
  • a user may change glasses positions during the display of the rendered object movie 402.
  • the sets of rendered images 308 may be based on the first glasses position and after the change in position, the sets of rendered images 308 may be based on the second glasses position.
  • FIG. 8 is a block diagram illustrating one example of how a single image file 502-b with multiple sets of rendered images may be used to generate a rendered object movie 402.
  • an image file 502-b may include a first set of rendered images 1 12-a (e.g., a first pair of glasses rendered in a first position) and a second set of rendered images 1 12-b (e.g., the first pair of glasses rendered in a second position, a second pair of glasses rendered in a first position, the second pair of glasses rendered in a second position, etc.).
  • a first rendering scheme may be selected (e.g., a first pair of glasses rendered in a first position).
  • a first set of rendered images 1 12-a may be used to generate each set of layered images 308.
  • a first set of layered images 308-a-l may be obtained by overlaying the first rendered image 1 12-a-l onto the first base image 1 10-a-l .
  • the rendering scheme may be changed.
  • a second rendering scheme may subsequently be selected.
  • a second set of rendering im- age 1 12-b may be used to generate each set of layered images 308.
  • a second set of layered images 308-b-2 may be obtained by overlaying the second rendered image 1 12-b-2 onto the second base image 1 10-a-2, and an nth set of layered images 308-b-n may be ob- tained by overlaying the nth rendered image onto the nth base image 1 10-a-n.
  • the first set of layered images 308-a-l is based on the first set of rendered images 1 12-a and the second and nth sets of layered images 308-b-2, 308-b-n are based on the second set of rendered images 1 12-b.
  • each of sets of layered images 308 may be based on the first set of rendered images 1 12-a and after the change in rendering schemes, each of the sets of layered images 308 may be based on the second set of rendered images 1 12-b.
  • multiple sets of rendered images may be included in the same image file 502-b.
  • FIG. 9 is a block diagram illustrating one example of how a single image file 502 with multiple sets of rendered images may be used to generate disparate rendered object movies 402.
  • the image file 502-b may include a first set of rendered images 1 12-a (e.g., a first pair of glasses rendered in a first position) and a second set of rendered images 1 12-b (e.g., the first pair of glasses rendered in a second position, a second pair of glasses rendered in a first position, the second pair of glasses rendered in a se- cond position, etc.).
  • a first set of layered images 308-a-l may be obtained by overlaying the first rendered image 1 12-a-l from the first set of rendered images 1 12-a onto the first base image 1 10-a-l
  • a second set of layered images 308-a-2 may be obtained by overlaying the second rendered image 1 12-a-2 from the first set of rendered images 1 12-a onto the second base image 1 10-a-2
  • an nth set of layered images 308-a-n may be obtained by overlaying the nth rendered image from the first set of rendered images 1 12-a onto the nth base image 1 10-a-n.
  • a first set of layered images 308-b-l may be obtained by overlaying the first rendered image 1 12-b-l from the second set of rendered images 1 12-b onto the first base image 1 10-a-l
  • a second set of layered images 308-b-2 may be obtained by overlaying the second rendered image 1 12-b-2 from the second set of rendered images 1 12-b onto the second base image 1 10-a-2
  • an nth set of layered images 308-b-n may be obtained by overlaying the nth rendered image from the second set of rendered images 1 12-b onto the nth base image 1 10-a-n.
  • the same base images 1 10 may be used for disparate rendered object movies 402 because the particular rendered images 1 12 are overlaid onto the base images 1 10.
  • the first set of rendered images 1 12-a may have a first rendering scheme (e.g., a first pair of glasses in a first position) and the second set of rendered images 1 12-b may have a second rendering scheme (e.g., a first pair of glasses in a second position).
  • the first rendered object movie 402-a-l may be a virtual-try on for the first pair of glasses in the first position and the second rendered object movie 402- a-2 may be a virtual try-on for the first pair of glasses in the second position.
  • FIG. 10 is a flow diagram illustrating one embodiment of a method 1000 to display rendered images.
  • the method 1000 may be implemented by the rendered image display module 104 illustrated in FIGS. 1 or 2.
  • a base image may be obtained.
  • a rendered image may be obtained.
  • the rendered image may be matched to a location on the base image. For example, the rendered image may be matched to a pixel location on the base image so that the rendered image covers the portion of the base image that the rendered image was rendered from.
  • the rendered image may be overlaid onto the base image at the location to generate a set of layered images.
  • the set of layered images may be displayed.
  • FIG. 11 is a flow diagram illustrating another embodiment of a method
  • the method 1 100 may be implemented by the rendered image display module 104 illustrated in FIGS. 1 or 2.
  • a base image having a first perspective may be obtained.
  • an image file having a plurality of rendered images may be obtained.
  • a rendered image may be selected from the plurality of rendered images based at least upon the first perspective. For example, the rendered image may be rendered based on the first perspective.
  • the rendered image may be matched to a location on the base image.
  • the rendered image may be overlaid onto the base image at the location to generate a set of layered images.
  • the set of layered images may be displayed.
  • FIG. 12 depicts a block diagram of a computer system 1210 suitable for implementing the present systems and methods.
  • Computer system 1210 includes a bus 1212 which interconnects major subsystems of computer system 1210, such as a central processor 1214, a system memory 1217 (typically RAM, but which may also include ROM, flash RAM, or the like), an input/output controller 1218, an external audio device, such as a speak- er system 1220 via an audio output interface 1222, an external device, such as a display screen 1224 via display adapter 1226, an keyboard 1232 (interfaced with a keyboard controller 1233) (or other input device), multiple USB devices 1292 (interfaced with a USB control- ler 1291), and a storage interface 1234. Also included are a mouse 1246 (or other point-and- click device) and a network interface 1248 (coupled directly to bus 1212).
  • a mouse 1246 or other point-and- click device
  • network interface 1248 coupled directly to bus 1212.
  • Bus 1212 allows data communication between central processor 1214 and system memory 1217, which may include read-only memory (ROM) or flash memory (nei- ther shown), and random access memory (RAM) (not shown), as previously noted.
  • the RAM is generally the main memory into which the operating system and application programs are loaded.
  • the ROM or flash memory can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components or devices.
  • BIOS Basic Input-Output system
  • the rendered image display module 104 to imple- ment the present systems and methods may be stored within the system memory 1217.
  • Applications resident with computer system 1210 are generally stored on and accessed via a non- transitory computer readable medium, such as a hard disk drive (e.g., fixed disk 1244) or other storage medium. Additionally, applications can be in the form of electronic signals modulated in accordance with the application and data communication technology when accessed via interface 1248.
  • a non- transitory computer readable medium such as a hard disk drive (e.g., fixed disk 1244) or other storage medium.
  • applications can be in the form of electronic signals modulated in accordance with the application and data communication technology when accessed via interface 1248.
  • Storage interface 1234 can connect to a standard computer readable medium for storage and/or retrieval of information, such as a fixed disk drive 1244.
  • Fixed disk drive 1244 may be a part of computer system 1210 or may be separate and accessed through other interface systems.
  • Network interface 1248 may provide a direct connection to a remote server via a direct network link to the Internet via a POP (point of presence).
  • Network interface 1248 may provide such connection using wireless techniques, including digital cellular telephone connection, Cellular Digital Packet Data (CDPD) connection, digital satellite data connection, or the like.
  • CDPD Cellular Digital Packet Data
  • FIG. 12 Many other devices or subsystems (not shown) may be connected in a similar manner (e.g., document scanners, digital cameras, and so on). Conversely, all of the devices shown in FIG. 12 need not be present to practice the present systems and methods.
  • the devices and subsystems can be interconnected in different ways from that shown in FIG. 12.
  • the operation of a computer system such as that shown in FIG. 12 is readily known in the art and is not discussed in detail in this application.
  • Code to implement the present disclosure can be stored in a non-transitory computer-readable medium such as one or more of system memory 1217 or fixed disk 1244.
  • the operating system provided on computer system 1210 may be iOS ® , MS-DOS ® , MS-WINDOWS ® , OS/2 ® , UNIX ® , Linux ® , or another known operating system.

Abstract

A computer-implemented method to display a rendered image is described. A base image is obtained. A rendered image is obtained. The rendered image is matched to a location on the base image. The rendered image is overlaid onto the base image at the location to generate a set of layered images. The set of layered images is displayed.

Description

SYSTEMS AND METHODS TO DISPLAY RENDERED IMAGES
RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Application No.
61/650,983 entitled SYSTEMS AND METHODS TO VIRTUALLY TRY-ON PRODUCTS, filed on May 23, 2012; and U.S. Application No. 13/662,118 entitled SYSTEMS AND METHODS TO DISPLAY RENDERED IMAGES, filed on October 26, 2012, both of which are incorporated herein in their entirety by this reference.
BACKGROUND
[0002] The use of computer systems and computer-related technologies continues to increase at a rapid pace. This increased use of computer systems has influenced the advances made to computer-related technologies. Indeed, computer systems have increasingly become an integral part of the business world and the activities of individual consumers. For example, computers have opened up an entire industry of internet shopping. In many ways, online shopping has changed the way consumers purchase products. However, in some cases, consumers may avoid shopping online. For example, it may be difficult for a consumer to know if they will look good in and/or with a product without seeing themselves in and/or with the product. In many cases, this challenge may deter a consumer from purchasing a product online. Therefore, improving the online shopping experience may be desirable. In various situations, it may be desirable to render an image.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] The accompanying drawings illustrate a number of exemplary embodi- ments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the instant disclosure.
[0004] FIG. 1 is a block diagram illustrating one embodiment of an environment in which the present systems and methods may be implemented;
[0005] FIG. 2 is a block diagram illustrating one example of the rendered image display module; [0006] FIG. 3 is a block diagram illustrating one example of how a rendered image and a base image may be layered;
[0007] FIG. 4 is a block diagram illustrating one example of a rendered object movie based on multiple sets of layered images;
[0008] FIG. 5 is a block diagram illustrating one example of an image file that includes more than one rendered images;
[0009] FIG. 6 is a block diagram illustrating one example of how an image file may be used to generate a rendered object movie;
[0010] FIG. 7 is a block diagram illustrating one example of how multiple image files may be used to generate a rendered object movie;
[0011] FIG. 8 is a block diagram illustrating one example of how a single image file with multiple sets of rendered images may be used to generate a rendered object movie;
[0012] FIG. 9 is a block diagram illustrating one example of how a single image file with multiple sets of rendered images may be used to generate disparate rendered object movies;
[0013] FIG. 10 is a flow diagram illustrating one embodiment of a method to display rendered images;
[0014] FIG. 1 1 is a flow diagram illustrating another embodiment of a method to display rendered images; and
[0015] FIG. 12 depicts a block diagram of a computer system suitable for implementing the present systems and methods.
[0016] While the embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the instant disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
DISCLOSURE OF THE INVENTION
[0017] A computer- implemented method to display a rendered image is described. A base image is obtained. A rendered image is obtained. The rendered image is matched to a location on the base image. The rendered image is overlaid onto the base image at the location to generate a set of layered images. The set of layered images is displayed.
[0018] The rendered image may include a rendered version of at least a portion of the base image. The location on the base image may correspond to the at least a portion of the base image.
[0019] The set of layered images may include the base image and the rendered image. In some cases, the rendered image may cover the at least a portion of the base image.
[0020] In some configurations, an image file having a plurality of rendered images may be obtained. In one example, obtaining a rendered image may include determining a rendered image from the plurality of rendered images based on the base image. In another example, obtaining a rendered image may include selecting a pixel area of the image file that corresponds to the determined rendered image. In some cases, each of the plurality of rendered images may correspond to the same rendering scheme.
[0021] A computing device configured to display a rendered image is also described. The computing device includes instructions stored in memory that in is in electronic communication with a processor. The instructions being executable by the processor to obtain a base image. The instructions also being executable by the processor to obtain a rendered image. The instructions additionally being executable by the processor to match the rendered image to a location on the base image. The instructions also being executable by the processor to overlay the rendered image onto the base image at the location to generate a set of layered images. The instructions further being executable by the processor to display the set of layered images.
[0022] A computer-program product to display a rendered image is additionally described. The computer-program product includes a non-transitory computer-readable storage medium that stores computer executable instructions. The instructions being executable by the processor to obtain a base image. The instructions also being executable by the processor to obtain a rendered image. The instructions additionally being executable by the processor to match the rendered image to a location on the base image. The instructions also being executable by the processor to overlay the rendered image onto the base image at the location to generate a set of layered images. The instructions further being executable by the processor to display the set of layered images. BEST MODE(S) FOR CARRYING OUT THE INVENTION
[0023] In various situations, it may be desirable to render an image. For example, it may be desirable for a user to virtually try-on a product so that the user may see what they would look like in and/or with the product. In this example, a base image of the user may be rendered to include the product. Typically, only a portion of the base image may be rendered. Therefore, a portion of the image may not be rendered. In these situations, it may be desirable to store only the rendered portion of the image as a rendered image. Alternatively, it may be desirable to store the region that includes the rendered portion of the image as the rendered image. This may allow the full rendered image to be created by overlaying the rendered im- age onto the base image at the location where the rendered image was rendered from.
[0024] In some configurations, this may allow for efficient storing, transferring, and/or processing of rendered images. This may be particularly beneficial when the same base image is used for displaying multiple renderings (as in the case of virtually trying-on products, for example). For instance, the same base image may be quickly modified to show different renderings by overlaying the appropriate rendering image. In the case that rendered images are obtained from a server, the reduced size of each rendering image and the reuse of the same base image may allow the user to virtually try-on and shop simultaneously (e.g., the products are presented to the user in the context of a rendered virtual try-on).
[0025] In one example, a user may desire to virtually try-on a pair of glasses (so that the user may see how the glasses look on their face/head, for example). The user may provide one or more images of the user's face/head (e.g., the base images). In some configurations, each base image may be rendered with a virtual pair of glasses. In one example, a region that includes the rendered pair of glasses (a strip of a user's head that includes the region around the eyes and ears of the user, for example) may be stored as the rendered image. In some configurations, multiple rendered images may be formed. For example, a rendered image may be generated for each pair of glasses. As another example, a rendered image may be generated for each position on the face that a particular pair of glasses may be positioned at. This may allow a user to change glasses and/or change the position of a pair of glasses on their face simply by overlaying the proper rendered image on the base image.
[0026] Referring now to the figures, FIG. 1 is a block diagram illustrating one embodiment of an environment 100 in which the present systems and methods may be im- plemented. In one embodiment, a device 102 may be coupled to a memory 108 through an interface 106. In some configurations, the device 102 may include a rendered image display module 104 that implements the systems and methods described herein.
[0027] In some configurations, the rendered image display module 104 may ac- cess data stored in the memory 108. For example, the rendered image display module 104 may obtain one or more base images 1 10 and one or more rendered images 1 12 from the memory 108. In some configurations, the rendered image display module 104 may match a rendered image 1 12 with a particular portion of a base image 1 10. In some configurations, the rendered image display module 104 may overlay the rendered image 1 12 onto the base image 1 10 to cover the particular portion of the base image 1 10. In some configurations, the rendered image display module 104 may display the layered images (e.g., the rendered image 1 12 and the base image 1 10). In some cases, displaying the set of layered images creates the illusion that the layered images are a single rendered image. The rendered image display module 104 is discussed in greater detail below.
[0028] In one embodiment, the memory 108 may be local to the device 102. For example, the memory 108 may be within the device 102 and/or directly attached to the device 102. In another embodiment, the memory 108 may be remote from the device 102. For example, the memory 108 may be hosted by a server. For instance, the device 102 may access the one or more base images 1 10 and the one or more rendered images 1 12 through a network (via the server, for example). In yet another embodiment, a first memory 108 may be local to the device 102 and a second memory 108 may be remote from the device 102. In this case, the rendered image display module 104 may obtain a base image 1 10 and/or a rendered image 1 12 from either memory 108. Examples of interface 106 include system buses, serial AT attachment (SAT A) interfaces, universal serial bus (USB) interfaces, wired networks, wire- less networks, cellular networks, satellite networks, etc. In some cases, the interface 106 may be the internet.
[0029] FIG. 2 is a block diagram illustrating one example of the rendered image display module 104-a. The rendered image display module 104-a may be one example of the rendered image display module 104 illustrated in FIG. 1. In some configurations, the ren- dered image display module 104-a may include an obtaining module 202, a matching module 204, a layering module 206, and a displaying module 208. [0030] In one embodiment, the obtaining module 202 may obtain one or more base images 1 10 and one or more rendered images 1 12. In one example, the obtaining module 202 may obtain a base image 1 10. In some cases, a base image 1 10 may be associated with a perspective (e.g., an x, y, z, orientation, for example). In one example, the obtaining module 202 may obtain a rendered image 1 12 based on the perspective of the base image 1 10. For example, the obtaining module 202 may obtain a rendered image 1 12 that corresponds (has the same perspective, for example) to a base image 1 10.
[0031] In one example, the obtaining module 202 may obtain a base image 1 10 and/or a rendered image 1 12 by initiating a read access to the memory 108. In another exam- pie, the obtaining module 202 may obtain a base image 1 10 and/or a rendered image 1 12 by receiving the base image 1 10 and/or the rendered image 1 12 from a server. In some cases, the obtaining module 202 may request the base image 1 10 and/or the rendered image 1 12 from the server and may receive the base image 1 10 and/or the rendered image 1 12 from the server in response to the request.
[0032] In one embodiment, the obtaining module 202 may obtain one or more image files (image files 502 as illustrated in FIGs. 5, 6, 7, 8, or 9, for example). For example, the obtaining module 202 may obtain the one or more image files from a server as described previously. In some cases, an image file may include multiple rendered images 1 12. In these cases, the obtaining module 202 may determine one or more rendered image 1 12 from the multiple rendered images 1 12. For example, the obtaining module 202 may determine a rendered image 1 12 that corresponds to a base image 1 10, that corresponds to a particular rendering scheme (associated with a particular set of rendered images, for example), and/or that meets a predetermined criteria. In some cases, the obtaining module 202 may obtain a rendered image 1 12 by selecting the determined rendered image 1 12 from the image file 502. In one example, the obtaining module 202 may extract a pixel area from the image file 502 to obtain the rendered image 1 12.
[0033] In one embodiment, the matching module 204, may match a rendered image 1 12 to a location in its corresponding base image 1 10. For example, the matching module 204 may match the determined rendered image 1 12 to the location in the base image 1 10 that the rendered image 1 12 was rendered from. In some cases, the rendered image 1 12 may be a rendered version of a portion of the base image. In these cases, the rendered image 1 12 may be matched with a corresponding un-rendered version portion of the base image 1 10. In some cases, the matching module 204 may match the rendered image 112 with the base image 110 so that a layering of the images may create the illusion of a single rendered image.
[0034] In one embodiment, the layering module 206 may overlay a rendered image 112 onto a base image 110. In some configurations, the rendered image 112 may cover a portion of the base image 110. In one example, the rendered image 112 may be a rendered version of the portion of the base image 110 that it is covering. For instance, the layering module 206 may overlay the rendered image 112 onto the base image 110 based on the determined matching from the matching module 204. In some cases, the layering module 206 may overlay multiple rendering images 112 onto a single base image 110.
[0035] In one embodiment, the displaying module 208 may display the overlaying rendered image 112 and the underlying base image 110. In some cases, the layered images create the illusion that the base image 110 has been rendered.
[0036] FIG. 3 is a block diagram illustrating one example of how a rendered image 112 and a base image 110 may be layered. In some configurations, the rendered image 112 may cover a portion of the base image 306. In some cases, the rendered image 112 may be a rendered version of a portion of the base image 306. In some configurations, the rendered image 112 may match the rendered image 112 with the base image 110 so that that the rendered image 112 may cover the portion of the base image 306 that it was rendered from. In one example, the pixel location 302 of the rendered image 112 may correspond to a ren- dered version of the pixel location 304 of the base image 110. In this example, the rendered image 112 may be matched so that the rendered image 112 may cover the portion of the base image 306. For example, pixel location 302 may be matched to pixel location 304 so that the rendered image 112 may cover the portion of the base image 306.
[0037] In some configurations, the rendered image 112 may be layered over the base image 110 based on the determined matching. For example, the rendered image 112 may be overlaid onto the base image 110 so that the rendered image 112 covers the portion of the base image 306 that the rendered image 212 was rendered from. Although FIG. 3 illustrates the case of layering one rendered image 112 onto a base image 110, it is noted that more than one rendered images 112 may be layered onto a single base image 110. In some configura- tions, one or more rendered images 112 that are layered with a base image 110 may be referred to as a set of layered images 308. In some cases, the set of layered images 308 may create the illusion of a single rendered image. For example, the set of layered images 308 may provide the same effect as if the entire base image 110 were rendered as a single image.
[0038] FIG. 4 is a block diagram illustrating one example of a rendered object movie 402 based on multiple sets of layered images 308. In some configurations, each set of layered images 308 may be an example of the set of layered images 308 illustrated in FIG. 3.
[0039] In some configurations, different base images 110 may depict an object in different orientations. For example, a first base image 110-a-l may depict the object in a first orientation, the second base image 110-a-2 may depict the object in a second orientation, and the nth base image 110-a-n may depict the object in an nth orientation. In one example, these base images 1 lOa-n may be ordered to create an object movie of the object. For example, the base images 1 lOa-n may be ordered to so that the orientation of the object appears to rotate as two or more consecutive base images 110 are cycled.
[0040] In some configurations, each rendered image 112 may correspond to a particular base image 110. For example, a first rendered image 112-a-l may correspond to the first base image 110-a-l, the second rendered image 112-a-2 may correspond to the second base image 110-a-2, and the nth rendered image 112-a-n may correspond to the nth base image 110-a-n. For instance, each rendered image 112 may be a rendered version of a portion of its corresponding base image 110. In some configurations, a first set of layered images 308-a- 1 may include the first rendered image 112-a-l and the first base image 110-a-l, a second set of layered images 308-a-2 may include the second rendered image 112-a-2 and the second base image 110-a-2, and an nth set of layered images 308-a-n may include the nth rendered image 112-a-n and the nth base image 110-a-n. In some cases, the first set of layered images 308-a-l, the second set of layered images 308-a-2, and the nth set of layered images 308-a-n may be ordered to form a rendered object movie 402. In some configurations, the rendered object movie 402 may create the illusion of an object movie of rendered base images.
[0041] In one example, the systems and methods described herein may be used for virtually trying-on glasses. In this example the base images 110 may be images of a user's face/head in various orientations. For instance, a first base image 110-a-l may be an image of a user's face/head in a left facing orientation, a second base image 110-a-2 may be an image of a user's face/head in a center facing orientation, and a third base image 110-a-3 may be an image of a user's face/head in a right facing orientation. In this example, each rendered images 112 may be a rendered version of a portion of a base image 110. For example, the first rendered image 1 12-a-l may include a portion of the user's face/head with a rendered set of glasses (rendered based on the left facing orientation of the first base image 1 10-a-l , for example), the second rendered image 1 12-a-2 may include a portion of the user's face/head with the rendered set of glasses (rendered based on the center facing orientation of the second base image 1 10-a-2), and the third rendering image 1 12-a-2 may include a portion of the user's face/head with the rendered set of glasses (rendered based on the right facing orientation of the third base image 1 10-a-3). In one example, the portion of the user's face/head may be a strip that includes the eye and ear regions of the user's face/head. In some configurations, each rendered image 1 12 may be layered with its corresponding base image 1 10 to create sets of layered images 308. In one example, these sets of layered images 308 may be ordered to create a rendered object movie 402 that allows the rendered glasses to be viewed on the user's face/head from the various orientations of the underlying base images 1 10.
[0042] In a similar example, seventeen base images 1 10 may be used as the basis for a rendered object movie 402 (of a virtual try-on, for example). In one embodiment, base images 1 10-a-l through 1 10-a-8 may be images of the user's face/head in various (e.g., de- creasingly) left facing orientations, the ninth base image 1 10-a-9 may be an image of the user's face/head in a center facing orientation, and base images 1 10-a-lO through 1 10-a-17 may be images of the user's face/head in various (e.g., increasingly) right facing orientations. As described above, rendered images 1 12 based on these base images 1 10 and layered on their corresponding base images 1 10 may enable the efficient creation of a rendered object movie 402 (e.g., virtual try-on). In some cases, more (or less) than seventeen base images 1 10 may be used.
[0043] FIG. 5 is a block diagram illustrating one example of an image file 502 that includes more than one rendered images 1 12. In some configurations, the image file 502 may include at least one rendered image 1 12 for each base image 1 10. In one example, the first rendered image 1 12-a-l may be a rendered version of a portion of the first base image 1 10-a-l , the second rendered image 1 12-a-2 may be a rendered version of a portion of the second base image 1 10-a-2, and the nth rendered image 1 12-a-n may be a rendered version of a portion of the nth base image 1 10-a-n. In this example, the first rendered image 1 12-a-l may be a rendered image 1 12 for the first base image 1 10-a-l , the second rendered image 1 12-a-2 may be a rendered image 1 12 for the second base image 1 10-a-2, and the nth rendered image 1 12-a-n may be a rendered image 1 12 for the nth base image 1 10-a-n. [0044] In some cases, disparate pixel areas of the image file 502 may correspond to disparate rendered images 1 12. For example, a first pixel area 504-a-l of the image file 502 may correspond to a first rendered image file 1 12-a-l , a second pixel area 504-a-2 of the image file 502 may correspond to a second rendered image file 1 12-a-2, and an nth pixel area 504-a-n of the image file 502 may correspond to an nth rendered image file 1 12-a-n.
[0045] In one example, each pixel area 504 may be a strip that is 80 pixels high and as wide as the image of the image file 502. In this example, the first rendered image 1 12- a-l may be obtained by selecting pixels 0-80 from the image file 502, the second rendered image 1 12-a-2 may be obtained by selecting pixels 80-160 from the image file 502, and so forth. In other examples, the dimensions and/or location of a pixel area 504 in the image file 502 may differ. In some configurations, one or more configuration settings may define the dimensions and/or location of a pixel area 504 that should be selected to obtain a particular rendered image 1 12. In one example, the location of a particular rendered image 1 12 may be consistent across multiple image files 502.
[0046] In some configurations, an image file 502 may include one or more sets of rendered images. In one example, a set of rendered images may allow an entire object movie to be rendered with a particular rendering scheme (e.g., a first style of glasses in a first position, a second style of glasses in a first position, a first style of glasses in a second position, etc.). For example, each rendering image 1 12 in the set of rendering images may apply the particular rendering scheme to its corresponding base image 1 10. In some cases, a rendered object movie 402 may be rendered based on a single image file 502.
[0047] FIG. 6 is a block diagram illustrating one example of how an image file 502 may be used to generate a rendered object movie 402. In one embodiment, each of the rendered images 1 12 in the image file 502 are part of a set rendered images 1 12-a.
[0048] In some cases, a set of rendered images 1 12-a may include a rendered image 1 12 for each base image 1 10 that applies a particular rendering scheme to each base image 1 10. For example, the first rendered image 1 12-a-l may apply a first rendering scheme to the first base image 1 10-a-l , the second rendered image 1 12-a-2 may apply the first rendering scheme to the second base image 1 10-a-2, and the nth rendered image 1 12-a-n may apply the first rendering scheme to the nth base image 1 10-a-n. In one example, the first rendering scheme may be to render a particular pair of glasses in a particular position. In some cases this may allow for the creation of a rendered object movie 402 with a particular rendering scheme using a single image file 502.
[0049] In some configurations, a first set of layered images 308-a-l may be obtained by covering a portion of the first base image 1 10-a-l with the first rendered image 1 12-a-l , a second set of layered images 308-a-2 may be obtained by covering a portion of the second base image 1 10-a-2 with the second rendered image 1 12-a-2, and an nth set of layered image 308-a-n may be obtained by covering a portion of the nth base image 1 10-a-n with the nth rendered image 1 12-a-n. In some configurations, the sets of layered images may be ordered to create a rendered object movie 402. In the case that the rendered images 1 12 are from the same set of rendered images 1 12-a, the rendered object movie 402 may be an object movie with a single rendering scheme.
[0050] In one example, the rendered object movie 402 may be an object movie of a virtual try-on of a pair of glasses. In this scenario, the first rendered image 1 12-a-l may include a particular pair of glasses rendered in a first orientation at a particular position on the user's face/head, the second rendered image 1 12-a-2 may include the particular pair of glasses rendered in a second orientation at the particular position on the user's face/head, and the nth rendered image 1 12-a-n may include the particular pair of glasses rendered in an nth orientation at the particular position on the user's face/head. In some configurations, the rendered images 1 12 for a rendered object movie 402 of a particular product at a particular posi- tion may be stored in and/or extracted from a single image file 502.
[0051] FIG. 7 is a block diagram illustrating one example of how multiple image files 502 may be used to generate a rendered object movie 402. In some cases, a first image file 502-a-l may include a first set of rendered images 1 12-a (e.g., a first pair of glasses rendered in a first position) and a second image file 502-a-2 may include a second set of ren- dered images 1 12-b (e.g., the first pair of glasses rendered in a second position, a second pair of glasses rendered in a first position, the second pair of glasses rendered in a second position, etc.).
[0052] In one example, a first rendering scheme may be selected (e.g., a first pair of glasses rendered in a first position). For example, a first set of rendered images 1 12-a may be used to generate each set of layered images 308. For instance, a first set of layered images 308-a-l may be obtained by overlaying the first rendered image 1 12-a-l from the first image file 502-a-l onto the first base image 1 10-a-l . In some cases, the rendering scheme may be changed. For example, a second rendering scheme may subsequently be selected. For example, a second set of rendering image 1 12-b may be used to generate each set of layered images 308. For instance, a second set of layered images 308-b-2 may be obtained by overlaying the second rendered image 1 12-b-2 from the second image file 502-a-2 onto the second base image 1 10-a-2, and an nth set of layered images 308-b-n may be obtained by overlaying the nth rendered image from the second image file 502-a-2 onto the nth base image 1 10-a-n. As illustrated in FIG. 7, the first set of layered images 308-a-l is based on the first set of rendered images 1 12-a and the second and nth sets of layered images 308-b-2, 308-b-n are based on the second set of rendered images 1 12-b. However, before the change in rendering schemes, each of sets of layered images 308 may be based on the first set of rendered images 1 12-a and after the change in rendering schemes, each of the sets of layered images 308 may be based on the second set of rendered images 1 12-b.
[0053] For example, a user a user may change glasses positions during the display of the rendered object movie 402. In this scenario, previous to the change in position, the sets of rendered images 308 may be based on the first glasses position and after the change in position, the sets of rendered images 308 may be based on the second glasses position.
[0054] FIG. 8 is a block diagram illustrating one example of how a single image file 502-b with multiple sets of rendered images may be used to generate a rendered object movie 402. In some configurations, an image file 502-b may include a first set of rendered images 1 12-a (e.g., a first pair of glasses rendered in a first position) and a second set of rendered images 1 12-b (e.g., the first pair of glasses rendered in a second position, a second pair of glasses rendered in a first position, the second pair of glasses rendered in a second position, etc.).
[0055] In one example, a first rendering scheme may be selected (e.g., a first pair of glasses rendered in a first position). For example, a first set of rendered images 1 12-a may be used to generate each set of layered images 308. For instance, a first set of layered images 308-a-l may be obtained by overlaying the first rendered image 1 12-a-l onto the first base image 1 10-a-l . In some cases, the rendering scheme may be changed. For example, a second rendering scheme may subsequently be selected. For example, a second set of rendering im- age 1 12-b may be used to generate each set of layered images 308. For instance, a second set of layered images 308-b-2 may be obtained by overlaying the second rendered image 1 12-b-2 onto the second base image 1 10-a-2, and an nth set of layered images 308-b-n may be ob- tained by overlaying the nth rendered image onto the nth base image 1 10-a-n. As illustrated in FIG. 8, the first set of layered images 308-a-l is based on the first set of rendered images 1 12-a and the second and nth sets of layered images 308-b-2, 308-b-n are based on the second set of rendered images 1 12-b. However, before the change in rendering schemes, each of sets of layered images 308 may be based on the first set of rendered images 1 12-a and after the change in rendering schemes, each of the sets of layered images 308 may be based on the second set of rendered images 1 12-b. In this example, multiple sets of rendered images may be included in the same image file 502-b.
[0056] FIG. 9 is a block diagram illustrating one example of how a single image file 502 with multiple sets of rendered images may be used to generate disparate rendered object movies 402. In some configurations, the image file 502-b may include a first set of rendered images 1 12-a (e.g., a first pair of glasses rendered in a first position) and a second set of rendered images 1 12-b (e.g., the first pair of glasses rendered in a second position, a second pair of glasses rendered in a first position, the second pair of glasses rendered in a se- cond position, etc.).
[0057] In some configurations, a first set of layered images 308-a-l may be obtained by overlaying the first rendered image 1 12-a-l from the first set of rendered images 1 12-a onto the first base image 1 10-a-l , a second set of layered images 308-a-2 may be obtained by overlaying the second rendered image 1 12-a-2 from the first set of rendered images 1 12-a onto the second base image 1 10-a-2, and an nth set of layered images 308-a-n may be obtained by overlaying the nth rendered image from the first set of rendered images 1 12-a onto the nth base image 1 10-a-n. Similarly, a first set of layered images 308-b-l may be obtained by overlaying the first rendered image 1 12-b-l from the second set of rendered images 1 12-b onto the first base image 1 10-a-l , a second set of layered images 308-b-2 may be obtained by overlaying the second rendered image 1 12-b-2 from the second set of rendered images 1 12-b onto the second base image 1 10-a-2, and an nth set of layered images 308-b-n may be obtained by overlaying the nth rendered image from the second set of rendered images 1 12-b onto the nth base image 1 10-a-n. As illustrated in FIG. 9, the same base images 1 10 may be used for disparate rendered object movies 402 because the particular rendered images 1 12 are overlaid onto the base images 1 10.
[0058] In one example, the first set of rendered images 1 12-a may have a first rendering scheme (e.g., a first pair of glasses in a first position) and the second set of rendered images 1 12-b may have a second rendering scheme (e.g., a first pair of glasses in a second position). In this example, the first rendered object movie 402-a-l may be a virtual-try on for the first pair of glasses in the first position and the second rendered object movie 402- a-2 may be a virtual try-on for the first pair of glasses in the second position.
[0059] FIG. 10 is a flow diagram illustrating one embodiment of a method 1000 to display rendered images. In some configurations, the method 1000 may be implemented by the rendered image display module 104 illustrated in FIGS. 1 or 2.
[0060] At step 1002, a base image may be obtained. At step 1004, a rendered image may be obtained. At step 1006, the rendered image may be matched to a location on the base image. For example, the rendered image may be matched to a pixel location on the base image so that the rendered image covers the portion of the base image that the rendered image was rendered from. At step 1008, the rendered image may be overlaid onto the base image at the location to generate a set of layered images. At step 1010, the set of layered images may be displayed.
[0061] FIG. 11 is a flow diagram illustrating another embodiment of a method
1000 to display rendered images. In some configurations, the method 1 100 may be implemented by the rendered image display module 104 illustrated in FIGS. 1 or 2.
[0062] At step 1 102, a base image having a first perspective may be obtained. At step 1 104, an image file having a plurality of rendered images may be obtained. At step 1 106, a rendered image may be selected from the plurality of rendered images based at least upon the first perspective. For example, the rendered image may be rendered based on the first perspective. At step 1 108, the rendered image may be matched to a location on the base image. At step 1 1 10, the rendered image may be overlaid onto the base image at the location to generate a set of layered images. At step 1 1 12, the set of layered images may be displayed.
[0063] FIG. 12 depicts a block diagram of a computer system 1210 suitable for implementing the present systems and methods. Computer system 1210 includes a bus 1212 which interconnects major subsystems of computer system 1210, such as a central processor 1214, a system memory 1217 (typically RAM, but which may also include ROM, flash RAM, or the like), an input/output controller 1218, an external audio device, such as a speak- er system 1220 via an audio output interface 1222, an external device, such as a display screen 1224 via display adapter 1226, an keyboard 1232 (interfaced with a keyboard controller 1233) (or other input device), multiple USB devices 1292 (interfaced with a USB control- ler 1291), and a storage interface 1234. Also included are a mouse 1246 (or other point-and- click device) and a network interface 1248 (coupled directly to bus 1212).
[0064] Bus 1212 allows data communication between central processor 1214 and system memory 1217, which may include read-only memory (ROM) or flash memory (nei- ther shown), and random access memory (RAM) (not shown), as previously noted. The RAM is generally the main memory into which the operating system and application programs are loaded. The ROM or flash memory can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components or devices. For example, the rendered image display module 104 to imple- ment the present systems and methods may be stored within the system memory 1217. Applications resident with computer system 1210 are generally stored on and accessed via a non- transitory computer readable medium, such as a hard disk drive (e.g., fixed disk 1244) or other storage medium. Additionally, applications can be in the form of electronic signals modulated in accordance with the application and data communication technology when accessed via interface 1248.
[0065] Storage interface 1234, as with the other storage interfaces of computer system 1210, can connect to a standard computer readable medium for storage and/or retrieval of information, such as a fixed disk drive 1244. Fixed disk drive 1244 may be a part of computer system 1210 or may be separate and accessed through other interface systems. Network interface 1248 may provide a direct connection to a remote server via a direct network link to the Internet via a POP (point of presence). Network interface 1248 may provide such connection using wireless techniques, including digital cellular telephone connection, Cellular Digital Packet Data (CDPD) connection, digital satellite data connection, or the like.
[0066] Many other devices or subsystems (not shown) may be connected in a similar manner (e.g., document scanners, digital cameras, and so on). Conversely, all of the devices shown in FIG. 12 need not be present to practice the present systems and methods. The devices and subsystems can be interconnected in different ways from that shown in FIG. 12. The operation of a computer system such as that shown in FIG. 12 is readily known in the art and is not discussed in detail in this application. Code to implement the present disclosure can be stored in a non-transitory computer-readable medium such as one or more of system memory 1217 or fixed disk 1244. The operating system provided on computer system 1210 may be iOS®, MS-DOS®, MS-WINDOWS®, OS/2®, UNIX®, Linux®, or another known operating system.
[0067] While the foregoing disclosure sets forth various embodiments using specific block diagrams, flowcharts, and examples, each block diagram component, flowchart step, operation, and/or component described and/or illustrated herein may be implemented, individually and/or collectively, using a wide range of hardware, software, or firmware (or any combination thereof) configurations. In addition, any disclosure of components contained within other components should be considered exemplary in nature since many other architectures can be implemented to achieve the same functionality.
[0068] The process parameters and sequence of steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
[0069] Furthermore, while various embodiments have been described and/or illustrated herein in the context of fully functional computing systems, one or more of these exemplary embodiments may be distributed as a program product in a variety of forms, regardless of the particular type of computer-readable media used to actually carry out the dis- tribution. The embodiments disclosed herein may also be implemented using software modules that perform certain tasks. These software modules may include script, batch, or other executable files that may be stored on a computer-readable storage medium or in a computing system. In some embodiments, these software modules may configure a computing system to perform one or more of the exemplary embodiments disclosed herein.
[0070] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the present systems and meth- ods and their practical applications, to thereby enable others skilled in the art to best utilize the present systems and methods and various embodiments with various modifications as may be suited to the particular use contemplated. [0071] Unless otherwise noted, the terms "a" or "an," as used in the specification and claims, are to be construed as meaning "at least one of." In addition, for ease of use, the words "including" and "having," as used in the specification and claims, are interchangeable with and have the same meaning as the word "comprising." In addition, the term "based on" as used in the specification and the claims is to be construed as meaning "based at least upon."

Claims

What is claimed is:
1. A computer-implemented method to display a rendered image, comprising: obtaining a base image;
obtaining a rendered image;
matching the rendered image to a location on the base image;
overlaying the rendered image onto the base image at the location to generate a set of layered images; and
displaying the set of layered images.
2. The method of claim 1 , wherein the rendered image comprises a rendered version of at least a portion of the base image.
3. The method of claim 2, wherein the location on the base image corresponds to the at least a portion of the base image.
4. The method of claim 2, the set of layered images comprising:
the base image; and
the rendered image, wherein the rendered image covers the at least a portion of the base image.
5. The method of claim 1 , further comprising, obtaining an image file having a plurality of rendered images.
6. The method of claim 5, wherein obtaining a rendered image comprises:
determining a rendered image from the plurality of rendered images based on the base image; and
selecting a pixel area of the image file that corresponds to the determined rendered image.
7. The method of claim 5, wherein each of the plurality of rendered images corresponds to the same rendering scheme.
8. The method of claim 1 , wherein the set of layered images creates an illusion that the overlaying rendered image and the underlying base image are a single image.
9. The method of claim 1 , wherein the set of layered images is displayed as part of an object movie.
10. A computing device configured to display a rendered image, comprising: a processor;
memory in electronic communication with the processor;
instructions stored in the memory, the instructions being executable by the processor to:
obtain a base image;
obtain a rendered image;
match the rendered image to a location on the base image; overlay the rendered image onto the base image at the location to generate a set of layered images; and
display the set of layered images.
1 1. The computing device of claim 10, wherein the rendered image comprises a rendered version of at least a portion of the base image.
12. The computing device of claim 1 1 , wherein the location on the base image corresponds to the at least a portion of the base image.
13. The computing device of claim 1 1 , the set of layered images comprising: the base image; and
the rendered image, wherein the rendered image covers the at least a portion of the base image.
14. The computing device of claim 10, wherein the instructions are further executable by the processor to obtain an image file having a plurality of rendered images.
15. The computing device of claim 14, wherein the instructions executable by the processor to obtain a rendered image comprise instructions executable by the processor to:
determine a rendered image from the plurality of rendered images based on the base image; and
select a pixel area of the image file that corresponds to the determined rendered image.
16. The computing device of claim 14, wherein each of the plurality of rendered images corresponds to the same rendering scheme.
17. The computing device of claim 10, wherein the set of layered images creates an illusion that the overlaying rendered image and the underlying base image are a single image.
18. The computing device of claim 10, wherein the set of layered images is displayed as part of an object movie.
19. A computer-program product to display a rendered image, the computer- program product comprising a non-transitory computer-readable storage medium that stores computer executable instructions, the instructions being executable by a processor to:
obtain a base image;
obtain a rendered image;
match the rendered image to a location on the base image;
overlay the rendered image onto the base image at the location to generate a set of layered images; and
display the set of layered images. The computer-program product of claim 19, wherein the rendered prises a rendered version of at least a portion of the base image.
PCT/US2013/042529 2012-05-23 2013-05-23 Systems and methods to display rendered images WO2013177467A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201261650983P 2012-05-23 2012-05-23
US61/650,983 2012-05-23
US13/662,118 2012-10-26
US13/662,118 US9483853B2 (en) 2012-05-23 2012-10-26 Systems and methods to display rendered images

Publications (1)

Publication Number Publication Date
WO2013177467A1 true WO2013177467A1 (en) 2013-11-28

Family

ID=49624361

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/042529 WO2013177467A1 (en) 2012-05-23 2013-05-23 Systems and methods to display rendered images

Country Status (2)

Country Link
US (2) US9483853B2 (en)
WO (1) WO2013177467A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6099232B2 (en) 2013-08-22 2017-03-22 ビスポーク, インコーポレイテッド Method and system for creating custom products
RU2722495C1 (en) 2017-04-11 2020-06-01 Долби Лэборетериз Лайсенсинг Корпорейшн Perception of multilayer augmented entertainment
JP6321263B1 (en) * 2017-05-19 2018-05-09 株式会社コロプラ Information processing method, apparatus, and program for causing computer to execute information processing method
US11330030B2 (en) 2019-07-25 2022-05-10 Dreamworks Animation Llc Network resource oriented data communication
US11503256B2 (en) * 2019-09-04 2022-11-15 Material Technologies Corporation Object feature visualization apparatus and methods
JP2022544679A (en) 2019-09-04 2022-10-20 マテリアル テクノロジーズ コーポレイション Target feature visualization device and method
CA3171478A1 (en) 2020-02-21 2021-08-26 Ditto Technologies, Inc. Fitting of glasses frames including live fitting
CN113763286A (en) * 2021-09-27 2021-12-07 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
US20230252655A1 (en) * 2022-02-09 2023-08-10 Google Llc Validation of modeling and simulation of wearable device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6661433B1 (en) * 2000-11-03 2003-12-09 Gateway, Inc. Portable wardrobe previewing device
JP2004272530A (en) * 2003-03-07 2004-09-30 Digital Fashion Ltd Virtual fitting display device and method, virtual fitting display program, and computer readable recoding medium with the same program recorded
US20050162419A1 (en) * 2002-03-26 2005-07-28 Kim So W. System and method for 3-dimension simulation of glasses
KR20080086945A (en) * 2006-12-29 2008-09-29 이상민 Apparatus and method for coordination simulation for on-line shopping mall
US20110234591A1 (en) * 2010-03-26 2011-09-29 Microsoft Corporation Personalized Apparel and Accessories Inventory and Display

Family Cites Families (465)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3927933A (en) 1973-08-06 1975-12-23 Humphrey Instruments Inc Apparatus for opthalmological prescription readout
DE2934263C3 (en) 1979-08-24 1982-03-25 Fa. Carl Zeiss, 7920 Heidenheim Method and device for the automatic measurement of the vertex power in the main sections of toric spectacle lenses
US4522474A (en) 1980-05-20 1985-06-11 Slavin Sidney H Spinning optics device
US4698564A (en) 1980-05-20 1987-10-06 Slavin Sidney H Spinning optics device
US4534650A (en) 1981-04-27 1985-08-13 Inria Institut National De Recherche En Informatique Et En Automatique Device for the determination of the position of points on the surface of a body
US4539585A (en) 1981-07-10 1985-09-03 Spackova Daniela S Previewer
US4467349A (en) 1982-04-07 1984-08-21 Maloomian Laurence G System and method for composite display
EP0092364A1 (en) 1982-04-14 1983-10-26 The Hanwell Optical Co. Limited A method of and apparatus for dimensioning a lens to fit a spectacle frame
JPS5955411A (en) 1982-09-24 1984-03-30 Hoya Corp Determination method of optimum thickness for spectacle lens
US4613219A (en) 1984-03-05 1986-09-23 Burke Marketing Services, Inc. Eye movement recording apparatus
JPS6180222A (en) 1984-09-28 1986-04-23 Asahi Glass Co Ltd Method and apparatus for adjusting spectacles
US4781452A (en) 1984-11-07 1988-11-01 Ace Ronald S Modular optical manufacturing system
US5281957A (en) 1984-11-14 1994-01-25 Schoolman Scientific Corp. Portable computer and head mounted display
DE3517321A1 (en) 1985-05-14 1986-11-20 Fa. Carl Zeiss, 7920 Heidenheim MULTIFOCAL EYEWEAR LENS WITH AT LEAST ONE SPECTACLE
US5139373A (en) 1986-08-14 1992-08-18 Gerber Optical, Inc. Optical lens pattern making system and method
US4724617A (en) 1986-08-14 1988-02-16 Gerber Scientific Products, Inc. Apparatus for tracing the lens opening in an eyeglass frame
US4845641A (en) 1986-09-19 1989-07-04 Hoya Corporation Method of forming a synthetic image in simulation system for attachment of spectacles
KR910000591B1 (en) 1986-10-30 1991-01-26 가부시기가이샤 도시바 Glasses frame picture process record method and it's system
FR2636143B1 (en) 1988-09-08 1990-11-02 Briot Int DATA TRANSMISSION DEVICE FOR FACILITATING AND ACCELERATING THE MANUFACTURE OF GLASSES
US4957369A (en) 1989-01-23 1990-09-18 California Institute Of Technology Apparatus for measuring three-dimensional surface geometries
US5255352A (en) 1989-08-03 1993-10-19 Computer Design, Inc. Mapping of two-dimensional surface detail on three-dimensional surfaces
IE67140B1 (en) 1990-02-27 1996-03-06 Bausch & Lomb Lens edging system
FR2678405B1 (en) 1991-06-25 1993-09-24 Chainet Patrice INSTALLATION FOR ASSISTING THE CASH MANAGEMENT OF A DISTRIBUTION CENTER, AS WELL AS THE PROCESS IMPLEMENTED FOR SAID INSTALLATION.
US5257198A (en) 1991-12-18 1993-10-26 Schoyck Carol G Van Method of transmitting edger information to a remote numerically controlled edger
EP1147852B1 (en) 1992-06-24 2005-11-16 Hoya Corporation Spectacle lens production
SE470440B (en) 1992-08-12 1994-03-14 Jan Erik Juto Method and apparatus for rhino-osteometric measurement
US5280570A (en) 1992-09-11 1994-01-18 Jordan Arthur J Spectacle imaging and lens simulating system and method
US5428448A (en) 1993-10-20 1995-06-27 Augen Wecken Plasticos S.R.L. De C.V. Method and apparatus for non-contact digitazation of frames and lenses
DE4427071A1 (en) 1994-08-01 1996-02-08 Wernicke & Co Gmbh Procedure for determining boundary data
US5550602A (en) 1994-11-09 1996-08-27 Johannes Braeuning Apparatus and method for examining visual functions
JP3543395B2 (en) 1994-11-17 2004-07-14 株式会社日立製作所 Service provision and usage
DE69609403D1 (en) 1995-02-01 2000-08-24 Bausch & Lomb IRONING PIECE FOR GLASSES
US5774129A (en) 1995-06-07 1998-06-30 Massachusetts Institute Of Technology Image analysis and synthesis networks using shape and texture information
US5844573A (en) 1995-06-07 1998-12-01 Massachusetts Institute Of Technology Image compression by pointwise prototype correspondence using shape and texture information
US6016150A (en) 1995-08-04 2000-01-18 Microsoft Corporation Sprite compositor and method for performing lighting and shading operations using a compositor to combine factored image layers
US5592248A (en) 1995-11-16 1997-01-07 Norton; Ross A. Computerized method for fitting eyeglasses
US5682210A (en) 1995-12-08 1997-10-28 Weirich; John Eye contact lens video display system
US5720649A (en) 1995-12-22 1998-02-24 Gerber Optical, Inc. Optical lens or lap blank surfacing machine, related method and cutting tool for use therewith
US5988862A (en) 1996-04-24 1999-11-23 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three dimensional objects
DE19616526A1 (en) 1996-04-25 1997-11-06 Rainer Jung Machine for the machining of optical materials for the production of optical parts
WO1998013721A1 (en) 1996-09-26 1998-04-02 Mcdonnell Douglas Corporation Head mounted display with fibre optic image transfer from flat panel
US5809580A (en) 1996-12-20 1998-09-22 Bausch & Lomb Incorporated Multi-sport goggle with interchangeable strap and tear-off lens system
DE19702287C2 (en) 1997-01-23 1999-02-11 Wernicke & Co Gmbh Method for determining the course of the facets on the edge of spectacle lenses to be processed and for controlling the processing of shapes in accordance with the determined course of the facets
CN1216120A (en) 1997-02-06 1999-05-05 博士伦公司 Electric connection configuration for electro-optical device
US5983201A (en) 1997-03-28 1999-11-09 Fay; Pierre N. System and method enabling shopping from home for fitted eyeglass frames
US6420698B1 (en) 1997-04-24 2002-07-16 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three-dimensional objects
EP1032934A2 (en) 1997-05-15 2000-09-06 Palantir Software, Inc. Multimedia interface with user interaction tracking
AU753161B2 (en) 1997-05-16 2002-10-10 Hoya Corporation System for making spectacles to order
IT1293363B1 (en) 1997-05-29 1999-02-25 Killer Loop Eyewear Srl INTERCONNECTION DEVICE, PARTICULARLY FOR GLASSES
US6492986B1 (en) 1997-06-02 2002-12-10 The Trustees Of The University Of Pennsylvania Method for human face shape and motion estimation based on integrating optical flow and deformable models
US6208347B1 (en) 1997-06-23 2001-03-27 Real-Time Geometry Corporation System and method for computer modeling of 3D objects and 2D images by mesh constructions that incorporate non-spatial data such as color or texture
EP0901105A1 (en) 1997-08-05 1999-03-10 Canon Kabushiki Kaisha Image processing apparatus
US6647146B1 (en) 1997-08-05 2003-11-11 Canon Kabushiki Kaisha Image processing apparatus
DE69823116D1 (en) 1997-08-05 2004-05-19 Canon Kk Image processing method and device
US6018339A (en) 1997-08-15 2000-01-25 Stevens; Susan Automatic visual correction for computer screen
WO1999015945A2 (en) 1997-09-23 1999-04-01 Enroute, Inc. Generating three-dimensional models of objects defined by two-dimensional image data
USD426847S (en) 1997-09-30 2000-06-20 Bausch & Lomb Incorporated Eyewear
US5880806A (en) 1997-10-16 1999-03-09 Bausch & Lomb Incorporated Eyewear frame construction
US6249600B1 (en) 1997-11-07 2001-06-19 The Trustees Of Columbia University In The City Of New York System and method for generation of a three-dimensional solid model
US6139143A (en) 1997-12-11 2000-10-31 Bausch & Lomb Incorporated Temple for eyewear having an integrally formed serpentine hinge
US6310627B1 (en) 1998-01-20 2001-10-30 Toyo Boseki Kabushiki Kaisha Method and system for generating a stereoscopic image of a garment
AU753506B2 (en) 1998-02-03 2002-10-17 Tsuyoshi Saigo Simulation system for wearing glasses
DE19804428A1 (en) 1998-02-05 1999-08-19 Wernicke & Co Gmbh Method for marking or drilling holes in spectacle lenses and device for carrying out the method
US6356271B1 (en) 1998-02-17 2002-03-12 Silicon Graphics, Inc. Computer generated paint stamp seaming compensation
US6144388A (en) 1998-03-06 2000-11-07 Bornstein; Raanan Process for displaying articles of clothing on an image of a person
US6233049B1 (en) 1998-03-25 2001-05-15 Minolta Co., Ltd. Three-dimensional measurement apparatus
USD433052S (en) 1998-05-06 2000-10-31 Luxottica Leasing S.P.A. Eyewear
WO1999056942A1 (en) 1998-05-07 1999-11-11 Luxottica Leasing S.P.A. Eyewear frame construction
US6139141A (en) 1998-05-20 2000-10-31 Altair Holding Company Auxiliary eyeglasses with magnetic clips
US6072496A (en) 1998-06-08 2000-06-06 Microsoft Corporation Method and system for capturing and representing 3D geometry, color and shading of facial expressions and other animated objects
AU4558299A (en) 1998-06-12 1999-12-30 Bausch & Lomb Incorporated Eyewear with replaceable lens system
US5926248A (en) 1998-06-26 1999-07-20 Bausch & Lomb, Incorporated Sunglass lens laminate
US6999073B1 (en) 1998-07-20 2006-02-14 Geometrix, Inc. Method and system for generating fully-textured 3D
US6563499B1 (en) 1998-07-20 2003-05-13 Geometrix, Inc. Method and apparatus for generating a 3D region from a surrounding imagery
USD422014S (en) 1998-07-31 2000-03-28 Luxottica Leasing S.P.A. Eyewear temple
USD425543S (en) 1998-08-21 2000-05-23 Luxottica Leasing S.P.A. Eyewear
US6095650A (en) 1998-09-22 2000-08-01 Virtual Visual Devices, Llc Interactive eyewear selection system
KR100294923B1 (en) 1998-10-02 2001-09-07 윤종용 3-D mesh coding/decoding method and apparatus for error resilience and incremental rendering
JP4086429B2 (en) 1998-10-12 2008-05-14 Hoya株式会社 Evaluation method and apparatus for spectacle lens
US6222621B1 (en) 1998-10-12 2001-04-24 Hoyo Corporation Spectacle lens evaluation method and evaluation device
US6307568B1 (en) 1998-10-28 2001-10-23 Imaginarix Ltd. Virtual dressing over the internet
USD432156S (en) 1998-11-19 2000-10-17 Luxottica Leasing S.P.A. Eyewear
US6466205B2 (en) 1998-11-19 2002-10-15 Push Entertainment, Inc. System and method for creating 3D models from 2D sequential image data
USD420037S (en) 1998-11-19 2000-02-01 Luxottica Leasing S.P.A. Eyewear
US6132044A (en) 1998-11-20 2000-10-17 Luxottica Leasing S.P.A Filter for a special purpose lens and method of making filter
JP4025442B2 (en) 1998-12-01 2007-12-19 富士通株式会社 3D model conversion apparatus and method
DK1138011T3 (en) 1998-12-02 2004-03-08 Univ Manchester Determination of facial subspace
US6281903B1 (en) 1998-12-04 2001-08-28 International Business Machines Corporation Methods and apparatus for embedding 2D image content into 3D models
US6024444A (en) 1998-12-18 2000-02-15 Luxottica Leasing S.P.A. Eyewear lens retention apparatus and method
KR100292837B1 (en) 1999-01-28 2001-06-15 장두순 online ticket sales system and method for the same
US6456287B1 (en) 1999-02-03 2002-09-24 Isurftv Method and apparatus for 3D model creation based on 2D images
USD424095S (en) 1999-02-03 2000-05-02 Luxottica Leasing S.P.A. Eyewear front
USD427227S (en) 1999-02-09 2000-06-27 Luxottica Leasing S.P.A. Eyewear
EP1793262B1 (en) 1999-02-12 2014-10-29 Hoya Corporation Spectacle lens and manufacturing method therefor
USD423034S (en) 1999-02-19 2000-04-18 Luxottica Leasing S.P.A. Eyewear
US6305656B1 (en) 1999-02-26 2001-10-23 Dash-It Usa Inc. Magnetic coupler and various embodiments thereof
USD426568S (en) 1999-03-01 2000-06-13 Luxottica Leasing S.P.A. Eyewear
USD420379S (en) 1999-03-01 2000-02-08 Luxottica Leasing S.P.A. Eyewear front
USD417883S (en) 1999-03-03 1999-12-21 Luxottica Leasing S.P.A. Eyewear temple
JP3599268B2 (en) 1999-03-08 2004-12-08 株式会社ソニー・コンピュータエンタテインメント Image processing method, image processing apparatus, and recording medium
US20110026606A1 (en) 1999-03-11 2011-02-03 Thomson Licensing System and method for enhancing the visibility of an object in a digital picture
DE69934478T2 (en) 1999-03-19 2007-09-27 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method and apparatus for image processing based on metamorphosis models
USD421764S (en) 1999-04-05 2000-03-21 Luxottica Leasing S.P.A. Eyewear front
USD427225S (en) 1999-04-08 2000-06-27 Luxottica Leasing S.P.A. Goggle
IT1312246B1 (en) 1999-04-09 2002-04-09 Francesco Lauricella APPARATUS WITH PHOTOSENSITIVE CHESSBOARD FOR THE CONTROL OF PERSONALCOMPUTER BY CONCENTRATED LIGHT EMITTING, APPLIED TO THE HEAD
WO2000064168A1 (en) 1999-04-19 2000-10-26 I Pyxidis Llc Methods and apparatus for delivering and viewing distributed entertainment broadcast objects as a personalized interactive telecast
USD422011S (en) 1999-04-29 2000-03-28 Luxottica Leasing S.P.A. Eyewear front
USD434788S (en) 1999-12-15 2000-12-05 Luxottica Leasing S.P.A. Eyewear
USD424094S (en) 1999-04-30 2000-05-02 Luxottica Leasing S.P.A. Eyewear
USD423556S (en) 1999-04-30 2000-04-25 Luxottica Leasing S.P.A. Eyewear
USD423552S (en) 1999-04-30 2000-04-25 Luxottica Leasing S.P.A. Eyewear
USD424096S (en) 1999-05-12 2000-05-02 Luxottica Leasing S.P.A. Eyewear front
USD425542S (en) 1999-05-27 2000-05-23 Luxottica Leasing S.P.A. Eyewear
US6415051B1 (en) 1999-06-24 2002-07-02 Geometrix, Inc. Generating 3-D models using a manually operated structured light source
USD423553S (en) 1999-06-24 2000-04-25 Luxottica Leasing S.P.A. Eyewear
USD423554S (en) 1999-06-24 2000-04-25 Luxottica Leasing S.P.A. Eyewear front
US6760488B1 (en) 1999-07-12 2004-07-06 Carnegie Mellon University System and method for generating a three-dimensional model from a two-dimensional image sequence
USD423557S (en) 1999-09-01 2000-04-25 Luxottica Leasing S.P.A. Eyewear temple
US6922494B1 (en) 1999-09-24 2005-07-26 Eye Web, Inc. Automated image scaling
US6650324B1 (en) 1999-10-29 2003-11-18 Intel Corporation Defining surface normals in a 3D surface mesh
USD430591S (en) 1999-11-01 2000-09-05 Luxottica Leasing S.P.A. Eyewear temple
WO2001032074A1 (en) 1999-11-04 2001-05-10 Stefano Soatto System for selecting and designing eyeglass frames
US6583792B1 (en) 1999-11-09 2003-06-24 Newag Digital, Llc System and method for accurately displaying superimposed images
AU772362B2 (en) 1999-11-09 2004-04-22 University Of Manchester, The Object class identification, verification or object image synthesis
US7663648B1 (en) * 1999-11-12 2010-02-16 My Virtual Model Inc. System and method for displaying selected garments on a computer-simulated mannequin
US6671538B1 (en) 1999-11-26 2003-12-30 Koninklijke Philips Electronics, N.V. Interface system for use with imaging devices to facilitate visualization of image-guided interventional procedure planning
US7234937B2 (en) 1999-11-30 2007-06-26 Orametrix, Inc. Unified workstation for virtual craniofacial diagnosis, treatment planning and therapeutics
WO2001045029A2 (en) 1999-12-10 2001-06-21 Lennon Jerry W Customer image capture and use thereof in a retailing system
US6980690B1 (en) 2000-01-20 2005-12-27 Canon Kabushiki Kaisha Image processing apparatus
US6377281B1 (en) 2000-02-17 2002-04-23 The Jim Henson Company Live performance control of computer graphic characters
DE10007705A1 (en) 2000-02-19 2001-09-06 Keune Thomas Method for matching spectacles to potential wearer via Internet, in which wearer records images of themselves wearing reference marker, using digital camera connected to computer and these are transmitted to server
US6419549B2 (en) 2000-02-29 2002-07-16 Asahi Kogaku Kogyo Kabushiki Kaisha Manufacturing method of spectacle lenses and system thereof
JP4898055B2 (en) 2000-03-08 2012-03-14 ぴあ株式会社 Ticket transfer system
US7657083B2 (en) 2000-03-08 2010-02-02 Cyberextruder.Com, Inc. System, method, and apparatus for generating a three-dimensional representation from one or more two-dimensional images
US6807290B2 (en) 2000-03-09 2004-10-19 Microsoft Corporation Rapid computer modeling of faces for animation
JP2001330806A (en) 2000-03-17 2001-11-30 Topcon Corp Spectacle frame synthesizing system and spectacle frame sales method
EP1136869A1 (en) 2000-03-17 2001-09-26 Kabushiki Kaisha TOPCON Eyeglass frame selecting system
JP3853756B2 (en) 2000-03-17 2006-12-06 株式会社トプコン Eyeglass lens processing simulation equipment
AU2001249703B2 (en) 2000-03-30 2005-07-14 Q2100, Inc. Apparatus and system for the production of plastic lenses
JP2001340296A (en) 2000-03-30 2001-12-11 Topcon Corp Optimetric system
US7149665B2 (en) 2000-04-03 2006-12-12 Browzwear International Ltd System and method for simulation of virtual wear articles on virtual models
WO2001078630A1 (en) 2000-04-14 2001-10-25 Pavel Efimovich Golikov Method for increasing visual working capacity when one is working with display facilities, light-filter devices for performing said method and method for producing these devices
DE10033983A1 (en) 2000-04-27 2001-10-31 Frank Mothes Appliance for determining eyeglass centering data with the aid of video camera and computer
US7224357B2 (en) 2000-05-03 2007-05-29 University Of Southern California Three-dimensional modeling based on photographic images
US6968075B1 (en) 2000-05-09 2005-11-22 Chang Kurt C System and method for three-dimensional shape and size measurement
EP1299787A4 (en) 2000-05-18 2005-02-02 Visionix Ltd Spectacles fitting system and fitting methods useful therein
WO2001091016A1 (en) 2000-05-25 2001-11-29 Realitybuy, Inc. A real time, three-dimensional, configurable, interactive product display system and method
US6535223B1 (en) 2000-05-31 2003-03-18 Schmidt Laboratories, Inc. Method and system for determining pupiltary distant and element height
US7489768B1 (en) 2000-06-01 2009-02-10 Jonathan Strietzel Method and apparatus for telecommunications advertising
FR2810770B1 (en) 2000-06-23 2003-01-03 France Telecom REFINING A TRIANGULAR MESH REPRESENTATIVE OF AN OBJECT IN THREE DIMENSIONS
DE10031042A1 (en) 2000-06-26 2002-01-03 Autodesk Inc A method for generating a 2D view of a 3D model, the 3D model comprising at least one object
FI20001688A (en) 2000-07-20 2002-01-21 Juhani Wahlgren Method and apparatus for simultaneous text display of theater, opera, etc. performances or event or fiction information
US7062722B1 (en) 2000-08-22 2006-06-13 Bruce Carlin Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of promotion and procurement
US7523411B2 (en) 2000-08-22 2009-04-21 Bruce Carlin Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of object promotion and procurement, and generation of object advertisements
US6791584B1 (en) 2000-09-05 2004-09-14 Yiling Xie Method of scaling face image with spectacle frame image through computer
US6386562B1 (en) 2000-09-11 2002-05-14 Hui Shan Kuo Scooter having changeable steering mechanism
US6664956B1 (en) 2000-10-12 2003-12-16 Momentum Bilgisayar, Yazilim, Danismanlik, Ticaret A. S. Method for generating a personalized 3-D face model
EP1336891B1 (en) 2000-10-27 2010-09-01 Hoya Corporation Production method for spectacle lens and supply system for spectacle lens
US7736147B2 (en) 2000-10-30 2010-06-15 Align Technology, Inc. Systems and methods for bite-setting teeth models
US6792401B1 (en) 2000-10-31 2004-09-14 Diamond Visionics Company Internet-based modeling kiosk and method for fitting and selling prescription eyeglasses
CA2326087A1 (en) 2000-11-16 2002-05-16 Craig Summers Inward-looking imaging system
US6493073B2 (en) 2000-12-11 2002-12-10 Sheldon L. Epstein System and method for measuring properties of an optical component
IL140494A0 (en) 2000-12-22 2002-02-10 Pneumatic control system for a biopsy device
GB0101371D0 (en) 2001-01-19 2001-03-07 Virtual Mirrors Ltd Production and visualisation of garments
US7016824B2 (en) 2001-02-06 2006-03-21 Geometrix, Inc. Interactive try-on platform for eyeglasses
GB2372165A (en) 2001-02-10 2002-08-14 Hewlett Packard Co A method of selectively storing images
DE10106562B4 (en) 2001-02-13 2008-07-03 Rodenstock Gmbh Method for demonstrating the influence of a particular spectacle frame and the optical glasses used in this spectacle frame
US6726463B2 (en) 2001-02-20 2004-04-27 Q2100, Inc. Apparatus for preparing an eyeglass lens having a dual computer system controller
US6808381B2 (en) 2001-02-20 2004-10-26 Q2100, Inc. Apparatus for preparing an eyeglass lens having a controller
US7051290B2 (en) 2001-02-20 2006-05-23 Q2100, Inc. Graphical interface for receiving eyeglass prescription information
US6893245B2 (en) 2001-02-20 2005-05-17 Q2100, Inc. Apparatus for preparing an eyeglass lens having a computer system controller
US6950804B2 (en) 2001-02-26 2005-09-27 Pika Media Systems and methods for distributing targeted multimedia content and advertising
US6943789B2 (en) 2001-03-16 2005-09-13 Mitsubishi Electric Research Labs, Inc Conversion of adaptively sampled distance fields to triangles
US7034818B2 (en) 2001-03-16 2006-04-25 Mitsubishi Electric Research Laboratories, Inc. System and method for converting range data to 3D models
US6873720B2 (en) 2001-03-20 2005-03-29 Synopsys, Inc. System and method of providing mask defect printability analysis
ITBO20010218A1 (en) 2001-04-13 2002-10-13 Luxottica S P A FRAME FOR GLASSES WITH PERFECTED STRATIFORM COATING
US7717708B2 (en) 2001-04-13 2010-05-18 Orametrix, Inc. Method and system for integrated orthodontic treatment planning using unified workstation
US7156655B2 (en) 2001-04-13 2007-01-02 Orametrix, Inc. Method and system for comprehensive evaluation of orthodontic treatment using unified workstation
US7003515B1 (en) 2001-05-16 2006-02-21 Pandora Media, Inc. Consumer item matching method and system
JP3527489B2 (en) 2001-08-03 2004-05-17 株式会社ソニー・コンピュータエンタテインメント Drawing processing method and apparatus, recording medium storing drawing processing program, drawing processing program
US20030030904A1 (en) 2001-08-13 2003-02-13 Mei-Ling Huang Stereoscopic viewing assembly with adjustable fields of vision in two dimensions
US7123263B2 (en) 2001-08-14 2006-10-17 Pulse Entertainment, Inc. Automatic 3D modeling system and method
DE10140656A1 (en) 2001-08-24 2003-03-13 Rodenstock Optik G Process for designing and optimizing an individual lens
JP2005502111A (en) 2001-08-31 2005-01-20 ソリッドワークス コーポレイション Simultaneous use of 2D and 3D modeling data
US7103211B1 (en) 2001-09-04 2006-09-05 Geometrix, Inc. Method and apparatus for generating 3D face models from one camera
US6961439B2 (en) 2001-09-26 2005-11-01 The United States Of America As Represented By The Secretary Of The Navy Method and apparatus for producing spatialized audio signals
US7634103B2 (en) 2001-10-01 2009-12-15 L'oreal S.A. Analysis using a three-dimensional facial image
US7081893B2 (en) 2001-10-10 2006-07-25 Sony Computer Entertainment America Inc. System and method for point pushing to render polygons in environments with changing levels of detail
US7209557B2 (en) 2001-10-18 2007-04-24 Lenovo Singapore Pte, Ltd Apparatus and method for computer screen security
US6682195B2 (en) 2001-10-25 2004-01-27 Ophthonix, Inc. Custom eyeglass manufacturing method
US7434931B2 (en) 2001-10-25 2008-10-14 Ophthonix Custom eyeglass manufacturing method
KR100450823B1 (en) 2001-11-27 2004-10-01 삼성전자주식회사 Node structure for representing 3-dimensional objects using depth image
US20030110099A1 (en) 2001-12-11 2003-06-12 Philips Electronics North America Corporation Virtual wearing of clothes
US7221809B2 (en) 2001-12-17 2007-05-22 Genex Technologies, Inc. Face recognition system and method
US20040217956A1 (en) 2002-02-28 2004-11-04 Paul Besl Method and system for processing, compressing, streaming, and interactive rendering of 3D color image data
JP2003271965A (en) 2002-03-19 2003-09-26 Fujitsu Ltd Program, method and device for authentication of hand- written signature
WO2003084448A1 (en) 2002-04-11 2003-10-16 Sendo Co., Ltd. Color-blindness correcting eyeglass and method for manufacturing color-blindness correcting eyeglass
US20040004633A1 (en) 2002-07-03 2004-01-08 Perry James N. Web-based system and method for ordering and fitting prescription lens eyewear
US7492364B2 (en) 2002-07-23 2009-02-17 Imagecom, Inc. System and method for creating and updating a three-dimensional model and creating a related neutral file format
JP2004062565A (en) 2002-07-30 2004-02-26 Canon Inc Image processor and image processing method, and program storage medium
US6775128B2 (en) 2002-10-03 2004-08-10 Julio Leitao Protective cover sleeve for laptop computer screens
US6825838B2 (en) 2002-10-11 2004-11-30 Sonocine, Inc. 3D modeling system
DE502004010054D1 (en) 2003-02-22 2009-10-29 Hans-Joachim Ollendorf Method for determining the pupillary distance
EP1599829A1 (en) 2003-03-06 2005-11-30 Animetrics, Inc. Viewpoint-invariant detection and identification of a three-dimensional object from two-dimensional imagery
US7711155B1 (en) 2003-04-14 2010-05-04 Videomining Corporation Method and system for enhancing three dimensional face modeling using demographic classification
US7242807B2 (en) 2003-05-05 2007-07-10 Fish & Richardson P.C. Imaging of biometric information based on three-dimensional shapes
US20040223631A1 (en) 2003-05-07 2004-11-11 Roman Waupotitsch Face recognition based on obtaining two dimensional information from three-dimensional face shapes
TW594594B (en) 2003-05-16 2004-06-21 Ind Tech Res Inst A multilevel texture processing method for mapping multiple images onto 3D models
US7421097B2 (en) 2003-05-27 2008-09-02 Honeywell International Inc. Face identification verification using 3 dimensional modeling
US20040257364A1 (en) 2003-06-18 2004-12-23 Basler Gregory A. Shadow casting within a virtual three-dimensional terrain model
GB2403883B (en) 2003-07-08 2007-08-22 Delcam Plc Method and system for the modelling of 3D objects
US7814436B2 (en) 2003-07-28 2010-10-12 Autodesk, Inc. 3D scene orientation indicator system with scene orientation change capability
AU2003259166A1 (en) 2003-08-06 2005-03-07 Catalina Marketing International, Inc. Delivery of targeted offers for movie theaters and other retail stores
US7151545B2 (en) 2003-08-06 2006-12-19 Landmark Graphics Corporation System and method for applying accurate three-dimensional volume textures to arbitrary triangulated surfaces
US7212664B2 (en) 2003-08-07 2007-05-01 Mitsubishi Electric Research Laboratories, Inc. Constructing heads from 3D models and 2D silhouettes
US7426292B2 (en) 2003-08-07 2008-09-16 Mitsubishi Electric Research Laboratories, Inc. Method for determining optimal viewpoints for 3D face modeling and face recognition
US20050111705A1 (en) 2003-08-26 2005-05-26 Roman Waupotitsch Passive stereo sensing for 3D facial shape biometrics
KR100682889B1 (en) 2003-08-29 2007-02-15 삼성전자주식회사 Method and Apparatus for image-based photorealistic 3D face modeling
JP2005100367A (en) 2003-09-02 2005-04-14 Fuji Photo Film Co Ltd Image generating apparatus, image generating method and image generating program
US20050190264A1 (en) 2003-09-12 2005-09-01 Neal Michael R. Method of interactive system for previewing and selecting eyewear
MXPA06003890A (en) 2003-10-06 2006-07-03 Disney Entpr Inc System and method of playback and feature control for video players.
WO2005038700A1 (en) 2003-10-09 2005-04-28 University Of York Image recognition
US7290201B1 (en) 2003-11-12 2007-10-30 Xilinx, Inc. Scheme for eliminating the effects of duty cycle asymmetry in clock-forwarded double data rate interface applications
US7889209B2 (en) 2003-12-10 2011-02-15 Sensable Technologies, Inc. Apparatus and methods for wrapping texture onto the surface of a virtual object
US20050208457A1 (en) 2004-01-05 2005-09-22 Wolfgang Fink Digital object recognition audio-assistant for the visually impaired
JPWO2005076210A1 (en) * 2004-02-05 2007-10-18 ソフトバンクモバイル株式会社 Image processing method, image processing apparatus, and mobile communication terminal apparatus
FR2866718B1 (en) 2004-02-24 2006-05-05 Essilor Int CENTRAL-BLOCKING DEVICE OF AN OPHTHALMIC LENSES LENS, AUTOMATIC DETECTION METHOD AND ASSOCIATED MANUAL CENTERING METHODS
US7154529B2 (en) 2004-03-12 2006-12-26 Hoke Donald G System and method for enabling a person to view images of the person wearing an accessory before purchasing the accessory
JP2005269022A (en) 2004-03-17 2005-09-29 Ricoh Co Ltd Encoder and encoding method, encoded data editor and editing method and program, and recording medium
EP1728467A4 (en) 2004-03-26 2009-09-16 Hoya Corp Spectacle lens supply system, spectacle wearing parameter measurement device, spectacle wearing inspection system, spectacle lens, and spectacle
US7441895B2 (en) 2004-03-26 2008-10-28 Hoya Corporation Spectacle lens supply system, spectacle wearing parameter measurement apparatus, spectacle wearing test system, spectacle lens, and spectacle
US20050226509A1 (en) 2004-03-30 2005-10-13 Thomas Maurer Efficient classification of three dimensional face models for human identification and other applications
US20070013873A9 (en) 2004-04-29 2007-01-18 Jacobson Joseph M Low cost portable computing device
US7630580B1 (en) 2004-05-04 2009-12-08 AgentSheets, Inc. Diffusion-based interactive extrusion of 2D images into 3D models
US7436988B2 (en) 2004-06-03 2008-10-14 Arizona Board Of Regents 3D face authentication and recognition based on bilateral symmetry analysis
US7804997B2 (en) 2004-06-10 2010-09-28 Technest Holdings, Inc. Method and system for a three dimensional facial recognition system
US7133048B2 (en) 2004-06-30 2006-11-07 Mitsubishi Electric Research Laboratories, Inc. Variable multilinear models for facial synthesis
US20060012748A1 (en) 2004-07-15 2006-01-19 Parikumar Periasamy Dynamic multifocal spectacle frame
US7218323B1 (en) 2004-08-13 2007-05-15 Ngrain (Canada) Corporation Method and system for rendering voxel data while addressing multiple voxel set interpenetration
SE528068C2 (en) 2004-08-19 2006-08-22 Jan Erik Solem Med Jsolutions Three dimensional object recognizing method for e.g. aircraft, involves detecting image features in obtained two dimensional representation, and comparing recovered three dimensional shape with reference representation of object
US7219995B2 (en) 2004-08-25 2007-05-22 Hans-Joachim Ollendorf Apparatus for determining the distance between pupils
DE102004059448A1 (en) 2004-11-19 2006-06-01 Rodenstock Gmbh Method and apparatus for manufacturing a spectacle lens; System and computer program product for manufacturing a spectacle lens
US7324110B2 (en) 2004-12-09 2008-01-29 Image Metrics Limited Method and system for cleaning motion capture data
US20060127852A1 (en) 2004-12-14 2006-06-15 Huafeng Wen Image based orthodontic treatment viewing system
KR100511210B1 (en) 2004-12-27 2005-08-30 주식회사지앤지커머스 Method for converting 2d image into pseudo 3d image and user-adapted total coordination method in use artificial intelligence, and service besiness method thereof
US20110102553A1 (en) 2007-02-28 2011-05-05 Tessera Technologies Ireland Limited Enhanced real-time face models from stereo imaging
US7533453B2 (en) 2005-01-24 2009-05-19 Yancy Virgil T E-facet optical lens
US7860301B2 (en) 2005-02-11 2010-12-28 Macdonald Dettwiler And Associates Inc. 3D imaging system
US20120021835A1 (en) 2005-02-11 2012-01-26 Iprd Labs Llc Systems and methods for server based video gaming
US20060212150A1 (en) 2005-02-18 2006-09-21 Sims William Jr Method of providing 3D models
KR20070119018A (en) 2005-02-23 2007-12-18 크레이그 써머스 Automatic scene modeling for the 3d camera and 3d video
JP4473754B2 (en) 2005-03-11 2010-06-02 株式会社東芝 Virtual fitting device
US20060216680A1 (en) 2005-03-24 2006-09-28 Eharmony.Com Selection of relationship improvement content for users in a relationship
US7760923B2 (en) 2005-03-24 2010-07-20 Optasia Medical Limited Method and system for characterization of knee joint morphology
DE102005014775B4 (en) 2005-03-31 2008-12-11 Nokia Siemens Networks Gmbh & Co.Kg Method, communication arrangement and communication device for controlling access to at least one communication device
US9001215B2 (en) 2005-06-02 2015-04-07 The Invention Science Fund I, Llc Estimating shared image device operational capabilities or resources
US7830384B1 (en) 2005-04-27 2010-11-09 Image Metrics Limited Animating graphical objects using input video
US7415152B2 (en) 2005-04-29 2008-08-19 Microsoft Corporation Method and system for constructing a 3D representation of a face from a 2D representation
US7609859B2 (en) 2005-06-14 2009-10-27 Mitsubishi Electric Research Laboratories, Inc. Method and system for generating bi-linear models for faces
EP1897033A4 (en) 2005-06-16 2015-06-24 Strider Labs Inc System and method for recognition in 2d images using 3d class models
US7756325B2 (en) 2005-06-20 2010-07-13 University Of Basel Estimating 3D shape and texture of a 3D object based on a 2D image of the 3D object
US7953675B2 (en) 2005-07-01 2011-05-31 University Of Southern California Tensor voting in N dimensional spaces
US7961914B1 (en) 2005-07-12 2011-06-14 Smith Robert J D Portable storage apparatus with integral biometric-based access control system
CN100403974C (en) 2005-07-27 2008-07-23 李信亮 Method for preparing eyeglass based on photograph of customer's header up loaded and optometry data
ITBO20050524A1 (en) 2005-08-05 2007-02-06 Luxottica Srl LENS FOR MASKS AND GLASSES
JP4659554B2 (en) 2005-08-09 2011-03-30 株式会社メニコン Ophthalmic lens manufacturing system and manufacturing method
DE102005038859A1 (en) 2005-08-17 2007-03-01 Rodenstock Gmbh Tool for calculating the performance of progressive lenses
US8218836B2 (en) 2005-09-12 2012-07-10 Rutgers, The State University Of New Jersey System and methods for generating three-dimensional images from two-dimensional bioluminescence images and visualizing tumor shapes and locations
US7563975B2 (en) 2005-09-14 2009-07-21 Mattel, Inc. Music production system
DE102005048436B3 (en) 2005-10-07 2007-03-29 Buchmann Deutschland Gmbh Blank lenses adapting method, for spectacle frame, involves testing whether adjustment of one lens requires corresponding concurrent adjustment of another lens to maintain preset tolerances corresponding to vertical amplitude of fusion
US7755619B2 (en) 2005-10-13 2010-07-13 Microsoft Corporation Automatic 3D face-modeling from video
GB2431793B (en) 2005-10-31 2011-04-27 Sony Uk Ltd Image processing
US7768528B1 (en) 2005-11-09 2010-08-03 Image Metrics Limited Replacement of faces in existing video
US20070104360A1 (en) 2005-11-09 2007-05-10 Smedia Technology Corporation System and method for capturing 3D face
WO2007054561A1 (en) * 2005-11-10 2007-05-18 Bracco Research Sa Instantaneous visualization of contrast agent concentration in imaging applications
KR100735564B1 (en) 2005-12-02 2007-07-04 삼성전자주식회사 Apparatus, system, and method for mapping information
JP3962803B2 (en) 2005-12-16 2007-08-22 インターナショナル・ビジネス・マシーンズ・コーポレーション Head detection device, head detection method, and head detection program
JP4868171B2 (en) 2005-12-27 2012-02-01 日本電気株式会社 Data compression method and apparatus, data restoration method and apparatus, and program
KR101334173B1 (en) 2006-01-11 2013-11-28 삼성전자주식회사 Method and apparatus for encoding/decoding graphic data
US7856125B2 (en) 2006-01-31 2010-12-21 University Of Southern California 3D face reconstruction from 2D images
US7587082B1 (en) 2006-02-17 2009-09-08 Cognitech, Inc. Object recognition based on 2D images and 3D models
JP4990917B2 (en) 2006-02-23 2012-08-01 イマジネスティクス エルエルシー A method that allows a user to draw a component as input to search for the component in the database
US8026917B1 (en) 2006-05-01 2011-09-27 Image Metrics Ltd Development tools for animated character rigging
US8433157B2 (en) 2006-05-04 2013-04-30 Thomson Licensing System and method for three-dimensional object reconstruction from two-dimensional images
US20070262988A1 (en) 2006-05-09 2007-11-15 Pixar Animation Studios Method and apparatus for using voxel mip maps and brick maps as geometric primitives in image rendering process
EP1862110A1 (en) 2006-05-29 2007-12-05 Essilor International (Compagnie Generale D'optique) Method for optimizing eyeglass lenses
US7573489B2 (en) 2006-06-01 2009-08-11 Industrial Light & Magic Infilling for 2D to 3D image conversion
US7573475B2 (en) 2006-06-01 2009-08-11 Industrial Light & Magic 2D to 3D image conversion
WO2008002630A2 (en) 2006-06-26 2008-01-03 University Of Southern California Seamless image integration into 3d models
JP5053373B2 (en) 2006-06-29 2012-10-17 トムソン ライセンシング Adaptive pixel-based filtering
DE102006030204A1 (en) 2006-06-30 2008-01-03 Rodenstock Gmbh Pair of spectacle lenses in anisometropia
DE102006033491A1 (en) 2006-07-19 2008-01-31 Rodenstock Gmbh Device and method for determining a wearing position of spectacles, computer program device
DE102006033490A1 (en) 2006-07-19 2008-01-31 Rodenstock Gmbh Apparatus and method for determining a position of a spectacle lens relative to a spectacle frame, computer program device
EP1881457B1 (en) 2006-07-21 2017-09-13 Dassault Systèmes Method for creating a parametric surface symmetric with respect to a given symmetry operation
US20080136814A1 (en) 2006-09-17 2008-06-12 Chang Woo Chu System and method for generating 3-d facial model and animation using one video camera
JP5330246B2 (en) 2006-09-29 2013-10-30 トムソン ライセンシング Automatic parameter estimation for adaptive pixel-based filtering
US8073196B2 (en) 2006-10-16 2011-12-06 University Of Southern California Detection and tracking of moving objects from a moving platform in presence of strong parallax
US20080112610A1 (en) 2006-11-14 2008-05-15 S2, Inc. System and method for 3d model generation
US7656402B2 (en) 2006-11-15 2010-02-02 Tahg, Llc Method for creating, manufacturing, and distributing three-dimensional models
US8330801B2 (en) 2006-12-22 2012-12-11 Qualcomm Incorporated Complexity-adaptive 2D-to-3D video sequence conversion
TW200828043A (en) 2006-12-29 2008-07-01 Cheng-Hsien Yang Terminal try-on simulation system and operating and applying method thereof
US8199152B2 (en) 2007-01-16 2012-06-12 Lucasfilm Entertainment Company Ltd. Combining multiple session content for animation libraries
US8542236B2 (en) 2007-01-16 2013-09-24 Lucasfilm Entertainment Company Ltd. Generating animation libraries
WO2008087556A2 (en) 2007-01-16 2008-07-24 Optasia Medical, Limited Image processing systems and methods
US8130225B2 (en) 2007-01-16 2012-03-06 Lucasfilm Entertainment Company Ltd. Using animation libraries for object identification
US8644601B2 (en) 2007-01-19 2014-02-04 Thomson Licensing Reducing contours in digital images
US8303113B2 (en) 2007-01-25 2012-11-06 Rodenstock Gmbh Method for calculating a spectacle lens having a variable position of the reference points
ES2483171T3 (en) 2007-01-25 2014-08-05 Rodenstock Gmbh Reference points for ortho position
JP5490546B2 (en) 2007-01-25 2014-05-14 ローデンストック.ゲゼルシャフト.ミット.ベシュレンクテル.ハフツング Method for optimizing spectacle lenses
US7699300B2 (en) 2007-02-01 2010-04-20 Toshiba Tec Kabushiki Kaisha Sheet post-processing apparatus
US8200502B2 (en) 2007-02-14 2012-06-12 Optivision, Inc. Frame tracer web browser component
US20110071804A1 (en) 2007-02-21 2011-03-24 Yiling Xie Method And The Associated Mechanism For 3-D Simulation Stored-Image Database-Driven Spectacle Frame Fitting Services Over Public Network
US7665843B2 (en) 2007-02-21 2010-02-23 Yiling Xie Method and the associate mechanism for stored-image database-driven spectacle frame fitting services over public network
US20080201641A1 (en) 2007-02-21 2008-08-21 Yiling Xie Method And The Associated Mechanism For 3-D Simulation Stored-Image Database-Driven Spectacle Frame Fitting Services Over Public Network
JP5049356B2 (en) 2007-02-28 2012-10-17 デジタルオプティックス・コーポレイション・ヨーロッパ・リミテッド Separation of directional lighting variability in statistical face modeling based on texture space decomposition
US8286083B2 (en) 2007-03-13 2012-10-09 Ricoh Co., Ltd. Copying documents from electronic displays
BRPI0721462A2 (en) 2007-03-23 2013-01-08 Thomson Licensing 2d image region classification system and method for 2d to 3d conversion
ATE549692T1 (en) 2007-03-26 2012-03-15 Thomson Licensing METHOD AND DEVICE FOR DETECTING OBJECTS OF INTEREST IN A FOOTBALL VIDEO BY COLOR SEGMENTATION AND SHAPE ANALYSIS
US20080240588A1 (en) 2007-03-29 2008-10-02 Mikhail Tsoupko-Sitnikov Image processing method and image processing apparatus
DE102007020031A1 (en) 2007-04-27 2008-10-30 Rodenstock Gmbh Glasses, method of making glasses and computer program product
US20080271078A1 (en) 2007-04-30 2008-10-30 Google Inc. Momentary Electronic Program Guide
US8059917B2 (en) 2007-04-30 2011-11-15 Texas Instruments Incorporated 3-D modeling
US20080278633A1 (en) 2007-05-09 2008-11-13 Mikhail Tsoupko-Sitnikov Image processing method and image processing apparatus
US20080279478A1 (en) 2007-05-09 2008-11-13 Mikhail Tsoupko-Sitnikov Image processing method and image processing apparatus
US8009880B2 (en) 2007-05-11 2011-08-30 Microsoft Corporation Recovering parameters from a sub-optimal image
US8212812B2 (en) 2007-05-21 2012-07-03 Siemens Corporation Active shape model for vehicle modeling and re-identification
WO2008147809A1 (en) 2007-05-24 2008-12-04 Schlumberger Canada Limited Near surface layer modeling
US20080297503A1 (en) 2007-05-30 2008-12-04 John Dickinson System and method for reconstructing a 3D solid model from a 2D line drawing
GB2449855A (en) 2007-06-05 2008-12-10 Steven Harbutt System and method for measuring pupillary distance
US7848548B1 (en) 2007-06-11 2010-12-07 Videomining Corporation Method and system for robust demographic classification using pose independent model from sequence of face images
US20080310757A1 (en) 2007-06-15 2008-12-18 George Wolberg System and related methods for automatically aligning 2D images of a scene to a 3D model of the scene
US20090010507A1 (en) 2007-07-02 2009-01-08 Zheng Jason Geng System and method for generating a 3d model of anatomical structure using a plurality of 2d images
FI120325B (en) 2007-07-04 2009-09-15 Theta Optics Ltd Oy Method of making glasses
DE102007032564A1 (en) 2007-07-12 2009-01-15 Rodenstock Gmbh Method for checking and / or determining user data, computer program product and device
WO2009023012A1 (en) 2007-08-16 2009-02-19 Nasir Wajihuddin Interactive custom design and building of toy vehicle
ES2565244T3 (en) 2007-10-05 2016-04-01 Essilor International (Compagnie Générale d'Optique) A method to provide an ophthalmic lens for glasses by calculating or selecting a design
US8090160B2 (en) 2007-10-12 2012-01-03 The University Of Houston System Automated method for human face modeling and relighting with application to face recognition
US8144153B1 (en) 2007-11-20 2012-03-27 Lucasfilm Entertainment Company Ltd. Model production for animation libraries
US20090135177A1 (en) 2007-11-20 2009-05-28 Big Stage Entertainment, Inc. Systems and methods for voice personalization of video content
US20090129402A1 (en) 2007-11-21 2009-05-21 Simple Star, Inc. Method and System For Scheduling Multimedia Shows
KR100914845B1 (en) 2007-12-15 2009-09-02 한국전자통신연구원 Method and apparatus for 3d reconstructing of object by using multi-view image information
KR100914847B1 (en) 2007-12-15 2009-09-02 한국전자통신연구원 Method and apparatus for creating 3d face model by using multi-view image information
KR100940862B1 (en) 2007-12-17 2010-02-09 한국전자통신연구원 Head motion tracking method for 3d facial model animation from a video stream
US8160345B2 (en) 2008-04-30 2012-04-17 Otismed Corporation System and method for image segmentation in generating computer models of a joint to undergo arthroplasty
EP2031434B1 (en) 2007-12-28 2022-10-19 Essilor International An asynchronous method for obtaining spectacle features to order
EP2037314B1 (en) 2007-12-28 2021-12-01 Essilor International A method and computer means for choosing spectacle lenses adapted to a frame
KR101432177B1 (en) 2008-01-21 2014-08-22 삼성전자주식회사 Portable device and method for processing the photography the same, and photography processing system having it
US8217934B2 (en) 2008-01-23 2012-07-10 Adobe Systems Incorporated System and methods for rendering transparent surfaces in high depth complexity scenes using hybrid and coherent layer peeling
US9305389B2 (en) 2008-02-28 2016-04-05 Autodesk, Inc. Reducing seam artifacts when applying a texture to a three-dimensional (3D) model
US8260006B1 (en) 2008-03-14 2012-09-04 Google Inc. System and method of aligning images
DE102008015189A1 (en) 2008-03-20 2009-10-01 Rodenstock Gmbh Rescaling the target astigmatism for other additions
GB2458388A (en) 2008-03-21 2009-09-23 Dressbot Inc A collaborative online shopping environment, virtual mall, store, etc. in which payments may be shared, products recommended and users modelled.
US8351649B1 (en) 2008-04-01 2013-01-08 University Of Southern California Video feed target tracking
WO2009126261A2 (en) 2008-04-11 2009-10-15 Thomson Licensing System and method for enhancing the visibility of an object in a digital picture
WO2009128783A1 (en) 2008-04-14 2009-10-22 Xid Technologies Pte Ltd An image synthesis method
US8374422B2 (en) 2008-04-14 2013-02-12 Xid Technologies Pte Ltd. Face expressions identification
US8274506B1 (en) 2008-04-28 2012-09-25 Adobe Systems Incorporated System and methods for creating a three-dimensional view of a two-dimensional map
KR101085390B1 (en) 2008-04-30 2011-11-21 주식회사 코아로직 Image presenting method and apparatus for 3D navigation, and mobile apparatus comprising the same apparatus
WO2009135183A1 (en) 2008-05-02 2009-11-05 Zentech, Inc. Automated generation of 3d models from 2d computer-aided design (cad) drawings
US8737721B2 (en) 2008-05-07 2014-05-27 Microsoft Corporation Procedural authoring
EP2280651A2 (en) 2008-05-16 2011-02-09 Geodigm Corporation Method and apparatus for combining 3d dental scans with other 3d data sets
US8126249B2 (en) 2008-05-30 2012-02-28 Optasia Medical Limited Methods of and system for detection and tracking of osteoporosis
US8204299B2 (en) 2008-06-12 2012-06-19 Microsoft Corporation 3D content aggregation built into devices
US20090316945A1 (en) 2008-06-20 2009-12-24 Akansu Ali N Transportable Sensor Devices
US8284190B2 (en) 2008-06-25 2012-10-09 Microsoft Corporation Registration of street-level imagery to 3D building models
EP2291795A1 (en) 2008-07-02 2011-03-09 C-True Ltd. Face recognition system and method
US8131063B2 (en) 2008-07-16 2012-03-06 Seiko Epson Corporation Model-based object image processing
US8155411B2 (en) 2008-07-22 2012-04-10 Pie Medical Imaging B.V. Method, apparatus and computer program for quantitative bifurcation analysis in 3D using multiple 2D angiographic images
US8248417B1 (en) 2008-08-28 2012-08-21 Adobe Systems Incorporated Flattening 3D images
CN102138333B (en) 2008-08-29 2014-09-24 汤姆逊许可公司 View synthesis with heuristic view blending
EP2340516A1 (en) 2008-09-04 2011-07-06 Essilor International (Compagnie Générale D'Optique) Method for providing finishing parameters
EP2161611A1 (en) 2008-09-04 2010-03-10 Essilor International (Compagnie Générale D'Optique) Method for optimizing the settings of an ophtalmic system
CN102227748A (en) 2008-10-03 2011-10-26 3M创新有限公司 Systems and methods for multi-perspective scene analysis
US20120075296A1 (en) 2008-10-08 2012-03-29 Strider Labs, Inc. System and Method for Constructing a 3D Scene Model From an Image
US8160325B2 (en) 2008-10-08 2012-04-17 Fujifilm Medical Systems Usa, Inc. Method and system for surgical planning
WO2010042990A1 (en) 2008-10-16 2010-04-22 Seeing Machines Limited Online marketing of facial products using real-time face tracking
KR20100050052A (en) 2008-11-05 2010-05-13 김영준 Virtual glasses wearing method
EP3406222B1 (en) 2008-11-20 2021-11-10 Align Technology, Inc. Orthodontic systems and methods including parametric attachments
US8982122B2 (en) 2008-11-24 2015-03-17 Mixamo, Inc. Real time concurrent design of shape, texture, and motion for 3D character animation
TW201023092A (en) 2008-12-02 2010-06-16 Nat Univ Tsing Hua 3D face model construction method
TWI382354B (en) 2008-12-02 2013-01-11 Nat Univ Tsing Hua Face recognition method
EP2202560A1 (en) 2008-12-23 2010-06-30 Essilor International (Compagnie Générale D'Optique) A method for providing a spectacle ophthalmic lens by calculating or selecting a design
IT1392435B1 (en) 2008-12-23 2012-03-09 Luxottica Srl MULTILAYER FILM DEPICTING A BIDIMENSIONAL COLORED IMAGE ONLY VISIBLE THROUGH A POLARIZED FILTER AND PROCEDURE TO REALIZE IT.
IT1392436B1 (en) 2008-12-23 2012-03-09 Luxottica Srl MULTILAYER FILM DEPICTING A BIDIMENSIONAL COLORED IMAGE ONLY VISIBLE THROUGH A POLARIZED FILTER AND PROCEDURE TO REALIZE IT.
IT1392623B1 (en) 2008-12-23 2012-03-16 Luxottica Srl DEVICE VISUALIZER OF CRYPTED IMAGES VISIBLE ONLY THROUGH A POLARIZED FILTER AND PROCEDURE TO REALIZE IT.
DE102009005214A1 (en) 2009-01-20 2010-07-22 Rodenstock Gmbh Automatic progressive lens design modification
DE102009005206A1 (en) 2009-01-20 2010-07-22 Rodenstock Gmbh Variable progressive lens design
WO2010084019A1 (en) 2009-01-23 2010-07-29 Rodenstock Gmbh Controlling designs using a polygonal design
KR101670282B1 (en) 2009-02-10 2016-10-28 톰슨 라이센싱 Video matting based on foreground-background constraint propagation
US8355079B2 (en) 2009-02-10 2013-01-15 Thomson Licensing Temporally consistent caption detection on videos using a 3D spatiotemporal method
US8605989B2 (en) 2009-02-13 2013-12-10 Cognitech, Inc. Registration and comparison of three dimensional objects in facial imaging
US8260039B2 (en) 2009-02-25 2012-09-04 Seiko Epson Corporation Object model fitting using manifold constraints
US8208717B2 (en) 2009-02-25 2012-06-26 Seiko Epson Corporation Combining subcomponent models for object image modeling
US8260038B2 (en) 2009-02-25 2012-09-04 Seiko Epson Corporation Subdivision weighting for robust object model fitting
US8204301B2 (en) 2009-02-25 2012-06-19 Seiko Epson Corporation Iterative data reweighting for balanced model learning
US8605942B2 (en) 2009-02-26 2013-12-10 Nikon Corporation Subject tracking apparatus, imaging apparatus and subject tracking method
US8860723B2 (en) 2009-03-09 2014-10-14 Donya Labs Ab Bounded simplification of geometrical computer data
USD616918S1 (en) 2009-03-27 2010-06-01 Luxottica Group S.P.A. Eyeglass
US8372319B2 (en) 2009-06-25 2013-02-12 Liguori Management Ophthalmic eyewear with lenses cast into a frame and methods of fabrication
WO2011002933A2 (en) 2009-06-30 2011-01-06 Museami, Inc. Vocal and instrumental audio effects
US20110001791A1 (en) 2009-07-02 2011-01-06 Emaze Imaging Techonolgies Ltd. Method and system for generating and displaying a three-dimensional model of physical objects
US8553973B2 (en) 2009-07-07 2013-10-08 University Of Basel Modeling methods and systems
US9020259B2 (en) 2009-07-20 2015-04-28 Thomson Licensing Method for detecting and adapting video processing for far-view scenes in sports video
WO2011011059A1 (en) 2009-07-21 2011-01-27 Thomson Licensing A trajectory-based method to detect and enhance a moving object in a video sequence
US9582889B2 (en) 2009-07-30 2017-02-28 Apple Inc. Depth mapping based on pattern matching and stereoscopic information
ES2356457B1 (en) 2009-07-31 2012-09-07 Innovaciones Via Solar, S.L ORIENTABLE LAUNCH OF PIROTECHNICAL HOUSES.
US9384214B2 (en) 2009-07-31 2016-07-05 Yahoo! Inc. Image similarity from disparate sources
US8275590B2 (en) 2009-08-12 2012-09-25 Zugara, Inc. Providing a simulation of wearing items such as garments and/or accessories
US8803950B2 (en) 2009-08-24 2014-08-12 Samsung Electronics Co., Ltd. Three-dimensional face capturing apparatus and method and computer-readable medium thereof
US8537200B2 (en) 2009-10-23 2013-09-17 Qualcomm Incorporated Depth map generation techniques for conversion of 2D video data to 3D video data
WO2011058177A1 (en) 2009-11-13 2011-05-19 Essilor International (Compagnie Generale D'optique) A method for providing a spectacle ophthalmic lens by calculating or selecting a design
JP5463866B2 (en) 2009-11-16 2014-04-09 ソニー株式会社 Image processing apparatus, image processing method, and program
US20120306874A1 (en) 2009-12-14 2012-12-06 Agency For Science, Technology And Research Method and system for single view image 3 d face synthesis
WO2011081639A2 (en) 2009-12-14 2011-07-07 Thomson Licensing Object-aware video encoding strategies
WO2011084130A1 (en) 2009-12-16 2011-07-14 Thomson Licensing Human interaction trajectory-based system
GB2476968B (en) 2010-01-15 2011-12-14 Gareth Edwards Golf grip training aid
WO2011090790A1 (en) 2010-01-22 2011-07-28 Thomson Licensing Methods and apparatus for sampling -based super resolution vido encoding and decoding
US8781253B2 (en) 2010-01-22 2014-07-15 Thomson Licensing Method and apparatus for video object segmentation
KR101791919B1 (en) 2010-01-22 2017-11-02 톰슨 라이센싱 Data pruning for video compression using example-based super-resolution
WO2011097306A1 (en) 2010-02-04 2011-08-11 Sony Corporation 2d to 3d image conversion based on image content
IT1397832B1 (en) 2010-02-08 2013-02-04 Luxottica Srl APPARATUS AND PROCEDURE FOR REALIZING A FRAME OF GLASSES IN THERMOPLASTIC MATERIAL.
US20110211816A1 (en) 2010-02-22 2011-09-01 Richard Edwin Goedeken Method and apparatus for synchronized workstation with two-dimensional and three-dimensional outputs
US20120314023A1 (en) 2010-02-24 2012-12-13 Jesus Barcons-Palau Split screen for 3d
US20120320153A1 (en) 2010-02-25 2012-12-20 Jesus Barcons-Palau Disparity estimation for stereoscopic subtitling
US20110227934A1 (en) 2010-03-19 2011-09-22 Microsoft Corporation Architecture for Volume Rendering
MX2012010842A (en) 2010-03-22 2013-04-03 Luxxotica Us Holdings Corp Ion beam assisted deposition of ophthalmic lens coatings.
US8194072B2 (en) 2010-03-26 2012-06-05 Mitsubishi Electric Research Laboratories, Inc. Method for synthetically relighting images of objects
US9959453B2 (en) 2010-03-28 2018-05-01 AR (ES) Technologies Ltd. Methods and systems for three-dimensional rendering of a virtual augmented replica of a product image merged with a model image of a human-body feature
US9141864B2 (en) 2010-04-08 2015-09-22 Vidyo, Inc. Remote gaze control system and method
US8459792B2 (en) 2010-04-26 2013-06-11 Hal E. Wilson Method and systems for measuring interpupillary distance
DE102011009473B4 (en) 2010-04-28 2022-03-17 Rodenstock Gmbh Computer-implemented method for calculating a spectacle lens with viewing-angle-dependent prescription data, device for calculating or optimizing a spectacle lens, computer program product, storage medium, method for manufacturing a spectacle lens, and use of a spectacle lens
DE102010018549B4 (en) 2010-04-28 2022-08-18 Rodenstock Gmbh Computer-implemented method for calculating a spectacle lens taking into account the rotation of the eye, device for calculating or optimizing a spectacle lens, computer program product, storage medium, method for manufacturing a spectacle lens, device for manufacturing a spectacle lens and use of a spectacle lens
US8482593B2 (en) * 2010-05-12 2013-07-09 Blue Jeans Network, Inc. Systems and methods for scalable composition of media streams for real-time multimedia communication
US8295589B2 (en) 2010-05-20 2012-10-23 Microsoft Corporation Spatially registering user photographs
DE102010021763A1 (en) 2010-05-27 2011-12-01 Carl Zeiss Vision Gmbh Method for producing a spectacle lens and spectacle lens
US8411092B2 (en) 2010-06-14 2013-04-02 Nintendo Co., Ltd. 2D imposters for simplifying processing of plural animation objects in computer graphics generation
KR101654777B1 (en) 2010-07-19 2016-09-06 삼성전자주식회사 Apparatus and method for scalable encoding 3d mesh, and apparatus and method for scalable decoding 3d mesh
US8861800B2 (en) 2010-07-19 2014-10-14 Carnegie Mellon University Rapid 3D face reconstruction from a 2D image and methods using such rapid 3D face reconstruction
US20120038665A1 (en) 2010-08-14 2012-02-16 H8it Inc. Systems and methods for graphing user interactions through user generated content
US9519396B2 (en) 2010-09-28 2016-12-13 Apple Inc. Systems, methods, and computer-readable media for placing an asset on a three-dimensional model
US20120256906A1 (en) 2010-09-30 2012-10-11 Trident Microsystems (Far East) Ltd. System and method to render 3d images from a 2d source
US8307560B2 (en) 2010-10-08 2012-11-13 Levi Strauss & Co. Shaped fit sizing system
FR2966038B1 (en) 2010-10-14 2012-12-14 Magellan Interoptic METHOD FOR MEASURING THE PUPILLARY GAP OF A PERSON AND ASSOCIATED DEVICE
WO2012051654A1 (en) 2010-10-20 2012-04-26 Luxottica Retail Australia Pty Ltd An equipment testing apparatus
WO2012054972A1 (en) 2010-10-26 2012-05-03 Luxottica Retail Australia Pty Ltd A merchandise retailing structure
WO2012054983A1 (en) 2010-10-29 2012-05-03 Luxottica Retail Australia Pty Ltd Eyewear selection system
TWI476729B (en) 2010-11-26 2015-03-11 Inst Information Industry Dimensional image and three - dimensional model of the combination of the system and its computer program products
US9529939B2 (en) 2010-12-16 2016-12-27 Autodesk, Inc. Surfacing algorithm for designing and manufacturing 3D models
KR101796190B1 (en) 2010-12-23 2017-11-13 한국전자통신연구원 Apparatus and method for generating digital clone
US8533187B2 (en) 2010-12-23 2013-09-10 Google Inc. Augmentation of place ranking using 3D model activity in an area
US8447099B2 (en) 2011-01-11 2013-05-21 Eastman Kodak Company Forming 3D models using two images
US20120176380A1 (en) 2011-01-11 2012-07-12 Sen Wang Forming 3d models using periodic illumination patterns
US8861836B2 (en) 2011-01-14 2014-10-14 Sony Corporation Methods and systems for 2D to 3D conversion from a portrait image
US9129438B2 (en) 2011-01-18 2015-09-08 NedSense Loft B.V. 3D modeling and rendering from 2D images
US8885050B2 (en) 2011-02-11 2014-11-11 Dialogic (Us) Inc. Video quality monitoring
US8553956B2 (en) 2011-02-28 2013-10-08 Seiko Epson Corporation 3D current reconstruction from 2D dense MCG images
US20130088490A1 (en) 2011-04-04 2013-04-11 Aaron Rasmussen Method for eyewear fitting, recommendation, and customization using collision detection
US9070208B2 (en) 2011-05-27 2015-06-30 Lucasfilm Entertainment Company Ltd. Accelerated subsurface scattering determination for rendering 3D objects
US8520075B2 (en) 2011-06-02 2013-08-27 Dialogic Inc. Method and apparatus for reduced reference video quality measurement
KR101608253B1 (en) 2011-08-09 2016-04-01 인텔 코포레이션 Image-based multi-view 3d face generation
US20130271451A1 (en) 2011-08-09 2013-10-17 Xiaofeng Tong Parameterized 3d face generation
KR101381439B1 (en) 2011-09-15 2014-04-04 가부시끼가이샤 도시바 Face recognition apparatus, and face recognition method
US8743051B1 (en) 2011-09-20 2014-06-03 Amazon Technologies, Inc. Mirror detection-based device functionality
US8766979B2 (en) 2012-01-20 2014-07-01 Vangogh Imaging, Inc. Three dimensional data compression
US8813378B2 (en) 2012-05-17 2014-08-26 Carol S. Grove System and method for drafting garment patterns from photographs and style drawings

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6661433B1 (en) * 2000-11-03 2003-12-09 Gateway, Inc. Portable wardrobe previewing device
US20050162419A1 (en) * 2002-03-26 2005-07-28 Kim So W. System and method for 3-dimension simulation of glasses
JP2004272530A (en) * 2003-03-07 2004-09-30 Digital Fashion Ltd Virtual fitting display device and method, virtual fitting display program, and computer readable recoding medium with the same program recorded
KR20080086945A (en) * 2006-12-29 2008-09-29 이상민 Apparatus and method for coordination simulation for on-line shopping mall
US20110234591A1 (en) * 2010-03-26 2011-09-29 Microsoft Corporation Personalized Apparel and Accessories Inventory and Display

Also Published As

Publication number Publication date
US9996959B2 (en) 2018-06-12
US9483853B2 (en) 2016-11-01
US20170046863A1 (en) 2017-02-16
US20130342575A1 (en) 2013-12-26

Similar Documents

Publication Publication Date Title
US9996959B2 (en) Systems and methods to display rendered images
US9286715B2 (en) Systems and methods for adjusting a virtual try-on
EP2852934B1 (en) Systems and methods for rendering virtual try-on products
EP2905945B1 (en) Inter-terminal image sharing method, terminal device and communication system
CN107223270B (en) Display data processing method and device
RU2749643C1 (en) Head-mounted display device and method performed by them
US20130314413A1 (en) Systems and methods for scaling a three-dimensional model
US10025482B2 (en) Image effect extraction
CN104125397B (en) A kind of data processing method and electronic equipment
CN106445129A (en) Method, device and system for displaying panoramic picture information
CN107492144A (en) Shadow processing method and electronic equipment
CN108961424B (en) Virtual information processing method, device and storage medium
CN113426112A (en) Game picture display method and device, storage medium and electronic equipment
CN106034067A (en) Picture display method, apparatus and system of instant messaging client
CN109901904B (en) Application picture adjusting method in wearable device and wearable device
CN108647069B (en) Interface display method and device, storage medium and electronic device
CN108919951B (en) Information interaction method and device
US20180174345A1 (en) Non-transitory computer-readable storage medium, display control device and display control method
CN104867109A (en) Display method and electronic equipment
US20150062116A1 (en) Systems and methods for rapidly generating a 3-d model of a user
US20230088963A1 (en) System and method for scene reconstruction with plane and surface reconstruction
US11670063B2 (en) System and method for depth map guided image hole filling
JP6937722B2 (en) Image processing device and image processing method
CN107679175B (en) Method and system for batch datamation of vegetarian gold products
CN112950455A (en) Image display method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13794650

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13794650

Country of ref document: EP

Kind code of ref document: A1