CA2706699C - Product modeling system and method - Google Patents
Product modeling system and method Download PDFInfo
- Publication number
- CA2706699C CA2706699C CA2706699A CA2706699A CA2706699C CA 2706699 C CA2706699 C CA 2706699C CA 2706699 A CA2706699 A CA 2706699A CA 2706699 A CA2706699 A CA 2706699A CA 2706699 C CA2706699 C CA 2706699C
- Authority
- CA
- Canada
- Prior art keywords
- product
- design
- markers
- piece
- visual representation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 73
- 238000013461 design Methods 0.000 claims abstract description 98
- 239000003550 marker Substances 0.000 claims description 35
- 230000000007 visual effect Effects 0.000 claims description 25
- 238000003384 imaging method Methods 0.000 claims description 14
- 238000013507 mapping Methods 0.000 claims description 14
- 239000000463 material Substances 0.000 claims description 14
- 241001465754 Metazoa Species 0.000 claims description 10
- 230000001413 cellular effect Effects 0.000 claims description 2
- 239000000049 pigment Substances 0.000 claims 2
- 239000003086 colorant Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 239000000758 substrate Substances 0.000 description 5
- 239000011800 void material Substances 0.000 description 4
- 239000004744 fabric Substances 0.000 description 3
- 230000001788 irregular Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000002156 mixing Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 241000208140 Acer Species 0.000 description 1
- 102100036514 Amyloid-beta A4 precursor protein-binding family A member 1 Human genes 0.000 description 1
- 101710093637 Amyloid-beta A4 precursor protein-binding family A member 1 Proteins 0.000 description 1
- 241000605422 Asparagus asparagoides Species 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 241000282994 Cervidae Species 0.000 description 1
- 241001468045 Channa Species 0.000 description 1
- RAPBNVDSDCTNRC-UHFFFAOYSA-N Chlorobenzilate Chemical compound C=1C=C(Cl)C=CC=1C(O)(C(=O)OCC)C1=CC=C(Cl)C=C1 RAPBNVDSDCTNRC-UHFFFAOYSA-N 0.000 description 1
- RWSOTUBLDIXVET-UHFFFAOYSA-N Dihydrogen sulfide Chemical compound S RWSOTUBLDIXVET-UHFFFAOYSA-N 0.000 description 1
- 101100061188 Drosophila melanogaster dila gene Proteins 0.000 description 1
- GIYXAJPCNFJEHY-UHFFFAOYSA-N N-methyl-3-phenyl-3-[4-(trifluoromethyl)phenoxy]-1-propanamine hydrochloride (1:1) Chemical compound Cl.C=1C=CC=CC=1C(CCNC)OC1=CC=C(C(F)(F)F)C=C1 GIYXAJPCNFJEHY-UHFFFAOYSA-N 0.000 description 1
- 241001482237 Pica Species 0.000 description 1
- 235000019169 all-trans-retinol Nutrition 0.000 description 1
- 238000010171 animal model Methods 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 229940101532 meted Drugs 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- VYQNWZOUAUKGHI-UHFFFAOYSA-N monobenzone Chemical compound C1=CC(O)=CC=C1OCC1=CC=CC=C1 VYQNWZOUAUKGHI-UHFFFAOYSA-N 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- VJZLQIPZNBPASX-OJJGEMKLSA-L prednisolone sodium phosphate Chemical compound [Na+].[Na+].O=C1C=C[C@]2(C)[C@H]3[C@@H](O)C[C@](C)([C@@](CC4)(O)C(=O)COP([O-])([O-])=O)[C@@H]4[C@@H]3CCC2=C1 VJZLQIPZNBPASX-OJJGEMKLSA-L 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000012958 reprocessing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- LZNWYQJJBLGYLT-UHFFFAOYSA-N tenoxicam Chemical compound OC=1C=2SC=CC=2S(=O)(=O)N(C)C=1C(=O)NC1=CC=CC=N1 LZNWYQJJBLGYLT-UHFFFAOYSA-N 0.000 description 1
- 210000004916 vomit Anatomy 0.000 description 1
- 230000008673 vomiting Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/20—Linear translation of a whole image or part thereof, e.g. panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G06T5/94—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/08—Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Business, Economics & Management (AREA)
- Software Systems (AREA)
- Development Economics (AREA)
- General Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Computer Hardware Design (AREA)
- Economics (AREA)
- Finance (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Geometry (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
- Details Of Garments (AREA)
Abstract
A product modeling system and method are provided. On one embodiment, the product modeling system is used to model a piece of apparel, such as a shirt, with a design wherein the model with the design is used to display the piece of apparel with the design to a consumer.
Description
= 79150-117 FRODUCT lv.IODELING SYSTEM AND METHOD
Anpendicet Appendix A (2 pages) contains an exsunple. of the pseudocode for fuading a set of markers on a product Appendix B (1 page) contains an example of the code for remapping the color of an image using normalized ordinal color distribution;
Appendix C (4 pages) contains an example of the code for building a color map in normalized histogram order with an index from a matching color space;
Appendix D (2 pages) contains an example of the code for building a Iook-up table to remap the colors from a source sphere to a destination sphere; and Appendix E (3 pages) contains an example of the code for remapping the color of the source image with a source sphere color map to a destination image with the color map of the sphere color object.
Appendices A-E form part of the specification and are incorporated herein by reference Field The invention relates generally bp a system and method for modeling a piece of appareL
Background Electronic cortunerce (E-commerce) is a thriving business in which various different products and services are sold to &plurality of consumers using an E-cornmerce site. The E-Commerce site may include a website that aLlows a plurality of consumers to gain access to the website using a network, such as the Internet The website may have a plurality ofl,veh pages Wherein these web pages have images of a plurality of different products that the consumer may purchase. The itnages contained in the plurality of web pages are two dimensional images l'he vtebsite may also include a secure commerce portion that allows the consumer to select one or more items, place those items in an electronic shopping cart ' 79150-117 and, when done shopping, check out and pay for the items that remain in the electronic shopping cart using various payment services, such as PayPal or a credit card.
One limitation with these typical E-commerce systems is that the product available on the website, such as a shirt, may be modeled by a human model to show the product and its design, but is shown to the consumer as a "flat" image since it is shown to the consumer on the display of the computer being used by the consumer. Thus, the actual design of the product and how the product looks in real life is often difficult to determine from those images. This may result in consumers not purchasing the product which is undesirable.
Another limitation of these typical E-commerce systems is that the product available on the website, such as a shirt, cannot be customized by the consumer with a design on the product. Thus, the consumer cannot see the customized product with the design and this also may result in consumers not purchasing the product which is undesirable. Thus, it is desirable to provide a system and method that provides better models for products and it is to this end that the system and method are directed.
Summary According to an aspect of the invention, there is provided an apparatus for modeling a product, comprising: a plurality of markers that are capable of forming a marker pattern on a product and the marker pattern does not occlude a surface of the product; an imaging device that is capable of taking a single image of the product on an object and the plurality of markers and the product is one of an item worn by a human being and an item worn by an animal; and a computing device that captures a contour of a design area when that design area is on a product that is represented on an object in the single image, electronically applies a user design to the product and generates a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
A further aspect of the invention provides a method for product modeling, comprising: providing, using an imaging device, a contour of a design area when that design = 79150-117 2a area is on a product that is represented on an object in the single image generated by imaging, using a single image, the product on an object using a plurality of markers that form a marker pattern on the product and the marker pattern does not occlude a surface of the product, wherein the product is one of an item worn by a human being and an item worn by an animal;
electronically applying, using a computer, a user design to the product; and generating, using the computer, a visual representation of the design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
There is also provided an apparatus for modeling a product, comprising: a plurality of markers that are capable of forming a marker pattern on a product that does not occlude a surface of the product; an imaging device that is capable of taking a single image of the product on an object and the plurality of markers, wherein the product is one of an item worn by a human being and an item worn by an animal; and a computing device that captures a contour of a design area when that design area is on a product that is represented on an object in the single image, electronically applies a user design to the product and generates a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
In accordance with a still further aspect of the invention, there is provided a method for product modeling, comprising: providing, using a imaging device, a contour of a design area when that design area is on a product that is represented on an object in the single image generated by imaging, using a single image, the product on an object with a single image using a plurality of markers that form a marker pattern on the product that does not occlude a surface of the product; electronically applying, using a computer, a user design to the product; and generating, using the computer, a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the user design on the product has the contour of the surface of the product.
2b Brief Description of the Drawings Figure 1 is a block diagram illustrates an exemplary implementation of the product modeling system;
Figure 2 illustrates an exemplary implementation of a product modeling method;
Figure 3 illustrates further details of an exemplary implementation of a product modeling method;
Figure 4A and 4B illustrate further details of the process for marking a product in the product modeling method;
1 0 Figures 5A and 5B illustrate further details of the process for generating images of a product in the product modeling method;
Figures 6A-6D illustrate further details of the process for preprocessing the model in the product modeling method;
Anpendicet Appendix A (2 pages) contains an exsunple. of the pseudocode for fuading a set of markers on a product Appendix B (1 page) contains an example of the code for remapping the color of an image using normalized ordinal color distribution;
Appendix C (4 pages) contains an example of the code for building a color map in normalized histogram order with an index from a matching color space;
Appendix D (2 pages) contains an example of the code for building a Iook-up table to remap the colors from a source sphere to a destination sphere; and Appendix E (3 pages) contains an example of the code for remapping the color of the source image with a source sphere color map to a destination image with the color map of the sphere color object.
Appendices A-E form part of the specification and are incorporated herein by reference Field The invention relates generally bp a system and method for modeling a piece of appareL
Background Electronic cortunerce (E-commerce) is a thriving business in which various different products and services are sold to &plurality of consumers using an E-cornmerce site. The E-Commerce site may include a website that aLlows a plurality of consumers to gain access to the website using a network, such as the Internet The website may have a plurality ofl,veh pages Wherein these web pages have images of a plurality of different products that the consumer may purchase. The itnages contained in the plurality of web pages are two dimensional images l'he vtebsite may also include a secure commerce portion that allows the consumer to select one or more items, place those items in an electronic shopping cart ' 79150-117 and, when done shopping, check out and pay for the items that remain in the electronic shopping cart using various payment services, such as PayPal or a credit card.
One limitation with these typical E-commerce systems is that the product available on the website, such as a shirt, may be modeled by a human model to show the product and its design, but is shown to the consumer as a "flat" image since it is shown to the consumer on the display of the computer being used by the consumer. Thus, the actual design of the product and how the product looks in real life is often difficult to determine from those images. This may result in consumers not purchasing the product which is undesirable.
Another limitation of these typical E-commerce systems is that the product available on the website, such as a shirt, cannot be customized by the consumer with a design on the product. Thus, the consumer cannot see the customized product with the design and this also may result in consumers not purchasing the product which is undesirable. Thus, it is desirable to provide a system and method that provides better models for products and it is to this end that the system and method are directed.
Summary According to an aspect of the invention, there is provided an apparatus for modeling a product, comprising: a plurality of markers that are capable of forming a marker pattern on a product and the marker pattern does not occlude a surface of the product; an imaging device that is capable of taking a single image of the product on an object and the plurality of markers and the product is one of an item worn by a human being and an item worn by an animal; and a computing device that captures a contour of a design area when that design area is on a product that is represented on an object in the single image, electronically applies a user design to the product and generates a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
A further aspect of the invention provides a method for product modeling, comprising: providing, using an imaging device, a contour of a design area when that design = 79150-117 2a area is on a product that is represented on an object in the single image generated by imaging, using a single image, the product on an object using a plurality of markers that form a marker pattern on the product and the marker pattern does not occlude a surface of the product, wherein the product is one of an item worn by a human being and an item worn by an animal;
electronically applying, using a computer, a user design to the product; and generating, using the computer, a visual representation of the design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
There is also provided an apparatus for modeling a product, comprising: a plurality of markers that are capable of forming a marker pattern on a product that does not occlude a surface of the product; an imaging device that is capable of taking a single image of the product on an object and the plurality of markers, wherein the product is one of an item worn by a human being and an item worn by an animal; and a computing device that captures a contour of a design area when that design area is on a product that is represented on an object in the single image, electronically applies a user design to the product and generates a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
In accordance with a still further aspect of the invention, there is provided a method for product modeling, comprising: providing, using a imaging device, a contour of a design area when that design area is on a product that is represented on an object in the single image generated by imaging, using a single image, the product on an object with a single image using a plurality of markers that form a marker pattern on the product that does not occlude a surface of the product; electronically applying, using a computer, a user design to the product; and generating, using the computer, a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the user design on the product has the contour of the surface of the product.
2b Brief Description of the Drawings Figure 1 is a block diagram illustrates an exemplary implementation of the product modeling system;
Figure 2 illustrates an exemplary implementation of a product modeling method;
Figure 3 illustrates further details of an exemplary implementation of a product modeling method;
Figure 4A and 4B illustrate further details of the process for marking a product in the product modeling method;
1 0 Figures 5A and 5B illustrate further details of the process for generating images of a product in the product modeling method;
Figures 6A-6D illustrate further details of the process for preprocessing the model in the product modeling method;
Figures 7A¨ 7C illustrate further details of the post processing process in the product modeling method;
Figure 8A illustrates an =maple of a design to be placed on a piece of apparel;
Figures 813 illustrates atypical imags ofa piece of apparel with the design in a typical system;
Figures SC-SD illustrate the design on apiece of apparel in the product modeling system;
Figure 8E illustrates the process for placing the design on the model; and Figures 9A -9C illustrate a process for changing the background against which the piece of apparel with the design is displayed in the product modeling system.
Detailed Descrintion of One or Mors Embodiments The system and method are particularly applicable to a system and method for modeling a shirt implemented in software on a computer and it is in this context that the system and method is illustrated and described. It will be appreciated, however, that the system and method can be used for various products wherein the products may include other apparel and other products in which it is desirable to provide,betters models of the products.
For example, the system may be uaed for any type of garment or piece of apparel, any item that can be worn erased by a human being or pet, such as a hat, backpack, dog sweater, etc.
and/or any other product in which it is desirable to be able to display the product on a model.
In addition, the syitem may be used with any product in which it is desirable to be able to display the product (with an irregular Sent* with a design on it, such as a skateboard, 4 shoe. In addition, the system may be used to display a design on any item with an irregular surface, such as a wall, automobile body, a pencil and the hire. Furthermore, the system may be used to identify a product/item in a video wherein a design Can be inaerted into the product/itm in the video. In addition, the systran and method can be imptementsd in software (shown in the illustrated implementation), hardware or a combination of hardware and software and may also be implemented on stand alone computing device (shown in the illustrated implementation), a web server, a terminal, a pour to peer system and the like so that the system and method are not limited to the puticular implementation ads:
system or method.
Figure 1 is a block diagram illustrates an exemplary implementation of tbe product modeling system 100. In this implementation, the system is implemented on a stand alone computing device, such as a personal computer turd thaprodirot modeling system is inspleMented a& one or mons pieces of software comprising a phmtlity of lines of computer aide that are executed by a processing unit in the computing device to implement the product modeling system. The product modeling system, however, can also be implemented on other computing devices and computing systems, such as a networked computing systemot client/server system, a peer-to-peer system, an ASP model type systenr, a laptop computer, a mobile device, a mobilo cellular phoneor any other computing device with sufficient processing power, memory and connectivity to implement the product modeling system and method as described below.
Thd exemplary implementation of the =system may incslude a display device 102 to permit a consumer to view the product with the design generated by the product modeling system, a chassis 104 and one or more input/output devices 105, such as a keyboard and mouse, that allow the consumer to interface with the computing deviceand the product modeling system. The chassis 104 may house a processing unit 106 such as an Intel processor, a persistent storage device 1Dg such as a bard disk drive and a memory 110 wherein the memory may store the software modules/applications being executed by the processing unit When the Product modeling system is being implemented on the computing device), the computing device may also include a product modeling store 11Z
such as a software implemented database and the memory tna,y store gat operating system 114 that controls the operations of the computing device and a product modeling module 116 that has a plurality oiliness of computer code wherein the plurality of lines of computer code are executed by the processfirs unit to implement the product modeling sysumn and method as described below.
For purposes of illustrating the product modeling system and method, a product modeling method for a piece of apparel, such as a t-shirt, with a design is described below.
However, the product modeling system may also be used for other products, midi as other apparel and other products in which it is desirable to provide betters models of the products.
Figure 8A illustrates an =maple of a design to be placed on a piece of apparel;
Figures 813 illustrates atypical imags ofa piece of apparel with the design in a typical system;
Figures SC-SD illustrate the design on apiece of apparel in the product modeling system;
Figure 8E illustrates the process for placing the design on the model; and Figures 9A -9C illustrate a process for changing the background against which the piece of apparel with the design is displayed in the product modeling system.
Detailed Descrintion of One or Mors Embodiments The system and method are particularly applicable to a system and method for modeling a shirt implemented in software on a computer and it is in this context that the system and method is illustrated and described. It will be appreciated, however, that the system and method can be used for various products wherein the products may include other apparel and other products in which it is desirable to provide,betters models of the products.
For example, the system may be uaed for any type of garment or piece of apparel, any item that can be worn erased by a human being or pet, such as a hat, backpack, dog sweater, etc.
and/or any other product in which it is desirable to be able to display the product on a model.
In addition, the syitem may be used with any product in which it is desirable to be able to display the product (with an irregular Sent* with a design on it, such as a skateboard, 4 shoe. In addition, the system may be used to display a design on any item with an irregular surface, such as a wall, automobile body, a pencil and the hire. Furthermore, the system may be used to identify a product/item in a video wherein a design Can be inaerted into the product/itm in the video. In addition, the systran and method can be imptementsd in software (shown in the illustrated implementation), hardware or a combination of hardware and software and may also be implemented on stand alone computing device (shown in the illustrated implementation), a web server, a terminal, a pour to peer system and the like so that the system and method are not limited to the puticular implementation ads:
system or method.
Figure 1 is a block diagram illustrates an exemplary implementation of tbe product modeling system 100. In this implementation, the system is implemented on a stand alone computing device, such as a personal computer turd thaprodirot modeling system is inspleMented a& one or mons pieces of software comprising a phmtlity of lines of computer aide that are executed by a processing unit in the computing device to implement the product modeling system. The product modeling system, however, can also be implemented on other computing devices and computing systems, such as a networked computing systemot client/server system, a peer-to-peer system, an ASP model type systenr, a laptop computer, a mobile device, a mobilo cellular phoneor any other computing device with sufficient processing power, memory and connectivity to implement the product modeling system and method as described below.
Thd exemplary implementation of the =system may incslude a display device 102 to permit a consumer to view the product with the design generated by the product modeling system, a chassis 104 and one or more input/output devices 105, such as a keyboard and mouse, that allow the consumer to interface with the computing deviceand the product modeling system. The chassis 104 may house a processing unit 106 such as an Intel processor, a persistent storage device 1Dg such as a bard disk drive and a memory 110 wherein the memory may store the software modules/applications being executed by the processing unit When the Product modeling system is being implemented on the computing device), the computing device may also include a product modeling store 11Z
such as a software implemented database and the memory tna,y store gat operating system 114 that controls the operations of the computing device and a product modeling module 116 that has a plurality oiliness of computer code wherein the plurality of lines of computer code are executed by the processfirs unit to implement the product modeling sysumn and method as described below.
For purposes of illustrating the product modeling system and method, a product modeling method for a piece of apparel, such as a t-shirt, with a design is described below.
However, the product modeling system may also be used for other products, midi as other apparel and other products in which it is desirable to provide betters models of the products.
For example, the system may be used for any type of garment or piece of apparel, any item that can be worn or used by a human being or pet, such as a hat, backpack, dog sweater, etc.
and/or any other product in which it is desirable to be able to display the product on a model.
In addition, the system rnay be used with any product in which it is desirable to be able to display the product (with an inegubtr surface) with a design on it, such as a skateboard, a shoe. In addition, the system May be used to display a design on any item with an irregular surface, such as a wall, automobile body, a pencil and the hire. Furthermore, the system may be used to identify a product/item in a video v/herein a design can be inserted into the product/item in the video. The output of the product modeling method (an image of the product with a design shown on the ptoduct) may be used for various purposes.
For example, the output may be used to generate a plurality of product displays with designs on a website that allows couturiers to see-the products. The example described below is a system in which the product modeling system is tied to a product marketing and selling company wherein the product marketing and selling company has control of models and images of the product IS modeling system. In another implemedtation/emhodiment of the product modeling trystern, the stem may permit a consumer to provide their 011111 images/models, such as models of the actual consumer, so that the mummer can upload the image to a service and then have the selected design displayed on the model of the actual consumer wherein the service provides:
1) the model components (to create the model faun); 2) a tool to upload/modify the model images to the service; and 3) a tool to displaythe model with the design to the consumer.
Figure 2 nlustrates an exemplary implementation of a product modeling method that displays a model with a design on the model wherein the model is a realistic representation of a person with a piece of apparel thathas the design on the piece of apparel The methods shown in Figures 2 and 3, the processes described below may be performed by the product modeling module 116 described above. A consumer may select a design (122) such as the design shown in Figure SA and s warp process (124) may be performed to generate a warp design (128). The consumer may also select a background (126) for the model such as the backgrounds shown in Figures 9A-9C. Once the background and design ere, chosen by the consumer, the design is warped and then surface shading (130) and a surface specular process (132) is performed. Once these processes are completed, the -model is created with the design (134) wherein the model with the design is shown to the consumer.
In one embodiment, the model with the design is displayed to the consumer to assist the consumer in previewing the product with the design before the coneumer purchases the product with the design, such as throtigh an E-commerce website. Now, the product modeling method is described in more detail.
Figure 3 illustrates further details aim exemplary implementation of a product modeling method 140 when used with a piece otappareL The product modeling method (end the processes set forth below) are implemented, ill one embodiment and implementation, as a plurality alines of computer code that Mt part of the product modeling roodule that are =meted by a processing unit 106 that is part of the product modeling system.
In the method, a piece of apparel is created with a plurality &markers (142) that are used to capture information about the piece of apparel when the piece of apparel is worn by a human model.
The plurality of markets ns.ay be a mathir pattern that encodes, in two dimensions, a flexible substrate that may be detected when, the flexible. substrate is placed on a complex three dimensional surface ',herein the coverage area of the =akar patient does not substantially occlude the substrate that it encodes. For example, the plurality of markers may cover a predetermined percentage, auth as 50%, of the piece of apparel, that alio*
th.e system to capture information about the piece of apparel when the piece of apparel is wom by a human model. In one insplementation, the plurality of markers rimy form a grid.
Inmate detail, the markers that form a grid on a flat surface (the piece of apparel filt .01111BUTfitee, when die markers are properly positioned on the piece of apparel) may be used to map to a grid of markers on a non-flat surface (the piece of apparel when worn on a human model). As shown in Figure 4A, the grid of markers 186 on the flat surface are mapped to a grid 187 with the same markers in the seine positions on a non-flat surface so that the mapping between the grid on the flat surface and the grid owthe non-flat surface is determined.
The system may interpolate the marker locations to generate a mapping front the plurality of markers to the grid on the flat surface and may then store the-mapping to avoid recalculation of the mapping each time. In OM embodirnent, the markers may be a number of non-visible lines that form a grid. In another embodiMent, the marker's may bc a plurality of optical markers 190 that may be affixed to a piece of apparel 192 as shown in Figure 4B that permits the optical tagging of the piece of apparel to map the surface of the piece of apparel when worn by a human model.
The optical markers. may be made of a reflectiye material, a colorized material or a diffraction pattern_ The ;effective material may be retro-reilective material.
The colorized material may be pigmented material. The markers may have various shapes (including the = 79150-117 dot shape shown in Figure 4B) and sizes and the method is not limited to any particular shape of the merkers. In one embocfunent, the plurality of markers may be a film material that has the retro,reflective material in a particular shape. In yet another embodiment, the makers may he a set ofmarkers that form a grid wherein the marbss arc placed onto the piece of apparel electronically or by other means, In one embodiment in which the product modeling system is used by a business entity that sells apparel, each. piece of apparel is placed unto a plurality of humeri models of different shapes andlor sizes (as shown in Figures 5A and 5B) so that the consumer can then choose smodel for the piece of apparel that is closest to the intended wearer of the piece of apparel. In another embodiment in which each consumer nuty 1 0 create his own model for a piece of apparel, the consumer is provided 'glib the markem (either electronically or as physical markers) so that the consumer can affix the markers to a piece of apparel and then performs the other pm:asset describedbelow. bt yet another embodiment, the product modeling system may allow a plurality of users (such as a community of users) to generate a plurality of models that may then be uploaded to the product modeling System.
Once theme ormore pieces of apparel are prepared With the markers, an image for each piece of apparel on each different human model may be generated (150) such as by using a camera to take a pica= of the piece of apparel being worn bya hturum model.
Prior to taking the image of the piece of apparel with the medusa on the human model, the lighting for taking the image is determined. When the user/consumer generates the models, the product modeling system may provide instmctions for taking an image of the piece of append such as using a flash, using a particular exposure, etc... In One hnplententation of the product modeling system, the product modeling system may download a piece of code directly to a user/consumer's camera, such as a digital camera, to set up the camera properly to take the.
image et the product or item. In particular, the surface model and illuminstion model for each piece of apparel is determined vvitioh also allows the color and lighting for the imago to be accurately determined.
Once the image alba piece of speared on a phuality of human models in a plurality of different poses are taken, the model for the piece Of apparel on a particular model in a particular pose are preprocessed (160) by theproduet modeling system. During the pnsprocessing, the product modeling system may detect the plurality of markets on the piece reapply' image, remove the.marker intages from thc image of the piece of apparel and then generate a representation of the smilax of the piece of apparel when worn by the lmman model.
In one implementation, the markers may be detected by a dlitinguis' hingfeature of the markers (spectral difference, reflective difference, textual difference antlior temporal difference), refined by snatching geometric propertiee of the pattern (local pattern finding) and reconetructed by matching the known pattern (local pafterna assembled into a keewn complete pattern.) The reconstructed pattern may then be used to model the sbaPe of the flexible substrate. The product modeling system may have a plurality of local sainples of the original unmarked subattate so that the marker pattern can bo rephead using the teatimes of the unmarked substrate as an example that yields an umnaticed bnage suitable for commercial The preprocessing procesa is shown in Nouns 6A-6C with Figure 6A illustrating the image of the piece of apparel with the markers, Figure 6B illustrating theplurality of read=
identified on the piece ofapparel and Fist= 6C muattates the image of the piece of apparel with the markers removed. Appendix A (2 pages), incorporated herein by reference, contains an example of the=pseudocode for identifying this markers on a product in one implementation of the product modeling system. The steps elite marker identification process tbr one implementation are set tbrth in Appendix A. In one implementation, the markers are detected by viable detection. In another implementation of the system, the =deers may be detected by a temporal process in which infrared radiation may be used to image the =ken at several different times and then the pattern of the marbms is detected based on the images of the markers at several different times.
During the identification of the markers, the product modeling system nely use various teelmiques. For example, edge detection maybe used to identify each marker and the spacing between the markers that can then be used to generate the grid of markere on the surface of the piece of apparel When worn on a.hunutn model that thus allows the surface of that piece of apparel on the partkular liman model in a particular pose to be accurately determined. Alternatively, the system raay threshold at the white color based on the color calkaaticri and then locate elements above the threshold end then also Vomit, the background including elements of the human model such as jeweliy, an eye or the = 79150-117 -9- =
background behind the human model. The synem may also use histograms ni identify the Markers and the background.
The marker knages(once identified) may be-removed fiom the image of the piece of apparel (as shown in Figure 6C) by various processes. For example,. the markers may be manoved by, for each mutter Iodation, identifying the texture adjacent the =ricer and then faling in the location of the marker With the texture in the adjacent area.
Alternatively, the aYetetn may use image coherence and synthesize the image to rernove the hunkers in the image.
To geoerste the representation of the contours of the surfs= of the piece of apparel when worn by a particular human.model in a particubr pose, the system maps the position of the markets 190 relative to each other as shown fn Figure 6D into a set of contour curves 194 that represent the surface of the piece of appatel when worn by a particular human model in n particular pose. Since the system has information about the madras and the grid that they form on a flat surface as shown Figure 4A, the syatem it able to determine the centours of the SneEtCe of tho piece of apparel when worn by a particular Unman model in a particular pose.
Once the COntOufS of the surface is determined and the 'Reprocessing is completed, the model of the piece of appatel when worn by a particular human model in a.particular pose may be retouched (162) as needed. Then, the model is post-processed (170) by the ptoduct model system. During the post-processingprocess, the product medal system colorizes the model using a color mapping module that is part of the product media system.
The colorizing allows each model for each piece of apparel on a particular human model in a partic' ular pose to have the piece of apparel converted into any colors such as the two different colors Shown in FiguresIA and 7B. As shown in Figure 7C, the system mayuse the rcler calibration card with a Imown spectral response for each session to calibrate images for the same session. To change the color for the piece olappatel,. the fahric may bowmpped onto a sphere as shown in Figure 7C which is then mappedto the model to change the color of the model.
Appendices B-B, incorporated herein by reference, illustrate, for a pruticular implementation of the ptoduct inodeling *item, the code for 1) rernapping the color of an image using normalized ordinal color distribution; 2) building a col& map in nommlized histogtam order with anindex from a matching. color space; 3) building a look-up table to retnap the colors frent a source sphere to a destination sphere; and 4) remspping the color of the source image with a source sphere color tnap to a destination image with the color map of the sphere color object. Using the code sat forth in these appendices (and the prOCtin stops described in these appendices). the color mapping process: I) builds a color map (the BuildMap code in Appendix C) for the source image using a sphere to build a histogram and then a sorted tabI 2) builds a ramp table (the BuildReMap table code in Appendix D); 3) remapa the image colors (the code in Appendices.B and E) onto the product The system may also layer color and texture so that the colorized model of the particular piece of apparel on the particular human model in the particular pose more accurately emulates afermrt fabrics and/or threads of the fabric which results, for ectample,. in an =undo emulation of the printed ink of the design on the piece of apparel with the particular type of fabric.
Once the colorization is completed, tbc model for a particulax piece of apparel on a particular human model in a particular pose is integrated into a service (180) such as a website that has the pieces of apparel with particular designs for sale to mummers.
When the model is integrated into the service, thelaroduct modeling system may perform warp mapping (182) on a design selected by the am:nu:rim and permit the user to select a particular background (184). An example design is shown in Figure 8A.
'The exemplary design shown on apiece of apparel in atypical system with a flat image is shown in Figure 8B. Using the product modeling systesn, a mapping between the design image and thesurface contour of the model lathe particular piece demand on the particular human model in the particular pose (Bee for example Figure 8D) is done so that the design is shown on the model as shown in Figure 8E is a more realistic three dimensional manner.
During the warp mapping (that may be a bicubic image winp), a grid of the design 200 is mapped to the surface contour grid 202 which is then placed cmto the piece of apparel to generate the more realistic model for the piece of apparel with the design as shown in Figure 8D. In the mapping process, a point in the design is mapped to the surf=
contour grid which is in turn mapped onto the piece of apparel. The image background can be easily exchanged by the product modeling system as shown in Figures. 9A-9C.
Although the example provided herein is for a piece of apparel (a shirt) worn by a human being, the product modeling system may be used for various different products (other pieces of apparel, other garments, hats, shoes, pet clothing, inanimate objects such as *ups) with various different models (human models, animal models, inanimate models such as robots or mannequins) and with any number of different poses for the models since the above example is merely illustrative.
While the foregoing has been with reference to a particular embodiment of the invention, it will be appreciated by those skilled in the art that changes in this embodiment may be made. The scope of the claims should not be limited by the embodiments set forth in the examples, but should be given the broadest interpretation consistent with the description as a whole.
and/or any other product in which it is desirable to be able to display the product on a model.
In addition, the system rnay be used with any product in which it is desirable to be able to display the product (with an inegubtr surface) with a design on it, such as a skateboard, a shoe. In addition, the system May be used to display a design on any item with an irregular surface, such as a wall, automobile body, a pencil and the hire. Furthermore, the system may be used to identify a product/item in a video v/herein a design can be inserted into the product/item in the video. The output of the product modeling method (an image of the product with a design shown on the ptoduct) may be used for various purposes.
For example, the output may be used to generate a plurality of product displays with designs on a website that allows couturiers to see-the products. The example described below is a system in which the product modeling system is tied to a product marketing and selling company wherein the product marketing and selling company has control of models and images of the product IS modeling system. In another implemedtation/emhodiment of the product modeling trystern, the stem may permit a consumer to provide their 011111 images/models, such as models of the actual consumer, so that the mummer can upload the image to a service and then have the selected design displayed on the model of the actual consumer wherein the service provides:
1) the model components (to create the model faun); 2) a tool to upload/modify the model images to the service; and 3) a tool to displaythe model with the design to the consumer.
Figure 2 nlustrates an exemplary implementation of a product modeling method that displays a model with a design on the model wherein the model is a realistic representation of a person with a piece of apparel thathas the design on the piece of apparel The methods shown in Figures 2 and 3, the processes described below may be performed by the product modeling module 116 described above. A consumer may select a design (122) such as the design shown in Figure SA and s warp process (124) may be performed to generate a warp design (128). The consumer may also select a background (126) for the model such as the backgrounds shown in Figures 9A-9C. Once the background and design ere, chosen by the consumer, the design is warped and then surface shading (130) and a surface specular process (132) is performed. Once these processes are completed, the -model is created with the design (134) wherein the model with the design is shown to the consumer.
In one embodiment, the model with the design is displayed to the consumer to assist the consumer in previewing the product with the design before the coneumer purchases the product with the design, such as throtigh an E-commerce website. Now, the product modeling method is described in more detail.
Figure 3 illustrates further details aim exemplary implementation of a product modeling method 140 when used with a piece otappareL The product modeling method (end the processes set forth below) are implemented, ill one embodiment and implementation, as a plurality alines of computer code that Mt part of the product modeling roodule that are =meted by a processing unit 106 that is part of the product modeling system.
In the method, a piece of apparel is created with a plurality &markers (142) that are used to capture information about the piece of apparel when the piece of apparel is worn by a human model.
The plurality of markets ns.ay be a mathir pattern that encodes, in two dimensions, a flexible substrate that may be detected when, the flexible. substrate is placed on a complex three dimensional surface ',herein the coverage area of the =akar patient does not substantially occlude the substrate that it encodes. For example, the plurality of markers may cover a predetermined percentage, auth as 50%, of the piece of apparel, that alio*
th.e system to capture information about the piece of apparel when the piece of apparel is wom by a human model. In one insplementation, the plurality of markers rimy form a grid.
Inmate detail, the markers that form a grid on a flat surface (the piece of apparel filt .01111BUTfitee, when die markers are properly positioned on the piece of apparel) may be used to map to a grid of markers on a non-flat surface (the piece of apparel when worn on a human model). As shown in Figure 4A, the grid of markers 186 on the flat surface are mapped to a grid 187 with the same markers in the seine positions on a non-flat surface so that the mapping between the grid on the flat surface and the grid owthe non-flat surface is determined.
The system may interpolate the marker locations to generate a mapping front the plurality of markers to the grid on the flat surface and may then store the-mapping to avoid recalculation of the mapping each time. In OM embodirnent, the markers may be a number of non-visible lines that form a grid. In another embodiMent, the marker's may bc a plurality of optical markers 190 that may be affixed to a piece of apparel 192 as shown in Figure 4B that permits the optical tagging of the piece of apparel to map the surface of the piece of apparel when worn by a human model.
The optical markers. may be made of a reflectiye material, a colorized material or a diffraction pattern_ The ;effective material may be retro-reilective material.
The colorized material may be pigmented material. The markers may have various shapes (including the = 79150-117 dot shape shown in Figure 4B) and sizes and the method is not limited to any particular shape of the merkers. In one embocfunent, the plurality of markers may be a film material that has the retro,reflective material in a particular shape. In yet another embodiment, the makers may he a set ofmarkers that form a grid wherein the marbss arc placed onto the piece of apparel electronically or by other means, In one embodiment in which the product modeling system is used by a business entity that sells apparel, each. piece of apparel is placed unto a plurality of humeri models of different shapes andlor sizes (as shown in Figures 5A and 5B) so that the consumer can then choose smodel for the piece of apparel that is closest to the intended wearer of the piece of apparel. In another embodiment in which each consumer nuty 1 0 create his own model for a piece of apparel, the consumer is provided 'glib the markem (either electronically or as physical markers) so that the consumer can affix the markers to a piece of apparel and then performs the other pm:asset describedbelow. bt yet another embodiment, the product modeling system may allow a plurality of users (such as a community of users) to generate a plurality of models that may then be uploaded to the product modeling System.
Once theme ormore pieces of apparel are prepared With the markers, an image for each piece of apparel on each different human model may be generated (150) such as by using a camera to take a pica= of the piece of apparel being worn bya hturum model.
Prior to taking the image of the piece of apparel with the medusa on the human model, the lighting for taking the image is determined. When the user/consumer generates the models, the product modeling system may provide instmctions for taking an image of the piece of append such as using a flash, using a particular exposure, etc... In One hnplententation of the product modeling system, the product modeling system may download a piece of code directly to a user/consumer's camera, such as a digital camera, to set up the camera properly to take the.
image et the product or item. In particular, the surface model and illuminstion model for each piece of apparel is determined vvitioh also allows the color and lighting for the imago to be accurately determined.
Once the image alba piece of speared on a phuality of human models in a plurality of different poses are taken, the model for the piece Of apparel on a particular model in a particular pose are preprocessed (160) by theproduet modeling system. During the pnsprocessing, the product modeling system may detect the plurality of markets on the piece reapply' image, remove the.marker intages from thc image of the piece of apparel and then generate a representation of the smilax of the piece of apparel when worn by the lmman model.
In one implementation, the markers may be detected by a dlitinguis' hingfeature of the markers (spectral difference, reflective difference, textual difference antlior temporal difference), refined by snatching geometric propertiee of the pattern (local pattern finding) and reconetructed by matching the known pattern (local pafterna assembled into a keewn complete pattern.) The reconstructed pattern may then be used to model the sbaPe of the flexible substrate. The product modeling system may have a plurality of local sainples of the original unmarked subattate so that the marker pattern can bo rephead using the teatimes of the unmarked substrate as an example that yields an umnaticed bnage suitable for commercial The preprocessing procesa is shown in Nouns 6A-6C with Figure 6A illustrating the image of the piece of apparel with the markers, Figure 6B illustrating theplurality of read=
identified on the piece ofapparel and Fist= 6C muattates the image of the piece of apparel with the markers removed. Appendix A (2 pages), incorporated herein by reference, contains an example of the=pseudocode for identifying this markers on a product in one implementation of the product modeling system. The steps elite marker identification process tbr one implementation are set tbrth in Appendix A. In one implementation, the markers are detected by viable detection. In another implementation of the system, the =deers may be detected by a temporal process in which infrared radiation may be used to image the =ken at several different times and then the pattern of the marbms is detected based on the images of the markers at several different times.
During the identification of the markers, the product modeling system nely use various teelmiques. For example, edge detection maybe used to identify each marker and the spacing between the markers that can then be used to generate the grid of markere on the surface of the piece of apparel When worn on a.hunutn model that thus allows the surface of that piece of apparel on the partkular liman model in a particular pose to be accurately determined. Alternatively, the system raay threshold at the white color based on the color calkaaticri and then locate elements above the threshold end then also Vomit, the background including elements of the human model such as jeweliy, an eye or the = 79150-117 -9- =
background behind the human model. The synem may also use histograms ni identify the Markers and the background.
The marker knages(once identified) may be-removed fiom the image of the piece of apparel (as shown in Figure 6C) by various processes. For example,. the markers may be manoved by, for each mutter Iodation, identifying the texture adjacent the =ricer and then faling in the location of the marker With the texture in the adjacent area.
Alternatively, the aYetetn may use image coherence and synthesize the image to rernove the hunkers in the image.
To geoerste the representation of the contours of the surfs= of the piece of apparel when worn by a particular human.model in a particubr pose, the system maps the position of the markets 190 relative to each other as shown fn Figure 6D into a set of contour curves 194 that represent the surface of the piece of appatel when worn by a particular human model in n particular pose. Since the system has information about the madras and the grid that they form on a flat surface as shown Figure 4A, the syatem it able to determine the centours of the SneEtCe of tho piece of apparel when worn by a particular Unman model in a particular pose.
Once the COntOufS of the surface is determined and the 'Reprocessing is completed, the model of the piece of appatel when worn by a particular human model in a.particular pose may be retouched (162) as needed. Then, the model is post-processed (170) by the ptoduct model system. During the post-processingprocess, the product medal system colorizes the model using a color mapping module that is part of the product media system.
The colorizing allows each model for each piece of apparel on a particular human model in a partic' ular pose to have the piece of apparel converted into any colors such as the two different colors Shown in FiguresIA and 7B. As shown in Figure 7C, the system mayuse the rcler calibration card with a Imown spectral response for each session to calibrate images for the same session. To change the color for the piece olappatel,. the fahric may bowmpped onto a sphere as shown in Figure 7C which is then mappedto the model to change the color of the model.
Appendices B-B, incorporated herein by reference, illustrate, for a pruticular implementation of the ptoduct inodeling *item, the code for 1) rernapping the color of an image using normalized ordinal color distribution; 2) building a col& map in nommlized histogtam order with anindex from a matching. color space; 3) building a look-up table to retnap the colors frent a source sphere to a destination sphere; and 4) remspping the color of the source image with a source sphere color tnap to a destination image with the color map of the sphere color object. Using the code sat forth in these appendices (and the prOCtin stops described in these appendices). the color mapping process: I) builds a color map (the BuildMap code in Appendix C) for the source image using a sphere to build a histogram and then a sorted tabI 2) builds a ramp table (the BuildReMap table code in Appendix D); 3) remapa the image colors (the code in Appendices.B and E) onto the product The system may also layer color and texture so that the colorized model of the particular piece of apparel on the particular human model in the particular pose more accurately emulates afermrt fabrics and/or threads of the fabric which results, for ectample,. in an =undo emulation of the printed ink of the design on the piece of apparel with the particular type of fabric.
Once the colorization is completed, tbc model for a particulax piece of apparel on a particular human model in a particular pose is integrated into a service (180) such as a website that has the pieces of apparel with particular designs for sale to mummers.
When the model is integrated into the service, thelaroduct modeling system may perform warp mapping (182) on a design selected by the am:nu:rim and permit the user to select a particular background (184). An example design is shown in Figure 8A.
'The exemplary design shown on apiece of apparel in atypical system with a flat image is shown in Figure 8B. Using the product modeling systesn, a mapping between the design image and thesurface contour of the model lathe particular piece demand on the particular human model in the particular pose (Bee for example Figure 8D) is done so that the design is shown on the model as shown in Figure 8E is a more realistic three dimensional manner.
During the warp mapping (that may be a bicubic image winp), a grid of the design 200 is mapped to the surface contour grid 202 which is then placed cmto the piece of apparel to generate the more realistic model for the piece of apparel with the design as shown in Figure 8D. In the mapping process, a point in the design is mapped to the surf=
contour grid which is in turn mapped onto the piece of apparel. The image background can be easily exchanged by the product modeling system as shown in Figures. 9A-9C.
Although the example provided herein is for a piece of apparel (a shirt) worn by a human being, the product modeling system may be used for various different products (other pieces of apparel, other garments, hats, shoes, pet clothing, inanimate objects such as *ups) with various different models (human models, animal models, inanimate models such as robots or mannequins) and with any number of different poses for the models since the above example is merely illustrative.
While the foregoing has been with reference to a particular embodiment of the invention, it will be appreciated by those skilled in the art that changes in this embodiment may be made. The scope of the claims should not be limited by the embodiments set forth in the examples, but should be given the broadest interpretation consistent with the description as a whole.
APPENDIX A
EXEMPLARY CODE FOR LOCATING MARKERS
EXEMPLARY CODE FOR LOCATING MARKERS
/*===*====***=****==========*********==*********4=44==*=,06.4************4***
= Procedure:
* FindAndSynthMarkers =
= Description:
= Finds a marker pattem in a photograph, * Saves the Marker Arniy to a Me, * Fills the found Markers using multi-frequencylexture synthesis.
* Finds the Background Alpha Channel *.pii***.*********************************************.*****=*************/
void ModelShot::FindAndSynthMarkers( image *aSrelmage, Imago 41aDatimage, ) // find the markers // Save the ROB image to destination CopyR013Channela(Srelmage, Dstlmage);
// Choose Monochrtnne scalers based on shirt color ChooseMonochomeScalers(SKU COLOR, aMonochromeScaler);
// Convert thc Source brag to monochome with good constrast ConvertSourcchnageToMonoChrome(aMonoehromeScaler, Srelmage, aMonolmage);
II Apply a Gaussian Difference Bandpass filter to increase marker contrast ApplyBandPassFilter(aMonolmage. Lov.rMarkerSize, IlighMarkerSize);
// find first histogram Mini1212 from full luminancq. dila is maker threshold MarkerThrcshold YmdliistogramMinima(aMorugmage, 1);
// produce marker image by thresholding ApplyThreshold(aMonolmage, MarkerImage, MarkerThreshold);
//build marker array by filling each found region and averaging pixel locations.
BuilciMarkerArray(Markerlmage, aMarkerArmy);
SaveMarkerAnny(aMarkorArray, aMarkerArrayFile) // save the found markers as a chiumel;
CoPYChannel(Mgkerlarage, DstImage, MarkesQuumel);
// ford the background // find second histogram minims from full luminance, this is background threshold BackgrormdThreahold = TmdHistogramMininunPstImage, 2);
= Procedure:
* FindAndSynthMarkers =
= Description:
= Finds a marker pattem in a photograph, * Saves the Marker Arniy to a Me, * Fills the found Markers using multi-frequencylexture synthesis.
* Finds the Background Alpha Channel *.pii***.*********************************************.*****=*************/
void ModelShot::FindAndSynthMarkers( image *aSrelmage, Imago 41aDatimage, ) // find the markers // Save the ROB image to destination CopyR013Channela(Srelmage, Dstlmage);
// Choose Monochrtnne scalers based on shirt color ChooseMonochomeScalers(SKU COLOR, aMonochromeScaler);
// Convert thc Source brag to monochome with good constrast ConvertSourcchnageToMonoChrome(aMonoehromeScaler, Srelmage, aMonolmage);
II Apply a Gaussian Difference Bandpass filter to increase marker contrast ApplyBandPassFilter(aMonolmage. Lov.rMarkerSize, IlighMarkerSize);
// find first histogram Mini1212 from full luminancq. dila is maker threshold MarkerThrcshold YmdliistogramMinima(aMorugmage, 1);
// produce marker image by thresholding ApplyThreshold(aMonolmage, MarkerImage, MarkerThreshold);
//build marker array by filling each found region and averaging pixel locations.
BuilciMarkerArray(Markerlmage, aMarkerArmy);
SaveMarkerAnny(aMarkorArray, aMarkerArrayFile) // save the found markers as a chiumel;
CoPYChannel(Mgkerlarage, DstImage, MarkesQuumel);
// ford the background // find second histogram minims from full luminance, this is background threshold BackgrormdThreahold = TmdHistogramMininunPstImage, 2);
I/ Produce baekgroOnd alpha by ifireaholding ApplyTbresholOstImage, MarkerImage, Markeahreshold);
// save the found background alpha as a channel;
CopyChannekMarkerlmage, DatImage, backgroundCahnnel);
ra the markers // separate the high frequency image component HighPassFilter(Dsttmage, Hip,hPassImage, 2.0) // separate the low frequency conrponent, &mum* to improve texture synthesis speed.
Downsamplelmage(DsfirnageõLowPasalmage, 2.0) // increase rile size ofthemarkers to cover color artifacts from bloom.
ExpandAlpha(Dathnage, MaricerCharmel, HighSynthSelection, BloomDist);
// downsrunple the selection for the low frequency component.
Downsamplellmage(flighSynthSelection, LowSynthSelection, 2.0) // fill the Lowpassimage markers with texture examples // *cm the local 11 x 11 neighborhood.
LocalTextureSynthesis(LowPassimage. LowSynthSelection, 11, 11);
II fill the HighPasalmage with texture cuunplar // from the local 5 x 5 neighborhood LocalTexturegyathesis(HighPassImage, HighSynthSelection, 5, 5);
APPENDIX B
EXEMPLARY CODE FOR REMAPPING IMAGE
COLOR
.04Ø*************===========* ************.diumisoimisiks *****
***============**
* Procedure:
* RernapIrnageColor = Description:
= Remaps the Image Color using normalized ordinal cOlor distribution * of sinatr source and destination color reference photos.
* Reference photos need not be aligned, only have proportionally * ilhurninated color areas.
*****=*******************************************************************/
void tviodelShot::RernaplutagcColor( binge *aSrcIrnage.
Image *aSreColorReference, Image saDstImage, Image *aDstColorReferen.ce // Make a Source Color map SphereColor *arcMap-= SphereColor::NewL(LService);
sat-cMap->BuildMap(arcColorReferenos, 3, Dx00007fft);
// Make a Mat Color nap SphereColor *aDstMap SphereColor::NewL(iService);
aDstMap->BuildMap(aDatCobacterence, 3, Ox00007fff);
aDstMap->RernapInsageComposirgaSrcImage, drcMap, aDstImage);
APPENDIX C
EXEMPLARY CODE FOR BUILDMAP =
/************* ***** 4****************** *************.*********-=
* Procedure:
* BuildMap =
* Description:
= Builds a color map in normalized histogram. order, with an index * to map from a matching zolor space.
* Uses these steps:
1. Finds a monochrome scale for the imago that maximizesthe resolution for this color sample.
2. Builds a Histogram from this monochrome value for the Image annotated with the average color for each bin.
3. Builds a map of color normalized to the amount of each luminance found in the reference image.
4. Builds a lookup to go from the reference color space to the normalized map space.
BOOL SphereColor.:BuildMap(Image *aSpherelmage, 1NT32 alapChannel, 1NT32 aMapThreshoId) 1NT32 anEntryCormt = 1 << SPHERE_COLOR SHIFT;
UINT1 6 *aBuffer = (UINT164)aSphereIrnage->image_pir;
1NT32 anIrnageSize = aSpherelmage->width * aSpherehnage->height;
INT64 anAccumeolor[3];
1NT32 aFoundPixelCount = 0;
sphereHistoAccum *aTcrapColorEntries;
1NT32 i;
BOOL aRcsult = FALSE;
iService->AlloeL(anEntryCount = SizeoftspherelastoAccum),'cent);
aTenapColorEntries = (sphereHistoAccums)iService->OctAlloc();
iService->PushAliocL(aTempColorEntries);
memsegaTempColorEntries, 0, anEntryCount * sizeoksphereHistoAceurn));
anAceunrCe1or[0] = 0;
anAccumColor[1] = 0;
gnAcuumC,olor[2] = 0;
for(i = 0; i < anIznageSize; i++) if (aBuffer[aMapChannel] > aMapThreabolcl) = 79150-117 anAccumColor[0] aBuffer[0];
arAccurnColor[ += giBufferill;
anAcctunCo1caf2] aBuffer[2];
sPoundPixe1Count+41 aBuffer aSpherelmage->channas;
if (aFotmdPixelCount > 0) anAccumColot[0] anAccuniColor(01/ aFoundPixelCount;
anAccumeottir[I] inAtclImCo1or[1] aFowr1rke.1Count;
anAccumColort21.. ariAccumColor[21/ alkAmdPixelCount;
CalaionoScalerganAccuinCeIor);
aBufkr (UINT16=)aSphexe1rnago.>inmge_ptr;
for(i 0; i < anIznageSiz 1-H-) if (aBuffer(aMapChinne1] > aMapTluvehold) LENT32 &Luminance ((aBuffer[01 = (UIN*13.2)1MonoScalexiDD >> 16)+
((aBuffer[1] * (UINT32)iMonoScaler[1p >> 16)4-((alluffor[2] = (U1N132)ildonacaloti2D >> 16);
TNT32 aLumIndexetLumini' mce >> (10 -SPHERE COLOR SHIFT);
aTempColorEttrics[aLtmiludoc].shaColorCount-H-;
aTempColorEntgies[ittaunIndellshaAverageC01040] aSuffer[0];
aTempColerVagrles[aLumIndex].ahsAvemeColot[ 1) 4= aBuffctill;
aTempCo1orEntries[othmllnd.ex].shaAverageCo1or[2] aBuffer[2];
.aTerapCoktrEntrits[aLumlidegishaLuminance aLuntinance;
aBuffer aSpherelmno->thaultkels;
if (aFoundPixelCount )' 256) double alncrament (REAL)aFounciPlxclCount (REAL)anEntryeAwnt;
'double 4Rimningansitt 0;
UrNT32 j;
aRcsult i= TRUE;
if (thiii->iHisto i= NULL) ' 79150-117 iService->AlloeL(anEntryCount * sizeof(spherellistoEntry), 'big%
this->iHisto (sphereHistoEutry*)iService->GetAlloc();
mernset(iFlisto, 0, anEuttyCount * atzeogsphereffunaEntiy));
foqi ... 0; i< anEntryCount; i++) INT32 aColorCount = aTernpColorEntriesjii.shaColorCoun if (aColorCount !=
aTanpColorEntriestilahaAverageColor[0] t= aColorC.ount aTeanpColorEntricagahaAverageColorrli l aC.olorCoun4 aTempOolorEntriesashaAverageCo1ort2] t= aColorCoun4 for(i = 0; i < anEnnyCoun4 i++) double aNeaCount aRunninsCount +
aTetnpColorEntries[ilshaeolorC,outt4 double alristoValue (aRunningCount / anInerernent);
UINT32 aRunStart (UINT32)floor(aHistoValue);
UINT32 altunEnd (UINT32)oeil(ablextCount / anIncrernent);
UINT32 aRunDiff;
IN= aLumSbitt = (16 - SPHERE COLOR SHIFT);
INT32 aLumincrement = I << aLtunShift if (aRunStrat > (UINT32)anEntryCount- l) aRunStart = (0INT32)anEntryCoimt-1;
if (aRunEnd > (UINT32)anF-ntryCount) aRunEnd = (U1NT32)anEntryCount;
altunDiff = Altman - altunStart;
illiatonabalndec = altunStar4 ansto[ilahallistoPract (UINT16)((aHistoValue - aRunStart) *
Ox0000FITE);
if(nstoalshailistoFract > Ox00000ffe) iirtstonshailistoFract = Ox0000frfe, if (aRunDiff) UINT32 aRunScaler = 0740010000 / aRunDiff;
fort = aRnnStart;j < aRunEnd; ji-E) INT32 aFract ((j - aRunStart) * aver:toiler);
thia->illistoashabmainancc aLumShift);
this->illisto[gahaLuminance 4 (aLurnInorement aFract) >> 16;
INT32 aColorScaler 0;
if (aTenipCokaetries[ilehalauninance > (1) aCcdorScaler (this-toalahaLuminance << 16) /
aTempCoIorEetricaglahaLuminance;
this->illistoUlsfulCoJor(0) (11INTI6)((aTempCo1aralriesnahaAverageColor(0]
* aColorScaler) >> 16);
thia->lliistoalalicColor[ I] =, (UINT16)((aTerapColorEntries[ii.s1iaAverageCo1or[1]
aColorScalcr) >> 16);
this->Oristo[aabaCOlot(21 =
(UINT 1 if)((aTempColorEntriespihaAverageColor[2]
accdorscam 16);
alienningCount = aNextComt;
}
this->Wrato[anEntryCount-1].ahaCo1or[0] thia->iHisto[antntryCount-2].shaColor[0];
this->illisto[anEntryCount-1lahaCo1or[1] ffia->iflistoran.EntryCoimt-2j.a12aColor(1);
rhnliato[anEntryCount-llahaColor[2] thes->illistoianEntryCouni-2].shaColor[2];
iScrvice->P0pAndDestroyAlIoc(aTerapOolorEetries) rerurn(aReault);
s 79150-117 APPENDIX D
EXEMPLARY CODE FOR BUILD REmAp TABLE
= 79150-117 ****** **=======******=******====********=====***********=*=***********
* Ploccdure:
* BuildRernapTable = Description:
* Builds a look-up table to remap the colors from a source sphere to a dest sphere.
* This fimction builds a luminance-based look up.
* fbr each luminance entry it does these things:
= 1. Look up the index front this luminance into the normalized color reference space.
= 2. Interopolate between two entries in the reference map = using this luminance's fractional weight.
= 3. Store the interpolated value in the entry for this = luminance.
*********************************************************** ******* **my** /
void SphereCokm:BuildRemapTable(SphereColor *aSourceSphere) 1NI32 anEntryCount J <,4 SPHERE COLOR SHIFT;
1:J1NT16 *aDstColor;
1NT32 aLurnShift = (16 - SPHERE_COLOR SLUR);
INT32 i;
if (iRetnap NULL) iService->AllocL(anEtUryCmmt sizeof(UINTI6) = 3, 'map);
this->illemap (UINT160)iService->6etA1loop;
aDstColor r- this->literrtap;
for (i =O; < anEntryCount;
// for each entry in the table-.
II map from lumin' ante into normalized Histogram order IN132 allistolndcx0 aSourccSphere->iHisto[i].shalndex;
// if this iB not the last =ay¨
+ 1) < anEntryCount) interpolate between this and the next entry for smoothness U1NT32 allistoBlend1 = aSourceSphere->illistogillahalfisto ;
=
UINT32 aHistoBleadO = Os0000filf - alrataBlendl;
INT32 allistrandexl aSourceSphere->iHisto[i+1].shaIndm UINT16 saDstcolote this->atlisto[aRistaindexO]shaColor, UINT16 saDstColorl = this->asto[allistoIndes 1].shaCoIor;
aDstColor[0] = (111Nr16)(((aDstColor0[0] * aHistoBlestd0) 16)1-((aDstColor1[0] *
aHistoBlendl) 16));
aDstColor[1] (IIINTI6)(((aDs1Color0[1] * allistoBIen40) 16)+
RaDstColorl [1] *
aHistoBlendl) 16));
aDstColor[2] = (LIDIT16)(((aDatColar0[2] * sHiatoBlend0) 16)+
((aDstColorl [2] *
sHistoBlendl) 16));
else last entry, ao intcrpolation UINT16 *thistoColor = this->ifil.sto[aHistoIndex0].shaCo1or, aDeColor[0] = allistoColorf0];
aDstColor[1] aHistoColor[1];
aDstCo1or[2] aHisbaColor[2];
aDste,okir += 3;
APPENDIX E
EXEMPLARY CODE FOR REMAP IMAGE COMPOSITE
=
r.***********4****=**************4.6***4444****======*014***4.0=*-************
* Procedure:
* RemaplmagcCompositc * Description:
* Remaps the.Colca of aSourceltnage with aSourceSphere color map to * aDestImage with.the color Map of this SphereColor otject.
* aMapChannel is the index of the alpha channel with the region to be * rernapped indicated in white. The remaped .color ia then blended into * the image.
* This function does the following Builds a rernapTahle In go iiom the source color space to the dest color space.
* Using the sou= color space's Monochrome scaler,. find a bitninance foresail pixel.
* Ute the. hmfinauce to look-up the new color value in the. ntmap table.
11 Composite the new color value into the dest image.
void SphereCalon:RemaplmageComposite( Image *aSourcehnage, 'Spl2ereColor *aSourceSphere, Image *aDestImage, INT32 aMApChannel) if (aSourcelmage && aSourceSphese && tiDerulmage) if (aSourcelmage->WlemType cletnl6bit &&
aDestImage->iBlemType = eleml6bit) if (aSourcelmago,>hcight ==aDestImage->hcight &&
aSourcelmage->width= aDestImage->width 8c4k aMapChanntd > 2 &&
aSourcelmage-acharmals > abdapChannel aDeStImitge.>chrustels > 3) n(i./fist && aSourceSphere->iiiisto) f 1N1-32 aPixelCount = aSourcemago->lxight *
aSourceimage->width;
UINT16 *aSrcauffer= (U1NT1645aSourcchna,ge->image_.ptr;
UINT16 *aDstBuffer = MINT 1 64)aDestimage->image_ptr, UINT32 *srcMonoScalcr = aSourceSphcre->iMonoSealcr;
INT32 anEntryCount =1 << SPHERE coLoR saint UINT16 *aDatColor;
.1NT32 alamiShift us (16 - SPHERE COLOR_SHIF1);
BuildRemapTable(aSoureeSphere);
INT32 i;
for (i = 0; i < aPixelCotm4 ii-f) If for ever/ pixel in the image-.
if (aStauffer[ablapChennel] > Ox000000$) // fetch the blending values fir alpha coloring umrr32stal1,haBlend0 aSreBufferfaMapCbannelj;
UINT32 anAlphaBlendl = Ox0000ffir-anAlphaBlend0;
= if ode buninunce using this color space'S
monochrome II scaler.
U1N1'32 &bunkum =-((aSrcBuffer[0] *
(U1NT32)sraionoScaler[0]) >> 16)+
((arcBuffer[1] =
(UINT32)arclkdonoSca1erflp>> 16)+
((aSroBufferr2) =
(31NT32)arcbdonoScalerf2j) >> 16);
the look-up iti 11 convettlltill/MCC value to an index for INT32 aLumhalex = aLuminince >>
aLumShift II look-up the replacement color for blending.
UINT16 *thIendColor iRenutp -t-(aLumIndex = 3);
/1 alpha blend the color into the destination bna8c-aDstauffbr[0] --(UINT16)(((aBlendColor[O] = anAlphaB)end0) >> 16)4-((aDateuffer[0] * anAlphs.Blendl) >> 16));
. 79150-117 aDatBuffec[11.-(UTNT16)(((aBlendColotl] = agoAlphaBiand0)>> 16)+
((aDatBuffer[1] * anAlphaBleall)>> 16)) allkatSuffer[2]
(UINT16)(gaBlendColo42] * anAlphaBleadO) 16)+
((aDatBufrat21 anAlphaBlendl) >> 16));
1NT32 aSum = aDdBufferf3];
*Sum 4¨ aSrcHuffer[ablapChanneit if (aSum> Ox0000frip aSum Ox.0000fftt aDstBufferr31.= aSum;
aSzuBuffer aSourcalmage->chatmela;
=*Maid:far aDastImage->chaunels;
)
// save the found background alpha as a channel;
CopyChannekMarkerlmage, DatImage, backgroundCahnnel);
ra the markers // separate the high frequency image component HighPassFilter(Dsttmage, Hip,hPassImage, 2.0) // separate the low frequency conrponent, &mum* to improve texture synthesis speed.
Downsamplelmage(DsfirnageõLowPasalmage, 2.0) // increase rile size ofthemarkers to cover color artifacts from bloom.
ExpandAlpha(Dathnage, MaricerCharmel, HighSynthSelection, BloomDist);
// downsrunple the selection for the low frequency component.
Downsamplellmage(flighSynthSelection, LowSynthSelection, 2.0) // fill the Lowpassimage markers with texture examples // *cm the local 11 x 11 neighborhood.
LocalTextureSynthesis(LowPassimage. LowSynthSelection, 11, 11);
II fill the HighPasalmage with texture cuunplar // from the local 5 x 5 neighborhood LocalTexturegyathesis(HighPassImage, HighSynthSelection, 5, 5);
APPENDIX B
EXEMPLARY CODE FOR REMAPPING IMAGE
COLOR
.04Ø*************===========* ************.diumisoimisiks *****
***============**
* Procedure:
* RernapIrnageColor = Description:
= Remaps the Image Color using normalized ordinal cOlor distribution * of sinatr source and destination color reference photos.
* Reference photos need not be aligned, only have proportionally * ilhurninated color areas.
*****=*******************************************************************/
void tviodelShot::RernaplutagcColor( binge *aSrcIrnage.
Image *aSreColorReference, Image saDstImage, Image *aDstColorReferen.ce // Make a Source Color map SphereColor *arcMap-= SphereColor::NewL(LService);
sat-cMap->BuildMap(arcColorReferenos, 3, Dx00007fft);
// Make a Mat Color nap SphereColor *aDstMap SphereColor::NewL(iService);
aDstMap->BuildMap(aDatCobacterence, 3, Ox00007fff);
aDstMap->RernapInsageComposirgaSrcImage, drcMap, aDstImage);
APPENDIX C
EXEMPLARY CODE FOR BUILDMAP =
/************* ***** 4****************** *************.*********-=
* Procedure:
* BuildMap =
* Description:
= Builds a color map in normalized histogram. order, with an index * to map from a matching zolor space.
* Uses these steps:
1. Finds a monochrome scale for the imago that maximizesthe resolution for this color sample.
2. Builds a Histogram from this monochrome value for the Image annotated with the average color for each bin.
3. Builds a map of color normalized to the amount of each luminance found in the reference image.
4. Builds a lookup to go from the reference color space to the normalized map space.
BOOL SphereColor.:BuildMap(Image *aSpherelmage, 1NT32 alapChannel, 1NT32 aMapThreshoId) 1NT32 anEntryCormt = 1 << SPHERE_COLOR SHIFT;
UINT1 6 *aBuffer = (UINT164)aSphereIrnage->image_pir;
1NT32 anIrnageSize = aSpherelmage->width * aSpherehnage->height;
INT64 anAccumeolor[3];
1NT32 aFoundPixelCount = 0;
sphereHistoAccum *aTcrapColorEntries;
1NT32 i;
BOOL aRcsult = FALSE;
iService->AlloeL(anEntryCount = SizeoftspherelastoAccum),'cent);
aTenapColorEntries = (sphereHistoAccums)iService->OctAlloc();
iService->PushAliocL(aTempColorEntries);
memsegaTempColorEntries, 0, anEntryCount * sizeoksphereHistoAceurn));
anAceunrCe1or[0] = 0;
anAccumColor[1] = 0;
gnAcuumC,olor[2] = 0;
for(i = 0; i < anIznageSize; i++) if (aBuffer[aMapChannel] > aMapThreabolcl) = 79150-117 anAccumColor[0] aBuffer[0];
arAccurnColor[ += giBufferill;
anAcctunCo1caf2] aBuffer[2];
sPoundPixe1Count+41 aBuffer aSpherelmage->channas;
if (aFotmdPixelCount > 0) anAccumColot[0] anAccuniColor(01/ aFoundPixelCount;
anAccumeottir[I] inAtclImCo1or[1] aFowr1rke.1Count;
anAccumColort21.. ariAccumColor[21/ alkAmdPixelCount;
CalaionoScalerganAccuinCeIor);
aBufkr (UINT16=)aSphexe1rnago.>inmge_ptr;
for(i 0; i < anIznageSiz 1-H-) if (aBuffer(aMapChinne1] > aMapTluvehold) LENT32 &Luminance ((aBuffer[01 = (UIN*13.2)1MonoScalexiDD >> 16)+
((aBuffer[1] * (UINT32)iMonoScaler[1p >> 16)4-((alluffor[2] = (U1N132)ildonacaloti2D >> 16);
TNT32 aLumIndexetLumini' mce >> (10 -SPHERE COLOR SHIFT);
aTempColorEttrics[aLtmiludoc].shaColorCount-H-;
aTempColorEntgies[ittaunIndellshaAverageC01040] aSuffer[0];
aTempColerVagrles[aLumIndex].ahsAvemeColot[ 1) 4= aBuffctill;
aTempCo1orEntries[othmllnd.ex].shaAverageCo1or[2] aBuffer[2];
.aTerapCoktrEntrits[aLumlidegishaLuminance aLuntinance;
aBuffer aSpherelmno->thaultkels;
if (aFoundPixelCount )' 256) double alncrament (REAL)aFounciPlxclCount (REAL)anEntryeAwnt;
'double 4Rimningansitt 0;
UrNT32 j;
aRcsult i= TRUE;
if (thiii->iHisto i= NULL) ' 79150-117 iService->AlloeL(anEntryCount * sizeof(spherellistoEntry), 'big%
this->iHisto (sphereHistoEutry*)iService->GetAlloc();
mernset(iFlisto, 0, anEuttyCount * atzeogsphereffunaEntiy));
foqi ... 0; i< anEntryCount; i++) INT32 aColorCount = aTernpColorEntriesjii.shaColorCoun if (aColorCount !=
aTanpColorEntriestilahaAverageColor[0] t= aColorC.ount aTeanpColorEntricagahaAverageColorrli l aC.olorCoun4 aTempOolorEntriesashaAverageCo1ort2] t= aColorCoun4 for(i = 0; i < anEnnyCoun4 i++) double aNeaCount aRunninsCount +
aTetnpColorEntries[ilshaeolorC,outt4 double alristoValue (aRunningCount / anInerernent);
UINT32 aRunStart (UINT32)floor(aHistoValue);
UINT32 altunEnd (UINT32)oeil(ablextCount / anIncrernent);
UINT32 aRunDiff;
IN= aLumSbitt = (16 - SPHERE COLOR SHIFT);
INT32 aLumincrement = I << aLtunShift if (aRunStrat > (UINT32)anEntryCount- l) aRunStart = (0INT32)anEntryCoimt-1;
if (aRunEnd > (UINT32)anF-ntryCount) aRunEnd = (U1NT32)anEntryCount;
altunDiff = Altman - altunStart;
illiatonabalndec = altunStar4 ansto[ilahallistoPract (UINT16)((aHistoValue - aRunStart) *
Ox0000FITE);
if(nstoalshailistoFract > Ox00000ffe) iirtstonshailistoFract = Ox0000frfe, if (aRunDiff) UINT32 aRunScaler = 0740010000 / aRunDiff;
fort = aRnnStart;j < aRunEnd; ji-E) INT32 aFract ((j - aRunStart) * aver:toiler);
thia->illistoashabmainancc aLumShift);
this->illisto[gahaLuminance 4 (aLurnInorement aFract) >> 16;
INT32 aColorScaler 0;
if (aTenipCokaetries[ilehalauninance > (1) aCcdorScaler (this-toalahaLuminance << 16) /
aTempCoIorEetricaglahaLuminance;
this->illistoUlsfulCoJor(0) (11INTI6)((aTempCo1aralriesnahaAverageColor(0]
* aColorScaler) >> 16);
thia->lliistoalalicColor[ I] =, (UINT16)((aTerapColorEntries[ii.s1iaAverageCo1or[1]
aColorScalcr) >> 16);
this->Oristo[aabaCOlot(21 =
(UINT 1 if)((aTempColorEntriespihaAverageColor[2]
accdorscam 16);
alienningCount = aNextComt;
}
this->Wrato[anEntryCount-1].ahaCo1or[0] thia->iHisto[antntryCount-2].shaColor[0];
this->illisto[anEntryCount-1lahaCo1or[1] ffia->iflistoran.EntryCoimt-2j.a12aColor(1);
rhnliato[anEntryCount-llahaColor[2] thes->illistoianEntryCouni-2].shaColor[2];
iScrvice->P0pAndDestroyAlIoc(aTerapOolorEetries) rerurn(aReault);
s 79150-117 APPENDIX D
EXEMPLARY CODE FOR BUILD REmAp TABLE
= 79150-117 ****** **=======******=******====********=====***********=*=***********
* Ploccdure:
* BuildRernapTable = Description:
* Builds a look-up table to remap the colors from a source sphere to a dest sphere.
* This fimction builds a luminance-based look up.
* fbr each luminance entry it does these things:
= 1. Look up the index front this luminance into the normalized color reference space.
= 2. Interopolate between two entries in the reference map = using this luminance's fractional weight.
= 3. Store the interpolated value in the entry for this = luminance.
*********************************************************** ******* **my** /
void SphereCokm:BuildRemapTable(SphereColor *aSourceSphere) 1NI32 anEntryCount J <,4 SPHERE COLOR SHIFT;
1:J1NT16 *aDstColor;
1NT32 aLurnShift = (16 - SPHERE_COLOR SLUR);
INT32 i;
if (iRetnap NULL) iService->AllocL(anEtUryCmmt sizeof(UINTI6) = 3, 'map);
this->illemap (UINT160)iService->6etA1loop;
aDstColor r- this->literrtap;
for (i =O; < anEntryCount;
// for each entry in the table-.
II map from lumin' ante into normalized Histogram order IN132 allistolndcx0 aSourccSphere->iHisto[i].shalndex;
// if this iB not the last =ay¨
+ 1) < anEntryCount) interpolate between this and the next entry for smoothness U1NT32 allistoBlend1 = aSourceSphere->illistogillahalfisto ;
=
UINT32 aHistoBleadO = Os0000filf - alrataBlendl;
INT32 allistrandexl aSourceSphere->iHisto[i+1].shaIndm UINT16 saDstcolote this->atlisto[aRistaindexO]shaColor, UINT16 saDstColorl = this->asto[allistoIndes 1].shaCoIor;
aDstColor[0] = (111Nr16)(((aDstColor0[0] * aHistoBlestd0) 16)1-((aDstColor1[0] *
aHistoBlendl) 16));
aDstColor[1] (IIINTI6)(((aDs1Color0[1] * allistoBIen40) 16)+
RaDstColorl [1] *
aHistoBlendl) 16));
aDstColor[2] = (LIDIT16)(((aDatColar0[2] * sHiatoBlend0) 16)+
((aDstColorl [2] *
sHistoBlendl) 16));
else last entry, ao intcrpolation UINT16 *thistoColor = this->ifil.sto[aHistoIndex0].shaCo1or, aDeColor[0] = allistoColorf0];
aDstColor[1] aHistoColor[1];
aDstCo1or[2] aHisbaColor[2];
aDste,okir += 3;
APPENDIX E
EXEMPLARY CODE FOR REMAP IMAGE COMPOSITE
=
r.***********4****=**************4.6***4444****======*014***4.0=*-************
* Procedure:
* RemaplmagcCompositc * Description:
* Remaps the.Colca of aSourceltnage with aSourceSphere color map to * aDestImage with.the color Map of this SphereColor otject.
* aMapChannel is the index of the alpha channel with the region to be * rernapped indicated in white. The remaped .color ia then blended into * the image.
* This function does the following Builds a rernapTahle In go iiom the source color space to the dest color space.
* Using the sou= color space's Monochrome scaler,. find a bitninance foresail pixel.
* Ute the. hmfinauce to look-up the new color value in the. ntmap table.
11 Composite the new color value into the dest image.
void SphereCalon:RemaplmageComposite( Image *aSourcehnage, 'Spl2ereColor *aSourceSphere, Image *aDestImage, INT32 aMApChannel) if (aSourcelmage && aSourceSphese && tiDerulmage) if (aSourcelmage->WlemType cletnl6bit &&
aDestImage->iBlemType = eleml6bit) if (aSourcelmago,>hcight ==aDestImage->hcight &&
aSourcelmage->width= aDestImage->width 8c4k aMapChanntd > 2 &&
aSourcelmage-acharmals > abdapChannel aDeStImitge.>chrustels > 3) n(i./fist && aSourceSphere->iiiisto) f 1N1-32 aPixelCount = aSourcemago->lxight *
aSourceimage->width;
UINT16 *aSrcauffer= (U1NT1645aSourcchna,ge->image_.ptr;
UINT16 *aDstBuffer = MINT 1 64)aDestimage->image_ptr, UINT32 *srcMonoScalcr = aSourceSphcre->iMonoSealcr;
INT32 anEntryCount =1 << SPHERE coLoR saint UINT16 *aDatColor;
.1NT32 alamiShift us (16 - SPHERE COLOR_SHIF1);
BuildRemapTable(aSoureeSphere);
INT32 i;
for (i = 0; i < aPixelCotm4 ii-f) If for ever/ pixel in the image-.
if (aStauffer[ablapChennel] > Ox000000$) // fetch the blending values fir alpha coloring umrr32stal1,haBlend0 aSreBufferfaMapCbannelj;
UINT32 anAlphaBlendl = Ox0000ffir-anAlphaBlend0;
= if ode buninunce using this color space'S
monochrome II scaler.
U1N1'32 &bunkum =-((aSrcBuffer[0] *
(U1NT32)sraionoScaler[0]) >> 16)+
((arcBuffer[1] =
(UINT32)arclkdonoSca1erflp>> 16)+
((aSroBufferr2) =
(31NT32)arcbdonoScalerf2j) >> 16);
the look-up iti 11 convettlltill/MCC value to an index for INT32 aLumhalex = aLuminince >>
aLumShift II look-up the replacement color for blending.
UINT16 *thIendColor iRenutp -t-(aLumIndex = 3);
/1 alpha blend the color into the destination bna8c-aDstauffbr[0] --(UINT16)(((aBlendColor[O] = anAlphaB)end0) >> 16)4-((aDateuffer[0] * anAlphs.Blendl) >> 16));
. 79150-117 aDatBuffec[11.-(UTNT16)(((aBlendColotl] = agoAlphaBiand0)>> 16)+
((aDatBuffer[1] * anAlphaBleall)>> 16)) allkatSuffer[2]
(UINT16)(gaBlendColo42] * anAlphaBleadO) 16)+
((aDatBufrat21 anAlphaBlendl) >> 16));
1NT32 aSum = aDdBufferf3];
*Sum 4¨ aSrcHuffer[ablapChanneit if (aSum> Ox0000frip aSum Ox.0000fftt aDstBufferr31.= aSum;
aSzuBuffer aSourcalmage->chatmela;
=*Maid:far aDastImage->chaunels;
)
Claims (36)
1. An apparatus for modeling a product, comprising:
a plurality of markers that are capable of forming a marker pattern on a product and the marker pattern does not occlude a surface of the product;
an imaging device that is capable of taking a single image of the product on an object and the plurality of markers and the product is one of an item worn by a human being and an item worn by an animal; and a computing device that captures a contour of a design area when that design area is on a product that is represented on an object in the single image, electronically applies a user design to the product and generates a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
a plurality of markers that are capable of forming a marker pattern on a product and the marker pattern does not occlude a surface of the product;
an imaging device that is capable of taking a single image of the product on an object and the plurality of markers and the product is one of an item worn by a human being and an item worn by an animal; and a computing device that captures a contour of a design area when that design area is on a product that is represented on an object in the single image, electronically applies a user design to the product and generates a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
2. The apparatus of claim 1, wherein the computing device generates a web page that displays the visual representation of the product with the design on the web page to a consumer.
3. The apparatus of claim 2, wherein the object further comprises a human model, a mannequin or an animal.
4. The apparatus of claim 1, wherein each marker further comprises a piece of pigment.
5. The apparatus of claim 1, wherein each marker further comprises a piece of reflective material.
6. The apparatus of claim 5, wherein the piece of reflective material further comprises a piece of retro-reflective material.
7. The apparatus of claim 1, wherein each marker further comprises a circular marker.
8. The apparatus of claim 1, wherein the plurality of markers further comprise a grid of lines on the product not visible to a human.
9. The apparatus of claim 1, wherein the product further comprises a piece of apparel, a garment, an item worn by a human being or an item worn by an animal.
10. The apparatus of claim 1, wherein the computing device maps one or more points on the design to one or more points on the contour of the surface of the product on an object.
11. The apparatus of claim 1, wherein the computing device colorizes the surface of the product on the object prior to generating the visual representation of the design on the product.
12. The apparatus of claim 11, wherein the computing device texturizes the surface of the product on the object prior to generating the visual representation of the design on the product.
13. The apparatus of claim 1, wherein the imaging device further comprises a camera.
14. The apparatus of claim 1, wherein the computing device further comprises a networked computing system, a client/server system, a peer-to-peer system, an ASP model type system, a laptop computer, a mobile device or a mobile cellular phone.
15. The apparatus of claim 1, wherein the marker pattern further comprises a grid of markets on the product.
16. A method for product modeling, comprising:
providing, using an imaging device, a contour of a design area when that design area is on a product that is represented on an object in the single image generated by imaging, using a single image, the product on an object using a plurality of markers that form a marker pattern on the product and the marker pattern does not occlude a surface of the product, wherein the product is one of an item worn by a human being and an item worn by an animal;
electronically applying, using a computer, a user design to the product; and generating, using the computer, a visual representation of the design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
providing, using an imaging device, a contour of a design area when that design area is on a product that is represented on an object in the single image generated by imaging, using a single image, the product on an object using a plurality of markers that form a marker pattern on the product and the marker pattern does not occlude a surface of the product, wherein the product is one of an item worn by a human being and an item worn by an animal;
electronically applying, using a computer, a user design to the product; and generating, using the computer, a visual representation of the design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
17. The method of claim 16, wherein providing the contour of the design area further comprises placing the plurality of markers on the product to create a grid, imaging the product on the object with the grid to generate an imaged product and capturing the contour of the surface of the product when the product is on the object based on the imaged product.
18. The method of claim 17 further comprising displaying the visual representation of the product with the design on a web page to a consumer.
19. The method of claim 17, wherein the object further comprises a human model, a mannequin or an animal.
20. The method of claim 17, wherein placing the plurality of markers on the product further comprises affixing the plurality of markers to a physical product.
21. The method of claim 17, wherein placing the plurality of markers on the product further comprises electronically affixing the plurality of markers to an image of a physical product.
22. The method of claim 17, wherein each marker further comprises a piece of pigment.
23. The method of claim 17, wherein each marker further comprises a piece of reflective material.
24. The method of claim 23, wherein the piece of reflective material further comprises a piece of retro-reflective material.
25. The method of claim 17, wherein each marker further comprises a circular marker.
26. The method of claim 17, wherein placing the plurality of markers on the product further comprises placing a grid of lines on the product not visible to a human.
27. The method of claim 16, wherein the product further comprises a piece of apparel, a garment, an item worn by a human being or an item worn by an animal.
28. The method of claim 16, wherein generating the visual representation of the design on the product further comprises mapping one or more points on the design to one or more points on the contour of the surface of the product on the object.
29. The method of claim 28, wherein the mapping the one or more points further comprising using a warp mapping.
30. The method of claim 29, wherein the warp mapping further comprises using a bicubic image warp.
31. The method of claim 16, wherein generating the visual representation of the design on the product further comprises colorizing the surface of the product on the object prior to generating the visual representation of the design on the product.
32. The method of claim 31, wherein colorizing the surface of the product further comprises using a color calibration card.
33. The method of claim 31, wherein generating the visual representation of the design on the product further comprises texturizing the surface of the product on the object prior to generating the visual representation of the design on the product.
34. The method of claim 16, wherein the marker pattern further comprises a grid of markets on the product.
35. An apparatus for modeling a product, comprising:
a plurality of markers that are capable of forming a marker pattern on a product that does not occlude a surface of the product;
an imaging device that is capable of taking a single image of the product on an object and the plurality of markers, wherein the product is one of an item worn by a human being and an item worn by an animal; and a computing device that captures a contour of a design area when that design area is on a product that is represented on an object in the single image, electronically applies a user design to the product and generates a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
a plurality of markers that are capable of forming a marker pattern on a product that does not occlude a surface of the product;
an imaging device that is capable of taking a single image of the product on an object and the plurality of markers, wherein the product is one of an item worn by a human being and an item worn by an animal; and a computing device that captures a contour of a design area when that design area is on a product that is represented on an object in the single image, electronically applies a user design to the product and generates a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the design on the product has the contour of the surface of the product.
36. A method for product modeling, comprising:
providing, using a imaging device, a contour of a design area when that design area is on a product that is represented on an object in the single image generated by imaging, using a single image, the product on an object with a single image using a plurality of markers that form a marker pattern on the product that does not occlude a surface of the product;
electronically applying, using a computer, a user design to the product; and generating, using the computer, a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the user design on the product has the contour of the surface of the product.
providing, using a imaging device, a contour of a design area when that design area is on a product that is represented on an object in the single image generated by imaging, using a single image, the product on an object with a single image using a plurality of markers that form a marker pattern on the product that does not occlude a surface of the product;
electronically applying, using a computer, a user design to the product; and generating, using the computer, a visual representation of the user design on the product when on the object using the captured lighting, texture and contours of the product and the object so that the visual representation of the user design on the product has the contour of the surface of the product.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/925,716 US8174521B2 (en) | 2007-10-26 | 2007-10-26 | Product modeling system and method |
US11/925,716 | 2007-10-26 | ||
PCT/US2008/081215 WO2009055738A1 (en) | 2007-10-26 | 2008-10-24 | Product modeling system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2706699A1 CA2706699A1 (en) | 2009-04-30 |
CA2706699C true CA2706699C (en) | 2013-10-01 |
Family
ID=40580071
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2706699A Active CA2706699C (en) | 2007-10-26 | 2008-10-24 | Product modeling system and method |
Country Status (8)
Country | Link |
---|---|
US (4) | US8174521B2 (en) |
EP (1) | EP2215603A4 (en) |
JP (1) | JP4951709B2 (en) |
KR (1) | KR101243429B1 (en) |
CN (1) | CN101933048B (en) |
AU (1) | AU2008316632B2 (en) |
CA (1) | CA2706699C (en) |
WO (1) | WO2009055738A1 (en) |
Families Citing this family (60)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10127480B1 (en) | 2007-03-09 | 2018-11-13 | R. B. III Associates, Inc. | System for automated decoration |
US11157977B1 (en) | 2007-10-26 | 2021-10-26 | Zazzle Inc. | Sales system using apparel modeling system and method |
US8917424B2 (en) | 2007-10-26 | 2014-12-23 | Zazzle.Com, Inc. | Screen printing techniques |
US8174521B2 (en) | 2007-10-26 | 2012-05-08 | Zazzle.Com | Product modeling system and method |
US9147213B2 (en) * | 2007-10-26 | 2015-09-29 | Zazzle Inc. | Visualizing a custom product in situ |
US8170367B2 (en) * | 2008-01-28 | 2012-05-01 | Vistaprint Technologies Limited | Representing flat designs to be printed on curves of a 3-dimensional product |
US10719862B2 (en) | 2008-07-29 | 2020-07-21 | Zazzle Inc. | System and method for intake of manufacturing patterns and applying them to the automated production of interactive, customizable product |
US9087355B2 (en) * | 2008-08-22 | 2015-07-21 | Zazzle Inc. | Product customization system and method |
US8947455B2 (en) | 2010-02-22 | 2015-02-03 | Nike, Inc. | Augmented reality design system |
US9213920B2 (en) | 2010-05-28 | 2015-12-15 | Zazzle.Com, Inc. | Using infrared imaging to create digital images for use in product customization |
CN102339023B (en) * | 2010-07-16 | 2013-11-13 | 华宝通讯股份有限公司 | Mechanical device control system |
US8516392B2 (en) * | 2010-08-31 | 2013-08-20 | Daniel Reuven Ostroff | Interactive generic configurator program |
US9110673B2 (en) * | 2010-08-31 | 2015-08-18 | Daniel Reuven Ostroff | System and method of creating and remotely editing interactive generic configurator programs |
US8711175B2 (en) * | 2010-11-24 | 2014-04-29 | Modiface Inc. | Method and system for simulating superimposition of a non-linearly stretchable object upon a base object using representative images |
US9111306B2 (en) | 2011-01-10 | 2015-08-18 | Fujifilm North America Corporation | System and method for providing products from multiple websites |
EP3664428B1 (en) * | 2011-08-31 | 2021-04-28 | Zazzle Inc. | Tiling process for digital image retrieval |
US20130090894A1 (en) * | 2011-10-11 | 2013-04-11 | Tomer DIKERMAN | System and method for computer-aided user-specific design |
US10969743B2 (en) | 2011-12-29 | 2021-04-06 | Zazzle Inc. | System and method for the efficient recording of large aperture wave fronts of visible and near visible light |
US9371712B2 (en) | 2012-03-09 | 2016-06-21 | Halliburton Energy Services, Inc. | Cement set activators for set-delayed cement compositions and associated methods |
US9255454B2 (en) | 2012-03-09 | 2016-02-09 | Halliburton Energy Services, Inc. | Set-delayed cement compositions comprising pumice and associated methods |
US8851173B2 (en) | 2012-03-09 | 2014-10-07 | Halliburton Energy Services, Inc. | Set-delayed cement compositions comprising pumice and associated methods |
US9328281B2 (en) | 2012-03-09 | 2016-05-03 | Halliburton Energy Services, Inc. | Foaming of set-delayed cement compositions comprising pumice and hydrated lime |
US9212534B2 (en) * | 2012-03-09 | 2015-12-15 | Halliburton Energy Services, Inc. | Plugging and abandoning a well using a set-delayed cement composition comprising pumice |
US9796904B2 (en) | 2012-03-09 | 2017-10-24 | Halliburton Energy Services, Inc. | Use of MEMS in set-delayed cement compositions comprising pumice |
US10195764B2 (en) | 2012-03-09 | 2019-02-05 | Halliburton Energy Services, Inc. | Set-delayed cement compositions comprising pumice and associated methods |
US9505972B2 (en) | 2012-03-09 | 2016-11-29 | Halliburton Energy Services, Inc. | Lost circulation treatment fluids comprising pumice and associated methods |
US9255031B2 (en) | 2012-03-09 | 2016-02-09 | Halliburton Energy Services, Inc. | Two-part set-delayed cement compositions |
US9790132B2 (en) | 2012-03-09 | 2017-10-17 | Halliburton Energy Services, Inc. | Set-delayed cement compositions comprising pumice and associated methods |
US9580638B2 (en) | 2012-03-09 | 2017-02-28 | Halliburton Energy Services, Inc. | Use of synthetic smectite in set-delayed cement compositions |
US10082001B2 (en) | 2012-03-09 | 2018-09-25 | Halliburton Energy Services, Inc. | Cement set activators for cement compositions and associated methods |
US9856167B2 (en) | 2012-03-09 | 2018-01-02 | Halliburton Energy Services, Inc. | Mitigation of contamination effects in set-delayed cement compositions comprising pumice and hydrated lime |
US9227872B2 (en) | 2012-03-09 | 2016-01-05 | Halliburton Energy Services, Inc. | Cement set activators for set-delayed cement compositions and associated methods |
US9534165B2 (en) | 2012-03-09 | 2017-01-03 | Halliburton Energy Services, Inc. | Settable compositions and methods of use |
US9328583B2 (en) | 2012-03-09 | 2016-05-03 | Halliburton Energy Services, Inc. | Set-delayed cement compositions comprising pumice and associated methods |
US10202751B2 (en) | 2012-03-09 | 2019-02-12 | Halliburton Energy Services, Inc. | Set-delayed cement compositions comprising pumice and associated methods |
US9665981B2 (en) * | 2013-01-07 | 2017-05-30 | R.B. Iii Associates Inc | System and method for generating 3-D models from 2-D views |
CN102864566B (en) * | 2012-09-29 | 2014-02-12 | 加宝利服装有限公司 | Fabric manufacture method, manufacture control method, manufacture control device and manufacture system |
US20140201023A1 (en) * | 2013-01-11 | 2014-07-17 | Xiaofan Tang | System and Method for Virtual Fitting and Consumer Interaction |
US8712566B1 (en) | 2013-03-14 | 2014-04-29 | Zazzle Inc. | Segmentation of a product markup image based on color and color differences |
US9704296B2 (en) | 2013-07-22 | 2017-07-11 | Trupik, Inc. | Image morphing processing using confidence levels based on captured images |
US8958663B1 (en) * | 2013-09-24 | 2015-02-17 | Zazzle Inc. | Automated imaging of customizable products |
US20160168363A1 (en) * | 2014-06-27 | 2016-06-16 | Api Intellectual Property Holdings, Llc | Nanocellulose-polymer composites, and processes for producing them |
US9734631B2 (en) | 2014-07-22 | 2017-08-15 | Trupik, Inc. | Systems and methods for image generation and modeling of complex three-dimensional objects |
GB201420090D0 (en) * | 2014-11-12 | 2014-12-24 | Knyttan Ltd | Image to item mapping |
EP3038053B1 (en) | 2014-12-22 | 2019-11-06 | Reactive Reality GmbH | Method and system for generating garment model data |
CN107533769B (en) * | 2015-03-16 | 2021-04-02 | 彩滋公司 | Automated computer-aided design in the preparation of cut thermoadhesive films |
CN113693325B (en) * | 2015-08-10 | 2022-07-08 | 彩滋公司 | System and method for customizing digital indicia of a product |
US10176636B1 (en) * | 2015-12-11 | 2019-01-08 | A9.Com, Inc. | Augmented reality fashion |
US10192346B2 (en) * | 2016-09-28 | 2019-01-29 | Pixar | Generating UV maps for modified meshes |
JP6915629B2 (en) * | 2016-12-27 | 2021-08-04 | ソニーグループ株式会社 | Product design system and design image correction device |
US11651179B2 (en) | 2017-02-20 | 2023-05-16 | 3M Innovative Properties Company | Optical articles and systems interacting with the same |
US11246366B2 (en) * | 2017-05-31 | 2022-02-15 | Nike, Inc. | Selective deposition of reflective materials for an apparel item |
US10679539B2 (en) * | 2017-08-10 | 2020-06-09 | Outward, Inc. | Two-dimensional compositing |
US11314971B2 (en) | 2017-09-27 | 2022-04-26 | 3M Innovative Properties Company | Personal protective equipment management system using optical patterns for equipment and safety monitoring |
CN112262363A (en) | 2018-02-27 | 2021-01-22 | 利惠商业有限公司 | Laser arrangement design tool |
US20190272663A1 (en) * | 2018-03-05 | 2019-09-05 | Vida & Co. | Simulating display of a 2d design on an image of a 3d object |
US10650584B2 (en) * | 2018-03-30 | 2020-05-12 | Konica Minolta Laboratory U.S.A., Inc. | Three-dimensional modeling scanner |
US20210073886A1 (en) | 2019-08-29 | 2021-03-11 | Levi Strauss & Co. | Digital Showroom with Virtual Previews of Garments and Finishes |
CN113487727B (en) * | 2021-07-14 | 2022-09-02 | 广西民族大学 | Three-dimensional modeling system, device and method |
KR102536525B1 (en) * | 2022-06-29 | 2023-05-26 | 주식회사 쓰리디뱅크 | Filming Device for 3D Scanning Using a Special Mannequin Equipped With a Marker |
Family Cites Families (64)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6280891B2 (en) | 1994-05-04 | 2001-08-28 | Hologram Industries S.A. | Multi-layer assembly and method for marking articles and resulting marked articles |
JPH0944556A (en) * | 1995-08-03 | 1997-02-14 | Sanyo Electric Co Ltd | Trial wearing simulation method |
US5850222A (en) * | 1995-09-13 | 1998-12-15 | Pixel Dust, Inc. | Method and system for displaying a graphic image of a person modeling a garment |
JPH09204478A (en) * | 1996-01-25 | 1997-08-05 | Topcon Corp | Merchandise information system |
JPH09273017A (en) * | 1996-04-02 | 1997-10-21 | Toray Ind Inc | Method for displaying wearing state of wear, production of wear and apparatus for assisting production of wear |
JPH1153427A (en) * | 1997-06-06 | 1999-02-26 | Toray Ind Inc | Method and device for setting color pattern of image |
US6310627B1 (en) | 1998-01-20 | 2001-10-30 | Toyo Boseki Kabushiki Kaisha | Method and system for generating a stereoscopic image of a garment |
US6097310A (en) * | 1998-02-03 | 2000-08-01 | Baker Hughes Incorporated | Method and apparatus for mud pulse telemetry in underbalanced drilling systems |
US6173211B1 (en) | 1998-04-15 | 2001-01-09 | Gerber Technology, Inc. | Apparatus and method for fabric printing of nested |
US6836695B1 (en) | 1998-08-17 | 2004-12-28 | Soft Sight Inc. | Automatically generating embroidery designs from a scanned image |
US20050131571A1 (en) | 1998-11-23 | 2005-06-16 | Darryl Costin | Internet customization of apparel |
JP4438971B2 (en) * | 1999-10-20 | 2010-03-24 | グンゼ株式会社 | Data three-dimensional apparatus and computer-readable recording medium recording data three-dimensional program |
JP2001160095A (en) * | 1999-12-03 | 2001-06-12 | Soft Ryutsu Kk | Virtual mall system, commodity information transmission method for the mall system, recording medium with computer program for realizing the mall system recorded thereon and a recording medium with computer program recorded thereon for allowing external computer accessing the mall system to display commodity |
US7302114B2 (en) | 2000-01-18 | 2007-11-27 | Branders.Com, Inc. | Methods and apparatuses for generating composite images |
US6196146B1 (en) | 2000-03-23 | 2001-03-06 | Pulse Microsystems Ltd. | Web based embroidery system and method |
JP2001273446A (en) * | 2000-03-28 | 2001-10-05 | Akesesu:Kk | Article fitting system |
US7149665B2 (en) | 2000-04-03 | 2006-12-12 | Browzwear International Ltd | System and method for simulation of virtual wear articles on virtual models |
US7216092B1 (en) | 2000-04-14 | 2007-05-08 | Deluxe Corporation | Intelligent personalization system and method |
US6968075B1 (en) * | 2000-05-09 | 2005-11-22 | Chang Kurt C | System and method for three-dimensional shape and size measurement |
US20020099524A1 (en) | 2000-05-31 | 2002-07-25 | 3M Innovative Properties Company | Process and system for designing a customized artistic element package |
US6546309B1 (en) * | 2000-06-29 | 2003-04-08 | Kinney & Lange, P.A. | Virtual fitting room |
JP2002058045A (en) * | 2000-08-08 | 2002-02-22 | Komatsu Ltd | System and method for entering real object into virtual three-dimensional space |
AU2001278318A1 (en) | 2000-07-24 | 2002-02-05 | Jean Nicholson Prudent | Modeling human beings by symbol manipulation |
US6473671B1 (en) | 2000-09-11 | 2002-10-29 | He Yan | 3-D modeling of prototype garments |
US7918808B2 (en) | 2000-09-20 | 2011-04-05 | Simmons John C | Assistive clothing |
BE1013816A6 (en) | 2000-10-30 | 2002-09-03 | Douelou Nv | Production of made to order clothing e.g. for Internet, where the customers inputs their age, weight, height, and collar size into a controller which then determines the clothing pattern |
JP2002140578A (en) * | 2000-10-31 | 2002-05-17 | Fujitsu Ltd | Sale supporting method, sale supporting device and recording medium |
US6564118B1 (en) | 2000-12-28 | 2003-05-13 | Priscilla Swab | System for creating customized patterns for apparel |
GB0101371D0 (en) * | 2001-01-19 | 2001-03-07 | Virtual Mirrors Ltd | Production and visualisation of garments |
US6842532B2 (en) | 2001-02-08 | 2005-01-11 | The Hong Kong Polytechnic University | Three dimensional measurement, evaluation and grading system for fabric/textile structure/garment appearance |
FI20011814A (en) | 2001-09-14 | 2003-03-15 | Jari Ruuttu | A method of obtaining a particular product through a public information network such as the Internet |
US7479956B2 (en) * | 2001-10-19 | 2009-01-20 | Unique Solutions Design Ltd. | Method of virtual garment fitting, selection, and processing |
WO2003064170A1 (en) | 2002-01-30 | 2003-08-07 | Gerber Scientific Products, Inc. | Apparatus and method for printing and cutting customized wall decorations |
GB0225789D0 (en) | 2002-03-25 | 2002-12-11 | Makemyphone Ltd | Method and apparatus for creating image production file for a custom imprinted article |
JP2004086662A (en) * | 2002-08-28 | 2004-03-18 | Univ Waseda | Clothes try-on service providing method and clothes try-on system, user terminal device, program, program for mounting cellphone, and control server |
ES2211357B1 (en) | 2002-12-31 | 2005-10-16 | Reyes Infografica, S.L. | METHOD ASSISTED BY COMPUTER TO DESIGN CLOTHING. |
JP4246516B2 (en) * | 2003-02-14 | 2009-04-02 | 独立行政法人科学技術振興機構 | Human video generation system |
US20040194344A1 (en) | 2003-04-05 | 2004-10-07 | Tadin Anthony G. | User-customizable insoles for footwear and method of customizing insoles |
US20040227752A1 (en) * | 2003-05-12 | 2004-11-18 | Mccartha Bland | Apparatus, system, and method for generating a three-dimensional model to represent a user for fitting garments |
US20040236455A1 (en) * | 2003-05-22 | 2004-11-25 | Kimberly-Clark Worldwide, Inc. | Method of designing a product in a virtual environment |
US20050177453A1 (en) | 2003-12-02 | 2005-08-11 | Anton John T. | Method and system for customization of consumer products |
JP2005236888A (en) * | 2004-02-23 | 2005-09-02 | Hitachi Software Eng Co Ltd | Photograph seal vending machine, and picture editing method |
US20060020486A1 (en) | 2004-04-02 | 2006-01-26 | Kurzweil Raymond C | Machine and method to assist user in selecting clothing |
US7260445B2 (en) | 2004-06-24 | 2007-08-21 | Basf Aktiengesellschaft | System and method for customized industrial textile coloration |
JP4473754B2 (en) * | 2005-03-11 | 2010-06-02 | 株式会社東芝 | Virtual fitting device |
JP2007011543A (en) * | 2005-06-29 | 2007-01-18 | Dainippon Printing Co Ltd | Article-wearing simulation system, article-wearing simulation method, and the like |
US7881818B2 (en) | 2005-10-07 | 2011-02-01 | Esko Ip Nv | Flexible packaging incorporating two-dimensional graphics |
FR2894783B1 (en) * | 2005-12-19 | 2011-06-10 | Lectra Sa | DEVICE AND METHOD FOR DESIGNING A GARMENT |
US20070208633A1 (en) | 2006-03-06 | 2007-09-06 | Ranjie Singh | Display article customization system and method of use |
CN1828671A (en) * | 2006-04-14 | 2006-09-06 | 浙江大学 | Gridding texture mapping method in garment virtual display system based on image |
CN1877629A (en) * | 2006-06-20 | 2006-12-13 | 东华大学 | Method for displaying textile lining on internet |
US7920939B2 (en) | 2006-09-30 | 2011-04-05 | Vistaprint Technologies Limited | Method and system for creating and manipulating embroidery designs over a wide area network |
US7996756B2 (en) | 2007-09-12 | 2011-08-09 | Vistaprint Technologies Limited | System and methods for displaying user modifiable server-rendered images |
JP5237378B2 (en) | 2007-10-04 | 2013-07-17 | ナイキ インターナショナル リミテッド | Display cards and methods for custom-made items |
US8174521B2 (en) * | 2007-10-26 | 2012-05-08 | Zazzle.Com | Product modeling system and method |
US9147213B2 (en) * | 2007-10-26 | 2015-09-29 | Zazzle Inc. | Visualizing a custom product in situ |
US20090122329A1 (en) | 2007-11-07 | 2009-05-14 | Skinit, Inc. | Customizing print content |
US8170367B2 (en) | 2008-01-28 | 2012-05-01 | Vistaprint Technologies Limited | Representing flat designs to be printed on curves of a 3-dimensional product |
US20100169185A1 (en) | 2008-06-18 | 2010-07-01 | Keith Cottingham | Self-Designed Maquettes Using Rapid Prototyping |
US8411090B2 (en) * | 2008-08-27 | 2013-04-02 | The Chinese University Of Hong Kong | Methods for flattening a 3D surface into a 2D piece |
US9213920B2 (en) * | 2010-05-28 | 2015-12-15 | Zazzle.Com, Inc. | Using infrared imaging to create digital images for use in product customization |
US8516392B2 (en) * | 2010-08-31 | 2013-08-20 | Daniel Reuven Ostroff | Interactive generic configurator program |
US8711175B2 (en) * | 2010-11-24 | 2014-04-29 | Modiface Inc. | Method and system for simulating superimposition of a non-linearly stretchable object upon a base object using representative images |
US8712566B1 (en) * | 2013-03-14 | 2014-04-29 | Zazzle Inc. | Segmentation of a product markup image based on color and color differences |
-
2007
- 2007-10-26 US US11/925,716 patent/US8174521B2/en active Active
-
2008
- 2008-10-24 CA CA2706699A patent/CA2706699C/en active Active
- 2008-10-24 AU AU2008316632A patent/AU2008316632B2/en active Active
- 2008-10-24 JP JP2010531297A patent/JP4951709B2/en active Active
- 2008-10-24 CN CN200880122986.2A patent/CN101933048B/en active Active
- 2008-10-24 EP EP08843251A patent/EP2215603A4/en not_active Ceased
- 2008-10-24 WO PCT/US2008/081215 patent/WO2009055738A1/en active Application Filing
- 2008-10-24 KR KR1020107011507A patent/KR101243429B1/en active IP Right Grant
-
2012
- 2012-05-04 US US13/464,551 patent/US8514220B2/en active Active
-
2013
- 2013-08-15 US US13/968,142 patent/US8878850B2/en active Active
-
2014
- 2014-11-03 US US14/531,918 patent/US9947076B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US8514220B2 (en) | 2013-08-20 |
US9947076B2 (en) | 2018-04-17 |
US20150054849A1 (en) | 2015-02-26 |
US8174521B2 (en) | 2012-05-08 |
KR20100112112A (en) | 2010-10-18 |
AU2008316632B2 (en) | 2012-02-09 |
CN101933048A (en) | 2010-12-29 |
US8878850B2 (en) | 2014-11-04 |
US20130328877A1 (en) | 2013-12-12 |
AU2008316632A1 (en) | 2009-04-30 |
EP2215603A4 (en) | 2010-12-08 |
CN101933048B (en) | 2015-04-01 |
JP4951709B2 (en) | 2012-06-13 |
KR101243429B1 (en) | 2013-03-13 |
US20090109214A1 (en) | 2009-04-30 |
CA2706699A1 (en) | 2009-04-30 |
EP2215603A1 (en) | 2010-08-11 |
US20120249552A1 (en) | 2012-10-04 |
JP2011501326A (en) | 2011-01-06 |
WO2009055738A1 (en) | 2009-04-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2706699C (en) | Product modeling system and method | |
US11640672B2 (en) | Method and system for wireless ultra-low footprint body scanning | |
US10628666B2 (en) | Cloud server body scan data system | |
US9990764B2 (en) | Virtual try on simulation service | |
CN112513913A (en) | Digital array chamber of clothes | |
ES2272346T3 (en) | SYSTEM AND METHOD TO VISUALIZE PERSONAL ASPECT. | |
CN107924555A (en) | System and method for customed product numeral mark | |
EP3562134B1 (en) | Using infrared imaging to create digital images for use in product customization | |
KR102202843B1 (en) | System for providing online clothing fitting service using three dimentional avatar | |
US20130307851A1 (en) | Method for virtually trying on footwear | |
US10445856B2 (en) | Generating and displaying an actual sized interactive object | |
US20230383452A1 (en) | 3D Preview of Laser-Finished Garments | |
US11948057B2 (en) | Online garment design and collaboration system and method | |
US20220036421A1 (en) | Sales system using apparel modeling system and method | |
WO2018182938A1 (en) | Method and system for wireless ultra-low footprint body scanning | |
JP2017188071A (en) | Pattern change simulation device, pattern change simulation method and program | |
Wang | Fashion Coordination System Based on Depth Camera Capture and Virtual Reality | |
CN116757772A (en) | Intelligent management system for renting and retail of clothes | |
Tingare et al. | Implementation of Virtual Dresing Room using Kinect along with OpenCV |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |