US20140028801A1 - Multispectral Binary Coded Projection - Google Patents

Multispectral Binary Coded Projection Download PDF

Info

Publication number
US20140028801A1
US20140028801A1 US13/839,457 US201313839457A US2014028801A1 US 20140028801 A1 US20140028801 A1 US 20140028801A1 US 201313839457 A US201313839457 A US 201313839457A US 2014028801 A1 US2014028801 A1 US 2014028801A1
Authority
US
United States
Prior art keywords
projector
scene
patterns
camera
binary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/839,457
Inventor
Siu-Kei Tin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to US13/839,457 priority Critical patent/US20140028801A1/en
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TIN, SIU-KEI
Publication of US20140028801A1 publication Critical patent/US20140028801A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/49Scattering, i.e. diffuse reflection within a body or fluid
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects

Definitions

  • the present disclosure relates to illumination of an object with structured light, and measurement of light reflected therefrom, for purposes which include measurement of a material property of the object, such as by differentiating between different types of materials.
  • Binary coded patterns are robust to depth discontinuities, but multiple patterns are required. For example, to project Gray codes in 10 bits (10-bit depth resolution), a minimum of 10 projected patterns are required. This makes high-speed 3D scanning of dynamic scenes difficult. Combining 3-bit planes into one color pattern using the 3 channels of an RGB projector can reduce the number of projected patterns, but due to the crosstalk between the channels of a broadband RGB projector, a hyperspectral camera with narrow band sensitivity must be used in some systems. Hyperspectral imaging requires a long capture time for each scene and is not suitable for real-time capture of dynamic scenes or when fast sorting is desired.
  • Imaging spectroscopy such as described by Kim, et al., 2012, combines 3D measurement techniques and hyperspectral imaging, but requires a long capture time, making it unsuitable for real-time classification of materials.
  • color coded patterns such as the rainbow projections, are spatial coding. They assume that depth does not change abruptly across the object. In other words, they are not robust to depth discontinuities.
  • an object is illuminated with structured light that is also spectrally structured, and light reflected therefrom is measured spectrally, for purposes which include derivation of a three-dimensional (3D) measurement of the object, such as depth and/or contours of the object.
  • structured light that is also spectrally structured, and light reflected therefrom is measured spectrally, for purposes which include derivation of a three-dimensional (3D) measurement of the object, such as depth and/or contours of the object.
  • an object is illuminated with structured light that is also spectrally structured, and light reflected therefrom is measured spectrally, for purposes which include measurement of a material property of the object, such as by differentiating between different types of materials.
  • a material property of the object such as by differentiating between different types of materials.
  • an object is illuminated with structured light that is also spectrally structured, and light reflected therefrom is measured spectrally, for purposes which include derivation of a three-dimensional (3D) measurement of the object, such as depth and/or contours of the object.
  • the 3D measurement is used to calculate a reflectance property, such as by using calculations of a surface normal vector of the object from the 3D measurement, and estimating the reflectance property from the 3D measurement, surface normal vector and spectral measurements of reflected light, for purposes which include measurement of a material property of the object, such as by differentiating between different types of materials.
  • multichannel projector and “multi-primary projector” are generally equivalent to each other and may be used interchangeably.
  • spectral structured light and “multispectral coded projection” should generally be considered equivalent to each other and may be used interchangeably, with the term “multispectral binary coded projection” being a special case.
  • a method of measuring depth of a scene using a multichannel projector with relatively-narrow spectrally-banded channels at predetermined wavelengths and a multispectral camera with relatively-broad spectrally-banded channels, wherein the spectral sensitivities of the camera channels at the wavelengths of the projector channels form a well-conditioned matrix comprises projecting a multispectral binary pattern onto the scene, wherein the multispectral binary pattern is composed of binary images, one for each channel of the projector; capturing one or more images of the scene with the multispectral camera; and recovering the depth of the scene by analyzing the captured one or more images.
  • the number of projector channels is more than 3 and the number of camera channels is at least the number of projector channels. In some embodiments, the number of projector channels is 10, each channel of the projector is monochromatic, each channel of the projector is associated with a monochromatic laser source, or each channel of the projector is associated with a monochromatic LED light source.
  • the spectral sensitivity of each channel of the multispectral camera responds to at most one wavelength of a channel of the projector; a binary image is designated for projection in a projector channel by a rule that minimizes error; the binary images for the channels of the projector are ranked according to a significance value, and a binary image with a higher significance value is designated for a projector channel with a wavelength at which camera spectral sensitivity is higher; or information about the spectral reflectance in the scene is available, and a binary image with higher significance value is designated for a projector with a wavelength at which a combination of camera spectral sensitivity and spectral reflectance is higher.
  • analyzing the captured image includes converting the multispectral image into multiple binary images, one for each projector channel; the conversion is performed by comparing each channel of an image based on the multispectral image with a channel-dependent threshold value; the channel-dependent threshold value is also pixel-location dependent and is derived from a second multispectral image captured with an inverse multispectral binary pattern; or the image based on the multispectral image is a multispectral radiance image.
  • the wavelengths of the monochromatic channels are chosen to correspond to the wavelengths where the spectrum of the ambient light has low intensities
  • recovering the depth of the scene is performed by triangulation, or the triangulation includes determining a plane corresponding to a column of projector pixels and a line corresponding to a camera pixel and includes calculation of an intersection between the plane and the line.
  • some of the embodiments employ multispectral cameras (relatively broadband) instead of hyperspectral cameras (relatively narrow band). This is advantageous because multispectral imaging does not require high spectral resolution like hyperspectral imaging does and because acquisition time is much shorter, allowing real-time capture of dynamic scenes.
  • structured light techniques are extended in scope and concept to “spectral structured light” that enables real-time determination of spectral reflectance, in some cases by capturing as few as two (2) multispectral images using a multi-primary projector and a multispectral camera. If the 3D position and surface normal vector of a scene point are known or can be derived, the spectral reflectance can be recovered from the radiance of the scene point, which in turn can be measured by a multispectral camera. 3D position of a scene point can ordinarily be obtained readily, such as through triangulation techniques. The surface normal vector of a scene point can ordinarily be obtained by applying techniques of the first aspect described above.
  • 10-bit Gray code patterns are used as in conventional structured light, but the bit plane images are projected simultaneously using separate channels of the 10-channel projector with monochromatic primary in each channel. In some embodiments, only 2 multispectral images are captured by the multispectral camera, corresponding to the 10-bit Gray code and its inverse (complementary) code.
  • this process of determining 3D position is applied to a collection of scene points.
  • the collection of determined 3D positions is referred to as a 3D map.
  • the collection of scene points may be determined by a preprocessing step applied to the multispectral images captured by the multispectral camera, such as a step of segmentation.
  • the step of segmentation may determine scene points that belong to a same material.
  • the surface normal map is not determined based on the 3D map. Instead, a photometric stereo method is used to determine the surface normal vectors directly, wherein one or more projectors or illumination sources may be used.
  • a method of estimating a material property of a scene by projecting one or more patterns onto the scene using a projector and capturing one or more images of the scene using a camera comprises determining 3D positions of a collection of scene points from the one or more captured images, determining surface normal vectors of the scene points in the collection of scene points from the one or more captured images, calculating a local reflectance value for each of the scene points in the collection of scene points, and fitting a global reflectance model to the local reflectance values so as to obtain an estimate of a material property parameter.
  • the projector is a multi-primary projector with multiple channels
  • the camera is a multispectral camera with at least as many channels as the number of projector channels
  • the one or more patterns are patterns that are composed of binary patterns in each projector channel.
  • the binary patterns are based on the Gray code; the one or more patterns consist of two patterns such that for each projector channel, the composing binary pattern in one pattern is complementary to the composing binary pattern in the other pattern; or determining the 3D position of a scene point from the collection of scene points includes determining a projector-pixel coordinate and a camera-pixel coordinate corresponding to the scene point, followed by a triangulation calculation.
  • determining the projector pixel coordinate includes recovering a binary representation of a code with each binary digit corresponding to a projector channel, recovering a binary representation of a code comprises comparing the values of the complementary binary patterns at the scene point, determining surface normal vectors comprises applying a spatial smoothing operation to the determined 3D positions in a moving window followed by fitting a tangent plane in the moving window, or determining surface normal vectors is based on a photometric stereo algorithm.
  • calculating a local reflectance value for each of the scene points is based on the 3D position and surface normal vector of the scene point, as well as the values of the one or more captured images at a pixel corresponding to the scene point; the global reflectance model to be fitted is a constant reflectance value to be determined; or the global reflectance model to be fitted is a BRDF model (“bidirectional reflectance distribution function”).
  • these systems and methods enable real-time determination of material properties by capturing very few multispectral images, such as only one multispectral image or only two multispectral images or three or more multispectral images.
  • Some embodiments may be implemented as a method or methods according to any of the disclosure herein. Some embodiments may be implemented as an apparatus or apparatuses according to any of the disclosure herein. Representative embodiments of such apparatus may be implemented as one or more processors constructed to execute stored process steps together with memory which stores the process steps described herein for execution by the processor(s). Other representative embodiments of such apparatus may be implemented as units constructed to execute processes described herein, with such units, for example, being implemented by computer hardware in combination with software which when executed by the computer hardware causes the computer hardware to execute such processes. Some further embodiments may be implemented as non-transitory computer-readable storage media which retrievably stores computer-executable process steps which when executed by a computer cause the computer to execute such process steps.
  • FIG. 1 is a view showing a multichannel projector for projecting spectral structured light on an object and a multispectral camera for multispectral capture of light reflected therefrom.
  • FIGS. 2 and 2A are views showing the relation between relative sensitivity of the multispectral camera as compared to the wavelengths of light projected by the multichannel projector.
  • FIG. 3 is an example of input binary images based on Gray code, and shows ten (10) binary images.
  • FIG. 4 shows inverse binary patterns to those of FIG. 3 .
  • FIG. 5 is a view for explaining 3D triangulation for determining the 3D position of a scene point.
  • FIG. 6 shows an example of a 3D-reconstruction-error distribution for two different camera sensitivity matrices, one with a relatively low condition number and the other with a relatively high condition number.
  • FIG. 7 shows an example embodiment of an automated sorting system such as might be used in a recycling facility, which combines 3D depth reconstruction and material classification based on projection of spectral structured light, together with material classification logic for automated sorting of objects based on material and/or based on shape.
  • FIG. 8 is a view for explaining the 3D position of a point on the surface of an object in a scene, and for explaining the surface normal to the point.
  • FIG. 9 is a flowchart for explaining the determination of spectral reflectance values for points in a scene, by using 3D positions of the points and by using normal vectors to the points, in order to develop a local reflectance map.
  • FIG. 10 is an example of a local reflectance map.
  • FIG. 11 is a view for explaining one example of calculating a local reflectance value, by using two or more patterns that include binary patterns in each projector channel.
  • FIG. 12 shows the radiance profiles of I 0 , I 2 , and I 3 of FIG. 11 , along a horizontal row of pixels.
  • FIG. 13 is a view for explaining a geometry of a ray for a “v-groove” scene.
  • FIG. 14 is an RGB visualization of the simulation of a spectral Gray code pattern (left) and its inverse pattern (right).
  • FIG. 15 is view for explaining a simulation framework.
  • FIG. 16 is a flowchart for explaining the determination of spectral reflectance values for points in a scene, by using 3D positions of the points and by using normal vectors to the points, in order to develop a local reflectance map.
  • FIG. 1 An example embodiment of a multispectral binary coded projection system 100 is shown in FIG. 1 .
  • the multichannel projector 101 has n channels capable of accepting n binary images 102 such as binary images PROJ_IMG 1 , PROJ_IMG 2 , . . . , PROJ_IMG n .
  • the binary images are generated by a control module 103 and transmitted to the multichannel projector 101 .
  • a multispectral camera 105 with m channels is also controlled by the control module 103 to capture an image having multiple channels, such as camera images CAM_IMG 1 , CAM_IMG 2 , . . . , CAM_IMG m shown at reference numeral 106 , with the captured image being captured synchronously with the projected image.
  • each of the channels of the projector 101 is monochromatic or narrow band with a characteristic peak wavelength, e.g., ⁇ 1 , ⁇ 2 , . . . , ⁇ n .
  • the spectral sensitivity of each channel of the multispectral camera 105 responds to at most one wavelength of a channel of the projector.
  • FIG. 2 An example is shown in FIG. 2 , where the spectral sensitivities of the camera channels are denoted S 1 ( ⁇ ), S 2 ( ⁇ ), . . . , S m ( ⁇ ). The figure illustrates that it is not strictly necessary for peak sensitivity of a camera channel to match exactly at the corresponding projector wavelength.
  • FIG. 2 shows spectral sensitivities that are narrow banded, the spectral sensitivities are not required to be narrow banded.
  • FIG. 2A shows spectral sensitivities that have broadband sensitivity.
  • the camera sensitivity matrix ⁇ corresponds to a diagonal matrix.
  • each camera-sensitivity curve may respond to multiple wavelengths of the projector, and the matrix ⁇ is band diagonal.
  • Tables 1 and 2 show two examples of the matrix ⁇ . In the first example of the matrix ⁇ , denoted ⁇ 1 , the entries of the matrix are shown in the following table:
  • some embodiments include a matrix ⁇ that is well-conditioned.
  • the wavelengths of the monochromatic channels are chosen to correspond to the wavelengths where the spectrum of the ambient light has low intensities. In some embodiments, some of the wavelengths lie in the NIR (near infra-red) spectrum, where the intensities of the spectrum of the ambient light are low.
  • the projector comprises monochromatic laser sources. In some embodiments, the projector comprises monochromatic LED sources.
  • some channels of the projector are not monochromatic, but are nevertheless narrow banded such that none of the channels overlaps in spectrum with other channels, and the spectral sensitivity of each channel of the multispectral camera responds to at most one channel of the projector.
  • the input binary images 102 depicted in the FIG. 1 embodiment as PROJ_IMG 1 , PROJ_IMG 2 , . . . , PROJ_IMG n , are based on Gray code.
  • other codes are used.
  • the high frequency binary codes disclosed in Gupta, M., et al., Structured light 3D scanning in the presence of global illumination, CVPR 2011, may be used.
  • These binary codes are related to the Gray codes via an XOR operation so that they share similar benefits of the Gray codes, while the relatively high spatial frequency of these codes provides more robustness to the adverse effect of global illumination effects such as interreflections.
  • Other binary codes, including the natural binary codes may be used in an embodiment where the operating environment and/or scene can be controlled so as to reduce recovery errors of the binary codes.
  • Gray code it has been considered that when capturing a scene illuminated with a binary coded projection, the scene is captured by a monochrome (e.g., grayscale) camera. Analysis of the captured images then recovers the Gray code at each scene point, from which the depth map can be recovered by a technique based on triangulation.
  • a monochrome e.g., grayscale
  • Each channel gives rise to a monochrome (e.g., grayscale) image similar to the conventional Gray code, but with the m channel multispectral camera of the embodiment described herein, there are m different channels captured as compared to the single channel captured with a monochrome camera.
  • a monochrome e.g., grayscale
  • the operating environment is controlled (e.g., low ambient light) and the scene/objects have a known range of spectral reflectance (e.g., a lower bound of the reflectance at wavelengths ⁇ 1 , ⁇ 2 , . . . , ⁇ n can be predetermined).
  • a threshold value t i can be predetermined during calibration such that a binary image BIN IMG 1 can be determined by thresholding: For a pixel p,
  • ⁇ + may be taken as the Moore-Penrose pseudo-inverse of ⁇ .
  • c(p) is a correction factor that depends on the pixel location p and that accounts for irradiance falloff from the optical axis of the camera. A common approximation for this factor is the “cosine 4 th power”. A more precise estimate of this falloff factor can be obtained by camera calibration.
  • binarization can be performed by thresholding the radiance images, similar to the previous embodiment, by directly using the camera images:
  • the thresholds t i are determined in turn by a predetermined ratio ⁇ i :
  • ⁇ i 0.5.
  • radiance images are estimated by inverting ⁇
  • is ill-conditioned (e.g., if it has a high condition number)
  • the quantization errors in the camera images e.g., 16-bit integer encoding from the A/D converter in the camera image processing
  • Acceptable values for the condition number are 100 and lower, more preferably 10 and lower
  • a second multispectral binary pattern is projected that is the inverse of the first multispectral binary pattern.
  • an inverse multispectral binary pattern is obtained by interchanging the values 0 and 1, as shown in FIG. 4 .
  • a “null pattern”, i.e., a pattern of zeroes, is projected.
  • the control module 103 calculates this inverse pattern and controls the multichannel projector 101 to project this pattern after projecting the first pattern.
  • the control module 103 also controls the multispectral camera 105 to capture the scene with the inverse pattern after capturing the scene with the first pattern.
  • the resulting multispectral image corresponds to a set of grayscale images which are denoted here as “primed” images CAM_IMG 1 ′, CAM_IMG 2 ′, . . . , CAM_IMG m ′.
  • a binary image BIN IMG i can be determined by comparing with CAM_IMG i ′:
  • binarization is performed by comparing radiance images:
  • the radiance images are estimated from the camera images similar to the procedure described previously.
  • the spectral power distributions of the projector channels, P 1 ( ⁇ ), . . . , P n ( ⁇ ), are not monochrome or narrow band, or otherwise may have significant cross-talk with the spectral sensitivities S 1 ( ⁇ ), . . . , S m ( ⁇ ) of the multispectral camera.
  • the converted binary images are used to determine a binary code of length n for each scene point corresponding to each pixel of the camera.
  • the recovered binary code is an n-bit Gray code, and the Gray code is further inverted to obtain a decimal index number that indicates the projector pixel column that is visible to the scene point.
  • the recovered binary code is an n-bit binary code with an associated decimal index number corresponding to a sequential or serial index of the binary code, and the decimal index number, which indicates the projector pixel column that is visible to the scene point, is recovered by an inversion process and/or a table lookup process. The depth of the scene point is then determined by a triangulation-based method.
  • the method of triangulation may be implemented based on the geometries and orientations shown in FIG. 5 .
  • a method of triangulation involves a line-plane intersection 141 , wherein the line corresponds to a ray 142 emanating from the camera pixel 143 in question, and the plane is a light plane 145 that corresponds to the determined projector pixel column 146 .
  • the determined intersection gives the 3D position of the scene point or, equivalently, the (projective) depth of the scene point along the camera ray.
  • FIG. 6 shows one example of a 3D-reconstruction-error distribution for the two camera sensitivity matrices ⁇ 1 and ⁇ 2 given in Tables 1 and 2 above. It shows that high condition numbers result in high reconstruction errors.
  • selection of which binary pattern to project in which projector channel is based on a rule that minimizes error. In some embodiments, the selection of which binary pattern to project in which projector channel depends on a combination of the camera spectral sensitivity and the target material reflectance.
  • the projector channels have narrow band characteristic wavelengths ⁇ 1 , ⁇ 2 , . . . , ⁇ n .
  • ⁇ ( ⁇ ) the spectral reflectances of the target material
  • ⁇ i ⁇ i ⁇ ( ⁇ i )
  • These quantities are then sorted into non-increasing order of magnitude, resulting in a permutation of indices ⁇ i , e.g., ⁇ ⁇ 1 ⁇ ⁇ 2 ⁇ . . . ⁇ ⁇ n .
  • projector channel ⁇ 1 projects binary pattern PROJ_IMG i .
  • the larger ⁇ i is for a projector channel, the more significant the bit plane in the code that is projected in that channel.
  • ⁇ ( ⁇ i ) is taken to 1, and ⁇ i is completely determined by the camera sensitivity at that wavelength. The principle behind these embodiments is that a higher ⁇ i results in lower error (smaller condition number) in that channel, and should be preferred for recovery of high-significance bits.
  • the following Table 3 shows the cumulative distribution function of 3D-reconstruction errors for two cases.
  • the first case uses the above embodiment with the non-decreasing sequence of ⁇ 1 .
  • the second case uses the reverse ordering, i.e., non-increasing sequence of ⁇ 1 .
  • the reverse-ordering embodiment provides an advantage of 0.5% for lower errors. To put the results in context, 0.5% of 1 megapixel is 5,000 pixels. Furthermore, this advantage can be achieved by simply using a more optimal choice of projector channels for the patterns, without incurring any cost in terms of extra equipment or acquisition efficiency.
  • the depth sensor is incorporated into a material camera, which in turn is part of a system for classifying and sorting materials, such as in a recycling facility.
  • FIG. 7 depicts an example arrangement.
  • a conveyor belt paradigm 160 provides a dynamic scene environment that requires fast capture and decision.
  • a conveyor belt 161 transports objects 162 a , 162 b , etc. past an inspection station indicated generally at 163 , where the object subjected to inspection is illuminated with multispectral binary coded projection by sensor 165 .
  • Sensor 165 may comprise system 100 shown in FIG. 1 .
  • Sensor 165 obtains a 3D reconstruction of the inspected object, as described above.
  • sensor 165 determines the type of material for the inspected object, such as by distinguishing between plastic and rubber. Material classification may be based in part on spectral reflectances from the object, and it may in part rely on the 3D reconstruction from sensor 165 in order to determine expected spectral reflectances.
  • module 166 for material classification logic activates robotics shown generally at 167 , to sort the inspected objects into respective bins 168 a , 168 b etc.
  • some of the embodiments employ multispectral cameras (relatively broadband) instead of hyperspectral cameras (relatively narrow band). This is advantageous because multispectral imaging does not require high spectral resolution like hyperspectral imaging does and because acquisition time is much shorter, allowing real-time capture of dynamic scenes.
  • Material classification using optical methods has many practical applications, such as food inspection and recycling.
  • One material property to measure is spectral reflectance.
  • some systems and methods extend the standard structured-light technique for three-dimensional measurement to “spectral structured light”, which enables real-time determination of spectral reflectance by capturing very few multispectral images, such as only one multispectral image or only two multispectral images or three or more multispectral images, using a multi-primary projector and a multispectral camera.
  • Imaging spectroscopy such as described by Kim, et al., 2012, combines 3D measurement techniques and hyperspectral imaging, but requires a long capture time, making it unsuitable for real-time classification of materials.
  • structured light techniques are extended in scope and concept to “spectral structured light” that enables real-time determination of spectral reflectance, in some cases by capturing as few as two (2) multispectral images using a multichannel projector and a multispectral camera. Simulation results, using an NVIDIA OptiX framework for a ray tracing simulation of the multispectral images, are also discussed.
  • V-groove For the simulation scene, some embodiments used a “V-groove” scene, which is commonly used for testing in structured light experiments. Despite its simplicity, the scene presents difficulties, such as the interreflection effect, and non-uniform illumination due to the relative position of the projector and the V-groove.
  • the spectral reflectance can be recovered from the radiance of the scene point, which in turn can be measured by a multispectral camera.
  • 10-bit Gray code patterns are used as in conventional structured light, but the bit plane images are projected simultaneously using separate channels of the 10-channel projector with monochromatic primary in each channel.
  • only 2 multispectral images are captured by the multispectral camera, corresponding to the 10-bit Gray code and its inverse (complementary) code.
  • multiple grayscale patterns i.e., not binary patterns
  • the multiple grayscale patterns are projected simultaneously by projecting each one of the multiple grayscale patterns in a corresponding one of the multiple channels of the multi-primary projector.
  • only 2 multispectral images are captured by the multispectral camera, corresponding to the multiple grayscale patterns and their inverse (complementary) patterns. For example, if a grayscale pattern is a sinusoidal pattern, then its inverse pattern is the inverted sinusoidal pattern, i.e., with a 180 degree phase difference.
  • Each simulated multispectral image resulted in a stack of 10 spectral radiance images.
  • the totality of 20 radiance images were processed analogously to the standard structured light post-processing to generate a 3D map.
  • the 3D map was further processed to produce a surface normal map after a pre-processing (smoothing) step.
  • the radiance images, 3D map, and surface normal map were then used to generate a stack of 10 reflectance maps.
  • RANSAC Random SAmple Consensus
  • the simulation obtained a spectral RMS error of 0.97% (less than 1%). In many applications, this is considered a good estimate of spectral reflectance.
  • the tests validate the utility and versatility of a spectral structured light imaging system for fast (2-shot) determination of spectral reflectance of materials.
  • the flexibility and extensibility of the OptiX ray tracing framework allows fast implementation of simulation of arbitrary projector-camera systems and scenes beyond conventional RGB colors.
  • Other systems and methods may include simulations of more complex scenes and non-Lambertian materials and include determinations of other material properties, such as BRDF.
  • a multichannel projector 181 with N channels and a multispectral camera 182 having M channels, with M being at least N channels, are provided and geometrically calibrated in a triangulation configuration, as shown in FIG. 8 .
  • N 10
  • P is a typical scene point and n P is the surface normal vector at P. If the 3D position of P and the surface normal vector at P are known, and the projector-camera system is geometrically calibrated, then d P , the distance between the scene point P and the optical center of the projector, can be determined. Similarly, the angles ⁇ P , ⁇ P , ⁇ P as indicated in FIG. 8 can also be determined, such as by geometric calibration of the projector-camera system as mentioned in the foregoing paragraph.
  • v i is the projector radiance (predetermined in a projector calibration, for example) in the i-th projector channel due to a projector pixel Q affecting the scene point P
  • L i is the reflected radiance at P at wavelength ⁇ i as recorded by the multispectral camera
  • Equation (9) may take on an additional multiplicative constant on the right hand side. With proper choice of units for radiance and distance, this constant may be reduced to unity. More generally, the right hand side of Equation (9) may take the form
  • R 0 ( ⁇ P , ⁇ P , ⁇ i ) ⁇ R( ⁇ P , ⁇ P , ⁇ i ), so that the above equation takes the following form, provided that the i-th channel of projector pixel Q is turned on:
  • R 0 ( ⁇ P , ⁇ P , ⁇ i ) can be considered an approximation of the spectral reflectance ⁇ ( ⁇ 1 ) of the Lambertian surface at wavelength ⁇ i .
  • the 3D position of P and the surface normal vector at P can be determined using a method of triangulation as shown in the flow diagram depicted in FIG. 9 .
  • one or more patterns composed of binary patterns in each projector channel are projected onto an object in the scene.
  • the binary patterns may be based on the Gray code or any acceptable binary code as described above.
  • Multispectral images are captured in step S 901 .
  • the 3D position of P equivalently the depth of P, can be determined by a method of triangulation, in step S 902 .
  • the one or more patterns may include two patterns such that for each projector channel, the composing binary pattern in one pattern is complementary, or inverse, to the composing binary pattern in the other pattern, in which case there is a corresponding multispectral capture of the complementary/inverse projections.
  • this process of determining 3D position is applied to a collection of scene points.
  • the collection of determined 3D positions is referred to as a 3D map, as also shown in step S 902 .
  • the collection of scene points may be determined by a preprocessing step applied to the multispectral images captured by the multispectral camera, such as a step of segmentation.
  • the step of segmentation may determine scene points that belong to a same material.
  • a surface normal map is determined at step S 903 .
  • some embodiments determine the surface normal map. In some embodiments, this can be determined from the already determined 3D map. Because taking derivatives may amplify inherent noise in the 3D map, a spatial smoothing operation is applied to the determined 3D positions in a moving window of size W ⁇ W. For example, a moving averaging operation can be applied to a 3D map in an 11 ⁇ 11 window. After spatial smoothing, a tangent plane is fitted to the 3D positions in the W ⁇ W window, for example in the least-squares sense. The normal vector at a scene point is then taken to be a unit vector perpendicular to the fitted tangent plane.
  • the surface normal map is not determined based on the 3D map. Instead, a photometric stereo method, perhaps using an entirely independent illumination and/or camera system, is used to determine the surface normal vectors directly, wherein one or more projectors or illumination sources may be used.
  • a local reflectance map for each wavelength ⁇ i can be calculated, as shown at step S 904 .
  • the local reflectance map for each wavelength ⁇ i is calculated based on the 3D position and surface normal vector of a scene point, as well as the values of the one or more captured images at a pixel corresponding to the scene point, as described in one of the equations (9) or (10) above.
  • FIG. 10 shows an example of a local reflectance map for a wavelength.
  • a scene point that is not illuminated i.e., the corresponding projector pixel is not turned on in the channel corresponding to the wavelength
  • a scene point that is illuminated i.e., the corresponding projector pixel is turned on in the channel corresponding to the wavelength
  • a lit pixel with a local reflectance value calculated from Equation (10).
  • a local reflectance map R 0 ( ⁇ P , ⁇ P , ⁇ i ) is theoretically a constant function in P, e.g., independent of ⁇ P and ⁇ P . Due to illumination conditions, interreflections, or other global illumination effects, along with errors associated with finite projector and camera resolution, the local reflectance map may not be a constant even for a Lambertian surface. For example, in the illustrative example shown in FIG. 10 , the local reflectance map is seen to take values ranging from 0.25 to 0.30.
  • a reflectance property is calculated from local reflectance values in the local reflectance map, at step S 905 , by fitting them to a global reflectance model.
  • a global reflectance model is a constant global reflectance model.
  • Another such model is an analytical BRDF (bidirectional reflectance distribution function) model, which more accurately is actually a family of models providing an abundance of analytic BRDF models.
  • a reflectance property is the spectral reflectance ⁇ ( ⁇ i ) at wavelength ⁇ i .
  • a RANSAC (RANdom SAmple Consensus) algorithm may be applied to a local reflectance map to fit a constant global reflectance model.
  • the RANSAC algorithm attempts to find a subset of pixels in the local reflectance map that are inliers.
  • fitting corresponds to an averaging of the local reflectance values for the inliers, resulting in an estimate ⁇ circumflex over ( ⁇ ) ⁇ i of the true reflectance ⁇ ( ⁇ i ), and the best fit is achieved by minimizing the variance or standard deviation.
  • the RANSAC algorithm requires as input a threshold number for the least number of inliers required. In some embodiments, this threshold number of pixels is 10% of the valid pixels in the local reflectance map.
  • BRDF bidirectional reflectance distribution function
  • a model may take the form f( ⁇ P , ⁇ P , ⁇ i , ⁇ , ⁇ , . . . ), where f is a function and ⁇ , ⁇ , . . . are parameters of the model.
  • is an albedo (reflectance) at ⁇ i
  • is a surface roughness parameter.
  • RANSAC is applied to obtain a best fit of the BRDF model for the local reflectance map R ( ⁇ P , ⁇ P , ⁇ i ), resulting in estimates of the BRDF model parameters ⁇ circumflex over ( ⁇ ) ⁇ , ⁇ circumflex over ( ⁇ ) ⁇ , . . . .
  • These parameters can then be used as material property parameters.
  • parameters such as albedo and root-mean-square slope of the surface microfacets can be used as material property parameters.
  • a global reflectance model for a non-Lambertian surface include a BTF (bidirectional texture function) model or a BSSRDF (bidirectional surface scattering reflectance distribution function) model.
  • BTF bidirectional texture function
  • BSSRDF bidirectional surface scattering reflectance distribution function
  • these systems and methods enable real-time determination of material properties by capturing very few (1 or 2) multispectral images.
  • step S 904 With specific reference to the aforementioned feature of calculating a local reflectance value for multiple ones of each of the scene points, as mentioned in connection with step S 904 and which is based on the 3D position and surface normal vector of the scene point, as well as the values of the one or more captured images at a pixel corresponding to the scene point, the following may be stated.
  • one or more patterns that include binary patterns in each projector channel are used.
  • the one or more patterns consist of two patterns such that, for each projector channel, the included binary pattern in one pattern is complementary to the included binary pattern in the other pattern.
  • the multispectral camera records two radiance images I 0 and I 1 that are complementary, or inverse to each other.
  • the projected patterns are binary
  • the recorded images would in general have more than two values, e.g., they are grayscale images instead of binary images.
  • FIG. 11 shows an example.
  • a binary mask M shown at 171 .
  • This mask M represents pixels that are determined to correspond to projector pixels that are turned on in that channel.
  • the binary mask M is determined by performing the Boolean comparison operation I 0 >I 1 on the two radiance images I 0 (shown at 172 ) and I 1 (shown at 173 ).
  • is typically approximately 0.5, although the actual value depends on the images I 0 and I 1 , in the sense that ⁇ is calculated from M which in turn is calculated from images I 0 and I 1 .
  • a correction image I 2 (shown at 175 ) is then determined.
  • the correction image I 2 appears relatively dark and corresponds to “noise” due to interreflections.
  • the correction image I 2 is determined using the following equation:
  • I 2 I 1 ⁇ M ⁇ ⁇ 1 - ⁇ Equation ⁇ ⁇ ( 11 )
  • Pixel-wise multiplication by the mask M may ensure that correction is applied only to the on-pixels in I 0 .
  • the local reflectance calculation may be done only for projector pixels that are turned on for that channel, so that correction to the on-pixels is the only concern.
  • the reason for the need for correction may be to correct for global illumination effects such as interreflections.
  • the radiance values L i at the on-pixels in I 3 are then used in the local reflectance calculations.
  • FIG. 12 One example of illustrative radiance profiles of I 0 , I 2 , and I 3 along a horizontal row of pixels is shown in FIG. 12 .
  • the thin continuous line represents the profile of I 0
  • the thick dotted line and thick continuous line represent respectively the profiles of I 2 and I 3 . Comparing the profile of I 0 at on-pixels and off-pixels, it is clear that the correction of the global component of illumination is effective.
  • FIGS. 13 , 14 and 15 describe ray-tracing simulations that validate the utility and versatility of a spectral structured light imaging system for fast (2-shot) determination of spectral reflectance of materials, or more generally to the determination of a material property.
  • One common material property to measure is spectral reflectance.
  • This figure depicts a cross-section of a 3D Lambertian surface shown in cross-section at the X-Z plane.
  • the equation of the Lambertian surface is
  • V-groove scene presents difficulties such as the interreflection effect, and non-uniform illumination due to the relative position of the projector and the V-groove.
  • the multi-primary projector has 10 monochromatic channels with wavelengths ⁇ 1 , ⁇ 2 , . . . , ⁇ 10 . If the radiance values of a scene point P at these wavelengths are L 1 , L 2 , . . . , L 10 , as recorded by the multispectral camera, then the spectral reflectance factors at these wavelengths can be recovered by the following equation:
  • FIG. 15 is view for explaining a simulation framework.
  • the NVIDIA OptiX framework was used for ray tracing simulation of spectral radiance images because of its flexibility to program non-standard illumination source and arbitrary spectral data type.
  • the flexibility and extensibility of the OptiX ray tracing framework allows fast implementation of simulation of arbitrary projector-camera systems and scenes beyond conventional RGB colors.
  • a new data type is defined to support 10-channel spectral data.
  • the simulation program starts by initializing an instance ‘prd’ of a data structure ‘PerRayData’, and generating an initial camera ray.
  • the data structure ‘PerRayData’ is a custom data structure containing fields including ‘radiance’, ‘projector’, and ‘attenuation’, of a custom data type ‘spectrum’.
  • the custom data type ‘spectrum’ is defined as a structure containing a vector of ten components, each represented in an IEEE single precision floating point number.
  • a program variable ‘depth’, initially set to one, is compared to a predetermined constant ‘MAX_DEPTH’ which determines a maximum number of levels of recursion in the ray tracing simulation.
  • the OptiX built-in function rtTrace interacts with two custom functions: intersect_vgroove and closest_hit.
  • intersect_vgroove calculates the intersection point ‘hit_point’ of the current ray and the V-groove.
  • the intersection point ‘hit_point’ is then passed into the closest_hit function, which generates a new random direction, calculates projector radiance ‘prd.projector’ at ‘hit_point’, and updates attenuation factor ‘prd.attenuation’, the latter two being used in the subsequent updating of the current radiance of the ray as described above.
  • 10-bit Gray code patterns are used as in conventional structured light, but the bit plane images are projected simultaneously using separate channels of the multi-primary projector. Only 2 multispectral images are captured, corresponding to the 10-bit Gray code and its inverse (complementary) code.
  • a grayscale illustration of an RGB visualization of OptiX simulation of spectral Gray code pattern (left) and its inverse pattern (right) are shown in FIG. 14 .
  • An example pixel column is shown on the spectral Gray code pattern (left) that is associated with an example code 1010001001, resulting in non-zero reflection spectrum in wavelengths ⁇ 1 , ⁇ 3 , ⁇ 7 and ⁇ 10 .
  • a corresponding pixel column is shown on the inverse pattern (right) that is associated with inverse code 0101110110, resulting in non-zero reflection spectrum in wavelengths ⁇ 2 , ⁇ 4 , ⁇ 5 , ⁇ 6 , ⁇ 8 and ⁇ 9 .
  • FIG. 16 A flow diagram for one representative reconstruction algorithm is shown in FIG. 16 .
  • multispectral images are captured in step S 1501 (here a multispectral image pair is captured), a 3D map is reconstructed in step S 1502 , such as by triangulation, a surface normal map is constructed in step S 1503 , a reflectance map is constructed in step S 1504 , and spectral reflectances are estimated in step S 1505 , such as by RANSAC analysis.
  • each of the 2 spectral Gray code patterns results in a stack of 10 spectral radiance images.
  • the totality of 20 radiance images is processed analogously to structured light post-processing described above to generate a 3D map.
  • the 3D map is further processed to produce a surface normal map after a pre-processing (smoothing) step.
  • the spectral RMS (root-mean-square) error can be derived from the individual spectral errors shown in Table 4. In this simulation, the spectral RMS error is 0.97% (less than 1%). In many applications, this is considered a good estimate of spectral reflectance.
  • example embodiments may include a computer processor such as a single core or multi-core central processing unit (CPU) or micro-processing unit (MPU), which is constructed to realize the functionality described above.
  • the computer processor might be incorporated in a stand-alone apparatus or in a multi-component apparatus, or might comprise multiple computer processors which are constructed to work together to realize such functionality.
  • the computer processor or processors execute a computer-executable program (sometimes referred to as computer-executable instructions or computer-executable code) to perform some or all of the above-described functions.
  • the computer-executable program may be pre-stored in the computer processor(s), or the computer processor(s) may be functionally connected for access to a non-transitory computer-readable storage medium on which the computer-executable program or program steps are stored.
  • access to the non-transitory computer-readable storage medium may be a local access such as by access via a local memory bus structure, or may be a remote access such as by access via a wired or wireless network or Internet.
  • the computer processor(s) may thereafter be operated to execute the computer-executable program or program steps to perform functions of the above-described embodiments.
  • example embodiments may include methods in which the functionality described above is performed by a computer processor such as a single core or multi-core central processing unit (CPU) or micro-processing unit (MPU).
  • a computer processor such as a single core or multi-core central processing unit (CPU) or micro-processing unit (MPU).
  • the computer processor might be incorporated in a stand-alone apparatus or in a multi-component apparatus, or might comprise multiple computer processors which work together to perform such functionality.
  • the computer processor or processors execute a computer-executable program (sometimes referred to as computer-executable instructions or computer-executable code) to perform some or all of the above-described functions.
  • the computer-executable program may be pre-stored in the computer processor(s), or the computer processor(s) may be functionally connected for access to a non-transitory computer-readable storage medium on which the computer-executable program or program steps are stored. Access to the non-transitory computer-readable storage medium may form part of the method of the embodiment. For these purposes, access to the non-transitory computer-readable storage medium may be a local access such as by access via a local memory bus structure, or may be a remote access such as by access via a wired or wireless network or Internet.
  • the computer processor(s) is/are thereafter operated to execute the computer-executable program or program steps to perform functions of the above-described embodiments.
  • the non-transitory computer-readable storage medium on which a computer-executable program or program steps are stored may be any of a wide variety of tangible storage devices which are constructed to retrievably store data, including, for example, any of a flexible disk (floppy disk), a hard disk, an optical disk, a magneto-optical disk, a compact disc (CD), a digital versatile disc (DVD), micro-drive, a read only memory (ROM), random access memory (RAM), erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), dynamic random access memory (DRAM), video RAM (VRAM), a magnetic tape or card, optical card, nanosystem, molecular memory integrated circuit, redundant array of independent disks (RAID), a nonvolatile memory card, a flash memory device, a storage of distributed computing systems and the like.
  • the storage medium may be a function expansion unit removably inserted in and/or remotely accessed by the apparatus or system for use with the computer processor(s).

Abstract

Illumination of an object with spectral structured light, and spectral measurement of light reflected therefrom, for purposes which include derivation of a three-dimensional (3D) measurement of the object, such as depth and/or contours of the object, and/or for purposes which include measurement of a material property of the object, such as by differentiating between different types of materials.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Patent Application No. 61/677,382, filed Jul. 30, 2012, and U.S. Provisional Patent Application No. 61/759,083, filed Jan. 31, 2013, the contents of both of which are hereby incorporated by reference as if fully stated herein.
  • FIELD
  • The present disclosure relates to illumination of an object with structured light, and measurement of light reflected therefrom, for purposes which include measurement of a material property of the object, such as by differentiating between different types of materials.
  • BACKGROUND
  • In the field of 3D scanning using structured light, it has been considered to use binary coded patterns, which are black and white patterns. See, for example, Gupta et al., 2011. In addition, it has also been considered to use “rainbow projections”, which are color coded patterns. See, for example, U.S. Pat. No. 7,349,104 to Geng. Manabe et al. combine 3 bit planes of Gray code to form a color pattern using an RGB projector.
  • REFERENCES
    • Gupta, M., Agrawal, A., Veeraraghavan, A. and Narasimhan, S., Structured light 3D scanning in the presence of global illumination, CVPR 2011.
    • U.S. Pat. No. 7,349,104 Geng, System and a method for three-dimensional imaging systems
    • Manabe, Y., Parkkinen, J., Jaaskelainen, T. and Chihara, K., Three Dimensional Measurement using Color Structured Patterns and Imaging Spectrograph, ICPR 2002.
    • Kim, Harvey, Kittle, Rushmeier, Dorsey, Prum and Brady, 3D Imaging Spectroscopy for Measuring Hyperspectral Patterns on Solid Objects, SIGGRAPH 2012.
    SUMMARY
  • Binary coded patterns are robust to depth discontinuities, but multiple patterns are required. For example, to project Gray codes in 10 bits (10-bit depth resolution), a minimum of 10 projected patterns are required. This makes high-speed 3D scanning of dynamic scenes difficult. Combining 3-bit planes into one color pattern using the 3 channels of an RGB projector can reduce the number of projected patterns, but due to the crosstalk between the channels of a broadband RGB projector, a hyperspectral camera with narrow band sensitivity must be used in some systems. Hyperspectral imaging requires a long capture time for each scene and is not suitable for real-time capture of dynamic scenes or when fast sorting is desired.
  • Conventional spectroscopy is capable only of single point measurements. Imaging spectroscopy, such as described by Kim, et al., 2012, combines 3D measurement techniques and hyperspectral imaging, but requires a long capture time, making it unsuitable for real-time classification of materials.
  • Other color coded patterns, such as the rainbow projections, are spatial coding. They assume that depth does not change abruptly across the object. In other words, they are not robust to depth discontinuities.
  • In a first aspect described herein, an object is illuminated with structured light that is also spectrally structured, and light reflected therefrom is measured spectrally, for purposes which include derivation of a three-dimensional (3D) measurement of the object, such as depth and/or contours of the object.
  • In a second aspect described herein, an object is illuminated with structured light that is also spectrally structured, and light reflected therefrom is measured spectrally, for purposes which include measurement of a material property of the object, such as by differentiating between different types of materials. As one simple example, there is a differentiation between an object whose material is plastic and an object whose material is rubber.
  • In a third aspect described herein, an object is illuminated with structured light that is also spectrally structured, and light reflected therefrom is measured spectrally, for purposes which include derivation of a three-dimensional (3D) measurement of the object, such as depth and/or contours of the object. The 3D measurement is used to calculate a reflectance property, such as by using calculations of a surface normal vector of the object from the 3D measurement, and estimating the reflectance property from the 3D measurement, surface normal vector and spectral measurements of reflected light, for purposes which include measurement of a material property of the object, such as by differentiating between different types of materials.
  • As used in this description, the terms “multichannel projector” and “multi-primary projector” are generally equivalent to each other and may be used interchangeably. Likewise, as used in this description, the terms “spectral structured light” and “multispectral coded projection” should generally be considered equivalent to each other and may be used interchangeably, with the term “multispectral binary coded projection” being a special case.
  • In a somewhat more detailed summary of at least the first aspect, and at least partly of the second and third aspects, in some embodiments, a method of measuring depth of a scene using a multichannel projector with relatively-narrow spectrally-banded channels at predetermined wavelengths and a multispectral camera with relatively-broad spectrally-banded channels, wherein the spectral sensitivities of the camera channels at the wavelengths of the projector channels form a well-conditioned matrix, comprises projecting a multispectral binary pattern onto the scene, wherein the multispectral binary pattern is composed of binary images, one for each channel of the projector; capturing one or more images of the scene with the multispectral camera; and recovering the depth of the scene by analyzing the captured one or more images.
  • In some embodiments of the method, the number of projector channels is more than 3 and the number of camera channels is at least the number of projector channels. In some embodiments, the number of projector channels is 10, each channel of the projector is monochromatic, each channel of the projector is associated with a monochromatic laser source, or each channel of the projector is associated with a monochromatic LED light source.
  • Also, in some embodiments of the method, the spectral sensitivity of each channel of the multispectral camera responds to at most one wavelength of a channel of the projector; a binary image is designated for projection in a projector channel by a rule that minimizes error; the binary images for the channels of the projector are ranked according to a significance value, and a binary image with a higher significance value is designated for a projector channel with a wavelength at which camera spectral sensitivity is higher; or information about the spectral reflectance in the scene is available, and a binary image with higher significance value is designated for a projector with a wavelength at which a combination of camera spectral sensitivity and spectral reflectance is higher.
  • Furthermore, in some embodiments, analyzing the captured image includes converting the multispectral image into multiple binary images, one for each projector channel; the conversion is performed by comparing each channel of an image based on the multispectral image with a channel-dependent threshold value; the channel-dependent threshold value is also pixel-location dependent and is derived from a second multispectral image captured with an inverse multispectral binary pattern; or the image based on the multispectral image is a multispectral radiance image.
  • Additionally, in some embodiments, the wavelengths of the monochromatic channels are chosen to correspond to the wavelengths where the spectrum of the ambient light has low intensities, recovering the depth of the scene is performed by triangulation, or the triangulation includes determining a plane corresponding to a column of projector pixels and a line corresponding to a camera pixel and includes calculation of an intersection between the plane and the line.
  • The above-described systems and methods enjoy the same robustness to depth discontinuities as binary coded patterns, such as Gray code, while avoiding an unduly excessive number of projected patterns. Under the right circumstances, one-shot depth capture is theoretically possible, for example via thresholding in each channel of the multispectral camera.
  • In addition, some of the embodiments employ multispectral cameras (relatively broadband) instead of hyperspectral cameras (relatively narrow band). This is advantageous because multispectral imaging does not require high spectral resolution like hyperspectral imaging does and because acquisition time is much shorter, allowing real-time capture of dynamic scenes.
  • In a somewhat more detailed summary of at least the second and third aspects, and at least partly of the first aspect, structured light techniques are extended in scope and concept to “spectral structured light” that enables real-time determination of spectral reflectance, in some cases by capturing as few as two (2) multispectral images using a multi-primary projector and a multispectral camera. If the 3D position and surface normal vector of a scene point are known or can be derived, the spectral reflectance can be recovered from the radiance of the scene point, which in turn can be measured by a multispectral camera. 3D position of a scene point can ordinarily be obtained readily, such as through triangulation techniques. The surface normal vector of a scene point can ordinarily be obtained by applying techniques of the first aspect described above. In some embodiments of the spectral structured light system, 10-bit Gray code patterns are used as in conventional structured light, but the bit plane images are projected simultaneously using separate channels of the 10-channel projector with monochromatic primary in each channel. In some embodiments, only 2 multispectral images are captured by the multispectral camera, corresponding to the 10-bit Gray code and its inverse (complementary) code.
  • In some embodiments, this process of determining 3D position is applied to a collection of scene points. The collection of determined 3D positions is referred to as a 3D map. The collection of scene points may be determined by a preprocessing step applied to the multispectral images captured by the multispectral camera, such as a step of segmentation. For example, the step of segmentation may determine scene points that belong to a same material.
  • In some embodiments, the surface normal map is not determined based on the 3D map. Instead, a photometric stereo method is used to determine the surface normal vectors directly, wherein one or more projectors or illumination sources may be used.
  • In some embodiments, a method of estimating a material property of a scene by projecting one or more patterns onto the scene using a projector and capturing one or more images of the scene using a camera comprises determining 3D positions of a collection of scene points from the one or more captured images, determining surface normal vectors of the scene points in the collection of scene points from the one or more captured images, calculating a local reflectance value for each of the scene points in the collection of scene points, and fitting a global reflectance model to the local reflectance values so as to obtain an estimate of a material property parameter.
  • In some embodiments, the projector is a multi-primary projector with multiple channels, the camera is a multispectral camera with at least as many channels as the number of projector channels, and the one or more patterns are patterns that are composed of binary patterns in each projector channel. Also, in some embodiments, the binary patterns are based on the Gray code; the one or more patterns consist of two patterns such that for each projector channel, the composing binary pattern in one pattern is complementary to the composing binary pattern in the other pattern; or determining the 3D position of a scene point from the collection of scene points includes determining a projector-pixel coordinate and a camera-pixel coordinate corresponding to the scene point, followed by a triangulation calculation.
  • In some embodiments, determining the projector pixel coordinate includes recovering a binary representation of a code with each binary digit corresponding to a projector channel, recovering a binary representation of a code comprises comparing the values of the complementary binary patterns at the scene point, determining surface normal vectors comprises applying a spatial smoothing operation to the determined 3D positions in a moving window followed by fitting a tangent plane in the moving window, or determining surface normal vectors is based on a photometric stereo algorithm.
  • In some embodiments, calculating a local reflectance value for each of the scene points is based on the 3D position and surface normal vector of the scene point, as well as the values of the one or more captured images at a pixel corresponding to the scene point; the global reflectance model to be fitted is a constant reflectance value to be determined; or the global reflectance model to be fitted is a BRDF model (“bidirectional reflectance distribution function”).
  • Contrary to conventional spectroscopy or imaging spectroscopy, these systems and methods enable real-time determination of material properties by capturing very few multispectral images, such as only one multispectral image or only two multispectral images or three or more multispectral images.
  • Some embodiments may be implemented as a method or methods according to any of the disclosure herein. Some embodiments may be implemented as an apparatus or apparatuses according to any of the disclosure herein. Representative embodiments of such apparatus may be implemented as one or more processors constructed to execute stored process steps together with memory which stores the process steps described herein for execution by the processor(s). Other representative embodiments of such apparatus may be implemented as units constructed to execute processes described herein, with such units, for example, being implemented by computer hardware in combination with software which when executed by the computer hardware causes the computer hardware to execute such processes. Some further embodiments may be implemented as non-transitory computer-readable storage media which retrievably stores computer-executable process steps which when executed by a computer cause the computer to execute such process steps.
  • This brief summary has been provided so that the nature of this disclosure may be understood quickly. A more complete understanding can be obtained by reference to the following detailed description and to the attached drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a view showing a multichannel projector for projecting spectral structured light on an object and a multispectral camera for multispectral capture of light reflected therefrom.
  • FIGS. 2 and 2A are views showing the relation between relative sensitivity of the multispectral camera as compared to the wavelengths of light projected by the multichannel projector.
  • FIG. 3 is an example of input binary images based on Gray code, and shows ten (10) binary images.
  • FIG. 4 shows inverse binary patterns to those of FIG. 3.
  • FIG. 5 is a view for explaining 3D triangulation for determining the 3D position of a scene point.
  • FIG. 6 shows an example of a 3D-reconstruction-error distribution for two different camera sensitivity matrices, one with a relatively low condition number and the other with a relatively high condition number.
  • FIG. 7 shows an example embodiment of an automated sorting system such as might be used in a recycling facility, which combines 3D depth reconstruction and material classification based on projection of spectral structured light, together with material classification logic for automated sorting of objects based on material and/or based on shape.
  • FIG. 8 is a view for explaining the 3D position of a point on the surface of an object in a scene, and for explaining the surface normal to the point.
  • FIG. 9 is a flowchart for explaining the determination of spectral reflectance values for points in a scene, by using 3D positions of the points and by using normal vectors to the points, in order to develop a local reflectance map.
  • FIG. 10 is an example of a local reflectance map.
  • FIG. 11 is a view for explaining one example of calculating a local reflectance value, by using two or more patterns that include binary patterns in each projector channel.
  • FIG. 12 shows the radiance profiles of I0, I2, and I3 of FIG. 11, along a horizontal row of pixels.
  • FIG. 13 is a view for explaining a geometry of a ray for a “v-groove” scene.
  • FIG. 14 is an RGB visualization of the simulation of a spectral Gray code pattern (left) and its inverse pattern (right).
  • FIG. 15 is view for explaining a simulation framework.
  • FIG. 16 is a flowchart for explaining the determination of spectral reflectance values for points in a scene, by using 3D positions of the points and by using normal vectors to the points, in order to develop a local reflectance map.
  • DETAILED DESCRIPTION
  • An example embodiment of a multispectral binary coded projection system 100 is shown in FIG. 1.
  • The multichannel projector 101 has n channels capable of accepting n binary images 102 such as binary images PROJ_IMG1, PROJ_IMG2, . . . , PROJ_IMGn. The binary images are generated by a control module 103 and transmitted to the multichannel projector 101. At the same time, a multispectral camera 105 with m channels is also controlled by the control module 103 to capture an image having multiple channels, such as camera images CAM_IMG1, CAM_IMG2, . . . , CAM_IMGm shown at reference numeral 106, with the captured image being captured synchronously with the projected image. In one embodiment, n=m=10, and the pixel width of both the projector and the camera is at least 1024 (=210).
  • In some embodiments, each of the channels of the projector 101 is monochromatic or narrow band with a characteristic peak wavelength, e.g., λ1, λ2, . . . , λn. In an example embodiment, the spectral sensitivity of each channel of the multispectral camera 105 responds to at most one wavelength of a channel of the projector. An example is shown in FIG. 2, where the spectral sensitivities of the camera channels are denoted S1(λ), S2(λ), . . . , Sm(λ). The figure illustrates that it is not strictly necessary for peak sensitivity of a camera channel to match exactly at the corresponding projector wavelength. However, if the peak sensitivity does occur at the projector wavelength, an (defined below) tends to be larger, resulting in a camera sensitivity matrix that is more well-conditioned, i.e., a camera sensitivity matrix that has a smaller condition number as described below. In addition, although FIG. 2 shows spectral sensitivities that are narrow banded, the spectral sensitivities are not required to be narrow banded. FIG. 2A shows spectral sensitivities that have broadband sensitivity.
  • In general, a camera sensitivity matrix Σ=(σij) i=1, . . . , m; j=1, . . . , n is defined as follows:

  • σij =S ij)  Equation (1)
  • Thus, with the camera sensitivities depicted in FIG. 2, the camera sensitivity matrix Σ corresponds to a diagonal matrix.
  • In some embodiments, each camera-sensitivity curve may respond to multiple wavelengths of the projector, and the matrix Σ is band diagonal. The following Tables 1 and 2 show two examples of the matrix Σ. In the first example of the matrix Σ, denoted Σ1, the entries of the matrix are shown in the following table:
  • TABLE 1
    Camera sensitivity matrix Σ1; relatively low condition number
    0.8747 0.1251 0.0001 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000
    0.0569 0.8661 0.0770 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000
    0.0001 0.1200 0.8691 0.0109 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000
    0.0000 0.0000 0.0439 0.8796 0.0760 0.0005 0.0000 0.0000 0.0000 0.0000
    0.0000 0.0000 0.0000 0.1709 0.7053 0.1238 0.0000 0.0000 0.0000 0.0000
    0.0000 0.0000 0.0000 0.0009 0.1715 0.8109 0.0165 0.0001 0.0000 0.0000
    0.0000 0.0000 0.0000 0.0000 0.0000 0.0155 0.7445 0.2364 0.0036 0.0000
    0.0000 0.0000 0.0000 0.0000 0.0000 0.0001 0.2980 0.6065 0.0952 0.0001
    0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0046 0.1521 0.7856 0.0576
    0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0006 0.1096 0.8898
  • Notice that there is some crosstalk among the camera channels.
  • In another example of the matrix Σ, denoted Σ2, the entries of the matrix are shown in the following table:
  • TABLE 2
    Camera sensitivity matrix Σ2; relatively high condition number
    0.4094 0.3437 0.1860 0.0463 0.0116 0.0028 0.0001 0.0000 0.0000 0.0000
    0.2546 0.3252 0.2616 0.1072 0.0382 0.0122 0.0009 0.0002 0.0000 0.0000
    0.1247 0.2425 0.2898 0.1954 0.0985 0.0422 0.0052 0.0014 0.0002 0.0000
    0.0295 0.1005 0.2035 0.2666 0.2138 0.1359 0.0335 0.0131 0.0032 0.0003
    0.0067 0.0347 0.1045 0.2252 0.2558 0.2188 0.0912 0.0458 0.0151 0.0022
    0.0011 0.0085 0.0380 0.1347 0.2168 0.2493 0.1756 0.1134 0.0512 0.0114
    0.0001 0.0008 0.0058 0.0397 0.1015 0.1734 0.2457 0.2216 0.1521 0.0594
    0.0000 0.0002 0.0018 0.0174 0.0562 0.1169 0.2350 0.2506 0.2121 0.1098
    0.0000 0.0000 0.0003 0.0041 0.0187 0.0522 0.1774 0.2430 0.2817 0.2227
    0.0000 0.0000 0.0000 0.0008 0.0051 0.0190 0.1092 0.1922 0.3052 0.3685
  • There is a considerable amount of crosstalk among the camera channels, and the matrix is almost fully populated with non-zero entries.
  • While a diagonal matrix or a near-diagonal matrix is not required, in order to achieve fewer 3D reconstruction errors, some embodiments include a matrix Σ that is well-conditioned.
  • In some embodiments, the wavelengths of the monochromatic channels are chosen to correspond to the wavelengths where the spectrum of the ambient light has low intensities. In some embodiments, some of the wavelengths lie in the NIR (near infra-red) spectrum, where the intensities of the spectrum of the ambient light are low. In some embodiments, the projector comprises monochromatic laser sources. In some embodiments, the projector comprises monochromatic LED sources.
  • In some embodiments, some channels of the projector are not monochromatic, but are nevertheless narrow banded such that none of the channels overlaps in spectrum with other channels, and the spectral sensitivity of each channel of the multispectral camera responds to at most one channel of the projector.
  • In some embodiments, the input binary images 102, depicted in the FIG. 1 embodiment as PROJ_IMG1, PROJ_IMG2, . . . , PROJ_IMGn, are based on Gray code. FIG. 3 shows an example of Gray code binary images for an example embodiment where n=10. Gray code is advantageous over other more naïve binary codes, such as the natural binary code, because adjacent Gray codes differ only in one bit. In the event of a recovery error for the Gray code, it is likely that the error will be limited in only one bit plane. In other words, the 3D reconstruction error tends to be lower with the Gray code.
  • In some embodiments, other codes are used. For example, the high frequency binary codes disclosed in Gupta, M., et al., Structured light 3D scanning in the presence of global illumination, CVPR 2011, may be used. These binary codes are related to the Gray codes via an XOR operation so that they share similar benefits of the Gray codes, while the relatively high spatial frequency of these codes provides more robustness to the adverse effect of global illumination effects such as interreflections. Other binary codes, including the natural binary codes, may be used in an embodiment where the operating environment and/or scene can be controlled so as to reduce recovery errors of the binary codes.
  • In conventional Gray code, it has been considered that when capturing a scene illuminated with a binary coded projection, the scene is captured by a monochrome (e.g., grayscale) camera. Analysis of the captured images then recovers the Gray code at each scene point, from which the depth map can be recovered by a technique based on triangulation.
  • In some embodiments described herein, a multispectral camera with m channels is used, and in addition m≧n, e.g., m=n. Each channel gives rise to a monochrome (e.g., grayscale) image similar to the conventional Gray code, but with the m channel multispectral camera of the embodiment described herein, there are m different channels captured as compared to the single channel captured with a monochrome camera. To recover the binary code associated with each scene point, it may be necessary to convert each grayscale image to a binary image. For each scene point, concatenating the binary value (e.g., 0 or 1) of each channel then gives the binary code in n bits.
  • In some embodiments, the operating environment is controlled (e.g., low ambient light) and the scene/objects have a known range of spectral reflectance (e.g., a lower bound of the reflectance at wavelengths λ1, λ2, . . . , λn can be predetermined). Given a specific grayscale captured image CAM_IMG1, taken from among the captured camera images CAM_IMG1, CAM_IMG2, . . . , CAM_IMGm, where 1≦i≦m, a threshold value ti can be predetermined during calibration such that a binary image BIN IMG1 can be determined by thresholding: For a pixel p,
  • BIN_ IMG i ( p ) = { 1 if CAM_ IMG i ( p ) t i 0 otherwise Equation ( 2 )
  • In some embodiments, radiance images are first estimated from the camera images as follows: For each pixel p, the values of radiance images {RAD_IMGi(p)}i=1, 2, . . . , n are obtained according to
  • ( RAD_ IMG 1 ( p ) RAD_ IMG 2 ( p ) RAD_ IMG n ( p ) ) = 1 c ( p ) · + ( CAM_ IMG 1 ( p ) CAM_ IMG 2 ( p ) CAM_ IMG m ( p ) ) Equation ( 3 )
  • In general, Σ+ may be taken as the Moore-Penrose pseudo-inverse of Σ. In embodiments where m=n, Σ+−1 is the usual matrix inverse. c(p) is a correction factor that depends on the pixel location p and that accounts for irradiance falloff from the optical axis of the camera. A common approximation for this factor is the “cosine 4th power”. A more precise estimate of this falloff factor can be obtained by camera calibration.
  • Once the radiance images are estimated, binarization can be performed by thresholding the radiance images, similar to the previous embodiment, by directly using the camera images:
  • BIN_ IMG i ( p ) = { 1 if RAG_ IMG i ( p ) t i 0 otherwise Equation ( 4 )
  • In some embodiments, the thresholds ti are determined in turn by a predetermined ratio τi:

  • t i1·maxpRAD_IMGi(p)  Equation (5)
  • In some embodiments, τi=0.5.
  • In these embodiments involving radiance recovery, because radiance images are estimated by inverting Σ, if Σ is ill-conditioned (e.g., if it has a high condition number) then the quantization errors in the camera images (e.g., 16-bit integer encoding from the A/D converter in the camera image processing) may be amplified, resulting in incorrect binarized images. For example, the condition numbers of the two examples given are cond(Σ1)=2.6767 and cond(Σ2)=1.3036×104, and the depth reconstruction errors of the latter are much higher.
  • Acceptable values for the condition number are 100 and lower, more preferably 10 and lower
  • In some embodiments, a second multispectral binary pattern is projected that is the inverse of the first multispectral binary pattern. For example, with respect to the multispectral binary pattern shown in FIG. 3, an inverse multispectral binary pattern is obtained by interchanging the values 0 and 1, as shown in FIG. 4. In other embodiments, a “null pattern”, i.e., a pattern of zeroes, is projected.
  • Reverting to FIG. 1, the control module 103 calculates this inverse pattern and controls the multichannel projector 101 to project this pattern after projecting the first pattern. The control module 103 also controls the multispectral camera 105 to capture the scene with the inverse pattern after capturing the scene with the first pattern. The resulting multispectral image corresponds to a set of grayscale images which are denoted here as “primed” images CAM_IMG1′, CAM_IMG2′, . . . , CAM_IMGm′. Given a grayscale image CAM_IMGi, where 1≦i≦m, a binary image BIN IMGi can be determined by comparing with CAM_IMGi′:
  • BIN_ IMG i ( p ) = { 1 if CAM_ IMG i ( p ) > CAM_ IMG i ( p ) 0 otherwise Equation ( 6 )
  • In some embodiments, binarization is performed by comparing radiance images:
  • BIN_ IMG i ( p ) = { 1 if RAD_ IMG i ( p ) > RAD_ IMG i ( p ) 0 otherwise Equation ( 7 )
  • The radiance images are estimated from the camera images similar to the procedure described previously.
  • In other embodiments, the spectral power distributions of the projector channels, P1(λ), . . . , Pn (λ), are not monochrome or narrow band, or otherwise may have significant cross-talk with the spectral sensitivities S1(λ), . . . , Sm(λ) of the multispectral camera. If c=(c1, . . . , cm)T denotes the channels of the multispectral camera, then a channel transformation may be applied to the camera grayscale images to obtain images in a new set of virtual projector channels q=(q1, . . . , qn)T, where q=Σ+c and Σ+ is the Moore-Penrose pseudo-inverse of Σ, a matrix where the (i,j) entry is

  • Σij=∫0 S i(λ)P j(λ)  Equation (8)
  • Then a binarization strategy similar to the ones described above is applied to the images in the virtual channels q.
  • In some embodiments, the converted binary images are used to determine a binary code of length n for each scene point corresponding to each pixel of the camera. In some embodiments, the recovered binary code is an n-bit Gray code, and the Gray code is further inverted to obtain a decimal index number that indicates the projector pixel column that is visible to the scene point. In other embodiments, the recovered binary code is an n-bit binary code with an associated decimal index number corresponding to a sequential or serial index of the binary code, and the decimal index number, which indicates the projector pixel column that is visible to the scene point, is recovered by an inversion process and/or a table lookup process. The depth of the scene point is then determined by a triangulation-based method.
  • In some embodiments, the method of triangulation may be implemented based on the geometries and orientations shown in FIG. 5. As shown in FIG. 5, a method of triangulation involves a line-plane intersection 141, wherein the line corresponds to a ray 142 emanating from the camera pixel 143 in question, and the plane is a light plane 145 that corresponds to the determined projector pixel column 146. The determined intersection gives the 3D position of the scene point or, equivalently, the (projective) depth of the scene point along the camera ray.
  • FIG. 6 shows one example of a 3D-reconstruction-error distribution for the two camera sensitivity matrices Σ1 and Σ2 given in Tables 1 and 2 above. It shows that high condition numbers result in high reconstruction errors.
  • In some embodiments, selection of which binary pattern to project in which projector channel is based on a rule that minimizes error. In some embodiments, the selection of which binary pattern to project in which projector channel depends on a combination of the camera spectral sensitivity and the target material reflectance. In some embodiments, the projector channels have narrow band characteristic wavelengths λ1, λ2, . . . , λn. The camera sensitivities may be such that each camera channel responds to only one of the wavelengths, which, without loss of generality, can be assumed to be in the same ordering (e.g., m=n), and the matrix Σ is a diagonal matrix, with diagonal entries σi=Sii). If the spectral reflectances of the target material, denoted ρ(λ), are known, then the quantities κiiρ(λi), i=1, 2, . . . , n are considered. These quantities are then sorted into non-increasing order of magnitude, resulting in a permutation of indices πi, e.g., κπ 1 ≧κπ 2 ≧ . . . ≧κπ n .
  • In some embodiments, projector channel π1 projects binary pattern PROJ_IMGi. In other words, the larger κi is for a projector channel, the more significant the bit plane in the code that is projected in that channel. In embodiments where the target-material reflectance is unknown, ρ(λi) is taken to 1, and κi is completely determined by the camera sensitivity at that wavelength. The principle behind these embodiments is that a higher κi results in lower error (smaller condition number) in that channel, and should be preferred for recovery of high-significance bits.
  • The following Table 3 shows the cumulative distribution function of 3D-reconstruction errors for two cases. The first case uses the above embodiment with the non-decreasing sequence of κ1. The second case uses the reverse ordering, i.e., non-increasing sequence of κ1. It can be seen that the reverse-ordering embodiment provides an advantage of 0.5% for lower errors. To put the results in context, 0.5% of 1 megapixel is 5,000 pixels. Furthermore, this advantage can be achieved by simply using a more optimal choice of projector channels for the patterns, without incurring any cost in terms of extra equipment or acquisition efficiency.
  • TABLE 3
    Comparison of cumulative distribution
    function of 3D-reconstruction errors
    3D Reconstruction Error x 0.01 0.02 0.03 0.04
    CDF F(x) for non-decreasing 40.27% 80.54% 99.79% 100.00%
    ordering (1st case)
    CDF F(x) for non-increasing 40.75% 81.02% 99.70% 100.00%
    ordering (2nd case)
  • In some embodiments, the depth sensor is incorporated into a material camera, which in turn is part of a system for classifying and sorting materials, such as in a recycling facility. FIG. 7 depicts an example arrangement.
  • As depicted in FIG. 7, a conveyor belt paradigm 160 provides a dynamic scene environment that requires fast capture and decision. A conveyor belt 161 transports objects 162 a, 162 b, etc. past an inspection station indicated generally at 163, where the object subjected to inspection is illuminated with multispectral binary coded projection by sensor 165. Sensor 165 may comprise system 100 shown in FIG. 1. Sensor 165 obtains a 3D reconstruction of the inspected object, as described above. In addition, sensor 165 determines the type of material for the inspected object, such as by distinguishing between plastic and rubber. Material classification may be based in part on spectral reflectances from the object, and it may in part rely on the 3D reconstruction from sensor 165 in order to determine expected spectral reflectances. These aspects are described in greater detail below.
  • Based on the determination of material type, module 166 for material classification logic activates robotics shown generally at 167, to sort the inspected objects into respective bins 168 a, 168 b etc.
  • The above-described systems and methods enjoy the same robustness to depth discontinuities as binary coded patterns, such as Gray code, while avoiding an excessive number of projected patterns. Under many ordinarily-encountered circumstances, one-shot depth capture is theoretically possible, for example via thresholding in each channel of the multispectral camera.
  • In addition, some of the embodiments employ multispectral cameras (relatively broadband) instead of hyperspectral cameras (relatively narrow band). This is advantageous because multispectral imaging does not require high spectral resolution like hyperspectral imaging does and because acquisition time is much shorter, allowing real-time capture of dynamic scenes.
  • Material classification will now be described in greater detail.
  • Material classification using optical methods has many practical applications, such as food inspection and recycling. One material property to measure is spectral reflectance. In this disclosure, some systems and methods extend the standard structured-light technique for three-dimensional measurement to “spectral structured light”, which enables real-time determination of spectral reflectance by capturing very few multispectral images, such as only one multispectral image or only two multispectral images or three or more multispectral images, using a multi-primary projector and a multispectral camera. Some findings show the feasibility of the proposed techniques using ray-tracing simulation of multispectral images.
  • Conventional spectroscopy is capable only of single point measurements. Imaging spectroscopy, such as described by Kim, et al., 2012, combines 3D measurement techniques and hyperspectral imaging, but requires a long capture time, making it unsuitable for real-time classification of materials.
  • Structured light is known in Computer Vision for 3D measurement, such as described by Gupta et al., 2011.
  • In one contribution described herein, structured light techniques are extended in scope and concept to “spectral structured light” that enables real-time determination of spectral reflectance, in some cases by capturing as few as two (2) multispectral images using a multichannel projector and a multispectral camera. Simulation results, using an NVIDIA OptiX framework for a ray tracing simulation of the multispectral images, are also discussed.
  • For the simulation scene, some embodiments used a “V-groove” scene, which is commonly used for testing in structured light experiments. Despite its simplicity, the scene presents difficulties, such as the interreflection effect, and non-uniform illumination due to the relative position of the projector and the V-groove.
  • Based on a theoretical analysis of rays, if the 3D position and surface normal vector of a scene point are known or can be derived, such as with the spectral structured light arrangement described above, the spectral reflectance can be recovered from the radiance of the scene point, which in turn can be measured by a multispectral camera. In some embodiments of the spectral structured light system, 10-bit Gray code patterns are used as in conventional structured light, but the bit plane images are projected simultaneously using separate channels of the 10-channel projector with monochromatic primary in each channel. In some embodiments, only 2 multispectral images are captured by the multispectral camera, corresponding to the 10-bit Gray code and its inverse (complementary) code.
  • In some embodiments of the spectral structured light system, multiple grayscale patterns (i.e., not binary patterns) with predetermined phase shifts are used, and the multiple grayscale patterns are projected simultaneously by projecting each one of the multiple grayscale patterns in a corresponding one of the multiple channels of the multi-primary projector. In some embodiments, only 2 multispectral images are captured by the multispectral camera, corresponding to the multiple grayscale patterns and their inverse (complementary) patterns. For example, if a grayscale pattern is a sinusoidal pattern, then its inverse pattern is the inverted sinusoidal pattern, i.e., with a 180 degree phase difference.
  • In order to simulate these multispectral images, some experiments used the NVIDIA OptiX ray-tracing framework because of its flexibility to program non-standard illumination source (our multi-primary projector) and arbitrary spectral data type.
  • Each simulated multispectral image resulted in a stack of 10 spectral radiance images. The totality of 20 radiance images were processed analogously to the standard structured light post-processing to generate a 3D map. The 3D map was further processed to produce a surface normal map after a pre-processing (smoothing) step. The radiance images, 3D map, and surface normal map were then used to generate a stack of 10 reflectance maps. Finally, RANSAC (“RANdom SAmple Consensus”) was applied to each reflectance map to find a best fit (minimum variance) reflectance value for each wavelength. The simulation obtained a spectral RMS error of 0.97% (less than 1%). In many applications, this is considered a good estimate of spectral reflectance.
  • Thus, using ray-tracing simulations, the tests validate the utility and versatility of a spectral structured light imaging system for fast (2-shot) determination of spectral reflectance of materials. The flexibility and extensibility of the OptiX ray tracing framework allows fast implementation of simulation of arbitrary projector-camera systems and scenes beyond conventional RGB colors. Other systems and methods may include simulations of more complex scenes and non-Lambertian materials and include determinations of other material properties, such as BRDF.
  • In some embodiments, a multichannel projector 181 with N channels and a multispectral camera 182 having M channels, with M being at least N channels, are provided and geometrically calibrated in a triangulation configuration, as shown in FIG. 8. In one embodiment, N=10. In addition, each channel of the projector 181 has a monochromatic primary of wavelength λi, i=1, 2, . . . , N.
  • In FIG. 8, P is a typical scene point and nP is the surface normal vector at P. If the 3D position of P and the surface normal vector at P are known, and the projector-camera system is geometrically calibrated, then dP, the distance between the scene point P and the optical center of the projector, can be determined. Similarly, the angles φP, ψP, βP as indicated in FIG. 8 can also be determined, such as by geometric calibration of the projector-camera system as mentioned in the foregoing paragraph. If vi is the projector radiance (predetermined in a projector calibration, for example) in the i-th projector channel due to a projector pixel Q affecting the scene point P, and Li is the reflected radiance at P at wavelength λi as recorded by the multispectral camera, then a “local reflectance value” R(φP, βP, λi) can be determined by the following equation, provided that the i-th channel of projector pixel Q is turned on:
  • R ( φ p , β p , λ i ) = L i v i · d p 2 cos ψ p cos φ p Equation ( 9 )
  • In some embodiments, Equation (9) may take on an additional multiplicative constant on the right hand side. With proper choice of units for radiance and distance, this constant may be reduced to unity. More generally, the right hand side of Equation (9) may take the form
  • L i v i · F ( ψ p , d p ) cos φ p
  • with F(ψP, dP) being a function of two variables ψP and dP dependent on a particular mathematical modeling of the projector, and with the particular instance of Equation (9) above corresponding to the special case
  • F ( ψ p , d p ) = d p 2 cos ψ p Equation ( 9 A )
  • If the object surface is Lambertian, then another local reflectance value may also be used, R0P, βP, λi)=πR(φP, βP, λi), so that the above equation takes the following form, provided that the i-th channel of projector pixel Q is turned on:
  • R 0 ( φ p , β p , λ i ) = π L i v i · d p 2 cos ψ p cos φ p Equation ( 10 )
  • R0P, βP, λi) can be considered an approximation of the spectral reflectance ρ(λ1) of the Lambertian surface at wavelength λi.
  • As explained above, in order to determine the local reflectance value using one of the above equations, there is first a determination of the 3D position of P and the surface normal vector at P. The 3D position of P and the surface normal vector at P can be determined using a method of triangulation as shown in the flow diagram depicted in FIG. 9.
  • As shown in the embodiment of FIG. 9, for the 3D position, one or more patterns composed of binary patterns in each projector channel are projected onto an object in the scene. The binary patterns may be based on the Gray code or any acceptable binary code as described above. Multispectral images are captured in step S901. The 3D position of P, equivalently the depth of P, can be determined by a method of triangulation, in step S902. Furthermore, the one or more patterns may include two patterns such that for each projector channel, the composing binary pattern in one pattern is complementary, or inverse, to the composing binary pattern in the other pattern, in which case there is a corresponding multispectral capture of the complementary/inverse projections.
  • In some embodiments, this process of determining 3D position is applied to a collection of scene points. The collection of determined 3D positions is referred to as a 3D map, as also shown in step S902. The collection of scene points may be determined by a preprocessing step applied to the multispectral images captured by the multispectral camera, such as a step of segmentation. For example, the step of segmentation may determine scene points that belong to a same material.
  • A surface normal map is determined at step S903. For the surface normal vectors at the collection of scenes points, some embodiments determine the surface normal map. In some embodiments, this can be determined from the already determined 3D map. Because taking derivatives may amplify inherent noise in the 3D map, a spatial smoothing operation is applied to the determined 3D positions in a moving window of size W×W. For example, a moving averaging operation can be applied to a 3D map in an 11×11 window. After spatial smoothing, a tangent plane is fitted to the 3D positions in the W×W window, for example in the least-squares sense. The normal vector at a scene point is then taken to be a unit vector perpendicular to the fitted tangent plane.
  • In some embodiments, the surface normal map is not determined based on the 3D map. Instead, a photometric stereo method, perhaps using an entirely independent illumination and/or camera system, is used to determine the surface normal vectors directly, wherein one or more projectors or illumination sources may be used.
  • When both the 3D map and the surface normal map are determined, a local reflectance map for each wavelength λi can be calculated, as shown at step S904. In the embodiment of step S904, the local reflectance map for each wavelength λi is calculated based on the 3D position and surface normal vector of a scene point, as well as the values of the one or more captured images at a pixel corresponding to the scene point, as described in one of the equations (9) or (10) above.
  • FIG. 10 shows an example of a local reflectance map for a wavelength. A scene point that is not illuminated (i.e., the corresponding projector pixel is not turned on in the channel corresponding to the wavelength) is indicated as a dark pixel. A scene point that is illuminated (i.e., the corresponding projector pixel is turned on in the channel corresponding to the wavelength) is indicated as a lit pixel with a local reflectance value calculated from Equation (10).
  • For a Lambertian surface with spectral reflectance ρ(λ), a local reflectance map R0P, βP, λi) is theoretically a constant function in P, e.g., independent of φP and βP. Due to illumination conditions, interreflections, or other global illumination effects, along with errors associated with finite projector and camera resolution, the local reflectance map may not be a constant even for a Lambertian surface. For example, in the illustrative example shown in FIG. 10, the local reflectance map is seen to take values ranging from 0.25 to 0.30.
  • Reverting to FIG. 9, a reflectance property is calculated from local reflectance values in the local reflectance map, at step S905, by fitting them to a global reflectance model. One such model is a constant global reflectance model. Another such model is an analytical BRDF (bidirectional reflectance distribution function) model, which more accurately is actually a family of models providing an abundance of analytic BRDF models. In the case of a constant global reflectance model, a reflectance property is the spectral reflectance ρ(λi) at wavelength λi. In order to obtain the spectral reflectance ρ(λi) of a Lambertian surface, a RANSAC (RANdom SAmple Consensus) algorithm may be applied to a local reflectance map to fit a constant global reflectance model. The RANSAC algorithm attempts to find a subset of pixels in the local reflectance map that are inliers. In the case of a constant global reflectance model, fitting corresponds to an averaging of the local reflectance values for the inliers, resulting in an estimate {circumflex over (ρ)}i of the true reflectance ρ(λi), and the best fit is achieved by minimizing the variance or standard deviation. The RANSAC algorithm requires as input a threshold number for the least number of inliers required. In some embodiments, this threshold number of pixels is 10% of the valid pixels in the local reflectance map.
  • For a non-Lambertian surface, another embodiment of the global reflectance model to be fitted is an analytical BRDF (bidirectional reflectance distribution function) model. In general, such a model may take the form f(φP, βP, λi, ρ, σ, . . . ), where f is a function and ρ, σ, . . . are parameters of the model. For example, in the Oren-Nayar model for rough surfaces, ρ is an albedo (reflectance) at λi, and σ is a surface roughness parameter. Similarly, RANSAC is applied to obtain a best fit of the BRDF model for the local reflectance map R (φP, βP, λi), resulting in estimates of the BRDF model parameters {circumflex over (ρ)}, {circumflex over (σ)}, . . . . These parameters can then be used as material property parameters. In another example of the Cook-Torrence model for metallic surfaces, parameters such as albedo and root-mean-square slope of the surface microfacets can be used as material property parameters.
  • Further embodiments of a global reflectance model for a non-Lambertian surface include a BTF (bidirectional texture function) model or a BSSRDF (bidirectional surface scattering reflectance distribution function) model. Generally, it is preferred for the global reflectance model to be an analytical model with a few parameters for fitting to.
  • Contrary to conventional spectroscopy or imaging spectroscopy, these systems and methods enable real-time determination of material properties by capturing very few (1 or 2) multispectral images.
  • With specific reference to the aforementioned feature of calculating a local reflectance value for multiple ones of each of the scene points, as mentioned in connection with step S904 and which is based on the 3D position and surface normal vector of the scene point, as well as the values of the one or more captured images at a pixel corresponding to the scene point, the following may be stated.
  • In some embodiments for calculating the local reflectance value, one or more patterns that include binary patterns in each projector channel are used. In addition, the one or more patterns consist of two patterns such that, for each projector channel, the included binary pattern in one pattern is complementary to the included binary pattern in the other pattern. The following description focuses on a typical projector channel. For this channel, the multispectral camera records two radiance images I0 and I1 that are complementary, or inverse to each other. Even though the projected patterns are binary, the recorded images would in general have more than two values, e.g., they are grayscale images instead of binary images. FIG. 11 shows an example.
  • In FIG. 11, a binary mask M, shown at 171, is determined. This mask M represents pixels that are determined to correspond to projector pixels that are turned on in that channel. In an embodiment, the binary mask M is determined by performing the Boolean comparison operation I0>I1 on the two radiance images I0 (shown at 172) and I1 (shown at 173).
  • The fraction a of pixels that are determined to be “on-pixels” is then determined from the binary mask M. For patterns based on the Gray code, α is typically approximately 0.5, although the actual value depends on the images I0 and I1, in the sense that α is calculated from M which in turn is calculated from images I0 and I1.
  • A correction image I2 (shown at 175) is then determined. In FIG. 11, the correction image I2 appears relatively dark and corresponds to “noise” due to interreflections. The correction image I2 is determined using the following equation:
  • I 2 = I 1 · M · α 1 - α Equation ( 11 )
  • Pixel-wise multiplication by the mask M may ensure that correction is applied only to the on-pixels in I0. The local reflectance calculation may be done only for projector pixels that are turned on for that channel, so that correction to the on-pixels is the only concern. The reason for the need for correction may be to correct for global illumination effects such as interreflections.
  • Finally, a corrected radiance image I3 (shown at 176) is obtained by the equation I3=I0−I2. The radiance values Li at the on-pixels in I3 are then used in the local reflectance calculations.
  • One example of illustrative radiance profiles of I0, I2, and I3 along a horizontal row of pixels is shown in FIG. 12.
  • As seen in FIG. 12, the thin continuous line represents the profile of I0, while the thick dotted line and thick continuous line represent respectively the profiles of I2 and I3. Comparing the profile of I0 at on-pixels and off-pixels, it is clear that the correction of the global component of illumination is effective.
  • FIGS. 13, 14 and 15 describe ray-tracing simulations that validate the utility and versatility of a spectral structured light imaging system for fast (2-shot) determination of spectral reflectance of materials, or more generally to the determination of a material property. One common material property to measure is spectral reflectance.
  • Consider a “V-groove” scene, which is commonly used as a test scene in structured light experiments, as mentioned by Gupta et al., 2011, as shown in FIG. 13.
  • This figure depicts a cross-section of a 3D Lambertian surface shown in cross-section at the X-Z plane. The equation of the Lambertian surface is

  • z=|x| where −1≦x≦1 and −1≦y≦1  Equation (12)
  • The Lambertian surface has reflectance ρ(λ) and in this figure θ=5 degrees.
  • Such a “V-groove” scene presents difficulties such as the interreflection effect, and non-uniform illumination due to the relative position of the projector and the V-groove.
  • Assume that the multi-primary projector has 10 monochromatic channels with wavelengths λ1, λ2, . . . , λ10. If the radiance values of a scene point P at these wavelengths are L1, L2, . . . , L10, as recorded by the multispectral camera, then the spectral reflectance factors at these wavelengths can be recovered by the following equation:

  • ρ(λi)≈·L i ·d P 2/cos ψP cos φP , i=1,2, . . . ,10  Equation (13)
  • In this equation, the approximate equality would be exact if there were no interreflections. To use this equation, both the 3D position and surface normal vector are determined at P.
  • FIG. 15 is view for explaining a simulation framework. In the simulation, the NVIDIA OptiX framework was used for ray tracing simulation of spectral radiance images because of its flexibility to program non-standard illumination source and arbitrary spectral data type. In addition, the flexibility and extensibility of the OptiX ray tracing framework allows fast implementation of simulation of arbitrary projector-camera systems and scenes beyond conventional RGB colors.
  • Some key features of the simulation are:
  • (a) A new data type is defined to support 10-channel spectral data.
  • (b) The standard path tracing algorithm is modified to work with our projector with pinhole optics: In the standard algorithm, the path ends only when it hits a light source, which would not happen with a pinhole.
  • (c) The radiance images are averaged over 5000+ frames to minimize random noise.
  • (d) 10-bit Gray code patterns are used as in conventional structured light, but the bit plane images are projected simultaneously using separate channels of the multi-primary projector. Only 2 multispectral images are captured, corresponding to the 10-bit Gray code and its inverse (complementary) code.
  • (e) Visualization of the multispectral radiance image is provided by real time conversion of multispectral radiance images to RGB images for display.
  • In more detail, as illustrated in FIG. 15, the simulation program starts by initializing an instance ‘prd’ of a data structure ‘PerRayData’, and generating an initial camera ray. The data structure ‘PerRayData’ is a custom data structure containing fields including ‘radiance’, ‘projector’, and ‘attenuation’, of a custom data type ‘spectrum’. The custom data type ‘spectrum’ is defined as a structure containing a vector of ten components, each represented in an IEEE single precision floating point number. A program variable ‘depth’, initially set to one, is compared to a predetermined constant ‘MAX_DEPTH’ which determines a maximum number of levels of recursion in the ray tracing simulation. If the maximum number of levels of recursion is exceeded (‘depth>MAX_DEPTH’), the simulation program outputs the radiance of the ray, ‘prd.radiance’. If the maximum number of levels of recursion is not exceeded (‘depth<=MAX_DEPTH’), ray tracing is performed by the OptiX built-in function rtTrace in order to update the data structure ‘prd’. Afterwards, the current radiance ‘prd.radiance’ of the ray is also updated for each of the ten components, the program variable ‘depth’ is incremented by one, and a “continuing ray” is generated that has a new random direction calculated from a function closest_hit, which is described below. The process then goes back to the step of comparing ‘depth’ to the predetermined constant ‘MAX_DEPTH’. In the above described process, the OptiX built-in function rtTrace interacts with two custom functions: intersect_vgroove and closest_hit. intersect_vgroove calculates the intersection point ‘hit_point’ of the current ray and the V-groove. The intersection point ‘hit_point’ is then passed into the closest_hit function, which generates a new random direction, calculates projector radiance ‘prd.projector’ at ‘hit_point’, and updates attenuation factor ‘prd.attenuation’, the latter two being used in the subsequent updating of the current radiance of the ray as described above.
  • 10-bit Gray code patterns are used as in conventional structured light, but the bit plane images are projected simultaneously using separate channels of the multi-primary projector. Only 2 multispectral images are captured, corresponding to the 10-bit Gray code and its inverse (complementary) code. A grayscale illustration of an RGB visualization of OptiX simulation of spectral Gray code pattern (left) and its inverse pattern (right) are shown in FIG. 14. An example pixel column is shown on the spectral Gray code pattern (left) that is associated with an example code 1010001001, resulting in non-zero reflection spectrum in wavelengths λ1, λ3, λ7 and λ10. A corresponding pixel column is shown on the inverse pattern (right) that is associated with inverse code 0101110110, resulting in non-zero reflection spectrum in wavelengths λ2, λ4, λ5, λ6, λ8 and λ9.
  • A flow diagram for one representative reconstruction algorithm is shown in FIG. 16. In a manner somewhat similar to the flow of FIG. 9, multispectral images are captured in step S1501 (here a multispectral image pair is captured), a 3D map is reconstructed in step S1502, such as by triangulation, a surface normal map is constructed in step S1503, a reflectance map is constructed in step S1504, and spectral reflectances are estimated in step S1505, such as by RANSAC analysis.
  • In FIG. 16, suggestive examples are provided to the right of each of the steps, so as to indicate the general nature of graphical depictions of input/output for each step.
  • In the simulation, each of the 2 spectral Gray code patterns results in a stack of 10 spectral radiance images.
  • The totality of 20 radiance images is processed analogously to structured light post-processing described above to generate a 3D map.
  • The 3D map is further processed to produce a surface normal map after a pre-processing (smoothing) step.
  • Because the 3D map and the surface normal map are now determined, the radiance images, the 3D map and the surface normal map are used to generate a stack of 10 reflectance maps by applying Equation (13) above which is reproduced here again for convenience

  • ρ(λi)βπ·L i ·d P 2/cos ψP cos φP , i=1,2, . . . ,10  Equation (13), repeated
  • Finally, RANSAC is applied to each reflectance map to find a best fit (minimum variance) reflectance value for each wavelength, yielding the results shown in the following Table 4.
  • TABLE 4
    Comparison of ground truth reflectance of simulated Lambertian
    surface against estimated reflectance for each wavelength
    Wavelength (nm) 405 440 473 514.5 543.5 568.2 611.9 632.8 659 694.3
    Ground Truth 0.498 0.606 0.582 0.412 0.313 0.268 0.233 0.240 0.245 0.225
    Reflectance
    Estimated 0.504 0.616 0.581 0.426 0.317 0.277 0.237 0.250 0.229 0.215
    Reflectance
    Reflectance 0.62% 1.02% 0.08% 1.46% 0.37% 0.93% 0.43% 0.98% 1.65% 1.02%
    Error (Percentage)
  • The spectral RMS (root-mean-square) error can be derived from the individual spectral errors shown in Table 4. In this simulation, the spectral RMS error is 0.97% (less than 1%). In many applications, this is considered a good estimate of spectral reflectance.
  • Ray tracing simulations thus that validate the utility and versatility of a spectral structured light imaging system for fast (2-shot) determination of spectral reflectance of materials, or more generally to the determination of a material property.
  • According to other embodiments contemplated by the present disclosure, example embodiments may include a computer processor such as a single core or multi-core central processing unit (CPU) or micro-processing unit (MPU), which is constructed to realize the functionality described above. The computer processor might be incorporated in a stand-alone apparatus or in a multi-component apparatus, or might comprise multiple computer processors which are constructed to work together to realize such functionality. The computer processor or processors execute a computer-executable program (sometimes referred to as computer-executable instructions or computer-executable code) to perform some or all of the above-described functions. The computer-executable program may be pre-stored in the computer processor(s), or the computer processor(s) may be functionally connected for access to a non-transitory computer-readable storage medium on which the computer-executable program or program steps are stored. For these purposes, access to the non-transitory computer-readable storage medium may be a local access such as by access via a local memory bus structure, or may be a remote access such as by access via a wired or wireless network or Internet. The computer processor(s) may thereafter be operated to execute the computer-executable program or program steps to perform functions of the above-described embodiments.
  • According to still further embodiments contemplated by the present disclosure, example embodiments may include methods in which the functionality described above is performed by a computer processor such as a single core or multi-core central processing unit (CPU) or micro-processing unit (MPU). As explained above, the computer processor might be incorporated in a stand-alone apparatus or in a multi-component apparatus, or might comprise multiple computer processors which work together to perform such functionality. The computer processor or processors execute a computer-executable program (sometimes referred to as computer-executable instructions or computer-executable code) to perform some or all of the above-described functions. The computer-executable program may be pre-stored in the computer processor(s), or the computer processor(s) may be functionally connected for access to a non-transitory computer-readable storage medium on which the computer-executable program or program steps are stored. Access to the non-transitory computer-readable storage medium may form part of the method of the embodiment. For these purposes, access to the non-transitory computer-readable storage medium may be a local access such as by access via a local memory bus structure, or may be a remote access such as by access via a wired or wireless network or Internet. The computer processor(s) is/are thereafter operated to execute the computer-executable program or program steps to perform functions of the above-described embodiments.
  • The non-transitory computer-readable storage medium on which a computer-executable program or program steps are stored may be any of a wide variety of tangible storage devices which are constructed to retrievably store data, including, for example, any of a flexible disk (floppy disk), a hard disk, an optical disk, a magneto-optical disk, a compact disc (CD), a digital versatile disc (DVD), micro-drive, a read only memory (ROM), random access memory (RAM), erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), dynamic random access memory (DRAM), video RAM (VRAM), a magnetic tape or card, optical card, nanosystem, molecular memory integrated circuit, redundant array of independent disks (RAID), a nonvolatile memory card, a flash memory device, a storage of distributed computing systems and the like. The storage medium may be a function expansion unit removably inserted in and/or remotely accessed by the apparatus or system for use with the computer processor(s).
  • This disclosure has provided a detailed description with respect to particular representative embodiments. It is understood that the scope of the appended claims is not limited to the above-described embodiments and that various changes and modifications may be made without departing from the scope of the claims.

Claims (40)

1. A method of estimating a material property of a scene by projecting one or more patterns onto the scene using a projector and capturing one or more images of the scene using a camera, the method comprising:
determining 3D positions of a collection of scene points from the one or more captured images;
determining surface normal vectors of the scene points in the collection of scene points from the one or more captured images;
calculating a local reflectance value for each of the scene points in the collection of scene points; and
fitting a global reflectance model to the local reflectance values so as to obtain an estimate of a material property parameter.
2. The method of claim 1, wherein the projector is a multi-primary projector with multiple channels corresponding to multiple wavelengths, the camera is a multispectral camera with at least as many channels as the number of projector channels, and the global reflectance model is a global reflectance model of the multiple wavelengths.
3. The method of claim 1, wherein the projector is a multi-primary projector with multiple channels, the camera is a multispectral camera with at least as many channels as the number of projector channels, and the one or more patterns are patterns that are composed of binary patterns in each projector channel.
4. The method of claim 3, wherein the binary patterns are based on the Gray code.
5. The method of claim 3, wherein the one or more patterns consist of two patterns such that for each projector channel, the composing binary pattern in one pattern is complementary to the composing binary pattern in the other pattern.
6. The method of claim 1, wherein determining the 3D position of a scene point from the collection of scene points comprises determining a projector pixel coordinate and a camera pixel coordinate corresponding to the scene point, followed by a triangulation calculation.
7. The method of claim 6, wherein determining the projector pixel coordinate comprises recovering a binary representation of a code with each binary digit corresponding to a projector channel.
8. The method of claim 7, wherein recovering a binary representation of a code comprises comparing the values of the complementary patterns at the scene point.
9. The method of claim 1, wherein determining surface normal vectors comprises applying a spatial smoothing operation to the determined 3D positions in a moving window followed by fitting a tangent plane in the moving window.
10. The method of claim 1, wherein determining surface normal vectors is based on a photometric stereo algorithm.
11. The method of claim 1, wherein calculating a local reflectance value for each of the scene points is based on the 3D position and surface normal vector of the scene point, as well as the values of the one or more captured images at a pixel corresponding to the scene point.
12. The method of claim 1, wherein the global reflectance model to be fitted is a constant reflectance value to be determined.
13. The method of claim 1, wherein the global reflectance model to be fitted is a BRDF model.
14. An apparatus for estimating a material property of a scene by projecting one or more patterns onto the scene using a projector and capturing one or more images of the scene using a camera, the apparatus comprising:
one or more processors constructed to execute stored process steps; and
memory which stores process steps for execution by the one or more processors;
wherein the stored process steps comprise:
determining 3D positions of a collection of scene points from the one or more captured images;
determining surface normal vectors of the scene points in the collection of scene points from the one or more captured images;
calculating a local reflectance value for each of the scene points in the collection of scene points; and
fitting a global reflectance model to the local reflectance values so as to obtain an estimate of a material property parameter.
15. The apparatus of claim 14, wherein the projector is a multi-primary projector with multiple channels corresponding to multiple wavelengths, the camera is a multispectral camera with at least as many channels as the number of projector channels, and the global reflectance model is a global reflectance model of the multiple wavelengths.
16. The apparatus of claim 14, wherein the projector is a multi-primary projector with multiple channels, the camera is a multispectral camera with at least as many channels as the number of projector channels, and the one or more patterns are patterns that are composed of binary patterns in each projector channel.
17. The apparatus of claim 16, wherein the binary patterns are based on the Gray code.
18. The apparatus of claim 16, wherein the one or more patterns consist of two patterns such that for each projector channel, the composing binary pattern in one pattern is complementary to the composing binary pattern in the other pattern.
19. The apparatus of claim 14, wherein determining the 3D position of a scene point from the collection of scene points comprises determining a projector pixel coordinate and a camera pixel coordinate corresponding to the scene point, followed by a triangulation calculation.
20. The apparatus of claim 19, wherein determining the projector pixel coordinate comprises recovering a binary representation of a code with each binary digit corresponding to a projector channel.
21. The apparatus of claim 20, wherein recovering a binary representation of a code comprises comparing the values of the complementary patterns at the scene point.
22. The apparatus of claim 14, wherein determining surface normal vectors comprises applying a spatial smoothing operation to the determined 3D positions in a moving window followed by fitting a tangent plane in the moving window.
23. The apparatus of claim 14, wherein determining surface normal vectors is based on a photometric stereo algorithm.
24. The apparatus of claim 14, wherein calculating a local reflectance value for each of the scene points is based on the 3D position and surface normal vector of the scene point, as well as the values of the one or more captured images at a pixel corresponding to the scene point.
25. The apparatus of claim 14, wherein the global reflectance model to be fitted is a constant reflectance value to be determined.
26. The apparatus of claim 14, wherein the global reflectance model to be fitted is a BRDF model.
27. An apparatus for estimating a material property of a scene by projecting one or more patterns onto the scene using a projector and capturing one or more images of the scene using a camera, the apparatus comprising:
a determining unit constructed to determine 3D positions of a collection of scene points from the one or more captured images;
a determining unit constructed to determine surface normal vectors of the scene points in the collection of scene points from the one or more captured images;
a calculating unit constructed to calculate a local reflectance value for each of the scene points in the collection of scene points; and
a fitting unit constructed to fit a global reflectance model to the local reflectance values so as to obtain an estimate of a material property parameter.
28. The apparatus of claim 27, wherein the projector is a multi-primary projector with multiple channels corresponding to multiple wavelengths, the camera is a multispectral camera with at least as many channels as the number of projector channels, and the global reflectance model is a global reflectance model of the multiple wavelengths.
29. The apparatus of claim 27, wherein the projector is a multi-primary projector with multiple channels, the camera is a multispectral camera with at least as many channels as the number of projector channels, and the one or more patterns are patterns that are composed of binary patterns in each projector channel.
30. The apparatus of claim 29, wherein the binary patterns are based on the Gray code.
31. The apparatus of claim 29, wherein the one or more patterns consist of two patterns such that for each projector channel, the composing binary pattern in one pattern is complementary to the composing binary pattern in the other pattern.
32. The apparatus of claim 27, wherein determining the 3D position of a scene point from the collection of scene points comprises determining a projector pixel coordinate and a camera pixel coordinate corresponding to the scene point, followed by a triangulation calculation.
33. The apparatus of claim 32, wherein determining the projector pixel coordinate comprises recovering a binary representation of a code with each binary digit corresponding to a projector channel.
34. The apparatus of claim 33, wherein recovering a binary representation of a code comprises comparing the values of the complementary patterns at the scene point.
35. The apparatus of claim 27, wherein determining surface normal vectors comprises applying a spatial smoothing operation to the determined 3D positions in a moving window followed by fitting a tangent plane in the moving window.
36. The apparatus of claim 27, wherein determining surface normal vectors is based on a photometric stereo algorithm.
37. The apparatus of claim 27, wherein calculating a local reflectance value for each of the scene points is based on the 3D position and surface normal vector of the scene point, as well as the values of the one or more captured images at a pixel corresponding to the scene point.
38. The apparatus of claim 27, wherein the global reflectance model to be fitted is a constant reflectance value to be determined.
39. The apparatus of claim 27, wherein the global reflectance model to be fitted is a BRDF model.
40. A non-transitory computer-readable storage medium which retrievably stores computer-executable process steps which when executed by a computer cause the computer to execute a method of estimating a material property of a scene by projecting one or more patterns onto the scene using a projector and capturing one or more images of the scene using a camera, the method comprising:
determining 3D positions of a collection of scene points from the one or more captured images;
determining surface normal vectors of the scene points in the collection of scene points from the one or more captured images;
calculating a local reflectance value for each of the scene points in the collection of scene points; and
fitting a global reflectance model to the local reflectance values so as to obtain an estimate of a material property parameter.
US13/839,457 2012-07-30 2013-03-15 Multispectral Binary Coded Projection Abandoned US20140028801A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/839,457 US20140028801A1 (en) 2012-07-30 2013-03-15 Multispectral Binary Coded Projection

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201261677382P 2012-07-30 2012-07-30
US201361759083P 2013-01-31 2013-01-31
US13/839,457 US20140028801A1 (en) 2012-07-30 2013-03-15 Multispectral Binary Coded Projection

Publications (1)

Publication Number Publication Date
US20140028801A1 true US20140028801A1 (en) 2014-01-30

Family

ID=49994499

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/839,292 Active 2034-05-07 US9325966B2 (en) 2012-07-30 2013-03-15 Depth measurement using multispectral binary coded projection and multispectral image capture
US13/839,457 Abandoned US20140028801A1 (en) 2012-07-30 2013-03-15 Multispectral Binary Coded Projection

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/839,292 Active 2034-05-07 US9325966B2 (en) 2012-07-30 2013-03-15 Depth measurement using multispectral binary coded projection and multispectral image capture

Country Status (1)

Country Link
US (2) US9325966B2 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140063009A1 (en) * 2012-08-29 2014-03-06 Ge Aviation Systems Llc Method for simulating hyperspectral imagery
US20150022632A1 (en) * 2013-07-16 2015-01-22 Texas Instruments Incorporated Hierarchical Binary Structured Light Patterns
US20150022697A1 (en) * 2013-07-16 2015-01-22 Texas Instruments Incorporated Projector Auto-Focus Correction with the Aid of a Camera
CN104897616A (en) * 2015-05-26 2015-09-09 北京理工大学 Method and system for measuring multispectral bidirectional reflectance distribution function of sample of any shape
JP2016008954A (en) * 2014-06-26 2016-01-18 株式会社豊田中央研究所 Object shape estimation apparatus and program
EP3034992A2 (en) 2014-12-16 2016-06-22 Commissariat A L'energie Atomique Et Aux Energies Alternatives Structured light projector and three-dimensional scanner comprising such a projector
US20170131091A1 (en) * 2015-11-10 2017-05-11 Canon Kabushiki Kaisha Measuring surface geometry using illumination direction coding
US20170186152A1 (en) * 2014-06-09 2017-06-29 Keyence Corporation Image Inspection Apparatus, Image Inspection Method, Image Inspection Program, Computer-Readable Recording Medium And Recording Device
CN107271445A (en) * 2017-05-16 2017-10-20 广州视源电子科技股份有限公司 A kind of defect inspection method and device
US20180106593A1 (en) * 2016-10-13 2018-04-19 Lmi Technologies Inc. Fringe projection for in-line inspection
US20190056218A1 (en) * 2017-08-17 2019-02-21 Carl Zeiss Industrielle Messtechnik Gmbh Method and apparatus for determining at least one of dimensional or geometric characteristics of a measurement object
US10218883B2 (en) * 2015-07-07 2019-02-26 The Board Of Regents Of The University Of Texas System Digital imaging and analysis system
US20190355137A1 (en) * 2018-05-18 2019-11-21 Quanta Computer Inc. Method and device for improving efficiency of reconstructing three-dimensional model
US11012678B2 (en) * 2016-02-05 2021-05-18 Vatech Co., Ltd. Scanning an object in three dimensions using color dashed line pattern
US11481953B2 (en) * 2019-05-28 2022-10-25 Advanced Micro Devices, Inc. Command processor based multi dispatch scheduler
US20230143816A1 (en) * 2021-11-10 2023-05-11 Ford Global Technologies, Llc Image relighting
US20230147607A1 (en) * 2021-11-10 2023-05-11 Ford Global Technologies, Llc Single-perspective image relighting
US11709364B1 (en) 2019-05-22 2023-07-25 Meta Platforms Technologies, Llc Addressable crossed line projector for depth camera assembly
US11733524B1 (en) * 2018-02-01 2023-08-22 Meta Platforms Technologies, Llc Depth camera assembly based on near infra-red illuminator
US11830130B1 (en) * 2023-05-05 2023-11-28 Illuscio, Inc. Systems and methods for removing lighting effects from three-dimensional models

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3019905B1 (en) * 2013-04-30 2020-06-03 Molecular Devices, LLC Apparatus and method for generating in-focus images using parallel imaging in a microscopy system
DE102014104903A1 (en) * 2014-04-07 2015-10-08 Isra Vision Ag Method and sensor for generating and detecting patterns on a surface
TWI524050B (en) * 2014-04-15 2016-03-01 聚晶半導體股份有限公司 Image capture device, depth generating device and method thereof
WO2016024200A2 (en) 2014-08-12 2016-02-18 Mantisvision Ltd. Structured light projection and imaging
US9846943B2 (en) * 2015-08-31 2017-12-19 Qualcomm Incorporated Code domain power control for structured light
US10012496B2 (en) * 2015-10-29 2018-07-03 Canon Kabushiki Kaisha Multispectral binary coded projection using multiple projectors
EP3529553A4 (en) * 2016-10-18 2020-06-17 Dentlytec G.P.L. Ltd. Intra-oral scanning patterns
CN108985119A (en) * 2017-05-31 2018-12-11 华为技术有限公司 The method and apparatus of structured light decoding
US10282857B1 (en) 2017-06-27 2019-05-07 Amazon Technologies, Inc. Self-validating structured light depth sensor system
KR102355301B1 (en) * 2017-09-29 2022-01-25 삼성전자 주식회사 Method and apparatus for analyzing communication environment
US10235797B1 (en) 2017-11-27 2019-03-19 Lowe's Companies, Inc. Inverse rendering of visual material properties
GB201721451D0 (en) 2017-12-20 2018-01-31 Univ Manchester Apparatus and method for determining spectral information
EP3784111A2 (en) 2018-04-25 2021-03-03 Dentlytec G.P.L. Ltd. Properties measurement device
JP2021170689A (en) * 2018-06-01 2021-10-28 ソニーグループ株式会社 Image processing device and method
WO2020168094A1 (en) * 2019-02-15 2020-08-20 Nikon Corporation Simultaneous depth profile and spectral measurement
CN112097688B (en) * 2020-09-03 2021-07-06 清华大学 Multispectral three-dimensional shape measurement method and device based on grating projection three-dimensional imaging
CN112985307B (en) * 2021-04-13 2023-03-21 先临三维科技股份有限公司 Three-dimensional scanner, system and three-dimensional reconstruction method
CN113269886B (en) * 2021-04-29 2022-10-18 中国科学院武汉岩土力学研究所 Slope three-dimensional digital twin model building method based on multi-source data fusion

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4846577A (en) * 1987-04-30 1989-07-11 Lbp Partnership Optical means for making measurements of surface contours
US6031233A (en) * 1995-08-31 2000-02-29 Infrared Fiber Systems, Inc. Handheld infrared spectrometer
US6297825B1 (en) * 1998-04-06 2001-10-02 Synapix, Inc. Temporal smoothing of scene analysis data for image sequence generation
US20020080136A1 (en) * 2000-10-26 2002-06-27 Cyriaque Kouadio Surface shading using stored texture map based on bidirectional reflectance distribution function
US20040135718A1 (en) * 2001-05-11 2004-07-15 Magnus Herberthson Method for determining position and velocity of targets from signals scattered by the targets
US20050285860A1 (en) * 2004-06-18 2005-12-29 Hanspeter Pfister Scene reflectance functions under natural illumination
US6995762B1 (en) * 2001-09-13 2006-02-07 Symbol Technologies, Inc. Measurement of dimensions of solid objects from two-dimensional image(s)
US20060044144A1 (en) * 2004-08-28 2006-03-02 Landon Duval Substance detection and alarm using a spectrometer built into a steering wheel assembly
US20070268366A1 (en) * 2006-05-17 2007-11-22 Ramesh Raskar System and method for sensing geometric and photometric attributes of a scene with multiplexed illumination and solid state optical devices
US7433540B1 (en) * 2002-10-25 2008-10-07 Adobe Systems Incorporated Decomposing natural image sequences
US20090033753A1 (en) * 2007-07-31 2009-02-05 Daisuke Sato Imaging apparatus, imaging method, storage medium, and integrated circuit
US20090290811A1 (en) * 2008-05-23 2009-11-26 Samsung Electronics Co., Ltd. System and method for generating a multi-dimensional image
US20100008588A1 (en) * 2008-07-08 2010-01-14 Chiaro Technologies LLC Multiple channel locating
US20100046005A1 (en) * 2006-10-16 2010-02-25 Fraunhofer-Gesellschaft Zur Forderung Der Angewand Ten Forschung E.V. Electrostatice chuck with anti-reflective coating, measuring method and use of said chuck
US20110187715A1 (en) * 2008-09-25 2011-08-04 Dolby Laboratories Licensing Corporation Improved Illumination and Light Recycling in Projection Systems
US20120035900A1 (en) * 2010-08-06 2012-02-09 Raytheon Company Remote material identification process performance prediction tool
US20120056994A1 (en) * 2010-08-30 2012-03-08 University Of Southern California Single-shot photometric stereo by spectral multiplexing
US20120087009A1 (en) * 2010-10-12 2012-04-12 Toyota Motor Engineering & Manufacturing North America, Inc. Semi-Transparent Reflectors
US20130129208A1 (en) * 2011-11-21 2013-05-23 Tandent Vision Science, Inc. Color analytics for a digital image
US20130141530A1 (en) * 2011-12-05 2013-06-06 At&T Intellectual Property I, L.P. System and Method to Digitally Replace Objects in Images or Video

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5253302A (en) * 1989-02-28 1993-10-12 Robert Massen Method and arrangement for automatic optical classification of plants
EP0665673A3 (en) * 1994-02-01 1996-06-12 Dainippon Screen Mfg Method and apparatus for producing a halftone image using a threshold matrix.
US6587701B1 (en) * 1997-04-03 2003-07-01 Miroslaw F. Stranc Method of assessing tissue viability using near-infrared spectroscopy
US7039228B1 (en) * 1999-11-19 2006-05-02 Rudolph Technologies, Inc. System and method for three-dimensional surface inspection
US6937348B2 (en) 2000-01-28 2005-08-30 Genex Technologies, Inc. Method and apparatus for generating structural pattern illumination
US7035430B2 (en) * 2000-10-31 2006-04-25 Hitachi Kokusai Electric Inc. Intruding object detection method and intruding object monitor apparatus which automatically set a threshold for object detection
JP3884321B2 (en) * 2001-06-26 2007-02-21 オリンパス株式会社 3D information acquisition apparatus, projection pattern in 3D information acquisition, and 3D information acquisition method
US6694159B2 (en) * 2001-07-16 2004-02-17 Art, Advanced Research Technologies Inc. Choice of wavelengths for multiwavelength optical imaging
US7079289B2 (en) * 2001-10-01 2006-07-18 Xerox Corporation Rank-order error diffusion image processing
US6956350B2 (en) * 2002-06-14 2005-10-18 Texas Instruments Incorporated Resonant scanning mirror driver circuit
US7113651B2 (en) 2002-11-20 2006-09-26 Dmetrix, Inc. Multi-spectral miniature microscope array
US7538761B2 (en) * 2002-12-12 2009-05-26 Olympus Corporation Information processor
DE10304111B4 (en) * 2003-01-31 2011-04-28 Sirona Dental Systems Gmbh Recording method for an image of a recording object
US8224064B1 (en) * 2003-05-21 2012-07-17 University Of Kentucky Research Foundation, Inc. System and method for 3D imaging using structured light illumination
US7349104B2 (en) 2003-10-23 2008-03-25 Technest Holdings, Inc. System and a method for three-dimensional imaging systems
JP4293044B2 (en) * 2004-04-21 2009-07-08 ブラザー工業株式会社 Image forming apparatus, image forming method, and image forming program
US8224425B2 (en) 2005-04-04 2012-07-17 Hypermed Imaging, Inc. Hyperspectral imaging in diabetes and peripheral vascular disease
US7489804B2 (en) * 2005-09-26 2009-02-10 Cognisign Llc Apparatus and method for trajectory-based identification of digital data content
US20070115484A1 (en) * 2005-10-24 2007-05-24 Peisen Huang 3d shape measurement system and method including fast three-step phase shifting, error compensation and calibration
US20070285554A1 (en) * 2005-10-31 2007-12-13 Dor Givon Apparatus method and system for imaging
JP4723362B2 (en) 2005-11-29 2011-07-13 株式会社日立ハイテクノロジーズ Optical inspection apparatus and method
US7463349B1 (en) 2006-06-02 2008-12-09 Kla-Tencor Technologies Corp. Systems and methods for determining a characteristic of a specimen
WO2008027569A2 (en) * 2006-09-01 2008-03-06 Agr International, Inc. In-line inspection system for vertically profiling plastic containers using multiple wavelength discrete spectral light sources
JP2010517460A (en) 2007-01-29 2010-05-20 ジョンイル パク Multispectral image acquisition method and apparatus
AU2008261138B2 (en) * 2008-12-19 2011-08-18 Canon Kabushiki Kaisha Measure display SFR using a camera and phase shifting
IT1393542B1 (en) * 2009-03-30 2012-04-27 Milano Politecnico PHOTO-DETECTOR AND METHOD FOR REVEALING OPTICAL RADIATION
US8908958B2 (en) * 2009-09-03 2014-12-09 Ron Kimmel Devices and methods of generating three dimensional (3D) colored models
WO2011054083A1 (en) * 2009-11-04 2011-05-12 Technologies Numetrix Inc. Device and method for obtaining three-dimensional object surface data
GB0921477D0 (en) * 2009-12-08 2010-01-20 Moor Instr Ltd Apparatus for measuring blood parameters
CN102484687B (en) * 2009-12-08 2016-03-23 惠普开发有限公司 For compensating the method for the crosstalk in 3-D display
JP5451552B2 (en) * 2010-08-09 2014-03-26 オリンパス株式会社 Microscope system, specimen observation method and program

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4846577A (en) * 1987-04-30 1989-07-11 Lbp Partnership Optical means for making measurements of surface contours
US6031233A (en) * 1995-08-31 2000-02-29 Infrared Fiber Systems, Inc. Handheld infrared spectrometer
US6297825B1 (en) * 1998-04-06 2001-10-02 Synapix, Inc. Temporal smoothing of scene analysis data for image sequence generation
US20020080136A1 (en) * 2000-10-26 2002-06-27 Cyriaque Kouadio Surface shading using stored texture map based on bidirectional reflectance distribution function
US20040135718A1 (en) * 2001-05-11 2004-07-15 Magnus Herberthson Method for determining position and velocity of targets from signals scattered by the targets
US6995762B1 (en) * 2001-09-13 2006-02-07 Symbol Technologies, Inc. Measurement of dimensions of solid objects from two-dimensional image(s)
US7433540B1 (en) * 2002-10-25 2008-10-07 Adobe Systems Incorporated Decomposing natural image sequences
US20050285860A1 (en) * 2004-06-18 2005-12-29 Hanspeter Pfister Scene reflectance functions under natural illumination
US20060044144A1 (en) * 2004-08-28 2006-03-02 Landon Duval Substance detection and alarm using a spectrometer built into a steering wheel assembly
US20070268366A1 (en) * 2006-05-17 2007-11-22 Ramesh Raskar System and method for sensing geometric and photometric attributes of a scene with multiplexed illumination and solid state optical devices
US20100046005A1 (en) * 2006-10-16 2010-02-25 Fraunhofer-Gesellschaft Zur Forderung Der Angewand Ten Forschung E.V. Electrostatice chuck with anti-reflective coating, measuring method and use of said chuck
US20090033753A1 (en) * 2007-07-31 2009-02-05 Daisuke Sato Imaging apparatus, imaging method, storage medium, and integrated circuit
US20090290811A1 (en) * 2008-05-23 2009-11-26 Samsung Electronics Co., Ltd. System and method for generating a multi-dimensional image
US20100008588A1 (en) * 2008-07-08 2010-01-14 Chiaro Technologies LLC Multiple channel locating
US20110187715A1 (en) * 2008-09-25 2011-08-04 Dolby Laboratories Licensing Corporation Improved Illumination and Light Recycling in Projection Systems
US20120035900A1 (en) * 2010-08-06 2012-02-09 Raytheon Company Remote material identification process performance prediction tool
US20120056994A1 (en) * 2010-08-30 2012-03-08 University Of Southern California Single-shot photometric stereo by spectral multiplexing
US20120087009A1 (en) * 2010-10-12 2012-04-12 Toyota Motor Engineering & Manufacturing North America, Inc. Semi-Transparent Reflectors
US20130129208A1 (en) * 2011-11-21 2013-05-23 Tandent Vision Science, Inc. Color analytics for a digital image
US20130141530A1 (en) * 2011-12-05 2013-06-06 At&T Intellectual Property I, L.P. System and Method to Digitally Replace Objects in Images or Video

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Classifying Materials from their Reflectance - Properties Peter Nillius, www.nada.kth.sel~nilliuslpublicationslnillius_class_mat.pdf (2004) *

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9218690B2 (en) * 2012-08-29 2015-12-22 Ge Aviation Systems, Llc Method for simulating hyperspectral imagery
US20140063009A1 (en) * 2012-08-29 2014-03-06 Ge Aviation Systems Llc Method for simulating hyperspectral imagery
US20150022632A1 (en) * 2013-07-16 2015-01-22 Texas Instruments Incorporated Hierarchical Binary Structured Light Patterns
US20150022697A1 (en) * 2013-07-16 2015-01-22 Texas Instruments Incorporated Projector Auto-Focus Correction with the Aid of a Camera
US9609306B2 (en) * 2013-07-16 2017-03-28 Texas Instruments Incorporated Hierarchical binary structured light patterns
US10542248B2 (en) 2013-07-16 2020-01-21 Texas Instruments Incorporated Hierarchical binary structured light patterns
US9774833B2 (en) * 2013-07-16 2017-09-26 Texas Instruments Incorporated Projector auto-focus correction with the aid of a camera
US10648921B2 (en) 2014-06-09 2020-05-12 Keyence Corporation Image inspection apparatus, image inspection method, image inspection program, computer-readable recording medium and recording device
US10139350B2 (en) * 2014-06-09 2018-11-27 Keyence Corporation Image inspection apparatus, image inspection method, image inspection program, computer-readable recording medium and recording device
US20170186152A1 (en) * 2014-06-09 2017-06-29 Keyence Corporation Image Inspection Apparatus, Image Inspection Method, Image Inspection Program, Computer-Readable Recording Medium And Recording Device
JP2016008954A (en) * 2014-06-26 2016-01-18 株式会社豊田中央研究所 Object shape estimation apparatus and program
EP3034992A2 (en) 2014-12-16 2016-06-22 Commissariat A L'energie Atomique Et Aux Energies Alternatives Structured light projector and three-dimensional scanner comprising such a projector
CN104897616A (en) * 2015-05-26 2015-09-09 北京理工大学 Method and system for measuring multispectral bidirectional reflectance distribution function of sample of any shape
US10218883B2 (en) * 2015-07-07 2019-02-26 The Board Of Regents Of The University Of Texas System Digital imaging and analysis system
US20170131091A1 (en) * 2015-11-10 2017-05-11 Canon Kabushiki Kaisha Measuring surface geometry using illumination direction coding
US11012678B2 (en) * 2016-02-05 2021-05-18 Vatech Co., Ltd. Scanning an object in three dimensions using color dashed line pattern
US11686573B2 (en) * 2016-10-13 2023-06-27 Lmi Technologies Inc. Fringe projection for in-line inspection
US20180106593A1 (en) * 2016-10-13 2018-04-19 Lmi Technologies Inc. Fringe projection for in-line inspection
CN107271445A (en) * 2017-05-16 2017-10-20 广州视源电子科技股份有限公司 A kind of defect inspection method and device
US10458784B2 (en) * 2017-08-17 2019-10-29 Carl Zeiss Industrielle Messtechnik Gmbh Method and apparatus for determining at least one of dimensional or geometric characteristics of a measurement object
US20190056218A1 (en) * 2017-08-17 2019-02-21 Carl Zeiss Industrielle Messtechnik Gmbh Method and apparatus for determining at least one of dimensional or geometric characteristics of a measurement object
US11733524B1 (en) * 2018-02-01 2023-08-22 Meta Platforms Technologies, Llc Depth camera assembly based on near infra-red illuminator
CN110503712A (en) * 2018-05-18 2019-11-26 广达电脑股份有限公司 Improve the method and device for rebuilding threedimensional model efficiency
US10740916B2 (en) * 2018-05-18 2020-08-11 Quanta Computer Inc. Method and device for improving efficiency of reconstructing three-dimensional model
US20190355137A1 (en) * 2018-05-18 2019-11-21 Quanta Computer Inc. Method and device for improving efficiency of reconstructing three-dimensional model
US11709364B1 (en) 2019-05-22 2023-07-25 Meta Platforms Technologies, Llc Addressable crossed line projector for depth camera assembly
US11481953B2 (en) * 2019-05-28 2022-10-25 Advanced Micro Devices, Inc. Command processor based multi dispatch scheduler
US20230143816A1 (en) * 2021-11-10 2023-05-11 Ford Global Technologies, Llc Image relighting
US20230147607A1 (en) * 2021-11-10 2023-05-11 Ford Global Technologies, Llc Single-perspective image relighting
US11756261B2 (en) * 2021-11-10 2023-09-12 Ford Global Technologies, Llc Single-perspective image relighting
US11776200B2 (en) * 2021-11-10 2023-10-03 Ford Global Technologies, Llc Image relighting
US11830130B1 (en) * 2023-05-05 2023-11-28 Illuscio, Inc. Systems and methods for removing lighting effects from three-dimensional models

Also Published As

Publication number Publication date
US9325966B2 (en) 2016-04-26
US20140028800A1 (en) 2014-01-30

Similar Documents

Publication Publication Date Title
US9325966B2 (en) Depth measurement using multispectral binary coded projection and multispectral image capture
CN109477710B (en) Reflectance map estimation for point-based structured light systems
Shi et al. Self-calibrating photometric stereo
Goesele et al. Disco: acquisition of translucent objects
US10012496B2 (en) Multispectral binary coded projection using multiple projectors
CA2378867C (en) Method and system for measuring the relief of an object
Liao et al. Interreflection removal for photometric stereo by using spectrum-dependent albedo
TW201415863A (en) Techniques for generating robust stereo images
Douxchamps et al. High-accuracy and robust localization of large control markers for geometric camera calibration
CN113108720B (en) Surface three-dimensional reconstruction method based on linearly polarized light and stripe reflection
CN109191560B (en) Monocular polarization three-dimensional reconstruction method based on scattering information correction
US9958259B2 (en) Depth value measurement
Santoši et al. Evaluation of synthetically generated patterns for image-based 3D reconstruction of texture-less objects
CN113533256A (en) Method, device and equipment for determining spectral reflectivity
Chiang et al. Active stereo vision system with rotated structured light patterns and two-step denoising process for improved spatial resolution
CN113409379B (en) Method, device and equipment for determining spectral reflectivity
Lyu et al. Validation of physics-based image systems simulation with 3-D scenes
Frangez et al. Surface finish classification using depth camera data
Li et al. Spectral MVIR: Joint reconstruction of 3D shape and spectral reflectance
Ikeuchi et al. Active lighting and its application for computer vision
Sole et al. Image based reflectance measurement based on camera spectral sensitivities
Berssenbrügge et al. Characterization of the 3D resolution of topometric sensors based on fringe and speckle pattern projection by a 3D transfer function
Langmann Wide area 2D/3D imaging: development, analysis and applications
Wang et al. BRDF invariant stereo using light transport constancy
Moreno et al. Three-dimensional measurement of light-emitting diode radiation pattern: a rapid estimation

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TIN, SIU-KEI;REEL/FRAME:030022/0328

Effective date: 20130315

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION