CN100452819C - Image reading device and image forming device - Google Patents

Image reading device and image forming device Download PDF

Info

Publication number
CN100452819C
CN100452819C CNB2006100826093A CN200610082609A CN100452819C CN 100452819 C CN100452819 C CN 100452819C CN B2006100826093 A CNB2006100826093 A CN B2006100826093A CN 200610082609 A CN200610082609 A CN 200610082609A CN 100452819 C CN100452819 C CN 100452819C
Authority
CN
China
Prior art keywords
optics
light
unit
image
driver element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB2006100826093A
Other languages
Chinese (zh)
Other versions
CN1949817A (en
Inventor
市川裕一
仲谷文雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fuji Xerox Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Xerox Co Ltd filed Critical Fuji Xerox Co Ltd
Publication of CN1949817A publication Critical patent/CN1949817A/en
Application granted granted Critical
Publication of CN100452819C publication Critical patent/CN100452819C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Input (AREA)
  • Exposure Or Original Feeding In Electrophotography (AREA)
  • Optical Systems Of Projection Type Copiers (AREA)
  • Control Or Security For Electrophotography (AREA)
  • Facsimile Scanning Arrangements (AREA)

Abstract

Image reading device and image forming device are disclosed. The device includes a light source; a first guiding unit that guides the light from the light source to an object at prescribed incident angles including a first incident angle and a second incident angle; a signal generating unit that receives light and that generates an image signal based on the received light; a second guiding unit that guides the light reflected from the object to the signal generating unit; and a control unit that controls the first guiding unit to guide the light from the light source to the object at least two different incident angles including the first incident angle and the second incident angle, and controls the signal generating unit to generate image signals for the at least two different incident angles.

Description

Image read-out
Technical field
The present invention relates to obtain image read-out and image processing system, more specifically, relate to the image read-out and the image processing system that obtain about the information of object glossiness and irregularity degree about the information of object texture (texture).
Background technology
Object surfaces has a lot of different texture and colors.The texture of object comprises the glossiness and the irregularity degree of object.For example, the surface of polishing metal has the texture of level and smooth and gloss, and the surface of cloth or fabric has the uneven texture of the uniqueness that warp thread and weft yarn owing to cloth or fabric cause.
Figure 18 shows the characteristic of light from the object reflection.Common understanding is, when light be incident on the body surface by incidence angle θ 1 and by angle of reflection θ 2 from the object reflex time, angle of reflection θ 2 equals incidence angle θ 1 (reflection law).Yet in fact, light is not only pressed angle of reflection θ 2 reflections from body surface, and by other a plurality of angle reflections.
This is always not flat because of reflecting surface (body surface), has certain irregularity degree.When reflecting surface has this irregularity degree, owing to this irregularity degree makes light reflect by multiple angles.
In the present invention, " direct reflection " refers to light from the reflection of macroscopical reflecting surface by the angle of reflection that equals incidence angle substantially, and " specular light " refers to so light of reflection; " diffuse reflection " refers to the whole reflections except direct reflection of light from macroscopical reflecting surface, and " diffusing " refers to so light of reflection.
In the accompanying drawings, in the place that needs are distinguished specular light and diffused, add symbol Lsr to the light path of expression specular light; Add symbol Ldr to the light path that expression diffuses.
For the glossiness of object, the known specular components from the light of object reflection and the intensity of diffuse reflection component used is recently represented it.For example, for the light from the polished metal surface reflection, this ratio is higher relatively.This is because polished metal surface has press polished texture.On the contrary, for from having the irreflexive light of object (for example cloth or fabric) than low-luster, this ratio is relatively low.Therefore, can read the glossiness of object by the ratio of measuring the specular light from the light of object reflection and diffuse.
Yet, often surpass the dynamic range of the image input element of ordinary optical image read-out from the light intensity of object direct reflection.Therefore, the design optical guide unit is so that to the reception minimum from the specular light of object, thereby makes to the reception maximum that diffuses from object.Because the reverberation that the ordinary optical image read-out receives in this design comprises a large amount of diffusing, so this device can not suitably read the glossiness of object.
In order to read the glossiness of object, need following structure: reception diffuses and specular light from object, and can obtain glossiness based on each reflecting component.For example, by with the light source irradiating object with read mainly comprise the image (diffuse reflection image) that diffuses, then with the light source irradiating object to read the image (direct reflection image) that mainly comprises specular light, can produce the glossiness signal of expression glossiness based on these picture signals.
For the irregularity degree of object, it shows as the shade on the object.The easier shade that occurs when the incidence angle of light becomes big.For example, as shown in figure 19, by incidence angle θ 11The photoconduction of incident object protuberance has caused the shade in the region S 1.In addition, by incidence angle θ 12(>θ 11) photoconduction of incident object protuberance caused the shade in the region S 2.As shown in the figure, region S 2 is bigger than region S 1.Therefore when incidence angle increased, the irregularity degree of object seemed more obvious.
Therefore, in order to read the irregularity degree of object, need following structure: wherein by two different incidence angles, promptly first incidence angle and second (bigger) incidence angle reads.When by the first incidence angle irradiating object, mainly represent color based on the diffuse reflection component of object from the light of object reflection.When by the second incidence angle irradiating object, mainly represent irregularity degree based on the convex-concave (irregularity degree) of body surface from the light of object reflection.Therefore, when forming image, can reappear the color of object and the irregularity degree on surface based on these two kinds of reverberation.
As shown in the part sectioned view of image read-out shown in Figure 20, need be used for by the first incidence angle θ 11The light source 111 of irradiating object O and being used for by the second incidence angle θ 12The light source 112 of irradiating object O.
Yet secondary light source is set in image read-out as mentioned above needs bigger space, and causes the increase of cost.Therefore, wishing only provides a light source, and it is moved between position shown in Figure 20 111 and post-11.2.Yet, in this case, also require light source vertical moving (between the top and bottom of image read-out).Usually require image read-out to be designed to be as far as possible little in vertical direction.Therefore, aforesaid requirement to the light source vertical moving is a problem.
In addition, if the irregularity degree of body surface is very little, then only may be not sufficient to read irregularity degree by shining by predetermined incidence angle.In this case, unless second incidence angle further increases, otherwise the enough big shade of size can not appear.Therefore, in order more clearly to read irregularity degree, three or more light sources preferably are set and use these light sources according to the irregularity degree of body surface.Yet still there are the following problems: in requiring image read-out as far as possible little on the vertical direction as mentioned above, it is very difficult that three or more light sources are installed in the searching space.
Summary of the invention
According to an aspect of the present invention, provide a kind of image read-out, it comprises: a light source; First guidance unit is used for by the regulation incidence angle that comprises first incidence angle and second incidence angle light from the light source UDeflector; Signal generating unit is used to receive light and produces picture signal based on the light that receives; Second guidance unit, the photoconduction that is used for reflecting from object is to signal generating unit; And control unit, be used to control first guidance unit with by at least two different incidence angles that comprise first incidence angle and second incidence angle with light from the light source UDeflector, and signal generating unit controlled so that described two different incidence angles are produced picture signal at least.
Description of drawings
Describe exemplary embodiment of the present invention in detail based on the following drawings, in the accompanying drawings:
Fig. 1 is the functional block diagram of the image processing system of first exemplary embodiment according to the present invention;
Fig. 2 is the figure that the apparatus structure of image processing system is shown;
Fig. 3 is the figure of structure that full rate saddle (carriage) unit of image processing system is shown;
Fig. 4 is the figure of example of structure that the drive system of full rate saddle unit is shown;
Fig. 5 is the figure of example of structure that the drive system of full rate saddle unit is shown;
Fig. 6 is the figure that the input picture that the scan operation by 45 ° of incidence angles obtains for object (fabric) is shown;
Fig. 7 is the figure that the input picture that the scan operation by 65 ° of incidence angles obtains for above-mentioned object is shown;
Fig. 8 is the figure that the composograph that has synthesized the input picture that input picture that the scan operation by 45 ° of incidence angles obtains and scan operation by 65 ° of incidence angles obtain is shown;
Fig. 9 is the figure that the apparatus structure of the image read-out of second exemplary embodiment according to the present invention is shown;
Figure 10 is the figure that illustrates according to another structure of the full rate saddle unit of this exemplary embodiment;
Figure 11 is the figure that illustrates according to another structure of the full rate saddle unit of this exemplary embodiment;
Figure 12 is the figure that illustrates according to another structure of the full rate saddle unit of the modified example of this exemplary embodiment;
Figure 13 is the figure that illustrates according to another structure of the full rate saddle unit of the modified example of this exemplary embodiment;
Figure 14 is the figure of another structure that the full rate saddle unit of the 3rd exemplary embodiment according to the present invention is shown;
Figure 15 is the figure that illustrates according to another structure of the full rate saddle unit of this exemplary embodiment;
Figure 16 is the figure that illustrates according to another structure of the full rate saddle unit of the modified example of this exemplary embodiment;
Figure 17 is the figure that illustrates according to another structure of the full rate saddle unit of the modified example of this exemplary embodiment;
Figure 18 illustrates light how from the concept map of object reflection;
Figure 19 is the figure that the incidence angle and the relation between the shade of object are shown; And
Figure 20 is the figure that the topology example of the full rate saddle unit with two light sources is shown.
Embodiment
A. first exemplary embodiment
The A-1 image processing system
Fig. 1 is the functional block diagram of the image processing system 1 of first exemplary embodiment according to the present invention.Image processing system 1 has image fetching unit 10, image formation unit 20, control unit 30, memory cell 40, graphics processing unit 50, operating unit 60 and data I/O unit 70.Control unit 30 is the calculation elements that are provided with unshowned CPU (CPU), RAM (random access memory), ROM (read-only memory) etc., and controls the operation of the various unit of image processing system 1 by execution the program in the memory cell 40 of being stored in.Memory cell 40 is constructed to the mass storage device such as HDD (hard disk drive), and stored program.
10 pairs of object surfaces such as paper or fabric of image fetching unit are carried out optically read, and produce and output image signal according to the texture on surface.Graphics processing unit 50 has such as a plurality of image processing circuits of ASIC (application-specific integrated circuit (ASIC)) or LSI (large scale integrated circuit) and the video memory etc. that is used for temporarily storing image data, and each image processing circuit is carried out different image processing.Particularly, the image processing that the picture signal that graphics processing unit 50 produces based on image fetching unit 10 is stipulated also produces view data, and view data is outputed to image formation unit 20.Image formation unit 20 forms toner image based on this view data on the record page such as record-paper.Operating unit 60 is provided with contact panel display, various buttons etc., accepts operator's input instruction, and input instruction is offered control unit 30.Data I/O unit 70 is the interface arrangements that are used for transmitting back and forth with external device (ED) data.
Fig. 2 is the figure that the structure of image processing system 1 is shown.
Image fetching unit 10 has full rate saddle unit 110, half rate saddle unit 120, condenser lens 130, built-in sensors (inline sensor) 140, pressuring plate glass 150 and platen cover 160.Full rate saddle unit 110 is being carried out optically read to the surface of object O by the driver element such as the motor (not shown) when (main scanning direction) moves by speed v in the direction of arrow C.Half rate saddle unit 120 has speculum 121 and 122, and will be from the photoconduction of full rate saddle unit 110 to condenser lens 130.Make half rate saddle unit 120 mobile by a half speed (being v/2) of full rate saddle unit 110 by driver element along the direction (scanning direction) identical with full rate saddle unit 110 such as the motor (not shown).
Condenser lens 130 is placed along the light path that connects speculum 122 and built-in sensors 140, and will image in the light-receiving position of built-in sensors 140 from the reverberation of object O.According to desired performance level, condenser lens 130 for example is made up of 4 to 8 compound lenss.In this exemplary embodiment, be referred to as " guidance unit " along the speculum of catoptrical light path setting, lens etc.
Built-in sensors 140 is to be used to receive imaging and to produce the also signal generating unit of output image signal according to the light that receives, and for example is the multirow CCD linear imaging sensor (image input element) that is provided with colour filter on the chip.In this exemplary embodiment, use can come the imageing sensor of input picture by B (indigo plant), G (green) and three kinds of colors of R (red).The picture signal of built-in sensors 140 these three kinds of colors of output.
Object O pressuring plate glass 150 placed thereon is smooth and transparent glass plates.On the both sides of pressuring plate glass 150, be formed with reflection and suppress layer (for example multilayer dielectric film), reduce pressuring plate glass 150 lip-deep reflections thus.Platen cover 160 is set to be covered pressuring plate glass 150 and can more easily read the object O that places on the pressuring plate glass 150 to block exterior light and to make.
According to structure like this, in image fetching unit 10,110 irradiations of full rate saddle unit place the object O on the pressuring plate glass 150, and built-in sensors 140 is via speculum 121 and 122 reverberation that receive from object O.Built-in sensors 140 produces the picture signal of B (indigo plant), G (green) and three kinds of colors of R (red) in response to its reverberation that receives, and they are outputed to graphics processing unit 50.Graphics processing unit 50 generates and has passed through the view data of shadow correction, color correction and various other corrections and computing based on picture signal, and provides it to image formation unit 20.
A-2 full rate saddle unit
Fig. 3 is the figure that the details of full rate saddle unit 110 are shown.As shown in Figure 3, full rate saddle unit 110 has tubular light source 111, two cylindrical convex lens 112 and 113, outer cover 114, mobile mirror 115 and stationary mirrors 116.Tubular light source 111 for example is Halogen lamp LED or xenon fluorescent lamp.The a pair of cylindrical convex lens 112 and 113 that is provided with according to its convex surface mode respect to one another will be converted to approximately parallel light from the light of tubular light source 111.Outer cover 114 covers tubular light source 111 and cylindrical convex lens 112 and 113 in case leak-stopping light.
Mobile mirror 115 has from the light of tubular light source 111 reflecting surface along the direction reflection of object O.This light further reflects from object O.Stationary mirror 116 will reflect along the direction towards half rate saddle unit 120 from the reverberation of object O.Mobile mirror 115 be constructed to make in the drawings its towards and horizontal level (along main scanning direction) can change.Yet, mobile mirror 115 towards and the position between have fixing relation.Particularly, even mobile mirror 115 towards having changed, if full rate saddle unit 110 is in pre-determined position (for example, scanning starting position), then mobile mirror 115 is constructed to move to and makes the position of the fixed position on the rayed object O of its reflecting surface reflection.
Shown in the example among Fig. 3, even mobile mirror 115 under the situation that changes and change for the incidence angle of object O, adopted following structure: wherein, the position, scanning direction of mobile mirror 115 changes along with different incidence angles so that light is radiated on the fixed position O ' on the object all the time.Particularly, when mobile mirror 115 is placed on A place, position, be θ for the angle of light of object O 1, angle of light is θ when mobile mirror 115 is placed on B place, position 2, angle of light is θ when mobile mirror 115 is placed on C place, position 31<θ 2<θ 3).In other words, along with mobile mirror 115 towards change, it is big that incidence angle becomes, the position of mobile mirror 115 becomes far away towards the position away from light source 111.Even changing the reason that the irradiation position on the object O is maintained fixed, incidence angle is, when the irradiation position on the object O is mobile in response to the variation of incidence angle, when the scanning beginning, must 110 positions, full rate saddle unit own be adjusted to the scanning starting position of regulation by move full rate saddle unit 110 along main scanning direction.
Since by change mobile mirror 115 towards making that the incidence angle for object O is variable, therefore for example can obtain to represent the image of color and the surperficial texture of object O as follows: in first scan period by first incidence angle to object O irradiates light, second scan period by than big second incidence angle of first incidence angle to object O irradiates light, and the picture signal that 140 each scannings obtain based on built-in sensors and on record page, produce image.
Next describe be used to adjust mobile mirror 115 towards with the adjustment unit of mobile mirror 115 along the position of scanning direction.
Fig. 4 shows the example of this adjustment unit.Full rate saddle unit 110 shown in Fig. 4 has first driver element that is used to mobile mirror 115 is rotated makes mobile mirror 115 along the second mobile driver element of direction (hereinafter being called sub scanning direction) perpendicular to main scanning direction with being used to.
Next the structure of first driver element is described.Mobile mirror 115 is fixed in the mobile mirror support sector 117, and mobile mirror support sector 117 has axle 201, and this vertical direction along Fig. 4 paper is extended and be parallel with reflecting surface.Axle 201 is inserted in the hole of the case member (not shown) that is arranged at full rate saddle unit 110, and is subjected to rotatable support.Be with 202 to stride across axle 201.When rotating moving axis 204 rotations with 202 rollers that tightly hold 203 owing to the motor (not shown), mobile mirror support sector 117 also rotates around axle 201 owing to this rotating operation.In this manner, changed mobile mirror 115 towards.
Next the structure of second driver element is described.Rotation axis 204 is by support unit 205 supportings that attach to platform portion 206.Platform portion 206 is provided with the hole (not shown) that inner surface has groove line (groove).On the outer peripheral face of axle 207, be formed with this hole in the groove line of groove line engagement.When axle 207 was rotated, platform portion 206 moved along horizontal direction among the figure (along sub scanning direction) owing to this rotational motion.Mobile mirror support sector 117 in the platform portion moves along sub scanning direction thus.
Image fetching unit 10 is with two kinds of pattern work: first pattern is used to obtain the color of object O; Second pattern is used to obtain the texture of object O.209 pairs of various mode switch of mobile mirror control unit are to the incidence angle of object O.For example, under first pattern, press 45 ° incidence angle irradiating object O, and under second pattern, press 65 ° incidence angle irradiating object O.Particularly, mobile mirror control unit 209 for various patterns with incidence angle, be used to realize the position of mobile mirror 115 on sub scanning direction of this incidence angle and towards being associated and storing.When control unit 30 had been determined incidence angle by specifying first pattern or second pattern, mobile mirror control unit 209 control motors 208 came driven roller 203 so that with mobile mirror 115 be positioned at the position of realizing this incidence angle and towards.
Then, Fig. 5 is the figure that another example of adjustment unit is shown.
As shown in Figure 5, mobile mirror 115 is fixed in mobile mirror support sector 117, and mobile mirror support sector 117 is provided with single axle 301, and is provided with two protrusion pin 302 and 303 in the side.Particularly, pin 302 is arranged near mobile mirror support sector 117 apical margins, and pin 303 is arranged near the root edge, and axle 301 is arranged near the central authorities.The case member of full rate saddle unit 110 is provided with axle 301 and sells 302 and 303 guide grooves 401,402 and 403 that insert respectively.Since plane (reflecting surface) towards determining by two lines on the plane, so when axle 301 and sell 302 and 303 and insert guide grooves 401,402 and 403 and mobile mirror support sector 117 when being fixed on the ad-hoc location, mobile mirror 115 towards being determined uniquely.
As shown in the figure, guide groove 401,402 extends along different direction with 403, thus the position of mobile mirror support sector 117 and towards changing continuously.In addition, come to determine the direction of guide groove 401,402 and 403 as follows: though make mobile mirror 115 be positioned at mobile mirror support sector 117 towards changing the also position of the fixed position of irradiating object O.
Axle 301 is by support unit 304 rotatably supports.Support unit 304 is provided with the hole that its inner surface has the groove line.On the outer peripheral face of axle 305, be formed with this hole in the groove line of groove line engagement.When axle 305 was rotated, support unit 304 moved along horizontal direction among the figure (along sub scanning direction) owing to this rotating operation.When determine as described above mobile mirror 115 towards the time, also determined the position of mobile mirror 115 uniquely.Therefore, position on sub scanning direction is associated and stores mobile mirror control unit 307 in order to realize these incidence angles with mobile mirror 115 with the incidence angle of various patterns (be mobile mirror 115 towards), when having specified read mode and having determined incidence angle, mobile mirror control unit 307 drive motors 306 are so that mobile mirror 115 is positioned at the position of realizing this incidence angle.
A-3 generates view data
As mentioned above, full rate saddle unit 110 irradiating object O and obtain information from object O.Hereinafter, this operation is called " scan operation ".More specifically, when by 45 ° incidence angle irradiating object O, operation is called " 45 ° of incidence angle scan operations ", and when the incidence angle irradiating object O that presses 65 °, operation is called " 65 ° of incidence angle scan operations ".
Image fetching unit 10 is carried out two types scan operation: carry out scan operation by 45 ° of incidence angles under first read mode at every turn, under second read mode, carry out scan operation by 65 ° of incidence angles, the picture signal of obtaining by scan operation is synthesized, and generate view data.Color and the texture of the pictorial data representation object O of Huo Deing by this way.Below describe and use " fabric " example as object O.
At first, Fig. 6 illustrates by object O (fabric) is carried out the input picture P45 that 45 ° of incidence angle scan operations under first read mode obtain.Input picture P45 shows with colour, and clearly represents the color (pattern) of object O.In other words, can think that 45 ° of incidence angles are suitable for reading the design and color of object O.
Then, Fig. 7 shows by above-mentioned object O (fabric) is carried out the input picture P65 that 65 ° of incidence angle scan operations under second read mode obtain.Input picture P65 does not comprise colored black and white image.Relatively as seen input picture P65 and input picture P45 shown in Figure 6 in input picture P65, exist black region on object O.These black regions are the shades that produced by the light that shines on the object O because have convex-concave on object O.In other words, as described in use Figure 19, because incidence angle θ 12=about 65 ° of incidence angle θ greater than light source 111 11=45 °, so the lip-deep convex-concave of object O produces more shade.Therefore, we can say that carrying out scan operation by big incidence angle is more suitable in the convex-concave that reads object O (being texture).Yet if incidence angle increases to more than 80 °, the shaded area that causes owing to the last bigger convex-concave of object O becomes greatly, causes losing detailed texture information.In addition, excessively increase incidence angle and cause following problem: obviously descend from the amount of the irradiates light of light source irradiation object O surface per unit area.Therefore, when reading texture, the incidence angle between 60 ° to 70 ° is appropriate.
Fig. 8 is the figure of composograph P that the input picture P65 of the input picture P45 that wherein synthesized Fig. 6 and Fig. 7 is shown.Therefore as mentioned above, input picture P45 clearly represents the color of object O, and input picture P65 represents the texture of object O, we can say that composograph P had not only represented the color of object O but also represented the texture of object O.
A kind of concrete grammar that generates view data is as follows.
At first, image fetching unit 10 is carried out scan operation by 45 ° of incidence angles under first read mode.Particularly, the mobile mirror control unit 209 on the full rate saddle unit 110 with mobile mirror 115 adjust to the incidence angle that makes for object O be 45 ° the position and towards.When finishing this adjustment, full rate saddle unit 110 moves along the direction shown in the arrow C among Fig. 2, and is luminous from tubular light source 111 simultaneously.Thus, optical scanner is carried out on the whole surface of object O, read reverberation by built-in sensors 140.Graphics processing unit 50 obtains based on the picture signal that diffuses (first picture signal) from built-in sensors 140.The signal value of first picture signal is stored in the video memory of graphics processing unit 50.
Then, image fetching unit 10 is carried out scan operation by 65 ° of incidence angles.Particularly, the mobile mirror control unit 209 on the full rate saddle unit 110 with mobile mirror 115 adjust to make incidence angle be 65 ° the position and towards.When finishing this adjustment, full rate saddle unit 110 moves along the direction shown in the arrow C among Fig. 2, and is luminous from tubular light source 111 simultaneously.Thus optical scanner is carried out on the whole surface of object O, read reverberation by built-in sensors 140.By this process, graphics processing unit 50 obtains based on the picture signal that diffuses (second picture signal) from built-in sensors 140.The signal value of second picture signal is stored in the video memory of graphics processing unit 50.
Next, graphics processing unit 50 reads the signal value of second picture signal from video memory, is converted into the signal value of expression black and white image (netrual colour), and this signal value be multiply by coefficient C (0<C≤1).Coefficient C is stored in the graphics processing unit 50.Coefficient C is big more, and the shade on the body surface is just outstanding more.In other words, because coefficient C is as the weight of the texture of representing about second picture signal, so graphics processing unit 50 can be adjusted the balance of the texture of object O with respect to color by adjusting coefficient C.
Then, graphics processing unit 50 reads the signal value of first picture signal from video memory, and with the signal value of second picture signal and the long-pending signal value that is added to first picture signal of coefficient, so synthesized this two width of cloth image.The image processing that 50 pairs of signal values that so obtain of graphics processing unit are stipulated, and the finally composograph data of the composograph P of output that obtain to indicate.Generating expression thus wherein is added to based on the color image data of the image on the black and white image of second picture signal based on the coloured image of first picture signal.
It should be noted that and at first to carry out above-mentioned 45 ° of incidence angle scan operations or 65 ° of incidence angle scan operations.
In case graphics processing unit 50 has generated view data by said process, then image formation unit 20 forms image based on this view data on record page.The structure of image formation unit 20 is described referring again to Fig. 2 now.As shown in Figure 2, image formation unit 20 has: image formation unit 210a, the 210b, 210c and the 210d that correspond respectively to color Y (yellow), M (magenta), C (cyan) and K (black); Intermediate image transfer belt 220; Image transfer roller 230a, 230b, 230c and a 230d; Secondary image transfer roll 240; Backing roll 250; Paper feed unit 260 and photographic fixing (fusing) unit 270.Intermediate image transfer belt 220 is the tape loop parts, by the driver element (not shown) its direction along arrow B among the figure is moved.Image transfer roller 230a, 230b, 230c and a 230d are via the photosensitive drums side on intermediate image transfer belt 220 deflection image formation unit 210a, 210b, 210c and the 210d.On these photosensitive drums, form toner image, toner image is transferred to intermediate image transfer belt 220 based on Y, M, C and the K look of view data.Secondary image transfer roll 240 advances in intermediate image transfer belt 220 position relative with record-paper mutually partially with backing roll 250, and toner image is transferred to record-paper from middle image transfer belt 220.Paper feed unit 260 has the paper disc 261a and the 261b of keeping records paper, and presents record-paper during image forms.Fixation unit 270 has and is used for roller member that record-paper is heated and pressurizes, utilizes heat and pressure that the lip-deep toner image that is transferred to record-paper is carried out photographic fixing.Thus, image formation unit 20 forms image based on the view data that graphics processing unit 50 provides on record-paper.
According to this exemplary embodiment, the position by changing the mobile mirror 115 that the light from tubular light source 111 is reflected and towards, even use single source, also can will adjust to arbitrary value for the incidence angle of object O.In this case, the moving direction of mobile mirror 115 is scanning direction (horizontal direction), and therefore (especially in image read-out) need not to guarantee big vertical space in image processing system.
Image processing system 1 is by to by shining first picture signal that the reflection of light light on the object O obtains and synthesize by shine second picture signal that the reflection of light light on the object O obtains by 65 ° of incidence angles greater than 45 ° of incidence angles by 45 ° of incidence angles, thereby produces view data.By first picture signal that obtains by the light of 45 ° of incidence angles is the picture signal that is mainly used in the color of inspected object O, and second picture signal that is obtained by the light of 65 ° of incidence angles is the picture signal that is mainly used in the texture of inspected object O.Therefore, be the color of expression object O and the view data of texture by the view data that first picture signal and second picture signal are synthesized into.Form feasible color and the texture that can reappear object O faithfully of image based on this view data.
B. second exemplary embodiment
Fig. 9 is the figure of apparatus structure that the image read-out 500 of second exemplary embodiment according to the present invention is shown.As shown in the drawing, image read-out 500 has pressuring plate glass 150, platen cover 160, full rate saddle unit 510, half rate saddle unit 120, condenser lens 130, built-in sensors 140 and operating unit 60.
Pressuring plate glass 150 is transparent glass plates, and object O is placed thereon.The reflection that is formed with on the both sides of pressuring plate glass 150 such as multilayer dielectric film suppresses layer, thereby reduces pressuring plate glass 150 lip-deep reflections.Platen cover 160 is set to make it to cover pressuring plate glass 150, blocks exterior light and makes that the easier object O that is opposite on the pressuring plate glass 150 reads.It should be noted that in the present invention object O is not limited to paper, and can be plastics, metal, cloth or fabric.
Figure 10 and Figure 11 are the figure that the details of full rate saddle unit 510 are shown.
The image read mode of image read-out 500 is to be mainly used in the color read mode (the first image read mode) that reads object color and to be mainly used in the texture read mode (the second image read mode) that reads object texture or glossiness.Figure 10 shows the structure of full rate saddle unit 510 under the color read mode, and Figure 11 shows the structure of full rate saddle unit 510 under the texture read mode.It should be noted that in the full rate saddle unit 510 of this exemplary embodiment, be about 45 ° for the incidence angle of object, and be specular light by the light of about 45 ° angle of reflection reflection with respect to this light from the light of tubular light source 531.More specifically, except specular light, also comprise in the reverberation and diffuse, but should reduce to be equivalent in the reverberation component that diffuses by the picture signal that produces based on this light being carried out predetermined operation.On the other hand, the same with the normal image reading device of the color that only reads object O, the light of angle of reflection reflection that is about 0 ° by the light with respect to incident object O is for diffusing.
Full rate saddle unit 510 has tubular light source 531, collimating lens 530, removable reflector 532, speculum 533 and 534.Tubular light source 531 for example is Halogen lamp LED or xenon fluorescent lamp, and as shown in the figure, it is arranged on along on the luminous position of the direction of object O.Collimating lens 530 is guidance unit (first guidance unit), and it will be converted to directional light from the light (diffused light) that tubular light source 531 sends, and with this directional light UDeflector O.Collimating lens 530 is fixed in the support unit 542 that rotates around axle 541.When make support unit 542 when axle 541 rotates by motor 540 (first driver element), collimating lens 530 can be orientated as shown in figure 10 and (be it should be noted that among the Figure 11 to 16 that describes hereinafter and omitted motor 540) as shown in figure 11.
When collimating lens 530 is positioned at position shown in Figure 10 under the color read mode, use diffused light irradiating object O from tubular light source 531.Diffused light is convenient to read the color of object O.When collimating lens 530 is positioned at position shown in Figure 11 under the texture read mode, after being converted to directional light, send along the direction of object O by collimating lens 530 from the light of tubular light source 531.When directional light irradiating object O, light beam is consistent for the incidence angle of object, this make can more quantitative real estate organism surface the direct reflection light component that causes of trickle pattern.As a result, can read the texture of object more accurately.To assemble from the light (diffused light) of tubular light source 531 is directional light, and realization can be guaranteed the effect of enough light quantities.
Removable reflector 532 is as guidance unit (second guidance unit), is formed the shape of the lines of the shape of angle brackets (<) or central bend.Removable reflector 532 rotates by motor 536 (second driver element) and around axle 535, can adopt shown in Figure 10 towards with shown in Figure 11 towards (it should be noted that at Figure 11 and Figure 12 described below and in Figure 16, omitted motor 536).Removable reflector 532 has and is used for catoptrical reflecting surface 532m and is used for light absorbing absorbing surface 532t.Absorbing surface 532t is so-called ligh trap (light trap), black porous polyurethane sheet for example, and the major part that enters the light of absorbing surface 532t is caught and is absorbed by the surface.
Speculum 533 and 534 is as further reflecting from the reverberation of object O and with the guidance unit of this photoconduction to half rate saddle unit 120.More specifically, under the color read mode, speculum 533 (first guidance unit) will be from the direction of the guiding half rate saddle unit 120 that diffuses of object O.On the contrary, in the texture read mode, speculum 534 (second guidance unit) will be from the direction of the specular light of object O guiding half rate saddle unit 120.
In the color read mode, when removable reflector 532 was positioned at as shown in figure 10 position, it utilized reflecting surface 532m along the direction reflection of the object O light from tubular light source 531, shown in dotted line r1.Simultaneously, object O is by the direct sunshine irradiation from tubular light source 531, and shown in dotted line r0, the result is illuminated from both direction (dotted line r0 and r1) simultaneously.Shown in dotted line r2,,, thereby be directed to the direction of half rate saddle unit 120 further by the reflecting surface 532m of removable reflector 532 reflection from diffusing after mirror 533 reflection that is reflected of object O.In other words, removable reflector 532 under the color read mode towards be following towards: wherein, will be by reflecting surface 532m from the light of tubular light source 531 to direction reflection towards object O, make that also the reflection that diffuses by speculum 533 from object O is directed to half rate saddle unit 120.
Under the texture read mode, when removable reflector 532 is positioned at as shown in figure 11 position, reflecting surface 532m moves to the position that can't receive from the light of tubular light source 531, so object O is only illuminated from the direction (promptly from fixed-direction) of tubular light source 531.Therefore, this light is the specular light that trickle pattern produced on object O surface, so the texture of this light representations object.Mirror 534 reflection that is reflected of this specular light, and be directed to the direction of half rate saddle unit 120, shown in dotted line r5.In addition, the absorbing surface 532t of removable reflector 532 moves to towards the position of object O, is absorbed face 532t absorption thereby make from diffusing of object O, shown in dotted line r4.Thus, removable reflector 532 under the texture read mode towards be following towards: wherein, make from the light of tubular light source 531 by object O diffuse reflection and be directed to the direction of absorbing surface 532t, make that also the specular light from object O is directed to half rate saddle unit 120 by speculum 534.
When switching between color read mode and texture read mode, removable reflector 532 and collimating lens 530 need move in its mode that does not bump.For example, when the color read mode from Figure 10 is converted to the texture read mode of Figure 11, at first should forward removable reflector 532 to position shown in Figure 11, collimating lens 530 should be moved to the position shown in Figure 11 then.
It should be noted that, about removable reflector 532 towards and the position of parts 532 to 534, light is equaled light by the optical path length of object O surface direct reflection till receiving through speculum 534 and by built-in sensors 140 by the optical path length of object O surface diffuse reflectance till receiving through speculum 533 and removable reflector 532 and by built-in sensors 140.Therefore, though removable reflector 532 towards changing according to the image read mode, the focal position of guidance unit can not change yet.Feasible can the reception at same built-in sensors 140 (signal generating unit) of this structure diffuse and specular light, and need not all to adjust the focal position at every turn.
Parts in the full rate saddle unit 510 shown in Figure 10 have the size roughly the same with pressuring plate glass 150 on the direction vertical with paper.Make full rate saddle unit 510 move by the driver element (not shown) with the direction of speed v along arrow C among Fig. 9.When driver element makes full rate saddle unit 510 when the direction shown in the arrow C moves, the whole surface that full rate saddle unit 510 can scanning object O.
Now, referring again to Fig. 9, continue to describe the unit in the image read-out 500.
Half rate saddle unit 120 has speculum 141 and 142, and will be from the photoconduction of full rate saddle unit 510 to condenser lens 130.Half rate saddle unit 120 is driven by the driver element (not shown), and mobile by half speed (being v/2) of full rate saddle unit 510 along the direction identical with full rate saddle unit 510.Condenser lens 130 is placed along the light path that connects speculums 542 and built-in sensors 140, and will be from the photoimaging of object O in the position of built-in sensors 140.Built-in sensors 140 is the receiver elements such as three-way colored CCD (charge coupled device), this element separates the light such as three kinds of colors of R (redness), G (green) and B (blueness) and receives and versicolor light is carried out opto-electronic conversion, and built-in sensors 140 generates and output image signal according to the light quantity that receives.Operating unit 60 has LCD or other display unit and various button, to user's display message and accept input instruction from the user.
Control the operation of said units by unshowned control unit.Control unit XX has such as the calculation element of CPU (CPU) and such as various types of memories of ROM (read-only memory) and RAM (random access memory), and provide instruction according to the input instruction from the user for above-mentioned driver element, thereby the operation that puts rules into practice is with reading images.Control unit generates view data by the picture signal of built-in sensors 140 outputs being used various image processing (for example, AD conversion, gamma conversion and shadow correction).The picture signal of built-in sensors 140 outputs comprises based on the picture signal that diffuses with based on the specular light picture signal of (comprise or rather and diffusing).Control unit generates the view data that comprises colouring information by the calculating of picture signal application specifies to the former.Control unit further generates the view data that comprises texture information by the calculating to the latter's picture signal application specifies.Thereby by picture signal and the view data that obtains of the latter's picture signal from the former are superposeed, control unit can generate the view data that comprises about the information of color and texture.When generating this view data, control unit is carried out the computing that reduces from the component that is equivalent to diffuse in the specular light picture signal of (comprise or rather and diffuse).
In second exemplary embodiment, under the color read mode, from both direction irradiating object O, and based on the view data that generates the outward appearance (being mainly color) of representing object O from diffusing of object O.Under the texture read mode, regularly from a direction irradiating object O, and based on the view data of the outward appearance (being mainly texture) of representing object O from the direct reflection photogenerated of object O.Therefore, under the color read mode, the color of object can be read, under the texture read mode, the texture of object can be read.If color read mode and texture read mode are used together, then can read the color and the texture of object simultaneously.
By change by motor 536 (driver element) removable reflector 532 (guidance unit) towards, removable reflector 532 can be used for two kinds of image read modes.Therefore, and for example be that the situation that each of two kinds of image read modes is all installed special cell is compared, because structure is simpler, so can reduce the quantity of unit.In addition, owing under the texture read mode, be converted to directional light from the collimated lens 530 of the light of tubular light source 531, and send along the direction of object O, so light beam is consistent for the incidence angle of object, makes and to give birth to because the direct reflection light component that the trickle pattern of body surface causes by more quantitative real estate.Thereby can read the texture of object more accurately.To assemble from the light (diffused light) of tubular light source 531 is directional light, makes to guarantee enough light quantities.
C. the 3rd exemplary embodiment
Next the 3rd exemplary embodiment of the present invention is described.Only be the structure of full rate saddle unit according to the difference of the image read-out 500 of the image read-out of the 3rd exemplary embodiment and above-mentioned first exemplary embodiment.Therefore, only describe the structure of full rate saddle unit below, simultaneously the parts identical with the parts of second exemplary embodiment are distributed identical symbol, and omit description of them.
Figure 14 and Figure 15 are the figure that the structure of the full rate saddle unit 510b in the 3rd exemplary embodiment is shown.Full rate saddle unit 510b has the removable reflector 537 that replaces the removable reflector 532 in second exemplary embodiment, and has beam splitter 539.Removable reflector 537 is plane opticses, can rotate around the axle 538 that drives by motor 536 (not shown), and can get shown in Figure 14 towards with shown in Figure 15 towards.Be similar to removable reflector 532, removable reflector 537 has and is used for catoptrical reflecting surface 537m and is used for light absorbing absorbing surface 537t.
Beam splitter 539 makes the part reflection of incident light and a part is seen through.Because its design, light is provided with highly more from the reflectivity on a surface of beam splitter 539, and then the optical transmission rate just becomes low more.In other words, positive (surface) equals the reflection of incident light rate or is higher than a threshold value, and transmissivity is equal to or less than a threshold value.(it should be noted that these two threshold values needn't be identical).Utilize this characteristic, under the color read mode, reflect by the reflectivity that is equal to or higher than threshold value in the front of beam splitter 539 from object O and through diffusing of speculum 533, and be directed to the direction of half rate saddle unit 120.On the contrary, under the texture read mode, be transmitted to front by the transmissivity that is equal to or less than threshold value from the back side of beam splitter 539 from object O and through the specular light of speculum 534, and be directed to the direction of half rate saddle unit 120.Usually, may be from the specular light of object O than the strong several magnitude of the dynamic range that diffuses.Therefore, in design during beam splitter 539, the reflectivity that should the front and the transmissivity at the back side are set to appropriate value.The crossing position of light path (second light path) that to be the light paths (first light path) of advancing of the irreflexive light of object O shown in the dotted line of Figure 14 advance with the light of object O direct reflection shown in the dotted line of Figure 15, the position of beam splitter 539.
Under the color read mode, when collimating lens 530 was positioned at position shown in Figure 14, object O was by the diffused light irradiation from tubular light source 531.Under the texture read mode, when collimating lens 530 is positioned at position shown in Figure 15, is converted to directional light and is drawn towards the direction of object O from the collimated device 530 of the light of tubular light source 531.
When removable reflector 537 is positioned at position shown in Figure 14 under the color read mode, its with reflecting surface 537m along the direction reflection of object O light from tubular light source 531.At this moment, object O is also by light (diffused light) irradiation from tubular light source 531, and the result is that object O is shone from both direction simultaneously.From mirror 533 reflection that is reflected of diffusing of object O, reflected by beam splitter 539, advance along the direction of half rate saddle unit 120 then.In other words, removable reflector 537 under the color read mode towards be following towards: wherein, reflecting surface 537m will be from the light of tubular light source 531 along the direction reflection towards object O, makes that also the reflection that diffuses by speculum 533 and beam splitter 539 from object O is directed to half rate saddle unit 120.
When removable reflector 537 is positioned at position shown in Figure 15 under the texture read mode, reflecting surface 537m moves to the position that can't receive from the light of tubular light source 531, therefore only from the direction (promptly from fixed-direction) of tubular light source 531 with directional light irradiating object O.Therefore, the trickle pattern on object O surface produces more direct reflection light component, thereby makes the texture of their expression objects.Specular light mirror 534 reflection that is reflected by beam splitter 539, is advanced along the direction of half rate saddle unit 120 then.In addition, the absorbing surface 537t of removable reflector 537 moves to towards the position of object O, makes that therefore being absorbed face 537t from diffusing of object O absorbs.Therefore, removable reflector 537 under the texture read mode towards be following towards: wherein, by object O diffuse reflection and be directed to the direction of absorbing surface 537t, also make specular light from the object O mirror 534 guiding half rate saddle unit 120 that are reflected from the light of tubular light source 531.
It should be noted that, as in second exemplary embodiment, about removable reflector 537 towards with various position component, light is equaled light by the optical path length of object O direct reflection till receiving through speculum 534 and by built-in sensors 140 by the optical path length of object O diffuse reflection till receiving through speculum 533 and removable reflector 537 and by built-in sensors 140.Therefore, though removable reflector 537 towards changing according to the image read mode, the focal position of guidance unit can not change yet.Feasible can the reception at same built-in sensors 140 (signal generating unit) of this structure diffuse and specular light, and need not all to adjust the focal position at every turn.
In the 3rd exemplary embodiment,, under the color read mode, can read the color of object, and under the texture read mode, can read the texture of object as in second exemplary embodiment.If, then can read the color and the texture of object simultaneously in conjunction with adopting color read mode and texture read mode.By change with motor 536 (driver element) removable reflector 537 (guidance unit) towards, removable reflector 537 can be used for two kinds of image read modes.Therefore, and for example be that the situation that in two kinds of image read modes each is all installed special cell is compared, because structure is simpler, so can reduce the quantity of unit.In addition, owing under the texture read mode, be converted to directional light by collimating lens 530 and send along the direction of object O from the light of tubular light source 531, so light beam is consistent for the incidence angle of object, this makes and can give birth to because the direct reflection light component that the trickle pattern of body surface causes by more quantitative real estate.Thereby can read the texture of object more accurately.To assemble from the light (diffused light) of tubular light source 531 is directional light, and this makes can guarantee enough light quantities.
D. modified example
Can carry out following modification to above-mentioned first to the 3rd exemplary embodiment.
(1) utilization is to the structure of the full rate saddle unit shown in Figure 5 in the description of first exemplary embodiment, when with axle 301 and sell 302 and 303 and insert in the guide grooves 401,402 and 403 and when being fixed on mobile mirror support sector 117 on the ad-hoc location, mobile mirror 115 towards (orientation) by unique definite.Yet, for determine surface (reflecting surface) towards, if determined that these lip-deep two lines are just enough.Therefore, if at least two pins are set and at least two guide grooves that these pins insert wherein are set just enough in the side of mobile mirror 115 at outer cover part.In addition, when changing the position of mobile mirror 115, if axle 301 is moved along the scanning direction, then the position of mobile mirror 115 changes.
(2) in first exemplary embodiment, graphics processing unit 50 generates and has wherein superposeed based on the coloured image of first picture signal with based on the state color image data down of the black and white image of second picture signal, and still following method also is feasible.
At first, can be coloured image based on the image of second picture signal, rather than black and white image.Owing to black and white image need not colored be represented, so the shadow region can be more obvious, even but in coloured image, the shadow region is also darker, therefore also can be identified as the shadow region, thus expression texture.
In addition, graphics processing unit 50 can be such: it generates color image data based on first picture signal on the one hand, generate the black and white image data based on second picture signal on the other hand, the color image data and the black and white image data that generate are associated, and it is outputed to image formation unit 20 separately.In this case, image formation unit 20 should be overlapping and be formed on the record page with the black and white image that uses the K colour toners based on the black and white image data with the coloured image that uses C, M and Y colour toners based on color image data.
(3) in first exemplary embodiment, graphics processing unit 50 reads the signal value of second picture signal from video memory, be converted into the signal value of expression black and white image (netrual colour), and further use coefficient C (0<C≤1) to multiply by this signal value, but also can be simply coefficient of utilization C with the signal value addition of the signal value of first picture signal and second picture signal and not.
Replacement is preset as for example C=0.5 with coefficient C, and the operator can determine appropriate coefficient C at every turn.For example, before forming image on the record page, graphics processing unit 50 for example is set to value between 0.1 and 1 by 0.1 incremental change coefficient C, and based on the view data of using all these coefficient C to generate, the personal computer that the tabulation of a plurality of images is shown in the display of operating unit 60 or is connected with image processing system 1 in network.Coefficient C is more near 1, and shade is obvious more, but has lost color simultaneously, so the operator is chosen in from these a plurality of images in operator's the eye and it seems the image that reappears texture and the color of object O according to best balanced mode.Graphics processing unit 50 will represent that this view data of the image of operator's appointment like this offers image formation unit 20, and image formation unit 20 forms image based on this view data on record-paper.
It is the situation that 45 ° the situation and second incidence angle are 65 ° that (4) first exemplary embodiments disclose first incidence angle particularly, but the value of first incidence angle and second incidence angle is not limited to this.For example, it is the angle that can read the object with uniform outer surface satisfactorily that first incidence angle only needs, and is desirably about 45 °, but with 45 ° differ 1 ° and also be fine to 2 °.For the texture of further outstanding object O, second incidence angle can arrive about 70 °, and for the color of outstanding object O, second incidence angle can arrive about 60 °.
(5) in first exemplary embodiment, be described to dispose the multirow ccd image sensor of colour filter on the chip as the built-in sensors 140 of signal generating unit, but the present invention is not limited to this structure.For example, signal generating unit can be the unicursal graph image-position sensor that slides or rotate the structure of colour filter by disposing.Utilize this structure, can construct built-in sensors more cheaply, but increase the problem that the quantity read color brings the number of times that carries out read operation to increase thereupon.The quantity of the color that built-in sensors reads has more than and is limited to three kinds of colors, and can be four kinds or more colors.The color of greater number makes can estimate the optical frequency reflectivity more accurately, and still the data volume of the picture signal that produces when considering and image processing are during the time, and about three to six kinds of colors are appropriate.
(6) in first exemplary embodiment, described a kind of tandem image formation unit, but rotary image formation unit is fine also with four image formation units.In addition, sheet-transport belt can be set substitute the intermediate image transfer belt, and image directly can be transferred to record-paper and not carry out transfer printing from middle image transfer article (intermediate image transfer belt) from photosensitive drums.
(7) should also be noted that in first exemplary embodiment, described the situation as image processing system, but this is not restriction on the one hand the present invention.For example, can be provided with structure with the image fetching unit equivalence of this exemplary embodiment, need not to provide graphics processing unit or image formation unit just can obtain specific effect as image read-out.In other words, the present invention can be appointed as such image read-out.
(8) can carry out following modification to second exemplary embodiment.
Figure 12 and Figure 13 are the figure that illustrates according to the structure of the full rate saddle unit 510a of modified example.Figure 12 shows the structure of full rate saddle unit 510a under the color read mode, and Figure 13 shows the structure of full rate saddle unit 510a under the texture read mode.In Figure 12 and Figure 13, the parts identical with parts in first exemplary embodiment have identical symbol.Have two speculum 534a and the 534b that replaces speculum 534 according to the full rate saddle unit 510a of this modified example.By this two speculum 534a and 534b is set, by the irreflexive light of object O till receiving through speculum 533 and removable reflector 532 and by built-in sensors 140 order of reflection and all be even number (twice) by the light of the object O direct reflection order of reflection till receiving through speculum 534a and 534b and by built-in sensors 140.By making the light order of reflection of color read mode and texture read mode consistent in this manner is even number or odd number, can be so that the picture direction unanimity of the catoptrical sub scanning direction in various situations lower edge.In first exemplary embodiment, the order of reflection under the color read mode is twice, and the order of reflection under the texture read mode this means that for once image direction separately is inconsistent.In this case, because the direction of the sub scanning direction picture of imaging is inconsistent on built-in sensors, so put upside down the order of the picture of imaging on built-in sensors at sub scanning direction with R, G and three pixel columns of B.Therefore, must change the condition that is used for the processing of row coupling in the level delay memory of back, this may cause the problem such as hand-off process circuit or increase memory latency.Owing to do not need these processing, so this modified example is very easily.
(9), can carry out following modification to the 3rd exemplary embodiment based on the idea identical with the modified example of second exemplary embodiment.Figure 16 and 17 is the figure that illustrate according to the structure of the full rate saddle unit 510c of the modified example of the 3rd exemplary embodiment.Figure 16 shows the structure of full rate saddle unit 510c under the color read mode, and Figure 17 shows the structure of full rate saddle unit 510c under the texture read mode.Full rate saddle unit 510c has two speculum 534c and 534d replaces speculum 534.By this two speculum 534c and 534d is set, by the irreflexive light of object O till receiving through speculum 533 and beam splitter 539 and by built-in sensors 140 order of reflection and all be even number (twice) by the light of the object O direct reflection order of reflection till receiving through speculum 534c and 534d and by built-in sensors 140.By making the light order of reflection of color read mode and texture read mode consistent in this manner is even number or odd number, can be so that the direction unanimity of the picture in the reverberation under the various situation.
(10) it should be noted that in second exemplary embodiment and the 3rd exemplary embodiment, removable reflector towards changing by swaying according to the image read mode.Iff change removable reflector towards, control is comparatively simple so, and is preferred.Yet, the position that can change removable reflector according to the situation of the inner space of image read-out be not only removable reflector towards, can also change simultaneously towards and the position.In addition, the shape shown in collimating lens is not limited to, but can adopt any known collimating lens.
For the above description of exemplary embodiment of the present for explanation and illustrative purposes and provide.It is not intended to exhaustive or limit the invention to disclosed precise forms.Obviously, many modifications and variations are conspicuous for those skilled in the art.Selecting and describing exemplary embodiment is in order to explain principle of the present invention and practical application thereof best, makes others skilled in the art can understand various embodiment and the various modified example that is suitable for expecting concrete purposes of the present invention thus.Be intended to limit scope of the present invention by claims and equivalent thereof.

Claims (14)

1, a kind of image read-out, this image read-out comprises:
A light source;
First guidance unit, being used for will be from the photoconduction of light source to object by the regulation incidence angle that comprises first incidence angle and second incidence angle;
Signal generating unit is used to receive light and based on the photogenerated picture signal that receives;
Second guidance unit, the photoconduction that is used for reflecting from object is to signal generating unit; And
Control unit, be used to control first guidance unit with will be from the photoconduction of light source to object, and the control signal generating unit is to generate picture signal at described at least two different incidence angles by at least two different incidence angles that comprise first incidence angle and second incidence angle.
2, image read-out according to claim 1,
Wherein, first guidance unit comprises:
Reflector element, comprise speculum and being used to change described speculum towards the unit; And
Mobile unit is used to make the speculum of reflector element to move along first direction,
And wherein, control unit control reflector element with change speculum towards so that when speculum when first direction moves from the rayed of light source fixed position at object.
3, image read-out according to claim 2,
Wherein, reflector element can be around a rotation, and this axle is parallel with speculum and perpendicular to first direction,
And wherein, first guidance unit further comprises:
First driver element, be used for by make reflector element around described axle rotate change speculum towards; And
Second driver element is used to make speculum to move along first direction,
And wherein, control unit control first driver element with change speculum towards, and control mobile unit and second driver element so that speculum move along first direction.
4, image read-out according to claim 2,
Wherein, reflector element comprises:
Be arranged at least two axles or pin on the side of speculum, and
Extend and described at least two axles or pin insertion at least two guide grooves wherein along different directions,
And wherein, control unit control first driver element with by make be inserted in the guide groove the axle or the pin along first direction move change speculum towards and the position.
5, image read-out according to claim 1, wherein, first incidence angle is about 45 degree, and second incidence angle is about 60 to 70 degree.
6, image read-out according to claim 1, wherein, signal generating unit is based in photogenerated first picture signal that receives by by the first incidence angle UDeflector time from the light of light source, and based in photogenerated second picture signal that receives by by the second incidence angle UDeflector time from the light of light source, this image read-out further comprises:
Image data creating unit is used for generating view data based on first picture signal and second picture signal; And
On record page, form the unit of toner image based on the view data that is generated.
7, image read-out according to claim 6, wherein, image data creating unit generates view data by the long-pending and first picture signal addition with second picture signal and predetermined coefficients.
8, image read-out according to claim 7, wherein, described coefficient is greater than 0 and be less than or equal to 1.
9, image read-out according to claim 1,
Wherein, first guidance unit and second guidance unit respectively dynamically constitute in the group of following parts:
First optics comprises first reflecting surface and second reflecting surface and is used for light absorbing absorbing surface;
First driver element, be used to change first optics towards and the position;
Second optics is used for incident light is converted to directional light;
Second driver element, be used to change second optics towards and the position;
The 3rd optics is used for second reflecting surface with first optics that leads from diffusing of object; And
The 4th optics is used for the specular light targeting signal generating unit from object,
And wherein, under the first image input pattern, control unit is constructed first guidance unit with first optics and first driver element, construct second guidance unit with first optics and the 3rd optics and first driver element, and control second driver element with second optics towards with the light direct irradiation object of position change for making from light source, and control first driver element with first optics towards with position change for making light by first reflecting surface reflection of first optics by the first incidence angle UDeflector, and also further reflexed to signal generating unit by the reflection of the 3rd optics by second reflecting surface of first optics from diffusing of object
And wherein, under the second image input pattern, control unit is constructed first guidance unit with second optics and second driver element, construct second guidance unit with the 4th optics, and control second driver element second optics is placed between light source and the object so that all light from light source are converted into directional light also only by the first incidence angle irradiating object, and control first driver element: make the specular light that reflexes to signal generating unit by the 4th optics do not blocked, and absorbed by the absorbing surface of first optics from diffusing of object by first optics from object so that first optics is placed on following position.
10, image read-out according to claim 9, wherein, second optics is a collimating lens.
11, image read-out according to claim 9,
Wherein, signal generating unit under the first image input pattern based on the generation picture signal that diffuses from object, under the second image input pattern based on direct reflection photogenerated picture signal from object,
And wherein, control unit generates the colouring information of representing object color based on the picture signal that generates under the first image input pattern, generates the texture information of expression object texture based on the picture signal that generates under the second image input pattern.
12, image read-out according to claim 9, wherein, equal from object and by the optical path length of the specular light of the 4th optics targeting signal generating unit from object and by the optical path length that diffuses of the first optics targeting signal generating unit.
13, image read-out according to claim 9, wherein, along from object and by the order of reflection of the light path that diffuses of the first optics targeting signal generating unit with along all being odd number from object and by the order of reflection of the light path of the specular light of the 4th optics targeting signal generating unit or all being even number.
14, image read-out according to claim 1,
Wherein, first guidance unit and second guidance unit respectively dynamically constitute in the group of following parts:
Beam splitter comprises the back side of reflecting surface and this reflecting surface, and wherein reflecting surface reflects light when shine this beam splitter from reflecting surface, and when pass through this beam splitter from this beam splitter time of back side illuminaton;
First optics comprises reflecting surface and is used for light absorbing absorbing surface;
First driver element, be used to change first optics towards and the position;
Second optics is used for incident light is converted to directional light;
Second driver element, be used to change second optics towards and the position;
The 3rd optics is used for the reflecting surface with the beam splitter that leads from diffusing of object; And
The 4th optics is used for the specular light targeting signal generating unit from object,
And wherein, under the first image input pattern, control unit is constructed first guidance unit with first optics and first driver element, construct second guidance unit with beam splitter and the 3rd optics, and control second driver element with second optics towards with the light direct irradiation object of position change for making from light source, and control first driver element with first optics towards with position change for making light by the reflecting surface reflection of first optics by the first incidence angle UDeflector, and diffusing by the reflection of the 3rd optics and further by the direction reflection of the reflecting surface of beam splitter along signal generating unit from object
And wherein, under the second image input pattern, control unit is constructed first guidance unit with second optics and second driver element, construct second guidance unit with the 4th optics, and control second driver element second optics is placed between light source and the object so that all light from light source are converted into directional light also only by the first incidence angle irradiating object, and control first driver element: make the optics of winning not block specular light from object so that first optics is placed on following position, described specular light further passes through beam splitter by the reflection of the 4th optics and along the direction of signal generating unit, but the absorbing surface of first optics absorbs diffusing from object.
CNB2006100826093A 2005-10-13 2006-05-18 Image reading device and image forming device Expired - Fee Related CN100452819C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2005299265 2005-10-13
JP2005299265A JP4835098B2 (en) 2005-10-13 2005-10-13 Image reading apparatus and image forming apparatus
JP2005307416 2005-10-21

Publications (2)

Publication Number Publication Date
CN1949817A CN1949817A (en) 2007-04-18
CN100452819C true CN100452819C (en) 2009-01-14

Family

ID=38019185

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2006100826093A Expired - Fee Related CN100452819C (en) 2005-10-13 2006-05-18 Image reading device and image forming device

Country Status (2)

Country Link
JP (1) JP4835098B2 (en)
CN (1) CN100452819C (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102706294A (en) * 2012-05-04 2012-10-03 东莞市奥普特自动化科技有限公司 Device for detecting surface evenness of object
CN107404597A (en) * 2016-04-19 2017-11-28 京瓷办公信息系统株式会社 Image read-out, image reading method and image processing system

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010130444A (en) * 2008-11-28 2010-06-10 Fuji Xerox Co Ltd Reader and image forming apparatus
CN102739906A (en) * 2012-06-28 2012-10-17 威海华菱光电股份有限公司 Contact image sensor
CN102801890A (en) * 2012-09-03 2012-11-28 威海华菱光电股份有限公司 Contact-type image sensor

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05313537A (en) * 1992-05-11 1993-11-26 Konica Corp Image forming device
JPH05333643A (en) * 1992-06-02 1993-12-17 Konica Corp Image forming device
JPH0670097A (en) * 1992-08-20 1994-03-11 Ricoh Co Ltd Picture reader
US5773808A (en) * 1996-05-17 1998-06-30 Laser; Vadim Method and apparatus for reading invisible messages
US5991038A (en) * 1997-04-23 1999-11-23 Odp Co., Ltd. Surface pattern unevenness detecting method and apparatus
CN1423800A (en) * 2000-01-21 2003-06-11 福来克斯产品公司 Automated verification system and methods for use with optical interference devices
JP2004301719A (en) * 2003-03-31 2004-10-28 Seiko Epson Corp Inspection device and inspection method on semiconductor, and semiconductor device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1065874A (en) * 1996-08-23 1998-03-06 Nikon Corp Image input device
JP2006279226A (en) * 2005-03-28 2006-10-12 Fuji Xerox Co Ltd Image pickup apparatus and image forming apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05313537A (en) * 1992-05-11 1993-11-26 Konica Corp Image forming device
JPH05333643A (en) * 1992-06-02 1993-12-17 Konica Corp Image forming device
JPH0670097A (en) * 1992-08-20 1994-03-11 Ricoh Co Ltd Picture reader
US5773808A (en) * 1996-05-17 1998-06-30 Laser; Vadim Method and apparatus for reading invisible messages
US5991038A (en) * 1997-04-23 1999-11-23 Odp Co., Ltd. Surface pattern unevenness detecting method and apparatus
CN1423800A (en) * 2000-01-21 2003-06-11 福来克斯产品公司 Automated verification system and methods for use with optical interference devices
JP2004301719A (en) * 2003-03-31 2004-10-28 Seiko Epson Corp Inspection device and inspection method on semiconductor, and semiconductor device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102706294A (en) * 2012-05-04 2012-10-03 东莞市奥普特自动化科技有限公司 Device for detecting surface evenness of object
CN102706294B (en) * 2012-05-04 2015-11-11 东莞市奥普特自动化科技有限公司 A kind of for detecting the smooth device of object surface
CN107404597A (en) * 2016-04-19 2017-11-28 京瓷办公信息系统株式会社 Image read-out, image reading method and image processing system

Also Published As

Publication number Publication date
JP2007108433A (en) 2007-04-26
CN1949817A (en) 2007-04-18
JP4835098B2 (en) 2011-12-14

Similar Documents

Publication Publication Date Title
CN102937782B (en) Recording sheet surface detection apparatus and image processing system
US7724402B2 (en) Image reader
CN100452819C (en) Image reading device and image forming device
CN202870958U (en) Image sensor unit and image reading device
US7336431B2 (en) Image reading device and image forming device
JP2006279228A (en) Image pickup apparatus
JP2006261820A (en) Imaging apparatus, image forming apparatus, and texture reading method
US20020070921A1 (en) Holographic keyboard
CN100398327C (en) Exposure system and production method for exposure system
JP2000349960A (en) Image reader and image reading method
CN100556069C (en) Cis
JP7183662B2 (en) Irradiation device and reader
JP2007116536A (en) Image reading apparatus
JPH01147956A (en) Color original reader
JP2007116535A (en) Image reading apparatus
JP4830284B2 (en) Image reading device
JP2023141225A (en) Image reading device and image forming apparatus
JPH0339977Y2 (en)
JP4034643B2 (en) Image forming apparatus
JP3678353B2 (en) Projector lighting system
JPH0466428B2 (en)
JP4214754B2 (en) Photo printing system
JP2023141226A (en) Image reading device and image forming apparatus
US20020063938A1 (en) Reflective and penetrative scanning apparatus
JPH0733234Y2 (en) Slide film printing equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090114

Termination date: 20180518

CF01 Termination of patent right due to non-payment of annual fee