CN101697105B - Camera type touch detection positioning method and camera type touch detection system - Google Patents

Camera type touch detection positioning method and camera type touch detection system Download PDF

Info

Publication number
CN101697105B
CN101697105B CN2009101933024A CN200910193302A CN101697105B CN 101697105 B CN101697105 B CN 101697105B CN 2009101933024 A CN2009101933024 A CN 2009101933024A CN 200910193302 A CN200910193302 A CN 200910193302A CN 101697105 B CN101697105 B CN 101697105B
Authority
CN
China
Prior art keywords
relative distance
image
camera head
max
distortion correction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2009101933024A
Other languages
Chinese (zh)
Other versions
CN101697105A (en
Inventor
林道庆
曾昭兴
李响
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vtron Group Co Ltd
Original Assignee
Vtron Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtron Technologies Ltd filed Critical Vtron Technologies Ltd
Priority to CN2009101933024A priority Critical patent/CN101697105B/en
Publication of CN101697105A publication Critical patent/CN101697105A/en
Application granted granted Critical
Publication of CN101697105B publication Critical patent/CN101697105B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a camera type touch detection positioning method and a camera type touch detection system. The camera type touch detection positioning method comprises the following steps of: shooting specific touch objects on a plurality of preset coordinates of a screen through a plurality of camera devices respectively; and utilizing an imaging distortion calculation method of an optical lens to calculate image distortion correction parameters corresponding to each camera device; shooting touch objects at uncertain positions on the screen through the plurality of camera devices respectively; correcting images of the touch objects at the uncertain positions according to the corresponding image distortion correction parameters; and calculating actual coordinates of the touch objects at the uncertain positions according to the corrected images of the touch objects at the uncertain positions. The camera type touch detection positioning method and the camera type touch detection system can reduce the touch detection positioning error caused by imaging distortion of the lens.

Description

A kind of camera type touch detection positioning method and camera type touch detection system
Technical field
The present invention relates to a kind of camera type touch detection positioning method, the invention still further relates to a kind of camera type touch detection positioning system.
Background technology
Common camera type touch detection positioning system utilizes to be arranged on the image that a plurality of camera collection screen surface touch objects on the screen top margin form in the color line background of screen edge, determines the actual coordinate of touch point.A kind of locating device and localization method of electronic display system have been disclosed among the Chinese patent CN200510100255, it utilizes three positioning shooting heads respectively shelter to be taken, earlier selected Primary Location camera, the image information that provides according to the Primary Location camera is determined the general centre coordinate position of shelter, determine accurate positioning shooting head, obtain the position digital information of shelter by accurate positioning shooting head, determine the accurate centre coordinate of shelter position.A kind of multi-point positioning device and method of interactive electronic display system have been disclosed among the Chinese patent application CN200810030083, the camera technique of its utilization camera assembly, by different camera combinations, the color line background image that camera when shelter and have no occluder are relatively arranged photographs, extract coordinate angle information (being the line of each touch point and described camera head and the angle of described screen edge), determine the elements of a fix of a shelter, can realize multipoint positioning.
In the multi-point positioning device and method of above-mentioned interactive electronic display system, the method of extracting the coordinate angle information is normally according to the distance of touch point pattern center described in the image of each camera shooting to the picture centre of each described camera head shooting, calculate and obtain each described touch point and the line of described camera head and the angle of described screen edge, as the coordinate angle information of described touch point.
In fact, when carrying out the camera shooting type multipoint positioning, the coordinate angle information according to extracting respectively in each camera head can utilize trigonometric function and screen size to calculate the coordinate position of touch point simply.See also Fig. 1, Fig. 1 is a kind of principle of work synoptic diagram of camera type touch detection positioning system of prior art, and each camera head is respectively A, B, C, and the touch point is D, camera A, B, C and touch point be the angle of D be respectively θ, φ and
Figure G2009101933024D00021
, be true origin with A, AC length is L, 1 D on the screen (x, coordinate y) are that formula is then arranged:
Figure DEST_PATH_GSB00000502889400012
Yet, camera for shooting (being generally two wide-angle lens and fish eye lenses that are arranged on the top margin midpoint that are arranged on screen drift angle place) is when photographic images, because the imaging of optical lens all can have pattern distortion in various degree usually, thereby the calculating that makes described coordinate angle information also can become inaccurate, also bigger error can occur to the location of touch article coordinate.
Summary of the invention
For solving the prior art camera type touch detection positioning method, the invention provides a kind of location to touch objects camera type touch detection positioning method more accurately to the not accurate enough problem in the location of touch objects.
A kind of camera type touch detection positioning method, it may further comprise the steps: respectively the specific touch objects on the preset coordinates of screen is taken by the camera head outside a plurality of viewing areas that are arranged on screen, wherein, the scope at the shooting visual angle of each described camera head comprises whole described screen.Obtain first relative distance of the picture centre of described specific touch objects to the entire image center that corresponding described camera head is taken, according to described first relative distance and described preset coordinates, utilize the image deformation computing method of optical lens to calculate the image distortion correction parameter of each described camera head.Respectively the touch objects to be positioned on the screen is taken by described a plurality of camera heads.The picture centre of obtaining the touch objects described to be positioned of described camera head shooting arrives second relative distance at described entire image center, according to described image distortion correction parameter, utilize the image deformation bearing calibration of optical lens to obtain the corrected value of described second relative distance.Corrected value according to described second relative distance, calculate the coordinate angle information that is wired to described screen edge that obtains each described touch article coordinate and described camera head, with the described coordinate angle information combination of extracting respectively in a plurality of camera heads, utilize trigonometric function and screen size to calculate the actual coordinate of described touch objects.
Compared with prior art, camera type touch detection positioning method of the present invention is before positioning touch objects, calculate each described first relative distance of acquisition according to the image and the described preset coordinates of described specific touch objects earlier, calculate corresponding image distortion correction parameter.Then, when described touch objects to be positioned is carried out detection and location, the picture centre of the touch objects to be positioned of the described camera head of correspondence being taken according to described image distortion correction parameter is proofreaied and correct to second relative distance at described entire image center, and according to the actual coordinate of the described touch objects to be positioned of the correction value of described second relative distance.Thereby reduced because the error that the optical lens image deformation of camera head brings, more accurate to the calculating of the actual coordinate of touch objects.
For solving in the prior art camera type touch detection positioning system the not accurate enough problem in the location of touch objects, the location that the invention provides a kind of touch objects is the camera type touch detection positioning system more accurately.
A kind of camera type touch detection positioning system, it comprises image collection module, image correction module and locating module.
Described image capture module is used for respectively the specific touch objects on the screen preset coordinates being taken by the camera head outside a plurality of viewing areas that are arranged on screen, and the touch objects to be positioned on the screen taken, wherein, the scope at the shooting visual angle of each described camera head comprises whole described screen;
Described image correction module is used to obtain first relative distance of the picture centre of described specific touch objects to the entire image center that corresponding described camera head is taken, according to described first relative distance and described preset coordinates, utilize the image deformation computing method of optical lens to calculate the image distortion correction parameter of each described camera head, the picture centre of obtaining the touch objects described to be positioned of described camera head shooting then arrives second relative distance at described entire image center, according to described image distortion correction parameter, utilize the image deformation bearing calibration of optical lens to obtain the corrected value of described second relative distance;
Described locating module is used for the corrected value according to described second relative distance, calculate the coordinate angle information that is wired to described screen edge that obtains each described touch article coordinate and described camera head, with the described coordinate angle information combination of extracting respectively in a plurality of camera heads, utilize trigonometric function and screen size to calculate the actual coordinate of described touch objects.
Compared with prior art, camera type touch detection positioning of the present invention system has described distortion correction module, before touch objects is positioned, described image correction module is calculated each described first relative distance of acquisition according to the image and the described preset coordinates of described specific touch objects earlier, calculates corresponding image distortion correction parameter.Then, when described touch objects to be positioned is carried out detection and location, the picture centre of the touch objects to be positioned of the described camera head shooting of correspondence is proofreaied and correct to second relative distance at described entire image center according to described image distortion correction parameter.Again by the actual coordinate of described locating module according to the described touch objects to be positioned of the correction value of described second relative distance.Thereby reduced because the error that the optical lens image deformation of camera head brings, more accurate to the calculating of the actual coordinate of described touch objects.
Fig. 1 is a kind of principle of work synoptic diagram of camera type touch detection positioning system of prior art;
Description of drawings
Fig. 2 is the process flow diagram of camera type touch detection method first embodiment of the present invention;
Fig. 3 utilizes camera type touch detection positioning method of the present invention that camera head with wide-angle lens is carried out the principle schematic that image dsitortion is proofreaied and correct;
Fig. 4 is the synoptic diagram that has the image of effective touch surveyed area that the camera head of wide-angle lens photographs described in the camera type touch detection positioning method of the present invention;
Fig. 5 utilizes camera type touch detection positioning method of the present invention to carry out the principle schematic that image dsitortion is proofreaied and correct to having fish-eye camera head;
Fig. 6 is the synoptic diagram that has the image of effective touch surveyed area that fish-eye camera head photographs described in the camera type touch detection positioning method of the present invention;
Fig. 7 is the structural representation of camera type touch detection positioning of the present invention system first embodiment;
Fig. 8 is the structural representation of a kind of preferred implementation of camera type touch detection positioning system of the present invention.
Wherein, 10,20 camera type touch detection positioning systems;
11 image capture modules;
12,22 image correction module;
121,221 wide-angle lens correction parameter generation modules;
122,222 fish eye lens correction parameter generation modules;
125,225 wide-angle lens image correction module;
126,226 fish eye lens image correction module;
223 parameter look-up table means;
13 locating modules.
See also Fig. 2, Fig. 2 is the process flow diagram of camera type touch detection positioning method first embodiment of the present invention.
Embodiment
Described camera type touch detection positioning method starts from step S101.
Then, in step S102, respectively the specific touch objects on the preset coordinates of screen is taken by a plurality of camera heads.
Described a plurality of at least camera head is arranged on outside the viewing area of described screen, and its coordinate with respect to described screen is known.In the present embodiment, three camera heads are set on the top margin of described screen, wherein, the coordinate of the coordinate of first camera head and second camera head is separately positioned on the top margin of described screen and the intersection point place on two other limit, it has takes the wide-angle lens that the visual angle is 90 degree, and the angle of the optical axis of described wide-angle lens and the top margin of described screen is 45 degree; The coordinate of the 3rd camera head is arranged on the midpoint of described top margin, and it has takes the fish eye lens that the visual angle is 180 degree, and described fish-eye optical axis is perpendicular to the top margin of described screen.Described the 3rd camera head also can have with two takes the camera head replacement that the visual angle is the wide-angle lens of 90 degree.The scope at the shooting visual angle of described three camera heads all comprises whole described screen, in the present embodiment, and the plane parallel at the camera lens optical axis of described three camera heads and described screen place or become a less angle.
Described preset coordinates is known with respect to the coordinate of described screen, and the angle of the line of itself and described camera head and the optical axis of described camera head can calculate according to the angle of the optical axis of the coordinate of described preset coordinates, described camera head and described camera head and screen edge and obtain.Described specific touch objects only on described preset coordinates, carry out touch action, promptly described specific touch objects is described preset coordinates with respect to the coordinate of described screen.
In step S104, obtain first relative distance of the picture centre of described specific touch objects to the entire image center that corresponding described camera head is taken, according to described first relative distance and described preset coordinates, utilize the image deformation computing method computed image distortion correction parameter of optical lens.
The picture centre of the described specific touch objects of Image Acquisition of the described specific touch objects of taking according to described camera head is to first relative distance at the entire image center that described camera head is taken.Obtain the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head according to described preset coordinates.Utilize the image deformation computing method of optical lens to calculate the image distortion correction parameter of each described camera head then.Wherein, the center of the entire image of described camera head shooting is the point of camera lens optical axis correspondence in described entire image of described camera head.
In order to improve the accuracy of described image distortion correction parameter, as a kind of preferred implementation, set at least two preset coordinates on the described screen, according to the image of the specific touch objects on described at least two preset coordinates of each described camera head shooting, obtain at least two described first relative distances respectively.
Respectively according to described at least two preset coordinates and corresponding described first relative distance, utilize the image deformation computing method of optical lens to calculate at least two corresponding image distortion correction parameters, and described at least two image distortion correction parameters are averaged, with the final value of described mean value as described image distortion correction parameter.
A plurality of image distortion correction parameters of setting a plurality of preset coordinate on the described screen and asking for the correspondence of described a plurality of preset coordinate are averaged, and can reduce error, increase the accuracy of described image distortion correction parameter.
According to paper " based on the geometric correction of imagery of aberration rate " (" applied optics " 2006 03 phases), the application of formula and the computation process of wide-angle lens image distortion correction algorithm are as follows:
Aberration rate is defined as:
D = η - H H × 100 %
In the formula, η is the actual imaging height; H is the ideal image height; D is an aberration rate.Can release the ideal image height H:
H = η 1 + D
In the formula, D=(θ-tan θ)/tan θ;
Introduce intermediate variable h=r Max/ tan θ Max
In the formula, r is the distance that the center of described entire image is arrived at the center of the image of the touch objects on the described preset coordinates, r MaxMaximal value for r; θ MaxBe form parameter, its value is decided on calibration result.The θ value of each point is calculated by following formula:
r/h=tanθ,θ=arctan(r/h)
Suppose original fault image coordinate for (x, y), the coordinate after the correction be (X, Y), then, described original fault image coordinate is proofreaied and correct according to following formula:
1 ) - - - θ = arctan ( r h ) = arctan ( r · tan θ max r max )
2 ) - - - D = ( θ - tan θ ) tan θ
3 ) - - - X = x 1 + D Y = y 1 + D
See also Fig. 3 and Fig. 4, Fig. 3 utilizes camera type touch detection positioning method of the present invention that camera head with wide-angle lens is carried out the principle schematic that image dsitortion is proofreaied and correct; Fig. 4 is the synoptic diagram that has the image of effective touch surveyed area that the camera head of wide-angle lens photographs described in the camera type touch detection positioning method of the present invention.Among Fig. 4 when y is very little, x ≈ r.
Above-mentioned method for correcting image based on aberration rate is introduced in the camera type touch detection positioning method, and then described image distortion correction parameter is form parameter θ Max
The ideal value of described first distance is:
4 ) - - - R = r 1 + D = r · tan θ θ = r 2 · tan θ max r max arctan ( r · tan θ max r max ) = r 2 · tan θ max r max · arctan ( r · tan θ max r max )
In the formula, r MaxBe the maximal value of r, promptly the edge of described entire image can obtain according to the described entire image that each described camera head is taken to the distance at the center of described entire image.
If α is the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head, θ is the visual angle that described camera head is taken described preset coordinates correspondence.Following formula is arranged:
α ≈ θ 2 = θ max 2 · R R max
Wherein, θ MxaBe form parameter, R MaxBe the maximal value of R,
According to equation 4) can get:
α = r 2 θ max 2 2 r max 2 · arctan ( r · tan θ max r max )
Be 30 degree if the formed angle α of optical axis of the camera lens of preset coordinate and camera head line and described camera head is set, then have:
α = r 2 θ max 2 2 r max 2 arctan ( r tan θ max r max ) = π 6
Solve an equation, ask form parameter θ Max, have:
tan ( 3 r 2 π r max 2 θ max 2 ) = r r max tan θ max ;
If function
Figure DEST_PATH_GSB00000502889400083
For variable θ unique in the formula MaxSpan get
Figure DEST_PATH_GSB00000502889400084
I.e. (1.0472,1.5708), more accurate in order to calculate in addition to differ 0.008725 (0.5 degree angle), also can get littler step value, to reach the accurate more purpose of result of calculation.At θ MaxThe whole span of value
Figure DEST_PATH_GSB00000502889400085
Each interior θ MaxValue is calculated described function f (θ Max) value, make f (θ Max) the most close 0 θ Max, be the form parameter θ that is tried to achieve MaxDescribed form parameter θ MaxBe the image dsitortion correction parameter of camera head with wide-angle lens.
According to " FISH EYE LENS OPTICS " (Wang Yongzhong, Science Press), flake glasses head image distortion correction algorithm is as follows:
Known fish eye lens focal length is f 0, establishing the panorama picture of fisheye lens radius is r 0, the center of the image of the touch objects on the described preset coordinates is r to first relative distance at the center of described entire image 1If r 1≤ r 0﹠amp; r 1<f 0, utilize quadrature alignment model r=fsin ω to obtain the distortion visual angle ω of current pixel point earlier, get:
ω = arcsin ( r 1 f 0 )
Under orthogonal model, radially magnification is:
β r = f cos ω r
Wherein, cos ω = 1 - sin 2 ω = 1 - ( r 1 f 0 ) 2 = f 0 2 - r 1 2 f 0
Can get thus,
Figure DEST_PATH_GSB00000502889400093
Suppose the current pixel coordinate for (x, y), the size of the entire image of lens shooting is height * width, then the distortion correction result be (X, Y),
( 12 ) - - - X = width 2 + x - cenw β r = width 2 + r 0 · ( x - cenw ) f 0 2 - r 1 2 Y = height 2 + ( y - cenh ) β r = height 2 + r 0 · ( y - cenh ) f 0 2 - r 1 2
Wherein, width is the width on the described camera head entire image horizontal direction of taking,
cenw = width 2 .
See also Fig. 5 and Fig. 6, Fig. 5 utilizes camera type touch detection positioning method of the present invention to carry out the principle schematic that image dsitortion is proofreaied and correct to having fish-eye camera head; Fig. 6 is the synoptic diagram that has the image of effective touch surveyed area that fish-eye camera head photographs described in the camera type touch detection positioning method of the present invention.
Above-mentioned derivation result is applied in the camera type touch detection positioning method of the present invention, and then described image dsitortion correction parameter is that fish-eye imaging radius is r 0
The described first relative distance r 1The Image Acquisition of the described specific touch objects that can take according to described camera head, the distance after the correction is r 1', then according to magnification β radially rDefinition, have:
r 1 ′ = r 0 r 1 f 0 2 - r 1 2
After the maximum imaging semidiameter correction in the angular field of view be:
r 0 ′ = r 0 2 f 0 2 - r 0 2
For the visual angle is 180 the degree fish eye lenses, optical axis with angular field of view divide equally for
Figure DEST_PATH_GSB00000502889400103
Maximum radius r in the maximum angular field of view in the image of described optical axis both sides 0' corresponding angular field of view is
Figure DEST_PATH_GSB00000502889400104
Can calculate r in proportion 1Corresponding angle γ then has:
γ = π 2 r ′ r 0 ′ = π 2 r 0 r 1 f 0 2 - r 1 2 f 0 2 - r 0 2 r 0 2 = π 2 r 1 r 0 f 0 2 - r 0 2 f 0 2 - r 1 2
If the angle of described preset coordinates and described fish-eye line and described fish eye lens optical axis is 45 degree, focal distance f 0Can be by the inquiry of camera lens instruction manual.According to known conditions, have:
γ = π 2 r 1 r 0 f 0 2 - r 0 2 f 0 2 - r 1 2 = π 4
That is, r 0 = 2 f 0 r 1 f 0 2 + 3 r 1 2
Described imaging radius r 0Be described image distortion correction parameter with fish-eye camera head.
In step S106, respectively the touch objects to be positioned on the screen is taken by described a plurality of camera heads.
After described image distortion correction parameter is obtained in calculating, respectively the touch objects to be positioned on the screen is taken by described a plurality of camera heads.Described touch objects to be positioned is exactly the touch objects with respect to coordinate the unknown of described screen.
In step S108, the picture centre that obtains the touch objects described to be positioned of described camera head shooting arrives second relative distance at described entire image center, according to described image distortion correction parameter, utilize the image deformation bearing calibration of optical lens to obtain the corrected value of described second relative distance.
Described second relative distance is the distance of the picture centre of the touch objects described to be positioned that described camera head takes to described entire image center, described second relative distance is used to calculate the angle that is wired to described screen edge of each described touch article coordinate and described camera head, promptly described coordinate angle information.
For camera head, obtaining described distortion correction parameter, i.e. form parameter θ with wide-angle lens MaxAfterwards, according to 1) formula and 2) formula, can calculate aberration rate D, obtain described aberration rate D after, can proofread and correct described second relative distance according to formula (3).
Because touch objects just can be judged as effective touch action when having only near described screen, so in the image that described camera head is taken, having only very little scope on the Y direction is effectively to touch surveyed area, and for formula (3) and formula (12), when Y is very little, can ignore the influence of Y, directly calculate the corrected value X of described second relative distance projector distance in the horizontal direction.
X = x 1 + D
Can calculate coordinate and the line of described camera head and the angle (being described coordinate angle information) of described screen edge of described touch objects according to the corrected value X of described second relative distance projector distance in the horizontal direction, thereby calculate the coordinate of described touch objects.
Obtain described fish-eye imaging radius r for having fish-eye camera head 0Afterwards, can be according to formula (11) in the hope of described radially magnification β r
And then according to formula (12):
X = width 2 + r 0 · ( x - cenw ) f 0 2 - r 1 2
Can calculate the corrected value X of described second relative distance projector distance in the horizontal direction.
The mean value of image distortion correction parameter as described in letting it pass as falling into a trap at step S104 then in this step will be according to the corrected value of described second relative distance of the mean value calculation of described image distortion correction parameter.
In this step, may further include a basis and in step S104, calculate the described image distortion correction parameter that obtains, utilize the image deformation bearing calibration of optical lens to calculate in the described entire image that described camera head takes each and put the corrected value of the relative distance at described entire image center, according to the corrected value of each described relative distance and corresponding described relative distance, set up the step of a parameter look-up table.Search described parameter look-up table according to described second relative distance, can obtain the corrected value of corresponding described second relative distance.
To having the camera head of wide-angle lens, store described image distortion correction parameter, promptly described form parameter θ Max,, write down its relative distance respectively, again according to a plurality of described relative distances and described form parameter θ to the entire image center that described camera head is taken for each point in the described entire image of a described camera head shooting Max, calculate in the described entire image each respectively and put the corrected value of the relative distance at described entire image center.Set up described parameter look-up table according to the corrected value of described relative distance and described relative distance then.
Set up after the described parameter look-up table, the center of the image of the described touch objects to be positioned of each of taking according to described camera head is to second relative distance at the center of described entire image, search described parameter look-up table, can obtain the corrected value of described second relative distance.
For described fish eye lens, profit uses the same method can set up described parameter look-up table equally.
By setting up described parameter look-up table, then can search the corrected value of corresponding described second relative distance directly, apace, accelerate the whole locating speed of image rectification speed and described camera type touch detection positioning method.
In step S110, according to the actual coordinate of the described touch objects to be positioned of the correction value of described second relative distance.
When calculating the actual coordinate of described touch objects,, can calculate the angle that is wired to described screen edge that obtains each described touch article coordinate and described camera head, promptly described coordinate angle information according to the corrected value of described second relative distance.Then, with the described coordinate angle information combination of extracting respectively in a plurality of camera heads, utilize trigonometric function and screen size to calculate the coordinate position of touch point.
Described camera type touch detection positioning method ends at step S112.
Compared with prior art, camera type touch detection positioning method of the present invention was calculated the image distortion correction parameter that obtains each described first relative distance correspondence according to the image and the described preset coordinates of described specific touch objects earlier before touch objects is positioned.Then, when described touch objects to be positioned is carried out detection and location, the picture centre of the touch objects to be positioned of the described camera head of correspondence being taken according to described image distortion correction parameter is proofreaied and correct to second relative distance at described entire image center, and according to the actual coordinate of the described touch objects to be positioned of the correction value of described second relative distance.Thereby reduced because the error that the optical lens image deformation of camera head brings, more accurate to the calculating of the actual coordinate of touch objects.
See also Fig. 7, Fig. 7 is the structural representation of camera type touch detection positioning of the present invention system first embodiment.
Described camera type touch detection positioning system 10 comprises image capture module 11, image correction module 12 and locating module 13.
Described image capture module 11 is used for respectively the specific touch objects on the screen preset coordinates being taken by a plurality of camera heads, and the touch objects to be positioned on the screen is taken.
Described image capture module 11 connects a plurality of camera heads, and described a plurality of at least camera heads are arranged on outside the viewing area of described screen, and its coordinate with respect to described screen is known.In the present embodiment, three camera heads are set on the top margin of described screen, wherein, the coordinate of the coordinate of first camera head and second camera head is separately positioned on the top margin of described screen and the intersection point place on two other limit, it has takes the wide-angle lens that the visual angle is 90 degree, and the angle of the optical axis of described wide-angle lens and the top margin of described screen is 45 degree; The coordinate of the 3rd camera head is arranged on the midpoint of described top margin, and it has takes the fish eye lens that the visual angle is 180 degree, and described fish-eye optical axis is perpendicular to the top margin of described screen.Described the 3rd camera head also can have with two takes the camera head replacement that the visual angle is the wide-angle lens of 90 degree.The scope at the shooting visual angle of described three camera heads all comprises whole described screen, in the present embodiment, and the plane parallel at the camera lens optical axis of described three camera heads and described screen place or become a less angle.
Described preset coordinates is known with respect to the coordinate of described screen, and the angle of the line of itself and described camera head and the optical axis of described camera head can calculate according to the angle of the optical axis of the coordinate of described preset coordinates, described camera head and described camera head and screen edge and obtain.Described specific touch objects only on described preset coordinates, carry out touch action, promptly described specific touch objects is described preset coordinates with respect to the coordinate of described screen.
Described image correction module 12 is used to obtain first relative distance of the picture centre of described specific touch objects to the entire image center that corresponding described camera head is taken, according to described first relative distance and described preset coordinates, utilize the image deformation computing method computed image distortion correction parameter of optical lens, the picture centre of obtaining the touch objects described to be positioned of described camera head shooting then arrives second relative distance at described entire image center, according to described image distortion correction parameter, utilize the image deformation bearing calibration of optical lens to obtain the corrected value of described second relative distance.
Described image correction module 12 arrives first relative distance at the entire image center of corresponding described camera head shooting according to the picture centre of the described specific touch objects of Image Acquisition of the described specific touch objects of each described camera head shooting.Obtain the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head according to described preset coordinates.Then in conjunction with each described preset coordinates, utilize the image deformation computing method of optical lens to calculate the image distortion correction parameter of each described camera head.Wherein, the center of the entire image of described camera head shooting is the point of camera lens optical axis correspondence in described entire image of described camera head.
In order to improve the accuracy of described image distortion correction parameter, as a kind of preferred implementation, described screen comprises at least two preset coordinates, described image correction module 12 according to the image of the specific touch objects on described at least two preset coordinates of each described camera head shooting, obtains at least two described first relative distances respectively.
Described image correction module 12 is respectively according to described at least two preset coordinates and corresponding described first relative distance, utilize the image deformation computing method of optical lens to calculate at least two corresponding image distortion correction parameters, and described at least two image distortion correction parameters are averaged, with the final value of described mean value as described image distortion correction parameter.
A plurality of preset coordinate on the described image correction module 12 corresponding described screens are asked for and are calculated described image distortion correction parameter respectively, average then, can reduce error, increase the accuracy of described image distortion correction parameter.
Described image correction module 12 comprises a wide-angle lens correction parameter generation module 121 and a wide-angle lens image correction module 125.
According to paper " based on the geometric correction of imagery of aberration rate " (" applied optics " 2006 03 phases), the application of formula and the computation process of wide-angle lens image distortion correction algorithm are as follows:
Aberration rate is defined as:
D = η - H H × 100 %
In the formula, η is the actual imaging height; H is the ideal image height; D is an aberration rate.Can release the ideal image height H:
H = η 1 + D
In the formula, D=(θ-tan θ)/tan θ;
Introduce intermediate variable h=r Max/ tan θ Max
In the formula, r is the distance that the center of described entire image is arrived at the center of the image of the touch objects on the described preset coordinates, r MaxMaximal value for r; θ MaxBe form parameter, its value is decided on calibration result.The θ value of each point is calculated by following formula:
r/h=tanθ,θ=arctan(r/h)
Suppose original fault image coordinate for (x, y), the coordinate after the correction be (X, Y), then, described original fault image coordinate is proofreaied and correct according to following formula:
4 ) - - - θ = arctan ( r h ) = arctan ( r · tan θ max r max )
5 ) - - - D = ( θ - tan θ ) tan θ
6 ) - - - X = x 1 + D Y = y 1 + D
Above-mentioned method for correcting image based on aberration rate is applied in the described wide-angle lens correction parameter generation module 121, and then described image distortion correction parameter is form parameter θ Max
The ideal value of described first distance is:
4 ) - - - R = r 1 + D = r · tan θ θ = r 2 · tan θ max r max arctan ( r · tan θ max r max ) = r 2 · tan θ max r max · arctan ( r · tan θ max r max )
In the formula, r MaxBe the maximal value of r, promptly the edge of described entire image can obtain according to the described entire image that each described camera head is taken to the distance at the center of described entire image.
If α is the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head, θ is the visual angle that described camera head is taken described preset coordinates correspondence.Following formula is arranged:
α ≈ θ 2 = θ max 2 · R R max
Wherein, θ MaxBe form parameter, R MaxBe the maximal value of R,
Figure DEST_PATH_GSB00000502889400172
According to equation 4) can get:
5 ) - - - α = r 2 θ max 2 2 r max 2 · arctan ( r · tan θ max r max )
Described wide-angle lens correction parameter generation module 121 is used for described first relative distance of Image Acquisition according to the described specific touch objects that the camera head with wide-angle lens is taken, obtain the angle of the optical axis of the described camera head with wide-angle lens of being wired to of described preset coordinates and described camera head according to described preset coordinates, and the edge of described entire image calculates form parameter θ to the distance at the center of described entire image according to formula (5) Max:
Be 30 degree if the formed angle α of optical axis of the camera lens of preset coordinate and camera head line and described camera head is set, then have:
α = r 2 θ max 2 2 r max 2 arctan ( r tan θ max r max ) = π 6
Solve an equation, ask form parameter θ Max, have:
tan ( 3 r 2 π r max 2 θ max 2 ) = r r max tan θ max ;
If function f ( θ max ) = tan ( 3 r 2 πr max 2 θ max 2 ) - r r max tan θ max ;
For variable θ unique in the formula MaxSpan get
Figure DEST_PATH_GSB00000502889400181
I.e. (1.0472,1.5708), more accurate in order to calculate in addition to differ 0.008725 (0.5 degree angle), also can get littler step value, to reach the accurate more purpose of result of calculation.At θ MaxThe whole span of value
Figure DEST_PATH_GSB00000502889400182
Described function f (the θ of interior calculating Max) value, make f (θ Max) the most close 0 θ Max, be the form parameter θ that is tried to achieve MaxDescribed form parameter θ MaxBe the image dsitortion correction parameter of camera head with wide-angle lens.
Described wide-angle lens correction parameter generation module 121 finish calculate obtain described image distortion correction parameter after, described image capture module 11 is taken the touch objects to be positioned on the screen respectively by described a plurality of camera heads.Described touch objects to be positioned is exactly the touch objects with respect to coordinate the unknown of described screen.
Described wide-angle lens image correction module 125 is used for according to 1) formula and 2) formula, can calculate aberration rate D, calculate the corrected value of described second relative distance again according to formula (3).
Because touch objects just can be judged as effective touch action when having only near described screen, so in the image that described camera head is taken, having only very little scope on the Y direction is effectively to touch surveyed area, and for formula (3) and formula (12), when Y is very little, can ignore the influence of Y, directly calculate the corrected value X of described second relative distance projector distance in the horizontal direction.
X = x 1 + D
Wherein, x is described second relative distance projector distance in the horizontal direction, and X is the corrected value of described second relative distance projector distance in the horizontal direction.
Described wide-angle lens image correction module 125 can be calculated coordinate and the line of described camera head and the angle (being described coordinate angle information) of described screen edge of described touch objects according to the corrected value X of described second relative distance projector distance in the horizontal direction, thereby calculates the coordinate of described touch objects.
Described image correction module 12 further comprises a fish eye lens correction parameter generation module 122 and a fish eye lens image correction module 126.
According to " FISH EYE LENS OPTICS " (Wang Yongzhong, Science Press), flake glasses head image distortion correction algorithm is as follows:
Known fish eye lens focal length is f 0, establishing the panorama picture of fisheye lens radius is r 0, the center of the image of the touch objects on the described preset coordinates is r to first relative distance at the center of described entire image 1If r 1≤ r 0﹠amp; r 1<f 0, utilize quadrature alignment model r=fsin ω to obtain the distortion visual angle ω of current pixel point earlier, get:
ω = arcsin ( r 1 f 0 )
Under orthogonal model, radially magnification is:
β r = f cos ω r
Wherein, cos ω = 1 - sin 2 ω = 1 - ( r 1 f 0 ) 2 = f 0 2 - r 1 2 f 0
Can get thus,
Figure DEST_PATH_GSB00000502889400194
Suppose the current pixel coordinate for (x, y), the size of the entire image of lens shooting is height * width, then the distortion correction result be (X, Y),
( 12 ) - - - X = width 2 + x - cenw β r = width 2 + r 0 · ( x - cenw ) f 0 2 - r 1 2 Y = height 2 + ( y - cenh ) β r = height 2 + r 0 · ( y - cenh ) f 0 2 - r 1 2
Wherein, width is the width on the described camera head entire image horizontal direction of taking,
cenw = width 2 .
Above-mentioned derivation result is applied in the camera type touch detection positioning of the present invention system, and then described image dsitortion correction parameter is that fish-eye imaging radius is r 0
The described first relative distance r 1The Image Acquisition of the described specific touch objects that can take according to described camera head, the distance after the correction is r 1', then according to magnification β radially rDefinition, have:
r 1 ′ = r 0 r 1 f 0 2 - r 1 2
After the maximum imaging semidiameter correction in the angular field of view be:
r 0 ′ = r 0 2 f 0 2 - r 0 2
For the visual angle is 180 the degree fish eye lenses, optical axis with angular field of view divide equally for
Figure DEST_PATH_GSB00000502889400204
Maximum radius r in the maximum angular field of view in the image of described optical axis both sides 0' corresponding angular field of view is
Figure DEST_PATH_GSB00000502889400205
Can calculate r in proportion 1Corresponding angle γ then has:
( 13 ) - - - γ = π 2 r ′ r 0 ′ = π 2 r 0 r 1 f 0 2 - r 1 2 f 0 2 - r 0 2 r 0 2 = π 2 r 1 r 0 f 0 2 - r 0 2 f 0 2 - r 1 2
Described fish eye lens correction parameter generation module 122 is used to obtain the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head, according to described first relative distance, described angle and described fish-eye focal length, be calculated to be the picture radius r according to following formula 0As described image distortion correction parameter:
γ = π 2 r 1 r 0 f 0 2 - r 0 2 f 0 2 - r 1 2
R wherein 1Be described first relative distance, γ is the angle that described preset coordinates is obtained the optical axis that is wired to described camera head of described preset coordinates and described camera head, f 0It is described fish-eye focal length.
If the angle of described preset coordinates and described fish-eye line and described fish eye lens optical axis is 45 degree, focal distance f 0Can be by the inquiry of camera lens instruction manual.According to known conditions, have:
γ = π 2 r 1 r 0 f 0 2 - r 0 2 f 0 2 - r 1 2 = π 4
That is, r 0 = 2 f 0 r 1 f 0 2 + 3 r 1 2 .
Described imaging radius r 0Be described image distortion correction parameter with fish-eye camera head.
Described fish eye lens correction parameter generation module 122 calculates and obtains described fish-eye imaging radius r 0With described radially magnification β rAfterwards, described image capture module 11 is taken the touch objects to be positioned on the screen respectively by described a plurality of camera heads.Described touch objects to be positioned is exactly the touch objects with respect to coordinate the unknown of described screen.
Described fish eye lens image correction module 126 is used for calculating radially magnification β according to following formula r:
β r = f 0 2 - r 2 2 r 0
Wherein, r 2Be described second relative distance, β rBe described radially magnification.
And then according to formula (12):
X = width 2 + r 0 · ( x - cenw ) f 0 2 - r 1 2
Can calculate the corrected value X of described second relative distance projector distance in the horizontal direction.
Wherein, x is described second relative distance projector distance in the horizontal direction, and X is the corrected value of described second relative distance projector distance in the horizontal direction.
Described locating module 13 is used for the actual coordinate according to the described touch objects to be positioned of the correction value of described second relative distance.
When calculating the actual coordinate of described touch objects,, can calculate the angle that is wired to described screen edge that obtains each described touch article coordinate and described camera head, promptly described coordinate angle information according to the corrected value of described second relative distance.Then, with the described coordinate angle information combination of extracting respectively in a plurality of camera heads, utilize trigonometric function and screen size to calculate the coordinate position of touch point.
Compared with prior art, camera type touch detection positioning of the present invention system has described distortion correction module, before touch objects is positioned, calculate each described first relative distance of acquisition, computed image distortion correction parameter according to the image and the described preset coordinates of described specific touch objects earlier.Then, when described touch objects to be positioned is carried out detection and location, the picture centre of the touch objects to be positioned of the described camera head shooting of correspondence is proofreaied and correct to second relative distance at described entire image center according to described image distortion correction parameter.Again by the actual coordinate of described locating module according to the described touch objects to be positioned of the correction value of described second relative distance.Thereby reduced because the error that the optical lens image deformation of camera head brings, more accurate to the calculating of the actual coordinate of described touch objects.
See also Fig. 8, Fig. 8 is the structural representation of a kind of preferred implementation of camera type touch detection system of the present invention.
As a kind of preferred implementation, in the described camera type touch detection system 20, described image correction module 22 further comprises a parameter look-up table means 223, described parameter look-up table means 223 is used to receive the image dsitortion correction parameter of described wide-angle lens correction parameter generation module 221 and 222 generations of described fish eye lens correction parameter generation module, utilize the image deformation bearing calibration of optical lens to calculate in the described entire image that described camera head takes each and put the corrected value of the relative distance at described entire image center, according to the corrected value of each described relative distance and corresponding described relative distance, set up a parameter look-up table.Described wide-angle lens image correction module 225 and described wide-angle lens image correction module 226 are searched the described parameter look-up table of corresponding described camera head respectively according to described second relative distance, can obtain the corrected value of corresponding described second relative distance.
To having the camera head of wide-angle lens, described parameter look-up table means 223 receives the described image distortion correction parameter that described wide-angle lens correction parameter generation module 221 produces, promptly described form parameter θ Max,, write down its relative distance respectively, again according to a plurality of described relative distances and described form parameter θ to the entire image center that described camera head is taken for each point in the described entire image of a described camera head shooting Max, calculate in the described entire image each respectively and put the corrected value of the relative distance at described entire image center.Set up described parameter look-up table according to the corrected value of described relative distance and described relative distance then.
The center of the image of each described touch objects to be positioned that described wide-angle lens image correction module 225 is taken according to described camera head is to second relative distance at the center of described entire image, search described parameter look-up table, can obtain the corrected value of described second relative distance.
Have a fish-eye camera head for described, described parameter look-up table means 223 receives the imaging radius r that described fish eye lens correction parameter generation module 222 produces 0, profit uses the same method and sets up described parameter look-up table.
The center of the image of each described touch objects to be positioned that described fish eye lens image correction module 226 is taken according to described camera head is to second relative distance at the center of described entire image, search described parameter look-up table, can obtain the corrected value of described second relative distance.
Set up described parameter look-up table by described parameter look-up table means 223, then can search the corrected value of corresponding described second relative distance directly, apace, accelerate the whole locating speed of image rectification speed and described camera type touch detection positioning method.
Above-described embodiment of the present invention does not constitute the qualification to protection domain of the present invention.Any modification of being done within the spirit and principles in the present invention, be equal to and replace and improvement etc., all should be included within the claim protection domain of the present invention.

Claims (10)

1. camera type touch detection method is characterized in that may further comprise the steps:
Respectively the specific touch objects on the preset coordinates of screen is taken by the camera head outside a plurality of viewing areas that are arranged on screen, wherein, the scope at the shooting visual angle of each described camera head comprises whole described screen;
Obtain first relative distance of the picture centre of described specific touch objects to the entire image center that corresponding described camera head is taken, according to described first relative distance and described preset coordinates, utilize the image deformation computing method of optical lens to calculate the image distortion correction parameter of each described camera head;
Respectively the touch objects to be positioned on the screen is taken by described a plurality of camera heads;
The picture centre of obtaining the touch objects described to be positioned of described camera head shooting arrives second relative distance at described entire image center, according to described image distortion correction parameter, utilize the image deformation bearing calibration of optical lens to obtain the corrected value of described second relative distance;
Corrected value according to described second relative distance, calculate the coordinate angle information that is wired to described screen edge that obtains each described touch article coordinate and described camera head, with the described coordinate angle information combination of extracting respectively in a plurality of camera heads, utilize trigonometric function and screen size to calculate the actual coordinate of described touch objects.
2. camera type touch detection method as claimed in claim 1 is characterized in that according to described image distortion correction parameter, and the step of utilizing the image deformation bearing calibration of optical lens to obtain the corrected value of described second relative distance comprises:
According to described image distortion correction parameter, utilize the image deformation bearing calibration of optical lens to calculate in the described entire image that described camera head takes each and put the corrected value of the relative distance at described entire image center, according to the corrected value of each described relative distance and corresponding described relative distance, set up a parameter look-up table;
Search described parameter look-up table according to described second relative distance, obtain the corrected value of corresponding described second relative distance.
3. camera type touch detection method as claimed in claim 1 or 2, it is characterized in that when the camera lens of described camera head is wide-angle lens, described according to described first relative distance and described preset coordinates, utilize the step of the image deformation computing method computed image distortion correction parameter of optical lens to comprise:
Obtain the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head, according to the edge of described first relative distance, described angle and described entire image distance, calculate form parameter θ according to following formula to the center of described entire image MaxAs described distortion correction parameter:
α = r 1 2 θ max 2 2 r max 2 · acr tan ( r 1 · tan θ max r max )
Wherein, r 1Be described first relative distance, r MaxBe the edge of the described entire image ultimate range to the center of described entire image, α is the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head;
Described according to described image distortion correction parameter, the step of utilizing the image deformation bearing calibration of optical lens to obtain the corrected value of described second relative distance comprises:
Calculate aberration rate D according to following formula:
θ = arctan ( r 2 · tan θ max r max )
D = ( θ - tan θ ) tan θ
Wherein, the r in the formula 2Represent described second relative distance, θ MaxBe described image distortion correction parameter;
Calculate the corrected value of described second relative distance projector distance in the horizontal direction again according to following formula:
X = x 1 + D
Wherein, x is described second relative distance projector distance in the horizontal direction, and X is the corrected value of described second relative distance projector distance in the horizontal direction.
4. camera type touch detection method as claimed in claim 3, it is characterized in that when the camera lens of described camera head is fish eye lens, described according to described first relative distance and described preset coordinates, utilize the step of the image deformation computing method computed image distortion correction parameter of optical lens to comprise:
Obtain the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head,, be calculated to be the picture radius r according to following formula according to described first relative distance, described angle and described fish-eye focal length 0As described image distortion correction parameter:
γ = π 2 r 1 r 0 f 0 2 - r 0 2 f 0 2 - r 1 2
R wherein 1Be described first relative distance, γ is the angle that described preset coordinates is obtained the optical axis that is wired to described camera head of described preset coordinates and described camera head, f 0It is described fish-eye focal length;
Described according to described image distortion correction parameter, the step of utilizing the image deformation bearing calibration of optical lens to obtain the corrected value of described second relative distance comprises:
Calculate radially magnification β according to following formula r:
β r = f 0 2 - r 2 2 r 0
Wherein, r 2Be described second relative distance, β rBe described radially magnification;
Calculate the corrected value of described second relative distance projector distance in the horizontal direction then according to following formula:
X = width 2 + r 0 · ( x - cenw ) f 0 2 - r 2 2
Wherein, width is a described entire image width in the horizontal direction,
Figure FSB00000502889300034
X is described second relative distance projector distance in the horizontal direction, and X is the corrected value of described second relative distance projector distance in the horizontal direction.
5. camera type touch detection method as claimed in claim 1 is characterized in that described preset coordinates is at least two in the described step of respectively specific touch objects on the preset coordinates of screen being taken by a plurality of camera heads;
Described first relative distance of obtaining the picture centre of described specific touch objects to the entire image center that corresponding described camera head is taken, according to described first relative distance and described preset coordinates, utilize the step of the image deformation computing method computed image distortion correction parameter of optical lens to comprise:
Obtain at least two first relative distances of the picture centre of the specific touch objects on described at least two preset coordinates respectively to the entire image center that described camera head is taken;
Respectively according at least two described first relative distances and corresponding described at least two preset coordinates, utilize the image deformation computing method of optical lens to calculate at least two image distortion correction parameters, with the mean value of described at least two image distortion correction parameters final value as described image distortion correction parameter.
6. camera type touch detection positioning system is characterized in that comprising:
An image capture module, be used for respectively the specific touch objects on the screen preset coordinates being taken by the camera head outside a plurality of viewing areas that are arranged on screen, and the touch objects to be positioned on the screen taken, wherein, the scope at the shooting visual angle of each described camera head comprises whole described screen;
An image correction module, be used to obtain first relative distance of the picture centre of described specific touch objects to the entire image center that corresponding described camera head is taken, according to described first relative distance and described preset coordinates, utilize the image deformation computing method of optical lens to calculate the image distortion correction parameter of each described camera head, the picture centre of obtaining the touch objects described to be positioned of described camera head shooting then arrives second relative distance at described entire image center, according to described image distortion correction parameter, utilize the image deformation bearing calibration of optical lens to obtain the corrected value of described second relative distance;
A locating module, be used for corrected value according to described second relative distance, calculate the coordinate angle information that is wired to described screen edge that obtains each described touch article coordinate and described camera head, with the described coordinate angle information combination of extracting respectively in a plurality of camera heads, utilize trigonometric function and screen size to calculate the actual coordinate of described touch objects.
7. camera type touch detection system as claimed in claim 6, it is characterized in that described image correction module comprises a parameter look-up table means, described parameter look-up table means is used for according to described image distortion correction parameter, utilize the image deformation bearing calibration of optical lens to calculate in the described entire image that described camera head takes each and put the corrected value of the relative distance at described entire image center, and, set up a parameter look-up table according to the corrected value of each described relative distance and corresponding described relative distance;
After described image correction module obtains described second relative distance, search described parameter look-up table, obtain the corrected value of corresponding described second relative distance according to described second relative distance.
8. as claim 6 or 7 described camera type touch detection system, it is characterized in that described image correction module comprises a wide-angle lens correction parameter generation module and a wide-angle lens image correction module;
Described wide-angle lens correction parameter generation module is used to obtain the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head, according to the edge of described first relative distance, described angle and described entire image distance, calculate form parameter θ according to following formula to the center of described entire image MaxAs described distortion correction parameter:
α = r 1 2 θ max 2 2 r max 2 · arctan ( r 1 · tan θ max r max )
Wherein, r 1Be described first relative distance, r MaxBe the edge of the described entire image ultimate range to the center of described entire image, α is the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head;
Described wide-angle lens image correction module is used for calculating aberration rate D according to following formula:
θ = arctan ( r 2 · tan θ max r max )
D = ( θ - tan θ ) tan θ
Wherein, the r in the formula 2Represent described second relative distance, θ MaxBe described image distortion correction parameter;
Calculate the corrected value of described second relative distance projector distance in the horizontal direction again according to following formula:
X = x 1 + D
Wherein, x is described second relative distance projector distance in the horizontal direction, and X is the corrected value of described second relative distance projector distance in the horizontal direction.
9. camera type touch detection system as claimed in claim 8 is characterized in that described image correction module also comprises a fish eye lens correction parameter generation module and a fish eye lens image correction module;
Described fish eye lens correction parameter generation module is used to obtain the angle of the optical axis that is wired to described camera head of described preset coordinates and described camera head, according to described first relative distance, described angle and described fish-eye focal length, be calculated to be the picture radius r according to following formula 0As described image distortion correction parameter:
γ = π 2 r 1 r 0 f 0 2 - r 0 2 f 0 2 - r 1 2
R wherein 1Be described first relative distance, γ is the angle that described preset coordinates is obtained the optical axis that is wired to described camera head of described preset coordinates and described camera head, f 0It is described fish-eye focal length;
Described fish eye lens image correction module is used for calculating radially magnification β according to following formula r:
β r = f 0 2 - r 2 2 r 0
Wherein, r 2Be described second relative distance, β rBe described radially magnification;
Calculate the corrected value of described second relative distance projector distance in the horizontal direction then according to following formula:
X = width 2 + r 0 · ( x - cenw ) f 0 2 - r 2 2
Wherein, width is a described entire image width in the horizontal direction,
Figure FSB00000502889300072
X is described second relative distance projector distance in the horizontal direction, and X is the corrected value of described second relative distance projector distance in the horizontal direction.
10. camera type touch detection system as claimed in claim 6 is characterized in that described preset coordinates is at least two; Described image correction module is obtained at least two first relative distances of the picture centre of the specific touch objects on described at least two preset coordinates to the entire image center that described camera head is taken respectively, and respectively according at least two described first relative distances and corresponding described at least two preset coordinates, utilize the image deformation computing method of optical lens to calculate at least two image distortion correction parameters, with the mean value of described at least two image distortion correction parameters final value as described image distortion correction parameter.
CN2009101933024A 2009-10-26 2009-10-26 Camera type touch detection positioning method and camera type touch detection system Expired - Fee Related CN101697105B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009101933024A CN101697105B (en) 2009-10-26 2009-10-26 Camera type touch detection positioning method and camera type touch detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009101933024A CN101697105B (en) 2009-10-26 2009-10-26 Camera type touch detection positioning method and camera type touch detection system

Publications (2)

Publication Number Publication Date
CN101697105A CN101697105A (en) 2010-04-21
CN101697105B true CN101697105B (en) 2011-09-14

Family

ID=42142211

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009101933024A Expired - Fee Related CN101697105B (en) 2009-10-26 2009-10-26 Camera type touch detection positioning method and camera type touch detection system

Country Status (1)

Country Link
CN (1) CN101697105B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103425356B (en) * 2010-06-25 2016-07-13 原相科技股份有限公司 Image sensing module
CN101887330B (en) * 2010-07-13 2012-10-03 世大光电(深圳)有限公司 Electronic equipment as well as single-camera object-positioning device and method thereof
CN102402339B (en) * 2010-09-07 2014-11-05 北京汇冠新技术股份有限公司 Touch positioning method, touch screen, touch system and display
CN102479379B (en) * 2010-11-19 2014-09-03 义晶科技股份有限公司 Image rectification method and relevant image rectification system
CN102023759B (en) * 2010-11-23 2013-05-01 广东威创视讯科技股份有限公司 Writing and locating method of active pen
TWI435250B (en) * 2011-03-14 2014-04-21 Wistron Corp Method for calibrating accuracy of optical touch monitor
CN102446035B (en) * 2011-08-31 2014-04-23 广东威创视讯科技股份有限公司 Method and device for discriminating color of touch pen
TWI475447B (en) * 2012-07-11 2015-03-01 Wistron Corp Optical touch system and touch point calculation method thereof
CN102981683B (en) * 2012-12-10 2016-05-25 广东威创视讯科技股份有限公司 A kind of camera optical alignment method for quickly correcting and optical axis bearing calibration thereof
CN103124334B (en) * 2012-12-19 2015-10-21 四川九洲电器集团有限责任公司 A kind of method of lens distortion calibration
CN103065318B (en) * 2012-12-30 2016-08-03 深圳普捷利科技有限公司 The curved surface projection method and device of multiple-camera panorama system
CN103257750B (en) * 2013-05-15 2016-03-30 广州视睿电子科技有限公司 The touch identification method of optical imaging touch screen and device
CN105094459B (en) * 2015-03-20 2018-11-02 淮阴工学院 A kind of optics multi-point touch locating method
CN105160632B (en) * 2015-06-30 2017-11-07 广东欧珀移动通信有限公司 A kind of distortion correction method and mobile terminal
CN107306331A (en) * 2016-04-19 2017-10-31 义晶科技股份有限公司 Image processing method and portable electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0657841A1 (en) * 1993-12-07 1995-06-14 AT&T Corp. Sensing stylus position using single 1-D imge sensor
CN1896919A (en) * 2005-07-14 2007-01-17 李银林 Touch projection screen system and its realization
CN101464754A (en) * 2008-12-19 2009-06-24 卫明 Positioning method and apparatus for implementing multi-point touch control for any plane without peripheral at four sides

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0657841A1 (en) * 1993-12-07 1995-06-14 AT&T Corp. Sensing stylus position using single 1-D imge sensor
CN1896919A (en) * 2005-07-14 2007-01-17 李银林 Touch projection screen system and its realization
CN101464754A (en) * 2008-12-19 2009-06-24 卫明 Positioning method and apparatus for implementing multi-point touch control for any plane without peripheral at four sides

Also Published As

Publication number Publication date
CN101697105A (en) 2010-04-21

Similar Documents

Publication Publication Date Title
CN101697105B (en) Camera type touch detection positioning method and camera type touch detection system
CN110211043B (en) Registration method based on grid optimization for panoramic image stitching
US7379619B2 (en) System and method for two-dimensional keystone correction for aerial imaging
CN111750820B (en) Image positioning method and system
CN105118055B (en) Camera position amendment scaling method and system
CN109903227B (en) Panoramic image splicing method based on camera geometric position relation
JP4147059B2 (en) Calibration data measuring device, measuring method and measuring program, computer-readable recording medium, and image data processing device
CN104835159A (en) Digital image correction method for continuous variable-focal-length optical imaging system
CN102169573B (en) Real-time distortion correction method and system of lens with high precision and wide field of view
CN107767422A (en) A kind of fish-eye bearing calibration, device and portable terminal
CN106886979A (en) A kind of image splicing device and image split-joint method
CN104835115A (en) Imaging method for aerial camera, and system thereof
CN103198487A (en) Automatic calibration method for video monitoring system
CN105447850A (en) Panorama stitching synthesis method based on multi-view images
CN101271575B (en) Orthogonal projection emendation method for image measurement in industry close range photography
CN102663734A (en) Fish eye lens calibration and fish eye image distortion correction method
CN104732482A (en) Multi-resolution image stitching method based on control points
CN101763643A (en) Automatic calibration method for structured light three-dimensional scanner system
CN104657982A (en) Calibration method for projector
CN104392435A (en) Fisheye camera calibration method and device
CN101794184B (en) Coordinate detection device and locating method thereof
CN106971408A (en) A kind of camera marking method based on space-time conversion thought
CN107665483A (en) Exempt from calibration easily monocular camera lens fish eye images distortion correction method
CN110033407A (en) A kind of shield tunnel surface image scaling method, joining method and splicing system
CN106886976B (en) Image generation method for correcting fisheye camera based on internal parameters

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: Kezhu road high tech Industrial Development Zone, Guangzhou city of Guangdong Province, No. 233 510670

Patentee after: Wei Chong group Limited by Share Ltd

Address before: 510663 Guangzhou province high tech Industrial Development Zone, Guangdong, Cai road, No. 6, No.

Patentee before: Guangdong Weichuangshixun Science and Technology Co., Ltd.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110914

Termination date: 20181026