CN102929447A - Method for extracting perception image effective area of optical sensor - Google Patents

Method for extracting perception image effective area of optical sensor Download PDF

Info

Publication number
CN102929447A
CN102929447A CN2012104284249A CN201210428424A CN102929447A CN 102929447 A CN102929447 A CN 102929447A CN 2012104284249 A CN2012104284249 A CN 2012104284249A CN 201210428424 A CN201210428424 A CN 201210428424A CN 102929447 A CN102929447 A CN 102929447A
Authority
CN
China
Prior art keywords
image
optical sensor
effective
rest position
factor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012104284249A
Other languages
Chinese (zh)
Other versions
CN102929447B (en
Inventor
吴涛
邵枝晖
刘耀诚
李晓强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WUXI HINORA TECHNOLOGY Co Ltd
Original Assignee
WUXI HINORA TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WUXI HINORA TECHNOLOGY Co Ltd filed Critical WUXI HINORA TECHNOLOGY Co Ltd
Priority to CN201210428424.9A priority Critical patent/CN102929447B/en
Publication of CN102929447A publication Critical patent/CN102929447A/en
Application granted granted Critical
Publication of CN102929447B publication Critical patent/CN102929447B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to a method for extracting a perception image effective area of an optical sensor. The optical sensor comprises an optical sensor signal processing unit, an image processing unit, an image effective area automatic extraction unit and an optical sensor calibration unit. A self-adaptive adjustment method is adopted. The extraction method comprises the following steps of: transmitting a touch screen image signal of the optical sensor signal processing unit into the image processing unit, processing the image signal, and transmitting the processed image signal into the image effective area automatic extraction unit; calculating area brightness perception factors and image self-adaptive adjustment factors from the acquired image signal by the image effective area automatic extraction unit; and finding starting and ending positions of a perception image according to the area brightness perception factors, and calibrating the area between the starting position and the ending position of the perception image as an image effective area.

Description

The extracting method of a kind of optical sensor perceptual image effective coverage
Technical field
The present invention relates to the optical touch technology, relate in particular to a kind of extracting method that adopts the optical sensor perceptual image effective coverage of self-adaptation adjusting.
Background technology
The main flow touching technique of domestic market is worked as take vector pressure sensing technology, resistive technologies, infrared technology, capacitance technology, touch-surface technology of acoustic wave as main, and wherein, vector pressure sensing technology touch-screen steps down from the stage of history; The accurate positioning of touch screens, but its price is quite high, and be afraid of to scrape rapid wear; The touch-screen of infrared technology is cheap, but its housing is frangible, easily produces light and disturbs, easily distortion in the curved surface situation; The capacitance plate design theory is good, but its pattern distortion problem is difficult to be solved at all; Surface acoustic wave touch screen has solved the in the past various defectives of touch-screen, and is clear uprising, is suitable for various occasions, and its shortcoming is that water droplet, the dust of screen surfaces can make touch-screen slow up, and does not even work.
The appearance of optical touch screen has solved the difficult problem of numerous touch-screens substantially, and has the following adequate condition that touches market of leader.The optical touch technology is a kind of new technology that is different from above-mentioned existing touching technique, and optical sensor is quick to careful movement response, makes operator's operation more brisk, smoothness, accurately.Two optics modules that are installed in respectively top left, right corner can detect the position of operator's a plurality of fingers or other touch operation body accurately, make the operator not only can click, double-click, dilatory, can also implement to rotate freely and the operation such as amplification.Existing optical touch screen structure mainly comprises luminescence component 1, optical sensor signals processing unit 2, graphics processing unit (not shown) and display unit 3 as shown in Figure 1, the arbitrary touch point 4 the when point in the touch-screen represents the operator touch operation.
In the reality, one of key factor that affects the optical touch performance is the height of optical sensor perceptual performance, and wherein the extraction of optical sensor effective image area plays a part outbalance.The extracting method of optical current sensor effective image area mainly is by regulating manually, Fig. 2 is optical sensor perceptual image manual adjustments theory diagram, and its functional module comprises that optical sensor signals processing unit, graphics processing unit, image touch manual extraction unit, effective coverage and optical sensor is demarcated the unit.Wherein, the optical sensor signals processing unit generally adopts CCD camera or CMOS camera, is used for gathering and processing the image information that touch-screen shows; Wherein, graphics processing unit is that the handled picture signal of optical sensor signals processing unit is processed; Wherein, it is that graphics processing unit is processed image afterwards that image touches manual extraction unit, effective coverage, extracts manually the image that image touches the effective coverage; Wherein, to demarcate the unit be that effective image area to manual extraction carries out optical sensor and demarcates to optical sensor.It is comparatively serious that this manually control method waits more by force or more secretly some light to disturb in illumination, perhaps the optical sensor imaging arrangement is comparatively under the complex environment, the precision that effective image area is extracted is not high, can greatly affect like this performance of optical sensor perception, and then greatly reduce optical touch body perception can, and be not suitable for the popularization of industrial production pattern.
This shows, need a kind of extracting method of improved optical sensor effective image area.
Summary of the invention
Deficiency for the prior art existence, the object of the present invention is to provide a kind of extracting method that adopts the optical sensor perceptual image effective coverage of self-adaptation adjusting, substitute the manual extraction unit with automatic extraction unit, this method can be according to specific environment, automatically calculate the illumination effect factor, and then automatic lifting taking-up optical sensor effective image area, thereby having solved under complex environment, effective image area extracts hard problem.
For achieving the above object, technical scheme of the present invention is: the extracting method of a kind of optical sensor perceptual image effective coverage, comprise optical sensor signals processing unit, graphics processing unit, the automatic extraction unit of effective image area and optical sensor demarcation unit, adopt adaptive regulation method, extracting method comprises: import the touch-screen picture signal of described optical sensor signals processing unit into described graphics processing unit, picture signal is processed, and the picture signal after will processing is imported the automatic extraction unit of described effective image area into; The automatic extraction unit of described effective image area calculates the self-adaptation regulatory factor H of regional luminance perception factor K and image from the picture signal that obtains; Then, find initial, the rest position of perceptual image according to its regional luminance perception factor K, and be effective image area with the region labeling between initial, the rest position of perceptual image.
Preferably, in each described optical sensor perceptual image, take the centre of image as the boundary, in the left and right regions of middle boundary, respectively get initial beginning flag and the initial cut-off sign of an effective image area, as initial reference position and the initial rest position of described optical sensor perceptual image effective coverage.
Preferably, calculate the regional luminance perception factor K of left margin and the right margin of each optical sensor perceptual image.Calculating optical sensor senses brightness of image quality valuation factor M.
Preferably, by relatively, if gradation of image value I more than or equal to optical sensor perceptual image brightness quality valuation factor M, then chooses this row and is defined as effective row on border, optical sensor perceptual image effective coverage; All that determine successively border, optical sensor perceptual image effective coverage are effectively capable.
Preferably, carry out the processing of traversing graph picture by all that determined are effectively gone, obtain reference position and the rest position of optical sensor perceptual image effective coverage.
Compared with prior art, the technique effect of technical solution of the present invention is: self-adaptation of the present invention is regulated optical sensor perceptual image effective coverage extracting method, can automatically calculate regional luminance perception factor K according to specific environment, and then automatic lifting takes out effective image area.Self-adaptation is regulated, and is real-time, time complexity is low, the robustness of this method is significantly improved, and its optical sensor is demarcated more stable, and extraction accuracy is high and easy to operate.When touch-screen is installed, need not the evaluation that installation environment to touch-screen carries out the factor such as illumination.When optical sensor perceptual image effective coverage is chosen, need not to require the professional to operate, any operating personnel can operate.Because the technology of the present invention adopts one-touch automatic fetch strategy, so need not extract manually effective image area, particularly under the industrial production pattern, more easily realize, more convenient to use, volume production efficient is high.
Description of drawings
Fig. 1 is general optical touch screen structural representation;
Fig. 2 is existing optical sensor perceptual image manual adjustments theory diagram;
Fig. 3 is optical sensor perceptual image automatic principle of compensation block diagram of the present invention;
Fig. 4 is that workflow diagram is extracted in optical sensor perceptual image of the present invention effective coverage.
Embodiment
Below in conjunction with drawings and Examples the present invention is described in further detail.
Fig. 1 shows existing optical touch screen structure, and the extracting method of optical sensor perceptual image of the present invention effective coverage can be used for this equipment.
The difference of the existing optical sensor perceptual image manual adjustments structure shown in Fig. 2 and optical sensor perceptual image self-regulation structure of the present invention is that the present invention is automatic adjusting, specifically adopt adaptive regulation method, and existing be to adopt manually to regulate.
As shown in Figure 3, optical sensor perceptual image self-regulation structure of the present invention, its functional module comprises optical sensor signals processing unit, graphics processing unit, the automatic extraction unit of effective image area and optical sensor demarcation unit.Owing to adopting adaptive regulation method, when the relevant staff carries out relative parameters setting when beginning to optical sensor, the effective image area extraction unit of system can start automatically, at first can promptly from optical sensor, obtain picture signal, calculate boundary luminances perception factor K and brightness of image quality valuation factor M, then can send into rapidly " the automatic extraction module of effective image area " to the parameter that obtains, from " the automatic extraction module of effective image area ", extract effective image area.
Its principle of work: for optical touch screen, after the size of touch screen determines, after the touch environment such as touch mounting structure determine, illumination will remain within certain scope the impact of environment, then the brightness self-adaptation regulatory factor of each optical sensor perceptual image and the regional luminance perception factor also are preset parameter, so the effective coverage of the captured screen picture of each optical sensor also is fixing zone.In like manner according to the self-adaptation regulatory factor of the regional luminance perception factor and image, can calculate initial row and the cut-off row of effective image area.
Fig. 4 illustrates optical sensor perceptual image of the present invention effective coverage and extracts workflow, and it extracts flow process and may further comprise the steps:
1) by each optical sensor photographed screen image, produces the optical sensor perceptual image.
2) in each described optical sensor perceptual image, take the centre of image as the boundary, in the left and right regions of middle boundary, respectively get initial beginning flag and the initial cut-off sign of an effective image area, as initial reference position and the initial rest position of described optical sensor perceptual image effective coverage.The initial start-stop mark position of effective coverage can according to real image, be chosen suitable empirical value and get final product.
3) calculate the regional luminance perception factor K of left margin and the right margin of each described optical sensor perceptual image, described K equals the average gray value of perceptual image in a certain fixed range in border, and the large I experience of fixed range draws, and the present invention can be 50.
4) seek the brightest gray-scale value, calculate the initial reference position of described optical sensor perceptual image and the high light of initial rest position image, the corresponding gray-scale value G of described high light is the brightest gray-scale value.
5) calculate described optical sensor perceptual image brightness quality valuation factor M, the value of this M equals described gray-scale value G and multiply by brightness rewards and punishments factor Q, and described factor Q generally gets 0 to 1.
6) determine effective row on effective image area border, comprising:
Gradation of image value I with initial reference position and the initial rest position of every delegation of the perceptual image of optical sensor described in the step 1) compares with the described optical sensor perceptual image brightness quality valuation factor M that draws in the step 5);
If described gradation of image value I more than or equal to described optical sensor perceptual image brightness quality valuation factor M, then chooses this row and is defined as effective row on border, optical sensor perceptual image effective coverage;
All that determine successively border, described optical sensor perceptual image effective coverage are effectively capable.
7) by effectively being gone, all that determined carry out the processing of traversing graph picture, obtain reference position and the rest position of described optical sensor perceptual image effective coverage, the method of described traversing graph picture is selected the windowing sliding scale, the size of window is wanted choose reasonable, the present invention general desirable 3,4,5 get final product.If this window value obtains too little or too large, the performance that the gradation of image of current window section is can not be rationally correct.Rule of thumb, can draw the image relative efficiency zone of reference position of effective image area relatively dark with interior image; The image relative efficiency zone of the rest position of effective image area is relatively dark with interior image.Described traversing graph picture is processed, and is to begin to travel through described optical sensor perceptual image left from described initial reference position, begins to travel through described optical sensor perceptual image to the right from described initial rest position.Described traversing graph picture is processed and is specifically comprised:
Calculate the difference of the gradation of image value of left and right side window mouth;
The correlation factor of described difference and brightness is compared, and the position after relatively is labeled as respectively reference position and the rest position of doubtful effective image area;
Reference position and rest position to described all doubtful effective image areas compare, draw the minimum position of all doubtful reference positions and the maximum position of doubtful rest position, then carry out windowing traversing graph picture, thereby obtain actual reference position and the actual rest position of described optical sensor perceptual image effective coverage.
Traversing graph as the time, if when seeking the reference position of described optical sensor perceptual image effective coverage, then to the perceptual image of optical sensor traversing graph picture from right to left, if when seeking the rest position of described optical sensor perceptual image effective coverage, then to the perceptual image of optical sensor traversing graph picture from left to right.
The correlation factor with described difference and brightness in the step 7) compares, the difference that is the gradation of image value of the left and right side window mouth that will calculate compares with brightness self-adaptation regulatory factor H and brightness of image quality valuation factor M respectively, if described difference is greater than brightness self-adaptation regulatory factor H, and the average gray value of the gradation of image value of left and right side window mouth is greater than brightness of image quality valuation factor M, and then this position is labeled as respectively reference position and the rest position of doubtful effective image area.
The minimum position that draws all doubtful reference positions and the maximum position of doubtful rest position are carried out windowing traversing graph picture, and the method for described windowing traversing graph picture is as follows:
If this minimum position is the starting position of optical sensor perceptual image, then to this row image, begin windowing traversing graph picture left from initial reference position, if this maximum position is the rest position of optical sensor perceptual image, then to this row image, begin windowing traversing graph picture to the right from initial rest position;
If the gradation of image value of window is less than default threshold value T, then this position is respectively actual reference position and the actual rest position of described optical sensor perceptual image effective coverage.Wherein, but threshold value T experience provide.
The above only is preferred embodiment of the present invention, and is in order to limit the present invention, within the spirit and principles in the present invention not all, any modification of doing, is equal to replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (5)

1. the extracting method of an optical sensor perceptual image effective coverage, comprise optical sensor signals processing unit, graphics processing unit, the automatic extraction unit of effective image area and optical sensor demarcation unit, adopt adaptive regulation method, extracting method comprises:
Import the touch-screen picture signal of described optical sensor signals processing unit into described graphics processing unit, picture signal is processed, and the picture signal after will processing is imported the automatic extraction unit of described effective image area into;
The automatic extraction unit of described effective image area calculates the self-adaptation regulatory factor H of regional luminance perception factor K and image from the picture signal that obtains;
Then, find initial, the rest position of perceptual image according to its regional luminance perception factor K, and be effective image area with the region labeling between initial, the rest position of perceptual image.
2. extracting method according to claim 1, it extracts flow process and may further comprise the steps:
1) by each optical sensor photographed screen image, produces the optical sensor perceptual image;
2) in each described optical sensor perceptual image, take the centre of image as the boundary, in the left and right regions of middle boundary, respectively get initial beginning flag and the initial cut-off sign of an effective image area, as initial reference position and the initial rest position of described optical sensor perceptual image effective coverage;
3) calculate the regional luminance perception factor K of left margin and the right margin of each described optical sensor perceptual image;
4) seek the brightest gray-scale value, calculate the initial reference position of described optical sensor perceptual image and the high light of initial rest position image, the corresponding gray-scale value G of described high light is the brightest gray-scale value;
5) calculate described optical sensor perceptual image brightness quality valuation factor M, the value of this M equals described gray-scale value G and multiply by brightness rewards and punishments factor Q;
6) determine effective row on effective image area border, comprising:
Gradation of image value I with initial reference position and the initial rest position of every delegation of the perceptual image of optical sensor described in the step 1) compares with the described optical sensor perceptual image brightness quality valuation factor M that draws in the step 5);
If described gradation of image value I more than or equal to described optical sensor perceptual image brightness quality valuation factor M, then chooses this row and is defined as effective row on border, optical sensor perceptual image effective coverage;
All that determine successively border, described optical sensor perceptual image effective coverage are effectively capable;
7) carry out the processing of traversing graph picture by all that determined are effectively gone, obtain reference position and the rest position of described optical sensor perceptual image effective coverage, the method for described traversing graph picture is selected the windowing sliding scale, and described traversing graph picture is processed and comprised:
Calculate the difference of the gradation of image value of left and right side window mouth;
The correlation factor of described difference and brightness is compared, and the position after relatively is labeled as respectively reference position and the rest position of doubtful effective image area;
Reference position and rest position to described all doubtful effective image areas compare, draw the minimum position of all doubtful reference positions and the maximum position of doubtful rest position, then carry out windowing traversing graph picture, thereby obtain actual reference position and the actual rest position of described optical sensor perceptual image effective coverage.
3. extracting method according to claim 2, its feature is put and is, described traversing graph picture in the step 7) is processed, when seeking the reference position of described optical sensor perceptual image effective coverage, to the perceptual image of optical sensor traversing graph picture from right to left, when seeking the rest position of described optical sensor perceptual image effective coverage, to the perceptual image of optical sensor traversing graph picture from left to right.
4. extracting method according to claim 2, it is characterized in that, the correlation factor with described difference and brightness in the step 7) compares, the difference that is the gradation of image value of the left and right side window mouth that will calculate compares with brightness self-adaptation regulatory factor H and brightness of image quality valuation factor M respectively, if described difference is greater than brightness self-adaptation regulatory factor H, and the average gray value of the gradation of image value of left and right side window mouth is greater than brightness of image quality valuation factor M, and then this position is labeled as respectively reference position and the rest position of doubtful effective image area.
5. extracting method according to claim 2 is characterized in that, the minimum position that draws all doubtful reference positions and the maximum position of doubtful rest position are carried out windowing traversing graph picture, and the method for described windowing traversing graph picture is as follows:
If this minimum position is the starting position of optical sensor perceptual image, then to this row image, begin windowing traversing graph picture left from initial reference position, if this maximum position is the rest position of optical sensor perceptual image, then to this row image, begin windowing traversing graph picture to the right from initial rest position;
If the gradation of image value of window is less than default threshold value T, then this position is respectively actual reference position and the actual rest position of described optical sensor perceptual image effective coverage.
CN201210428424.9A 2012-10-19 2012-10-31 A kind of extracting method of perception image effective area of optical sensor Expired - Fee Related CN102929447B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210428424.9A CN102929447B (en) 2012-10-19 2012-10-31 A kind of extracting method of perception image effective area of optical sensor

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201210401218 2012-10-19
CN2012104012189 2012-10-19
CN201210401218.9 2012-10-19
CN201210428424.9A CN102929447B (en) 2012-10-19 2012-10-31 A kind of extracting method of perception image effective area of optical sensor

Publications (2)

Publication Number Publication Date
CN102929447A true CN102929447A (en) 2013-02-13
CN102929447B CN102929447B (en) 2015-09-09

Family

ID=47644270

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210428424.9A Expired - Fee Related CN102929447B (en) 2012-10-19 2012-10-31 A kind of extracting method of perception image effective area of optical sensor

Country Status (1)

Country Link
CN (1) CN102929447B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324359A (en) * 2013-04-09 2013-09-25 上海广电光显技术有限公司 Anti-light interference picture signal processing method for optical touch control screen
CN109002215A (en) * 2018-07-27 2018-12-14 青岛海信移动通信技术股份有限公司 A kind of terminal with touch screen determines the method and terminal of touch initial position
CN113487542A (en) * 2021-06-16 2021-10-08 成都唐源电气股份有限公司 Method for extracting worn area of contact line conductor

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5724456A (en) * 1995-03-31 1998-03-03 Polaroid Corporation Brightness adjustment of images using digital scene analysis
EP1876517A1 (en) * 2006-07-03 2008-01-09 Micro-Nits Co., Ltd. Input method of pointer input system
CN102508582A (en) * 2011-11-30 2012-06-20 无锡海森诺科技有限公司 Optical touch calibration automatic adjusting method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5724456A (en) * 1995-03-31 1998-03-03 Polaroid Corporation Brightness adjustment of images using digital scene analysis
EP1876517A1 (en) * 2006-07-03 2008-01-09 Micro-Nits Co., Ltd. Input method of pointer input system
CN102508582A (en) * 2011-11-30 2012-06-20 无锡海森诺科技有限公司 Optical touch calibration automatic adjusting method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324359A (en) * 2013-04-09 2013-09-25 上海广电光显技术有限公司 Anti-light interference picture signal processing method for optical touch control screen
CN103324359B (en) * 2013-04-09 2016-11-02 上海仪电电子多媒体有限公司 The image-signal processing method of the anti-light interference of optical touch screen
CN109002215A (en) * 2018-07-27 2018-12-14 青岛海信移动通信技术股份有限公司 A kind of terminal with touch screen determines the method and terminal of touch initial position
CN109002215B (en) * 2018-07-27 2021-03-19 青岛海信移动通信技术股份有限公司 Method for determining touch initial position of terminal with touch screen and terminal
CN113487542A (en) * 2021-06-16 2021-10-08 成都唐源电气股份有限公司 Method for extracting worn area of contact line conductor
CN113487542B (en) * 2021-06-16 2023-08-04 成都唐源电气股份有限公司 Extraction method of contact net wire abrasion area

Also Published As

Publication number Publication date
CN102929447B (en) 2015-09-09

Similar Documents

Publication Publication Date Title
CN103716594B (en) Panorama splicing linkage method and device based on moving target detecting
US9964624B2 (en) Computer vision-based object tracking system
CN100579174C (en) Motion detection method and device
CN107545592B (en) Dynamic camera calibration
CN101572803B (en) Customizable automatic tracking system based on video monitoring
US20090315869A1 (en) Digital photo frame, information processing system, and control method
CN103164022B (en) Many fingers touch method and device, portable terminal
CN104133548A (en) Method and device for determining viewpoint area and controlling screen luminance
CN102945091B (en) A kind of man-machine interaction method based on laser projection location and system
CN101437124A (en) Method for processing dynamic gesture identification signal facing (to)television set control
EP3576052A1 (en) Image processing systems and methods
CN105023552A (en) Display and brightness adjusting method thereof
CN101201695A (en) Mouse system for extracting and tracing based on ocular movement characteristic
CN101464751B (en) Electronic white board system based on image detection and detection method thereof
WO2018233254A1 (en) Terminal-based object recognition method, device and electronic equipment
CN102591533A (en) Multipoint touch screen system realizing method and device based on computer vision technology
CN111161214B (en) System and method for measuring pig weight and identifying drinking behavior based on binocular vision
CN104778460A (en) Monocular gesture recognition method under complex background and illumination
CN102929447A (en) Method for extracting perception image effective area of optical sensor
CN112560649A (en) Behavior action detection method, system, equipment and medium
CN112017210A (en) Target object tracking method and device
CN101840275A (en) Non-contact type mouse device and operation method thereof
CN102508582B (en) Optical touch calibration automatic adjusting method
CN105807989A (en) Gesture touch method and system
CN106713701A (en) Cluster motion data acquisition method and system based on image processing technology

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150909

Termination date: 20181031

CF01 Termination of patent right due to non-payment of annual fee