CN102929447B - A kind of extracting method of perception image effective area of optical sensor - Google Patents

A kind of extracting method of perception image effective area of optical sensor Download PDF

Info

Publication number
CN102929447B
CN102929447B CN201210428424.9A CN201210428424A CN102929447B CN 102929447 B CN102929447 B CN 102929447B CN 201210428424 A CN201210428424 A CN 201210428424A CN 102929447 B CN102929447 B CN 102929447B
Authority
CN
China
Prior art keywords
image
optical sensor
effective
perception
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210428424.9A
Other languages
Chinese (zh)
Other versions
CN102929447A (en
Inventor
吴涛
邵枝晖
刘耀诚
李晓强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WUXI HINORA TECHNOLOGY Co Ltd
Original Assignee
WUXI HINORA TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WUXI HINORA TECHNOLOGY Co Ltd filed Critical WUXI HINORA TECHNOLOGY Co Ltd
Priority to CN201210428424.9A priority Critical patent/CN102929447B/en
Publication of CN102929447A publication Critical patent/CN102929447A/en
Application granted granted Critical
Publication of CN102929447B publication Critical patent/CN102929447B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

A kind of extracting method of perception image effective area of optical sensor, comprise optical sensor signals processing unit, graphics processing unit, the automatic extraction unit of effective image area and optical sensor and demarcate unit, adopt adaptive regulation method, extracting method comprises: import the touch-screen picture signal of optical sensor signals processing unit into graphics processing unit, picture signal is processed, and imports the picture signal after process into effective image area automatic extraction unit; The automatic extraction unit of effective image area calculates the Automatic adjusument factor of regional luminance perception Summing Factor image from obtained picture signal; Find the start-stop position of perceptual image according to the regional luminance perception factor, and be effective image area by the region labeling between the start-stop position of perceptual image.

Description

A kind of extracting method of perception image effective area of optical sensor
Technical field
The present invention relates to optical touch technology, particularly relate to a kind of extracting method adopting the perception image effective area of optical sensor of Automatic adjusument.
Background technology
The main flow touching technique of domestic market is worked as based on vector pressure sensing technology, resistive technologies, infrared technology, capacitance technology, touch-surface technology of acoustic wave, and wherein, vector pressure sensing technology touch-screen steps down from the stage of history; The accurate positioning of touch screens, but its price is quite high, and be afraid of to scrape rapid wear; The touch-screen of infrared technology is cheap, but its housing is frangible, easily produces light interference, easy distortion in curved surface situation; Capacitance plate design theory is good, but its pattern distortion problem is difficult to be solved at all; Surface acoustic wave touch screen solves the various defects of touch-screen in the past, clear uprising, is suitable for various occasion, and its shortcoming is the water droplet of screen surfaces, dust can make touch-screen slow up, and does not even work.
The appearance of optical touch screen solves a difficult problem for numerous touch-screen substantially, and has the following adequate condition touching market of leader.Optical touch technology is a kind of new technology being different from above-mentioned existing touching technique, and optical sensor is quick to careful movement response, makes the operation of operator more brisk, smooth, accurately.Be arranged on top left respectively, two optics modules of right corner can detect multiple finger of operator or the position of other touch operation body accurately, operator not only can be clicked, double-click, dilatory, can also implement to rotate freely and the operation such as amplification.Existing optical touch screen structure as shown in Figure 1, mainly comprises luminescence component 1, optical sensor signals processing unit 2, graphics processing unit (not shown) and display unit 3, and the point in touch-screen represents the arbitrary touch point 4 during operator touch operation.
In reality, affect the height that one of key factor of optical touch performance is optical sensor perceptual performance, and wherein the extraction of optical sensor image effective coverage plays a part outbalance.The extracting method of optical current sensor effective image area is mainly by regulating manually, Fig. 2 is optical sensor perceptual image manual adjustments theory diagram, and its functional module comprises optical sensor signals processing unit, graphics processing unit, image touches effective coverage manual extraction unit and unit demarcated by optical sensor.Wherein, optical sensor signals processing unit, general adopts CCD camera or CMOS camera, for gathering and processing the image information of touch-screen display; Wherein, graphics processing unit processes the picture signal handled by optical sensor signals processing unit; Wherein, image touch effective coverage manual extraction unit be to graphics processing unit process after image, extract the image that image touches effective coverage manually; Wherein, unit demarcated by optical sensor is carry out optical sensor demarcation to the effective image area of manual extraction.More by force or more secretly etc. the interference of some light is comparatively serious in illumination for this control method manually, or optical sensor imaging arrangement is comparatively under complex environment, the precision extracted effective image area is not high, greatly can affect the performance of optical sensor perception like this, and then greatly reduce the body perception energy of optical touch, and be not suitable for the popularization of industrial production pattern.
As can be seen here, a kind of extracting method of optical sensor image effective coverage of improvement is needed.
Summary of the invention
For the deficiency that prior art exists, the object of the present invention is to provide a kind of extracting method adopting the perception image effective area of optical sensor of Automatic adjusument, manual extraction unit is substituted with automatic extraction unit, this method can according to specific environment, automatically calculate the illumination effect factor, and then automatically extract optical sensor image effective coverage, thus solve under complex environment, effective image area extracts the problem of difficulty.
For achieving the above object, technical scheme of the present invention is: a kind of extracting method of perception image effective area of optical sensor, comprise optical sensor signals processing unit, graphics processing unit, the automatic extraction unit of effective image area and optical sensor and demarcate unit, adopt adaptive regulation method, extracting method comprises: import the touch-screen picture signal of described optical sensor signals processing unit into described graphics processing unit, picture signal is processed, and imports the picture signal after process into described effective image area automatic extraction unit; The automatic extraction unit of described effective image area calculates the Automatic adjusument factor H of regional luminance perception factor K and image from obtained picture signal; Then, find initial, the rest position of perceptual image according to its regional luminance perception factor K, and be effective image area by the region labeling between initial, the rest position of perceptual image.
Preferably, in each described optical sensor perceptual image, with the centre of image for boundary, initial beginning flag and the initial cutoff mark of an effective image area is respectively got, as initial reference position and the initial cutoff position of described perception image effective area of optical sensor in the left and right regions of middle boundary.
Preferably, the left margin of each optical sensor perceptual image and the regional luminance perception factor K of right margin is calculated.Calculating optical sensor senses brightness of image quality valuation factor M.
Preferably, by comparing, if image intensity value I is more than or equal to optical sensor perceptual image brightness quality valuation factor M, then chooses this row and be defined as effective row on perception image effective area of optical sensor border; Determine all effective row on perception image effective area of optical sensor border successively.
Preferably, by carrying out traversal image procossing to all effective row determined, obtain reference position and the rest position of perception image effective area of optical sensor.
Compared with prior art, the technique effect of technical solution of the present invention is: Automatic adjusument perception image effective area of optical sensor extracting method of the present invention, according to specific environment, automatically can calculate regional luminance perception factor K, and then automatically extract effective image area.Automatic adjusument, real-time, time complexity is low, the robustness of this method is significantly improved, its optical sensor is demarcated more stable, extraction accuracy is high and easy to operate.When installing touch-screen, without the need to carrying out the evaluation of the factors such as illumination to the installation environment of touch-screen.When choosing perception image effective area of optical sensor, without the need to requiring that professional operates, any operating personnel can operate.Because the technology of the present invention adopts one-touch automatic fetch strategy, so need not extract effective image area manually, particularly under industrial production pattern, more easily realize, more convenient to use, volume production efficiency is high.
Accompanying drawing explanation
Fig. 1 is general optical touch screen structural representation;
Fig. 2 is existing optical sensor perceptual image manual adjustments theory diagram;
Fig. 3 is optical sensor perceptual image automatic principle of compensation block diagram of the present invention;
Fig. 4 is that perception image effective area of optical sensor of the present invention extracts workflow diagram.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.
Fig. 1 shows existing optical touch screen structure, and the extracting method of perception image effective area of optical sensor of the present invention can be used for this equipment.
The difference of the existing optical sensor perceptual image manual adjustments structure shown in Fig. 2 and optical sensor perceptual image self-regulation structure of the present invention is the present invention is automatic adjustment, specifically adopt adaptive regulation method, and existing be adopt to regulate manually.
As shown in Figure 3, optical sensor perceptual image self-regulation structure of the present invention, its functional module comprises optical sensor signals processing unit, graphics processing unit, the automatic extraction unit of effective image area and optical sensor and demarcates unit.Owing to adopting adaptive regulation method, when relevant staff to optical sensor carry out relative parameters setting start time, the effective image area extraction unit of system can start automatically, first promptly picture signal can be obtained from optical sensor, calculate boundary luminances perception factor K and brightness of image quality valuation factor M, then the parameter obtained can be sent into " the automatic extraction module of effective image area " rapidly, from " the automatic extraction module of effective image area ", extract effective image area.
Its principle of work: for optical touch screen, after the size of touch screen determines, after the touch environment such as touch mounting structure determine, illumination will remain within certain scope on the impact of environment, then the brightness Automatic adjusument Summing Factor regional luminance perception factor of each optical sensor perceptual image is also preset parameter, and the effective coverage of the screen picture therefore captured by each optical sensor is also fixing region.In like manner according to the Automatic adjusument factor of regional luminance perception Summing Factor image, initial row and the cut-off row of effective image area can be calculated.
Fig. 4 illustrates that perception image effective area of optical sensor of the present invention extracts workflow, and it extracts flow process and comprises the following steps:
1) by each optical sensor photographed screen image, optical sensor perceptual image is produced.
2) in each described optical sensor perceptual image, with the centre of image for boundary, initial beginning flag and the initial cutoff mark of an effective image area is respectively got, as initial reference position and the initial cutoff position of described perception image effective area of optical sensor in the left and right regions of middle boundary.Suitable empirical value according to real image, can be chosen in the initial start-stop mark position of effective coverage.
3) calculate the left margin of each described optical sensor perceptual image and the regional luminance perception factor K of right margin, described K equals the average gray value of perceptual image in a certain fixed range in border, and the large I experience of fixed range draws, the present invention can be 50.
4) find the brightest gray-scale value, calculate the initial reference position of described optical sensor perceptual image and the high light of initial cutoff location drawing picture, the gray-scale value G corresponding to described high light is the brightest gray-scale value.
5) value calculating described optical sensor perceptual image brightness quality valuation factor M, this M equals described gray-scale value G and is multiplied by brightness rewards and punishments factor Q, and described factor Q generally gets 0 to 1.
6) determine effective row on effective image area border, comprising:
By the initial reference position of every a line of the perceptual image of optical sensor described in step 1) and the image intensity value I of initial cutoff position, compare with the described optical sensor perceptual image brightness quality valuation factor M drawn in step 5);
If described image intensity value I is more than or equal to described optical sensor perceptual image brightness quality valuation factor M, then chooses this row and be defined as effective row on perception image effective area of optical sensor border;
Determine all effective row on described perception image effective area of optical sensor border successively.
7) by carrying out traversal image procossing to all effective row determined, obtain reference position and the rest position of described perception image effective area of optical sensor, the method choice windowing sliding scale of described traversing graph picture, the size of window wants choose reasonable, the present invention generally desirable 3,4,5.If this window value obtains too little or too large, the performance that the gradation of image of current window section can not be rationally correct.Rule of thumb, can show that the image within the image relative efficiency region of the reference position of effective image area is relatively dark; Image within the image relative efficiency region of the rest position of effective image area is relatively dark.Described traversal image procossing, is travel through described optical sensor perceptual image left from described initial reference position, from described initial cutoff position, travels through described optical sensor perceptual image to the right.Described traversal image procossing specifically comprises:
Calculate the difference of the image intensity value of left and right side window mouth;
The correlation factor of described difference and brightness compared, the position is relatively labeled as reference position and the rest position of doubtful effective image area respectively;
The reference position of described all doubtful effective image areas and rest position are compared, draw the minimum position of all doubtful reference positions and the maximum position of doubtful rest position, then carry out windowed traversal image, thus obtain the actual reference position of described perception image effective area of optical sensor and actual rest position.
Traversing graph as time, if when finding the reference position of described perception image effective area of optical sensor, then to the perceptual image traversing graph picture from right to left of optical sensor, if when finding the rest position of described perception image effective area of optical sensor, then to the perceptual image traversing graph picture from left to right of optical sensor.
The correlation factor of described difference and brightness is compared in step 7), that the difference of the image intensity value of the left and right side window mouth calculated is compared with brightness Automatic adjusument factor H and brightness of image quality valuation factor M respectively, if described difference is greater than brightness Automatic adjusument factor H, and the average gray value of the image intensity value of left and right side window mouth is greater than brightness of image quality valuation factor M, then this position is labeled as reference position and the rest position of doubtful effective image area respectively.
Carry out windowed traversal image to the maximum position of the minimum position and doubtful rest position that draw all doubtful reference positions, the method for described windowed traversal image is as follows:
If this minimum position is the starting position of optical sensor perceptual image, then to this row image, windowed traversal image left from initial reference position, if this maximum position is the rest position of optical sensor perceptual image, then to this row image, windowed traversal image to the right from initial cutoff position;
If the image intensity value of window is less than default threshold value T, then this position is respectively the actual reference position of described perception image effective area of optical sensor and actual rest position.Wherein, threshold value T can experience provide.
The foregoing is only preferred embodiment of the present invention, not in order to limit the present invention, within the spirit and principles in the present invention all, any amendment done, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (3)

1. the extracting method of a perception image effective area of optical sensor, comprise optical sensor signals processing unit, graphics processing unit, the automatic extraction unit of effective image area and optical sensor and demarcate unit, adopt adaptive regulation method, extracting method comprises:
Import the touch-screen picture signal of described optical sensor signals processing unit into described graphics processing unit, picture signal is processed, and import the picture signal after process into described effective image area automatic extraction unit;
The difference of the image intensity value of the left and right side window mouth that the automatic extraction unit of described effective image area calculates from obtained picture signal compares with brightness Automatic adjusument factor H and brightness of image quality valuation factor M respectively, if described difference is greater than brightness Automatic adjusument factor H, and the average gray value of the image intensity value of left and right side window mouth is greater than brightness of image quality valuation factor M, then this position is labeled as reference position and the rest position of doubtful effective image area respectively
Carry out windowed traversal image to the maximum position of the minimum position and doubtful rest position that draw all doubtful reference positions, the method for described windowed traversal image is as follows:
If this minimum position is the starting position of optical sensor perceptual image, then to this row image, windowed traversal image left from initial reference position, if this maximum position is the rest position of optical sensor perceptual image, then to this row image, windowed traversal image to the right from initial cutoff position;
If the image intensity value of window is less than default threshold value T, then this position is respectively the actual reference position of described perception image effective area of optical sensor and actual rest position.
2. extracting method according to claim 1, it extracts flow process and comprises the following steps:
1) by each optical sensor photographed screen image, optical sensor perceptual image is produced;
2) in each described optical sensor perceptual image, with the centre of image for boundary, initial beginning flag and the initial cutoff mark of an effective image area is respectively got, as initial reference position and the initial cutoff position of described perception image effective area of optical sensor in the left and right regions of middle boundary;
3) left margin of each described optical sensor perceptual image and the regional luminance perception factor K of right margin is calculated;
4) find the brightest gray-scale value, calculate the initial reference position of described optical sensor perceptual image and the high light of initial cutoff location drawing picture, the gray-scale value G corresponding to described high light is the brightest gray-scale value;
5) value calculating described optical sensor perceptual image brightness quality valuation factor M, this M equals described gray-scale value G and is multiplied by brightness rewards and punishments factor Q;
6) determine effective row on effective image area border, comprising:
By step 1) described in the initial reference position of every a line of optical sensor perceptual image and the image intensity value I of initial cutoff position, with step 5) in the described optical sensor perceptual image brightness quality valuation factor M that draws compare;
If described image intensity value I is more than or equal to described optical sensor perceptual image brightness quality valuation factor M, then chooses this row and be defined as effective row on perception image effective area of optical sensor border;
Determine all effective row on described perception image effective area of optical sensor border successively;
7) by carrying out traversal image procossing to all effective row determined, obtain reference position and the rest position of described perception image effective area of optical sensor, the method choice windowing sliding scale of described traversing graph picture, the process of described traversing graph picture comprises:
Calculate the difference of the image intensity value of left and right side window mouth;
The difference of the image intensity value of the left and right side window mouth calculated compared with brightness Automatic adjusument factor H and brightness of image quality valuation factor M respectively, the position is relatively labeled as reference position and the rest position of doubtful effective image area respectively;
The reference position of described all doubtful effective image areas and rest position are compared, draw the minimum position of all doubtful reference positions and the maximum position of doubtful rest position, then carry out windowed traversal image, thus obtain the actual reference position of described perception image effective area of optical sensor and actual rest position.
3. extracting method according to claim 2, its feature is put and is, step 7) in described traversal image procossing, when finding the reference position of described perception image effective area of optical sensor, to the perceptual image traversing graph picture from right to left of optical sensor, when finding the rest position of described perception image effective area of optical sensor, to the perceptual image traversing graph picture from left to right of optical sensor.
CN201210428424.9A 2012-10-19 2012-10-31 A kind of extracting method of perception image effective area of optical sensor Expired - Fee Related CN102929447B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210428424.9A CN102929447B (en) 2012-10-19 2012-10-31 A kind of extracting method of perception image effective area of optical sensor

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201210401218.9 2012-10-19
CN201210401218 2012-10-19
CN2012104012189 2012-10-19
CN201210428424.9A CN102929447B (en) 2012-10-19 2012-10-31 A kind of extracting method of perception image effective area of optical sensor

Publications (2)

Publication Number Publication Date
CN102929447A CN102929447A (en) 2013-02-13
CN102929447B true CN102929447B (en) 2015-09-09

Family

ID=47644270

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210428424.9A Expired - Fee Related CN102929447B (en) 2012-10-19 2012-10-31 A kind of extracting method of perception image effective area of optical sensor

Country Status (1)

Country Link
CN (1) CN102929447B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324359B (en) * 2013-04-09 2016-11-02 上海仪电电子多媒体有限公司 The image-signal processing method of the anti-light interference of optical touch screen
CN109002215B (en) * 2018-07-27 2021-03-19 青岛海信移动通信技术股份有限公司 Method for determining touch initial position of terminal with touch screen and terminal
CN113487542B (en) * 2021-06-16 2023-08-04 成都唐源电气股份有限公司 Extraction method of contact net wire abrasion area

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5724456A (en) * 1995-03-31 1998-03-03 Polaroid Corporation Brightness adjustment of images using digital scene analysis
EP1876517A1 (en) * 2006-07-03 2008-01-09 Micro-Nits Co., Ltd. Input method of pointer input system
CN102508582A (en) * 2011-11-30 2012-06-20 无锡海森诺科技有限公司 Optical touch calibration automatic adjusting method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5724456A (en) * 1995-03-31 1998-03-03 Polaroid Corporation Brightness adjustment of images using digital scene analysis
EP1876517A1 (en) * 2006-07-03 2008-01-09 Micro-Nits Co., Ltd. Input method of pointer input system
CN102508582A (en) * 2011-11-30 2012-06-20 无锡海森诺科技有限公司 Optical touch calibration automatic adjusting method

Also Published As

Publication number Publication date
CN102929447A (en) 2013-02-13

Similar Documents

Publication Publication Date Title
US20220303447A1 (en) Information acquisition device, method, patrol robot and storage medium
CN103716594B (en) Panorama splicing linkage method and device based on moving target detecting
CN101807293B (en) Method for adjusting detection result of image quadrilateral frame
WO2013135033A1 (en) Tunnel deformation online monitoring system based on image analysis and application thereof
CN104133548A (en) Method and device for determining viewpoint area and controlling screen luminance
CN104501737B (en) A kind of device and method of liquid jet spraying boundary alignment
CN104423569A (en) Pointing position detecting device, method and computer readable recording medium
EP2574041A3 (en) Image capturing apparatus and control method thereof
CN101694694B (en) Finger identification method used in interactive demonstration system
CN101464751B (en) Electronic white board system based on image detection and detection method thereof
CN103399695B (en) Quadrangle frame identification method and device for intelligent wireless communication terminal
CN102929447B (en) A kind of extracting method of perception image effective area of optical sensor
CN109145734A (en) Algorithm is captured in IPC Intelligent human-face identification based on 4K platform
CN105807989A (en) Gesture touch method and system
CN102508582B (en) Optical touch calibration automatic adjusting method
CN206331472U (en) A kind of interactive robot based on Face datection
CN103390259A (en) Ground image processing method in visual guidance AGV
CN106713701A (en) Cluster motion data acquisition method and system based on image processing technology
CN101840275A (en) Non-contact type mouse device and operation method thereof
CN103949054A (en) Infrared light gun positioning method and system
CN114220044A (en) River course floater detection method based on AI algorithm
CN107538485B (en) Robot guiding method and system
CN107436675A (en) A kind of visual interactive method, system and equipment
CN102023759A (en) Writing and locating method of active pen
EP2512140A3 (en) Method and display apparatus for calculating coordinates of a light beam

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150909

Termination date: 20181031