CN103826151B - Display system and method thereof - Google Patents

Display system and method thereof Download PDF

Info

Publication number
CN103826151B
CN103826151B CN201310722269.6A CN201310722269A CN103826151B CN 103826151 B CN103826151 B CN 103826151B CN 201310722269 A CN201310722269 A CN 201310722269A CN 103826151 B CN103826151 B CN 103826151B
Authority
CN
China
Prior art keywords
image
coordinate
fisrt feature
display device
viewing area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310722269.6A
Other languages
Chinese (zh)
Other versions
CN103826151A (en
Inventor
林楷庭
唐玄辉
陈静珊
汤绍康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AU Optronics Corp
Original Assignee
AU Optronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AU Optronics Corp filed Critical AU Optronics Corp
Publication of CN103826151A publication Critical patent/CN103826151A/en
Application granted granted Critical
Publication of CN103826151B publication Critical patent/CN103826151B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a display system and a display method. The display method includes capturing a first image of a first object. According to a first feature coordinate of a first feature of the first object in the first image, a position parameter for indicating a relative position of the first object and the display device is determined. And determining a first display area of the second image in the display device according to the position parameter. The display system and the method can adjust the area of the display picture in the display device according to the positions of the eyes of the user and the display device.

Description

Display system and its method
Technical field
The present invention with regard to a kind of display system and its method, the display system of particularly a kind of adjustable viewing area and its Method.
Background technology
With the development of science and technology, the computer product for touching in life has very significant progress, especially fills with showing The technology for putting correlation is even more maked rapid progress.
In general, when user watches display device, no matter fixed or portable display device, it is necessary to First the screen of display device is adjusted to into ad-hoc location, to obtain preferably sense organ experience.For example, when using portable aobvious During showing device, when conventional handss display device is lifted in fixed position, but therefore handss may ache or sprain.Or, when using Person is lain at when seeing TV on sofa, often to reverse cervical region so that eyes are located on same level height, but musculi colli have can Injury can be therefore suffered from.Again for example, when user hangs on the display device of eminence in viewing, higher pattern is always felt Or word is less and does not see Chu.
The content of the invention
Problem in view of more than, the present invention proposes a kind of display device and method, can be by judging user and showing The relative position of device, and then the position of viewing area, size and shape in display device are adjusted, help user can be more Cosily watch display picture.
According to a kind of display packing of the present invention, the method captures the first shadow for being associated with the first object in first position Picture.Also, according to fisrt feature coordinate of the fisrt feature of the first object in the first image, determine to indicate the first object With the location parameter of display device relative position.Also, according to location parameter, determine first in a display device of the second image Viewing area.
A kind of display system of the foundation present invention, including display device, the first image capture unit, and processing meanss. Wherein processing meanss are coupled to display device and the first image capture unit.First image capture unit is in first position Acquisition is associated with the first image of the first object.Processing meanss according to the first object fisrt feature in the first image first Characteristic coordinates, determines location parameter.Location parameter is to indicate the relative position of the first object and display device.Processing meanss are simultaneously According to location parameter, the first viewing area that the second image shows in a display device is determined.
In sum, using the display system of presently disclosed method, captured using the first image capture unit and associated In the first image of the first object.Processing meanss are able to the fisrt feature according to the fisrt feature of the first object in the first image Coordinate, determines the relative position of the first object and display device.And adjust the display picture in display device according to relative position Region, allow user need not long-time display device is fixed on into same position, just can cosily view and admire picture.
The explanation of explanation and following embodiment above with respect to present disclosure is to demonstrate and explain the present invention's Spirit and principle, and provide the present invention patent claim further explain.
Description of the drawings
Fig. 1 is according to the Display System Function block chart of one embodiment of the invention;
Fig. 2 is according to the method flow diagram of one embodiment of the invention;
Fig. 3 A are according to an enforcement pattern schematic diagram of the present invention;
Fig. 3 B are the top view of Fig. 3 A;
Fig. 3 C are the side view of Fig. 3 A;
Fig. 4 is to illustrate the method flow diagram according to S230 details in Fig. 2 in one embodiment of the invention;
Fig. 5 is according to the running schematic diagram of another embodiment of the present invention;
Fig. 6 is to illustrate according to the method flow diagram of one embodiment of the invention;
Fig. 7 A are according to the installation drawing of yet another embodiment of the invention;
Fig. 7 B are the running schematic diagram of the first image capture unit in Fig. 7 A;
Fig. 7 C are the running schematic diagram of the second image capture unit in Fig. 7 A;
Fig. 8 is to illustrate according to the method flow diagram of one embodiment of the invention;
Fig. 9 A are according to one embodiment of the invention aspect schematic diagram;
Fig. 9 B are according to another embodiment of the present invention aspect schematic diagram.
Wherein, reference
10th, 30,50,70,90 display system
102nd, 310,510,710 display device
104th, 320,520,720 first image capture unit
106th, 330,540,740 processing meanss
322 image capture axis
324 focal distances
340th, 550,750 first object
345th, 755 fisrt feature
512nd, 911 first viewing area
514th, 913 second viewing area
530th, 730 second image capture unit
560 second objects
722 first image capture axis
732 first image capture axis
756 subpoints
910 Arcuate display devices
920 image capture units
R1、R2Relative position
S210~S250, S401~S405, S601~S605, S810~S840 steps
αxMaximum horizontal visual angle
αyMaximum vertical visual angle
βxHorizontal view angle
βyVertical visual angle
βx1First level visual angle
βy1First vertical visual angle
βx2Second horizontal view angle
βy2Second vertical visual angle
Specific embodiment
Hereinafter the detailed features and advantage of the present invention are described in detail in embodiments, its content be enough to make any ability Field technique personnel understand the technology contents of the present invention and simultaneously implement according to this, and according to content disclosed in this specification, apply for a patent Scope and accompanying drawing, any personnel for being familiar with correlation technique can be readily understood upon the purpose and advantage of correlation of the invention.Following reality The viewpoint that example further describes the present invention is applied, but it is non-anyways limiting scope of the invention.
Fig. 1 is refer to, it is according to the Display System Function block chart of one embodiment of the invention.As shown in figure 1, showing system System 10 includes display device 102, the first image capture unit 104 and processing meanss 106.Processing meanss 106 are respectively coupled to The image capture unit 104 of display device 102 and first.
Display device 102 is to show image.For example, display device 102 can be liquid crystal display (liquid Crystal display, LCD), LED displays (light emitting diode display), active array Formula organic luminuous dipolar object display (active matrix organic light emitting diode, AMOLED) or its The equipment that he is suitable to show image, the present embodiment here is not any limitation as.
First image capture unit 104 is to the image in first position the first object of acquisition (not being illustrated in Fig. 1).Picking When taking image, the first image capture unit 104 can capture a static image at interval of a period of time to the first object, Persistently to the first object pick-up image and a video streaming can also be produced with the mode made video recording.First image capture unit 104 can be digital camera, network camera, infrared image capture device, narrow-band image capture unit or other be suitable to The equipment of pick-up image, the present embodiment here is not any limitation as.
Fisrt feature coordinate of the processing meanss 106 according to the fisrt feature of the first object in image, determines the first object With the relative position of display device 102, and aforementioned relative position is represented with location parameter.Then determine aobvious according to location parameter To show the first viewing area of the second image on showing device 102.Wherein determine the relative position of object and display device 102 Method can according to fisrt feature coordinate with equation computing relative position.Or according to fisrt feature coordinate via looking into Table and relative position.Wherein the second image can be film, operating window or photo.First viewing area can show dress Put 102 active area or partial active area, and the first viewing area shape and picture direction can be with Arbitrarily adjustment.Processing meanss 106 can be ASIC (application-specific integrated Circuit, ASIC), advanced reduced instruction set machine (advanced RISC machine, ARM), CPU (central processing unit, CPU), single-chip controller or other equipment for being adapted for carrying out computing and control instruction, The present embodiment here is not any limitation as.
It is according to the method flow diagram of one embodiment of the invention please with reference to Fig. 1 and Fig. 2, Fig. 2.As illustrated, In S210, the first image capture unit 104 captures the first image of the first object in first position.In S230, processing meanss The 106 fisrt feature coordinates according to the fisrt feature of the first object in the first image, determine location parameter.Wherein location parameter To the relative position for indicating the first object and display device 102.Then in S250, processing meanss 106 are according to location parameter Determine the first viewing area that the second image shows in display device 102.
With regard to a kind of embodiment of said method, please with reference to Fig. 3 A to Fig. 3 C, Fig. 3 A are according to the present invention Implement pattern schematic diagram, Fig. 3 B are the top view of Fig. 3 A, and Fig. 3 C are the side view of Fig. 3 A.As shown in Figure 3A, display system 30 is wrapped Containing display device 310, the first image capture unit 320 and processing meanss 330.Processing meanss 330 are electrically coupled to respectively aobvious The image capture unit 320 of showing device 310 and first.First image capture unit 320 is picked in first position to the first object 340 Take the first image.For example, the first object 340 can be a user, and the first image capture unit 320 can be fixed Keep at a certain distance away in the surface of display device 310, underface, particular corner or with display device 310 and in a suitable manner Arrange, the present embodiment here is not any limitation as.
Processing meanss 330 can be according to the fisrt feature 345 (such as the eyes of user) of the first object 340 in the first image Fisrt feature coordinate (X in middle institute's distribution area1,Y1), the centre coordinate of such as pupil of left eye, and the first image capture dress Put 320 maximum visual angle (αxy), obtain the image capture axis 322 of the image capture unit 320 of fisrt feature 345 and first Visual angle, that is, visual angle (βxy).Wherein, βxThe β for horizontal view angleyFor vertical visual angle.Then according to visual angle (βxy) and Relative position R1, estimate relative position R for obtaining fisrt feature 345 and display device 3102.Wherein, relative position R1For first Image capture unit 320 and the relative position at the complete display area center of display device 310.Wherein, maximum visual angle (αxy) in Maximum horizontal visual angle αxBy the first image capture unit 320 receive the image left and right sides respectively with the first image capture unit The angle of 320 lines, as shown in Figure 3 B.And maximum vertical visual angle αyImage is received by the first image capture unit 320 upper and lower Both sides respectively with the angle of the line of the first image capture unit 320, as shown in Figure 3 C.Although above-mentioned relative position R1And R2All It is, as reference point, but to be not limited thereto with the complete display area center of display device 310, for example, can also be aobvious The lower left corner of showing device 310, the lower right corner, the upper left corner or the upper right corner are as reference point.
Wherein, processing meanss 330 are according to fisrt feature coordinate (X1,Y1) and maximum visual angle (αxy) try to achieve visual angle (βx, βy) method details are as follows.If the center point coordinate of the first image is (X0,Y0), and four corner coordinates of the first image are (Xm,Ym)、(Xm,YM)、(XM,Ym)、(XM,YM), then the image capture of fisrt feature 345 and first can be calculated with following equation Visual angle (the β of the image capture axis 322 of device 320xy).Wherein, image capture axis 322 is the first image capture unit 320 with the line at institute's pick-up image center.
k·tan(αx/2)=(XM-X0)…………………………………………(1)
Equation (1) is first depending on it is known that level maximum visual angle (αx) tangent value and XMAnd X0Difference be directly proportional.
k·tan(βx)=(X1-X0)…………………………………………(2)
And according to equation (2) it is known that first level visual angle (βx) tangent value and X1And X0Difference be directly proportional.
tan(βx)=tan(αx/2)·(X1-X0)/(XM-X0)………………………(3)
Therefore can be by level maximum visual angle (αx) tangent value and X0、X1、XMObtain horizontal view angle (βx) tangent value, Such as equation (3).Using same procedure, by X0、X1、XMReplace with Y0、Y1、YMAnd by level maximum visual angle (αx) replace with it is vertical Maximum visual angle (αy) can just calculate vertical visual angle (βy) tangent value.
Processing meanss 330 calculate the image of the image capture unit 320 of fisrt feature 345 and first according to aforesaid equation Capture the visual angle (β of axis 322xy) after.Due to relative position R1, it is known that therefore processing meanss 330 can be according to relative position R1 And visual angle (βxy) calculate and obtain relative position R2, and by relative position R2Save as location parameter.Obtain location parameter Afterwards, processing meanss 330 can determine the first viewing area in display device 310 according to location parameter, to show the second image.
Above-mentioned foundation relative position R1And visual angle (βxy) calculate and obtain relative position R2There are numerous embodiments.Its A kind of middle easy embodiment is as follows, due to relative position R1For, it is known that processing meanss 330 can be according to relative position R1Produce A raw view position conversion table.Multiple visual angles are noted down in view position conversion table with corresponding relative position R2.Fill when processing Put 330 and receive (βxy) when, if there is identical visual angle in view position conversion table, table look at is found out corresponding Relative position R2.If without identical visual angle in view position conversion table, two or more being regarded using immediate Angle, with Once approximate method, quadratic approximation or Newton method an interpolated value is tried to achieve.This interpolated value is exactly relative position R2Estimation Value.Although only recording the method using tabling look-up herein, relative position R can also be calculated using other computational methods2
In an example, to seek more accurately calculating relative position R2, the first image capture unit 320 more transmits focusing Processing meanss 330 are given from 324.Focal distance 324 is the number by estimated by the first image capture unit 320 is according to focusing mechanism According to the distance for indicating the first image capture unit 320 and the place image plane of the first object 340.If focal distance 324 Value be p, then understand that the image plane distance that the first image capture unit 320 and the first object 340 are located is p.The first shadow As capture device 320 is as origin, image capture axis 322 as z-axis, the plumb line of image capture axis 322 is orthogonal to As y-axis, the horizontal line of image capture axis 322 is orthogonal to as x-axis.Then fisrt feature 345 is relative to the first image capture The coordinate of device 320 is (ptan (βx),p·tan(βy),p).Therefore processing meanss 330 can be according to the phase of fisrt feature 345 For the coordinate and relative position R of the first image capture unit 3201Calculate relative position R2
Under the situation of known focal distance 324, according to fisrt feature 345 relative to the first image capture unit 320 Coordinate and relative position R1Calculate relative position R2Method be exemplified below.Coordinate axess use aforementioned definitions, it assumes that R1For (R1x, R1y,R1z), it is defined as the complete display area that the first image capture unit 320 coordinate in space deducts display device 310 Central point coordinate in space.Then R2It is set to (R2x,R2y,R2z), it is defined as first image capture unit 320 in space Coordinate deducts the complete display area central point of display device 310 coordinate in space.Then R2xEqual to R1xPlus ptan (βx), R2yEqual to R1yPlus ptan (βy), R2zEqual to R1zPlus p.This sentences orthogonal coordinates as operating basis, but With the coordinate of other facility computings as operating basis, and can be limited with disclosing herein.
In an example, if the first image capture unit 320 does not transmit focal distance 324 to processing meanss 330.Process dress Putting 330 can be according to the fisrt feature coordinate (X of the corresponding area in the first image of fisrt feature 3451,Y1), second feature coordinate (X2,Y2) and the first image centre coordinate (X0,Y0) estimate the phase of fisrt feature 345 and the first image capture unit 320 To position R2.In this example, fisrt feature 345 is the eyes of user.Due to it is general grow up user binocular interval, especially It is the spacing of double vision, approximately fixed, about 6 centimetres, therefore fisrt feature coordinate (X1,Y1) and second feature coordinate (X2,Y2) Difference (Δ X, Δ Y) corresponding to actual range be about 6 centimetres.Therefore, if fisrt feature coordinate (X1,Y1) relative to image The first visual angle for capturing axis 322 is (βx1y1), and second feature coordinate (X1,Y1) relative to the of image capture axis 322 Two visual angles are (βx2y2), then processing meanss 330 can calculate fisrt feature 345 according to following equation, that is, user The relative position of eyes and the first image capture unit 320.
ΔX=X2-X1……………………………………………………(4)
ΔY=Y2-Y1……………………………………………………(5)
Equation (4) and equation (5) to represent the differential horizontal of fisrt feature coordinate and second feature coordinate with And vertical difference.It is to simplify to calculate facilitating understanding, it is assumed herein that Δ Y=0, that is, eyes are on same level height, then Δ Distance corresponding to X is 6 centimetres, then can calculate focal distance 324 using the equation below (6) (with symbol p representatives).
p·|tan(β2x)-tan(β1x)|=ΔX………………………………(6)
Processing meanss 330 are calculated after focal distance 324 according to aforesaid equation (6), can be further with reference to aforementioned manner According to the first image capture unit 320 and relative position R of display device 3101Calculate display device 310 and fisrt feature 345 relative position R2
Although said method calculates the visual angle (β of each characteristic coordinates and image capture axis 322 by equationxy), However, in an example, party's formula can also be represented with trigonometric table, so as to produce one " coordinate visual angle effect Table ".Then processing meanss 330 can direct basis fisrt feature coordinate (X1,Y1) and the first visual angle of coordinate visual angle effect table acquisition (βx1y1), according to second feature coordinate (X2,Y2) and coordinate visual angle effect table the second visual angle (β of acquisitionx2y2), then foundation Aforesaid equation (6) calculates focal distance 324, and with reference to aforementioned manner according further to the first image capture unit 320 with Relative position R of display device 3101Calculate relative position R of display device 310 and fisrt feature 3452
Said method can be summarized as follows, please with reference to Fig. 1 and Fig. 4, Fig. 4 to illustrate according in one embodiment of the invention The method flow diagram of S230 details in Fig. 2.As illustrated, in S401, processing meanss 106 are according to fisrt feature coordinate (X1,Y1) And coordinate visual angle effect table obtains the first visual angle (βx1y1).In S403, processing meanss 106 are according to second feature coordinate (X2,Y2) and coordinate visual angle effect table the second visual angle (β of acquisitionx2y2).In S405, processing meanss 106 are according to difference (Δ X, Δ Y), the first visual angle (βx1y1) and the second visual angle (βx2y2) determine location parameter.
Fig. 3 A are returned to, in an example, relative position R is estimated with fisrt feature coordinate and second feature coordinate2When.Although The double vision spacing of adult user is approximately fixed, but each user has individual difference, that is to say, that possible first uses The double vision spacing of person is 5.5 centimetres, and the double vision spacing of second user is 7 centimetres.This individual difference is likely to result in phase To position R2Estimation difference.Therefore, the adjust instruction that processing meanss 330 can be input into according to user determines the second viewing area Domain, this second viewing area may have a rotation, deformation or the displacement of boundary coordinate relative to aforementioned first viewing area.Process Device 330 can readjust aforementioned coordinate visual angle effect table according to the second viewing area and the difference of the first viewing area.Then Second viewing area is replaced into the first viewing area to show the second image.Thus, after user is input into adjust instruction, place Reason device 330 more accurately can adjust the viewing area to show the second image for indivedual user.
When aforementioned so-called first object 340 is user, fisrt feature 345 can be, for example, face, eyes are double The region of eyebrow, ears, place between the eyebrows to nose or the region of nose and face, but be not limited thereto, other may be selected readily discernible Feature as fisrt feature 345.
In an example, Fig. 5 is refer to, it is according to the running schematic diagram of another embodiment of the present invention.As shown in figure 5, Display system 50 includes display device 510, the first image capture unit 520, the second image capture unit 530, and processes dress Put 540.Wherein processing meanss 540 are electrically coupled to respectively display device 510, the first image capture unit 520 and the second shadow As capture device 530.
When the first image capture unit 520 is captured to the first object 550, for example, the first user, the first image when, The first object 550 of first viewing area of the processing meanss 540 in display device 510 512 pairs shows the second image.When the second shadow As capture device 530 is captured to the second object 560, for example, the second user, three images when, processing meanss 540 are aobvious The second object 560 of the second viewing area on showing device 510 514 pairs shows the 4th image.Determine the first viewing area 512 and The method of two viewing areas 514 is as described in the embodiment of Fig. 1 and Fig. 3 A to Fig. 3 C.
The flow process of said method can be summarized as follows that refer to Fig. 6, Fig. 6 is to illustrate according to one embodiment of the invention Method flow diagram.As shown in fig. 6, after the flow process such as Fig. 2 has been performed, in S601, the second image capture unit is in second Put the 3rd image for capturing the second object.In S603, processing meanss are according to the second feature of the second object in the 3rd image Second feature coordinate, determine the second viewing area that the 4th image shows in a display device.In S605, processing meanss control Display device processed shows the 4th image in the second viewing area.
In an example, please with reference to Fig. 7 A to Fig. 7 C, Fig. 7 A are according to the installation drawing of yet another embodiment of the invention, figure 7B is the running schematic diagram of the first image capture unit in Fig. 7 A, and Fig. 7 C are that the running of the second image capture unit in Fig. 7 A is illustrated Figure.As shown in Figure 7 A, display system 70 includes display device 710, the first image capture unit 720, the second image capture unit 730, and processing meanss 740.Wherein the first image capture unit 720 and the second image capture unit 730 are located at show respectively The left side and right side of device 710, and the first image capture axis 722 and the second image capture of the first image capture unit 720 Second image capture axis 732 of device 730 is all perpendicular to the display plane of display device 710.Processing meanss 740 are electrical respectively It is coupled to display device 710, the first image capture unit 720 and the second image capture unit 730.
When the first object 750, a for example, user, when occurring in before display system 70, the first image capture unit 720 And second image capture unit 730 capture first image and the 3rd image of the first object 750 respectively.First object 750 Fisrt feature 755, for example, place between the eyebrows of user has with the first image capture axis 722 of the first image capture unit 720 First visual angle (βx1y1).Second image capture axis 732 of the image capture unit 730 of fisrt feature 755 and second has second Visual angle (βx2y2).Calculate the first visual angle (βx1y1) and the second visual angle (βx2y2) method by Fig. 3 A to 3C figure take off Show, here is omitted.Assume that the first image capture unit 720 and the second image capture unit 730 are located at same level height, Then the first vertical visual angle βy1Visual angle β vertical with secondy2It is equal.Processing meanss 740 can utilize first level visual angle βx1With the second water Look squarely angle betax2Calculate fisrt feature 755 and be projected on display device 710 with the distance and fisrt feature 755 of display device 710 Subpoint 756.Method is as follows, it is assumed that, at a distance of d, d is for for the first image capture unit 720 and the second image capture unit 730 Know.Fisrt feature to be asked 755 and the projector distance p of display device 710, then can use the equation below (7) to regard with d, first level Angle betax1And the second horizontal view angle βx2Calculate projector distance p.
p[tan(β1x)+tan(β2x)]=d……………………………………………(7)
After calculating projector distance p, processing meanss 740 can be according to projector distance p, first level visual angle βx1And the Two horizontal view angle βx2Subpoint 756 is calculated in the position of display device 710, and according to subpoint 746 in display device 710 Position determines the size of the first viewing area as the central point of the first viewing area according to projector distance p.
The flow process of said method can be summarized as follows that refer to the 7th figure and Fig. 8, Fig. 8 is to illustrate real according to the present invention one Apply the method flow diagram of example.As shown in figure 8, in S810, the first image capture unit 720 captures the first object in first position 750 the first image.In S820, the second image capture unit 730 captures the 3rd shadow of the first object 750 in the second position Picture.In S830, fisrt feature coordinate of the processing meanss 740 according to the fisrt feature 755 of the first object 750 in the first image With second feature coordinate of the fisrt feature 755 in the 3rd image, location parameter is determined, wherein location parameter is to indicate first The relative position of object 750 and display device 710.Then in S840, processing meanss 740 determine second according to location parameter The first viewing area that image shows in display device 710.
In an embodiment of the present invention, Fig. 9 A to Fig. 9 B are refer to, Fig. 9 A are to implement aspect schematic diagram according to the present invention one, Fig. 9 B are to be intended to according to another enforcement aspect formula of the present invention.As shown in Figure 9 A, display system 90 comprising Arcuate display device 910 with And image capture unit 920.The start principle of each device is no longer gone to live in the household of one's in-laws on getting married as described in the figures of Fig. 1 to the 7th in this in display system 90 State.
As shown in Figure 9 A, when user is seated using display system 90.Now, processing meanss (not being illustrated in Fig. 9 A) can With according to image capture unit 920 the image that arrives of acquisition, in judging the face of user and the relative position of display system 90, Vertical relative position is not significantly larger than level of relative position.Therefore processing meanss determine the in Arcuate display device 910 The display picture of one viewing area 911.
As shown in Figure 9 B, when user is stand using display system 90.Now, processing meanss (not being illustrated in Fig. 9 B) can With according to image capture unit 920 the image that arrives of acquisition, in judging the face of user and the relative position of display system 90, Vertical relative position is significantly greater than level of relative position.Therefore processing meanss determine that in Arcuate display device 910 second shows Show the display picture of region 913.
In sum, using display system disclosed in this invention, captured using the first image capture unit and be associated with the First image of one object.Processing meanss are able to the fisrt feature according to the fisrt feature of the first object in the first image and sit Mark, determines the relative position of the first object and display device.And adjust the display picture in display device according to relative position Region, allows user for a long time display device need not be fixed on into same position, just can cosily view and admire picture.
Certainly, the present invention can also have other various embodiments, ripe in the case of without departing substantially from spirit of the invention and its essence Know those skilled in the art and work as and various corresponding changes and deformation, but these corresponding changes and change can be made according to the present invention Shape should all belong to the protection domain of appended claims of the invention.

Claims (12)

1. a kind of display packing, it is characterised in that include:
One first image for being associated with one first object is captured in a first position;
According to a fisrt feature coordinate of the fisrt feature of first object in first image, a location parameter is determined, Wherein the location parameter is to indicate the relative position of first object and a display device;And
According to the location parameter, determine that one second image shows one first viewing area in the display device;
Wherein when determine the location parameter when, the more second feature coordinate according to the fisrt feature in first image and should One difference of fisrt feature coordinate and the second feature coordinate, to determine the location parameter;
Wherein when the location parameter is determined, including:
According to the fisrt feature coordinate and a coordinate visual angle effect table, one first visual angle is obtained;
According to the second feature coordinate and the coordinate visual angle effect table, one second visual angle is obtained;And
According to the difference, first visual angle and second visual angle, the location parameter is determined;
The fisrt feature coordinate, the second feature coordinate are two-dimensional coordinate.
2. display packing according to claim 1, it is characterised in that further include:
One second viewing area is determined according to an adjust instruction;
According to second viewing area and first viewing area, the coordinate visual angle effect table is changed;And
First viewing area is replaced with second viewing area.
3. display packing according to claim 1, it is characterised in that wherein the fisrt feature is a face image or part The face image.
4. display packing according to claim 1, it is characterised in that this for further including in the display device first shows Region shows second image.
5. display packing according to claim 1, it is characterised in that further include:
One the 3rd image of one second object is captured in a second position;
According to a second feature coordinate of the second feature of second object in the 3rd image, determine that one the 4th image exists One second viewing area shown in the display device;And
The 4th image is shown in second viewing area.
6. display packing according to claim 1, it is characterised in that further include:
One the 3rd image of first object is captured in a second position;And
The fisrt feature coordinate according to the fisrt feature in first image is with the fisrt feature in the 3rd image One second feature coordinate, determines the location parameter.
7. a kind of display system, it is characterised in that include:
One display device;
One first image capture unit, to one first image that one first object is captured in a first position;And
One processing meanss, are coupled to the display device and first image capture unit, to according to first object A fisrt feature coordinate of the fisrt feature in first image, determines a location parameter, and according to the location parameter, determines one Second image shows one first viewing area in the display device, wherein the location parameter to indicate first object with One relative position of the display device;
A wherein second feature coordinate of the processing meanss more according to the fisrt feature in first image, and this is first special A difference of coordinate and the second feature coordinate is levied, the location parameter is determined;
The wherein processing meanss more according to a coordinate visual angle effect table, with the fisrt feature coordinate, the second feature coordinate and The difference determines the location parameter;
The fisrt feature coordinate, the second feature coordinate are two-dimensional coordinate.
8. display system according to claim 7, it is characterised in that wherein the processing meanss are more determined according to an adjust instruction Fixed one second displaing coordinate, and according to second displaing coordinate and first viewing area, the coordinate visual angle effect table is changed, Then first viewing area is replaced with second viewing area.
9. display system according to claim 7, it is characterised in that wherein this feature be a face image or it is partial should Face image.
10. display system according to claim 7, wherein display device shows second shadow in first viewing area Picture.
11. display systems according to claim 7, it is characterised in that further include:
One second image capture unit, is coupled to the processing meanss, to capture the one the of one second object in a second position Three images, and the processing meanss more according to second object a second feature the 3rd image a second feature coordinate, One second viewing area that one the 4th image shows in the display device is determined, and the 4th is shown in second viewing area Image.
12. display systems according to claim 7, it is characterised in that further include:
One second image capture unit, is coupled to the processing meanss, to capture the one of first object in a second position Three images, the fisrt feature coordinate of the processing meanss more according to the fisrt feature in first image exists with the fisrt feature A second feature coordinate in 3rd image, determines the location parameter.
CN201310722269.6A 2013-05-28 2013-12-24 Display system and method thereof Active CN103826151B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW102118838A TWI511053B (en) 2013-05-28 2013-05-28 Display system and method thereof
TW102118838 2013-05-28

Publications (2)

Publication Number Publication Date
CN103826151A CN103826151A (en) 2014-05-28
CN103826151B true CN103826151B (en) 2017-04-26

Family

ID=50760940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310722269.6A Active CN103826151B (en) 2013-05-28 2013-12-24 Display system and method thereof

Country Status (2)

Country Link
CN (1) CN103826151B (en)
TW (1) TWI511053B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW369636B (en) * 1997-03-12 1999-09-11 Toshiba Corp Semiconductor integrated circuit and its testing method
CN1457468A (en) * 2001-03-05 2003-11-19 皇家菲利浦电子有限公司 Automatic positioning of display depending upon viewer's location
US6917301B2 (en) * 1999-05-04 2005-07-12 Intellimats, Llc Floor display system with variable image orientation
CN1679319A (en) * 2002-07-05 2005-10-05 索尼株式会社 Image display system, image display method and display unit
EP2037679A1 (en) * 2006-06-23 2009-03-18 Sharp Kabushiki Kaisha Image display device, image display method, image display system, image data transmission device, program, and recording medium
TW201317832A (en) * 2011-10-28 2013-05-01 Top Victory Invest Ltd Remote control system based on user's motion and method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI369636B (en) * 2008-10-28 2012-08-01 Univ Nat Central Image system for adjusting displaying angle by detecting human face and visual simulation control apparatus thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW369636B (en) * 1997-03-12 1999-09-11 Toshiba Corp Semiconductor integrated circuit and its testing method
US6917301B2 (en) * 1999-05-04 2005-07-12 Intellimats, Llc Floor display system with variable image orientation
CN1457468A (en) * 2001-03-05 2003-11-19 皇家菲利浦电子有限公司 Automatic positioning of display depending upon viewer's location
CN1679319A (en) * 2002-07-05 2005-10-05 索尼株式会社 Image display system, image display method and display unit
EP2037679A1 (en) * 2006-06-23 2009-03-18 Sharp Kabushiki Kaisha Image display device, image display method, image display system, image data transmission device, program, and recording medium
TW201317832A (en) * 2011-10-28 2013-05-01 Top Victory Invest Ltd Remote control system based on user's motion and method thereof

Also Published As

Publication number Publication date
TWI511053B (en) 2015-12-01
CN103826151A (en) 2014-05-28
TW201445455A (en) 2014-12-01

Similar Documents

Publication Publication Date Title
US20170177082A1 (en) Stabilization Plane Determination Based On Gaze Location
US20180075616A1 (en) Method and system for determining datum plane
WO2016115872A1 (en) Binocular ar head-mounted display device and information display method thereof
WO2019062056A1 (en) Smart projection method and system, and smart terminal
US11534063B2 (en) Interpupillary distance measuring method, wearable ophthalmic device and storage medium
US11270443B2 (en) Resilient dynamic projection mapping system and methods
CN104683786A (en) Human eye tracking method and device of naked eye 3D equipment
WO2017067262A1 (en) Image processing method and apparatus
WO2017187694A1 (en) Region of interest image generating device
US20130201099A1 (en) Method and system for providing a modified display image augmented for various viewing angles
KR20130038471A (en) Method and apparatus for displaying image based on user location
TWI637288B (en) Image processing method and system for eye-gaze correction
US20100135580A1 (en) Method for adjusting video frame
CN110209277B (en) Head-mounted sight tracking method based on pupil characteristics only
JP2017134467A (en) Information processor, information processing method and information processing program
US10088910B2 (en) Information processing apparatus and information processing method
CN103826151B (en) Display system and method thereof
US9615074B2 (en) Method for generating translation image and portable electronic apparatus thereof
TWI704473B (en) Vision vector detecting method and device
Li et al. 3D vision attack against authentication
KR20140000831A (en) Display apparatus and method
Hopf et al. Multi-user eye tracking suitable for 3D display applications
WO2023162504A1 (en) Information processing device, information processing method, and program
TWI594790B (en) Scoring system, device and method for sensing body movement
Chiang et al. A Limbus-Based Approach for Eye Gaze Trackers

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant