CN102804204A - Geture recognition using chroma- keying - Google Patents

Geture recognition using chroma- keying Download PDF

Info

Publication number
CN102804204A
CN102804204A CN2010800282991A CN201080028299A CN102804204A CN 102804204 A CN102804204 A CN 102804204A CN 2010800282991 A CN2010800282991 A CN 2010800282991A CN 201080028299 A CN201080028299 A CN 201080028299A CN 102804204 A CN102804204 A CN 102804204A
Authority
CN
China
Prior art keywords
color
target
background
module
given
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2010800282991A
Other languages
Chinese (zh)
Other versions
CN102804204B (en
Inventor
宋嵘
J·王
Y·刘
H·张
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Priority to CN201080028299.1A priority Critical patent/CN102804204B/en
Publication of CN102804204A publication Critical patent/CN102804204A/en
Application granted granted Critical
Publication of CN102804204B publication Critical patent/CN102804204B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

A method and system for analyzing an object is provided in the present invention. The system comprises: a background arranged behind an object, wherein the color of the background is allowed to be selected from a set of colors; a first unit for setting a given color for the background so that the given color is different from the color of the object; a second unit for taking a picture including the object and the background; and a third unit for detecting at least one feature relating to the object according to the picture taken by said second unit. In the system, the color of the background is allowed to be set so as to be different from the color of the object. In this way, it is easy to distinguish the object part from the background part in the taken picture. This results in stable recognition for different objects, especially objects of different colors.

Description

Utilize the gesture recognition of chroma key
Technical field
The present invention relates to image processing techniques, relate in particular to and utilize image processing techniques that target is analyzed.
Background technology
Utilize device that image processing techniques discerns the for example target of staff shape known in this field, but also have some existing systems, these systems utilize camera and electronic processing device to discern the for example different gestures of staff.In these existing systems, the image of target normally obtains before complicated background perhaps has the background of constant color.But different targets often has various colors (for example because the colour of skin is different).When color of object and background color near the time, these existing systems will be difficult to the identification that provides accurate and stable.Owing to this reason, these existing systems need the application in the field of high precision and good stability to be restricted at some.
Therefore, need provide a kind of technology to come with high precision and good stability to detecting and analyze such as the such target of staff.
Summary of the invention
Provide a kind of technology with high precision and good stable property to different target, the target that particularly has different colours detects and analyzes will be very favorable.
According to the present invention, after target to be analyzed, arrange background, the color of this background allows to be arranged to be different from the color of target.Like this, after shooting comprises the image of this target and this background, easily the part of the target in this image is distinguished with background parts.This realizes particularly having the stable identification of the target of different colours for different target.
According to an aspect of the present invention, a kind of system and method that target is analyzed of being used for is provided.This system comprises:
Be arranged in target background afterwards, the color of wherein said background allows to be selected from any one in one group of color;
First module is used to be provided with the color of said background, so that said color is different from the color of said target;
Unit second is used to take the image that comprises said target and said background; And
Unit the 3rd is used for detecting at least one characteristic relevant with said target according to the image by said second unit photographs.
The advantage of this system and method is: because the color of background is different from the color of target very much, thereby in captured image, prospect (that is target part) and background area are separated easily.And in this system, the color of background can be provided with according to the color of target.Thus, even owing to the change of target causes color of object to change, system still can detect the characteristic relevant with this target according to captured image reliably.
Preferably, the color that is different from target for the selected color of background in colourity, brightness or colourity and brightness on the two.As required, this system can use cheaply monochrome cameras or color camera as said Unit second.
Preferably, can confirm by first module in the first module for the color of background setting.In example, this first module can be selected default color (for example, green) color as a setting.In another example, this background color can import to confirm based on the user, and wherein this user's input indicates the user and hopes the background color selected.The very simple and easy operating of this system.In another example, this background color can be imported to confirm according to the distribution of color and the user of detected target.This system can help patient or user to select only color color as a setting, so that obtain analyzing more accurately of target.
Preferably, the distribution of color of target can confirm automatically through carrying out following steps:
For said background is selected the subclass of multiple color successively, to generate one group of background with different colours;
To each background in this group background, take the image that at least one comprises said target and this background;
Based on statistical study, detect the distribution of color of said target for the image of taking for this group background.
This system can detect the distribution of color of target automatically, thereby can be in the only color that need not to confirm exactly under the situation of user intervention to be used for background.
In a preferred embodiment, this target is the part (for example staff) of health, and this system further comprises: Unit the 4th is used for based on the posture of being discerned this part of said health by detected at least one characteristic in Unit the 3rd.For example, detected characteristic can be the profile of this part of health.This system can be accurately and stably detects the posture of this part of this health, so that further analyze.
In addition, said system can be used as rehabilitation system, and said target is the part (for example, upper limbs and hand) of patient's body.Whether the posture of this part of the health that the further evaluation of said rehabilitation system is identified is correct, returns feedback then, for example through the audio or video signal, with the excitation patient.Therefore, this rehabilitation system can be based on the rehabilitation system of family expenses, carries out self-exercise recovery training automatically to promote the patient.
According to a further aspect of the present invention, a kind of device that target is analyzed supported is provided also, this device comprises:
Have a mind to be arranged in said target background unit afterwards, the color of wherein said background allows to be selected from any one in the shades of colour;
First module is used to said background configuration color, so that this given color is different from the color of said target.
This device can be cooperated to constitute above-mentioned rehabilitation system with camera and computing equipment.
With reference to the embodiment of following description, these aspects of the present invention and others will become cheer and bright.
Description of drawings
Below will combine embodiment and, more specifically describe and explain the present invention with reference to accompanying drawing, in the accompanying drawings:
Shown in Figure 1ly be synoptic diagram according to the system that is used for target is analyzed of the embodiment of the invention;
Shown in Figure 2 is the process flow diagram of the operating process of system shown in Figure 1;
Shown in Figure 3 is the structured flowchart of Fig. 1 middle controller;
Shown in Figure 4 is process flow diagram according to the distribution of color of the automatic detection target of the embodiment of the invention;
Shown in Figure 5 is operational flowchart according to the rehabilitation system of the embodiment of the invention;
Shown in Figure 6 for according to the synoptic diagram of the profile that the embodiment of the invention obtained.
In institute's drawings attached, identical Reference numeral is represented similar or corresponding characteristic and/or function.
Embodiment
Below will combine accompanying drawing to describe embodiments of the invention in detail.
Inventor of the present invention finds: if be placed on the color of the background after the target to be analyzed can be configured to be different from the color of target will be highly beneficial.Owing to have big aberration between the target and background, therefore can easily from the image that comprises target and background, tell the target part.
Fig. 1 illustrates the example system 10 that target is analyzed of being used for based on above-mentioned thought.This system 10 for example can be gesture identification system or the like.As shown in Figure 1; System 10 comprises background 110, controller 120 (first module), camera 130 (Unit second) and processing unit 140, and wherein processing unit 140 comprises for example detecting unit 141 (Unit the 3rd) and other optional functional unit 142 and 143 (will be described later).
Target 150 as shown in Figure 1, to be analyzed, for example staff is placed between background 110 and the camera 130.Here, the position of background 110 and highly can regulating as required for example can be regulated to be suitable for different users.Camera 130 is taken the image that comprises target 150 and background 110.Camera 130 can be digital camera or video camera.Camera 130 can be coupled to processing unit 140 through wired or wireless connection, thereby captured image is sent to processing unit 140 from camera 130, so that analyze.Processing unit 140 can be notebook computer, PDA, cell phone, and perhaps other has the electronic equipment of image processing function arbitrarily.140 pairs of images that received of this processing unit are handled so that therefrom detect at least one characteristic relevant with target 150.These characteristics of target comprise position, size, profile or arbitrarily other can be from image detected characteristic.
In the embodiment shown in fig. 1, the color of background 110 can be selected from any one in the multiple color.For example, background 110 can be to embed the LED lamp plate that a plurality of LED are arranged.The color of this LED lamp plate can be any one in rgb value 100 kinds of different colours increasing gradually or reduce gradually.But the present invention is not limited to this, and the quantity of the color of background and type all can be confirmed as required.And this background 110 also is not limited to the LED lamp plate, and it also can be temperature control variable color plate etc. for example.Except background color, the texture of background also can be selected and changes according to user's hobby, thereby user-friendly interface can be provided.
In Fig. 1, controller 120 is coupled to background 110, so that be that background 110 is selected given color, wherein this given color is different from the color of target.This controller 120 can be used as the for example separate unit operation of RCU, perhaps can integrate (if it is simple in structure) with background 110, perhaps is integrated in the processing unit 140.For selected this given color of background and color of object can be different on the two at colourity, brightness or this.For example, color can be described with YUV or rgb format.In yuv format, Y representes brightness, and U and V represent colourity.Suppose only to consider the colourity of color of object (the for example colour of skin), and for example U is 80-100, V is 120-140, the scope that this given color of this background should the wide color so, for example the U of this given color and V the two can be about 230.So be provided with, just have big aberration between color of object and the background color.This aberration will help from the image that camera 130 is taken, detecting the relevant characteristic (for example profile) of target.
Fig. 2 shows the exemplary operating process of system 10.As shown in Figure 2, at step S210, system 10 carries out initialization, and target 150 (for example staff) is placed in before the background 110.At step S220, controller 120 becomes given color through transmitting control signal to background 110 with the color settings of background.This given color obviously is different from the color of target.For example, this given color can be a default color, for example obviously is different from the green of yellow's colour of skin.The example of controller 120 will combine accompanying drawing 3 and 4 to describe below.At step S230, camera 130 is taken at least one image that comprises target 150 and the background 110 that is disposed.Captured image is sent to processing unit 140.Camera 130 can come photographic images according to the instruction from processing unit or controller, and perhaps camera 130 can continue photographic images always.At step S240,141 pairs of images that receive of the detecting unit in the processing unit 140 carry out Flame Image Process to detect at least one characteristic relevant with target.Because the aberration of background and target is very big, thereby detecting unit 141 utilizes known technology from captured image, to be partitioned into the target part easily.For example, this is cut apart and can carry out through using the threshold value of confirming automatically.In detecting unit 141, the target part that is partitioned into is further handled, to detect the characteristic relevant with target, for example, and the position of target, size and profile etc., thus further analyze (for example identification).The exemplary process that is used for detected characteristics will combine accompanying drawing 5 to be described in detail below.
In Fig. 1, controller 120 can utilize multiple mode to realize.Fig. 3 shows some examples of controller 120.
As shown in Figure 3, controller 120 comprises first module 310, is used to background 110 and confirms given color.For example, as stated, during initialization, first module can be selected default color (for example green) this given color as a setting, and background 110 is set to said color.
Still as shown in Figure 3, controller 120 can also comprise user interface 320, is used to receive user's input.The user imports and directly indicates the user and hope the background color selected.In this case, first module 310 is imported indicated color with the user and is confirmed as given color.This is a kind of easy mode, because in most of the cases the user can subjectively select the color that is different from color of object for background.
In another example, controller 120 further comprises second module 330, is used to detect the distribution of color of target 150.In this example, indicate the color (for example, the color of the colour of skin and coat-sleeve) of target from user's input of user interface 320.Based on user's input, second module 330 generates the distribution of color of target 150, and in first module 310, confirms given color according to this distribution of color.For example, this given color can be in 100 kinds of different colours with the color distance of the colour of skin and coat-sleeve color all greater than a kind of color of a threshold value.And the color distance of each color component is big more in the distribution of color of given color and target, and Expected Results is good more.
In another embodiment, controller 120 can include only first module 310 and second module 330.In this example, second module 330 detects the distribution of color of target 150 automatically.This method is particularly suitable for the color of object complicated situation.Accompanying drawing 4 illustrates exemplary treatment scheme.
As shown in Figure 4, flow process begins from step S410.In step S410, background 110 is configured to appear successively one group of color (for example, 100 kinds of different colours, or the subclass of these colors), has one group of background of different colours thereby generate.Step S410 can be carried out by second module 330, is perhaps carried out under the control of second module 330 by first module 310.At step S420, to each background with different colours, camera 130 is taken at least one image that comprises target 150 and background under the control of second module, thereby photographs the set of diagrams picture with different background color.At step S430, this picture group picture directly is sent to or is sent to second module 330 via processing unit 140.At step S440, this picture group is looked like to carry out statistical study, to detect the distribution of color of target.
In an example, at step S440, the RGB that goes out each image in this picture group picture through histogram calculation distributes.Here, because the distribution of color of target is always same or analogous, thereby can obtain the distribution of color of target 150 through the distribution that relatively calculates.In simple example more, captured image is accumulated on R, G, three dimensions of B together.Because the background color of these images is to change within a large range and its target partly remains unchanged; Thereby there is peak region in the histogram of frame of accumulation apparently higher than other part; This peak region is corresponding to the color gamut of target part, the i.e. distribution of color of target.After detecting the distribution of color of target, can confirm as and the center of the distribution of color of target or the maximum a kind of color of color distance of peak for the given color of background.
Fig. 1-4 shows some embodiment of system 10.System 10 can be used in the different application, for example the gesture identification system.For this recognition system, processing unit 140 also comprises recognition unit 142, is used for going out user's gesture according to detecting unit 141 detected feature identification.System 10 can also further be applied to other system, for example product classification system or the like.
Fig. 5 shows the example that system shown in Figure 1 10 is applied as rehabilitation system.Rehabilitation system is the system that patient that a kind of help suffers from the motion function obstacle recovers its function that loses.At present, existing rehabilitation system comprises for example electric excitation (FES) system, robot etc.But these rehabilitation systems cost an arm and a leg and complicacy, and this has limited them and has been applied to family.Therefore, in this embodiment, a kind of cheap family expenses rehabilitation system has been proposed, so that the patient for example carries out self-exercise recovery training to upper limbs and hand.
In this embodiment, system 10 shown in Figure 1 is as rehabilitation system.In this rehabilitation system, processing unit 140 also comprises the command unit (not shown), is used for carrying out specific action through sound or vision signal order patient.In addition, processing unit 140 also comprises aforesaid recognition unit 142 and judging unit 143.Judging unit 143 judges whether the posture that is identified by unit 142 is correct, and returns feedback signal with the excitation patient.In addition, processing unit 140 can also comprise display, is used for showing in real time captured image.
Fig. 5 illustrates the example operational flow of this rehabilitation system.As shown in Figure 5, at step S510, as target 150, the patient is asked to its upper limbs and hand are placed between background 110 and the camera 130, and the patient makes specific action (for example clenching fist) according to order.At step S520, according to the flow process of accompanying drawing 2~4, the color of selecting background is to be different from for example patient's the colour of skin and the color of coat-sleeve.In step S530, according to mode same as shown in Figure 2, camera 130 is taken the image that comprises patient's upper limbs and hand and background, to catch the situation of upper limbs and hand.At step S540, image is sent in the detecting unit 141 in the processing unit 140, to detect the characteristic of upper limbs and hand.For example, in step S540, be partitioned into the target part, promptly after the part of upper limbs and hand, utilizing algorithm that obtains geometric center or the center that erosion algorithm comes test example such as hand.The position of this target is represented in this center, and can be used as benchmark successively.And in step S540, through the target part of Hi-pass filter (for example utilizing the Sobel operator) to being partitioned into, promptly the part of upper limbs and hand is carried out filtering, to obtain the profile of upper limbs and hand.Preferably, resulting profile can also further carry out the blurring effect of deblurring processing with contour elimination.Edge clearly with situation about playing an important role under, deblurring method is highly beneficial.Fig. 6 illustrates two profiles 610 and 620 of the upper limbs that from least two images of sequential shoot, obtains according to the method described above and hand.
Next, in step S550,142 pairs of detected characteristics of recognition unit (comprising position and/or profile) are further handled to identify the posture that the patient is made.This identification can realize with several different methods.For example, can tentatively discern the posture of upper limbs based on the upper arm shown in detected center and the detected profile and the angle between the forearm.Preferably, can be in step S550 end user's artificial neural networks (ANN) algorithm or maximum Likelihood.For the ANN algorithm, the input of ANN can be the characteristic of detected profile, for example the average of the distance of each point and center or standard deviation etc. on the unique point on the profile (extreme point or flex point), the profile.The output of ANN then identifies the posture that is identified, for example, the hand among Fig. 6 open 610 with clench fist 620.At step S560, whether judging unit 143 is judged the posture that is identified and is mated with predefined correct body position, and give feedback signal based on judged result, for example rewards.Under this excitation, the patient can proceed self-exercise recovery training.Alternatively, a plurality of this rehabilitation systems can also be connected to recovery centre through network.The doctor of recovery centre can two or more patients of remote assistance carry out rehabilitation training simultaneously.
Although in accompanying drawing and above description, be shown specifically and described the present invention, these illustrate and describe and all are interpreted as is exemplifying and exemplary, and nonrestrictive; The present invention is not limited to embodiment disclosed herein.For example, the present invention can also be applied to discern lower limb or other body part, perhaps discerns the spatial attitude of specific device.Alternatively, the present invention can also be applied to the field of product classification or other target analysiss.
When realization was of the present invention, through research accompanying drawing, the disclosure and appended claims, those skilled in the art can understand and expect other modification to the disclosed embodiments.In claims, word " comprises " does not get rid of other elements or step, and indefinite article " " and " one " do not get rid of a plurality of possibilities yet.The function of a plurality of projects described in claims can be realized in single processor or other unit.Some means is documented in the each other different dependent claims this fact and does not represent that the combination of these means is not favourable.Computer program can be stored or be distributed on the suitable medium; For example by other hardware or as the part of other hardware and the for example optical storage medium or the solid storage medium that provide; But also can be with other form distribution, for example via internet or other wired or wireless communication systems.Any Reference numeral in claims all should not be construed as the restriction to scope.

Claims (15)

1. one kind is used for system (10) that target (150) is analyzed, comprising:
-be arranged in said target (150) background cell (110) afterwards, be used to said target background is provided, the color of said background cell (110) allows to be selected from any one in one group of color;
-first module (120), being used for the color settings of said background cell (110) is the given color of said one group of color, said given color is different from the color of said target (150);
Unit-the second (130) is used for taking the image that comprises said target (150) and be set to the said background cell (110) of said given color;
-Di Unit three (141) are used for detecting at least one characteristic relevant with said target (150) according to the said image of being taken by said Unit second (130).
2. system according to claim 1, wherein said given color is different from the color of said target (150) in colourity and/or brightness.
3. system according to claim 1, wherein said first module (120) comprise first module (310) that is used for confirming said given color.
4. system according to claim 3, wherein said first module (120) also comprise second module (330) of the distribution of color that is used to detect said target (150);
Wherein, said given color is confirmed based on the distribution of color of detected said target (150).
5. system according to claim 3, wherein said first module (120) also comprise the user interface (320) that is used to receive user's input;
Wherein, said given color is imported to confirm according to said user.
6. system according to claim 4, wherein said first module (120) also comprise the user interface (320) that is used to receive user's input;
Wherein, said given color is confirmed based on the distribution of color of said user's input and detected said target (150).
7. system according to claim 1, wherein said target (150) is the part of health, said system (10) also comprises:
Unit the 4th (142) is used for based on the posture of being discerned the said part of said health by detected said at least one characteristic in said Unit the 3rd (141).
8. one kind is used to support comprise device that target is analyzed:
Have a mind to be arranged in said target (150) background cell (110) afterwards, the color of wherein said background cell (110) allows to be selected from any one in one group of color;
First module (120) is used for providing to said background cell (110) the given color of said one group of color, and said given color is different from the color of said target (150).
9. device according to claim 8, wherein said first module (120) also comprise first module (310) that is used for confirming said given color.
10. device according to claim 8, wherein said first module (120) also comprise the user interface (320) that is used to receive user's input;
Wherein said given color is imported to confirm according to said user.
11. device according to claim 8, wherein said background cell (110) are to embed the plate that at least one LED is arranged.
12. the method that target (150) is analyzed may further comprise the steps:
-provide (S210) to be arranged in said target background afterwards, the color of said background allows to be selected from any one in one group of color;
-to said background the color in (S220) said one group of color being provided, said given color is different from the color of said target;
-shooting (S230) comprises said target and the image that is set to the said background of said given color; And
-the first detects step (S240), is used for detecting at least one characteristic relevant with said target according to captured image.
13. method according to claim 12, said method also comprise definite step of confirming said given color.
14. method according to claim 13, wherein said definite step also comprise the step that receives user's input, and wherein said given color is imported to confirm based on said user.
15. method according to claim 13, wherein said definite step also comprises:
-the second detects step, is used to detect the distribution of color of said target; Wherein, said given color is confirmed based on the distribution of color of detected said target.
CN201080028299.1A 2009-06-25 2010-06-23 Geture recognition using chroma- keying Expired - Fee Related CN102804204B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201080028299.1A CN102804204B (en) 2009-06-25 2010-06-23 Geture recognition using chroma- keying

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN2009101508941 2009-06-25
CN200910150894 2009-06-25
CN200910150894.1 2009-06-25
PCT/IB2010/052839 WO2010150201A1 (en) 2009-06-25 2010-06-23 Geture recognition using chroma- keying
CN201080028299.1A CN102804204B (en) 2009-06-25 2010-06-23 Geture recognition using chroma- keying

Publications (2)

Publication Number Publication Date
CN102804204A true CN102804204A (en) 2012-11-28
CN102804204B CN102804204B (en) 2015-07-01

Family

ID=42931884

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201080028299.1A Expired - Fee Related CN102804204B (en) 2009-06-25 2010-06-23 Geture recognition using chroma- keying

Country Status (8)

Country Link
US (1) US8817125B2 (en)
EP (1) EP2446395A1 (en)
JP (1) JP5793493B2 (en)
KR (1) KR101783999B1 (en)
CN (1) CN102804204B (en)
BR (1) BRPI1010041A2 (en)
RU (1) RU2556417C2 (en)
WO (1) WO2010150201A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903000A (en) * 2012-12-28 2014-07-02 联想(北京)有限公司 Method and system for recognizing target object from background
CN109214372A (en) * 2018-11-01 2019-01-15 深圳蓝胖子机器人有限公司 Attitude determination method, device and computer readable storage medium
CN110678810A (en) * 2017-05-22 2020-01-10 弗劳恩霍夫应用研究促进协会 Illumination system and recording system for volume capture
CN112836692A (en) * 2020-11-27 2021-05-25 北京百度网讯科技有限公司 Method, apparatus, device and medium for processing image

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5413673B2 (en) * 2010-03-08 2014-02-12 ソニー株式会社 Information processing apparatus and method, and program
JP6335695B2 (en) * 2014-07-09 2018-05-30 キヤノン株式会社 Information processing apparatus, control method therefor, program, and storage medium
US11386580B1 (en) * 2021-08-13 2022-07-12 Goodsize Inc. System apparatus and method for guiding user to comply with application-specific requirements

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5423554A (en) * 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
CN1411284A (en) * 2001-10-05 2003-04-16 Lg电子株式会社 Method for testing face by image
US20070200938A1 (en) * 2005-03-10 2007-08-30 Toshihiko Kaku Image-Taking System
CN101398896A (en) * 2007-09-28 2009-04-01 三星电子株式会社 Device and method for extracting color characteristic with strong discernment for image forming apparatus

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09185711A (en) * 1995-12-28 1997-07-15 Kobe Steel Ltd Shape recognizing method and its device
GB9619119D0 (en) * 1996-09-12 1996-10-23 Discreet Logic Inc Processing image
US5946500A (en) * 1997-09-18 1999-08-31 Oles; Henry J. Apparatus and method for chroma replacement
JP3241327B2 (en) * 1998-08-22 2001-12-25 大聖電機有限会社 Chroma key system
US6674485B2 (en) * 1998-08-31 2004-01-06 Hitachi Software Engineering Co., Ltd. Apparatus and method for image compositing
JP2001246161A (en) * 1999-12-31 2001-09-11 Square Co Ltd Device and method for game using gesture recognizing technic and recording medium storing program to realize the method
JP2002118859A (en) * 2000-10-05 2002-04-19 Sony Corp Photographing device and photographing method
US7253832B2 (en) * 2001-08-13 2007-08-07 Olympus Corporation Shape extraction system and 3-D (three dimension) information acquisition system using the same
JP3876985B2 (en) * 2002-09-18 2007-02-07 オムロンエンタテインメント株式会社 Photo sticker vending machine and image printing method
RU2268497C2 (en) * 2003-06-23 2006-01-20 Закрытое акционерное общество "ЭЛВИИС" System and method for automated video surveillance and recognition of objects and situations
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
JP2007156950A (en) * 2005-12-07 2007-06-21 Toyota Motor Corp Vehicle operating device
US8589824B2 (en) 2006-07-13 2013-11-19 Northrop Grumman Systems Corporation Gesture recognition interface system
JP2008152622A (en) * 2006-12-19 2008-07-03 Mitsubishi Electric Corp Pointing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5423554A (en) * 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
CN1411284A (en) * 2001-10-05 2003-04-16 Lg电子株式会社 Method for testing face by image
US20070200938A1 (en) * 2005-03-10 2007-08-30 Toshihiko Kaku Image-Taking System
CN101398896A (en) * 2007-09-28 2009-04-01 三星电子株式会社 Device and method for extracting color characteristic with strong discernment for image forming apparatus

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903000A (en) * 2012-12-28 2014-07-02 联想(北京)有限公司 Method and system for recognizing target object from background
CN103903000B (en) * 2012-12-28 2020-01-31 联想(北京)有限公司 method and system for identifying target object from background
CN110678810A (en) * 2017-05-22 2020-01-10 弗劳恩霍夫应用研究促进协会 Illumination system and recording system for volume capture
CN109214372A (en) * 2018-11-01 2019-01-15 深圳蓝胖子机器人有限公司 Attitude determination method, device and computer readable storage medium
CN109214372B (en) * 2018-11-01 2021-04-02 深圳蓝胖子机器智能有限公司 Attitude determination method, attitude determination device and computer-readable storage medium
CN112836692A (en) * 2020-11-27 2021-05-25 北京百度网讯科技有限公司 Method, apparatus, device and medium for processing image

Also Published As

Publication number Publication date
EP2446395A1 (en) 2012-05-02
JP5793493B2 (en) 2015-10-14
WO2010150201A1 (en) 2010-12-29
RU2012102412A (en) 2013-07-27
BRPI1010041A2 (en) 2016-09-20
KR20120112359A (en) 2012-10-11
RU2556417C2 (en) 2015-07-10
US20120092519A1 (en) 2012-04-19
US8817125B2 (en) 2014-08-26
JP2012531652A (en) 2012-12-10
KR101783999B1 (en) 2017-11-06
CN102804204B (en) 2015-07-01

Similar Documents

Publication Publication Date Title
CN102804204A (en) Geture recognition using chroma- keying
US9124812B2 (en) Object image capture apparatus and method
EP2806373A2 (en) Image processing system and method of improving human face recognition
CN103312972B (en) Electronic installation and its its focalization method
CN101364265A (en) Method for auto configuring equipment parameter of electronic appliance and ccd camera
CN102761706A (en) Imaging device and imaging method and program
EP2373002A1 (en) Image capture device
US10282601B2 (en) Electronic device and gesture recognition method applied therein
US10127424B2 (en) Image processing apparatus, image processing method, and image processing system
CN111880640B (en) Screen control method and device, electronic equipment and storage medium
CN108781268B (en) Image processing apparatus and method
JP6884219B2 (en) Image analysis technique
CN105376524B (en) Fuzzy detection method, monitoring device and monitoring system for image picture
CN113545028A (en) Gain control for face authentication
WO2013114803A1 (en) Image processing device, image processing method therefor, computer program, and image processing system
CN114424145B (en) Proximity detection method, terminal and storage medium
CN104349112B (en) Video conference device and its method
CN108984140B (en) Display control method and system
TW201512701A (en) Image capturing apparatus and the control method thereof
CN101996324B (en) Dynamic sampling-based method for extracting laser spot
US9320113B2 (en) Method of self-calibrating a lighting device and a lighting device performing the method
US11575841B2 (en) Information processing apparatus, imaging apparatus, method, and storage medium
CN112347834B (en) Remote nursing method, equipment and readable storage medium based on personnel category attribute
CN107749942A (en) Suspension image pickup method, mobile terminal and computer-readable recording medium
KR101685419B1 (en) Apparatus and method for processing digital image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150701