US20030137486A1 - Method and apparatus for locating a pointing elements within a digital image - Google Patents

Method and apparatus for locating a pointing elements within a digital image Download PDF

Info

Publication number
US20030137486A1
US20030137486A1 US09/210,525 US21052598A US2003137486A1 US 20030137486 A1 US20030137486 A1 US 20030137486A1 US 21052598 A US21052598 A US 21052598A US 2003137486 A1 US2003137486 A1 US 2003137486A1
Authority
US
United States
Prior art keywords
search
find
arm
tip
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/210,525
Inventor
Alexis Wilke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weather Central Holdings Inc
Original Assignee
AWA-TV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AWA-TV filed Critical AWA-TV
Priority to US09/210,525 priority Critical patent/US20030137486A1/en
Assigned to AWA-TV reassignment AWA-TV ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WILKE, ALEXIS
Publication of US20030137486A1 publication Critical patent/US20030137486A1/en
Assigned to WEATHER CENTRAL, INC. reassignment WEATHER CENTRAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MAGIC TRAK, L.L.C., WELCH, ANDREW D/B/A AWA-TV.COM
Priority to US10/959,596 priority patent/US7170490B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Definitions

  • optical mouse device What is referred to generically herein as an optical mouse device was first created in 1995 after more than three months of research on how to find a color ball within a video field.
  • One area of particular interest for use of an optical mouse is in television weather broadcasts.
  • the talent, by using an optical mouse could, for example, move a cloud shown on a display from one position to another by moving a handheld pointing device with a colored ball at its end.
  • the optical mouse device needs to analyze images (usually from a camera feed, but it is not limited to just a camera) which has a single background color.
  • images usually from a camera feed, but it is not limited to just a camera
  • Weather broadcasts often have a blue or green background used as a matte, but the invented optical mouse patent does not require any specific background color.
  • the object to be tracked (possibly a person, but it is not limited to just persons, i.e. it could track a robot) will be standing within the received image and will have a different color than the background as shown in FIG. 1 and FIG. 2.
  • the object may have parts using that background color if some areas need to appear transparent through a matte, etc.
  • the device will search an area of the image for a pointing device (optical mouse).
  • the pointing device position will be returned to the user of the device.
  • the explanations found here can be used to search a person standing in front of a camera like, but not limited to, a weather caster. Such a search could work in any direction for any kind of object or person and is valid for any kind of extension.
  • the device starts its search at the bottom of the area where the body of a standing person should always be found (FIG. 3 a ). There are a few cases explained below when the device may find arms without finding the body of the person first.
  • a search for the arms is conducted (FIG. 3 b ). To do so, a search is started from the top of the image and continues toward the bottom. If nothing is found, there is no extended arm.
  • both arms When both arms are extended, one of them must be selected to know which is the one used to point (FIG. 3 d ). To accomplish this, simply select the one further away from the body: it is known where the body was found (Xb), and where the tip was found (Xt) therefore the distance
  • FIG. 1 shows a video field with a background and foreground object.
  • FIG. 2 shows a video field having a limited search area.
  • FIG. 2 a shows a pixel being checked and its surrounding pixels.
  • FIG. 3 a shows a search for a body portion.
  • FIG. 3 b shows a search for arm portions.
  • FIG. 3 c shows a search for a tip.
  • FIG. 3 d shows a search for a desired arm portion.
  • FIG. 3 e shows a detailed search for the tip.
  • FIG. 4 shows a special case for a body and arm search.
  • FIG. 5 shows a special case for an arm search.
  • the computer needs a source to receive a full digital image.
  • This source can be, and is not limited to: (1) the Internet, (2) a movie or (3) a direct video feed to the computer, in which case the video signal will need to be converted to a digital representation (FIG. 1.).
  • the area to be tracked can be any rectangle 11 within the full video field.
  • the definition is (X-LEFT, Y-TOP, X-RIGHT, Y-BOTTOM) in pixel coordinates. These coordinates are used to determine where to search for the body 13 and arms 15 as shown in FIG. 1. The origin of these coordinates is the top-left corner, with the Y coordinates increasing going down and the X coordinates increasing going right.
  • the limit variable is defined as LIMIT and represents the number of pixels to skip on all borders. Note that four values could be defined to give a specific left, right, top and bottom limit. For the purpose of this discussion, we will limit our to only one value.
  • the background 21 needs to be a single color; i.e. always use the same color.
  • the color will possibly vary in intensity and chroma within a limited range defined by the user.
  • the color definitions will vary depending on the number of components of the digital representation of the image. Yet, each component can be checked against a range.
  • a range will be defined as MIN for the lowest possible value and MAX for the highest possible value. These ranges may be defined as constants (the background is always of the same color in all the possible cases) or variables (the background can change from one usage to another; i.e. from a blue screen to a green screen).
  • Example of Ranges Color Range of the background Red Green Blue Blue 0% to 15% 0% to 30% 75% to 100% Green 0% to 10% 75% to 100% 0% to 40% White 80% to 100% 80% to 100% 80% to 100% 80% to 100%
  • the object will be of all the colors which are not the background color.
  • the area to be searched can therefore be defined with a positive (the object) and a negative (the background) set of pixels.
  • a program could be done to transform the whole image in this way. However to make it work in real time, it is necessary to check a very limited number of pixels. The algorithm below shows how this works. FUNCTION IS-OBJECT (VAR PIXEL : RGB) -- we return TRUE when IS-BACKGROUND returns FALSE -- and vice versa IF IS-BACKGROUND (PIXEL) THEN RETURN FALSE ELSE RETURN TRUE END IF
  • the first part we will search is the body of the object.
  • the following is an algorithm to search a standing person.
  • the optical mouse device may not be specifically based on such an object, it could be applied to a flying plane, an object attached at a ceiling, etc.
  • the search can be accomplished with the FIND function as defined in the Search section above. It will check one pixel every 20 pixels until the body is found or the whole screen has been searched. The position at which the body is found is put in variables for later use.
  • variable STEP sets the number of pixels to be skipped on each iteration; the use of a STEP value is important to make the search fast; in this example we defined the step as 20 , so if we start searching at the position (10, 120), the second pixel tested will be found at the position (30, 120), the third at (50, 120), etc.
  • variable MAX-TEST it sets the number of pixels which can be tested left-right; we use the X-LEFT and X-RIGHT coordinates as defined above; the LIMIT * 2 would be:
  • the FIND function tests pixels until the body is found, or MAX-TEST pixels are checked. The result is then saved in the BODY-X-LEFT and BODY-Y-LEFT variables.
  • the FIND function tests pixels until the body is found, or MAX-TEST pixels are checked. The result is then saved in the BODY-X-RIGHT and BODY-Y-RIGHT variables.
  • This first algorithm can be used to find the tip of one arm.
  • the direction (X-DIRECTION) parameter can be used to search on the left ( ⁇ 1) or on the right (1) as described below.
  • FIND-TIP ARM-X-LEFT, ARM-Y-LEFT ⁇ 10, ⁇ 1, 5) RETURNING TIP-X-LEFT, TIP-Y-LEFT -- search for the left pointing device with 1 pixel precision -- (direction set to ⁇ 1 and step to 1)
  • FIND-TIP TIP-X-LEFT ⁇ 5, TIP-Y-LEFT ⁇ 5, ⁇ 1, 1) RETURNING TIP-X-LEFT, TIP-Y-LEFT -- search for the right pointing device with 5 pixels precision -- (direction set to 1 and step to 5)
  • FIND-TIP ARM-X-RIGHT, ARM-Y-RIGHT ⁇ 10, 1, 5)
  • This last algorithm calls the FIND-TIP function which determines the tip of each arm.
  • FIND-ARMS should check whether an arm was found before to search for its tips. The search is repeated twice. See FIG. 3 e : (1) with a step of 5 which returns a point near the tip at about 5 pixels away and (2) with a step of 1 which returns a point next to the tip (i.e. which is at most 1 pixel away).
  • This technique is only to enable to processing to occur substantially real time. Other techniques can be used to close up on the tip instead of a new call to FIND-TIP.
  • the Body algorithm will usually return an unusable position for the search of the body when the person is standing on a border.
  • a resulting position which is equal to the startup or the first position, can be declared as being an invalid position (i.e. the body is on that side).
  • This special case happens whenever the person is out of the video field but is still pointing to something within the video field as shown in FIG. 5. This special case occurs whenever the body 13 is not found at all in the area to be searched.
  • the FLYING-ARMS procedure sets all the body coordinates to fake positions as if a body had been found. Thus the FIND-ARMS procedure can be called to perform a search for two arms.
  • the optical mouse could be done to search for several areas in each image. This way, more than one pointing device could be found and managed; i.e. two persons could appear in the image: one on the left and one on the right.

Abstract

In order to work, the optical mouse device needs to analyze images (usually from a camera feed, but it is not limited to just a camera) which has a single background color. Weather broadcasts often have a blue or green background used as a matte, but the invented optical mouse patent does not require any specific background color.

Description

    BACKGROUND OF THE INVENTION
  • What is referred to generically herein as an optical mouse device was first created in 1995 after more than three months of research on how to find a color ball within a video field. One area of particular interest for use of an optical mouse is in television weather broadcasts. The talent, by using an optical mouse, could, for example, move a cloud shown on a display from one position to another by moving a handheld pointing device with a colored ball at its end. [0001]
  • There were two problems: [0002]
  • 1. Because the digitalization of a video signal is too far apart from the raw video signal, it is not possible to find a specific color within a full video field. [0003]
  • 2. On a system with a slow processor, it is not possible to search the whole image for a single point of color and let another program work in parallel. [0004]
  • Therefore, a way to find the end (or tip) of a person's hand (or of a pointer device) was chosen to bypass the problem. No specific color is necessary to find the end. See U.S. Pat. No. 5,270,820 for a description of this technique. [0005]
  • SUMMARY OF THE INVENTION
  • In order to work, the optical mouse device needs to analyze images (usually from a camera feed, but it is not limited to just a camera) which has a single background color. Weather broadcasts often have a blue or green background used as a matte, but the invented optical mouse patent does not require any specific background color. [0006]
  • The object to be tracked (possibly a person, but it is not limited to just persons, i.e. it could track a robot) will be standing within the received image and will have a different color than the background as shown in FIG. 1 and FIG. 2. The object may have parts using that background color if some areas need to appear transparent through a matte, etc. [0007]
  • When the data is coming to the computer through a video signal, the computer will need a piece of hardware to convert the digital representation of the video signal supported by the optical mouse device. Whatever the format of the images, it will have to be converted to a one component (gray scale) or three components (red, green and blue) format. However, the way those are represented does not make any difference on how the optical mouse works. (FIG. 2.) [0008]
  • Then the device will search an area of the image for a pointing device (optical mouse). The pointing device position will be returned to the user of the device. The explanations found here can be used to search a person standing in front of a camera like, but not limited to, a weather caster. Such a search could work in any direction for any kind of object or person and is valid for any kind of extension. [0009]
  • In this specific case, the device starts its search at the bottom of the area where the body of a standing person should always be found (FIG. 3[0010] a). There are a few cases explained below when the device may find arms without finding the body of the person first.
  • Once the body of the person is found, a search for the arms is conducted (FIG. 3[0011] b). To do so, a search is started from the top of the image and continues toward the bottom. If nothing is found, there is no extended arm.
  • Finally, a search is conducted along the arm up to the tip of the arm, i.e., end of the hand (FIG. 3[0012] c) or the pointing device (which will prolong the arm). This last search result is the position which the optical mouse device returns to the user.
  • When both arms are extended, one of them must be selected to know which is the one used to point (FIG. 3[0013] d). To accomplish this, simply select the one further away from the body: it is known where the body was found (Xb), and where the tip was found (Xt) therefore the distance|Xb−Xt| (absolute value of the subtraction of both positions) is also known; the longest distance determines which of the arms is the furthest away. Other methods could be used. Some optical mouse devices could be programmed to always return a position on the left side.
  • If no arm tip is found, a “no position” is returned.[0014]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a video field with a background and foreground object. [0015]
  • FIG. 2 shows a video field having a limited search area. [0016]
  • FIG. 2[0017] a shows a pixel being checked and its surrounding pixels.
  • FIG. 3[0018] a shows a search for a body portion.
  • FIG. 3[0019] b shows a search for arm portions.
  • FIG. 3[0020] c shows a search for a tip.
  • FIG. 3[0021] d shows a search for a desired arm portion.
  • FIG. 3[0022] e shows a detailed search for the tip.
  • FIG. 4 shows a special case for a body and arm search. [0023]
  • FIG. 5 shows a special case for an arm search. [0024]
  • DETAILED DESCRIPTION OF THE INVENTION
  • The following is a description of a special case referring to arms. These arms are described as the left arm and the right arm. The ‘left’ and ‘right’ words are not referencing the real left arm or right arm of the person standing in the video field. Rather, these are references to the extension seen on the left and the extension seen on the right as presented in the figures. See FIG. 3[0025] d.
  • 1. Full Image [0026]
  • The computer needs a source to receive a full digital image. This source can be, and is not limited to: (1) the Internet, (2) a movie or (3) a direct video feed to the computer, in which case the video signal will need to be converted to a digital representation (FIG. 1.). [0027]
  • The full image sizes are represented as the variables WIDTH and HEIGHT. [0028]
  • 2. Area to be Tracked [0029]
  • The area to be tracked can be any rectangle [0030] 11 within the full video field. The definition is (X-LEFT, Y-TOP, X-RIGHT, Y-BOTTOM) in pixel coordinates. These coordinates are used to determine where to search for the body 13 and arms 15 as shown in FIG. 1. The origin of these coordinates is the top-left corner, with the Y coordinates increasing going down and the X coordinates increasing going right.
  • 3. Search Limit [0031]
  • Depending on the type of source, it may be necessary to define a [0032] limit 19 to avoid searching near the borders (FIG. 2.), thus limiting problems while searching. Most of the time, a video feed will have artifacts on the border. The search limit can also be generated with a smaller area to be tracked. However, a full image version of the optical mouse may only relate on a search limit.
  • The limit variable is defined as LIMIT and represents the number of pixels to skip on all borders. Note that four values could be defined to give a specific left, right, top and bottom limit. For the purpose of this discussion, we will limit ourselves to only one value. [0033]
  • 4. Background [0034]
  • The [0035] background 21 needs to be a single color; i.e. always use the same color. The color will possibly vary in intensity and chroma within a limited range defined by the user.
  • The color definitions will vary depending on the number of components of the digital representation of the image. Yet, each component can be checked against a range. A range will be defined as MIN for the lowest possible value and MAX for the highest possible value. These ranges may be defined as constants (the background is always of the same color in all the possible cases) or variables (the background can change from one usage to another; i.e. from a blue screen to a green screen). [0036]
  • Here is an algorithm to check if a pixel is of type background in a three component image (red, green and blue): [0037]
    -- the PIXEL variable is defined as a structure RGB with
     -- three fields: RED, GREEN and BLUE.
     FUNCTION IS-BACKGROUND (VAR PIXEL : RGB)
    -- check the red pixel
     -- RED-MIN is the minimum accepted red value
     -- RED-MAX is the maximum accepted red value
    IF PIXEL.RED < RED-MIN OR PIXEL.RED > RED-MAX THEN
      RETURN FALSE
    END IF
    -- check the green pixel
     IF PIXEL.GREEN < GREEN-MIN OR PIXEL.GREEN >
     GREEN-MAX THEN
      RETURN FALSE
    END IF
    -- check the blue pixel
     IF PIXEL.BLUE < BLUE-MIN OR PIXEL.BLUE > BLUE-MAX
     THEN
      RETURN FALSE
    END IF
    -- if all the tests went through properly, the pixel is
     -- a background pixel
    RETURN TRUE
  • Example of Ranges: [0038]
    Color Range
    of the background Red Green Blue
    Blue  0% to 15%  0% to 30% 75% to 100%
    Green  0% to 10% 75% to 100%  0% to 40%
    White 80% to 100% 80% to 100% 80% to 100%
  • With some optical devices, it will be better to check for more than one single pixel. It is possible to check nine (9) pixels instead of one to make sure it is a background pixel as shown in FIG. 2[0039] a. The range used to check the pixels around may vary from the range used to check the middle pixel.
  • 5. Object [0040]
  • On the other hand, the object will be of all the colors which are not the background color. The area to be searched can therefore be defined with a positive (the object) and a negative (the background) set of pixels. A program could be done to transform the whole image in this way. However to make it work in real time, it is necessary to check a very limited number of pixels. The algorithm below shows how this works. [0041]
    FUNCTION IS-OBJECT (VAR PIXEL : RGB)
    -- we return TRUE when IS-BACKGROUND returns FALSE
     -- and vice versa
    IF IS-BACKGROUND (PIXEL) THEN
     RETURN FALSE
    ELSE
     RETURN TRUE
    END IF
  • 6. Search [0042]
  • To search something in the screen, the optical mouse device needs to determine the sign of a pixel (positive or negative). This is done with a call to the IS-BACKGROUND or IS-OBJECT functions as defined in the Background and Object section above. A search algorithm can look like this: [0043]
    FUNCTION FIND (VAR X, Y: INTEGER ;
      VAR X-INCREMENT, Y-INCREMENT: INTEGER ;
      VAR COUNT: INTEGER)
    -- COUNT determine the maximum number of iterations
     -- we can use to find the object
    WHILE COUNT > 0
     -- get the pixel at the position X, Y
     PIXEL = READ-PIXEL (X, Y)
     -- check if the pixel is part of the object
    IF IS-OBJECT (PIXEL) THEN
     -- we found the object, stop the search and
      -- return the position at which the object
      -- was found
      RETURN X, Y
     END IF
    -- the object was not found yet
     -- decrease the number of pixels to still be checked
    COUNT: = COUNT − 1
     -- define the next X coordinate to be checked
     -- (note: X-INCREMENT can be negative to go from right to left)
     X: = X + X-INCREMENT
     -- define the next Y coordinate to be checked
     -- (note: Y-INCREMENT can be negative to go from bottom to top)
     Y: = Y + Y-INCREMENT
    END WHILE
    -- return a position which can't be reached as a “no position”
     -- in this example we use (−1, −1)
    RETURN −1, −1
  • In this FIND function we search an object. The opposite could be accomplished simply by changing the IS-OBJECT function call in IS-BACKGROUND. The variable COUNT is used to know how many pixels will be checked. The increment variables (X-INCREMENT and Y-INCREMENT) should be used with values larger than one (1) whenever possible so as to avoid testing many pixels. [0044]
  • 7. Body [0045]
  • The first part we will search is the body of the object. The following is an algorithm to search a standing person. However, the optical mouse device may not be specifically based on such an object, it could be applied to a flying plane, an object attached at a ceiling, etc. [0046]
  • The search can be accomplished with the FIND function as defined in the Search section above. It will check one pixel every 20 pixels until the body is found or the whole screen has been searched. The position at which the body is found is put in variables for later use. [0047]
    PROCEDURE FIND-BODY
    -- define the STEP used to check pixels to find the body ;
    -- the STEP needs to change depending on the area and therefore
    -- the size of the standing person ;
    -- 20 is a good value for a full screen area
    STEP = 20
    -- compute the maximum number of left-right iterations within
    -- the area to be searched
    MAX-TEST = (X-RIGHT − X-LEFT − LIMIT * 2)/STEP
    -- find the left position of the body
    FIND (
     X-LEFT + LIMIT,
     Y-BOTTOM − LIMIT,
     STEP,
     0,
     MAX-TEST
    )
    RETURNING BODY-X-LEFT, BODY-Y-LEFT
    -- here, we should check whether the body was found
    -- when not, it's a special case and no body will be
    -- found searching from right to left so we could skip
    -- the following call
    -- to find the right position of the body we could also
    -- search for the background going through the body instead
    -- of searching from the other side of the screen (another
    -- FIND function would be necessary)
    -- find the right position of the body
    FIND (
     X-RIGHT − LIMIT,
     Y-BOTTOM − LIMIT,
     - STEP,
     0,
     MAX-TEST
    )
    RETURNING BODY-X-RIGHT, BODY-Y-RIGHT
  • In this algorithm the following is defined: [0048]
  • The variable STEP: sets the number of pixels to be skipped on each iteration; the use of a STEP value is important to make the search fast; in this example we defined the step as [0049] 20, so if we start searching at the position (10, 120), the second pixel tested will be found at the position (30, 120), the third at (50, 120), etc.
  • The variable MAX-TEST: it sets the number of pixels which can be tested left-right; we use the X-LEFT and X-RIGHT coordinates as defined above; the LIMIT * 2 would be: [0050]
  • . . . (LEFT-LIMIT+RIGHT-LIMIT) [0051]
  • if the limits were specific on each side. [0052]
  • The call to the FIND function to look for the left side: starts the search at the position: [0053]
  • (X-LEFT+LIMIT, Y-BOTTOM−LIMIT) [0054]
  • which is at the bottom-left of the image. The function will continue the search increasing the X coordinate by STEP (Y is not changed). The second pixel tested (if the body was not found yet) is: [0055]
  • (X-LEFT+LIMIT+STEP, Y-BOTTOM−LIMIT) [0056]
  • The FIND function tests pixels until the body is found, or MAX-TEST pixels are checked. The result is then saved in the BODY-X-LEFT and BODY-Y-LEFT variables. [0057]
  • The call to the FIND function to look for the right side: starts the search at the position: [0058]
  • (X-RIGHT−LIMIT, Y-BOTTOM−LIMIT) [0059]
  • which is at the bottom-right of the image; the function will continue the search decreasing the X coordinate by STEP (Y is not changed). The second pixel tested (if the body was not found yet) is: [0060]
  • (X-RIGHT−LIMIT−STEP, Y-BOTTOM−LIMIT) [0061]
  • The FIND function tests pixels until the body is found, or MAX-TEST pixels are checked. The result is then saved in the BODY-X-RIGHT and BODY-Y-RIGHT variables. [0062]
  • 8. Arms [0063]
  • Once the body has been found, a search for the [0064] arms 15 is conducted. See FIG. 3b. In the example shown, we start looking from the top of the searched area to avoid any problems with the shadow the person may display on the background. A too dark shadow would be seen as being positive (i.e. as part of the standing person).
  • The search of the arms is very similar to the search of the body. There is an example of algorithm to do so: [0065]
    PROCEDURE FIND-ARMS
    -- define the STEP used to check pixels to find the arm ;
    -- it is smaller than for the body because arms are smaller
    -- the STEP needs to change depending on the area and therefore
    -- the size of the standing person
    STEP = 10
    -- compute the maximum number of top-bottom iterations
    -- within the area to be searched
    MAX-TEST = (Y-BOTTOM − Y-TOP − LIMIT * 2)/STEP
    -- find the position of the left arm
    FIND (
     BODY-X-LEFT − 20,
     Y-TOP + LIMIT,
     0,
     STEP,
     MAX-TEST
    )
    RETURNING ARM-X-LEFT, ARM-Y-LEFT
    -- find the position of the right arm
    FIND (
     BODY-X-RIGHT + 20,
     Y-TOP + LIMIT,
     0,
     STEP,
     MAX-TEST
    )
    RETURNING ARM-X-RIGHT, ARM-Y-RIGHT
  • This algorithm is very similar to the one in the Body algorithm description above. Note that the step is now used to increment the Y coordinate (X is not changed). There should be a test, which does not appear here, to make sure both sides of the body were found. The search starts from BODY-X-LEFT−20 and BODY-X-RIGHT+20 to avoid searching for shoulders and arms along the body (i.e. obviously not pointing). The value of MAX-TEST could also be made smaller since the arms can not be lower than a certain point; the computation could be: [0066]
  • SEARCH-HEIGHT=Y-BOTTOM−Y-TOP−LIMIT * 2 MAX-TEST=SEARCH-HEIGHT * 0.80/ STEP [0067]
  • where we use only 80% of SEARCH-HEIGHT so the lowest 20% of the searched area will not be checked for an arm. See FIG. 3[0068] b.
  • 9. Pointing Device [0069]
  • Once arms are found, we can start searching for the [0070] tip 25 of the arm as shown in FIG. 3c. This search is more complicated, and can be accomplished checking inside the arm or around the arm. The algorithm presented here goes around the arm.
    FUNCTION FIND-TIP (VAR X, Y: INTEGER ;
       VAR X-DIRECTION: INTEGER ;
       VAR STEP: INTEGER)
    -- loop until a RETURN is reached
    LOOP FOREVER
     SAVED-X := X
     SAVED-Y := Y
     -- move toward the tip of the arm
     X := X + STEP * X-DIRECTION
     -- clip the X position on the right
     IF X >= X-RIGHT − LIMIT THEN
     -- no tip found (or found on the border...)
     RETURN −1, −1
     END IF
     -- clip the X position on the left
     IF X <= X-LEFT + LIMIT THEN
     -- no tip found (or found on the border...)
     RETURN −1, −1
     END IF
     -- check the pixel, is it a background pixel?
     IF IS-BACKGROUND (X, Y) THEN
     -- the loop ends with the UNTIL
     LOOP
      -- move toward the bottom of the area
      Y := Y + STEP
      -- when we reach the bottom of the screen, we are “after”
      -- the tip of the pointing device ; so check for that case
      IF Y >= Y-BOTTOM − LIMIT THEN
      RETURN SAVED-X, SAVED-Y
      END IF
     -- check going below
     UNTIL IS-OBJECT (X, Y)
     -- keep the position outside the arm
      Y := Y − STEP
     ELSE
     -- the pixel on the left isn't a background pixel
     -- therefore we need to go up instead of down to go
     -- around it ;
     -- the loop ends with the UNTIL
     LOOP
      -- move toward the top of the area
      Y := Y − STEP
      -- check whether we reached the top of the screen
      IF Y <= Y-TOP − LIMIT THEN
      -- this case could be treated as a special case by
      -- strong optical mouse devices as an arm touching
      -- the top of the screen ; yet in many cases the arm
      -- can never reach such a place
      RETURN −1, −1
      END IF
     UNTIL IS-BACKGROUND (X, Y)
     END IF
     -- here we know we have a background pixel at (X, Y)
    END LOOP
  • This first algorithm can be used to find the tip of one arm. The direction (X-DIRECTION) parameter can be used to search on the left (−1) or on the right (1) as described below. [0071]
    PROCEDURE FIND-POINTERS
    -- search for the left pointing device with 5 pixels precision
    -- (direction set to −1 and step to 5)
    FIND-TIP (
     ARM-X-LEFT,
     ARM-Y-LEFT − 10,
     −1,
     5)
    RETURNING TIP-X-LEFT, TIP-Y-LEFT
    -- search for the left pointing device with 1 pixel precision
    -- (direction set to −1 and step to 1)
    FIND-TIP (
     TIP-X-LEFT − 5,
     TIP-Y-LEFT − 5,
     −1,
     1)
    RETURNING TIP-X-LEFT, TIP-Y-LEFT
    -- search for the right pointing device with 5 pixels precision
    -- (direction set to 1 and step to 5)
    FIND-TIP (
     ARM-X-RIGHT,
     ARM-Y-RIGHT − 10,
     1,
     5)
    RETURNING TIP-X-RIGHT, TIP-Y-RIGHT
    -- search for the right pointing device with 1 pixel precision
    -- (direction set to 1 and step to 1)
    FIND-TIP (
     TIP-X-RIGHT − 5,
     TIP-Y-RIGHT − 5,
     1,
     1)
    RETURNING TIP-X-RIGHT, TOP-Y-RIGHT
  • This last algorithm calls the FIND-TIP function which determines the tip of each arm. FIND-ARMS should check whether an arm was found before to search for its tips. The search is repeated twice. See FIG. 3[0072] e: (1) with a step of 5 which returns a point near the tip at about 5 pixels away and (2) with a step of 1 which returns a point next to the tip (i.e. which is at most 1 pixel away). This technique is only to enable to processing to occur substantially real time. Other techniques can be used to close up on the tip instead of a new call to FIND-TIP.
  • 10. Left or Right?[0073]
  • When two pointing devices are found (one on the right [0074] 15 and one on the left 27), a simple distance algorithm is used to determine which one will be used. See FIG. 3d. Other algorithms could be used, like always force the left side or the right side to be picked.
    FUNCTION SELECT-TIP
    IF BODY-X-LEFT − TIP-X-LEFT > TIP-X-RIGHT − BODY-X-RIGHT
    THEN
     RETURN TIP-X-LEFT, TIP-Y-LEFT
    ELSE
     RETURN TIP-X-RIGHT, TIP-Y-RIGHT
    END IF
  • Since it is known which variable which variable has the largest X coordinate, there is no need to use an absolute value function. [0075]
  • 11. Body on the Border [0076]
  • There are two special cases when the [0077] body 13 is on the border of the screen. One when the body is on the right See FIG. 4. and one when the body is on the left. In both cases, a search is conducted for one arm 15. The search will be exactly the same as described above in the Body, Arms and Pointing Device algorithm.
  • The Body algorithm will usually return an unusable position for the search of the body when the person is standing on a border. A resulting position which is equal to the startup or the first position, can be declared as being an invalid position (i.e. the body is on that side). [0078]
  • The handling of this special case just means that one side of the body may not exist. Therefore, the Arms algorithm needs to search only one of the two arms and set the other arm position to a ‘no position’ status. Similarly, the Pointing Device algorithm needs to search the tip of only one arm. [0079]
  • An algorithm could look like this: [0080]
     -- do some setup
     ...
     -- search for the body
     FIND-BODY
     -- check if the left position is correct
     -- (instead of 25, we should use the body STEP value)
     IF BODY-X-LEFT <X-LEFT + LIMIT + 25 THEN
      -- not a correct position
      -- we didn't find the body here
      BODY-X-LEFT := −1
      BODY-Y-LEFT := −1
     END IF
     -- check if the right position is correct
    IF BODY-X-RIGHT <Y-RIGHT − LIMIT − 25 THEN
      -- not a correct position
      --  we didn't find the body here
      BODY-X-RIGHT := −1
      BODY-Y-RIGHT := −1
     END IF
     -- check if one side wasn't found
     -- search for only one arm
     IF BODY-X-LEFT = −1 AND BODY-Y-LEFT = −1 THEN
     IF BODY-X-RIGHT = −1 AND BODY-Y-RIGHT = −1 THEN
       -- both sides weren't found
       -- check for flying arms ;
       -- (see next point)
       FLYING-ARMS
      ELSE
       -- the left side wasn't found
       -- search the right arm only
       FIND-RIGHT-ARM
      END IF
      RETURN
     END IF
     IF BODY-X-RIGHT = −1 AND BODY-Y-RIGHT = −1 THEN
      -- the right side wasn't found
      -- search the left arm only
      FIND-LEFT-ARM
      RETURN
     END IF
  • 12. No Body Found or the “Flying Arms”[0081]
  • This special case happens whenever the person is out of the video field but is still pointing to something within the video field as shown in FIG. 5. This special case occurs whenever the [0082] body 13 is not found at all in the area to be searched.
  • Once this special case is determined (like in the Body or the Border algorithm), a search for an arm (also called a flying arm) on the left is performed as if the body was just further left of the limit. Similarly, and if nothing is found on the left (limitation which is given here because a person cannot be on both side of a screen which is specific to this example of the optical mouse device) a check on the right is performed as if the body was just further right of the limit. [0083]
  • The following algorithm shows an example to do so: [0084]
  • PROCEDURE FLYING-ARMS [0085]
  • simulate a body on the right of the screen BODY-X-LEFT=X-RIGHT−LIMIT BODY-Y-LEFT=Y-BOTTOM−LIMIT [0086]
  • simulate a body on the left of the screen BODY-X-RIGHT=X-LEFT+LIMIT BODY-Y-RIGHT=Y-BOTTOM−LIMIT [0087]
  • search for the arms FIND-ARMS [0088]
  • search for the pointing device FIND-POINTERS [0089]
  • The FLYING-ARMS procedure sets all the body coordinates to fake positions as if a body had been found. Thus the FIND-ARMS procedure can be called to perform a search for two arms. [0090]
  • 13. Nothing Found [0091]
  • In the event nothing is found (the area is only negative) the optical mouse device will return a “no position” value. [0092]
  • Because of the way it works, the optical mouse could be done to search for several areas in each image. This way, more than one pointing device could be found and managed; i.e. two persons could appear in the image: one on the left and one on the right. [0093]

Claims (1)

1. A method for finding a pointing element within a digital image comprising the steps of:
a) receiving a digitized image including a background and an object to be tracked, said background and said object having a different color;
b) defining a rectangle within said image including at least a portion of said object;
c) conducting a search within the defined rectangle starting at a position adjacent a first side of a bottom portion of said rectangle and continuing in a direction towards a second side of said rectangle until said object is located and storing a first x value corresponding thereto;
d) conducting a second search within the defined rectangle starting at a position adjacent said second side of said bottom portion of said rectangle and continuing in a direction towards said first side of said rectangle until said object is located and storing a second x value corresponding thereto;
e) conducting a third search within the defined rectangle starting at a position adjacent a top portion of said rectangle and offset from said first x value and continuing in a direction towards said bottom until said object is located and storing a first y value corresponding thereto;
f) conducting a fourth search within the defined rectangle starting at a position adjacent said top portion of said rectangle and offset from said second x value and continuing in a direction towards said bottom until said object is located and storing a second y value corresponding thereto;
g) conducting a fifth search using the stored first x and second x and first y and second y positions to find a point representing one of a leftmost and a rightmost position of said object and storing x and y coordinates corresponding to said found point.
US09/210,525 1998-12-11 1998-12-11 Method and apparatus for locating a pointing elements within a digital image Abandoned US20030137486A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US09/210,525 US20030137486A1 (en) 1998-12-11 1998-12-11 Method and apparatus for locating a pointing elements within a digital image
US10/959,596 US7170490B2 (en) 1998-12-11 2004-10-06 Method and apparatus for locating a pointing element within a digital image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/210,525 US20030137486A1 (en) 1998-12-11 1998-12-11 Method and apparatus for locating a pointing elements within a digital image

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US10/959,596 Continuation-In-Part US7170490B2 (en) 1998-12-11 2004-10-06 Method and apparatus for locating a pointing element within a digital image
US10/959,596 Continuation US7170490B2 (en) 1998-12-11 2004-10-06 Method and apparatus for locating a pointing element within a digital image

Publications (1)

Publication Number Publication Date
US20030137486A1 true US20030137486A1 (en) 2003-07-24

Family

ID=22783251

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/210,525 Abandoned US20030137486A1 (en) 1998-12-11 1998-12-11 Method and apparatus for locating a pointing elements within a digital image
US10/959,596 Expired - Lifetime US7170490B2 (en) 1998-12-11 2004-10-06 Method and apparatus for locating a pointing element within a digital image

Family Applications After (1)

Application Number Title Priority Date Filing Date
US10/959,596 Expired - Lifetime US7170490B2 (en) 1998-12-11 2004-10-06 Method and apparatus for locating a pointing element within a digital image

Country Status (1)

Country Link
US (2) US20030137486A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7071914B1 (en) * 2000-09-01 2006-07-04 Sony Computer Entertainment Inc. User input device and method for interaction with graphic images
WO2006136738A2 (en) * 2005-06-24 2006-12-28 Daniel Martin Digital pointing device on a screen
US20090297039A1 (en) * 2007-04-27 2009-12-03 Brijot Imaging Systems, Inc. Software methodology for autonomous concealed object detection and threat assessment
US8490877B2 (en) 2010-11-09 2013-07-23 Metrologic Instruments, Inc. Digital-imaging based code symbol reading system having finger-pointing triggered mode of operation
US20190250698A1 (en) * 2004-10-25 2019-08-15 I-Interactive Llc Method for controlling an application employing identification of a displayed image

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070018966A1 (en) * 2005-07-25 2007-01-25 Blythe Michael M Predicted object location
US20100060581A1 (en) * 2008-05-02 2010-03-11 Moore John S System and Method for Updating Live Weather Presentations
US10368059B2 (en) * 2015-10-02 2019-07-30 Atheer, Inc. Method and apparatus for individualized three dimensional display calibration

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US6434271B1 (en) * 1998-02-06 2002-08-13 Compaq Computer Corporation Technique for locating objects within an image

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5937081A (en) * 1996-04-10 1999-08-10 O'brill; Michael R. Image composition system and method of using same
US6300955B1 (en) * 1997-09-03 2001-10-09 Mgi Software Corporation Method and system for mask generation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US6434271B1 (en) * 1998-02-06 2002-08-13 Compaq Computer Corporation Technique for locating objects within an image

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7071914B1 (en) * 2000-09-01 2006-07-04 Sony Computer Entertainment Inc. User input device and method for interaction with graphic images
US20190250698A1 (en) * 2004-10-25 2019-08-15 I-Interactive Llc Method for controlling an application employing identification of a displayed image
US11561608B2 (en) * 2004-10-25 2023-01-24 I-Interactive Llc Method for controlling an application employing identification of a displayed image
WO2006136738A2 (en) * 2005-06-24 2006-12-28 Daniel Martin Digital pointing device on a screen
WO2006136738A3 (en) * 2005-06-24 2007-03-22 Daniel Martin Digital pointing device on a screen
US20090297039A1 (en) * 2007-04-27 2009-12-03 Brijot Imaging Systems, Inc. Software methodology for autonomous concealed object detection and threat assessment
US8437566B2 (en) * 2007-04-27 2013-05-07 Microsemi Corporation Software methodology for autonomous concealed object detection and threat assessment
US8490877B2 (en) 2010-11-09 2013-07-23 Metrologic Instruments, Inc. Digital-imaging based code symbol reading system having finger-pointing triggered mode of operation
US9569652B2 (en) 2010-11-09 2017-02-14 Metrologic Instruments, Inc. Code symbol reading system

Also Published As

Publication number Publication date
US7170490B2 (en) 2007-01-30
US20050195158A1 (en) 2005-09-08

Similar Documents

Publication Publication Date Title
CN108280822B (en) Screen scratch detection method and device
US10924729B2 (en) Method and device for calibration
RU2617557C1 (en) Method of exposure to virtual objects of additional reality
JP3935499B2 (en) Image processing method, image processing apparatus, and image processing program
CN109257582A (en) A kind of bearing calibration of projection device and device
US8073318B2 (en) Determining scene distance in digital camera images
US20110267348A1 (en) Systems and methods for generating a virtual camera viewpoint for an image
van de Weijer et al. Robust optical flow from photometric invariants
US20040156557A1 (en) Device and method for difitizing an object
TWI632894B (en) Heart rate activity detecting system based on motion images and method thereof
CN109286758B (en) High dynamic range image generation method, mobile terminal and storage medium
US7170490B2 (en) Method and apparatus for locating a pointing element within a digital image
CN108765380A (en) Image processing method, device, storage medium and mobile terminal
CN112308797A (en) Corner detection method and device, electronic equipment and readable storage medium
CN110717920A (en) Method and device for extracting target image of projector galvanometer test and electronic equipment
KR100217485B1 (en) Method for movement compensation in a moving-image encoder or decoder
CN111896233A (en) Contrast test method, contrast test apparatus, and storage medium
CN109886864B (en) Privacy mask processing method and device
CN110047126B (en) Method, apparatus, electronic device, and computer-readable storage medium for rendering image
Drew Robust specularity detection from a single multi-illuminant color image
JPH0793535A (en) Picture correction processing method
JP4222013B2 (en) Image correction apparatus, character recognition method, and image correction program
KR100851055B1 (en) The stereo matching method using edge projection
JP7194534B2 (en) Object detection device, image processing device, object detection method, image processing method, and program
US10134163B2 (en) Dynamic detection of an object framework in a mobile device captured image

Legal Events

Date Code Title Description
AS Assignment

Owner name: AWA-TV, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WILKE, ALEXIS;REEL/FRAME:009712/0165

Effective date: 19981211

AS Assignment

Owner name: WEATHER CENTRAL, INC., WISCONSIN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WELCH, ANDREW D/B/A AWA-TV.COM;MAGIC TRAK, L.L.C.;REEL/FRAME:014959/0886

Effective date: 20031010

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION