US20130079907A1 - Golf athleticism rating system - Google Patents

Golf athleticism rating system Download PDF

Info

Publication number
US20130079907A1
US20130079907A1 US13/682,217 US201213682217A US2013079907A1 US 20130079907 A1 US20130079907 A1 US 20130079907A1 US 201213682217 A US201213682217 A US 201213682217A US 2013079907 A1 US2013079907 A1 US 2013079907A1
Authority
US
United States
Prior art keywords
athlete
athleticism
athletic performance
results
golf
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/682,217
Inventor
Kristopher L Homsi
David H Annis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nike Inc
Original Assignee
Nike Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/559,082 external-priority patent/US20100129780A1/en
Application filed by Nike Inc filed Critical Nike Inc
Priority to US13/682,217 priority Critical patent/US20130079907A1/en
Priority to PCT/US2012/069784 priority patent/WO2014081442A1/en
Publication of US20130079907A1 publication Critical patent/US20130079907A1/en
Assigned to NIKE, INC. reassignment NIKE, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HOMSI, KRISTOPHER L.
Assigned to NIKE, INC. reassignment NIKE, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ANNIS, DAVID
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports

Definitions

  • aspects described herein relate to athleticism ratings and related performance measuring systems, methods, apparatuses and the like.
  • aspects are directed to athleticism ratings and performance measuring systems for determining golf athleticism for an individual or group.
  • Some current systems attempt to use objective standards to evaluate athletic potential. Oftentimes, the systems use the same data and same athletic exercises or tests regardless of the type of athletic activity for which the athlete is being evaluated.
  • the present invention seeks to overcome certain limitations and other drawbacks, and to provide new features not heretofore available.
  • the present invention generally relates to systems and methods for rating the performance of an athlete, particularly a golf athlete using various types of exercises and tests.
  • the tests or exercises may be specific to evaluating golf athleticism.
  • aspects described herein may include a portable test field or mat that may be used to perform the various exercises or tests for determining athleticism of an individual.
  • FIG. 1 illustrates a general operating environment in which one or more aspects described herein may be included.
  • FIG. 2 is a flowchart illustrating an exemplary method for collecting data for generating an athleticism rating of an athlete based on a set of athletic exercises or tests according to one or more aspects described herein.
  • FIG. 3 is an exemplary graph illustrating rankings of performances in a particular type of exercise among an athlete pool according to one or more aspects described herein.
  • FIG. 4 is an exemplary table illustrating scoring values for an athlete along with possible scoring values according to one or more aspects described herein.
  • FIG. 5 is a table illustrating an example set of athleticism scores for various athleticism tests performed by an athlete according to one or more aspects described herein.
  • FIG. 6 is a table illustrating an example set of athleticism scores corresponding to ceiling values for various athleticism tests according to one or more aspects described herein.
  • FIG. 7 is a flowchart illustrating an exemplary method for generating an athleticism rating according to one or more aspects described herein.
  • FIG. 8 is a timeline and process flow illustrating a method for performing a step test according to one or more aspects described herein.
  • FIGS. 9A-9C illustrate an example athletic activity exercise that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 10A and 10B illustrate another example athletic activity exercise that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 11A-11C illustrate another example athletic activity exercise that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 12A-12C illustrate another example athletic activity exercise, a wood-chop bounce, that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 13A-13D illustrate another example athletic activity exercise, a side-sling object launch, that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 14A and 14B illustrate an example exercise test field that may be used to perform one or more athleticism exercises/tests according to one or more aspects described herein.
  • a first aspect of the present invention is directed to a system for performing a method, the method including receiving athletic performance results from multiple types of performance tests.
  • the athletic performance results including at least two selected from the following 1) a change in pulse of an athlete measured during a stepping exercise, 2) a broad jump distance of an athlete, 3) a lateral hop distance of an athlete, 4) a bounce distance of a ball when thrown by the athlete in a downward direction toward a target, and 5) a sling distance of a ball when thrown by the athlete using a underhanded side sling.
  • the method also comprised of generating, by the computing system, a golf athleticism rating based on the at least two athletic performance results.
  • a second aspect of the present invention is directed to non-transitory computer readable media having computer-executable instructions embodied thereon that when executed by a processor perform a method for evaluating the athleticism of an athlete in golf,
  • the method comprises receiving at least two results for the athlete's performance in at least two different athletic performance tests related to golf.
  • the method further comprises comparing each of the at least two results to a corresponding distribution of test results of athletic data for athletes similar to the athlete and determining a percentile ranking for each of the at least two results.
  • the method is further comprised of transforming the percentile ranking for each of the at least two results to a fractional event point number for each result.
  • the percentile rankings for each of the at least two results are progressive.
  • the method further comprising determining an athleticism rating score for the athlete in golf based on the fractional event point numbers.
  • FIG. 1 is a block diagram of an athleticism measurement system 100 that includes a sensor 102 (e.g., an accelerometer, compression sensor, inertial measurement system, etc.) that is borne by an athlete during different athletic drills or tests to generate data that are used to generate an athleticism rating, such as a rating described in International patent application no. PCT/US2005/040493 for Athleticism Rating and Performance Measuring Systems and incorporated herein by reference.
  • Sensor 102 may be any type of sensor configured to detect a stimulus and provide a resulting signal.
  • sensor 102 may be configured to detect a force, such as an impact force from a person or object striking another person or object.
  • sensor 102 may be utilized to measure one or more parameters, such as, for example, velocity, acceleration, pressure, location, energy transfer, temperature, orientation, light, sound, magnetism, or a particular motion along two or more axes.
  • sensor 102 may comprise an accelerometer module.
  • the accelerometer module may be implemented with a two-axis accelerometer for measuring acceleration along two orthogonal axes.
  • the accelerometer module may be implemented with a three-axis accelerometer for measuring acceleration along three orthogonal axes.
  • sensor 102 may comprise a camera.
  • a camera may detect or measure one or more properties of an athlete or other user, before, during or after, any processes or routines disclosed herein.
  • multiple sensors may be used in measuring an athlete's performance during one or more athleticism exercises or tests.
  • the multiple sensors may be of the same type or may include different types.
  • the multiple sensors may correspond to accelerometers placed at different locations on a test field or on an athlete's body.
  • a first sensor may comprise an accelerometer while another sensor may comprise a pulse measurement sensor. Multiple sensors may be incorporated into the same physical device or may each be physically separate from the others.
  • Acceleration sensor 102 may be positioned in a shoe, on top of a shoe, fastened around the ankle or wrist, attached to waist belts or incorporated into apparel on the body of the athlete, or otherwise borne by the athlete.
  • sensor 102 may be worn or attached to any other portion of an athlete's body and/or incorporated into clothing as necessary or desired.
  • an athlete may wear a sensor around his or her head to measure head movement.
  • an athlete may wear a shirt having a heart rate monitor included therein.
  • sensor 102 communicates over a link 104 with an athleticism rating processing device 106 .
  • link 104 is a wireless, digital, low-power RF link with 1-way or 2-way transmission.
  • a wired link could alternatively be employed in some applications.
  • Athleticism rating processing device 106 may include one or more of an athleticism timing system, such as an electronic timing system, or a stopwatch, sport watch, digital music player, mobile phone, wireless athleticism measurement kiosk, etc. configured to communicate over link 104 with sensor 102 .
  • athleticism rating processing device 106 allows a user (e.g., an athlete, coach, etc.) to select an athleticism measurement mode from among multiple selectable athleticism measurement modes. During a measurement mode, athleticism rating processing device 106 obtains and stores acceleration data from sensor 102 and selected timing data. In addition, athleticism rating processing device 106 may cue the athlete to perform certain actions during an athleticism measurement or may provide feedback during or after the measurement.
  • a user e.g., an athlete, coach, etc.
  • athleticism rating processing device 106 obtains and stores acceleration data from sensor 102 and selected timing data.
  • athleticism rating processing device 106 may cue the athlete to perform certain actions during an athleticism measurement or may provide feedback during or after the measurement.
  • athleticism rating processing device 106 delivers the acceleration data and the timing data by wired or wireless communication to an athleticism rating computer system 108 that calculates an athleticism rating based in part on the sensed data and/or timing data.
  • Athleticism rating computer system 108 may be disposed at the location where the athlete performs the athletic drills or tests or may be located remotely and accessed over a computer network (e.g., the Internet).
  • athleticism rating processing device 106 may calculate an athleticism rating directly.
  • athleticism rating processing device 106 and/or athleticism rating computer system 108 may be included as part of the same physical device as sensor 102 .
  • sensor 102 may include a processor and memory storing instructions for processing the athleticism data and subsequently calculating an athleticism rating.
  • athleticism rating processing device 106 athleticism rating computer system 108 and sensor 102 may all correspond to physically separate devices.
  • processing device 106 and rating computer system 108 may correspond to a single physical computing device or system. Any of the sensor 102 , processing device 106 and computer system 108 may also be configured to operate in multiple modes, each mode corresponding to a different sport, type of athletic activity, type of athleticism rating being determined and the like.
  • An example multi-mode athleticism movement measurement system is described in further detail in U.S. Application Pub. No. 2008/0249736 A1, entitled “MULTI-MODE ACCELERATION-BASED ATHLETICISM MEASUREMENT SYSTEM,” and filed on Sep. 28, 2007, which is hereby incorporated by reference in its entirety.
  • Processing device 106 may include one or more computing devices and components.
  • processing device 106 may include a computing unit 113 .
  • the computing unit 113 includes a processing unit 115 and a system memory 117 .
  • the processing unit 115 may be any type of processing device for executing software instructions, but will conventionally be a microprocessor device. In some arrangements, processing unit 115 may be single-core or multi-core.
  • the system memory 117 may include both a read-only memory (ROM) 119 and a random access memory (RAM) 121 .
  • both the read-only memory (ROM) 119 and the random access memory (RAM) 121 may store software instructions for execution by the processing unit 115 . Further, it is contemplated that one or more forms of memory may be non-transitory computer readable media.
  • the processing unit 115 and the system memory 117 are connected, either directly or indirectly, through a bus 123 or alternate communication structure to one or more peripheral devices.
  • the processing unit 115 or the system memory 117 may be directly or indirectly connected to additional memory storage, such as the hard disk drive 127 and the optical disk drive 129 .
  • additional memory storage such as the hard disk drive 127 and the optical disk drive 129 .
  • Other types of memory may also be used, including flash memory and removable magnetic drives.
  • the processing unit 115 and the system memory 117 also may be directly or indirectly connected to one or more input devices 131 and one or more output devices 133 .
  • the input devices 131 may include, for example, a keyboard, touch screen, a remote control pad, a pointing device (such as a mouse, touchpad, stylus, trackball, or joystick), a scanner, a camera or a microphone.
  • the output devices 133 may include, for example, a monitor display, television, printer, stereo, or speakers.
  • the computing unit 113 will be directly or indirectly connected to one or more network interfaces 125 for communicating with a network.
  • This type of network interface 125 also sometimes referred to as a network adapter or network interface card (NIC), translates data and control signals from the computing unit 113 into network messages according to one or more communication protocols, such as the Transmission Control Protocol (TCP), the Internet Protocol (IP), and the User Datagram Protocol (UDP).
  • TCP Transmission Control Protocol
  • IP Internet Protocol
  • UDP User Datagram Protocol
  • Network adapters may be wireless or wired or combinations thereof. These protocols are well known in the art, and thus will not be discussed here in more detail.
  • a network interface 125 may employ any suitable connection agent for connecting to a network, including, for example, a wireless transceiver, a power line adapter, a modem, or an Ethernet connection. Connection agents may similarly be wireless or wired or a combination thereof.
  • a golf athleticism rating may be determined based on a battery of tests and exercises that may, in one or more arrangements, be specific to potential in the golfing arena. For example, a user's performance during the tests and exercises may be measured using sensors such as sensor 102 , collected by a processing device such as processing 106 and used to generate an athleticism rating by a computing device such as computing system 108 . In embodiments, one or more of the performance results may be measured manually and entered, for instance, into input device 131 of processing device 106 . Additionally, it is contemplated that one or more performance results may be measured automatically and provided to the processing device 106 , in an exemplary aspect.
  • FIG. 2 illustrates an exemplary method 200 whereby a computing device may detect and/or receive data corresponding to an athlete's performance for one or more golf exercises and tests and generate an athleticism rating based thereon.
  • athlete information may be received by the computing system.
  • the athlete information may include name, gender, age, height, weight, left/right-handedness, shoe size, wingspan, and the like. This information may be used in calculating performance in a particular test or exercise. For example, if a computing system seeks to determine an athlete's leg strength, the computing system may use the athlete's weight in combination with the athlete's vertical jump height to determine leg strength or power.
  • shoe size may be used to insure proper foot placement for a particular test or exercise.
  • a system may determine if this condition has been satisfied by detecting a heel edge of a user's foot wear and determining, using the athlete's shoe size, whether the toe point of the foot overlaps the heel edge of the user's other foot.
  • An athlete's information may also be used to categorize the data. For example, the athlete's gender may be used to determine a pool of data in which to store the athlete's performance data. The various pools of data may be used as the basis for generating athleticism ratings. Thus, athleticism ratings for male and female athletes may correspond to different scales and athlete pools and thus, might not be directly comparable.
  • the computing system may determine a series of one or more tests or exercises for the athlete.
  • the series of tests or exercises may be selected based on the athlete information. For example, some exercises or tests might not be age-appropriate for younger athletes. In a particular example, a dunking test for basketball might not be appropriate or valuable for athletes under the age of 14 due to height, muscular development and other issues.
  • Tests and exercises might also be sport-specific or athletic activity-specific. In a present embodiment, for instance, tests and exercises might be specific to golf. Example tests and exercises for evaluating golf athleticism are described in further detail below. In other arrangements, some exercises may be generic to multiple sports, but a set of exercises or tests are specific or unique to a given sport or athletic activity. Moreover, tests and exercises or sets of tests or exercises may be gender specific.
  • the computing system may provide instructions on when and how to perform each exercise.
  • the computing system may provide audible, visual or haptic cues to indicate times at which a particular movement or action is expected.
  • an audible and/or visible cue may be provided when a user is expected to jump or throw an object.
  • Instructions may further include an animation illustrating the type of movement expected for the test.
  • the computing system may progress through the set of exercises and tests, instructing the athlete through each one.
  • the computing may concurrently and/or subsequently request and receive performance data.
  • the computing system may, based on the determined exercise or set of exercises, generate requests for particular types of data such as a number of steps taken, a top speed, a distance an object was thrown, a distance the athlete moved, or an impact force (e.g., force of the athlete hitting the ground or another object, of an object being thrown or otherwise propelled by the athlete and the like).
  • the computing system may generate an interface including a data entry form.
  • the data entry form may include fields for particular types of performance data based on the types of exercises and tests performed.
  • the data requested by the computer system and, in some examples, the data entry form
  • the data entry form may also be unique to the sport or athletic activity.
  • Various types of data are described in additional detail below.
  • the computer system may, in step 220 , generate an athleticism score or value for each test/exercise by comparing the athlete's performance data with data from a pool of other athletes for the particular test/exercise. Based on the athlete's athleticism score for each test/exercise, an overall athleticism rating for an athletic activity or sport such as golf may be determined in step 225 . In one example, a number of points may be determined for each exercise or test and the points later combined. Either the overall athleticism rating or the exercise-specific scores or both may also be scaled. The athleticism rating may thus represent a level of potential or skill in a particular athletic activity (e.g., sport-specific athleticism rating) relative to an athlete pool. Accordingly, athleticism ratings may be comparable within the athlete pool, but might not be comparable outside of the pool. In some arrangements, the athleticism ratings may be comparable between multiple or all athlete pools.
  • determining an athleticism rating may include two general steps: 1) normalization of raw scores (e.g., test data) and 2) converting normalized scores to accumulated points. Normalization may be a prerequisite for comparing data from different tests. Step 1 ensures that subsequent comparisons are meaningful while step 2 determines the specific facets of the scoring system (e.g., is extreme performance rewarded progressively or are returns diminishing). Because the mapping developed in step 2 converts normalized test results to (fractional event) points in a standardized fashion, this scoring method can be applied universally to all tests, regardless of sport and/or measurement scale. Prudent choice of normalization and transformation functions provides a consistent rating to value performance according to predetermined properties.
  • results may be standardized on a common scale.
  • a common standardization is the z-score, which represents the (signed) number of standard deviations between the observation and the mean value.
  • z-scores are no longer appropriate as they do not have consistent interpretation for data from different distributions.
  • a more robust standardization is the percentile of the empirical cumulative distribution function (ECDF), u, defined as follows:
  • II ⁇ A ⁇ is an indicator function equal to 1 if the event A occurs and 0 otherwise. Note that u depends on both the raw measurement of interest, x, and the raw measurements of peers, y.
  • u is calculated easily by ordering and counting the combined data set consisting of all calibration data (y 1 , y 2 , . . . , y n ) and the raw score to be standardized, x.
  • the ECDFs calculated in step 1 provide a common scale by which to compare results from disparate tests, the ECDFs may be inappropriate for scoring performance because they do not award points consistently with progressive rewards and percentile “anchors” (sanity checks). Therefore, it is necessary to transform (via a monotonic, 1-to-1 mapping) the computed percentiles into an appropriate point scale.
  • the above function relies on two parameters ⁇ and ⁇ and produces scoring curves that are qualitatively similar to the two-parameter power-law applied to raw scores.
  • the parameters ⁇ and ⁇ were chosen to satisfy approximately the following four rules governing the relationship between percentile of performance and points awarded:
  • the bin label corresponds to the lower bound, e.g., the bin labeled 90 contains measurements from the interval (90, 100).
  • the norm data has been collected and sorted in a manner, as set forth above for a given test, its ECDF is scatter plotted to reveal the Performance Curve. For example, non-standing vertical jump data observed in the field for 288 girls are shown as indicated in FIG. 3 . For those results not observed, e.g., 26.6 inches, that value's rank (99.37 percentile) is assigned by interpolation; the unobserved points requiring assigned ranks are shown as indicated in FIG. 3 .
  • a “ceiling” and a “floor” value is determined, which represent the boundaries of scoring for each test. Any test value at or above the ceiling earns the same number of event points. Likewise, any test value at or below the floor earns the same number of event points. These boundaries serve to keep the rating scale intact. The ceiling limits the chance of a single exceptional test result skewing an athlete's rating, thereby masking mediocre performance in other tests.
  • Each rank is transformed to fractional event points using a statistical function, as set forth above with respect to the inverse Weibull Transformation.
  • the scoring curve of event points is shown for girls' no-step vertical jump in FIG. 3 , as indicated therein, where the points are displayed as percentages, i.e., 0.50 points (awarded for a jump of 18.1 inches) are shown as fifty percent.
  • These fractional event points are also referred to as the w-score (“w” for Weibull).
  • the inverse Weibull Transformation can process non-normal (skewed) distributions of test data, as described above.
  • the transformation also allows for progressive scoring at the upper end of the performance range. Progressive scoring assigns points progressively (more generously) for test results that are more exceptional. Progressive scoring allows for accentuation of elite performance, thus making the rating more useful as a tool for talent identification.
  • FIG. 4 identifies a sample athlete, “Andrea White” who jumped 26.5 inches during a no-step vertical jump. This value corresponds to w-score of 1.078. The w-scores for all of her tests are found by referencing those tests' respective look-up tables. These w-scores are shown in FIG. 5 .
  • the “event scaling factor” is determined for each rating by the number of rated events and desired rating range. Ratings should generally fall within a range of 10 to 110. A boys' scaling factor is 25, for example, as the rating comprises four variables: Peak Power, Max Touch, Lane Agility, and three-quarter Court Sprint.
  • her w-score total would yield a rating of almost 130 (129.85).
  • Assessing each of the various scores for each test provides the athlete with an overall athleticism rating, which may be used by the athlete in comparing their ability and/or performance to other athletes within their age group. Furthermore, the athlete may use such information to compare their skill set with those of other athletes in a particular sport (e.g., basketball, golf, etc.) to determine how their skill set compares with that of a professional athlete in the sport. While the above described tests and data may relate more to basketball or other similar activities, the same or similar algorithms, formulas, calculations and processes may be used to develop ratings for tests and exercises relating to other sports such as golf.
  • FIG. 7 illustrates a method 700 for generating an athleticism rating based on performance data collected in multiple athletic exercises or tests.
  • athletic performance data related to a particular sport are collected for a group of athletes.
  • the performance data may correspond to multiple different athletic tests or exercises. In some arrangements, those tests or exercises may be specific or unique to a particular sport or type of athletic activity/movement.
  • Athletic performance data might include, by way of example, and not limitation, a no-step vertical jump height, an approach jump reach height, a sprint time for a predetermined distance, a cycle time around a predetermined course, or the like.
  • Athletic performance data can be recorded for multiple athletes (e.g., a group of hundreds or thousands of athletes).
  • the collected athletic performance data may be normalized.
  • athletic performance test results e.g., raw test results
  • raw test results for each athlete can be standardized in accordance with a common scale. Normalization enables a comparison of data corresponding with different athletic tests.
  • a normalized athletic performance datum is a percentile of the empirical cumulative distribution function (ECDF). Any method can be utilized to obtain normalized athletic performance data (i.e., athletic performance data that has been normalized).
  • the normalized athletic performance data is utilized to generate a set of ranks.
  • the set of ranks includes an assigned rank for each athletic performance test result included within a scoring table.
  • a scoring table e.g., a lookup table
  • a scoring table includes a set of athletic performance test results, or possibilities (e.g., potential test values or results) thereof.
  • Each athletic performance test result within a scoring table corresponds with an assigned rank and/or a fractional event point number.
  • the athletic performance data is sorted and a percentile of the empirical cumulative distribution function (ECDF) is calculated for each value. As such, the percentile of the empirical cumulative distribution function represents a rank for a specific athletic performance test result included in the scoring table.
  • ECDF empirical cumulative distribution function
  • each athletic performance test result is assigned a ranking number based on that test result's percentile among the normal distribution of test results.
  • the rank e.g., percentile
  • the rank may depend on the raw test measurements and may be a function of both the size of the data set and the component test values.
  • a scoring table might include observed athletic performance test results and unobserved athletic performance test results.
  • a rank that corresponds with an unobserved athletic performance test result can be assigned using interpolation of the observed athletic performance test data.
  • a fractional event point number is determined for each athletic performance test result.
  • a fractional event point number for a particular athletic performance test result is determined or calculated based on the corresponding assigned rank. That is, the set of assigned ranks, or percentiles, is transformed into an appropriate point scale.
  • a statistical function such as an inverse Weibull transformation, provides such a transformation.
  • a scoring table (e.g., a lookup table) includes a set of athletic performance test results, or possibilities thereof. Each athletic performance test result within a scoring table corresponds with an assigned rank and/or a fractional event point number.
  • a single scoring table that includes data associated with multiple tests and/or sports can be generated.
  • multiple scoring tables can be generated. For instance, a scoring table might be generated for each sport or for each athletic performance test.
  • One or more scoring tables, or a portion thereof (e.g., athletic test results, assigned ranks, fractional event point numbers, etc.) can be stored in a database.
  • step 725 athletic performance data in association with a particular athlete is referenced (e.g., received, obtained, retrieved, identified, or the like). That is, athletic performance test results for a plurality of different athletic performance tests are referenced.
  • the set of athletic tests can be predefined in accordance with a particular sport or other physical activity.
  • An athletic performance test is designed to assess the athletic ability and/or performance of a given athlete and measures an athletic performance skill related to a particular sport or type of physical activity.
  • a fractional event point number that corresponds with each test result of the athlete is identified.
  • a scoring table a fractional event point number can be looked up, determined, calculated, or recognized based on the athletic performance test result for the athlete.
  • the best result from each test is translated into a fractional event point number by referencing the test result in the lookup table for each test.
  • a rank and/or a fractional event point number could be determined upon receiving an athlete's test results.
  • an algorithm can be performed in real time to calculate a fractional event point number for a specific athletic performance test result.
  • an athletic performance test result for a particular athlete can be compared to a distribution of test results of athletic data for athletes similar to the athlete, and a percentile ranking for the test result can be determined. Thereafter, the percentile ranking for the test result can be transformed to a fractional event point number.
  • step 735 the fractional event point number for each relevant test result for the athlete is combined or aggregated to arrive at a total point score. That is, the fractional event point number for each test result for the athlete is summed to calculate the athlete's total point score.
  • the total point score is multiplied by an event scaling factor to produce an overall athleticism rating.
  • An event scaling factor can be determined using the number of rated events and/or desired rating range.
  • Athletic data associated with a particular athlete such as athletic test results, ranks, fractional event point numbers, total point values, overall athleticism rating, or the like, can be stored in a database.
  • While athleticism ratings may be developed for a variety of different athletic activities, sports and movements, athleticism ratings for each type of athletic activity, sport and/or movement may be based on different metrics, tests and athleticism exercises.
  • a computing system may request and/or receive input relating to endurance (e.g., using a par 5 step test), leg power (e.g., through a broad jump exercise and/or a countermovement lateral hop test), rotational and throwing force (e.g., based on a wood-chop bounce exercise and/or a side-sling launch exercise).
  • the noted golf athleticism rating exercises or tests may more accurately measure an athlete's golf athleticism versus using other types of tests or exercises. In other arrangements, additional exercises or tests may be added to the battery or set of golf athleticism exercises as desired.
  • FIG. 8 illustrates a process 800 for performing an endurance-based par 5 step test.
  • the test is intended to simulate the amount of exercise or physical exertion involved in completing a par 5 hole on a golf course.
  • a computing system may instruct a user to perform a pulse find over a predefined amount of time. That is, the user may be asked, within a 30-second period, for example, to practice or attempt to find an athlete's pulse.
  • the athlete may be the user or may be another individual. This find period may also be used to allow an athlete's pulse to lower from peaks before taking a reading.
  • the computing system may subsequently instruct the user to perform a read of the subject athlete's pulse for a predefined pulse read time period in step 804 .
  • the predefined time period may be 30 seconds and a number of heart beats may be counted or detected over the 30-second period and multiplied by 2 to result in a beats per minute (bpm) value.
  • the counted value over the 30 second period may be used for athleticism rating calculation purposes.
  • Heart beats may be determined through manual user counting/determination or using electromechanical systems.
  • FIG. 9A illustrates an example process by which a pulse may be read.
  • an athlete 901 may measure the pulse of a subject athlete 903 by placing their fingers along a wrist area of the subject athlete 903 . The measuring athlete 901 may then count the number of beats over the predefined time period. As illustrated, athletes 901 and 903 may simultaneously or concurrently measure the pulse of the other athlete 903 and 901 , respectively.
  • an electromechanical heart rate monitor may be utilized as an alternative or in addition to other methods provided herein.
  • step 806 after determining the subject athlete's pulse (e.g., at the one minute mark from the start of the exercise), the computing system may instruct the user to record the data determined in step 806 and to begin practicing an athletic movement such as stepping.
  • the athletic movement may include walking up and down a set of steps.
  • Step 806 may last a total of 30 seconds (or other predefined amount) of exercise time.
  • Various time periods may be used and the time periods described herein with respect to each process step (e.g., read, find, rest, report, step, etc.) may be different or equal to the time periods for each of the other process steps.
  • the predefined amount of time corresponding to step 806 may correspond to twice the predefined period of step 804 .
  • the predefined time period of step 806 may be three times, four times, ten times, (or any multiple, whole or fractional) etc. of the predefined period of step 804 .
  • Reporting may include recording the data manually (e.g., on paper or other physical writing medium) or entry into the computing system or a combination thereof.
  • the step test may include the setting or generation of a periodic or aperiodic beat (e.g., audible or visual) with which the user is to follow with steps.
  • the computing system may generate and produce a periodic beat (audible, visual and/or haptic) having a frequency of 60 beats per minute.
  • a metronome may be used to set the beat (the computing system may also provide instructions to this effect).
  • Other mechanical, electromechanical and manual methods e.g., manually timing and counting out the beats
  • Other beat frequencies may also be used including 30 beats per minute, 25 beats per minute, 45 beats per minute, 120 beats per minute and the like.
  • the beat may be aperiodic.
  • the beat may include a first beat at time 0, followed by a second beat at time 1 second and a third beat at time 4 seconds and a fourth beat at time 10 seconds.
  • the computing system may further provide instructions on the type of movement expected at each beat or cue.
  • the computing system may instruct the athlete to step up with the left foot at the first beat, to step up with the right foot at the second beat, to step down with the left foot at the third beat and to step down with the right foot at the fourth beat and so on (repeating with same set of steps or with other sets of step movements).
  • the par 5 step test may require performance with or on a physical structure having multiple elevations (e.g., a set of steps). In some instances, only two levels or elevations are necessary while in other examples, more than two levels or elevations may be required. Other arrangements of step movements may also be used.
  • the athlete may be asked to perform right up, left up, right up, left up, right down, left down, right up, left up, right up, left up and so on.
  • the instructions might also depend on the dominant foot of the athlete. For example, the athlete may be instructed to start with the dominant foot first followed by the non-dominant foot.
  • the computing system may subsequently instruct the athlete to perform actual steps (in contrast to practice steps) for a predefined amount of time in step 808 .
  • the predefined amount of step time may be 3 minutes.
  • the computing system may provide or instruct a user or other device such as a metronome to provide an audible, visual (e.g., on a display) or haptic cue (e.g., a beat).
  • the instructions may include an instruction for a user to activate a metronome or other beat generating device.
  • the computing system may further display or audibly convey the particular movement required at each beat.
  • FIGS. 9B and 9C illustrate example stepping movements that may be performed by athletes on a set of steps in conjunction with the exercise/test described in FIG. 8 .
  • a pulse find may include locating the subject's pulse, allowing the subject's pulse to lower from a peak, and/or to otherwise prepare for determining the subject athlete's pulse.
  • the pulse find process may be performed over a predefined time period such as 30 seconds.
  • the computing system may subsequently instruct the user to read the subject athlete's pulse in step 812 over another predefined time period (e.g., 30 seconds, 1 minute, 45 seconds, 10 seconds, 2 minutes, 5 minutes, etc.).
  • another predefined time period e.g., 30 seconds, 1 minute, 45 seconds, 10 seconds, 2 minutes, 5 minutes, etc.
  • read time periods and find time periods may be the same. In other arrangements, these time periods may be different. In yet other arrangements, each read time period and/or each find time period may different from one or more of the other read time periods or find time periods, respectively.
  • This second stepping phase may have a duration that is less than the first stepping phase (step 808 ).
  • the first stepping phase may have a duration of three minutes while the second stepping phase may have a duration of two minutes.
  • the relationship between the first stepping phase duration and the second stepping phase duration may be defined in a number of ways.
  • the first stepping phase duration may be defined as one minute more than the second stepping phase duration.
  • the first stepping phase duration may be defined as twice, three times, 10 times, etc. the second stepping phase duration.
  • the second stepping phase duration may be defined as a fraction of the first stepping phase duration (e.g., 1 ⁇ 2, 3 ⁇ 4, 2 ⁇ 3, 3/7, etc.).
  • the second stepping phase of step 814 may be followed by, similar to the first stepping phase, a find phase of 30-seconds (or another duration) at step 816 , and a read phase of 30-seconds (or another duration) at step 818 .
  • a computing system may instruct an athlete to perform a third stepping phase or round in step 820 for a predefined duration.
  • the duration of the third stepping phase may be related to the durations of the second stepping phase and the first stepping phase. For example, the duration of the third stepping phase may be 1 minute less than the duration of the second stepping phase (and two minutes less than the duration of the first stepping phase).
  • the duration of the third stepping phase may be a predefined percentage or fraction of the durations of the first and/or second stepping phases.
  • the movements required in the stepping phases may be the same (just for different durations) or may vary.
  • the athlete in the first stepping phase, the athlete may be instructed to perform right step up, left step up, right step up, left step up, right step down, etc. while in the second stepping phase, the athlete may be instructed to perform right step up, left step up, right step down, left step down.
  • the third stepping phase may further be different from the first and second stepping phases.
  • the third (and final in some examples) stepping phase 820 may be followed a find phase of 30-seconds (or another duration) at step 816 , and a read phase of 30-seconds (or another duration) at step 818 .
  • the final report may be provided based on a user interface or electronic form generated by the computing system.
  • the electronic form may be specific to the par 5 test and request various information including the various pulse readings at the specified times.
  • the computing system may automatically take the pulse measurements during the read phases.
  • a user may perform the pulse reading process in a manual fashion (e.g., either by manually counting or using a device to measure the subject's pulse). By measuring the subject's pulse and change in the subject's pulse after varying degrees (e.g., time or amount) of exercise (e.g.
  • a computing system may determine the subject's endurance or ability to recover (e.g., based on the changes or differences in pulse readings at the specified times). This information may be relevant to how well an athlete would perform (e.g., endurance-wise) in golf games since golf games tend to last multiple hours and require a significant amount of walking.
  • the data recorded during the above par 5 step test may be requested and received by the computing system in conjunction with a corresponding instruction or may be requested and received at the end of the entire test. Alternatively, data may be collected at various intervals or points during the test (e.g., during rest periods and the like). Additionally or alternatively, any number of stepping rounds may be performed or required.
  • FIGS. 10A and 10B illustrate an example process for performing a broad jump test/exercise for evaluating golf athleticism.
  • the broad jump exercise may begin with an athlete 1001 being positioned with feet 1003 parallel and toes (or tip of the athlete's shoe) placed at a predefined point such as a baseline 1005 as illustrated in FIG. 10A .
  • the athlete 1001 may further be required to start in a crouched position in preparation for launching himself or herself as far as possible in a specified direction (e.g., direction A).
  • a sensor system (not shown) may be deployed along baseline 1005 to insure that the athlete's feet are properly aligned.
  • the sensor may sound an alarm or other type of visual, audio or haptic alert/feedback cue. Additionally or alternatively, if the athlete's toes are not touching baseline 1005 , a test administrator and/or the athlete may be notified.
  • a computing system may further cue the athlete 1001 to begin the exercise.
  • the computing system may provide an audible, visual and/or haptic cue to begin a jump.
  • athlete 1001 may perform a broad jump.
  • FIG. 10B illustrates athlete 1001 mid-jump.
  • the athlete 1001 may be required to remain upright with feet stationary so as to permit accurate measurement.
  • the measurement may be taken from the baseline to the heel of athlete 1001 closest to the baseline.
  • the measurement may be taken through a manual process.
  • electronic sensors e.g., RFID, weight sensors, etc.
  • a system may determine a heel point of the athlete's back foot (e.g., based on knowing the athlete's shoe size). This measurement may then be used to determine a broad jump athleticism score as described herein.
  • An athlete's jump may be disqualified, not recorded or not counted for a variety of reasons. For example, if the athlete steps into the jump, the athlete's jump may be disqualified. In another example, the athlete's jump may be disqualified if the athlete's toes cross the baseline prior to the jump. In yet another example, disqualification may be based on the athlete taking a step, hopping or landing any other body part other than his feet on the jumping surface within a 5-second period after landing and/or prior to confirmed measurement. In one or more arrangements, the athlete may be required to perform two qualified jumps. An average may then be taken as the final recorded jump value and score. In other examples, the athlete might only be allowed to use a single jump score. Accordingly, if the user's first or second jump is disqualified, the athlete might be required to base his or her athleticism score for the broad jump on the other jump.
  • FIGS. 11A-11C illustrate an example countermovement lateral hop movement.
  • a countermovement lateral hop corresponds to an athlete dominant-leg lateral hop to the opposite leg, covering the greatest distance possible.
  • FIG. 11A for instance, a user positions him or herself in an initial stance. The initial stance may involve the athlete placing the lateral outside edge of his or her dominant foot 1103 along a baseline 1105 while standing with his or her shoulder line perpendicular to baseline 1105 .
  • the athlete's foot and/or body position may be verified using various types of sensors such as laser sensors, weight sensors, and/or computing systems to perform visual analysis of the athlete's position and the like.
  • FIG. 11B illustrates an athlete initiating a hop.
  • the user may be encouraged or instructed to shift his or her weight to the dominant foot before launching into the hop.
  • the athlete may be permitted to move either foot prior to the jump, but may be required to touch the baseline 1105 with his or her dominant foot 1103 just prior to initiating the jump/hop.
  • Initiation of the hop may be defined by the lifting of the dominant foot while the non-dominant foot is in mid-air (i.e., not in contact with the test surface).
  • FIG. 11C illustrates an athlete's landing upon completing the countermovement lateral hop.
  • a measurement may be taken between the baseline 1105 and an inside lateral edge 1107 of the athlete's non-dominant foot (i.e., front/lead foot). Again, this measurement may be taken manually or through electronic means and systems.
  • a trial of the hop test may be disqualified under certain circumstances.
  • the test and measurement may be disqualified if the athlete does not begin in the proscribed initial stance as described herein.
  • the stance may be confirmed by another individual, by sensors, and/or based on visual image analysis performed by a computing system.
  • Disqualification may also result from the athlete failing to touch the baseline with the dominant/back foot immediately prior to initiating the hop/jump, the athlete's non-dominant/lead foot landing in an orientation that is not substantially parallel to the baseline, and/or failing to stabilize the landing leg/foot for a measurement time period (e.g., 5, 10, 15, 30, 60, 120 seconds, etc.).
  • the countermovement lateral hop test may require the athlete to perform two successful hops.
  • the average of the distance measurements may then be used to determine the athlete's final score or value (e.g., per the methods and algorithms discussed herein) for the countermovement lateral hop test.
  • the athlete's final score or value e.g., per the methods and algorithms discussed herein
  • any number of measurements may be required (e.g., 3, 5, 10, 12, etc.).
  • a further measure of golf athleticism may include a wood-chop bounce exercise/test.
  • a wood-chop bounce may include an athlete performing a cross-body rotational throw or slam. The throw or slam may be performed diagonally downwards (e.g., cross-body) with both hands on a thrown object (e.g., a ball).
  • FIGS. 12A-12C illustrate an example motion for performing the wood-chop bounce.
  • the athlete 1201 assumes an initial stance and position along a baseline 1203 .
  • the athlete may be required to place his or her dominant foot on the baseline 1203 with the width of his or her body perpendicular to the baseline 1203 .
  • ball 1205 may include sensors to detect whether the athlete is touching two points on the ball 1205 (e.g., representing that the user is using both hands to grasp the ball).
  • the athlete may next be instructed to draw the ball up and back (e.g., just above head level) and to slam or throw the ball cross-body downward and toward the ground as shown in FIG. 12B .
  • the goal of throwing the ball in this manner may be to achieve a maximum bounce and farthest second bounce or touch point.
  • the athlete may be instructed to throw the ball 1205 toward a designated point 1207 .
  • point 1207 may be five feet from the baseline (in a widthwise direction of the athlete when in the initial stance).
  • ball 1205 may be required to take the initial bounce within a testing field.
  • the testing field may be 10 feet wide (e.g., baseline 1203 may be 10 feet wide).
  • FIG. 12C illustrates ball 1205 and athlete 1201 after the ball 1205 has made an initial bounce (e.g., near or at the predefined point 1207 ).
  • a marker may be placed or the position may be recorded.
  • an electronic detection system may use weight sensors or RFID tags to determine a second bounce landing point. RFID tags may be incorporated into the ball 1205 or other thrown object. The second bounce point might also be required to be within the testing field (e.g., within an area that is ten feet wide). A distance between the baseline 1203 and the second bounce landing point may be measured and recorded. The measured distance may then be used as a basis for determining a user's score or athleticism rating in the wood-chop bounce exercise or test.
  • the wood-chop bounce test includes various situations in which a test/measurement may be disqualified and the test results not recorded. For example, if the ball fails to land within the test field, the trial may be disqualified. In other examples, disqualification may result if the athlete does not begin the test in a golf stance with the dominant/back foot touching the baseline, the athlete does not throw the ball with two hands cross-body (e.g., the athlete might not be permitted to turn, open and square to the fairway when throwing), and/or the ball is bounced beyond the predefined first bounce point (e.g., striking the ground further than five feet or other predefined distance from the baseline). Various other disqualification rules may also be used in addition or as an alternative to any of the above noted disqualification criteria.
  • An alternate or additional version of the wood-chop bounce may involve neutralizing hip movement so that an athlete does not rely on hip movement in performing the test (e.g., throwing the ball).
  • the athlete may be required to place and hold a ball or other object between his or her legs (e.g., above the knees). When throwing, the athlete is required to maintain his or her hold of the object between his or her legs. By having such a requirement, the test may minimize the contribution of hip movement during the throw.
  • the ball held between the athlete's legs may be different in size, shape, color, weight and the like from the ball thrown. In another example, the ball held between the athlete's legs may be the same in size, shape, color, weight and other attributes as the ball to be thrown.
  • FIGS. 13A-13D illustrate the movements associated with the side-sling object launch.
  • the athlete 1301 may be instructed to take an initial position/stance similar to a golf stance.
  • the user's dominant foot 1303 may be positioned on a baseline 1305 of a test field 1311 and the athlete's body may be positioned width-wise perpendicular to baseline 1305 .
  • the athlete 1301 may further be required to hold an object (such as a ball 1307 ) between his or her legs to reduce hip movement contributions to the athlete's side-sling.
  • the athlete 1301 may be required to hold the object between his or her legs (e.g., above the knees) through the entire exercise.
  • FIG. 13B illustrates athlete 1301 holding another object (e.g., ball 1309 ) that is to be thrown down a test field 1311 in direction C.
  • ball 1307 may be different in size, shape, color, weight and the like from ball 1309 .
  • ball 1307 may be identical to ball 1309 .
  • FIG. 13C the athlete 1301 is shown slinging ball 1309 down the test field 1311 .
  • the throw may be a single underhanded throw and may require the user to start by swinging his or her arm 1313 ball directly back and then slinging his or her arm 1313 forward, releasing the ball out into the test field 1311 .
  • the athlete 1301 may be required to follow through with the swing, ending upright as shown in FIG. 13D .
  • a distance between the baseline 1305 and the initial landing/contact point of the thrown object e.g., ball 1309 .
  • the distance may correspond to the perpendicular distance between the baseline 1305 and the ball 1309 .
  • This distance may then be used to determine the athlete's athleticism score or value for the side-sling object launch exercise. Multiple trials may be performed and an average may be taken in some arrangements.
  • Disqualifications may be levied if the athlete 1301 throws the ball 1309 and the ball 1309 does not land within the test field 1311 (width-wise), the athlete 1301 does not begin the test in the initial position (e.g., golf stance) with his or her back foot 1303 touching baseline 1305 , the athlete's backswing carries his or her arm 1313 outside of his width-wise body line (e.g., the arm 1313 crosses the plane created by the athlete's shoulders and back in the initial stance, causing the ball 1309 to deviate from the test field 1311 ), the athlete 1301 takes any kind of step with his front foot and/or if ball 1307 held between the athlete's legs falls out during the throw.
  • the athlete 1301 takes any kind of step with his front foot and/or if ball 1307 held between the athlete's legs falls out during the throw.
  • RFID tags or other sensors may also be worn by the user to detect various movements and positions.
  • an RFID tag may be placed in heel portion of a user's shoe to detect a landing position in the broad jump test.
  • RFID tags may be incorporated into a lateral edge (inner and/or exterior) of a user's shoe to detect a landing point in the countermovement lateral hop test. The RFID tag in the lateral edge or in other portions of the shoe may also be used to detect whether the user contacted or is contacting a baseline.
  • a computing system such as processing device 106 or computer system 108 may provide automated instructions to the athletes for any of the above noted exercises and movements, positions, measurements and data submissions described herein.
  • the computing system may provide audio, visual and/or haptic cues.
  • sensors may be used to determine if the athlete is in a correct stance, determine one or more landing positions of a thrown object or the athlete, positions of an athlete's body parts and the like.
  • the computing system may further generate a data input form requesting recordation of trial data for each of the tests described herein.
  • the computing system may generate a single form or a sequence of forms having fields for the types of data to be recorded for each of the various golf athleticism tests/exercises.
  • the computing system may determine the rating or score for each exercise in addition to an overall golf athleticism rating that takes into account all of the scores and results from all of the tests/exercises performed. For example, as described herein, an athlete's fractional event points (or other scoring value) for each text/exercise may be summed to result in an overall rating.
  • the overall rating may also be subject to scaling factors (e.g., multiplying by a certain factor) to derive the scaled athleticism rating.
  • FIGS. 14A and 14B illustrate an example test field configuration that may be used for one or more of the golf athleticism rating tests described herein.
  • FIG. 14A illustrates a front perspective view of test field 1400 .
  • Test field 1400 may include multiple markers including 5-foot markers 1403 and submarkers 1405 .
  • a baseline 1407 may be defined at the O-foot marker 1403 a with a staging area 1409 defined therebehind.
  • the test field 1400 may be divided width-wise into two sections 1411 a and 1411 b by a demarcation or dividing line 1413 .
  • Section 1411 a may be configured for throwing-oriented exercises while section 1411 b may be configured for jumping/hopping type tests/exercises.
  • section 1411 b might only extend length-wise for a distance smaller than the length of section 1411 a .
  • section 1411 b might only have distance markers up to a first distance while section 1411 a may have distance markers up to a second distance greater than the first distance.
  • section 1411 a may be 50 feet long while section 1411 b may be 10 feet long.
  • Other distances may be used to define each of sections 1411 a and 1411 b .
  • the widths of sections 1411 a and 1411 b may be equal or one may be greater than the other. In one particular example, the width of section 1411 a may be 10 feet.
  • FIG. 14B illustrates test field 1400 from a top down view.
  • Test field 1400 includes sections 1411 a and 1411 b that are substantially rectangular.
  • a bounce line for various tests such as the wood-chop bounce may be predefined at the 5 foot marker 1403 b .
  • the remaining portion of section 1411 a may be labeled with a “FAIRWAY” mark to indicate the throwing region.
  • Section 1411 b may be labeled with a “JUMP ZONE” label to identify section 1411 b as a jumping or hopping test area.
  • athletes may perform different tests at the same time.
  • test field 1400 may be provided as a mat that is portable.
  • the portable mat may allow test administrators and test takers to administer or take the tests in a variety of locations (i.e., the tests would not be restricted to one particular physical location). Additionally, using a portable mat may provide consistency in the results that are produced.
  • the mat may include one or more electronic devices including sensors, LED displays, other visual displays, haptic feedback devices and the like. For example, LED numbering may be used instead of drawn numbering on the mat.
  • weight sensors may be used to detect the position and weight distribution of athletes and test objects (e.g., balls).
  • a sensor may be placed at the borders of the test sections 1411 a and 1411 b including the baseline 1407 , end lines 1405 a and 1405 b and dividing line 1413 .
  • infrared or laser sensors may be used to detect whether an athlete or an object is touching (or not touching) or touches a particular point on the mat such as the bounce line, the baseline and the like.
  • a display device may display instructions for performing a test.
  • the mat may thus further include data transmission devices (e.g., wireless adapters, USB ports, serial ports and the like) to transmit detected information to one or more computing systems such as processing device 106 and/or computer system 108 .
  • the baseline, bounce line and other demarcations on the mat may be provided in different colors or patterns for visual differentiation. This may help the athlete identify targets, position requirements and the like.
  • the mat or portable test field may be composed of various materials and/or combinations of materials including plastic such as artificial turf and/or, foams, metals and the like.
  • the mat may, in one particular example, be roughly 15 feet wide and 60 feet long.
  • section 1411 a FIGS. 14A and 14B
  • section 1411 b may be 5 feet wide or smaller.
  • Other lengths and widths may be used depending on the test criteria and requirements.

Abstract

Aspects of this disclosure relate to systems and methods for rating the performance of an athlete, particularly a golf athlete. The systems and method may include instructing the user to perform multiple golf-relevant athleticism drills or tests such as a stepping exercise, a broad jump, a wood-chop bounce, a countermovement lateral hop and/or a side-sling object launch. One or more drills or tests may be performed in a hip-neutral manner so as to simulate a golfer's stance. The performance data collected from each of these tests may be input into a processing system to generate an athleticism rank and score for each test as well as an overall, multi-factorial rating of golf athleticism.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application, having attorney docket number NIKE.172005, is a Continuation-in-Part of copending U.S. Nonprovisional application having Ser. No. 12/559,082, attorney docket number NIKE.170315, filed on Sep. 14, 2009, and entitled “Athletic Performance Rating System,” which claims the benefit of U.S. Provisional Patent Application No. 61/096,603, filed on Sep. 12, 2008, entitled “Athletic Performance Rating System.” The entireties of the aforementioned applications are incorporated by reference herein.
  • TECHNICAL FIELD
  • Aspects described herein relate to athleticism ratings and related performance measuring systems, methods, apparatuses and the like. In particular, aspects are directed to athleticism ratings and performance measuring systems for determining golf athleticism for an individual or group.
  • BACKGROUND
  • Athletics contribute to the promotion of physical activity and a healthy sense of competition. Commercially, athletics play a significant role in providing entertainment to fans and generating revenue for the various leagues and players. At any level of athletics, teams, sponsors, coaches and the like seek out the best athletes. However, evaluating an athlete's level of skill or potential is often very subjective. For example, in some instances, scouts or other evaluation personnel rely upon subjective and individual-specific opinions and experiences regarding performance and the relative importance of various attributes of performance.
  • Some current systems attempt to use objective standards to evaluate athletic potential. Oftentimes, the systems use the same data and same athletic exercises or tests regardless of the type of athletic activity for which the athlete is being evaluated.
  • The present invention seeks to overcome certain limitations and other drawbacks, and to provide new features not heretofore available.
  • BRIEF SUMMARY
  • The following presents a general summary of aspects of the invention in order to provide a basic understanding of the invention and various features of it. This summary is not intended to limit the scope of the invention in any way, but it simply provides a general overview and context for the more detailed description that follows.
  • The present invention generally relates to systems and methods for rating the performance of an athlete, particularly a golf athlete using various types of exercises and tests. In some arrangements, the tests or exercises may be specific to evaluating golf athleticism. Additional or alternatively, aspects described herein may include a portable test field or mat that may be used to perform the various exercises or tests for determining athleticism of an individual.
  • These and other features of the invention will become apparent from the following detailed description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more complete understanding of various aspects disclosed herein and certain advantages thereof may be acquired by referring to the following detailed description in consideration with the accompanying drawings, in which:
  • FIG. 1 illustrates a general operating environment in which one or more aspects described herein may be included.
  • FIG. 2 is a flowchart illustrating an exemplary method for collecting data for generating an athleticism rating of an athlete based on a set of athletic exercises or tests according to one or more aspects described herein.
  • FIG. 3 is an exemplary graph illustrating rankings of performances in a particular type of exercise among an athlete pool according to one or more aspects described herein.
  • FIG. 4 is an exemplary table illustrating scoring values for an athlete along with possible scoring values according to one or more aspects described herein.
  • FIG. 5 is a table illustrating an example set of athleticism scores for various athleticism tests performed by an athlete according to one or more aspects described herein.
  • FIG. 6 is a table illustrating an example set of athleticism scores corresponding to ceiling values for various athleticism tests according to one or more aspects described herein.
  • FIG. 7 is a flowchart illustrating an exemplary method for generating an athleticism rating according to one or more aspects described herein.
  • FIG. 8 is a timeline and process flow illustrating a method for performing a step test according to one or more aspects described herein.
  • FIGS. 9A-9C illustrate an example athletic activity exercise that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 10A and 10B illustrate another example athletic activity exercise that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 11A-11C illustrate another example athletic activity exercise that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 12A-12C illustrate another example athletic activity exercise, a wood-chop bounce, that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 13A-13D illustrate another example athletic activity exercise, a side-sling object launch, that may be used to determine a golf athleticism rating according to one or more aspects described herein.
  • FIGS. 14A and 14B illustrate an example exercise test field that may be used to perform one or more athleticism exercises/tests according to one or more aspects described herein.
  • The reader is advised that the attached drawings are not necessarily drawn to scale.
  • DETAILED DESCRIPTION
  • In the following description of various example structures and methods in accordance with the invention, reference is made to the accompanying drawings, which form a part hereof, and in which are shown by way of illustration various performance rating devices and systems using performance ratings or measuring devices in accordance with various embodiments of the invention. Additionally, it is to be understood that other specific arrangements of parts and structures may be utilized and structural and functional modifications may be made without departing from the scope of the invention.
  • A first aspect of the present invention is directed to a system for performing a method, the method including receiving athletic performance results from multiple types of performance tests. In an exemplary aspect, the athletic performance results including at least two selected from the following 1) a change in pulse of an athlete measured during a stepping exercise, 2) a broad jump distance of an athlete, 3) a lateral hop distance of an athlete, 4) a bounce distance of a ball when thrown by the athlete in a downward direction toward a target, and 5) a sling distance of a ball when thrown by the athlete using a underhanded side sling. It is understood that when it is stated herein that at least two are selected from the following, it is intended that at least two different performance test are selected from the listing of possible performance tests. The method also comprised of generating, by the computing system, a golf athleticism rating based on the at least two athletic performance results.
  • A second aspect of the present invention is directed to non-transitory computer readable media having computer-executable instructions embodied thereon that when executed by a processor perform a method for evaluating the athleticism of an athlete in golf, The method comprises receiving at least two results for the athlete's performance in at least two different athletic performance tests related to golf. The method further comprises comparing each of the at least two results to a corresponding distribution of test results of athletic data for athletes similar to the athlete and determining a percentile ranking for each of the at least two results. The method is further comprised of transforming the percentile ranking for each of the at least two results to a fractional event point number for each result. The percentile rankings for each of the at least two results are progressive. The method further comprising determining an athleticism rating score for the athlete in golf based on the fractional event point numbers.
  • FIG. 1 is a block diagram of an athleticism measurement system 100 that includes a sensor 102 (e.g., an accelerometer, compression sensor, inertial measurement system, etc.) that is borne by an athlete during different athletic drills or tests to generate data that are used to generate an athleticism rating, such as a rating described in International patent application no. PCT/US2005/040493 for Athleticism Rating and Performance Measuring Systems and incorporated herein by reference. Sensor 102 may be any type of sensor configured to detect a stimulus and provide a resulting signal. In one embodiment, sensor 102 may be configured to detect a force, such as an impact force from a person or object striking another person or object. In certain embodiments, sensor 102 may be utilized to measure one or more parameters, such as, for example, velocity, acceleration, pressure, location, energy transfer, temperature, orientation, light, sound, magnetism, or a particular motion along two or more axes. In one embodiment, sensor 102 may comprise an accelerometer module. In a particular example, the accelerometer module may be implemented with a two-axis accelerometer for measuring acceleration along two orthogonal axes. In another embodiment, the accelerometer module may be implemented with a three-axis accelerometer for measuring acceleration along three orthogonal axes.
  • Further exemplary sensors include strain gauges, conductive ink, piezo-electric devices and/or pressure transducers. In certain embodiments, relative pressure applied to sensor 102 (e.g., versus pressure detected by another sensor or by different components of sensor 102) can be used to indicate weight distribution. In certain embodiments, sensor 102 may comprise a camera. A camera may detect or measure one or more properties of an athlete or other user, before, during or after, any processes or routines disclosed herein. Additionally, multiple sensors may be used in measuring an athlete's performance during one or more athleticism exercises or tests. The multiple sensors may be of the same type or may include different types. In one example, the multiple sensors may correspond to accelerometers placed at different locations on a test field or on an athlete's body. In another example, a first sensor may comprise an accelerometer while another sensor may comprise a pulse measurement sensor. Multiple sensors may be incorporated into the same physical device or may each be physically separate from the others.
  • Acceleration sensor 102 may be positioned in a shoe, on top of a shoe, fastened around the ankle or wrist, attached to waist belts or incorporated into apparel on the body of the athlete, or otherwise borne by the athlete. For example, sensor 102 may be worn or attached to any other portion of an athlete's body and/or incorporated into clothing as necessary or desired. For example, an athlete may wear a sensor around his or her head to measure head movement. In another example, an athlete may wear a shirt having a heart rate monitor included therein.
  • In an embodiment, sensor 102 communicates over a link 104 with an athleticism rating processing device 106. In one implementation, link 104 is a wireless, digital, low-power RF link with 1-way or 2-way transmission. A wired link could alternatively be employed in some applications. Athleticism rating processing device 106 may include one or more of an athleticism timing system, such as an electronic timing system, or a stopwatch, sport watch, digital music player, mobile phone, wireless athleticism measurement kiosk, etc. configured to communicate over link 104 with sensor 102.
  • According to one or more arrangements, athleticism rating processing device 106 allows a user (e.g., an athlete, coach, etc.) to select an athleticism measurement mode from among multiple selectable athleticism measurement modes. During a measurement mode, athleticism rating processing device 106 obtains and stores acceleration data from sensor 102 and selected timing data. In addition, athleticism rating processing device 106 may cue the athlete to perform certain actions during an athleticism measurement or may provide feedback during or after the measurement.
  • In one implementation, athleticism rating processing device 106 delivers the acceleration data and the timing data by wired or wireless communication to an athleticism rating computer system 108 that calculates an athleticism rating based in part on the sensed data and/or timing data. Athleticism rating computer system 108 may be disposed at the location where the athlete performs the athletic drills or tests or may be located remotely and accessed over a computer network (e.g., the Internet). In an alternative implementation, athleticism rating processing device 106 may calculate an athleticism rating directly. In some arrangements, athleticism rating processing device 106 and/or athleticism rating computer system 108 may be included as part of the same physical device as sensor 102. For example, sensor 102 may include a processor and memory storing instructions for processing the athleticism data and subsequently calculating an athleticism rating. In other arrangements, athleticism rating processing device 106, athleticism rating computer system 108 and sensor 102 may all correspond to physically separate devices. In yet other arrangements, processing device 106 and rating computer system 108 may correspond to a single physical computing device or system. Any of the sensor 102, processing device 106 and computer system 108 may also be configured to operate in multiple modes, each mode corresponding to a different sport, type of athletic activity, type of athleticism rating being determined and the like. An example multi-mode athleticism movement measurement system is described in further detail in U.S. Application Pub. No. 2008/0249736 A1, entitled “MULTI-MODE ACCELERATION-BASED ATHLETICISM MEASUREMENT SYSTEM,” and filed on Sep. 28, 2007, which is hereby incorporated by reference in its entirety.
  • Processing device 106 (and/or computer system 108 and sensor 102) may include one or more computing devices and components. For example, processing device 106 may include a computing unit 113. The computing unit 113 includes a processing unit 115 and a system memory 117. The processing unit 115 may be any type of processing device for executing software instructions, but will conventionally be a microprocessor device. In some arrangements, processing unit 115 may be single-core or multi-core. The system memory 117 may include both a read-only memory (ROM) 119 and a random access memory (RAM) 121. As will be appreciated by those of ordinary skill in the art, both the read-only memory (ROM) 119 and the random access memory (RAM) 121 may store software instructions for execution by the processing unit 115. Further, it is contemplated that one or more forms of memory may be non-transitory computer readable media.
  • The processing unit 115 and the system memory 117 are connected, either directly or indirectly, through a bus 123 or alternate communication structure to one or more peripheral devices. For example, the processing unit 115 or the system memory 117 may be directly or indirectly connected to additional memory storage, such as the hard disk drive 127 and the optical disk drive 129. Other types of memory may also be used, including flash memory and removable magnetic drives. The processing unit 115 and the system memory 117 also may be directly or indirectly connected to one or more input devices 131 and one or more output devices 133. The input devices 131 may include, for example, a keyboard, touch screen, a remote control pad, a pointing device (such as a mouse, touchpad, stylus, trackball, or joystick), a scanner, a camera or a microphone. The output devices 133 may include, for example, a monitor display, television, printer, stereo, or speakers.
  • Still further, the computing unit 113 will be directly or indirectly connected to one or more network interfaces 125 for communicating with a network. This type of network interface 125, also sometimes referred to as a network adapter or network interface card (NIC), translates data and control signals from the computing unit 113 into network messages according to one or more communication protocols, such as the Transmission Control Protocol (TCP), the Internet Protocol (IP), and the User Datagram Protocol (UDP). Network adapters may be wireless or wired or combinations thereof. These protocols are well known in the art, and thus will not be discussed here in more detail. A network interface 125 may employ any suitable connection agent for connecting to a network, including, for example, a wireless transceiver, a power line adapter, a modem, or an Ethernet connection. Connection agents may similarly be wireless or wired or a combination thereof.
  • Using the system 100, a golf athleticism rating may be determined based on a battery of tests and exercises that may, in one or more arrangements, be specific to potential in the golfing arena. For example, a user's performance during the tests and exercises may be measured using sensors such as sensor 102, collected by a processing device such as processing 106 and used to generate an athleticism rating by a computing device such as computing system 108. In embodiments, one or more of the performance results may be measured manually and entered, for instance, into input device 131 of processing device 106. Additionally, it is contemplated that one or more performance results may be measured automatically and provided to the processing device 106, in an exemplary aspect.
  • FIG. 2 illustrates an exemplary method 200 whereby a computing device may detect and/or receive data corresponding to an athlete's performance for one or more golf exercises and tests and generate an athleticism rating based thereon. In step 202, for example, athlete information may be received by the computing system. The athlete information may include name, gender, age, height, weight, left/right-handedness, shoe size, wingspan, and the like. This information may be used in calculating performance in a particular test or exercise. For example, if a computing system seeks to determine an athlete's leg strength, the computing system may use the athlete's weight in combination with the athlete's vertical jump height to determine leg strength or power. In another example, shoe size may be used to insure proper foot placement for a particular test or exercise. In a particular example, if a test requires that one foot is placed behind the other without overlap, a system may determine if this condition has been satisfied by detecting a heel edge of a user's foot wear and determining, using the athlete's shoe size, whether the toe point of the foot overlaps the heel edge of the user's other foot. An athlete's information may also be used to categorize the data. For example, the athlete's gender may be used to determine a pool of data in which to store the athlete's performance data. The various pools of data may be used as the basis for generating athleticism ratings. Thus, athleticism ratings for male and female athletes may correspond to different scales and athlete pools and thus, might not be directly comparable.
  • In step 205, the computing system may determine a series of one or more tests or exercises for the athlete. The series of tests or exercises may be selected based on the athlete information. For example, some exercises or tests might not be age-appropriate for younger athletes. In a particular example, a dunking test for basketball might not be appropriate or valuable for athletes under the age of 14 due to height, muscular development and other issues. Tests and exercises might also be sport-specific or athletic activity-specific. In a present embodiment, for instance, tests and exercises might be specific to golf. Example tests and exercises for evaluating golf athleticism are described in further detail below. In other arrangements, some exercises may be generic to multiple sports, but a set of exercises or tests are specific or unique to a given sport or athletic activity. Moreover, tests and exercises or sets of tests or exercises may be gender specific.
  • In step 210, the computing system may provide instructions on when and how to perform each exercise. In some examples, the computing system may provide audible, visual or haptic cues to indicate times at which a particular movement or action is expected. In a particular example, an audible and/or visible cue may be provided when a user is expected to jump or throw an object. Instructions may further include an animation illustrating the type of movement expected for the test. The computing system may progress through the set of exercises and tests, instructing the athlete through each one. In step 215, the computing may concurrently and/or subsequently request and receive performance data. For example, the computing system may, based on the determined exercise or set of exercises, generate requests for particular types of data such as a number of steps taken, a top speed, a distance an object was thrown, a distance the athlete moved, or an impact force (e.g., force of the athlete hitting the ground or another object, of an object being thrown or otherwise propelled by the athlete and the like). In one particular example, the computing system may generate an interface including a data entry form. The data entry form may include fields for particular types of performance data based on the types of exercises and tests performed. Thus, if an exercise or a set of exercises is unique to a type of sport or athletic activity, the data requested by the computer system (and, in some examples, the data entry form) may also be unique to the sport or athletic activity. Various types of data are described in additional detail below.
  • Upon receiving the athletic performance data, the computer system may, in step 220, generate an athleticism score or value for each test/exercise by comparing the athlete's performance data with data from a pool of other athletes for the particular test/exercise. Based on the athlete's athleticism score for each test/exercise, an overall athleticism rating for an athletic activity or sport such as golf may be determined in step 225. In one example, a number of points may be determined for each exercise or test and the points later combined. Either the overall athleticism rating or the exercise-specific scores or both may also be scaled. The athleticism rating may thus represent a level of potential or skill in a particular athletic activity (e.g., sport-specific athleticism rating) relative to an athlete pool. Accordingly, athleticism ratings may be comparable within the athlete pool, but might not be comparable outside of the pool. In some arrangements, the athleticism ratings may be comparable between multiple or all athlete pools.
  • According to one or more configurations, determining an athleticism rating may include two general steps: 1) normalization of raw scores (e.g., test data) and 2) converting normalized scores to accumulated points. Normalization may be a prerequisite for comparing data from different tests. Step 1 ensures that subsequent comparisons are meaningful while step 2 determines the specific facets of the scoring system (e.g., is extreme performance rewarded progressively or are returns diminishing). Because the mapping developed in step 2 converts normalized test results to (fractional event) points in a standardized fashion, this scoring method can be applied universally to all tests, regardless of sport and/or measurement scale. Prudent choice of normalization and transformation functions provides a consistent rating to value performance according to predetermined properties.
  • In order to compare results of different tests comprising the set or battery of tests for a particular sport such as golf, the results may be standardized on a common scale. If data is normal, a common standardization is the z-score, which represents the (signed) number of standard deviations between the observation and the mean value. However, when data are non-normal, z-scores are no longer appropriate as they do not have consistent interpretation for data from different distributions. A more robust standardization is the percentile of the empirical cumulative distribution function (ECDF), u, defined as follows:
  • u = 1 n + 1 [ j ( II { y j < x } + 1 2 II { y j = x } ) + 1 2 ] ,
  • In the above equation, x is the raw measurement to be standardized; y1, y2, . . . yn are the data used to calibrate the event and II{A} is an indicator function equal to 1 if the event A occurs and 0 otherwise. Note that u depends on both the raw measurement of interest, x, and the raw measurements of peers, y.
  • The addition of ½ to the summation in square brackets and the use of (n+1) in the denominator ensures that uε(0, 1) with strict inequality. Although the definition is cumbersome, u is calculated easily by ordering and counting the combined data set consisting of all calibration data (y1, y2, . . . , yn) and the raw score to be standardized, x.
  • u = [ # of y s less than x ] + 0.5 [ ( # of y s equal to x ) + 1 ] # of y s + 1 = [ # of ( y s and x ) less than x ] + 0.5 [ # of ( y s and x ) equal to x ] # of ( y s and x )
  • Note that this definition still applies to binned data (though raw data may be used whenever possible).
  • Although the ECDFs calculated in step 1 provide a common scale by which to compare results from disparate tests, the ECDFs may be inappropriate for scoring performance because they do not award points consistently with progressive rewards and percentile “anchors” (sanity checks). Therefore, it is necessary to transform (via a monotonic, 1-to-1 mapping) the computed percentiles into an appropriate point scale.
  • An inverse Weibull transformation provides such a transformation and is given by
  • w = 1 λ [ - ln ( 1 - u ) ] 1 / α , where α = 1.610 and λ = 2.512 .
  • The above function relies on two parameters α and λ and produces scoring curves that are qualitatively similar to the two-parameter power-law applied to raw scores. The parameters α and λ were chosen to satisfy approximately the following four rules governing the relationship between percentile of performance and points awarded:
      • 1. The 10th percentile should achieve roughly ten percent of the nominal maximum.
      • 2. The 50th percentile should achieve roughly thirty percent of the nominal maximum.
      • 3. The 97.7th percentile should achieve roughly one hundred percent of the nominal maximum.
      • 4. The 99.9th percentile should achieve roughly one hundred twenty-five percent of the nominal maximum.
  • Because, in general, four constraints cannot be satisfied simultaneously by a two-parameter model, parameters were chosen to minimize some measure of discrepancy (in this case the sum of squared log-errors). However, estimation was relatively insensitive to the specific choice of discrepancy metric.
  • To illustrate the method when raw (unbinned) data is available, consider scoring three performances, 12, 16, and 30, using a calibration data set consisting of nine observations: 16, 20, 25, 27, 19, 18, 26, 27, and 15.
  • For x=16, there is one observation in the calibration data (15) that is less than x and one that is equal. Therefore,
  • u = 1 9 + 1 [ j ( II { y j < 16 } + 1 2 II { y j = 16 } ) + 1 2 ] = 1 10 [ 1 + 1 2 + 1 2 ] = 0.20 .
  • A summary of calculations is given in the following table.
  • x Σj Π(yj < x) Σj Π(yj = x) u w
    12 0 0 [0 + (0.5)(0) + 0.5]/(9 + 1) = 0.063
    0.05
    16 1 1 [1 + (0.5)(1) + 0.5]/(9 + 1) = 0.157
    0.20
    30 9 0 [9 + (0.5)(0) + 0.5]/(9 + 1) = 0.787
    0.95
  • For backward compatibility, it may be necessary to score athletes based on binned data. Consider scoring four performances, 40, 120, 135, and 180, using a calibration data set binned as follows. Here, the bin label corresponds to the lower bound, e.g., the bin labeled 90 contains measurements from the interval (90, 100).
  • Bin Count
    <50 0
     50 2
     60 19
     70 33
     80 63
     90 39
    100 20
    110 17
    120 26
    130 14
    140 4
    150 3
    160 1
    170 4
    Total 245
  • For x=135, there are 0+2++17+26=219 observations that are in bins less than the one that contains x and 14 that fall in the same bin. Therefore,
  • u = 1 245 + 1 [ j ( II { y j < bin containing 135 } + 1 2 II { y j in bin containing 135 } ) + 1 2 ] = 1 246 [ 219 + 7 + 1 2 ] = 0.921 .
  • A summary of calculations is given in the following table.
  • x Σj Π{yj < x} Σj Π{yj = x} u w
    40 0 0 0.002 0.008
    120 193 26 0.839 0.579
    135 219 14 0.921 0.709
    180 241 4 0.990 1.026
  • The standardization and transformation processes are performed exactly as in the raw data example; however, care must be taken to ensure consistent treatment of bins. All raw values contained in the same bin will result in the same standardized value and thus the same score. In short, scoring based on binned data simplifies data collection and storage at the expense of resolution (only a range, not a precise value, is recorded) and complexity (consistent treatment of bin labels).
  • In rare circumstances, only summary statistics (such as the mean and standard deviation) of the calibration data are available. If an assumption of normal data is made, then raw data can be standardized in Microsoft® Excel®, available from the Microsoft Corporation of Redmond, Wash., using the normsdist( ) function.
  • The above method relies heavily on the assumption of normality. Therefore if data are not normal it will, naturally, perform poorly. Due to the assumed normality, this method does not enjoy the robustness of the ECDF method based on raw or binned data and should be avoided unless there is no other alternative.
  • To illustrate this technique, assume that the mean and standard deviation of a normally distributed calibration data set are 98.48 and 24.71, respectively, and it is desirable to score x=150. In this case, u=normsdist((150−98.48)/24.71)=0.981.
  • As before,
  • ω = 1 λ [ - ln ( 1 - u ) ] 1 / α = 1 2.512 [ - ln ( 1 - 0.981 ) ] 1 / 1.610 = 0.924 .
  • Once the norm data has been collected and sorted in a manner, as set forth above for a given test, its ECDF is scatter plotted to reveal the Performance Curve. For example, non-standing vertical jump data observed in the field for 288 girls are shown as indicated in FIG. 3. For those results not observed, e.g., 26.6 inches, that value's rank (99.37 percentile) is assigned by interpolation; the unobserved points requiring assigned ranks are shown as indicated in FIG. 3.
  • For each test, a “ceiling” and a “floor” value is determined, which represent the boundaries of scoring for each test. Any test value at or above the ceiling earns the same number of event points. Likewise, any test value at or below the floor earns the same number of event points. These boundaries serve to keep the rating scale intact. The ceiling limits the chance of a single exceptional test result skewing an athlete's rating, thereby masking mediocre performance in other tests.
  • Each rank is transformed to fractional event points using a statistical function, as set forth above with respect to the inverse Weibull Transformation. The scoring curve of event points is shown for girls' no-step vertical jump in FIG. 3, as indicated therein, where the points are displayed as percentages, i.e., 0.50 points (awarded for a jump of 18.1 inches) are shown as fifty percent. These fractional event points are also referred to as the w-score (“w” for Weibull).
  • The inverse Weibull Transformation can process non-normal (skewed) distributions of test data, as described above. The transformation also allows for progressive scoring at the upper end of the performance range. Progressive scoring assigns points progressively (more generously) for test results that are more exceptional. Progressive scoring allows for accentuation of elite performance, thus making the rating more useful as a tool for talent identification.
  • FIG. 4 identifies a sample athlete, “Andrea White” who jumped 26.5 inches during a no-step vertical jump. This value corresponds to w-score of 1.078. The w-scores for all of her tests are found by referencing those tests' respective look-up tables. These w-scores are shown in FIG. 5.
  • The fractional event points are summed for each ratings test variable to arrive at the athlete's total w-score (5.520 in FIG. 5, for example). This total is multiplied by an event scaling factor to produce a rating. For a girls' basketball rating, for example, this scaling factor is 18, and so Andrea White's overall athleticism rating is 99.36 (=5.520×18).
  • The “event scaling factor” is determined for each rating by the number of rated events and desired rating range. Ratings should generally fall within a range of 10 to 110. A boys' scaling factor is 25, for example, as the rating comprises four variables: Peak Power, Max Touch, Lane Agility, and three-quarter Court Sprint.
  • Were a female athlete to “hit the ceiling” on all six tests (shown in FIG. 6), her w-score total would yield a rating of almost 130 (129.85).
  • Assessing each of the various scores for each test provides the athlete with an overall athleticism rating, which may be used by the athlete in comparing their ability and/or performance to other athletes within their age group. Furthermore, the athlete may use such information to compare their skill set with those of other athletes in a particular sport (e.g., basketball, golf, etc.) to determine how their skill set compares with that of a professional athlete in the sport. While the above described tests and data may relate more to basketball or other similar activities, the same or similar algorithms, formulas, calculations and processes may be used to develop ratings for tests and exercises relating to other sports such as golf.
  • FIG. 7 illustrates a method 700 for generating an athleticism rating based on performance data collected in multiple athletic exercises or tests. Initially, as indicated at step 702, athletic performance data related to a particular sport are collected for a group of athletes. As described herein, the performance data may correspond to multiple different athletic tests or exercises. In some arrangements, those tests or exercises may be specific or unique to a particular sport or type of athletic activity/movement. Athletic performance data might include, by way of example, and not limitation, a no-step vertical jump height, an approach jump reach height, a sprint time for a predetermined distance, a cycle time around a predetermined course, or the like. Athletic performance data can be recorded for multiple athletes (e.g., a group of hundreds or thousands of athletes).
  • In step 705, the collected athletic performance data, such as athletic performance test results, may be normalized. Accordingly, athletic performance test results (e.g., raw test results) for each athletic test performed by an athlete in association with a defined sport are normalized. That is, raw test results for each athlete can be standardized in accordance with a common scale. Normalization enables a comparison of data corresponding with different athletic tests. In one embodiment, a normalized athletic performance datum is a percentile of the empirical cumulative distribution function (ECDF). Any method can be utilized to obtain normalized athletic performance data (i.e., athletic performance data that has been normalized).
  • In step 710, the normalized athletic performance data is utilized to generate a set of ranks. The set of ranks includes an assigned rank for each athletic performance test result included within a scoring table. A scoring table (e.g., a lookup table) includes a set of athletic performance test results, or possibilities (e.g., potential test values or results) thereof. Each athletic performance test result within a scoring table corresponds with an assigned rank and/or a fractional event point number. In one embodiment, the athletic performance data is sorted and a percentile of the empirical cumulative distribution function (ECDF) is calculated for each value. As such, the percentile of the empirical cumulative distribution function represents a rank for a specific athletic performance test result included in the scoring table. In this regard, each athletic performance test result is assigned a ranking number based on that test result's percentile among the normal distribution of test results. As such, the rank (e.g., percentile) may depend on the raw test measurements and may be a function of both the size of the data set and the component test values. As can be appreciated, a scoring table might include observed athletic performance test results and unobserved athletic performance test results. A rank that corresponds with an unobserved athletic performance test result can be assigned using interpolation of the observed athletic performance test data.
  • In step 715, a fractional event point number is determined for each athletic performance test result. A fractional event point number for a particular athletic performance test result is determined or calculated based on the corresponding assigned rank. That is, the set of assigned ranks, or percentiles, is transformed into an appropriate point scale. In one embodiment, a statistical function, such as an inverse Weibull transformation, provides such a transformation.
  • In step 720, one or more scoring tables are generated. As previously mentioned, a scoring table (e.g., a lookup table) includes a set of athletic performance test results, or possibilities thereof. Each athletic performance test result within a scoring table corresponds with an assigned rank and/or a fractional event point number. In some cases, a single scoring table that includes data associated with multiple tests and/or sports can be generated. Alternatively, multiple scoring tables can be generated. For instance, a scoring table might be generated for each sport or for each athletic performance test. One or more scoring tables, or a portion thereof (e.g., athletic test results, assigned ranks, fractional event point numbers, etc.) can be stored in a database.
  • In step 725, athletic performance data in association with a particular athlete is referenced (e.g., received, obtained, retrieved, identified, or the like). That is, athletic performance test results for a plurality of different athletic performance tests are referenced. The set of athletic tests can be predefined in accordance with a particular sport or other physical activity. An athletic performance test is designed to assess the athletic ability and/or performance of a given athlete and measures an athletic performance skill related to a particular sport or type of physical activity.
  • At step 730, a fractional event point number that corresponds with each test result of the athlete is identified. Using a scoring table, a fractional event point number can be looked up, determined, calculated, or recognized based on the athletic performance test result for the athlete. In some arrangements, the best result from each test is translated into a fractional event point number by referencing the test result in the lookup table for each test. Although the above described process includes generating a scoring table having a rank and a fractional event point number that corresponds with each test result to use to lookup a fractional event point number for a specific athletic performance test result, alternative methods can be utilized to identify or determine a fractional event point number for a test result. For instance, in some cases, upon receiving an athlete's test results, a rank and/or a fractional event point number could be determined. In this regard, an algorithm can be performed in real time to calculate a fractional event point number for a specific athletic performance test result. By way of example only, an athletic performance test result for a particular athlete can be compared to a distribution of test results of athletic data for athletes similar to the athlete, and a percentile ranking for the test result can be determined. Thereafter, the percentile ranking for the test result can be transformed to a fractional event point number.
  • In step 735, the fractional event point number for each relevant test result for the athlete is combined or aggregated to arrive at a total point score. That is, the fractional event point number for each test result for the athlete is summed to calculate the athlete's total point score. At step 740, the total point score is multiplied by an event scaling factor to produce an overall athleticism rating. An event scaling factor can be determined using the number of rated events and/or desired rating range. Athletic data associated with a particular athlete, such as athletic test results, ranks, fractional event point numbers, total point values, overall athleticism rating, or the like, can be stored in a database.
  • While athleticism ratings may be developed for a variety of different athletic activities, sports and movements, athleticism ratings for each type of athletic activity, sport and/or movement may be based on different metrics, tests and athleticism exercises. In determining an athleticism rating for golf, for instance, a computing system may request and/or receive input relating to endurance (e.g., using a par 5 step test), leg power (e.g., through a broad jump exercise and/or a countermovement lateral hop test), rotational and throwing force (e.g., based on a wood-chop bounce exercise and/or a side-sling launch exercise). The noted golf athleticism rating exercises or tests may more accurately measure an athlete's golf athleticism versus using other types of tests or exercises. In other arrangements, additional exercises or tests may be added to the battery or set of golf athleticism exercises as desired.
  • FIG. 8 illustrates a process 800 for performing an endurance-based par 5 step test. The test is intended to simulate the amount of exercise or physical exertion involved in completing a par 5 hole on a golf course. In step 802, for example, a computing system may instruct a user to perform a pulse find over a predefined amount of time. That is, the user may be asked, within a 30-second period, for example, to practice or attempt to find an athlete's pulse. The athlete may be the user or may be another individual. This find period may also be used to allow an athlete's pulse to lower from peaks before taking a reading.
  • Upon expiration of the initial 30-second find period or other time period, the computing system may subsequently instruct the user to perform a read of the subject athlete's pulse for a predefined pulse read time period in step 804. In one example, the predefined time period may be 30 seconds and a number of heart beats may be counted or detected over the 30-second period and multiplied by 2 to result in a beats per minute (bpm) value. Alternatively, the counted value over the 30 second period may be used for athleticism rating calculation purposes. Heart beats may be determined through manual user counting/determination or using electromechanical systems.
  • FIG. 9A illustrates an example process by which a pulse may be read. For example, an athlete 901 may measure the pulse of a subject athlete 903 by placing their fingers along a wrist area of the subject athlete 903. The measuring athlete 901 may then count the number of beats over the predefined time period. As illustrated, athletes 901 and 903 may simultaneously or concurrently measure the pulse of the other athlete 903 and 901, respectively. As previously provided, an electromechanical heart rate monitor may be utilized as an alternative or in addition to other methods provided herein.
  • Referring again to FIG. 8, in step 806, after determining the subject athlete's pulse (e.g., at the one minute mark from the start of the exercise), the computing system may instruct the user to record the data determined in step 806 and to begin practicing an athletic movement such as stepping. In a particular example, the athletic movement may include walking up and down a set of steps. Step 806 may last a total of 30 seconds (or other predefined amount) of exercise time. Various time periods may be used and the time periods described herein with respect to each process step (e.g., read, find, rest, report, step, etc.) may be different or equal to the time periods for each of the other process steps. In some instances, the predefined amount of time corresponding to step 806 may correspond to twice the predefined period of step 804. Alternatively, the predefined time period of step 806 may be three times, four times, ten times, (or any multiple, whole or fractional) etc. of the predefined period of step 804. Reporting may include recording the data manually (e.g., on paper or other physical writing medium) or entry into the computing system or a combination thereof.
  • The step test may include the setting or generation of a periodic or aperiodic beat (e.g., audible or visual) with which the user is to follow with steps. In one example, the computing system may generate and produce a periodic beat (audible, visual and/or haptic) having a frequency of 60 beats per minute. In another example, a metronome may be used to set the beat (the computing system may also provide instructions to this effect). Other mechanical, electromechanical and manual methods (e.g., manually timing and counting out the beats) and systems may also be used to setting and/or cuing a user to a particular beat. Other beat frequencies may also be used including 30 beats per minute, 25 beats per minute, 45 beats per minute, 120 beats per minute and the like. In some arrangements, the beat may be aperiodic. For example, the beat may include a first beat at time 0, followed by a second beat at time 1 second and a third beat at time 4 seconds and a fourth beat at time 10 seconds.
  • During the practice phase, the computing system may further provide instructions on the type of movement expected at each beat or cue. For example, the computing system may instruct the athlete to step up with the left foot at the first beat, to step up with the right foot at the second beat, to step down with the left foot at the third beat and to step down with the right foot at the fourth beat and so on (repeating with same set of steps or with other sets of step movements). Accordingly, the par 5 step test may require performance with or on a physical structure having multiple elevations (e.g., a set of steps). In some instances, only two levels or elevations are necessary while in other examples, more than two levels or elevations may be required. Other arrangements of step movements may also be used. For example, the athlete may be asked to perform right up, left up, right up, left up, right down, left down, right up, left up, right up, left up and so on. The instructions might also depend on the dominant foot of the athlete. For example, the athlete may be instructed to start with the dominant foot first followed by the non-dominant foot.
  • Once the practice and report phase is completed (e.g., from step 806), the computing system may subsequently instruct the athlete to perform actual steps (in contrast to practice steps) for a predefined amount of time in step 808. In this arrangement, the predefined amount of step time may be 3 minutes. As with the practice phase, the computing system may provide or instruct a user or other device such as a metronome to provide an audible, visual (e.g., on a display) or haptic cue (e.g., a beat). In one example, the instructions may include an instruction for a user to activate a metronome or other beat generating device. The computing system may further display or audibly convey the particular movement required at each beat.
  • FIGS. 9B and 9C illustrate example stepping movements that may be performed by athletes on a set of steps in conjunction with the exercise/test described in FIG. 8.
  • Referring again to FIG. 8, upon expiration of the stepping period in process step 808, the computing system may provide an instruction for the subject athlete to stop the stepping movement and to begin performing a pulse find in step 810. As described in step 802 and FIG. 9A, a pulse find may include locating the subject's pulse, allowing the subject's pulse to lower from a peak, and/or to otherwise prepare for determining the subject athlete's pulse. The pulse find process may be performed over a predefined time period such as 30 seconds.
  • Once the pulse find time period has expired, the computing system may subsequently instruct the user to read the subject athlete's pulse in step 812 over another predefined time period (e.g., 30 seconds, 1 minute, 45 seconds, 10 seconds, 2 minutes, 5 minutes, etc.). In one or more arrangements, read time periods and find time periods may be the same. In other arrangements, these time periods may be different. In yet other arrangements, each read time period and/or each find time period may different from one or more of the other read time periods or find time periods, respectively.
  • Next, the computing system may subsequently instruct the user to again perform a stepping exercise for another predefined amount of time in step 814. This second stepping phase may have a duration that is less than the first stepping phase (step 808). In one example, the first stepping phase may have a duration of three minutes while the second stepping phase may have a duration of two minutes. The relationship between the first stepping phase duration and the second stepping phase duration may be defined in a number of ways. For example, the first stepping phase duration may be defined as one minute more than the second stepping phase duration. In other examples, the first stepping phase duration may be defined as twice, three times, 10 times, etc. the second stepping phase duration. In yet other examples, the second stepping phase duration may be defined as a fraction of the first stepping phase duration (e.g., ½, ¾, ⅔, 3/7, etc.).
  • The second stepping phase of step 814 may be followed by, similar to the first stepping phase, a find phase of 30-seconds (or another duration) at step 816, and a read phase of 30-seconds (or another duration) at step 818. Next, a computing system may instruct an athlete to perform a third stepping phase or round in step 820 for a predefined duration. The duration of the third stepping phase may be related to the durations of the second stepping phase and the first stepping phase. For example, the duration of the third stepping phase may be 1 minute less than the duration of the second stepping phase (and two minutes less than the duration of the first stepping phase). Alternatively or additionally, the duration of the third stepping phase may be a predefined percentage or fraction of the durations of the first and/or second stepping phases. The movements required in the stepping phases may be the same (just for different durations) or may vary. For example, in the first stepping phase, the athlete may be instructed to perform right step up, left step up, right step up, left step up, right step down, etc. while in the second stepping phase, the athlete may be instructed to perform right step up, left step up, right step down, left step down. The third stepping phase may further be different from the first and second stepping phases.
  • The third (and final in some examples) stepping phase 820 may be followed a find phase of 30-seconds (or another duration) at step 816, and a read phase of 30-seconds (or another duration) at step 818.
  • The final report may be provided based on a user interface or electronic form generated by the computing system. The electronic form may be specific to the par 5 test and request various information including the various pulse readings at the specified times. According to one or more arrangements, in addition to or alternatively, the computing system may automatically take the pulse measurements during the read phases. In other arrangements, a user may perform the pulse reading process in a manual fashion (e.g., either by manually counting or using a device to measure the subject's pulse). By measuring the subject's pulse and change in the subject's pulse after varying degrees (e.g., time or amount) of exercise (e.g. stepping) and rest, a computing system may determine the subject's endurance or ability to recover (e.g., based on the changes or differences in pulse readings at the specified times). This information may be relevant to how well an athlete would perform (e.g., endurance-wise) in golf games since golf games tend to last multiple hours and require a significant amount of walking.
  • The data recorded during the above par 5 step test may be requested and received by the computing system in conjunction with a corresponding instruction or may be requested and received at the end of the entire test. Alternatively, data may be collected at various intervals or points during the test (e.g., during rest periods and the like). Additionally or alternatively, any number of stepping rounds may be performed or required.
  • FIGS. 10A and 10B illustrate an example process for performing a broad jump test/exercise for evaluating golf athleticism. The broad jump exercise may begin with an athlete 1001 being positioned with feet 1003 parallel and toes (or tip of the athlete's shoe) placed at a predefined point such as a baseline 1005 as illustrated in FIG. 10A. The athlete 1001 may further be required to start in a crouched position in preparation for launching himself or herself as far as possible in a specified direction (e.g., direction A). In one or more arrangements, a sensor system (not shown) may be deployed along baseline 1005 to insure that the athlete's feet are properly aligned. For example, if the athlete 1001 crosses baseline 1005, the sensor may sound an alarm or other type of visual, audio or haptic alert/feedback cue. Additionally or alternatively, if the athlete's toes are not touching baseline 1005, a test administrator and/or the athlete may be notified.
  • A computing system may further cue the athlete 1001 to begin the exercise. For example, the computing system may provide an audible, visual and/or haptic cue to begin a jump. Upon generating the cue, athlete 1001 may perform a broad jump. FIG. 10B illustrates athlete 1001 mid-jump. Upon landing, the athlete 1001 may be required to remain upright with feet stationary so as to permit accurate measurement. The measurement may be taken from the baseline to the heel of athlete 1001 closest to the baseline. In one example, the measurement may be taken through a manual process. In other examples, electronic sensors (e.g., RFID, weight sensors, etc.) may be used to detect the position of athlete 1001's feet. By determining the position of the athlete's feet, a system may determine a heel point of the athlete's back foot (e.g., based on knowing the athlete's shoe size). This measurement may then be used to determine a broad jump athleticism score as described herein.
  • An athlete's jump may be disqualified, not recorded or not counted for a variety of reasons. For example, if the athlete steps into the jump, the athlete's jump may be disqualified. In another example, the athlete's jump may be disqualified if the athlete's toes cross the baseline prior to the jump. In yet another example, disqualification may be based on the athlete taking a step, hopping or landing any other body part other than his feet on the jumping surface within a 5-second period after landing and/or prior to confirmed measurement. In one or more arrangements, the athlete may be required to perform two qualified jumps. An average may then be taken as the final recorded jump value and score. In other examples, the athlete might only be allowed to use a single jump score. Accordingly, if the user's first or second jump is disqualified, the athlete might be required to base his or her athleticism score for the broad jump on the other jump.
  • Another exercise or test that may be used to evaluate golf athleticism is a countermovement lateral hop test designed to measure leg power with weight transfer and stabilization upon landing. This test may be used for determining golf athleticism since FIGS. 11A-11C illustrate an example countermovement lateral hop movement. A countermovement lateral hop corresponds to an athlete dominant-leg lateral hop to the opposite leg, covering the greatest distance possible. In FIG. 11A, for instance, a user positions him or herself in an initial stance. The initial stance may involve the athlete placing the lateral outside edge of his or her dominant foot 1103 along a baseline 1105 while standing with his or her shoulder line perpendicular to baseline 1105. The athlete's foot and/or body position may be verified using various types of sensors such as laser sensors, weight sensors, and/or computing systems to perform visual analysis of the athlete's position and the like.
  • Once the athlete has established his or her initial stance and the initial stance has been verified as proper, the athlete may then be cued or otherwise instructed to hop, leading with the non-dominant foot, as far as possible in direction B. FIG. 11B illustrates an athlete initiating a hop. The user may be encouraged or instructed to shift his or her weight to the dominant foot before launching into the hop. The athlete may be permitted to move either foot prior to the jump, but may be required to touch the baseline 1105 with his or her dominant foot 1103 just prior to initiating the jump/hop. Initiation of the hop may be defined by the lifting of the dominant foot while the non-dominant foot is in mid-air (i.e., not in contact with the test surface).
  • FIG. 11C illustrates an athlete's landing upon completing the countermovement lateral hop. Once the athlete has landed, a measurement may be taken between the baseline 1105 and an inside lateral edge 1107 of the athlete's non-dominant foot (i.e., front/lead foot). Again, this measurement may be taken manually or through electronic means and systems.
  • A trial of the hop test may be disqualified under certain circumstances. For example, the test and measurement may be disqualified if the athlete does not begin in the proscribed initial stance as described herein. The stance may be confirmed by another individual, by sensors, and/or based on visual image analysis performed by a computing system. Disqualification may also result from the athlete failing to touch the baseline with the dominant/back foot immediately prior to initiating the hop/jump, the athlete's non-dominant/lead foot landing in an orientation that is not substantially parallel to the baseline, and/or failing to stabilize the landing leg/foot for a measurement time period (e.g., 5, 10, 15, 30, 60, 120 seconds, etc.). As with the broad jump, the countermovement lateral hop test may require the athlete to perform two successful hops. The average of the distance measurements may then be used to determine the athlete's final score or value (e.g., per the methods and algorithms discussed herein) for the countermovement lateral hop test. In other arrangements, only a single successful exercise/measurement may be required. In yet other arrangements, any number of measurements may be required (e.g., 3, 5, 10, 12, etc.).
  • A further measure of golf athleticism may include a wood-chop bounce exercise/test. A wood-chop bounce may include an athlete performing a cross-body rotational throw or slam. The throw or slam may be performed diagonally downwards (e.g., cross-body) with both hands on a thrown object (e.g., a ball). FIGS. 12A-12C illustrate an example motion for performing the wood-chop bounce. In FIG. 12A, for example, the athlete 1201 assumes an initial stance and position along a baseline 1203. As with the countermovement lateral hop, the athlete may be required to place his or her dominant foot on the baseline 1203 with the width of his or her body perpendicular to the baseline 1203. Again, various sensors and computing systems or devices may be used to confirm that the athlete 1201 is in the correct stance and position. For example, ball 1205 may include sensors to detect whether the athlete is touching two points on the ball 1205 (e.g., representing that the user is using both hands to grasp the ball).
  • The athlete may next be instructed to draw the ball up and back (e.g., just above head level) and to slam or throw the ball cross-body downward and toward the ground as shown in FIG. 12B. The goal of throwing the ball in this manner may be to achieve a maximum bounce and farthest second bounce or touch point. In the illustrated example, the athlete may be instructed to throw the ball 1205 toward a designated point 1207. In a particular arrangement, point 1207 may be five feet from the baseline (in a widthwise direction of the athlete when in the initial stance). In some examples, ball 1205 may be required to take the initial bounce within a testing field. In a particular example, the testing field may be 10 feet wide (e.g., baseline 1203 may be 10 feet wide).
  • FIG. 12C illustrates ball 1205 and athlete 1201 after the ball 1205 has made an initial bounce (e.g., near or at the predefined point 1207). On the second bounce of ball 1205, a marker may be placed or the position may be recorded. In one example, an electronic detection system may use weight sensors or RFID tags to determine a second bounce landing point. RFID tags may be incorporated into the ball 1205 or other thrown object. The second bounce point might also be required to be within the testing field (e.g., within an area that is ten feet wide). A distance between the baseline 1203 and the second bounce landing point may be measured and recorded. The measured distance may then be used as a basis for determining a user's score or athleticism rating in the wood-chop bounce exercise or test.
  • The wood-chop bounce test includes various situations in which a test/measurement may be disqualified and the test results not recorded. For example, if the ball fails to land within the test field, the trial may be disqualified. In other examples, disqualification may result if the athlete does not begin the test in a golf stance with the dominant/back foot touching the baseline, the athlete does not throw the ball with two hands cross-body (e.g., the athlete might not be permitted to turn, open and square to the fairway when throwing), and/or the ball is bounced beyond the predefined first bounce point (e.g., striking the ground further than five feet or other predefined distance from the baseline). Various other disqualification rules may also be used in addition or as an alternative to any of the above noted disqualification criteria.
  • An alternate or additional version of the wood-chop bounce may involve neutralizing hip movement so that an athlete does not rely on hip movement in performing the test (e.g., throwing the ball). In one example, the athlete may be required to place and hold a ball or other object between his or her legs (e.g., above the knees). When throwing, the athlete is required to maintain his or her hold of the object between his or her legs. By having such a requirement, the test may minimize the contribution of hip movement during the throw. The ball held between the athlete's legs may be different in size, shape, color, weight and the like from the ball thrown. In another example, the ball held between the athlete's legs may be the same in size, shape, color, weight and other attributes as the ball to be thrown.
  • Another test that may be used to evaluate golf athleticism is a side-sling object launch exercise. This test or exercise may be used to evaluate an athlete's arm swing power while in a particular stance or position. The side-sling object launch may be performed in a hip-neutral manner as is further described below. FIGS. 13A-13D illustrate the movements associated with the side-sling object launch. In FIG. 13A, for example, the athlete 1301 may be instructed to take an initial position/stance similar to a golf stance. For example, the user's dominant foot 1303 may be positioned on a baseline 1305 of a test field 1311 and the athlete's body may be positioned width-wise perpendicular to baseline 1305. The athlete 1301 may further be required to hold an object (such as a ball 1307) between his or her legs to reduce hip movement contributions to the athlete's side-sling. In particular, the athlete 1301 may be required to hold the object between his or her legs (e.g., above the knees) through the entire exercise.
  • FIG. 13B illustrates athlete 1301 holding another object (e.g., ball 1309) that is to be thrown down a test field 1311 in direction C. In some arrangements, ball 1307 may be different in size, shape, color, weight and the like from ball 1309. In another example, ball 1307 may be identical to ball 1309. In FIG. 13C, the athlete 1301 is shown slinging ball 1309 down the test field 1311. The throw may be a single underhanded throw and may require the user to start by swinging his or her arm 1313 ball directly back and then slinging his or her arm 1313 forward, releasing the ball out into the test field 1311. The athlete 1301 may be required to follow through with the swing, ending upright as shown in FIG. 13D.
  • Upon completion of the throw, a distance between the baseline 1305 and the initial landing/contact point of the thrown object, e.g., ball 1309, may be measured. The distance may correspond to the perpendicular distance between the baseline 1305 and the ball 1309. This distance may then be used to determine the athlete's athleticism score or value for the side-sling object launch exercise. Multiple trials may be performed and an average may be taken in some arrangements. Disqualifications may be levied if the athlete 1301 throws the ball 1309 and the ball 1309 does not land within the test field 1311 (width-wise), the athlete 1301 does not begin the test in the initial position (e.g., golf stance) with his or her back foot 1303 touching baseline 1305, the athlete's backswing carries his or her arm 1313 outside of his width-wise body line (e.g., the arm 1313 crosses the plane created by the athlete's shoulders and back in the initial stance, causing the ball 1309 to deviate from the test field 1311), the athlete 1301 takes any kind of step with his front foot and/or if ball 1307 held between the athlete's legs falls out during the throw.
  • In one or more of the above tests, RFID tags or other sensors may also be worn by the user to detect various movements and positions. For example, an RFID tag may be placed in heel portion of a user's shoe to detect a landing position in the broad jump test. In another example, RFID tags may be incorporated into a lateral edge (inner and/or exterior) of a user's shoe to detect a landing point in the countermovement lateral hop test. The RFID tag in the lateral edge or in other portions of the shoe may also be used to detect whether the user contacted or is contacting a baseline.
  • As with the par 5 step test, a computing system such as processing device 106 or computer system 108 may provide automated instructions to the athletes for any of the above noted exercises and movements, positions, measurements and data submissions described herein. For example, the computing system may provide audio, visual and/or haptic cues. Additionally or alternatively, sensors may be used to determine if the athlete is in a correct stance, determine one or more landing positions of a thrown object or the athlete, positions of an athlete's body parts and the like. The computing system may further generate a data input form requesting recordation of trial data for each of the tests described herein. Alternatively or additionally, the computing system may generate a single form or a sequence of forms having fields for the types of data to be recorded for each of the various golf athleticism tests/exercises. Upon receipt of the data, the computing system may determine the rating or score for each exercise in addition to an overall golf athleticism rating that takes into account all of the scores and results from all of the tests/exercises performed. For example, as described herein, an athlete's fractional event points (or other scoring value) for each text/exercise may be summed to result in an overall rating. The overall rating may also be subject to scaling factors (e.g., multiplying by a certain factor) to derive the scaled athleticism rating.
  • FIGS. 14A and 14B illustrate an example test field configuration that may be used for one or more of the golf athleticism rating tests described herein. FIG. 14A illustrates a front perspective view of test field 1400. Test field 1400 may include multiple markers including 5-foot markers 1403 and submarkers 1405. A baseline 1407 may be defined at the O-foot marker 1403 a with a staging area 1409 defined therebehind. The test field 1400 may be divided width-wise into two sections 1411 a and 1411 b by a demarcation or dividing line 1413. Section 1411 a may be configured for throwing-oriented exercises while section 1411 b may be configured for jumping/hopping type tests/exercises. In one arrangement, section 1411 b might only extend length-wise for a distance smaller than the length of section 1411 a. In particular, section 1411 b might only have distance markers up to a first distance while section 1411 a may have distance markers up to a second distance greater than the first distance. In a specific example, section 1411 a may be 50 feet long while section 1411 b may be 10 feet long. Other distances may be used to define each of sections 1411 a and 1411 b. The widths of sections 1411 a and 1411 b may be equal or one may be greater than the other. In one particular example, the width of section 1411 a may be 10 feet.
  • FIG. 14B illustrates test field 1400 from a top down view. Test field 1400, as described, includes sections 1411 a and 1411 b that are substantially rectangular. In section 1411 a, a bounce line for various tests such as the wood-chop bounce may be predefined at the 5 foot marker 1403 b. The remaining portion of section 1411 a may be labeled with a “FAIRWAY” mark to indicate the throwing region. Section 1411 b, on the other hand, may be labeled with a “JUMP ZONE” label to identify section 1411 b as a jumping or hopping test area. Using these predefined sections 1411 a and 1411 b, athletes may perform different tests at the same time.
  • In one or more arrangements, test field 1400 may be provided as a mat that is portable. The portable mat may allow test administrators and test takers to administer or take the tests in a variety of locations (i.e., the tests would not be restricted to one particular physical location). Additionally, using a portable mat may provide consistency in the results that are produced. In some arrangements, the mat may include one or more electronic devices including sensors, LED displays, other visual displays, haptic feedback devices and the like. For example, LED numbering may be used instead of drawn numbering on the mat. In another example, weight sensors may be used to detect the position and weight distribution of athletes and test objects (e.g., balls). In a particular example, a sensor may be placed at the borders of the test sections 1411 a and 1411 b including the baseline 1407, end lines 1405 a and 1405 b and dividing line 1413. In yet other examples, infrared or laser sensors may be used to detect whether an athlete or an object is touching (or not touching) or touches a particular point on the mat such as the bounce line, the baseline and the like. In still another example, a display device may display instructions for performing a test. The mat may thus further include data transmission devices (e.g., wireless adapters, USB ports, serial ports and the like) to transmit detected information to one or more computing systems such as processing device 106 and/or computer system 108. The baseline, bounce line and other demarcations on the mat may be provided in different colors or patterns for visual differentiation. This may help the athlete identify targets, position requirements and the like.
  • The mat or portable test field may be composed of various materials and/or combinations of materials including plastic such as artificial turf and/or, foams, metals and the like. The mat may, in one particular example, be roughly 15 feet wide and 60 feet long. Thus, in the example described above where section 1411 a (FIGS. 14A and 14B) is 10 feet wide, the other section, section 1411 b, may be 5 feet wide or smaller. Other lengths and widths may be used depending on the test criteria and requirements.
  • While the invention has been described in detail in terms of specific examples including presently preferred modes of carrying out the invention, those skilled in the art will appreciate that there are numerous variations and permutations of the above described systems and methods. Thus, the scope of the invention should be construed broadly as set forth in the appended claims.

Claims (20)

1. A method comprising:
receiving, at a computing system having at least one processor, athletic performance results from multiple types of performance tests, the athletic performance results including at least two selected from the following:
a change in pulse of an athlete measured during a stepping exercise,
a broad jump distance of an athlete,
a lateral hop distance of an athlete,
a bounce distance of a ball when thrown by the athlete in a downward direction toward a target, and
a sling distance of a ball when thrown by the athlete using a underhanded side sling; and
generating, by the computing system, a golf athleticism rating based on the at least two athletic performance results.
2. The method of claim 1, wherein the athletic performance results includes the thrown distance of the ball, and wherein the method further includes receiving, at the computing system, the sling distance of the ball during an exercise in which the athlete holds an object between the athlete's legs while performing the underhanded side sling.
3. The method of claim 1, wherein the stepping exercise includes multiple rounds of stepping by the athlete, each round of stepping followed by a rest period.
4. The method of claim 1, further comprising generating an athleticism score for each athleticism performance result, and wherein generating the golf athleticism rating includes combining each athleticism score.
5. The method of claim 4, wherein generating the athleticism score for each athletic performance result includes normalizing the athletic performance result data to a common scale.
6. The method of claim 1, wherein at least one of the athletic performance results is received from an athletic performance field having one or more integrated sensors.
7. The method of claim 6, wherein the athletic performance field comprises a portable test field.
8. An apparatus comprising:
at least one processor; and
memory operatively coupled to the at least one processor and storing computer readable instructions that, when executed, cause the apparatus to:
receive athletic performance results corresponding to an athlete's performance during multiple types of athletic exercises, the athletic performance results including at least two of:
a change in pulse of an athlete measured during a stepping exercise,
a broad jump distance of an athlete,
a lateral hop distance of an athlete,
a bounce distance of a ball when thrown by the athlete in a downward direction toward a target, and
a sling distance of a ball when thrown by the athlete using a underhanded side sling; and
generate a golf athleticism rating based on the at least two athletic performance results.
9. The apparatus of claim 8, wherein the athletic performance results includes the sling distance of the ball, and wherein the method further includes receiving, at the computing system, the sling distance of the ball during an exercise in which the athlete holds an object between the athlete's legs while performing the underhanded single-hand throw.
10. The apparatus of claim 8, wherein the stepping exercise includes multiple rounds of stepping by the athlete, each round of stepping followed by a rest period.
11. The apparatus of claim 8, further comprising computer readable instructions for generating an athleticism score for each athletic performance result, and wherein generating the golf athleticism rating includes combining the athleticism scores.
12. The apparatus of claim 11, wherein generating each athleticism score includes normalizing the athleticism result to a common scale.
13. The apparatus of claim 8, wherein at least one of the athletic performance results is received from an athletic performance field having one or more integrated sensors.
14. The apparatus of claim 13, wherein the athletic performance field comprises a portable test field.
15. One or more non-transitory computer readable media having computer-executable instructions embodied thereon that when executed by a processor perform a method for evaluating the athleticism of an athlete in golf, the method comprising:
receiving at least two results for the athlete's performance in at least two different athletic performance tests related to golf;
comparing each of the at least two results to a corresponding distribution of test results of athletic data for athletes similar to the athlete and determining a percentile ranking for each of the at least two results;
transforming the percentile ranking for each of the at least two results to a fractional event point number for each result, wherein the percentile rankings for each of the at least two results are progressive; and
determining an athleticism rating score for the athlete in golf based on the fractional event point numbers.
16. The one or more non-transitory computer readable media of claim 15, further comprising determining an athleticism score includes combining the fractional event point numbers to determine a combined fractional event point number and applying a scaling factor to the combined fractional event point number.
17. The one or more non-transitory computer readable media of claim 16, wherein transforming the percentile ranking for the at least two results to the fractional event point number comprises applying an inverse-Weibull transformation.
18. The one or more non-transitory computer readable media of claim 17, wherein the distribution of test results of athletic data for athletes similar to the athlete is determined using the empirical cumulative distribution function.
19. The one or more non-transitory computer readable media of claim 18, wherein the percentile ranking for each of the at least two results is capped at a ceiling value.
20. The one or more non-transitory computer readable media of claim 15, wherein the athletic performance results including at least two of:
a change in pulse of an athlete measured during a stepping exercise,
a broad jump distance of an athlete,
a lateral hop distance of an athlete,
a bounce distance of a ball when thrown by the athlete in a downward direction toward a target, and
a sling distance of a ball when thrown by the athlete using a underhanded side sling.
US13/682,217 2008-09-12 2012-11-20 Golf athleticism rating system Abandoned US20130079907A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/682,217 US20130079907A1 (en) 2008-09-12 2012-11-20 Golf athleticism rating system
PCT/US2012/069784 WO2014081442A1 (en) 2012-11-20 2012-12-14 Golf athleticism rating system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US9660308P 2008-09-12 2008-09-12
US12/559,082 US20100129780A1 (en) 2008-09-12 2009-09-14 Athletic performance rating system
US13/682,217 US20130079907A1 (en) 2008-09-12 2012-11-20 Golf athleticism rating system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/559,082 Continuation-In-Part US20100129780A1 (en) 2008-09-12 2009-09-14 Athletic performance rating system

Publications (1)

Publication Number Publication Date
US20130079907A1 true US20130079907A1 (en) 2013-03-28

Family

ID=47912119

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/682,217 Abandoned US20130079907A1 (en) 2008-09-12 2012-11-20 Golf athleticism rating system

Country Status (1)

Country Link
US (1) US20130079907A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120253653A1 (en) * 2011-02-17 2012-10-04 Nike, Inc. Location Mapping
US20150018112A1 (en) * 2013-07-10 2015-01-15 Sweet Spot Science Corp. System and method for golf swing training
US20150032235A1 (en) * 2013-07-23 2015-01-29 BADPOPCORN, Inc. Systems and methods for automated analysis of fitness data
US9089182B2 (en) 2008-06-13 2015-07-28 Nike, Inc. Footwear having sensor system
US9192816B2 (en) 2011-02-17 2015-11-24 Nike, Inc. Footwear having sensor system
US20150375043A1 (en) * 2006-09-29 2015-12-31 Nike Inc. Multi-Mode Acceleration-Based Athleticism Measurement System
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
US9389057B2 (en) 2010-11-10 2016-07-12 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9410857B2 (en) 2013-03-15 2016-08-09 Nike, Inc. System and method for analyzing athletic activity
US9462844B2 (en) 2008-06-13 2016-10-11 Nike, Inc. Footwear having sensor system
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
US9682280B1 (en) * 2013-09-20 2017-06-20 Sparta Software Corporation System for analysing athletic movement
US9737758B1 (en) * 2013-09-20 2017-08-22 Sparta Software Corporation Method and system for generating athletic signatures
US9737761B1 (en) * 2014-10-29 2017-08-22 REVVO, Inc. System and method for fitness testing, tracking and training
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
US9756895B2 (en) 2012-02-22 2017-09-12 Nike, Inc. Footwear having sensor system
US20180075392A1 (en) * 2016-09-12 2018-03-15 Real Time Athletes, Inc. Standardized athletic evaluation system and methods for using the same
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
US10471290B1 (en) * 2013-10-10 2019-11-12 Sparta Software Corporation Method and system for training athletes based on athletic signatures and prescription
US10568381B2 (en) 2012-02-22 2020-02-25 Nike, Inc. Motorized shoe with gesture control
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
US20210145367A1 (en) * 2018-06-05 2021-05-20 Sparta Software Corporation Systems, devices, and methods for determining injury risk and athletic readiness
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6746370B1 (en) * 2001-09-27 2004-06-08 Meridian Asset Management Inc. Method, apparatus and data processor program product capable of enabling administration of a levels-based athleticism development program
US20080114564A1 (en) * 2004-11-25 2008-05-15 Masayoshi Ihara Information Classifying Device, Information Classifying Method, Information Classifying Program, Information Classifying System
US20080188353A1 (en) * 2007-02-05 2008-08-07 Smartsport, Llc System and method for predicting athletic ability

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6746370B1 (en) * 2001-09-27 2004-06-08 Meridian Asset Management Inc. Method, apparatus and data processor program product capable of enabling administration of a levels-based athleticism development program
US20080114564A1 (en) * 2004-11-25 2008-05-15 Masayoshi Ihara Information Classifying Device, Information Classifying Method, Information Classifying Program, Information Classifying System
US20080188353A1 (en) * 2007-02-05 2008-08-07 Smartsport, Llc System and method for predicting athletic ability

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"Hanrahan Golf School Golf Tips", 1/9/2102, retrieved from internet URL http://web.archive.org/web/20110201000000*/http://www.hanrahangolfschool.com/tips.php on 10/23/2013 *
Pratt, Steve, "Increasing Clubhead Speed by Studying the Submarine Throw", 1/31/2011, retrieved from URL https://www.hititlonger.com/index.php?/blog/article/increasing-clubhead-speed-by-studying-the-submarine-throw/ on 10/23/2013 *
Sykes, et al, "The Chester step test - a simple yet effective tool for the prediction of aerobic capacity," Physiotherapy 90 (2004), 183-188 *

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150375043A1 (en) * 2006-09-29 2015-12-31 Nike Inc. Multi-Mode Acceleration-Based Athleticism Measurement System
US11654333B2 (en) 2006-09-29 2023-05-23 Nike, Inc. Multi-mode acceleration-based athleticism measurement system
US11400343B2 (en) 2006-09-29 2022-08-02 Nike, Inc. Multi-mode acceleration-based athleticism measurement system
US10729936B2 (en) * 2006-09-29 2020-08-04 Nike, Inc. Multi-mode acceleration-based athleticism measurement system
US11707107B2 (en) 2008-06-13 2023-07-25 Nike, Inc. Footwear having sensor system
US9089182B2 (en) 2008-06-13 2015-07-28 Nike, Inc. Footwear having sensor system
US11026469B2 (en) 2008-06-13 2021-06-08 Nike, Inc. Footwear having sensor system
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
US10912490B2 (en) 2008-06-13 2021-02-09 Nike, Inc. Footwear having sensor system
US10314361B2 (en) 2008-06-13 2019-06-11 Nike, Inc. Footwear having sensor system
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
US9622537B2 (en) 2008-06-13 2017-04-18 Nike, Inc. Footwear having sensor system
US9462844B2 (en) 2008-06-13 2016-10-11 Nike, Inc. Footwear having sensor system
US10293209B2 (en) 2010-11-10 2019-05-21 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9429411B2 (en) 2010-11-10 2016-08-30 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11935640B2 (en) 2010-11-10 2024-03-19 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11817198B2 (en) 2010-11-10 2023-11-14 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11600371B2 (en) 2010-11-10 2023-03-07 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11568977B2 (en) 2010-11-10 2023-01-31 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9757619B2 (en) 2010-11-10 2017-09-12 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9389057B2 (en) 2010-11-10 2016-07-12 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US10632343B2 (en) 2010-11-10 2020-04-28 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9924760B2 (en) 2011-02-17 2018-03-27 Nike, Inc. Footwear having sensor system
US9192816B2 (en) 2011-02-17 2015-11-24 Nike, Inc. Footwear having sensor system
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
US8827815B2 (en) * 2011-02-17 2014-09-09 Nike, Inc. Location mapping
US10179263B2 (en) 2011-02-17 2019-01-15 Nike, Inc. Selecting and correlating physical activity data with image data
US20120253653A1 (en) * 2011-02-17 2012-10-04 Nike, Inc. Location Mapping
US9411940B2 (en) 2011-02-17 2016-08-09 Nike, Inc. Selecting and correlating physical activity data with image data
US10357078B2 (en) 2012-02-22 2019-07-23 Nike, Inc. Footwear having sensor system
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control
US10568381B2 (en) 2012-02-22 2020-02-25 Nike, Inc. Motorized shoe with gesture control
US11071345B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Footwear having sensor system
US9756895B2 (en) 2012-02-22 2017-09-12 Nike, Inc. Footwear having sensor system
US11793264B2 (en) 2012-02-22 2023-10-24 Nike, Inc. Footwear having sensor system
US11071344B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Motorized shoe with gesture control
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
US11918854B2 (en) 2013-02-01 2024-03-05 Nike, Inc. System and method for analyzing athletic activity
US9810591B2 (en) 2013-03-15 2017-11-07 Nike, Inc. System and method of analyzing athletic activity
US9410857B2 (en) 2013-03-15 2016-08-09 Nike, Inc. System and method for analyzing athletic activity
US10024740B2 (en) 2013-03-15 2018-07-17 Nike, Inc. System and method for analyzing athletic activity
US20150018112A1 (en) * 2013-07-10 2015-01-15 Sweet Spot Science Corp. System and method for golf swing training
US20150032235A1 (en) * 2013-07-23 2015-01-29 BADPOPCORN, Inc. Systems and methods for automated analysis of fitness data
US9682280B1 (en) * 2013-09-20 2017-06-20 Sparta Software Corporation System for analysing athletic movement
US9737758B1 (en) * 2013-09-20 2017-08-22 Sparta Software Corporation Method and system for generating athletic signatures
US11179602B2 (en) 2013-10-10 2021-11-23 Sparta Software Corporation Method and system for training athletes based on athletic signatures and prescription
US10471290B1 (en) * 2013-10-10 2019-11-12 Sparta Software Corporation Method and system for training athletes based on athletic signatures and prescription
US9737761B1 (en) * 2014-10-29 2017-08-22 REVVO, Inc. System and method for fitness testing, tracking and training
US20180075392A1 (en) * 2016-09-12 2018-03-15 Real Time Athletes, Inc. Standardized athletic evaluation system and methods for using the same
US20210145367A1 (en) * 2018-06-05 2021-05-20 Sparta Software Corporation Systems, devices, and methods for determining injury risk and athletic readiness

Similar Documents

Publication Publication Date Title
US20130079907A1 (en) Golf athleticism rating system
US20120130515A1 (en) Athletic performance rating system
US20120130514A1 (en) Athletic performance rating system
US10532248B2 (en) Monitoring of physical training events
US10729936B2 (en) Multi-mode acceleration-based athleticism measurement system
US10467926B2 (en) Conformal sensor systems for sensing and analysis
US8348809B2 (en) System for training optimisation
US10881905B2 (en) Integrated portable device and method implementing an accelerometer for detecting asymmetries in a movement of a user
US20100129780A1 (en) Athletic performance rating system
Saponara Wearable biometric performance measurement system for combat sports
JP2014512220A (en) System and method for storing and analyzing golf data
CA2750094A1 (en) Athletic performance rating system
WO2014081442A1 (en) Golf athleticism rating system
KR20160121460A (en) Fitness monitoring system
Bramley The relationship between strength, power and speed measures and playing ability in premier level competition rugby forwards
AU2014203635A1 (en) Monitoring of physical training events
Staunton Development and application of novel accelerometry-derived metrics for athlete monitoring in basketball

Legal Events

Date Code Title Description
AS Assignment

Owner name: NIKE, INC., OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ANNIS, DAVID;REEL/FRAME:030819/0121

Effective date: 20130621

Owner name: NIKE, INC., OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HOMSI, KRISTOPHER L.;REEL/FRAME:030819/0039

Effective date: 20121213

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION