US20140255890A1 - Patient support apparatus with physical therapy system - Google Patents

Patient support apparatus with physical therapy system Download PDF

Info

Publication number
US20140255890A1
US20140255890A1 US14/197,678 US201414197678A US2014255890A1 US 20140255890 A1 US20140255890 A1 US 20140255890A1 US 201414197678 A US201414197678 A US 201414197678A US 2014255890 A1 US2014255890 A1 US 2014255890A1
Authority
US
United States
Prior art keywords
occupant
support apparatus
patient support
physical therapy
sensor unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/197,678
Inventor
Michelle Kovach
Christopher R. O'Keefe
Varad SRIVASTAVA
Thomas F. HElL
John Sparkman
Mark Lanning
Michael S. Hood
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hill Rom Services Inc
Original Assignee
Hill Rom Services Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hill Rom Services Inc filed Critical Hill Rom Services Inc
Priority to US14/197,678 priority Critical patent/US20140255890A1/en
Publication of US20140255890A1 publication Critical patent/US20140255890A1/en
Assigned to HILL-ROM SERVICES, INC. reassignment HILL-ROM SERVICES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Kovach, Michelle, O'KEEFE, CHRISTOPHER R., Sparkman, John, Lanning, Mark, HEIL, THOMAS F., HOOD, MICHAEL, SRIVASTAVA, Varad
Assigned to JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT reassignment JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ALLEN MEDICAL SYSTEMS, INC., ASPEN SURGICAL PRODUCTS, INC., HILL-ROM SERVICES, INC., WELCH ALLYN, INC.
Assigned to JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT reassignment JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT SECURITY AGREEMENT Assignors: ALLEN MEDICAL SYSTEMS, INC., ASPEN SURGICAL PRODUCTS, INC., HILL-ROM SERVICES, INC., WELCH ALLYN, INC.
Assigned to MORTARA INSTRUMENT, INC., ALLEN MEDICAL SYSTEMS, INC., Voalte, Inc., HILL-ROM COMPANY, INC., MORTARA INSTRUMENT SERVICES, INC., WELCH ALLYN, INC., HILL-ROM SERVICES, INC., HILL-ROM, INC., ANODYNE MEDICAL DEVICE, INC. reassignment MORTARA INSTRUMENT, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: JPMORGAN CHASE BANK, N.A.
Assigned to JPMORGAN CHASE BANK, N.A. reassignment JPMORGAN CHASE BANK, N.A. SECURITY AGREEMENT Assignors: ALLEN MEDICAL SYSTEMS, INC., ANODYNE MEDICAL DEVICE, INC., HILL-ROM HOLDINGS, INC., HILL-ROM SERVICES, INC., HILL-ROM, INC., Voalte, Inc., WELCH ALLYN, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6891Furniture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G7/00Beds specially adapted for nursing; Devices for lifting patients or disabled persons
    • A61G7/05Parts, details or accessories of beds
    • A61G7/0506Head or foot boards

Definitions

  • the present disclosure is related to a patient support apparatus. More specifically, the present disclosure is related to a patient support apparatus configured to enable a patient to perform a physical therapy routine while being supported by the patient support apparatus.
  • a physician prescribes physical therapy to a patient to assist the patient's recovery after an injury caused by physical trauma or disease.
  • a nurse, physical therapist, or caregiver may instruct and observe the patient while the patient performs the prescribed physical therapy routine.
  • the patient may perform the physical therapy routine in a variety of locations including a hospital room or the patient's home. Additionally, the patient may perform the physical therapy routine when standing, sitting, or while being supported by a patient support apparatus.
  • One method of administering the prescribed physical therapy routine, and verifying that the physical therapy routine is correctly performed is to hire a caregiver, trained in physical therapy, to monitor the patient as the patient performs the physical therapy routine. Requiring a caregiver, trained in physical therapy, to supervise the patient can place a large financial burden on the patient. Additionally, a hospital or other facility may have a limited number of caregivers trained in physical therapy. This may lead to less physical therapy sessions for all patients in need of physical therapy or some patients with marginal need of physical therapy receiving no physical therapy at all.
  • Another method of administering the prescribed physical therapy is to instruct the patient to perform self-guided physical therapy.
  • a patient may be unmotivated and choose not to perform the required physical therapy. Additionally, the patient may perform the physical therapy routine incorrectly causing poor results or injury to the patient.
  • Patients prescribed physical therapy may be instructed to remain in bed or may be physically unable to exit a bed. These patients may experience difficulty in performing prescribed physical therapy that requires the patient to be out of bed or incorporates the use of heavy equipment.
  • a caregiver trained in physical therapy may have a building or area within a building dedicated to physical therapy sessions. The caregiver may have to relocate to the patient's room due to the inability of the patient to exit their bed. The travel associated with the caregiver relocating may reduce the total time available to all patients for physical therapy or increase the financial burden of the patient receiving the physical therapy. By relocating to the patient's room, the caregiver may be limited to assisting one patient instead of multiple patients in a group therapy session. As such, traditional physical therapy may be difficult or costly for some patients and may lead to less physical therapy for all patients.
  • the sensor unit is operable to sense a position and motion of limbs of the occupant in the patient support apparatus and to transmit information representing the position and motion of the limbs of the occupant to the processor.
  • the graphical display is coupled to the processor and configured to display graphics based upon feedback from the processor.
  • the memory device is coupled to the processor and contains information representing an idealized set of positions and motions of limbs of an occupant to be achieved by the occupant while performing a physical therapy routine.
  • the processor is configured to update the graphical display based upon the information representing the position and motion of the limbs of the occupant performing the physical therapy routine received from the sensor unit and a comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device.
  • the processor may be configured to update the graphical display to display graphics instructing the occupant to move the limbs of the occupant with at least one of a first speed, a first range of motion, and a first force to progress the physical therapy. In some embodiments, the processor may be configured to update the graphical display to display graphics instructing the occupant to move the limbs of the occupant with at least one of a second speed, a second range of motion, and a second force to progress the physical therapy.
  • the one of the second speed, second range of motion, and second force may be determined by the comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device.
  • the information representing the idealized set of positions and motions of limbs of the occupant may be based upon a length of a limb of the occupant. In some embodiments, the information representing the idealized set of positions and motions of limbs of the occupant may be based upon an age of the occupant.
  • the sensor unit may be wirelessly connected to an assigned patient support apparatus and the sensor unit may sound an audible alarm if the sensor unit is moved to a position outside of a specified range of the assigned patient support apparatus.
  • the patient support apparatus may include at least one physiological sensor.
  • the physiological sensor may be configured to transmit information representing one of a heart rate, a respiration rate, calories burned, and a temperature of the occupant to the processor.
  • the sensor unit may include a number of weight sensors. In some embodiments, the sensor unit may be mounted on a member of the patient support apparatus.
  • the graphical display may be positioned to be visible to the occupant. In some embodiments, the graphical display may be mounted on a member of the patient support apparatus. In some embodiments, the graphical display may be included in the sensor unit.
  • the sensor unit may include an image-recording device. In some embodiments, the sensor unit may include at least one accelerometer. In some embodiments, the sensor unit includes a radio frequency sensor.
  • the sensor unit may include at least one switch.
  • the at least one switch may have an inactive and an active position. The occupant may be enabled to move the at least one switch between the inactive position and the active position.
  • the at least one switch may be configured to offer a resistance against moving between the inactive and active position.
  • the resistance offered by the switch may be variable.
  • the processor may create data relating to statistics of the occupant while performing the physical therapy routine.
  • the data may be stored in the memory device.
  • the data may be automatically transmitted to a computer network of a hospital.
  • the data relating to statistics of the occupant may include at least one of a heart rate of the occupant, a number of repetitions of the physical therapy routine performed by the occupant, and a score indicative of the caliber of the performance of the occupant.
  • the sensor unit may include an accelerometer, a switch, and a radio frequency sensor.
  • the sensor unit may include an accelerometer, a switch, the graphical display, and the processor.
  • the graphical display may be a touch screen.
  • the processor is may be configured to update the graphical display based on a physical fitness of the occupant.
  • the physical fitness of the occupant may include information relating to at least an age of the occupant, a weight of the occupant, a height of the occupant, any medicines prescribed to the occupant, and a daily physical fitness level of the occupant.
  • the patient support apparatus may be configured to receive information from a computer network.
  • the physical fitness of the occupant may be determined from a medical record including information relating to the physical fitness of the occupant.
  • the medical record may be received by the patient support apparatus from the computer network.
  • the processor may be configured to end physical therapy based upon the information representing the position and motion of the limbs of the occupant performing the physical therapy routine received from the sensor unit and a comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device.
  • the processor may be configured to produce an alarm signal based upon the information representing the position and motion of the limbs of the occupant performing the physical therapy routine received from the sensor unit and a comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device.
  • the processor may be configured to update the graphical display based upon information received from the patient support apparatus.
  • the information received from the patient support apparatus may include one of an angle and a position of a portion of a mattress included in the patient support apparatus.
  • the processor may be in communication with a computer network.
  • the memory device may include a unique identifier.
  • Information transmitted to the computer network from the processor may include the unique identifier.
  • a method of monitoring an occupant's performance of a physical therapy routine in a patient support apparatus comprises several steps.
  • the steps include displaying graphics on a graphical display instructing the occupant to perform a physical therapy routine, receiving information representing a position and motion of an occupant's limbs, comparing the information representing the position and motion of the occupant's limbs to a set of information representing an optimal position and motion of limbs of an occupant performing the physical therapy routine in the patient support apparatus, and using the information representing the position and motion of the occupant's limbs and the comparison to the set of information representing the optimal position and motion of limbs of an occupant performing the physical therapy routine in a patient support apparatus as parameters to affect the graphics displayed on the graphical display.
  • the sensor is operable to sense the position and motion a limb of the occupant in the patient support apparatus and to transmit information representing the position and motion of the limb of the occupant to the control unit.
  • the graphical display is coupled to the control unit and configured to display graphics based upon instructions from the control unit.
  • the control unit is configured to monitor the information representing the position and motion of the limb of the occupant and transmit instructions to the graphical display based on the information representing the position and motion of the limb of the occupant.
  • control unit may be configured to evaluate the position and motion of the limb of the occupant and create data representing a caliber of the performance of the occupant.
  • patient support apparatus may be configured to communicate the data representing the caliber of the performance of the occupant to a computer network of a hospital.
  • instructions transmitted by the control unit may be based on information stored on the computer network of the hospital.
  • FIG. 1 is a perspective view of a patient support apparatus in accordance with the present disclosure monitoring an occupant supported by the patient support apparatus performing a physical therapy routine;
  • FIG. 2 is a diagrammatic view of the patient support apparatus of FIG. 1 , the patient support apparatus including a control unit, a sensor unit, and a graphical display;
  • FIG. 3 is a diagrammatic view of one embodiment of the sensor unit of FIG. 2 including at least one sensing device and a sensor communications port;
  • FIG. 4 is a diagrammatic view of the patient support apparatus of FIG. 1 , the patient support apparatus configured to communicate with a number of devices included in a computer network of a healthcare facility;
  • FIG. 5 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a second embodiment of the sensor unit;
  • FIG. 6 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a third embodiment of the sensor unit;
  • FIG. 7 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a fourth embodiment of the sensor unit;
  • FIG. 8 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a fifth embodiment of the sensor unit;
  • FIG. 9 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a sixth embodiment of the sensor unit.
  • a patient support apparatus 10 is configured to enable an occupant 12 to perform physical therapy and is shown in FIG. 1 .
  • occupant 12 is prescribed physical therapy to recovery from an injury.
  • patient support apparatus 10 is configured such that occupant 12 can perform at least one physical therapy routine 24 while being supported by patient support apparatus 10 , thus occupant 12 is not required to exit patient support apparatus 10 to perform at least some of the physical therapy prescribed for occupant 12 .
  • patient support apparatus 10 provides instructions for performing physical therapy routine 24 to occupant 12 .
  • occupant 12 may perform physical therapy routine 24 alone or under the supervision of a caregiver 13 .
  • Patient support apparatus 10 includes a number of devices operable to instruct and monitor the progress of occupant 12 performing physical therapy routine 24 and provide feedback to occupant 12 .
  • Patient support apparatus 10 allows for continuous and self-directed physical therapy treatment.
  • Patient support apparatus 10 increases the level of difficulty of physical therapy routine 24 to an appropriate amount of challenge as occupant 12 physically progresses. This approach accelerates patient mobility and leads to reduced hospital stays, reduced re-admissions, and reduced need for caregiver intervention.
  • patient support apparatus 10 includes a control unit 14 , a sensor unit 16 , and a graphical display 18 .
  • Graphical display 18 is configured to display graphics 20 that inform or suggest to occupant 12 how to perform a particular physical therapy routine 24 .
  • Sensor unit 16 is operable to detect the position and the movement of one or more limbs 22 of occupant 12 as occupant 12 performs the particular physical therapy routine 24 .
  • Sensor unit 16 transfers information relating to the position and movement of limbs 22 of occupant 12 to control unit 14 .
  • Control unit 14 compares the information received from sensor unit 16 with information stored in control unit 14 .
  • Control unit 14 then updates graphics 20 on graphical display 18 based on the information received from sensor unit 16 and the comparison made between the information received from sensor unit 16 and the information stored in control unit 14 . Occupant 12 is informed as to what movements are required to continue physical therapy routine 24 by the updated graphics 20 . For example, graphics 20 may indicate that the occupant 12 needs to increase a range of motion or a repetition rate to meet the current target of the physical therapy routine 24 .
  • Patient support apparatus 10 continues to guide occupant 12 through physical therapy routine 24 until physical therapy routine 24 is terminated.
  • physical therapy routine 24 may be terminated when occupant 12 completes physical therapy routine 24 .
  • Physical therapy routine 24 may be terminated if patient support apparatus 10 determines that occupant 12 is unable to complete the physical therapy routine 24 .
  • physical therapy routine 24 may be terminated if occupant 12 is unable to adequately increase their range of motion or complete the required number of repetitions as required by physical therapy routine 24 .
  • occupant 12 may voluntarily and prematurely quit performing physical therapy routine 24 . In such instances, patient support apparatus 10 may automatically terminate physical therapy routine 24 and produce an alarm signal to alert caregiver 13 that physical therapy routine 24 has been terminated.
  • patient support apparatus 10 further includes a main communications port 26 .
  • Main communications port 26 communicates information relating to the performance of physical therapy routine 24 by occupant 12 between patient support apparatus 10 and a computer network 28 , as shown in FIGS. 2 and 4 .
  • Control unit 14 included in patient support apparatus 10 is shown in FIG. 2 .
  • Control unit 14 controls the progression of physical therapy routine 24 .
  • control unit 14 is integrated with a main control unit (not shown) of patient support apparatus 10 .
  • control unit 14 is configured to control only physical therapy routine 24 .
  • Control unit 14 receives information from sensor unit 16 and updates graphics 20 displayed on graphical display 18 in response to the information received from sensor unit 16 .
  • control unit 14 includes a memory device 32 and a processor 30 .
  • Memory device 32 stores information electronically and is configured to communicate with processor 30 .
  • Processor 30 is configured to receive information from memory device 32 and transmit information to memory device 32 to store.
  • the information stored on memory device 32 includes instructions to initiate, progress, and end physical therapy routine 24 .
  • Memory device 32 includes an idealized set of positions and motions of limbs 22 of occupant 12 performing physical therapy routine 24 and supported by patient support apparatus 10 .
  • memory device 32 further includes an unchanging unique identifier.
  • the unique identifier is unique for each control unit 14 .
  • Information transmitted to computer network 28 from control unit 14 includes the unique identifier.
  • the idealized set of positions included in memory device 32 may include ideal Cartesian coordinates of an arm 22 A of occupant 12 for each position of physical therapy routine 24 .
  • the idealized set of motions include an ideal time that it should take occupant 12 to move between each position of physical therapy routine 24 .
  • the idealized set of positions and motions are based upon statistical averages of human anatomy.
  • the idealized set of positions and motions are based upon the anatomy of occupant 12 .
  • the idealized set of positions and motions are based upon the length of the arms and legs of occupant 12 or the age and physical fitness of occupant 12 .
  • Processor 30 included in control unit 14 communicates with graphical display 18 , memory device 32 , and sensor unit 16 .
  • Processor 30 receives information from sensor unit 16 during physical therapy routine 24 . At least some of the information received from sensor unit 16 includes information relating to the position and motion of limbs 22 of occupant 12 . Additionally, processor 30 receives information from memory device 32 relating to an idealized set of positions and motions of occupant 12 performing physical therapy routine 24 while supported by patient support apparatus 10 .
  • Processor 30 compares the information relating to the position and motion of limbs 22 of occupant 12 with the idealized set of positions and motions of occupant 12 performing physical therapy routine 24 .
  • Processor 30 communicates with graphical display 18 to update graphics 20 displayed on graphical display 18 based upon the information relating to the position and motion of limbs 22 of occupant 12 and the comparison of the information relating to the position and motion of limbs 22 of occupant 12 with the idealized set of position and motion of an occupant of patient support apparatus 10 performing physical therapy routine 24 .
  • processor 30 is further configured to update graphical display 18 based upon a physical fitness of occupant 12 .
  • the physical fitness of occupant 12 may include information relating to at least an age of occupant 12 , a weight of occupant 12 , a height of occupant 12 , any medicines prescribed to occupant 12 , and a daily physical fitness level of occupant 12 .
  • patient support apparatus 10 includes a main communications port 26
  • the physical fitness of occupant 12 may be received from computer network 28 .
  • the physical fitness of occupant 12 may be determined from a medical record including information relating to the physical fitness of occupant 12 , where the medical record is received by patient support apparatus 10 from computer network 28 .
  • the physical fitness of occupant 12 is received from computer network 28 automatically.
  • Processor 30 is configured to increase the level of difficulty of physical therapy routine 24 .
  • Processor 30 determines the level of difficulty of physical therapy routine 24 based upon the information relating to the position and motion of limbs 22 of occupant 12 and the comparison of the information relating to the position and motion of limbs 22 of occupant 12 with the idealized set of position and motion of an occupant of patient support apparatus 10 performing physical therapy routine 24 .
  • the level of difficulty is further based upon the physical fitness of occupant 12 .
  • Processor 30 increases the level of difficulty of physical therapy routine 24 by updating graphical display 18 in a way that requires occupant 12 to move with a greater speed, range of motion, and/or with more force to complete physical therapy routine 24 .
  • processor 30 increases the level of difficulty of physical therapy routine 24 for occupant 12 during physical therapy routine 24 . In some embodiments, processor 30 increases the level of difficulty of physical therapy routine 24 for the next physical therapy routine 24 preformed by occupant 12 subsequent a successfully completed physical therapy routine 24 .
  • processor 30 disables certain features of patient support apparatus 10 while occupant 12 performs physical therapy routine 24 .
  • Processor 30 ends physical therapy routine 24 if the information from other devices (not shown) on patient support apparatus 10 or sensor unit 16 suggests occupant 12 is in a hazardous position.
  • patient support apparatus 10 may automatically end physical therapy routine 24 if occupant 12 has achieved a position in which occupant 12 is at excessive risk of falling from patient support apparatus 10 .
  • patient support apparatus 10 may automatically produce and alarm signal used to alert caregiver 13 in response to detecting that occupant 12 is in a hazardous position.
  • processor 30 creates data 34 relating to the statistics of occupant 12 performing physical therapy routine 24 .
  • the statistics may be, for example, a score indicative of the caliber of the performance of occupant 12 or the number of repetitions of a particular physical therapy routine 24 performed by occupant 12 .
  • the statistics created by processor 30 include physiological measurements of occupant 12 during physical therapy routine 24 , for example, the heart rate of occupant 12 .
  • data 34 includes the progression of occupant 12 over the course of performing several progressive physical therapy routines 24 .
  • Processor 30 transmits data 34 relating to the statistics of occupant 12 to memory device 32 for memory device 32 to store.
  • data 34 relating to the statistics of occupant 12 may be transmitted to computer network 28 .
  • data 34 is transmitted to computer network 28 automatically.
  • Graphical display 18 informs occupant 12 as to how to perform physical therapy routine 24 .
  • Graphical display 18 displays graphics 20 in response to input from processor 30 .
  • Graphics 20 displayed on graphical display 18 inform or suggest to occupant 12 how to perform physical therapy routine 24 .
  • graphical display includes speakers 23 that audibly inform occupant 12 as to how to perform physical therapy routine 24 .
  • Speakers 23 also produce audible sounds to enhance the experience of physical therapy routine 24 .
  • graphical display 18 expressly informs occupant 12 how to perform physical therapy routine 24 .
  • graphical display 18 displays and audibly produces words describing how occupant 12 should position and move their limbs 22 .
  • graphical display 18 indirectly informs or suggests to occupant 12 how to perform physical therapy routine 24 .
  • graphical display 18 displays graphics 20 of a person performing physical therapy routine 24 , suggesting that occupant 12 should position and move their limbs 22 in a manner similar to the person displayed on graphical display 18 .
  • graphical display 18 displays graphic 20 in a specific location on graphical display 18 , suggesting occupant 12 should position and move at least one of their limbs 22 to mirror graphic 20 .
  • Graphical display 18 may be any device capable of informing occupant 12 how to perform physical therapy routine 24 .
  • graphical display 18 is an electronic visual display capable of rendering different graphics, for example a touch-screen monitor.
  • graphical display 18 includes pre-determined shapes that are configured to be illuminated.
  • graphical display 18 may be a television or a mat 150 including images of hands and feet where a specific image of a hand is illuminated when occupant 12 needs to place a hand 22 H over the illuminated image of a hand to perform physical therapy routine 24 .
  • graphical display 18 when graphical display 18 is included in sensor unit 16 , for example when sensor unit is mat 150 , occupant 12 is supported by graphical display 18 .
  • a number of graphical displays 22 are positioned at several locations on patient support apparatus 10 .
  • graphical display 18 may be one or more television screens or illuminated graphics 20 located on a side rail 40 , a headboard 42 , or a footboard 44 of patient support apparatus 10 .
  • graphical display is positioned at a single location on patient support apparatus 10 .
  • graphical display 18 is coupled to footboard 44 in FIG. 1 .
  • graphical display 18 rotates so that graphical display 18 faces toward occupant 12 positioned in patient support apparatus 10 or face away from occupant 12 , for example, toward a caregiver (not shown) positioned between occupant 12 and graphical display 18 .
  • patient support apparatus 10 includes main communications port 26 .
  • Main communications port 26 enables patient support apparatus 10 to communicate with computer network or system 28 of a healthcare facility as indicated diagrammatically in FIG. 4 by double-headed arrows 50 .
  • Main communications port 26 may communicate with computer network 28 through a wired or wireless datalink.
  • nurse call system 52 includes a nurse call system 52 , an electronic medical record database 54 , a nurse call/locating badge 56 , one ore more computers programmed with workflow process software 58 , (such as, for example, NaviCare® software which is available from Hill-Rom Company, Inc.), one or more personal digital assistant (PDA's) 60 , one or more voice communications badges 62 , and one or more pagers 64 .
  • workflow process software 58 such as, for example, NaviCare® software which is available from Hill-Rom Company, Inc.
  • PDA's personal digital assistant
  • voice communications badges 62 includes a pagers 64 .
  • pagers 64 includes a pagers 64 .
  • nurse call system 52 and badges 56 are of the type available as part of the ComLinxTM system from Hill-Rom Company, Inc.
  • Main communications port 26 includes a transmitter 70 and a receiver 72 .
  • Main communications port 26 is configured to communicate with one or more computers in computer network 28 via transmitter 70 .
  • transmitter 70 transmits data 34 relating to statistics of occupant 12 performing physical therapy routine 24 along with measurements from a physiological sensor 78 .
  • Transmitter 70 is also configured to transmit an alarm signal to one or more computers in computer network 28 to alert a caregiver 13 of the status of occupant 12 , for example, if occupant 12 is no longer performing physical therapy routine 24 .
  • Receiver 72 included in main communications port 26 receives information from the one or more computers in computer network 28 .
  • receiver 72 receives information relating to the physical fitness of occupant 12 .
  • the information relating to the physical fitness of occupant 12 may be from a medical record of occupant 12 stored in a database of computer network 28 .
  • Processor 30 is configured to update graphics 20 on graphical display 18 to instruct occupant 12 to perform physical therapy routine 24 with a difficulty suitable for the physical fitness of occupant 12 .
  • Sensor unit 16 is configured to sense the position and motion of limbs 22 of occupant 12 and transmit information relating to the position and motion of limbs 22 of occupant 12 to processor 30 as shown in FIGS. 2 and 3 .
  • Sensor unit 16 includes at least one sensing device 74 and a sensor communications port 76 .
  • sensor unit 16 further includes at least one of physiological sensor 78 , an alarm 80 , and an auxiliary feedback device 82 .
  • sensor unit 16 is a portable device physically uncoupled with patient support apparatus 10 .
  • sensor unit 16 may be a tablet computer or handheld controller.
  • sensor unit 16 is omitted and components of sensor unit 16 are integral with patient support apparatus 10 .
  • sensing devices 74 may be integral with side rails 40 or footboard 44 .
  • Sensor communications port 76 communicates information between sensor unit 16 and processor 30 .
  • Sensor communications port includes a transmitter 86 that transmits information to processor 30 .
  • the information transmitted to processor 30 includes, for example, the information relating to the position and movement of limbs 22 of occupant 12 and measurements from physiological sensor 78 .
  • sensor communications port 76 includes a receiver 88 configured to receive information from processor 30 .
  • the information received from processor 30 includes, for example, instructions to control sensing device 74 , physiological sensor 78 , alarm 80 , and auxiliary feedback device 82 .
  • Sensor communications port 76 may communicate with processor 30 through a wired or wireless datalink.
  • Sensing device 74 is configured to detect at least information relating to the position and motion of limbs 22 of occupant 12 .
  • Sensing device 74 may be one of a number of types of sensors.
  • sensing device 74 may include an accelerometer 100 , a camera 120 , a weight sensor 110 , radio frequency sensor 140 , and a switch 130 .
  • sensor unit 16 includes multiple sensing devices 74 .
  • sensor unit 16 includes at least one accelerometer 100 and at least one switch 130 .
  • sensor unit 16 includes a video camera 120 and a number of weight sensors 110 .
  • sensing device 74 is at least one camera 120 as shown in FIG. 1 .
  • Camera 120 is configured to record images of occupant 12 .
  • the images of occupant 12 are communicated to processor 30 .
  • Processor 30 is configured to determine the movement and position of the limbs 22 of occupant 12 based on the images recorded by camera 120 .
  • the movement and position of limbs of occupant 12 are determined by the images and a comparison between consecutive images.
  • two cameras are used for a larger field of view or to enable processor 30 to determine depth of objects in the field of view of cameras 120 .
  • Camera 120 is configured to be mounted at any location where the viewing area of camera 120 includes at least one of the limb 22 of occupant 12 required to perform physical therapy routine 24 .
  • cameras 120 may be mounted on the side rails 40 , headboard 42 , or footboard 44 of patient support apparatus 10 .
  • Camera 120 may be independently mounted to a tripod 122 next to patient support apparatus 10 .
  • Camera 120 may be any device capable of recording images of occupant 12 .
  • camera 120 may be a video camera, a digital camera, a depth camera, and stereo cameras. Camera 120 may detect in the visible light spectrum or in the infrared light spectrum.
  • sensing device 74 is accelerometer 100 as shown, as an example, in FIG. 5 .
  • Accelerometer 100 is able to measure acceleration of sensor unit 16 in a first directional axis 102 .
  • sensor unit 16 when sensor unit 16 includes accelerometers 100 , sensor unit 16 includes three accelerometers 100 A, 100 B, and 100 C to receive acceleration information about three axes.
  • Accelerometer 100 A is able to measure acceleration in first directional axis 102 .
  • Accelerometer 100 B is able to measure acceleration in a second directional axis 104 orthogonal to first directional axis 102 .
  • Accelerometer 100 C is able to measure acceleration in a third directional axis 106 orthogonal to both first directional axis 102 and second directional axis 104 .
  • occupant 12 holds sensor unit 16 in one or both hands 22 H.
  • graphical display 18 instructs occupant 12 as to which hand or hands 22 H should hold sensor unit 16 .
  • Occupant 12 performing physical therapy routine 24 moves arms 22 A and hands 22 H as required to complete physical therapy routine 24 .
  • Accelerometers 100 A, 100 B, 100 C measure the acceleration of sensor unit 16 as occupant 12 moves hands 22 H.
  • Processor 30 uses the acceleration measurements to determine relative position of limbs 22 of occupant 12 between positions. As such, sensor unit 16 is able to measure the acceleration of limbs 22 of occupant 12 as well as the relative position of limbs 22 of occupant 12 during physical therapy routine 24 .
  • sensing device 74 is a radio frequency sensor 140 as shown in FIG. 5 .
  • Radio frequency sensor 140 transmits and receives radio waves to determine a point in space where sensor unit 16 is pointing.
  • radio frequency sensor 140 is included in an end 142 of sensor unit 16 .
  • End 142 of sensor unit 16 is pointed at graphical display 18 coupled to footboard 44 of patient support apparatus 10 .
  • Processor 30 determines a point 144 on graphical display 18 that sensor unit 16 is being pointed based on the information from radio frequency sensor 140 .
  • Point 144 appears as a cursor 146 on graphical display 18 .
  • Occupant 12 is able to use sensor unit 16 including radio frequency sensor 140 as a pointer to control graphic display 18 .
  • sensor unit 16 is elongated and configured to be held in one hand 22 H of occupant 12 .
  • Sensor unit 16 includes radio frequency sensor 140 in end 142 of sensor unit 16 .
  • Occupant 12 is able to orient sensor unit 16 such that cursor 146 appears on graphical display 18 at a location desired by occupant 12 .
  • graphical display 18 displays several menu options.
  • Occupant 12 orients sensor unit 16 such that cursor 146 highlights a desired menu option on graphical display 18 .
  • Occupant 12 selects the highlighted menu option by depressing a button on sensor unit 16 while the menu option is highlighted.
  • a number of graphics 20 are displayed on graphical display 18 and occupant 12 is instructed to orient sensor unit 16 such that cursor 146 is moved towards and virtually contacts graphics 20 .
  • cursor 146 is moved towards and virtually contacts graphics 20 .
  • the hand 22 H and wrist of occupant 12 orienting sensor unit 16 are strengthened.
  • sensing device 74 is a weight sensor 110 as shown in FIG. 6 .
  • Weight sensor 110 is capable of measuring an amount of weight applied to weight sensor 110 .
  • a number of weight sensors 110 are included in patient support apparatus.
  • patient support apparatus 10 includes a number of weight sensors 110 in a deck section 46 of patient support apparatus 10 .
  • Occupant 12 is supported by deck section 46 and, as such, the weight of occupant 12 is distributed to the number of weight sensors 110 . As occupant 12 moves in patient support apparatus 10 , the weight of occupant 12 is redistributed to the number of weight sensors 110 . The number of weight sensors 110 communicate the weight of occupant 12 measured by each weight sensor 110 to processor 30 . Processor 30 is configured to determine the movement and position of the limbs 22 of occupant 12 based on the change in the weight measured by each weight sensor 110 when occupant 12 moves.
  • sensing device 74 is a switch 130 as shown in FIGS. 7 and 8 .
  • Switch 130 has an active position and an inactive position.
  • Switch 130 is in the active position when switch 130 is closed by occupant 12 .
  • Switch 130 is in the inactive position when switch 130 is opened by occupant 12 .
  • Sensor unit 16 is configured to communicate the position of switch 130 to processor 30 .
  • Switch 130 may be any device capable of changing from an inactive position to an active position by occupant 12 .
  • switch 130 may be a push button 130 P, a foot pedal switch 130 F, or a touch-screen 130 TS.
  • a number of switches 130 are coupled to patient support apparatus 10 .
  • Push buttons 130 P are initially in the inactive position. Occupant 12 can move push buttons 130 P to the active position by depressing push buttons 130 P.
  • push buttons 130 P are coupled to side rails 40 , headboard 42 , and footboard 44 . A limb 22 of occupant 12 is able to reach out and depress, and thus move to the active position, any of the number of push buttons 130 P.
  • Patient support apparatus 10 informs occupant 12 as to which push button 130 P to depress, the amount of time to depress it, and which limb 22 to depress push button 130 P.
  • push button 130 P illuminates a graphic 20 of a limb 22 of occupant 12 to indicate occupant 12 should depress the illuminated push button 130 P with limb 22 until push button 130 P is no longer illuminated.
  • sensor unit 16 includes a number of push buttons 130 P to enable occupant 12 to communicate with patient support apparatus 10 .
  • occupant 12 depresses a first push button 130 P to answer in the affirmative in response to a prompt on graphical display 18 .
  • Occupant 12 instead depresses a second push button 130 P to answer in the negative in response to the prompt on graphical display 18 .
  • switches 130 are included in graphical display 18 .
  • sensing device 74 has variable resistance such that occupant 12 must press upon sensing device 74 with enough force to overcome a predetermined threshold value.
  • sensing device 74 may have a constant resistance rate such that sensing device 74 resists forces applied to sensing device 74 by occupant 12 in a linear relationship and occupant 12 must apply a force to sensing device 74 greater than the threshold value to progress the physical therapy routine 24 .
  • sensing device 74 may actively resist any force applied to sensing device 74 by occupant 12 such that sensing device 74 applies a greater force to occupant 12 than occupant 12 applies to sensing device 74 and occupant 12 must apply a force to sensing device 74 greater than the threshold value to progress the physical therapy routine 24 .
  • Sensing device 74 may apply a greater force than occupant 12 , for example, by an actuator coupled to sensing device 74 .
  • sensing device 74 is a proximity sensor 90 .
  • Proximity sensor 90 detects when proximity sensor 90 is proximate to patient support apparatus 10 .
  • Proximity sensor 90 may be any device capable of detecting the proximity of proximity sensor 90 to patient support apparatus 10 .
  • patient support apparatus 10 and proximity sensor 90 are in wireless communication with each other.
  • patient support apparatus 10 has a specific wireless communication range. Patient support apparatus 10 is able to detect proximity sensor 90 so long as proximity sensor 90 is within the specific wireless communication range of patient support apparatus 10 .
  • sensor unit 16 is assigned to a specific patient support apparatus.
  • Alarm 80 is activated if proximity sensor 90 is moved to a location outside of a specified range of patient support apparatus 10 . This is helpful, for example, to reduce the chance of losing sensor unit 16 by activating alarm 80 if sensor unit 16 is moved away from the assigned patient support apparatus 10 .
  • patient support apparatus 10 includes physiological sensor 78 as shown in FIG. 3 .
  • Physiological sensor 78 measures at least one vital sign of occupant 12 .
  • physiological sensor 78 may be a heart rate monitor, a thermometer, and a respiration sensor.
  • Physiological sensor 78 measures, for example, the heart rate, body temperature, respiration rate, skin temperature of occupant 12 , or the amount of calories burned by occupant 12 during physical therapy routine 24 .
  • physiological sensor 78 is coupled to sensor unit 16 as shown in FIG. 3 .
  • physiological sensor 78 is separated from sensor unit 16 and communicates with control unit 14 independently.
  • the measurements of the vital signs of occupant 12 are communicated to processor 30 .
  • processor 30 uses the measurements of the vital signs of occupant 12 to help determine what graphics 20 are displayed on graphical display 18 .
  • graphical display 18 displays graphics 20 that inform occupant 12 to perform a less physically demanding physical therapy routine 24 in response to the heart rate measurement of occupant 12 .
  • Processor 30 is configurable to activate an alarm 80 based on the value of the measurements. For example, processor 30 activates alarm 80 if the heart rate measurement of occupant 12 exceeds a predetermined threshold value.
  • sensor unit 16 includes alarm 80 as shown in FIG. 3 .
  • Alarm 80 may be any device capable of alerting occupant 12 and caregiver 13 .
  • alarm 80 is audible and visible.
  • Alarm 80 is configured to be activated by a number of conditions. As an example, alarm 80 is activated if a vital sign measurement from physiological sensor 78 exceeds a predefined threshold. As another example, alarm 80 is activated if sensor unit 16 is separated from patient support apparatus 10 by a distance larger than a predetermined range.
  • sensor unit 16 includes an auxiliary feedback device 82 as shown in FIG. 3 .
  • Auxiliary feedback device 82 may be any device capable of providing feedback to occupant 12 performing physical therapy routine 24 .
  • auxiliary feedback device 82 is capable of vibrating sensor unit 16 .
  • Sensor unit 16 vibrates, for example, if occupant 12 incorrectly performs physical therapy routine 24 .
  • auxiliary feedback device 82 may be a speaker, a light, or an aroma.
  • patient support apparatus 10 incorporates controls or buttons for controlling other features included in patient support apparatus 10 into physical therapy routine 24 . Incorporating the controls of features included in patient support apparatus 10 and outside of sensor unit 16 into physical therapy routine 24 increases the scope of the physical anatomy of occupant 12 that can be treated by physical therapy routine 24 . Additionally, the sensory perception of occupant 12 may be tested and treated by physical therapy routine 24 by incorporating the controls of features outside of sensor unit 16 .
  • physical therapy routine 24 may incorporate the controls for adjusting lights included in patient support apparatus 10 or the angle of a head section of a mattress included in patient support apparatus 10 . These controls are not included in sensor unit 16 and may be included in patient support apparatuses that do not include physical therapy routine 24 .
  • Physical therapy routine 24 may require occupant 12 to adjust the angle of the head section of the mattress via the buttons that control the angle of the head section.
  • Physical therapy routine 24 may progress only after occupant 12 has adjusted the angle of the head section to the required angle. By adjusting the angle of the head section correctly, occupant 12 demonstrates both the mentally ability to understand and the physically capability to execute the instructions.
  • sensor unit 16 includes a combination of sensing devices 74 , physiological sensors 78 , alarms 80 , and auxiliary feedback devices 82 .
  • FIG. 1 shows one embodiment of patient support apparatus 10 where sensing device 74 includes a number of cameras 120 .
  • sensing device 74 includes a number of cameras 120 .
  • a first graphic 20 F is displayed on graphical display 18 .
  • Occupant 12 is instructed to point foot 22 F at first graphic 20 F.
  • Cameras 120 record the position and movement of occupant 12 and occupant 12 points foot 22 F at first graphic 20 F.
  • Processor 30 compares the information from cameras 120 to the information relating to the ideal position and movement of an occupant.
  • processor 30 After processor 30 determines that the position and movement of foot 22 F of occupant 12 are adequate, where the adequacy is based on the comparison between the position and movements of foot 22 F of occupant 12 and the ideal position and movement of an occupant, processor 30 instructs graphical display 18 to remove first graphic 20 F and display a second graphic 20 S on graphical display 18 .
  • Second graphic 20 S informs occupant 12 to point foot 22 F at second graphic 20 S.
  • Cameras 120 record the position and movement of occupant 12 as occupant 12 points foot 22 F at second graphic 20 S.
  • Processor 30 compares the information from cameras 120 to the information relating to the ideal position and movement of an occupant. After processor 30 determines that the position and movements of foot 22 F of occupant 12 are adequate, processor 30 increases repetition counter 152 by one repetition.
  • Processor 30 instructs graphical display 18 to remove second graphic 20 S and display first graphic 20 F on graphical display 18 .
  • Physical therapy routine 24 repeats in this manner until occupant 12 has completed a prescribed number of repetitions.
  • FIG. 5 shows an embodiment of patient support apparatus 10 where sensing device 74 includes a number of accelerometers 100 , radio frequency sensor 140 , and a number of switches 130 .
  • sensing device 74 includes a number of accelerometers 100 , radio frequency sensor 140 , and a number of switches 130 .
  • first graphic 20 F is displayed in the upper-right corner of graphical display 18 .
  • Occupant 12 holds sensor unit 16 in hand 22 H of limb 22 requiring physical therapy.
  • Occupant 12 points end 142 of sensor unit 16 , including radio frequency sensor 140 , at first graphic 20 F and depresses one of the number of switches 130 .
  • Accelerometers 100 and radio frequency sensor 140 work together to accurately record information relating to the position and movement of limb 22 of occupant 12 .
  • Processor 30 compares the information relating to the position and movement of hand 22 H of occupant 12 and information relating to the ideal position and movement of a hand of an occupant. If processor 30 determines that the physical therapy routine 24 position was adequate, processor 30 instructs graphical display 18 to display second graphic 20 S in the upper-left corner of graphical display 18 . Second graphic 20 S informs occupant 12 as to where to move limb 22 to continue physical therapy routine 24 .
  • Occupant 12 moves limb 22 and points sensor unit 16 at second graphic 20 S and depresses one of the number of switches 130 in response to second graphic 20 S appearing on graphical display 18 .
  • Processor 30 receives information from sensor unit 16 relating to the measurements of accelerometers 100 , radio frequency sensor 140 , and switch 130 and compares the information to the idealized values of limbs 22 of an occupant 12 moving sensor unit 18 from first graphic 20 F to second graphic 20 S.
  • Processor 30 determines a score 92 based on the difference between the information relating to the movement and position of limbs 22 of occupant 12 and the idealized movement and positions of an occupant.
  • a third graphic 20 T and score 92 of occupant 12 is displayed on graphical display 18 in response to the comparison made by processor 30 .
  • speaker 23 of graphical display 18 may produce an audible sound indicating that occupant 12 performed the physical therapy routine adequately.
  • graphical display 18 may remain the same, for example, if processor 30 determines the movement and position of limbs 22 of occupant 12 were not sufficient to complete that physical therapy routine 24 position. Speaker 23 produces an audible sound indicating that occupant 12 did not perform the physical therapy routine adequately.
  • FIG. 6 shows an embodiment of patient support apparatus 10 where sensing device 74 includes a number of weight sensors 110 .
  • sensing device 74 includes a number of weight sensors 110 .
  • an avatar 148 representing occupant 12 is displayed on graphical display 18 .
  • Avatar 148 is configured to move on graphical display 18 relative to the amount of weight of occupant 12 distributed to each weight sensor 110 .
  • First graphic 20 F is displayed on graphical display 18 informing occupant 12 to move avatar 148 toward first graphic 20 F.
  • Occupant 12 moves limbs 22 to redistribute the weight of occupant 12 distributed to each weight sensor 110 .
  • occupant 12 leans to the left to apply more weight to weight sensors 110 located on a left side of patient support apparatus 10 .
  • Weight sensors 110 record the amount of weight distributed to each weight sensor 110 as information relating to the position and movement of limb 22 of occupant 12 .
  • Processor 30 compares the information relating to the position and movement of occupant 12 and information relating to the ideal position and movement of an occupant. If processor 30 determines that the physical therapy routine 24 position was adequate, processor 30 instructs graphical display 18 to display second graphic 20 S on graphical display 18 . Second graphic 20 S informs occupant 12 as to where to move avatar 148 to continue physical therapy routine 24 .
  • Processor 30 determines score 92 based on the difference between the information relating to the movement and position of occupant 12 and the idealized movement and positions of an occupant. However, graphical display 18 may remain the same, for example, if processor 30 determines the movement and position of occupant 12 were not sufficient to complete that physical therapy routine 24 position.
  • FIG. 7 shows an embodiment of patient support apparatus 10 where sensor unit 16 includes a touch-screen graphical display 130 TS and sensing device 74 includes a number of accelerometers 100 and a number of switches 130 .
  • sensor unit 16 includes a touch-screen graphical display 130 TS and sensing device 74 includes a number of accelerometers 100 and a number of switches 130 .
  • cursor 146 and first graphic 20 F are displayed on graphical display 18 .
  • Cursor 146 is configured to move on graphical display 18 in response to occupant 12 rotating sensor unit 16 .
  • Occupant 12 moves limbs 22 to rotate sensor unit 16 to try to make cursor 146 contact first graphic 20 F.
  • Accelerometers 100 record the rotation of sensor unit 16 as information relating to the position and movement of limb 22 of occupant 12 .
  • Processor 30 compares the information relating to the position and movement of occupant 12 and information relating to the ideal position and movement of an occupant. If processor 30 determines that the physical therapy routine 24 position was adequate, processor 30 instructs graphical display 18 to display second graphic 20 S on graphical display 18 . Second graphic 20 S informs occupant 12 as to where to move cursor 146 to continue physical therapy routine 24 .
  • Processor 30 determines score 92 based on the difference between the information relating to the movement and position of occupant 12 and the idealized movement and positions of an occupant.
  • a second graphic 20 S and score 92 of occupant 12 are displayed on graphical display 18 in response to the comparison made by processor 30 .
  • graphical display 18 may remain the same, for example, if processor 30 determines the movement and position of occupant 12 were not sufficient to complete that physical therapy routine 24 position.
  • occupant 12 is required to press upon touch-screen graphical display 130 TS where first graphic 20 F is displayed.
  • Processor 30 instructs graphical display 18 to display a second graphic 20 S when occupant 12 touches first graphic 20 F.
  • FIG. 8 shows an embodiment of patient support apparatus 10 where sensing device 74 includes a number of push buttons 130 P.
  • sensing device 74 includes a number of push buttons 130 P.
  • one of the number of push buttons 130 P is illuminated. Occupant moves limb 22 to illuminated push button 130 P and depresses push button 130 P until illuminated push button 130 P is no longer illuminated. If processor 30 determines push button 130 P was depressed adequately, another one of the number of push buttons 130 P is illuminated.
  • push buttons 130 P are illuminated in a specific pattern. Occupant 12 is instructed to depress push buttons 130 P in that pattern. When occupant 12 completes the pattern, the number of illuminated push buttons 130 P in the pattern is increased by one. Occupant 12 continues to perform physical therapy routine 24 by depressing push buttons 130 P in an order consistent with the pattern until occupant 12 depresses push buttons 130 P in an order inconsistent with the pattern.
  • FIG. 9 shows an embodiment of patient support apparatus 10 where sensing device 74 includes mat 150 including a number of push buttons 130 P and/or weight sensors 110 .
  • sensing device 74 includes mat 150 including a number of push buttons 130 P and/or weight sensors 110 .
  • mat 150 is supported on deck section 46 and occupant 12 is supported on mat 150 .
  • Graphics 20 included in mat 150 illuminate to inform occupant 12 to touch mat 150 with limbs 22 where graphics 20 are illuminated.
  • Push buttons 130 P and/or weight sensors 110 detect limbs 22 touching mat 150 .
  • Occupant 12 moves limb 22 and touches mat 150 where graphics 20 are illuminated until graphics 20 are no longer illuminated.
  • processor 30 determines that the limbs 22 of occupant 12 were touching mat 150 adequately to satisfy the physical therapy routine 24 position, another graphic on mat 150 is illuminated.
  • another graphical display 18 is coupled to footboard 44 and displays a graphic of mat 150 .
  • the graphic 20 of mat 150 indicates the illuminated graphics 20 on actual mat 150 to assist occupant 12 performing physical therapy routine 24 .

Abstract

A patient support apparatus configured to enable an occupant of the patient support apparatus to perform a physical therapy routine while being supported by the patient support apparatus.

Description

    CROSS-REFERENCE TO RELATED U.S. APPLICATION
  • This present application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Patent Application Ser. No. 61/774,190, entitled “PATIENT SUPPORT APPARATUS WITH PHYSICAL THERAPY SYSTEM,” which was filed on Mar. 7, 2013, the entirety of which is hereby incorporated by reference.
  • BACKGROUND
  • The present disclosure is related to a patient support apparatus. More specifically, the present disclosure is related to a patient support apparatus configured to enable a patient to perform a physical therapy routine while being supported by the patient support apparatus.
  • In some instances, a physician prescribes physical therapy to a patient to assist the patient's recovery after an injury caused by physical trauma or disease. A nurse, physical therapist, or caregiver may instruct and observe the patient while the patient performs the prescribed physical therapy routine. The patient may perform the physical therapy routine in a variety of locations including a hospital room or the patient's home. Additionally, the patient may perform the physical therapy routine when standing, sitting, or while being supported by a patient support apparatus.
  • One method of administering the prescribed physical therapy routine, and verifying that the physical therapy routine is correctly performed, is to hire a caregiver, trained in physical therapy, to monitor the patient as the patient performs the physical therapy routine. Requiring a caregiver, trained in physical therapy, to supervise the patient can place a large financial burden on the patient. Additionally, a hospital or other facility may have a limited number of caregivers trained in physical therapy. This may lead to less physical therapy sessions for all patients in need of physical therapy or some patients with marginal need of physical therapy receiving no physical therapy at all.
  • Another method of administering the prescribed physical therapy is to instruct the patient to perform self-guided physical therapy. However, a patient may be unmotivated and choose not to perform the required physical therapy. Additionally, the patient may perform the physical therapy routine incorrectly causing poor results or injury to the patient.
  • Patients prescribed physical therapy may be instructed to remain in bed or may be physically unable to exit a bed. These patients may experience difficulty in performing prescribed physical therapy that requires the patient to be out of bed or incorporates the use of heavy equipment. Additionally, a caregiver trained in physical therapy may have a building or area within a building dedicated to physical therapy sessions. The caregiver may have to relocate to the patient's room due to the inability of the patient to exit their bed. The travel associated with the caregiver relocating may reduce the total time available to all patients for physical therapy or increase the financial burden of the patient receiving the physical therapy. By relocating to the patient's room, the caregiver may be limited to assisting one patient instead of multiple patients in a group therapy session. As such, traditional physical therapy may be difficult or costly for some patients and may lead to less physical therapy for all patients.
  • SUMMARY
  • The present application discloses one or more of the features recited in the appended claims and/or the following features which, alone or in any combination, may comprise patentable subject matter:
  • According to one aspect of the present disclosure, a patient support apparatus for providing physical therapy to an occupant comprises a processor, a sensor unit, a graphical display, and a memory device. The sensor unit is operable to sense a position and motion of limbs of the occupant in the patient support apparatus and to transmit information representing the position and motion of the limbs of the occupant to the processor. The graphical display is coupled to the processor and configured to display graphics based upon feedback from the processor. The memory device is coupled to the processor and contains information representing an idealized set of positions and motions of limbs of an occupant to be achieved by the occupant while performing a physical therapy routine. The processor is configured to update the graphical display based upon the information representing the position and motion of the limbs of the occupant performing the physical therapy routine received from the sensor unit and a comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device.
  • In some embodiments, the processor may be configured to update the graphical display to display graphics instructing the occupant to move the limbs of the occupant with at least one of a first speed, a first range of motion, and a first force to progress the physical therapy. In some embodiments, the processor may be configured to update the graphical display to display graphics instructing the occupant to move the limbs of the occupant with at least one of a second speed, a second range of motion, and a second force to progress the physical therapy. The one of the second speed, second range of motion, and second force may be determined by the comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device.
  • In some embodiments, the information representing the idealized set of positions and motions of limbs of the occupant may be based upon a length of a limb of the occupant. In some embodiments, the information representing the idealized set of positions and motions of limbs of the occupant may be based upon an age of the occupant.
  • In some embodiments, the sensor unit may be wirelessly connected to an assigned patient support apparatus and the sensor unit may sound an audible alarm if the sensor unit is moved to a position outside of a specified range of the assigned patient support apparatus.
  • In some embodiments, the patient support apparatus may include at least one physiological sensor. The physiological sensor may be configured to transmit information representing one of a heart rate, a respiration rate, calories burned, and a temperature of the occupant to the processor.
  • In some embodiments, the sensor unit may include a number of weight sensors. In some embodiments, the sensor unit may be mounted on a member of the patient support apparatus.
  • In some embodiments, the graphical display may be positioned to be visible to the occupant. In some embodiments, the graphical display may be mounted on a member of the patient support apparatus. In some embodiments, the graphical display may be included in the sensor unit.
  • In some embodiments, the sensor unit may include an image-recording device. In some embodiments, the sensor unit may include at least one accelerometer. In some embodiments, the sensor unit includes a radio frequency sensor.
  • In some embodiments, the sensor unit may include at least one switch. The at least one switch may have an inactive and an active position. The occupant may be enabled to move the at least one switch between the inactive position and the active position.
  • In some embodiments, the at least one switch may be configured to offer a resistance against moving between the inactive and active position. In some embodiments, the resistance offered by the switch may be variable.
  • In some embodiments, the processor may create data relating to statistics of the occupant while performing the physical therapy routine. The data may be stored in the memory device. In some embodiments, the data may be automatically transmitted to a computer network of a hospital. In some embodiments, the data relating to statistics of the occupant may include at least one of a heart rate of the occupant, a number of repetitions of the physical therapy routine performed by the occupant, and a score indicative of the caliber of the performance of the occupant.
  • In some embodiments, the sensor unit may include an accelerometer, a switch, and a radio frequency sensor. In some embodiments, the sensor unit may include an accelerometer, a switch, the graphical display, and the processor. In some embodiments, the graphical display may be a touch screen.
  • In some embodiments, the processor is may be configured to update the graphical display based on a physical fitness of the occupant. The physical fitness of the occupant may include information relating to at least an age of the occupant, a weight of the occupant, a height of the occupant, any medicines prescribed to the occupant, and a daily physical fitness level of the occupant.
  • In some embodiments, the patient support apparatus may be configured to receive information from a computer network. The physical fitness of the occupant may be determined from a medical record including information relating to the physical fitness of the occupant. The medical record may be received by the patient support apparatus from the computer network.
  • In some embodiments, the processor may be configured to end physical therapy based upon the information representing the position and motion of the limbs of the occupant performing the physical therapy routine received from the sensor unit and a comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device. In some embodiments, the processor may be configured to produce an alarm signal based upon the information representing the position and motion of the limbs of the occupant performing the physical therapy routine received from the sensor unit and a comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device.
  • In some embodiments, the processor may be configured to update the graphical display based upon information received from the patient support apparatus. In some embodiments, the information received from the patient support apparatus may include one of an angle and a position of a portion of a mattress included in the patient support apparatus.
  • In some embodiments, the processor may be in communication with a computer network. In some embodiments, the memory device may include a unique identifier. Information transmitted to the computer network from the processor may include the unique identifier.
  • According to one aspect of the present disclosure, a method of monitoring an occupant's performance of a physical therapy routine in a patient support apparatus comprises several steps. The steps include displaying graphics on a graphical display instructing the occupant to perform a physical therapy routine, receiving information representing a position and motion of an occupant's limbs, comparing the information representing the position and motion of the occupant's limbs to a set of information representing an optimal position and motion of limbs of an occupant performing the physical therapy routine in the patient support apparatus, and using the information representing the position and motion of the occupant's limbs and the comparison to the set of information representing the optimal position and motion of limbs of an occupant performing the physical therapy routine in a patient support apparatus as parameters to affect the graphics displayed on the graphical display.
  • According to one aspect of the present disclosure, a patient support apparatus for providing physical therapy to an occupant supported by the patient support apparatus comprises a control unit, a sensor, and a graphical display. The sensor is operable to sense the position and motion a limb of the occupant in the patient support apparatus and to transmit information representing the position and motion of the limb of the occupant to the control unit. The graphical display is coupled to the control unit and configured to display graphics based upon instructions from the control unit. The control unit is configured to monitor the information representing the position and motion of the limb of the occupant and transmit instructions to the graphical display based on the information representing the position and motion of the limb of the occupant.
  • In some embodiments, the control unit may be configured to evaluate the position and motion of the limb of the occupant and create data representing a caliber of the performance of the occupant. In some embodiments, the patient support apparatus may be configured to communicate the data representing the caliber of the performance of the occupant to a computer network of a hospital. In some embodiments, the instructions transmitted by the control unit may be based on information stored on the computer network of the hospital.
  • Additional features, which alone or in combination with any other feature(s), including those listed above and those listed in the claims, may comprise patentable subject matter and will become apparent to those skilled in the art upon consideration of the following detailed description of illustrative embodiments exemplifying the best mode of carrying out the invention as presently perceived.
  • BRIEF DESCRIPTIONS OF THE DRAWINGS
  • The detailed description particularly refers to the accompanying figures in which:
  • FIG. 1 is a perspective view of a patient support apparatus in accordance with the present disclosure monitoring an occupant supported by the patient support apparatus performing a physical therapy routine;
  • FIG. 2 is a diagrammatic view of the patient support apparatus of FIG. 1, the patient support apparatus including a control unit, a sensor unit, and a graphical display;
  • FIG. 3 is a diagrammatic view of one embodiment of the sensor unit of FIG. 2 including at least one sensing device and a sensor communications port;
  • FIG. 4 is a diagrammatic view of the patient support apparatus of FIG. 1, the patient support apparatus configured to communicate with a number of devices included in a computer network of a healthcare facility;
  • FIG. 5 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a second embodiment of the sensor unit;
  • FIG. 6 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a third embodiment of the sensor unit;
  • FIG. 7 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a fourth embodiment of the sensor unit;
  • FIG. 8 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a fifth embodiment of the sensor unit; and
  • FIG. 9 is a perspective view of another embodiment of a patient support apparatus in accordance with the present disclosure monitoring the occupant supported by the patient support apparatus performing a physical therapy routine with a sixth embodiment of the sensor unit.
  • DETAILED DESCRIPTION
  • A patient support apparatus 10 is configured to enable an occupant 12 to perform physical therapy and is shown in FIG. 1. In some instances, occupant 12 is prescribed physical therapy to recovery from an injury. In the illustrative embodiment, patient support apparatus 10 is configured such that occupant 12 can perform at least one physical therapy routine 24 while being supported by patient support apparatus 10, thus occupant 12 is not required to exit patient support apparatus 10 to perform at least some of the physical therapy prescribed for occupant 12. Furthermore, patient support apparatus 10 provides instructions for performing physical therapy routine 24 to occupant 12. As such, occupant 12 may perform physical therapy routine 24 alone or under the supervision of a caregiver 13. Patient support apparatus 10 includes a number of devices operable to instruct and monitor the progress of occupant 12 performing physical therapy routine 24 and provide feedback to occupant 12.
  • Patient support apparatus 10 allows for continuous and self-directed physical therapy treatment. Patient support apparatus 10 increases the level of difficulty of physical therapy routine 24 to an appropriate amount of challenge as occupant 12 physically progresses. This approach accelerates patient mobility and leads to reduced hospital stays, reduced re-admissions, and reduced need for caregiver intervention.
  • Referring now to FIG. 2, patient support apparatus 10 includes a control unit 14, a sensor unit 16, and a graphical display 18. Graphical display 18 is configured to display graphics 20 that inform or suggest to occupant 12 how to perform a particular physical therapy routine 24. Sensor unit 16 is operable to detect the position and the movement of one or more limbs 22 of occupant 12 as occupant 12 performs the particular physical therapy routine 24. Sensor unit 16 transfers information relating to the position and movement of limbs 22 of occupant 12 to control unit 14. Control unit 14 compares the information received from sensor unit 16 with information stored in control unit 14. Control unit 14 then updates graphics 20 on graphical display 18 based on the information received from sensor unit 16 and the comparison made between the information received from sensor unit 16 and the information stored in control unit 14. Occupant 12 is informed as to what movements are required to continue physical therapy routine 24 by the updated graphics 20. For example, graphics 20 may indicate that the occupant 12 needs to increase a range of motion or a repetition rate to meet the current target of the physical therapy routine 24.
  • Patient support apparatus 10 continues to guide occupant 12 through physical therapy routine 24 until physical therapy routine 24 is terminated. For example, physical therapy routine 24 may be terminated when occupant 12 completes physical therapy routine 24. Physical therapy routine 24 may be terminated if patient support apparatus 10 determines that occupant 12 is unable to complete the physical therapy routine 24. For example, physical therapy routine 24 may be terminated if occupant 12 is unable to adequately increase their range of motion or complete the required number of repetitions as required by physical therapy routine 24. Additionally, occupant 12 may voluntarily and prematurely quit performing physical therapy routine 24. In such instances, patient support apparatus 10 may automatically terminate physical therapy routine 24 and produce an alarm signal to alert caregiver 13 that physical therapy routine 24 has been terminated.
  • In some embodiments, patient support apparatus 10 further includes a main communications port 26. Main communications port 26 communicates information relating to the performance of physical therapy routine 24 by occupant 12 between patient support apparatus 10 and a computer network 28, as shown in FIGS. 2 and 4.
  • Control unit 14 included in patient support apparatus 10 is shown in FIG. 2. Control unit 14 controls the progression of physical therapy routine 24. In some embodiments, control unit 14 is integrated with a main control unit (not shown) of patient support apparatus 10. In other embodiments, control unit 14 is configured to control only physical therapy routine 24. Control unit 14 receives information from sensor unit 16 and updates graphics 20 displayed on graphical display 18 in response to the information received from sensor unit 16. In the illustrative embodiment, control unit 14 includes a memory device 32 and a processor 30.
  • Memory device 32 stores information electronically and is configured to communicate with processor 30. Processor 30 is configured to receive information from memory device 32 and transmit information to memory device 32 to store. The information stored on memory device 32 includes instructions to initiate, progress, and end physical therapy routine 24. Memory device 32 includes an idealized set of positions and motions of limbs 22 of occupant 12 performing physical therapy routine 24 and supported by patient support apparatus 10. In some embodiments, memory device 32 further includes an unchanging unique identifier. The unique identifier is unique for each control unit 14. Information transmitted to computer network 28 from control unit 14 includes the unique identifier.
  • The idealized set of positions included in memory device 32 may include ideal Cartesian coordinates of an arm 22A of occupant 12 for each position of physical therapy routine 24. The idealized set of motions, as another example, include an ideal time that it should take occupant 12 to move between each position of physical therapy routine 24. In some embodiments, the idealized set of positions and motions are based upon statistical averages of human anatomy. In other embodiments, the idealized set of positions and motions are based upon the anatomy of occupant 12. For example, the idealized set of positions and motions are based upon the length of the arms and legs of occupant 12 or the age and physical fitness of occupant 12.
  • Processor 30 included in control unit 14 communicates with graphical display 18, memory device 32, and sensor unit 16. Processor 30 receives information from sensor unit 16 during physical therapy routine 24. At least some of the information received from sensor unit 16 includes information relating to the position and motion of limbs 22 of occupant 12. Additionally, processor 30 receives information from memory device 32 relating to an idealized set of positions and motions of occupant 12 performing physical therapy routine 24 while supported by patient support apparatus 10.
  • Processor 30 compares the information relating to the position and motion of limbs 22 of occupant 12 with the idealized set of positions and motions of occupant 12 performing physical therapy routine 24. Processor 30 communicates with graphical display 18 to update graphics 20 displayed on graphical display 18 based upon the information relating to the position and motion of limbs 22 of occupant 12 and the comparison of the information relating to the position and motion of limbs 22 of occupant 12 with the idealized set of position and motion of an occupant of patient support apparatus 10 performing physical therapy routine 24.
  • In some embodiments, processor 30 is further configured to update graphical display 18 based upon a physical fitness of occupant 12. The physical fitness of occupant 12 may include information relating to at least an age of occupant 12, a weight of occupant 12, a height of occupant 12, any medicines prescribed to occupant 12, and a daily physical fitness level of occupant 12. In embodiments where patient support apparatus 10 includes a main communications port 26, the physical fitness of occupant 12 may be received from computer network 28. In some embodiments, the physical fitness of occupant 12 may be determined from a medical record including information relating to the physical fitness of occupant 12, where the medical record is received by patient support apparatus 10 from computer network 28. In some embodiments, the physical fitness of occupant 12 is received from computer network 28 automatically.
  • Processor 30 is configured to increase the level of difficulty of physical therapy routine 24. Processor 30 determines the level of difficulty of physical therapy routine 24 based upon the information relating to the position and motion of limbs 22 of occupant 12 and the comparison of the information relating to the position and motion of limbs 22 of occupant 12 with the idealized set of position and motion of an occupant of patient support apparatus 10 performing physical therapy routine 24. In some embodiments, the level of difficulty is further based upon the physical fitness of occupant 12. Processor 30 increases the level of difficulty of physical therapy routine 24 by updating graphical display 18 in a way that requires occupant 12 to move with a greater speed, range of motion, and/or with more force to complete physical therapy routine 24.
  • In some embodiments, processor 30 increases the level of difficulty of physical therapy routine 24 for occupant 12 during physical therapy routine 24. In some embodiments, processor 30 increases the level of difficulty of physical therapy routine 24 for the next physical therapy routine 24 preformed by occupant 12 subsequent a successfully completed physical therapy routine 24.
  • In some embodiments, processor 30 disables certain features of patient support apparatus 10 while occupant 12 performs physical therapy routine 24. Processor 30 ends physical therapy routine 24 if the information from other devices (not shown) on patient support apparatus 10 or sensor unit 16 suggests occupant 12 is in a hazardous position. For example, patient support apparatus 10 may automatically end physical therapy routine 24 if occupant 12 has achieved a position in which occupant 12 is at excessive risk of falling from patient support apparatus 10. Furthermore, patient support apparatus 10 may automatically produce and alarm signal used to alert caregiver 13 in response to detecting that occupant 12 is in a hazardous position.
  • In some embodiments, processor 30 creates data 34 relating to the statistics of occupant 12 performing physical therapy routine 24. The statistics may be, for example, a score indicative of the caliber of the performance of occupant 12 or the number of repetitions of a particular physical therapy routine 24 performed by occupant 12. As another example, the statistics created by processor 30 include physiological measurements of occupant 12 during physical therapy routine 24, for example, the heart rate of occupant 12. As yet another example, data 34 includes the progression of occupant 12 over the course of performing several progressive physical therapy routines 24. Processor 30 transmits data 34 relating to the statistics of occupant 12 to memory device 32 for memory device 32 to store. In embodiments where patient support apparatus 10 includes main communications port 26, data 34 relating to the statistics of occupant 12 may be transmitted to computer network 28. In some embodiments, data 34 is transmitted to computer network 28 automatically.
  • Graphical display 18 informs occupant 12 as to how to perform physical therapy routine 24. Graphical display 18 displays graphics 20 in response to input from processor 30. Graphics 20 displayed on graphical display 18 inform or suggest to occupant 12 how to perform physical therapy routine 24. In some embodiments, graphical display includes speakers 23 that audibly inform occupant 12 as to how to perform physical therapy routine 24. Speakers 23 also produce audible sounds to enhance the experience of physical therapy routine 24.
  • In some embodiments, graphical display 18 expressly informs occupant 12 how to perform physical therapy routine 24. For example, graphical display 18 displays and audibly produces words describing how occupant 12 should position and move their limbs 22. In other embodiments, graphical display 18 indirectly informs or suggests to occupant 12 how to perform physical therapy routine 24. For example, graphical display 18 displays graphics 20 of a person performing physical therapy routine 24, suggesting that occupant 12 should position and move their limbs 22 in a manner similar to the person displayed on graphical display 18. In another example, graphical display 18 displays graphic 20 in a specific location on graphical display 18, suggesting occupant 12 should position and move at least one of their limbs 22 to mirror graphic 20.
  • Graphical display 18 may be any device capable of informing occupant 12 how to perform physical therapy routine 24. In some embodiments, graphical display 18 is an electronic visual display capable of rendering different graphics, for example a touch-screen monitor. In other embodiments, graphical display 18 includes pre-determined shapes that are configured to be illuminated. For example, graphical display 18 may be a television or a mat 150 including images of hands and feet where a specific image of a hand is illuminated when occupant 12 needs to place a hand 22H over the illuminated image of a hand to perform physical therapy routine 24.
  • In some embodiments, when graphical display 18 is included in sensor unit 16, for example when sensor unit is mat 150, occupant 12 is supported by graphical display 18. In other embodiments, a number of graphical displays 22 are positioned at several locations on patient support apparatus 10. For example, graphical display 18 may be one or more television screens or illuminated graphics 20 located on a side rail 40, a headboard 42, or a footboard 44 of patient support apparatus 10.
  • In other embodiments graphical display is positioned at a single location on patient support apparatus 10. For example, graphical display 18 is coupled to footboard 44 in FIG. 1. In some embodiments, graphical display 18 rotates so that graphical display 18 faces toward occupant 12 positioned in patient support apparatus 10 or face away from occupant 12, for example, toward a caregiver (not shown) positioned between occupant 12 and graphical display 18.
  • Referring now to FIG. 2, in some embodiments, patient support apparatus 10 includes main communications port 26. Main communications port 26 enables patient support apparatus 10 to communicate with computer network or system 28 of a healthcare facility as indicated diagrammatically in FIG. 4 by double-headed arrows 50. Main communications port 26 may communicate with computer network 28 through a wired or wireless datalink.
  • Included in computer network 28 is a nurse call system 52, an electronic medical record database 54, a nurse call/locating badge 56, one ore more computers programmed with workflow process software 58, (such as, for example, NaviCare® software which is available from Hill-Rom Company, Inc.), one or more personal digital assistant (PDA's) 60, one or more voice communications badges 62, and one or more pagers 64. In some embodiments, nurse call system 52 and badges 56 are of the type available as part of the ComLinx™ system from Hill-Rom Company, Inc.
  • Main communications port 26 includes a transmitter 70 and a receiver 72. Main communications port 26 is configured to communicate with one or more computers in computer network 28 via transmitter 70. For example, transmitter 70 transmits data 34 relating to statistics of occupant 12 performing physical therapy routine 24 along with measurements from a physiological sensor 78. Transmitter 70 is also configured to transmit an alarm signal to one or more computers in computer network 28 to alert a caregiver 13 of the status of occupant 12, for example, if occupant 12 is no longer performing physical therapy routine 24.
  • Receiver 72 included in main communications port 26 receives information from the one or more computers in computer network 28. For example, receiver 72 receives information relating to the physical fitness of occupant 12. The information relating to the physical fitness of occupant 12 may be from a medical record of occupant 12 stored in a database of computer network 28. Processor 30 is configured to update graphics 20 on graphical display 18 to instruct occupant 12 to perform physical therapy routine 24 with a difficulty suitable for the physical fitness of occupant 12.
  • Sensor unit 16 is configured to sense the position and motion of limbs 22 of occupant 12 and transmit information relating to the position and motion of limbs 22 of occupant 12 to processor 30 as shown in FIGS. 2 and 3. Sensor unit 16 includes at least one sensing device 74 and a sensor communications port 76. In some embodiments, sensor unit 16 further includes at least one of physiological sensor 78, an alarm 80, and an auxiliary feedback device 82. In some embodiments, sensor unit 16 is a portable device physically uncoupled with patient support apparatus 10. For example, sensor unit 16 may be a tablet computer or handheld controller. In some embodiments sensor unit 16 is omitted and components of sensor unit 16 are integral with patient support apparatus 10. For example, sensing devices 74 may be integral with side rails 40 or footboard 44.
  • Sensor communications port 76 communicates information between sensor unit 16 and processor 30. Sensor communications port includes a transmitter 86 that transmits information to processor 30. The information transmitted to processor 30 includes, for example, the information relating to the position and movement of limbs 22 of occupant 12 and measurements from physiological sensor 78.
  • In some embodiments, sensor communications port 76 includes a receiver 88 configured to receive information from processor 30. The information received from processor 30 includes, for example, instructions to control sensing device 74, physiological sensor 78, alarm 80, and auxiliary feedback device 82. Sensor communications port 76 may communicate with processor 30 through a wired or wireless datalink.
  • Sensing device 74 is configured to detect at least information relating to the position and motion of limbs 22 of occupant 12. Sensing device 74 may be one of a number of types of sensors. For example, sensing device 74 may include an accelerometer 100, a camera 120, a weight sensor 110, radio frequency sensor 140, and a switch 130. In some embodiments, sensor unit 16 includes multiple sensing devices 74. For example, in one embodiment, sensor unit 16 includes at least one accelerometer 100 and at least one switch 130. In another embodiment, sensor unit 16 includes a video camera 120 and a number of weight sensors 110.
  • In some embodiments, sensing device 74 is at least one camera 120 as shown in FIG. 1. Camera 120 is configured to record images of occupant 12. The images of occupant 12 are communicated to processor 30. Processor 30 is configured to determine the movement and position of the limbs 22 of occupant 12 based on the images recorded by camera 120. The movement and position of limbs of occupant 12 are determined by the images and a comparison between consecutive images.
  • In some embodiments, two cameras are used for a larger field of view or to enable processor 30 to determine depth of objects in the field of view of cameras 120. Camera 120 is configured to be mounted at any location where the viewing area of camera 120 includes at least one of the limb 22 of occupant 12 required to perform physical therapy routine 24. For example, cameras 120 may be mounted on the side rails 40, headboard 42, or footboard 44 of patient support apparatus 10. Camera 120 may be independently mounted to a tripod 122 next to patient support apparatus 10. Camera 120 may be any device capable of recording images of occupant 12. For example, camera 120 may be a video camera, a digital camera, a depth camera, and stereo cameras. Camera 120 may detect in the visible light spectrum or in the infrared light spectrum.
  • In some embodiments, sensing device 74 is accelerometer 100 as shown, as an example, in FIG. 5. Accelerometer 100 is able to measure acceleration of sensor unit 16 in a first directional axis 102. In some embodiments, when sensor unit 16 includes accelerometers 100, sensor unit 16 includes three accelerometers 100A, 100B, and 100C to receive acceleration information about three axes. Accelerometer 100A is able to measure acceleration in first directional axis 102. Accelerometer 100B is able to measure acceleration in a second directional axis 104 orthogonal to first directional axis 102. Accelerometer 100C is able to measure acceleration in a third directional axis 106 orthogonal to both first directional axis 102 and second directional axis 104.
  • In some embodiments including accelerometers 100, occupant 12 holds sensor unit 16 in one or both hands 22H. In some embodiments, graphical display 18 instructs occupant 12 as to which hand or hands 22H should hold sensor unit 16. Occupant 12 performing physical therapy routine 24 moves arms 22A and hands 22H as required to complete physical therapy routine 24. Accelerometers 100A, 100B, 100C measure the acceleration of sensor unit 16 as occupant 12 moves hands 22H. Processor 30 uses the acceleration measurements to determine relative position of limbs 22 of occupant 12 between positions. As such, sensor unit 16 is able to measure the acceleration of limbs 22 of occupant 12 as well as the relative position of limbs 22 of occupant 12 during physical therapy routine 24.
  • In some embodiments, sensing device 74 is a radio frequency sensor 140 as shown in FIG. 5. Radio frequency sensor 140 transmits and receives radio waves to determine a point in space where sensor unit 16 is pointing. As an example, radio frequency sensor 140 is included in an end 142 of sensor unit 16. End 142 of sensor unit 16 is pointed at graphical display 18 coupled to footboard 44 of patient support apparatus 10. Processor 30 determines a point 144 on graphical display 18 that sensor unit 16 is being pointed based on the information from radio frequency sensor 140. Point 144 appears as a cursor 146 on graphical display 18.
  • Occupant 12 is able to use sensor unit 16 including radio frequency sensor 140 as a pointer to control graphic display 18. As an example, in FIG. 5, sensor unit 16 is elongated and configured to be held in one hand 22H of occupant 12. Sensor unit 16 includes radio frequency sensor 140 in end 142 of sensor unit 16. Occupant 12 is able to orient sensor unit 16 such that cursor 146 appears on graphical display 18 at a location desired by occupant 12. As one example, graphical display 18 displays several menu options. Occupant 12 orients sensor unit 16 such that cursor 146 highlights a desired menu option on graphical display 18. Occupant 12 selects the highlighted menu option by depressing a button on sensor unit 16 while the menu option is highlighted. In another example of occupant 12 using radio frequency sensor 140 to perform physical therapy routine 24, a number of graphics 20 are displayed on graphical display 18 and occupant 12 is instructed to orient sensor unit 16 such that cursor 146 is moved towards and virtually contacts graphics 20. As such, the hand 22H and wrist of occupant 12 orienting sensor unit 16 are strengthened.
  • In some embodiments, sensing device 74 is a weight sensor 110 as shown in FIG. 6. Weight sensor 110 is capable of measuring an amount of weight applied to weight sensor 110. In some embodiments, a number of weight sensors 110 are included in patient support apparatus. In one embodiment, as an example, patient support apparatus 10 includes a number of weight sensors 110 in a deck section 46 of patient support apparatus 10.
  • Occupant 12 is supported by deck section 46 and, as such, the weight of occupant 12 is distributed to the number of weight sensors 110. As occupant 12 moves in patient support apparatus 10, the weight of occupant 12 is redistributed to the number of weight sensors 110. The number of weight sensors 110 communicate the weight of occupant 12 measured by each weight sensor 110 to processor 30. Processor 30 is configured to determine the movement and position of the limbs 22 of occupant 12 based on the change in the weight measured by each weight sensor 110 when occupant 12 moves.
  • In some embodiments, sensing device 74 is a switch 130 as shown in FIGS. 7 and 8. Switch 130 has an active position and an inactive position. Switch 130 is in the active position when switch 130 is closed by occupant 12. Switch 130 is in the inactive position when switch 130 is opened by occupant 12. Sensor unit 16 is configured to communicate the position of switch 130 to processor 30. Switch 130 may be any device capable of changing from an inactive position to an active position by occupant 12. For an example, switch 130 may be a push button 130P, a foot pedal switch 130F, or a touch-screen 130TS.
  • In some embodiments, a number of switches 130 are coupled to patient support apparatus 10. Push buttons 130P are initially in the inactive position. Occupant 12 can move push buttons 130P to the active position by depressing push buttons 130P. In one embodiment, push buttons 130P are coupled to side rails 40, headboard 42, and footboard 44. A limb 22 of occupant 12 is able to reach out and depress, and thus move to the active position, any of the number of push buttons 130P. Patient support apparatus 10 informs occupant 12 as to which push button 130P to depress, the amount of time to depress it, and which limb 22 to depress push button 130P.
  • In some embodiments, push button 130P illuminates a graphic 20 of a limb 22 of occupant 12 to indicate occupant 12 should depress the illuminated push button 130P with limb 22 until push button 130P is no longer illuminated. In other embodiments, sensor unit 16 includes a number of push buttons 130P to enable occupant 12 to communicate with patient support apparatus 10. For example, occupant 12 depresses a first push button 130P to answer in the affirmative in response to a prompt on graphical display 18. Occupant 12 instead depresses a second push button 130P to answer in the negative in response to the prompt on graphical display 18. In yet another embodiment, when graphical display is a mat 150 for example, switches 130 are included in graphical display 18.
  • In some embodiments where sensing device 74 is intended to be touched or pressed upon by occupant 12, for example, when sensing device 74 is weight sensor 110 or switch 130, sensing device 74 has variable resistance such that occupant 12 must press upon sensing device 74 with enough force to overcome a predetermined threshold value. For example, sensing device 74 may have a constant resistance rate such that sensing device 74 resists forces applied to sensing device 74 by occupant 12 in a linear relationship and occupant 12 must apply a force to sensing device 74 greater than the threshold value to progress the physical therapy routine 24. As another example, sensing device 74 may actively resist any force applied to sensing device 74 by occupant 12 such that sensing device 74 applies a greater force to occupant 12 than occupant 12 applies to sensing device 74 and occupant 12 must apply a force to sensing device 74 greater than the threshold value to progress the physical therapy routine 24. Sensing device 74 may apply a greater force than occupant 12, for example, by an actuator coupled to sensing device 74.
  • In another embodiment, sensing device 74 is a proximity sensor 90. Proximity sensor 90 detects when proximity sensor 90 is proximate to patient support apparatus 10. Proximity sensor 90 may be any device capable of detecting the proximity of proximity sensor 90 to patient support apparatus 10. For an example, patient support apparatus 10 and proximity sensor 90 are in wireless communication with each other. In some embodiments, patient support apparatus 10 has a specific wireless communication range. Patient support apparatus 10 is able to detect proximity sensor 90 so long as proximity sensor 90 is within the specific wireless communication range of patient support apparatus 10.
  • In some embodiments, where sensor unit 16 includes proximity sensor 90, sensor unit 16 is assigned to a specific patient support apparatus. Alarm 80 is activated if proximity sensor 90 is moved to a location outside of a specified range of patient support apparatus 10. This is helpful, for example, to reduce the chance of losing sensor unit 16 by activating alarm 80 if sensor unit 16 is moved away from the assigned patient support apparatus 10.
  • In some embodiments, patient support apparatus 10 includes physiological sensor 78 as shown in FIG. 3. Physiological sensor 78 measures at least one vital sign of occupant 12. For example, physiological sensor 78 may be a heart rate monitor, a thermometer, and a respiration sensor. Physiological sensor 78 measures, for example, the heart rate, body temperature, respiration rate, skin temperature of occupant 12, or the amount of calories burned by occupant 12 during physical therapy routine 24. In some embodiments, physiological sensor 78 is coupled to sensor unit 16 as shown in FIG. 3. In other embodiments, physiological sensor 78 is separated from sensor unit 16 and communicates with control unit 14 independently.
  • The measurements of the vital signs of occupant 12 are communicated to processor 30. In some embodiments, processor 30 uses the measurements of the vital signs of occupant 12 to help determine what graphics 20 are displayed on graphical display 18. For example, graphical display 18 displays graphics 20 that inform occupant 12 to perform a less physically demanding physical therapy routine 24 in response to the heart rate measurement of occupant 12. Processor 30 is configurable to activate an alarm 80 based on the value of the measurements. For example, processor 30 activates alarm 80 if the heart rate measurement of occupant 12 exceeds a predetermined threshold value.
  • In some embodiments, sensor unit 16 includes alarm 80 as shown in FIG. 3. Alarm 80 may be any device capable of alerting occupant 12 and caregiver 13. For example, in some embodiments, alarm 80 is audible and visible. Alarm 80 is configured to be activated by a number of conditions. As an example, alarm 80 is activated if a vital sign measurement from physiological sensor 78 exceeds a predefined threshold. As another example, alarm 80 is activated if sensor unit 16 is separated from patient support apparatus 10 by a distance larger than a predetermined range.
  • In some embodiments, sensor unit 16 includes an auxiliary feedback device 82 as shown in FIG. 3. Auxiliary feedback device 82 may be any device capable of providing feedback to occupant 12 performing physical therapy routine 24. In an illustrative embodiment, auxiliary feedback device 82 is capable of vibrating sensor unit 16. Sensor unit 16 vibrates, for example, if occupant 12 incorrectly performs physical therapy routine 24. In other embodiments, auxiliary feedback device 82 may be a speaker, a light, or an aroma.
  • In some embodiments, patient support apparatus 10 incorporates controls or buttons for controlling other features included in patient support apparatus 10 into physical therapy routine 24. Incorporating the controls of features included in patient support apparatus 10 and outside of sensor unit 16 into physical therapy routine 24 increases the scope of the physical anatomy of occupant 12 that can be treated by physical therapy routine 24. Additionally, the sensory perception of occupant 12 may be tested and treated by physical therapy routine 24 by incorporating the controls of features outside of sensor unit 16.
  • For example, physical therapy routine 24 may incorporate the controls for adjusting lights included in patient support apparatus 10 or the angle of a head section of a mattress included in patient support apparatus 10. These controls are not included in sensor unit 16 and may be included in patient support apparatuses that do not include physical therapy routine 24. Physical therapy routine 24 may require occupant 12 to adjust the angle of the head section of the mattress via the buttons that control the angle of the head section. Physical therapy routine 24 may progress only after occupant 12 has adjusted the angle of the head section to the required angle. By adjusting the angle of the head section correctly, occupant 12 demonstrates both the mentally ability to understand and the physically capability to execute the instructions.
  • In some embodiments, sensor unit 16 includes a combination of sensing devices 74, physiological sensors 78, alarms 80, and auxiliary feedback devices 82. FIG. 1 shows one embodiment of patient support apparatus 10 where sensing device 74 includes a number of cameras 120. As one example of occupant 12 using sensor unit 16 to perform physical therapy routine 24, a first graphic 20F is displayed on graphical display 18. Occupant 12 is instructed to point foot 22F at first graphic 20F. Cameras 120 record the position and movement of occupant 12 and occupant 12 points foot 22F at first graphic 20F. Processor 30 compares the information from cameras 120 to the information relating to the ideal position and movement of an occupant. After processor 30 determines that the position and movement of foot 22F of occupant 12 are adequate, where the adequacy is based on the comparison between the position and movements of foot 22F of occupant 12 and the ideal position and movement of an occupant, processor 30 instructs graphical display 18 to remove first graphic 20F and display a second graphic 20S on graphical display 18.
  • Second graphic 20S informs occupant 12 to point foot 22F at second graphic 20S. Cameras 120 record the position and movement of occupant 12 as occupant 12 points foot 22F at second graphic 20S. Processor 30 compares the information from cameras 120 to the information relating to the ideal position and movement of an occupant. After processor 30 determines that the position and movements of foot 22F of occupant 12 are adequate, processor 30 increases repetition counter 152 by one repetition. Processor 30 instructs graphical display 18 to remove second graphic 20S and display first graphic 20F on graphical display 18. Physical therapy routine 24 repeats in this manner until occupant 12 has completed a prescribed number of repetitions.
  • FIG. 5 shows an embodiment of patient support apparatus 10 where sensing device 74 includes a number of accelerometers 100, radio frequency sensor 140, and a number of switches 130. As an example of occupant 12 using the illustrative embodiment of sensor unit 16 to perform physical therapy routine 24, first graphic 20F is displayed in the upper-right corner of graphical display 18. Occupant 12 holds sensor unit 16 in hand 22H of limb 22 requiring physical therapy. Occupant 12 points end 142 of sensor unit 16, including radio frequency sensor 140, at first graphic 20F and depresses one of the number of switches 130. Accelerometers 100 and radio frequency sensor 140 work together to accurately record information relating to the position and movement of limb 22 of occupant 12.
  • Processor 30 compares the information relating to the position and movement of hand 22H of occupant 12 and information relating to the ideal position and movement of a hand of an occupant. If processor 30 determines that the physical therapy routine 24 position was adequate, processor 30 instructs graphical display 18 to display second graphic 20S in the upper-left corner of graphical display 18. Second graphic 20S informs occupant 12 as to where to move limb 22 to continue physical therapy routine 24.
  • Occupant 12 moves limb 22 and points sensor unit 16 at second graphic 20S and depresses one of the number of switches 130 in response to second graphic 20S appearing on graphical display 18. Processor 30 receives information from sensor unit 16 relating to the measurements of accelerometers 100, radio frequency sensor 140, and switch 130 and compares the information to the idealized values of limbs 22 of an occupant 12 moving sensor unit 18 from first graphic 20F to second graphic 20S.
  • Processor 30 determines a score 92 based on the difference between the information relating to the movement and position of limbs 22 of occupant 12 and the idealized movement and positions of an occupant. A third graphic 20T and score 92 of occupant 12 is displayed on graphical display 18 in response to the comparison made by processor 30. Additionally, speaker 23 of graphical display 18 may produce an audible sound indicating that occupant 12 performed the physical therapy routine adequately.
  • However, graphical display 18 may remain the same, for example, if processor 30 determines the movement and position of limbs 22 of occupant 12 were not sufficient to complete that physical therapy routine 24 position. Speaker 23 produces an audible sound indicating that occupant 12 did not perform the physical therapy routine adequately.
  • FIG. 6 shows an embodiment of patient support apparatus 10 where sensing device 74 includes a number of weight sensors 110. As an example of occupant 12 using sensor unit 16 to perform physical therapy routine 24, an avatar 148 representing occupant 12 is displayed on graphical display 18. Avatar 148 is configured to move on graphical display 18 relative to the amount of weight of occupant 12 distributed to each weight sensor 110. First graphic 20F is displayed on graphical display 18 informing occupant 12 to move avatar 148 toward first graphic 20F. Occupant 12 moves limbs 22 to redistribute the weight of occupant 12 distributed to each weight sensor 110. For example, occupant 12 leans to the left to apply more weight to weight sensors 110 located on a left side of patient support apparatus 10. Weight sensors 110 record the amount of weight distributed to each weight sensor 110 as information relating to the position and movement of limb 22 of occupant 12.
  • Processor 30 compares the information relating to the position and movement of occupant 12 and information relating to the ideal position and movement of an occupant. If processor 30 determines that the physical therapy routine 24 position was adequate, processor 30 instructs graphical display 18 to display second graphic 20S on graphical display 18. Second graphic 20S informs occupant 12 as to where to move avatar 148 to continue physical therapy routine 24.
  • Processor 30 determines score 92 based on the difference between the information relating to the movement and position of occupant 12 and the idealized movement and positions of an occupant. However, graphical display 18 may remain the same, for example, if processor 30 determines the movement and position of occupant 12 were not sufficient to complete that physical therapy routine 24 position.
  • FIG. 7 shows an embodiment of patient support apparatus 10 where sensor unit 16 includes a touch-screen graphical display 130TS and sensing device 74 includes a number of accelerometers 100 and a number of switches 130. As a first example of occupant 12 using sensor unit 16 to perform physical therapy routine 24, cursor 146 and first graphic 20F are displayed on graphical display 18. Cursor 146 is configured to move on graphical display 18 in response to occupant 12 rotating sensor unit 16. Occupant 12 moves limbs 22 to rotate sensor unit 16 to try to make cursor 146 contact first graphic 20F. Accelerometers 100 record the rotation of sensor unit 16 as information relating to the position and movement of limb 22 of occupant 12.
  • Processor 30 compares the information relating to the position and movement of occupant 12 and information relating to the ideal position and movement of an occupant. If processor 30 determines that the physical therapy routine 24 position was adequate, processor 30 instructs graphical display 18 to display second graphic 20S on graphical display 18. Second graphic 20S informs occupant 12 as to where to move cursor 146 to continue physical therapy routine 24.
  • Processor 30 determines score 92 based on the difference between the information relating to the movement and position of occupant 12 and the idealized movement and positions of an occupant. A second graphic 20S and score 92 of occupant 12 are displayed on graphical display 18 in response to the comparison made by processor 30. However, graphical display 18 may remain the same, for example, if processor 30 determines the movement and position of occupant 12 were not sufficient to complete that physical therapy routine 24 position.
  • As a second example of using the patient support apparatus of FIG. 7, occupant 12 is required to press upon touch-screen graphical display 130TS where first graphic 20F is displayed. Processor 30 instructs graphical display 18 to display a second graphic 20S when occupant 12 touches first graphic 20F.
  • FIG. 8 shows an embodiment of patient support apparatus 10 where sensing device 74 includes a number of push buttons 130P. As an example of occupant 12 using sensor unit 16 to perform physical therapy routine 24, one of the number of push buttons 130P is illuminated. Occupant moves limb 22 to illuminated push button 130P and depresses push button 130P until illuminated push button 130P is no longer illuminated. If processor 30 determines push button 130P was depressed adequately, another one of the number of push buttons 130P is illuminated.
  • In some embodiments, push buttons 130P are illuminated in a specific pattern. Occupant 12 is instructed to depress push buttons 130P in that pattern. When occupant 12 completes the pattern, the number of illuminated push buttons 130P in the pattern is increased by one. Occupant 12 continues to perform physical therapy routine 24 by depressing push buttons 130P in an order consistent with the pattern until occupant 12 depresses push buttons 130P in an order inconsistent with the pattern.
  • FIG. 9 shows an embodiment of patient support apparatus 10 where sensing device 74 includes mat 150 including a number of push buttons 130P and/or weight sensors 110. As an example of occupant 12 using sensor unit 16 to perform physical therapy routine 24, mat 150 is supported on deck section 46 and occupant 12 is supported on mat 150.
  • Graphics 20 included in mat 150 illuminate to inform occupant 12 to touch mat 150 with limbs 22 where graphics 20 are illuminated. Push buttons 130P and/or weight sensors 110 detect limbs 22 touching mat 150. Occupant 12 moves limb 22 and touches mat 150 where graphics 20 are illuminated until graphics 20 are no longer illuminated. If processor 30 determines that the limbs 22 of occupant 12 were touching mat 150 adequately to satisfy the physical therapy routine 24 position, another graphic on mat 150 is illuminated. In some embodiments, another graphical display 18 is coupled to footboard 44 and displays a graphic of mat 150. The graphic 20 of mat 150 indicates the illuminated graphics 20 on actual mat 150 to assist occupant 12 performing physical therapy routine 24.
  • Although certain illustrative embodiments have been described in detail above, variations and modifications exist within the scope and spirit of this disclosure as described and as defined in the following claims.

Claims (20)

1. A patient support apparatus for providing physical therapy to an occupant, the patient support apparatus comprising:
a processor,
a sensor unit operable to sense the position and motion of limbs of the occupant in the patient support apparatus and to transmit information representing the position and motion of the limbs of the occupant to the processor,
a graphical display coupled to the processor and configured to display graphics based upon feedback from the processor,
a memory device coupled to the processor and containing information representing an idealized set of positions and motions of limbs of an occupant to be achieved by the occupant while performing a physical therapy routine,
wherein the processor is configured to update the graphical display based upon (i) the information representing the position and motion of the limbs of the occupant performing the physical therapy routine received from the sensor unit and (ii) a comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device.
2. The patient support apparatus of claim 1, wherein the processor is configured to update the graphical display to display graphics instructing the occupant to move the limbs of the occupant with at least one of a first speed, a first range of motion, and a first force to progress the physical therapy.
3. The patient support apparatus of claim 2, wherein the processor is configured to update the graphical display to display graphics instructing the occupant to move the limbs of the occupant with at least one of a second speed, a second range of motion, and a second force to progress the physical therapy, and the one of the second speed, second range of motion, and second force are determined by the comparison between the information representing the position and motion of limbs of the occupant and the information representing the idealized set of positions and motions of the limbs of an occupant performing a physical therapy routine stored in the memory device.
4. The patient support apparatus of claim 1, wherein the information representing the idealized set of positions and motions of limbs of the occupant are based upon a length of a limb of the occupant.
5. The patient support apparatus of claim 1, wherein the information representing the idealized set of positions and motions of limbs of the occupant are based upon an age of the occupant.
6. The patient support apparatus of claim 1, wherein the sensor unit is wirelessly connected to an assigned patient support apparatus and the sensor unit sounds an audible alarm if the sensor unit is moved to a position outside of a specified range of the assigned patient support apparatus.
7. The patient support apparatus of claim 1 further including at least one physiological sensor, the physiological sensor configured to transmit information representing one of a heart rate, respiration rate, calories burned, and temperature of the occupant to the processor.
8. The patient support apparatus of claim 1, wherein the sensor unit includes a number of weight sensors.
9. The patient support apparatus of claim 1, wherein the sensor unit is mounted on a member of the patient support apparatus.
10. The patient support apparatus of claim 1, wherein the graphical display is positioned to be visible to the occupant.
11. The patient support apparatus of claim 1, wherein the graphical display is mounted on a member of the patient support apparatus.
12. The patient support apparatus of claim 1, wherein the graphical display is included in the sensor unit.
13. The patient support apparatus of claim 1, wherein the sensor unit includes an image recording device.
14. The patient support apparatus of claim 1, wherein the sensor unit includes at least one accelerometer.
15. The patient support apparatus of claim 1, wherein the sensor unit includes at least one switch, the at least one switch having an inactive and an active position, and the occupant is enabled to move the at least one switch between the inactive position and the active position.
16. The patient support apparatus of claim 15, wherein the at least one switch is configured to offer a resistance against moving between the inactive and active position.
17. The patient support apparatus of claim 16, wherein the resistance offered by the switch is variable.
18. The patient support apparatus of claim 1, wherein the sensor unit includes a radio frequency sensor.
19. The patient support apparatus of claim 1, wherein the processor further creates data relating to statistics of the occupant while performing the physical therapy routine, the data stored in the memory device.
20. The patient support apparatus of claim 19, wherein the data is automatically transmitted to a computer network of a hospital.
US14/197,678 2013-03-07 2014-03-05 Patient support apparatus with physical therapy system Abandoned US20140255890A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/197,678 US20140255890A1 (en) 2013-03-07 2014-03-05 Patient support apparatus with physical therapy system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361774190P 2013-03-07 2013-03-07
US14/197,678 US20140255890A1 (en) 2013-03-07 2014-03-05 Patient support apparatus with physical therapy system

Publications (1)

Publication Number Publication Date
US20140255890A1 true US20140255890A1 (en) 2014-09-11

Family

ID=51488250

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/197,678 Abandoned US20140255890A1 (en) 2013-03-07 2014-03-05 Patient support apparatus with physical therapy system

Country Status (1)

Country Link
US (1) US20140255890A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017169871A (en) * 2016-03-24 2017-09-28 パラマウントベッド株式会社 Bed device
JP2018000316A (en) * 2016-06-28 2018-01-11 日本光電工業株式会社 Relay device
US9901503B2 (en) 2008-03-13 2018-02-27 Optimedica Corporation Mobile patient bed
US20190279480A1 (en) * 2016-11-30 2019-09-12 Agency For Science, Technology And Research A computer system for alerting emergency services
JP2020124528A (en) * 2020-04-01 2020-08-20 パラマウントベッド株式会社 Bed device
US20220020459A1 (en) * 2019-02-01 2022-01-20 Orthobistro Llc Patient assessment system
US11524204B2 (en) * 2017-08-11 2022-12-13 Bio-Sensing Solutions, S.L. Device for sensing force and movement in an action performed by a subject; a system and method for correcting the force and movement of the action

Citations (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4665388A (en) * 1984-11-05 1987-05-12 Bernard Ivie Signalling device for weight lifters
US5423554A (en) * 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5461739A (en) * 1994-07-25 1995-10-31 American Echo, Inc. Patient midsection and shoulder support apparatus for tilting examination table
US5846086A (en) * 1994-07-01 1998-12-08 Massachusetts Institute Of Technology System for human trajectory learning in virtual environments
CA2375333A1 (en) * 1999-05-27 2001-02-08 Nicholas William Granville Rehabilitation device
US6227974B1 (en) * 1997-06-27 2001-05-08 Nds Limited Interactive game system
US20010001303A1 (en) * 1996-11-25 2001-05-17 Mieko Ohsuga Physical exercise system having a virtual reality environment controlled by a users movement
US6430997B1 (en) * 1995-11-06 2002-08-13 Trazer Technologies, Inc. System and method for tracking and assessing movement skills in multidimensional space
US20040101146A1 (en) * 2001-04-11 2004-05-27 Arvo Laitinen Personalized information distribution system
US20060277074A1 (en) * 2004-12-07 2006-12-07 Motorika, Inc. Rehabilitation methods
US20060293617A1 (en) * 2004-02-05 2006-12-28 Reability Inc. Methods and apparatuses for rehabilitation and training
US20070022384A1 (en) * 1998-12-18 2007-01-25 Tangis Corporation Thematic response to a computer user's context, such as by a wearable personal computer
US20080004550A1 (en) * 2004-02-05 2008-01-03 Motorika, Inc. Methods and Apparatus for Rehabilitation and Training
US20080119763A1 (en) * 2006-11-21 2008-05-22 Jay Wiener Acquisition processing and reporting physical exercise data
US20080132383A1 (en) * 2004-12-07 2008-06-05 Tylerton International Inc. Device And Method For Training, Rehabilitation And/Or Support
US20080183049A1 (en) * 2007-01-31 2008-07-31 Microsoft Corporation Remote management of captured image sequence
US20080267447A1 (en) * 2007-04-30 2008-10-30 Gesturetek, Inc. Mobile Video-Based Therapy
US20080281633A1 (en) * 2007-05-10 2008-11-13 Grigore Burdea Periodic evaluation and telerehabilitation systems and methods
US7515734B2 (en) * 2006-03-27 2009-04-07 Eyecue Vision Technologies Ltd. Device, system and method for determining compliance with a positioning instruction by a figure in an image
US7555437B2 (en) * 2006-06-14 2009-06-30 Care Cam Innovations, Llc Medical documentation system
US7641623B2 (en) * 2003-04-11 2010-01-05 Hill-Rom Services, Inc. System for compression therapy with patient support
US7880717B2 (en) * 2003-03-26 2011-02-01 Mimic Technologies, Inc. Method, apparatus, and article for force feedback based on tension control and tracking through cables
US20110043630A1 (en) * 2009-02-26 2011-02-24 Mcclure Neil L Image Processing Sensor Systems
US20110281249A1 (en) * 2010-05-14 2011-11-17 Nicholas Gammell Method And System For Creating Personalized Workout Programs
US20120000300A1 (en) * 2009-03-10 2012-01-05 Yoshikazu Sunagawa Body condition evaluation apparatus, condition estimation apparatus, stride estimation apparatus, and health management system
US20120021833A1 (en) * 2010-06-11 2012-01-26 Harmonic Music Systems, Inc. Prompting a player of a dance game
US20120075464A1 (en) * 2010-09-23 2012-03-29 Stryker Corporation Video monitoring system
US20120108909A1 (en) * 2010-11-03 2012-05-03 HeadRehab, LLC Assessment and Rehabilitation of Cognitive and Motor Functions Using Virtual Reality
US8292837B2 (en) * 2010-03-24 2012-10-23 Tiffany Du Combination exercise-massage device
US20120271143A1 (en) * 2010-11-24 2012-10-25 Nike, Inc. Fatigue Indices and Uses Thereof
US20120277891A1 (en) * 2010-11-05 2012-11-01 Nike, Inc. Method and System for Automated Personal Training that Includes Training Programs
US8306635B2 (en) * 2001-03-07 2012-11-06 Motion Games, Llc Motivation and enhancement of physical and mental exercise, rehabilitation, health and social interaction
US20120317064A1 (en) * 2011-06-13 2012-12-13 Sony Corporation Information processing apparatus, information processing method, and program
US20130123667A1 (en) * 2011-08-08 2013-05-16 Ravi Komatireddy Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation
US20130128022A1 (en) * 2010-08-26 2013-05-23 Blast Motion, Inc. Intelligent motion capture element
US20130171601A1 (en) * 2010-09-22 2013-07-04 Panasonic Corporation Exercise assisting system
US20130178960A1 (en) * 2012-01-10 2013-07-11 University Of Washington Through Its Center For Commercialization Systems and methods for remote monitoring of exercise performance metrics
US20130246084A1 (en) * 2010-04-16 2013-09-19 University of Pittsburg - of the Commonwealth System of Higher Education Versatile and integrated system for telehealth
US20130252216A1 (en) * 2012-03-20 2013-09-26 Microsoft Corporation Monitoring physical therapy via image sensor
US20130271602A1 (en) * 2010-08-26 2013-10-17 Blast Motion, Inc. Motion event recognition system and method
US8583252B2 (en) * 2008-07-11 2013-11-12 Medtronic, Inc. Patient interaction with posture-responsive therapy
US8638228B2 (en) * 2007-02-02 2014-01-28 Hartford Fire Insurance Company Systems and methods for sensor-enhanced recovery evaluation
US20140039351A1 (en) * 2011-03-04 2014-02-06 Stryker Corporation Sensing system for patient supports
US8652072B2 (en) * 2012-01-11 2014-02-18 Stimson Biokinematics, Llc Kinematic system
US20140147820A1 (en) * 2012-11-28 2014-05-29 Judy Sibille SNOW Method to Provide Feedback to a Physical Therapy Patient or Athlete
US20140157911A1 (en) * 2012-12-10 2014-06-12 The Regents Of The University Of California On-bed monitoring system for range of motion exercises with a pressure sensitive bed sheet
US20140243682A1 (en) * 2011-10-20 2014-08-28 Koninklijke Philips N.V. Device and method for monitoring movement and orientation of the device
US8882684B2 (en) * 2008-05-12 2014-11-11 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US20140364230A1 (en) * 2013-06-06 2014-12-11 Universita' Degli Studi Di Milano Apparatus and Method for Rehabilitation Employing a Game Engine
US20140371633A1 (en) * 2011-12-15 2014-12-18 Jintronix, Inc. Method and system for evaluating a patient during a rehabilitation exercise
US20150004581A1 (en) * 2011-10-17 2015-01-01 Interactive Physical Therapy, Llc Interactive physical therapy
US20150087933A1 (en) * 2005-01-13 2015-03-26 Welch Allyn, Inc. Vital Signs Monitor
US20150133820A1 (en) * 2013-11-13 2015-05-14 Motorika Limited Virtual reality based rehabilitation apparatuses and methods
US20150151199A1 (en) * 2013-03-06 2015-06-04 Biogaming Ltd. Patient-specific rehabilitative video games
US9135347B2 (en) * 2013-12-18 2015-09-15 Assess2Perform, LLC Exercise tracking and analysis systems and related methods of use
US20160015315A1 (en) * 2014-07-21 2016-01-21 Withings System and method to monitor and assist individual's sleep
US20160074698A1 (en) * 2014-09-16 2016-03-17 Manuel Alejandro Figueroa Article of furniture with means to be converted to physical exercise equipment, and vice versa, physical exercise equipment with means to be converted to an article of furniture
US20160086500A1 (en) * 2012-10-09 2016-03-24 Kc Holdings I Personalized avatar responsive to user physical state and context
US9330497B2 (en) * 2011-08-12 2016-05-03 St. Jude Medical, Atrial Fibrillation Division, Inc. User interface devices for electrophysiology lab diagnostic and therapeutic equipment
US20160258573A1 (en) * 2015-03-06 2016-09-08 Department Of Veterans Affairs Exercise machine and method for use in a supine position
US20170020440A1 (en) * 2015-07-24 2017-01-26 Johnson & Johnson Vision Care, Inc. Biomedical devices for biometric based information communication and sleep monitoring

Patent Citations (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4665388A (en) * 1984-11-05 1987-05-12 Bernard Ivie Signalling device for weight lifters
US5423554A (en) * 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5846086A (en) * 1994-07-01 1998-12-08 Massachusetts Institute Of Technology System for human trajectory learning in virtual environments
US5461739A (en) * 1994-07-25 1995-10-31 American Echo, Inc. Patient midsection and shoulder support apparatus for tilting examination table
US6430997B1 (en) * 1995-11-06 2002-08-13 Trazer Technologies, Inc. System and method for tracking and assessing movement skills in multidimensional space
US20010001303A1 (en) * 1996-11-25 2001-05-17 Mieko Ohsuga Physical exercise system having a virtual reality environment controlled by a users movement
US6244987B1 (en) * 1996-11-25 2001-06-12 Mitsubishi Denki Kabushiki Kaisha Physical exercise system having a virtual reality environment controlled by a user's movement
US6227974B1 (en) * 1997-06-27 2001-05-08 Nds Limited Interactive game system
US20070022384A1 (en) * 1998-12-18 2007-01-25 Tangis Corporation Thematic response to a computer user's context, such as by a wearable personal computer
CA2375333A1 (en) * 1999-05-27 2001-02-08 Nicholas William Granville Rehabilitation device
US8892219B2 (en) * 2001-03-07 2014-11-18 Motion Games, Llc Motivation and enhancement of physical and mental exercise, rehabilitation, health and social interaction
US8306635B2 (en) * 2001-03-07 2012-11-06 Motion Games, Llc Motivation and enhancement of physical and mental exercise, rehabilitation, health and social interaction
US20040101146A1 (en) * 2001-04-11 2004-05-27 Arvo Laitinen Personalized information distribution system
US7880717B2 (en) * 2003-03-26 2011-02-01 Mimic Technologies, Inc. Method, apparatus, and article for force feedback based on tension control and tracking through cables
US7641623B2 (en) * 2003-04-11 2010-01-05 Hill-Rom Services, Inc. System for compression therapy with patient support
US9220655B2 (en) * 2003-04-11 2015-12-29 Hill-Rom Services, Inc. System for compression therapy
US20090062698A1 (en) * 2004-02-05 2009-03-05 Motorika Inc. Methods and apparatuses for rehabilitation and training
US8753296B2 (en) * 2004-02-05 2014-06-17 Motorika Limited Methods and apparatus for rehabilitation and training
US20060293617A1 (en) * 2004-02-05 2006-12-28 Reability Inc. Methods and apparatuses for rehabilitation and training
US20080004550A1 (en) * 2004-02-05 2008-01-03 Motorika, Inc. Methods and Apparatus for Rehabilitation and Training
US8177732B2 (en) * 2004-02-05 2012-05-15 Motorika Limited Methods and apparatuses for rehabilitation and training
US8012107B2 (en) * 2004-02-05 2011-09-06 Motorika Limited Methods and apparatus for rehabilitation and training
US8545420B2 (en) * 2004-02-05 2013-10-01 Motorika Limited Methods and apparatus for rehabilitation and training
US20080132383A1 (en) * 2004-12-07 2008-06-05 Tylerton International Inc. Device And Method For Training, Rehabilitation And/Or Support
US20060277074A1 (en) * 2004-12-07 2006-12-07 Motorika, Inc. Rehabilitation methods
US20150087933A1 (en) * 2005-01-13 2015-03-26 Welch Allyn, Inc. Vital Signs Monitor
US7515734B2 (en) * 2006-03-27 2009-04-07 Eyecue Vision Technologies Ltd. Device, system and method for determining compliance with a positioning instruction by a figure in an image
US7899206B2 (en) * 2006-03-27 2011-03-01 Eyecue Vision Technologies Ltd. Device, system and method for determining compliance with a positioning instruction by a figure in an image
US7555437B2 (en) * 2006-06-14 2009-06-30 Care Cam Innovations, Llc Medical documentation system
US20080119763A1 (en) * 2006-11-21 2008-05-22 Jay Wiener Acquisition processing and reporting physical exercise data
US20080183049A1 (en) * 2007-01-31 2008-07-31 Microsoft Corporation Remote management of captured image sequence
US9256906B2 (en) * 2007-02-02 2016-02-09 Hartford Fire Insurance Company Systems and methods for sensor-enhanced activity evaluation
US20160171623A1 (en) * 2007-02-02 2016-06-16 Andrew J. Amigo Systems and Methods For Sensor-Based Activity Evaluation
US8638228B2 (en) * 2007-02-02 2014-01-28 Hartford Fire Insurance Company Systems and methods for sensor-enhanced recovery evaluation
US8094873B2 (en) * 2007-04-30 2012-01-10 Qualcomm Incorporated Mobile video-based therapy
US20140007009A1 (en) * 2007-04-30 2014-01-02 Qualcomm Incorporated Mobile video-based therapy
US20080267447A1 (en) * 2007-04-30 2008-10-30 Gesturetek, Inc. Mobile Video-Based Therapy
US8577081B2 (en) * 2007-04-30 2013-11-05 Qualcomm Incorporated Mobile video-based therapy
US20080281633A1 (en) * 2007-05-10 2008-11-13 Grigore Burdea Periodic evaluation and telerehabilitation systems and methods
US8882684B2 (en) * 2008-05-12 2014-11-11 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US8998830B2 (en) * 2008-05-12 2015-04-07 Earlysense Ltd. Monitoring, predicting and treating clinical episodes
US8644945B2 (en) * 2008-07-11 2014-02-04 Medtronic, Inc. Patient interaction with posture-responsive therapy
US8583252B2 (en) * 2008-07-11 2013-11-12 Medtronic, Inc. Patient interaction with posture-responsive therapy
US20110043630A1 (en) * 2009-02-26 2011-02-24 Mcclure Neil L Image Processing Sensor Systems
US20120000300A1 (en) * 2009-03-10 2012-01-05 Yoshikazu Sunagawa Body condition evaluation apparatus, condition estimation apparatus, stride estimation apparatus, and health management system
US8292837B2 (en) * 2010-03-24 2012-10-23 Tiffany Du Combination exercise-massage device
US20130246084A1 (en) * 2010-04-16 2013-09-19 University of Pittsburg - of the Commonwealth System of Higher Education Versatile and integrated system for telehealth
US20110281249A1 (en) * 2010-05-14 2011-11-17 Nicholas Gammell Method And System For Creating Personalized Workout Programs
US20120021833A1 (en) * 2010-06-11 2012-01-26 Harmonic Music Systems, Inc. Prompting a player of a dance game
US20130271602A1 (en) * 2010-08-26 2013-10-17 Blast Motion, Inc. Motion event recognition system and method
US20130128022A1 (en) * 2010-08-26 2013-05-23 Blast Motion, Inc. Intelligent motion capture element
US20130171601A1 (en) * 2010-09-22 2013-07-04 Panasonic Corporation Exercise assisting system
US20120075464A1 (en) * 2010-09-23 2012-03-29 Stryker Corporation Video monitoring system
US20120108909A1 (en) * 2010-11-03 2012-05-03 HeadRehab, LLC Assessment and Rehabilitation of Cognitive and Motor Functions Using Virtual Reality
US20120277891A1 (en) * 2010-11-05 2012-11-01 Nike, Inc. Method and System for Automated Personal Training that Includes Training Programs
US20120271143A1 (en) * 2010-11-24 2012-10-25 Nike, Inc. Fatigue Indices and Uses Thereof
US20140039351A1 (en) * 2011-03-04 2014-02-06 Stryker Corporation Sensing system for patient supports
US20120317064A1 (en) * 2011-06-13 2012-12-13 Sony Corporation Information processing apparatus, information processing method, and program
US20130123667A1 (en) * 2011-08-08 2013-05-16 Ravi Komatireddy Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation
US9330497B2 (en) * 2011-08-12 2016-05-03 St. Jude Medical, Atrial Fibrillation Division, Inc. User interface devices for electrophysiology lab diagnostic and therapeutic equipment
US20160320930A1 (en) * 2011-08-12 2016-11-03 St. Jude Medical, Atrial Fibrillation Division, Inc User interface devices for electrophysiology lab diagnostic and therapeutic equipment
US20150004581A1 (en) * 2011-10-17 2015-01-01 Interactive Physical Therapy, Llc Interactive physical therapy
US20140243682A1 (en) * 2011-10-20 2014-08-28 Koninklijke Philips N.V. Device and method for monitoring movement and orientation of the device
US20140371633A1 (en) * 2011-12-15 2014-12-18 Jintronix, Inc. Method and system for evaluating a patient during a rehabilitation exercise
US20130178960A1 (en) * 2012-01-10 2013-07-11 University Of Washington Through Its Center For Commercialization Systems and methods for remote monitoring of exercise performance metrics
US8652072B2 (en) * 2012-01-11 2014-02-18 Stimson Biokinematics, Llc Kinematic system
US20130252216A1 (en) * 2012-03-20 2013-09-26 Microsoft Corporation Monitoring physical therapy via image sensor
US20160086500A1 (en) * 2012-10-09 2016-03-24 Kc Holdings I Personalized avatar responsive to user physical state and context
US20140147820A1 (en) * 2012-11-28 2014-05-29 Judy Sibille SNOW Method to Provide Feedback to a Physical Therapy Patient or Athlete
US20140157911A1 (en) * 2012-12-10 2014-06-12 The Regents Of The University Of California On-bed monitoring system for range of motion exercises with a pressure sensitive bed sheet
US20150151199A1 (en) * 2013-03-06 2015-06-04 Biogaming Ltd. Patient-specific rehabilitative video games
US20140364230A1 (en) * 2013-06-06 2014-12-11 Universita' Degli Studi Di Milano Apparatus and Method for Rehabilitation Employing a Game Engine
US20150133820A1 (en) * 2013-11-13 2015-05-14 Motorika Limited Virtual reality based rehabilitation apparatuses and methods
US9135347B2 (en) * 2013-12-18 2015-09-15 Assess2Perform, LLC Exercise tracking and analysis systems and related methods of use
US20160015315A1 (en) * 2014-07-21 2016-01-21 Withings System and method to monitor and assist individual's sleep
US20160074698A1 (en) * 2014-09-16 2016-03-17 Manuel Alejandro Figueroa Article of furniture with means to be converted to physical exercise equipment, and vice versa, physical exercise equipment with means to be converted to an article of furniture
US20160258573A1 (en) * 2015-03-06 2016-09-08 Department Of Veterans Affairs Exercise machine and method for use in a supine position
US20170020440A1 (en) * 2015-07-24 2017-01-26 Johnson & Johnson Vision Care, Inc. Biomedical devices for biometric based information communication and sleep monitoring

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9901503B2 (en) 2008-03-13 2018-02-27 Optimedica Corporation Mobile patient bed
US10426685B2 (en) 2008-03-13 2019-10-01 Optimedica Corporation Mobile patient bed
JP2017169871A (en) * 2016-03-24 2017-09-28 パラマウントベッド株式会社 Bed device
JP2018000316A (en) * 2016-06-28 2018-01-11 日本光電工業株式会社 Relay device
US20190279480A1 (en) * 2016-11-30 2019-09-12 Agency For Science, Technology And Research A computer system for alerting emergency services
US11074798B2 (en) * 2016-11-30 2021-07-27 Agency for Science, Technology end Research Computer system for alerting emergency services
US11524204B2 (en) * 2017-08-11 2022-12-13 Bio-Sensing Solutions, S.L. Device for sensing force and movement in an action performed by a subject; a system and method for correcting the force and movement of the action
US20220020459A1 (en) * 2019-02-01 2022-01-20 Orthobistro Llc Patient assessment system
JP2020124528A (en) * 2020-04-01 2020-08-20 パラマウントベッド株式会社 Bed device
JP7118104B2 (en) 2020-04-01 2022-08-15 パラマウントベッド株式会社 sleeping device

Similar Documents

Publication Publication Date Title
US20140255890A1 (en) Patient support apparatus with physical therapy system
US20220125320A1 (en) Wrist-Worn Device for Coordinating Patient Care
US10733866B2 (en) Walker-assist device
US10004447B2 (en) Systems and methods for collecting and displaying user orientation information on a user-worn sensor device
KR102318887B1 (en) Wearable electronic device and method for controlling thereof
US8753296B2 (en) Methods and apparatus for rehabilitation and training
US20130178960A1 (en) Systems and methods for remote monitoring of exercise performance metrics
US20060293617A1 (en) Methods and apparatuses for rehabilitation and training
CN104991639A (en) Virtual reality rehabilitation training system and method
JP2009511220A (en) Diversity therapy apparatus and method and interactive device
JP2003523223A (en) Rehabilitation device and method
CN104983426A (en) Virtual reality bedside rehabilitation training system and method
KR102355511B1 (en) VR-Based Recognition Training System
JP2001198110A (en) Body action sensing device
EP3725279B1 (en) Patient bed having exercise therapy apparatus
CA2625748A1 (en) Interface device
CN102958488A (en) Pediatric patient-safe cpr device
JP5705458B2 (en) Biological information detection apparatus and subject monitoring system
JP2016087226A (en) Exercise assist device, exercise assist method, and program
WO2017176667A1 (en) Pressure ulcer detection methods, devices and techniques
KR20020073920A (en) The Early Rehabilitation Training System
WO2007030947A1 (en) Mapping motion sensors to standard input devices
WO2020133480A1 (en) Mobile monitoring measurement method, mobile monitoring device, system and storage medium
JP6062889B2 (en) Muscle contraction exercise support system
JP2017038959A (en) Exercise assist device, exercise assist method, and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: HILL-ROM SERVICES, INC., INDIANA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KOVACH, MICHELLE;O'KEEFE, CHRISTOPHER R.;SRIVASTAVA, VARAD;AND OTHERS;SIGNING DATES FROM 20140328 TO 20150519;REEL/FRAME:036436/0887

AS Assignment

Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, ILLINOIS

Free format text: SECURITY INTEREST;ASSIGNORS:ALLEN MEDICAL SYSTEMS, INC.;HILL-ROM SERVICES, INC.;ASPEN SURGICAL PRODUCTS, INC.;AND OTHERS;REEL/FRAME:036582/0123

Effective date: 20150908

Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, IL

Free format text: SECURITY INTEREST;ASSIGNORS:ALLEN MEDICAL SYSTEMS, INC.;HILL-ROM SERVICES, INC.;ASPEN SURGICAL PRODUCTS, INC.;AND OTHERS;REEL/FRAME:036582/0123

Effective date: 20150908

AS Assignment

Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, ILLINOIS

Free format text: SECURITY AGREEMENT;ASSIGNORS:HILL-ROM SERVICES, INC.;ASPEN SURGICAL PRODUCTS, INC.;ALLEN MEDICAL SYSTEMS, INC.;AND OTHERS;REEL/FRAME:040145/0445

Effective date: 20160921

Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, IL

Free format text: SECURITY AGREEMENT;ASSIGNORS:HILL-ROM SERVICES, INC.;ASPEN SURGICAL PRODUCTS, INC.;ALLEN MEDICAL SYSTEMS, INC.;AND OTHERS;REEL/FRAME:040145/0445

Effective date: 20160921

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

AS Assignment

Owner name: ALLEN MEDICAL SYSTEMS, INC., ILLINOIS

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050254/0513

Effective date: 20190830

Owner name: VOALTE, INC., FLORIDA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050254/0513

Effective date: 20190830

Owner name: WELCH ALLYN, INC., NEW YORK

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050254/0513

Effective date: 20190830

Owner name: HILL-ROM COMPANY, INC., ILLINOIS

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050254/0513

Effective date: 20190830

Owner name: ANODYNE MEDICAL DEVICE, INC., FLORIDA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050254/0513

Effective date: 20190830

Owner name: MORTARA INSTRUMENT SERVICES, INC., WISCONSIN

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050254/0513

Effective date: 20190830

Owner name: MORTARA INSTRUMENT, INC., WISCONSIN

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050254/0513

Effective date: 20190830

Owner name: HILL-ROM, INC., ILLINOIS

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050254/0513

Effective date: 20190830

Owner name: HILL-ROM SERVICES, INC., ILLINOIS

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050254/0513

Effective date: 20190830

AS Assignment

Owner name: JPMORGAN CHASE BANK, N.A., ILLINOIS

Free format text: SECURITY AGREEMENT;ASSIGNORS:HILL-ROM HOLDINGS, INC.;HILL-ROM, INC.;HILL-ROM SERVICES, INC.;AND OTHERS;REEL/FRAME:050260/0644

Effective date: 20190830

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION