US20050124851A1 - System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display - Google Patents

System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display Download PDF

Info

Publication number
US20050124851A1
US20050124851A1 US11/042,304 US4230405A US2005124851A1 US 20050124851 A1 US20050124851 A1 US 20050124851A1 US 4230405 A US4230405 A US 4230405A US 2005124851 A1 US2005124851 A1 US 2005124851A1
Authority
US
United States
Prior art keywords
subject
images
image
response
conditioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/042,304
Inventor
David Patton
John Agostinelli
James Stephens
Edward Covannon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Carestream Health Inc
Original Assignee
Patton David L.
Agostinelli John A.
Stephens James G.
Edward Covannon
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Patton David L., Agostinelli John A., Stephens James G., Edward Covannon filed Critical Patton David L.
Priority to US11/042,304 priority Critical patent/US20050124851A1/en
Publication of US20050124851A1 publication Critical patent/US20050124851A1/en
Assigned to CREDIT SUISSE, CAYMAN ISLANDS BRANCH, AS ADMINISTRATIVE AGENT reassignment CREDIT SUISSE, CAYMAN ISLANDS BRANCH, AS ADMINISTRATIVE AGENT FIRST LIEN OF INTELLECTUAL PROPERTY SECURITY AGREEMENT Assignors: CARESTREAM HEALTH, INC.
Assigned to CREDIT SUISSE, CAYMAN ISLANDS BRANCH, AS ADMINISTRATIVE AGENT reassignment CREDIT SUISSE, CAYMAN ISLANDS BRANCH, AS ADMINISTRATIVE AGENT SECOND LIEN INTELLECTUAL PROPERTY SECURITY AGREEME Assignors: CARESTREAM HEALTH, INC.
Assigned to CARESTREAM HEALTH, INC. reassignment CARESTREAM HEALTH, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EASTMAN KODAK COMPANY
Assigned to CARESTREAM HEALTH, INC. reassignment CARESTREAM HEALTH, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EASTMAN KODAK COMPANY
Assigned to CARESTREAM HEALTH, INC. reassignment CARESTREAM HEALTH, INC. RELEASE OF SECURITY INTEREST IN INTELLECTUAL PROPERTY (FIRST LIEN) Assignors: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4884Other medical applications inducing physiological or psychological stress, e.g. applications for stress testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0044Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense
    • A61M2021/005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense images, e.g. video

Definitions

  • This invention generally relates to the field of psychological health management and in particular relates to an autostereoscopic display system adapted for psychological health management and to a method for using an autostereoscopic display system for conditioning the psychological state of a subject for biofeedback, stress management, behavior modification, entertainment, and similar applications.
  • image display for conditioning the psychological state, physiological state, or overall behavior of a human subject is widely recognized and documented, with applications in numerous fields.
  • conditioning can be used for purposes such as stress management as well as for helping in the treatment of conditions such as anxiety, brain injury, or stroke, and other psychological and physiological conditions.
  • behavioral sciences such conditioning can be applied to behavior modification, for example.
  • conditioning the psychological state, physiological state, or overall behavior of a human subject can be useful in conjunction with simulation systems.
  • entertainment fields such conditioning, coupled with careful measurement techniques, could be used to adapt a visual entertainment experience to suit a particular human subject.
  • the present invention is directed to an apparatus and method for conditioning the psychological state, physiological state, or behavior of a human subject by displaying images to a subject, measuring the subject's response, and adapting further display operation based upon the measured response of the subject.
  • the present invention is directed to apparatus and methods solutions in health management, stress management, training, and entertainment. It is also instructive to observe that, in the broad sense used in this application, the concept of conditioning the psychological state of a subject encompasses that of modifying the physiological state or the behavioral response of the subject.
  • An area of particular interest in conditioning the psychological or physiological state of a subject relates to the measurement and management of stress.
  • the measurement and management of a psychological and physiological state, such as the stress, of a subject is a component of a health management program.
  • stress which can have both a psychological and physiological component
  • it is useful to measure a physiological state of a subject.
  • Useful types of measurements can include measuring galvanic skin response, temperature of fingers, toes, or other extremities, electromyographic (EMG) signals, electroencephalographic (EEG) signals, heart rate, blood pressure, etc., to determine the stress level or level of anxiety of the subject. Dilation of the eye pupil can also be a useful indicator of stress level.
  • the results of these measurements can be converted into signals and fed back as an indication of the subject's level of stress.
  • the subject's level of stress can be determined, measured and compared to a predetermined base level, then converted into sound, light, heat, vibration or images and fed back to the subject.
  • U.S. Pat. No. 3,855,998 shows an entertainment device that includes sensing means connected to the subject.
  • the sensing means can, for example, sense the subject's galvanic skin response and, according to the given measured state of the subject, provide a given type of audio-visual stimulation for a timed interval to hold the subject's attention or modify subject response to a desired state.
  • the subject's state is again measured and a further timed audio-visual response, based on this measured state, is presented to the subject.
  • U.S. Pat. No. 6,149,586 discloses a system and method for diagnosing executive dysfunctions in patients using virtual reality (VR) technology.
  • VR virtual reality
  • images shown to the subjects are displayed on a CRT monitor, constraining the capability of the system for achieving full engagement of the subject's attention.
  • U.S. Pat. No. 6,102,846 discloses a system for managing a psychological and physiological state of a subject using images that are created according to a personalized preferred response profile and specifically tailored to the subject.
  • the image display device disclosed in U.S. Pat. No. 6,102,846 is a high-resolution color monitor.
  • display devices of this type are limited in providing realistic images. The subject can be too easily distracted and must exert some effort to become absorbed in the viewing experience with this type of display.
  • VR virtual reality
  • Auto stereoscopic display systems include “immersion” systems, intended to provide a realistic viewing experience for a subject by visually surrounding the subject with a three-dimensional image having a very wide field of view.
  • the auto stereoscopic display is characterized by the absence of any requirement for a wearable item of any type, such as goggles, headgear, or special glasses, for example. That is, an auto stereoscopic display attempts to provide “natural” viewing conditions for a subject.
  • the virtual image appears to the eye as if it has a spatial position, but this appearance is caused by divergence of light rays rather than by the actual formation of a focused image.
  • a very small source object can provide the scene content for a large virtual image.
  • a display system using a curved mirror and beamsplitter such as is disclosed in U.S. Pat. No. 6,416,181 forms a virtual image that appears to be well behind the curved mirror in space.
  • Vergence refers to the degree at which the observer's eyes must be crossed in order to fuse the separate images of an object within the field of view. Vergence decreases, then vanishes as viewed objects become more distant.
  • Accommodation refers to the requirement that the eye lens of the observer change shape to maintain retinal focus for the object of interest. It is known that there can be a temporary degradation of the observer's depth perception when the observer is exposed for a period of time to mismatched depth cues for vergence and accommodation. It is also known that this negative effect on depth perception can be mitigated when the accommodation cues correspond to distant image position, as can be provided using virtual imaging. In addition to providing an image that is easy for the eye to adapt to, virtual imaging allows a wide field of view.
  • virtual reality because of wide use of the term “virtual reality”, there is some confusion of terminology related to virtual images.
  • virtual images are considered to be images that are solely computer-generated.
  • references to “virtual images” refer to images formed optically in the manner described above and differentiated from real images.
  • virtual image content may be either from natural sources or may be computer-generated.
  • Virtual reality techniques may employ either real images, as is described with reference to U.S. Pat. Nos. 6,102,846 and 6 , 149 , 586 above, or may employ virtual images.
  • Pupil imaging also provides advantages for realistic autostereoscopic imaging.
  • pupil imaging the eye pupil of the subject is optically conjugate to the projection lens pupil. This allows natural head movement if an eye-tracking and compensation mechanism is employed to adjust the viewing pupil position when the eye pupil is moved.
  • an eye-tracking and compensation mechanism is employed to adjust the viewing pupil position when the eye pupil is moved.
  • An acknowledged design goal for immersion systems is to provide the most realistic viewing environment possible. While this relates most pronouncedly to visual perception, it can also encompass auditory, tactile, and other sensory perception as well. It is well known to those skilled in the virtual reality art that, while the visual display is the primary component needed for an effective immersion experience, there is substantial added value in complementing visual accuracy with reinforcement using other senses of a subject. While the addition of auditory, tactile, and motion stimuli has been implemented for a more realistic and compelling motion picture experience to an audience, there is a need to provide additional sense stimuli in an auto stereoscopic viewing system. Moreover, the use of such additional stimuli may be optimized using sensed feedback information from measurements obtained from a subject.
  • the present invention provides a system comprising:
  • a feature of the present invention is the use of an adaptive autostereoscopic imaging system for display of images to the subject.
  • the system forms images using a ball lens that is conjugate to each viewing pupil due to its optical relationship to a curved mirror and a beamsplitter.
  • a further feature of the present invention is the use of system control logic for maintaining a profile for the subject, where the profile is used as a factor in determining image selection and display.
  • the present invention provides for both a system and method for helping to condition or manage the psychological and physiological state of a subject by utilizing images and other stimuli such as sound, smell, etc.
  • the images viewed can be still images, audio-visual images, or video clips, for example.
  • the apparatus and method of the present invention can be part of a personal biofeedback program for managing stress responses or modifying behavior in some way.
  • a subject can then utilize these personal images or stimuli using an adaptive autostereoscopic display system along with, or as part of, a biofeedback mechanism for altering the subject's psychological and physiological state, so as to manage and/or reduce stress levels, for example.
  • the method of the present invention comprises the steps of creating a personalized preferred image response profile for a subject by having the subject view a first set of images and then choose images from the first set of images which provide a preferred response for the subject, wherein the personalized preferred image response profile defines preferred characteristics which are representative of common characteristics of the chosen images; selecting a second set of images from an image library which include characteristics that match the preferred characteristics of the personalized preferred image response profile; and displaying the selected second set of images to the subject to help to manage a psychological and physiological state of the subject.
  • the present invention further relates to a method of changing, managing or helping a subject to manage a psychological and physiological state using images which comprises the steps of showing a first set of images to the subject; measuring a physiological state of the subject as the subject views the first set of images; and recording images from the first set of images which provide a preferred response based on the measured physiological state of the subject, so as to create a personalized preferred image response profile that defines preferred characteristics which are representative of common characteristics of the recorded preferred images.
  • the present invention also relates to a system, which changes, manages or helps to manage a psychological and physiological state of a subject using images.
  • the system comprises an image display device which is adapted to store a personalized preferred image response profile for a subject and to store and display a set of images from an image library; and a detector device which measures physiological characteristics of the subject, wherein the physiological characteristics are indicative of a stress level of the subject.
  • the image display device comprises a control mechanism which selects images from the set of images that include attributes that match attributes of the personalized preferred image response profile, and displays the selected images in a desired sequence in accordance with a stress level of the subject as measured by the detection device, to control a stress level of the subject.
  • the present invention also relates to a method of helping to manage a subject's psychological and physiological state, the method comprising the steps of showing a set of stimuli to the subject; measuring a physiological state of the subject as the subject views the set of stimuli, and making a recording of stimuli from the set of stimuli which provide a preferred response based on the measured physiological state of the subject, so as to create a personalized preferred response profile that defines preferred characteristics which are representative of common characteristics of the recorded stimuli.
  • FIG. 1 is a perspective view showing major components of an adaptive autostereoscopic imaging system of the present invention
  • FIG. 2 is a block diagram showing key image forming, control logic, stimulus and feedback components of the system of the present invention
  • FIG. 3 is a flow chart giving process steps for the present invention.
  • FIG. 4 is a further flow chart showing a comparison of images, which can be utilized within the system of the present invention.
  • FIG. 5 is an example of a selector device, which can be used for paired comparisons of images
  • FIG. 6 is a chart illustrating an example of a comparison of images and attributes which can be utilized with the system of the present invention
  • FIG. 7 is a flow chart of an alternative system of the present invention.
  • FIG. 8 is a flow chart showing image selection.
  • an adaptive autostereoscopic display system 10 arranged as a biofeedback imaging apparatus.
  • Adaptive autostereoscopic display system 10 can be used to monitor, condition, and manage the psychological and physiological state of a subject 12 , either controlled by another person, such as a medical professional, or programmed for control by subject 12 .
  • the present invention will be primarily described as using images for behavior modification such as stress management, it is recognized that other sensory stimuli such as sound, smell, touch, for example, can be provided to subject 12 within the context of the present invention.
  • conditioning and “psychological state,” given in the Background section of this application apply to the present invention.
  • adaptive autostereoscopic display system 10 provides virtual autostereoscopic images at left and right viewing pupils 14 l and 14 r.
  • Left and right viewing pupil forming apparatus 36 l and 36 r each project images through a left and right ball lens assembly 30 l and 30 r onto a beamsplitter 16 .
  • Beamsplitter 16 interacts with a curved mirror 24 to form a virtual image for each viewing pupil 14 l and 14 r.
  • images can be generated using a spatial light modulator such as a liquid crystal device (LCD) or a digital micromirror device (DMD), for example.
  • a spatial light modulator such as a liquid crystal device (LCD) or a digital micromirror device (DMD)
  • images could be generated by one or more lasers, using a grating light valve or similar electromechanical device, or using an OLED.
  • Autostereoscopic image delivery system 18 comprises left viewing pupil forming apparatus 36 l for forming and positioning left viewing pupil 14 l and right viewing pupil forming apparatus 36 r for forming and positioning right viewing pupil 14 r.
  • a housing 58 provides a structure for mounting the various components of auto stereoscopic image delivery system 18 and related components.
  • FIG. 2 there is shown a schematic block diagram with key control and signal paths for major components of adaptive autostereoscopic display system 10 .
  • An image source 42 provides image content to an image generator 40 , part of autostereoscopic image delivery system 18 .
  • Image generator 40 comprising a digital image modifying device under the control of a control logic processor 50 , then cooperates with left and right eye projection apparatus 20 l and 20 r and viewing pupil forming apparatus 36 l and 36 r ( FIG. 1 ) to provide stereoscopic virtual images at left and right viewing pupils 14 l and 14 r.
  • Image source 42 may provide any of a number of types of images, such as, but not limited to, the following:
  • control logic processor 50 controls the operation of image generator 40 , the position of projection apparatus 20 , and the overall operation of a projection translation apparatus 60 within auto stereoscopic image delivery system 18 .
  • a beamsplitter positioning apparatus 60 b and a mirror positioning apparatus 60 m would enable adaptive autostereoscopic display system 10 to adapt to changes in position of subject 12 as well as to changes in height, head position, and the like.
  • Control logic processor 50 may also control a chair servo mechanism 66 or movable platform and can accept feedback data about subject 12 from subject feedback sensors 52 such as cameras 54 or other devices, such as photosensors, for example.
  • a manual feedback control 104 provides an alternate means for obtaining instructions from subject 12 .
  • Adaptive autostereoscopic display system 10 could be equipped with a feedback control loop for sensing and responding to a position or gesture of subject 12 .
  • adaptive autostereoscopic display system 10 could be equipped to sense and compensate for an interocular distance or a gesture of subject 12 or could use speech recognition as a sensed input.
  • adaptive autostereoscopic display system 10 could be designed to provide some measure of compensation for parallax error or “see-around” capability, adjusting left and right eye images based on eye positions observed for subject 12 .
  • Control logic processor 50 may also control other optional output devices for controlling vibration, temperature, fans, or other devices. Tactile output could be provided, such as by means of a glove (not shown) or by one or more fans or other devices. These tactile devices could also control temperature, such as the temperature of air from a fan outlet, for example. An olfactory output apparatus (not shown) could be employed as an output device for emitting an odor perceptible to subject 12 .
  • Optional audio content from an audio source 70 also under control of control logic processor 50 , can be directed to a speaker system 62 and to one or more speakers 64 .
  • Control logic processor 50 is a computer of some type, possibly comprising a dedicated CPU or microprocessor, programmed to generate output commands based on program instructions and conditioned by sensed input feedback data.
  • FIG. 1 shows a detector device 11 which can be operationally associated as a feedback source with adaptive autostereoscopic display system 10 .
  • Detector device 11 can be a measuring or monitoring apparatus for obtaining feedback data from subject 12 , such as galvanic skin response, temperature of fingers or other extremities, blood pressure, pulse rate, breathing, eye movements, or other functions that indicate the level of stress or other condition of subject 12 .
  • Detector device 11 can be a device attached to the body in a non-invasive manner, could be an instrumented glove or other device that is easily worn, could be a manipulable device, or could be a non-contact sensing device, such as an optical monitor for measuring pupil dilation, for example.
  • An interconnect cable 13 or other suitable interface mechanism connects detector device 11 with a control mechanism 17 .
  • Control mechanism 17 may be programmed to interact with detector device 11 and cause adaptive autostereoscopic display system 10 to display a preferred image based on the measured physiological characteristics or level of stress of subject 12 .
  • control mechanism 17 can detect changes in stress related physiological functions of subject 12 and trigger a change in the sequence or type of images displayed by adaptive autostereoscopic display system 10 .
  • control mechanism 17 can include software that is designed to select images from a first set of images 100 or from a second set of images 102 supplied by image source 42 , where the selected set 100 or 102 is compatible with attributes defined by a personalized preferred image response profile for subject 12 .
  • Adaptive autostereoscopic display system 10 would then display the selected images in a desired sequence in accordance with the stress level of subject 12 as measured by detection device 11 , to help manage stress, for example.
  • First set of images 100 may be representative of a personalized preferred image response profile for subject 12 , the details of which will be described later.
  • First set of images 100 can be tailored to information obtained from subject 12 and can include a series of images based on a variety of themes (such as ocean, forest, desert, sunset, or similar themes, for example.)
  • first set of images 100 may include personal images of subject 12 , such as family and friends, for example.
  • Images in first or second set of images 100 or 102 can be arranged in a preset sequence, such as from chaotic, to ordered, to placid, as might be useful for helping to modify behavior and reduce stress, for example.
  • Adaptive autostereoscopic display system 10 can further be adapted to store for display, in image source 42 , second set of images 102 from an image library or from personal images of subject 12 .
  • the stored images can be of any type, such as still images, audio-visual images, video clips, or computer-generated images, for example.
  • the personalized preferred image response profile is created by having subject 12 view a wide variety of images and measuring physiological effects on subject 12 as an indicator of psychological state.
  • the measurements can be made using observer feedback sensor 52 , camera 54 , or detector device 11 .
  • Measurements could record one or more physiological symptoms, such as, but not limited to EMG, EEG, galvanic skin response, skin temperature, heart rate, blood pressure, eye movement, or pupil dilation, for example.
  • the measurements obtained are correlated to the corresponding image or image sequence viewed by subject 12 at the time.
  • the measured results indicate how subject 12 reacts to a specific image or to a sequence of images.
  • first set of images 100 will generally be images that provide a preferred response for the type of behavior modification desired, such as lowering a stress level.
  • the personalized preferred image response profile can include data from first set of images 100 which is representative of common characteristics, or attributes of first set of images 100 that tend to provide a preferred response to the individual.
  • the personalized preferred image response profile can then be used to select images from an image library which includes second set of images 102 .
  • the selected images can be used by as personal biofeedback images by subject 12 .
  • images are selected that have a desired effect for subject 12 .
  • the personalized preferred image response profile may be comprised of a set of information that describes the selected images and others that match the response profile.
  • subject 12 accesses an image library stored in image source 42 and keys in a code that links to the personalized preferred image response profile specific to subject 12 .
  • This personalized preferred image response profile is then used by the image library to select images from the image library. These selected images are displayed so that subject 12 can choose a desired set. This selected set can then be loaded as second set of images 102 , for example.
  • the personalized image response profile allows subject 12 to pick from a variety of categories such as seascapes, desert scenes, forest scenes or personal images such as from home, garden, favorite museum, pets, or other images, for example. This allows subject 12 to change the images that are used as biofeedback, reducing the risk that displayed images within second set of images 102 will have an adverse effect on the psychological state of subject 12 .
  • subject 12 interacts with detecting device 11 and loads the selected images from second set of images 102 for display by adaptive autostereoscopic display system 10 .
  • the output from detecting device 11 feeds into control mechanism 17 over interconnect cable 13 or over some other type of wireless interface.
  • Subject 12 can set a base state by recording measured levels of stress, using measurements of symptoms such as EMG, EEG, galvanic skin response, skin temperature, heart rate, blood pressure, eye movement, and the like.
  • An image from the selected images of second set of images 102 which relates to this level of stress can them be displayed on adaptive autostereoscopic display system 10 .
  • subject 12 can begin a personal stress reduction regime and determine the changing level of stress is manifested with the transition in images.
  • the images displayed to subject 12 could change from a chaotic state to a serene state.
  • the image transition may be from a first image to a second image, where the second image is very unlike the first image, with a corresponding measurement made of the change in stress symptoms for subject 12 , on the basis of the personalized preferred image response profile.
  • Image classification may be based on resolution, color, contrast, scene content, or other characteristic.
  • subject 12 creates a profile using a profile set in an initial step 300 .
  • Subject 12 obtains an image profile set (step 301 A).
  • the profile set can include images arranged in a series of twos which are used for paired comparisons.
  • Subject 12 is then shown the first of the two images and then the second of the two images and asked to chose a preferred image (step 302 ).
  • FIG. 4 illustrates a flow chart with respect to the comparison of images of step 302
  • FIG. 5 illustrates a selector device 80 , connected to the system of the present invention, that can be used by subject 12 to choose, compare and select images. As shown in FIG.
  • selector device 80 includes selector buttons 81 , 82 , 83 , and 84 .
  • Selector button 81 corresponds to image A and is activated or depressed by subject 12 when image A provides a preferred response when compared to image B.
  • Selector button 83 corresponds to image B and is activated or depressed by subject 12 when image B provides the preferred response when compared to image A.
  • Selector button 82 can be activated or depressed by subject 12 when no image is preferred.
  • Selector button 84 is used to toggle between images A and image B.
  • Selector device 80 is connected via a connector 85 to control logic processor 50 ( FIG. 2 ) which records and stores the selections made by subject 12 .
  • Selector device 80 also connects to image source 42 and image generator 40 via control logic processor 50 .
  • Selector device 80 has a display panel 86 to indicate to subject 12 which image in the sequence is being displayed.
  • step 302 subject 12 is then directed to choose the more relaxing image (step 302 A).
  • subject 12 can provide a direct response with respect to the preferred images by activating or pressing one of selector buttons 81 - 83 of selector device 80 .
  • preferred images can be automatically chosen based on a physiological measurement obtained from subject 12 by detecting device 11 as illustrated in FIG. 1 .
  • step 302 B there is a check to see if an image “A” is selected (step 302 B). If the answer to step 302 B is no, there is a check to see if image “B” is selected (step 302 B′). If the answer to step 302 B′ is yes, the selection of image “B” is recorded (step 302 C). If the answer to step 302 B′ is no, then there is a recording that neither image has been selected (step 302 C′). If the answer to step 302 B is yes, there is a recording of the selection of image “A” (step 302 D).
  • step 302 E After either of steps 302 C, 302 C′ or 302 D, there is a check to see if the image pair shown to subject 12 is the last image pair (step 302 E). If the answer to step 302 E is no, then above steps are continued as noted in the flow chart of FIG. 4 until the last image pair is chosen and the complete selection of images is noted by subject 12 . After step 302 E, the process proceeds to compile the results of the images chosen by subject 12 (step 302 F).
  • each of the images in the profile set can have certain attributes or characteristics.
  • FIG. 6 shows a chart of sample images and their attributes. In viewing the images selected by subject 12 , predominant attributes or characteristics of the selected images could be determined and these attributes or characteristics utilized to help create the personalized preferred image response profile for subject 12 . Also, a search can be initiated using these attributes or characteristics as described in, for example U.S. Pat. No. 6,102,846, the subject matter of which is incorporated by reference.
  • step 305 the selection results of the paired comparisons, as well as the assessment of the attributes of the chosen images or sequence of the images, provides a basis for creating a personalized preferred image response profile for subject 12 based on exhibited preferences.
  • the personalized preferred image response profile for subject 12 can now be used to select second set of images 102 from an image library as noted in step 306 or from personal images of subject 12 .
  • the process goes to step 400 in FIG. 8 .
  • step 500 images from second set of images 102 using the personalized preferred image response profile are selected and viewed in categories (step 600 ). Thereafter, subject 12 can select image categories (step 700 ), view the images (step 800 ), select images (step 900 ) based on the personalized preferred image response profile, and arrange the images in a desired sequence for achieving stress reduction (step 1000 ) (not shown).
  • a computer program can be utilized to update the profile for subject 12 using data from the newly selected images (step 1010 ) (not shown) and the media for showing the images can be selected (step 1020 ) (not shown).
  • the images obtained by the image source 42 can be in the form of a video, photo CDs, CD ROMs, floppy disks, DVD, lenticular imaging, downloaded or EEPROM.
  • subject 12 creates a personalized preferred image response profile by viewing a first set of images on the adaptive autostereoscopic display system 10 such as from a profile set that includes images arranged in pairs, and compares and chooses images from the first set of images, which provide a preferred response for subject 12 .
  • those images which provide a preferred behavior such as the stress response level for subject 12 are chosen or are automatically selected based on measured stress levels of subject 12 .
  • Selected images from this comparison are used to create a profile for subject 12 based on personal preferences.
  • the personalized preferred image response profile will define preferred characteristics that can be representative of common characteristics of the chosen images.
  • Subject 12 can then select a second set of images from an image library, where the images have characteristics that match the preferred characteristics of the personalized preferred image response profile.
  • adaptive autostereoscopic display system 10 using detector device 11 as illustrated in FIG. 1 , can be used to measure the present stress level of subject 12 .
  • adaptive autostereoscopic display system 10 can display selected images from the image library to subject 12 in a sequence chosen by subject 12 , in accordance with the measured stress level, to enable subject 12 to manage and/or control this stress level.
  • Subject 12 can be shown scenes known to induce stress and be taught, using biofeedback techniques, to control response to the images, for example.
  • subject 12 can visit a medical office (step 2000 ) to control stress using the present invention.
  • a medical practitioner seats the patient as subject 12 in adaptive autostereoscopic display system 10 (step 3000 ) as shown in FIG. 1 .
  • the practitioner controls adaptive autostereoscopic display system 10 to display images to subject 12 and records responses to these images (step 4000 ).
  • adaptive autostereoscopic display system 10 is further capable of interfacing with existing biofeedback equipment.
  • the practitioner or control software adjusts the image combination in view of the responses of subject 12 (step 5000 ).
  • the practitioner trains subject 12 to manage physiological responses to images (step 6000 ).
  • adaptive autostereoscopic display system 10 The medical data for subject 12 is then recorded and managed by adaptive autostereoscopic display system 10 (step 7000 ). Control logic in adaptive autostereoscopic display system 10 then continuously learns the responses of subject 12 as it builds the personalized preferred image response profile and chooses images that support improved health restoration (step 8000 ). Subject 12 can thereafter receive a personalized preferred image response profile for use in a self-care system (step 9000 ).
  • Image composition analysis can be used to help in building an image response profile, based on the attributes shown in FIG. 6 .
  • the profile can also be influenced by color analysis, timing preference, health baseline, health history, ability to learn and record variables for time of day, seasons, geographics, personal or family images, for example. (step 9050 ). Having the personalized preferred image response profile created by a medical practitioner, subject 12 can now go to a retailer or use an on-line remote connection, adapting the sequence described in the flow chart of FIG. 7 to obtain images that help manage stress.
  • the method and apparatus of the present invention overcomes the disadvantage of generalized image selection. Rather than presenting subject 12 with images statistically chosen on the bases of the effects these images had on a large sample of individuals, the images are linked to personal responses. Subject 12 uses the personalized images in a device that uses images or feedback controlled image properties as a biofeedback mechanism for achieving an improved psychological state.
  • adaptive autostereoscopic display system 10 could also be used for entertainment purposes.
  • the display of images or of an image sequence could be selectively adapted in order to obtain a desired type of response from subject 12 .
  • an adventure ride sequence could be speeded up, slowed down, or otherwise adapted to suit the response of each particular subject 12 .
  • the method and apparatus of the present invention also permits the use of a personalized preferred image response profile to allow subject 12 to choose images or a sequence of images from a number of different categories of images such as seascapes, desert scenes, forest scenes, other nature scenes, personal images, or computer-generated images according to data in the personalized preferred image response profile.
  • the personalized preferred image response profile can also be used for pre-selection, thereby preventing subject 12 from having to choose images or sets of images from a large library of images.
  • the images are selected by comparing the attributes of the images to the personalized preferred image response profile.
  • the method and apparatus of the present invention also permits subject 12 to use the personalized preferred image response profile to sort, compare, select and keep track of images.
  • the method and apparatus of the present invention also provides for a device which can be utilized to manage stress and at the same time is portable enough so that it can be used at home, at work, or while traveling.
  • adaptive autostereoscopic display system 10 is not limited to the use of images for creating a personalized preferred response profile and helping subject 12 to manage stress.
  • additional stimuli such as sound, smell, touch, or vibration, could also be used, alone or with images, as a basis for creating the personalized preferred response profile and helping subject 12 to manage stress.
  • adaptive autostereoscopic display system 10 could be used in other medical, therapeutic, or entertainment applications.
  • Additional types of feedback sensors could be employed to provide any of the more sophisticated sensing functions known in the virtual reality presentation arts, such as head-tracking or gaze-tracking, for example.
  • an autostereoscopic display system adapted for psychological health management and to a method for using an autostereoscopic display system for conditioning the psychological state of a subject for biofeedback, stress management, behavior modification, entertainment, and similar applications.

Abstract

An adaptive autostereoscopic display system (10) provides an apparatus for conditioning the psychological state, physiological state, or behavior of a subject (12) by displaying a stereoscopic virtual image at a left viewing pupil (14 l) and a right viewing pupil (14 r). A first set of images (100) is displayed and physiological response measurements are obtained from the subject (12). Based on the response of the subject (12) a personalized image response profile is obtained. Then, in order to condition the psychological state, physiological state, or behavior of the subject (12), a second set of images (102), based on the personalized image response profile is displayed.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This is a divisional of application Ser. No. 10/212,342, filed Aug. 5, 2002.
  • FIELD OF THE INVENTION
  • This invention generally relates to the field of psychological health management and in particular relates to an autostereoscopic display system adapted for psychological health management and to a method for using an autostereoscopic display system for conditioning the psychological state of a subject for biofeedback, stress management, behavior modification, entertainment, and similar applications.
  • BACKGROUND OF THE INVENTION
  • The value of image display for conditioning the psychological state, physiological state, or overall behavior of a human subject is widely recognized and documented, with applications in numerous fields. In health-related fields such conditioning can be used for purposes such as stress management as well as for helping in the treatment of conditions such as anxiety, brain injury, or stroke, and other psychological and physiological conditions. In behavioral sciences, such conditioning can be applied to behavior modification, for example. In training applications, conditioning the psychological state, physiological state, or overall behavior of a human subject can be useful in conjunction with simulation systems. In entertainment fields, such conditioning, coupled with careful measurement techniques, could be used to adapt a visual entertainment experience to suit a particular human subject.
  • For the purpose of this application, it is instructive to clarify the meaning of the verb “condition” as used herein with reference to prior art devices as well as to the present invention. The verb “condition” is broadly defined in the Merriam-Webster Collegiate Dictionary as “to adapt, modify, or mold so as to conform to an environing culture” or “to modify so that an act or response previously associated with one stimulus becomes associated with another.” Using this sense, the present invention is directed to an apparatus and method for conditioning the psychological state, physiological state, or behavior of a human subject by displaying images to a subject, measuring the subject's response, and adapting further display operation based upon the measured response of the subject. In broadest terms, the present invention is directed to apparatus and methods solutions in health management, stress management, training, and entertainment. It is also instructive to observe that, in the broad sense used in this application, the concept of conditioning the psychological state of a subject encompasses that of modifying the physiological state or the behavioral response of the subject.
  • An area of particular interest in conditioning the psychological or physiological state of a subject relates to the measurement and management of stress. The measurement and management of a psychological and physiological state, such as the stress, of a subject is a component of a health management program. In order to manage stress, which can have both a psychological and physiological component, it is useful to measure a physiological state of a subject. Useful types of measurements can include measuring galvanic skin response, temperature of fingers, toes, or other extremities, electromyographic (EMG) signals, electroencephalographic (EEG) signals, heart rate, blood pressure, etc., to determine the stress level or level of anxiety of the subject. Dilation of the eye pupil can also be a useful indicator of stress level. The results of these measurements can be converted into signals and fed back as an indication of the subject's level of stress. The subject's level of stress can be determined, measured and compared to a predetermined base level, then converted into sound, light, heat, vibration or images and fed back to the subject.
  • Several methods for determining a change in stress levels are disclosed in U.S. Pat. No. 6,394,963 and commonly-assigned copending U.S. patent application Ser. No. 09/865,902. The subject in employing stress-reducing techniques uses sound, light, and images to help control stress response. Changes due to physical measures are shown to the subject by a biofeedback device by changing the sound, heat, vibration, light, or images. In the case of images, for example the initial state may show an image out of focus, then, as the stress level decreases, improving focus so that the image becomes more defined. In U.S. Pat. No. 5,465,729, measurements of electro-physiological quantities are used to control a presentation to a subject of a series of pre-stored audio-visual sequences. In this reference, the image does not have to provide feedback and can be used to achieve a relaxed state.
  • U.S. Pat. No. 3,855,998 shows an entertainment device that includes sensing means connected to the subject. In this reference, the sensing means can, for example, sense the subject's galvanic skin response and, according to the given measured state of the subject, provide a given type of audio-visual stimulation for a timed interval to hold the subject's attention or modify subject response to a desired state. At the end of the interval, the subject's state is again measured and a further timed audio-visual response, based on this measured state, is presented to the subject.
  • In U.S. Pat. No. 5,596,994, an automated and interactive positive motivation system is disclosed. The system of this arrangement permits a physician, counselor, or trainer to produce and send a series of motivational messages and/or questions to a subject to change or reinforce a specific behavioral response.
  • U.S. Pat. No. 6,149,586 discloses a system and method for diagnosing executive dysfunctions in patients using virtual reality (VR) technology. However, images shown to the subjects are displayed on a CRT monitor, constraining the capability of the system for achieving full engagement of the subject's attention.
  • Psychotherapists have found that mental visualization of images or guided imagery is a very effective tool for behavior modification therapy, an important factor in managing a subject's stress. Implementation of guided imagery based therapies can be hindered by variety of factors such as a subject's inability to create and properly control mental images and inability to practice and apply visualization techniques without assistance.
  • U.S. Pat. No. 6,102,846 discloses a system for managing a psychological and physiological state of a subject using images that are created according to a personalized preferred response profile and specifically tailored to the subject. The image display device disclosed in U.S. Pat. No. 6,102,846 is a high-resolution color monitor. However, as is noted above, display devices of this type are limited in providing realistic images. The subject can be too easily distracted and must exert some effort to become absorbed in the viewing experience with this type of display.
  • There is considerable interest in applying virtual reality (VR) imaging as part of behavior modification therapy, particularly for treatment of phobias and related neuroses. However, drawbacks with existing VR imaging techniques include cost and complexity, lack of realistic imaging, and an awkward viewing environment due to the need for the subject to wear goggles, headgear, or special glasses.
  • The potential value of auto stereoscopic display systems is widely appreciated particularly in entertainment and simulation fields. Auto stereoscopic display systems include “immersion” systems, intended to provide a realistic viewing experience for a subject by visually surrounding the subject with a three-dimensional image having a very wide field of view. As differentiated from the more general category of stereoscopic displays, the auto stereoscopic display is characterized by the absence of any requirement for a wearable item of any type, such as goggles, headgear, or special glasses, for example. That is, an auto stereoscopic display attempts to provide “natural” viewing conditions for a subject.
  • Conventional display systems, such as the type disclosed in U.S. Pat. No. 6,102,846, use a color display monitor or project an image onto a screen for viewing. Optically, this type of image is termed a “real” image, with some form of display surface positioned where the image is formed in space by the optical system. However, for realistic viewing in an immersive imaging system, display of a “virtual” image, as contrasted with a real image, has distinct advantages. A virtual image, formed by an optical system, appears to be more natural in appearance than a real image, with a more lifelike light behavior. A virtual image is not projected onto a surface and therefore does not exhibit screen or monitor artifacts. The virtual image appears to the eye as if it has a spatial position, but this appearance is caused by divergence of light rays rather than by the actual formation of a focused image. A very small source object can provide the scene content for a large virtual image. A display system using a curved mirror and beamsplitter such as is disclosed in U.S. Pat. No. 6,416,181 forms a virtual image that appears to be well behind the curved mirror in space. As a result, vergence and accommodation effects are improved over solutions using real image projection. Vergence refers to the degree at which the observer's eyes must be crossed in order to fuse the separate images of an object within the field of view. Vergence decreases, then vanishes as viewed objects become more distant. Accommodation refers to the requirement that the eye lens of the observer change shape to maintain retinal focus for the object of interest. It is known that there can be a temporary degradation of the observer's depth perception when the observer is exposed for a period of time to mismatched depth cues for vergence and accommodation. It is also known that this negative effect on depth perception can be mitigated when the accommodation cues correspond to distant image position, as can be provided using virtual imaging. In addition to providing an image that is easy for the eye to adapt to, virtual imaging allows a wide field of view.
  • It must be noted that, because of wide use of the term “virtual reality”, there is some confusion of terminology related to virtual images. In some contexts, virtual images are considered to be images that are solely computer-generated. However, for the purposes of the present application, references to “virtual images” refer to images formed optically in the manner described above and differentiated from real images. For the purposes of the present application, virtual image content may be either from natural sources or may be computer-generated. Virtual reality techniques may employ either real images, as is described with reference to U.S. Pat. Nos. 6,102,846 and 6,149,586 above, or may employ virtual images.
  • Pupil imaging also provides advantages for realistic autostereoscopic imaging. In pupil imaging, the eye pupil of the subject is optically conjugate to the projection lens pupil. This allows natural head movement if an eye-tracking and compensation mechanism is employed to adjust the viewing pupil position when the eye pupil is moved. With a system that updates the image display according to the position of left and right viewing pupils, some ability to “look around” an object can be achieved.
  • An acknowledged design goal for immersion systems is to provide the most realistic viewing environment possible. While this relates most pronouncedly to visual perception, it can also encompass auditory, tactile, and other sensory perception as well. It is well known to those skilled in the virtual reality art that, while the visual display is the primary component needed for an effective immersion experience, there is substantial added value in complementing visual accuracy with reinforcement using other senses of a subject. While the addition of auditory, tactile, and motion stimuli has been implemented for a more realistic and compelling motion picture experience to an audience, there is a need to provide additional sense stimuli in an auto stereoscopic viewing system. Moreover, the use of such additional stimuli may be optimized using sensed feedback information from measurements obtained from a subject.
  • Thus, it can be seen that, while there have been some conventional approaches for conditioning the psychological and physiological state of a subject using displays that provide real images, there is a need for solutions that provide a more natural and realistic viewing experience. In particular, there would be benefits to providing an improved auto stereoscopic imaging solution for viewing electronically processed images, where the solution provides a structurally simple apparatus, minimizes aberrations and image distortion, and meets demanding requirements for providing wide field of view with large pupil size, for compensating for subject head movement and interocular distance differences, and for providing additional sensory stimulation. At the same time, such a solution could serve as the basis for a system that enables a personalized image response profile to be developed and maintained for conditioning the psychological and physiological state of a subject or for conditioning a subject's behavior.
  • SUMMARY OF THE INVENTION
  • It is an object of the present invention to provide an improved autostereoscopic system for conditioning the psychological and physiological state of a subject or for conditioning a subject's behavior. With this object in mind, the present invention provides a system comprising:
      • (a) an autostereoscopic image display for providing a virtual image to the subject, the virtual image viewable at a right viewing pupil and a left viewing pupil;
      • (b) at least one feedback sensor for providing a physiological measurement from the subject; and
      • (c) a control logic processor for obtaining the physiological measurement from the at least one feedback sensor, for maintaining a response profile conditioned by the physiological measurement, and for controlling the selection and processing of the virtual image by the autostereoscopic image display based on the response profile.
  • A feature of the present invention is the use of an adaptive autostereoscopic imaging system for display of images to the subject. In a preferred embodiment, the system forms images using a ball lens that is conjugate to each viewing pupil due to its optical relationship to a curved mirror and a beamsplitter.
  • A further feature of the present invention is the use of system control logic for maintaining a profile for the subject, where the profile is used as a factor in determining image selection and display.
  • It is an advantage of the present invention that it provides an improved system and method that can be used for therapeutic use in treatment of a wide variety of psychological and physiological disorders.
  • It is a further advantage of the present invention that it provides an immersive environment for management and conditioning of the psychological state of the subject. Distractions of external equipment, movement, or personnel are minimized so that the subject can concentrate on visual and other sensory stimuli provided by the system.
  • The present invention provides for both a system and method for helping to condition or manage the psychological and physiological state of a subject by utilizing images and other stimuli such as sound, smell, etc. In the context of the present invention, the images viewed can be still images, audio-visual images, or video clips, for example. The apparatus and method of the present invention can be part of a personal biofeedback program for managing stress responses or modifying behavior in some way. With the method and apparatus of the present invention, it is possible to overcome the disadvantage of generalized image selection as in conventional arrangements. That is, with the apparatus and method of the present invention, visual and related stimuli are based on personal responses of the subject, rather than on generalized responses obtained from a larger group of subjects. A subject can then utilize these personal images or stimuli using an adaptive autostereoscopic display system along with, or as part of, a biofeedback mechanism for altering the subject's psychological and physiological state, so as to manage and/or reduce stress levels, for example.
  • The method of the present invention comprises the steps of creating a personalized preferred image response profile for a subject by having the subject view a first set of images and then choose images from the first set of images which provide a preferred response for the subject, wherein the personalized preferred image response profile defines preferred characteristics which are representative of common characteristics of the chosen images; selecting a second set of images from an image library which include characteristics that match the preferred characteristics of the personalized preferred image response profile; and displaying the selected second set of images to the subject to help to manage a psychological and physiological state of the subject.
  • The present invention further relates to a method of changing, managing or helping a subject to manage a psychological and physiological state using images which comprises the steps of showing a first set of images to the subject; measuring a physiological state of the subject as the subject views the first set of images; and recording images from the first set of images which provide a preferred response based on the measured physiological state of the subject, so as to create a personalized preferred image response profile that defines preferred characteristics which are representative of common characteristics of the recorded preferred images.
  • The present invention also relates to a system, which changes, manages or helps to manage a psychological and physiological state of a subject using images. The system comprises an image display device which is adapted to store a personalized preferred image response profile for a subject and to store and display a set of images from an image library; and a detector device which measures physiological characteristics of the subject, wherein the physiological characteristics are indicative of a stress level of the subject. The image display device comprises a control mechanism which selects images from the set of images that include attributes that match attributes of the personalized preferred image response profile, and displays the selected images in a desired sequence in accordance with a stress level of the subject as measured by the detection device, to control a stress level of the subject.
  • The present invention also relates to a method of helping to manage a subject's psychological and physiological state, the method comprising the steps of showing a set of stimuli to the subject; measuring a physiological state of the subject as the subject views the set of stimuli, and making a recording of stimuli from the set of stimuli which provide a preferred response based on the measured physiological state of the subject, so as to create a personalized preferred response profile that defines preferred characteristics which are representative of common characteristics of the recorded stimuli.
  • These and other objects, features, and advantages of the present invention will become apparent to those skilled in the art upon a reading of the following detailed description when taken in conjunction with the drawings wherein there is shown and described an illustrative embodiment of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • While the specification concludes with claims particularly pointing out and distinctly claiming the subject matter of the present invention, it is believed that the invention will be better understood from the following description when taken in conjunction with the accompanying drawings, wherein:
  • FIG. 1 is a perspective view showing major components of an adaptive autostereoscopic imaging system of the present invention;
  • FIG. 2 is a block diagram showing key image forming, control logic, stimulus and feedback components of the system of the present invention;
  • FIG. 3 is a flow chart giving process steps for the present invention;
  • FIG. 4 is a further flow chart showing a comparison of images, which can be utilized within the system of the present invention;
  • FIG. 5 is an example of a selector device, which can be used for paired comparisons of images;
  • FIG. 6 is a chart illustrating an example of a comparison of images and attributes which can be utilized with the system of the present invention;
  • FIG. 7 is a flow chart of an alternative system of the present invention; and
  • FIG. 8 is a flow chart showing image selection.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The present description is directed in particular to elements forming part of, or cooperating more directly with, apparatus in accordance with the invention. It is to be understood that elements not specifically shown or described may take various forms well known to those skilled in the art.
  • Referring to FIG. 1, there is shown an adaptive autostereoscopic display system 10, arranged as a biofeedback imaging apparatus. Adaptive autostereoscopic display system 10 can be used to monitor, condition, and manage the psychological and physiological state of a subject 12, either controlled by another person, such as a medical professional, or programmed for control by subject 12. Although the present invention will be primarily described as using images for behavior modification such as stress management, it is recognized that other sensory stimuli such as sound, smell, touch, for example, can be provided to subject 12 within the context of the present invention. Within this detailed description, it must be emphasized that the broad definitions for the terms “conditioning” and “psychological state,” given in the Background section of this application, apply to the present invention.
  • Detailed description of the optical subsystem of adaptive autostereoscopic display system 10 is given in commonly-assigned copending U.S. Pat. Ser. No. 09/854,699. In the preferred embodiment, adaptive autostereoscopic display system 10 provides virtual autostereoscopic images at left and right viewing pupils 14 l and 14 r. Left and right viewing pupil forming apparatus 36 l and 36 r each project images through a left and right ball lens assembly 30 l and 30 r onto a beamsplitter 16. Beamsplitter 16 interacts with a curved mirror 24 to form a virtual image for each viewing pupil 14 l and 14 r. Within left and right viewing pupil forming apparatus 36 l and 36 r, images can be generated using a spatial light modulator such as a liquid crystal device (LCD) or a digital micromirror device (DMD), for example. Alternately, images could be generated by one or more lasers, using a grating light valve or similar electromechanical device, or using an OLED.
  • As illustrated in FIG. 1, subject 12 is seated in an adjustable chair 32 for viewing an image projected by an autostereoscopic image delivery system 18 to left viewing pupil 14 l and to right viewing pupil 14 r. Autostereoscopic image delivery system 18 comprises left viewing pupil forming apparatus 36 l for forming and positioning left viewing pupil 14 l and right viewing pupil forming apparatus 36 r for forming and positioning right viewing pupil 14 r. A housing 58 provides a structure for mounting the various components of auto stereoscopic image delivery system 18 and related components.
  • Referring to FIG. 2, there is shown a schematic block diagram with key control and signal paths for major components of adaptive autostereoscopic display system 10. An image source 42 provides image content to an image generator 40, part of autostereoscopic image delivery system 18. Image generator 40, comprising a digital image modifying device under the control of a control logic processor 50, then cooperates with left and right eye projection apparatus 20 l and 20 r and viewing pupil forming apparatus 36 l and 36 r (FIG. 1) to provide stereoscopic virtual images at left and right viewing pupils 14 l and 14 r.
  • Image source 42 may provide any of a number of types of images, such as, but not limited to, the following:
      • (a) Live images from cameras locally or remotely positioned.
      • (b) Images from film, such as conventional motion picture images.
      • (c) Images processed digitally, such as digital cinema images for example. This can include images stored on a storage medium, such as a computer hard disk or removable storage device, for example.
      • (d) Images generated digitally, such as computer simulations. This can include images stored on a storage medium, such as a computer hard disk or removable storage device, for example.
  • From the description of FIGS. 1 and 2, it can be observed that similar optical components within autostereoscopic image delivery system 18 are used to present separate left and right images to each eye of subject 12. When the description that follows applies in general to either left- or right-components, appended “l” and “r” designators are omitted unless otherwise needed.
  • Referring again to FIGS. 1 and 2, control logic processor 50 controls the operation of image generator 40, the position of projection apparatus 20, and the overall operation of a projection translation apparatus 60 within auto stereoscopic image delivery system 18. A beamsplitter positioning apparatus 60 b and a mirror positioning apparatus 60 m would enable adaptive autostereoscopic display system 10 to adapt to changes in position of subject 12 as well as to changes in height, head position, and the like. Control logic processor 50 may also control a chair servo mechanism 66 or movable platform and can accept feedback data about subject 12 from subject feedback sensors 52 such as cameras 54 or other devices, such as photosensors, for example. A manual feedback control 104 provides an alternate means for obtaining instructions from subject 12.
  • Adaptive autostereoscopic display system 10 could be equipped with a feedback control loop for sensing and responding to a position or gesture of subject 12. For example, as is described in commonly-assigned copending U.S. patent application Ser. No. 09/854,699, adaptive autostereoscopic display system 10 could be equipped to sense and compensate for an interocular distance or a gesture of subject 12 or could use speech recognition as a sensed input. In addition, adaptive autostereoscopic display system 10 could be designed to provide some measure of compensation for parallax error or “see-around” capability, adjusting left and right eye images based on eye positions observed for subject 12.
  • Control logic processor 50 may also control other optional output devices for controlling vibration, temperature, fans, or other devices. Tactile output could be provided, such as by means of a glove (not shown) or by one or more fans or other devices. These tactile devices could also control temperature, such as the temperature of air from a fan outlet, for example. An olfactory output apparatus (not shown) could be employed as an output device for emitting an odor perceptible to subject 12. Optional audio content from an audio source 70, also under control of control logic processor 50, can be directed to a speaker system 62 and to one or more speakers 64. Control logic processor 50 is a computer of some type, possibly comprising a dedicated CPU or microprocessor, programmed to generate output commands based on program instructions and conditioned by sensed input feedback data. For example, FIG. 1 shows a detector device 11 which can be operationally associated as a feedback source with adaptive autostereoscopic display system 10. Detector device 11 can be a measuring or monitoring apparatus for obtaining feedback data from subject 12, such as galvanic skin response, temperature of fingers or other extremities, blood pressure, pulse rate, breathing, eye movements, or other functions that indicate the level of stress or other condition of subject 12. Detector device 11 can be a device attached to the body in a non-invasive manner, could be an instrumented glove or other device that is easily worn, could be a manipulable device, or could be a non-contact sensing device, such as an optical monitor for measuring pupil dilation, for example. An interconnect cable 13 or other suitable interface mechanism connects detector device 11 with a control mechanism 17. Control mechanism 17 may be programmed to interact with detector device 11 and cause adaptive autostereoscopic display system 10 to display a preferred image based on the measured physiological characteristics or level of stress of subject 12.
  • For example, control mechanism 17 can detect changes in stress related physiological functions of subject 12 and trigger a change in the sequence or type of images displayed by adaptive autostereoscopic display system 10. More specifically, control mechanism 17 can include software that is designed to select images from a first set of images 100 or from a second set of images 102 supplied by image source 42, where the selected set 100 or 102 is compatible with attributes defined by a personalized preferred image response profile for subject 12. Adaptive autostereoscopic display system 10 would then display the selected images in a desired sequence in accordance with the stress level of subject 12 as measured by detection device 11, to help manage stress, for example.
  • First set of images 100 may be representative of a personalized preferred image response profile for subject 12, the details of which will be described later. First set of images 100 can be tailored to information obtained from subject 12 and can include a series of images based on a variety of themes (such as ocean, forest, desert, sunset, or similar themes, for example.) Alternately, first set of images 100 may include personal images of subject 12, such as family and friends, for example. Images in first or second set of images 100 or 102 can be arranged in a preset sequence, such as from chaotic, to ordered, to placid, as might be useful for helping to modify behavior and reduce stress, for example. Adaptive autostereoscopic display system 10 can further be adapted to store for display, in image source 42, second set of images 102 from an image library or from personal images of subject 12. As previously discussed, the stored images can be of any type, such as still images, audio-visual images, video clips, or computer-generated images, for example.
  • Creating a Personalized Preferred Image Response Profile
  • The process of creating a personalized preferred image response profile for determining first set of images 100 for each subject 12 will now be described. The personalized preferred image response profile is created by having subject 12 view a wide variety of images and measuring physiological effects on subject 12 as an indicator of psychological state. The measurements can be made using observer feedback sensor 52, camera 54, or detector device 11. Measurements could record one or more physiological symptoms, such as, but not limited to EMG, EEG, galvanic skin response, skin temperature, heart rate, blood pressure, eye movement, or pupil dilation, for example. The measurements obtained are correlated to the corresponding image or image sequence viewed by subject 12 at the time. The measured results indicate how subject 12 reacts to a specific image or to a sequence of images. From this data, a personalized preferred image response profile is created for subject 12. In practice, first set of images 100 will generally be images that provide a preferred response for the type of behavior modification desired, such as lowering a stress level. In this way, the personalized preferred image response profile can include data from first set of images 100 which is representative of common characteristics, or attributes of first set of images 100 that tend to provide a preferred response to the individual. The personalized preferred image response profile can then be used to select images from an image library which includes second set of images 102. The selected images can be used by as personal biofeedback images by subject 12. Thus, by using the personalized preferred image response profile, images are selected that have a desired effect for subject 12. More generally, the personalized preferred image response profile may be comprised of a set of information that describes the selected images and others that match the response profile.
  • As an example, to create a personalized preferred image response profile, subject 12 accesses an image library stored in image source 42 and keys in a code that links to the personalized preferred image response profile specific to subject 12. This personalized preferred image response profile is then used by the image library to select images from the image library. These selected images are displayed so that subject 12 can choose a desired set. This selected set can then be loaded as second set of images 102, for example. The personalized image response profile allows subject 12 to pick from a variety of categories such as seascapes, desert scenes, forest scenes or personal images such as from home, garden, favorite museum, pets, or other images, for example. This allows subject 12 to change the images that are used as biofeedback, reducing the risk that displayed images within second set of images 102 will have an adverse effect on the psychological state of subject 12.
  • Once a personalized preferred image response profile is set up, subject 12 interacts with detecting device 11 and loads the selected images from second set of images 102 for display by adaptive autostereoscopic display system 10. The output from detecting device 11 feeds into control mechanism 17 over interconnect cable 13 or over some other type of wireless interface. Subject 12 can set a base state by recording measured levels of stress, using measurements of symptoms such as EMG, EEG, galvanic skin response, skin temperature, heart rate, blood pressure, eye movement, and the like. An image from the selected images of second set of images 102 which relates to this level of stress can them be displayed on adaptive autostereoscopic display system 10. At this point, subject 12 can begin a personal stress reduction regime and determine the changing level of stress is manifested with the transition in images. For example, as the stress level decreases, the images displayed to subject 12 could change from a chaotic state to a serene state. In another arrangement, the image transition may be from a first image to a second image, where the second image is very unlike the first image, with a corresponding measurement made of the change in stress symptoms for subject 12, on the basis of the personalized preferred image response profile. Image classification may be based on resolution, color, contrast, scene content, or other characteristic.
  • Referring now to FIG. 3, subject 12 creates a profile using a profile set in an initial step 300. Subject 12 then obtains an image profile set (step 301A). As illustrated in step 301B, the profile set can include images arranged in a series of twos which are used for paired comparisons. Subject 12 is then shown the first of the two images and then the second of the two images and asked to chose a preferred image (step 302). FIG. 4 illustrates a flow chart with respect to the comparison of images of step 302, while FIG. 5 illustrates a selector device 80, connected to the system of the present invention, that can be used by subject 12 to choose, compare and select images. As shown in FIG. 5, selector device 80 includes selector buttons 81, 82, 83, and 84. Selector button 81 corresponds to image A and is activated or depressed by subject 12 when image A provides a preferred response when compared to image B. Selector button 83 corresponds to image B and is activated or depressed by subject 12 when image B provides the preferred response when compared to image A. Selector button 82 can be activated or depressed by subject 12 when no image is preferred. Selector button 84 is used to toggle between images A and image B. Selector device 80 is connected via a connector 85 to control logic processor 50 (FIG. 2) which records and stores the selections made by subject 12. Selector device 80 also connects to image source 42 and image generator 40 via control logic processor 50. Selector device 80 has a display panel 86 to indicate to subject 12 which image in the sequence is being displayed.
  • As noted in FIG. 4, after subject 12 is shown a pair of images (step 302), subject 12 is then directed to choose the more relaxing image (step 302A). In step 302A, subject 12 can provide a direct response with respect to the preferred images by activating or pressing one of selector buttons 81-83 of selector device 80. As an alternative, preferred images can be automatically chosen based on a physiological measurement obtained from subject 12 by detecting device 11 as illustrated in FIG. 1.
  • Referring again to FIG. 4, after step 302A, there is a check to see if an image “A” is selected (step 302B). If the answer to step 302B is no, there is a check to see if image “B” is selected (step 302B′). If the answer to step 302B′ is yes, the selection of image “B” is recorded (step 302C). If the answer to step 302B′ is no, then there is a recording that neither image has been selected (step 302C′). If the answer to step 302B is yes, there is a recording of the selection of image “A” (step 302D). After either of steps 302C, 302C′ or 302D, there is a check to see if the image pair shown to subject 12 is the last image pair (step 302E). If the answer to step 302E is no, then above steps are continued as noted in the flow chart of FIG. 4 until the last image pair is chosen and the complete selection of images is noted by subject 12. After step 302E, the process proceeds to compile the results of the images chosen by subject 12 (step 302F).
  • Referring back to FIG. 3, the chosen images or choices are thereafter recorded and stored in memory (step 303). At this point, the selection process could be continued until internal consistencies are achieved across images (step 304). For example, and with reference to FIG. 6, each of the images in the profile set can have certain attributes or characteristics. FIG. 6 shows a chart of sample images and their attributes. In viewing the images selected by subject 12, predominant attributes or characteristics of the selected images could be determined and these attributes or characteristics utilized to help create the personalized preferred image response profile for subject 12. Also, a search can be initiated using these attributes or characteristics as described in, for example U.S. Pat. No. 6,102,846, the subject matter of which is incorporated by reference. In step 305, the selection results of the paired comparisons, as well as the assessment of the attributes of the chosen images or sequence of the images, provides a basis for creating a personalized preferred image response profile for subject 12 based on exhibited preferences. The personalized preferred image response profile for subject 12 can now be used to select second set of images 102 from an image library as noted in step 306 or from personal images of subject 12. At the conclusion of step 306, the process goes to step 400 in FIG. 8.
  • Referring to FIG. 8, there is shown the next stage in this process, whereby subject 12 can use adaptive autostereoscopic display system 10 as illustrated in FIG. 1 with an image library including second set of images 102. In step 500, images from second set of images 102 using the personalized preferred image response profile are selected and viewed in categories (step 600). Thereafter, subject 12 can select image categories (step 700), view the images (step 800), select images (step 900) based on the personalized preferred image response profile, and arrange the images in a desired sequence for achieving stress reduction (step 1000) (not shown). As a further option, a computer program can be utilized to update the profile for subject 12 using data from the newly selected images (step 1010) (not shown) and the media for showing the images can be selected (step 1020) (not shown).
  • The images obtained by the image source 42 can be in the form of a video, photo CDs, CD ROMs, floppy disks, DVD, lenticular imaging, downloaded or EEPROM.
  • With the process of the present invention, subject 12 creates a personalized preferred image response profile by viewing a first set of images on the adaptive autostereoscopic display system 10 such as from a profile set that includes images arranged in pairs, and compares and chooses images from the first set of images, which provide a preferred response for subject 12. In making the comparison choices between images as illustrated in the flow charts of FIGS. 3 and 4, those images which provide a preferred behavior such as the stress response level for subject 12 are chosen or are automatically selected based on measured stress levels of subject 12. Selected images from this comparison are used to create a profile for subject 12 based on personal preferences. Thus, the personalized preferred image response profile will define preferred characteristics that can be representative of common characteristics of the chosen images. Subject 12 can then select a second set of images from an image library, where the images have characteristics that match the preferred characteristics of the personalized preferred image response profile.
  • Having created the personalized preferred image response profile, which is then stored in the image source 42, adaptive autostereoscopic display system 10, using detector device 11 as illustrated in FIG. 1, can be used to measure the present stress level of subject 12. Based on the personalized preferred image response profile, adaptive autostereoscopic display system 10 can display selected images from the image library to subject 12 in a sequence chosen by subject 12, in accordance with the measured stress level, to enable subject 12 to manage and/or control this stress level. Subject 12 can be shown scenes known to induce stress and be taught, using biofeedback techniques, to control response to the images, for example.
  • In a further aspect of the present invention, as shown by the flow chart of FIG. 7, subject 12 can visit a medical office (step 2000) to control stress using the present invention. As illustrated in FIG. 7, a medical practitioner seats the patient as subject 12 in adaptive autostereoscopic display system 10 (step 3000) as shown in FIG. 1. The practitioner then controls adaptive autostereoscopic display system 10 to display images to subject 12 and records responses to these images (step 4000). It is noted that adaptive autostereoscopic display system 10 is further capable of interfacing with existing biofeedback equipment. Thereafter, the practitioner or control software adjusts the image combination in view of the responses of subject 12 (step 5000). Next, the practitioner trains subject 12 to manage physiological responses to images (step 6000). The medical data for subject 12 is then recorded and managed by adaptive autostereoscopic display system 10 (step 7000). Control logic in adaptive autostereoscopic display system 10 then continuously learns the responses of subject 12 as it builds the personalized preferred image response profile and chooses images that support improved health restoration (step 8000). Subject 12 can thereafter receive a personalized preferred image response profile for use in a self-care system (step 9000).
  • Image composition analysis can be used to help in building an image response profile, based on the attributes shown in FIG. 6. The profile can also be influenced by color analysis, timing preference, health baseline, health history, ability to learn and record variables for time of day, seasons, geographics, personal or family images, for example. (step 9050). Having the personalized preferred image response profile created by a medical practitioner, subject 12 can now go to a retailer or use an on-line remote connection, adapting the sequence described in the flow chart of FIG. 7 to obtain images that help manage stress.
  • In this way, the method and apparatus of the present invention overcomes the disadvantage of generalized image selection. Rather than presenting subject 12 with images statistically chosen on the bases of the effects these images had on a large sample of individuals, the images are linked to personal responses. Subject 12 uses the personalized images in a device that uses images or feedback controlled image properties as a biofeedback mechanism for achieving an improved psychological state.
  • It is recognized that in addition to managing stress, this method and apparatus can also be used as a tool to motivate, teach, focus, or visualize. In addition, adaptive autostereoscopic display system 10 could also be used for entertainment purposes. By measuring physiological response, the display of images or of an image sequence could be selectively adapted in order to obtain a desired type of response from subject 12. Thus, for example, an adventure ride sequence could be speeded up, slowed down, or otherwise adapted to suit the response of each particular subject 12.
  • The method and apparatus of the present invention also permits the use of a personalized preferred image response profile to allow subject 12 to choose images or a sequence of images from a number of different categories of images such as seascapes, desert scenes, forest scenes, other nature scenes, personal images, or computer-generated images according to data in the personalized preferred image response profile.
  • The personalized preferred image response profile can also be used for pre-selection, thereby preventing subject 12 from having to choose images or sets of images from a large library of images. The images are selected by comparing the attributes of the images to the personalized preferred image response profile.
  • The method and apparatus of the present invention also permits subject 12 to use the personalized preferred image response profile to sort, compare, select and keep track of images. With the method and apparatus of the present invention, it is also possible to generate a chart or record of stress levels for periods of time which can be shared, for example, with physicians as part of a diagnostic exercise or treatment plan.
  • The method and apparatus of the present invention also provides for a device which can be utilized to manage stress and at the same time is portable enough so that it can be used at home, at work, or while traveling.
  • Although primarily intended for image presentation, adaptive autostereoscopic display system 10 is not limited to the use of images for creating a personalized preferred response profile and helping subject 12 to manage stress. As previously discussed, additional stimuli such as sound, smell, touch, or vibration, could also be used, alone or with images, as a basis for creating the personalized preferred response profile and helping subject 12 to manage stress.
  • The invention has been described in detail with particular reference to certain preferred embodiments thereof, but it will be understood that variations and modifications can be effected within the scope of the invention as described above, and as noted in the appended claims, by a person of ordinary skill in the art without departing from the scope of the invention. For example, adaptive autostereoscopic display system 10 could be used in other medical, therapeutic, or entertainment applications. Additional types of feedback sensors could be employed to provide any of the more sophisticated sensing functions known in the virtual reality presentation arts, such as head-tracking or gaze-tracking, for example.
  • Thus, what is provided is an autostereoscopic display system adapted for psychological health management and to a method for using an autostereoscopic display system for conditioning the psychological state of a subject for biofeedback, stress management, behavior modification, entertainment, and similar applications.
  • PARTS LIST
    • 10 Adaptive autostereoscopic display system
    • 11 Detector device
    • 12 Subject
    • 13 Interconnect cable
    • 14 Viewing pupil
    • 14 l Left viewing pupil
    • 14 r Right viewing pupil
    • 16 Beamsplitter
    • 17 Control mechanism
    • 18 Autostereoscopic image delivery system
    • 20 Projection apparatus
    • 20 l Left-eye projection apparatus
    • 20 r Right-eye projection apparatus
    • 24 Curved mirror
    • 301 Left ball lens assembly
    • 30 r Right ball lens assembly
    • 32. Adjustable chair
    • 36 l Left viewing pupil forming apparatus
    • 36 r Right viewing pupil forming apparatus
    • 40 Image generator
    • 42 Image source
    • 50 Control logic processor
    • 52 Observer feedback sensor
    • 54 Camera
    • 58 Housing
    • 60 Projection translation apparatus
    • 60 l Left-eye projection translation apparatus
    • 60 r Right-eye projection translation apparatus
    • 60 b Beamsplitter positioning apparatus
    • 60 m Mirror positioning apparatus
    • 62 Speaker system
    • 64 Speaker
    • 66 Chair servo mechanism
    • 70 Audio source
    • 80 Selector device
    • 81 Selector button
    • 82 Selector button
    • 83 Selector button
    • 84 Selector button
    • 85 Connector
    • 86 Display panel
    • 100 First set of images
    • 102 Second set of images
    • 104 Manual feedback control
    • 300 Initial step
    • 301A Step
    • 301B Step
    • 302 Step
    • 302A Step
    • 302B Step
    • 302B′ Step
    • 302C Step
    • 302C′ Step
    • 302D Step
    • 302E Step
    • 302F Step
    • 303 Step
    • 304 Step
    • 305 Step
    • 306 Step
    • 400 Step
    • 500 Step
    • 600 Step
    • 700 Step
    • 800 Step
    • 900 Step
    • 1000 Step
    • 1010 Step
    • 1020 Step
    • 2000 Step
    • 3000 Step
    • 4000 Step
    • 5000 Step
    • 6000 Step
    • 7000 Step
    • 8000 Step
    • 9000 Step
    • 9050 Step

Claims (10)

1. A method for conditioning the psychological state of a subject, the method comprising the steps of:
(a) displaying a first set of stereoscopic images;
(b) obtaining response data from the subject;
(c) creating a personalized image response profile based on said response data; and
(d) according to said personalized image response profile, displaying a second set of stereoscopic images for obtaining a predetermined response from the subject.
2. A method for conditioning the psychological state of a subject according to claim 1 wherein the step of displaying a first set of stereoscopic images comprises displaying autostereoscopic images.
3. A method for conditioning the psychological state of a subject according to claim 1 wherein said predetermined response is a stress reduction response.
4. A method for conditioning the psychological state of a subject according to claim 1 wherein said predetermined response is a behavioral response.
5. A method for conditioning the psychological state of a subject according to claim 1 wherein said predetermined response is for entertainment of the subject.
6. A method for conditioning the psychological state of a subject according to claim 1 wherein the step of displaying said first set of stereoscopic images further comprises the step of modifying, in response to feedback data about the subject, the spatial position of a viewing pupil for the subject, comprising:
(a) generating a command, conditioned by said feedback data, said command provided for obtaining an adjustment for the spatial position of said viewing pupil; and
(b) controlling a movement of a viewing pupil forming apparatus in response to said command, said movement achieving an optical adjustment of the spatial position of said viewing pupil.
7. A method for conditioning the psychological state of a subject according to claim 6 wherein the step of controlling said movement of said viewing pupil forming apparatus comprises the step of moving a ball lens.
8. A method for conditioning the psychological state of a subject according to claim 6 wherein the step of controlling said movement of said viewing pupil forming apparatus comprises the step of moving a curved mirror.
9. A method for conditioning the psychological state of a subject according to claim 6 wherein the step of controlling said movement of said viewing pupil forming apparatus comprises the step of moving a beamsplitter.
10. A method for conditioning the psychological state of a subject according to claim 6 further comprising the step of controlling a movement of a chair on which the subject is seated, said movement adjusting the position of said viewing pupil relative to an eye of the subject.
US11/042,304 2002-08-05 2005-01-24 System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display Abandoned US20050124851A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/042,304 US20050124851A1 (en) 2002-08-05 2005-01-24 System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/212,342 US6896655B2 (en) 2002-08-05 2002-08-05 System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display
US11/042,304 US20050124851A1 (en) 2002-08-05 2005-01-24 System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/212,342 Division US6896655B2 (en) 2002-08-05 2002-08-05 System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display

Publications (1)

Publication Number Publication Date
US20050124851A1 true US20050124851A1 (en) 2005-06-09

Family

ID=31187747

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/212,342 Expired - Fee Related US6896655B2 (en) 2002-08-05 2002-08-05 System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display
US11/042,304 Abandoned US20050124851A1 (en) 2002-08-05 2005-01-24 System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/212,342 Expired - Fee Related US6896655B2 (en) 2002-08-05 2002-08-05 System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display

Country Status (1)

Country Link
US (2) US6896655B2 (en)

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080171914A1 (en) * 2005-02-07 2008-07-17 Koninklijke Philips Electronics N.V. Device For Determining A Stress Level Of A Person And Providing Feedback On The Basis Of The Stress Level As Determined
US20080253695A1 (en) * 2007-04-10 2008-10-16 Sony Corporation Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program
US20080319279A1 (en) * 2007-06-21 2008-12-25 Immersion Corporation Haptic Health Feedback Monitoring
US20090040231A1 (en) * 2007-08-06 2009-02-12 Sony Corporation Information processing apparatus, system, and method thereof
US20090271008A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Combination treatment modification methods and systems
US20090267758A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Systems and apparatus for measuring a bioactive agent effect
US20090271219A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The Stste Of Delaware Methods and systems for presenting a combination treatment
US20090271217A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Side effect ameliorating combination therapeutic products and systems
US20090271347A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring bioactive agent use
US20090270694A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US20090271009A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Combination treatment modification methods and systems
US20090271122A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US20090270688A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for presenting a combination treatment
US20090271011A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring bioactive agent use
US20090271213A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Corporation Of The State Of Delaware Combination treatment selection methods and systems
US20090271375A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Combination treatment selection methods and systems
US20090271215A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for detecting a bioactive agent effect
US20090270693A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for modifying bioactive agent use
US20090292676A1 (en) * 2008-04-24 2009-11-26 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Combination treatment selection methods and systems
US20090312595A1 (en) * 2008-04-24 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware System and method for memory modification
US20090312668A1 (en) * 2008-04-24 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20090319301A1 (en) * 2008-04-24 2009-12-24 Searete Llc, A Limited Liability Corporation Of The State Of Delawar Methods and systems for presenting a combination treatment
US20100004762A1 (en) * 2008-04-24 2010-01-07 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100017001A1 (en) * 2008-04-24 2010-01-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100015583A1 (en) * 2008-04-24 2010-01-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational System and method for memory modification
US20100022820A1 (en) * 2008-04-24 2010-01-28 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100030089A1 (en) * 2008-04-24 2010-02-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US20100030013A1 (en) * 2008-08-04 2010-02-04 Henry Brunelle Wall integrated multisensory therapy device
US20100042578A1 (en) * 2008-04-24 2010-02-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100041958A1 (en) * 2008-04-24 2010-02-18 Searete Llc Computational system and method for memory modification
US20100041964A1 (en) * 2008-04-24 2010-02-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US20100063368A1 (en) * 2008-04-24 2010-03-11 Searete Llc, A Limited Liability Corporation Computational system and method for memory modification
US20100066817A1 (en) * 2007-02-25 2010-03-18 Humaneyes Technologies Ltd. method and a system for calibrating and/or visualizing a multi image display and for reducing ghosting artifacts
US20100069724A1 (en) * 2008-04-24 2010-03-18 Searete Llc Computational system and method for memory modification
US20100076249A1 (en) * 2008-04-24 2010-03-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100081860A1 (en) * 2008-04-24 2010-04-01 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational System and Method for Memory Modification
US20100081861A1 (en) * 2008-04-24 2010-04-01 Searete Llc Computational System and Method for Memory Modification
US20100100036A1 (en) * 2008-04-24 2010-04-22 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational System and Method for Memory Modification
US20100125561A1 (en) * 2008-04-24 2010-05-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100130811A1 (en) * 2008-04-24 2010-05-27 Searete Llc Computational system and method for memory modification
US20100280332A1 (en) * 2008-04-24 2010-11-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring bioactive agent use
US20110077546A1 (en) * 2009-09-29 2011-03-31 William Fabian System and Method for Applied Kinesiology Feedback
US20120063654A1 (en) * 2008-10-30 2012-03-15 Korea University Industrial & Academic Collaborative Foundation Computer system and computer-readable storage medium for art therapy
US20140303450A1 (en) * 2013-04-03 2014-10-09 Dylan Caponi System and method for stimulus optimization through closed loop iterative biological sensor feedback
US9035968B2 (en) 2007-07-23 2015-05-19 Humaneyes Technologies Ltd. Multi view displays and methods for producing the same
US9138558B1 (en) * 2014-06-19 2015-09-22 Biofeedback Systems Design, LLC Apparatus and method for improving psychophysiological function for performance under stress
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US20190355457A1 (en) * 2009-05-11 2019-11-21 Bruce A. Lev Telemedical apparatus, system, & method for providing medical services remotely
CN110613879A (en) * 2019-08-20 2019-12-27 湖南文理学院 Computer synchronous learning and memory analysis system and method
US11273283B2 (en) 2017-12-31 2022-03-15 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
US11364361B2 (en) 2018-04-20 2022-06-21 Neuroenhancement Lab, LLC System and method for inducing sleep by transplanting mental states
US11452839B2 (en) 2018-09-14 2022-09-27 Neuroenhancement Lab, LLC System and method of improving sleep
US11462333B1 (en) * 2009-05-11 2022-10-04 Bruce A. Lev Telemedical apparatus, system, and method for providing medical services remotely
US11717686B2 (en) 2017-12-04 2023-08-08 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to facilitate learning and performance
US11723579B2 (en) 2017-09-19 2023-08-15 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep

Families Citing this family (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004213350A (en) * 2002-12-27 2004-07-29 Seiko Epson Corp Inner force sense presenting device and image correcting method
CA2419962C (en) * 2003-02-26 2013-08-06 Patrice Renaud Method and apparatus for providing an environment to a patient
AU2005214713A1 (en) * 2004-02-13 2005-09-01 Emory University Display enhanced testing for concussions and mild traumatic brain injury
JP4481682B2 (en) * 2004-02-25 2010-06-16 キヤノン株式会社 Information processing apparatus and control method thereof
US9205062B2 (en) * 2004-03-09 2015-12-08 Mylan Pharmaceuticals, Inc. Transdermal systems containing multilayer adhesive matrices to modify drug delivery
US20050240956A1 (en) * 2004-04-22 2005-10-27 Kurt Smith Method and apparatus for enhancing wellness
US20050250082A1 (en) * 2004-05-05 2005-11-10 Mark Baldwin Interpersonal cognition method and system
US20060024650A1 (en) * 2004-07-30 2006-02-02 Epstein Gerald N Kinesiology-based self-improvement process
US20060058701A1 (en) * 2004-09-13 2006-03-16 Sensory Learning Center International, Inc. Systems and methods for providing sensory input
FR2879323B1 (en) * 2004-12-13 2008-05-16 Sagem METHOD FOR SEARCHING INFORMATION IN A DATABASE
WO2007016241A2 (en) * 2005-07-28 2007-02-08 Abbas Rashidi Interactive systems and methods for mental stimulation
US20070106127A1 (en) * 2005-10-11 2007-05-10 Alman Brian M Automated patient monitoring and counseling system
US8602791B2 (en) 2005-11-04 2013-12-10 Eye Tracking, Inc. Generation of test stimuli in visual media
FR2892940B1 (en) * 2005-11-10 2021-04-09 Olivier Lordereau BIOMEDICAL DEVICE FOR TREATMENT BY VIRTUAL IMMERSION
US8323191B2 (en) * 2005-12-23 2012-12-04 Koninklijke Philips Electronics N.V. Stressor sensor and stress management system
GB0614458D0 (en) * 2006-07-20 2006-08-30 Clare Jon Computerised hypnosis therapy device and method
JP5432120B2 (en) * 2007-04-04 2014-03-05 コーニンクレッカ フィリップス エヌ ヴェ Stress level judgment based on human performance in games or puzzles
WO2009000741A1 (en) 2007-06-22 2008-12-31 Basf Se Molding materials comprising polyaryl ethers with improved surface quality
US20090216070A1 (en) * 2008-02-25 2009-08-27 Hunt Ronald D Apparatus and method of relaxation therapy
US20090269329A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Combination Therapeutic products and systems
CA2730404C (en) * 2008-07-10 2016-08-30 Claudia Zayfert Device, system, and method for treating psychiatric disorders
US20100021874A1 (en) * 2008-07-24 2010-01-28 John Milford Cunningham Inculcating Positive Altered Personal Behavioral Patterns
EP2349002B1 (en) * 2008-09-30 2021-05-05 Cognisens Inc. Method and device for assessing, training and improving perceptual-cognitive abilities of individuals
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US20150205111A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. Optical configurations for head worn computing
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9298985B2 (en) 2011-05-16 2016-03-29 Wesley W. O. Krueger Physiological biosensor system and method for controlling a vehicle or powered equipment
WO2011143655A1 (en) * 2010-05-14 2011-11-17 Advitech, Inc. System and method for prevention and control of the effects of spatial disorientation
US9994228B2 (en) 2010-05-14 2018-06-12 Iarmourholdings, Inc. Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment
US8939885B2 (en) 2010-09-17 2015-01-27 Penelope S. Martin System and method for eliciting a relaxation response
CN102068745B (en) * 2011-01-21 2012-11-21 杨杰 Four-dimensional brain biofeedback therapeutic instrument and operating control method thereof
EP2755547A4 (en) * 2011-09-16 2015-04-01 Annidis Corp System and method for assessing retinal functionality
KR102272067B1 (en) * 2012-02-22 2021-07-05 조세린 포베르 Perceptual-cognitive-motor learning system and method
US20130237867A1 (en) * 2012-03-07 2013-09-12 Neurosky, Inc. Modular user-exchangeable accessory for bio-signal controlled mechanism
JP6096880B2 (en) * 2012-03-27 2017-03-15 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Ambient stimulus selection
US9265416B2 (en) * 2013-03-11 2016-02-23 Children's Healthcare Of Atlanta, Inc. Systems and methods for detection of cognitive and developmental conditions
US9185291B1 (en) 2013-06-13 2015-11-10 Corephotonics Ltd. Dual aperture zoom digital camera
CN105873515B (en) 2013-10-17 2020-11-20 亚特兰大儿童医疗保健公司 Method for assessing infant and child development via eye tracking
US20150277118A1 (en) 2014-03-28 2015-10-01 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US20160019715A1 (en) 2014-07-15 2016-01-21 Osterhout Group, Inc. Content presentation in head worn computing
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US9651788B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9532714B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US20150205135A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. See-through computer display systems
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
US20150241963A1 (en) 2014-02-11 2015-08-27 Osterhout Group, Inc. Eye imaging in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US20160187651A1 (en) 2014-03-28 2016-06-30 Osterhout Group, Inc. Safety for a vehicle operator with an hmd
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
US20160239985A1 (en) 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
CN104888331B (en) * 2015-03-30 2019-01-15 徐志强 For treating the training system of self-closing disease
WO2016189711A1 (en) * 2015-05-27 2016-12-01 糧三 齋藤 Stress evaluation program for mobile terminal and mobile terminal provided with program
US11003246B2 (en) 2015-07-22 2021-05-11 Mentor Acquisition One, Llc External user interface for head worn computing
US10139966B2 (en) 2015-07-22 2018-11-27 Osterhout Group, Inc. External user interface for head worn computing
WO2017143128A1 (en) * 2016-02-18 2017-08-24 Osterhout Group, Inc. Haptic systems for head-worn computers
US10466491B2 (en) 2016-06-01 2019-11-05 Mentor Acquisition One, Llc Modular systems for head-worn computers
US10824253B2 (en) 2016-05-09 2020-11-03 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10684478B2 (en) 2016-05-09 2020-06-16 Mentor Acquisition One, Llc User interface systems for head-worn computers
FR3062307B1 (en) * 2017-01-30 2022-01-21 Herve Harounian DEVICE AND METHOD FOR COGNITIVE STIMULATION OF A PATIENT
US10453172B2 (en) 2017-04-04 2019-10-22 International Business Machines Corporation Sparse-data generative model for pseudo-puppet memory recast
US10152141B1 (en) 2017-08-18 2018-12-11 Osterhout Group, Inc. Controller movement tracking with light emitters
CN110140076B (en) 2017-11-23 2021-05-21 核心光电有限公司 Compact folding camera structure
CN107970035A (en) * 2017-12-13 2018-05-01 上海青研科技有限公司 A kind of mental health reponse system based on eye movement data
US11559355B2 (en) 2018-07-30 2023-01-24 Boston Scientific Neuromodulation Corporation Augmented and virtual reality for use with neuromodulation therapy
JP7335186B2 (en) * 2020-02-28 2023-08-29 富士フイルム株式会社 Image processing device, image processing method and program
US20220218942A1 (en) * 2021-01-11 2022-07-14 Kevin Jain Full-sensory guided-meditation system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3855998A (en) * 1973-03-14 1974-12-24 Hidalgo A De Entertainment device
US5465729A (en) * 1992-03-13 1995-11-14 Mindscope Incorporated Method and apparatus for biofeedback
US5596994A (en) * 1993-08-30 1997-01-28 Bro; William L. Automated and interactive behavioral and medical guidance system
US5739955A (en) * 1994-08-10 1998-04-14 Virtuality (Ip) Limited Head mounted display optics
US5767400A (en) * 1995-07-10 1998-06-16 Doryokuro Kakunenryo Kaihatsu Jigyodan Hydraulic test system mounted with borehole television set for simultaneous observation in front and lateral directions
US6012926A (en) * 1996-03-27 2000-01-11 Emory University Virtual reality system for treating patients with anxiety disorders
US6102846A (en) * 1998-02-26 2000-08-15 Eastman Kodak Company System and method of managing a psychological state of an individual using images
US6149586A (en) * 1998-01-29 2000-11-21 Elkind; Jim System and method for diagnosing executive dysfunctions using virtual reality and computer simulation
US6394963B1 (en) * 2000-06-20 2002-05-28 Eastman Kodak Company Technique for diagnosing attention deficit disorder
US6416181B1 (en) * 2000-12-15 2002-07-09 Eastman Kodak Company Monocentric autostereoscopic optical apparatus and method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4751642A (en) * 1986-08-29 1988-06-14 Silva John M Interactive sports simulation system with physiological sensing and psychological conditioning
US5502481A (en) * 1992-11-16 1996-03-26 Reveo, Inc. Desktop-based projection display system for stereoscopic viewing of displayed imagery over a wide field of view
US5304112A (en) * 1991-10-16 1994-04-19 Theresia A. Mrklas Stress reduction system and method
US5913310A (en) * 1994-05-23 1999-06-22 Health Hero Network, Inc. Method for diagnosis and treatment of psychological and emotional disorders using a microprocessor-based video game
US5546943A (en) * 1994-12-09 1996-08-20 Gould; Duncan K. Stimulating a beneficial human response by using visualization of medical scan data to achieve psychoneuroimmunological virtual reality
US5947908A (en) * 1995-07-14 1999-09-07 Morris; Ritchi Color reactivity device and method
US6057846A (en) * 1995-07-14 2000-05-02 Sever, Jr.; Frank Virtual reality psychophysiological conditioning medium
US5742263A (en) * 1995-12-18 1998-04-21 Telxon Corporation Head tracking system for a head mounted display system
JP3771973B2 (en) * 1996-09-26 2006-05-10 オリンパス株式会社 3D image display device
US6425764B1 (en) * 1997-06-09 2002-07-30 Ralph J. Lamson Virtual reality immersion therapy for treating psychological, psychiatric, medical, educational and self-help problems
US6752498B2 (en) * 2001-05-14 2004-06-22 Eastman Kodak Company Adaptive autostereoscopic display system
US6511182B1 (en) * 2001-11-13 2003-01-28 Eastman Kodak Company Autostereoscopic optical apparatus using a scanned linear image source

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3855998A (en) * 1973-03-14 1974-12-24 Hidalgo A De Entertainment device
US5465729A (en) * 1992-03-13 1995-11-14 Mindscope Incorporated Method and apparatus for biofeedback
US5596994A (en) * 1993-08-30 1997-01-28 Bro; William L. Automated and interactive behavioral and medical guidance system
US5739955A (en) * 1994-08-10 1998-04-14 Virtuality (Ip) Limited Head mounted display optics
US5767400A (en) * 1995-07-10 1998-06-16 Doryokuro Kakunenryo Kaihatsu Jigyodan Hydraulic test system mounted with borehole television set for simultaneous observation in front and lateral directions
US6012926A (en) * 1996-03-27 2000-01-11 Emory University Virtual reality system for treating patients with anxiety disorders
US6149586A (en) * 1998-01-29 2000-11-21 Elkind; Jim System and method for diagnosing executive dysfunctions using virtual reality and computer simulation
US6102846A (en) * 1998-02-26 2000-08-15 Eastman Kodak Company System and method of managing a psychological state of an individual using images
US6394963B1 (en) * 2000-06-20 2002-05-28 Eastman Kodak Company Technique for diagnosing attention deficit disorder
US6416181B1 (en) * 2000-12-15 2002-07-09 Eastman Kodak Company Monocentric autostereoscopic optical apparatus and method

Cited By (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080171914A1 (en) * 2005-02-07 2008-07-17 Koninklijke Philips Electronics N.V. Device For Determining A Stress Level Of A Person And Providing Feedback On The Basis Of The Stress Level As Determined
US8684924B2 (en) * 2005-02-07 2014-04-01 Koninklijke Philips N.V. Device for determining a stress level of a person and providing feedback on the basis of the stress level as determined
US20100066817A1 (en) * 2007-02-25 2010-03-18 Humaneyes Technologies Ltd. method and a system for calibrating and/or visualizing a multi image display and for reducing ghosting artifacts
US8520060B2 (en) * 2007-02-25 2013-08-27 Humaneyes Technologies Ltd. Method and a system for calibrating and/or visualizing a multi image display and for reducing ghosting artifacts
US20080253695A1 (en) * 2007-04-10 2008-10-16 Sony Corporation Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program
US8687925B2 (en) * 2007-04-10 2014-04-01 Sony Corporation Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program
US9754078B2 (en) * 2007-06-21 2017-09-05 Immersion Corporation Haptic health feedback monitoring
US20080319279A1 (en) * 2007-06-21 2008-12-25 Immersion Corporation Haptic Health Feedback Monitoring
US9035968B2 (en) 2007-07-23 2015-05-19 Humaneyes Technologies Ltd. Multi view displays and methods for producing the same
US9972116B2 (en) 2007-08-06 2018-05-15 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
US10262449B2 (en) 2007-08-06 2019-04-16 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
US9568998B2 (en) 2007-08-06 2017-02-14 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
US8797331B2 (en) 2007-08-06 2014-08-05 Sony Corporation Information processing apparatus, system, and method thereof
US10529114B2 (en) 2007-08-06 2020-01-07 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
US10937221B2 (en) 2007-08-06 2021-03-02 Sony Corporation Information processing apparatus, system, and method for displaying bio-information or kinetic information
US20090040231A1 (en) * 2007-08-06 2009-02-12 Sony Corporation Information processing apparatus, system, and method thereof
US20100017001A1 (en) * 2008-04-24 2010-01-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US9026369B2 (en) 2008-04-24 2015-05-05 The Invention Science Fund I, Llc Methods and systems for presenting a combination treatment
US20090292676A1 (en) * 2008-04-24 2009-11-26 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Combination treatment selection methods and systems
US20090312595A1 (en) * 2008-04-24 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware System and method for memory modification
US20090312668A1 (en) * 2008-04-24 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20090319301A1 (en) * 2008-04-24 2009-12-24 Searete Llc, A Limited Liability Corporation Of The State Of Delawar Methods and systems for presenting a combination treatment
US20100004762A1 (en) * 2008-04-24 2010-01-07 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20090271215A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for detecting a bioactive agent effect
US20100015583A1 (en) * 2008-04-24 2010-01-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational System and method for memory modification
US20100022820A1 (en) * 2008-04-24 2010-01-28 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100030089A1 (en) * 2008-04-24 2010-02-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US20090271008A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Combination treatment modification methods and systems
US20100042578A1 (en) * 2008-04-24 2010-02-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100041958A1 (en) * 2008-04-24 2010-02-18 Searete Llc Computational system and method for memory modification
US20100041964A1 (en) * 2008-04-24 2010-02-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US20100063368A1 (en) * 2008-04-24 2010-03-11 Searete Llc, A Limited Liability Corporation Computational system and method for memory modification
US20090271375A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Combination treatment selection methods and systems
US20100069724A1 (en) * 2008-04-24 2010-03-18 Searete Llc Computational system and method for memory modification
US20100076249A1 (en) * 2008-04-24 2010-03-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100081860A1 (en) * 2008-04-24 2010-04-01 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational System and Method for Memory Modification
US20100081861A1 (en) * 2008-04-24 2010-04-01 Searete Llc Computational System and Method for Memory Modification
US20100100036A1 (en) * 2008-04-24 2010-04-22 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational System and Method for Memory Modification
US20100125561A1 (en) * 2008-04-24 2010-05-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Computational system and method for memory modification
US20100130811A1 (en) * 2008-04-24 2010-05-27 Searete Llc Computational system and method for memory modification
US20100280332A1 (en) * 2008-04-24 2010-11-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring bioactive agent use
US10786626B2 (en) 2008-04-24 2020-09-29 The Invention Science Fund I, Llc Methods and systems for modifying bioactive agent use
US10572629B2 (en) 2008-04-24 2020-02-25 The Invention Science Fund I, Llc Combination treatment selection methods and systems
US20090267758A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Systems and apparatus for measuring a bioactive agent effect
US20090271219A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The Stste Of Delaware Methods and systems for presenting a combination treatment
US20090271217A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Side effect ameliorating combination therapeutic products and systems
US20090271213A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Corporation Of The State Of Delaware Combination treatment selection methods and systems
US20090271011A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring bioactive agent use
US20090270688A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for presenting a combination treatment
US20090271122A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US20090271347A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring bioactive agent use
US8876688B2 (en) 2008-04-24 2014-11-04 The Invention Science Fund I, Llc Combination treatment modification methods and systems
US8930208B2 (en) 2008-04-24 2015-01-06 The Invention Science Fund I, Llc Methods and systems for detecting a bioactive agent effect
US9662391B2 (en) 2008-04-24 2017-05-30 The Invention Science Fund I Llc Side effect ameliorating combination therapeutic products and systems
US20090270693A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for modifying bioactive agent use
US20090271009A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Combination treatment modification methods and systems
US9064036B2 (en) 2008-04-24 2015-06-23 The Invention Science Fund I, Llc Methods and systems for monitoring bioactive agent use
US9649469B2 (en) 2008-04-24 2017-05-16 The Invention Science Fund I Llc Methods and systems for presenting a combination treatment
US9239906B2 (en) 2008-04-24 2016-01-19 The Invention Science Fund I, Llc Combination treatment selection methods and systems
US20090270694A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US9560967B2 (en) 2008-04-24 2017-02-07 The Invention Science Fund I Llc Systems and apparatus for measuring a bioactive agent effect
US9282927B2 (en) 2008-04-24 2016-03-15 Invention Science Fund I, Llc Methods and systems for modifying bioactive agent use
US9358361B2 (en) 2008-04-24 2016-06-07 The Invention Science Fund I, Llc Methods and systems for presenting a combination treatment
US9504788B2 (en) 2008-04-24 2016-11-29 Searete Llc Methods and systems for modifying bioactive agent use
US9449150B2 (en) 2008-04-24 2016-09-20 The Invention Science Fund I, Llc Combination treatment selection methods and systems
US20100030013A1 (en) * 2008-08-04 2010-02-04 Henry Brunelle Wall integrated multisensory therapy device
US8070669B2 (en) * 2008-08-04 2011-12-06 Gestion Ultra International Inc. Wall integrated multisensory therapy device
US20120063654A1 (en) * 2008-10-30 2012-03-15 Korea University Industrial & Academic Collaborative Foundation Computer system and computer-readable storage medium for art therapy
US8942443B2 (en) * 2008-10-30 2015-01-27 Korea University Industrial & Academic Collaboration Foundation Computer system and computer-readable storage medium for art therapy
US11462333B1 (en) * 2009-05-11 2022-10-04 Bruce A. Lev Telemedical apparatus, system, and method for providing medical services remotely
US20190355457A1 (en) * 2009-05-11 2019-11-21 Bruce A. Lev Telemedical apparatus, system, & method for providing medical services remotely
AU2010300771B2 (en) * 2009-09-29 2016-03-10 William Fabian System and method for applied kinesiology feedback
US8323216B2 (en) 2009-09-29 2012-12-04 William Fabian System and method for applied kinesiology feedback
WO2011041360A1 (en) * 2009-09-29 2011-04-07 William Fabian System and method for applied kinesiology feedback
US20110077546A1 (en) * 2009-09-29 2011-03-31 William Fabian System and Method for Applied Kinesiology Feedback
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US20140303450A1 (en) * 2013-04-03 2014-10-09 Dylan Caponi System and method for stimulus optimization through closed loop iterative biological sensor feedback
US9668693B2 (en) 2014-06-19 2017-06-06 Biofeedback Systems Design, LLC Method for improving psychophysiological function for performance under stress
US9138558B1 (en) * 2014-06-19 2015-09-22 Biofeedback Systems Design, LLC Apparatus and method for improving psychophysiological function for performance under stress
US9402581B2 (en) 2014-06-19 2016-08-02 Biofeedback Systems Design, LLC Apparatus and method for improving psychophysiological function for performance under stress
US10758180B2 (en) 2014-06-19 2020-09-01 Biofeedback Systems Design, LLC Method for improving psychophysiological function for performance under stress
US11647948B2 (en) 2014-06-19 2023-05-16 Optivio, Inc. System for training a subject to improve psychophysiological function for performance under stress
US11723579B2 (en) 2017-09-19 2023-08-15 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement
US11717686B2 (en) 2017-12-04 2023-08-08 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to facilitate learning and performance
US11273283B2 (en) 2017-12-31 2022-03-15 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
US11478603B2 (en) 2017-12-31 2022-10-25 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
US11318277B2 (en) 2017-12-31 2022-05-03 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
US11364361B2 (en) 2018-04-20 2022-06-21 Neuroenhancement Lab, LLC System and method for inducing sleep by transplanting mental states
US11452839B2 (en) 2018-09-14 2022-09-27 Neuroenhancement Lab, LLC System and method of improving sleep
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep
CN110613879A (en) * 2019-08-20 2019-12-27 湖南文理学院 Computer synchronous learning and memory analysis system and method

Also Published As

Publication number Publication date
US20040024287A1 (en) 2004-02-05
US6896655B2 (en) 2005-05-24

Similar Documents

Publication Publication Date Title
US6896655B2 (en) System and method for conditioning the psychological state of a subject using an adaptive autostereoscopic display
US11615600B1 (en) XR health platform, system and method
US6102846A (en) System and method of managing a psychological state of an individual using images
KR102450362B1 (en) Augmented Reality Systems and Methods for User Health Analysis
US10083631B2 (en) System, method and computer program for training for ophthalmic examinations
US8531354B2 (en) Image generation system
US11000669B2 (en) Method of virtual reality system and implementing such method
KR20190019180A (en) Augmented reality display system for evaluation and correction of neurological conditions including visual processing and perceptual states
JP2005524432A (en) Eyepiece display device and method for evaluation, measurement and treatment of eye diseases
MXPA04011319A (en) Interactive occlusion system.
US10043415B2 (en) System, method and computer program for training for medical examinations involving manipulation of medical tools
US20140134587A1 (en) System, method and computer program for training for medical examinations involving body parts with concealed anatomy
US6364485B1 (en) Methods and systems for relieving eye strain
Ghahramani Computation and psychophysics of sensorimotor integration
EP2482935B1 (en) System for supporting a user to do exercises
Neugebauer et al. Influence of open-source virtual-reality based gaze training on navigation performance in Retinitis pigmentosa patients in a crossover randomized controlled trial
Schubert et al. Size matters: How reaching and vergence movements are influenced by the familiar size of stereoscopically presented objects
TWI767768B (en) Virtual reality device that automatically regulates media contents according to physiological conditions
US11961197B1 (en) XR health platform, system and method
US20230320640A1 (en) Bidirectional sightline-position determination device, bidirectional sightline-position determination method, and training method
JP6713526B1 (en) Improvement of VDT syndrome and fibromyalgia
EP4325517A1 (en) Methods and devices in performing a vision testing procedure on a person
Pugnetti et al. Immersive Virtual Reality To Assist Retraining of Acquired Cognitive Deficits-First Results with a Dedicated System
Neugebauer et al. Simulating Vision Impairment in Virtual Reality--A Comparison of Visual Task Performance with Real and Simulated Tunnel Vision
Malpica Mallo Visual and multimodal perception in immersive environments

Legal Events

Date Code Title Description
AS Assignment

Owner name: CREDIT SUISSE, CAYMAN ISLANDS BRANCH, AS ADMINISTR

Free format text: FIRST LIEN OF INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:CARESTREAM HEALTH, INC.;REEL/FRAME:019649/0454

Effective date: 20070430

Owner name: CREDIT SUISSE, CAYMAN ISLANDS BRANCH, AS ADMINISTR

Free format text: SECOND LIEN INTELLECTUAL PROPERTY SECURITY AGREEME;ASSIGNOR:CARESTREAM HEALTH, INC.;REEL/FRAME:019773/0319

Effective date: 20070430

AS Assignment

Owner name: CARESTREAM HEALTH, INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EASTMAN KODAK COMPANY;REEL/FRAME:020741/0126

Effective date: 20070501

Owner name: CARESTREAM HEALTH, INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EASTMAN KODAK COMPANY;REEL/FRAME:020756/0500

Effective date: 20070501

Owner name: CARESTREAM HEALTH, INC.,NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EASTMAN KODAK COMPANY;REEL/FRAME:020741/0126

Effective date: 20070501

Owner name: CARESTREAM HEALTH, INC.,NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EASTMAN KODAK COMPANY;REEL/FRAME:020756/0500

Effective date: 20070501

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE

AS Assignment

Owner name: CARESTREAM HEALTH, INC., NEW YORK

Free format text: RELEASE OF SECURITY INTEREST IN INTELLECTUAL PROPERTY (FIRST LIEN);ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:026069/0012

Effective date: 20110225