CA2068452C - Virtual graphics display capable of presenting icons and windows to the blind computer user - Google Patents
Virtual graphics display capable of presenting icons and windows to the blind computer userInfo
- Publication number
- CA2068452C CA2068452C CA002068452A CA2068452A CA2068452C CA 2068452 C CA2068452 C CA 2068452C CA 002068452 A CA002068452 A CA 002068452A CA 2068452 A CA2068452 A CA 2068452A CA 2068452 C CA2068452 C CA 2068452C
- Authority
- CA
- Canada
- Prior art keywords
- computer
- user
- mouse
- information
- tactile feedback
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/007—Teaching or communicating with blind persons using both tactile and audible presentation of the information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
Abstract
The disclosure teaches a computer mouse having tactile feedback to be used with audio computer output to provide a virtual graphic display to blind computer users. The mouse and audio feedback allow blind computer users to visualize computer graphic images and multiple screen windows in much the same way as these forms of computer output are visualized by persons with sight.
Description
CT9-91-008 20684~2 TITLE
VIRTUAL GRAPHICS DISPLAY CAPABLE OF PRESENTING
ICONS AND WINDOWS TO THE BLIND COMPUTER USER
BACKGROUND OF THE INVENTION:
Field of the Invention:
This invention relates generally to providing computer output to blind computer users and more particularly to providing means and method for allowing blind computer users to visualize computer graphic images and multiple screen windows in much the same way as these forms of computer output are visualized by persons with sight.
Background Prior Art:
A number of electronically assisted or computerized reading systems have been proposed for use by the blind. Examples include German Patent Publication DE3901023 and US Patent 4,687,444-The German publication teaches an optical scanner 11 mountedin a hand held device having a braille output matrix 5. The device is scanned across printed text which is "read" and "recognized by a computer 3 and converted to braille for output to the user. No provision i5 made for handling non text graphics and icons.
The US patent teaches a tape "reader" which converts text encoded onto a magnetic tape into braille output matrix 24.
The tape is read into a buffer. The buffer is then read out to a braille matrix. The tape is commonly encoded in a braille code and can be amplified to drive the matrix actuators directly. Reading of the buffer is controlled by 2û68~2 moving a mouse similar to those used with computers but there is only a horizontal distance output to control buffer addressing.
A device, called the Opticon, from Telesensory Company in Mountain View California, has the capability of translating the area surrounding the display cursor into a matrix of pins such that contrasting areas of the display will map to vibration of certain of the pins. The user places a finger of one hand on the pin matrix and the other hand holds and moves a mouse that causes the cursor to move about the display.
The original concept for the Opticon used a video camera that the user would sc~n across a paper document and then evolved thru camera lens changes to an ability to physically scan across a computer display. Further evolution was the substitution of a mou~e driven cursor for the handheld camera. The pin matrix, however, remained a separate device that is touched by a finger of the other hand.
The computer workstation for sighted users has evolved from a simple character based display to displays of graphics, windows, and object oriented programming. This evolution continues with multimedia and mixed media screen presentations, voice annotations, and simulated prospective.
Likewise, the user interface to this graphical display has shifted from the keyboard towards the "mouse". The mouse allows the user to easily move from one area of the display to another, and by single and double clicking on it s button keys, to manipulate the objects upon which the mouse driven cursor lies.
An example of such an object manipulation is "drag and drop"
whereby an object that represents a document, such as a musical composition, can be captured and dragged to an object which represents a printer and the document is printed; or dragged to a representation of a phonograph and 2~68~52 CT9-91-00~ 3 the document is transposed to an audio rendition of the music.
A paper document is a fixation of information to a substrate. That is the text and/or graphics do not move once placed upon the paper. That is conditionally true also for a character based computer display. A document, though scrollable, is typically positional stable upon the screen.
Thus it is reasonable to accept the learned ability of a blind user to be able to read text characters by feeling their impressions as simulated by a tactile readout device.
As described above with respect to a sighted user, windows appear, move, expand, and contract. Likewise menus pull-down, and scroll bars and radio buttons wait to be actuated. Icons represent objects to be manipulated.
Objects, by dragging and dropping, interact with other objects and things happen -- like printing, speech articulation, new object formation, etc.
The blind user requires a more sophisticated approach to interaction with this new technology. The compensatory enhancement of the sense of touch is not alone sufficient for spatial orientation when the information presented is free flowing, non-restrained, and particularly non-character representations.
SUMMARY OF THE INVENTION
In order to appreciate the environment that is created for a blind person by use of the apparatus and method of the invention, the reader of this application is requested to imagine being in a zoo among the freely roaming animals.
The person can walk about touching and feeling the shapes of the animals who are also moving around. Upon request an animal will speak giving it's name, how it differs from the other animals, and what it can do. This is the new era of computers wlthin which a visually handicapped person must exist and thrive.
Accordingly it is an advantageous effect of this invention that an image of a computer display screen is created in the mind of a blind user by tactile feedback to the thumb and/or finger of the users hand that holds a mouse like input output device.
It is a further advantageous effect that a blind person can, using the invention, communicate in an object oriented graphic environment. A still further effect is that a blind user can locate, identify, verify, change, and/or otherwise manipulate information and/or objects contained within a object oriented scheme as exemplified by a graphical user interface such as Presentation Manager residing on an operating system such as OS/2 or Windows residing on DOS.
These and other advantages are obtained according to the invention by combining a computer, voice response and a mouse having touch feedback with the method of the invention. The combination of the invention operates most efficiently in an object oriented graphical environment.
The method of the invention determines that an object has been located by providing feedback to the mouse when a boundary is encountered. A contrast point on the display causes a feedback sensation in ~he hand moving the mouse.
To allow for rapid and circuitous movement of the mouse it i8 necessary for the user feedback to be instantaneous. For this reason, feedback in the form of a mild electrical impulse, a physical vibration or other fast response manifestation is a pre~erred embodiment.
Clicking of a button will cause a voiced annotation of the encounter. If the object is a cell in a spread sheet, the response will be the numerical value, the text contained within that cell, or other information such as cell location/address. If the object is a note or chord on a musical composition, a representation of that sound will be heard. Similarly, if an icon object is encountered, the icon s property and use will be articulated or depending upon the program and the sequence of button depressions, it 20684~2 will be expanded so it can be more easily traced by the user with the mouse.
Clicking of a button on the mouse can cause any object encountered to be expanded so as to allow the user to explore the object ~y tracing. The mouse will continue to react so long as the cursor follows the path. Like walking along the road by feeling the difference in the surface contrast along the edge. BRIEF DESCRIPTION OF THE DRAWINGS:
Figure 1 shows the overall system in which the invention finds utility.
Figure 2 shows an embodiment of the mouse incorporating the invention.
Figure 3 shows an electrical tactile feedback element.
Figure 4 shows a vibratory tactile feedback element.
Figure 5 shows a feedback driver circuit.
Figure 6 shows a flow diagram of the method of operation of the invention.
Figure 7 is a representation of a display of a plurality of icons.
Figure 8 is a representation of a display of the screen of Figure 7 where one of the icons has been transformed into a menu window.
Figure 9 is a representation of a di~play of the screen of Figure 8 where one of the icons in the menu window has been transformed into an application window.
Figure 10 is the application window of Figure 9 after expansion to full screen display showing the operation of modifying a spread-sheet.
20~8~52 DETAILED DESCRIPTION OF PREFERRED EMBODIMENT:
Figure 1 shows the overall system of the preferred embodiment which includes a computer 11 connected to the mouse 17 having tactile feedback, which is shown in more detail in Figure 2. The system also has a keyboard 13 for alphanumeric input and a speaker 15 for beep, tone, and computer simulated speech output. Although not connected physically by wires to the computer, a virtual display 19 is shown in broken lines to portray the virtual display that is created in the mind of the user by means of the tactile and audio feedback that is provided by the computer.
Shown in the center of virtual display i8 a set-point 31.
The computer has programmed means for re-positioning the mouse cursor to the preset set-point position 31 within the virtual display presented to the user by the tactile feedback in the mouse and audio feedback. Dead-center is a preferable position for the set-point as then the user has a shorter distance to traverse to get to any quadrant of the display. This i~ necessary when the user has lost spatial orientation and needs to re-discover the positions of object~ on the virtual presentation. This is especially desirable when windows are re-sized or one window partially overlays another.
Also, the movement of addressing in the presentation buffer is not consistently proportional to mouse movement. This is due to changes in the coefficient of friction between the rolling mouse ball and the desktop surface, acceleration and damping effects from the hand control, and various similar factors. Though not desirable in normal mouse applications, it is tolerable since there is a constant visual feedback of the cursor location on the display screen. The blind user must have a reliable method for ad-hoc re-establishment of buffer addressing which is analogous to visual cursor location when the buffer is also connected to drive a visual display.
Figure 1 also shows the programs that implement the method of the invention. Included are the application programs 21, which are preferably written as object oriented programs using a language such as Smalltalk or C+~. The graphical interface shell 23 is a graphical user interface such as Presentation Manager u~ed with the IBM OS/2 or Windows used with Microsoft or IBM DOS. The input and output to and from the computer is effected by the I/O driver programs 27, 28, and 29. They control the speaker, the keyboard, and the mouse respectively. The mouse of course has both input and output functions implemented into its driver program.
although not shown in Figure 1, a visual display may also be connected to the computer to allow sighted persons to follow the activity of a blind user or to allow a user with sight to interact with the computer application using all the interfaces of sight, sound and feel.
When creating application programs incorporating the invention, a language within the family of Object Oriented Programming System~ (OOPS) is the preferred language. Two examples of such languages are Small-Talk and C+~. These languages utilize objects and object-to-object communication and interaction. An object is an encapsulation of related pieces of program code (referred to as method) and data.
Objects have attributes and exhibit behaviors. An object can be considered as a self-contained computer program that is operating on it s own data. Examples of objects are:
windows, icons, menus, spreadsheets, documents, keyboards, and printers.
The application programs, operating system, device drivers, objects, etc. have sound annotation. Spreadsheets talk, compositions play, documents read, and objects describe themselves. Sound annotation is added to the programs through the use of a device such as the IBM Audio Capture and Playback Adaptor. This is a plug in card for the IBM
PS/2 computer. Also needed is the IBM Audio Visual Connection software program product. These items allow the user and programmer to capture, compress, and store voice 2068~52 and audio information in the computer memory or the DASD
storage device.
Inheritance is a property that is exploited in OOPS
languages. Inheritance allows the programmer to create an object by calling a similar object and defining only the differences (additions and subtractions) needed to identify the new object. A data-cell in a spreadsheet, for example, will be given voice articulation of the contents thru the use of the inheritance attribute.
Another unique characteristic in OOPS is polymorphism whereby objects respond with different behavior dependent upon their interaction with other objects. For example an object such as a pull-down menu can respond in a different mode to a blind user s mouse than it would to a normal mouse. The click from the former could cause the menu list to be read out whereas a click from the latter would cause it to visually scroll downward.
For those application programs which do not contain voice annotations, there are many text readout auxiliary programs available. The~e programs attempt to read words or will spell out the word by character. One example of this type of program is the Screen Reader by IBM.
These auxiliary programs when executed in a computer of the invention will cause tactile or touch feedback as objects are located. An object, such as the title block on a window, will be readable through text to voice conversions by the auxiliary program, but Icons and graphical representations will not.
Referring now to Figure 2, a perspective view of a mouse 17 incorporating a tactile feedback area 33 according to the invention is shown. The preferred form of feedback to the user is a very mild AC signal. This AC signal is adjustable in both voltage and current so as to give a mild tingling sensation at the fingertip holding the mouse. The sensation is similar to the touching of an electrical appliance having 20684~2 a small leakage current that is seeking a ground return through the persons body.
The prior art includes many physical mouse designs and mouse to computer interfaces. There is also a dependency of the graphical interface layer to establish the effect upon the computer when the button(s) on the mouse is actuated. Thus, for the purpose of this invention, I have chosen to utilize the IBM mouse and the IBM Presentation Manager graphical interface residing on the OS/2 operating system.
For this configuration, the mouse has two buttons labeled 16 and 18 as indicated in figure 2. The left button 16 is the principle usage control in the prior art IBM Presentation Manager graphical interface. A single left button click on an icon presents a menu list and a double click produces the default position on that list which is, typically, to produce the window which the icon represents.
A left button click-and-hold locks the icon to the cursor and allows a user with sight to reposition the icon elsewhere on a display.
Other uses for the single and double clicking reside within the window after it has been established as a transform from the icon. For example, there are two triangles at the upper right corner of the window. Clicking on the leftmost triangle causes the window to transform back to the icon while clicking on the rightmost triangle causes the window to expand to full screen in size.
The right button 18 on the mouse has little application when used by persons with sight, except to produce a task-list upon a single click. The task list is a listing of the icons (tasks) available for the use of the user.
In the IBM Presentation Manager graphical interface residing on the OS/2 operating system, there is the ability to reconfigure the operation and sequencing of the mouse buttons by changing the mouse configuration program. This 20684~2 is accomplished by calling the mouse window by clicking on the mouse icon.
I choose to use the left mouse button 16 as indicated above.
I also choose to define and reconfigure the right mouse button 18 as follows:
Single click. Produce voice annotation.
(if available ) Second (or double) click. Stop voice annotation.
Double click and hold. Go to set-point position.
Release. Stay at set-point.
Single click and hold, Expand area surrounding curson then, Single click on left button.
Release. Return to normal screen presentation.
The task list i8 available, as are all mouse button operations, as a multi-key actuation on the keyboard and of course, the user has the ability to re-configure the operation of the buttons as he desires as indicated above.
Now with reference to Figure 3, a conductive area 33 is shown in which a sin~le finger will be in contact with the different voltage potentials of the tactile electrical output of the mouse 17. The conductive area 33 comprises a group of concentric circles separated by insulating space.
Circles 35 and 39 are electrically connected to terminal A
and circle 37 and center circle 41 are connected to terminal B. A finger placed onto area 33 will be able to sense the 2068~2 current and voltage between terminals A and B as tactile feedback from the computer.
Figure 4 shows an alternate embodiment of the tactile feedback transducer as a vibrator or tone source which will be made to vary in intensity and/or frequency as the mouse 17 is moved to present different parts of the buffer information to the user. The transducer of Figure 4 is a loudspeaker like device having a voice coil 47 and a magnet 45. The voice coil 47 is connected to a cone like member that is thereby made to vibrate creating a tactile sensation as well as producing a separate audible sensation when operated at audio frequencies. The voice coil 47 has one end connected to terminal A and the other end connected to terminal B. The mechanism of Figure 4 is similar to voice coil mechanisms which are used to drive the read/write head for movement across the rotating disks or diskettes in computer DASD storage devices.
Figure 5 shows the essential components required to furnish an ~C tactile feedback signal from a low DC voltage available from the computer to which the mouse is attached, or from a battery if the mouse has a wireless connection to the computer. The DC voltage source 51 is applied to a switching circuit 53 which changes it to a sequence of pulsations under control of the feedback signal from the computer. The ~requency of the pulsations are controlled by the feedback signal. The output of the switching circuit 53 is applied to the primary 55 of a transformer. The ratio of the turns in the primary winding 55 to the secondary winding 57 of the transformer determines the magnitude of the voltage available at the secondary. Taps 59, 60, and 61 on the secondary allow the magnitude of the voltage to be tailored to the user. ~ikewise the current limiting resistors 63 and 65 in series with the secondary voltage allow the maximum current level to be adjusted. The resultant voltage is applied across terminals A and B to drive either the electrical transducer of Flgure 3 or the vibratory transducer of Figure 4.
Referring now to Figure 6 a flow diagram of the method of operation of the invention is set out.
The method is initiated at block 71 where a set-point input is received from either the graphical interface shell program or a button click as described hereinafter with respect to block 87. The set-point input causes the program of the invention to set the presentation buffer 26 address registers to the values that define the logical address of the set-point ;31. Thereafter the information at the addressed location of the buffer 26 is read out at block 75 and used to obtain the presentation information pertinent to the presentation being presented at the corresponding x and y coordinates of the virtual display. The presentation information may be thought of as virtual picture elements that will be presented to the user by tactile feedback to the mouse.
If the program of the invention determines at block 77 that the information indicates that there is something other than background color, then a tactile feedback signal is generates. The signal defines a frequency indicative of the color of the information being presented. For example, the color red is a lower frequency and blue is a high frequency.
This signal is then sent to the mouse 17 where it is applied to the feedback input 52 of the circuits shown in Figure 5 to actuate the transducer of Figure 3 or Figure 4 at the defined fre~ency. If the program of the invention determines at block 79 that the presentation information pertinent to the x and y coordinates indicates that a window boundary exists at these coordinates, then an audio beep i8 generated in response to the programmed determination.
At block 81 the buttons of the mouse 17 are read by the pxogram of the invention. If the program finds one click of the right button at block 83 in Figure 6, then the voice annotation of an object represented by the presentation at the coordinates of the virtual display is retrieved and presented to the user through speaker 15. Of course, there may be no object associated with the coordinates or there 20684~
may be not voice annotation for an object if an object does exist.
The Voice annotation continues to be produced until the annotation either ends or until the user presses the right button twice to create a double click. The right double click also stops the voice so that the user can continue exploring the presentation as soon as the object is recognized by the user.
At block 87, the program of the invention detects a double click and hold of the right button which is interpreted by the program as a request by the user to return to the set-point. This is done when the user loses orientation of material being presented and wishes to be reoriented by starting again from a known position on the virtual display While the riqht button is being held, the user can also move the mouse to a convenient position thereby adjusting the position of the whole virtual display When the button is released, the buffer is again read at block 75 to restart the presentation to the user.
A more normal step occurs at block 89 where the position outputs from the mouse 17 are read by the program of the invention. If the mouse has been moved while in contact with a surface, the position outputs will provide values to the program which are proportional to the change in x and y coordinates.
These values are then used by the program at block 93 to increment or to decrement the logical address registers which access the buffer 26. In this way, the buffer can be read at a new logical address to obtain presentation information at a new location in the virtual display.
In addition to the method of the invention which is programmed to control the computer, the user operates the computer under the programmed control of the invention by initiating user chosen moves of the mouse and clicks of the ' 2068452 left and right buttons to interact with the objects being presented.
In order to better understand the invention, it is desirable to describe a synopsis of the blind user interacting with the virtual desktop display with reference to the Figures 7 through lO. Using an operating system such as OS/2, the user turns on the computer and, after a time, the OS/2 Presentation Manager Graphical Screen is made available upon the virtual display by loading the buffer 26 with the objects shown in Figure 7.
By moving the mouse the user encounters and feels icons in the presentation. The user determines where these icons are by moving and exploring the virtual display area using the tactile feedback from the mouse. A click of the button as described with respect to block 83 in Figure 6 articulates the name and properties of the icon. He can drag the icon and position it to preferred positions on the desktop for future use.
Having selected an icon, a double click of the left button produces a window that can be explored. Edges of the window are located by moving and feeling with the mouse. Window edges are also uniquely identified by an audio beep.
When the cursor encounters the edge of a window it changes from a single to a double ended arrow. This visually indicates to a user with sight that the window can be resized by click-holding the ]eft mouse button and dragging the window edge. As described above, this function has been implemented in the invention as an audio beep ~or the blind user. In OOPS it is done by the previously mentioned inheritance addition to the cursor object . For base level code it will re~uire addition of an additiona1 sub-routine to the cursor control code. Assume that the icon selected was for an application menu window as shown in Figure 8.
This selected window contains a listing of applications available and their respective icons. The user enters and explores this window with the mouse. The user determines CT9-91-008 15 20684S~
the window edges by feel and the audio beeps, and identifies the icons and associated text by feeling, clicking, and listening to the vocal responses.
Locating and selecting one of the icons on the menu window of Figure 8 produces another window which, perhaps, overlaps the menu window as shown in Figure 9. Assume that this latest acquired window is a spreadsheet.
The question now is if this is the desired spreadsheet. it can be identified by clicking at it s title whereby the title is read aloud as described with respect to block 83 of Figure 6. Any cell within any row or column can be located by feeling the boundary crossings and the text within the cells. A click articulates the contents of the cell. A
standard typewriter style keyboard allows entry or modification of data within the cell which can then be clicked upon for voiced verification.
The user experience will be as if the user had a large spread sheet upon his desktop. A11 of the printing is raised and when the mouse encounters this raised printing the ueers fingers feel a mild electrical tingle or mechanical vibration.
When the user is at an area of perceived interest, for example a row/column intercept data point, a request by a mouse click described with respect to block 83 of Figure 6 will cause a voiced readout of what is contained at that point. For example: "one-five-four-point-two-zero", "customer-account-number", "page-one-seven".
Over time, familiarity and a learning curve will cause the user to be comfortable with determining various spread sheets by their layouts and the voice indication of the title or page number on the paper. The significance is that the user obtains the voiced information only upon request of a mouse click. The user is free to rapidly scroll across the spreadsheet in quest of the desired column or row.
Location of desired information is recollected by reference 2068~2 to adjacent column/row information. Text is articulated by character, word, line, or paragraph and initiated under control of the user.
Closing the spreadsheet window, by locating and clicking on the close icon, re-establishes the application icon which can be dragged to a printer icon, stored in a file, or sent to another computer terminal on the network.
Although the invention has been described in the form of the preferred embodiment for use by a blind user, it will be recognized that the advantages of the invention are also useful to people who can see. Accordingly a visual display apparatus may be added to the computer for user by sighted persons when working alone or together with a blind user.
Likewise many other changes in form and detail will suggest themselves to those skilled in the art of interactive computer terminal design without departing from the spirit and scope of the invention which is measured by the following claims
VIRTUAL GRAPHICS DISPLAY CAPABLE OF PRESENTING
ICONS AND WINDOWS TO THE BLIND COMPUTER USER
BACKGROUND OF THE INVENTION:
Field of the Invention:
This invention relates generally to providing computer output to blind computer users and more particularly to providing means and method for allowing blind computer users to visualize computer graphic images and multiple screen windows in much the same way as these forms of computer output are visualized by persons with sight.
Background Prior Art:
A number of electronically assisted or computerized reading systems have been proposed for use by the blind. Examples include German Patent Publication DE3901023 and US Patent 4,687,444-The German publication teaches an optical scanner 11 mountedin a hand held device having a braille output matrix 5. The device is scanned across printed text which is "read" and "recognized by a computer 3 and converted to braille for output to the user. No provision i5 made for handling non text graphics and icons.
The US patent teaches a tape "reader" which converts text encoded onto a magnetic tape into braille output matrix 24.
The tape is read into a buffer. The buffer is then read out to a braille matrix. The tape is commonly encoded in a braille code and can be amplified to drive the matrix actuators directly. Reading of the buffer is controlled by 2û68~2 moving a mouse similar to those used with computers but there is only a horizontal distance output to control buffer addressing.
A device, called the Opticon, from Telesensory Company in Mountain View California, has the capability of translating the area surrounding the display cursor into a matrix of pins such that contrasting areas of the display will map to vibration of certain of the pins. The user places a finger of one hand on the pin matrix and the other hand holds and moves a mouse that causes the cursor to move about the display.
The original concept for the Opticon used a video camera that the user would sc~n across a paper document and then evolved thru camera lens changes to an ability to physically scan across a computer display. Further evolution was the substitution of a mou~e driven cursor for the handheld camera. The pin matrix, however, remained a separate device that is touched by a finger of the other hand.
The computer workstation for sighted users has evolved from a simple character based display to displays of graphics, windows, and object oriented programming. This evolution continues with multimedia and mixed media screen presentations, voice annotations, and simulated prospective.
Likewise, the user interface to this graphical display has shifted from the keyboard towards the "mouse". The mouse allows the user to easily move from one area of the display to another, and by single and double clicking on it s button keys, to manipulate the objects upon which the mouse driven cursor lies.
An example of such an object manipulation is "drag and drop"
whereby an object that represents a document, such as a musical composition, can be captured and dragged to an object which represents a printer and the document is printed; or dragged to a representation of a phonograph and 2~68~52 CT9-91-00~ 3 the document is transposed to an audio rendition of the music.
A paper document is a fixation of information to a substrate. That is the text and/or graphics do not move once placed upon the paper. That is conditionally true also for a character based computer display. A document, though scrollable, is typically positional stable upon the screen.
Thus it is reasonable to accept the learned ability of a blind user to be able to read text characters by feeling their impressions as simulated by a tactile readout device.
As described above with respect to a sighted user, windows appear, move, expand, and contract. Likewise menus pull-down, and scroll bars and radio buttons wait to be actuated. Icons represent objects to be manipulated.
Objects, by dragging and dropping, interact with other objects and things happen -- like printing, speech articulation, new object formation, etc.
The blind user requires a more sophisticated approach to interaction with this new technology. The compensatory enhancement of the sense of touch is not alone sufficient for spatial orientation when the information presented is free flowing, non-restrained, and particularly non-character representations.
SUMMARY OF THE INVENTION
In order to appreciate the environment that is created for a blind person by use of the apparatus and method of the invention, the reader of this application is requested to imagine being in a zoo among the freely roaming animals.
The person can walk about touching and feeling the shapes of the animals who are also moving around. Upon request an animal will speak giving it's name, how it differs from the other animals, and what it can do. This is the new era of computers wlthin which a visually handicapped person must exist and thrive.
Accordingly it is an advantageous effect of this invention that an image of a computer display screen is created in the mind of a blind user by tactile feedback to the thumb and/or finger of the users hand that holds a mouse like input output device.
It is a further advantageous effect that a blind person can, using the invention, communicate in an object oriented graphic environment. A still further effect is that a blind user can locate, identify, verify, change, and/or otherwise manipulate information and/or objects contained within a object oriented scheme as exemplified by a graphical user interface such as Presentation Manager residing on an operating system such as OS/2 or Windows residing on DOS.
These and other advantages are obtained according to the invention by combining a computer, voice response and a mouse having touch feedback with the method of the invention. The combination of the invention operates most efficiently in an object oriented graphical environment.
The method of the invention determines that an object has been located by providing feedback to the mouse when a boundary is encountered. A contrast point on the display causes a feedback sensation in ~he hand moving the mouse.
To allow for rapid and circuitous movement of the mouse it i8 necessary for the user feedback to be instantaneous. For this reason, feedback in the form of a mild electrical impulse, a physical vibration or other fast response manifestation is a pre~erred embodiment.
Clicking of a button will cause a voiced annotation of the encounter. If the object is a cell in a spread sheet, the response will be the numerical value, the text contained within that cell, or other information such as cell location/address. If the object is a note or chord on a musical composition, a representation of that sound will be heard. Similarly, if an icon object is encountered, the icon s property and use will be articulated or depending upon the program and the sequence of button depressions, it 20684~2 will be expanded so it can be more easily traced by the user with the mouse.
Clicking of a button on the mouse can cause any object encountered to be expanded so as to allow the user to explore the object ~y tracing. The mouse will continue to react so long as the cursor follows the path. Like walking along the road by feeling the difference in the surface contrast along the edge. BRIEF DESCRIPTION OF THE DRAWINGS:
Figure 1 shows the overall system in which the invention finds utility.
Figure 2 shows an embodiment of the mouse incorporating the invention.
Figure 3 shows an electrical tactile feedback element.
Figure 4 shows a vibratory tactile feedback element.
Figure 5 shows a feedback driver circuit.
Figure 6 shows a flow diagram of the method of operation of the invention.
Figure 7 is a representation of a display of a plurality of icons.
Figure 8 is a representation of a display of the screen of Figure 7 where one of the icons has been transformed into a menu window.
Figure 9 is a representation of a di~play of the screen of Figure 8 where one of the icons in the menu window has been transformed into an application window.
Figure 10 is the application window of Figure 9 after expansion to full screen display showing the operation of modifying a spread-sheet.
20~8~52 DETAILED DESCRIPTION OF PREFERRED EMBODIMENT:
Figure 1 shows the overall system of the preferred embodiment which includes a computer 11 connected to the mouse 17 having tactile feedback, which is shown in more detail in Figure 2. The system also has a keyboard 13 for alphanumeric input and a speaker 15 for beep, tone, and computer simulated speech output. Although not connected physically by wires to the computer, a virtual display 19 is shown in broken lines to portray the virtual display that is created in the mind of the user by means of the tactile and audio feedback that is provided by the computer.
Shown in the center of virtual display i8 a set-point 31.
The computer has programmed means for re-positioning the mouse cursor to the preset set-point position 31 within the virtual display presented to the user by the tactile feedback in the mouse and audio feedback. Dead-center is a preferable position for the set-point as then the user has a shorter distance to traverse to get to any quadrant of the display. This i~ necessary when the user has lost spatial orientation and needs to re-discover the positions of object~ on the virtual presentation. This is especially desirable when windows are re-sized or one window partially overlays another.
Also, the movement of addressing in the presentation buffer is not consistently proportional to mouse movement. This is due to changes in the coefficient of friction between the rolling mouse ball and the desktop surface, acceleration and damping effects from the hand control, and various similar factors. Though not desirable in normal mouse applications, it is tolerable since there is a constant visual feedback of the cursor location on the display screen. The blind user must have a reliable method for ad-hoc re-establishment of buffer addressing which is analogous to visual cursor location when the buffer is also connected to drive a visual display.
Figure 1 also shows the programs that implement the method of the invention. Included are the application programs 21, which are preferably written as object oriented programs using a language such as Smalltalk or C+~. The graphical interface shell 23 is a graphical user interface such as Presentation Manager u~ed with the IBM OS/2 or Windows used with Microsoft or IBM DOS. The input and output to and from the computer is effected by the I/O driver programs 27, 28, and 29. They control the speaker, the keyboard, and the mouse respectively. The mouse of course has both input and output functions implemented into its driver program.
although not shown in Figure 1, a visual display may also be connected to the computer to allow sighted persons to follow the activity of a blind user or to allow a user with sight to interact with the computer application using all the interfaces of sight, sound and feel.
When creating application programs incorporating the invention, a language within the family of Object Oriented Programming System~ (OOPS) is the preferred language. Two examples of such languages are Small-Talk and C+~. These languages utilize objects and object-to-object communication and interaction. An object is an encapsulation of related pieces of program code (referred to as method) and data.
Objects have attributes and exhibit behaviors. An object can be considered as a self-contained computer program that is operating on it s own data. Examples of objects are:
windows, icons, menus, spreadsheets, documents, keyboards, and printers.
The application programs, operating system, device drivers, objects, etc. have sound annotation. Spreadsheets talk, compositions play, documents read, and objects describe themselves. Sound annotation is added to the programs through the use of a device such as the IBM Audio Capture and Playback Adaptor. This is a plug in card for the IBM
PS/2 computer. Also needed is the IBM Audio Visual Connection software program product. These items allow the user and programmer to capture, compress, and store voice 2068~52 and audio information in the computer memory or the DASD
storage device.
Inheritance is a property that is exploited in OOPS
languages. Inheritance allows the programmer to create an object by calling a similar object and defining only the differences (additions and subtractions) needed to identify the new object. A data-cell in a spreadsheet, for example, will be given voice articulation of the contents thru the use of the inheritance attribute.
Another unique characteristic in OOPS is polymorphism whereby objects respond with different behavior dependent upon their interaction with other objects. For example an object such as a pull-down menu can respond in a different mode to a blind user s mouse than it would to a normal mouse. The click from the former could cause the menu list to be read out whereas a click from the latter would cause it to visually scroll downward.
For those application programs which do not contain voice annotations, there are many text readout auxiliary programs available. The~e programs attempt to read words or will spell out the word by character. One example of this type of program is the Screen Reader by IBM.
These auxiliary programs when executed in a computer of the invention will cause tactile or touch feedback as objects are located. An object, such as the title block on a window, will be readable through text to voice conversions by the auxiliary program, but Icons and graphical representations will not.
Referring now to Figure 2, a perspective view of a mouse 17 incorporating a tactile feedback area 33 according to the invention is shown. The preferred form of feedback to the user is a very mild AC signal. This AC signal is adjustable in both voltage and current so as to give a mild tingling sensation at the fingertip holding the mouse. The sensation is similar to the touching of an electrical appliance having 20684~2 a small leakage current that is seeking a ground return through the persons body.
The prior art includes many physical mouse designs and mouse to computer interfaces. There is also a dependency of the graphical interface layer to establish the effect upon the computer when the button(s) on the mouse is actuated. Thus, for the purpose of this invention, I have chosen to utilize the IBM mouse and the IBM Presentation Manager graphical interface residing on the OS/2 operating system.
For this configuration, the mouse has two buttons labeled 16 and 18 as indicated in figure 2. The left button 16 is the principle usage control in the prior art IBM Presentation Manager graphical interface. A single left button click on an icon presents a menu list and a double click produces the default position on that list which is, typically, to produce the window which the icon represents.
A left button click-and-hold locks the icon to the cursor and allows a user with sight to reposition the icon elsewhere on a display.
Other uses for the single and double clicking reside within the window after it has been established as a transform from the icon. For example, there are two triangles at the upper right corner of the window. Clicking on the leftmost triangle causes the window to transform back to the icon while clicking on the rightmost triangle causes the window to expand to full screen in size.
The right button 18 on the mouse has little application when used by persons with sight, except to produce a task-list upon a single click. The task list is a listing of the icons (tasks) available for the use of the user.
In the IBM Presentation Manager graphical interface residing on the OS/2 operating system, there is the ability to reconfigure the operation and sequencing of the mouse buttons by changing the mouse configuration program. This 20684~2 is accomplished by calling the mouse window by clicking on the mouse icon.
I choose to use the left mouse button 16 as indicated above.
I also choose to define and reconfigure the right mouse button 18 as follows:
Single click. Produce voice annotation.
(if available ) Second (or double) click. Stop voice annotation.
Double click and hold. Go to set-point position.
Release. Stay at set-point.
Single click and hold, Expand area surrounding curson then, Single click on left button.
Release. Return to normal screen presentation.
The task list i8 available, as are all mouse button operations, as a multi-key actuation on the keyboard and of course, the user has the ability to re-configure the operation of the buttons as he desires as indicated above.
Now with reference to Figure 3, a conductive area 33 is shown in which a sin~le finger will be in contact with the different voltage potentials of the tactile electrical output of the mouse 17. The conductive area 33 comprises a group of concentric circles separated by insulating space.
Circles 35 and 39 are electrically connected to terminal A
and circle 37 and center circle 41 are connected to terminal B. A finger placed onto area 33 will be able to sense the 2068~2 current and voltage between terminals A and B as tactile feedback from the computer.
Figure 4 shows an alternate embodiment of the tactile feedback transducer as a vibrator or tone source which will be made to vary in intensity and/or frequency as the mouse 17 is moved to present different parts of the buffer information to the user. The transducer of Figure 4 is a loudspeaker like device having a voice coil 47 and a magnet 45. The voice coil 47 is connected to a cone like member that is thereby made to vibrate creating a tactile sensation as well as producing a separate audible sensation when operated at audio frequencies. The voice coil 47 has one end connected to terminal A and the other end connected to terminal B. The mechanism of Figure 4 is similar to voice coil mechanisms which are used to drive the read/write head for movement across the rotating disks or diskettes in computer DASD storage devices.
Figure 5 shows the essential components required to furnish an ~C tactile feedback signal from a low DC voltage available from the computer to which the mouse is attached, or from a battery if the mouse has a wireless connection to the computer. The DC voltage source 51 is applied to a switching circuit 53 which changes it to a sequence of pulsations under control of the feedback signal from the computer. The ~requency of the pulsations are controlled by the feedback signal. The output of the switching circuit 53 is applied to the primary 55 of a transformer. The ratio of the turns in the primary winding 55 to the secondary winding 57 of the transformer determines the magnitude of the voltage available at the secondary. Taps 59, 60, and 61 on the secondary allow the magnitude of the voltage to be tailored to the user. ~ikewise the current limiting resistors 63 and 65 in series with the secondary voltage allow the maximum current level to be adjusted. The resultant voltage is applied across terminals A and B to drive either the electrical transducer of Flgure 3 or the vibratory transducer of Figure 4.
Referring now to Figure 6 a flow diagram of the method of operation of the invention is set out.
The method is initiated at block 71 where a set-point input is received from either the graphical interface shell program or a button click as described hereinafter with respect to block 87. The set-point input causes the program of the invention to set the presentation buffer 26 address registers to the values that define the logical address of the set-point ;31. Thereafter the information at the addressed location of the buffer 26 is read out at block 75 and used to obtain the presentation information pertinent to the presentation being presented at the corresponding x and y coordinates of the virtual display. The presentation information may be thought of as virtual picture elements that will be presented to the user by tactile feedback to the mouse.
If the program of the invention determines at block 77 that the information indicates that there is something other than background color, then a tactile feedback signal is generates. The signal defines a frequency indicative of the color of the information being presented. For example, the color red is a lower frequency and blue is a high frequency.
This signal is then sent to the mouse 17 where it is applied to the feedback input 52 of the circuits shown in Figure 5 to actuate the transducer of Figure 3 or Figure 4 at the defined fre~ency. If the program of the invention determines at block 79 that the presentation information pertinent to the x and y coordinates indicates that a window boundary exists at these coordinates, then an audio beep i8 generated in response to the programmed determination.
At block 81 the buttons of the mouse 17 are read by the pxogram of the invention. If the program finds one click of the right button at block 83 in Figure 6, then the voice annotation of an object represented by the presentation at the coordinates of the virtual display is retrieved and presented to the user through speaker 15. Of course, there may be no object associated with the coordinates or there 20684~
may be not voice annotation for an object if an object does exist.
The Voice annotation continues to be produced until the annotation either ends or until the user presses the right button twice to create a double click. The right double click also stops the voice so that the user can continue exploring the presentation as soon as the object is recognized by the user.
At block 87, the program of the invention detects a double click and hold of the right button which is interpreted by the program as a request by the user to return to the set-point. This is done when the user loses orientation of material being presented and wishes to be reoriented by starting again from a known position on the virtual display While the riqht button is being held, the user can also move the mouse to a convenient position thereby adjusting the position of the whole virtual display When the button is released, the buffer is again read at block 75 to restart the presentation to the user.
A more normal step occurs at block 89 where the position outputs from the mouse 17 are read by the program of the invention. If the mouse has been moved while in contact with a surface, the position outputs will provide values to the program which are proportional to the change in x and y coordinates.
These values are then used by the program at block 93 to increment or to decrement the logical address registers which access the buffer 26. In this way, the buffer can be read at a new logical address to obtain presentation information at a new location in the virtual display.
In addition to the method of the invention which is programmed to control the computer, the user operates the computer under the programmed control of the invention by initiating user chosen moves of the mouse and clicks of the ' 2068452 left and right buttons to interact with the objects being presented.
In order to better understand the invention, it is desirable to describe a synopsis of the blind user interacting with the virtual desktop display with reference to the Figures 7 through lO. Using an operating system such as OS/2, the user turns on the computer and, after a time, the OS/2 Presentation Manager Graphical Screen is made available upon the virtual display by loading the buffer 26 with the objects shown in Figure 7.
By moving the mouse the user encounters and feels icons in the presentation. The user determines where these icons are by moving and exploring the virtual display area using the tactile feedback from the mouse. A click of the button as described with respect to block 83 in Figure 6 articulates the name and properties of the icon. He can drag the icon and position it to preferred positions on the desktop for future use.
Having selected an icon, a double click of the left button produces a window that can be explored. Edges of the window are located by moving and feeling with the mouse. Window edges are also uniquely identified by an audio beep.
When the cursor encounters the edge of a window it changes from a single to a double ended arrow. This visually indicates to a user with sight that the window can be resized by click-holding the ]eft mouse button and dragging the window edge. As described above, this function has been implemented in the invention as an audio beep ~or the blind user. In OOPS it is done by the previously mentioned inheritance addition to the cursor object . For base level code it will re~uire addition of an additiona1 sub-routine to the cursor control code. Assume that the icon selected was for an application menu window as shown in Figure 8.
This selected window contains a listing of applications available and their respective icons. The user enters and explores this window with the mouse. The user determines CT9-91-008 15 20684S~
the window edges by feel and the audio beeps, and identifies the icons and associated text by feeling, clicking, and listening to the vocal responses.
Locating and selecting one of the icons on the menu window of Figure 8 produces another window which, perhaps, overlaps the menu window as shown in Figure 9. Assume that this latest acquired window is a spreadsheet.
The question now is if this is the desired spreadsheet. it can be identified by clicking at it s title whereby the title is read aloud as described with respect to block 83 of Figure 6. Any cell within any row or column can be located by feeling the boundary crossings and the text within the cells. A click articulates the contents of the cell. A
standard typewriter style keyboard allows entry or modification of data within the cell which can then be clicked upon for voiced verification.
The user experience will be as if the user had a large spread sheet upon his desktop. A11 of the printing is raised and when the mouse encounters this raised printing the ueers fingers feel a mild electrical tingle or mechanical vibration.
When the user is at an area of perceived interest, for example a row/column intercept data point, a request by a mouse click described with respect to block 83 of Figure 6 will cause a voiced readout of what is contained at that point. For example: "one-five-four-point-two-zero", "customer-account-number", "page-one-seven".
Over time, familiarity and a learning curve will cause the user to be comfortable with determining various spread sheets by their layouts and the voice indication of the title or page number on the paper. The significance is that the user obtains the voiced information only upon request of a mouse click. The user is free to rapidly scroll across the spreadsheet in quest of the desired column or row.
Location of desired information is recollected by reference 2068~2 to adjacent column/row information. Text is articulated by character, word, line, or paragraph and initiated under control of the user.
Closing the spreadsheet window, by locating and clicking on the close icon, re-establishes the application icon which can be dragged to a printer icon, stored in a file, or sent to another computer terminal on the network.
Although the invention has been described in the form of the preferred embodiment for use by a blind user, it will be recognized that the advantages of the invention are also useful to people who can see. Accordingly a visual display apparatus may be added to the computer for user by sighted persons when working alone or together with a blind user.
Likewise many other changes in form and detail will suggest themselves to those skilled in the art of interactive computer terminal design without departing from the spirit and scope of the invention which is measured by the following claims
Claims (4)
1. A virtual graphics display system for the blind computer user comprising:
a computer having a buffer for storing information to be presented to a user;
a mouse connected to said computer for providing buffer addressing input signals to said computer, and for receiving tactile feedback signals from said computer and presenting said tactile feedback signals to said user;
an audio output device for presenting audio information to said user;
first programmed means operating in said computer, said first programmed means responsive to a setpoint input signal for addressing a set point in said buffer;
second programmed means operating in said computer, said second programmed means responsive to said buffer addressing input signals from said mouse to retrieve presentation data representing virtual picture elements on said virtual display;
third programmed means operating in said computer, said third programmed means responsive to said picture elements for generating a first tactile feedback signal to said mouse when said picture elements represent a nonbackground information;
fourth programmed means operating in said computer, said fourth programmed means responsive to said picture elements for generating a first audio feedback signal to said audio output device when said picture elements represent nonbackground information that is also a window edge.
a computer having a buffer for storing information to be presented to a user;
a mouse connected to said computer for providing buffer addressing input signals to said computer, and for receiving tactile feedback signals from said computer and presenting said tactile feedback signals to said user;
an audio output device for presenting audio information to said user;
first programmed means operating in said computer, said first programmed means responsive to a setpoint input signal for addressing a set point in said buffer;
second programmed means operating in said computer, said second programmed means responsive to said buffer addressing input signals from said mouse to retrieve presentation data representing virtual picture elements on said virtual display;
third programmed means operating in said computer, said third programmed means responsive to said picture elements for generating a first tactile feedback signal to said mouse when said picture elements represent a nonbackground information;
fourth programmed means operating in said computer, said fourth programmed means responsive to said picture elements for generating a first audio feedback signal to said audio output device when said picture elements represent nonbackground information that is also a window edge.
2. The method of presenting graphical information in the form of screen windows in the virtual display of a computer to a blind computer user comprising the steps of:
logically addressing presentation information in a memory by a hand held tactile feedback input device;
retrieving graphical information related to a point of said presentation information in the virtual display of the computer;
vibrating a transducer on said hand held tactile feedback input device if said graphical information indicates that a non background color is being presented at said point; and changing the logical address into said presentation information by means of said hand held tactile feedback input device, before repeating the above steps.
logically addressing presentation information in a memory by a hand held tactile feedback input device;
retrieving graphical information related to a point of said presentation information in the virtual display of the computer;
vibrating a transducer on said hand held tactile feedback input device if said graphical information indicates that a non background color is being presented at said point; and changing the logical address into said presentation information by means of said hand held tactile feedback input device, before repeating the above steps.
3. The method of Claim 2 wherein said frequency of vibration of said transducer is related to the color of said non background color.
4. The method of Claim 2 further comprising the step of changing the logical address in said presentation information to a predetermined set-point.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US07/748,996 | 1991-08-22 | ||
US07/748,996 US5186629A (en) | 1991-08-22 | 1991-08-22 | Virtual graphics display capable of presenting icons and windows to the blind computer user and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2068452A1 CA2068452A1 (en) | 1993-02-23 |
CA2068452C true CA2068452C (en) | 1998-03-31 |
Family
ID=25011788
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002068452A Expired - Fee Related CA2068452C (en) | 1991-08-22 | 1992-05-12 | Virtual graphics display capable of presenting icons and windows to the blind computer user |
Country Status (2)
Country | Link |
---|---|
US (1) | US5186629A (en) |
CA (1) | CA2068452C (en) |
Families Citing this family (265)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5631861A (en) | 1990-02-02 | 1997-05-20 | Virtual Technologies, Inc. | Force feedback and texture simulating interface device |
JP3217386B2 (en) * | 1991-04-24 | 2001-10-09 | オリンパス光学工業株式会社 | Diagnostic system |
US5889670A (en) * | 1991-10-24 | 1999-03-30 | Immersion Corporation | Method and apparatus for tactilely responsive user interface |
US6046722A (en) * | 1991-12-05 | 2000-04-04 | International Business Machines Corporation | Method and system for enabling blind or visually impaired computer users to graphically select displayed elements |
US5373309A (en) * | 1991-12-09 | 1994-12-13 | Sony/Tektronix Corporation | Method and apparatus for setting variable to desired value |
US5287102A (en) * | 1991-12-20 | 1994-02-15 | International Business Machines Corporation | Method and system for enabling a blind computer user to locate icons in a graphical user interface |
US6222525B1 (en) | 1992-03-05 | 2001-04-24 | Brad A. Armstrong | Image controllers with sheet connected sensors |
JP3086069B2 (en) * | 1992-06-16 | 2000-09-11 | キヤノン株式会社 | Information processing device for the disabled |
JP2597802B2 (en) * | 1992-08-04 | 1997-04-09 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Method for controlling an image capture device, image capture device and user interface |
US5790108A (en) | 1992-10-23 | 1998-08-04 | University Of British Columbia | Controller |
US5629594A (en) * | 1992-12-02 | 1997-05-13 | Cybernet Systems Corporation | Force feedback system |
US6433771B1 (en) | 1992-12-02 | 2002-08-13 | Cybernet Haptic Systems Corporation | Haptic device attribute control |
US6131097A (en) * | 1992-12-02 | 2000-10-10 | Immersion Corporation | Haptic authoring |
US5533182A (en) * | 1992-12-22 | 1996-07-02 | International Business Machines Corporation | Aural position indicating mechanism for viewable objects |
US5511187A (en) * | 1992-12-22 | 1996-04-23 | International Business Machines Corporation | Method and system for nonvisual groupware participant status determination in a data processing system |
US6186794B1 (en) * | 1993-04-02 | 2001-02-13 | Breakthrough To Literacy, Inc. | Apparatus for interactive adaptive learning by an individual through at least one of a stimuli presentation device and a user perceivable display |
US5682166A (en) * | 1993-06-01 | 1997-10-28 | Matsushita Electric Industrial Co., Ltd. | Multi-window apparatus with audio output function |
US5731804A (en) | 1995-01-18 | 1998-03-24 | Immersion Human Interface Corp. | Method and apparatus for providing high bandwidth, low noise mechanical I/O for computer systems |
US5734373A (en) * | 1993-07-16 | 1998-03-31 | Immersion Human Interface Corporation | Method and apparatus for controlling force feedback interface systems utilizing a host computer |
US5805140A (en) * | 1993-07-16 | 1998-09-08 | Immersion Corporation | High bandwidth force feedback interface using voice coils and flexures |
US5721566A (en) * | 1995-01-18 | 1998-02-24 | Immersion Human Interface Corp. | Method and apparatus for providing damping force feedback |
US5739811A (en) * | 1993-07-16 | 1998-04-14 | Immersion Human Interface Corporation | Method and apparatus for controlling human-computer interface systems providing force feedback |
US5471675A (en) * | 1993-07-27 | 1995-11-28 | Taligent, Inc. | Object oriented video framework system |
US5602563A (en) * | 1993-12-15 | 1997-02-11 | International Business Machines Corporation | Float to surface display |
US5473344A (en) * | 1994-01-06 | 1995-12-05 | Microsoft Corporation | 3-D cursor positioning device |
US6940488B1 (en) | 1994-01-06 | 2005-09-06 | Microsoft Corporation | System and method of adjusting display characteristics of a displayable data file using an ergonomic computer input device |
US7322011B2 (en) * | 1994-01-06 | 2008-01-22 | Microsoft Corporation | System and method of adjusting display characteristics of a displayable data file using an ergonomic computer input device |
US6097371A (en) | 1996-01-02 | 2000-08-01 | Microsoft Corporation | System and method of adjusting display characteristics of a displayable data file using an ergonomic computer input device |
US5943050A (en) * | 1994-04-07 | 1999-08-24 | International Business Machines Corporation | Digital image capture control |
CA2120879A1 (en) * | 1994-04-08 | 1995-10-09 | Gaston Ouzilleau | Computer aided tactile design |
WO1995034186A1 (en) * | 1994-06-03 | 1995-12-14 | Apple Computer, Inc. | System for producing directional sound in computer-based virtual environments |
US5734805A (en) * | 1994-06-17 | 1998-03-31 | International Business Machines Corporation | Apparatus and method for controlling navigation in 3-D space |
US5821920A (en) | 1994-07-14 | 1998-10-13 | Immersion Human Interface Corporation | Control input device for interfacing an elongated flexible object with a computer system |
US5623582A (en) * | 1994-07-14 | 1997-04-22 | Immersion Human Interface Corporation | Computer interface or control input device for laparoscopic surgical instrument and other elongated mechanical objects |
KR100368508B1 (en) * | 1994-09-07 | 2005-10-25 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Information processing system interacts with virtual workspaces with user-programmable tactile feedback |
US20030040361A1 (en) * | 1994-09-21 | 2003-02-27 | Craig Thorner | Method and apparatus for generating tactile feedback via relatively low-burden and/or zero burden telemetry |
US5666138A (en) | 1994-11-22 | 1997-09-09 | Culver; Craig F. | Interface control |
DE69622101T2 (en) * | 1995-03-13 | 2003-02-13 | Koninkl Philips Electronics Nv | 3-D INPUT BY VERTICAL SHIFTING OF MOUSE OR ROLLER BALL |
US5758122A (en) * | 1995-03-16 | 1998-05-26 | The United States Of America As Represented By The Secretary Of The Navy | Immersive visual programming system |
US5736978A (en) * | 1995-05-26 | 1998-04-07 | The United States Of America As Represented By The Secretary Of The Air Force | Tactile graphics display |
US5691898A (en) * | 1995-09-27 | 1997-11-25 | Immersion Human Interface Corp. | Safe and low cost computer peripherals with force feedback for consumer applications |
US5790820A (en) * | 1995-06-07 | 1998-08-04 | Vayda; Mark | Radial graphical menuing system |
US5798760A (en) * | 1995-06-07 | 1998-08-25 | Vayda; Mark | Radial graphical menuing system with concentric region menuing |
US5745717A (en) * | 1995-06-07 | 1998-04-28 | Vayda; Mark | Graphical menu providing simultaneous multiple command selection |
US6166723A (en) | 1995-11-17 | 2000-12-26 | Immersion Corporation | Mouse interface device providing force feedback |
US5694153A (en) * | 1995-07-31 | 1997-12-02 | Microsoft Corporation | Input device for providing multi-dimensional position coordinate signals to a computer |
US5999168A (en) * | 1995-09-27 | 1999-12-07 | Immersion Corporation | Haptic accelerator for force feedback computer peripherals |
US6384743B1 (en) * | 1999-06-14 | 2002-05-07 | Wisconsin Alumni Research Foundation | Touch screen for the vision-impaired |
NL1001493C2 (en) * | 1995-10-24 | 1997-04-25 | Alva B V | Workstation, equipped with a braille display. |
US5754023A (en) | 1995-10-26 | 1998-05-19 | Cybernet Systems Corporation | Gyro-stabilized platforms for force-feedback applications |
US6639581B1 (en) | 1995-11-17 | 2003-10-28 | Immersion Corporation | Flexure mechanism for interface device |
US6100874A (en) * | 1995-11-17 | 2000-08-08 | Immersion Corporation | Force feedback mouse interface |
US5825308A (en) | 1996-11-26 | 1998-10-20 | Immersion Human Interface Corporation | Force feedback interface having isotonic and isometric functionality |
US6704001B1 (en) | 1995-11-17 | 2004-03-09 | Immersion Corporation | Force feedback device including actuator with moving magnet |
US6061004A (en) * | 1995-11-26 | 2000-05-09 | Immersion Corporation | Providing force feedback using an interface device including an indexing function |
WO1997020305A1 (en) | 1995-11-30 | 1997-06-05 | Virtual Technologies, Inc. | Tactile feedback man-machine interface device |
US5956484A (en) * | 1995-12-13 | 1999-09-21 | Immersion Corporation | Method and apparatus for providing force feedback over a computer network |
US7027032B2 (en) * | 1995-12-01 | 2006-04-11 | Immersion Corporation | Designing force sensations for force feedback computer applications |
US6169540B1 (en) | 1995-12-01 | 2001-01-02 | Immersion Corporation | Method and apparatus for designing force sensations in force feedback applications |
US6028593A (en) | 1995-12-01 | 2000-02-22 | Immersion Corporation | Method and apparatus for providing simulated physical interactions within computer generated environments |
US8508469B1 (en) | 1995-12-01 | 2013-08-13 | Immersion Corporation | Networked applications including haptic feedback |
US6147674A (en) * | 1995-12-01 | 2000-11-14 | Immersion Corporation | Method and apparatus for designing force sensations in force feedback computer applications |
US6219032B1 (en) * | 1995-12-01 | 2001-04-17 | Immersion Corporation | Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface |
US6300936B1 (en) | 1997-11-14 | 2001-10-09 | Immersion Corporation | Force feedback system including multi-tasking graphical host environment and interface device |
US6078308A (en) * | 1995-12-13 | 2000-06-20 | Immersion Corporation | Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object |
US6161126A (en) | 1995-12-13 | 2000-12-12 | Immersion Corporation | Implementing force feedback over the World Wide Web and other computer networks |
US5914705A (en) | 1996-02-09 | 1999-06-22 | Lucent Technologies Inc. | Apparatus and method for providing detent-like tactile feedback |
US5692956A (en) * | 1996-02-09 | 1997-12-02 | Mattel, Inc. | Combination computer mouse and game play control |
US6115482A (en) * | 1996-02-13 | 2000-09-05 | Ascent Technology, Inc. | Voice-output reading system with gesture-based navigation |
SE519661C2 (en) * | 1996-02-23 | 2003-03-25 | Immersion Corp | Pointing devices and method for marking graphic details on a display with sensory feedback upon finding said detail |
US6050718A (en) * | 1996-03-28 | 2000-04-18 | Immersion Corporation | Method and apparatus for providing high bandwidth force feedback with improved actuator feel |
US6374255B1 (en) | 1996-05-21 | 2002-04-16 | Immersion Corporation | Haptic authoring |
US5881318A (en) * | 1996-07-09 | 1999-03-09 | Gateway 2000, Inc. | Keyboard audio controls for integrated CD-ROM players |
US5841425A (en) * | 1996-07-31 | 1998-11-24 | International Business Machines Corporation | Ambidextrous computer input device |
US6125385A (en) * | 1996-08-01 | 2000-09-26 | Immersion Corporation | Force feedback implementation in web pages |
US5990869A (en) * | 1996-08-20 | 1999-11-23 | Alliance Technologies Corp. | Force feedback mouse |
US6024576A (en) | 1996-09-06 | 2000-02-15 | Immersion Corporation | Hemispherical, high bandwidth mechanical interface for computer systems |
US5991781A (en) * | 1996-09-27 | 1999-11-23 | Sun Microsystems, Inc. | Method and apparatus for detecting and presenting client side image map attributes including sound attributes using page layout data strings |
GB9622556D0 (en) * | 1996-10-30 | 1997-01-08 | Philips Electronics Nv | Cursor control with user feedback mechanism |
US5990872A (en) * | 1996-10-31 | 1999-11-23 | Gateway 2000, Inc. | Keyboard control of a pointing device of a computer |
US6411276B1 (en) | 1996-11-13 | 2002-06-25 | Immersion Corporation | Hybrid control of haptic feedback for host computer and interface device |
US6128006A (en) * | 1998-03-26 | 2000-10-03 | Immersion Corporation | Force feedback mouse wheel and other control wheels |
US6636197B1 (en) | 1996-11-26 | 2003-10-21 | Immersion Corporation | Haptic feedback effects for control, knobs and other interface devices |
US6956558B1 (en) * | 1998-03-26 | 2005-10-18 | Immersion Corporation | Rotary force feedback wheels for remote control devices |
US7489309B2 (en) * | 1996-11-26 | 2009-02-10 | Immersion Corporation | Control knob with multiple degrees of freedom and force feedback |
US6686911B1 (en) * | 1996-11-26 | 2004-02-03 | Immersion Corporation | Control knob with control modes and force feedback |
US6154201A (en) * | 1996-11-26 | 2000-11-28 | Immersion Corporation | Control knob with multiple degrees of freedom and force feedback |
US5973670A (en) * | 1996-12-31 | 1999-10-26 | International Business Machines Corporation | Tactile feedback controller for computer cursor control device |
US6111562A (en) * | 1997-01-06 | 2000-08-29 | Intel Corporation | System for generating an audible cue indicating the status of a display object |
US5912660A (en) * | 1997-01-09 | 1999-06-15 | Virtouch Ltd. | Mouse-like input/output device with display screen and method for its use |
US6762749B1 (en) * | 1997-01-09 | 2004-07-13 | Virtouch Ltd. | Tactile interface system for electronic data display system |
US6278441B1 (en) * | 1997-01-09 | 2001-08-21 | Virtouch, Ltd. | Tactile interface system for electronic data display system |
CA2278726C (en) * | 1997-01-27 | 2004-08-31 | Immersion Corporation | Method and apparatus for providing high bandwidth, realistic force feedback including an improved actuator |
US6705868B1 (en) * | 1998-03-18 | 2004-03-16 | Purdue Research Foundation | Apparatus and methods for a shape memory spring actuator and display |
US6020876A (en) * | 1997-04-14 | 2000-02-01 | Immersion Corporation | Force feedback interface with selective disturbance filter |
US6050962A (en) | 1997-04-21 | 2000-04-18 | Virtual Technologies, Inc. | Goniometer-based body-tracking device and method |
US6292170B1 (en) | 1997-04-25 | 2001-09-18 | Immersion Corporation | Designing compound force sensations for computer applications |
US6285351B1 (en) | 1997-04-25 | 2001-09-04 | Immersion Corporation | Designing force sensations for computer applications including sounds |
US6078312A (en) * | 1997-07-09 | 2000-06-20 | Gateway 2000, Inc. | Pointing device with absolute and relative positioning capability |
EP0931285B1 (en) * | 1997-07-21 | 2003-12-17 | Koninklijke Philips Electronics N.V. | Information processing system |
US6061718A (en) * | 1997-07-23 | 2000-05-09 | Ericsson Inc. | Electronic mail delivery system in wired or wireless communications system |
US20010043191A1 (en) * | 1997-07-31 | 2001-11-22 | Todd D. Lindsey | Audio and video controls on a pointing device for a computer |
US6292174B1 (en) * | 1997-08-23 | 2001-09-18 | Immersion Corporation | Enhanced cursor control using limited-workspace force feedback devices |
US6252579B1 (en) | 1997-08-23 | 2001-06-26 | Immersion Corporation | Interface device and method for providing enhanced cursor control with force feedback |
GB9722766D0 (en) | 1997-10-28 | 1997-12-24 | British Telecomm | Portable computers |
US6281651B1 (en) | 1997-11-03 | 2001-08-28 | Immersion Corporation | Haptic pointing devices |
US6088019A (en) * | 1998-06-23 | 2000-07-11 | Immersion Corporation | Low cost force feedback device with actuator for non-primary axis |
US6252583B1 (en) | 1997-11-14 | 2001-06-26 | Immersion Corporation | Memory and force output management for a force feedback system |
US6243078B1 (en) | 1998-06-23 | 2001-06-05 | Immersion Corporation | Pointing device with forced feedback button |
US8020095B2 (en) * | 1997-11-14 | 2011-09-13 | Immersion Corporation | Force feedback system including multi-tasking graphical host environment |
US6448977B1 (en) | 1997-11-14 | 2002-09-10 | Immersion Corporation | Textures and other spatial sensations for a relative haptic interface device |
US6211861B1 (en) | 1998-06-23 | 2001-04-03 | Immersion Corporation | Tactile mouse device |
US6256011B1 (en) | 1997-12-03 | 2001-07-03 | Immersion Corporation | Multi-function control device with force feedback |
US6091395A (en) * | 1997-12-15 | 2000-07-18 | International Business Machines Corporation | Computer system and method of manipulating a graphical user interface component on a computer display through collision with a pointer |
US6075531A (en) * | 1997-12-15 | 2000-06-13 | International Business Machines Corporation | Computer system and method of manipulating multiple graphical user interface components on a computer display with a proximity pointer |
US6308187B1 (en) | 1998-02-09 | 2001-10-23 | International Business Machines Corporation | Computer system and method for abstracting and accessing a chronologically-arranged collection of information |
US6304259B1 (en) | 1998-02-09 | 2001-10-16 | International Business Machines Corporation | Computer system, method and user interface components for abstracting and accessing a body of knowledge |
US6874123B1 (en) | 1998-02-09 | 2005-03-29 | International Business Machines Corporation | Three-dimensional model to facilitate user comprehension and management of information |
US6275227B1 (en) | 1998-02-09 | 2001-08-14 | International Business Machines Corporation | Computer system and method for controlling the same utilizing a user interface control integrated with multiple sets of instructional material therefor |
US6219034B1 (en) | 1998-02-23 | 2001-04-17 | Kristofer E. Elbing | Tactile computer interface |
US6307552B1 (en) | 1998-03-16 | 2001-10-23 | International Business Machines Corporation | Computer system and method for controlling the same utilizing an abstraction stack with a sequence of predetermined display formats |
US6184885B1 (en) | 1998-03-16 | 2001-02-06 | International Business Machines Corporation | Computer system and method for controlling the same utilizing logically-typed concept highlighting |
US20080055241A1 (en) * | 1998-03-26 | 2008-03-06 | Immersion Corporation | Systems and Methods for Haptic Feedback Effects for Control Knobs |
US6717573B1 (en) | 1998-06-23 | 2004-04-06 | Immersion Corporation | Low-cost haptic mouse implementations |
US6184868B1 (en) * | 1998-09-17 | 2001-02-06 | Immersion Corp. | Haptic feedback control devices |
US6686901B2 (en) * | 1998-06-23 | 2004-02-03 | Immersion Corporation | Enhancing inertial tactile feedback in computer interface devices having increased mass |
US6429846B2 (en) * | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
US6697043B1 (en) | 1999-12-21 | 2004-02-24 | Immersion Corporation | Haptic interface device and actuator assembly providing linear haptic sensations |
US6707443B2 (en) | 1998-06-23 | 2004-03-16 | Immersion Corporation | Haptic trackball device |
US6727919B1 (en) | 1998-07-07 | 2004-04-27 | International Business Machines Corporation | Flexible mouse-driven method of user interface |
NL1010236C2 (en) * | 1998-10-02 | 2000-04-04 | Koninkl Kpn Nv | Automatic telephone dialing system which allows user to browse sequence of names which are spoken via loudspeaker and press button when required one is announced |
US7038667B1 (en) * | 1998-10-26 | 2006-05-02 | Immersion Corporation | Mechanisms for control knobs and other interface devices |
GB2343413A (en) * | 1998-11-07 | 2000-05-10 | Gerald William Haywood | Input device with audio feedback |
US6452586B1 (en) * | 1998-11-30 | 2002-09-17 | Microsoft Corporation | Computer input device providing tactile feedback |
JP3543695B2 (en) * | 1999-03-17 | 2004-07-14 | 富士ゼロックス株式会社 | Driving force generator |
US6728675B1 (en) * | 1999-06-03 | 2004-04-27 | International Business Machines Corporatiion | Data processor controlled display system with audio identifiers for overlapping windows in an interactive graphical user interface |
DE20022244U1 (en) * | 1999-07-01 | 2001-11-08 | Immersion Corp | Control of vibrotactile sensations for haptic feedback devices |
US7561142B2 (en) | 1999-07-01 | 2009-07-14 | Immersion Corporation | Vibrotactile haptic feedback devices |
US6693622B1 (en) | 1999-07-01 | 2004-02-17 | Immersion Corporation | Vibrotactile haptic feedback devices |
US8169402B2 (en) * | 1999-07-01 | 2012-05-01 | Immersion Corporation | Vibrotactile haptic feedback devices |
US6369799B1 (en) | 1999-07-23 | 2002-04-09 | Lucent Technologies Inc. | Computer pointer device for handicapped persons |
DE20080209U1 (en) * | 1999-09-28 | 2001-08-09 | Immersion Corp | Control of haptic sensations for interface devices with vibrotactile feedback |
US6644321B1 (en) * | 1999-10-29 | 2003-11-11 | Medtronic, Inc. | Tactile feedback for indicating validity of communication link with an implantable medical device |
US6844871B1 (en) * | 1999-11-05 | 2005-01-18 | Microsoft Corporation | Method and apparatus for computer input using six degrees of freedom |
US6693626B1 (en) | 1999-12-07 | 2004-02-17 | Immersion Corporation | Haptic feedback using a keyboard device |
US6822635B2 (en) * | 2000-01-19 | 2004-11-23 | Immersion Corporation | Haptic interface for laptop computers and other portable devices |
US8013840B1 (en) | 2000-04-06 | 2011-09-06 | Microsoft Corporation | User notification system with an illuminated computer input device |
US6445284B1 (en) | 2000-05-10 | 2002-09-03 | Juan Manuel Cruz-Hernandez | Electro-mechanical transducer suitable for tactile display and article conveyance |
US6906697B2 (en) | 2000-08-11 | 2005-06-14 | Immersion Corporation | Haptic sensations for tactile feedback interface devices |
EP1182539B1 (en) * | 2000-08-16 | 2009-03-25 | Sony Deutschland GmbH | Haptic device control |
US7084854B1 (en) * | 2000-09-28 | 2006-08-01 | Immersion Corporation | Actuator for providing tactile sensations and device for directional tactile sensations |
AU2001294852A1 (en) * | 2000-09-28 | 2002-04-08 | Immersion Corporation | Directional tactile feedback for haptic feedback interface devices |
US7182691B1 (en) | 2000-09-28 | 2007-02-27 | Immersion Corporation | Directional inertial tactile feedback using rotating masses |
US6995744B1 (en) | 2000-09-28 | 2006-02-07 | Immersion Corporation | Device and assembly for providing linear tactile sensations |
WO2002048825A2 (en) * | 2000-11-15 | 2002-06-20 | Bagley, Dallin | System and method for guiding a computer user to promotional material |
GB2374506B (en) * | 2001-01-29 | 2004-11-17 | Hewlett Packard Co | Audio user interface with cylindrical audio field organisation |
GB2372923B (en) * | 2001-01-29 | 2005-05-25 | Hewlett Packard Co | Audio user interface with selective audio field expansion |
GB2374502B (en) * | 2001-01-29 | 2004-12-29 | Hewlett Packard Co | Distinguishing real-world sounds from audio user interface sounds |
GB2374507B (en) * | 2001-01-29 | 2004-12-29 | Hewlett Packard Co | Audio user interface with audio cursor |
US20030227476A1 (en) * | 2001-01-29 | 2003-12-11 | Lawrence Wilcock | Distinguishing real-world sounds from audio user interface sounds |
GB0127778D0 (en) * | 2001-11-20 | 2002-01-09 | Hewlett Packard Co | Audio user interface with dynamic audio labels |
GB2371914B (en) * | 2001-02-03 | 2004-09-22 | Ibm | Non-visual user interface |
US20020124025A1 (en) * | 2001-03-01 | 2002-09-05 | International Business Machines Corporataion | Scanning and outputting textual information in web page images |
US20020124056A1 (en) * | 2001-03-01 | 2002-09-05 | International Business Machines Corporation | Method and apparatus for modifying a web page |
US20020124020A1 (en) * | 2001-03-01 | 2002-09-05 | International Business Machines Corporation | Extracting textual equivalents of multimedia content stored in multimedia files |
WO2002071258A2 (en) * | 2001-03-02 | 2002-09-12 | Breakthrough To Literacy, Inc. | Adaptive instructional process and system to facilitate oral and written language comprehension |
US7020840B2 (en) * | 2001-03-22 | 2006-03-28 | Pulse Data International Limited | Relating to Braille equipment |
US9625905B2 (en) * | 2001-03-30 | 2017-04-18 | Immersion Corporation | Haptic remote control for toys |
US6834373B2 (en) * | 2001-04-24 | 2004-12-21 | International Business Machines Corporation | System and method for non-visually presenting multi-part information pages using a combination of sonifications and tactile feedback |
US6636202B2 (en) * | 2001-04-27 | 2003-10-21 | International Business Machines Corporation | Interactive tactile display for computer screen |
US6941509B2 (en) | 2001-04-27 | 2005-09-06 | International Business Machines Corporation | Editing HTML DOM elements in web browsers with non-visual capabilities |
US20020161824A1 (en) * | 2001-04-27 | 2002-10-31 | International Business Machines Corporation | Method for presentation of HTML image-map elements in non visual web browsers |
US7202851B2 (en) * | 2001-05-04 | 2007-04-10 | Immersion Medical Inc. | Haptic interface for palpation simulation |
IL143255A (en) | 2001-05-20 | 2015-09-24 | Simbionix Ltd | Endoscopic ultrasonography simulation |
US6937033B2 (en) * | 2001-06-27 | 2005-08-30 | Immersion Corporation | Position sensor with resistive element |
US20030002682A1 (en) * | 2001-07-02 | 2003-01-02 | Phonex Broadband Corporation | Wireless audio/mechanical vibration transducer and audio/visual transducer |
US7056123B2 (en) * | 2001-07-16 | 2006-06-06 | Immersion Corporation | Interface apparatus with cable-driven force feedback and grounded actuators |
US7623114B2 (en) | 2001-10-09 | 2009-11-24 | Immersion Corporation | Haptic feedback sensations based on audio output from computer devices |
US6703550B2 (en) | 2001-10-10 | 2004-03-09 | Immersion Corporation | Sound data output and manipulation using haptic feedback |
SE0103531D0 (en) * | 2001-10-23 | 2001-10-23 | Abb Ab | Industrial Robot System |
SE0103532D0 (en) * | 2001-10-23 | 2001-10-23 | Abb Ab | Industrial robot system |
FR2831428B1 (en) * | 2001-10-26 | 2004-09-03 | Univ Compiegne Tech | METHOD FOR ALLOWING AT LEAST ONE USER, PARTICULARLY A BLIND USER, TO PERCEIVE A SHAPE AND DEVICE FOR CARRYING OUT THE METHOD |
US7457398B2 (en) * | 2002-01-31 | 2008-11-25 | Comverse, Inc. | Methods and systems for providing voicemail services |
US7161580B2 (en) * | 2002-04-25 | 2007-01-09 | Immersion Corporation | Haptic feedback using rotary harmonic moving mass |
US7369115B2 (en) * | 2002-04-25 | 2008-05-06 | Immersion Corporation | Haptic devices having multiple operational modes including at least one resonant mode |
US7103551B2 (en) * | 2002-05-02 | 2006-09-05 | International Business Machines Corporation | Computer network including a computer system transmitting screen image information and corresponding speech information to another computer system |
AU2003279475A1 (en) * | 2002-12-04 | 2004-06-23 | Koninklijke Philips Electronics N.V. | Graphic user interface having touch detectability |
US8059088B2 (en) * | 2002-12-08 | 2011-11-15 | Immersion Corporation | Methods and systems for providing haptic messaging to handheld communication devices |
AU2003297716A1 (en) | 2002-12-08 | 2004-06-30 | Immersion Corporation | Methods and systems for providing haptic messaging to handheld communication devices |
US8830161B2 (en) | 2002-12-08 | 2014-09-09 | Immersion Corporation | Methods and systems for providing a virtual touch haptic effect to handheld communication devices |
US7336266B2 (en) | 2003-02-20 | 2008-02-26 | Immersion Corproation | Haptic pads for use with user-interface devices |
KR200319960Y1 (en) * | 2003-04-16 | 2003-07-22 | (주)사운드스케이프 | Mouse device having means for voice output |
CA2468481A1 (en) * | 2003-05-26 | 2004-11-26 | John T. Forbis | Multi-position rail for a barrier |
WO2004111819A1 (en) * | 2003-06-09 | 2004-12-23 | Immersion Corporation | Interactive gaming systems with haptic feedback |
US7850456B2 (en) | 2003-07-15 | 2010-12-14 | Simbionix Ltd. | Surgical simulation device, system and method |
JP2005044241A (en) * | 2003-07-24 | 2005-02-17 | Nec Corp | Pointing device notification system and method |
US6992656B2 (en) * | 2003-08-13 | 2006-01-31 | Hughes Micheal L | Computer mouse with data retrieval and input functionalities |
US8826137B2 (en) * | 2003-08-14 | 2014-09-02 | Freedom Scientific, Inc. | Screen reader having concurrent communication of non-textual information |
FR2860308B1 (en) * | 2003-09-26 | 2006-01-13 | Inst Nat Rech Inf Automat | CURSOR POSITION MODULATION IN VIDEO DATA FOR COMPUTER SCREEN |
US7398089B2 (en) | 2003-11-12 | 2008-07-08 | Research In Motion Ltd | Data-capable network prioritization with reduced delays in data service |
US7742036B2 (en) * | 2003-12-22 | 2010-06-22 | Immersion Corporation | System and method for controlling haptic devices having multiple operational modes |
US7283120B2 (en) | 2004-01-16 | 2007-10-16 | Immersion Corporation | Method and apparatus for providing haptic feedback having a position-based component and a predetermined time-based component |
US7403191B2 (en) * | 2004-01-28 | 2008-07-22 | Microsoft Corporation | Tactile overlay for an imaging display |
US20050233287A1 (en) * | 2004-04-14 | 2005-10-20 | Vladimir Bulatov | Accessible computer system |
JP2007536666A (en) * | 2004-05-03 | 2007-12-13 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Graphic user interface, system, method and computer program for interacting with a user |
US7580867B2 (en) * | 2004-05-04 | 2009-08-25 | Paul Nykamp | Methods for interactively displaying product information and for collaborative product design |
US9046922B2 (en) * | 2004-09-20 | 2015-06-02 | Immersion Corporation | Products and processes for providing multimodal feedback in a user interface device |
WO2006042309A1 (en) * | 2004-10-08 | 2006-04-20 | Immersion Corporation | Haptic feedback for button and scrolling action simulation in touch input devices |
US7562117B2 (en) * | 2005-09-09 | 2009-07-14 | Outland Research, Llc | System, method and computer program product for collaborative broadcast media |
US20060229058A1 (en) * | 2005-10-29 | 2006-10-12 | Outland Research | Real-time person-to-person communication using geospatial addressing |
US20060195361A1 (en) * | 2005-10-01 | 2006-08-31 | Outland Research | Location-based demographic profiling system and method of use |
US7489979B2 (en) * | 2005-01-27 | 2009-02-10 | Outland Research, Llc | System, method and computer program product for rejecting or deferring the playing of a media file retrieved by an automated process |
US20060161621A1 (en) * | 2005-01-15 | 2006-07-20 | Outland Research, Llc | System, method and computer program product for collaboration and synchronization of media content on a plurality of media players |
US20070189544A1 (en) | 2005-01-15 | 2007-08-16 | Outland Research, Llc | Ambient sound responsive media player |
US7542816B2 (en) * | 2005-01-27 | 2009-06-02 | Outland Research, Llc | System, method and computer program product for automatically selecting, suggesting and playing music media files |
US20060173556A1 (en) * | 2005-02-01 | 2006-08-03 | Outland Research,. Llc | Methods and apparatus for using user gender and/or age group to improve the organization of documents retrieved in response to a search query |
US20070276870A1 (en) * | 2005-01-27 | 2007-11-29 | Outland Research, Llc | Method and apparatus for intelligent media selection using age and/or gender |
US20060173828A1 (en) * | 2005-02-01 | 2006-08-03 | Outland Research, Llc | Methods and apparatus for using personal background data to improve the organization of documents retrieved in response to a search query |
US20060179056A1 (en) * | 2005-10-12 | 2006-08-10 | Outland Research | Enhanced storage and retrieval of spatially associated information |
US20060179044A1 (en) * | 2005-02-04 | 2006-08-10 | Outland Research, Llc | Methods and apparatus for using life-context of a user to improve the organization of documents retrieved in response to a search query from that user |
US20060253210A1 (en) * | 2005-03-26 | 2006-11-09 | Outland Research, Llc | Intelligent Pace-Setting Portable Media Player |
US20060223637A1 (en) * | 2005-03-31 | 2006-10-05 | Outland Research, Llc | Video game system combining gaming simulation with remote robot control and remote robot feedback |
US20060223635A1 (en) * | 2005-04-04 | 2006-10-05 | Outland Research | method and apparatus for an on-screen/off-screen first person gaming experience |
US20060256008A1 (en) * | 2005-05-13 | 2006-11-16 | Outland Research, Llc | Pointing interface for person-to-person information exchange |
US20060241864A1 (en) * | 2005-04-22 | 2006-10-26 | Outland Research, Llc | Method and apparatus for point-and-send data transfer within an ubiquitous computing environment |
US20060256007A1 (en) * | 2005-05-13 | 2006-11-16 | Outland Research, Llc | Triangulation method and apparatus for targeting and accessing spatially associated information |
US20060259574A1 (en) * | 2005-05-13 | 2006-11-16 | Outland Research, Llc | Method and apparatus for accessing spatially associated information |
US20070150188A1 (en) * | 2005-05-27 | 2007-06-28 | Outland Research, Llc | First-person video-based travel planning system |
US20060271286A1 (en) * | 2005-05-27 | 2006-11-30 | Outland Research, Llc | Image-enhanced vehicle navigation systems and methods |
ITMI20051043A1 (en) * | 2005-06-06 | 2006-12-07 | Milano Politecnico | "SYSTEM AND METHOD FOR THE EXPLORATION OF GRAPHIC ITEMS FOR USERS". |
US20060186197A1 (en) * | 2005-06-16 | 2006-08-24 | Outland Research | Method and apparatus for wireless customer interaction with the attendants working in a restaurant |
US20080032719A1 (en) * | 2005-10-01 | 2008-02-07 | Outland Research, Llc | Centralized establishment-based tracking and messaging service |
US7519537B2 (en) | 2005-07-19 | 2009-04-14 | Outland Research, Llc | Method and apparatus for a verbo-manual gesture interface |
US20070028178A1 (en) * | 2005-07-26 | 2007-02-01 | Gibson Becky J | Method and system for providing a fully accessible color selection component in a graphical user interface |
US7917148B2 (en) * | 2005-09-23 | 2011-03-29 | Outland Research, Llc | Social musical media rating system and method for localized establishments |
US8176101B2 (en) | 2006-02-07 | 2012-05-08 | Google Inc. | Collaborative rejection of media for physical establishments |
US7577522B2 (en) * | 2005-12-05 | 2009-08-18 | Outland Research, Llc | Spatially associated personal reminder system and method |
US7586032B2 (en) * | 2005-10-07 | 2009-09-08 | Outland Research, Llc | Shake responsive portable media player |
US20070083323A1 (en) * | 2005-10-07 | 2007-04-12 | Outland Research | Personal cuing for spatially associated information |
US20070103437A1 (en) * | 2005-10-26 | 2007-05-10 | Outland Research, Llc | Haptic metering for minimally invasive medical procedures |
US20060227047A1 (en) * | 2005-12-13 | 2006-10-12 | Outland Research | Meeting locator system and method of using the same |
US20070145680A1 (en) * | 2005-12-15 | 2007-06-28 | Outland Research, Llc | Shake Responsive Portable Computing Device for Simulating a Randomization Object Used In a Game Of Chance |
US20070075127A1 (en) * | 2005-12-21 | 2007-04-05 | Outland Research, Llc | Orientation-based power conservation for portable media devices |
US7770118B2 (en) * | 2006-02-13 | 2010-08-03 | Research In Motion Limited | Navigation tool with audible feedback on a handheld communication device having a full alphabetic keyboard |
JP4827644B2 (en) * | 2006-07-27 | 2011-11-30 | アルパイン株式会社 | Remote input device and electronic device using the same |
US8225229B2 (en) * | 2006-11-09 | 2012-07-17 | Sony Mobile Communications Ab | Adjusting display brightness and/or refresh rates based on eye tracking |
US7844915B2 (en) | 2007-01-07 | 2010-11-30 | Apple Inc. | Application programming interfaces for scrolling operations |
US8543338B2 (en) | 2007-01-16 | 2013-09-24 | Simbionix Ltd. | System and method for performing computerized simulations for image-guided procedures using a patient specific model |
US8500451B2 (en) * | 2007-01-16 | 2013-08-06 | Simbionix Ltd. | Preoperative surgical simulation |
US7847677B2 (en) * | 2007-10-04 | 2010-12-07 | International Business Machines Corporation | Method and system for providing auditory feedback for the visually impaired when defining visual models |
US7877700B2 (en) | 2007-11-20 | 2011-01-25 | International Business Machines Corporation | Adding accessibility to drag-and-drop web content |
US20100013613A1 (en) * | 2008-07-08 | 2010-01-21 | Jonathan Samuel Weston | Haptic feedback projection system |
US8239357B1 (en) * | 2008-09-12 | 2012-08-07 | Ryan, LLC | Method and system for extracting information from electronic data sources |
US20100134261A1 (en) * | 2008-12-02 | 2010-06-03 | Microsoft Corporation | Sensory outputs for communicating data values |
US8961313B2 (en) * | 2009-05-29 | 2015-02-24 | Sony Computer Entertainment America Llc | Multi-positional three-dimensional controller |
US8810365B2 (en) * | 2011-04-08 | 2014-08-19 | Avaya Inc. | Random location authentication |
TWI431516B (en) * | 2011-06-21 | 2014-03-21 | Quanta Comp Inc | Method and electronic device for tactile feedback |
US9582178B2 (en) | 2011-11-07 | 2017-02-28 | Immersion Corporation | Systems and methods for multi-pressure interaction on touch-sensitive surfaces |
US9245428B2 (en) | 2012-08-02 | 2016-01-26 | Immersion Corporation | Systems and methods for haptic remote control gaming |
US10102310B2 (en) * | 2015-05-08 | 2018-10-16 | Siemens Product Lifecycle Management Software Inc. | Precise object manipulation system and method |
USD779506S1 (en) * | 2015-07-27 | 2017-02-21 | Microsoft Corporation | Display screen with icon |
US10244342B1 (en) * | 2017-09-03 | 2019-03-26 | Adobe Systems Incorporated | Spatially representing graphical interface elements as binaural audio content |
KR102593965B1 (en) * | 2017-09-11 | 2023-10-25 | 엘지디스플레이 주식회사 | Display apparatus |
US11510817B2 (en) * | 2017-10-10 | 2022-11-29 | Patrick Baudisch | Haptic device that allows blind users to interact in real-time in virtual worlds |
IT201800009989A1 (en) * | 2018-10-31 | 2020-05-01 | Neosperience Spa | Method for managing an information interaction with a user, software program for performing said method and electronic device equipped with said software |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS57207929A (en) * | 1981-06-17 | 1982-12-20 | Toshiba Corp | Movement detecting system |
GB8315630D0 (en) * | 1983-06-07 | 1983-07-13 | Pathway Communications Ltd | Electronic memory devices for blind |
US4687444A (en) * | 1986-03-31 | 1987-08-18 | The United States Of America As Represented By The Administrator Of The National Aeronautics & Space Administration | Braille reading system |
US4881900A (en) * | 1986-07-23 | 1989-11-21 | Canon Kabushiki Kaisha | Sensing display apparatus of image pattern |
NL8700164A (en) * | 1987-01-23 | 1988-08-16 | Alva | WORK STATION, EQUIPPED WITH A BRAIL READING RULE. |
US5068645A (en) * | 1987-10-14 | 1991-11-26 | Wang Laboratories, Inc. | Computer input device using an orientation sensor |
DE3901023A1 (en) * | 1989-01-14 | 1990-07-19 | Ulrich Dipl Ing Ritter | Reading device for blind or visually impaired persons with a scanner reading normal script |
FR2652933B1 (en) * | 1989-10-09 | 1993-10-29 | Ioan Montane | METHOD AND DEVICE FOR IMPROVING THE BLIND READING OF A COMPUTER SCREEN USING A BRAILLE DACTYL DISPLAY. |
-
1991
- 1991-08-22 US US07/748,996 patent/US5186629A/en not_active Expired - Fee Related
-
1992
- 1992-05-12 CA CA002068452A patent/CA2068452C/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
CA2068452A1 (en) | 1993-02-23 |
US5186629A (en) | 1993-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2068452C (en) | Virtual graphics display capable of presenting icons and windows to the blind computer user | |
Kline et al. | Improving GUI accessibility for people with low vision | |
US7253807B2 (en) | Interactive apparatuses with tactiley enhanced visual imaging capability and related methods | |
JP2826195B2 (en) | Electronic display and data processing apparatus and method | |
TWI321290B (en) | System and method for making user interface elements known to an application and user | |
Wall et al. | Feeling what you hear: tactile feedback for navigation of audio graphs | |
JP3941292B2 (en) | Page information display method and apparatus, and storage medium storing page information display program or data | |
US6046722A (en) | Method and system for enabling blind or visually impaired computer users to graphically select displayed elements | |
US7337389B1 (en) | System and method for annotating an electronic document independently of its content | |
US7114129B2 (en) | Method and system for controlling an application displayed in an inactive window | |
US20050015731A1 (en) | Handling data across different portions or regions of a desktop | |
JP2000207089A (en) | Method and device for displaying hypertext document | |
JP2003531428A (en) | User interface and method of processing and viewing digital documents | |
JP2002502999A (en) | Computer system, method and user interface components for abstraction and access of body of knowledge | |
JPH06501798A (en) | Computer with tablet input to standard programs | |
JP2001084273A (en) | System and method for presentation control and physical object generating system | |
KR20050094865A (en) | A programmable virtual book system | |
JPH0555893B2 (en) | ||
Buxton | Human skills in interface design | |
US5511187A (en) | Method and system for nonvisual groupware participant status determination in a data processing system | |
Buxton | The three mirrors of interaction: a holistic approach to user interfaces | |
JP2940846B2 (en) | Hypertext display system | |
Bornschein | BrailleIO–a tactile display abstraction framework | |
Mynatt et al. | The Mercator Environment a Nonvisual Interface to X Windows and Unix Workstations | |
US20220147693A1 (en) | Systems and Methods for Generating Documents from Video Content |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
MKLA | Lapsed |