CA2081910C - Hands-free hardware keyboard - Google Patents

Hands-free hardware keyboard Download PDF

Info

Publication number
CA2081910C
CA2081910C CA002081910A CA2081910A CA2081910C CA 2081910 C CA2081910 C CA 2081910C CA 002081910 A CA002081910 A CA 002081910A CA 2081910 A CA2081910 A CA 2081910A CA 2081910 C CA2081910 C CA 2081910C
Authority
CA
Canada
Prior art keywords
key
input
keyboard
controller
computer system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CA002081910A
Other languages
French (fr)
Other versions
CA2081910A1 (en
Inventor
Donald E. Drumm
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wang Laboratories Inc
Original Assignee
Wang Laboratories Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wang Laboratories Inc filed Critical Wang Laboratories Inc
Publication of CA2081910A1 publication Critical patent/CA2081910A1/en
Application granted granted Critical
Publication of CA2081910C publication Critical patent/CA2081910C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks

Abstract

A computer system is provided which allows keyboard access in a hands-free environment. An orientation sensor is mounted on a headset and provides a cartesian positional input to an interface. The device functions in either a cursor control mode or a keyboard simulation mode. In keyboard simulation mode, positional inputs are converted to keyboard input codes which are input to a standard keyboard controller. The key codes generated are displayed to a user on an LED display. A microphone on the headset connects to voice control circuitry of the interface which allows input selections to be made through voice commands. A user locates the desired keyboard input by observing the LED display and selects the input with a vocal command. RAM memory storage of keyboard inputs allows the system to appear as a manual keyboard from the perspective of the keyboard controller.

Description

'~ f~: '~ 4~,.~
~C.'~' ~,.x.,~ .._'~

HANDS-FREE HARDWARE KEYBOARD
Background of the Invention There are various input devices (other than the cursor keys of a manual keyboard) which are known for positioning or controlling the movement of a cursor on a computer display screen. One of the more common in use is the conventional "mouse" device, in the form of a hand-sized housing which is moved over a flat desktop.
Motion over the desktop is sensed by means of a mechanically rotating ball or optically reflective sensor, and digital data are generated which translate into corresponding motion of the cursor on the display screen.
Other cursor positioning devices include the joystick, the graphics input tablet, consisting of a flat sensor pad and a hand-held pointing stylus, which translates the analog motion of the pointing stylus into digitized data used to control the location of the cursor on the display screen.
Still other cursor movement devices rely on focused light sources, held by the user or fixed on the user's person, as on a pilot's helmet. As the user aims the light beam, sensors mounted around a display screen track the movement.
of the beam and translate this movement into a corresponding movement of the cursor on the display screen.
Devices such as those discussed above are basically effective, although they suffer from certain disadvantages. Most cursor positioning and contrnlli»a devices require a fixed, generally level surface upon which to operate, or must operate in conjunction with stationary sensor of some type; that is, motion i:: senseo with respect to a fixed medium and positional data siUnal:
are generated and presented. to a computer for translation into a corresponding cursor movement. The need for a fixed surface or stationary sensor constrains how the user may interact with the display device. The user must normally sit close to the display screen and be provided with ample level desk space for placement of the graphic tablet or movement of the "mouse". In the case of the stationary sensor, the user must confine movement to keep the light beam within range of the sensor array and aimed appropriately.
Still another disadvantage of many present-day cursor movement devices is that they require a level of manual dexterity for their effective use. This requirement has i tS grPatPCt imoa~t i n the area of y1P ntly$iC_'.ally disabled. To operate a computer, a disabled person may find that a mouse, a graphics tablet, or even a keyboard is useless to him.
Disadvantages of the prior art devices, such as the requirements of a flat desktop, aiming of a light source, or the manual dexterity of the user, were overcome in U.S.
Application Serial No. 07/267,413, filed November 4, 1988 and assigned to the assignee of the present invention.
The cited application discloses an input device which does not require manual control, and which can be used by a physically disabled person.
summary of the Inver_tion Application Serial No. 07/267,413 cited in the background section teaches a device which provides good hands-free control of an on-screen cursor. However, mosi_ software programs require keyboard inputs, and the device was limited to programs using on-screen cursor and select functions.
Software programs exist which provide software keyboards which "pop-up" on the display screen of a host computer. However, these programs are often incompatible with other software programs, particularly if special utility of the system video RAM is required. Furthermore, WO 91/17522 !

j I
~~-.~.:.....~~
when the software keyboard is brought on to the display screen, it takes up much of the screen space, limiting user access to the screen.
In contrast to software keyboards, the present invention provides a hands-free keyboard input device which is transparent to accompanying software. No compatability problems arise because it is a hardware device and appears as a manual keyboard to a host computer. In addition, since it does not interfere with the display screen, no loss of screen space is suffered.
The present invention makes usP pf a ~nmnntPr svst.pm having a processor, a display screen and a keyboard input port for receiving inputs from a mechanical keyboard. An input device of the system provides a Cartesian positional input to a controller in the system. The controller responds to the input signal by indicating individual keys on a key pad. The controller then provides an input to i the keyboard input port to the processor.
In a preferred embodiment, a keyboard controller is provided which provides the input to the keyboard input port. The keyboard controller is a standard keyboard controller adapted to scan columns of mechanical key contacts and sense the state of the contacts. A key select input from the input device causes the controller to store in a random access memory a representation of a key indicated by the indicator. The keyboard controller periodically addresses the memory storage unit to read any new key representations. When a new key representation is read form the memory storage unit, the keyboard controller generates a corresponding input to the keyboard input port.
The transmission of key representations to and from the memory storage unit is controlled by buffer circuits which respond to an input from a user. A key select input causes the buffers to enable the storage of a key representation indicated by the key pad in th.e random access memory. The code is then read by the keyboard d .J. '... .3.k controller which. generates a corresponding input to the keyboard input port. If the key representation designates a character key, the memory is cleared after being read by the keyboard controller. If the key representation designates a function key, additional indicators are used to indicate that code representing a function key has been stored.
The key pad in the preferred embodiment has indicators which are a matrix of light emitting elements corresponding to a mechanical keyboard contact matrix. As the key representations chance-. the controller illuminates different elements to identify the particular key selection. The changes in the indicators of the array may be made to mimick the movement of the input device.
The preferred form of input device is an orientation sensor adapted to be worn as a headset, such as that presented in Application Serial No. 07/267,413.. The orientation sensor modifies the positional input of the input device in response to changes in the spatial orientation of the input device. A microphone on the headset can be used to input key select functions by using voice commands which are processed in voice control circuitry of the computer system. In addition, an earphone attached to the headset may be provided to allow a user to receive audio signals generated by the computer system.
Brief Description of the Drawinos FIG. 1 is a perspective view of a user using the .
computer system of the present invention.
FIG. 2A is a perspective view of the orientation sensor input device of the present invention.
FIG. 2B is a diagramatic illustration in partial cross section of the present invention, showing the location of the internal components.

FIG. 3 is a perspective view of the interface unit of the present invention.
FIG. 4 is a schematic illustration of the analog-to-digital conversion of the signal from the orientation sensor of the present invention.
FIG. 5 is a schematic illustration of the interface electronics of the present invention.
FIG. 6A is a block diagram of the voice control circuitry of the present invention.
FIG. 6B illustrates the timing diagrams associated with.t-hP vnicP c-.nntrnl circuitv ~f the nrPSPnt invention.
FIG. 7 is a schematic illustration of the LED array.
of the present invention.
FIG. 8A and 8B show a flow chart describing one aspect of the present invention.
FIG. 1 illustrates a computer workstation making use of the present invention. The system includes a computer input device 10, a keyboard simulation interface 12,~a computer 14, and a display terminal unit 16 having a display screen 18.
The .computer input device 10 is in the form of a headset to be worn by the computer user. The input device includes an orientation sensor 28 which provides an electrical analog signal that is related to the angle of tilt of the sensor 28 as determined in two axes. The user conveniently wears the device in the manner of a conventional audio headset. By controlling the orientation of the device by simple angular head movements, left-to-right, forward-to-back, or comt~inations of the two, the user effects a corresponding movement of a cursor displayed on the display screen 18. The computer input device 10 includes circuitry to digitize the analog signals generated by the orientation sensor for processing '~w~:.,r~, a ~ i f'V h .J~...V.~

by the interface 12 and transmission to the computer 14.
A microphone and preamplifier circuit are also included, which, together with voice control circuitry in the interface 12, allows the user to perform standard select functions.
A cable 20 carries digitized signals from the computer input device 10 to the interface 12. The interface 12 contains circuitry for receiving digitized signals from the computer input device 10 and translating those signals into control signals which can be intPrnretPd by the computer 14, which in torn cont.r.nl~ the display terminal unit 16. The interface 12 includes voice control circuitry which provides a speaker-independent voice detection of two primary sound types -- fricative and voiced sounds. The interface 12 is programmed to simulate the operation of a standard cursor control device, and provides a standard RS-232C output to the computer 14.
In.the preferred embodiment, the display screen 18 is a standard cathode ray tube, but any screen type may be used, including liquid crystal or' projection displays.
Also in the preferred embodiment, the computer input device 10 simulates the functioning of a graphics input tablet. A user can therefore manipulate displayed data, make menu selections, or input graphic information on the display screen 18. Other devices may be easily simulated, such as a conventional "mouse." The manner in which the present invention operates will be discussed further in what follows.
Turning now to FIGS. 2A and 2B, the major components , of ,the computer input device 10 are diagramatical7.y illustrated. The computer input device 10 includes a headband 24, preferably molded of durable plastic and having a shape and dimensions to fit comfortably around the head of the computer user. The headband 24 includes a soft foam insert 25 for supporting the headband 24 in a ~..~ .:_~
manner which is comfortable to the user. These considerations for user comfort are especially important when the anticipated user may be physically disabled, or for any other reason where the computer input device 10 would be worn for extended periods of time.
A circuitry housing 26 is fixedly mounted on an end portion of the headband 24. A generally spherical sensor housing 27, contains an orientation sensor 28. The sensor housing 27 mounts to the circuitry housing 26 With a screw/bushing arrangement, allowing the sensor housing 28 tn he nivntPr_1 by means of finaPr nrpssyrp. Tha angle cf the spherical housing 27 may thus be adjusted with respect to the headband 24. This allows the user to initially center the cursor on the display screen 18.
An earphone 29 is mounted in the circuitry housing 26 and positioned for operator comfort. As shown in FIG. 2B, the earphone 29 will not press against the user's ear during normal use, due to the locations of the foam insert 25 and the circuitry housing on the headband 26. The earphone 29 is used as an audio receiving device in certain computer applications including voice mail and audio messaging. In one application, the computer input device 10 provides a handicapped user with the ability to dial telephone numbers and conduct standard telephone communications through the computer 14, with the earphone 29 serving as the sound receiver.
A microphone 30 provides the user with a means for making audio commands, used to initiate conventional control functions, such as RETURN or SELECT. The microphone 30 is a conventional miniature directional microphone mounted on a microphone support member 31 and a pivot member 32. By means of the pivot member 32, the microphone 30 is adjusted upwardly or downwardly for positioning in front of the user's mouth.
A printed circuit board 34 contains the electronic circuitry for the computer input device 10. More n n a ~7 iG~~ _ c.'~:~.,_~ - 8 -specifically, this includes a preamplifier for the microphone 30, analog circuitry for the operation of the orientation sensor 28 and digital circuitry for communication with the interface 12. Interconnection wiring between the printed circuit board 34, earphone 29, cable 20, orientation sensor 28, and microphone 30 are not shown in FIG. 2B for simplicity. The microphone 30 is also connected through the preamplifier to the earphone 29, providing voice feedback to the user, as in a conventional telephone handset.
The nahlp 2(l pxiStS the lnwpr n~rti~n ~f the circuitry housing 26, so that it will cause minimum interference during operation of the computer input device 10. Although direct cabling is used in the preferred embodiment to keep the computer input device 10 inexpensive, it is within the scope of the present invention to link the computer input device l0~to the interface 12 by means of conventional miniature transmitter of transceiver technology.
The orientation sensor 28 is fixedly mounted within the sensor housing 27 as shown in FIG. 2B. ''The headset using orientation sensor 28 is the preferred form of input device for the present invention. The specific characteristics of the input device 10 are described in considerable detail in previously cited Application Serial No. 07/267,413.
In the preferred embodiment of the invention, the input device is allowed to control the cursor on the display screen 18 via the RS-232C output of the interface 12. Interface 12 receives the digitized signal from the input device 10 and adapts it to be received at the RS-232C input of the host computer 14. This allows selection mode software such as graphics programs to be fully controlled using the position sensor and the microphone input. However, an additional function of the interface 12 is to generate actual keyboard inputs to the WO 91/17522 PCT/US90/06i05 _ 9 _ ~ r. iw; ~ ~.r t.. ~J _!.. V .~1~
host computer 14.
Using the input device to perform two-dimensional movement of the cursor on the display screen 18 is referred to as cursor control mode and is made to correspond to the head movements of a user wearing the headset 10. That is, tilting the head forward moves the cursor down and tilting it back moves the cursor up.
Similarly, tilting the head to the right moves the cursor to the right and tilting the head to the left moves the cursor to the left. To allow a user to access the kayhnarrl i__n_Ptt f»>'1rt10I1 of the i_ntPifarP 17., a lnwPr nr upper limit is put on the cursor movement beyond which cursor control is disengaged. In the preferred embodiment, a lower limit is used and is taken as the digitized code from the input device which corresponds to the lowest possible cursor line on the display screen 18.
This lower limit would therefore be ezceeded by. the user tilting the head forward beyond the angle necessary to locate the cursor at the bottom of the screen 18.
Once the lower limit of the cursor control mode is passed, the interface 12 enters a keyboard simulation mode. In this mode, the RS-232 output from the interface to the host computer is disabled. Inputs from the input . device 10 are translated to digital codes recognizable to a standard keyboard controller as being key selections.
Since the number of different digitized outputs of the input device exceeds the number of possible keyboard key selections, more than one of the digitized outputs translates to the same keyboard key selection. Ranges of the x-y position coordinates represented by the digitized outputs of the input device are therefore allotted to each key representation. As the position of the orientation sensor changes enough to change the x-y coordinates from one designated range to another, the key selection code generated by the interface 12 changes to represent a different key input. Therefore, different keyboard key -inputs may be designated by a user without the need for manual controls.
An enlarged perspective view of the interface 12 is shown in FIG. 3. The front surface of the interface is a key pad with an overlay which has an appearance of a keyboard which would ordinarily be used with the host computer 14. In the present embodiment, no manual keys are actually present on the key pad. However, all the possible keyboard selections are represented in a pattern simulating the relative key arrangement of an actual kayhnarri, and in an altprnai-ivP Pmh~c7iment these may actually be mechanical keys responsive to touch.
Furthermore, any alternative arrangement of keys may also be used with the present invention.
Underneath each key representation of the overlay of the present embodiment is an indicator which in the present embodiment is a light-emitting diode (LED) which is illuminated when selected by underlying interface circuitry. The portions of the overlay over each LED are sufficiently transparent to allow a user to visually identify when a particular LED has been illuminated. ' Printed on each transparent region is a symbol designating the key being represented. When a particular keyboard input has been selected with the input device, the LED
which resides underneath the corresponding key symbol of the overlay is illuminated. A user therefore knows which of the possible key input codes is being generated by the interface electronics. Other types of indicators may be used (including audio indicators) as long as the necessary user feedback is provided.
The mapping of the input device signals to the particular key selection codes generated by the interface 12 is roughly arranged to correspond to the location of the LEDs of the interface. For example, if the LED under the symbol "T" were illuminated, a key selection input to the interface 12 would result in the generation of the 11 ~~~ J ~ ~ ._~~
keyboard input code for the symbol "T". If the orientation sensor was then tilted to the right far enough, the x-coordinate of the input device input would increase enough to shift the input into the next key selection range. Since the mapping of the ranges corresponds to the LED array, the next range to the right of the "T" range is the "Y" range. The movement would therefore cause the LED under the symbol "T" to be extinguished and the LED under the symbol "Y" to be illuminated. Correspondingly, a key selection input to i:_h_e lnterfdCe 1?~ uihile i_ll~,~rttil?atin' tile___ "Y" diode, ~rTO~..lld result in the generation of the keyboard input code for the symbol "Y". In this manner, the successive illumination of the LEDs in the array follows the movement of the input device and the generation of key codes.
As shown in FIG. 3, some of the keys of the keyboard overlay are of different sizes and shapes. The mapping of the positional inputs to the key codes is done so as to best correspond to the arrangement of keys on the overlay. Therefore, keys which are wider in the ' x-direction (such as the "space" key) have wider mapping ranges in the x-direction. Similarly, keys which are larger in the y-direction (such as the "shift" keys) have bigger mapping ranges in the y-direction. These mapping techniques help the successive lighting of the LEDs in the array to best mimick the movement of the input device.
It is often difficult for a user to keep the input device 10 completely still while moving to a particular LED. To prevent fluttering between two adjacent LEDs , because of inadvertant head movement, a threshold value i i must be exceeded by the positional input to extinc7uisl one LED and illuminate an adjacent one. Therefore, the !
positional input must not only change enough to designate i the next range, but must exceed the crossover point , between the ranges by the threshold value before the interface 12 switches to illuminate the new LED. Using ~S-. ~ ~a ~C~~.:_~~

the example above, if the LED under the symbol °'T" were being illuminated, tilting the input device 10 far enough to the right would eventually cause the change to the LED
under the symbol "Y". However, because of the threshold requirement, the positional input for the x-coordinate must increase to the lower x-value limit of the "Y" range and then increase further by the threshold amount before the switch to the "Y" LED is made.
The threshold value requirement is the same for each of the LED ranges for both the positive and negative x and 3r rjirortin_n_g of mnvPlnc~_n_~, Thus, nnCF? thF? "Y" LED had rlPPn illuminated, a user desiring to return to the "T" LED must tilt the input device far enough left to decrease the x-coordinate until the lower limit of the "Y" range is exceeded by the threshold value. In the present embodiment, the threshold value is approximately equal to 1.5 times the length or width of one of the smaller alphanumeric keys. This threshold is the same for each key on the overlay regardless of the size of the key or its mapping range.
The following example demonstrating the threshold ' requirement assumes that the user starts with the LED
under the character "T" being illuminated. In order to switch from the "T" LED to the "Y" LED, the user increases the x-coordinate of the positional input to go beyond the "Y" range and halfway across the "U" range (adjacent the "Y" range in the positive x-direction ) before switching from "T" to "Y" occurs. Switching back to the "T" then requires that the positive x-coordinate is decreased beyond "R" range (adjacent the "T" range in the negative x direction ) and halfway into the "E" range (adjacent the ' "R" range in the negative x direction ) before switching back to the "T" LED occurs. Since the threshold requirement affects movements in both the x and y directions, somewhat exaggerated head movements are , required to switch between adjacent LEDS. This allows _ 13 _ .........m~ , more defined user control, and eliminates the problem of LED flutter near the borders of the positional input ranges.
The key pad of interface 12 is positioned directly below the display screen 18, and the control in keyboard simulation mode is made contiguous with control in the cursor position mode. From cursor control mode, the user's head is tilted forward enough to bring the position signal below the predetermined lower limit. The cursor control mode is then disabled by the interface and the kayhpard Sl.mula.tinn mnr_1p i~ anahla~l. ThP lower limit of the cursor control mode must be exceeded by a threshold value before the user can enter keyboard simulation mode.
This prevents the inadvertant switching between modes.
The threshold value between the two modes is made large enough so a distinctly exaggerated downward head movement must be made before the keyboard simulation mode is entered.
Although the movement of the orientation sensor in the keyboard simulation mode allows the selection of different keyboard inputs, no key signal is actually transmitted to the host computer 14 until the user verifies the selection with a voice input to the voice control circuitry via microphone 30. The voice control circuitry recognizes two different sounds when in cursor position mode, but the sounds are not distinguished in keyboard simulation mode. Since all keyboard options are available to the user in this mode, there is no need for recognizing different vocal inputs. However, the input voice signals must be of relatively short duration to be accepted by the voice control circuitry. This aspect reduces the incidence of inadvertant inputs from background noise, and is discussed in more detail with reference to FIGS. 6A and 6B. Once the LED over the desired key is illuminated, the user simply speaks an appropriate input sound into the microphone 30 and the key WO 91/17522 PC'f/US90/06105 '~ t' ~ a iG~: .:..~.

selection is transmitted to the host computer 14.
The electronics of the input device 10 are illustrated in part in FIG. 4. Orientation sensor 28 outputs two differential analog voltages representative of absolute x and y coordinate positions. The differential voltages are transmitted along four lines to analog-to-digital converted (ADC) 86. In the present embodiment, the ADC 86 is a Texas Instruments type ADC0834. The ADC 86 employs a data comparator structure to differentially compare the analog input signals. The di_ni_ta.l npt-nyt ~l.a.ta. has Fight-hit _ragnlLttiOn wh7.t_'.h indicates the resultant magnitude and sign of the orientation sensor signals. A clock signal (CLK) and a chip stroke are received from interface 12 for scanning data at the ADC 85. In addition, control signals are also sent to the ADC from the interface along the "data in"
line, line DI, to designate which of the differential signals should be output and at which polarity. Data is output serially along the "data out" line, line DO, in response to commands received from the interface 12.
Both the earphone 29 and microphone 30 are also shown in FIG. 4 in block diagram form. Each requires two leads to communicate with the interface through the input device cable 20. The microphone 30 transmits vocal signals to voice control circuitry of the interface 12, while earphone 29 receives audio signals from the interface. A
5-volt power line and a ground (GND) line are also encaser7 ' in the input device cable. In all, ten lines connect the input device 10 and the interface 12.
The internal circuitry of the interface 12 is shown schematically in FIG. 5. The inputs from the input device , are shown arranged in a different order than shown in FIG. 4. It will be understood, however, that these arrangements have no physical significance, and serve illustrative purposes only. A ten-pin connector is provided at the end of cable 20 to allow the input device WO 91/17522 PCT/U~90/06105 J . L.~.l .<..

to be plugged into the interface 12. The two microphone connections are shown connected from the input device cable to voice control circuitry 108. The voice control circuitry receives audio signals input from the input device 10. This circuitry processes input voice signals from the user and provides a corresponding output to controller 110.
The voice control circuitry is shown in more detail in FIGS. 6A and 6B. FIG. 6A shows a block diagram of the voice control circuitry, and FIG. 6B shows the output wavPfnrm~ anal timing rplati.nnshine as~pc~.iatp~l wii-h the outputs of the various blocks. The waveforms shown in FIG. 6B are each designated by a circled letter. These letters are also shown in FIG. 6A to identify the location of each waveform.
The voice control circuitry is basically a two-sound speech recognition device. The present invention makes use of the characteristic frequency ranges of certain phonetic sound types, rather than on specific words spoken in specific languages. The frequency range of voiced sounds (ca'used by vibration of the vocal cords) are generally lower than the frequency range of fricative sounds (caused by air moving through narrowed air passages in the mouth). Words such as "no" end in a voiced sound;
most vowels are voiced sounds. Words such as "yes" end in a fricative sound ("sss").
Through microphone 30, the present speech recognition device detects short words that end in voiced or fricative sounds, such as "yes" and "no". To be detected, the sounds must be spoken in an isolated manner, rather than as part of continuous speech. In this way, conversation in the vicinity of the user will not cause commands to the control circuitry to be initiated.
The voice control circuit of the present invention is especially useful where the user speaks only a foreign language or possesses a speech difficulty or disability.

J L.V~.S~~

In this case, the input device 10 may still be used effectively, as long as the user can make any voiced and fricative sounds. Furthermore, it is speaker independent and requires no training sessions with the user, as is required with some other speech recognition devices.
Microphone 30 receives the user's voice command, which is amplified by preamplifier 91, which also attenuates any frequency component below 100 hertz. The microphone 30 and accompanying preamplifier 91 are located within the input device 10. As shown in FIG. 6A,. the c output of preamplifier 91 is also presented to a high pass filter 92, a low pass filter 93, and a detector 94. The high pass filter 92 rejects all signal frequencies below 1500 hertz. The low pass filter 93 rejects all signal frequencies above 400 hertz. The detector 94 outputs a logic height whenever a sound signal is received by the microphone 30. The logic high signal from the detector 94 is clocked through an eight-bit shift register 95 by an oscillator 96, which operates in the 20 to 30 hertz range.
The output signals of the high pass filter 92 and the low pass filter 93 are presented to detectors 96, 97 which square up the signals. These outputs, in turn, are presented to latches 98. 99 which are clocked by the third bit of the shift register 95. By using the third bit instead of the first. the voice control circuitry .., disregards the first part (approximately 150 milliseconds) of the command word and determines whether the last part of the command word is a voiced or fricative sound.
Finally, the output of the shift register 95 is gated such that a logic high occurs if and only if the received .
sound has a duration of 250 to 400 milliseconds and is preceded and followed by a momentary pause i the audio signal. Any sound longer or shorter than the specified window size is ignored, along with those which are part of a continuous stream of sound, such as words embedded in sentences. When the command word has the correct .. JW~~~

parameters, it is clocked through a latch 100 if it ends in a fricative or through a second latch 101 if i ends in a voiced sound.
The "yes' and "no " inputs are used in cursor control mode as standards SELECT and RETURN function inputs. The "yes" and "no" signals of the voice control circuitry are input to controller 110 via external switch 109, shown in FIG. 5. Switch 109 is a plug port for plugging in the inputs of an alternative input device such as a joystick.
If no plug is inserted in the switch outlet, the switch remains closed allowing the input of the signals from the voice control circuitry to the controller 110. However, when a plug from an alternative input device is instead in the outlet, the plug breaks the contact between the voice control circuitry leads and the controller leads. The plug insertion at the same time creates a connection between the plug contacts and the controller leads. In this way, the yes/no inputs may alternatively be obtained from the select buttons of a joystick or other alternative input device.
Referring to FIG. 5', the lines CLK, CS, DI and DO
from the input device 10 are connected directly to controller 110 which, in the present embodiment, is Motorola microprocessor (No. MC68HCOSCB). The controller is programmed to strobe the ADC 86 of the input device 10 along line CS. In response to the strobe signal, serial position data is transmitted from the ADC 86 to the controller 110 along line DO. The digitized position data , is received by the controller 110 which stores the data in a temporary storage register. The timing of the transmission and reception of the serial position data i;
controlled by the clock signal transmitted from the controller 110 to the ADC along line CLK. This ensures that the ADC and the controller are synchronized to achieve accurate transmission of the serial position data. An oscillator 112 is connected to the controller n~-:,, ~ ~ p,~;. ,fJ.
!..'~.. ,... ! ..j .~ _ 1 g _ 110 and provides the absolute timing signal from which the clock signal is derived.
A series of dual in-line package (DIP) switches 115 are shown connected to controller 110. These switches allow a user to change various parameters of the controller operation. In the present embodiment, these ' parameters include the speed of the cursor movement in cursor control mode, the polarity of the cursor movement, whether the interface is positioned above or below the display screen (i.e. whether an upper or lower limit is LlSed). the baud rate of the serial communication through the RS-232 port, and the size of the threshold between key code mapping ranges. Another of the DIP switches of the present embodiment activates an automatic selection function in keyboard simulation mode. When this switch is on, a controller key code output which is not changed for more than 4 seconds is automatically selected. Therefore, if this switch is on, no selection inputs are necessary to select keyboard inputs to the controller.
When the x-y position data is received by the controller 110, the y-coordinate is monitored to see if ""
its value exceeds the lower limit of the cursor control mode by the required threshold value. If not, the ,.
position data is formatted to simulate the operation of a graphics input device and output to the host computer 14 from RS-232 output 114. However, once the lower y-coordinate threshold is exceeded, the RS-232 port is disabled.
When the y-coordinate of the position data decrease beyond the lower limit and threshold value recognized by , the controller 110, the controller enters keyboard simulation mode. In this mode, the x-y coordinates of the input position data are used by the controller 110 as a look-up address to get keyboard key data. A different range of x-y coordinate inputs apply to each key code generated by the controller. Each key code is different _ 1 g _ ~r'~' ~ a '[7~ r ~;~
:. V ,~
and corresponds to a different key which would be found on a keyboard intended for use with the host computer 14.
The threshold value requirement discussed previously with regard to switching in the LED array is actually a requirement for the generation of different key codes.
The generation of a new key code requires the exaggerated movement of the input device to switch from one key code to the other. The LEDs respond directly to the key code output and therefore act as a representation of the key code which is being generated by the controller 110.
As shown in FIG. 5, the controller 110 has sever parallel data output lines (controller pin~numbers 22-28) generally indicated at 116. The key code generated by, the controller is a 7-bit representation output on these lines.
For convenience, the seven bits of the key code output by the controller 110 will be referred to as XO-X6, with XO being the highest order bit and X6 being the lowest order bit. Receiving the 7-bit key code from the controller 110 are decoder/drivers 118, 120, 122. Decoder 118 is a three-to-eight line decoder which receives the three lowest order bits x4-z6 of~the key code output by the controller 110. Decoders 120, 122 are also three-to-eight line decoders and work in parallel with one another, each receiving key code bits X1-X3. Bit XO (from .
controller pin number 22) of the 7-bit key code is applied to the enable lines of the decoders 110, 112 to allow them to function jointly as a four-to-sixteen line decoder.
Bix XO is input directly to the enable input of decoder 120, but is inverted by inverter 126 before being input r_o the enable input of decoder 122. Thus when bit XO is low, decoder 120 is enabled, and the disabled decoder 122 has an all zero output. Similarly, when XO is high, decoder 120 is disabled and decoder 122 decodes the bits X1-X3.
The function of the three decoders 118, 120, 122 is to drive LED display 124. As is more clearly shown in FIG. 7, the grid wiring of the decoder and the LEDs of the n~-:,, ~ ~ p,~;. ,fJ.
!..'~.. ,... ! ..j .

array allows the lighting of each LED to indicate the output of a different 7-bit key code from the controller ' 110. In this manner the key code being generated by the controller is identified by a system user. As shown in .
the enlarged view 128 of an individual LED wiring arrangement, the presence of a sufficiently positive voltage on one of the output lines of decoders 120, 122 results in the lighting of a particular LED along that line if the crossing line for that LED from decoder 118 is at a low voltage. Thus, a decoded output code is used to get up a vnl_tagP arrn~s nnP of t-hP T.FI~s in the array S»rh that the LED is illuminated. The illuminated LED is that which is coupled between a row selected by decoder 118 and a column selected by decoders 120 and 122.
The LED array shown schematically in FIG. 7 corresponds to the LEDs under the keyboard overlay of the interface 12 shown in FIG. 3. The LEDs are arranged underneath the overlay which looks like a keyboard but has indicators rather keys. When a key code generated by the controller 110 reaches the decoders, the LED positioned under the key on the overlay which corresponds to that key code is illuminated. The switching from one LED to another is directly controller by the changing key codes output by the controller 110. Therefore, the LED
illuminated always indicates the key which is represented by the key code being generated by the controller 110.
Each key represented on the key pad overlay has an LED of the array 124 which illuminates it from underneath. In the present embodiment these LEDs are red. However. in addition to the red LEDs, some of the keys of the overlay also have a green LED which is controller separately from the keys of array 24. Some of the keys having a red and a green LED are the so-called "function keys" which are used for selecting alternative inputs of the "character keys". In the present embodiment, these keys include two "shift" keys, two "control" keys, and two "alternate" keys. The green LEDs for the function keys are shown being controlled by latch 132 in FIG. 5 and FIG. 7. Although these schematic figures show the green LEDs in a different location than the red LEDs, each green LED is actually positioned adjacent a red LED under the appropriate function key shown on the overlay. The overlay of FIG. 3 shows each function key having a line separating into two sides.
This is to indicate that two LEDs reside under these keys one to either side of the line. These lines do not _n_aCt~gc~ril_y ~?_x_i_et ra_n_ t_h_~? ~Cttlal keyboard overlay.
To implement function keys on a manual keyboard, a user would depress and hold a function key while then striking a character key. A method of "holding" a function key is therefore provided with the present invention. Latch 132 powers a green LED to identify that the function key designated by that LED is being."held down" pending the selection of a character key. The specifics of the function key control will become clear in the description which follows. V
Referring back to FIG. 5, the correct delivery of a desired key code to the keyboard input of the computer 14 is by means of a random access memory (RAM) 130 used in conjunction with keyboard controller 138 and a decoder 136. The keyboard controller 138 is the same controller that is used with a manual keyboard compatible with the computer 14. In the present embodiment, the keyboard controller is model number 70C42 made by Intel Corporation. The keyboard controller 138 scans RAM
addresses as if it were.scanning a manual keyboard array.
Therefore, from the perspective of the keyboard controller, the key code generation circuitry of the interface 12 appears as a manual keyboard. However, what is actually being scanned is the RAM 130 in which the generated key codes are temporarily stored.
While in keyboard simulation mode, changes in J ..t.. ~.I' ~..
orientation of the headset input device 10 cause the key codes generated by the controller 110 to change as well.
These changes are continuously monitored by the user through observation of the LED array 124 which identifies the key codes being generated. While the three least significant bits of the key code are input to decoder 118 of the LED array, they are simultaneously input to three-to-eight line decoder 136. The decoder 136 decodes the three bits for input to RAM 130. The four most significant bits of the key code, XO-X3, are then used to select a 4-bit address in RAM 130 at which the output of decoder 136 is stored. Once a~code is stored in memory 130, the code can be scanned by keyboard controller 138 which identifies it as a key selection. , Keyboard controller 138 continuously scans RAM 130 "
for keyboard inputs by sequencing through all possible 4-bit addresses. Each different address is taken to be a different "column" of the simulated keyboard grid. At each address is an eight-bit data code which simulates the eight "rows" of the simulated keyboard grid. The keyboard controller 138 has eight input lines which read the 8-bit code for each address. When one of the bits is non-zero, the keyboard controller regards the non-zero element as indicating a key press. The key is identified by coordinating the "column" represented by the current address with the row designated by the particular non-zero bit. Once the key is identified, the controller generates a keyboard input which is output to the host computer 19 on keyboard input cable 139.
To allow the selection of particular key codes by a user, buffers 142, 144, 146, 148 are provided which work in conjunction with buffer 150. Buffers 142, 146 are normally disabled, and isolate RAM 130 form the output decoder 136 and the address bits XO-X3 of the key code.
Buffers 144, 148 are normally enabled, and control the transmission of information between RAM 130 and the ~~'''F "'~ ,~' a_'~
23 - '~~ .
keyboard controller 138. Thus, while a user is not I
selecting a key, but is changing the key code by moving the input device, buffers 142, 146 are disabled and , prevent the input of information to RAM 130. However, during this time buffers 144, 148 are enabled, allowing addresses to be transmitted from the keyboard controller ' to the RAM, and' RAM outputs to reach the keyboard controller.
When the user wishes to select a key code identified by the LED display, a "yes" or "no" command is given by speaking into the micrnph~nP R0. Tf a different input device is being used, similar inputs are provided at switch 109. The inputs are provided to the controller 110 which responds by generating several different output signals.
First, the clock pulse on line,CLK of the controller is suspended, and the CLK line is temporarily held high.
Since the CLK line delivers a signal to the enable input of decoder 136, this keeps the decoder temporarily enabled (when disabled. the decoder 136 output is all zeros). In addition, an output to the write input of the RAM is provided (from controller pin number 4) to allow writing into the memory from the decoder.
Finally, the output from pin 21 of the controller to the buffer 150 is temporarily driven low. When the signal goes low, the status of buffers 142, 146 and 144, 148 is reversed, buffers 142, 146 being enabled and buffers 144, 148 being disabled. The signal from the controller 110 to the buffers is also input through an .inverter.to the date enable (DE) input of the RAM 130. The voltage at this input is ordinarily low to allow the output of data along the data lines in response to the sequencing of input addresses from the keyboard controller 138. However, the pulse on the buffer line from the controller disables the data out function, allowing the writing in of the decoder 136 output.

t When buffers 144, 148 are disabled, the keyboard controller 138 can no longer access the RAM memory 130.
The data inputs of the keyboard controller 138 are held high during this period by resistors 152 which are fed by a 5-volt power supply. The enabling of buffers 142, 146 allows the key code currently displayed by the LED display to be input to RAM 130. The bits XO-X3 of the key code are input to the RAM address ports to designate a particular 4-bit address. Meanwhile, bits X4-X6 of the key code are decoded by decoder 136 and output to the data , ports of RAM 130. The logic low signal from the "
controller 110 to the write enable input of the RAM 130 allows the writing of the decoder 136 output into the RAM
130. The eight bit data is thereby stored in the RAM
memory at the address designated by the 4-bit address of bits XO-X3.
The low pulse to the buffer 150 is made long enough to allow the correct storage of the key code information.
Once the polarity becomes high again, the buffers 142, 144, 146, 148 return to their normal status. However, now when the keyboard resumes its scan of the RAM memory, the - data of the new key code is read from the designated RAM
address. The keyboard controller identifies the appropriate key and transmits a keyboard input to the host computer 14. The keyboard controller 138 also sends an audio confirmation signal (a "beep") to the user through earphones 29. Since the output from the controller 110 to buffer 150 is also input to the enable input of decoder 118, the generated low pulse to buffer 150 result:: in th~~
temporary disabling of the LED display by bringing the .
"enable" input of decoder 118 low. This causes any LEDs currently lit to go out, thus providing a second means of confirmation to the user (who is observing the display) that the key code was input to the keyboard controller 138.
Once the code stored in RAM 130 has been read by the keyboard controller 138, the RAM memory must be cleared to ~ , ~, ~~ ,'~.~ ,~
oC,~. J

prepare for receipt of the next key code. The controller 110 delays for a time great enough to ensure that the keyboard controller has read the key code out of memory, and then initiates a clearing of the memory 130. To clear the RAM, the controller once again activates the write enable input of the RAM, and discharges a low pulse to the buffer 150. However, for the clearing cycle, the decoder 136 is not enabled by a signal on the CLK line of the controller 110. With the decoder 136 disabled, the output on the data line of the decoder is all logic zeros. with fihe.h~~ffers 147, 746 Pnahled, the controller seauences _ r through all the possible addresses on bits XO-X3. Since the RAM is responding to these address inputs, the all-zero data output of the decoder 136 is read into each of the 16 RAM addresses. Thus the RAM memory 130 is cleared, and after all the 4-bit addresses have been output by the controller 110, the signals controlling the buffer 150 and the write enable of the RAM are returned to their normal state.
A particular facet of the present invention is its ability to "hold down" the function keys "shift", "alternate", and "control": Since on a manual keyboard these keys would be held down while another key was pressed, the keyboard controller must identify both key codes simultaneously to generate the proper keyboard input. When a function key is selected with the system of FIG. 5, the key code information is input to RAM 7.30 and stored as an ordinary key press. Ordinarily, the memory is cleared after the key code is input to the keyk~oarc3 controller. However, when an input to the controller is identified as selecting the key code of a function key, the ensuing controller operation is slightly different than that for a character key.
The storing of the function key code to RAM memory 130 is performed in the same manner as a character key.
However, following the function key code storage, the .r -..

controller outputs a signal to the enable input of latch 132, allowing it to latch a bit corresponding to the chosen function key, which is output by the controller one of bit lines X1-X6. When he function key code is latched, ..
the output of the latch 132 powers one of the green LEDs to identify the function key that has been selected and held. The controller 110 then sets the appropriate one of six flags internal to the controller, each of which corresponds to one~of the six function keys. The controller does not initiate a memory clear function after tha fy nCtion kPy has hePn i~lPntifiPr7 by the kayhoar.d controller. Therefore, the key code remains stored in RAM
memory 130 until the selection of another key.
When a second key is selected, the usual storage procedure is implemented with the function key representation remaining in the RAM 130. The second key code is thereby stored in the RAM 130 along with the function key code. When the keyboard controller scans the RAM 130, it identifies both sequentially selected keys as being selected simultaneously. Thus the function key is seen by the keyboard controller 138 as being "held down".
Once both keys are identified by the keyboard controller 138. the memory 130, latch 132, and internal flags of the controller are all cleared.
To assure the simultaneous presence of function key and character key codes in the RAM 130, the key codes of the function keys must contain 4-bit addresses which are not shared by any keys which are used with function keys.
Otherwise, the selected character key code would overwrite the function key code in the selected memory address.
Referring back to FIG. 7, each column of the LED grid corresponds to a different 4-bit address contained in bits XO-X3 of the key code. As shown, the two "control" keys ("CTRL" and "CTRR") and the two "shift" keys ("SFTL" and ' "SFTR") are alone in their own columns. Therefore, no other keys share the same 4-bit address with either the _ 27 _ n~',,~ '~'~-'>, aL _ ~.~...aw.~
"control" or the "shift" keys. The "alternate" keys ("ALTL" and "ALTR") share a different column only with keys which are not used with the "alternate" function keys. Therefore, no simultaneous address problems occur.
It will be noted with reference to FIG. 7 that "control", "shift", and "alternate" function key codes are at different addresses from each other to allow the use of two function keys at the same time. In such a case, a first funciton key representation is stored at a first address of RAM 130 while the corresponding green LED is 7 at~hp~l anc7 a f i rSt ront.rol 1 Pr f l ag i s sPt , ThP RAM 130 ~~ remains uncleared while a second function key is selected and its code representation is input to a different address of the RAM 130. A second green LED is now latched and a second flag is set within contro11er110. Meanwhile, RAM memory 130 still remains uncleared. When a.character key is finally selected, the character key is input to a third 4-bit address of the RAM and the keyboard controller reads the three key representations stored in the RAM
130. Following the identification of a character key by the keyboard controller 138, the RAM 130, latch 132 and internal flags of the controller 110 are cleared, If a user has selected a function key, and then decides not to use that key, the key may be toggled off by selecting that key a second time. The controller identifies the function key as being held by the internal.
flag set when the key was selected. If a function key is selected when a flag indicates that the function is already being held, the latch 132 and the internal. flag are cleared, and the controller initiates a memory clear function to remove the key representation from the RAM
memory 130. The controller 110 individually identifies the sig different function key flags, and allows a function key to be cleared with either of the keys performing that function. Therefore, if the left "shift"
key was originally selected, selection of the right n ~G~ ....~....a.::~

"shift" key while the shift is held results in the same clearing function as if the left "shift" had been selected a second time.
In the preferred embodiment, this function key memory clear function does not include the sequencing of addresses as performed in the normal clear function.
Instead, the controller just provides the address for the undesired function key to the RAM address inputs. In this way, any other function keys codes which are being held in memory are retained. Similarly, all the outputs of latch 132 and internal flags of the controller 110 are not cleared, but only the one corresponding to the undesired function key selection.
Other toggle keys which normally exist on a manual keyboard are also used with the present invention.
However, the toggling of these keys is controlled by the standard keyboard controller 138. The toggle keys include the "numlock", "scrlock", and "caplock" keys. Since the toggling of these keys is handled by the keyboard controller 138, the key codes are input as if they were for standard character keys. Green LEDs are provided for ' these keys in addition to the red LEDs which designated them in the LED array. The control of the green LEDs is provided by outputs from the keyboard controller 138.
Similar to the function keys, the representations of the toggle keys in the overlay of FIG. 3 show them divided down the middle to designate that two LEDs are associated with that key. These lines do not necessarily exist in the actual keyboard overlay.
Making reference to the flow chart of FIGS. 8A and 8B, an overview is provided of the operation of the computer system in keyboard simulation mode. The system runs through an initialization routine when turned on.
The controller 110 then begins scanning the ADC 86 of the input device 10 at block 160. If the y-coordinate input from the ADC 86 is not detected as being less than the WO 9i/17522 PCT/US90/06105 . i. a re,.
lower limit of the cursor control mode plus the threshold value at block 162, the ReS-232 functions of the system are performed at block 164. If the y-coordinate is low enough, however, the system is placed in keyboard simulation mode and goes to block 166.
As shown in block 166, the x-y coordinates of the ' input device positior~al input are used as a look-up table address, and a key code is formed. Input changes are checked in block 168. If the input has changed sufficiently to overcome the threshold value and designate a new key range, the key code is changed in block 170. As shown at 172, the key code generated by the controller is displayed at the key pad LED display. While the display is on, the controller monitors for any "yes"/"no" (or alternative) inputs :6rom the user, as shown in block 174.
Both "yes" and "no" perform the same selection function in keyboard simulation mode. If no selection has been made, the system continues to monitor for inputs and display any new keys.
When a user input is detected, the controller checks to see if the key code is designating a function key (block 176). If the designated key is a function key, the controller checks its internal flags to see if the function is already selected (block 178). If the function has not been selected, the appropriate flag is set, a corresponding green LED is latched and the data is input to memory 130. If the flags indicate that the function designated by the function key is already selected, the output green LED is unlatched, the flag which was set for.
that function is cleared, and the address in memory 130 containing the undesired function key representation is cleared. These functions are combined in the block 178 of the flow chart of FIG. SA. Once the latching or unlatching of the function key is complete, the controller 110 waits for the next input.
If the key selected at block 176 of the flow chart is WO 91/17522 PCT/US90/06105 "' ~~:" ~ . ~. ~'x.'~ ,.

not a function key, the input procedure continues as shown in FIG. 8B. At block 180, the display is turned off while the key representation is stored in RAM 130. The display is then turned back on, and the keyboard controller is allowed to scan the memory 130 (block 182). The selection inputs using the orientation sensor 10 are timed pulses, . but if a joystick or other input device with buttons or . other physical switches is used, monitoring of the switch depression is required. Block 184 shows a holding loop which waits until the input switch is released before pr~~PPrlina. This prPVents multiple inputs from the holding of a selection input switch. Finally, once the key input is received by the keyboard controller, the memory 130 and all latches and flags are cleared (block 186).
While the invention has been particularly shown and described with reference to a preferred embodiment thereof, it will be understood by those skilled in the art' that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. In one alternative embodiment, the key pad of FIG. 3 has manual keys in which the LED indicators reside. In that embodiment, either the input device 10 or the manual keys of the key pad may be used to select desired keyboard inputs.
In other embodiments, eiifferent forms of indicators are used to identify the key code to the user. These indicators might include an LCD display or a separate monitor. The display might have separate indicators, or might be part of a device which displays represented key characters. Other possiblities include the use of an audio indicator. In general, as long as the indicator presents a representation to a user of the key code generated by the controller, it can be implemented in the present invention.
~, ;r. . ~_':~., . ',..

Claims (37)

THE EMBODIMENTS OF THE INVENTION IN WHICH AN EXCLUSIVE
PROPERTY OR PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:
1. A computer system comprising a processor, a display screen and a keyboard input port adapted to receive inputs from a mechanical keyboard, the system further comprising:
an input device for providing a Cartesian positional input;
a keypad having indicators for indicating individual keys;
a controller responsive to the input device for indicating individual keys on the keypad and further responsive to a key select input to select an indicated key and to store a key representation of the selected key, the key representation including a single active data bit within a word indicating a selected key position within a key matrix;
a memory circuit responsive to the controller to store the representation of the selected key, said memory circuit having an addressable memory space which simulates a mechanical keyboard; and a keyboard controller adapted to scan columns of mechanical key contacts and sense the state of the contacts, said keyboard controller scanning the contents of the memory circuit as it would scan columns of mechanical key contacts, sensing the representation of the selected key, and providing an input representative of the selected key to the keyboard input port.
2. A computer system according to claim 1 wherein key represent at ions stored by the memory circuit identify the input to be provided by the keyboard controller to the keyboard input port.
3. A computer system according to claim 1 wherein the memory circuit is periodically scanned by the keyboard controller to read any new key representations stored in the memory unit.
4. A computer system according to claim 1 further comprising buffer circuits for regulating the transfer of key representations to and from the memory circuit.
5. A computer system according to claim 4 wherein the buffer circuits respond to an input from a user of the computer system to allow the memory circuit to store a particular key representation to be read by the keyboard controller.
6. A computer system according to claim 5 wherein said input from a user results in the storage of a key representation in the memory circuit which represents the key identified on the keypad indicators, the reading of the stored key representation by the keyboard controller causing the keyboard controller to input a keyboard input to the keyboard input port which designates the identified key.
7. A computer system according to claim 6 wherein said memory circuit is cleared after a key representation representing a character key is read from the memory circuit by the keyboard controller.
8. A computer system according to claim 1 wherein the indicators comprise function indicators for indicating storage by the memory circuit of key representations representing function keys.
9. A computer system according to claim 1 wherein the indicators are LEDs of an LED array.
10. A computer system according to claim 1 further comprising a keyboard representation on the keypad, the keys of which are associated with the indicators.
11. A computer system according to claim 1 wherein the indicators are a matrix of light emitting elements corresponding to a mechanical keyboard contact matrix.
12. A computer system according to claim 1 wherein the controller comprises a microprocessor.
13. A computer system according to claim 1 wherein the key select input is a voice input.
14. A computer system according to claim 1 wherein the input device comprises an orientation sensor adapted to be worn as a headset, the orientation sensor modifying the positional input in response to changes in spatial orientation of the sensor.
15. A computer system according to claim 1 further comprising a cursor control interface for providing the cartesian positional input to the processor for controlling the position of a cursor on the display screen in a cursor control mode of the system.
16. A computer system according to claim 15 wherein the cartesian positional input controls implementation of the cursor control mode.
17. A method of providing a keyboard input to a computer system comprising a processor, a display screen and a keyboard input port adapted to receive inputs from a mechanical keyboard, the method comprising:
providing a cartesian positional input to the system with an input device;
receiving the positional input from the input device with a controller which generates a key representation of an indicated keyboard key, the key representation including a single active data bit within a word indicating a selected key position within a key matrix;

providing a visual indication of the indicated keyboard key on a keypad having indicators for indicating individual keys;
in response to a key select input from a user, selecting the indicated key and storing the key representation of the selected key in a memory circuit, the memory circuit having an addressable memory space which simulates a mechanical keyboard;
scanning contents of the memory circuit with a keyboard controller to sense the stored representation of the selected key, the keyboard controller being adapted to scan columns of mechanical key contacts; and providing an input representative of the selected key to the keyboard input port.
18. A method according to claim 17 wherein the indicators are LEDs of an LED array.
19. A method according to claim 17 wherein the indicators are associated with a keyboard representation on the keypad.
20. A method according to claim 17 wherein the indicators are a matrix of light emitting elements corresponding to a mechanical keyboard contact matrix.
21. A method according to claim 17 further comprising indicating storage by the memory circuit of key representations representing function keys.
22. A method according to claim 17 wherein the key select input is a voice input.
23. A method according to claim 17 wherein the input device comprises an orientation sensor adapted to be worn as a headset, the orientation sensor modifying the positional input in response to changes in spatial orientation of the sensor.
24. A method according to claim 17 further comprising providing the cartesian positional input to the processor to control the position of a cursor on the display screen in a cursor control mode of the system.
25. A method according to claim 24 wherein the cartesian positional input controls implementation of the cursor control mode.
26. A computer system comprising a processor, a display screen and a keyboard input port adapted to receive inputs from a mechanical keyboard, the system further comprising:
an input device for providing a cartesian positional input;
a keypad having indicators for indicating individual keys;

a controller responsive to the input device in a keyboard simulation mode of the system for indicating individual keys on the keypad and for providing an input to the keyboard input port, said controller being further responsive to a key select input to store a representation of a selected indicated key;
a cursor control interface for providing the cartesian, positional input to the processor for controlling the position of a cursor on the display screen in a cursor control mode of the system; and a mode controller for:
monitoring the cartesian positional input, implementing the keyboard simulation mode when the cartesian positional input indicates a position beyond a position limit, and implementing the cursor control mode when the cartesian positional input indicates a position within the position limit.
27. A computer system according to claim 26 further comprising:
a memory circuit responsive to the controller for storing the representation of the selected key, said memory circuit having an addressable memory space which simulates a mechanical keyboard; and a keyboard controller adapted to scan columns of mechanical key contacts and sense the state of the contacts, said keyboard controller scanning the contents of the memory circuit as it would scan columns of mechanical key contacts, sensing the representation of the selected key, and providing an input representative of the selected key to the keyboard input port.
28. The computer system of claim 26 wherein the position limit corresponds to a value on a cartesian coordinate axis.
29. The computer system of claim 28 wherein the mode controller switches between the cursor control mode and the keyboard simulation mode as a value in the cartesian positional input crosses the value corresponding to the position limit.
30. The computer system of claim 29 wherein: the input device is adapted to be worn as a headset by the user; and the cartesian positional input is controllable by a position of the user's head.
31. The computer system of claim 26 wherein: the input device is adapted to be worn as a headset by the user; and the cartesian positional input is controllable by a position of the user's head.
32. The method of claim 28 further comprising: adapting the input device to be worn as a headset by the user; and controlling the cartesian positional input by changing a position of the user's head.
33. A method of providing input to a computer system comprising a processor, a display screen and a keyboard input port adapted to receive inputs from a mechanical keyboard, the method comprising:
providing a cartesian positional input to the system with an input device;
monitoring the cartesian positional input relative to a position limit;
when the cartesian positional input indicates a position beyond a position limit, implementing a keyboard simulation mode comprising:
receiving the cartesian positional input from the input device with a controller which generates a key representation of an indicated keyboard key;
providing a visual indication of the indicated keyboard key on a keypad having indicators for indicating individual keys; and providing an input representative of a selected key to the keyboard input port; and when the cartesian positional input indicates a position within the position limit, implementing a cursor control mode comprising providing the cartesian positional input to the processor to control the position of a cursor on the display screen.
34. A method according to claim 33 further comprising the steps of, in the keyboard simulation mode:
storing the key representation of the indicated key from the controller in a memory circuit in response to a key select input from a user used to select the indicated key, the memory circuit having an addressable memory space which simulates a mechanical keyboard;
scanning contents of the memory circuit with a keyboard controller to sense the stored representation of the selected key, the keyboard controller being adapted to scan columns of mechanical key contacts; and providing an input representative of the selected key to the keyboard input port.
35. The method of claim 33 wherein the position limit corresponds to a value on a cartesian coordinate axis.
36. The method of claim 35 further comprising switching between the cursor control mode and the keyboard simulation mode as a value in the cartesian positional input crosses the value corresponding to the position limit.
37. The method of claim 36 further comprising: adapting the input device to be Worn as a headset by the user; and controlling the cartesian positional input by changing a position of the user's head.
CA002081910A 1990-05-01 1990-10-24 Hands-free hardware keyboard Expired - Fee Related CA2081910C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US51734790A 1990-05-01 1990-05-01
US517,347 1990-05-01
PCT/US1990/006105 WO1991017522A1 (en) 1990-05-01 1990-10-24 Hands-free hardware keyboard

Publications (2)

Publication Number Publication Date
CA2081910A1 CA2081910A1 (en) 1991-11-02
CA2081910C true CA2081910C (en) 2000-04-25

Family

ID=24059448

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002081910A Expired - Fee Related CA2081910C (en) 1990-05-01 1990-10-24 Hands-free hardware keyboard

Country Status (7)

Country Link
US (1) US5426450A (en)
EP (1) EP0532496B1 (en)
JP (1) JPH05506112A (en)
AU (1) AU654118B2 (en)
CA (1) CA2081910C (en)
DE (1) DE69016463T2 (en)
WO (1) WO1991017522A1 (en)

Families Citing this family (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993014454A1 (en) * 1992-01-10 1993-07-22 Foster-Miller, Inc. A sensory integrated data interface
EP0676072A1 (en) * 1992-12-23 1995-10-11 WEIXLER, Bernhard System for optional control of functions of a pc
JP3530591B2 (en) * 1994-09-14 2004-05-24 キヤノン株式会社 Speech recognition apparatus, information processing apparatus using the same, and methods thereof
US5686942A (en) * 1994-12-01 1997-11-11 National Semiconductor Corporation Remote computer input system which detects point source on operator
US5787152A (en) * 1995-07-18 1998-07-28 Freadman; Tommyca Computer communications device
US5724558A (en) * 1995-07-31 1998-03-03 Microsoft Corporation System and method for dynamic data packet configuration
GB9602701D0 (en) * 1996-02-09 1996-04-10 Canon Kk Image manipulation
US5960395A (en) * 1996-02-09 1999-09-28 Canon Kabushiki Kaisha Pattern matching method, apparatus and computer readable memory medium for speech recognition using dynamic programming
US5923866A (en) * 1996-04-26 1999-07-13 Acer Incorporated Method and apparatus for realizing a keyboard key function on a remote control
US5913034A (en) * 1996-08-27 1999-06-15 Compaq Computer Corp. Administrator station for a computer system
US5874939A (en) * 1996-12-10 1999-02-23 Motorola, Inc. Keyboard apparatus and method with voice recognition
US6148100A (en) * 1996-12-20 2000-11-14 Bechtel Bwxt Idaho, Llc 3-dimensional telepresence system for a robotic environment
JP3112254B2 (en) * 1997-03-04 2000-11-27 富士ゼロックス株式会社 Voice detection device
US6996533B2 (en) * 1997-03-21 2006-02-07 Fujitsu Limited Information processing system
US5893064A (en) * 1997-05-14 1999-04-06 K2 Interactive Llc Speech recognition method and apparatus with voice commands and associated keystrokes
US5864334A (en) * 1997-06-27 1999-01-26 Compaq Computer Corporation Computer keyboard with switchable typing/cursor control modes
US6128010A (en) * 1997-08-05 2000-10-03 Assistive Technology, Inc. Action bins for computer user interface
US6384591B1 (en) 1997-09-11 2002-05-07 Comsonics, Inc. Hands-free signal level meter
US6353313B1 (en) 1997-09-11 2002-03-05 Comsonics, Inc. Remote, wireless electrical signal measurement device
US7834855B2 (en) 2004-08-25 2010-11-16 Apple Inc. Wide touchpad on a portable computer
AT414325B (en) * 1998-02-18 2007-07-15 Teamaxess Ticketing Gmbh ARRANGEMENT FOR THE SALE OF AUTHORIZATIONS
US6243076B1 (en) 1998-09-01 2001-06-05 Synthetic Environments, Inc. System and method for controlling host system interface with point-of-interest data
US6333753B1 (en) 1998-09-14 2001-12-25 Microsoft Corporation Technique for implementing an on-demand display widget through controlled fading initiated by user contact with a touch sensitive input device
US7358956B2 (en) * 1998-09-14 2008-04-15 Microsoft Corporation Method for providing feedback responsive to sensing a physical presence proximate to a control of an electronic device
US7256770B2 (en) 1998-09-14 2007-08-14 Microsoft Corporation Method for displaying information responsive to sensing a physical presence proximate to a computer input device
US6396477B1 (en) 1998-09-14 2002-05-28 Microsoft Corp. Method of interacting with a computer using a proximity sensor in a computer input device
US6456275B1 (en) 1998-09-14 2002-09-24 Microsoft Corporation Proximity sensor in a computer input device
US6330514B1 (en) * 1999-02-09 2001-12-11 Behavior Tech Computer Corp. Keyboard testing system
US6893407B1 (en) * 2000-05-05 2005-05-17 Personics A/S Communication method and apparatus
IL136206A (en) * 2000-05-17 2005-05-17 Powerloc Technologies Inc Modular device organizer
US6629077B1 (en) 2000-11-22 2003-09-30 Universal Electronics Inc. Universal remote control adapted to receive voice input
US20020085738A1 (en) * 2000-12-28 2002-07-04 Peters Geoffrey W. Controlling a processor-based system by detecting flesh colors
KR100458066B1 (en) * 2001-09-27 2004-12-03 김한성 Method of inputting letter using mouse and its system
US6770864B2 (en) * 2001-12-22 2004-08-03 Yong Yan Light beam operated personal interfaces to computers
US20040003136A1 (en) * 2002-06-27 2004-01-01 Vocollect, Inc. Terminal and method for efficient use and identification of peripherals
US7161579B2 (en) 2002-07-18 2007-01-09 Sony Computer Entertainment Inc. Hand-held computer interactive device
US8797260B2 (en) * 2002-07-27 2014-08-05 Sony Computer Entertainment Inc. Inertially trackable hand-held controller
US7623115B2 (en) * 2002-07-27 2009-11-24 Sony Computer Entertainment Inc. Method and apparatus for light input device
US7883415B2 (en) 2003-09-15 2011-02-08 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US7646372B2 (en) * 2003-09-15 2010-01-12 Sony Computer Entertainment Inc. Methods and systems for enabling direction detection when interfacing with a computer program
US8686939B2 (en) 2002-07-27 2014-04-01 Sony Computer Entertainment Inc. System, method, and apparatus for three-dimensional input control
US9474968B2 (en) 2002-07-27 2016-10-25 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US7627139B2 (en) * 2002-07-27 2009-12-01 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US8570378B2 (en) 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US7760248B2 (en) 2002-07-27 2010-07-20 Sony Computer Entertainment Inc. Selective sound source listening in conjunction with computer interactive processing
US9682319B2 (en) 2002-07-31 2017-06-20 Sony Interactive Entertainment Inc. Combiner method for altering game gearing
US9177387B2 (en) 2003-02-11 2015-11-03 Sony Computer Entertainment Inc. Method and apparatus for real time motion capture
US8072470B2 (en) 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
US20040243416A1 (en) * 2003-06-02 2004-12-02 Gardos Thomas R. Speech recognition
US10279254B2 (en) 2005-10-26 2019-05-07 Sony Interactive Entertainment Inc. Controller having visually trackable object for interfacing with a gaming system
US7874917B2 (en) 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US8287373B2 (en) 2008-12-05 2012-10-16 Sony Computer Entertainment Inc. Control device for communicating visual information
US8323106B2 (en) * 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
US9573056B2 (en) 2005-10-26 2017-02-21 Sony Interactive Entertainment Inc. Expandable control device via hardware attachment
US7663689B2 (en) * 2004-01-16 2010-02-16 Sony Computer Entertainment Inc. Method and apparatus for optimizing capture device settings through depth information
US20060028433A1 (en) * 2004-08-04 2006-02-09 Myrick Wilbur L Universal serial bus keystroke generator switch
US8547401B2 (en) 2004-08-19 2013-10-01 Sony Computer Entertainment Inc. Portable augmented reality device and method
KR100631628B1 (en) * 2005-03-07 2006-10-11 엘지전자 주식회사 Power key independent backlight device and method of mobile communication terminal
US20070152983A1 (en) 2005-12-30 2007-07-05 Apple Computer, Inc. Touch pad with symbols based on mode
US8022935B2 (en) 2006-07-06 2011-09-20 Apple Inc. Capacitance sensing electrode with integrated I/O mechanism
USRE48417E1 (en) 2006-09-28 2021-02-02 Sony Interactive Entertainment Inc. Object direction using video input combined with tilt angle information
US8781151B2 (en) 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
US8542907B2 (en) 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
US20090174679A1 (en) 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
CN102016877B (en) 2008-02-27 2014-12-10 索尼计算机娱乐美国有限责任公司 Methods for capturing depth data of a scene and applying computer actions
US8368753B2 (en) 2008-03-17 2013-02-05 Sony Computer Entertainment America Llc Controller with an integrated depth camera
US8961313B2 (en) * 2009-05-29 2015-02-24 Sony Computer Entertainment America Llc Multi-positional three-dimensional controller
US8294047B2 (en) 2008-12-08 2012-10-23 Apple Inc. Selective input signal rejection and modification
DE102008055180A1 (en) * 2008-12-30 2010-07-01 Sennheiser Electronic Gmbh & Co. Kg Control system, handset and control methods
US8527657B2 (en) 2009-03-20 2013-09-03 Sony Computer Entertainment America Llc Methods and systems for dynamically adjusting update rates in multi-player network gaming
US8342963B2 (en) 2009-04-10 2013-01-01 Sony Computer Entertainment America Inc. Methods and systems for enabling control of artificial intelligence game characters
US8393964B2 (en) 2009-05-08 2013-03-12 Sony Computer Entertainment America Llc Base station for position location
US8142288B2 (en) 2009-05-08 2012-03-27 Sony Computer Entertainment America Llc Base station movement detection and compensation
US20110138207A1 (en) * 2009-12-08 2011-06-09 Su Chen-Wei Power control and operation method for notebook computer
KR101660746B1 (en) * 2010-08-24 2016-10-10 엘지전자 주식회사 Mobile terminal and Method for setting application indicator thereof
IL208796A0 (en) * 2010-10-18 2010-12-30 Univ Ben Gurion An apparatus for operating a computer using thoughts or facial impressions
TWI416135B (en) * 2010-11-26 2013-11-21 Primax Electronics Ltd Testing method and system for circuit board of keys
CN103576863B (en) * 2012-06-21 2017-02-15 深圳市金正方科技股份有限公司 Keyboard input method and device
US9305229B2 (en) 2012-07-30 2016-04-05 Bruno Delean Method and system for vision based interfacing with a computer
US9134764B2 (en) * 2013-12-20 2015-09-15 Sony Corporation Apparatus and method for controlling a display based on a manner of holding the apparatus
CN106325506A (en) * 2016-08-17 2017-01-11 捷开通讯(深圳)有限公司 Interaction method for virtual reality device, virtual reality device and virtual reality system

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4400697A (en) * 1981-06-19 1983-08-23 Chyron Corporation Method of line buffer loading for a symbol generator
US4642610A (en) * 1982-06-04 1987-02-10 Smith Iii William N Communications apparatus for handicapped individuals
US4567479A (en) * 1982-12-23 1986-01-28 Boyd Barry S Directional controller apparatus for a video or computer input
US4565999A (en) * 1983-04-01 1986-01-21 Prime Computer, Inc. Light pencil
US4524348A (en) * 1983-09-26 1985-06-18 Lefkowitz Leonard R Control interface
US4746913A (en) * 1984-04-23 1988-05-24 Volta Arthur C Data entry method and apparatus for the disabled
JPS6194134A (en) * 1984-10-13 1986-05-13 Naretsuji:Kk Radio mouse device
FR2575560B1 (en) * 1984-12-27 1987-02-20 Lafitte Rene COMMUNICATION APPARATUS FOR PEOPLE WITH DIFFERENT MOTOR AND / OR EXPRESSION POSSIBILITIES, AND MORE PARTICULARLY FOR MOTOR AND / OR BRAIN DISABLED PEOPLE
US4713535A (en) * 1985-09-04 1987-12-15 Rhoades Randy L Optical keyboard
US4988981B1 (en) * 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
US4862172A (en) * 1987-09-14 1989-08-29 Texas Scottish Rite Hospital For Crippled Children Computer control apparatus including a gravity referenced inclinometer
CA1334684C (en) * 1987-10-14 1995-03-07 Wang Laboratories, Inc. Computer input device using an orientation sensor
AU618133B2 (en) * 1988-11-04 1991-12-12 Wang Laboratories, Inc. Computer input device using an orientation sensor

Also Published As

Publication number Publication date
CA2081910A1 (en) 1991-11-02
EP0532496B1 (en) 1995-01-25
AU6602090A (en) 1991-11-27
WO1991017522A1 (en) 1991-11-14
US5426450A (en) 1995-06-20
JPH05506112A (en) 1993-09-02
AU654118B2 (en) 1994-10-27
DE69016463D1 (en) 1995-03-09
DE69016463T2 (en) 1995-09-07
EP0532496A1 (en) 1993-03-24

Similar Documents

Publication Publication Date Title
CA2081910C (en) Hands-free hardware keyboard
US5142655A (en) Computer input device using an orientation sensor
US5287119A (en) Computer input device using an orientation sensor
US5603065A (en) Hands-free input device for operating a computer having mouthpiece with plurality of cells and a transducer for converting sound into electrical control signals
US10095327B1 (en) System, method, and computer-readable medium for facilitating adaptive technologies
US6160536A (en) Dwell time indication method and apparatus
US6747632B2 (en) Wireless control device
US20170108938A1 (en) Apparatus for Selecting from a Touch Screen
US5751260A (en) Sensory integrated data interface
US7337410B2 (en) Virtual workstation
US5999895A (en) Sound operated menu method and apparatus
US6005549A (en) User interface method and apparatus
US5819225A (en) Display indications of speech processing states in speech recognition system
CA2262672A1 (en) Speech recognition manager
CA1325854C (en) Computer input device using an orientation sensor
GB2311888A (en) Tactile communication system
Yıldıran et al. AiRType: an air-tapping keyboard for augmented reality environments
JP2002244810A (en) Virtual reality space data input device
Nisbet Alternative Access Technologies
EP1483659B1 (en) Input apparatus for a computer system
GB2331170A (en) Data entry
KR20060002995A (en) Electric apparatus and method of communication between an apparatus and a user
Murphy et al. Developing the user-system interface for a communications system for ALS patients and others with severe neurological impairments
Wamboldt Computer environmental control units for the severely physically disabled: a guide for the occupational therapist
Cook Technology and Disabilities

Legal Events

Date Code Title Description
EEER Examination request
MKLA Lapsed