WO1998032112A1 - Sensory communication apparatus - Google Patents

Sensory communication apparatus Download PDF

Info

Publication number
WO1998032112A1
WO1998032112A1 PCT/GB1998/000162 GB9800162W WO9832112A1 WO 1998032112 A1 WO1998032112 A1 WO 1998032112A1 GB 9800162 W GB9800162 W GB 9800162W WO 9832112 A1 WO9832112 A1 WO 9832112A1
Authority
WO
WIPO (PCT)
Prior art keywords
output
pins
processor
data
central processing
Prior art date
Application number
PCT/GB1998/000162
Other languages
French (fr)
Inventor
John Christian Doughty Nissen
Original Assignee
John Christian Doughty Nissen
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by John Christian Doughty Nissen filed Critical John Christian Doughty Nissen
Priority to AU56723/98A priority Critical patent/AU5672398A/en
Priority to EP98900916A priority patent/EP0917699A1/en
Publication of WO1998032112A1 publication Critical patent/WO1998032112A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/008Teaching or communicating with blind persons using visual presentation of the information for the partially sighted

Definitions

  • This invention is concerned with a sensory communication apparatus , and in particular to such apparatus for the dynamic display to a blind person of textual and graphical information.
  • Maps are used for three main purposes, namely for education, such as the study of geography, for planning a journey, and for navigation during a journey.
  • the tasks of exploration are similar in the three cases and are essentially to discover the locations, names and characteristics of, and relationships between, real-world objects or features , both natural and man-made .
  • the goal of exploration may be to establish the layout of a place, or more specifically to find a suitable route from one place to another.
  • the present invention is sensory communication apparatus comprising a central processing unit having a data store, an output processor, means for controlling the output of data from the data store to the output processor, and output means connected to receive data from the processor, the output means being responsive to the output data whereby the output of the computer can be determined by one or more of the user's senses of touch, sound or sight.
  • the means for controlling the output of the data store may comprise a pointing device.
  • the output means includes tactile elements in the form of pins, each having associated with it an electromechanical transducer.
  • the pins are also associated with respective switches which are connected through an input processor to the central processing unit.
  • the pins may be arranged in pairs each for simultaneous contact by a respective finger and each pin of each pair can be vibrated at different frequencies or with different pulse lengths.
  • the output means may be operable in either a character mode or a surface mode.
  • the patterns in which the pins are actuated are related to text characters.
  • the apparatus may include a speech synthesiser responsive to data output from the central processing unit.
  • a keyboard or keypad is connected through an input processor to the central processing unit.
  • the keyboard or keypad may be operable in a character mode or a control mode.
  • Graphical information may be held in the data store of the central processing unit as a virtual surface, over which moves a notional cursor.
  • the data output reflects the graphical information under the cursor, and the relative position of graphical objects on the surface.
  • a selected graphical object can be described to the user in text, which is output using the same tactile device, or using audio (speech) or visual display.
  • Fig.l is a block circuit diagram of an embodiment of the present invention
  • Fig.2 is a more detailed circuit diagram of part of Fig.1
  • Fig.3 is a more detailed circuit diagram of part of
  • tactile communication apparatus comprises a central processing unit 10 having a data store 11 and being connected through an output processor 12 and an input processor 14 to a number of tactile sensors 16 mounted on an input/output unit 18 (Fig.5).
  • the input and output processors need not, of course, be be separate units but may be provided in a single integrated circuit.
  • each sensor 16 is an input/ output device that both generates a mechanical movement in response to output signals received from the output processor 12 and generates signals in response to a mechanical input, the signals being passed to the central processing unit 10 through the input processor 14.
  • each sensor 16 comprises a pair of pins 20, a pair of electro-mechanical transducers, in this embodiment piezo strips 22, and a pair of membrane switches 24 mounted on a printed circuit board 26 supported from the top plate 28 of the unit 18.
  • the strips 22 are clamped at one end and mount the pins 20 at their other end, the pins 20 projecting upwardly through the top plate 28.
  • the switches 24 are each mounted beneath a respective pin 20 to be closed by the piezo strip when the pin is depressed.
  • the output processor 12 is shown to comprise, connected in cascade, a level translator 34, a pulse width modulator generator 36, a high voltage digital amplifier 38 and a number of low pass filters 40 each connected to a respective piezo strip 22 in a sensor 16. In this embodiment twelve low pass filters 40 are provided.
  • the output processor is powered from a low voltage source, preferably a battery, connected to a step-up voltage convertor 42 the high voltage output of which powers the processor more efficiently than would a low voltage.
  • the details of the pulse width modulation generator 36 are shown in Fig.3.
  • the generator 36 consists of four sources that may be mapped onto any combination of sixteen outputs.
  • the sources are controlled by a control unit 50 which interprets the commands and data incoming from the computer 10 via the level translater 34 and controls the various aspects of pulse generation.
  • Each of the four sources generates an arbitrary wave of a number of different amplitudes and frequencies and comprises a phase accumulator 52a, 52b, 52c, or 52d, which is set to step through a respective look-up table 54a, 54b 54c, or 54d, at a configurable rate whereby the frequency can be varied by altering the step and the waveform can be altered by using a number of different look-ups.
  • the output from the look-up table is then scaled by a respective sealer 56a,56b,56c or 56d, according to the desired amplitude.
  • the four current samples to be output are passed to their respective pulse width generators 58a-58d which create pulses which have widths proportional to the desired value.
  • Any of the generators can be fed to any of the outputs. This is achieved by preloading a mask register 60a, 60b, 60c or 60d with the output pattern for the respective pulse width generator.
  • this output word is fed through.
  • the output mask is gated off and therefore does not play a part in the output.
  • the sixteen outputs from each of the four generators are mixed together using an exclusive- OR gate 66. This results in a true mixing of the four sources when the resultant waveform has passed through its filtering process.
  • the array of transducers are mounted on, or are otherwise associated with, a pointing device, such as a computer mouse or a touch sensitive tablet. As the mouse moves, or a depression moves across the tablet, the window is correspondingly moved over the virtual surface representing the map or graphical image .
  • the information can be stored at various levels of detail, with the greater detail suppressed for smaller-scale presentation.
  • the map may show the exact shapes of buildings, on a smaller scale building may be represented as simple rectangles, and on a smaller scale still the buildings may be merged into a single object representing a built-up area.
  • the user can zoom in and out at will, which compensates for the small size of window.
  • Fig.5 is shown the plan view of a tactile input/output unit 18 which is similar to a conputer mouse in that it has a mouse ball 88 (Fig.6) on its underside and can therefore serve as a pointing device.
  • the unit 18 is provided with four tactile sensors (70, 72, 74, and 76) each having a pair of pins 20 projecting through its upper surface and two sensors having two pairs of pins 84, 86, as seen in Fig.6, projecting through each of its sides 78 and 80.
  • the sixteen pins 20 have associated respective transducers in the form of piezo strips 22 each connected to a respective one of the low pass filters 40 of Fig.2.
  • the unit is, in use, held in the hand with the four fingers engaging the four pairs of pins at its upper surface and the thumb engaging one of the two pairs of pins at the sides depending upon which hand is holding the unit.
  • output is by characters, each a pattern on the pins formed as follows.
  • Each pair of pins has three states determined by the frequency of vibration of the piezo strips; in the first state one of the strips vibrates at a low frequency (20 Hz), in the second state both piezo strips vibrate at the low frequency and in the third state both piezos vibrate at a higher frequency (200 Hz).
  • the alphabet is coded by patterns comprising either a single finger state, or a single finger state combined, simultaneously or sequentially, with a single thumb state.
  • the output corresponds to the virtual surface under the fingers.
  • the piezo strips vibrate according to an algorithm based on frequency and distance from vectors forming virtual objects on the surface.
  • the states of the pins are distinguished by the length of the vibration pulses as well as or instead of by the frequency of vibration.
  • embodiments of the invention have two input modes, namely character and control.
  • the input is via keys, i.e the pins 20, acting on the input switches 24 mounted under the piezo strips. Patterns of input can be produced to correspond to patterns of output. Three input stimuli are possible per finger, corresponding to the three output states per finger, one pin/piezo depressed, the other depressed, and both depressed.
  • control mode input is again via the keys and switches 24.
  • a single keystroke is used for simple commands such as Next, Previous, Up, Down, Enter and Leave. These commands are used for navigating in information space, typically for exploring a document hierarchy, and for editing. The same commands are used at all levels in the structure of information space which is basically organised as a tree with hyperlinks.
  • the patterns of activation of the pins can be used to give a direction, e.g. a compass bearing or the direction of an object from the cursor position.
  • the patterns can also be used to indicate what is under the cursor, or in the immediate vicinity.
  • One form of the input/output unit has six pins with the associated transducers arranged as a hexagon about a seventh central pin.
  • the pins are used to guide the hand of the user holding the tactile input/output unit in the direction corresponding to the direction of the graphical object from the cursor on the virtual surface. This allows the user to explore the surface for objects which the user has selected. There is means of selecting a single object or a group of objects with shared characteristics.
  • the method of finding a particular object and its shape is a follows.
  • a point object such as bus stop. While the window is not over the object, the pin or pair of pins closest to the object are activated periodically, with a period proportional to distance. The user can then move the window towards the object, and the frequency of activation increases as the object is approached. When the window is directly over the point, the central pin is activated.
  • a line object such as the centre line of a pavement along a street. While the window is not over the object, again the pin or pair of pins closest to the object are activated periodically. When the line is reached the pins over the line are activated. The user can then follow the line. While exactly over the line, the central pin is operated. Now consider objects which have an area (i.e. are not points lines). The same procedure is followed to find the edge of the object. However if the window is moved inside the area, the central pin is continuously activated, and the pins nearest to the nearest edge are periodically activated.
  • a particularly useful aspect of the invention is the ability of the user to feel the input as it is being drawn.
  • a person can input a line onto a map by moving the pointing device and then immediately feel the line using the tactile means.
  • Sounds may be produced corresponding to, pin activation above.
  • Embodiments of the invention may have an audio output where the sounds corresponding to pin activation are combined with speech output of text information.
  • a visual output may also be provided as, for example, a word-by-word visual display of text with large characters, where the display or highlighting of each word is synchronised or partially synchronised with the speech output of the word such that the word is spoken immediately before, immediately after, or simultaneously with the display of that word.
  • the word-by word facility is useful for partially sighted people as the visual display can handle words in large characters, up to the width of the screen.
  • the word display or highlighting can be centred, such that the person can retain focus on one point on the screen while the words are displayed sequentially. This permits rapid reading, since the eyes' sacade movements are eliminated, and the time spent in backtracking and re-reading is avoided.
  • the synchronism with speech reinforces the association between the written and spoken word, helpful for language learners and dyslexics.
  • the output unit of the present invention may rest on a surface, or may be clipped to a part of a wearer's apparel, such as a belt, or may be strapped to and operated by one hand.
  • central processing unit may be remote from the input/output and may even be accessed by telephone and modem.
  • the units are provided with a transmitter, for example an infra-red or a radio transmitter, or a transmitter and receiver, providing a remote control link with an appliance or system such as a television, a kiosk or an automatic telling machine, thus enabling the user to control the appliance or system using a dedicated pattern of pin or key operation and have from an audio or tactile output from the central processing unit confirmation of the control command.
  • a transmitter for example an infra-red or a radio transmitter, or a transmitter and receiver, providing a remote control link with an appliance or system such as a television, a kiosk or an automatic telling machine, thus enabling the user to control the appliance or system using a dedicated pattern of pin or key operation and have from an audio or tactile output from the central processing unit confirmation of the control command.

Abstract

Tactile communication apparatus comprising a central processing unit (10) having a data store (11), an output processor (12), means for controlling the output of data from the date store (11) to the output processor (12), and output means (16) connected to receive data from the processor, the output means having tactile sensor units responsive to the output data whereby the output of the computer can be determined by touch.

Description

SENSORY COMMUNICATION APPARATUS
This invention is concerned with a sensory communication apparatus , and in particular to such apparatus for the dynamic display to a blind person of textual and graphical information.
Maps are used for three main purposes, namely for education, such as the study of geography, for planning a journey, and for navigation during a journey.
The tasks of exploration are similar in the three cases and are essentially to discover the locations, names and characteristics of, and relationships between, real-world objects or features , both natural and man-made . The goal of exploration may be to establish the layout of a place, or more specifically to find a suitable route from one place to another.
There have been a number of developments in making maps accessible for blind people. Firstly there are the maps which are purely tactile and in which writing is in Braille, as raised dots, and texture is used as a substitute for colour. Such maps are "crude" in that there can be little detail, and resolution is small. Thus only a limited amount of information can be presented for a given size of map - several orders of magnitude less than a conventional map used by- sighted people. There are problems of producing tactile maps, though there have been technical developments to improve the situation. The cost of reproduction (i.e. the per copy cost) is still high.
Similar problems arise in the presentation of other multidimensional information including graphs, charts, block diagrams, tables and matrices. Even with a simple table the single line display of a conventional dynamic Braille apparatus can lead to confusion becayse the line crosses the columns. There have been various attempts to get away from a "hard-copy" approach and substitute a dynamic tactile display, using a physical surface which can be altered under control of a computer. These displays are expensive because the surface is composed of a large array of movable elements, but the cost of map reproduction (i.e. producing copies of a map to distribute) is negligible since the maps are in electronic form.
It is an object of the present invention to obviate or mitigate these difficulties.
The present invention is sensory communication apparatus comprising a central processing unit having a data store, an output processor, means for controlling the output of data from the data store to the output processor, and output means connected to receive data from the processor, the output means being responsive to the output data whereby the output of the computer can be determined by one or more of the user's senses of touch, sound or sight.
The means for controlling the output of the data store may comprise a pointing device.
Preferably the output means includes tactile elements in the form of pins, each having associated with it an electromechanical transducer.
Preferably the pins are also associated with respective switches which are connected through an input processor to the central processing unit.
The pins may be arranged in pairs each for simultaneous contact by a respective finger and each pin of each pair can be vibrated at different frequencies or with different pulse lengths. The output means may be operable in either a character mode or a surface mode.
Preferably the patterns in which the pins are actuated are related to text characters.
The apparatus may include a speech synthesiser responsive to data output from the central processing unit.
Preferably a keyboard or keypad is connected through an input processor to the central processing unit.
The keyboard or keypad may be operable in a character mode or a control mode.
Graphical information may be held in the data store of the central processing unit as a virtual surface, over which moves a notional cursor. The data output reflects the graphical information under the cursor, and the relative position of graphical objects on the surface. A selected graphical object can be described to the user in text, which is output using the same tactile device, or using audio (speech) or visual display.
Embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:-
Fig.l is a block circuit diagram of an embodiment of the present invention; Fig.2 is a more detailed circuit diagram of part of Fig.1 ; Fig.3 is a more detailed circuit diagram of part of
Fig.2; Fig.4 is a side elevation of an input/output sensor; Figs.5 is a plan view of a tactile input/output unit; and Fig.6 is a side view of the tactile input/output unit of Fig.5.
Referring now to Fig.l, tactile communication apparatus according to an embodiment of the present invention comprises a central processing unit 10 having a data store 11 and being connected through an output processor 12 and an input processor 14 to a number of tactile sensors 16 mounted on an input/output unit 18 (Fig.5). The input and output processors need not, of course, be be separate units but may be provided in a single integrated circuit.
The sensors 16 of this embodiment are shown in Fig.4 and it should be noted that each sensor 16 is an input/ output device that both generates a mechanical movement in response to output signals received from the output processor 12 and generates signals in response to a mechanical input, the signals being passed to the central processing unit 10 through the input processor 14.
Referring now to Fig.4, each sensor 16 comprises a pair of pins 20, a pair of electro-mechanical transducers, in this embodiment piezo strips 22, and a pair of membrane switches 24 mounted on a printed circuit board 26 supported from the top plate 28 of the unit 18. The strips 22 are clamped at one end and mount the pins 20 at their other end, the pins 20 projecting upwardly through the top plate 28. The switches 24 are each mounted beneath a respective pin 20 to be closed by the piezo strip when the pin is depressed.
In Fig.2 the output processor 12 is shown to comprise, connected in cascade, a level translator 34, a pulse width modulator generator 36, a high voltage digital amplifier 38 and a number of low pass filters 40 each connected to a respective piezo strip 22 in a sensor 16. In this embodiment twelve low pass filters 40 are provided. The output processor is powered from a low voltage source, preferably a battery, connected to a step-up voltage convertor 42 the high voltage output of which powers the processor more efficiently than would a low voltage.
The details of the pulse width modulation generator 36 are shown in Fig.3. The generator 36 consists of four sources that may be mapped onto any combination of sixteen outputs. The sources are controlled by a control unit 50 which interprets the commands and data incoming from the computer 10 via the level translater 34 and controls the various aspects of pulse generation.
Each of the four sources generates an arbitrary wave of a number of different amplitudes and frequencies and comprises a phase accumulator 52a, 52b, 52c, or 52d, which is set to step through a respective look-up table 54a, 54b 54c, or 54d, at a configurable rate whereby the frequency can be varied by altering the step and the waveform can be altered by using a number of different look-ups. The output from the look-up table is then scaled by a respective sealer 56a,56b,56c or 56d, according to the desired amplitude.
The four current samples to be output are passed to their respective pulse width generators 58a-58d which create pulses which have widths proportional to the desired value.
Any of the generators can be fed to any of the outputs. This is achieved by preloading a mask register 60a, 60b, 60c or 60d with the output pattern for the respective pulse width generator.
When the generator is generating a pulse, this output word is fed through. When the generator is not generating a pulse, the output mask is gated off and therefore does not play a part in the output.
The sixteen outputs from each of the four generators are mixed together using an exclusive- OR gate 66. This results in a true mixing of the four sources when the resultant waveform has passed through its filtering process.
In embodiments allowing access to maps or other graphical information, the array of transducers are mounted on, or are otherwise associated with, a pointing device, such as a computer mouse or a touch sensitive tablet. As the mouse moves, or a depression moves across the tablet, the window is correspondingly moved over the virtual surface representing the map or graphical image .
To allow a map to be scaleable over a wide range of scales, the information can be stored at various levels of detail, with the greater detail suppressed for smaller-scale presentation. Thus for example, on a large scale the map may show the exact shapes of buildings, on a smaller scale building may be represented as simple rectangles, and on a smaller scale still the buildings may be merged into a single object representing a built-up area. In the current invention, the user can zoom in and out at will, which compensates for the small size of window.
In Fig.5 is shown the plan view of a tactile input/output unit 18 which is similar to a conputer mouse in that it has a mouse ball 88 (Fig.6) on its underside and can therefore serve as a pointing device. In addition the unit 18 is provided with four tactile sensors (70, 72, 74, and 76) each having a pair of pins 20 projecting through its upper surface and two sensors having two pairs of pins 84, 86, as seen in Fig.6, projecting through each of its sides 78 and 80. The sixteen pins 20 have associated respective transducers in the form of piezo strips 22 each connected to a respective one of the low pass filters 40 of Fig.2.
The unit is, in use, held in the hand with the four fingers engaging the four pairs of pins at its upper surface and the thumb engaging one of the two pairs of pins at the sides depending upon which hand is holding the unit.
In embodiments of the invention there are two modes of operation, namely character and surface.
In character mode, output is by characters, each a pattern on the pins formed as follows. Each pair of pins has three states determined by the frequency of vibration of the piezo strips; in the first state one of the strips vibrates at a low frequency (20 Hz), in the second state both piezo strips vibrate at the low frequency and in the third state both piezos vibrate at a higher frequency (200 Hz). With one of the four finger pairs on there are three times four states, i.e. twelve states. With one of the two thumb pairs on there are six states . The alphabet is coded by patterns comprising either a single finger state, or a single finger state combined, simultaneously or sequentially, with a single thumb state.
In surface mode, the output corresponds to the virtual surface under the fingers. The piezo strips vibrate according to an algorithm based on frequency and distance from vectors forming virtual objects on the surface.
In a modified embodiment the states of the pins are distinguished by the length of the vibration pulses as well as or instead of by the frequency of vibration. Furthermore, embodiments of the invention have two input modes, namely character and control.
In character mode the input is via keys, i.e the pins 20, acting on the input switches 24 mounted under the piezo strips. Patterns of input can be produced to correspond to patterns of output. Three input stimuli are possible per finger, corresponding to the three output states per finger, one pin/piezo depressed, the other depressed, and both depressed.
In control mode, input is again via the keys and switches 24. A single keystroke is used for simple commands such as Next, Previous, Up, Down, Enter and Leave. These commands are used for navigating in information space, typically for exploring a document hierarchy, and for editing. The same commands are used at all levels in the structure of information space which is basically organised as a tree with hyperlinks.
The patterns of activation of the pins can be used to give a direction, e.g. a compass bearing or the direction of an object from the cursor position. The patterns can also be used to indicate what is under the cursor, or in the immediate vicinity. By selecting objects, feeling the patterns and moving the pointer, the user is able to explore a map, or other graphical image, represented on the virtual surface.
One form of the input/output unit has six pins with the associated transducers arranged as a hexagon about a seventh central pin. The pins are used to guide the hand of the user holding the tactile input/output unit in the direction corresponding to the direction of the graphical object from the cursor on the virtual surface. This allows the user to explore the surface for objects which the user has selected. There is means of selecting a single object or a group of objects with shared characteristics. With this particular input/output unit, the method of finding a particular object and its shape is a follows.
Consider first a point object such as bus stop. While the window is not over the object, the pin or pair of pins closest to the object are activated periodically, with a period proportional to distance. The user can then move the window towards the object, and the frequency of activation increases as the object is approached. When the window is directly over the point, the central pin is activated. Next consider a line object such as the centre line of a pavement along a street. While the window is not over the object, again the pin or pair of pins closest to the object are activated periodically. When the line is reached the pins over the line are activated. The user can then follow the line. While exactly over the line, the central pin is operated. Now consider objects which have an area (i.e. are not points lines). The same procedure is followed to find the edge of the object. However if the window is moved inside the area, the central pin is continuously activated, and the pins nearest to the nearest edge are periodically activated.
A particularly useful aspect of the invention is the ability of the user to feel the input as it is being drawn. Thus a person can input a line onto a map by moving the pointing device and then immediately feel the line using the tactile means.
For examining the relationship and shapes of different areas of the map, different areas are allocated different "colours" on the virtual surface, and each pin reacts to a different colour, as the cursor is moved over the surface. The user can thus scan the surface and detect the position of the different coloured areas. This is particularly efficient as no map requires more than four colours to distinguish different areas.
Sounds may be produced corresponding to, pin activation above. Embodiments of the invention may have an audio output where the sounds corresponding to pin activation are combined with speech output of text information.
While the main outputs are tactile or audio a visual output may also be provided as, for example, a word-by-word visual display of text with large characters, where the display or highlighting of each word is synchronised or partially synchronised with the speech output of the word such that the word is spoken immediately before, immediately after, or simultaneously with the display of that word. The word-by word facility is useful for partially sighted people as the visual display can handle words in large characters, up to the width of the screen. The word display or highlighting can be centred, such that the person can retain focus on one point on the screen while the words are displayed sequentially. This permits rapid reading, since the eyes' sacade movements are eliminated, and the time spent in backtracking and re-reading is avoided. The synchronism with speech reinforces the association between the written and spoken word, helpful for language learners and dyslexics.
It should be understood that the output unit of the present invention may rest on a surface, or may be clipped to a part of a wearer's apparel, such as a belt, or may be strapped to and operated by one hand.
Moreover the central processing unit may be remote from the input/output and may even be accessed by telephone and modem.
In a further modification of the input/output units previously described, the units are provided with a transmitter, for example an infra-red or a radio transmitter, or a transmitter and receiver, providing a remote control link with an appliance or system such as a television, a kiosk or an automatic telling machine, thus enabling the user to control the appliance or system using a dedicated pattern of pin or key operation and have from an audio or tactile output from the central processing unit confirmation of the control command.

Claims

1. Sensory communication apparatus comprising a central processing unit having a data store, an output processor, means for controlling the output of data from the data store to the output processor, and output means connected to receive data from the processor, the output means being responsive to the output data whereby the output of the computer can be determined by one or more of the user's senses of touch, sound or sight.
2. Apparatus as claimed in claim 1 , in which the means for controlling the output of the data store comprises a pointing device.
3. Apparatus as claimed in claim 1 or claim 2 , in which the output means includes tactile elements in the form of pins, each having associated with it an electro-mechanical transducer.
4. Apparatus as claimed in claim 3 , in which the pins are also associated with respective switches which are connected through an input processor to the central processing unit.
5. Apparatus as claimed in claim 3 or claim 4, in which the pins are arranged in pairs each for simultaneous contact by a respective finger and each pin of each pair can be vibrated at different frequencies or with different pulse lengths.
6. Apparatus as claimed in any of claims 3 to 5, in which the output means is operable in either a character mode or a surface mode.
7. Apparatus as claimed in any of claims 3 to 6, in which the patterns in which the pins are actuated are related to text characters .
8. Apparatus as claimed in any preceding claim, including a speech synthesiser responsive to data output from the central processing unit.
9. Apparatus as claimed in any of claims 1 to 3, or any of claims 5 to 8 when independent of claim 4, in which a keyboard or keypad is connected through an input processor to the central processing unit.
10. Apparatus as claimed in clain 9, in which the keyboard or keypad is operable in a character mode or a control mode.
PCT/GB1998/000162 1997-01-20 1998-01-19 Sensory communication apparatus WO1998032112A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU56723/98A AU5672398A (en) 1997-01-20 1998-01-19 Sensory communication apparatus
EP98900916A EP0917699A1 (en) 1997-01-20 1998-01-19 Sensory communication apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB9701102.7A GB9701102D0 (en) 1997-01-20 1997-01-20 Tactile system for dynamic display of textual and graphical information with audio option
GB9701102.7 1997-01-20

Publications (1)

Publication Number Publication Date
WO1998032112A1 true WO1998032112A1 (en) 1998-07-23

Family

ID=10806274

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1998/000162 WO1998032112A1 (en) 1997-01-20 1998-01-19 Sensory communication apparatus

Country Status (5)

Country Link
EP (1) EP0917699A1 (en)
AU (1) AU5672398A (en)
CA (1) CA2249415A1 (en)
GB (1) GB9701102D0 (en)
WO (1) WO1998032112A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2790578A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie Sign language for communicating with another person or electronic calculator has series of main keys on side not facing user and reading keys transmitting tactile sensations to fingers of user
FR2790567A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie KEYBOARD FOR TOUCH-BASED READING OF INFORMATION FROM AN ELECTRONIC COMPUTER
WO2000068917A1 (en) * 1999-05-10 2000-11-16 Vincent Hayward Electro-mechanical transducer suitable for tactile display and article conveyance
GB2358514A (en) * 2000-01-21 2001-07-25 Peter Nigel Bellamy Electronic braille reader
EP1179816A2 (en) * 2000-08-09 2002-02-13 Laurel Precision Machines Co. Ltd. Information input/output device for visually impaired users
US6445284B1 (en) 2000-05-10 2002-09-03 Juan Manuel Cruz-Hernandez Electro-mechanical transducer suitable for tactile display and article conveyance
US6693622B1 (en) 1999-07-01 2004-02-17 Immersion Corporation Vibrotactile haptic feedback devices
US9245428B2 (en) 2012-08-02 2016-01-26 Immersion Corporation Systems and methods for haptic remote control gaming
US9492847B2 (en) 1999-09-28 2016-11-15 Immersion Corporation Controlling haptic sensations for vibrotactile feedback interface devices
US9625905B2 (en) 2001-03-30 2017-04-18 Immersion Corporation Haptic remote control for toys

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1989006848A1 (en) * 1988-01-21 1989-07-27 British Telecommunications Public Limited Company Electronic vibrational display
EP0574887A1 (en) * 1992-06-16 1993-12-22 Canon Kabushiki Kaisha Information processing apparatus for physically-handicapped users
JPH0777944A (en) * 1993-06-14 1995-03-20 Yasushi Ikei Vibrating type tactile display
JPH087182A (en) * 1994-06-23 1996-01-12 Nippon Telegr & Teleph Corp <Ntt> Tactile sensation stimulation presentation method and device and tactile sensation stimulation display
WO1997016035A1 (en) * 1995-10-25 1997-05-01 Gilbert Rene Gonzales Tactile communication device and method
JPH09166958A (en) * 1995-12-18 1997-06-24 Japan Radio Co Ltd Navigation device
GB2311888A (en) * 1996-04-01 1997-10-08 John Christian Doughty Nissen Tactile communication system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1989006848A1 (en) * 1988-01-21 1989-07-27 British Telecommunications Public Limited Company Electronic vibrational display
EP0574887A1 (en) * 1992-06-16 1993-12-22 Canon Kabushiki Kaisha Information processing apparatus for physically-handicapped users
JPH0777944A (en) * 1993-06-14 1995-03-20 Yasushi Ikei Vibrating type tactile display
JPH087182A (en) * 1994-06-23 1996-01-12 Nippon Telegr & Teleph Corp <Ntt> Tactile sensation stimulation presentation method and device and tactile sensation stimulation display
WO1997016035A1 (en) * 1995-10-25 1997-05-01 Gilbert Rene Gonzales Tactile communication device and method
JPH09166958A (en) * 1995-12-18 1997-06-24 Japan Radio Co Ltd Navigation device
GB2311888A (en) * 1996-04-01 1997-10-08 John Christian Doughty Nissen Tactile communication system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
"FEEDBACK MOUSE", IBM TECHNICAL DISCLOSURE BULLETIN, vol. 36, no. 5, 1 May 1993 (1993-05-01), pages 27/28, XP000408898 *
MINAGAWA H ET AL: "TACTILE-AUDIO DIAGRAM FOR BLIND PERSONS", IEEE TRANSACTIONS ON REHABILITATION ENGINEERING, vol. 4, no. 4, December 1996 (1996-12-01), pages 431 - 436, XP000636716 *
PATENT ABSTRACTS OF JAPAN vol. 095, no. 006 31 July 1995 (1995-07-31) *
PATENT ABSTRACTS OF JAPAN vol. 096, no. 005 31 May 1996 (1996-05-31) *
PATENT ABSTRACTS OF JAPAN vol. 097, no. 010 31 October 1997 (1997-10-31) *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2790567A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie KEYBOARD FOR TOUCH-BASED READING OF INFORMATION FROM AN ELECTRONIC COMPUTER
WO2000052665A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie Tactile reading system for data coming from a computer and associated communication device
FR2790578A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie Sign language for communicating with another person or electronic calculator has series of main keys on side not facing user and reading keys transmitting tactile sensations to fingers of user
US6639510B1 (en) 1999-03-02 2003-10-28 Philippe Soulie Tactile reading system for data coming from a computer and associated communication device
WO2000068917A1 (en) * 1999-05-10 2000-11-16 Vincent Hayward Electro-mechanical transducer suitable for tactile display and article conveyance
US6693516B1 (en) 1999-05-10 2004-02-17 Vincent Hayward Electro-mechanical transducer suitable for tactile display and article conveyance
US6693622B1 (en) 1999-07-01 2004-02-17 Immersion Corporation Vibrotactile haptic feedback devices
US9492847B2 (en) 1999-09-28 2016-11-15 Immersion Corporation Controlling haptic sensations for vibrotactile feedback interface devices
GB2358514A (en) * 2000-01-21 2001-07-25 Peter Nigel Bellamy Electronic braille reader
US6445284B1 (en) 2000-05-10 2002-09-03 Juan Manuel Cruz-Hernandez Electro-mechanical transducer suitable for tactile display and article conveyance
EP1179816A3 (en) * 2000-08-09 2005-12-21 Laurel Precision Machines Co. Ltd. Information input/output device for visually impaired users
US7051292B2 (en) 2000-08-09 2006-05-23 Laurel Precision Machines Co., Ltd. Information input/output device for visually impaired users
EP1179816A2 (en) * 2000-08-09 2002-02-13 Laurel Precision Machines Co. Ltd. Information input/output device for visually impaired users
US9625905B2 (en) 2001-03-30 2017-04-18 Immersion Corporation Haptic remote control for toys
US9245428B2 (en) 2012-08-02 2016-01-26 Immersion Corporation Systems and methods for haptic remote control gaming
US9753540B2 (en) 2012-08-02 2017-09-05 Immersion Corporation Systems and methods for haptic remote control gaming

Also Published As

Publication number Publication date
GB9701102D0 (en) 1997-03-12
CA2249415A1 (en) 1998-07-23
AU5672398A (en) 1998-08-07
EP0917699A1 (en) 1999-05-26

Similar Documents

Publication Publication Date Title
Ducasse et al. Accessible interactive maps for visually impaired users
US5736978A (en) Tactile graphics display
JP4567817B2 (en) Information processing apparatus and control method thereof
US4464118A (en) Didactic device to improve penmanship and drawing skills
Bolt “Put-that-there” Voice and gesture at the graphics interface
US5287102A (en) Method and system for enabling a blind computer user to locate icons in a graphical user interface
US6802717B2 (en) Teaching method and device
US8228298B2 (en) Method and devices of transmitting tactile information description
US20060024647A1 (en) Method and apparatus for communicating graphical information to a visually impaired person using haptic feedback
WO1998032112A1 (en) Sensory communication apparatus
Rigas et al. The rising pitch metaphor: an empirical study
Brock Interactive maps for visually impaired people: design, usability and spatial cognition
GB2311888A (en) Tactile communication system
JP4736605B2 (en) Display device, information processing device, and control method thereof
US4594683A (en) Apparatus for fixing a coordinate point within a flat data representation
KR100312750B1 (en) Virtual musical performance apparatus and method thereof using sensor
US3740446A (en) Perception apparatus for the blind
Semwal et al. Virtual environments for visually impaired
Golledge et al. Multimodal interfaces for representing and accessing geospatial information
Bustoni et al. Multidimensional Earcon Interaction Design for The Blind: a Proposal and Evaluation
JPH11501740A (en) Man / machine interface for computing devices
JPH05174074A (en) Page-turning device
Karshmer et al. Equal access to information for all: making the world of electronic information more accessible to the handicapped in our society
Parker Assessment of Access Methods for Mobile Maps for Individuals Who are Blind or Visually Impaired
Ávila Soto Interactive tactile representations to support document accessibility for people with visual impairments

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

ENP Entry into the national phase

Ref document number: 2249415

Country of ref document: CA

Ref country code: CA

Ref document number: 2249415

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 09142973

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 1998900916

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWP Wipo information: published in national office

Ref document number: 1998900916

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1998900916

Country of ref document: EP