The present specification generally relates to tactile display devices and, more particularly, to tactile display devices capable of displaying tactile topographical information to blind or visually impaired users.
Blind or visually impaired persons may find it difficult to navigate within their environment. Aid devices such as a cane may provide a visually impaired person with haptic feedback regarding objects that are within his or her vicinity. A guide dog may be used to assist in guiding a blind or visually impaired person through the environment. However, it may be very difficult for a blind or visually impaired person to have an understanding of objects within the environment, such as the location of people, obstacles, and signs.
Accordingly, a need exists for devices that provide blind or visually impaired people with environmental information in a manner that is not reliant on human vision.
In one embodiment, a tactile display device includes a housing having a first surface, a tactile display located at the first surface, a camera, a processor, and a non-transitory memory device. The tactile display is configured to produce a plurality of raised portions defining a tactile message. The camera is configured to generate image data corresponding to an environment. The processor is disposed within the housing and communicatively coupled to the tactile display and the camera. The non-transitory memory device stores machine-readable instructions that, when executed by the processor, cause the processor to generate a topographical map of objects within the environment from the image data received from the camera, generate tactile display data corresponding to the topographical map, and provide the tactile display data to the tactile display such that the tactile display produces the plurality of raised portions to form the tactile message.
In another embodiment, a tactile display device includes a housing having a first surface and a second surface that is opposite from the first surface, a tactile display located at the first surface of the housing, a touch-sensitive input region disposed on a surface of the tactile display, an input device disposed at the second surface of the housing, a camera, a processor, and a non-transitory memory device. The tactile display is configured to produce a plurality of raised portions defining a tactile message. The camera is configured to generate image data corresponding to an environment. The non-transitory memory device stores machine-readable instructions that, when executed by the processor, cause the processor to receive a user input from the input device or the touch-sensitive input region, analyze the image data to determine a class of objects within the environment, wherein the user input indicates a desired class of objects for display in the tactile message, generate a topographical map of objects having the desired class according to the user input, generate tactile display data corresponding to the topographical map, and provide the tactile display data to the tactile display such that the tactile display produces the plurality of raised portions to form the tactile message.
These and additional features provided by the embodiments described herein will be more fully understood in view of the following detailed description, in conjunction with the drawings.
The embodiments set forth in the drawings are illustrative and exemplary in nature and not intended to limit the subject matter defined by the claims. The following detailed description of the illustrative embodiments can be understood when read in conjunction with the following drawings, where like structure is indicated with like reference numerals and in which:
Referring generally to
Referring now to
Still referring to
The processor 130 of the tactile display device 100 may be any device capable of executing machine-readable instructions. Accordingly, the processor 130 may be a controller, an integrated circuit, a microchip, a computer, or any other computing device. The processor 130 is communicatively coupled to the other components of the tactile display device 100 by the communication path 120. Accordingly, the communication path 120 may communicatively couple any number of processors with one another, and allow the components coupled to the communication path 120 to operate in a distributed computing environment. Specifically, each of the components may operate as a node that may send and/or receive data. While the embodiment depicted in
Still referring to
The tactile display 134 is coupled to the communication path 120 and communicatively coupled to the processor 130. The tactile display 134 may be any device capable of providing tactile output in the form of refreshable tactile messages. A tactile message conveys information to a user by touch. For example, a tactile message may be in the form of a tactile writing system, such as Braille. A tactile message may also be in the form of any shape, such as the shape of an object detected in the environment. A tactile message may be a topographic map of an environment.
Any known or yet-to-be-developed tactile display may be used. In some embodiments, the tactile display 134 is a three dimensional tactile display including a surface, portions of which may raise to communicate information. The raised portions may be actuated mechanically in some embodiments (e.g., mechanically raised and lowered pins). The tactile display 134 may also be fluidly actuated, or it may be configured as an electrovibration tactile display. The tactile display 134 is configured to receive tactile display data, and produce a tactile message accordingly. It is noted that the tactile display 134 can include at least one processor and/or memory module.
The inertial measurement unit 136 is coupled to the communication path 120 and communicatively coupled to the processor 130. The inertial measurement unit 136 may include one or more accelerometers and one or more gyroscopes. The inertial measurement unit 136 transforms sensed physical movement of the tactile display device 100 into a signal indicative of an orientation, a rotation, a velocity, or an acceleration of the tactile display device 100. As an example and not a limitation, the tactile message displayed by the tactile display 134 may depend on an orientation of the tactile display device 100 (e.g., whether the tactile display device 100 is horizontal, tilted, and the like). Some embodiments of the tactile display device 100 may not include the inertial measurement unit 136, such as embodiments that include an accelerometer but not a gyroscope, embodiments that include a gyroscope but not an accelerometer, or embodiments that include neither an accelerometer nor a gyroscope.
Still referring to
The speaker 140 (i.e., an audio output device) is coupled to the communication path 120 and communicatively coupled to the processor 130. The speaker 140 transforms audio message data from the processor 130 of the tactile display device 100 into mechanical vibrations producing sound. For example, the speaker 140 may provide to the user navigational menu information, setting information, status information, information regarding the environment as detected by image data from the one or more cameras 144, and the like. However, it should be understood that, in other embodiments, the tactile display device 100 may not include the speaker 140.
The microphone 142 is coupled to the communication path 120 and communicatively coupled to the processor 130. The microphone 142 may be any device capable of transforming a mechanical vibration associated with sound into an electrical signal indicative of the sound. The microphone 142 may be used as an input device 138 to perform tasks, such as navigate menus, input settings and parameters, and any other tasks. It should be understood that some embodiments may not include the microphone 142.
Still referring to
The network interface hardware 146 is coupled to the communication path 120 and communicatively coupled to the processor 130. The network interface hardware 146 may be any device capable of transmitting and/or receiving data via a network 170. Accordingly, network interface hardware 146 can include a communication transceiver for sending and/or receiving any wired or wireless communication. For example, the network interface hardware 146 may include an antenna, a modem, LAN port, Wi-Fi card, WiMax card, mobile communications hardware, near-field communication hardware, satellite communication hardware and/or any wired or wireless hardware for communicating with other networks and/or devices. In one embodiment, network interface hardware 146 includes hardware configured to operate in accordance with the Bluetooth wireless communication protocol. In another embodiment, network interface hardware 146 may include a Bluetooth send/receive module for sending and receiving Bluetooth communications to/from a portable electronic device 180. The network interface hardware 146 may also include a radio frequency identification (“RFID”) reader configured to interrogate and read RFID tags.
In some embodiments, the tactile display device 100 may be communicatively coupled to a portable electronic device 180 via the network 170. In some embodiments, the network 170 is a personal area network that utilizes Bluetooth technology to communicatively couple the tactile display device 100 and the portable electronic device 180. In other embodiments, the network 170 may include one or more computer networks (e.g., a personal area network, a local area network, or a wide area network), cellular networks, satellite networks and/or a global positioning system and combinations thereof. Accordingly, the tactile display device 100 can be communicatively coupled to the network 170 via wires, via a wide area network, via a local area network, via a personal area network, via a cellular network, via a satellite network, or the like. Suitable local area networks may include wired Ethernet and/or wireless technologies such as, for example, wireless fidelity (Wi-Fi). Suitable personal area networks may include wireless technologies such as, for example, IrDA, Bluetooth, Wireless USB, Z-Wave, ZigBee, and/or other near field communication protocols. Suitable personal area networks may similarly include wired computer buses such as, for example, USB and FireWire. Suitable cellular networks include, but are not limited to, technologies such as LTE, WiMAX, UMTS, CDMA, and GSM.
Still referring to
The tactile feedback device 148 is coupled to the communication path 120 and communicatively coupled to the processor 130. The tactile feedback device 148 may be any device capable of providing tactile feedback to a user. The tactile feedback device 148 may include a vibration device (such as in embodiments in which tactile feedback is delivered through vibration), an air blowing device (such as in embodiments in which tactile feedback is delivered through a puff of air), or a pressure generating device (such as in embodiments in which the tactile feedback is delivered through generated pressure). It should be understood that some embodiments may not include the tactile feedback device 148.
The location sensor 150 is coupled to the communication path 120 and communicatively coupled to the processor 130. The location sensor 150 may be any device capable of generating an output indicative of a location. In some embodiments, the location sensor 150 includes a global positioning system (GPS) sensor, though embodiments are not limited thereto. Some embodiments may not include the location sensor 150, such as embodiments in which the tactile display device 100 does not determine a location of the tactile display device 100 or embodiments in which the location is determined in other ways (e.g., based on information received from the camera 144, the microphone 142, the network interface hardware 146, the proximity sensor 154, the inertial measurement unit 136 or the like). The location sensor 150 may also be configured as a wireless signal detection device capable of triangulating a location of the tactile display device 100 and the user by way of wireless signals received from one or more wireless signal antennas.
Still referring to
The proximity sensor 154 is coupled to the communication path 120 and communicatively coupled to the processor 130. The proximity sensor 154 may be any device capable of outputting a proximity signal indicative of a proximity of the tactile display device 100 to another object. In some embodiments, the proximity sensor 154 may include a laser scanner, a capacitive displacement sensor, a Doppler effect sensor, an eddy-current sensor, an ultrasonic sensor, a magnetic sensor, an optical sensor, a radar sensor, a sonar sensor, or the like. Some embodiments may not include the proximity sensor 154, such as embodiments in which the proximity of the tactile display device 100 to an object is determine from inputs provided by other sensors (e.g., the camera 144, the speaker 140, etc.) or embodiments that do not determine a proximity of the tactile display device 100 to an object.
The temperature sensor 156 is coupled to the communication path 120 and communicatively coupled to the processor 130. The temperature sensor 156 may be any device capable of outputting a temperature signal indicative of a temperature sensed by the temperature sensor 156. In some embodiments, the temperature sensor 156 may include a thermocouple, a resistive temperature device, an infrared sensor, a bimetallic device, a change of state sensor, a thermometer, a silicon diode sensor, or the like. Some embodiments of the tactile display device 100 may not include the temperature sensor 156.
Still referring to
The housing 110 of the example tactile display device 100 provides a tablet-shaped device. It should be understood that embodiments of the present disclosure are not limited to the configuration of the tactile display device 100, and that the example tactile display device of
Referring to
Depending on the type of display device 134, the raised portions 135 may be made up of a plurality of tactile pixels (e.g., individual pins or pockets of fluid). The tactile pixels may be raised and lowered according to the tactile display data to produce the tactile message 137. As stated above, the tactile message 137 may be related to anything of interest to the user, such as a topographic map of the environment, the location of specific types of objects in the environment, a tactile representation of an object, symbols, Braille text of documents, and the like. In some embodiments, each raised portion 135 may be a representation of an object that is within the environment. As a non-limiting example, one or more of the raised portions 135 may include a Braille message that describes the particular object (e.g., the class of the object, a person's name, and the like).
The format of the tactile message 137 may be customizable depending on the preferences of the user. For example, the individual raised portions 135 may be spatially positioned within the tactile message 137 based on their location in the environment as shown in
Several components may be provided in the bezel 117, such as microphone 142, speaker 140, and input devices 138A, 138B. As described above with reference to
In the illustrated embodiment, input devices 138 are provided within the bezel 117 of the housing 110. The input devices 138A, 138B may be configured as one or more touch-sensitive regions in which the user may provide input to the tactile display device 100 as well as navigate menus, for example. The touch-sensitive regions may be formed by a touch sensitive film, in some embodiments. However, as stated above, any type of input device may be provided including, but not limited to, buttons, mechanical switches, and pressure switches. It should be understood that embodiments are not limited to the number and placement of input devices 138A, 138B shown in
Referring now to
In the illustrated embodiment, a camera assembly is defined by a first camera 144A and a second camera 144B. In other embodiment, only a single camera 144 may be provided. The first and second cameras 144A, 144B may each capture image data (i.e., digital images) of the environment. As described in more detail below, the image data is provided to the processor 130 to create a topographical map of the environment, which is then provided to the user as a tactile message 137 by the tactile display 134. The image data from each of the first camera 144A and the second camera 144B (i.e., a first image and a second image) may be combined to create a stereoscopic image in which depth information is extracted. The tactile message 137 may provide such depth information to the user.
In some embodiments, a light 155 (e.g., a flash or continuously on light) may be provided at the rear surface 113 to illuminate the environment when the first and second cameras 144A, 144B capture image data. (e.g., one or more light emitting diode lights). It should be understood that in some embodiments the rear surface light 155 may not be provided.
In the illustrated embodiment, the proximity sensor 154 is provided at the rear surface 113 of the tactile display device 100. As described above, the proximity sensor may provide information as to the proximity of the tactile display device 100 to an object. Such proximity information may be used to generate the topographical map that is displayed in the tactile message 137.
The illustrated tactile display device 100 comprises a kickstand 112 at the rear surface 113. The kickstand 112 may be used to keep the tactile display device 100 in an upright position when placed on a surface, such as a table or desk.
A user of the tactile display device 100 may take a picture of his or her environment with the tactile display device 100. For example, the user may control the tactile display device 100 using one or more input devices 138 (and/or microphone 142) to take a picture (i.e., capture image data) with the first and second cameras 144A, 144B (or single camera 144). The user may also input preferences using the one or more input device 138 (and/or microphone 142) regarding the class or type of objects that he or she wishes to display in the tactile display 134. For example, the user may desire to gain insight with respect to one or more particular types of objects in his or her environment. Example classes of objects include, but are not limited to, people, tables, empty seats, doorways, walls, restrooms, and water fountains. Accordingly, only those objects meeting one of the selected classes will be displayed in the tactile message 137.
The image data may be a single image from each of the first and second camera 144A, 144B or a plurality of sequential images. The image data captured by the first and second cameras 144A, 144B may be provided to the processor 130, which then analyzes the image data. One or more object recognition algorithms may be applied to the image data to extract objects having the particular class selected by the user. Any known or yet-to-be-developed object recognition algorithms may be used to extract the objects from the image data. Example object recognition algorithms include, but are not limited to, scale-invariant feature transform (“SIFT”), speeded up robust features (“SURF”), and edge-detection algorithms. Any known or yet-to-be developed facial recognition algorithms may also be applied to the image data to detect particular people within the environment. For example, the user may input the names of particular people he or she would like to detect. Data regarding the facial features of people may be stored in the memory module 132 and accessed by the facial recognition algorithms when analyzing the image data. The object recognition algorithms and facial recognition algorithms may be embodied as software stored in the memory module 132, for example.
The objects extracted from the image data may be utilized by the processor 130 to generate a topographical map of the user's environment. A topographical map is a map that provides spatial information regarding objects that are in the user's environment. For example, the topographical map may indicate the presence and position of particular objects, such as empty seats, doorways, tables, people, and the like. Referring specifically to
In some embodiments, the tactile display device 100 is configured to extract text that is present in the image data. For example, the tactile display device 100 may detect the text of signs that are present within the user's environment. The processor 130, using a text-detection algorithm (e.g., optical character recognition), may detect and extract any text from the image data for inclusion in the tactile message 137. As an example and not a limitation, the image data may have captured an “EXIT” sign in the environment. The processor 130 may detect and extract the word and location of the “EXIT” sign in the environment and generate the topographical map accordingly. The tactile message 137 may then indicate the presence and location of the “EXIT” sign to the user.
As stated above, information extracted from image data may also be converted to auditory data that is sent to the speaker 140 for playback of an audio message. As non-limiting examples, the speaker 140 may produce an auditory message regarding the number of empty seats in the room, or the presence of a particular person. The auditory message may provide any type of information to the user.
In some embodiments, topographical map information may be stored in the memory module 132 or stored remotely and accessible via the network interface hardware 146 and network 170. For example, the topographical map information may be stored on a portable electronic device 180 or on a remote server maintained by a third party map data provider.
The topographical map information may be based on a location of a user, or based on another location inputted by the user. The location of the tactile display device 100 and therefore the user may be determined by any method. For example, the location sensor 150 may be used to determine the location of the user (e.g., by a GPS sensor). Wireless signals, such as cellular signals, WiFi signals, and Bluetooth® signals may be used to determine the location of the user.
The topographical map information may include data relating to external maps, such as roads, footpaths, buildings, and the like. The topographical map information may also include data relating to interior spaces of buildings (e.g., location of rooms, doorways, walls, etc.). The topographical map information may provide additional information regarding the user's environment beyond the objects extracted from the image data.
The processor 130 may access the topographical map information when generating the topographical map. The topographical map may comprise any combination of objects extracted from image data and/or the topographical map information.
In some embodiments, the tactile message 137 displayed on the tactile display 134 provides a navigational route from a first location to a second location. For example, the tactile display device 100 may be configured to generate a tactile map including obstacles and a navigational route in the form of tactile arrows or lines that indicate to the user the path to follow. The navigational route may also be provided in the tactile message as Braille text providing directions. Accordingly, the tactile display of navigation route information may take on many forms.
In some embodiments, the tactile display device 100 may be configured to translate written text into Braille or other tactile writing system. In this manner, the user of the tactile display device 100 may be able to read written text. As an example and not a limitation, a user may take a picture of a page of text using the tactile display device 100. Using optical character recognition, the tactile display device 100 (e.g., using the processor 130 and/or other hardware) may extract or otherwise determine the text from the image data. A tactile representation of the extracted text may then be provided by the tactile message 137 (e.g., Braille text).
In some embodiments, the inertial measurement unit 136 may be included in the tactile display device 100 for additional functionality. The auditory and/or tactile output of the tactile display device 100 may depend on an orientation of the tactile display device 100 as detected by the inertial measurement unit 136. As an example and not a limitation, when the tactile display device 100 is oriented in a horizontal orientation with respect to the ground, the tactile display device 100 may preemptively initiate the optical character recognition process without user input because of the high likelihood that the user is taking a picture of text when the tactile display device 100 is in this orientation. Similarly, when the user is holding the tactile display device 100 in non-horizontal position (e.g., vertical), the tactile display device 100 may preemptively capture image data and initiate the object recognition algorithm(s) because of the high likelihood that the user is taking a picture of his or her environment.
Referring now to
The captured image data is then analyzed to detect the presence and location of people within the room. A topographical map is generated from the image data that includes the people within the room. The topographical map is converted into tactile display data that is provided to the tactile display 134, which then displays the tactile message 137 accordingly.
Referring to
It should now be understood that embodiments described herein are directed to tactile message devices capable of providing tactile information about a visually impaired user's environment. Embodiments of the present disclosure capture image data of a user's environment, detect objects from the captured image data, and display a topographical map in accordance with the detected objects in a tactile message provided on a tactile display. In this manner, a blind or visually impaired user may determine the presence and location of desired objects within his or her environment. Embodiments may also provide audio messages regarding the user's embodiments, as well as convert written text to Braille or another tactile writing system.
While particular embodiments have been illustrated and described herein, it should be understood that various other changes and modifications may be made without departing from the spirit and scope of the claimed subject matter. Moreover, although various aspects of the claimed subject matter have been described herein, such aspects need not be utilized in combination. It is therefore intended that the appended claims cover all such changes and modifications that are within the scope of the claimed subject matter.