Display technologies that allow people to see a three-dimensional digital world have witnessed great successes in the last decade. Touchable displays with tactile feedback exist but they have fairly limited features compared to the visual displays existing today. In a sense, the development of haptic technologies that simulate the sense of touching a three-dimensional digital world lags behind.
There is, therefore, a need for solutions that improve the function of haptic output devices.
Now there has been invented an improved method and technical equipment implementing the method, by which the above problems are alleviated. Various aspects of the invention include a method, an apparatus, a server, a client and a computer readable medium comprising a computer program stored therein, which are characterized by what is stated in the independent claims. Various embodiments of the invention are disclosed in the dependent claims.
The invention relates to a method, apparatus and system for producing haptic output. “Haptic” may be understood here as an interface to the user to enable interaction with the user by the sense of touch. Data of a plurality of objects of a model are received. The data of the plurality of objects comprise information of dimensions of the objects and properties of the objects. Haptic instructions for a haptic output device are received for producing haptic output of the properties, and in accordance with the instructions, haptic output for the objects using the haptic instructions is produced. One or more mappings between properties of virtual objects and target haptic outputs may be formed into a haptic data structure, the haptic data structure comprising a plurality of haptic instructions indicative of mappings between properties and haptic outputs, the haptic data structure being configured for use in haptic output related to objects when a user is determined to interact with (e.g. touch or point to) said objects. A data structure for controlling haptic output of a device may comprise one or more mappings between virtual reality model object properties and target haptic outputs, and one or more haptic instructions, the haptic instructions being intended for controlling a device to produce a defined haptic output for an object having a defined property.
In other words, the model and its objects (e.g. their dimensions) and their properties may be described e.g. in one data structure or data file, for example a three-dimensional city map. The desired haptic output corresponding to different properties of the model may be described in another data structure or data file, or a plurality of data structures and data files. In this manner, the model may be displayed visually to the user by using the information on dimensions of the objects, colour information, reflectance information. When a user interacts with one of these objects e.g. by touching the object, a haptic output may be produced by using the defined haptic output for the object or the object part that has been touched. In this manner, the model and the objects and their properties may be separated from the actual haptic output produced for the objects. For example, the haptic commands for producing haptic output may not need to be part of the model description or in the same data structure or file. Also, the haptic output may be modified separately from the model and its objects.
In the following, various embodiments of the invention will be described in more detail with reference to the appended drawings, in which
In the following, several examples will be described in the context of producing haptic output related to a model comprising objects, for example a virtual reality model like a city model. It is to be noted, however, that the invention is not limited to such models only, or a specific type of a model. In fact, the different embodiments have applications in any environment where producing haptic output is required. For example, the described haptic data structure may be used to control haptic output in any device or system so that a property or item is mapped to a certain haptic output with the help of the haptic data structure and haptic output is produced accordingly.
There may be a number of servers connected to the network, and in the example of
There are also a number of user devices such as mobile phones 126 and smart phones or Internet access devices (Internet tablets) 128, and personal computers 116 of various sizes and formats. These devices 116, 126 and 128 can also be made of multiple parts. The various devices may be connected to the networks 110 and 120 via communication connections such as a fixed connection to the internet, a wireless connection to the internet, a fixed connection to the mobile network 120, and a wireless connection to the mobile network 120. The connections are implemented by means of communication interfaces at the respective ends of the communication connection.
There may also be a user device 150 for producing haptic output, i.e., comprising or being functionally connected to a module for producing haptic output. In this context, a user device may be understood to comprise functionality and to be accessible to a user such that the user can control its operation directly. For example, the user may be able to power the user device on and off. In other words, the user device may be understood to be locally controllable by a user (a person other than an operator of a network), either directly by pushing buttons or otherwise physically touching the device, or by controlling the device over a local communication connection such as Ethernet, Bluetooth or WLAN.
The haptic controller may be arranged to receive haptic instructions generated by the processor, or the haptic controller may produce such haptic instructions to the haptic output device. Such instructions may be created from properties of objects of a model by mapping a haptic output to a property. In this manner, for example, tactile feedback may be produced to create a haptic understanding to digitalized three-dimensional models of cities. A new way of remote sensing may be provided for people to detect other properties such as the texture and even the temperature of the model objects.
There have been existing technologies for creating digital 3D models from small objects to large ones like cities. Typically, one can use visualization techniques to display 3D landscapes and city models. Other multimedia content (e.g. auditory data) can be used together to provide a more enhanced and holistic understanding of the real world. Together with other sensing techniques, not only can we see the digitalized world, but also touch and feel the world. This extends the sense of the digitized world and helps people in the situation when the vision-only solution is not enough or inapplicable, e.g. to those vision-impaired people.
In section 220, with the help of classification and sensor technologies other human sensor-sensitive data (properties of model objects) such as the material and temperature (live or statistic) of a given region may be obtained. Such additional information may be incorporated into the description language of the 3D structure to obtain a new multi-modality language. Such language may be rendered by a device to reproduce the world in the forms of a shape display and thermal rendering, that is, as haptic output. “Semantics” may in this context be understood to comprise description of properties of model objects for haptic output.
A semantic-aware tactile (haptic) sensing device 200 may comprise a multimodality semantic mixer 230 and haptic rendering engine 240. For example, according to the semantics of 3D map data (e.g. tree, glass wall, buildings), or any other data on object properties, the multimodality semantic mixer 230 converts the property data into a format that is able to be rendered on the haptic rendering device. In converting the property data to a multimodal data structure 250, a semantic aware conversion table or other mapping may be used. Semantic aware conversion lookup tables 235 define different ways of converting map data, e.g. how to map 3D depth information into haptic vibration magnitudes, or alternatively, how to map pixel color into different haptic temperatures, or such. The haptic rendering engine 240 may then render the multimodal data into haptic feedbacks such as 3D shapes, vibration, temperatures (thermal rendering), or a combination of these. To produce different temperatures, a thermal rendering component 245 may be used, where temperatures may be varied e.g. by electric heating and/or cooling by fans or liquid cooling.
The haptic instructions may define a relation between a first property of objects and a first target haptic output for the property, and information of dimensions and a property of an object may be received, and using this relation, the first haptic output for the first object may be produced. This producing may happen when the user is e.g. touching or otherwise interacting with the object in a virtual scene or pointing at the object. This relation between properties and haptic output may be implemented in the form of a haptic data structure. This haptic data structure may comprise a plurality of haptic instructions for producing certain haptic output, as well as a plurality of mappings between properties and target haptic outputs. Based on the plurality of mappings, a haptic output for the object among the target haptic outputs may be selected and then the selected haptic output may be produced, for example, when a user is determined to interact with (e.g. touch or point to) the object.
In phase 425, a second haptic data structure may be received, with the described haptic instructions and mappings. The haptic instructions of the first haptic data structure and the second haptic data structure may be combined to obtain a combined plurality of mappings between properties and target haptic outputs. The combining may happen e.g. so that the mappings in the second haptic data structure are added to the mappings of the first data structure, and where the same property is mapped to a haptic output in both the first and second data structures, the mapping of the second data structure prevails. Alternatively, if the same property is assigned in the first haptic data structure to have a first haptic output of a first haptic modality (e.g. vibration), and in a second haptic data structure to have a second haptic output of a second modality (e.g. temperature), both haptic outputs may be assigned to the same property.
This combined plurality of mappings may be used in phase 430 to select a second haptic output (or a plurality of haptic outputs) for the first object among the target haptic outputs and to produce the selected second haptic output (or a plurality of haptic outputs) when a user is determined to interact with this first object. That is, the haptic instructions in the haptic data structure may define that a first haptic output and a second haptic output are to be produced simultaneously for an object having a first property. Consequently, a user interaction (touch or pointing) is detected in phase 440, and a first haptic output is produced in phase 445 for an object having a first property using the haptic instructions, and, e.g. simultaneously, a second haptic output for the object having the first property may be produced using the haptic instructions. That is, mixed haptic output may be produced for a single object with a property, or mixed haptic output may be produced for different objects of a model having different properties.
In this description, the model may be a virtual reality model and the objects may be objects in the virtual reality model. The properties of the objects may comprise any of colour, surface texture, surface smoothness, temperature, surface wetness, object contents, level of fullness, weight and density, or their combination (one object may have several properties). The produced haptic output may comprise e.g. different strengths of vibration, creating a touchable surface shape, producing heat and producing cold, or any combination of such.
In this description, the model may comprise a city map and the objects may comprise building objects in the city map, environment objects and vehicle objects, or any such objects belonging to a virtual scene. For example, a property of an object may be determined to comprise demographic information or traffic information near the object in the model, and haptic output based on the determining.
Thermal rendering may be used as one modality of haptic output. A property of an object may comprise color, height, material property, smell or taste or another physical property of the object, and a haptic output may be produced based on the determining, wherein the producing comprises production of heat or cold. Different real world properties may be translated into different thermal values, for instance:
The model may be a virtual reality model with properties like colour, surface texture, surface smoothness, temperature, surface wetness, object contents, level of fullness, weight and density, as described earlier. Haptic output may comprise vibration, surface shape, heat and cold. The model may comprise a city map and the objects may comprise building objects, environment objects and vehicle objects.
In the flow charts described above, the phases may be carried out in different order than described here. Also, some of the phases may be omitted, and there may be additional phases. It needs to be understood that the phases may be combined by a skilled person in a usual manner. For example, if the phases have been implemented in computer software, software elements may be combined in a known manner to produce a software product that carries out the desired phases.
The different objects may have properties. For example, the street 520 may be determined to have a property 560 of being hot (temperature 45 degrees Centigrade). The car 522 may be detected as a car and defined to have a property 562 of a metallic surface. The tree 524 may have a property 564 of being green. The building 526 may have a property 566 of having a rough concrete surface.
These properties may be transformed by a function, e.g. a mapping, into haptic outputs. For example, a haptic data structure may define the transformation from object property space to haptic output space.
As described earlier, the properties may comprise at least one of the group of colour, surface texture, surface smoothness, temperature, surface wetness, object contents, level of fullness, weight and density and said haptic output may comprise at least one of the group of vibration, surface shape, heat and cold.
As an example in
The grouping may also reduce the number of definitions needed to set the haptic outputs corresponding to properties, thereby increasing coding efficiency.
In a sense, because a haptic data structure comprises a projection of properties of objects to haptic outputs, the haptic data structure may be understood to be a haptic theme. In a haptic theme, a number of properties are set in one go to map to certain haptic outputs. The technical benefit of this may be that several haptic data structures (themes) may be provided to the user device, and when a certain theme is to be used for a model, it suffices to refer to this haptic data structure (theme) instead of setting each one of the mappings one by one. The technical benefit from the individual mappings may be that the virtual reality model properties and the haptic output may be separated (e.g. to different files), and the same model may be output in haptic output in many ways without altering the model itself.
A number of haptic data structures (themes) may be combined. This makes it even simpler to define in which way the haptic output for a model should be produced.
The haptic data structures may be delivered to the device for producing haptic output e.g. at the time of downloading the model to be rendered. Alternatively, the haptic data structures may be pre-installed (e.g. at a factory) as preset haptic styles. There may be a default theme for the device, and there may be default themes defined for different types of content.
For example, the haptic data structure Haptic_data_structure_A may comprise the mapping “Metallic=Temperature 15 C” and the haptic data structure Haptic_data_structure_B may comprise the mapping “Traffic dense=vibration 3”. The model data may comprise objects and their properties may comprise “Metallic”, “Green” and “Dense traffic”. It is now clear that the properties “Metallic” and “Dense traffic” have defined haptic outputs (“Temperature 15 C” and “vibration 3”) while the property “Green” does not have a defined haptic output. Consequently, when an object having a property of “Metallic” or the property of “Dense traffic” is touched by the user, a haptic output is produced (either “Temperature 15 C” or “vibration 3”, or both), but when a user touches an object that has a property of “Green”, there is no haptic output produced by this property.
It is also possible to implement the haptic output control so that the application of haptic data structure(s) to a model comprising objects and their properties is carried out on the server system. That is, the haptic instructions for controlling the haptic output are obtained by utilizing the mapping in the haptic data structure(s) from the model data. These haptic instructions may then be provided to the user device that produces the haptic output.
The various examples described above may be implemented with the help of computer program code that resides in a memory and causes the relevant apparatuses to carry out the invention. For example, a device may comprise circuitry and electronics for handling, receiving and transmitting data, computer program code in a memory, and a processor that, when running the computer program code, causes the device to carry out the described features and/or functions. Yet further, a network device like a server may comprise circuitry and electronics for handling, receiving and transmitting data, computer program code in a memory, and a processor that, when running the computer program code, causes the network device to carry out the features of an embodiment. A computer program may be embodied on a computer readable medium, from where it may be accessed, e.g. loaded to the operating memory of a computer for execution. A data structure may be embodied on a computer readable medium, from where it may be accessed, e.g. loaded to the working memory of a computer device for controlling the computer device.
For example, there may be a computer program product embodied on a non-transitory computer readable medium, and the computer program product comprises computer executable instructions to cause an apparatus or system, when executed on a processor of the apparatus or system, to receive data of a plurality of objects of a model, the data of the plurality of objects comprising information of dimensions of the objects and properties of the objects; to receive haptic instructions for a haptic output device for producing haptic output of the properties; and to produce haptic output for the objects using the haptic instructions.
Such a computer program product may comprise a data structure for controlling haptic output of a device, the data structure comprising one or more mappings between virtual reality model object properties and target haptic outputs, and one or more haptic instructions, the haptic instructions being configured to control the apparatus or system to produce a defined haptic output for an object having a defined property. For example, a computer program product may comprise computer instructions for producing output from digital map content e.g. by executing a navigation application.
It is obvious that the present invention is not limited solely to the above-presented embodiments, but it can be modified within the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
1422896.9 | Dec 2014 | GB | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/FI2015/050836 | 12/1/2015 | WO | 00 |