Embodiments relate to data processing and, more particularly, to a system and method for sharing digital data.
It has been noted in, for example, Sacks H. (1970) “first” and “second” stories: Topical coherence: Storing and recording experiences, In G. Jefferson (Ed) Lectures on conversation, Volume II. Harvey Sacks (PP249-260). Oxford: Blackwell (incorporated by reference herein) that listeners within a conversation often offer second stories that are recognisably related to a first story being relayed by a first speaker. Typically, the second story is produced in which the listener is a character in a similar position to the speaker's character in the first story. For example, if the first speaker was recounting an amusing anecdote of a skiing trip, the second speaker may also have had an amusing experience on a skiing trip and offer their experience in reply, which has the effect of maintaining or fuelling the conversation. Such second story-telling behaviour demonstrates attention and empathy with the first story-teller and actively engages the listener in the story-telling activity. Sacks further discloses that the inclination to respond to a story by recounting one's own experiences is so strong that people have to be trained not to do it in, for example, counselling sessions undertaken by a psychotherapist.
However, it has been observed that the urge to recount second stories is inhibited in conversations involving photographs. When the first story-teller uses a set of photographs to illustrate a story they are often able to proceed to tell further stories without interruption from their audience. The photographs appear to act as an inhibitor of reciprocal stories and lead to an asymmetrical conversation in which the audience is usually passive. Current digital photo-album technology appears to exacerbate this situation by turning the album into a slide show and increasing the power and control of the speaker over the photographs and conversation. In effect, digital technology may have the effect of suppressing conversation rather than stimulating the urge to engage in conversation.
One exemplary embodiment for sharing digital data comprises receiving, at an addressee system, data associated with digital data rendered by an addressor system; searching, via the addressee system, for related digital data using the received data; enabling user selection of at least one of the related digital data located by the searching; and outputting the selected related digital data to the addressor system.
Embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings in which:
Various embodiments actively stimulate a listener into recounting memories related to a story they are being told, which then affords them the opportunity to engage the story-teller with a tale of their own. It will be appreciated that related media comprises media having at least something in common with the original media rendered at the addressor system, that is, the media share a common theme or context. As used herein, “rendered” refers to the process of displaying an image on a display, wherein the image corresponds to digital data.
Within a sharing or collaborative environment, one may wish to show the located media to a friend or colleague. Suitably, some embodiments provide a method wherein the data associated with the selected digital data comprises a copy of the selected digital data and method further comprises rendering the copy of the selected digital data at the addressor or addressee computer system. Within the current specification, the term “sharing” comprises an exchange of ideas or information and includes the showing of media and the exchange of media.
Once media of potential interest has been identified, the user may select one for display to a friend or transmission to their friend's computer or other display device. Suitably, embodiments provide a method further comprising enabling selection of the rendered at least digital data and transmitting data associated with the selected digital data to the addressor computer system.
Embodiments are provided in which the data associated with the selected digital data comprises at least one of a copy of the selected digital data and metadata describing the selected digital data.
Embodiments may provide a method in which the data associated with the selected digital data comprises the copy of the selected digital data and the metadata.
Some alternative embodiments provide a method in which the data associated with the selected digital data comprises only the metadata.
Also, embodiments provide a method in which the received data comprises a copy of the digital data of the addressor computer system and the method further comprises rendering the received data to produce a rendered copy of the digital data associated with the addressor computer system.
Another embodiment provides a communication system comprising first and second computers to exchange, via a communication network, data comprising at least one of media of the first computer and metadata associated with the media of the first computer; the second computer comprising a context-based search-engine to search for and identify media accessible by the second computer having a context associated with or derived from said at least one of media of the first computer or the metadata associated with the first computer.
A further embodiment provides a computer program element for implementing embodiments as described in this specification. The term “computer program element” comprises at least a part or the whole of a computer program. Furthermore, embodiments provide a computer program product comprising a computer readable storage medium storing such a computer program element.
A yet another embodiment provides a method of sharing media between first and second computer systems, the method comprising: rendering media at the first computer;
Other embodiments may provide a method of sharing digitally produced audio or visual data comprising outputting, at a first computer, a digital photograph for showing to a third party by a first party; transmitting, from the first computer to a second computer, the digital photograph and associated metadata; receiving the transmitted digital photograph at the second computer; searching, using the received metadata, to identify a further digital photograph, accessible by the second computer, having a respective metadata associated the received metadata; and outputting, at the second computer, the further digital photograph to stimulate a conversation between the first and third parties using their respective digital photographs.
The computers 102 and 104 comprise respective controllers 108 and 108′ for controlling the operation of the computers 102 and 104 and managing the interaction of various elements of computers 102 and 104. The computers 102 and 104 communicate, under the control of respective controllers 108 and 108′, using respective communication mechanisms 110 and 110′. The communication may be wired or wireless according to the type of communication network 106 relied upon by the computers 102 and 104. For example, the computers 102 and 104 may communicate using GSM, CDMA, IEEE 802.11b, Bluetooth, TCP/IP, WAP, HTTP or some other communication protocol.
The communication mechanisms 110 and 110′ are arranged to handle all necessary signalling and data exchange to allow the computers 102 and 104 to exchange information. Each computer presents a user interface (UI) 112 and 112′, respectively, via which users (not shown) can interact with the computers 112 and 104. Typically, the user interfaces 112 and 112′ comprise a display for displaying digital data such as, for example, text and graphical information, a user input device such as, for example, a keyboard, keypad or mouse, and, an audio output device such as, for example, audio speakers. The input devices constituting the user interfaces 112 and 112′ may depend upon the nature of the media to be output to the users (not shown) and the capabilities of the devices. Alternative embodiments of the present invention might also include other output devices such as, for example, printers or the like for producing printed media.
Computer system 102 and 104 may comprise at least one media rendering engine 114 and 114′. The media rendering engines 114 and 114′ are arranged to display or output media to users (not shown). Within the context of described embodiments, the term “media” comprises digital data representing at least one of audio, visual information, and/or digital data from which such audio or visual information can be derived. In particular, the term “media” comprises, but is not limited to, digitally produced still or video image data, with or without associated digital audio, such as, for example, digital photographs, digital video, and other types of digital data. An example of such a media rendering engine may be Windows Media Player available from Microsoft Corporation in the event that the media to be rendered is audio visual data or, for example, Internet Explorer in the event that the media to be rendered is an image file such as, for example, a JPEG file. Therefore, it will be appreciated that the terms “render,” “rendered” and “rendering” comprise producing a human perceivable output from the media, that is, from the digital data. Furthermore, the media rendering engine may be a word processor such as, for example, Word, also available from Microsoft Corporation, in the event that the media is a text or written word document. It will be appreciated that the computer systems 102 and 104 may comprise a number of media rendering engines according to the types of media computer systems 102 and 104 may be expected to handle.
Each computer system 102 and 104 is provided with a media search engine 116 and 116′, respectively, implemented, at least in part, using software. Realisations of embodiments of a media search engine might, for example, comprises a searchable data base for storing the media and a data base program for accessing the searchable data base to retrieve the media. The media search engines 116 and 116′ are used to identify media such as, for example, images 118 and 118′, audio files 119 and 119′, documents 120 and 120′ and video 122 and 122′, stored using respective non-volatile media storage 124 and 124′. The non-volatile storage may take the form of any convenient non-volatile storage such as, for example, flash memory or, in the illustrated embodiments 130 and 130′, as hard disk drives (HDDs). It can be appreciated that each system 102 and 104 has access to at least some distinct, that is, separate, media. Although embodiments of the present invention have been described with reference to flash and HDD type storage, embodiments can use other forms of storage.
Media stored using the non-volatile storage 124 and 124′ has associated metadata that is related to each media item to assist the media search engines 116 and 116′ in identifying media of interest. For example, the media may be a JPEG image of a number of cows standing by a lake and the associated metadata may comprise the set of words “cow” and “lake.” It can be appreciated that media can be related or categorised using the metadata. For example, a pair of pictures comprising respective images of cows might both have the word “cow” as part of their respective metadata. Such pictures are considered to be related as they both concern or depict similar, or the, same subject-matter. That is, the pictures, or at least their associated metadata, have something in common, that is, a substantially similar context. The same also applies to other forms of media.
The operation of the computer systems 102 and 104 will be described with reference to
The media rendering engine 114 renders any received media 126 (
Although the above embodiment indicates that the related or matching media (118, 119, 120 and/or 122) are displayed on the user interface 112, it will appreciated that embodiments can be realised in which a saliency measure is used to rank the media and only selected media from all matching media are displayed according to that measure. The user (not shown), using the user interface 112, can select one of the displayed related media (118, 119, 120 or 122) at step 310.
Optionally, the user of the computer system 102 may indicate that they also have media that may be of interest to the user (not shown) of the computer system 104. If the latter user expresses an interest in the recently identified media, the computer system 102 may transmit the selected related media to the first computer system 104 where it can be displayed on the user interface 112′ using the media rendering engine 114′ at step 312.
It can be appreciated that the automatic search, retrieval and display of related media provides a prompt to the user (not shown) of the first computer system 104 which may cause that user to contribute to the conversation or to engage the user of the first computer system 104 thereby overcoming the traditional urge to remain silent as is often the case when one party is showing the second party, photographs, for example.
The user of the first computer system 104 may transmit selected media 126 from the first computer system 104. The media 126 may be accompanied by metadata 128 describing or related to the media 126. As described above, for example, the media 126 may be a digital photograph of cows standing by the shore of a lake and the metadata may comprise the set of words “cows” and “lake.” The media 126 and metadata 128 are received by the second computer system 102. The controller 108 causes the media rendering engine 114 to display or output a media via the user interface 112 and forwards the metadata 128 to the media search engine 116 where it is used to perform a search of the media 118, 119, 120 to 122 stored using the media storage 124. The search is performed to identify matching or related media that may be of interest to the first user (not shown).
Although the above embodiments have been described with reference to the use of metadata to identify potentially related media, embodiments of the present invention are not limited to such an arrangement. Embodiments can be realised in which the media to be shared is transmitted without the metadata and a sophisticated media search engine can be arranged using, for example, image processing techniques or pattern matching techniques, to identified related media.
Referring to
The exchange of metadata 128 between the computers 102 and 104 can be realised using any convenient protocol.
In an embodiment the computers 102 and 104 store data identifying users from whose corresponding computers metadata can be accepted. In this manner, when the computer systems 102 and 104 are sufficiently close to each other, the first computer system 104 may merely transmit the metadata without it needing to be specifically addressed to the second computer system 102. The second computer system 102, under the influence of the controller 108 executing appropriate software, may receive the transmitted metadata and act upon it accordingly. However, before acting upon the metadata, the controller 108 of the second computer system 102 traverses its corresponding list of users from whose computer metadata can be accepted to identify a match. It will be appreciated in this embodiment that an indication of the addressor or sender of the metadata accompanies the metadata 128. This indication is used in the matching process. If it is determined that the identifier of the sender is contained within the list of users from whom the second computer system 102 is authorised to receive the metadata, the controller 108 causes the media search engine 116 to instigate a search for related media. The result of the search may be the display of digital photographs such as, for example, digital photographs 212, 214, 216 and/or 218. The user 404 of the second computer system 102 may then, using the control section 220, select one of the digital photographs 212, 214, 216 and/or 218 which might then be displayed in an enlarged rather than thumbnail form to allow the user 404 to show the enlarged photograph (not shown) to the other user 402. Again, using the second embodiment, an exchange or conversation between the users 402 and 404 is facilitated using the context sensitive metadata to retrieve context-sensitive media.
Referring to
The process of flow chart 600 starts at block 602. At block 604, at an addressee system, data is received that is associated with digital data rendered by an addressor system. At block 606, via the addressee system, related digital data using the received data is searched for. At block 608, enabling user selection of at least one of the related digital data located by the searching is enabled. At block 610, the selected related digital data to the addressor system is output. The process ends at block 612.
The process of flow chart 700 starts at block 702. At block 704, digital data at the first computer system is rendered. At block 706, data associated with the digital data rendered at the first computer system is transmitted to the second computer system. At block 708, the transmitted data is received at the second computer system. At block 710, using the received data, the second computer system is searched to identify related digital data having a context associated with the digital data rendered at the first computer system. At block 712, at least one of the related digital data on the second computer system is rendered. At block 714, user selection is enabled, at the second computer system, of at least one of the related digital data. At block 716, the selected related digital data is transmitted to the first computer system. The process ends at block 718.
Although the above embodiments have been described with reference to the second computer system 102 performing a search for related media, embodiments are not limited to such an arrangement. Embodiments can be realised in which the second computer system 102 merely instigates the search for such media, that is, the second computer system 102 may instruct a further computer system to perform the search rather than performing the search itself. It will be appreciated that such embodiments might at least reduce, and, preferably, remove the need to provide a complex local search engine.
Other embodiments provide a method further comprising searching, at the addressor computer system, for digital data using the copy of the selected digital data as a search key.
Other embodiments provide a method wherein the data associated with the selected digital data comprises a copy of metadata associated with the selected digital data and the method further comprises searching, using the copy of the metadata, to identify digital data having a context associated with the selected digital data.
Accordingly, some embodiments provide a data processing system comprising a digital data search engine arranged to perform a context-sensitive search of searchable digital data, stored using digital data storage, in response to data received from a first computer, to identify digital data having a substantially similar context to that of digital data associated with the first computer; the received data conveying the context of the digital data associated with the first computer, and means to output data associated with the identified digital data.
Other embodiments provide a data processing system in which the data received from the first computer comprises metadata associated with the digital data associated with the first computer. The metadata might comprise at least one keyword associated with the digital data associated with the first computer. The search engine may use the metadata to locate potentially interesting media.
Depending upon the complexity and sophistication of the media search engine, an alternative embodiment provides a data processing system in which the received data comprises a copy of the digital data associated with the first computer and the data processing system comprises a media rendering engine to render the copy of the first media. The search engine may use the copy of the digital data itself as the key for performing the search. For example, image or pattern recognition may be employed to locate potentially related media.
Other embodiments provide a data processing system wherein the communication mechanisms 110 and/or 110′ comprise a transmitter operable to send identified digital data to the computers. Furthermore, embodiments may provide a data processing system comprising a receiver operable to receive the data or media associated with the first computer.
Alternative embodiments provide a data processing system as described in any preceding embodiment in which the related digital data have associated metadata having at least one metadata item in common.
Some embodiments provide a data processing in which the digital data comprises at least one of audio data and visual data or at least data from which such audio and visual data can be derived. Accordingly, the digital data comprises digitally produced image data.
It will be appreciated that the searchable media may be stored locally or may be stored remotely, via, for example, a network drive or a server forming part of the Internet, that is, remotely stored media is stored using storage that is not directly accessible by or not integral to the data processing system. Suitably, embodiments provide a data processing system in which the media search engine comprises a means to access a remote storage device on which the searchable digital data is held.
Further embodiments provide a method of operating an addressee computer system comprising receiving data associated with digital data accessible to an addressor computer system; searching, at or via the addressee computer system, a plurality of digital data, using the received data, to identify at least one digital data of the plurality of digital data having a substantially similar or related context to the digital data of the addressor computer system.
Other embodiments provide a method further comprising the step of rendering the at least one digital data at the addressee computer system.
The reader's attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings) and/or all of the steps of any method or process so disclosed, may be combined in any combination.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose. Thus each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of any foregoing embodiments. The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.
Number | Date | Country | Kind |
---|---|---|---|
0316028.0 | Jul 2003 | GB | national |