The present invention relates generally to formatting and displaying content and, more particularly, to synchronizing and identifying content based on location of the content.
There has been a proliferation of content utilized by users. This content typically includes video tracks, graphic images, and photographs. In many instances, the content utilized by a user is stored without fully realizing the relationship between each piece of content.
For example, images are typically captured with attention paid to the visual quality of the image. Unfortunately, additional unique information about each image that describes the environment of the image is not captured.
Managing this increasing amount of content is a challenge for many users. With the increasing amount of content, it is also more difficult to track additional unique information related to the environment of each image.
In one embodiment, the methods and apparatuses for formatting and displaying content capture an image with a device; detect an image parameters related to the image; store the image parameters such that the image parameters are available for access at a later time; and display the image in a display location based on at least one of the image parameters.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate and explain one embodiment of the methods and apparatuses for synchronizing and identifying content. In the drawings,
The following detailed description of the methods and apparatuses for formatting and displaying content refers to the accompanying drawings. The detailed description is not intended to limit the methods and apparatuses for formatting and displaying content. Instead, the scope of the methods and apparatuses for formatting and displaying content is defined by the appended claims and equivalents. Those skilled in the art will recognize that many other implementations are possible, consistent with the present invention.
References to “content” includes data such as images, video, graphics, and the like, that are embodied in digital or analog electronic form.
References to “electronic device” includes a device such as a video camera, a still picture camera, a cellular phone with an image capture module, a personal digital assistant with an image capture module, and an image capturing device.
In one embodiment, one or more user interface 115 components are made integral with the electronic device 110 (e.g., keypad and video display screen input and output interfaces in the same housing as personal digital assistant electronics (e.g., as in a Clie® manufactured by Sony Corporation). In other embodiments, one or more user interface 115 components (e.g., a keyboard, a pointing device (mouse, trackball, etc.), a microphone, a speaker, a display, a camera are physically separate from, and are conventionally coupled to, electronic device 110. The user utilizes interface 115 to access and control content and applications stored in electronic device 110, server 130, or a remote storage device (not shown) coupled via network 120.
In accordance with the invention, embodiments of formatting and displaying content below are executed by an electronic processor in electronic device 110, in server 130, or by processors in electronic device 110 and in server 130 acting together. Server 130 is illustrated in
The methods and apparatuses for formatting and displaying content are shown in the context of exemplary embodiments of applications in which images are displayed in a particular format and location based on parameters associated with the image. In one embodiment, the image is utilized through the electronic device 110 and the network 120. In another embodiment, the image is formatted and displayed by the application which is located within the server 130 and/or the electronic device 110.
In one embodiment, the methods and apparatuses for formatting and displaying content automatically creates a record associated with an image. In one instance, information within the record is automatically completed by the methods and apparatuses for formatting and displaying content based on previously stored records associated with corresponding images.
Server device 130 includes a processor 211 coupled to a computer-readable medium 212. In one embodiment, the server device 130 is coupled to one or more additional external or internal devices, such as, without limitation, a secondary data storage element, such as database 240.
In one instance, processors 208 and 211 are manufactured by Intel Corporation, of Santa Clara, Calif. In other instances, other microprocessors are used.
The plurality of client devices 110 and the server 130 include instructions for a customized application formatting and displaying content. In one embodiment, the plurality of computer-readable medium 209 and 212 contain, in part, the customized application. Additionally, the plurality of client devices 110 and the server 130 are configured to receive and transmit electronic messages for use with the customized application. Similarly, the network 120 is configured to transmit electronic messages for use with the customized application.
One or more user applications are stored in memories 209, in memory 211, or a single user application is stored in part in one memory 209 and in part in memory 211. In one instance, a stored user application, regardless of storage location, is made customizable based on formatting and displaying content as determined using embodiments described below.
In one embodiment, the system 300 includes a render module 310, a location module 320, a storage module 330, an interface module 340, a control module 350, and a capture module.
In one embodiment, the control module 350 communicates with the render module 310, the location module 320, the storage module 330, the interface module 340, and the capture module 360. In one embodiment, the control module 350 coordinates tasks, requests, and communications between the render module 310, the location module 320, the storage module 330, the interface module 340, and the capture module 360.
In one embodiment, the render module 310 displays an image based on image data and location data. In another embodiment, the render module 310 displays multiple images based on image data and location data of each image. In one embodiment, the image data is identified by the capture module 360. In one instance, the image data is in the form of a JPEG file. In another instance, the image data is in the form of a RAW file. In yet another instance, the image data is in the form of a TIFF file.
In one embodiment, the location data is identified by the location module 320. In one instance, the location data illustrates a particular location of the device such as a street address of the device. In another instance, the location data also illustrates a positional orientation of the device such as the horizon, line of sight, and the like. In yet another instance, the location also illustrates an image location as seen through the viewfinder or the device such as the area captured by the viewfinder, the zoom of the lens, and the like.
In one embodiment, the location module 320 processes the location data. In one embodiment, the location data includes general location data that provides the broad location of the device on a street by street granularity. In another embodiment, the location data includes image location data that provides specific location data as seen through the viewfinder of the device.
In one embodiment, the general location data is gathered by a global positioning satellite (GPS) system. In this embodiment, the GPS system senses the location of the device and is capable of locating the device. In another embodiment, the general location data is gathered by multiple cellular phone receivers that is capable of providing a location of the device.
In one embodiment, the image location data is supplied by at least one sensor within the device that provides a direction that the viewfinder is pointed and the amount of information that is shown in the viewfinder. In one instance, the device senses the direction of the viewfinder and displays this direction through a coordinate calibrated with respect to due North. In another instance, the device senses the current focal length of the device and determines the amount of information that is available to the viewfinder.
In one embodiment, the location module 320 supplies the general location data and the image location data related to a specific image to the system 300.
In one embodiment, the storage module 330 stores a record including the location data associated with a specific content. In another embodiment, the storage module 330 also stores the specific content that is associated with the record.
In one embodiment, the interface module 340 receives a request for a specific function from one of the electronic devices 110. For example, in one instance, the electronic device requests content from another device through the system 300. In another embodiment, the interface module 340 receives a request from a user of a device. In yet another embodiment, the interface module 340 displays information contained within the record associated with the content.
In one embodiment, the capture module 360 identifies a specific image for use by the system 300. In one embodiment, the capture module 320 identifies the specific image. In another embodiment, the capture module 320 processes the specific image captured by the device.
The system 300 in
In one embodiment, the general location of device field 410 indicates a location of the device while capturing the corresponding image. In one embodiment, the location of the device is expressed in geographical coordinates such as minutes and seconds. In another embodiment, the location of the device is expressed as a street address or an attraction.
In one embodiment, the horizontal orientation of image field 420 indicates the horizontal direction of the corresponding image. In one embodiment, the horizontal orientation is expressed in terms of degrees from Due North.
In one embodiment, the vertical orientation of image field 430 indicates the vertical direction of the corresponding image. In one embodiment, the vertical direction is expressed in terms of degrees from the horizon line.
In one embodiment, the angle of view field 440 indicates the overall image area captured within the corresponding image. For example, the angle of view is expressed in terms of a zoom or magnification amount in one embodiment.
In one instance, the general location of device field 410, the horizontal orientation of image field 420, the vertical orientation of image field 430, and the angle of view field 440 are captured by the device while capturing the corresponding image. In combination, the parameters associated with general location of device field 410, the horizontal orientation of image field 420, the vertical orientation of image field 430, and the angle of view field 440 are capable of sufficiently describing the corresponding image in comparison with other images that have similar parameters recorded.
In one embodiment, the related image field 450 indicates at least one other image that is related to the image associated with the record 400. For example, another image having a location of device field 410 that is similar to the location of the device for this specific image is considered a related image. In one embodiment, the related image has a different horizontal orientation or a different vertical orientation from the specific image. In another embodiment, the related image has a different angle of view from the specific image.
In one embodiment, the common reference point 460 identifies a common reference location to multiple images. In one embodiment, the common reference location is calculated from the device. In another embodiment, the common reference location is calculated from each corresponding image.
In one embodiment, the image identification field 470 identifies the image. In one instance, the image description field 470 includes a descriptive title for the specific image. In another instance, the image identification field 470 includes a unique identification that corresponds to the specific image.
In one embodiment, the distance of subject 480 field identifies the distance between the device capturing the image and the subject of the image. In one embodiment, this distance is calculated from the focusing mechanism within the device.
In one embodiment, the record table 500 includes a record 515 and a record 525 which are similar to the record 400. In one embodiment, the image table 510 includes an image 520 and an image 530. In one instance, the record 515 corresponds with the image 520; and the record 525 corresponds with the image 530. Although the images and corresponding records are stored separately in this embodiment, the images and corresponding records are configured to be logically linked together such that when one of the images are utilized, the corresponding record is capable of being identified.
The flow diagrams as depicted in
The flow diagram in
In Block 620, the location of the electronic device is detected. In one embodiment, the location of the device is stored within the general location of device field 410.
In Block 630, an image is captured by the electronic device.
In Block 640, image information that corresponds with the image captured within the Block 630 is detected. In one embodiment, the image information includes the horizontal orientation of the image, the vertical orientation of the image, the angle of view, and/or the distance from the object. In one embodiment, the horizontal orientation of the image, the vertical orientation of the image, and the angle of view are recorded within the horizontal orientation of the image field 420, the vertical orientation of the image field 430, and the angle of view field 440, respectively.
In Block 650, the image is stored. In one embodiment, the image is stored within the storage module 330. In one instance, the image is stored within a table as shown in
In Block 660, the device location and image information are stored. In one embodiment, the device location and image information are stored within the storage module 330. In one instance, the device location and image information are stored within a table and linked to a corresponding image as shown in
The flow diagram in
In Block 720, image information that corresponds with the particular image is detected. In one embodiment, the image information includes the horizontal orientation of the image, the vertical orientation of the image, and/or the angle of view. In one embodiment, the horizontal orientation of the image, the vertical orientation of the image, and the angle of view are recorded within the horizontal orientation of the image field 420, the vertical orientation of the image field 430, and the angle of view field 440, respectively.
In one embodiment, the image information is detected through the record 400 that corresponds with the particular image.
In Block 730, the location information of the device is detected. In one embodiment, the location of the device corresponds to the location when the particular image was captured by the device. In one embodiment, the location information is found within the record 400.
In Block 740, an available display device is detected. In one embodiment, a single display device is detected. In another embodiment, multiple display devices are detected. In one embodiment, the display device is coupled to the render module 310.
In one embodiment, the display device is a display screen configured to visually display the image on the display screen. In another embodiment, the display device is a printer device configured to produce printed material on a tangible media.
In Block 750, an area is selected to display the particular image on the display device. In one embodiment, the area is selected based on the location information of the device. In another embodiment, the area is selected based on the image information.
In Block 760, the image is displayed within the selected area on the display device. In one embodiment, in the case of a single display screen, the image is displayed within the selected area on the single display screen based on the image information and/or the device location. For example, a lower right hand corner of the display screen is utilized to display based on the image information for the identified image.
In another embodiment, in the case of multiple display screens, the image is displayed on a particular display screen based on the image information and/or the device location. For example, with two displays located next to each other, the display located on the left is utilized to display the identified image based on the image information.
In yet another embodiment, in the case of tangible media within a printer device, the image is displayed within the selected area on the tangible media based on the image information and/or the device location. For example, a lower right hand corner of the tangible media is utilized to display based on the image information for the identified image.
The flow diagram in
In another embodiment, a user identifies the related images.
In Block 820, image information that corresponds with each of the related images is detected. In one embodiment, the image information includes the horizontal orientation of the image, the vertical orientation of the image, and/or the angle of view. In one embodiment, the horizontal orientation of the image, the vertical orientation of the image, and the angle of view are recorded within the horizontal orientation of the image field 420, the vertical orientation of the image field 430, and the angle of view field 440, respectively.
In one embodiment, the image information is detected through the record 400 that corresponds with the particular image.
In Block 830, an available display device is detected. In one embodiment, a single display device is detected. In another embodiment, multiple display devices are detected. In one embodiment, the display device is coupled to the render module 310.
In one embodiment, the display device is a display screen configured to visually display the image on the display screen. In another embodiment, the display device is a printer device configured to produce printed material on a tangible media.
In Block 840, a first related image is displayed within a first area within the display device. In one embodiment, the image information corresponding to the first related image determines the first area. In another embodiment, the image information of the first related image determines which display device is selected to display the first related image.
In Block 850, a second related image is displayed within a second area within the display device. In one embodiment, the image information corresponding to the second related image determines the second area. In another embodiment, the image information of the first related image determines which display device is selected to display the second related image.
In one embodiment, the first related image and the second related image are displayed relative to each other based on comparing the image information for both the first related image and the second related image. For example, if the first related image is captured with a horizontal orientation to the right of the second related image, then the first related image is displayed to the right of the second related image.
In another embodiment, the first related image and the second related image are displayed relative to each other based on comparing the image information for both the first related image and the second related image relative to a common reference point. For example, if the first related image is captured with a vertical orientation above the common reference point and the second related image is captured with a vertical orientation below the common reference point, then the first related image is displayed above the right of the second related image.
In one embodiment, the display devices include display devices 910, 920, 930, and 940 that are depicted in locations relative to a placeholder 905. In another embodiment, the display devices 910, 920, 930, and 940 represent different locations within a single display device.
In one embodiment, the placeholder 905 represents a camera device that recorded the stream of captured image 950. In another embodiment, the placeholder 950 represents a reference point utilized by the stream of captured images 950.
In one embodiment, the image 960 is displayed on the display device 940; the image 970 is displayed on the display 930; the image 980 is displayed on the display 910; and the image 990 is displayed on the display 920. In this embodiment, the stream of captured images 950 could have been captured in any order. In one embodiment, the images 960, 970, 980, and 990 are arranged and displayed according to the system 300 with respect to the placeholder 905. For example, when the images 960, 970, 980, and 990 were captured, the image 940 was located above the image 920; the image 910 was located to the left of the image 920; and the image 930 was located to the right of the image 920. Even though the images 960, 970, 980, and 990 were captured in a different order within the stream of captured images 950, they are positioned in their respective displays based on the position while being captured.
The foregoing descriptions of specific embodiments of the invention have been presented for purposes of illustration and description. The invention may be applied to a variety of other applications.
They are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed, and naturally many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the Claims appended hereto and their equivalents.