The United States Patent Office (USPTO) has published a notice to the effect that the USPTO's computer programs require that patent applicants reference both a serial number and indicate whether an application is a continuation, continuation-in-part, or divisional of a parent application. Stephen G. Kunin, Benefit of Prior-Filed Application, USPTO Official Gazette Mar. 18, 2003. The present Applicant Entity (hereinafter “Applicant”) has provided above a specific reference to the application(s) from which priority is being claimed as recited by statute. Applicant understands that the statute is unambiguous in its specific reference language and does not require either a serial number or any characterization, such as “continuation” or “continuation-in-part,” for claiming priority to U.S. patent applications. Notwithstanding the foregoing, Applicant understands that the USPTO's computer programs have certain data entry requirements, and hence Applicant has provided designation(s) of a relationship between the present application and its parent application(s) as set forth above, but expressly points out that such designation(s) are not to be construed in any way as any type of commentary and/or admission as to whether or not the present application contains any new matter in addition to the matter of its parent application(s).
A computationally implemented method includes, but is not limited to determining that a computing device that was in possession of a first user has been transferred from the first user to a second user, the determining including at least partially identifying the second user and the computing device being designed for presenting one or more items; and presenting, via the computing device, the one or more items in one or more particular formats, the one or more particular formats being selected based, at least in part, on said determining. In addition to the foregoing, other method aspects are described in the claims, drawings, and text forming a part of the present disclosure.
In one or more various aspects, related systems include but are not limited to circuitry and/or programming for effecting the herein-referenced method aspects; the circuitry and/or programming can be virtually any combination of hardware, software, and/or firmware in one or more machines or article of manufacture configured to effect the herein-referenced method aspects depending upon the design choices of the system designer.
A computationally implemented system includes, but is not limited to: means for determining that a computing device that was in possession of a first user has been transferred from the first user to a second user, the determining including at least partially identifying the second user and the computing device being designed for presenting one or more items; and means for presenting, via the computing device, the one or more items in one or more particular formats, the one or more particular formats being selected based, at least in part, on said determining. In addition to the foregoing, other system aspects are described in the claims, drawings, and text forming a part of the present disclosure.
A computationally implemented system includes, but is not limited to: circuitry for determining that a computing device that was in possession of a first user has been transferred from the first user to a second user, the determining including at least partially identifying the second user and the computing device being designed for presenting one or more items; and circuitry for presenting, via the computing device, the one or more items in one or more particular formats, the one or more particular formats being selected based, at least in part, on said determining. In addition to the foregoing, other system aspects are described in the claims, drawings, and text forming a part of the present disclosure.
An article of manufacture including a non-transitory storage medium bearing one or more instructions for determining that a computing device that was in possession of a first user has been transferred from the first user to a second user, the determining including at least partially identifying the second user and the computing device being designed for presenting one or more items; and one or more instructions for presenting, via the computing device, the one or more items in one or more particular formats, the one or more particular formats being selected based, at least in part, on said determining. In addition to the foregoing, other computer program product aspects are described in the claims, drawings, and text forming a part of the present disclosure.
A method for determining that a computing device that was in possession of a first user has been transferred from the first user to a second user, the determining including at least partially identifying the second user and the computing device being designed for presenting one or more items; and presenting, via the computing device, the one or more items in one or more particular formats, the one or more particular formats being selected based, at least in part, on said determining, wherein said determining that a computing device that was in possession of a first user has been transferred from the first user to a second user, the determining including at least partially identifying the second user and the computing device being designed for presenting one or more items and/or said presenting, via the computing device, the one or more items in one or more particular formats, the one or more particular formats being selected based, at least in part, on said determining are performed via at least one of a machine, article of manufacture, or composition of matter.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
a shows one type of movement that may be detected/monitored by the computing device 10* of
b shows another type of movement that may be detected/monitored by the computing device 10* of
c shows another type of movement that may be detected/monitored by the computing device 10* of
d shows overall 3-dimensional movements of the computing device 10* of
a shows a particular implementation of the computing device 10* of
b shows another implementation of the computing device 10* of
c shows another perspective of the transfer determining module 102* of
d shows another perspective of the particular format presenting module 104* of
e shows various types of sensors 120 that may be included in the computing device 10* of
a is a high-level logic flowchart of a process depicting alternate implementations of the transfer determining operation 402 of
b is a high-level logic flowchart of a process depicting alternate implementations of the transfer determining operation 402 of
c is a high-level logic flowchart of a process depicting alternate implementations of the transfer determining operation 402 of
d is a high-level logic flowchart of a process depicting alternate implementations of the transfer determining operation 402 of
e is a high-level logic flowchart of a process depicting alternate implementations of the transfer determining operation 402 of
f is a high-level logic flowchart of a process depicting alternate implementations of the transfer determining operation 402 of
g is a high-level logic flowchart of a process depicting alternate implementations of the transfer determining operation 402 of
h is a high-level logic flowchart of a process depicting alternate implementations of the transfer determining operation 402 of
a is a high-level logic flowchart of a process depicting alternate implementations of the particular format presenting operation 404 of
b is a high-level logic flowchart of a process depicting alternate implementations of the particular format presenting operation 404 of
c is a high-level logic flowchart of a process depicting alternate implementations of the particular format presenting operation 404 of
d is a high-level logic flowchart of a process depicting alternate implementations of the particular format presenting operation 404 of
a illustrates an example item 702a being displayed by the computing device 10* of
b illustrates an example item 702b being displayed by the computing device 10* of
c illustrates an example item 702c being displayed by the computing device 10* of
d illustrates an example item 702d being displayed by the computing device 10* of
In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here.
Advances in computing technologies and related technologies (e.g., visual display technology, battery technology, etc.) in recent years have greatly facilitated in the development of computing devices having increasingly smaller form factors while still maintaining exceptional processing capabilities. Examples of such mobile computing devices include, for example, laptops, Netbooks, tablet computers (i.e., “slate” computers), e-readers, Smartphones, personal digital assistants (PDAs), and so forth. Because of their compactness, such mobile computing devices (herein “computing devices”) are becoming much easier to share among a plurality of users. That is, due to their small form factors, such devices allow users of such devices to physically share such devices with friends, family, co-workers, clients, and so forth.
These portable computing devices, similar to their larger brethren, are able to visually and/or audibly present a wide variety of applications and content (herein “items”) in a wide range of formats depending on, for example, the needs of the users and the types of items to be presented. There are a number of ways to format items (e.g., applications such as gaming, productivity, or communication applications, audio or image files, textual documents, web pages, communication messages, and so forth) that may be visually and/or audibly presented through such devices One way to format such items is to directly format the items themselves. For example, items such as textual documents including word processing documents or email messages may be formatted to be presented in a wide variety of font styles and font sizes depending on, for example, the particular needs of users (e.g. elderly users with poor vision and/or hearing, or have unsteady fingers and have difficulty using, for example, a touchscreen). Another way to format such items is to configure a user interface (e.g., a display monitor and/or speakers) that is used to visually and/or audibly present the items (e.g., applications and content) in a particular way so that items that are presented through the user interface are presented in appropriate forms. For example, a display monitor may be configured in a particular way so that one or more items (e.g., video files) that are to be displayed through the display monitor may be displayed through a screen having certain brightness and color background that may be, for example, desired by the end user.
In accordance with various embodiments, computationally implemented methods, systems, and articles of manufacture are provided that can automatically determine whether a computing device that is designed for presenting one or more electronic items and that is in the possession of a first user has been transferred from the first user to a second user, the determination including at least partially identifying the second user; and presenting through the computing device the one or more electronic items in one or more particular formats, the one or more particular formats being selected based, at least in part, on the determination that the computing device was transferred from the first user to the second user. In various embodiments, such computationally implemented methods, systems, and articles of manufacture may be implemented at the computing device.
Referring now to
Although the computing device 10* illustrated in
There are a number of ways to determine whether a computing device 10* is or has been transferred from one user to another. In some cases, for instance, various sensor-provided data may be collected in order to make such a determination. Such data may indicate various environmental aspects surrounding the computing device 10* and/or aspects of the computing device itself (e.g., movements). For example, when the computing device 10* of
One way to track the movements or gestures of the first user 20 is to track the movements of the computing device 10*. That is, these gestures that may be exhibited by the first user 20 during the transfer of a computing device 10* from the first user 20 to the second user 30 may cause the computing device 10* to be spatially moved in a particular way. Thus, in order to detect whether a computing device 10* is being transferred from a first user 20 to a second user 30, one may observe the spatial movements of the computing device 10* in order to detect spatial movements that when detected at least infer the transfer of the computing device 10* between the first user 20 and the second user 30. For example, the computing device 10* may maintain in its memory 114 (see
One way to monitor for such movements of the computing device 10* is to directly detect such movements using one or more “movement” sensors that are designed to directly detect/measure movements. Examples of such movement sensors include, for example, inertia sensors, accelerometers (e.g. three-axis or 3D accelerometers), gyroscopes, and so forth. These sensors (herein “movement” sensors 202—see
Since not all movements of the computing device 10* that may be detected will be as a result of the computing device 10* being transferred between two users, in various embodiments and as will be further described herein, the computing device 10* may be endowed with particular logic for determining (e.g., identifying) which movements associated with the computing device 10* that have been detected indicates or at least suggests that the computing device 10* is or has been transferred from, for example, a first user 20 to a second user 30 and which detected movements may merely be “noise movements.”
Various types of movements of the computing device 10* may be tracked in order to determine or at least infer that the computing device 10* is being transferred between, for example, a first user 20 and a second user 30. Examples of the type of movements that may be tracked include, for example, tilt type movements, spin-rotation type movements, spatial relocation type movements, vibration movements, and so forth of the computing device 10. In order to determine or at least infer that the computing device 10* has been transferred from the first user 20 to the second user 30, one or more of these movements of the computing device 10* may be, individually or in combination, tracked using one or more sensors 120 that may be included with the computing device 10* as illustrated in
Referring now to
By detecting that the computing device 10* has been tilted in a particular manner from a first tilt orientation to a second tilt orientation, a determination or at least an inference may be made that the computing device 10* has been transferred from the first user 20 to the second user 30. In particular, when the first user 20 is handing-off or transferring the computing device 10* to the second user 30, the first user 20 may tilt the computing device 10* in a particular way that may be identifiable. Thus, when the computing device 10* is being transferred from a first user 20 to a second user 30, the computing device 10*(or rather the logic endowed with the computing device 10*) may track the movements of the computing device 10* as it moves from a first tilt orientation (e.g., the tilt of the computing device 10* at the beginning of the transfer or when the first user 20 was using the computing device 10*) to a second tilt orientation (e.g., the tilt of the computing device 10* at the end of the transfer or when the second user 30, for example, has obtained possession of the computing device 10*).
In order to make a determination or inference that a transfer was made from the first user 20 to the second user 30, the computing device 10* or at least the logic endowed in the computing device 10* may examine the particular movements of the computing device 10*(e.g., how the computing device 10* was reoriented from a first tilt orientation to a second tilt orientation including speed and cadence of the reorientation) as the computing device 10* moves from the first tilt orientation to a second tilt orientation. The computing device 10* may additionally or alternatively analyze the second tilt orientation (e.g., the tilt of the computing device 10* after it has finished being reoriented) at least with respect to the first tilt orientation in order to determine or infer that the computing device 10* has been transferred. To further determine or at least infer that the computing device 10* has been transferred from the first user 20 to the second user 30, the examination/analysis of the detected tilt movements of the computing device 10* may involve comparing the detected tilt movements of the computing device 10* with catalogued or library tilt movements (which may be stored in the memory 114 of the computing device 10) that are identified as being movements associated with transfer of the computing device 10* between two users.
That is, the computing device 10* may maintain in its memory 114 (see
Thus, another aspect of tilt orientation changes that may be considered in order to determine or infer that a transfer has taken place is to simply look at the end points of the tilt reorientation and their differences. In other words, to analyze the first tilt orientation (e.g., the tilt orientation of the computing device 10* before the computing device 10* being reoriented) and the second tilt orientation (e.g., the end tilt orientation of the computing device 10* after it has been reoriented) with respect to each other, and the differences between the first tilt orientation and the second tilt orientation. Thus, in some embodiments, the computing device 10* may also or additionally maintain a catalogue or library of changes of tilt orientation (e.g., tilt orientation changes) that have been previously identified as tilt changes that occur when, for example, a computing device 10* is transferred between two users. Such catalogue or library of tilt orientation changes may be stored as part of a movement library 170 stored in memory 114 (see
Referring now to
By detecting that the computing device 10* has been spin-rotated in a particular manner, a determination or at least an inference may be made that the computing device 10* has been transferred from the first user 20 to the second user 30. In particular, when the first user 20 is handing-off or transferring the computing device 10* to the second user 30, the first user 20 may spin-rotate the computing device 10* in a particular way. Thus, when the computing device 10* is being transferred from a first user 20 to a second user 30, the computing device 10*(or rather the logic endowed with the computing device 10*) may track the movements of the computing device 10* as it moves from a first spin orientation (e.g., the orientation of the computing device 10* at the beginning of the transfer or when the first user 20 was using the computing device 10*) to a second spin orientation (e.g., the orientation of the computing device 10* at the end of the transfer or when the second user 30 has obtained possession of the computing device 10*).
Similar to the tilt or tilt movement detection/analysis described earlier, in order to make a determination or inference that a transfer was made from the first user 20 to the second user 30, the computing device 10* or at least the logic endowed in the computing device 10* may scrutinize the particular movements of the computing device 10* as the computing device 10* spin rotates from a first orientation to a second orientation. The computing device 10* may additionally or alternatively analyze the second orientation (e.g., the orientation of the computing device 10* after it has finished being spin rotated) at least with respect to the first orientation (e.g., the orientation of the computing device 10* before it was spin rotated) in order to determine or at least infer that the computing device 10* has been transferred. To further determine or at least infer that the computing device 10* has been transferred from the first user 20 to the second user 30, the examination/analysis of the detected spin rotation of the computing device 10* from the first orientation to the second orientation may involve comparing the detected spin rotation movement of the computing device 10* with catalogued or library spin rotation movements that are identified as being associated with transfer of the computing device 10*. That is, the computing device 10* may maintain in its memory 114 (see
Turning now to
In some embodiments, in order to determine or at least infer that the computing device 10* has been transferred from the first user 20 to the second user 30, the computing device 10* may be endowed with logic that detects/monitors vibrations. That is, each user who may come in contact with the computing device 10* may pass on to the computing device 10* unique vibration pattern or signature (e.g., as a result of heartbeat). Thus, when the first user 20 is holding the computing device 10*, the computing device 10* may vibrate in a particular vibration pattern that is associated with the first user 20. In contrast, when the computing device 10* has been transferred to the second user 30 and the second user 30 is holding the computing device 10*, the computing device 10* may vibrate in a vibration pattern that is associated with the second user 30. Thus, one way to determine whether the computing device 10* has been transferred from the first user 20 to the second user 30 is to detect/monitor at least changes in vibrations of the computing device 10*. In some cases, this may involve the computing device 10*(or at least the logic endowed with the computing device 10*) initially detecting the particular vibration pattern of the computing device 10* when the computing device 10* is being held by the first user 20, and to detect when the computing device 10* no longer vibrates in such a particular vibration pattern. In order to determine whether the computing device 10* has been transferred from the first user 20 to the second user 30, the computing device 10* in some cases may be further designed to determine that the computing device 10* is vibrating in a way that matches with a vibration pattern of the second user 30. By making such a determination, an inference may be made that the computing device 10* is being held or is in contact with the second user 30.
In some embodiments, the computing device 10* may include logic that is designed to determine whether the computing device 10 has moved away from the first user 20 in order to determine whether the computing device 10* has been transferred from the first user 20 to the second user 30. That is, by making such a determination, an inference may be made that the computing device 10* has been transferred from the first user 20 to the second user 30. In some embodiments, in order to make such a determination, data from a combination of sensors 120 may be processed and analyzed. That is, in order to determine whether the computing device 10 has moved away from the first user 20, a combination of one or more movement sensors 202 (see
In some embodiments, and as illustrated in
As described briefly above, in addition to directly detecting the movements of the computing device 10* using movement sensors 202 (e.g., inertia sensors, accelerometers, gyroscopes, and so forth), other types of environmental aspects may be detected/monitored in order to determine whether the computing device 10* has been transferred from a first user 20 to a second user 30. For instance, in some embodiments, the computing device 10* or the logic endowed with the computing device 10* may be designed to detect, using one or more image capturing devices 204, certain visual cues that when detected at least infers the transfer of the computing device 10* from a first user 20 to a second user 30. For example, in some embodiments, the computing device 10* may be endowed with logic that at least detects, via one or more image capturing devices 204, changes in lighting in the proximate vicinity of the computing device 10*. That is, generally when an object is moved from one spatial location to another spatial location, as in the case of a computing device 10* being transferred between two users, the object will be exposed to changes in lighting conditions. Thus, by merely detecting changes in lighting conditions of the computing device 10*, at least an inference may be made that the computing device 10* is being transferred between two users.
Alternatively or additionally, in some embodiments, the computing device 10* may be endowed with a facial recognition system (e.g., facial recognition software) that when employed with one or more image capturing devices 204 may be used in order to determine the presence or absence of a face associated with the first user 20 or the second user 30 within the proximate vicinity of the computing device 10*. If the face associated with the first user 20 is not detected in the proximate vicinity of the computing device 10* and/or if a face not associated with the first user 20 is detected in the proximate vicinity of the computing device 10*, such as the face of the second user 30, then a determination or at least an inference may be made that a transfer of the computing device 10* from the first user 20 to the second user 30 may have occurred. The phrase “proximate vicinity” as used here is in reference to the immediate area surrounding the computing device 10* or within a distance from the computing device 10* from which an object or a person is visually (or audibly) discernable or identifiable by the computing device 10* using, for example, a facial recognition system (or a voice verification system).
Another type of visual cues that the computing device 10* or at least the logic endowed with the computing device 10* may look for in order to determine whether the computing device 10* has been transferred from a first user 20 to a second user 30 is the presence or absence of one or more eyes (e.g., irises or retinas) in the proximate vicinity of the computing device 10* that are determined to be associated with the first user 20 or the second user 30. In particular, if the eyes of the first user 20 is determined not to be at least in the field of view of an image capturing device 204 of the computing device 10* and/or if one or more eyes of another person (e.g., second user 30) other than the first user 20 is determined to be in the field of view of the image capturing device 204, then at least an inference may be made that the computing device 10* has been transferred from the first user 20 to the second user 30.
Yet another type of visual cues that the computing device 10* or at least the logic endowed with the computing device 10* may look for in order to determine whether the computing device 10* has been transferred from a first user 20 to a second user 30 is whether the first user 20 or the second user 30 has visually exhibited movements or visual gestures which indicates or at least infers that the computing device 10* has been transferred from the first user 20 to the second user 30. That is, and as described earlier, one way to track the movements or gestures of the first user 20 that indicates or at least suggests that the computing device 10* has been transferred from the first user 20 to the second user 30 is to directly detect or track the movements of the computing device 10* using, for example, one or more movement sensors 202. An alternative technique for detecting the gestures of the first user 20 (or the second user 30) that indicates or at least suggests that the computing device 10* has been transferred from the first user 20 to the second user 30 is to visually detect such gestures using, for example, one or more image capturing devices 204. For example, when the computing device 10* or at least the logic endowed with the computing device 10* using one or more image capturing devices 204 visually detects the first user 20 extending his or her arms out (such as when the first user 20 is passing the computing device 10*), then that may at least suggest that the computing device 10* is being transferred. Similarly, when the computing device 10* or at least the logic endowed with the computing device 10* detects the second user 30 withdrawing his or her arms, then that may at least suggest that the second user 30 is receiving the computing device 10*.
Note that in some cases, multiple image capturing devices 204 may be employed by the computing device 10* in order to obtain better visual data. For example, by using multiple visual sensors (i.e., image capturing devices 204), a better image of the face or eyes of the first user 20 or the second user 30 may be obtained. Further, by employing multiple visual sensors, rather than a single visual sensor, a more accurate determination regarding the location of the first user 20 or the second user 30 (e.g., the location of faces or eyes of the first user 20 and/or the second user 30) relative to the specific orientation of the computing device 10* may be obtained. As will be further described herein, in some embodiments, such information may be useful in order to properly format items that may be presented by the computing device 10* when the computing device 10* is transferred to the second user 30.
In various embodiments, the computing device 10* or at least the logic that may be endowed with the computing device 10* may be designed to look for absence or presence of audio cues in the proximate vicinity of the computing device 10* in order to determine or at least infer as to whether the computing device 10* has been transferred from a first user 20 to a second user 30. For example, in some embodiments, the computing device 10* may be endowed with a voice verification system that may be designed to detect, via one or more audio capturing devices 206 (e.g., one or more microphones), a voice in the proximate vicinity of the computing device 10* having a voice pattern that may be different from the voice pattern of the first user 20. By making such a determination and/or by detecting absence of a voice pattern associated with the first user 20 in the proximate vicinity of the computing device 10*, at least an inference may be made that the computing device 10* has been transferred from the first user 20 to the second user 30.
In some embodiments, the computing device 10* or at least the logic endowed with the computing device 10* may be designed to determine the transfer of the computing device 10* from the first user 20 to the second user 30 based on one or more detected movements of the computing device 10*, one or more detected visual cues, and/or one or more detected audio cues. That is, since in many situations, a particular type of data or measurement (e.g., detected movements of the computing device 10* or detected visual cues in the proximate vicinity of the computing device 10*) may not reliably or conclusively indicate that the transfer of the computing device 10* from the first user 20 to the second user 30 has occurred, in various embodiments, the computing device 10* may make the determination as to whether the computing device 10* has been transferred based on different types of measurements (e.g., direct movements of the computing device 10*, visual cues, and/or audio cues).
In various embodiments, in order to properly format the items (e.g., electronic items such as audio and/or image files, textual documents, applications, application interfaces, Internet web pages, textual messages, voice message, and so forth) that may be presented through the computing device 10* after the computing device 10*, the determination operation for determining whether the computing device 10* has been transferred from the first user 20 to the second user 30 may include an operation to at least partially identify the second user 30. That is, in various embodiments, the selection of the format to be applied to the one or more items that are to be presented through the computing device 10* may depend on at least the partial identification of the second user 30. For example, if the second user 30 is a primary user or owner of the computing device 10*, the second user 30 may prefer that certain formatting be applied to the one or more items to be presented through the computing device 10*.
In some embodiments, in order to at least partially identify the second user 30, the computing device 10* or at least the endowed logic may at least determine that the second user 30 is a different user from the first user 30. Alternatively or additionally, the computing device 10* or at least the endowed logic in order to at least partially identify the second user 30 may determine whether the second user 30 is registered with the computing device 10*. That is, whether the computing device 10* or at least the endowed logic recognizes the second user 30 by determining whether certain detected biometrics of the second user 30 (e.g., facial or retinal characteristics, or voice pattern) has already been inputted or stored in the computing device 10*.
If the computing device 10*(or the endowed logic) does indeed recognize the second user 30 then the computing device 10* may determine whether there are any presentation preferences 174 (see
In various embodiments, the memory 114 of the computing device 10* may store one or more presentation preferences 174 of one or more users. In some embodiments, the memory 114 may store one or more presentation preferences 174 that are specifically associated with a primary user or owner of the computing device 10* and generic one or more presentation preferences 174 for any other users who may access the computing device 174. Thus, when the computing device 10* determines that the primary user or owner of the computing device 10* has possession of the computing device 10* then the one or more presentation preferences 174 that are determined to be specifically associated with the primary user or owner will be invoked. On the other hand, if the computing device 10* determines that someone else other than the primary user or owner has possession of the computing device 10*, then the generic one or more presentation preferences 174 may be invoked.
As described earlier, in addition to being able to determine that the computing device 10* has been transferred from a first user 20 to a second user 30, the computing device 10* or at least the logic that may be endowed with the computing device 10* may also be designed to present one or more items in one or more particular formats that were selected based, at least in part, on the determination that the computing device 10* has been transferred from the first user 20 and the second user 30 and the at least partial identification of the second user 30. In various embodiments, the one or more items that may be presented in the one or more particular formats may have been open or running prior to the transfer of the computing device 10* and/or electronic items that were accessible through the computing device 10*(e.g., electronic documents and files that were stored in the computing device 10*) prior to the transfer of the computing device 10* to the second user 30.
The type of formatting to be selected and applied based on the determination that the computing device 10* has been transferred from the first user 20 to the second user 30 and the at least partial identification of the second user 30 will depend on a number of factors including what types of items are to be formatted and whether there are any presentation preferences 174 associated with the second user 30 that can be used in order to properly format the items to be presented through the computing device 10*. A more detailed discussion related to the presentation of the one or more items in the one or more particular formats will be provided in greater detail herein.
Referring now to
In various embodiments, the transfer determining module 102′ of
Turning now to
Note that although
In various embodiments, the memory 114 of the computing device 10′ of
Turning now to
Referring now to
e illustrates the various types of sensors 120 that may be included with the computing device 10*(e.g., the computing device 10′ of
A more detailed discussion related to the computing device 10* of
Further, in
In any event, after a start operation, the operational flow 400 of
In addition to the transfer determining operation 402, operational flow 400 may also include a particular format presenting operation 404 for presenting, via the computing device, the one or more items in one or more particular formats, the one or more particular formats being selected based, at least in part, on said determining as further illustrated in
Various types of formatting may be applied in various alternative implementations. For example, in some cases, the presenting of the one or more items in one or more particular formats may involve displaying the one or more items (textual documents) in one or more particular font styles or sizes. In some cases, the presenting of the one or more items in one or more particular formats may involve audibly presenting the one or more items (e.g., audio or video files) at a particular volume level. In other cases, the presenting of the one or more items in one or more particular formats may involve displaying the one or more items through a user interface 110 (e.g., a display device 12 such as a touch screen) that has been configured to display items at particular level or levels of brightness, tint, hue, and/or contrast. In still other cases, the presenting of the one or more items in one or more particular formats may involve displaying the one or more items in one or more particular color schemes. Other types of formatting may additionally or alternatively be applied to the one or more items to be presented in various other implementations as will be further described herein.
As will be further described herein, the transfer determining operation 402 and the particular format presenting operation 404 of
As further illustrated in
In various implementations, data from various types of sensors 120 may be used in order to determine whether the computing device 10* has been transferred. For example, in various implementations, operation 503 may include an operation 504 for determining that the computing device has been transferred from the first user to the second user based, at least in part, on data provided by one or more movement sensors that are designed to directly sense movements of the computing device. For instance, the transfer determining module 102* of the computing device 10* of
In some implementations, operation 504 may include an operation 505 for determining that the computing device has been transferred from the first user to the second user based, at least in part, on data provided by at least one of an accelerometer, an inertia sensor, or a gyro sensor as further depicted in
In the same or different implementations, operation 503 may include an operation 506 for determining that the computing device has been transferred from the first user to the second user based, at least in part, on data provided by one or more image capturing devices. For instance, the transfer determining module 102* of the computing device 10* of
In the same or alternative implementations, operation 503 may include an operation 507 for determining that the computing device has been transferred from the first user to the second user based, at least in part, on data provided by one or more audio capturing devices. For instance, the transfer determining module 102* of the computing device 10* of
In some cases, a more accurate determination that the computing device 10* has been transferred between two users may be obtained if data from different types of sensors are processed and analyzed. For example, in some implementations, operation 503 may include an operation 508 for determining that the computing device has been transferred from the first user to the second user based, at least in part, on data provided by one or more movement sensors and one or more image capturing devices as depicted in
In some alternative implementations, operation 503 may include an operation 509 for determining that the computing device has been transferred from the first user to the second user based, at least in part, on data provided by one or more movement sensors, one or more image capturing devices, and one or more audio capturing devices. For instance, the transfer determining module 102* of the computing device 10* of
In some alternative implementations, operation 503 may include an operation 510 for determining that the computing device has been transferred from the first user to the second user based, at least in part, on data provided by one or more movement sensors and one or more audio capturing devices. For instance, the transfer determining module 102* of the computing device 10* of
In some alternative implementations, operation 503 may include an operation 511 for determining that the computing device has been transferred from the first user to the second user based, at least in part, on data provided by one or more image capturing devices and one or more audio capturing devices. For instance, the transfer determining module 102* of the computing device 10* of
Turning now to
As further illustrated in
In some cases, operation 513 may, in turn, include an operation 514 for detecting that the computing device is no longer in a particular tilt orientation that the computing device was detected as having when the computing device was in the possession of the first user by at least detecting that the computing device has been reoriented from the particular tilt orientation to another tilt orientation that when detected as occurring at least suggests that the computing device has been transferred from the first user to the second user as further depicted in
In the same or different implementations, operation 513 may include an operation 515 for detecting that the computing device is no longer in a particular tilt orientation that the computing device was detected as having when the computing device was in the possession of the first user by at least detecting that the computing device has been reoriented from the particular tilt orientation to another tilt orientation having an angular tilt that is at least a predefined percentage different from an angular tilt associated with the particular tilt orientation that the computing device was detected as having when the computing device was in the possession of the first user as further depicted in
In various implementations, the operation 512 for determining that the computing device has been transferred from the first user to the second user by at least detecting that the computing device has exhibited one or more particular movements that at least suggest that the computing device has been transferred from the first user to the second user may involve detecting that the computing device 10* has at least been relocated away from a particular location. For example, in some implementations, operation 512 may include an operation 516 for detecting that the computing device has exhibited the one or more particular movements that at least suggest that the computing device has been transferred from the first user to the second user by at least detecting that the computing device is no longer at a particular spatial location that the computing device was detected as being located at when the computing device was in the possession of the first user as illustrated in
In various implementations, operation 516 may include an operation 517 for detecting that the computing device is no longer at a particular spatial location that the computing device was detected as being located at when the computing device was in the possession of the first user by at least detecting that the computing device has been relocated from the particular spatial location to another spatial location that when detected at least suggests that the computing device has been transferred from the first user to the second user. For instance, the spatial location detection module 214 of the computing device 10* detecting that the computing device 10* is no longer at a particular spatial location that the computing device 10* was detected as being located at when the computing device 10* was in the possession of the first user 20 by at least detecting that the computing device 10* has been relocated from the particular spatial location (e.g., see spatial location 46 of
In the same or different implementations, operation 516 may include an operation 518 for detecting that the computing device is no longer at a particular spatial location that the computing device was detected as being located at when the computing device was in the possession of the first user by at least detecting that the computing device has been relocated from the particular spatial location to another spatial location that is at least a predefined distance away from the particular spatial location that the computing device was detected as being located at when the computing device was in the possession of the first user. For instance, the spatial location detection module 214 of the computing device 10* detecting that the computing device 10* is no longer at a particular spatial location that the computing device 10* was detected as being located at when the computing device 10* was in the possession of the first user 20 by at least detecting that the computing device 10* has been relocated from the particular spatial location to another spatial location that is at least a predefined distance away from the particular spatial location that the computing device 10* was detected as being located at when the computing device 10* was in the possession of the first user 20.
Referring to
In the same or different implementations, operation 512 may include an operation 520 for detecting that the computing device has exhibited the one or more particular movements that at least suggest that the computing device has been transferred from the first user to the second user by at least detecting that the computing device has moved away from the first user. For instance, the particular movement detecting module 210 including the moving away detecting module 217 (see
In some implementations, operation 520 may further include an operation 521 for detecting that the computing device has moved away from the first user by at least detecting that the computing device has moved a predefined distance away from the first user. For instance, the moving away detecting module 217 of the computing device 10* detecting that the computing device 10*has moved away from the first user 20 by at least detecting that the computing device 10* has moved a predefined distance away from the first user 20. In doing so, movements exhibited by the computing device 10* that may be considered “noise” (e.g., random or accidental relocation movements of the computing device 10* caused by, for example, the random or accidental movements of the first user 20 holding the computing device 10*) may be filtered out and ignored.
In various embodiments, operation 512 for determining that the computing device has been transferred from the first user to the second user by at least detecting that the computing device has exhibited one or more particular movements that at least suggest that the computing device has been transferred from the first user to the second user may involve tracking or sensing vibrations that are exposed to the computing device 10*. That is, each user who may come in contact with the computing device 10* may be associated with relatively unique signature vibration pattern (e.g., heart rate). Thus, by detecting at least a change in vibration, at least an inference may be made that a transfer of the computing device 10* may have occurred. Thus, in various implementations, operation 512 may include an operation 522 for detecting that the computing device has exhibited the one or more particular movements that at least suggest that the computing device has been transferred from the first user to the second user by at least detecting that the computing device is no longer vibrating in a manner that matches with a vibration pattern that the computing device was detected as having when the computing device was in the possession of the first user as illustrated in
As further illustrated in
In the same or different implementations, operation 512 may include an operation 524 for detecting that the computing device has exhibited the one or more particular movements that at least suggest that the computing device has been transferred from the first user to the second user by at least detecting that the computing device is not vibrating in a manner that matches with a vibration pattern that is associated with the first user. For instance, the particular movement detecting module 210 including the vibration detecting module 218 of the computing device 10* detecting that the computing device 10* has exhibited the one or more particular movements that at least suggest that the computing device 10* has been transferred from the first user 20 to the second user 30 when the vibration detecting module 218 at least detects that the computing device 10* is not vibrating in a manner that matches with a vibration pattern that is associated with the first user 20.
In various implementations, operation 512 for determining that the computing device has been transferred from the first user to the second user by at least detecting that the computing device has exhibited one or more particular movements that at least suggest that the computing device has been transferred from the first user to the second user may involve tracking the overall movements of the computing device 10* rather than tracking a particular type of movements (e.g., tilt movements, spin rotation movements, spatial relocation movements, vibration movements, etc.) in order to determine whether the computing device 10* has been transferred from the first user 20 to the second user 30. For example, in some implementations, operation 512 may include an operation 525 for detecting that the computing device has exhibited the one or more particular movements that at least suggest that the computing device has been transferred from the first user to the second user by at least detecting that the computing device has moved in a particular three-dimensional movement that infers that the computing device has been transferred from the first user to the second user. For instance, the particular movement detecting module 210 including the three-dimensional movement detecting module 219 (see
In various implementations, the transfer determining operation 402 of
As further illustrated in
In the same or different implementations, operation 526 may include an operation 528 for detecting the presence or absence of the one or more visual cues in the proximate vicinity of the computing device by detecting at least a change in lighting in the proximate vicinity of the computing device that at least suggests that the computing device has at least been moved. For instance, the visual cue detecting module 220 including the lighting change detecting module 221 (see
In some cases, operation 528 may further include an operation 529 for detecting at least a change in lighting in the proximate vicinity of the computing device that at least suggests that the computing device has at least been moved by detecting at least a predefined amount of change in lighting in the proximate vicinity of the computing device within a predefined time period as further depicted in
In the same or different implementations, operation 526 may include an operation 530 for detecting presence or absence of the one or more visual cues in the proximate vicinity of the computing device by at least detecting presence of at least one face in the proximate vicinity of the computing device not associated with the first user. For instance, the visual cue detecting module 220 including the face detecting module 222 (see
As further illustrated in
In some cases, operation 526 may alternatively or additionally include an operation 532 for detecting the presence or absence of the one or more visual cues in the proximate vicinity of the computing device by at least detecting presence of a first face associated with the first user and a second face associated with the second user in the proximate vicinity of the computing device, the second face being detected as being closer to the computing device than the first face. For instance, the visual cue detecting module 220 including the face detecting module 222 of the computing device 10* detecting the presence or absence of the one or more visual cues in the proximate vicinity of the computing device 10* when the face detecting module 222 at least detects presence of a first face associated with the first user 20 and a second face associated with the second user 30 in the proximate vicinity of the computing device 10*, the second face being detected as being closer to the computing device 10* than the first face of the first user 20.
In the same or different implementations, operation 526 may include an operation 533 for detecting the presence or absence of the one or more visual cues in the proximate vicinity of the computing device by detecting presence of at least one eye in the proximate vicinity of the computing device not associated with the first user as further illustrated in
In some cases operation 533 may further include an operation 534 for detecting the presence of the at least one eye in the proximate vicinity of the computing device not associated with the first user by at least detecting presence of the at least one eye in the proximate vicinity of the computing device that is recognized as being associated with the second user. For instance, the eye detecting module 223 of the computing device 10* detecting the presence of the at least one eye in the proximate vicinity of the computing device 10* not associated with the first user 20 by at least detecting presence of the at least one eye in the proximate vicinity of the computing device 10* that is recognized as being associated with the second user 30. Thus, in some cases, the computing device 10* may store in its memory 114 images of eyes (e.g., images of irises or retinas) belonging to one or more parties including, for example, the second user 30.
Turning to
In the same or different implementations, operation 526 may include an operation 536 for detecting the presence or absence of the one or more visual cues in the proximate vicinity of the computing device by at least detecting absence of a visual cue associated with the first user in the proximate vicinity of the computing device as further illustrated in
As further illustrated in
In the same or different implementations, operation 536 may include an operation 538 for detecting the absence of a visual cue associated with the first user in the proximate vicinity of the computing device by at least detecting absence of one or more eyes associated with the first user in the proximate vicinity of the computing device as further depicted in
In various implementations, operation 526 for determining that the computing device has been transferred from the first user to the second user by at least detecting presence or absence of one or more visual cues in proximate vicinity of the computing device that at least suggest that the computing device has been transferred from the first user to the second user may further include an operation 539 for detecting the presence or absence of the one or more visual cues in the proximate vicinity of the computing device by at least detecting visually that the computing device has moved away from the first user as further depicted in
In the same or alternative implementations, operation 526 may additionally or alternatively include an operation 540 for detecting the presence or absence of the one or more visual cues in the proximate vicinity of the computing device by at least detecting visually that the computing device has moved closer to the second user. For instance, the visual cue detecting module 220 including the visual moving closer detecting module 225 (see
In various implementations, the transfer determining operation 402 of
As further illustrated in 5f, operation 541 may include one or more additional operations in various alternative implementations. For example, in some implementations, operation 541 may include an operation 542 for detecting the presence or absence of the one or more audio cues in the proximate vicinity of the computing device by at least detecting absence of an audio voice pattern associated with the first user in the proximate vicinity of the computing device. For instance, the audio cue detecting module 226 including the voice pattern detecting module 227 (see
In the same or different implementations, operation 541 may include an operation 543 for detecting the presence or absence of the one or more audio cues in the proximate vicinity of the computing device by at least detecting presence of at least one audio voice pattern not associated with the first user in the proximate vicinity of the computing device. For instance, the audio cue detecting module 226 including the voice pattern detecting module 227 of the computing device 10* detecting the presence or absence of the one or more audio cues in the proximate vicinity of the computing device 10* when the voice pattern detecting module 227 at least detects presence of at least one audio voice pattern not associated with the first user 20 in the proximate vicinity (e.g., within 5 feet or within some other distance from which voice of the second user 30 is at least clearly discernable or identifiable) of the computing device 10*.
As further illustrated in
In the same or different implementations, operation 541 may include an operation 545 for detecting the presence or absence of the one or more audio cues in the proximate vicinity of the computing device by at least detecting audibly that the computing device has moved away from the first user. For instance, the audio cue detecting module 226 including the audio moving away detecting module 228 (see
In the same or different implementations, operation 541 may include an operation 546 for detecting the presence or absence of the one or more audio cues in the proximate vicinity of the computing device by at least detecting audibly that the computing device has moved closer to the second user. For instance, the audio cue detecting module 226 including the audio moving closer detecting module 229 (see
In various implementations, the transfer determining operation 402 of
In some alternative implementations, the transfer determination operation 402 may alternatively include an operation 548 for determining that the computing device has been transferred from the first user to the second user by detecting presence or absence of one or more visual cues and one or more audio cues in proximate vicinity of the computing device and by detecting that the computing device has exhibited one or more movements that at least suggest that the computing device has been transferred from the first user to the second user. For instance, the transfer determining module 102* including the visual cue detecting module 220, the audio cue detecting module 226, and the particular movement detecting module 210 of the computing device 10* determining that the computing device 10* has been transferred from the first user 20 to the second user 30 when the visual cue detecting module 220 and the audio cue detecting module 226 detects presence or absence of one or more visual cues and one or more audio cues in proximate vicinity of the computing device 10* and when the particular movement detecting module 210 detects that the computing device 10* has exhibited one or more movements that at least suggest that the computing device 10* has been transferred from the first user 20 to the second user 30.
In some alternative implementations, the transfer determining operation 402 may include an operation 549 for determining that the computing device has been transferred from the first user to the second user by detecting presence or absence of one or more visual cues and one or more audio cues in proximate vicinity of the computing device that at least suggest that the computing device has been transferred from the first user to the second user as further depicted in
In some alternative implementations, the transfer determining operation 402 may include an operation 550 for determining that the computing device has been transferred from the first user to the second user by detecting presence or absence of one or more audio cues in proximate vicinity of the computing device and by detecting that the computing device has exhibited one or more movements that at least suggest that the computing device has been transferred from the first user to the second user. For instance, the transfer detecting module 102* including the audio cue detecting module 226 and the particular movement detecting module 210 of the computing device 10* determining that the computing device 10* has been transferred from the first user 20 to the second user 30 when the audio cue detecting module 226 detects presence or absence of one or more audio cues in proximate vicinity of the computing device 10* and when the particular movement detecting module 210 detects that the computing device 10* has exhibited one or more movements that at least suggest that the computing device 10* has been transferred from the first user 20 to the second user 30.
Referring now to
As further illustrated in
In the same or different implementations, the transfer determining operation 402 may include an operation 553 for identifying at least partially the second user by at least determining that the second user is a registered user who is registered with the computing device. For instance, the user identifying module 230* including the registered user determining module 232 (see
As further illustrated in
In some cases, operation 554 may include an operation 555 for acquiring the second user's one or more biometric identification credentials and determining that the second user's one or more biometric identification credentials are at least registered with the computing device. For instance, the biometric credential acquiring module 234 (see
In various implementations, operation 555 may further include an operation 556 for acquiring the second user's one or more facial and/or retinal profiles and determining that the second user's one or more facial and/or retinal profiles are at least registered with the computing device. For instance, the biometric credential acquiring module 234 of the computing device 10* acquiring the second user's one or more facial and/or retinal profiles (e.g., digital pictures of the second user's face or retinas) and the registered user determining module 232 of the computing device 10* determining that the second user's one or more facial and/or retinal profiles are at least registered with the computing device 10*.
In the same or different implementations, operation 555 may alternatively or additionally include an operation 557 for acquiring the second user's one or more signature voice patterns and determining that the second user's one or more signature voice patterns are at least registered with the computing device. For instance, the biometric credential acquiring module 234 of the computing device 10* acquiring the second user's one or more signature voice patterns (e.g., audio voice recording) and the registered user determining module 232 of the computing device 10* determining that the second user's one or more signature voice patterns are at least registered with the computing device 10*.
In the same or different implementations, operation 555 may alternatively or additionally include an operation 558 for acquiring the second user's one or more signature movement patterns and determining that the second user's one or more signature movement patterns are at least registered with the computing device. For instance, the biometric credential acquiring module 234 of the computing device 10* acquiring the second user's one or more signature movement patterns (e.g., heart/pulse rate, or a secret personal gesture movement that is only known by the second user 30) and the registered user determining module 232 of the computing device 10* determining that the second user's one or more signature movement patterns are at least registered with the computing device 10*.
In the same or alternative implementations, operation 553 for identifying at least partially the second user by at least determining that the second user is a registered user who is registered with the computing device may include an operation 559 for determining that the second user is a registered user by determining that one or more presentation preferences associated with the second user are registered with the computing device, the one or more presentation preferences being one or more preferences for how the one or more items are to be preferably presented via the computing device. For instance, the registered user determining module 232 including the registered preference determining module 235 (see
As further illustrated in
In the same or different implementations, operation 559 may alternatively or additionally include an operation 561 for determining that the one or more presentation preferences associated with the second user are registered with the computing device, the one or more presentation preferences being one or more preferences of the second user for how the one or more items are to be preferably presented via the computing device. For instance, the registered preference determining module 235 of the computing device 10* determining that the one or more presentation preferences associated with the second user 30 are registered with the computing device 10*, the one or more presentation preferences 174 being one or more preferences of the second user 30 for how the one or more items are to be preferably presented via the computing device 10*. Thus, in some cases, the second user 30 may have previously registered (e.g., previously entered or inputted) such preferences with the computing device 10*. For example if the second user 30 was a primary user (e.g., a person having superior access rights to the computing device 10* than other users) or an owner of the computing device 10*, then the second user 30 may have previously registered his/her presentation preferences.
Referring back to the particular format presenting operation 404 of
In some implementations, the particular format presenting operation 404 of
As further illustrated in
In various implementations, the particular format presenting operation 404 may include an operation 665 for presenting via the computing device the one or more items in the one or more particular formats by presenting one or more electronic items in the one or more particular formats. For instance, the particular format presenting module 104* of the computing device 10* presenting via the computing device 10* the one or more items in the one or more particular formats by presenting one or more electronic items (e.g., audio, video, and/or image files, word processing documents, spreadsheet documents, application interface, electronic passwords, software applications including gaming, productivity, and/or communication applications, and so forth) in the one or more particular formats.
As further illustrated in
In the same or different implementations, operation 665 may include an operation 667 for presenting the one or more electronic items in the one or more particular formats by presenting one or more image and/or audio files in the one or more particular formats. For instance, the particular format presenting module 104* of the computing device 10* presenting the one or more electronic items in the one or more particular formats by presenting one or more image and/or audio files (e.g., digital photos, audio recordings, voice messages, and so forth) in the one or more particular formats.
In the same or different implementations, operation 665 may include an operation 668 for presenting the one or more electronic items in the one or more particular formats by presenting one or more applications in the one or more particular formats. For instance, the particular format presenting module 104* of the computing device 10* presenting the one or more electronic items in the one or more particular formats by presenting one or more applications (e.g., software applications including gaming applications, communication applications, and/or productivity applications) in the one or more particular formats.
In some cases, operation 668 may further include an operation 669 for presenting the one or more electronic items in the one or more particular formats by presenting one or more application interfaces in the one or more particular formats. For instance, the particular format presenting module 104* of the computing device 10* presenting the one or more electronic items in the one or more particular formats by presenting (e.g., displaying) one or more application interfaces (e.g., modified application interfaces) in the one or more particular formats. For example, displaying an application interface that has been modified so that one or more functionalities are not available or modifying portions (e.g., making a menu or drop down menu bigger of the application interface so that it is easier to use.
In the same or different implementations, operation 665 may include an operation 670 for presenting the one or more electronic items in the one or more particular formats by presenting one or more credentials in the one or more particular formats. For instance, the particular format presenting module 104* of the computing device 10* presenting the one or more electronic items in the one or more particular formats by presenting one or more electronic credentials (e.g., electronic passwords that cannot be copied or duplicated) in the one or more particular formats.
Referring to
In some implementations, operation 671 may include an operation 672 for presenting the one or more items in the one or more particular visual and/or audio formats by presenting the one or more items to include text that is presented in one or more particular font styles and/or sizes that was selected based, at least in part, on said determining as further illustrated in
In the same or different implementations, operation 671 may include an operation 673 for presenting the one or more items in the one or more particular visual and/or audio formats by presenting the one or more items in one or more particular color schemes that was selected based, at least in part, on said determining. For instance, the particular format presenting module 104* including the format selecting module 240 of the computing device 10* presenting the one or more items in the one or more particular visual and/or audio formats by presenting the one or more items in one or more particular color schemes that was selected by the format selecting module 240 based, at least in part, on said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30.
In the same or different implementations, operation 671 may include an operation 674 for presenting the one or more items in the one or more particular visual and/or audio formats by presenting the one or more items in one or more particular audio schemes that was selected based, at least in part, on said determining. For instance, the particular format presenting module 104* including the format selecting module 240 of the computing device 10* presenting the one or more items in the one or more particular visual and/or audio formats by presenting the one or more items in one or more particular audio schemes that was selected by the format selecting module 240 based, at least in part, on said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30.
In various implementations, the particular format presenting operation 404 of
As a further illustration, referring now to
As further illustrated in
In the same or different implementations, operation 675 may additionally or alternatively include an operation 677 for presenting the one or more items with the one or more modifications by presenting the one or more items to include one or more substitutes for one or more selective portions of the one or more items that have been selectively replaced based, at least in part, on said determining as further depicted in
In the same or different implementations, operation 675 may alternatively or additionally include an operation 678 for presenting the one or more items with the one or more modifications by presenting the one or more items to include one or more additions that have been selectively added to the one or more items based, at least in part, on said determining. For instance, the modified form presenting module 242 of the computing device 10* presenting the one or more items with the one or more modifications by presenting the one or more items to include one or more additions that have been selectively added to the one or more items based, at least in part, on said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30. An example result of such an operation would be, for example, the inverse of
In the same or different implementations, operation 675 may alternatively or additionally include an operation 679 for presenting the one or more items with the one or more modifications by presenting the one or more items to include one or more portions that have been selectively altered based, at least in part, on said determining. For instance, the modified form presenting module 242 of the computing device 10* presenting the one or more items with the one or more modifications by presenting the one or more items to include one or more portions that have been selectively altered based, at least in part, on said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30.
Referring now to
In some implementations, operation 680 may further include an operation 681 for presenting the one or more items through the user interface that has been particularly configured to present the one or more items in the one or more particular ways, the user interface to be visually configured in one or more particular ways based, at least in part, on said determining. For instance, the particular format presenting module 104* including the user interface configuring module 244 of the computing device 10* presenting the one or more items through the user interface 110 (e.g., a touchscreen) that has been particularly configured by the user interface configuring module 244 to present the one or more items in the one or more particular ways, the user interface 110 to be visually configured in one or more particular ways based, at least in part, on said determining.
In the same or different implementations, operation 680 may additionally or alternatively include an operation 682 for presenting the one or more items through the user interface that has been particularly configured to present the one or more items in the one or more particular ways, the user interface to be audibly configured in one or more particular ways based, at least in part, on said determining. For instance, the particular format presenting module 104* including the user interface configuring module 244 of the computing device 10* presenting the one or more items through the user interface 110 (e.g., speakers) that has been particularly configured by the user interface configuring module 244 to present the one or more items in the one or more particular ways, the user interface 110 to be audibly configured in one or more particular ways based, at least in part, on said determining.
In the same or different implementations, the particular format presenting operation 404 of
A presentation preference 174 may indicate how one or more items may be preferably presented (e.g., preferable format) via, for example, the computing device 10*. Note that the one or more presentation preferences 174 of the second user 30 may or may not be the actual preferences of the second user 30. That is, in some cases, the one or more presentation preferences 174 of the second user 30 may be the preferences of another party. For example, if the first user 20 is the primary user or owner of the computing device 10*, then the one or more presentation preferences 174 of the second user 30 may be the preferences of the first user 20 as to how the first user 20 wishes the one or more items to be presented to the second user 30 via the computing device 10*. On the other hand, if the second user 30 is the primary user or owner of the computing device 10* then the one or more presentation preferences 174 of the second user 30 may actually be the preferences of the second user 30.
Accordingly and as further illustrated in
As further illustrated in
Turning now to
In various implementations, operation 686 may include one or more additional operations in various alternative implementations. For example, in some implementations, operation 686 may include an operation 687 for presenting the one or more items in the one or more particular formats in response, at least in part, to said determining, the one or more particular formats being selected based, at least in part, on detected location of the second user relative to front-side of the computing device, the front-side of the computing device being a side of the computing device having a display device. For instance, the particular format presenting module 104* including the format selecting module 240 and the user location determining module 248 of the computing device 10* presenting the one or more items in the one or more particular formats in response, at least in part, to said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30, the one or more particular formats being selected by the format selecting module 240 based, at least in part, on detected location (e.g., as detected by the user location determining module 248) of the second user 30 relative to front-side 17a of the computing device 10*, the front-side 17a of the computing device 10* being a side of the computing device 10* having a display device 12 (e.g., a touchscreen or a LCD).
In some cases, operation 687 may further include an operation 688 for presenting the one or more items in the one or more particular formats in response, at least in part, to said determining, the one or more particular formats being selected based, at least in part, on detected location or locations of one or more features of the second user relative to the front-side of the computing device. For instance, the particular format presenting module 104* including the format selecting module 240 and the user location determining module 248 of the computing device 10* presenting the one or more items in the one or more particular formats in response, at least in part, to said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30, the one or more particular formats being selected by the format selecting module 240 based, at least in part, on detected location or locations (e.g., as detected by the user location determining module 248) of one or more features of the second user 30 relative to the front-side 17a of the computing device 10*.
As further illustrated in
In the same or different implementations, operation 688 may include an operation 690 for presenting the one or more items in the one or more particular formats in response, at least in part, to said determining, the one or more particular formats being selected based, at least in part, on detected distance between the one or more features of the second user and the front-side of the computing device. For instance, the particular format presenting module 104* including the format selecting module 240 and the user location determining module 248 of the computing device 10* presenting the one or more items in the one or more particular formats in response, at least in part, to said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30, the one or more particular formats being selected by the format selecting module 240 based, at least in part, on detected distance (e.g., as detected by the user location determining module 248) between the one or more features of the second user 30 and the front-side 17a of the computing device 10*. For example, increasing the font size of the one or more items or increasing brightness of the display device 12 through which the one or more items are to be displayed if the face of the second user 30 is determined, by the user location determining module 248 as being relatively “far away” from the front-side 17a of the computing device 10*.
In the same or different implementations, operation 688 may include an operation 691 for presenting the one or more items in the one or more particular formats in response, at least in part, to said determining, the one or more particular formats being selected based, at least in part, on detected location or locations of one or more eyes of the second user relative to the front-side of the computing device. For instance, the particular format presenting module 104* including the format selecting module 240 and the user location determining module 248 of the computing device 10* presenting the one or more items in the one or more particular formats in response, at least in part, to said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30, the one or more particular formats being selected by the format selecting module 240 based, at least in part, on detected location or locations e.g., as detected by the user location determining module 248) of one or more eyes of the second user 30 relative to the front-side 17a of the computing device 10*.
In the same or different implementations, operation 688 may include an operation 692 for presenting the one or more items in the one or more particular formats in response, at least in part, to said determining, the one or more particular formats being selected based, at least in part, on detected location of a face of the second user relative to the front-side of the computing device. For instance, the particular format presenting module 104* including the format selecting module 240 and the user location determining module 248 of the computing device 10* presenting the one or more items in the one or more particular formats in response, at least in part, to said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30, the one or more particular formats being selected by the format selecting module 240 based, at least in part, on detected location of a face (e.g., as detected by the user location determining module 248) of the second user 30 relative to the front-side 17a of the computing device 10*.
In the same or different implementations, operation 688 may include an operation 693 for presenting the one or more items in the one or more particular formats in response, at least in part, to said determining, the one or more particular formats being selected based, at least in part, on the detected location or locations of the one or more features of the second user as sensed by one or more image capturing devices. For instance, the particular format presenting module 104* including the format selecting module 240 and the user location determining module 248 of the computing device 10* presenting the one or more items in the one or more particular formats in response, at least in part, to said determining that the computing device 10* was transferred from the first user 20 to the second user 30, the determining including at least partially identifying the second user 30, the one or more particular formats being selected by the format selecting module 240 based, at least in part, on the detected location or locations (e.g., as detected by the user location determining module 248) of the one or more features of the second user 30* as sensed (e.g., captured) by one or more image capturing devices 204.
Those having skill in the art will recognize that the state of the art has progressed to the point where there is little distinction left between hardware and software implementations of aspects of systems; the use of hardware or software is generally (but not always, in that in certain contexts the choice between hardware and software can become significant) a design choice representing cost vs. efficiency tradeoffs. Those having skill in the art will appreciate that there are various vehicles by which processes and/or systems and/or other technologies described herein can be effected (e.g., hardware, software, and/or firmware in one or more machines or articles of manufacture), and that the preferred vehicle will vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a mainly software implementation that is implemented in one or more machines or articles of manufacture; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware in one or more machines or articles of manufacture. Hence, there are several possible vehicles by which the processes and/or devices and/or other technologies described herein may be effected, none of which is inherently superior to the other in that any vehicle to be utilized is a choice dependent upon the context in which the vehicle will be deployed and the specific concerns (e.g., speed, flexibility, or predictability) of the implementer, any of which may vary. Those skilled in the art will recognize that optical aspects of implementations will typically employ optically-oriented hardware, software, and or firmware in one or more machines or articles of manufacture.
The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuitry (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, can be equivalently implemented in integrated circuitry, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
In a general sense, those skilled in the art will recognize that the various aspects described herein which can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or any combination thereof can be viewed as being composed of various types of “electrical circuitry.” Consequently, as used herein “electrical circuitry” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.
Those having skill in the art will recognize that it is common within the art to describe devices and/or processes in the fashion set forth herein, and thereafter use engineering practices to integrate such described devices and/or processes into data processing systems. That is, at least a portion of the devices and/or processes described herein can be integrated into a data processing system via a reasonable amount of experimentation. Those having skill in the art will recognize that a typical data processing system generally includes one or more of a system unit housing, a video display device, a memory such as volatile and non-volatile memory, processors such as microprocessors and digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces, and applications programs, one or more interaction devices, such as a touch pad or screen, and/or control systems including feedback loops and control motors (e.g., feedback for sensing position and/or velocity; control motors for moving and/or adjusting components and/or quantities). A typical data processing system may be implemented utilizing any suitable commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.
The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
While particular aspects of the present subject matter described herein have been shown and described, it will be apparent to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from the subject matter described herein and its broader aspects and, therefore, the appended claims are to encompass within their scope all such changes and modifications as are within the true spirit and scope of the subject matter described herein. Furthermore, it is to be understood that the invention is defined by the appended claims.
It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations.
In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.).
In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.”
The present application is related to and claims the benefit of the earliest available effective filing date(s) from the following listed application(s) (the “Related applications”) (e.g., claims earliest available priority dates for other than provisional patent applications or claims benefits under 35 USC §119(e) for provisional patent applications, for any and all parent, grandparent, great-grandparent, etc. applications of the Related application(s)). All subject matter of the Related applications and of any and all parent, grandparent, great-grandparent, etc. applications of the Related APPLICATIONS, including any priority claims, is incorporated herein by reference to the extent such subject matter is not inconsistent herewith. For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/065,885, entitled ACCESS RESTRICTION IN RESPONSE TO DETERMINING DEVICE TRANSFER, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 30 Mar. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/065,964, entitled ACCESS RESTRICTION IN RESPONSE TO DETERMINING DEVICE TRANSFER, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 31 Mar. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/066,848, entitled PROVIDING GREATER ACCESS TO ONE OR MORE ITEMS IN RESPONSE TO DETERMINING DEVICE TRANSFER, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 25 Apr. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/066,917, entitled PROVIDING GREATER ACCESS TO ONE OR MORE ITEMS IN RESPONSE TO DETERMINING DEVICE TRANSFER, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 26 Apr. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/135,314, entitled PROVIDING PARTICULAR LEVEL OF ACCESS TO ONE OR MORE ITEMS IN RESPONSE TO DETERMINING PRIMARY CONTROL OF A COMPUTING DEVICE, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 29 Jun. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/135,392, entitled PROVIDING PARTICULAR LEVEL OF ACCESS TO ONE OR MORE ITEMS IN RESPONSE TO DETERMINING PRIMARY CONTROL OF A COMPUTING DEVICE, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 30 Jun. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/199,237, entitled SELECTIVE ITEM ACCESS PROVISION IN RESPONSE TO ACTIVE ITEM ASCERTAINMENT UPON DEVICE TRANSFER, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 22 Aug. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/199,286, entitled SELECTIVE ITEM ACCESS PROVISION IN RESPONSE TO ACTIVE ITEM ASCERTAINMENT UPON DEVICE TRANSFER, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 23 Aug. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/200,743, entitled PROVIDING GREATER ACCESS TO ONE OR MORE ITEMS IN RESPONSE TO VERIFYING DEVICE TRANSFER, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 28 Sep. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/200,800, entitled PROVIDING GREATER ACCESS TO ONE OR MORE ITEMS IN RESPONSE TO VERIFYING DEVICE TRANSFER, naming Royce A. Levien; Richard T. Lord; Robert W. Lord; Mark A. Malamud; John D. Rinaldo, Jr.; Clarence T. Tegreene as inventors, filed 29 Sep. 2011, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
Number | Date | Country | |
---|---|---|---|
Parent | 13065885 | Mar 2011 | US |
Child | 13317827 | US | |
Parent | 13065964 | Mar 2011 | US |
Child | 13065885 | US | |
Parent | 13066848 | Apr 2011 | US |
Child | 13065964 | US | |
Parent | 13066917 | Apr 2011 | US |
Child | 13066848 | US | |
Parent | 13135314 | Jun 2011 | US |
Child | 13066917 | US | |
Parent | 13135392 | Jun 2011 | US |
Child | 13135314 | US | |
Parent | 13199237 | Aug 2011 | US |
Child | 13135392 | US | |
Parent | 13199286 | Aug 2011 | US |
Child | 13199237 | US | |
Parent | 13200743 | Sep 2011 | US |
Child | 13199286 | US | |
Parent | 13200800 | Sep 2011 | US |
Child | 13200743 | US |