Last screen rendering for electronic book reader

Information

  • Patent Grant
  • 9564089
  • Patent Number
    9,564,089
  • Date Filed
    Monday, April 7, 2014
    10 years ago
  • Date Issued
    Tuesday, February 7, 2017
    7 years ago
Abstract
A handheld dedicated electronic book (“eBook”) reader device and last screen rendering techniques for enhancing user experience are described. The eBook reader device detects certain screen conversion events, such as a timeout period, a scheduled event, or an event derived from user behavior. Upon detection of such events, the eBook reader device renders, as the last screen image to remain visible after the user ceases using the device, an image that conveys to the user some meaningful association with a content item. In the context of eBooks, the eBook reader device renders a representation of the book cover as the last screen image. A progress indicator may further be included to represent user progress through the content item.
Description
BACKGROUND

A large and growing population of users is enjoying entertainment through the consumption of digital media items, such as music, movies, images, electronic books, and so on. The users employ various electronic devices to consume such media items. Among these electronic devices are electronic book readers, cellular telephones, personal digital assistant (PDA), portable media players, tablet computers, netbooks, and the like.


One particular device that is gaining in popularity is the dedicated electronic book (“eBook”) reader device, which attempts to mimic the experience of reading a conventional book through display of electronic information on one or more electronic displays. As the quantity of available media content continues to grow, along with increasing proliferation of such dedicated devices to consume that media content, finding ways to enhance user experience continues to be a priority. As eBook reader devices continue to evolve, there remains a need for improving a reader's ability to relate comfortably with the eBooks, and begin to feel as though he is picking up the book itself, rather than a generic electronic device.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical components or features.



FIG. 1 illustrates one exemplary implementation of a handheld dedicated electronic book (“eBook”) reader device that implements last screen rendering techniques to leave a last screen image on the eBook reader device that is relevant to content that a user is presently consuming or is expected to consume next.



FIG. 2 is a block diagram showing selected components of the eBook reader device.



FIG. 3 is a flow diagram illustrating an exemplary process for converting a display of an eBook reader device from its current image to a last image that allows the eBook reader device to more readily convey an identity of a content item.



FIG. 4 is a flow diagram illustrating a process for changing the display of the eBook reader device to the last image based on time lapse since the user last interacted with the device.



FIG. 5 is a flow diagram illustrating a process for changing the display of the eBook reader device to the last image in response to a scheduled screen conversion event.



FIG. 6 is a flow diagram illustrating a process for changing the display of the eBook reader device to the last image based on observed patterns in user behavior.



FIG. 7 shows a front plan view of the eBook reader device with a last image depicted on the display. The last image is modified to exhibit user progress through the content item represented by the last image.



FIG. 8 shows a front plan view of the eBook reader device with a last image depicted on the display, and accompanied by a progress indicator in the form of a bar graph.



FIG. 9 shows a front plan view of the eBook reader device with a last image depicted on the display, and accompanied by a progress indicator in the form of a pie graph.





DETAILED DESCRIPTION

This disclosure describes last screen rendering techniques to enhance user experience with a dedicated handheld electronic book (“eBook”) reader device. The eBook reader device is designed to allow users to read or otherwise consume electronic content (e.g., text, graphics, audio, multimedia, and the like), such as that found in eBooks (e.g., books, magazines, newspapers, periodicals, or other types of electronic documents), RSS feeds, audio books, and the like. The eBook reader device described herein employs electronic paper (“ePaper”) display technology. A characteristic of ePaper display technology is that the display is bi-stable, meaning that it is capable of holding text or other rendered images even when very little or no power is supplied to the display. Thus, the last screen image rendered on the display can be maintained and visible for very long periods of time, such as days or weeks.


The techniques described herein enhance user experience by enabling the eBook reader device to render, as the last screen image to remain visible after the user ceases using the device, an image that conveys to the user some meaningful association with a content item. For instance, in the case of eBooks, the eBook reader device renders a representation of the book cover as the last screen image. In other scenarios, the eBook reader device determines, based on a schedule or past user behavior, which content item the user is likely to consume next. Based on this determination, the eBook reader device renders, as the last screen image that persists for long periods, a representation of that content item that the user is likely to consume next. In this manner, the eBook reader device projects an identity of a content item, rather than as a generic electronic device.


For discussion purposes, the techniques are described in the context of an eBook reader device used to facilitate reading of electronic books. However, the features discussed below may be applied to other content items, such as audio books, and so forth.


Illustrative Ebook Reader Device



FIG. 1 illustrates an exemplary eBook reader device 100 that is embodied as a handheld, dedicated eBook reader device. The eBook reader device 100 is equipped with a passive display 102 to present content in a human-readable format to a user. The content presented on the display 102 may take the form of electronic books or “eBooks”. For example, the display 102 depicts the text of the eBooks and also any illustrations, tables, or graphic elements that might be contained in the eBooks. The terms “book” and/or “eBook”, as used herein, include electronic or digital representations of printed works, as well as digital content that may include text, multimedia, hypertext, and/or hypermedia. Examples of printed and/or digital works include, but are not limited to, books, magazines, newspapers, periodicals, journals, reference materials, telephone books, textbooks, anthologies, instruction manuals, proceedings of meetings, forms, directories, maps, web pages, and so forth. Accordingly, the terms “book” and/or “eBook” may include any readable or viewable content that is in electronic or digital form.


The display 102 may further include touch screen capabilities that allow user input through contact or gesturing relative to the display. For convenience only, the display 102 is shown in a generally rectangular configuration. However, it is understood that the display 102 may be implemented in any shape, and may have any ratio of height to width. Also, for stylistic or design purposes, the touch-screen display 102 may be curved or otherwise non-linearly shaped.


The eBook reader device 100 also has a keyboard 104 beneath the display 102 and one or more actuatable controls 106 that may have dedicated or assigned operations. For instance, the actuatable controls 106 may include page turning buttons, a joystick, navigational keys, a power on/off button, selection keys, joystick, touchpad, and so on.


In FIG. 1, the display 102 is shown depicting a current page 108 of the eBook version of the work titled, “Outliers” by Malcolm Gladwell. This represents a point in time T1 at which the user is actively reading through the eBook. The term “page” as used herein refers to a collection of content that is presented at one time on the display 102. Thus, a “page” may be understood as a virtual frame of the content, or a visual display window presenting the content to the user. Accordingly, “pages” as described herein are not fixed permanently, in contrast to the pages of published “hard” books. Instead, pages described herein may be redefined or repaginated when, for example, the user chooses a different font or font size for displaying the content on the display 102.


The eBook reader device 100 has various internal electronic components and software modules, which include a last screen rendering module 110 that is responsible for rendering the final image presented on the display 102 after the user ceases reading the eBook. The last screen rendering module 110 may include a timer to track a time period from the last user input. When a sufficient time period has lapsed, the module 110 assumes that the user is likely to have ceased reading the eBook. In response, the last screen rendering module 110 renders one more screen image before entering a dormant or sleep mode. The last screen image is chosen to convey something meaningful about the eBook. In one implementation, the last screen image is a representation of the book cover.


In FIG. 1, suppose that the last screen rendering module 110 determines that a sufficient time period has lapsed since the user has last interacted with the current page 108 in the “Outliers” book. The last screen rendering module 110 replaces the current page 108 with an image of the book cover 112, as represented by the temporal lapse transition 114. Thus, the book cover image 112 is depicted at a time T2 after a predefined time lapse from the last user interaction.


As an alternative to time lapse, the last screen rendering module 110 may further detect when the user proactively inputs a command to transition the eBook reader device 100 from an active state to a non-active state (e.g., sleep, rest, lower power level, etc.). The user may, for example, input a command using a touch screen 102 (if available), the keyboard 104, or actuatable controls 106 to “power down” or otherwise force the device 100 into a non-active state. In response, the last screen rendering module 110 may then convert the current page 108 with the book cover image 112.


In another implementation, the last screen rendering module 110 may further render a screen image of an eBook that the user is likely to consume next. The last screen rendering module 110 may allow a user to schedule when it is likely to be reading certain works. For instance, a student may enter her class schedule, and the last screen rendering module 110 renders images of book covers of the class texts the student is likely to read during the course of the day. Alternatively, the last screen rendering module 110 may anticipate the next likely work based on past user behavior. For instance, the user may exhibit a preference for historical novels in the evening and work-related books during the day.


In FIG. 1, this is illustrated by another transition 116 that occurs when traversing a time or schedule boundary 118. These boundaries 118 may be explicitly entered by the user, automatically retrieved from the user's schedule (e.g., interacting with the user's calendar), or learned from the user's behavior. Moreover, the user's schedule may be kept locally on the device 102, or be kept remotely and hence be received from a remote source (e.g., another computer, server, etc.) over a network, such as a wireless network. When a boundary is reached, the last screen rendering module 110 renders an image of the next eBook that the user is likely to read. In this illustration, the time boundary represents an overnight transition to the morning timeframe, when the user is likely to read the Bible. Thus, at time T3, a cover image 120 of the Bible is represented on the display 102, replacing the previous image of the “Outliers” book cover 112 that the user was reading the previous evening



FIG. 2 illustrates selected functional components that might be implemented within the eBook reader device 100. In a very basic configuration, the device 102 includes a processing unit 202 composed one of one or more processors, and memory 204. Depending on the configuration of a dedicated eBook reader device 100, the memory 204 is an example of computer storage media and may include volatile and nonvolatile memory. Thus, the memory 204 may include, but is not limited to, RAM, ROM, EEPROM, flash memory, or other memory technology, or any other medium which can be used to store media items or applications and data which can be accessed by the eBook reader device 100.


The memory 204 may be used to store any number of functional components that are executable on the processing unit 202, as well as data and content items that are rendered by the eBook reader device 102. Thus, the memory 204 may store an operating system 206 and an eBook storage database to store one or more content items 208, such as eBooks and audio books. The memory may further include a memory portion designated as an immediate page memory to temporarily store one or more pages of an electronic book. The pages held by the immediate page memory are placed therein a short period before a next page request is expected.


A user interface module 210 may also be provided in memory 204 and executed on the processing unit 202 to facilitate user operation of the device 100. The UI module 210 may provide menus and other navigational tools to facilitate selection and rendering of the content items 208. The UI module 210 may further include a browser or other application that facilitates access to sites over a network, such as websites or online merchants.


A content presentation application 212 renders the content items 208. The content presentation application 212 may be implemented as various applications depending upon the content items. For instance, the application 212 may be an electronic book reader application for rending electronic books, or an audio player for playing audio books, or a video player for playing video, and so forth.


The last screen rendering module 110 may also be implemented as a software module stored in memory 204 and executable on the processing unit 202. The last screen rendering module 110 detects screen conversion events, such as time lapses, scheduled items, events set based on user behavior, and so forth. Upon detecting a screen conversion event, the last screen rendering module 110 directs the display 102 to present either (1) a visible representation associated with a content item that the user was last interacting with, or (2) a visible representation associated with a different content item. In the case of electronic books, the last screen rendering module 110 may render images of covers associated with the electronic books. In this manner, the eBook reader device is left standing with a screen depicting the cover of the last book that the reader was reading, or of the next book that the reader is likely to begin reading. Thus, the eBook reader device may be identified more by the book's cover than as an electronic device. Since the ePaper display can last for days or weeks or maybe even months, the cover image may remain visible on the device for long periods until the user once again begins interacting with the device.


The last screen rendering module 110 may implement different mechanisms for determining when to render the last screen image. Illustrated mechanisms include a timer 214, a schedule 216, and a behavior monitor 218.


The timer 214 is configured to detect when the user ceases interacting with the eBook reader device 100 for a threshold period of time. The threshold period may be user configurable. In one implementation, the threshold period is at least five minutes, although longer or shorter durations are possible. When the timer 214 reaches the threshold period, a screen conversion event is generated, causing the last screen rendering module 110 to render a different image on the display 102 that will be visibly persist until the user once again begins interacting with the eBook reader device 100. In the context of electronic books, the last screen rendering module 110 converts the screen image to that of the book's cover, rather than a particular page.


The schedule 216 allows the user to schedule, expressly or indirectly, one or more screen conversion events. Generally, the scheduler 216 enables the user to set a particular time of day, and/or day of week, as a screen conversion event in which to convert the display to an image that is associated with the same content item or another one. For instance, the user may schedule reading periods in advance of school classes, such that at each scheduled event, the eBook reader device 100 displays the cover image of eBook associated with the next class text.


There are different ways to implement the scheduler 216. In one implementation, the scheduler 216 offers a user interface that allows the user to define scheduled events in which to cause a screen conversion. In another implementation, the scheduler 216 works in the background to coordinate with a calendaring application (not shown) that resides on the eBook reader device or on an external computing system with which the eBook reader device communicates. The scheduler 216 imports from the calendaring application one or more scheduled events pertaining to the consumption of content items on the eBook reader device. Moreover, this information may be pushed to the eBook reader device from a remote location over a network, such as a wireless network. For instance, the user may maintain a schedule or other events at an online service (e.g., computing cloud or service) that sends such events to the eBook reader device, and these events are used by the last screen rendering module 110 to convert the display screen.


The behavior monitor 218 establishes screen conversion events based on the user's behavior. Over time, the behavior monitor 218 monitors user behavior during interaction with the eBook reader device. As part of this monitoring, the behavior monitor 218 observes which content items the user consumes at different times of day or days of the week. The behavior monitor 218 may identify, for example, one or more eBooks that the user may likely want to read based on past behavior. For example, the behavior monitor 218 may learn from observed patterns that the user reads the Bible each morning between 7:00 am and 8:00 am, and that the user tends to read a book on architecture every Monday, Wednesday, and Friday, between 10:00 am and noon, in advance of a class at college. Further, the user may be found to listen to audio versions of science fiction novels most evenings, after 9:00 pm.


The behavior monitor 218 then establishes screen conversion events around the observed activities, and associates one or more content items that the users are likely to consume with screen conversion event. For instance, the behavior monitor 218 may set a screen conversion event for every morning at, say, 6:45 am (in advance of the usual behavior of 7:00 am to 8:00 am), and associate the Bible with this event. When the screen rendering module 110 detects this screen conversion event, it causes the display to render a cover image of the Bible in anticipation of the user picking up the eBook reader device to read the Bible at the regular time.


As shown in both FIGS. 1 and 2, the eBook reader device 100 has a display 102. In one implementation, the display uses ePaper display technology. As noted above, the ePaper display technology is bi-stable, meaning that it is capable of holding text or other rendered images even when very little or no power is supplied to the display. Some exemplary ePaper-like displays that may be used with the implementations described herein include bi-stable LCDs, MEMS, cholesteric, pigmented electrophoretic, and others. In other implementations, or for other types of devices, the display may be embodied using other technologies, such as LCDs and OLEDs, and may further include a touch screen interface. In some implementations, a touch sensitive mechanism may be included with the display to form a touch-screen display.


In the same or different implementations, the display 102 may be a flexible display and further include a touch sensitive membrane, film, or other form of sensing material. The flexible display may be positioned, for example, above a touch sensor(s). The touch sensor(s) may be a resistive touch sensitive film. The flexible display may also include a protective layer made of a flexible material such as plastic. The flexible display may also include a flexible backplane layer. The backplane may also be made of a flexible material, such as plastic, metal, glass or a polymer based material. A flexible backplane may be bendable, rollable, light-weight, etc. In one configuration, the flexible backplane is a matrix backplane on a plastic substrate.


The eBook reader device 100 may further be equipped with various input/output (I/O) components 220. Such components may include various user interface controls (e.g., buttons, joystick, keyboard, etc.), audio speaker, microphone or audio input, connection ports, and so forth.


One or more communication interfaces 222 are provided to facilitate communication with external, remote computing sources over various networks or with other local devices. Content (e.g., eBooks, magazines, audio books, etc.), program modules, and screen conversion events, may be transferred to the eBook reader device 100 via the communication interfaces(s) 222. The communication interface(s) 222 support both wired and wireless connection to various networks, such as cellular networks, radio, WiFi networks, short range networks (e.g., Bluetooth), IR, and so forth. The communication connection(s) 222 support both wired and wireless communications with various types of networks, including the Internet. For example, the eBook reader device 100 may be equipped with a radio frequency transceiver to facilitate wireless communication over a wireless network. The device may further include a communication connection that facilitates communication with other devices via, for example, Bluetooth, radio frequency, or infrared connection(s). The communication connection(s) 222 are one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.


The eBook reader device 102 also includes a battery and power control unit 224. The power control unit operatively controls an amount of power, or electrical energy, consumed by the eBook reader device. Actively controlling the amount of power consumed by the reader device may achieve more efficient use of electrical energy stored by the battery. The processing unit 202 may supply computing resources to the power control unit 224, which may further include a clock/timer for accurate control of power consumed by the eBook reader device 100.


The eBook reader device 100 may have additional features or functionality. For example, the eBook reader device 100 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. The additional data storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. In some implementations the eBook reader device 100 may also include a vibrator 226 or other output device for creating a haptic output that is detectable by a user touching the eBook reader device 102. The eBook reader device 100 may further include, in some implementations, an accelerometer 228 for detecting the orientation of the device.



FIG. 3 shows a general process 300 for converting a display of an eBook reader device from its current image to a last image that allows the eBook reader device to more readily convey an identity of a content item, such as the content item's cover. For ease of understanding, the process 300 (as well as processes 400 in FIG. 4, 500 in FIGS. 5, and 600 in FIG. 6) is illustrated as a collection of blocks in a logical flow graph, which represent a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computer-executable instructions that, when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described blocks can be combined in any order and/or in parallel to implement the process. For discussion purposes, the process 300 is described with reference to the eBook reader device 100 of FIGS. 1 and 2.


The process 300 begins during normal user interaction with the eBook reader device 100. During this time, the user is consuming content items, such as eBooks, audio books, and so forth. Accordingly, at 302, content from the content item is displayed on the eBook reader device. For example, the current page from an eBook or corresponding text from an audio book may be displayed on the eBook reader device while the user is consuming the content item.


At 304, a screen conversion event is detected. The screen conversion events may be configured in various ways. In one approach, the screen conversion event may be tied to an explicit input from the user who is attempting to “power down” the eBook reader device, as represented by sub-act 304(1). The user may actuate a power control, for instance, that directs the eBook reader device to transition to a non-active state. The last screen rendering module 110 interprets this user input as a screen conversion event. In another approach, the screen conversion event is a timeout period, and hence, the detection occurs when the last screen rendering module 110 senses that the user has ceased engaging with the eBook reader device 100 for a threshold period of time, as represented by sub-act 304(2). In yet another approach, the screen conversion event may be a scheduled event, and hence, the detection occurs when the last screen rendering module 110 detects a previously scheduled event (such as a calendar event), as represented by sub-act 304(3). In still another approach, the screen conversion event may be an event stemming from past user behavior, and hence, the detection occurs when the last screen rendering module 110 monitors past user behavior and sets events based thereon, as represented by sub-act 304(4).


At 306, in response to detecting the screen conversion event, a “last” image associated with the content item or another content item is displayed on the eBook reader device. This last image is an image selected to convey a content item more readily to the user in the future. For instance, for eBooks, the last image may be a book cover, or a special composite that might include, for example, a title, author, last page read, reading statistics, and so forth. For newspapers or magazines, the last image may be a cover page, a graphical title, or any other combination of design elements that convey an identity of the newspaper or magazine. For an audio book, the last image may be that of the audio book cover or a picture of the actor who is reading the book.


While FIG. 3 shows a general process, the following discussion provides some illustrative usage scenarios. In particular, three representative usage scenarios are described: (1) screen conversion based on time lapse; (2) screen conversion based on date/time; and (3) screen conversion based on user behavior. These scenarios are merely representative and not intended to be exhaustive. Other permutations or variations are possible. Further, for ease of discussion, each scenario is presented in the context of a user reading eBooks on the eBook reader device. However, the aspects described herein may be applied to other content items, such as audio books.


Screen Conversion Based on Time Lapse


In the first usage scenario, conversion to the last image on the eBook reader device is achieved by detecting a timeout period.



FIG. 4 shows a process 400 for changing the display of the eBook reader device to the last image based on time lapse since the user last interacted with the device. To aid understanding, the display of the eBook reader device 100 is depicted at certain times during the process 400 to illustrate one example of how the screen may be changed to present the last image.


At 402, content from an eBook is displayed on the eBook reader device. For illustration, the screen of the eBook reader device 100 is shown depicting a page from the eBook “Outliers”, by Malcolm Gladwell. This page is being shown on the device 100 at time T1. Recall from above that a “page” as described herein may be understood as a virtual frame of the content, or a visual display window presenting the content to the user. The pages presented on the eBook reader device 100 may not correspond directly to the identical hard pages in the associated physical book. Depending on display capabilities, font size, and other such parameters, any given “page” displayed on the eBook reader device 100 may contain more or less text/graphics than the corresponding hard page.


At 404, a user input to power down the device, or a threshold period of inactivity, following a period in which the user was regularly interacting with the eBook reader device, is detected. This inactivity may be manifest in different ways, including by a failure to receive any input from the user (e.g., page turn request, navigation, adding an annotation, etc.) for a period of time, or by a failure to detect any movement of the device by the accelerometer 228 (if present), or other ways.


At 406, in response to detection of the timeout period, the content on the screen is replaced with a cover image of the eBook. Thus, at time T2, the cover image for the eBook “Outliers” is rendered as the last image on the eBook reader device 100(T2). The cover image remains visible on the screen of the eBook reader device 100(T2) until the user once again begins interacting with the device, or until another screen conversion event occurs (e.g., calendar event or behavior-based event). Thus, the eBook reader device maintains an appearance of the book that the user is presently reading, rather than a generic electronic device.


At 408, it is determined whether the user resumes interaction with the eBook reader device. If not (i.e., the “No” branch from 408), the device remains in rest mode and the cover image persists on the screen. However, once the user resumes interaction with the device (i.e., the “Yes” branch from 408), the eBook reader device returns to the content where the user left off Thus, at time T3, the last page the user was reading is displayed once again, at 410. As illustrated, the eBook reader device 100(T3) shows the same page of the eBook “Outliers” as was being depicted at time T1 when the user stopped reading previously.


Screen Conversion Based on Scheduled Date/time


In another usage scenario, the eBook reader device renders the last image in response to a scheduled conversion event that is based on a date, a time-of-day, or both.



FIG. 5 shows a process 500 for changing the display of the eBook reader device to the last image in response to a scheduled screen conversion event. To aid understanding, the display of the eBook reader device 100 is depicted at two different times, T1 and T2, during the process 500 to illustrate one example of how the screen may be changed to present the last image.


At 502, content from a first eBook is displayed on the eBook reader device. In this illustration, the screen of the eBook reader device 100 depicts a page from the eBook “Outliers”. This page is being shown on the device 100 at time T1.


At 504, a scheduled event is detected during a period of user inactivity. The scheduled event may be entered by the user, or retrieved from querying a calendaring application. Further, the scheduled event may be pushed or retrieved from a remote source, such as a calendaring application on a separate computing device or from an online service available over a network. In this example, suppose the user schedules periodic times to read from the Bible at specified times of the day, and days of the week.


At 506, in response to detecting a scheduled event, the cover image of the eBook associated with the scheduled event is depicted on the screen of the eBook reader device. In our continuing example, at time T2, the cover image for the Bible is shown on the eBook reader device 100(T2). In this example, the second eBook (e.g., Bible) is different from the first eBook (e.g., Outliers) that the user was last reading. However, in situations where the user was last reading the same eBook that is being triggered by the scheduled event, the cover image of the same book will appear.


Screen Conversion Based on User Behavior


In yet another usage scenario, conversion to the last image on the eBook reader device is achieved by observing user behavior and establishing screen conversion events based on the behavior.



FIG. 6 shows a process 600 for changing the display of the eBook reader device to the last image based on observed patterns in user behavior. As above, the display of the eBook reader device 100 is depicted at different times, T1 and T2, during the process 600 to illustrate one example of how the screen may be changed to present the last image.


At 602, user behavior during interaction with the eBook reader device 100 is monitored over time. In this example, at time T1, the user happens to be reading a page from the eBook “Outliers”.


At 604, one or more screen conversion events are established based on the user behavior. For instance, the user may exhibit a pattern of reading certain genre of eBooks during different times of the day or days of the week. Such patterns may be tracked over time and statistically analyzed. As one simple approach, the device may develop a histogram that tracks the number of times a user reads particular eBooks at predefined times of day (e.g., morning, mid-day, evening) In FIG. 6, a simple histogram 605 showing the highest occurring genre of eBook in the corresponding time slot. Thus, the Bible is the most often read eBook in the morning time slot, school texts during the mid-day time slot, and novels at night. Based on this behavior, three screen conversion events—morning, mid-day, and evening—may be established to change the image displayed on the screen of the eBook reader device.


At 606, eBooks that the user is likely to read in the various time slots are identified. For instance, the eBook reader device may identify certain school texts that the user is currently consuming and certain novels that the user has not yet completed.


At 608, the identified eBooks are associated with the screen conversion events. Thus, the school texts are associated with a screen conversion event that is set for mid-day, and the novels are associated with the screen conversion event that is set for evening


At 610, a screen conversion event is detected. Suppose, for example, that the screen conversion event is the one set to mid-day.


At 612, in response to detecting this event, the cover image of an eBook that is associated with the screen conversion event is displayed. In this example, since the screen conversion event is at mid-day, the user is likely to read a school text based on past behavior. Thus, at time T2, the cover image of the text “Macro Economics” is rendered on the eBook reader display 100(T2).


Last Image with Progress Information


In addition to rendering a last image, the eBook reader device may further provide progress information that helps the user understand how much of the content item has been consumed, and what remains. The progress information may be in the form of a textual summary, such as how many pages are in the eBook, what page the user last read, the percentage of the book that has been read and/or unread, and so forth. Alternatively, the progress information may be a graphical element that conveys progress through the content item.



FIGS. 7-9 show three different examples of graphical elements that convey progress. In FIG. 7, the progress indicator is implemented by modifying the last image based on the amount of progress made through the content item. Here, the eBook reader device 100 shows the cover of the school text “Macro Economics” as a last image 700 on the screen 102. A progress indicator 704 visually modifies the cover image by changing a part of the cover image in proportion to the amount of the eBook the user has read. In FIG. 7, the progress indicator 704 shows the cover image changing from the bottom to the top, thereby giving the appearance of the image filling as the user reads more of the eBook. Here, the user is approximately half way through the text, as graphically represented by the modified lower half of the image. The top of the filled portion further includes a non-linear delineator 706 to provide an appearance that the cover image is being filled with a liquid (e.g., water) as progress is made. This is just one possible way to modify the cover image, and many others are possible.



FIG. 8 shows another implementation in which the progress indicator is a progress bar 800 that is provided somewhere on the display 102 of the eBook reader device 100. In this example, the progress bar 800 is arranged horizontally across a lower part of the last image 802. The progress bar 800 includes a slider element 804 that grows within a hollow predefined region 806 to represent a proportion of the eBook completed. As the user reads, the slider element 804 fills more and more of the predefined region 806. Upon completion, the slider element 804 entirely fills the predefined region 806.



FIG. 9 shows yet another implementation in which the progress indicator is embodied as a pie graph 900 positioned on the display 102 of the eBook reader device 100. In this example, the progress pie graph 900 is positioned in the lower right-hand corner of the last image 902, although other locations are possible. The progress pie graph 900 includes a fill element 904 that enlarges as a leading edge 906 sweeps through a predefined circular region 908, akin to a minute arm of a clock moving over a clock face. As the user reads, the fill element 904 fills more of the predefined circular region 908. Upon completion, the fill element 904 entirely fills the predefined circular region 908.


These are just three possible examples. Other representations of progress may be implemented, and the progress indicators may be integrated with the cover image, or overlaid as a separate element.


CONCLUSION

Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the claims.

Claims
  • 1. A method comprising: causing, by an electronic device, display of content from an electronic book via a display that is associated with the electronic device;detecting at least one of (i) user input to transition the electronic device to a non-active state or (ii) a period of user inactivity with the electronic device; andresponsive to the detecting and by the electronic device, causing the display to transition from displaying the content from the electronic book to displaying other content that is different than the content from the electronic book.
  • 2. The method of claim 1, wherein the detecting comprises detecting a period of user inactivity with the electronic device from identifying that user input has not been received for a threshold period of time.
  • 3. The method of claim 1, wherein the detecting comprises detecting a period of user inactivity with the electronic device from identifying that movement of the electronic device has not been detected for a threshold period of time.
  • 4. The method of claim 1, further comprising: detecting occurrence of a scheduled event during a period of user inactivity with the electronic device; andresponsive to detecting occurrence of the scheduled event, causing display of content representing another electronic book.
  • 5. The method of claim 1, further comprising: determining that user interaction with the electronic device has resumed; andupon determining that user interaction with the electronic device has resumed, causing display of content of the electronic book that was last displayed prior to displaying the content that is different than the content from the electronic book.
  • 6. A system comprising: one or more processors;memory accessible by the one or more processors; andone or more modules stored in the memory and executable by the one or more processors to: cause content of an electronic book to be presented via a bi-stable display;detect at least one of (i) user input that requests a transition of the system to a non-active state, (ii) occurrence of a behavior-based event learned from observing user behavior when interacting with the system or (iii) occurrence of a scheduled event from a calendar application; andresponsive to the detecting, cause information to be presented via the bi-stable display.
  • 7. The system of claim 6, wherein the information identifies an author of the electronic book.
  • 8. The system of claim 6, wherein the information includes a progress indicator illustrating user progress through the electronic book.
  • 9. The system of claim 6, wherein the information includes a picture.
  • 10. The system of claim 6, wherein the information comprises a cover image of the electronic book.
  • 11. The system of claim 6, wherein the information includes at least one of information that identifies a last page that was consumed of the electronic book or a reading statistic.
  • 12. The system of claim 6, further comprising the bi-stable display.
  • 13. The system of claim 12, wherein the bi-stable display is configured to maintain presentation of the information when no power is supplied to the bi-stable display.
  • 14. One or more non-transitory computer-readable storage media having stored therein instructions, which when executed by an electronic device that is associated with an ePaper-type display, cause the electronic device to perform acts comprising: causing display of content from a content item via the ePaper-type display;detecting a screen conversion event that comprises at least one of (i) user input that requests a transition of the electronic device to a non-active state, (ii) a period of user inactivity with the electronic device, (iii) a behavior-based event or (iv) a scheduled event from a calendar; andresponsive to the detecting: transitioning the electronic device from an active state to the non-active state, the non-active state comprising a low-power state in which a first amount of power is supplied to the electronic device that is less than a second amount of power that is supplied to the electronic device in the active state; andcausing display of an image via the ePaper-type display, the image being different than the content from the content item.
  • 15. The one or more non-transitory computer readable storage media of claim 14, wherein the acts further comprise: monitoring user behavior with the electronic device;identifying one or more content items based at least in part on the user behavior; andassociating the one or more content items with the screen conversion event;wherein the screen conversion event comprises the behavior-based event.
  • 16. The one or more non-transitory computer readable storage media of claim 15, wherein the image represents the one or more content items that are associated with the screen conversion event.
  • 17. The one or more non-transitory computer readable storage media of claim 15, wherein the monitoring comprises observing which content items are consumed at different times of the day or week.
  • 18. The one or more non-transitory computer readable storage media of claim 15, wherein the monitoring comprises deriving a pattern of user consumption of the one or more content items.
  • 19. The one or more non-transitory computer readable storage media of claim 14, wherein the screen conversion event comprises the user input that requests the transition of the electronic device to the non-active state.
  • 20. The one or more non-transitory computer readable storage media of claim 14, wherein the screen conversion event comprises the period of user inactivity with the electronic device.
RELATED APPLICATIONS

This application claims priority to and is a continuation of U.S. patent application Ser. No. 12/567,984, filed on Sep. 28, 2009, the entire contents of which are incorporated herein by reference.

US Referenced Citations (519)
Number Name Date Kind
2684955 Knowles Sep 1954 A
4622627 Rodriguez et al. Nov 1986 A
4985697 Boulton Jan 1991 A
5418549 Anderson et al. May 1995 A
5495268 Pearson et al. Feb 1996 A
5499359 Vijaykumar Mar 1996 A
5517407 Weiner May 1996 A
5544305 Ohmaye et al. Aug 1996 A
5566098 Lucente et al. Oct 1996 A
5600775 King et al. Feb 1997 A
5623260 Jones Apr 1997 A
5630159 Zancho May 1997 A
5640553 Schultz Jun 1997 A
5659742 Beattie et al. Aug 1997 A
5661635 Huffman et al. Aug 1997 A
5663748 Huffman et al. Sep 1997 A
5696982 Tanigawa et al. Dec 1997 A
5710922 Alley et al. Jan 1998 A
5711922 O'Brien et al. Jan 1998 A
5742905 Pepe et al. Apr 1998 A
5761485 Munyan Jun 1998 A
5765168 Burrows Jun 1998 A
5774109 Winksy et al. Jun 1998 A
5813017 Morris Sep 1998 A
5845278 Kirsch et al. Dec 1998 A
5845301 Rivette et al. Dec 1998 A
5847698 Reavey et al. Dec 1998 A
5892900 Ginter et al. Apr 1999 A
5923861 Bertram et al. Jul 1999 A
5930026 Jacobson et al. Jul 1999 A
5940846 Akiyama Aug 1999 A
5956048 Gaston Sep 1999 A
5960411 Hartman et al. Sep 1999 A
5973681 Tanigawa et al. Oct 1999 A
5991439 Tanaka et al. Nov 1999 A
6018575 Gross et al. Jan 2000 A
6034839 Hamming Mar 2000 A
6037954 McMahon Mar 2000 A
6041335 Merritt et al. Mar 2000 A
6047189 Yun et al. Apr 2000 A
6049334 Bates et al. Apr 2000 A
6049796 Siitonen et al. Apr 2000 A
6064980 Jacobi et al. May 2000 A
6073148 Rowe et al. Jun 2000 A
6113394 Edgar Sep 2000 A
6148340 Bittinger et al. Nov 2000 A
6154757 Krause et al. Nov 2000 A
6164974 Carlile et al. Dec 2000 A
6195698 Lillibridge et al. Feb 2001 B1
6201771 Otsuka et al. Mar 2001 B1
6226642 Beranek et al. May 2001 B1
6233318 Picard et al. May 2001 B1
6272461 Meredith et al. Aug 2001 B1
6300947 Kanevsky Oct 2001 B1
6308320 Burch Oct 2001 B1
6313828 Chombo Nov 2001 B1
6331866 Eisenberg Dec 2001 B1
6331867 Eberhard et al. Dec 2001 B1
6335678 Heutschi Jan 2002 B1
6351750 Duga et al. Feb 2002 B1
6385596 Wiser et al. May 2002 B1
6401086 Bruckner Jun 2002 B1
6401239 Miron Jun 2002 B1
6442651 Crow et al. Aug 2002 B2
6449627 Baer et al. Sep 2002 B1
6457030 Adams et al. Sep 2002 B1
6466951 Birkler et al. Oct 2002 B1
6484212 Markowitz et al. Nov 2002 B1
6487669 Waring Nov 2002 B1
6493734 Sachs et al. Dec 2002 B1
6496803 Ho et al. Dec 2002 B1
6512497 Kondo et al. Jan 2003 B1
6529920 Arons et al. Mar 2003 B1
6535857 Clarke, III et al. Mar 2003 B1
6542874 Walker et al. Apr 2003 B1
6559882 Kerchner May 2003 B1
6560699 Konkle May 2003 B1
6574658 Gabber et al. Jun 2003 B1
6629138 Lambert et al. Sep 2003 B1
6631495 Kato et al. Oct 2003 B2
6642947 Feierbach Nov 2003 B2
6658623 Schilit et al. Dec 2003 B1
6685482 Hopp et al. Feb 2004 B2
6687878 Eintracht et al. Feb 2004 B1
6704733 Clark et al. Mar 2004 B2
6712701 Boylan, III et al. Mar 2004 B1
6721869 Senthil Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6725227 Li Apr 2004 B1
6726487 Dalstrom Apr 2004 B1
6735583 Bjarnestam et al. May 2004 B1
6744891 Allen Jun 2004 B1
6744967 Kaminski et al. Jun 2004 B2
6801751 Wood et al. Oct 2004 B1
6803930 Simonson Oct 2004 B1
6804489 Stuppy et al. Oct 2004 B2
6829594 Kitamura Dec 2004 B1
6847966 Sommer et al. Jan 2005 B1
6904449 Quinones Jun 2005 B1
6912398 Domnitz Jun 2005 B1
6933928 Lilienthal Aug 2005 B1
6938076 Meyer et al. Aug 2005 B2
6947922 Glance Sep 2005 B1
6948135 Ruthfield et al. Sep 2005 B1
6953343 Townshend Oct 2005 B2
6966029 Ahern Nov 2005 B1
6980652 Braitberg et al. Dec 2005 B1
6985932 Glaser et al. Jan 2006 B1
6992687 Baird et al. Jan 2006 B1
6999565 Delaney et al. Feb 2006 B1
7007015 Nayak Feb 2006 B1
7009596 Seet et al. Mar 2006 B2
7010500 Aarnio Mar 2006 B2
7010751 Shneiderman Mar 2006 B2
7020654 Najmi Mar 2006 B1
7020663 Hay et al. Mar 2006 B2
7054914 Suzuki et al. May 2006 B2
7057591 Hautanen et al. Jun 2006 B1
7062707 Knauft et al. Jun 2006 B1
7071930 Kondo et al. Jul 2006 B2
7089292 Roderick et al. Aug 2006 B1
7092116 Calaway Aug 2006 B2
7103848 Barsness et al. Sep 2006 B2
7107533 Duncan et al. Sep 2006 B2
7111250 Hayward et al. Sep 2006 B1
7130841 Goel et al. Oct 2006 B1
7133506 Smith Nov 2006 B1
7135932 Quadir et al. Nov 2006 B2
7142195 Northway et al. Nov 2006 B2
7149776 Roy et al. Dec 2006 B1
7165217 Kondo Jan 2007 B1
7181502 Incertis Feb 2007 B2
7188085 Pelletier Mar 2007 B2
7191346 Abe et al. Mar 2007 B2
7209888 Frid-Nielsen et al. Apr 2007 B2
7216116 Nilsson et al. May 2007 B1
7237123 LeVine et al. Jun 2007 B2
7246118 Chastain et al. Jul 2007 B2
7249046 Katsurabayashi et al. Jul 2007 B1
7249060 Ling Jul 2007 B2
7249324 Nakamura et al. Jul 2007 B2
7257577 Fagin et al. Aug 2007 B2
7287068 Eriksson et al. Oct 2007 B1
7290285 McCurdy et al. Oct 2007 B2
7298851 Hendricks et al. Nov 2007 B1
7304635 Seet et al. Dec 2007 B2
7310629 Mendelson et al. Dec 2007 B1
7313759 Sinisi Dec 2007 B2
7340436 Lilge Mar 2008 B1
7350704 Barsness et al. Apr 2008 B2
7355591 Sugimoto Apr 2008 B2
7375649 Gueziec May 2008 B2
7383505 Shimizu et al. Jun 2008 B2
7386480 Sarig Jun 2008 B2
7386804 Ho et al. Jun 2008 B2
7398244 Keith Jul 2008 B1
7401286 Hendricks et al. Jul 2008 B1
7414830 Weinstein et al. Aug 2008 B2
7454238 Vinayak et al. Nov 2008 B2
7461406 Pelly et al. Dec 2008 B2
7496767 Evans Feb 2009 B2
7506246 Hollander et al. Mar 2009 B2
7506356 Gupta et al. Mar 2009 B2
7509270 Hendricks et al. Mar 2009 B1
7511948 Hirayama Mar 2009 B2
7519278 Ikeda et al. Apr 2009 B2
7533152 Stark et al. May 2009 B2
7539478 Herley et al. May 2009 B2
7558884 Fuller et al. Jul 2009 B2
7562032 Abbosh et al. Jul 2009 B2
7574653 Croney et al. Aug 2009 B2
7631013 Parsons et al. Dec 2009 B2
7634429 Narin et al. Dec 2009 B2
7656127 Shutt et al. Feb 2010 B1
7657459 Anderson et al. Feb 2010 B2
7657831 Donahue Feb 2010 B2
7680849 Heller et al. Mar 2010 B2
7716224 Reztlaff, II et al. May 2010 B2
7720892 Healey, Jr. et al. May 2010 B1
7747949 Incertis Carro Jun 2010 B2
7760986 Beuque Jul 2010 B2
7788369 McAllen et al. Aug 2010 B2
7792756 Plastina et al. Sep 2010 B2
7835989 Hendricks et al. Nov 2010 B1
7849393 Hendricks et al. Dec 2010 B1
7865405 Hendricks et al. Jan 2011 B2
7865567 Hendricks et al. Jan 2011 B1
7865817 Ryan et al. Jan 2011 B2
7870022 Bous et al. Jan 2011 B2
7890848 Bodin et al. Feb 2011 B2
7900133 Cragun et al. Mar 2011 B2
7908628 Swart et al. Mar 2011 B2
7920112 Kurihara et al. Apr 2011 B2
7920320 Watson et al. Apr 2011 B2
8018431 Zehr et al. Sep 2011 B1
8117128 Ishibashi Feb 2012 B2
8131647 Siegel et al. Mar 2012 B2
8161198 Kikuchi Apr 2012 B2
8165998 Semerdzhiev Apr 2012 B2
8175925 Rouaix May 2012 B1
8209623 Barletta et al. Jun 2012 B2
8244468 Scalisi et al. Aug 2012 B2
8260915 Ashear Sep 2012 B1
8312096 Cohen et al. Nov 2012 B2
8341210 Lattyak et al. Dec 2012 B1
8370196 Choi et al. Feb 2013 B2
8417772 Lin et al. Apr 2013 B2
8429028 Hendricks et al. Apr 2013 B2
8452797 Paleja et al. May 2013 B1
8510247 Kane, Jr. et al. Aug 2013 B1
8601084 Carlander Dec 2013 B2
8725565 Ryan May 2014 B1
8793575 Lattyak et al. Jul 2014 B1
9268367 Aguera Y Arcas et al. Feb 2016 B2
20010007980 Ishibashi et al. Jul 2001 A1
20010025302 Suzuki et al. Sep 2001 A1
20010026287 Watanabe Oct 2001 A1
20010027450 Shinoda et al. Oct 2001 A1
20010027478 Meier et al. Oct 2001 A1
20010036822 Mead et al. Nov 2001 A1
20010037328 Pustejovsky et al. Nov 2001 A1
20010039493 Pustejovsky et al. Nov 2001 A1
20010049623 Aggarwal et al. Dec 2001 A1
20010050658 Adams Dec 2001 A1
20010053975 Kurihara Dec 2001 A1
20020002540 DeMello et al. Jan 2002 A1
20020010707 Chang et al. Jan 2002 A1
20020010759 Hitson et al. Jan 2002 A1
20020012134 Calaway Jan 2002 A1
20020026443 Chang et al. Feb 2002 A1
20020035697 McCurdy et al. Mar 2002 A1
20020046261 Iwata et al. Apr 2002 A1
20020049717 Routtenberg et al. Apr 2002 A1
20020054059 Schneiderman May 2002 A1
20020057286 Markel et al. May 2002 A1
20020059415 Chang et al. May 2002 A1
20020069222 McNeely Jun 2002 A1
20020069312 Jones Jun 2002 A1
20020087532 Barritz et al. Jul 2002 A1
20020090934 Mitchelmore Jul 2002 A1
20020091584 Clark et al. Jul 2002 A1
20020092031 Dudkiewicz et al. Jul 2002 A1
20020095468 Sakata Jul 2002 A1
20020101447 Carro Aug 2002 A1
20020103809 Starzl et al. Aug 2002 A1
20020120635 Joao Aug 2002 A1
20020123336 Kamada Sep 2002 A1
20020126140 Gorbet et al. Sep 2002 A1
20020129012 Green Sep 2002 A1
20020138291 Vaidyanathan et al. Sep 2002 A1
20020138649 Cartmell et al. Sep 2002 A1
20020143822 Brid et al. Oct 2002 A1
20020147724 Fries et al. Oct 2002 A1
20020165707 Call Nov 2002 A1
20020184319 Willner et al. Dec 2002 A1
20020194474 Natsuno et al. Dec 2002 A1
20030005002 Chen et al. Jan 2003 A1
20030009459 Chastain et al. Jan 2003 A1
20030012216 Novaes Jan 2003 A1
20030018720 Chang et al. Jan 2003 A1
20030025731 Chastain et al. Feb 2003 A1
20030028395 Rodgers et al. Feb 2003 A1
20030040970 Miller Feb 2003 A1
20030046233 Ara et al. Mar 2003 A1
20030052928 Williams Mar 2003 A1
20030058265 Robinson et al. Mar 2003 A1
20030065642 Zee Apr 2003 A1
20030069812 Yuen et al. Apr 2003 A1
20030076513 Sugimoto et al. Apr 2003 A1
20030090572 Belz et al. May 2003 A1
20030093312 Ukita et al. May 2003 A1
20030093382 Himeno et al. May 2003 A1
20030097354 Finlay et al. May 2003 A1
20030097361 Huang et al. May 2003 A1
20030105679 Krishnan et al. Jun 2003 A1
20030110503 Perkes Jun 2003 A1
20030126123 Kodama Jul 2003 A1
20030129963 Nurcahya Jul 2003 A1
20030135495 Vagnozzi Jul 2003 A1
20030152894 Townshend Aug 2003 A1
20030163399 Harper et al. Aug 2003 A1
20030164844 Kravitz et al. Sep 2003 A1
20030182551 Frantz et al. Sep 2003 A1
20030190145 Copperman et al. Oct 2003 A1
20030191737 Steele et al. Oct 2003 A1
20030204496 Ray et al. Oct 2003 A1
20030212613 Sarig Nov 2003 A1
20040002943 Merrill et al. Jan 2004 A1
20040003344 Lai et al. Jan 2004 A1
20040003398 Donian et al. Jan 2004 A1
20040015467 Fano Jan 2004 A1
20040023191 Brown et al. Feb 2004 A1
20040030686 Cardno et al. Feb 2004 A1
20040044723 Bell et al. Mar 2004 A1
20040049743 Bogward Mar 2004 A1
20040054499 Starzyk et al. Mar 2004 A1
20040068471 Kato Apr 2004 A1
20040078273 Loeb et al. Apr 2004 A1
20040078757 Golovchinsky et al. Apr 2004 A1
20040081300 Takae et al. Apr 2004 A1
20040098350 Labrou et al. May 2004 A1
20040117189 Bennett Jun 2004 A1
20040120280 Western Jun 2004 A1
20040128359 Horvitz et al. Jul 2004 A1
20040128539 Shureih Jul 2004 A1
20040139400 Allam et al. Jul 2004 A1
20040167822 Chasen et al. Aug 2004 A1
20040201633 Barsness et al. Oct 2004 A1
20040205457 Bent et al. Oct 2004 A1
20040210561 Shen Oct 2004 A1
20040212941 Haas et al. Oct 2004 A1
20040229194 Yang Nov 2004 A1
20040237033 Woolf et al. Nov 2004 A1
20040239703 Angelica Dec 2004 A1
20040243613 Pourheidari Dec 2004 A1
20040252692 Shim et al. Dec 2004 A1
20040254013 Quraishi et al. Dec 2004 A1
20040254917 Brill et al. Dec 2004 A1
20040267552 Gilliam et al. Dec 2004 A1
20040268253 DeMello et al. Dec 2004 A1
20050021464 Lindauer et al. Jan 2005 A1
20050022113 Hanlon Jan 2005 A1
20050044148 Son et al. Feb 2005 A1
20050044224 Jun et al. Feb 2005 A1
20050066219 Hoffman et al. Mar 2005 A1
20050069225 Schneider et al. Mar 2005 A1
20050069849 McKinney et al. Mar 2005 A1
20050076012 Manber et al. Apr 2005 A1
20050088410 Chaudhri Apr 2005 A1
20050091604 Davis Apr 2005 A1
20050097007 Alger et al. May 2005 A1
20050102618 Naito May 2005 A1
20050125222 Brown et al. Jun 2005 A1
20050132281 Pan et al. Jun 2005 A1
20050138007 Amitay Jun 2005 A1
20050138428 McAllen et al. Jun 2005 A1
20050144221 Shin et al. Jun 2005 A1
20050144895 Grimes et al. Jul 2005 A1
20050154601 Halpern et al. Jul 2005 A1
20050156869 Mori Jul 2005 A1
20050174335 Kent Aug 2005 A1
20050176438 Li Aug 2005 A1
20050177562 Raciborski Aug 2005 A1
20050177567 Hughes et al. Aug 2005 A1
20050193330 Peters Sep 2005 A1
20050195975 Kawakita Sep 2005 A1
20050198070 Lowry Sep 2005 A1
20050222977 Zhou et al. Oct 2005 A1
20050223315 Shimizu et al. Oct 2005 A1
20050228836 Bacastow et al. Oct 2005 A1
20050250439 Leslie Nov 2005 A1
20050256822 Hollingsworth Nov 2005 A1
20050257261 Shraim et al. Nov 2005 A1
20050262258 Kohno et al. Nov 2005 A1
20050286212 Brignone et al. Dec 2005 A1
20050289394 Arrouye et al. Dec 2005 A1
20060004840 Senda Jan 2006 A1
20060020469 Rast Jan 2006 A1
20060031316 Forstadius Feb 2006 A1
20060045470 Poslinski et al. Mar 2006 A1
20060047830 Nair et al. Mar 2006 A1
20060047844 Deng Mar 2006 A1
20060048184 Poslinski et al. Mar 2006 A1
20060053045 Danielson et al. Mar 2006 A1
20060057960 Tran Mar 2006 A1
20060061595 Goede et al. Mar 2006 A1
20060069697 Shraim et al. Mar 2006 A1
20060071754 Tofts et al. Apr 2006 A1
20060075205 Martin et al. Apr 2006 A1
20060075444 Dillen Apr 2006 A1
20060077897 Kotzin Apr 2006 A1
20060080261 Christal Apr 2006 A1
20060095404 Adelman et al. May 2006 A1
20060098900 King et al. May 2006 A1
20060101328 Albornoz et al. May 2006 A1
20060109242 Simpkins May 2006 A1
20060123053 Scannell, Jr. Jun 2006 A1
20060129618 Maier Jun 2006 A1
20060129924 Nelson et al. Jun 2006 A1
20060143558 Albornoz et al. Jun 2006 A1
20060145950 Tanaka Jul 2006 A1
20060156222 Chi et al. Jul 2006 A1
20060161578 Siegel et al. Jul 2006 A1
20060161635 Lamkin et al. Jul 2006 A1
20060175983 Crouse et al. Aug 2006 A1
20060179137 Jennings, III et al. Aug 2006 A1
20060190489 Vohariwatt et al. Aug 2006 A1
20060190568 Patterson Aug 2006 A1
20060195431 Holzgrafe et al. Aug 2006 A1
20060209175 Cohen et al. Sep 2006 A1
20060236240 Lebow Oct 2006 A1
20060240799 Kim et al. Oct 2006 A1
20060250994 Sasaki et al. Nov 2006 A1
20060253441 Nelson Nov 2006 A1
20060253458 Dixon et al. Nov 2006 A1
20060253461 de Bonet Nov 2006 A1
20060256083 Rosenberg Nov 2006 A1
20060265518 Owens et al. Nov 2006 A1
20060271629 MacDowell Nov 2006 A1
20060281058 Mangoaela Dec 2006 A1
20060282797 Barsness et al. Dec 2006 A1
20070005616 Hay et al. Jan 2007 A1
20070005757 Finger et al. Jan 2007 A1
20070014404 Cha Jan 2007 A1
20070016555 Ito et al. Jan 2007 A1
20070025704 Tsukazaki et al. Feb 2007 A1
20070039023 Kataoka Feb 2007 A1
20070050346 Goel et al. Mar 2007 A1
20070055926 Christiansen et al. Mar 2007 A1
20070061335 Ramer et al. Mar 2007 A1
20070061337 Saito et al. Mar 2007 A1
20070061803 Barrett Mar 2007 A1
20070073596 Alexander et al. Mar 2007 A1
20070076015 Tanabe Apr 2007 A1
20070078273 Hirota Apr 2007 A1
20070079236 Schrier et al. Apr 2007 A1
20070079383 Gopalakrishnan Apr 2007 A1
20070094285 Agichtein et al. Apr 2007 A1
20070094351 Kalish et al. Apr 2007 A1
20070105536 Tingo, Jr. May 2007 A1
20070112817 Danninger May 2007 A1
20070118533 Ramer et al. May 2007 A1
20070130109 King et al. Jun 2007 A1
20070136660 Gurcan et al. Jun 2007 A1
20070136679 Yang Jun 2007 A1
20070142934 Boercsoek et al. Jun 2007 A1
20070150456 Lian et al. Jun 2007 A1
20070162961 Tarrance et al. Jul 2007 A1
20070174545 Okada et al. Jul 2007 A1
20070185865 Budzik et al. Aug 2007 A1
20070189719 Furumachi et al. Aug 2007 A1
20070219983 Fish Sep 2007 A1
20070233562 Lidwell et al. Oct 2007 A1
20070233692 Lisa et al. Oct 2007 A1
20070234209 Williams Oct 2007 A1
20070238077 Strachar Oct 2007 A1
20070240187 Beach et al. Oct 2007 A1
20070242225 Bragg et al. Oct 2007 A1
20070250573 Rothschild Oct 2007 A1
20070282809 Hoeber et al. Dec 2007 A1
20070283173 Webb et al. Dec 2007 A1
20070288853 Neil Dec 2007 A1
20080005097 Kleewein et al. Jan 2008 A1
20080005203 Bots et al. Jan 2008 A1
20080005664 Chandra Jan 2008 A1
20080016064 Sareday et al. Jan 2008 A1
20080016164 Chandra Jan 2008 A1
20080027933 Hussam Jan 2008 A1
20080031595 Cho Feb 2008 A1
20080040233 Wildman et al. Feb 2008 A1
20080059702 Lu et al. Mar 2008 A1
20080066155 Abraham Mar 2008 A1
20080082518 Loftesness Apr 2008 A1
20080082911 Sorotokin et al. Apr 2008 A1
20080089665 Thambiratnam et al. Apr 2008 A1
20080113614 Rosenblatt May 2008 A1
20080115224 Jogand-Coulomb et al. May 2008 A1
20080120101 Johnson et al. May 2008 A1
20080120280 Iijima et al. May 2008 A1
20080133479 Zelevinsky et al. Jun 2008 A1
20080154908 Datar et al. Jun 2008 A1
20080163039 Ryan et al. Jul 2008 A1
20080164304 Narasimhan et al. Jul 2008 A1
20080165141 Christie Jul 2008 A1
20080168073 Siegel et al. Jul 2008 A1
20080208833 Basmov Aug 2008 A1
20080222552 Batarseh et al. Sep 2008 A1
20080235351 Banga et al. Sep 2008 A1
20080243788 Reztlaff et al. Oct 2008 A1
20080243814 Gurcan et al. Oct 2008 A1
20080243828 Reztlaff et al. Oct 2008 A1
20080253737 Kimura Oct 2008 A1
20080259057 Brons Oct 2008 A1
20080270930 Slosar Oct 2008 A1
20080281058 Araki Nov 2008 A1
20080293450 Ryan et al. Nov 2008 A1
20080294674 Rezlaff, II et al. Nov 2008 A1
20080295039 Nguyen et al. Nov 2008 A1
20080298083 Watson et al. Dec 2008 A1
20080301820 Stevens Dec 2008 A1
20090094528 Gray et al. Apr 2009 A1
20090094540 Gray et al. Apr 2009 A1
20090181649 Bull et al. Jul 2009 A1
20090228774 Matheny et al. Sep 2009 A1
20090231233 Liberatore Sep 2009 A1
20090241054 Hendricks Sep 2009 A1
20090263777 Kohn Oct 2009 A1
20090267909 Chen et al. Oct 2009 A1
20090296331 Choy Dec 2009 A1
20090319482 Norlander et al. Dec 2009 A1
20100023259 Krumm et al. Jan 2010 A1
20100081120 Nanjiani et al. Apr 2010 A1
20100095340 Ei et al. Apr 2010 A1
20100125876 Craner et al. May 2010 A1
20100131385 Harrang et al. May 2010 A1
20100156913 Ortega et al. Jun 2010 A1
20100164888 Okumura et al. Jul 2010 A1
20100177080 Essinger Jul 2010 A1
20100188327 Frid et al. Jul 2010 A1
20100284036 Ahn et al. Nov 2010 A1
20100328223 Mockarram-Dorri et al. Dec 2010 A1
20110050591 Kim et al. Mar 2011 A1
20110050594 Kim et al. Mar 2011 A1
20110057884 Gormish et al. Mar 2011 A1
20110069073 Unger Mar 2011 A1
20110112671 Weinstein May 2011 A1
20110191710 Jang et al. Aug 2011 A1
20110267333 Sakamoto et al. Nov 2011 A1
20110295926 Battiston et al. Dec 2011 A1
20120001923 Weinzimmer et al. Jan 2012 A1
20120016774 Dicke et al. Jan 2012 A1
20120036431 Ito et al. Feb 2012 A1
20120041941 King et al. Feb 2012 A1
20120079372 Kandekar et al. Mar 2012 A1
20120197998 Kessel et al. Aug 2012 A1
20120227001 Gupta et al. Sep 2012 A1
20130014195 Amira Jan 2013 A1
20130219320 Seet et al. Aug 2013 A1
20130246157 Puppin et al. Sep 2013 A1
Foreign Referenced Citations (47)
Number Date Country
1362682 Aug 2002 CN
1841373 Oct 2006 CN
101120358 Feb 2008 CN
1197902 Apr 2002 EP
1842150 Oct 2007 EP
6274493 Sep 1994 JP
07078139 Mar 1995 JP
09179870 Jul 1997 JP
10091640 Apr 1998 JP
11074882 Mar 1999 JP
2000501214 Feb 2000 JP
2001052016 Feb 2001 JP
2001052025 Feb 2001 JP
2001100702 Apr 2001 JP
2001195412 Jul 2001 JP
2001236358 Aug 2001 JP
2002099739 Apr 2002 JP
2002197079 Jul 2002 JP
2002259718 Sep 2002 JP
2002536736 Oct 2002 JP
2003016104 Jan 2003 JP
2003122969 Apr 2003 JP
2003513384 Apr 2003 JP
2003516585 May 2003 JP
2003517158 May 2003 JP
2003186910 Jul 2003 JP
2005056041 Mar 2005 JP
2006011694 Jan 2006 JP
2006107496 Apr 2006 JP
2006129323 May 2006 JP
2006129327 May 2006 JP
2006190114 Jul 2006 JP
2008071334 Mar 2008 JP
2008516297 May 2008 JP
2008527580 Jul 2008 JP
2008197634 Aug 2008 JP
1020020020262 Mar 2002 KR
WO9720274 Jun 1997 WO
WO9720274 Jun 1997 WO
WO0045588 Aug 2000 WO
WO0045588 Aug 2000 WO
WO0056055 Sep 2000 WO
WO0075840 Dec 2000 WO
WO0142978 Jun 2001 WO
WO0239206 May 2002 WO
WO2004055647 Jul 2004 WO
WO2006078728 Jul 2006 WO
Non-Patent Literature Citations (249)
Entry
Homer, et al., “Instant HTML”, Wrox Press, 1997, pp. 76-79.
Jones, et al., “Development of a Tactile Vest”, IEEE Computer Society, In the Proceedings of the 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Mar. 27-28, 2004, pp. 82-89.
Translated Japanese Office Action mailed Sep. 2, 2014 for Japanese patent application No. 2011-548210, a counterpart foreign application of U.S. Appl. No. 12/360,744, 4 pages.
Translated Japanese Office Action mailed Jan. 14, 2014 for Japanese patent application No. 2011-548210, a counterpart foreign application of U.S. Pat. No. 8,378,979, 4 pages.
Translated Japanese Office Action mailed Jan. 25, 2011 for Japanese Patent Application No. 2007-552235, a counterpart foreign application of U.S. Appl. No. 11/039,645, 6 pages.
Translated Japanese Office mailed Oct. 12, 2012 for Japanese patent application No. 2009-544304, a counterpart foreign application of U.S. Pat. No. 7,865,817, 6 pages.
Translated Japanese Office Action mailed Oct. 25, 2011 for Japanese patent application No. 2007-552235, a counterpart foreign application of U.S. Appl. No. 11/039,645, 3 pages.
Translated Japanese Office Action mailed Nov. 12, 2013 for Japanese patent application No. 2010-501125, a counterpart foreign application of U.S. Appl. No. 11/763,369, 9 pages.
Translated Japanese Office Action mailed Dec. 16, 2014 for Japanese patent application No. 2009-544304, a counterpart foreign application of U.S. Appl. No. 11/693,677, 2 pages.
Translated Japanese Office Action mailed Dec. 17, 2012 for Japanese patent application No. 2010-509529, a counterpart foreign application of U.S. Appl. No. 11/763,374, 7 pages.
Translated Japanese Office Action mailed Dec. 7, 2012 for Japanese patent application No. 2010-501124, a counterpart foreign application of U.S. Appl. No. 11/693,682, 6 pages.
Translated Japanese Office Action mailed Apr. 12, 2013 for Japanese patent application No. 2010-501125, a counterpart foreign application of U.S. Appl. No. 11/763,369, 5 pages.
Translated Japanese Office Action mailed May 24, 2013 for Japanese patent application No. 2010-501124, a counterpart foreign application of U.S. Appl. No. 11/693,682, 7 pages.
Translated Japanese Office Action mailed May 31, 2013 for Japanese patent application No. 2010-509529, a counterpart foreign application of U.S. Appl. No. 11/763,374, 5 pages.
Translated Japanese Office Action mailed Aug. 23, 2013 for Japanese patent application No. 2009-544304, a counterpart foreign application of U.S. Pat. No. 7,865,817, 4 pages.
Translated Japanese Office Action mailed Aug. 5, 2014 for Japanese patent application No. 2010-501125, a counterpart foreign application of U.S. Appl. No. 11/763,369, 7 pages.
Translated Japanese Office Action mailed Sep. 18, 2012 for Japanese patent application No. 2007-552235, a counterpart foreign application of U.S. Pat. No. 8,131,647, 4 pages.
Translated Korean Office Action mailed Dec. 19, 2014 for Korean patent application No. 10-2009-7024280, a counterpart foreign application of U.S. Appl. No. 11/763,374, 10 pages.
Leach et al, “A Universally Unique IDentifier (UUID) URN Namespace”, Jul. 2005, IETF, retrieved on Apr. 21, 2010 at http://tools.ietf.org/pdf/rfc4122.pdf, 32 pgs.
Leutwyler, “Shape-shifting Polymer Gels”, retrieved on May 7, 2009 at <<http://www.scientificamerican.com/article.cfm?id=shape-shifting-polymer-ge&print=true>>, Scientific American, Nov. 9, 2000, 1 pg.
“Mastering to Become a True Manager, Well-selected commands for an efficient event log management, Part 1,” Windows Server World, vol. 9, No. 2, pp. 86-96, IDG Japan, Japan, Feb. 1, 2004.
Means, et al., “Evaluating Compliance with FCC Guidelines for Human Exposure to Radiofrequency Electromagnetic Fields”, OET Bulletin 65 Edition 97-01, Jun. 2001, 57 pages.
Mercier et al., “Sphere of influence Model in Information retrieval”, IEEE 2005 International Conference on Fuzzy Systems, pp. 120-pp. 125.
Nakatani, et al., “3D Form Display with Shape Memory Alloy”, In Proceedings of 13th International Conference on Artificial Reality and Teleexistence (ICAT), 2003, pp. 179-184.
Navarro, et al., “Modern Information Retrieval, Chapter 8: Indexing and Searching”, Jan. 1, 1999, Modern Information Retrieval, ACM Press, New York, pp. 191-228.
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/763,375, mailed on Jan. 19, 2010, 31 pgs.
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/763,378, mailed on Oct. 15, 2009, 31 pgs.
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/763,375, mailed on Aug. 6, 2010, 17 pgs.
Office Action for U.S. Appl. No. 12/366,941, mailed on Jan. 6, 2015, Scott Dixon, “Bundled Digital Content”, 15 pages.
Final Office Action for U.S. Appl. No. 11/537,518, mailed on Jan. 9, 2015, John Lattyak, “Acquisition of an Item Based on a Catalog Presentation of Items”, 13 pages.
Office Action for U.S. Appl. No. 12/886,877, mailed on Jan. 15, 2015, Gilles Jean Roger Belin, “Cover Display”, 45 pages.
Office action for U.S. Appl. No. 12/943,211, mailed on Feb. 6, 2013, Reztlaff, II et al., “Obtaining and Verifying Search Indices”, 9 pages.
Final Office Action for U.S. Appl. No. 11/537,484, mailed on Jan. 24, 2012, Thomas A. Ryan, “Expedited Acquisition of a Digital Item Following a Sample Presentation of the Item”, 22 pages.
Final Office Action for U.S. Appl. No. 12/414,914, mailed on Jan. 4, 2012, Agarwal et al., “Questions on Highlighted Passages”, 41 pages.
Office action for U.S. Appl. No. 12/360,089, mailed on Oct. 5, 2011, Killalea et al., “Aggregation of Highlights”, 75 pages.
Office Action for U.S. Appl. No. 13/722,961, mailed on Oct. 10, 2014, John Lattyak, “Delivery of Items for Consumption by a User Device”, 8 pages.
Final Office Action for U.S. Appl. No. 11/763,392, mailed on Oct. 14, 2011, Thomas Ryan, “Administrative Tasks in a Media Consumption System”, 38 pages.
Office action for U.S. Appl. No. 11/763,374 mailed on Oct. 16, 2012, Ryan et al, “Consumption of Items via a User Device”, 13 pages.
Office action for U.S. Appl. No. 11/763,386, mailed on Oct. 16, 2013, Ryan et al., “Handling of Subscription-Related Issues in a Media Consumption System”,18 pages.
Office action for U.S. Appl. No. 11/763,390, mailed on Oct. 24, 2011, Bajaj et al., “Providing User-Supplied Items to a User Device”, 11 pages.
Final Office Action for U.S. Appl. No. 13/284,446, mailed on Oct. 31, 2014, Linsey R. Hansen, “Indicators for Navigating Digital Works”, 17 pages.
Office action for U.S. Appl. No. 13/083,445, mailed on Oct. 5, 2012, Siegel et al., “Method and System for Providing Annotations of a Digital Work”, 29 pages.
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/763,378, mailed on Oct. 6, 2014, Lattyak et al., “Transfer of Instructions to a User Device”, 16 pgs.
Office Action for U.S. Appl. No. 11/693,682, mailed on Oct. 7, 2014, Siegelet al., “Providing Annotations of a Digital Work”, 12 pages.
Office Action for U.S. Appl. No. 12/943,211, mailed on Oct. 8, 2013, “Obtaining and Verifying Search Indices”, 9 pages.
Final Office Action for U.S. Appl. No. 12/759,828, mailed on Nov. 10, 2011, James R. Retzalff II, “Search and Indexing on a User Device”, 16 pages.
Office action for U.S. Appl. No. 12/360,089, mailed on Nov. 23, 2012, Killalea et al., “Aggregation of Highlights”, 15 pages.
Final Office Action for U.S. Appl. No. 11/537,518, mailed on Nov. 25, 2011, John Lattyak, “Acquisition of an Item Based on a Catalog Presentation of Items,” 8 pages.
Final Office Action for U.S. Appl. No. 13/959,589, mailed on Nov. 6, 2014, Thomas A. Ryan, “Administrative Tasks in a Media Consumption System”, 29 pages.
Office action for U.S. Appl. No. 12/414,914, mailed on Aug. 4, 2011, Agarwal et al., “Questions on Highlighted Passages”, 39 pages.
Office action for U.S. Appl. No. 11/693,685, mailed on Aug. 15, 2013, Lattyak et al., “Relative Progress and Event Indicators”, 24 pages.
Office action for U.S. Appl. No. 11/763,376, mailed on Aug. 19, 2013, Kiraly et al., “Notification of a User Device to Perform an Action”, 16 pages.
Non-final Office Action for U.S. Appl. No. 11/537,484, mailed on Aug. 19, 2011, Thomas A. Ryan, “Expedited Acquisition of a Digital Item Following a Sample Presentation of the Item”, 13 pages.
Non-final Office Action for U.S. Appl. No. 11/763,363, mailed on Aug. 26, 2011, James R. Rezlaff II, “Search Results Generation and Sorting”, 10 pages.
Office action for U.S. Appl. No. 11/537,484, mailed on Aug. 27, 2013, Ryan, “Expedited Acquisition of a Digital Item Following a Sample Presentation of the Item”, 13 pages.
Office action for U.S. Appl. No. 11/763,314, mailed on Aug. 28, 2014, Griffin et al., “Display Dependent Markup Language”, 52 pages.
Non-Final Office Action for U.S. Appl. No. 11/763,358, mailed on Sep. 12, 2011, James R. Retzlaff II, “Managing Status of Search Index Generation”, 11 pages.
Office action for U.S. Appl. No. 12/414,914, mailed on Sep. 13, 2013, Agarwal et al, “Questions on Highlighted Passages”, 35 pages.
Final Office Action for U.S. Appl. No. 11/763,369, mailed on Sep. 16, 2013, James R. Reztlaff II et al., “Search of Multiple Content Sources on a User Device”, 23 pages.
Office Action for U.S. Appl. No. 13/294,803, mailed on Sep. 24, 2013, John Lattyak, “Progress Indication for a Digital Work”, 27 pages.
Office action for U.S. Appl. No. 13/083,445, mailed on Sep. 24, 2014, Siegel et al., “Method and System for Providing Annotations of a Digital Work”, 31 pages.
Final Office Action for U.S. Appl. No. 11/963,618, mailed on Sep. 26, 2011, Michael Rykov, “Dissemination of Periodical Samples”, 15 pages.
Non-Final Office Action for U.S. Appl. No. 11/763,374, mailed on Sep. 27, 2011, Thomas Ryan, “Consumption of Items via a User Device”, 17 pages.
Office action for U.S. Appl. No. 11/537,518, mailed on Sep. 4, 2014, Lattyak, “Acquisition of an Item Based on a Catalog Presentation of Items”, 10 pages.
Office action for U.S. Appl. No. 13/722,961, mailed on Sep. 5, 2013, Lattyak et al., “Delivery of Items for Consumption by a User Devic”, 6 pages.
Oki et al., “The Infomation Bus—An Architecture for Extensive Distributed Systems”, ACM, 1993, 11 pages.
OQO “A Full PC That Fits in Your Pocket” Retrieved on Sep. 22, 2008 at <<http://www.oqo.com/support/documentation.html>>, 34 pages.
Palm Reader Handbook, Palm Inc., 2000, 56 pages.
PCT International Search Report and the Written Opinion for Application No. PCT/US 08/64389, mailed on Jan. 28, 2009, 7 pgs.
International Search Report mailed Sep. 9, 2008, in International Application No. PCT/US/08/64387, filed May 21, 2008, 1 page.
International Search Report mailed Aug. 15, 2008, in International Application No. PCT/US07/89105, filed Dec. 28, 2007, 2 pages.
International Search Report mailed Aug. 15, 2008, in corresponding International Application No. PCT/US08/57829, filed Mar. 21, 2008, 1 page.
International Search Report mailed Jul. 7, 2008, in International Application No. PCT/US08/57848, filed Mar. 31, 2008, 2 pages.
International Search Report mailed Sep. 9, 2008, in International Application No. PCT/US08/64387, filed May 21, 2008, 1 page.
PCT Search Report for PCT Application No. PCT/US10/22060, mailed Mar. 8, 2010 (7 pages).
PCT International Search Report and the Written Opinion for Application No. PCT/US2006/001752, mailed on Jul. 27, 2006, 8 pgs.
“Say No to Third Voice,” Worldzone.net, 1999-2004, <http://worldzone.netiinternetipixelsnttv/index.html> [retrieved Jan. 30, 2004]. 5 pages.
“Shape Memory Polymer”, retrieved on May 7, 2009 at <<http://en.wikipedia.org/wiki/Shape—Memory—Polymer>>, Wikipedia, 8 pgs.
Sohn et al. “Development of a Standard Format for eBooks”, SAC2002, Madrid, Spain, 2002 ACM 1-58113-445-2/02/0, 6 pages.
“The Berkman Center for Internet & Society at Harvard Law School: Annotation Engine,” Harvard.Edu, 1999-2004, <http://cyber.iaw.harvard.eduJprojects/annotate.html> [Retrieved Jan. 30, 2004]. 3 pages.
Marshall, C.C., “The Future of Annotation in a Digital (Paper) World,” Proceedings o/the 35th Annual GSLIS Clinic, University of Illinois at UrbanaChampaign, Urbana, 11, Mar. 22-24, 1998, pp. 1-19.
Kumar, A., “Third Voice Trails off . . . ,” Wired News, 2004, <http://www.wired.comlnews/printIO. 1294,42803 ,00.html> [retrieved Jan. 30, 2004]. 3 pages.
“Trilogy Definition”, Merriam-Webster's Collegiate Dictionary, Tenth Edition, 1999, 2 pages.
“Universal Unique Identifier”, dated Dec. 16, 2002. The Open Group, 9 pages. Retrieved on Apr. 21, 2010 via Wayback Machine at http://web.archive.org/web/20021216070918/http://www.opengroup.org/onlinepubs/9629399/apdxa.htm, 9 pages.
“Web Services Architecture: W3C Working Group Note Feb. 11, 2004”, Feb. 11, 2004, W3C, 100 pages. Retrieved on Apr. 21, 2010 via Wayback Machine at http://web.archive.org/web/2004040205185/http://www.w3.org/TR/ws-arch/.
Wellman, et al., “Mechanical Design and Control of a High-Bandwidth Shape Memory Alloy Tactile Display”, Springer-Verlag, In the Proceedings of the International Symposium on Experimental Robotics, Barcelona, Spain, Jun. 1997, pp. 56-66, 12 pgs.
Yoshikawa, et al., “Vertical Drive Micro Actuator for Haptic Display Using Shape Memory Alloy Thin Film”, IEE Japan, Papers of Technical Meeting on Micromachine and Sensor System, Journal Code L2898B, vol. MSS-05, No. 21-44, 2005, pp. 103-108.
Ziviani, N ED, Baeza-Yates R. et at: “Modern Information Retrieval, Text Operations”, Jan. 1, 1999, Modern Information Retrieval, ACM Press, NY, pp. 163-190.
Zobel, J. et al., “Inverted Files for Text Search Engines” ACM Computing Surveys, vol. 38, No. 2, Jul. 1, 2006, pp. 1-56, NY, NY.
Barnes & Noble, “Nook User Guide”, retrieved from the internet Feb. 5, 2013, 120 pgs.
Kindle Community, Discussions—Screen Saver, retrieved from the internet on Nov. 6, 2009 at <<http://www.amazon.com/tag/kindle/forum?cdForum=Fx1D7SY3BVSESG&cdThread=Tx28QGUBE29L22J>>, 4 pages.
“Kobo Wireless eReader & Desktop Application User Guide”, Feb. 2011, 170 pgs.
Office Action for U.S. Appl. No. 12/886,877, mailed on Feb. 21, 2014, Gilles Jean Roger Belin, “Cover Display”, 36 pages.
Office action for U.S. Appl. No. 12/567,984, mailed on Mar. 15, 2013, Kim, “Last Screen Rendering for Electronic Book Readern”, 10 pages.
Office action for U.S. Appl. No. 13/070,328, mailed on Jul. 25, 2013, Rachabathuni, “Last Screen Rendering for Electronic Book Readers”, 11 pages.
Office action for U.S. Appl. No. 12/886,877, mailed on Sep. 11, 2013, Belin et al., “Cover Display”, 31 pages.
Non-Final Office Action for U.S. Appl. No. 12/567,984, mailed on Sep. 27, 2012, John T. Kim, “Last Screen Rendering for Electronic Book Reader”, 9 pages.
Office Action for U.S. Appl. No. 13/070,328, mailed on Feb. 25, 2014, Sailesh Rachabathuni, “Last Screen Rendering for Electronic Book Readers”, 11 pages.
Office action for U.S. Appl. No. 13/070,328, mailed on Aug. 12, 2014, Rachabathuni, “Last Screen Rendering for Electronic Book Readers”, 6 pages.
Office action for U.S. Appl. No. 12/886,877, mailed on Aug. 13, 2014, Belin et al., “Cover Display”, 40 pages.
U.S. Appl. No. 11/277,894, filed Mar. 29, 2006, Jateen P. Parekh, Gregg E. Zehr, and Subram Narasimhan,“Reader Device Content Indexing”.
U.S. Appl. No. 11/537,484, filed Sep. 29, 2006, Thomas Ryan, “Expedited Acquisition of a Digital Item Following a Sample Presentation of the Item.”
U.S. Appl. No. 11/537,518, filed Sep. 29, 2006, John Lattyak, “Acquisition of an Item based on a Catalog Presentation of Items.”
U.S. Appl. No. 11/693,685, filed Mar. 29, 2007, John Lattyak; John Kim; Steven Moy; Laurent An Minh Nguyen, “Relative Progress and Event Indicators.”
U.S. Appl. No. 11/763,314, filed Jun. 14, 2007, John Lattyak; Craig Griffin; Steven Weiss, “Display Dependent Markup Language.”
U.S. Appl. No. 11/763,339, filed Jun. 14, 2007, David Isbister; Marshall Willilams; Nicholas Vaccaro, “Power Management Techniques for a User Device.”
U.S. Appl. No. 11/763,357, filed Jun. 14, 2007, James Reztlaff II; John Lattyak, “Obtaining and Verifying Search Indices.”
U.S. Appl. No. 11/763,363, filed Jun. 14, 2007, James Reztlaff II; Thomas Ryan, “Search Results Generation and Sorting.”
U.S. Appl. No. 11/763,375, filed Jun. 14, 2007, John Lattyak, Girish Bansil Bajaj, Kevin R. Cheung, Thomas Fruchterman, Robert L. Goodwin, Kenneth P. Kiraly, Richard Moore, Subram Narasimhan, Thomas A. Ryan, Michael V. Rykov, Jon Saxton, James C. Slezak, Beryl Tomay, Aviram Zagorie, Gregg Elliott Zehr, “Delivery of Items for Consumption by a User Device.”
U.S. Appl. No. 11/763,376, filed Jun. 14, 2007, Kenneth Kiraly; Thomas Ryan; Gregg Zehr; John Lattyak; Michael Rykov; Girish Bansilal Bajaj; James Slezak; Aviram Zagorie; Richard Moore; Kevin Cheung; Thomas Fruchterman; Robert Goodwin, “Notification of a User Device to Perform an Action.”
U.S. Appl. No. 11/763,378, filed Jun. 14, 2007, John Lattyak; Thomas Ryan; Gregg Zehr; Kenneth Kiraly; Michael Rykov; Girish Bansilal Bajaj; James Slezak; Aviram Zagorie; Richard Moore; Kevin Cheung; Thomas Fruchterman; Robert Goodwin; Xiaotian Guo, “Transfer of Instructions to a User Device.”
U.S. Appl. No. 11/763,381, filed Jun. 14, 2007, Michael Rykov; Girish Bansilal Bajaj; James Slezak; Aviram Zagorie; Richard Moore; Kevin Cheung; Thomas Fruchterman; Robert Goodwin, “Selecting and Providing Items in a Media Consumption System.”
U.S. Appl. No. 11/763,386, filed Jun. 14, 2007, Thomas Ryan; Gregg Zehr; Kenneth Kiraly; John Lattyak; Michael Rykov; Girish Bansilal Bajaj; James Slezak; Aviram Zagorie; Richard Moore; Kevin Cheung; Thomas Fruchterman; Robert Goodwin, “Handling of Subscription-Related Issues in a Media Consumption System.”
U.S. Appl. No. 11/763,390, filed Jun. 14, 2007, Girish Bansilal Bajaj; Michael Rykov; James Slezak; Aviram Zagorie; Richard Moore; Kevin Cheung; Thomas Fruchterman; Robert Goodwin,, “Providing User-Supplied Items to a User Device.”
U.S. Appl. No. 11/763,392, filed Jun. 14, 2007, Thomas Ryan; Gregg Zehr; Kenneth Kiraly; John Lattyak; Subram Narasimhan; Michael Rykov; Girish Bansilal Bajaj; James Slezak; Aviram Zagorie; Richard Moore; Kevin Cheung; Thomas Fruchterman; Robert Goodwin, “Administrative Tasks in a Media Consumption System.”
U.S. Appl. No. 11/763,393, filed Jun. 14, 2007, John Lattyak; Michael Rykov; Girish Bansilal Bajaj; James Slezak; Aviram Zagorie; Richard Moore; Kevin Cheung; Thomas Fruchterman; Robert Goodwin,, “Incremental Updates of Items.”
U.S. Appl. No. 11/763,395, filed Jun. 14, 2007, Thomas Ryan; Gregg Zehr; Kenneth Kiraly; John Lattyak; Michael Rykov; Girish Bansilal Bajaj; James Slezak; Aviram Zagorie; Richard Moore; Kevin Cheung; Thomas Fruchterman; Robert Goodwin; James Reztlaff II, “Providing Supplemental Information Basedon Hints in a Media Comsumption System.”
U.S. Appl. No. 11/963,618, filed Dec. 21, 2007, Michael Rykov; Laurent An Minh Nguyen; Steven Moy, “Dissemination of Periodical Samples.”
U.S. Appl. No. 12/333,215, filed Dec. 11, 2008, Aviram Zagorie; Craig Griffin; John Lattyak; Michael Rykov, “Device-Specific Presentation Control for Electronic Book Reader Devices.”
U.S. Appl. No. 12/351,629, filed Jan. 9, 2009, John Johnston; Weiping Dou; Steven Chase, “Antenna Placement on Portable Device.”
U.S. Appl. No. 12/351,663, filed Jan. 9, 2009, Chris Li; Steven Chase, “Surface Mount Clip for Routing and Grounding Cables.”
U.S. Appl. No. 12/360,089, filed Jan. 26, 2009, Thomas Dimson, Janna Hamaker, Eugene Kalenkovich, Tom Killalea, “Aggregation of Highlights.”
U.S. Appl. No. 12/360,744, filed Jan. 27, 2009, Rajiv Kotesh Ghanta; Marcos Frid; Joseph J. Hebenstreit; John T. Kim, “Electronic Device With Haptic Feedback.”
U.S. Appl. No. 12/366,941, filed Feb. 6, 2009, Scott Dixon; Eriel Thomas, “Bundled Digital Content.”
U.S. Appl. No. 12/414,914, filed Mar. 31, 2009, Amit Agarwal; Zaur Kambarov; Tom Killalea, “Questions on Highlighted Passages.”
U.S. Appl. No. 29/331,528, filed Jan. 27, 2009, Chris Green, “User Interface Cluster.”
“A Universally Unique IDentifier (UUID) URN Namespace”, Jul. 2005, IETF, 32 pages. Retrieved on Apr. 21, 2010 at http://tools.ietf.org/pdf/rfc4122.pdf.
“Annotation Engine,” Berkman Center for Internet & Society at Harvard Law School <http://cyber.law .harvard. edulproj ectsl annotate.html> [Retrieved Jan. 30, 2004], 3 pages.
“Annotator Instructions,” Berkman Center for Internet & Society at Harvard Law School <<http://cyber.law.harvard.edu/annotate/instructions.html>>, also found at <<http://cyber.law.harvard.edu/cite/instructions.html>>, [Retrieved Jan. 30, 2004], 1 page.
“Annotator Wishlist,” Berkman Center for Internet & Society at Harvard Law School r <http://cyber.law.harvard.edulcite/annotate.cgi ?action=print&markup ;center=; view=http%3A%2F%2Fcy . . . > [Retrieved Jan. 30, 2004]. 1 page.
BarnesandNoble.com, “Barnes and Noble Homepage”, retrieved on Aug. 2, 2011 http://web.archive.org/web/19981202183957/http://www.barnesandnoble.com/, Dec. 2, 1998, 2 pages.
Beigbeder et al., “An Information Retrieval Model Using the Fuzzy Proximity Degree of Term Occurences”, 2005 ACM Symposium on Applied Computing, pp. 1018-pp. 1022.
Bellwood, et al., “UDDI Version 2.04 API Specification UDDI Committee Specification, Jul. 19, 2002”, Oasis, 95 pages. Retrieved on Apr. 21, 2010 via Wayback Machine at http://web.archive.org/web/20050314033213/www.oasis-open.org/committees/uddi-spec/doc/tcspecs.htm.
Roscheisen, M., et al., “Beyond Browsing: Shared Comments, SOAPs, Trails, and On-Line Communities,” Computer Networks and ISDN Systems 27:739-749, 1995.
Biskup, J., et al, “Towards a Credential-Based Implementation of Compound Access Control Policies” SACMAT '04, Proceedings of the ninth ACM symposium on Access control models and technologies, Jun. 4, 2004, NY, retrieved from the internet: http://portal.acm.org/citation.cfm?id=990036.990042 (retrieved Nov. 9, 2010, 10 pages.
Bradley, “Plastic Shape Shifter”, retreived on May 7, 2009 at <<http://www.reactivereports.com/61/61—3.html>>, Chemistry WebMagazine, Issue No. 61, Dec. 2006, 2 pgs.
Breu, M. et al., “The Medoc Distrubuted Electronic Library: Accounting and Security Aspects”, Electronic Publishing, New Models and Opportunities, Proceedings of an ICCC/IFIP Conference, Apr. 14, 1997, pp. 237-249.
The Canadian Office Action mailed Dec. 15, 2014 for Canadian patent application No. 2681754, a counterpart foreign application of U.S. Appl. No. 11/763,369, 5 pages.
Canadian Office Action mailed Apr. 14, 2009 for Canadian Patent Application No. 2594573, a counterpart foreign application of U.S. Appl. No. 11/039,645, 3 pages.
The Canadian Office Action mailed May 29, 2014 for Canadian patent application No. 2684580, a counterpart foreign application of U.S. Appl. No. 11/763,374, 3 pages.
Canadian Office Action mailed Jul. 6, 2012 for Canadian patent application No. 2594573, a counterpart foreign application of U.S. Pat. No. 8,131,647, 5 pages.
The Canadian Office Action mailed Aug. 14, 2014 for Canadian patent application No. 2684955, a counterpart foreign application of U.S. Appl. No. 11/693,682, 3 pages.
Cafesoft.com, “Security Glossary”, dated Oct. 13, 2003, retrieved from the Wayback Machine on Jul. 2, 2009 at <<http://web.archive.org/web/20031013022218/http://cafesoft.com/support/security-glossary.html>>, 6 pages.
Card et al., “3Book: A 3D Electronic Smart Book”, AVI'04 May 25-28, 2004, Hallipoli, Italy, ACM 2004, pp. 303-pp. 307.
Cavanaugh “EBooks and Accommodations”, Teaching Expectional Children vol. 35 No. 2 p. 56-61 Copyright 2002 CEC.
Cavanaugh, “EBooks and Accommodations”, Teaching Exceptional Children, vol. 35, No. 2, Copyright 2002 CEC, 6 pages.
Chi et al. “eBooks with Indexes that Reorganize Conceptually”, CHI2004, Apr. 24-29, 2004, Vienna, Austria ACM 1-58113-703-6/04/0004.
Cleveland, Jr. et al., “Questions and Answers about Biological Effects and Potential Hazards of Radiofrequency Electromagnetic Fields” OET Bulletin 56, Fourth Edition, Aug. 1999, 38 pages.
Cleveland, Jr., et al, “Evaluating Compliance with FCC Guidelines for Human Exposure to Radiofrequency Electromagnetic Fields” OET Bulletin 65, Edition 97-01, Aug. 1997, 84 pages.
Translated Chinese Office Action mailed May 9, 2008 for Chinese Patent Application No. 200680002606.2, a counterpart foreign application of U.S. Appl. No. 11/039,645, 22 pages.
Translated Chinese Second Office Action mailed Jun. 5, 2009 for Chinese Patent Application No. 200680002606.2, a counterpart foreign application of U.S. Appl. No. 11/039,645, 20 pages.
Translated Chinese Third Office Action mailed Nov. 27, 2009 for Chinese Patent Application No. 200680002606.2, a counterpart foreign application of U.S. Appl. No. 11/039,645, 15 pages.
Translated Chinese Office Action mailed Feb. 25, 2014 for Chinese patent application No. 200880025056.5 , a counterpart foreign application of U.S. Appl. No. 11/763,374, 13 pages.
Translated Chinese Office Action mailed Jan. 6, 2014 for Chinese patent application No. 201080006308.7, a counterpart foreign application of U.S. Pat. No. 8,378,979, 12 pages.
Translated Chinese Office Action mailed Oct. 10, 2011 for Chinese patent application No. 200880017259.X, a counterpart foreign application of U.S. Appl. No. 11/693,682, 7 pages.
Translated Chinese Office Action mailed Nov. 5, 2013 for Chinese patent application No. 200880025056.5, a counterpart foreign application of U.S. Appl. No. 11/763,374, 15 pages.
Translated Chinese Office Action mailed Dec. 13, 2012 for Chinese patent application No. 20078004873.9, a counterpart foreign application of U.S. Pat. No. 7,865,817, 4 pages.
Translated Chinese Office Action mailed Dec. 14, 2012 for Chinese patent application No. 200880017589.9, a counterpart foreign application of U.S. Appl. No. 11/763,369, 8 pages.
Translated Chinese Office Aciton mailed Feb. 1, 2013 for Chinese patent application No. 200880025056.5, a counterpart foreign application of U.S. Appl. No. 11/763,374, 19 pages.
Translated Chinese Office Action mailed May 17, 2012 for Chinese patent application No. 20078004873.9, a counterpart foreign application of U.S. Pat. No. 7,865,817, 5 pages.
Translated Chinese Office Action mailed May 21, 2012 for Chinese patent application No. 200880017589.9, a counterpart foreign application of U.S. Appl. No. 11/763,369, 9 pages.
Translated Chinese Office Action mailed Jun. 16, 2014 for Chinese patent application No. 200880025056.5, a counterpart foreign application of U.S. Appl. No. 11/763,374, 18 pages.
The Chinese Office Action mailed Jun. 28, 2013 for Chinese patent application No. 20078004873.9, a counterpart foreign application of U.S. Pat. No. 7,865,817, 4 pages.
The Chinese Office Action mailed Jun. 5, 2014 for Chinese patent application No. 201080006308.7, a counterpart foreign application of U.S. Pat. No. 8,378,979, 9 pages.
Translated Chinese Office Action mailed Jun. 6, 2013 for Chinese patent application No. 201080006308.7, a counterpart foreign application of U.S. Pat. No. 8,378,979, 13 pages.
Translated Chinese Office Action mailed Jul. 10, 2013 for Chinese patent application No. 200880025056.5, a counterpart foreign application of U.S. Appl. No. 11/763,374, 8 pages.
Translated Chinese Office Action mailed Jul. 14, 2011 for Japanese patent application No. 20078004873.9, a counterpart foreign application of U.S. Pat. No. 7,865,817, 6 pages.
Translated Chinese Office Action mailed Aug. 25, 2011 for Chinese patent application No. 200880024964.2, a counterpart foreign application of U.S. Appl. No. 11/763,358, 6 pages.
Translated Chinese Office Action mailed Aug. 3, 2012 for Chinese patent application No. 200880025056.5, a counterpart foreign application of U.S. Appl. No. 11/763,374, 17 pages.
Translated Chinese Office Action mailed Sep. 24, 2012 for Chinese patent application No. 200880017259.X, a counterpart foreign application of U.S. Appl. No. 11/693,682, 5 pages.
Translated Chinese Office Action mailed Sep. 26, 2011 for Chinese patent application No. 200880017589.9, a counterpart foreign application of U.S. Appl. No. 11/763,369, 9 pages.
Translated Chinese Office Action mailed Sep. 30, 2011 for Chinese patent application No. 200880025056.5, a counterpart foreign application of U.S. Appl. No. 11/763,374, 9 pages.
Davison et al. “The Use of eBooks and Interactive Multimedia, as Alternative Forms of Technical Documentation” SIGDOC'05, Sep. 21-23, 2005, Conventry, United Kingdom, Copyright 2005 ACM 1-59593-175-9/5/0009, 8 pages.
Desmoulins et al., “Pattern-Based Annotations on E-books: From Personal to Shared Didactic Content”, Proceedings of the IEEE International Workshop on Wireless and Mobile Techniques in Education, 2002, 4 pages.
Carter, S., et al., “Digital Graffiti: Public Annotation of Multimedia Content,” Proceedings o/the CHI2004, Vienna, Austria, Apr. 24-29, 2004, pp. 1207-1210.
Elspass, et al., “Portable Haptic Interface with Active Functional Design”, In Proceedings SPIE Conference on Smart Structures and Integrated Systems, Newport Beach, California, vol. 3668, Mar. 1999, 926-932.
Extended European Search Report mailed Dec. 22, 2009, issued in corresponding European Patent Application No. EP 06 71 8773.2, filed Jan. 18, 2006. 9 pages.
The Mintues of the Oral Proceedings mailed on Nov. 27, 2014 for European patent application No. 06718773.2, a counterpart foreign application of U.S. Pat. No. 8,131,647, 13 pages.
The European Office Action mailed Sep. 23, 2014 for European patent application No. 08732668.2, a counterpart foreign application of U.S. Appl. No. 11/763,369, 7 pages.
The European Office Action mailed Nov. 27, 2014 for European patent application No. 06718773.2, a counterpart foreign application of U.S. Pat. No. 8,131,647, 31 pages.
The European Office Action mailed Dec. 12, 2009 for European Patent Application No. 06718773.2, a counterpart foreign application of U.S. Appl. No. 11/039,645, 1 page.
The European Office Action mailed Mar. 26, 20010 for European Patent Application No. 06718773.2, a counterpart foreign application of U.S. Appl. No. 11/039,645.
The European Office Action mailed Apr. 7, 2014 for European patent application No. 06718773.2, a counterpart foreign application of U.S. Pat. No. 8,131,647, 7 pages.
The European Office Action mailed Jun. 10, 2013 for European patent application No. 06718773.2, a counterpart foreign application of U.S. Appl. No. 11/693,682, 6 pages.
The European Search report mailed Dec. 22, 2009 for European Patent Application No. 06718773.2, a counterpart foreign application of U.S. Appl. No. 11/039,645, 9 pages.
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/763,375, mailed Feb. 23, 2010, 15 pages.
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/763,378, mailed on Mar. 16, 2010, 16 pgs.
Final Office Action for U.S. Appl. No. 11/763,358, mailed on Apr. 5, 2011, James R. Retzlaff II, “Managing Status of Search Index Generation”, 21 pages.
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/414,914, mailed on Jan. 4, 2012, 16 pgs.
Final Office Action for U.S. Appl. No. 12/360,089, mailed on Jan. 28, 2014, Tom Killalea, “Aggregation of Highlights”, 14 pages.
Final Office Action for U.S. Appl. No. 11/763,395, mailed on Oct. 30, 2013, Thomas A. Ryan, “Providing Supplemental Information Based on Hints in a Media Consumption System”, 14 pages.
Final Office Action for U.S. Appl. No. 12/414,914, mailed on Feb. 10, 2014, Amit D. Agarwal, “Questions on Highlighted Passages”, 40 pages.
Gladney, H. M.: “Access Control for Large Collections”, NY, vol. 15, No. 2, Apr. 1, 1997, pp. 154-194.
Goodreads.com, “About goodreads”, 2006, 2 pages.
Malloy, et al., “Google Search”, retrieved on Sep. 17, 2014 at <<http://en.wikipedia.org/w/index.php?title=Google—Search&oldid=118323867>>, Wikipedia, the free encyclopedia, Mar. 27, 2007, 6 pages.
“Haptic History—Machine Haptics (Expansion)” retrieved on May 7, 2009 at <<http://hapticshistory.chc61.uci.cu/haptic/site/pages/Machine-Haptics-Became—5.php.>> from Google's cache, text-only version as webpage appeared on Apr. 16, 2009, 8 pgs.
Henke, H. “Survey on Electronic Book Features”, Open eBook Forum, online, Mar. 20, 2002, pp. 1-14, retrieved from the internet: <http://www.openebook.org/doc—library/surveys/IDPF—eBook—Features—2002.pdf> retrieved Nov. 8, 2010, 14 pages.
Final Office Action for U.S. Appl. No. 12/886,877, mailed on Jun. 16, 2015, Gilles Jean Roger Belin, “Cover Display”, 46 pages.
Office action for U.S. Appl. No. 13/959,589, mailed on Jul. 7, 2016, Ryan et al., “Administrative Tasks in a Media Consumption System”, 34 pages.
Non-Final Office Action for U.S. Appl. No. 11/763,386, mailed on Nov. 8, 2011, Thomas Ryan, “Handling of Subscription-Related Issues in a Media Consumption System”, 10 pages.
Non-Final Office Action, dated Nov. 9, 2011, for U.S. Appl. No. 11/763,395, Thomas Ryan, “Providing Supplemental Information Based on Hints in a Media Consumption System”, 10 pages.
Office Action for U.S. Appl. No. 12/759,828, mailed on Dec. 17, 2013, James R. Retzlaff II, “Search and Indexing on a User Device”, 25 pages.
Office action for U.S. Appl. No. 11/763,357 , mailed on Dec. 21, 2011, Reztlaff et al., “Obtaining and Verifying Search Indices”, 14 pages.
Final Office Action for U.S. Appl. No. 11/763,363, mailed on Dec. 23, 2011, James R. Rezlaff II et al., “Search Results Generation and Sorting”, 10 pages.
Office Action for U.S. Appl. No. 11/763,374, mailed on Dec. 24, 2013, Thomas A. Ryan, “Consumption of Items via a User Device”, 16 pages.
Non-Final Office Action for U.S. Appl. No. 11/763,369, mailed on Dec. 29, 2011, James R. Reztlaff II et al., “Search of Multiple Content Sources on a User Device”, 21 pages.
Final Office Action for U.S. Appl. No. 11/693,685, dated Dec. 8, 2011, John Lattyak et al., “Relative Progress and Event Indicators”, 23 pages.
Final Office Action for U.S. Appl. No. 11/763,374, mailed on Feb. 13, 2012, Thomas Ryan et al., “Consumption of Items via a User Device”, 14 pages.
Office action for U.S. Appl. No. 11/763,392, mailed on Feb. 14, 2013, Ryan et al., “Administrative Tasks in a Media Consumption System”, 47 pages.
Office Action for U.S. Appl. No. 11/537,518, mailed on Feb. 14, 2014, John Lattyak, “Acquisition of an Item Based on a Catalog Presentation of Items”, 14 pages.
Non-Final Office Action for U.S. Appl. No. 11/763,393, mailed on Feb. 16, 2012, John Lattyak et al., “Incremental Updates of Items”, 24 pages.
Office action for U.S. Appl. No. 13/294,803, mailed on Feb. 21, 2013, Inventor #1, “Progress Indication for a Digital Work”, 76 pages.
Office action for U.S. Appl. No. 11/763,386, mailed on Feb. 28, 2013, Ryan et al., “Handling of Subscription-Related Issues in a Media Consumption System”, 17 pages.
Office Action for U.S. Appl. No. 11/763,314, mailed on Mar. 10, 2014, Craig S. Griffin, “Display Dependent Markup Language”, 42 pages.
Office action for U.S. Appl. No. 12/366,941, mailed on Mar. 14, 2014, Dixon et al., “Bundled Digital Content”, 13 pages.
Final Office Action for U.S. Appl. No. 11/693,685, mailed on Mar. 24, 2014, John Lattyak, “Relative Progress and Event Indicators”, 26 pages.
Office Action for U.S. Appl. No. 11/763,357, mailed on Mar. 27, 2014, James R. Retzlaff II, “Obtaining and Verifying Search Indices”, 14 pages.
Final; Office Action for U.S. Appl. No. 12/360,089, mailed on Mar. 28, 2012, Tom Killalea et al., “Aggregation of Highlights”, 17 pages.
Non-Final Office Action for U.S. Appl. No. 12/366,941, mailed on Mar. 30, 2012, Scott Dixon et al., “Bundled Digital Content”, 12 pages.
Office action for U.S. Appl. No. 12/360,089, mailed on Mar. 5, 2013, Killalea et al., “Aggregation of Highlights”, 17 pages.
Office action for U.S. Appl. No. 11/763,374, mailed on Apr. 22, 2013, Ryan et al., “Consumption of Items via a User Device”, 17 pages.
Office action for U.S. Appl. No. 11/693,682, mailed on Apr. 23, 2012, Siegel et al., “Providing Annotations of a Digital Work”, 12 pages.
Office Action for U.S. Appl. No. 13/722,961, mailed on Apr. 25, 2014, John Lattyak, “Delivery of Items for Consumption by a User Device”, 4 pages.
Final Office Action for U.S. Appl. No. 11/763,386, mailed on Apr. 26, 2012, Thomas Ryan et al., “Handling of Subscription-Related Issues in a Media Consumption System”, 14 pages.
Non-Final Office Action for U.S. Appl. No. 11/537,518, mailed on Apr. 28, 2011, John Lattyak, “Acquisition of an Item Based on a Catalog Presentation of Items”, 8 pages.
Office action for U.S. Appl. No. 11/763,390, mailed on Apr. 8, 2013, Bajaj et al, “Providing User-Supplied Items to a User Device”, 7 pages.
Office action for U.S. Appl. No. 11/763,369, mailed on May 14, 2013, Reztlaff, II et al., “Search of Multiple Content Sources on a User Device”, 24 pages.
Final Office Action for U.S. Appl. No. 11/763,374, mailed on May 14, 2014, Thomas A. Ryan, “Consumption of Items via a User Device”, 21 pages.
Office action for U.S. Appl. No. 11/763,395, mailed on May 2, 2013,Ryan et al., “Providing Supplemental Information Based on Hints in a Media Consumption System”, 12 pages.
Final Office Action for U.S. Appl. No. 12/759,828, mailed on May 2, 2014, James R. Retzlaff II, “Search and Indexing on a User Device”, 27 pages.
Office action for U.S. Appl. No. 11/763,357, mailed on May 26, 2011, Reztlaff, “Obtaining and Verifying Search Indices”, 13 pages.
Non-Final Office Action for U.S. Appl. No. 13/083,445, mailed on May 4, 2012, Hilliard B. Siegel et al., “Method and System for Providing Annotations of a Digital Work”, 20 pages.
Final Office Action for U.S. Appl. No. 11/763,395, mailed May 9, 2012, Thomas Ryan et al., “Providing Supplemental Information Based on Hints in a Media Consumption System”, 12 pages.
Final Office Action for U.S. Appl. No. 11/763,314, mailed on Jun. 13, 2011, Craig S. Griffin, “Display Dependent Markup Language”, 26 pages.
Office Action for U.S. Appl. No. 13/959,589, mailed on Jun. 2, 2014, Thomas A. Ryan, “Administrative Tasks in a Media Consumption System”, 24 pages.
Office action for U.S. Appl. No. 13/284,446, mailed on Jun. 24, 2014, Hansen, “Indicators for Navigating Digital Works”, 19 pages.
Office action for U.S. Appl. No. 11/763,390, mailed on Jun. 26, 2012, Bajaj et al., “Providing User-Supplied Items to a User Device”, 7 pages.
Office action for U.S. Appl. No. 11/763,392, mailed on Jun. 27, 2012, Ryan et al., “Administrative Tasks in a Media Consumption System”, 47 pages.
Office action for U.S. Appl. No. 13/294,803, mailed on Jun. 4, 2013, Lattyak et al., “Progress Indication for a Digital Work”, 26 pages.
Office Action for U.S. Appl. No. 12/949,115, mailed on Jun. 4, 2014, Thomas A. Ryan, “Invariant Referencing in Digital Works”, 11 pages.
Non-Final Office Action for U.S. Appl. No. 12/943,211, mailed on Jun. 6, 2012, James. R. Retzlaff II et al., “Obtaining and Verifying Search Indices”, 10 pages.
Office action for U.S. Appl. No. 12/759,828, mailed on Jun. 6, 2013, Reztlaff, II et al., “Search and Indexing on a User Device”, 27 pages.
Non-Final Office Action for U.S. Appl. No. 11/763,369 mailed on Jun. 7, 2012, James R. Reztlaff II et al., “Search of Multiple Content Sources on a User Device”, 20 pages.
Non-Final Office Action for U.S. Appl. No. 11/693,682, mailed on Jun. 9, 2011, Hilliard B. Siegel, “Providing Annotations of a Digital Work”, 12 pages.
Office action for U.S. Appl. No. 12/333,215, mailed on Jul. 18, 2011, Zagorie et al., “Device-Specific Presentation Control for Electronic Book Reader Devices”, 22 pages.
Office action for U.S. Appl. No. 12/943,211, mailed on Jul. 2, 2014, Retzlaff, II et al., “Obtaining and Verifying Search Indices”, 9 pages.
Office action for U.S. Appl. No. 12/360,089, mailed on Jul. 3, 2013, Killalea et al., “Aggregation of Highlights”, 14 pages.
Non-Final Office Action for U.S. Appl. No. 11/693,685, John Lattyak, “Relative Progress and Event Indicators”, 22 pages.
Office action for U.S. Appl. No. 12/886,877, mailed on Feb. 1, 2016, Belin et al., “Cover Display”, 11 pages.
Related Publications (1)
Number Date Country
20140218286 A1 Aug 2014 US
Continuations (1)
Number Date Country
Parent 12567984 Sep 2009 US
Child 14246999 US