This disclosure relates generally to electronic devices with touch-sensitive surfaces, and in particular to cursor manipulation and content selection in an electronic document.
The use of touch-sensitive surfaces as input devices for computers and other electronic computing devices has increased significantly in recent years. Exemplary touch-sensitive surfaces include touch pads and touch screen displays. Such surfaces are widely used to review and edit electronic documents by manipulating a cursor within the electronic document presented on a display. These electronic documents are viewed or edited within applications having viewing and editing capabilities (e.g., drawing applications, presentation applications (e.g., Apple's KEYNOTE, or Microsoft's POWERPOINT), word processing applications (e.g., Apple's PAGES or Microsoft's WORD), website creation applications, spreadsheet applications (e.g., Apple's NUMBERS or Microsoft's EXCEL)).
Conventional cursor manipulation methods that require the use of a mouse, or other peripheral input mechanism, are relatively inefficient as they require the user to move one hand away from the keyboard to the mouse in order to make the selection. Similarly, conventional user interfaces provided on touch screen displays do not provide a simple and intuitive way to manipulate the cursor for content selection and editing. As such, it is desirable to provide a more efficient means for manipulating a cursor displayed in an electronic document presented on a touch screen display.
Accordingly, there is a need for electronic devices with faster, more efficient methods and interfaces for cursor manipulation and content (e.g., text) selection. Such methods and interfaces optionally complement or replace conventional methods for cursor manipulation and content selection. Such methods and interfaces reduce the number, extent, and/or nature of the inputs from a user and produce a more efficient human-machine interface. For battery-operated devices, a more efficient input mechanism also requires less computing resources, thereby increasing battery life of the device.
In order to manipulate a cursor in a document on a conventional portable multifunction device, users often need to make finger contact with precise locations on a touch sensitive screen at the location of the cursor. This is often difficult and the user cannot see the precise location of the cursor under their finger.
The above deficiencies and other problems associated with user interfaces for electronic devices with touch-sensitive surfaces are addressed by the devices and methods described herein. The methods described herein allow a user manipulate a cursor and perform editing functions, such as text selection and moving a selection, from any part of the touch-sensitive surface. For example, a two-finger touch input can be detected anywhere on the touch screen, including over the keyboard. Users can greatly benefit from this for at least two reasons. First, the multi-finger gesture detection can be distinguished from a single-finger detection (e.g., single-finger taps), so that a single-finger contact or gesture at the keyboard can be reserved for other functions, e.g., text entry. Second, it allows for a more efficient user input because the user does not have to consider where the cursor gestures are being made (e.g., the user need not be concerned about inadvertent key activation) and can instead focus on the movement of the cursor.
In some embodiments, the device is portable (e.g., a notebook computer, tablet computer, or handheld device), while in other embodiments, the device is a desktop computer. The device has a touch-sensitive display (also known as a “touch screen” or “touch screen display”), and in some embodiments it also includes a touchpad.
In some embodiments, the device has a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions. In some embodiments, the user interacts with the GUI primarily through touch inputs, including finger contacts and gestures, on the touch-sensitive surface. In some embodiments, the functions optionally include image reviewing, editing, drawing, presenting, word processing, website creation, authoring disks, spreadsheet creation or editing, playing games, using a telephone on the device, video-conferencing, e-mailing, instant messaging, digital photography or videography, web browsing, playing digital music or video, note taking, or the like. Executable instructions for performing these functions are, optionally, included in a non-transitory computer readable storage medium or other computer program product configured for execution by one or more processors.
In accordance with some embodiments, a method of cursor manipulation is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying content of an electronic document on the display; displaying a cursor within the electronic document; detecting two substantially simultaneous touch inputs anywhere on the touch screen display; and in response to detecting the two substantially simultaneous touch inputs: selecting a portion of the content in the document closest to the cursor; and displaying the portion of the content as selected content.
In accordance with some embodiments, the method of cursor manipulation further includes: while the portion of the content is displayed as selected content, detecting an additional two substantially simultaneous touch inputs anywhere on the touch screen display; in response to detecting the additional two substantially simultaneous touch inputs, selecting a first expanded portion of the content that includes the portion of the content; and displaying the first expanded portion as selected content.
In accordance with some embodiments, the method of cursor manipulation further includes: while the first expanded portion of the content is displayed as selected content, detecting an additional two substantially simultaneous touch inputs on the touch screen display; in response to detecting the additional two substantially simultaneous touch inputs, selecting a second expanded portion of the content that includes the expanded portion of the content; and displaying the second expanded portion as selected content.
In accordance with some embodiments, the method of cursor manipulation further includes: while the second expanded portion of the content is displayed as selected content, detecting an additional two substantially simultaneous touch inputs on the touch screen display; in response to detecting the additional two substantially simultaneous touch inputs, selecting a third expanded portion of the content that includes the first expanded portion of the content; and displaying the third expanded portion as selected content.
In accordance with some embodiments, the content includes text and the portion of the content is a word located closest to the cursor in the text, the expanded portion is a sentence, the first expanded portion is a paragraph, and the second expanded cursor is a page.
In accordance with some embodiments, the content includes text and the portion of the content is a word located closest to the cursor.
In accordance with some embodiments, the method of cursor manipulation further includes: detecting lift-off of the two substantially simultaneous touch inputs from the touch screen display, followed by an additional two substantially simultaneous touch inputs anywhere on the touch screen display; detecting a continuous touch gesture from locations of the additional two substantially simultaneous touch inputs on the touch screen display to additional locations on the touch screen display; and in response to detecting the continuous touch gesture, expanding the selected content to include additional content beyond the portion in a direction of toward the additional locations.
In accordance with some embodiments, the method of cursor manipulation further includes: detecting a first touch input on the touch screen display at a first location within the expanded selected content; detecting, substantially simultaneous with the detection of the first touch input, a second touch input at a second location within the expanded selected content; and in response to detecting the first and second touch inputs, expanding the display of the selected content based on the selected content.
In accordance with some embodiments, the method further includes: upon determining that the two substantially simultaneous touch inputs remain in contact with the touch screen display, detecting a continuous touch gesture at least partially across the touch screen display from locations of the two substantially simultaneous touch inputs; in response to the gesture, expanding the selection of the content beyond the portion of the content in a direction of the gesture.
In accordance with some embodiments, the cursor is an insertion point.
In accordance with some embodiments, the method of cursor manipulation further includes: detecting an additional two substantially simultaneous touch inputs on the touch screen display at respective first and second locations within boundaries of the selected content; upon determining that the two substantially simultaneous touch inputs remain in contact with the touch screen display, detecting a continuous touch gesture across the touch screen display; in response to detecting a lift off of the continuous touch gesture, moving the selected content to a different location.
In accordance with some embodiments, the method of cursor manipulation further includes: prior to moving the selected content to the second location, displaying a ghost cursor offset from the selected content, wherein the ghost selection moves with the continuous touch gesture, wherein the different location is the location of the ghost cursor at the time of the lift off.
In accordance with some embodiments, a method of cursor manipulation is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying content of an electronic document on the touch screen display; displaying a cursor at a current location within the content on the touch screen display; detecting two substantially simultaneous touch inputs at a first region on the touch screen display; upon determining that the two substantially simultaneous touch inputs remain in contact with the touch screen display, detecting a continuous touch gesture from a location of the two substantially simultaneous touch inputs across the touch screen display from the first region to a second region; and in response to detecting the continuous touch gesture: moving the cursor from the current location to a new location in a direction of the continuous touch gesture.
In accordance with some embodiments, a distance between the first region and the second region is substantially the same as a distance between the current location and the new location.
In accordance with some embodiments, the method further includes: while moving the cursor, displaying a ghost cursor offset from the cursor; and upon detecting a termination of the continuous touch gesture, placing the cursor at the location of the ghost cursor and ceasing the display of the ghost cursor.
In accordance with some embodiments, displaying the cursor at a current location includes displaying a selection of a portion of the content at the current location, moving the cursor from the current location to a new location in a direction of the continuous touch gesture includes: dismissing the selection and moving the cursor from the current location to the new location.
In accordance with some embodiments, a method of cursor manipulation is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying text on the display; displaying a cursor at a line within the text; detecting a two-finger swipe gesture on the touch screen display in a direction at least partially parallel to the line and towards an edge of the touch screen display; and in response to detecting the two-finger swipe gesture moving the cursor to a distal point of the text. For example, an end or beginning of a line or a top or bottom of a page or document.
In accordance with some embodiments, the detecting requires that the two-finger swipe gesture be performed at a speed higher than a predetermined speed.
In accordance with some embodiments, the distal point of the text is at a location in the direction of the gesture. For example, an end or beginning of a line or a top or bottom of a page or document.
In accordance with some embodiments, moving the cursor to the distal point of the text includes moving the cursor to a beginning or an end of the line of the text or a beginning or an end of the text (e.g., the top or bottom of a document or page) in accordance with the direction of the two-finger swipe.
In accordance with some embodiments, a method of cursor manipulation is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying content of an electronic document on the display; displaying a cursor within the electronic document; detecting a touch input on the touch screen display, wherein the touch input is located on a word within the content; and in response to detecting the touch input: selecting the word; and displaying a command display area adjacent to the selected word, wherein the second command display area includes an icon for cutting the selected word, an icon for copying the selected word, and an icon for pasting a previously selected content.
In accordance with some embodiments, a method of selection manipulation is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying content of an electronic document on the display; displaying a selection of the content within the electronic document; detecting a single touch input on the touch screen display at a location over the selection; in response to detecting the single touch input at the location, displaying a set of options related to the selection; determining if the single touch input remains at the location for a predetermined amount of time followed by a continuous touch gesture away from the location on the touch screen display; and in response to detecting the single touch input remaining at the location for the predetermined amount of time followed by the continuous touch gesture away from the location, moving the selection to a different location in a direction of the continuous touch gesture.
In accordance with some embodiments, a method of selection manipulation is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying content of an electronic document on the display; displaying a selection of the content within the electronic document; detecting three substantially simultaneous touch inputs at locations anywhere on the touch screen display; determining if the three substantially simultaneous touch inputs is followed by three continuous touch gestures away from the locations on the touch screen display; and in response to detecting the three continuous touch gestures, moving the selection to a different location in a direction of the continuous touch gestures.
In accordance with some embodiments, a method of cursor manipulation is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying content of an electronic document on the touch screen display, the content includes at least one line of text comprising at least two words; detecting a touch input on the content; and in response to detecting the touch input: determining a distance of the touch input to a closest space between the two words within the electronic document; and in accordance with a determination that the distance is greater than a predetermined threshold distance, selecting a word within the electronic document closest to the touch input and displaying an indication of the selection.
In accordance with some embodiments, the method further places a cursor in the closest space after a preceding word when the distance is less than the predetermined threshold distance.
In accordance with some embodiments, the predetermined threshold distance is calculated based on the percentage of the size of the word closest to the touch input.
In accordance with some embodiments, the method further includes: while displaying the selection, detecting an additional touch input on the touch screen display at a location within the selection; and in response to detecting the additional touch input, dismissing the selection and placing a cursor adjacent to a word within the electronic document based on the location of the additional touch input relative to the closest space to this word.
In accordance with some embodiments, the method further includes: while displaying the selection, detecting an additional touch input on the touch screen display at a location within the selection; detecting, without breaking contact with the touch screen display following the additional touch input, a continuous touch gesture from a first location of the additional touch input on the touch screen display to a second location on the touch screen display; and in response to detecting the continuous touch gesture, expanding the selection to include additional content beyond the word in a direction towards the second location.
In accordance with some embodiments, the method further includes: detecting a double-tap touch input; in response to detecting the double-tap touch input, selecting a word closest to the double-tap touch input; and displaying an indication of the selection.
In accordance with some embodiments, the method further includes: detecting a double-tap touch input at a space between two words within the electronic document; in response to detecting the double-tap touch input, selecting the word closest to the double-tap touch input that follows the space; and displaying an indication of the selection.
In accordance with some embodiments, the method further includes: detecting a triple-tap touch input; in response to detecting the triple-tap touch input, selecting a sentence closest to the triple-tap touch input; and displaying an indication of the selection of the sentence.
In accordance with some embodiments, the method further includes: detecting a quadruple-tap touch input; in response to detecting the quadruple-tap touch input, selecting a paragraph closest to the quadruple-tap touch input; and displaying an indication of the selection of the paragraph.
In accordance with some embodiments, the method further includes: detecting a quintuple-tap touch input; in response to detecting the quintuple-tap touch input, selecting the content; and displaying an indication of the selection of the content.
In accordance with some embodiments, selecting the word within the electronic document closest to the touch input is performed when the distance is equal to the predetermined threshold distance.
In accordance with some embodiments, placing the cursor in the closest space after the preceding word is performed when the distance is equal to the predetermined threshold distance.
In accordance with some embodiments, a method of keyboard display is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying content of an electronic document on the touch screen display; displaying a soft keyboard on the touch screen display; detecting two substantially simultaneous touch inputs on the soft keyboard; and in response to detecting the two substantially simultaneous touch inputs on the soft keyboard, displaying a blurred soft keyboard.
In accordance with some embodiments, the method further includes: detecting, without breaking contact with the touch screen display following the two substantially simultaneous touch inputs, movements of the two substantially touch inputs from the soft keyboard to the content; and in response to detecting the continuous finger contact to the content, replacing display of the blurred soft keyboard with display of the soft keyboard.
In accordance with some embodiments, displaying the blurred soft keyboard includes changing one or more of: color, hue, saturation, brightness, and contrast of the soft keyboard based on the content of the electronic document.
In accordance with some embodiments, a method of content selection is performed at a portable multifunction device including one or more processors, memory, and a touch-sensitive display. The method includes: concurrently displaying an onscreen keyboard and a content presentation region on the touch-sensitive display, wherein the content presentation region displays text input received from the onscreen keyboard; detecting a touch input on the onscreen keyboard displayed on the touch-sensitive display; in response to detecting the touch input on the onscreen keyboard displayed on the touch-sensitive display, determining whether the touch input satisfies one or more criteria for entering a text selection mode; and in accordance with a determination that the touch input satisfies the one or more criteria for entering the text selection mode: concurrently displaying, in the content presentation region, a first cursor at a first location and a second cursor at a second location that is different from the first location.
In accordance with some embodiments, the one or more criteria for entering the text selection mode include the touch input including a two-finger drag gesture over the onscreen keyboard.
In accordance with some embodiments, the device has one or more sensors to detect intensity of contacts with the touch-sensitive display, the touch input on the touch-sensitive display includes an input by a contact on the onscreen keyboard, and the one or more criteria for entering the text selection mode include the contact on the onscreen keyboard having an intensity that exceeds a predetermined deep press intensity threshold.
In accordance with some embodiments, the method further includes: in accordance with the determination that the touch input satisfies the one or more criteria for entering the text selection mode: visually obscuring keys on the onscreen keyboard.
In accordance with some embodiments, visually obscuring the keys on the onscreen keyboard includes applying a blurring effect to the onscreen keyboard.
In accordance with some embodiments, visually obscuring the keys on the onscreen keyboard includes transforming the onscreen keyboard into an onscreen touchpad.
In accordance with some embodiments, visually obscuring the keys on the onscreen keyboard includes making the onscreen keyboard semitransparent to partially reveal content lying underneath the onscreen keyboard.
In accordance with some embodiments, the second location is based on a location of an initial contact in the touch input; and the first location is a permitted insertion position in the content presentation region that is based on the second location.
In accordance with some embodiments, the first location is an insertion position at which the first cursor is located when the touch input is determined to satisfy the one or more criteria for entering the text selection mode; and the second location is displaced from the first location by a predetermined offset.
In accordance with some embodiments, one of the first and second cursors is already displayed in the content presentation region before both of the first and second cursors are concurrently displayed in the content presentation region.
In accordance with some embodiments, the method further includes: detecting movement of one or more contacts of the touch input; and moving the second cursor within the content presentation region in accordance with the movement of the one or more contacts of the touch input.
In accordance with some embodiments, the method further includes: moving the first cursor based on the movement of the second cursor, wherein movement of the first cursor includes discrete movements between permitted insertion positions in the content presentation region.
In accordance with some embodiments, the method further includes: detecting a lift-off of the touch input after detecting the movement of the one or more contacts of the touch input; and in response to detecting the lift-off of the touch input: ceasing to display the second cursor.
In accordance with some embodiments, the method further includes: in response to detecting the lift-off of the touch input, maintaining display of the first cursor at a respective permitted insertion position reached by the first cursor after the discrete movements of the first cursor.
In accordance with some embodiments, the method further includes: in response to detecting the lift-off of the touch input, ceasing to display the first cursor.
In accordance with some embodiments, the onscreen keyboard is obscured in accordance with the determination that the touch input satisfies the one or more criteria for entering the text selection mode, and the method further includes: in response to detecting the lift-off of the touch input, restoring display of the onscreen keyboard.
In accordance with some embodiments, the device has one or more sensors to detect intensity of contacts with the touch-sensitive display, and the method further includes: in the text selection mode, detecting that an intensity of a contact in the touch input exceeds a predetermined intensity threshold; after detecting that the intensity of the contact in the touch input exceeds the predetermined intensity threshold, detecting movement of the contact in the touch input; in response to detecting the movement of the contact in the touch input, after detecting that the intensity of the contact in the touch input exceeds the predetermined intensity threshold: selecting a portion of the text input in accordance with the movement of the contact in the touch input.
In accordance with some embodiments, the selected portion of the text input begins at a position of the first cursor when the detected intensity of the contact in the touch input exceeded the predetermined intensity threshold.
In accordance with some embodiments, the method further includes: detecting lift-off of the contact in the touch input after selecting the portion of the text input in accordance with the movement of the contact in the touch input; and, in response to detecting the lift-off of the contact in the touch input, confirming selection of the portion of the text input.
In accordance with some embodiments, the method further includes: after selecting the portion of the text input, while the portion of the text input is selected, detecting an intensity of the contact in the touch input that exceeds the predetermined threshold; and, in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold while the portion of the text input is selected, clearing selection of the portion of the text input.
In accordance with some embodiments, the method further includes: after selecting the portion of the text input, while the portion of the text input is selected, detecting an intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by lift-off of the contact without further movement of the contact; and, in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by lift-off of the contact without further movement of the contact, confirming selection of the portion of the text input.
In accordance with some embodiments, the method further includes: after selecting the portion of the text input, while the portion of the text input is selected, detecting an intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by further movement of the contact; and, in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by the further movement of the contact: clearing selection of the portion of the text input.
In accordance with some embodiments, the method further includes: in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by the further movement of the contact: starting selection of a new portion of the text input in accordance with the further movement of the contact.
In accordance with some embodiments, the method further includes: in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by the further movement of the contact: further moving the second cursor and the first cursor within the content presentation region in accordance with the further movement of the contact.
In accordance with some embodiments, the device has one or more sensors to detect intensity of contacts with the touch-sensitive display, and the method further includes: in the text selection mode, detecting a first local intensity peak in the touch input followed by a second local intensity peak in the touch input that both exceed a predetermined intensity threshold; and, in response to detecting the first local intensity peak followed by the second local intensity peak that both exceed the predetermined intensity threshold, selecting a first predetermined unit of the text input according to a current location of the first cursor.
In accordance with some embodiments, the method further includes: after detecting the first local intensity peak followed by the second local intensity peak, detecting a third consecutive local intensity peak in the touch input that exceeds the predetermined intensity threshold; and in response to detecting the three consecutive local intensity peaks in the touch input that all exceed the predetermined deep press intensity threshold, selecting a second predetermined unit of the text input that is larger than and includes the first predetermined unit of the text input.
In accordance with some embodiments, a method of content selection is performed at a portable multifunction device including one or more processors, memory, a touch screen display, and one or more sensors to detect intensity of contacts with the touch-sensitive surface. The method includes: while a contact is detected on the touch-sensitive surface, concurrently displaying on the display content and a text selection indicator at a first location within the content; detecting a first press input by the contact followed by movement of the contact across the touch-sensitive surface that corresponds to movement of at least a portion of the text selection indicator from the first location to a second location on the display; in response to detecting the first press input by the contact followed by movement of the contact across the touch-sensitive surface, selecting content between the first location and the second location; while the content between the first location and the second location is selected, detecting a second press input by the contact on the touch-sensitive surface; in response to detecting the second press input by the contact on the touch-sensitive surface, performing a text selection operation, associated with the content between the first location and the second location, in accordance with the second press input, wherein the contact in the first press input, the movement across the touch-sensitive surface, and the second press input is a single continuous contact with the touch-sensitive surface.
In accordance with some embodiments, detecting the first press input by the contact followed by movement of the contact across the touch-sensitive surface includes: detecting an increase in intensity of the contact above a predetermined intensity threshold followed by detecting a decrease in intensity of the contact to an intensity that remains above a predetermined minimum intensity value.
In accordance with some embodiments, the method further includes: in response to detecting the first press input by the contact followed by movement of the contact across the touch-sensitive surface: displaying at least the portion of the text selection indicator at the second location within the content.
In accordance with some embodiments, the text selection operation includes stopping selection of content at the second location and maintaining selection of the content between the first location and the second location.
In accordance with some embodiments, the method further includes: after detecting the second press input and while the content between the first location and the second location remains selected, detecting lift-off of the contact; and in response to detecting the lift-off of the contact, displaying an action menu for the selected content between the first location and the second location.
In accordance with some embodiments, the method further includes: after detecting the second press input by the contact on the touch-sensitive surface and stopping the selection of the content at the second location, detecting further movement of the contact; and in response to detecting the further movement of the contact, displaying at least a portion of the text selection indicator at a third location within the content.
In accordance with some embodiments, the method further includes: in response to detecting the further movement of the contact, canceling selection of content between the first location and the second location without selecting content between the second location and the third location.
In accordance with some embodiments, the text selection operation includes cancelling selection of content between the first location and the second location.
In accordance with some embodiments, the method further includes: after detecting the second press input by the contact on the touch-sensitive surface and canceling the selection of content between the first location and the second location, detecting further movement of the contact; and, in response to detecting the further movement of the contact, selecting content between the second location and a third location.
In accordance with some embodiments, the method further includes: while the content between the second location and the third location is selected, detecting lift-off of the contact; and, in response to detecting the lift-off of the contact while the content between the second location and the third location is selected, stopping selection of the content at the third location and maintaining selection of the content between the second location and the third location.
In accordance with some embodiments, the method further includes: before displaying the text selection indicator at the first location within the content, detecting an initial press input by the contact on the touch-sensitive surface; and in response to detecting the initial press input, displaying the text selection indicator at an initial location within the content that corresponds to a location of the initial press input on the touch-sensitive surface.
In accordance with some embodiments, the display is a touch-sensitive display that includes the touch-sensitive surface, and the method further includes: concurrently displaying, on the touch-sensitive display, the content and an onscreen keyboard, wherein the initial press input is detected on the onscreen keyboard.
In accordance with some embodiments, the initial press input is detected at a location on the touch-sensitive surface that corresponds to a location of the content on the display.
In accordance with some embodiments, the display is a touch-sensitive display that includes the touch-sensitive surface, and the method further includes: concurrently displaying, on the touch-sensitive display, the content and an onscreen keyboard; before displaying the text selection indicator at the first location within the content, detecting a multi-contact drag input on the onscreen keyboard; and, in response to detecting the multi-contact drag input on the onscreen keyboard, displaying the text selection indicator at an initial location within the content based on a location of the multi-contact drag input on the onscreen keyboard.
In accordance with some embodiments, the content includes editable content and the text selection indicator includes a cursor.
In accordance with some embodiments, the method further includes: displaying a magnifying loupe that displays a magnified version of the cursor and a region surrounding the cursor.
In accordance with some embodiments, selecting the content between the first location and the second location includes: moving the cursor one character space at a time in response to detecting the movement of the contact across the touch-sensitive surface; and selecting one additional character at a time in accordance with the movement of the cursor.
In accordance with some embodiments, the content includes read-only content and the text selection indicator includes a selection area; and displaying the text selection indicator at the first location includes displaying a first word located at the first location within the selection area.
In accordance with some embodiments, the method further includes: displaying a magnifying loupe that displays a magnified version of the selection area and a region surrounding the selection area.
In accordance with some embodiments, selecting the content between the first location and the second location includes: expanding the selection area one word at a time in accordance with the movement of the contact across the touch-sensitive surface; and selecting one additional word at a time in accordance with the expansion of the selection area.
In accordance with some embodiments, the method further includes: foregoing performing the text selection operation, in response to detecting the second press input, in accordance with a determination that the second press input is accompanied by simultaneous movement of the contact across the touch-sensitive surface.
In accordance with some embodiments, when the text is editable text, the text selection indicator is a cursor and selecting content between the first location and the second location includes expanding the selection character-by-character in accordance with movement of the contact on the touch-sensitive surface; and when the text is non-editable text, the text selection indicator is a selection region that initially encompasses a single word and selecting content between the first location and the second location includes expanding the selection word-by-word in accordance with movement of the contact on the touch-sensitive surface.
In accordance with some embodiments, a method of content selection is performed at a portable multifunction device including one or more processors, memory, and a touch-sensitive display. The method includes: concurrently displaying an onscreen keyboard and a content presentation region on the touch-sensitive display, wherein the content presentation region displays text input received from the onscreen keyboard; detecting a touch input on the onscreen keyboard displayed on the touch-sensitive display, wherein detecting the touch input includes detecting movement of a contact and liftoff of the contact; in response to detecting the touch input on the onscreen keyboard displayed on the touch-sensitive display: in accordance with a determination that the touch input satisfies text-selection criteria, wherein the text-selection criteria include a criterion that is met when a characteristic intensity of the contact increases above a text-selection intensity threshold, performing a text selection operation based on the movement of the contact; and in accordance with a determination that the touch input satisfies text-entry criteria, wherein the text-entry criteria include a criterion that is met when the characteristic intensity of the contact does not increase above the text-selection intensity threshold, entering text into the content presentation region based on the touch input.
In accordance with some embodiments, the text-entry criteria include a criterion that is met when the liftoff of the contact is detected while the contact is at a location of a character key of the onscreen keyboard.
In accordance with some embodiments, the text-entry criteria include a criterion that is met when the contact does not move outside of the onscreen keyboard before liftoff of the contact is detected.
In accordance with some embodiments, entering the text into the content region includes entering a character that corresponds to character key at a location at which touchdown of the contact was detected on the onscreen keyboard.
In accordance with some embodiments, entering the text into the content region includes entering a character that corresponds to character key at a location at which liftoff of the contact was detected on the onscreen keyboard.
In accordance with some embodiments, the text-selection criteria include a criterion that is met when the contact does not move more than a threshold distance before detecting an increase in the characteristic intensity of the contact above the text-selection intensity threshold.
In accordance with some embodiments, the text-selection operation includes one of: moving a cursor within the content region or selecting text within the content region.
In accordance with some embodiments, the method further includes, in response to detecting that the text-selection criteria have been met, generating a tactile output that is indicative of an entry into a text selection mode of operation.
In accordance with some embodiments, the method further includes, in response to detecting that the text-selection criteria have been met, changing an appearance of the onscreen keyboard to indicate that the device is operating in a text selection mode of operation, wherein changing the appearance of the onscreen keyboard includes obscuring an appearance of characters on keys of the onscreen keyboard.
In accordance with some embodiments, the method further includes, ending the text selection mode of operation and, in conjunction with the end of the text selection mode of operation, reversing the change in appearance of the onscreen keyboard to reveal the characters on the keys of the onscreen keyboard.
In accordance with some embodiments, the method further includes, when the touch input satisfies the text-selection criteria, detecting movement of the contact after the touch input has satisfied the text-selection criteria and moving a cursor in the content region in accordance with the movement of the contact detected after the touch input has satisfied the text-selection criteria.
In accordance with some embodiments, the method further includes, when the touch input satisfies the text-selection criteria, detecting a first subsequent change in the characteristic intensity of the contact followed by additional movement of the contact on the touch-sensitive display; and, in response to detecting the first subsequent change in the characteristic intensity of the contact: in accordance with a determination that the touch input satisfies selection-start criteria, wherein the selection-start criteria include a criterion that is met when the characteristic intensity of the contact increases above a selection-start intensity threshold, starting to select content in the content region at a location of a cursor in accordance with the additional movement of the contact; and in accordance with a determination that the touch input does not satisfy the selection-start criteria, moving the cursor in accordance with the additional movement of the contact without starting to select content in the content region.
In accordance with some embodiments, the method further includes, when the touch input satisfies selection-start criteria, after starting to select content in the content region, detecting liftoff of the contact from the touch-sensitive display and confirming the selection in response to detecting the liftoff of the contact.
In accordance with some embodiments, the method further includes, when the touch input satisfies selection-start criteria, after starting to select content in the content region, and while continuing to detect the contact on the touch-sensitive display, detecting a second subsequent change in intensity of the contact; in response to detecting the second subsequent change in the characteristic intensity of the contact: in accordance with a determination that the second subsequent change in the characteristic intensity of the contact satisfies selection-cancellation criteria, wherein the selection-cancellation criteria include a criterion that is met when the characteristic intensity of the contact increases above a selection-cancellation intensity threshold cancelling the selection; and in accordance with a determination that the second subsequent change in the characteristic intensity of the contact does not satisfy the selection-cancellation criteria, maintaining the selection.
In accordance with some embodiments, the selection-cancellation criteria include a criterion that is met when the contact moves no more than a threshold distance within a threshold amount of time before the characteristic intensity of the contact increases above the selection-cancellation intensity threshold.
In accordance with some embodiments, the method further includes, in response to detecting that the selection-cancellation criteria have been met, generating a tactile output that is indicative of an exit from the text selection mode of operation.
In accordance with some embodiments, the method further includes, after canceling the selection, and while continuing to detect the contact on the touch-sensitive display, detecting a third subsequent change in the characteristic intensity of the contact; and, in response to detecting the third subsequent change in the characteristic intensity of the contact: in accordance with a determination that the touch input satisfies the selection-start criteria, starting to select content in the content region at a location of the cursor; and in accordance with a determination that the touch input does not satisfy the selection-start criteria, forgoing starting to select content in the content region.
In accordance with some embodiments, starting to select content in response to detecting the third subsequent change in the characteristic intensity of the contact includes selecting a respective word at the location of the cursor.
In accordance with some embodiments, the selected respective word is a first word, and the method further includes, while the first word is selected, detecting first subsequent movement of the contact; and in response to detecting the first subsequent movement of the contact while the first word is selected: in accordance with a determination that the touch input meets selection-movement criteria which includes a movement criterion that is met when the contact moves more than a respective threshold amount, canceling selection of the first word; and selecting a second word that is adjacent to the first word in a first direction in accordance with the first subsequent movement of the contact, such that the selected respective word is the second word.
In accordance with some embodiments, the method further includes, while the respective word is selected, detecting first subsequent movement of the contact; and in response to detecting the first subsequent movement of the contact while the respective word is selected: in accordance with a determination that the touch input meets selection-expansion criteria which includes a movement criterion that is met when the contact moves more than a respective threshold amount, expanding the selection to include a word that is adjacent to the respective word in a first direction in accordance with the first subsequent movement of the contact.
In accordance with some embodiments, the method further includes, while the respective word is selected, detecting a fourth subsequent change in the characteristic intensity of the contact above a respective intensity threshold; and, in response to detecting the fourth subsequent change in the characteristic intensity of the contact, in accordance with a determination that the touch input meets the selection-cancellation criteria, which includes a criterion that is met when the amount of time between when the third subsequent change in the characteristic intensity of the contact is detected and when the fourth subsequent change in the characteristic intensity of the contact is detected is greater than the delay threshold, cancelling selection of the respective word.
In accordance with some embodiments, the method further includes, while the respective word is selected, detecting a fourth subsequent change in the characteristic intensity of the contact above a respective intensity threshold; and, in response to detecting the fourth subsequent change in the characteristic intensity of the contact, in accordance with a determination that the touch input meets sentence-selection criteria which include a movement criterion that is met when the contact moves less than a threshold amount within a threshold time period before the fourth subsequent change in intensity of the contact was detected and a time criterion that is met when an amount of time between when the third subsequent change in the characteristic intensity of the contact is detected and when the fourth subsequent change in the characteristic intensity of the contact is detected is less than a delay threshold, expanding the selection to include the respective sentence that contains the respective word.
In accordance with some embodiments, the method further includes, while the respective sentence is selected, detecting second subsequent movement of the contact; and in response to detecting the second subsequent movement of the contact while the respective sentence is selected: in accordance with a determination that the touch input meets selection-expansion criteria which includes a movement criterion that is met when the contact moves more than a respective threshold amount, expanding the selection to include a sentence that is adjacent to the respective sentence in a first direction in accordance with the second subsequent movement of the contact.
In accordance with some embodiments, the respective sentence is selected in response to the fourth subsequent change in the characteristic intensity of the contact and the method further includes, while the respective sentence is selected: detecting a fifth subsequent change in the characteristic intensity of the contact above the respective intensity threshold; and, in response to detecting the fifth subsequent change in the characteristic intensity of the contact, in accordance with a determination that the touch input meets the selection-cancellation criteria, which includes a criterion that is met when the amount of time between when the fourth subsequent change in the characteristic intensity of the contact is detected and when the fifth subsequent change in the characteristic intensity of the contact is detected is greater than the delay threshold, cancelling selection of the respective sentence.
In accordance with some embodiments, the method further includes, in response to detecting the fifth subsequent change in the characteristic intensity of the contact, in accordance with a determination that the touch input meets paragraph-selection criteria which include a movement criterion that is met when the contact moves less than the threshold amount within the threshold time period before the fifth subsequent change in intensity of the contact was detected and a time criterion that is met when an amount of time between when the fourth subsequent change in the characteristic intensity of the contact is detected and when the fifth subsequent change in the characteristic intensity of the contact is detected is less than the delay threshold, expanding the selection to include the respective paragraph that contains the respective sentence.
In accordance with some embodiments, the method further includes, while the respective paragraph is selected, detecting third subsequent movement of the contact; and in response to detecting the third subsequent movement of the contact while the respective paragraph is selected: in accordance with a determination that the touch input meets selection-expansion criteria which includes a movement criterion that is met when the contact moves more than a respective threshold amount, expanding the selection to include a paragraph that is adjacent to the respective paragraph in a first direction in accordance with the third subsequent movement of the contact.
In accordance with some embodiments, the respective paragraph is selected in response to the fifth subsequent change in the characteristic intensity of the contact and the method further includes, while the respective paragraph is selected: detecting a sixth subsequent change in the characteristic intensity of the contact above the respective intensity threshold; and, in response to detecting the sixth subsequent change in the characteristic intensity of the contact, in accordance with a determination that the touch input meets selection-cancellation criteria, which includes a criterion that is met when the amount of time between when the fifth subsequent change in the characteristic intensity of the contact is detected and when the sixth subsequent change in the characteristic intensity of the contact is detected is greater than the delay threshold, cancelling selection of the respective paragraph.
In accordance with some embodiments, the method further includes, in response to detecting the sixth subsequent change in the characteristic intensity of the contact, in accordance with a determination that the touch input meets document selection criteria which include a movement criterion that is met when the contact moves less than the threshold amount within the threshold time period before the sixth subsequent change in intensity of the contact was detected and a time criterion that is met when an amount of time between when the fifth subsequent change in the characteristic intensity of the contact is detected and when the sixth subsequent change in the characteristic intensity of the contact is detected is less than the delay threshold, expanding the selection to include the respective document that contains the respective paragraph.
In accordance with some embodiments, a method of content manipulation is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying content of an electronic document and a cursor within the content on the touch screen display; displaying, on the touch screen display, a soft keyboard having multiple keys each having a respective alphanumeric character of a plurality of alphanumeric characters; detecting two substantially simultaneous touch inputs on the soft keyboard; and in response to detecting the two substantially simultaneous touch inputs on the soft keyboard, changing the appearance of the soft keyboard to a changed appearance.
In accordance with some embodiments, the method of content manipulation further includes: detecting, without breaking contact with the touch screen display following the two substantially simultaneous touch inputs, a sliding gesture from a first location of the two substantially touch inputs on the soft keyboard to a second location on the content; and in response to detecting the sliding gesture, maintaining the changed appearance of the soft keyboard.
In accordance with some embodiments, changing the appearance of the soft keyboard includes: removing the plurality of alphanumeric characters from the multiple keys, or changing one or more of: color, hue, saturation, brightness, or contrast of the soft keyboard based on the content of the electronic document.
In accordance with some embodiments, the method of content manipulation further includes: detecting a continuous movement of the two substantially simultaneous touch inputs from a first location on the soft keyboard to a second location anywhere on the touch screen display without breaking contact with the touch screen display; and expanding the selected content to include additional content beyond the portion in a direction towards the second location in response to detecting the continuous movement.
In accordance with some embodiments, expanding the selected content to include additional content beyond the portion in a direction towards the second location includes: displaying a start-point object and an end-point object at respective ends of the selection; and moving one of the start-point object or the end-point object in accordance with the location of the first location and the second location. For example, the selection is expanded by moving a start-point object or an end-point object like dragging a cursor. The initial direction of the continuous movement of the two substantially simultaneous touch inputs defines what part of the selection is expanded. For example, a left/up movement of the two substantially simultaneous touch inputs drags a lollipop shaped start-point object and expands the selection backwards, while a right/down movement of the two substantially simultaneous touch inputs drags an upside down lollipop shaped end-point object and expands the selection forward.
In accordance with some embodiments, expanding the selected content to include additional content beyond the portion in a direction towards the second location includes: in accordance with a determination that the speed of the continuous movement exceeds a predetermined threshold, expanding the selection one word at a time.
In accordance with some embodiments, the method of content manipulation further includes: after detecting the two substantially simultaneous touch inputs, detecting a lift-off followed by an additional two substantially simultaneous touch inputs followed by a sliding gesture of the additional two substantially simultaneous touch inputs across the touch screen display to additional locations; dismissing the selected content; selecting a word as the selected content closest to the cursor; and expanding the selected content to include additional content beyond the selected content in a direction towards the additional locations.
In accordance with some embodiments, the method of content manipulation further includes: detecting an additional two substantially simultaneous touch inputs on the soft keyboard; and in response to detecting the additional two substantially simultaneous touch inputs on the soft keyboard: in accordance with a determination that the selected content is a word, expanding the selected content to include a sentence containing the word.
In accordance with some embodiments, the method of content manipulation further includes: in accordance with a determination that the selected content is more than a word, displaying the cursor at the beginning of the selected content and dismissing the selected content.
In accordance with some embodiments, the method of content manipulation further includes: detecting additional two substantially simultaneous touch inputs; in response to detecting the additional two substantially simultaneous touch inputs, expanding the selected content to a sentence containing the portion; and displaying an indication of the selected content.
In accordance with some embodiments, expanding the selected content to a sentence containing the portion includes expanding the selected content to the sentence containing the portion in accordance with a determination that the duration between the two substantially simultaneous touch inputs and the additional two substantially simultaneous touch inputs is less than a predetermined threshold (e.g., 0.66 seconds).
In accordance with some embodiments, the method of content manipulation further includes: detecting a further two substantially simultaneous touch inputs; in response to detecting the further two substantially simultaneous touch inputs, expanding the selected content to a paragraph containing the portion; and displaying an indication of the selected content.
In accordance with some embodiments, expanding the selected content to a paragraph containing the portion includes: expanding the selected content to the paragraph containing the portion in accordance with a determination that the duration between the two substantially simultaneous touch inputs and the further two substantially simultaneous touch inputs is less than a predetermined threshold (e.g., 0.66 seconds).
In accordance with some embodiments, there is provided an electronic device that includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs; the one or more programs are stored in the memory and configured to be executed by the one or more processors, where the one or more programs include instructions for performing any of the methods described herein.
In accordance with some embodiments there is provided a graphical user interface on an electronic device with a display, a touch-sensitive surface, a memory, and one or more processors to execute one or more programs stored in the memory. The graphical user interface includes one or more of the elements displayed in any of the methods described above, which are updated in response to inputs, as described in any of the methods described herein.
In accordance with some embodiments, there is provided a non-transitory computer readable storage medium that has stored therein instructions which when executed by an electronic device with a display, a touch-sensitive surface, cause the device to perform the any of the methods described herein.
In accordance with some embodiments, an electronic device includes a display unit configured to display content of an electronic document and a cursor within the electronic document; a touch-sensitive surface unit configured to receive user contacts; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect two substantially simultaneous touch inputs anywhere on the touch screen display; and in response to detecting the two substantially simultaneous touch inputs: select a portion of the content in the document closest to the cursor; and display the portion of the content as selected content.
In accordance with some embodiments, an electronic device includes a display unit configured to display content of an electronic document and a cursor at a current location within the content; a touch-sensitive surface unit configured to receive user contacts; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect two substantially simultaneous touch inputs at a first region on the touch screen display; upon determining that the two substantially simultaneous touch inputs remain in contact with the touch screen display, detect a continuous touch gesture from a location of the two substantially simultaneous touch inputs across the touch screen display from the first region to a second region; and in response to detecting the continuous touch gesture: move the cursor from the current location to a new location in a direction of the continuous touch gesture.
In accordance with some embodiments, an electronic device includes a display unit configured to display text and a cursor at a line within the text; a touch-sensitive surface unit configured to receive user contacts; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect a two-finger swipe gesture on the touch screen display in a direction at least partially parallel to the line and towards an edge of the touch screen display; and in response to detecting the two-finger swipe gesture: move the cursor to a distal point of the text.
In accordance with some embodiments, an electronic device includes a display unit configured to display content of an electronic document and a cursor within the electronic document; a touch-sensitive surface unit configured to receive user contacts; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect a touch input on the touch screen display, wherein the touch input is located on a word within the content; and in response to detecting the touch input: select the word; and display a command display area adjacent to the selected word, wherein the second command display area includes an icon for cutting the selected word, an icon for copying the selected word, and an icon for pasting a previously selected content.
In accordance with some embodiments, an electronic device includes a display unit configured to display content of an electronic document and a selection of the content within the electronic document; a touch-sensitive surface unit configured to receive user contacts; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect a single touch input on the touch screen display at a location over the selection; in response to detecting the single touch input at the location, display a set of options related to the selection; determine if the single touch input remains at the location for a predetermined amount of time followed by a continuous touch gesture away from the location on the touch screen display; and in response to detecting the single touch input remaining at the location for the predetermined amount of time followed by the continuous touch gesture away from the location, move the selection to a different location in a direction of the continuous touch gesture.
In accordance with some embodiments, an electronic device includes a display unit configured to display content of an electronic document and a selection of the content within the electronic document; a touch-sensitive surface unit configured to receive user contacts; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect three substantially simultaneous touch inputs at locations anywhere on the touch screen display; determine if the three substantially simultaneous touch inputs is followed by three continuous touch gestures away from the locations on the touch screen display; and in response to detecting the three continuous touch gestures, move the selection to a different location in a direction of the continuous touch gestures.
In accordance with some embodiments, an electronic device includes a display unit configured to display content of an electronic document, the content includes at least one line of text comprising at least two words; a touch-sensitive surface unit configured to receive user contacts; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect a touch input on the content; and in response to detecting the touch input: determine a distance of the touch input to a closest space between the two words within the electronic document; and in accordance with a determination that the distance is greater than a predetermined threshold distance, select a word within the electronic document closest to the touch input and display an indication of the selection.
In accordance with some embodiments, an electronic device includes a display unit configured to display content of an electronic document and a soft keyboard; a touch-sensitive surface unit configured to receive user contacts; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect two substantially simultaneous touch inputs on the soft keyboard; and in response to detecting the two substantially simultaneous touch inputs on the soft keyboard, display a blurred soft keyboard.
In accordance with some embodiments, an electronic device includes a touch-sensitive surface unit configured to receive user touch inputs, a display unit configured to concurrently display an onscreen keyboard and a content presentation region on the touch-sensitive display unit, wherein the content presentation region displays text input received from the onscreen keyboard; a touch-sensitive surface unit configured to receive user touch inputs; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect a touch input on the onscreen keyboard displayed on the touch-sensitive display; in response to detecting the touch input on the onscreen keyboard displayed on the touch-sensitive display, determine whether the touch input satisfies one or more criteria for entering a text selection mode; and in accordance with a determination that the touch input satisfies the one or more criteria for entering the text selection mode: concurrently display, in the content presentation region, a first cursor at a first location and a second cursor at a second location that is different from the first location.
In accordance with some embodiments, an electronic device includes a touch-sensitive surface unit configured to receive user touch inputs; a display unit configured to, while a contact is detected on the touch-sensitive surface unit, concurrently display on the display unit content and a text selection indicator at a first location within the content; one or more sensors to detect intensity of contacts with the touch-sensitive surface unit; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect a first press input by the contact followed by movement of the contact across the touch-sensitive surface unit that corresponds to movement of at least a portion of the text selection indicator from the first location to a second location on the display unit; in response to detecting the first press input by the contact followed by movement of the contact across the touch-sensitive surface unit, select content between the first location and the second location; while the content between the first location and the second location is selected, detect a second press input by the contact on the touch-sensitive surface unit; in response to detecting the second press input by the contact on the touch-sensitive surface unit, perform a text selection operation, associated with the content between the first location and the second location, in accordance with the second press input, wherein the contact in the first press input, the movement across the touch-sensitive surface unit, and the second press input is a single continuous contact with the touch-sensitive surface unit.
In accordance with some embodiments, an electronic device includes a touch-sensitive surface unit configured to receive user touch inputs; a display unit configured to concurrently display an onscreen keyboard and a content presentation region on the touch-sensitive display unit, wherein the content presentation region displays text input received from the onscreen keyboard; one or more sensors to detect intensity of contacts with the touch-sensitive surface unit; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: detect a touch input on the onscreen keyboard displayed on the touch-sensitive display unit, wherein detecting the touch input includes detecting movement of a contact and liftoff of the contact; in response to detecting the touch input on the onscreen keyboard displayed on the touch-sensitive display: in accordance with a determination that the touch input satisfies text-selection criteria, wherein the text-selection criteria include a criterion that is met when a characteristic intensity of the contact increases above a text-selection intensity threshold, perform a text selection operation based on the movement of the contact; and in accordance with a determination that the touch input satisfies text-entry criteria, wherein the text-entry criteria include a criterion that is met when the characteristic intensity of the contact does not increase above the text-selection intensity threshold, enter text into the content presentation region based on the touch input.
In accordance with some embodiments, there is provided an electronic device that includes a display unit configured to display content of an electronic document and a cursor within the electronic document; a touch-sensitive surface unit configured to receive user contacts; and a processing unit coupled to the display unit and the touch-sensitive surface unit. The processing unit is configured to: display, on the touch-sensitive surface display unit, a soft keyboard having multiple keys each having a respective alphanumeric character of a plurality of alphanumeric characters; detect two substantially simultaneous touch inputs on the soft keyboard; and in response to detecting the two substantially simultaneous touch inputs on the soft keyboard, change the appearance of the soft keyboard to a changed appearance.
Thus, electronic devices with displays, touch-sensitive surfaces are provided with faster, more efficient methods and interfaces for cursor manipulation, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for cursor manipulation.
For a better understanding of the various described embodiments, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
Described below are devices and methods that allow a user to efficiently manipulate a cursor in an electronic document. The methods are particularly useful on portable devices with small displays, including handheld or pocket-sized devices (e.g., smartphones). When using conventional portable devices, a user may find it difficult to precisely place her finger (or make contact) at the cursor location on the touch screen display, as the cursor is often small, hidden under the user's finger, and/or disposed between text or graphics. As such, users often need to lift their finger and reposition it multiple times until the cursor is placed at the correct location. This conventional process is time consuming, inefficient, and alienates users. The methods described herein allow a user to manipulate and place a cursor at a desired location within an electronic document, as well as perform certain editing functions, such as text selection or moving text. In some embodiments, this cursor (or selected text) manipulation can be controlled from any part of the touch-sensitive surface, not just the location of the cursor. These methods greatly reduce the number of steps that a user need perform to navigate and edit a document, thereby increasing efficiency and ease of use when performing these tasks.
Below,
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the scope of the present invention. The first contact and the second contact are both contacts, but they are not the same contact.
The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the description of the invention and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
Embodiments of computing devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the computing device is a portable communications device such as a mobile telephone that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, without limitation, the IPHONE, IPAD, and IPOD TOUCH devices from Apple Computer, Inc. of Cupertino, Calif.
In the discussion that follows, a computing device that includes a touch-sensitive display is described. It should be understood, however, that the computing device may include one or more other physical user-interface devices, such as a separate display, physical keyboard, a mouse, and/or a joystick.
The device supports a variety of applications, such as one or more of the following: a note taking application, a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video-conferencing application, an e-mail application, an instant messaging application, a fitness application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, a digital video player application, and/or a home automation application.
The various applications that may be executed on the device may use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device may be adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device may support the variety of applications with user interfaces that are intuitive and transparent.
The user interfaces may include one or more soft keyboard embodiments. The soft keyboard embodiments may include standard (QWERTY) and/or non-standard configurations of symbols on the displayed icons of the keyboard, such as those described in U.S. patent application Ser. No. 11/459,606, “Keyboards For Portable Electronic Devices,” filed Jul. 24, 2006, and Ser. No. 11/459,615, “Touch Screen Keyboards For Portable Electronic Devices,” filed Jul. 24, 2006, the contents of which are hereby incorporated by reference in their entirety. The keyboard embodiments may include a reduced number of icons (or soft keys) relative to the number of keys in existing physical keyboards, such as that for a typewriter. This may make it easier for users to select one or more icons in the keyboard, and thus, one or more corresponding symbols. The keyboard embodiments may be adaptive. For example, displayed icons may be modified in accordance with user actions, such as selecting one or more icons and/or one or more corresponding symbols. One or more applications on the device may utilize common and/or different keyboard embodiments. Thus, the keyboard embodiment used may be tailored to at least some of the applications. In some embodiments, one or more keyboard embodiments may be tailored to a respective user. For example, one or more keyboard embodiments may be tailored to a respective user based on a word usage history (lexicography, slang, individual usage) of the respective user. Some of the keyboard embodiments may be adjusted to reduce a probability of a user error when selecting one or more icons, and thus one or more symbols, when using the soft keyboard embodiments.
Attention is now directed toward embodiments of portable devices with touch-sensitive displays.
As used in the specification and claims, the term “tactile output” refers to physical displacement of a device relative to a previous position of the device, physical displacement of a component (e.g., a touch-sensitive surface) of a device relative to another component (e.g., housing) of the device, or displacement of the component relative to a center of mass of the device that will be detected by a user with the user's sense of touch. For example, in situations where the device or the component of the device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other part of a user's hand), the tactile output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in physical characteristics of the device or the component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is, optionally, interpreted by the user as a “down click” or “up click” of a physical actuator button. In some cases, a user will feel a tactile sensation such as an “down click” or “up click” even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movements. As another example, movement of the touch-sensitive surface is, optionally, interpreted or sensed by the user as “roughness” of the touch-sensitive surface, even when there is no change in smoothness of the touch-sensitive surface. While such interpretations of touch by a user will be subject to the individualized sensory perceptions of the user, there are many sensory perceptions of touch that are common to a large majority of users. Thus, when a tactile output is described as corresponding to a particular sensory perception of a user (e.g., an “up click,” a “down click,” “roughness”), unless otherwise stated, the generated tactile output corresponds to physical displacement of the device or a component thereof that will generate the described sensory perception for a typical (or average) user.
It should be appreciated that device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in
Memory 102 optionally includes high-speed random access memory and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of device 100, such as CPU(s) 120 and the peripherals interface 118, is, optionally, controlled by memory controller 122.
Peripherals interface 118 can be used to couple input and output peripherals of the device to CPU(s) 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for device 100 and to process data.
In some embodiments, peripherals interface 118, CPU(s) 120, and memory controller 122 are, optionally, implemented on a single chip, such as chip 104. In some other embodiments, they are, optionally, implemented on separate chips.
RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication optionally uses any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), Evolution, Data-Only (EV-DO), HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA), long term evolution (LTE), near field communication (NFC), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11ac, IEEE 802.11ax, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. Audio circuitry 110 receives audio data from peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 111. Speaker 111 converts the electrical signal to human-audible sound waves. Audio circuitry 110 also receives electrical signals converted by microphone 113 from sound waves. Audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to peripherals interface 118 for processing. Audio data is, optionally, retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (e.g., 212,
I/O subsystem 106 couples input/output peripherals on device 100, such as touch-sensitive display system 112 and other input or control devices 116, with peripherals interface 118. I/O subsystem 106 optionally includes display controller 156, optical sensor controller 158, intensity sensor controller 159, haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116. The other input or control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 are, optionally, coupled with any (or none) of the following: a keyboard, infrared port, USB port, stylus, and/or a pointer device such as a mouse. The one or more buttons (e.g., 208,
Touch-sensitive display system 112 provides an input interface and an output interface between the device and a user. Display controller 156 receives and/or sends electrical signals from/to touch-sensitive display system 112. Touch-sensitive display system 112 displays visual output to the user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output corresponds to user-interface objects. As used herein, the term “affordance” refers to a user-interactive graphical user interface object (e.g., a graphical user interface object that is configured to respond to inputs directed toward the graphical user interface object). Examples of user-interactive graphical user interface objects include, without limitation, a button, slider, icon, selectable menu item, switch, hyperlink, or other user interface control.
Touch-sensitive display system 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch-sensitive display system 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch-sensitive display system 112 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on touch-sensitive display system 112. In an exemplary embodiment, a point of contact between touch-sensitive display system 112 and the user corresponds to a finger of the user or a stylus.
Touch-sensitive display system 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch-sensitive display system 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch-sensitive display system 112. In an exemplary embodiment, projected mutual capacitance sensing technology is used, such as that found in the iPhone®, iPod Touch®, and iPad® from Apple Inc. of Cupertino, Calif.
Touch-sensitive display system 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen video resolution is in excess of 400 dpi (e.g., 500 dpi, 800 dpi, or greater). The user optionally makes contact with touch-sensitive display system 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen, device 100 optionally includes a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad is, optionally, a touch-sensitive surface that is separate from touch-sensitive display system 112 or an extension of the touch-sensitive surface formed by the touch screen.
Device 100 also includes power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
Device 100 optionally also includes one or more optical sensors 164.
Device 100 optionally also includes one or more contact intensity sensors 165.
Device 100 optionally also includes one or more proximity sensors 166.
Device 100 optionally also includes one or more tactile output generators 167.
Device 100 optionally also includes one or more accelerometers 168.
In some embodiments, the software components stored in memory 102 include operating system 126, communication module (or set of instructions) 128, contact/motion module (or set of instructions) 130, graphics module (or set of instructions) 132, haptic feedback module (or set of instructions) 133, text input module (or set of instructions) 134, Global Positioning System (GPS) module (or set of instructions) 135, and applications (or sets of instructions) 136. Furthermore, in some embodiments, memory 102 stores device/global internal state 157, as shown in
Operating system 126 (e.g., iOS, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by RF circuitry 108 and/or external port 124. External port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used in some iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif. In some embodiments, the external port is a Lightning connector that is the same as, or similar to and/or compatible with the Lightning connector used in some iPhone®, iPod Touch®, and iPad® devices from Apple Inc. of Cupertino, Calif.
Contact/motion module 130 optionally detects contact with touch-sensitive display system 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 130 includes various software components for performing various operations related to detection of contact (e.g., by a finger or by a stylus), such as determining if contact has occurred (e.g., detecting a finger-down event), determining an intensity of the contact (e.g., the force or pressure of the contact or a substitute for the force or pressure of the contact), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, optionally includes determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations are, optionally, applied to single contacts (e.g., one finger contacts or stylus contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 130 and display controller 156 detect contact on a touchpad.
Contact/motion module 130 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event. Similarly, tap, swipe, drag, and other gestures are optionally detected for a stylus by detecting a particular contact pattern for the stylus.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch-sensitive display system 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics module 132 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to display controller 156.
Haptic feedback module 133 includes various software components for generating instructions used by tactile output generator(s) 167 to produce tactile outputs at one or more locations on device 100 in response to user interactions with device 100.
Text input module 134, which is, optionally, a component of graphics module 132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e-mail 140, IM 141, browser 147, and any other application that needs text input).
GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing, to camera 143 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
Applications 136 optionally include the following modules (or sets of instructions), or a subset or superset thereof:
Examples of other applications 136 that are, optionally, stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, contacts module 137 includes executable instructions to manage an address book or contact list (e.g., stored in application internal state 192 of contacts module 137 in memory 102 or memory 370), including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers and/or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference 139, e-mail 140, or IM 141; and so forth.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, telephone module 138 includes executable instructions to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in address book 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed. As noted above, the wireless communication optionally uses any of a plurality of communications standards, protocols and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephone module 138, videoconferencing module 139 includes executable instructions to initiate, conduct, and terminate a video conference between a user and one or more other participants in accordance with user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, e-mail client module 140 includes executable instructions to create, send, receive, and manage e-mail in response to user instructions. In conjunction with image management module 144, e-mail client module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the instant messaging module 141 includes executable instructions to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, Apple Push Notification Service (APNs) or IMPS for Internet-based instant messages), to receive instant messages and to view received instant messages. In some embodiments, transmitted and/or received instant messages optionally include graphics, photos, audio files, video files and/or other attachments as are supported in a MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, APNs, or IMPS).
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module 152, workout support module 142 includes executable instructions to create workouts (e.g., with time, distance, and/or calorie burning goals); communicate with workout sensors (in sports devices and smart watches); receive workout sensor data; calibrate sensors used to monitor a workout; select and play music for a workout; and display, store and transmit workout data.
In conjunction with touch-sensitive display system 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, and/or delete a still image or video from memory 102.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions to arrange, modify (e.g., edit), or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions to browse the Internet in accordance with user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, e-mail client module 140, and browser module 147, calendar module 148 includes executable instructions to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.) in accordance with user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, widget modules 149 are mini-applications that are, optionally, downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 includes executable instructions to create widgets (e.g., turning a user-specified portion of a web page into a widget).
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions to display, present or otherwise play back videos (e.g., on touch-sensitive display system 112, or on an external display connected wirelessly or via external port 124). In some embodiments, device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, notes module 153 includes executable instructions to create and manage notes, to do lists, and the like in accordance with user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 includes executable instructions to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data) in accordance with user instructions.
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, online video module 155 includes executable instructions that allow the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen 112, or on an external display connected wirelessly or via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video.
Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments, memory 102 optionally stores a subset of the modules and data structures identified above. Furthermore, memory 102 optionally stores additional modules and data structures not described above.
In some embodiments, device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation of device 100, the number of physical input control devices (such as push buttons, dials, and the like) on device 100 is, optionally, reduced.
The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad optionally include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 100 to a main, home, or root menu from any user interface that is displayed on device 100. In such embodiments, a “menu button” is implemented using a touchpad. In some other embodiments, the menu button is a physical push button or other physical input control device instead of a touchpad.
Event sorter 170 receives event information and determines the application 136-1 and application view 191 of application 136-1 to which to deliver the event information. Event sorter 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, application 136-1 includes application internal state 192, which indicates the current application view(s) displayed on touch sensitive display 112 when the application is active or executing. In some embodiments, device/global internal state stored in the memory 102 is used by event sorter 170 to determine which application(s) is (are) currently active, and application internal state 192 is used by event sorter 170 to determine application views 191 to which to deliver event information.
In some embodiments, application internal state 192 includes additional information, such as one or more of: resume information to be used when application 136-1 resumes execution, user interface state information that indicates information being displayed or that is ready for display by application 136-1, a state queue for enabling the user to go back to a prior state or view of application 136-1, and a redo/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripherals interface 118. Event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112, as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or a sensor, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (through audio circuitry 110). Information that peripherals interface 118 receives from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to the peripherals interface 118 at predetermined intervals. In response, peripherals interface 118 transmits event information. In other embodiments, peripheral interface 118 transmits event information only when there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or for more than a predetermined duration).
In some embodiments, event sorter 170 also includes a hit view determination module 172 and/or an active event recognizer determination module 173.
Hit view determination module 172 provides software procedures for determining where a sub-event has taken place within one or more views, when touch sensitive display 112 displays more than one view. Views are made up of controls and other elements that a user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes herein called application views or user interface windows, in which information is displayed and touch-based gestures occur. The application views (of a respective application) in which a touch is detected may correspond to programmatic levels within a programmatic or view hierarchy of the application. For example, the lowest level view in which a touch is detected may be called the hit view, and the set of events that are recognized as proper inputs may be determined based, at least in part, on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of a touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies a hit view as the lowest view in the hierarchy which should handle the sub-event. In most circumstances, the hit view is the lowest level view in which an initiating sub-event occurs (e.g., the first sub-event in the sequence of sub-events that form an event or potential event). Once the hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
Active event recognizer determination module 173 determines which view or views within a view hierarchy should receive a particular sequence of sub-events. In some embodiments, active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively involved views, and therefore determines that all actively involved views should receive a particular sequence of sub-events. In other embodiments, even if touch sub-events were entirely confined to the area associated with one particular view, views higher in the hierarchy would still remain as actively involved views.
Event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments including active event recognizer determination module 173, event dispatcher module 174 delivers the event information to an event recognizer determined by active event recognizer determination module 173. In some embodiments, event dispatcher module 174 stores in an event queue the event information, which is retrieved by a respective event receiver module 182.
In some embodiments, operating system 126 includes event sorter 170. Alternatively, application 136-1 includes event sorter 170. In yet other embodiments, event sorter 170 is a stand-alone module, or a part of another module stored in memory 102, such as contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for handling touch events that occur within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, a respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of event recognizers 180 are part of a separate module, such as a user interface kit (not shown) or a higher level object from which application 136-1 inherits methods and other properties. In some embodiments, a respective event handler 190 includes one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 may utilize or call data updater 176, object updater 177 or GUI updater 178 to update the application internal state 192. Alternatively, one or more of the application views 191 includes one or more respective event handlers 190. Also, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
A respective event recognizer 180 receives event information (e.g., event data 179) from event sorter 170, and identifies an event from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 also includes at least a subset of: metadata 183, and event delivery instructions 188 (which may include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about a sub-event, for example, a touch or a touch movement. Depending on the sub-event, the event information also includes additional information, such as location of the sub-event. When the sub-event concerns motion of a touch the event information may also include speed and direction of the sub-event. In some embodiments, events include rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about the current orientation (also called device attitude) of the device.
Event comparator 184 compares the event information to predefined event or sub-event definitions and, based on the comparison, determines an event or sub-event, or determines or updates the state of an event or sub-event. In some embodiments, event comparator 184 includes event definitions 186. Event definitions 186 contain definitions of events (e.g., predefined sequences of sub-events), for example, event 1 (187-1), event 2 (187-2), and others. In some embodiments, sub-events in an event 187 include, for example, touch begin, touch end, touch movement, touch cancellation, and multiple touching. In one example, the definition for event 1 (187-1) is a double-tap on a displayed object. The double-tap, for example, comprises a first touch (touch begin) on the displayed object for a predetermined phase, a first lift-off (touch end) for a predetermined phase, a second touch (touch begin) on the displayed object for a predetermined phase, and a second lift-off (touch end) for a predetermined phase. In another example, the definition for event 2 (187-2) is a dragging on a displayed object. The dragging, for example, comprises a touch (or contact) on the displayed object for a predetermined phase, a movement of the touch across touch-sensitive display 112, and lift-off of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definition 187 includes a definition of an event for a respective user-interface object. In some embodiments, event comparator 184 performs a hit test to determine which user-interface object is associated with a sub-event. For example, in an application view in which three user-interface objects are displayed on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user-interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the result of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object triggering the hit test.
In some embodiments, the definition for a respective event 187 also includes delayed actions that delay delivery of the event information until after it has been determined whether the sequence of sub-events does or does not correspond to the event recognizer's event type.
When a respective event recognizer 180 determines that the series of sub-events do not match any of the events in event definitions 186, the respective event recognizer 180 enters an event impossible, event failed, or event ended state, after which it disregards subsequent sub-events of the touch-based gesture. In this situation, other event recognizers, if any, that remain active for the hit view continue to track and process sub-events of an ongoing touch-based gesture.
In some embodiments, a respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to actively involved event recognizers. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate how event recognizers may interact with one another. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to varying levels in the view or programmatic hierarchy.
In some embodiments, a respective event recognizer 180 activates event handler 190 associated with an event when one or more particular sub-events of an event are recognized. In some embodiments, a respective event recognizer 180 delivers event information associated with the event to event handler 190. Activating an event handler 190 is distinct from sending (and deferred sending) sub-events to a respective hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag catches the flag and performs a predefined process.
In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about a sub-event without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the series of sub-events or to actively involved views. Event handlers associated with the series of sub-events or with actively involved views receive the event information and perform a predetermined process.
In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, data updater 176 updates the telephone number used in contacts module 137, or stores a video file used in video and music player module 152. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user-interface object or updates the position of a user-interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends it to graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It shall be understood that the foregoing discussion regarding event handling of user touches on touch-sensitive displays also applies to other forms of user inputs to operate multifunction devices 100 with input-devices, not all of which are initiated on touch screens, e.g., coordinating mouse movement and mouse button presses with or without single or multiple keyboard presses or holds, user movements taps, drags, scrolls, etc., on touch-pads, pen stylus inputs, movement of the device, oral instructions, detected eye movements, biometric inputs, and/or any combination thereof, which may be utilized as inputs corresponding to sub-events which define an event to be recognized.
Device 100 optionally also includes one or more physical buttons, such as “home” or menu button 204. As described previously, menu button 204 is, optionally, used to navigate to any application 136 in a set of applications that are, optionally executed on device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI displayed on the touch-screen display.
In some embodiments, device 100 includes the touch-screen display, menu button 204, push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, Subscriber Identity Module (SIM) card slot 210, head set jack 212, and docking/charging external port 124. Push button 206 is, optionally, used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In some embodiments, device 100 also accepts verbal input for activation or deactivation of some functions through microphone 113. Device 100 also, optionally, includes one or more contact intensity sensors 165 for detecting intensity of contacts on touch-sensitive display system 112 and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
Although many of the examples that follow will be given with reference to inputs on touch screen display 112 (where the touch sensitive surface and the display are combined), in some embodiments, the device detects inputs on a touch-sensitive surface that is separate from the display, as shown in
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures, etc.), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse based input or a stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
As used herein, the term “focus selector” refers to an input element that indicates a current part of a user interface with which a user is interacting. In some implementations that include a cursor or other location marker, the cursor acts as a “focus selector,” so that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad or touch-sensitive surface 251 in
As used in the specification and claims, the term “intensity” of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact or a stylus contact) on the touch-sensitive surface, or to a substitute (proxy) for the force or pressure of a contact on the touch-sensitive surface. The intensity of a contact has a range of values that includes at least four distinct values and more typically includes hundreds of distinct values (e.g., at least 256). Intensity of a contact is, optionally, determined (or measured) using various approaches and various sensors or combinations of sensors. For example, one or more force sensors underneath or adjacent to the touch-sensitive surface are, optionally, used to measure force at various points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average or a sum) to determine an estimated force of a contact. Similarly, a pressure-sensitive tip of a stylus is, optionally, used to determine a pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereto, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereto, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereto are, optionally, used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the substitute measurements for contact force or pressure are used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the substitute measurements). In some implementations, the substitute measurements for contact force or pressure are converted to an estimated force or pressure and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). Using the intensity of a contact as an attribute of a user input allows for user access to additional device functionality that may otherwise not be readily accessible by the user on a reduced-size device with limited real estate for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or a physical/mechanical control such as a knob or a button).
In some embodiments, contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether a user has “clicked” on an icon). In some embodiments, at least a subset of the intensity thresholds are determined in accordance with software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and can be adjusted without changing the physical hardware of device 100). For example, a mouse “click” threshold of a trackpad or touch-screen display can be set to any of a large range of predefined thresholds values without changing the trackpad or touch-screen display hardware. Additionally, in some implementations a user of the device is provided with software settings for adjusting one or more of the set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting a plurality of intensity thresholds at once with a system-level click “intensity” parameter).
As used in the specification and claims, the term “characteristic intensity” of a contact refers to a characteristic of the contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on multiple intensity samples. The characteristic intensity is, optionally, based on a predefined number of intensity samples, or a set of intensity samples collected during a predetermined time period (e.g., 0.05, 0.1, 0.2, 0.5, 1, 2, 5, 10 seconds) relative to a predefined event (e.g., after detecting the contact, prior to detecting liftoff of the contact, before or after detecting a start of movement of the contact, prior to detecting an end of the contact, before or after detecting an increase in intensity of the contact, and/or before or after detecting a decrease in intensity of the contact). A characteristic intensity of a contact is, optionally based on one or more of: a maximum value of the intensities of the contact, a mean value of the intensities of the contact, an average value of the intensities of the contact, a top 10 percentile value of the intensities of the contact, a value at the half maximum of the intensities of the contact, a value at the 90 percent maximum of the intensities of the contact, or the like. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether an operation has been performed by a user. For example, the set of one or more intensity thresholds may include a first intensity threshold and a second intensity threshold. In this example, a contact with a characteristic intensity that does not exceed the first threshold results in a first operation, a contact with a characteristic intensity that exceeds the first intensity threshold and does not exceed the second intensity threshold results in a second operation, and a contact with a characteristic intensity that exceeds the second intensity threshold results in a third operation. In some embodiments, a comparison between the characteristic intensity and one or more intensity thresholds is used to determine whether or not to perform one or more operations (e.g., whether to perform a respective option or forgo performing the respective operation) rather than being used to determine whether to perform a first operation or a second operation.
In some embodiments, a portion of a gesture is identified for purposes of determining a characteristic intensity. For example, a touch-sensitive surface may receive a continuous swipe contact transitioning from a start location and reaching an end location (e.g., a drag gesture), at which point the intensity of the contact increases. In this example, the characteristic intensity of the contact at the end location may be based on only a portion of the continuous swipe contact, and not the entire swipe contact (e.g., only the portion of the swipe contact at the end location). In some embodiments, a smoothing algorithm may be applied to the intensities of the swipe contact prior to determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of: an unweighted sliding-average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some circumstances, these smoothing algorithms eliminate narrow spikes or dips in the intensities of the swipe contact for purposes of determining a characteristic intensity.
The user interface figures described herein optionally include various intensity diagrams that show the current intensity of the contact on the touch-sensitive surface relative to one or more intensity thresholds (e.g., a contact detection intensity threshold IT0, a light press intensity threshold ITL, a deep press intensity threshold ITD (e.g., that is at least initially higher than IL), and/or one or more other intensity thresholds (e.g., an intensity threshold Ix that is lower than IL)). This intensity diagram is typically not part of the displayed user interface, but is provided to aid in the interpretation of the figures. In some embodiments, the light press intensity threshold corresponds to an intensity at which the device will perform operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, the deep press intensity threshold corresponds to an intensity at which the device will perform operations that are different from operations typically associated with clicking a button of a physical mouse or a trackpad. In some embodiments, when a contact is detected with a characteristic intensity below the light press intensity threshold (e.g., and above a nominal contact-detection intensity threshold IT0 below which the contact is no longer detected), the device will move a focus selector in accordance with movement of the contact on the touch-sensitive surface without performing an operation associated with the light press intensity threshold or the deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent between different sets of user interface figures.
In some embodiments, the response of the device to inputs detected by the device depends on criteria based on the contact intensity during the input. For example, for some “light press” inputs, the intensity of a contact exceeding a first intensity threshold during the input triggers a first response. In some embodiments, the response of the device to inputs detected by the device depends on criteria that include both the contact intensity during the input and time-based criteria. For example, for some “deep press” inputs, the intensity of a contact exceeding a second intensity threshold during the input, greater than the first intensity threshold for a light press, triggers a second response only if a delay time has elapsed between meeting the first intensity threshold and meeting the second intensity threshold. This delay time is typically less than 200 ms in duration (e.g., 40, 100, or 120 ms, depending on the magnitude of the second intensity threshold, with the delay time increasing as the second intensity threshold increases). This delay time helps to avoid accidental deep press inputs. As another example, for some “deep press” inputs, there is a reduced-sensitivity time period that occurs after the time at which the first intensity threshold is met. During the reduced-sensitivity time period, the second intensity threshold is increased. This temporary increase in the second intensity threshold also helps to avoid accidental deep press inputs. For other deep press inputs, the response to detection of a deep press input does not depend on time-based criteria.
In some embodiments, one or more of the input intensity thresholds and/or the corresponding outputs vary based on one or more factors, such as user settings, contact motion, input timing, application running, rate at which the intensity is applied, number of concurrent inputs, user history, environmental factors (e.g., ambient noise), focus selector position, and the like. Exemplary factors are described in U.S. patent application Ser. Nos. 14/399,606 and 14/624,296, which are incorporated by reference herein in their entireties.
For example,
An increase of characteristic intensity of the contact from an intensity below the light press intensity threshold ITL to an intensity between the light press intensity threshold ITL and the deep press intensity threshold ITD is sometimes referred to as a “light press” input. An increase of characteristic intensity of the contact from an intensity below the deep press intensity threshold ITD to an intensity above the deep press intensity threshold ITD is sometimes referred to as a “deep press” input. An increase of characteristic intensity of the contact from an intensity below the contact-detection intensity threshold IT0 to an intensity between the contact-detection intensity threshold IT0 and the light press intensity threshold ITL is sometimes referred to as detecting the contact on the touch-surface. A decrease of characteristic intensity of the contact from an intensity above the contact-detection intensity threshold IT0 to an intensity below the contact-detection intensity threshold IT0 is sometimes referred to as detecting liftoff of the contact from the touch-surface. In some embodiments IT0 is zero. In some embodiments, IT0 is greater than zero. In some illustrations a shaded circle or oval is used to represent intensity of a contact on the touch-sensitive surface. In some illustrations, a circle or oval without shading is used represent a respective contact on the touch-sensitive surface without specifying the intensity of the respective contact.
In some embodiments, described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting the respective press input performed with a respective contact (or a plurality of contacts), where the respective press input is detected based at least in part on detecting an increase in intensity of the contact (or plurality of contacts) above a press-input intensity threshold. In some embodiments, the respective operation is performed in response to detecting the increase in intensity of the respective contact above the press-input intensity threshold (e.g., the respective operation is performed on a “down stroke” of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press-input threshold (e.g., the respective operation is performed on an “up stroke” of the respective press input).
In some embodiments, the device employs intensity hysteresis to avoid accidental inputs sometimes termed “jitter,” where the device defines or selects a hysteresis intensity threshold with a predefined relationship to the press-input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the press-input intensity threshold or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the press-input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above the press-input intensity threshold and a subsequent decrease in intensity of the contact below the hysteresis intensity threshold that corresponds to the press-input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the hysteresis intensity threshold (e.g., the respective operation is performed on an “up stroke” of the respective press input). Similarly, in some embodiments, the press input is detected only when the device detects an increase in intensity of the contact from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press-input intensity threshold and, optionally, a subsequent decrease in intensity of the contact to an intensity at or below the hysteresis intensity, and the respective operation is performed in response to detecting the press input (e.g., the increase in intensity of the contact or the decrease in intensity of the contact, depending on the circumstances).
For ease of explanation, the description of operations performed in response to a press input associated with a press-input intensity threshold or in response to a gesture including the press input are, optionally, triggered in response to detecting: an increase in intensity of a contact above the press-input intensity threshold, an increase in intensity of a contact from an intensity below the hysteresis intensity threshold to an intensity above the press-input intensity threshold, a decrease in intensity of the contact below the press-input intensity threshold, or a decrease in intensity of the contact below the hysteresis intensity threshold corresponding to the press-input intensity threshold. Additionally, in examples where an operation is described as being performed in response to detecting a decrease in intensity of a contact below the press-input intensity threshold, the operation is, optionally, performed in response to detecting a decrease in intensity of the contact below a hysteresis intensity threshold corresponding to, and lower than, the press-input intensity threshold. As described above, in some embodiments, the triggering of these responses also depends on time-based criteria being met (e.g., a delay time has elapsed between a first intensity threshold being met and a second intensity threshold being met).
Attention is now directed towards embodiments of processes and associated user interfaces (“UI”) that may be implemented on an electronic device with a display, a touch-sensitive surface, and optionally one or more sensors to detect intensities of contacts with the touch-sensitive surface, such as the portable multifunction device 100.
In some embodiments, the device 100 displays content of an electronic document on the touch screen display 112. In some embodiments, the content comprises text (e.g., plain text, unstructured text, formatted text, or text in a web page). In other embodiments, the content comprises graphics with or without text. Moreover, the content may be editable or read-only. In addition to displaying the content, when no content is selected, the device 100 may display a cursor within the electronic document, e.g., for text entry. In some embodiments, while displaying the content of the electronic document, the device 100 detects two substantially simultaneous touch inputs at 402. These two substantially simultaneous touch inputs can occur and be detected anywhere on the screen, including over an active virtual keyboard being displayed on the screen.
In some embodiments, the device 100 continuously monitors touch inputs and continuous movements of the touch inputs on the touch screen 112. Once touch inputs are detected by the device 100, the device 100 determines whether the two substantially simultaneous touch inputs are located on a soft keyboard on the touch screen at 406. In response to detecting the two substantially simultaneous touch inputs that started on the soft keyboard (406—Yes), the device 100 displays a soft keyboard that has a changed appearance at 404. In some embodiments, the device continuously monitors the touch inputs at step 406 so that the appearance of the keyboard remains changed when the two substantially simultaneous touch inputs that started on the soft keyboard are followed by a continuous movement of the two substantially simultaneous touch inputs without breaking contact with the touch screen 112. In other words, the device at step 406 detects whether there were two substantially simultaneous inputs on the soft keyboard either with or without a subsequent continuous movement off the keyboard. In some embodiments, in response to detecting the two substantially simultaneous touch inputs where not initially detected on the soft keyboard (406—No), the device 100 displays a soft keyboard with an unchanged appearance at 408, e.g., displays a regular unblurred soft keyboard or a keyboard with all alphanumeric characters displayed. A soft keyboard is a set of multiple virtual keys displayed on the screen, e.g., a QWERTY keyboard, as shown in
In
As shown in
In some embodiments, instead of displaying the blurred soft keyboard, the device changes one or more of: color, hue, saturation, brightness, and contrast of the soft keyboard 521 based on the content of the electronic document. In some embodiments, the appearance of the blurred keyboard 521 is based on the content displayed and a set of control-appearance values for blur radius, saturation adjustment, opacity of a white overlay, opacity of a black overlay, opacity of user interface elements in keyboard 521, and/or the color of text displayed in the region where the keyboard 521 is displayed. For example, in
As shown in
Referring back to
Depending on the type of the touch input received, the device 100 performs different actions. As shown in
For example, if the type of touch input is a drag and the touch input is not located on a selection, then the device 100 moves the cursor and displays a ghost cursor at 421.
While document 500 is displayed in a document editing mode, keyboard 521 is also displayed. A user may enter text into document 500 by typing on keyboard 521, and confirm completion of editing of document 500 by performing a touch input (e.g., a tap on “done” button 512) to exit an editing mode.
In some embodiments, the device 100 detects two substantially simultaneous touch inputs at a first region 524-1, 524-2 on the touch screen display 112. The device 100 can further determine that the two substantially simultaneous touch inputs remain in contact with the touch screen display 112, and detect a continuous touch gesture or drag gesture from a location of the two substantially simultaneous touch inputs across the touch screen display from the first region 524-1, 524-2 to a second region 524-3, 524-3. In response to detecting the continuous touch gesture, the device 100 moves the cursor or insertion point marker 522-1 across document 500 from the current location 522-1 in
In another embodiment, instead of moving the cursor while dragging two-fingers (e.g., a two-finger slide gesture), the ghost cursor 522-2 moves but not the actual cursor. In some embodiments, a distance between the first region 524-1, 524-2 and the second region 524-3, 524-4 of the touch inputs in
The above examples as shown in
For example, as shown in
In some embodiments, in addition to the selection indicators 526-530, the device 100 displays a command display area adjacent to the selected content 528 as shown in
Referring back to
For example, prior to detecting touch inputs, the device 100 detects an existing selection of a portion of the content and displays the selection. The selection is highlighted, and the device 100 displays markers, such as a start-point object and an end-point object at respective ends of the selection. After displaying a soft keyboard that has a changed appearance in response to detecting two substantially simultaneous touch inputs on the soft keyboard, the device 100 detects a continuous movement of the two substantially simultaneous touch inputs from a first location on the soft keyboard to a second location anywhere on the touch screen display (e.g., within the boundaries of the soft keyboard or beyond the soft keyboard and onto the content region) without breaking contact with the touch screen display. In response to detecting the continuous movement that started on the soft keyboard outside the selected content, the device expands the selected content to include additional content beyond the portion in a direction towards the second location in response to detecting the continuous movement.
In some embodiments, when expanding the selected content, the selection can expand either backward or forward. The initial direction of the drag gesture determines the direction of the expansion. For example, a right and/or down continuous movement moves the end-point object right and/or down without moving the start-point object in order to expand the selection forward; while a left and/or up drag gesture moves the start-point object 530 left and/or up without moving the end-point object in order to expand the selection backward. In some embodiments, the selection expands one word at a time if the continuous movement is fast, and expands one character at a time if the continuous movement is slow.
In some embodiments, in accordance with a determination that the touch inputs are not on the selection (415—No) (e.g., outside the selection) and determines at 419 that the touch inputs are a drag gesture, the device 100 dismisses the selection and moves the cursor in a direction of the drag gesture.
For example, in
The above examples as shown in
Similarly, if the device 100 determines at 415 that the touch inputs are not on the selection (415—No) (e.g., outside the selection), and determines at 419 that the touch input is a flick gesture (also known as a “swipe” gesture), then the selection is again dismissed and the cursor is moved to the distal point of the text at 445.
For example, as shown in
Though
Referring back to
For example, in
Subsequently, while displaying the content of the document 500, and when there is no existing selection (411—No of
In some embodiments, a word is selected if the cursor 522 is within the word (e.g., the word “score” in
It should be noted that in
The above examples as shown in
On the other hand, if the device 100 determines at 415 that the touch inputs are not on the selection (415—No) (e.g., outside the selection) and determines at 419 that the touch inputs are a tap, then the device 100 determines whether the selected content is one word at 466. In accordance with a determination that the selected content is one word (466—Yes), the device 100 expands the selected content to include a sentence containing the word at 467. On the other hand, in accordance with a determination that the selected content is more than a word, the device 100 displays a cursor at the beginning or end of the selection and dismisses the selection at 468. In some embodiments, in accordance with a determination that the tap is not on the selection, the device 100 dismisses the selection and selects the closest word to the contact point of the tap.
In other embodiments, after selecting the closest word to the cursor, the device 100 can further detect a two-finger double-tap (e.g., tapping twice with two fingers), a two-finger triple-tap (e.g., tapping three times with two fingers), or a two-finger quadruple-tap (e.g., tapping four times with two fingers). In response to detecting a two-finger double-tap, two-finger triple-tap, or two-finger quadruple-tap, the device 100 expands the selection to a sentence or a line at 460, to a paragraph at 462, or to the document at 464, respectively. In some embodiments, the selection expansion operation is only operable to paragraph level. For example, in response to a two-finger quadruple-tap, the device 100 keeps the paragraph selected without further expanding the selection to the entire document at 464. In some embodiments, device 100 determines if there is a multi-tap and expands the selection if the duration between each subsequent tap is less than a predetermined threshold (e.g., 0.66 seconds).
In some embodiments, if the device 100 determines two substantially simultaneous touch inputs that are located outside an existing selection (415—No) (e.g., on a soft keyboard) and the type of touch inputs is a tap (down), followed by a lift-off (up), followed by another tap (down), and without lift-off, a drag (419—Tap−∓half∓Tdrag), the device 100 at step 465 first dismisses the selection, then performs actions similar to those performed at step 471 in
Turning to
In some embodiments, upon determining that there is an existing selection 528 (411—Yes,
In some embodiments, upon determining that there is an existing selection (411—Yes,
In some embodiments, upon determining that there is an existing selection (411—Yes,
It should be noted that expansion of the selection illustrated in
In a similar vein, in some embodiments, the device 100 determines that the location of the tap 524 is on a selection 528 of a single line of text (
It should be noted that the selection expansion in response to successive touch inputs illustrated in
Referring back to
If the device 100 detects an existing selection (411—Yes), and further determines that the touch inputs are located on an existing selection (415—Yes), and then determines at 417 that the inputs are a hold and drag, then the device 100 moves the selection at 451. In some embodiments, the selected text remains in place with only a cursor moving, and only once the user releases contact with the touch screen surface (e.g., a lift-off event) is the selected text moved to the position of the cursor at the time of the lift-off or release. In other embodiments, the selection is dragged around the document and placed at the position of the selection (or a cursor) at the time of contact release or lift off. In yet another embodiment, a ghost copy of the selection is dragged around the document, while the original selection remains in place, and only placed at the position of the selection (or a cursor) at the time of contact release or lift off.
If the touch inputs are not located on an existing selection (e.g., outside an existing selection) (415—No), the device 100 determines the type of selection at 419, and if the type is a hold and drag 453, expands the existing selection in the direction of the drag gesture at 453. Here, the expanded selection includes the original selection and any expanded portion of the document.
As mentioned above, in some embodiments, instead of moving the selection 528-1, a ghost selection 528 is displayed in response to the detection of the two-finger dragging. As shown in
In some embodiments, the ghost selection 528-2 appears at location of the current position of the selection 528-1 as soon as the device 100 detects the two-finger dragging, and the start-point object 526-1 and the end-point object 530-1 can animate to attract the users attention to the appearance of the ghost selection 528-2. While dragging, the ghost selection 528-2 moves in the direction of the two-finger movement to indicate the position to where the selection will be moved when the user lifts or releases the contact (e.g., at the lift-off event). The ghost selection 528-2 may include a ghost start-point object 526-2 and a ghost end-point object 530-2. Both the ghost start point object 526-2 and the ghost end-point object 530-2 may have different appearances from the start point object 526-1 and the end-point object 530-1. On release of the two-finger drag, the selection moves to the location of the ghost selection 528-2. At release of the contacts, the device 100 ceases the display of the ghost selection 528-2 along with the ghost markers 526-2 and 530-2 as shown in
Turning to
For example, as shown in
As shown in
In some embodiments, the device 100 displays content of an electronic document on the touch screen display 112. In some embodiments, the content comprises text (e.g., plain text, unstructured text, formatted text, or text in a web page). In other embodiments, the content comprises graphics with or without text. Moreover, the content may be editable or read-only. In addition to displaying the content, when no content is selected, the device 100 may display a cursor within the electronic document. In some embodiments, while displaying the content of the electronic document, the device 100 detects a single-finger touch input (e.g., a single-finger tap) at 472. The portable multifunction device 100 then determines whether prior to detecting the touch input, there is an existing selection of the content at 474.
If the device 100 detects an existing selection (474—Yes), then the device 100 further determines if the single-finger touch input is located on the selection at 475. If the device 100 determines that the single-finger touch input is not located on the selection (475—No), then device 100 dismisses the selection at 476 and proceeds to step 480. On the other hand, if the device 100 determines that the single-finger touch input is located on the selection (475—Yes), the device 100 determines the type of the touch input at 477.
If the device 100 determines the type of single-finger touch input is tap (down), lift (up), tap (down), and a drag (477—Tap−∓half∓drag), the device 100 at 487 first dismisses the selection, then performs actions similar to those performed at step 471, such as selecting a word closest to the touch input and expanding the selection while dragging. If the device 100 determines the type of single-finger touch input is a tap (477—Tap), then the device 100 can perform one of three different options depending on embodiment.
In some embodiments, one of the options (Tap Option 3, 489) is to do-nothing, such that the device 100 does not respond to the single-finger tap in accordance with a determination that the tap is located on a selection (e.g., a selected word). In some embodiment, another option includes dismissing the selection at 485 (Tap Option 1) and performing additional actions (e.g., steps 481-484 in
For example,
It should be noted that
Still referring to
For example,
For example, in
In
In another example,
In yet another example,
In
The above examples (
Referring back to
As used herein, a hold and drag gesture is a stationary (or substantially stationary) pressing gesture by a single-finger at a location in the content displayed on the touch screen immediately followed by a drag. For example, a single-finger contact that moves less than a predefined amount (e.g., 1-2 mm) in a predefined time (e.g., 0.2-0.5 seconds) immediately followed by a drag is a hold and drag gesture. In response to recognizing the hold and drag gesture, as shown in
Although not shown in
As mentioned above, in some embodiments, instead of moving the selection 528-1, a ghost selection 528 is displayed in response to the detection of the single-finger dragging. As shown in
In some embodiments, the ghost selection 528-2 appears at location of the current position of the selection 528-1 as soon as the device 100 detects the single-finger dragging, and the start-point object 526-1 and the end-point object 530-1 can animate to attract the users attention to the appearance of the ghost selection 528-2. While dragging, the ghost selection 528-2 moves in the direction of the single-finger movement to indicate the position to where the selection will be moved when the user lifts or releases the contact (e.g., at the lift-off event). The ghost selection 528-2 may include a ghost start-point object 526-2 and a ghost end-point object 530-2. Both the ghost start point object 526-2 and the ghost end-point object 530-2 may have different appearances from the start point object 526-1 and the end-point object 530-1. On release of the single-finger drag, the selection moves to the location of the ghost selection 528-2. At release of the contacts, the device 100 ceases the display of the ghost selection 528-2 along with the ghost markers 526-2 and 530-2 as shown in
Referring back to
Still referring to
In some embodiments, similar to scrolling in response to a drag gesture, the device 100 scrolls the content displayed in the content region in response to a swipe gesture in accordance with the direction of the swipe. For example, an upward swipe scrolls the content upward, a downward swipe scrolls the content downward, a left swipe scrolls the content to the left, and likewise, a right swipe scrolls the content to the right.
In response to the tap∓half∓drag gesture, the device 100 selects the word closest to the starting location 524-1 (e.g., selecting the word “nation”), then expands the selection to include additional content beyond the selected word in the direction 532 towards the second location 524-2. The end of the selected portion of the content 530 is increased as the finger contact moves forward through text on the touch screen 112, as illustrated in
In some embodiments, the start-point object 526 and the end-point object 530 are displayed moving while the finger is dragging to indicate the expanding selection 528. The selection 528 expands one word at a time if the finger drags fast, and expands one character at a time if the finger drags slow. Though not shown in
In some embodiments, as shown in
Referring back to
For example, in
It should be noted that different from in response to a single-tap gesture described above with respect to
In some embodiments, the device 100 can alternatively move the selected content in response to detecting three substantially simultaneous touch inputs, e.g., a three-finger touch input.
Similar to the illustration in
It should be noted that the three substantially simultaneous touch inputs can be located anywhere on the touch screen display 112. For example, a user can touch the touch screen display 112 with three fingers close together or three fingers spread out. Regardless the three touch inputs locations on the touch screen display 112, the device 100 can detect the three substantially simultaneous touch inputs and perform actions such as moving a selection in response to detecting a drag of the three-fingers following the initial three substantially simultaneous touch inputs.
It should also be noted that different from the single-finger hold and drag of a selection as illustrated in
For example,
In some embodiments, the device 100 further detects the three-finger touch input remains motionless on the touch screen display for a predetermined duration followed by a continuous drag gesture (e.g., start dragging within 0.5 or 0.75 seconds after remaining motionless) from the location of the three-finger touch input in a direction 532 across the touch screen display from the first region 524-1 to a second region 524-2. In other embodiments, a delay (e.g., the predetermined duration, is not required for moving the selection.
In response to detecting the continuous touch movement, the device 100 moves the selection 528-1 across document 500 to a new location 528-2 as shown in
As mentioned above, in some embodiments, instead of moving the selection 528-1, a ghost selection 528 is displayed in response to the detection of the single-finger dragging. As shown in
In some embodiments, the ghost selection 528-2 appears at location of the current position of the selection 528-1 as soon as the device 100 detects the three-finger dragging, and the start-point object 526-1 and the end-point object 530-1 can be animated to attract the user's attention to the appearance of the ghost selection 528-2. While dragging, the ghost selection 528-2 moves in the direction of the three-finger movement to indicate the position to where the selection will be moved when the user lifts or releases the contact (e.g., at the lift-off event). The ghost selection 528-2 may include a ghost start-point object 526-2 and a ghost end-point object 530-2. Both the ghost start point object 526-2 and the ghost end-point object 530-2 may have different appearances from the start point object 526-1 and the end-point object 530-1. On release of the three-finger drag, the selection moves to the location of the ghost selection 528-2. At release of the contacts, the device 100 ceases the display of the ghost selection 528-2 along with the ghost markers 526-2 and 530-2 as shown in
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
In some embodiments, the electronic device 800 includes a display unit 802 configured to display content of an electronic document and a cursor within the electronic document; a touch-sensitive surface unit 804 configured to receive user contacts; and a processing unit 808 coupled to the display unit 802 and the touch-sensitive surface unit 804. In some embodiments, the processing unit 808 includes a detecting unit 810 and a selecting unit 812.
The processing unit 808 is configured to: detect two substantially simultaneous touch inputs anywhere on the touch-sensitive surface unit 804 (e.g., with the detecting unit 810); and in response to detecting the two substantially simultaneous touch inputs: select a portion of the content in the document closest to the cursor (e.g., with the selecting unit 812); and display the portion of the content as selected content (e.g., with the display unit 802).
In some embodiments, the electronic device 800 includes the display unit 802 configured to display content of an electronic document and a cursor at a current location within the content; the touch-sensitive surface unit 804 configured to receive user contacts; and the processing unit 808 coupled to the display unit 802 and the touch-sensitive surface unit 804. In some embodiments, the processing unit 808 includes the detecting unit 810, a determining unit 816, and a moving unit 814.
The processing unit 808 is configured to: detect two substantially simultaneous touch inputs at a first region on the touch-sensitive surface unit 804 (e.g., with the detecting unit 810); upon determining that the two substantially simultaneous touch inputs remain in contact with the touch-sensitive surface unit 804 (e.g., with the detecting unit 810), detect a continuous touch gesture from a location of the two substantially simultaneous touch inputs across the touch-sensitive surface unit 804 (e.g., with the detecting unit 810) from the first region to a second region; and in response to detecting the continuous touch gesture: move the cursor from the current location to a new location in a direction of the continuous touch gesture (e.g., with the moving unit 814).
In some embodiments, the electronic device 800 includes the display unit 802 configured to display text and a cursor at a line within the text; the touch-sensitive surface unit 804 configured to receive user contacts; and the processing unit 808 coupled to the display unit 802 and the touch-sensitive surface unit 804. In some embodiments, the processing unit 808 includes the detecting unit 810 and the moving unit 814.
The processing unit 808 is configured to: detect a two-finger swipe gesture on the touch-sensitive surface unit 804 (e.g., with the detecting unit 810) in a direction at least partially parallel to the line and towards an edge of the touch-sensitive surface unit 804 (e.g., with the detecting unit 810); and in response to detecting the two-finger swipe gesture: move the cursor to a distal point of the text (e.g., with the moving unit 814).
In some embodiments, the electronic device 800 includes the display unit 802 configured to display content of an electronic document and a cursor within the electronic document; the touch-sensitive surface unit 804 configured to receive user contacts; and the processing unit 808 coupled to the display unit 802 and the touch-sensitive surface unit 804. In some embodiments, the processing unit 808 includes the detecting unit 810 and the selecting unit 812.
The processing unit 808 is configured to: detect a touch input on the touch-sensitive surface unit 804 (e.g., with the detecting unit 810), wherein the touch input is located on a word within the content; and in response to detecting the touch input: select the word (e.g., with the selecting unit 812); and display a command display area adjacent to the selected word, wherein the second command display area includes an icon for cutting the selected word, an icon for copying the selected word, and an icon for pasting a previously selected content.
In some embodiments, the electronic device 800 includes the display unit 802 configured to display content of an electronic document and a selection of the content within the electronic document; the touch-sensitive surface unit 804 configured to receive user contacts; and the processing unit 808 coupled to the display unit 802 and the touch-sensitive surface unit 804. In some embodiments, the processing unit 808 includes the detecting unit 810, the determining unit 816, and the moving unit 814.
The processing unit 808 is configured to: detect a single touch input on the touch-sensitive surface unit 804 (e.g., with the detecting unit 810) at a location over the selection; in response to detecting the single touch input at the location, display a set of options related to the selection (e.g., with the display unit 802); determine if the single touch input remains at the location for a predetermined amount of time followed by a continuous touch gesture away from the location on the touch-sensitive surface unit 804 (e.g., with the determining unit 816); and in response to detecting the single touch input remaining at the location for the predetermined amount of time followed by the continuous touch gesture away from the location, move the selection to a different location in a direction of the continuous touch gesture (e.g., with the moving unit 814).
In some embodiments, the electronic device 800 includes the display unit 802 configured to display content of an electronic document and a selection of the content within the electronic document; the touch-sensitive surface unit 804 configured to receive user contacts; and the processing unit 808 coupled to the display unit 802 and the touch-sensitive surface unit 804. In some embodiments, the processing unit 808 includes the detecting unit 810, the determining unit 816, and the moving unit 814.
The processing unit 808 is configured to: detect three substantially simultaneous touch inputs at locations anywhere on the touch-sensitive surface unit 804 (e.g., with the detecting unit 810); determine if the three substantially simultaneous touch inputs is followed by three continuous touch gestures away from the locations on the touch-sensitive surface unit 804 (e.g., with the determining unit 816); and in response to detecting the three continuous touch gestures, move the selection to a different location in a direction of the continuous touch gestures (e.g., with the moving unit 814).
In some embodiments, the electronic device 800 includes the display unit 802 configured to display content of an electronic document, the content includes at least one line of text comprising at least two words; the touch-sensitive surface unit 804 configured to receive user contacts; and the processing unit 808 coupled to the display unit 802 and the touch-sensitive surface unit 804. In some embodiments, the processing unit 808 includes the detecting unit 810, the determining unit 816, and the selecting unit 812.
The processing unit 808 is configured to: detect a touch input on the content (e.g., with the detecting unit 810); and in response to detecting the touch input: determine a distance of the touch input to a closest space between the two words within the electronic document (e.g., with the determining unit 816); and in accordance with a determination that the distance is greater than a predetermined threshold distance, select a word within the electronic document closest to the touch input and display the selection (e.g., with the display unit 802).
In some embodiments, the electronic device 800 includes the display unit 802 configured to display content of an electronic document and a soft keyboard in focus; the touch-sensitive surface unit 804 configured to receive user contacts; and the processing unit 808 coupled to the display unit 802 and the touch-sensitive surface unit 804. In some embodiments, the processing unit 808 includes the detecting unit 810 and the changing appearance unit 818.
The processing unit 808 is configured to: detect two substantially simultaneous touch inputs on the soft keyboard (e.g., with the detecting unit 810); and in response to detecting the two substantially simultaneous touch inputs on the soft keyboard, blur the soft keyboard (e.g., with the changing appearance unit 818).
In some embodiments, the electronic device 800 includes the display unit 802 configured to display content of an electronic document and a cursor within the electronic document; the touch-sensitive surface unit 804 configured to receive user contacts; and the processing unit 808 coupled to the display unit 802 and the touch-sensitive surface unit 804. In some embodiments, the processing unit 808 includes the detecting unit 810 and the changing appearance unit 818.
The processing unit 808 is configured to: display, on the touch-sensitive surface display unit 802, a soft keyboard having multiple keys each having a respective alphanumeric character of a plurality of alphanumeric characters; detect two substantially simultaneous touch inputs on the soft keyboard (e.g., with the detecting unit 810); and in response to detecting the two substantially simultaneous touch inputs on the soft keyboard, change the appearance of the soft keyboard to a changed appearance (e.g., with the changing appearance unit 818).
Turning to
In
In
When the keyboard 921 serves as an onscreen touch pad or track pad, in response to detecting finger movement 932 on the keyboard 921, in some embodiments, a floating cursor 922-1 moves across the touch screen display 112 in accordance with the movement of one or more finger contacts, and a ghost cursor 922-2 is displayed offset from the real cursor 922-1. In some embodiments, the ghost cursor 922-2 indicates where the cursor will be located after a lift-off of the finger contact 924. In some embodiments, the ghost cursor 922-2 is a modified version of the original cursor 922 displayed on the screen (e.g., the ghost cursor 922-2 is in grey color, while the original cursor 922 as shown in
In some embodiments, when the touch input for entering the text selection mode is detected, the original cursor 922 changes its appearance, while the floating cursor 922-1 springs into its initial location from the original text cursor 922. In some embodiment, the initial location of the floating cursor 922-1 is slightly offset from the location of the ghost cursor 922-2 (e.g., slightly higher in
In some embodiments, when the user subsequently moves the finger contact, the floating cursor 922-1 moves with the finger contact 924 in a continuous fluid motion in the content presentation region 902, while the ghost cursor 922-2 moves in discrete jumps from one permitted insertion location to another permitted insertion location to indicate the final location of the cursor if the touch input was terminated at that moment. For example, in
In
In
In
In
In
In
In
In
In some embodiments, as shown in
In some embodiments, the criteria for selecting the first predetermined unit of the text includes a criterion that is met when the second increase in characteristic intensity of the contact above the predefined intensity threshold (e.g., ITD) occurs within a predetermined time period after the first increase in characteristic intensity of the contact above the predefined intensity threshold (e.g., ITD, the double press is a quick double press to trigger selection of the first predetermined unit of text input). For example, when the duration between time T4 and T5 is less than a threshold, the criteria for selecting the first predetermined unit of text are met.
In some embodiments, instead of just two increases in the characteristic intensity of the contact above the predefined intensity threshold (e.g., ITD), three such increases are detected and used to trigger selection of a second predetermined unit of text input (e.g., a quick triple press triggers selection of a sentence, whereas a quick double press triggers selection of a word).
In
Turning to
In
At a time prior to T0, no contact is present or detected on touch screen display 112. At T0, the device detects the intensity of the press input increases above the contact intensity threshold IT0. In response to detecting the finger contact 925 on the editable content 900, the device moves a text selection indicator (e.g., a cursor 922) to the location that corresponds to the finger contact (e.g., between the character “r” and “o” of the word “from).
In
In
In
In
In the corresponding intensity diagram, the movement of the finger from the middle of the word “nation” to the middle of the word “under” corresponds to the segment between time T3 and T4. As shown in
In
In
In
In some embodiments, if a third deep press input by the contact 925 is detected after the further movement of the contact to move the cursor(s), then the selected content (e.g., “ation un” in
In some embodiments, the third deep press input does not clear the previously selected content, but the third deep press input by the contact does initiate additional text selection such that movement of the contact 925 after the third deep press input selects an additional portion of text in accordance with the movement. In these embodiments, a user is able to select multiple portions of content with a single continuous contact by repeatedly: deep pressing to initiate text selection, moving to select text, deep pressing again to complete text selection, and moving without selecting text to a next position on the touch-sensitive surface that corresponds to a desired starting location for the next text selection.
On the other hand, if lift off of contact 925 is detected without detecting a third deep press input by contact 925, then display of the selected content “ation un” and the action menu 940 are maintained.
In some embodiments, in
Turning to
In
In
In
In
As described below, the method 1000 provides an efficient way to manipulate a cursor. The method reduces the cognitive burden on a user when using a cursor, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to manipulate a cursor faster and more efficiently conserves power and increases the time between battery charges.
The device concurrently displays (1004) an onscreen keyboard (e.g., keyboard 921,
In some embodiments, the one or more criteria for entering the text selection mode include (1012) the touch input including a two-finger drag gesture over the onscreen keyboard (
In accordance with a determination that the touch input satisfies the one or more criteria for entering the text selection mode, the device concurrently displays (1014), in the content presentation region 902, a first cursor (e.g., ghost cursor 922-2,
In some embodiments, the electronic device has (1016) one or more sensors to detect intensity of contacts with the touch-sensitive display, the touch input on the touch-sensitive display includes an input by a contact on the onscreen keyboard, and the one or more criteria for entering the text selection mode include the contact on the onscreen keyboard having an intensity that exceeds a predetermined deep press intensity threshold (
In some embodiments, in accordance with the determination that the touch input satisfies the one or more criteria for entering the text selection mode, the device visually obscures (1018) keys on the onscreen keyboard (
In some embodiments, the second location 922-1 (
In some embodiments, once the initial position of the second cursor (e.g., floating cursor 922-1,
In some embodiments, as shown in
In some embodiments, one of the first and second cursors is (1030) already displayed in the content presentation region 902 before both of the first and second cursors are concurrently displayed in the content presentation region 902. In some embodiments, the floating cursor 922-1 is already displayed when the touch input is determined to satisfy the one or more criteria for entering the text selection mode. In some embodiments, the insertion cursor 922-2 is already displayed when the touch input is determined to satisfy the one or more criteria for entering the text selection mode.
In some embodiments, the device detects (1032) movement of one or more contacts of the touch input, and moves (1034) the second cursor (e.g., the floating cursor 922-1) within the content presentation region in accordance with the movement of the one or more contacts of the touch input (e.g., the movement of the floating cursor smoothly follows the movement of a finger contact in terms of speed and direction). In some embodiments, the device moves (1036) the first cursor based on the movement of the second cursor, and movement of the first cursor includes discrete movements between permitted insertion positions in the content presentation region. For example, when the user moves the finger contact in the touch input (
In some embodiments, the device detects (1038) a lift-off of the touch input after detecting the movement of the one or more contacts of the touch input (
In some embodiments, in response to detecting the lift-off of the touch input, the device maintains (1042) display of the first cursor at a respective permitted insertion position reached by the first cursor after the discrete movements of the first cursor (e.g., the middle of the word “created” in
In some embodiments, in response to detecting the lift-off of the touch input, the device ceases (1044) to display the first cursor (
In some embodiments, the onscreen keyboard is obscured in accordance with a determination that the touch input satisfies the one or more criteria for entering the text selection mode, and in response to detecting the lift-off of the touch input, the device restores (1046) display of the onscreen keyboard (
In some embodiments, the device has (1048) one or more sensors to detect intensity of contacts with the touch-sensitive display, and using the one or more sensors, the device detects that an intensity of a contact in the touch input exceeds a predetermined intensity threshold (e.g., ITD). After detecting that the intensity of the contact in the touch input exceeds the predetermined intensity threshold, the device detects (1050) movement of the contact in the touch input. In response to detecting the movement of the contact in the touch input, after detecting that the intensity of the contact in the touch input exceeds the predetermined intensity threshold, the device selects (1052) a portion of the text input in accordance with the movement of the contact in the touch input (e.g., selecting “eated” in accordance with the finger movement 932,
In some embodiments, the device detects (1056) lift-off of the contact in the touch input after selecting the portion of the text input in accordance with the movement of the contact in the touch input. In response to detecting the lift-off of the contact in the touch input, the device confirms (1058) selection of the portion of the text input (
In some embodiments, after selecting the portion of the text input, while the portion of the text input is selected, the device detects (1060) an intensity of the contact in the touch input that exceeds the predetermined threshold (e.g., ITD). In response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold while the portion of the text input is selected, the device clears (1062) selection of the portion of the text input (
In some embodiments, after selecting the portion of the text input, while the portion of the text input is selected, the device detects (1064) an intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by lift-off of the contact without further movement of the contact. In response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by lift-off of the contact without further movement of the contact, the device confirms (1066) selection of the portion of the text input (
In some embodiments, after selecting the portion of the text input, while the portion of the text input is selected, the device detects (1068) an intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by further movement of the contact. In response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by the further movement of the contact, the device clears (1070) selection of the portion of the text input (
In some embodiments, the device has one or more sensors to detect intensity of contacts with the touch-sensitive display, and using the one or more sensors, in the text selection mode, the device detects (1076) a first local intensity peak in the touch input followed by a second local intensity peak in the touch input that both exceed a predetermined intensity threshold (e.g., ITD). In response to detecting the first local intensity peak followed by the second local intensity peak that both exceed the predetermined intensity threshold, the device selects (1078) a first predetermined unit (e.g., a word as shown in
In some embodiments, after detecting the first local intensity peak followed by the second local intensity peak, the device detects (1080) a third consecutive local intensity peak in the touch input that exceeds the predetermined intensity threshold (e.g., a triple deep press). In response to detecting the three consecutive local intensity peaks in the touch input that all exceed the predetermined deep press intensity threshold, the device selects (1082) a second predetermined unit (e.g., a sentence,
It should be understood that the particular order in which the operations in
As described below, the method 1100 provides an efficient way to select content with press and movement inputs by a single continuous contact on the touch-sensitive surface, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to select content faster and more efficiently conserves power and increases the time between battery charges.
While a contact (e.g., a finger contact or a stylus contact) is detected on the touch-sensitive surface, the device concurrently displays (1104) on the display content (e.g., editable content in
In some embodiments, the text selection indicator is a cursor (e.g., cursor 922) that moves within editable content (
The device detects (1106) a first press input by the contact followed by movement of the contact across the touch-sensitive surface that corresponds to movement of at least a portion of the text selection indicator from the first location to a second location on the display (e.g., moving a cursor to the second location on the display or moving an edge of a selection box to the second location on the display, where the edge of the selection box corresponds to one end of the selected content). In some embodiments, the device detects the first press input and the subsequent movement by detecting an increase in intensity of the contact above a predetermined intensity threshold (e.g., a deep press intensity threshold ITD), followed by detecting a decrease in intensity of the contact to an intensity that remains above a predetermined minimum intensity value (e.g., a light press intensity threshold or a contact detection threshold). For example, as shown in
In response to detecting the first press input by the contact followed by movement of the contact across the touch-sensitive surface, the device selects (1110) content between the first location and the second location (e.g., selecting content for an editing operation, such as copying). In some embodiments, the selected content (e.g., editable text or read-only text) is enclosed by a highlighted area (e.g., area 928 in
While the content between the first location and the second location is selected (e.g., after moving at least the portion of the text selection indicator to the second location), the device detects (1112) a second press input by the contact on the touch-sensitive surface. For example, as shown in
In response to detecting the second press input by the contact on the touch-sensitive surface, the device performs (1114) a text selection operation, associated with the content between the first location and the second location, in accordance with the second press input. In some embodiments, the contact in the first press input, the movement across the touch-sensitive surface, and the second press input is a single continuous contact with the touch-sensitive surface (e.g., contact 924, contact 925, or contact 927).
In some embodiments, in response to detecting the first press input by the contact followed by movement of the contact across the touch-sensitive surface: the device displays (1116) at least the portion of the text selection indicator at the second location within the content. For example, the device displays a cursor at the second location on the display (
In some embodiments, the text selection operation includes (1118) stopping selection of content at the second location and maintaining selection of the content between the first location and the second location (e.g.,
In some embodiments, after detecting the second press input and while the content between the first location and the second location remains selected, the device detects (1120) lift-off of the contact, and in response to detecting the lift-off of the contact, the device displays an action menu. For example, after detecting the second press input at a location on the touch-sensitive surface that corresponds to the second location on the display (e.g., in the middle of the word “under”), the device detects lift-off of the contact from that location on the touch-sensitive surface without detecting further movement of the contact. In response, the device displays an action menu bar 940 or other area that shows actions that can be performed on the selected portion of the text input (e.g., copying, defining, cutting, pasting, etc.) for the selected content between the first location and the second location.
In some embodiments, after detecting the second press input by the contact on the touch-sensitive surface and stopping the selection of the content at the second location, the device detects (1124) further movement of the contact. In response to detecting the further movement of the contact, the device displays at least a portion of the text selection indicator at a third location within the content. For example, in
In some embodiments, in response to detecting the further movement of the contact, the device cancels (1126) selection of content between the first location and the second location without selecting content between the second location and the third location (
In some embodiments, the text selection operation includes (1128) cancelling selection of content between the first location and the second location (
In some embodiments, after detecting the second press input by the contact on the touch-sensitive surface and canceling the selection of content between the first location and the second location, the device detects (1130) further movement of the contact. In response to detecting the further movement of the contact, the device selects content between the second location and a third location (
In some embodiments, while the content between the second location and the third location is selected, the device detects (1132) lift-off of the contact (e.g., without detecting further movement of the contact). In response to detecting the lift-off of the contact while the content between the second location and the third location is selected, the device stops selection of the content at the third location and maintains selection of the content between the second location and the third location (as shown in
In some embodiments, before displaying the text selection indicator at the first location within the content, the device detects (1134) an initial press input by the contact on the touch-sensitive surface. In response to detecting the initial press input, the device displays the text selection indicator at an initial location within the content that corresponds to a location of the initial press input on the touch-sensitive surface. In some embodiments, the initial press input is (1136) detected at a location on the touch-sensitive surface that corresponds to a location of the content on the display. When the initial press input is detected at a location that corresponds to the first location within the content, the text selection indicator (e.g. the cursor 922) is displayed at the first location within the content. If the initial press input is detected at a location that does not correspond to the first location within the content, the text selection indicator is displayed at the first location within the content when the contact, after making the initial press input, moves to a location on the touch-sensitive surface that corresponds to the first location. For example, for a “first location” of position marker 945 in
In some embodiments, the display is a touch-sensitive display that includes the touch-sensitive surface, and the device concurrently displays (1138), on the touch-sensitive display, the content and an onscreen keyboard. In some embodiments, the initial press input is detected on the onscreen keyboard (
In some embodiments, the display is a touch-sensitive display that includes the touch-sensitive surface, and the device concurrently displays (1140), on the touch-sensitive display, the content and an onscreen keyboard. Before displaying the text selection indicator at the first location within the content, the device detects (1142) a multi-contact drag input (e.g., a two-finger drag input as shown in
In some embodiments, the content includes (1144) editable content and the text selection indicator includes a cursor (
In some embodiments, the content includes (1150) read-only content and the text selection indicator includes a selection area (e.g., a rectangular selection box 929,
In some embodiments, the device foregoes (1156) performing the text selection operation, in response to detecting the second press input, in accordance with a determination that the second press input is accompanied by simultaneous movement of the contact across the touch-sensitive surface. For example, the text selection operation is not performed when the device detects the second press input accompanied by simultaneous movement of the contact. In other words, movement cancels the operation that is typically triggered by the second press input.
In some embodiments, when the text is editable text, the text selection indicator is a cursor and selecting content between the first location and the second location includes (1158) expanding the selection character-by-character in accordance with movement of the contact on the touch-sensitive surface (
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
In some embodiments, the electronic device 1200 includes a display unit 1202 configured to concurrently display an onscreen keyboard and a content presentation region on the display unit 1202, wherein the content presentation region displays text input received from the onscreen keyboard; a touch-sensitive surface unit 1204 configured to receive user touch inputs; one or more sensor units 1206 to detect intensity of contacts with the touch-sensitive surface unit 1204; and a processing unit 1208 coupled to the display unit 1202, the touch-sensitive surface unit 1204, and the one or more sensor units 1206. In some embodiments, the processing unit 1208 includes a detecting unit 1210, a determining unit 1216, an obscuring unit 1218, a moving unit 1214, and a selecting unit 1212.
The processing unit 1208 is configured to: detect a touch input on the onscreen keyboard displayed on the touch-sensitive surface unit 1204 (e.g., with the detecting unit 1210); in response to detecting the touch input on the onscreen keyboard displayed on the display unit 1202, determine whether the touch input satisfies one or more criteria for entering a text selection mode (e.g., with the determining unit 1216); and in accordance with a determination that the touch input satisfies the one or more criteria for entering the text selection mode: concurrently display (e.g., with the displaying unit 1202), in the content presentation region, a first cursor at a first location and a second cursor at a second location that is different from the first location.
In some embodiments, the one or more criteria for entering the text selection mode include the touch input including a two-finger drag gesture over the onscreen keyboard.
In some embodiments, the device has one or more sensor units 1206 to detect intensity of contacts with the touch-sensitive display; the touch input on the touch-sensitive display includes an input by a contact on the onscreen keyboard, and the one or more criteria for entering the text selection mode include the contact on the onscreen keyboard having an intensity that exceeds a predetermined deep press intensity threshold.
In some embodiments, the processing unit 1208 is configured to: in accordance with the determination that the touch input satisfies the one or more criteria for entering the text selection mode: visually obscure keys on the onscreen keyboard (e.g., with the obscuring unit 1218).
In some embodiments, visually obscuring the keys on the onscreen keyboard includes applying a blurring effect to the onscreen keyboard (e.g., with the display unit 1202).
In some embodiments, visually obscuring the keys on the onscreen keyboard includes transforming the onscreen keyboard into an onscreen touchpad (e.g., with the display unit 1202).
In some embodiments, visually obscuring the keys on the onscreen keyboard includes making the onscreen keyboard semitransparent to partially reveal content lying underneath the onscreen keyboard (e.g., with the display unit 1202).
In some embodiments, the second location is based on a location of an initial contact in the touch input; and the first location is a permitted insertion position in the content presentation region that is based on the second location.
In some embodiments, the first location is an insertion position at which the first cursor is located when the touch input is determined to satisfy the one or more criteria for entering the text selection mode; and the second location is displaced from the first location by a predetermined offset.
In some embodiments, one of the first and second cursors is already displayed in the content presentation region before both of the first and second cursors are concurrently displayed in the content presentation region.
In some embodiments, the processing unit 1208 is further configured to: detect movement of one or more contacts of the touch input (e.g., with the detecting unit 1210); and move the second cursor within the content presentation region in accordance with the movement of the one or more contacts of the touch input (e.g., with the moving unit 1214).
In some embodiments, the processing unit 1208 is further configured to: move the first cursor based on the movement of the second cursor, wherein movement of the first cursor includes discrete movements between permitted insertion positions in the content presentation region (e.g., with the moving unit 1214).
In some embodiments, the processing unit 1208 is further configured to: detect a lift-off of the touch input after detecting the movement of the one or more contacts of the touch input (e.g., with the detecting unit 1210); and in response to detecting the lift-off of the touch input: cease to display the second cursor (e.g., with the display unit 1202).
In some embodiments, the processing unit 1208 is further configured to: in response to detecting the lift-off of the touch input, maintain display of the first cursor at a respective permitted insertion position reached by the first cursor after the discrete movements of the first cursor (e.g., with the display unit 1202).
In some embodiments, the processing unit 1208 is further configured to: in response to detecting the lift-off of the touch input, cease to display the first cursor (e.g., with the display unit 1202).
In some embodiments, the onscreen keyboard is obscured in accordance with the determination that the touch input satisfies the one or more criteria for entering the text selection mode, and the processing unit 1208 is further configured to: in response to detecting the lift-off of the touch input, restore display of the onscreen keyboard (e.g., with the display unit 1202).
In some embodiments, the device has one or more sensor units 1206 to detect intensity of contacts with the touch-sensitive surface unit 1204, and the processing unit 1208 is further configured to: in the text selection mode, detect that an intensity of a contact in the touch input exceeds a predetermined intensity threshold (e.g., with the detecting unit 1210); after detecting that the intensity of the contact in the touch input exceeds the predetermined intensity threshold, detect movement of the contact in the touch input (e.g., with the detecting unit 1210); in response to detecting the movement of the contact in the touch input, after detecting that the intensity of the contact in the touch input exceeds the predetermined intensity threshold: select a portion of the text input in accordance with the movement of the contact in the touch input (e.g., with the selecting unit 1212).
In some embodiments, the selected portion of the text input begins at a position of the first cursor when the detected intensity of the contact in the touch input exceeded the predetermined intensity threshold.
In some embodiments, the processing unit 1208 is further configured to: detect lift-off of the contact in the touch input after selecting the portion of the text input in accordance with the movement of the contact in the touch input (e.g., with the detecting unit 1210); and, in response to detecting the lift-off of the contact in the touch input, confirm selection of the portion of the text input (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is further configured to: after selecting the portion of the text input, while the portion of the text input is selected, detect an intensity of the contact in the touch input that exceeds the predetermined threshold (e.g., with the detecting unit 1210); and, in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold while the portion of the text input is selected, clear selection of the portion of the text input (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is further configured to: after selecting the portion of the text input, while the portion of the text input is selected, detect an intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by lift-off of the contact without further movement of the contact (e.g., with the detecting unit 1210); and, in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by lift-off of the contact without further movement of the contact, confirm selection of the portion of the text input (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is further configured to: after selecting the portion of the text input, while the portion of the text input is selected, detect an intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by further movement of the contact (e.g., with the detecting unit 1210); and, in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by the further movement of the contact: clear selection of the portion of the text input (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is further configured to: in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by the further movement of the contact: start selection of a new portion of the text input in accordance with the further movement of the contact (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is further configured to: in response to detecting the intensity of the contact in the touch input that exceeds the predetermined threshold and that is followed by the further movement of the contact: further move the second cursor and the first cursor within the content presentation region in accordance with the further movement of the contact (e.g., with the moving unit 1214).
In some embodiments, the device has one or more sensor units (1206) to detect intensity of contacts with the touch-sensitive display, and the processing unit 1208 is further configured to: in the text selection mode, detect a first local intensity peak in the touch input followed by a second local intensity peak in the touch input that both exceed a predetermined intensity threshold (e.g., with the detecting unit 1210); and, in response to detecting the first local intensity peak followed by the second local intensity peak that both exceed the predetermined intensity threshold, select a first predetermined unit of the text input according to a current location of the first cursor (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is further configured to: after detecting the first local intensity peak followed by the second local intensity peak, detect a third consecutive local intensity peak in the touch input that exceeds the predetermined intensity threshold (e.g., with the detecting unit 1210); and in response to detecting the three consecutive local intensity peaks in the touch input that all exceed the predetermined deep press intensity threshold, select a second predetermined unit of the text input that is larger than and includes the first predetermined unit of the text input (e.g., with the selecting unit 1212).
In some embodiments, the electronic device 1200 includes a display unit 1202 configured to, while a contact is detected on the touch-sensitive surface unit 1204, concurrently display on the display unit 1202 content and a text selection indicator at a first location within the content; a touch-sensitive surface unit 1204 configured to receive user contacts; one or more sensor units 1206 to detect intensity of contacts with the touch-sensitive surface unit 1204; and a processing unit 1208 coupled to the display unit 1202, the touch-sensitive surface unit 1204, and the one or more sensor units 1206. In some embodiments, the processing unit 1208 includes a detecting unit 1210, a determining unit 1216, an obscuring unit 1218, a moving unit 1214, and a selecting unit 1212.
The processing unit 1208 is configured to: detect a first press input by the contact followed by movement of the contact across the touch-sensitive surface unit 1204 that corresponds to movement of at least a portion of the text selection indicator from the first location to a second location on the display unit 1202 (e.g., with the detecting unit 1210); in response to detecting the first press input by the contact followed by movement of the contact across the touch-sensitive surface unit 1204, select content between the first location and the second location (e.g., with the selecting unit 1212); while the content between the first location and the second location is selected, detect a second press input by the contact on the touch-sensitive surface unit 1204 (e.g., with the detecting unit 1210); in response to detecting the second press input by the contact on the touch-sensitive surface unit, perform a text selection operation, associated with the content between the first location and the second location, in accordance with the second press input, wherein the contact in the first press input, the movement across the touch-sensitive surface unit 1204, and the second press input is a single continuous contact with the touch-sensitive surface unit 1204 (e.g., with the selecting unit 1212).
In some embodiments, detecting the first press input by the contact followed by movement of the contact across the touch-sensitive surface unit 1204 includes: detecting an increase in intensity of the contact above a predetermined intensity threshold followed by detecting a decrease in intensity of the contact to an intensity that remains above a predetermined minimum intensity value (e.g., with the detecting unit 1210).
In some embodiments, the processing unit 1208 is configured to: in response to detecting the first press input by the contact followed by movement of the contact across the touch-sensitive surface unit 1204: display at least the portion of the text selection indicator at the second location within the content (e.g., with the display unit 1202).
In some embodiments, wherein the text selection operation includes stopping selection of content at the second location and maintaining selection of the content between the first location and the second location (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is configured to: after detecting the second press input and while the content between the first location and the second location remains selected, detect lift-off of the contact (e.g., with the detecting unit 1210); and in response to detecting the lift-off of the contact, display an action menu for the selected content between the first location and the second location (e.g., with the display unit 1202).
In some embodiments, the processing unit 1208 is configured to: after detecting the second press input by the contact on the touch-sensitive surface unit 1204 and stopping the selection of the content at the second location, detect further movement of the contact (e.g., with the detecting unit 1210); and in response to detecting the further movement of the contact, display at least a portion of the text selection indicator at a third location within the content (e.g., with the display unit 1202).
In some embodiments, the processing unit 1208 is configured to: in response to detecting the further movement of the contact, cancel selection of content between the first location and the second location without selecting content between the second location and the third location (e.g., with the selecting unit 1212).
In some embodiments, the text selection operation includes cancelling selection of content between the first location and the second location.
In some embodiments, the processing unit 1208 is configured to: after detecting the second press input by the contact on the touch-sensitive surface unit and canceling the selection of content between the first location and the second location, detect further movement of the contact (e.g., with the detecting unit 1210); and, in response to detecting the further movement of the contact, select content between the second location and a third location (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is configured to: while the content between the second location and the third location is selected, detect lift-off of the contact (e.g., with the detecting unit 1210); and, in response to detecting the lift-off of the contact while the content between the second location and the third location is selected, stop selection of the content at the third location and maintaining selection of the content between the second location and the third location (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is configured to: before displaying the text selection indicator at the first location within the content, detect an initial press input by the contact on the touch-sensitive surface unit 1204 (e.g., with the detecting unit 1210); and in response to detecting the initial press input, enable display of the text selection indicator at an initial location within the content that corresponds to a location of the initial press input on the touch-sensitive surface unit 1204.
In some embodiments, the display unit 1202 is a touch-sensitive display that includes the touch-sensitive surface unit 1204, and the processing unit 1208 is configured to: concurrently display, on the touch-sensitive display, the content and an onscreen keyboard, wherein the initial press input is detected on the onscreen keyboard.
In some embodiments, the initial press input is detected at a location on the touch-sensitive surface unit 1204 that corresponds to a location of the content on the display unit 1202.
In some embodiments, the display unit 1202 is a touch-sensitive display that includes the touch-sensitive surface unit 1204, and the processing unit 1208 is configured to: concurrently enable display of, on the touch-sensitive display, the content and an onscreen keyboard; before displaying the text selection indicator at the first location within the content, detect a multi-contact drag input on the onscreen keyboard (e.g., with the detecting unit 1210); and, in response to detecting the multi-contact drag input on the onscreen keyboard, enable display of the text selection indicator at an initial location within the content based on a location of the multi-contact drag input on the onscreen keyboard.
In some embodiments, the content includes editable content and the text selection indicator includes a cursor.
In some embodiments, the processing unit 1208 is configured to: enable display of a magnifying loupe that displays a magnified version of the cursor and a region surrounding the cursor.
In some embodiments, selecting the content between the first location and the second location includes: moving the cursor one character space at a time in response to detecting the movement of the contact across the touch-sensitive surface unit 1204 (e.g., with the moving unit 1214); and selecting one additional character at a time in accordance with the movement of the cursor (e.g., with the selecting unit 1212).
In some embodiments, the content includes read-only content and the text selection indicator includes a selection area; and displaying the text selection indicator at the first location includes displaying a first word located at the first location within the selection area (e.g., with the display unit 1202).
In some embodiments, the processing unit 1208 is configured to: enable display of a magnifying loupe that displays a magnified version of the selection area and a region surrounding the selection area.
In some embodiments, selecting the content between the first location and the second location includes: expanding the selection area one word at a time in accordance with the movement of the contact across the touch-sensitive surface unit 1204 (e.g., with the selecting unit 1212); and selecting one additional word at a time in accordance with the expansion of the selection area (e.g., with the selecting unit 1212).
In some embodiments, the processing unit 1208 is configured to: forego performing the text selection operation, in response to detecting the second press input, in accordance with a determination that the second press input is accompanied by simultaneous movement of the contact across the touch-sensitive surface unit 1204.
In some embodiments, when the text is editable text, the text selection indicator is a cursor and selecting content between the first location and the second location includes expanding the selection character-by-character in accordance with movement of the contact on the touch-sensitive surface unit 1204 (e.g., with the selecting unit 1212); and when the text is non-editable text, the text selection indicator is a selection region that initially encompasses a single word and selecting content between the first location and the second location includes expanding the selection word-by-word in accordance with movement of the contact on the touch-sensitive surface unit 1204 (e.g., with the selecting unit 1212).
Turning to
In
When entering text into the content region 1302, a character can be entered upon detecting a touchdown or liftoff event. In some embodiments, entering the text into the content region 1302 includes entering a character that corresponds to character key at a location at which touchdown of the contact was detected on the onscreen keyboard. For example, touchdown of the finger contact 1324 was detected on the character key “h”. In response to detecting the touchdown, character “h” is entered. In some embodiments, entering the text into the content region includes entering a character that corresponds to character key at a location at which liftoff of the contact was detected on the onscreen keyboard. For example, in response to liftoff of the finger contact 1324 from character key “h”, character “h” is entered.
In
In some embodiments, the press input is a stationary press input such that there is less than a threshold amount of movement between when touchdown of the finger contact 1324 is detected on the onscreen keyboard 1321 and when the increase in the characteristic intensity of the contact over the text selection intensity threshold ITD is detected. In some embodiments, if more than the threshold amount of movement is detected, upon liftoff, a character entry operation is performed based on which character entry key the contact is over when liftoff of the contact is detected.
As shown in
When the keyboard 1321 serves as an onscreen touch pad or track pad, in response to detecting finger movement 1332 started on the keyboard 1321, text selection operations such as moving the cursor 1322 as shown in
Turning to
In
In response to detecting the second deep press by the same contact on the onscreen keyboard 1321 at time T2, the device begins to select a portion of the text input 1328 in accordance with the movement 1332 of the contact in the touch input. After time T2, during the movement 1332, the intensity of the contact can be above or below the selection-start intensity threshold ITD. In some embodiments, the text selection starts when an increase in intensity of the contact is detected while the contact is substantially stationary (e.g., moves not more than a threshold distance within a threshold amount of time before the characteristic intensity of the contact increases above the selection-start intensity threshold).
In
In
In some embodiments, the selection-cancellation criteria include a criterion that is met when the characteristic intensity of the contact increases above a selection-cancellation intensity threshold (e.g., a threshold that is the same as, greater than, or less than the text-selection intensity threshold and/or the selection-start intensity threshold). For illustration purpose, the selection-cancellation intensity threshold shown in
In
In some embodiments, continuing the example shown in
In
In
In some embodiments, the selection-movement criteria include a movement criterion that is met when the contact moves more than a respective threshold amount (e.g., the contact is not substantially stationary and optionally a respective time criterion that is met when an amount of time between the third subsequent change in the characteristic intensity of the contact is detected and when the first subsequent movement of the contact is detected is less than a respective delay threshold). Thus,
In
In
In
In
In
In
In
For example, in
In
Though not shown in figures, continuing the example shown in
As described below, method 1400 provides an efficient way to manipulate a cursor. The method reduces the cognitive burden on a user when using a cursor, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to manipulate a cursor faster and more efficiently conserves power and increases the time between battery charges.
The device concurrently displays (1402) an onscreen keyboard (e.g., keyboard 1321,
In some embodiments, in response to detecting (1406) the touch input on the onscreen keyboard displayed on the touch-sensitive display, in accordance with a determination that the touch input satisfies text-selection criteria, the text-selection criteria include a criterion that is met when a characteristic intensity of the contact increases above a text-selection intensity threshold (e.g., a deep press with contact characteristic intensity increases above ITD at time T1,
In addition to text-selection operations, in response to detecting (1406) the touch input on the onscreen keyboard displayed on the touch-sensitive display and in accordance with a determination that the touch input satisfies text-entry criteria (e.g., as shown in
In some embodiments, in response to detecting that the text-selection criteria have been met, the device generates (1424) a tactile output (e.g., with one or more tactile output generating components of the device, 167 in
In some embodiments, in response to detecting that the text-selection criteria have been met, the device changes (1426) an appearance of the onscreen keyboard to indicate that the device is operating in a text selection mode of operation. In some embodiments, changing the appearance of the onscreen keyboard includes obscuring an appearance of characters on keys of the onscreen keyboard (e.g., blurring in
In some embodiments, when the touch input satisfies the text-selection criteria, the device detects (1430) movement of the contact after the touch input has satisfied the text-selection criteria and moving a cursor (
In some embodiments, when the touch input satisfies the text-selection criteria, the device detects (1432) a first subsequent change (e.g., a deep press at time T2,
In some embodiments, when the touch input satisfies selection-start criteria, after starting to select content in the content region, the device detects (1438) liftoff of the contact from the touch-sensitive display and confirms the selection in response to detecting the liftoff of the contact (e.g., as shown in
Alternatively, when the touch input satisfies selection-start criteria, after starting to select content in the content region, and while continuing to detect the contact on the touch-sensitive display, the device detects (1440) a second subsequent change in intensity of the contact (e.g., a deep press at time T3 in
In some embodiments, in response to detecting that the selection-cancellation criteria have been met, the device generates (1448) a tactile output (e.g., with one or more tactile output generating components of the device, 167 in
In some embodiments, after canceling the selection, and while continuing to detect the contact on the touch-sensitive display, the device detects (1450) a third subsequent change in the characteristic intensity of the contact (e.g., a deep press at time T4 in
In some embodiments, starting to select content in response to detecting the third subsequent change in the characteristic intensity of the contact includes (1456) selecting a respective word at the location of the cursor (as shown in
In some embodiments, the selected respective word is (1458) a first word (e.g., the word “enim” is the selected first word,
In some embodiments, instead of selecting a second word that is adjacent to the first word, while the respective word is selected (e.g., as shown in
In some embodiments, instead of selecting a second word that is adjacent to the first word or expanding the selection word by word in respect to detecting a subsequent movement of the contact, while the respective word is selected, the device detects (1462) a fourth subsequent change (e.g., a deep press at time T5, as shown in
In some embodiments, instead of canceling a selected word in response to a fourth subsequent change in the characteristic intensity of the contact above a respective intensity threshold, while the respective word is selected, the device detects (1464) a fourth subsequent change (e.g., a deep press at time T5,
In some embodiments, while the respective sentence is selected, the device detects (1466) second subsequent movement of the contact (e.g., as shown in
In some embodiments, instead of expanding the sentence selection in response to a subsequent movement of the contact, while the respective sentence is selected, the device detects (1468) a fifth subsequent change in the characteristic intensity of the contact above the respective intensity threshold (e.g., the selection-start intensity threshold) (e.g., as shown in
In some embodiments, instead of canceling the sentence selection, in response to detecting a subsequent change in the characteristic intensity of the contact, in response to detecting the fifth subsequent change in the characteristic intensity of the contact, in accordance with a determination that the touch input meets paragraph-selection criteria which include a movement criterion that is met when the contact moves less than the threshold amount within the threshold time period (e.g., is substantially stationary) before (e.g., just before) the fifth subsequent change in intensity of the contact was detected and a time criterion that is met when an amount of time between when the fourth subsequent change in the characteristic intensity of the contact is detected and when the fifth subsequent change in the characteristic intensity of the contact is detected is less than the delay threshold, the device expands (1470) the selection to include the (entire) respective paragraph that contains the respective sentence (e.g., as shown in
In some embodiments, while the respective paragraph is selected, the device detects (1472) third subsequent movement of the contact. In response to detecting the third subsequent movement of the contact while the respective paragraph is selected: in accordance with a determination that the touch input meets selection-expansion criteria which includes a movement criterion that is met when the contact moves more than a respective threshold amount, the device expands the selection to include a paragraph that is adjacent to the respective paragraph in a first direction in accordance with the third subsequent movement of the contact (e.g., as shown in
In some embodiments, the respective paragraph is selected in response to the fifth subsequent change in the characteristic intensity of the contact. While the respective paragraph is selected: the device detects (1474) a sixth subsequent change in the characteristic intensity of the contact above the respective intensity threshold; and, in response to detecting the sixth subsequent change in the characteristic intensity of the contact, in accordance with a determination that the touch input meets selection-cancellation 138-156 criteria, which includes a criterion that is met when the amount of time between when the fifth subsequent change in the characteristic intensity of the contact is detected and when the sixth subsequent change in the characteristic intensity of the contact is detected is greater than the delay threshold, the device cancels selection of the respective paragraph (e.g., as shown in
In some embodiments, in response to detecting the sixth subsequent change in the characteristic intensity of the contact, in accordance with a determination that the touch input meets document selection criteria which include a movement criterion that is met when the contact moves less than the threshold amount within the threshold time period before the sixth subsequent change in intensity of the contact was detected and a time criterion that is met when an amount of time between when the fifth subsequent change in the characteristic intensity of the contact is detected and when the sixth subsequent change in the characteristic intensity of the contact is detected is less than the delay threshold, the device expands (1476) the selection to include the respective document that contains the respective paragraph.
It should be understood that the particular order in which the operations in
In accordance with some embodiments,
In some embodiments, the electronic device 1500 includes a display unit 1502 configured to concurrently display an onscreen keyboard and a content presentation region on the display unit 1502, wherein the content presentation region displays text input received from the onscreen keyboard; a touch-sensitive surface unit 1504 configured to receive user touch inputs; one or more sensor units 1506 to detect intensity of contacts with the touch-sensitive surface unit 1504; and a processing unit 1508 coupled to the display unit 1502, the touch-sensitive surface unit 1504, and the one or more sensor units 1506. In some embodiments, the processing unit 1508 includes a detecting unit 1510, a selecting unit 1512, a moving unit 1514, a determining unit 1516, and a changing appearance unit 1518.
In some embodiments, the processing unit 1208 is configured to: in accordance with the determination that the touch input satisfies the one or more criteria for entering the text selection mode: visually obscure keys on the onscreen keyboard (e.g., with the obscuring unit 1218).
The processing unit 1508 is configured to: detect (e.g., with detecting unit 1510) a touch input on the onscreen keyboard displayed on the touch-sensitive display. In some embodiments, detect (e.g., with detecting unit 1510) the touch input includes detect (e.g., with detecting unit 1510) movement of a contact and liftoff of the contact. The processing unit 1508 is also configured to: in response to detecting the touch input on the onscreen keyboard displayed on the touch-sensitive display, in accordance with a determination (e.g., with determining unit 1516) that the touch input satisfies text-selection criteria, the text-selection criteria include a criterion that is met when a characteristic intensity of the contact increases above a text-selection intensity threshold, perform a text selection operation (e.g., with selecting unit 1512) based on the movement of the contact. Conversely, in accordance with a determination (e.g., with determining unit 1516) that the touch input satisfies text-entry criteria, the text-entry criteria include a criterion that is met when the characteristic intensity of the contact does not increase above the text-selection intensity threshold, the processing unit 1508 is configured to enter text into the content presentation region based on the touch input.
In some embodiments, the text-entry criteria include a criterion that is met when the liftoff of the contact is detected while the contact is at a location of a character key of the onscreen keyboard.
In some embodiments, the text-entry criteria include a criterion that is met when the contact does not move outside of the onscreen keyboard before liftoff of the contact is detected.
In some embodiments, enter the text into the content region includes enter a character that corresponds to character key at a location at which touchdown of the contact was detected on the onscreen keyboard.
In some embodiments, enter the text into the content region includes enter a character that corresponds to character key at a location at which liftoff of the contact was detected on the onscreen keyboard.
In some embodiments, the text-selection criteria include a criterion that is met when the contact does not move more than a threshold distance before detecting an increase in the characteristic intensity of the contact above the text-selection intensity threshold.
In some embodiments, the text-selection operation includes one of: moving a cursor within the content region or selecting text within the content region.
In some embodiments, the processing unit 1508 is configured to, in response to detecting that the text-selection criteria have been met, generate a tactile (e.g., with tactile output generator(s) 167,
In some embodiments, the processing unit 1508 is configured to, in response to detecting that the text-selection criteria have been met, change an appearance (e.g., with changing appearance unit 1518) of the onscreen keyboard to indicate that the device is operating in a text selection mode of operation. In some embodiments, change the appearance (e.g., with changing appearance unit 1518) of the onscreen keyboard includes obscure an appearance (e.g., with changing appearance unit 1518) of characters on keys of the onscreen keyboard.
In some embodiments, the processing unit 1508 is configured to end the text selection mode of operation and, in conjunction with the end of the text selection mode of operation, reverse the change in appearance (e.g., with changing appearance unit 1518) of the onscreen keyboard to reveal the characters on the keys of the onscreen keyboard.
In some embodiments, the processing unit 1508 is configured to, when the touch input satisfies the text-selection criteria, detect (e.g., with detecting unit 1510) movement of the contact after the touch input has satisfied the text-selection criteria and move (e.g., with moving unit 1514) a cursor in the content region in accordance with the movement of the contact detected after the touch input has satisfied the text-selection criteria.
In some embodiments, the processing unit 1508 is configured to, when the touch input satisfies the text-selection criteria, detect (e.g., with detecting unit 1510) a first subsequent change in the characteristic intensity of the contact followed by additional movement of the contact on the touch-sensitive display; and, in response to detecting the first subsequent change in the characteristic intensity of the contact, in accordance with a determination (e.g., with determining unit 1516) that the touch input satisfies selection-start criteria, the selection-start criteria include a criterion that is met when the characteristic intensity of the contact increases above a selection-start intensity threshold, the processing unit 1508 is configured to start to select (e.g., with selecting unit 1512) content in the content region at a location of a cursor in accordance with the additional movement of the contact. Conversely, in accordance with a determination (e.g., with determining unit 1516) that the touch input does not satisfy the selection-start criteria, the processing unit 1508 is configured to move (e.g., with moving unit 1514) the cursor in accordance with the additional movement of the contact without starting to select content in the content region.
In some embodiments, the processing unit 1508 is configured to: when the touch input satisfies selection-start criteria, after starting to select content in the content region, detect (e.g., with detecting unit 1510) liftoff of the contact from the touch-sensitive display and confirm the selection (e.g., with selecting unit 1512) in response to detecting the liftoff of the contact.
In some embodiments, the processing unit 1508 is configured to: when the touch input satisfies selection-start criteria, after starting to select content in the content region, and while continuing to detect (e.g., with detecting unit 1510) the contact on the touch-sensitive display, detect (e.g., with detecting unit 1510) a second subsequent change in intensity of the contact. In response to detecting the second subsequent change in the characteristic intensity of the contact, in accordance with a determination (e.g., with determining unit 1516) that the second subsequent change in the characteristic intensity of the contact satisfies selection-cancellation criteria, the selection-cancellation criteria include a criterion that is met when the characteristic intensity of the contact increases above a selection-cancellation intensity threshold, the processing unit 1508 is configured to cancel the selection (e.g., with selecting unit 1512). Conversely, in accordance with a determination (e.g., with determining unit 1516) that the second subsequent change in the characteristic intensity of the contact does not satisfy the selection-cancellation criteria, the processing unit 1508 is configured to maintain the selection (e.g., with selecting unit 1512).
In some embodiments, the selection-cancellation criterion include a criterion that is met when the contact moves no more than a threshold distance within a threshold amount of time before the characteristic intensity of the contact increases above the selection-cancellation intensity threshold.
In some embodiments, the processing unit 1508 is configured to, in response to detecting that the selection-cancellation criteria have been met, generate a tactile (e.g., with tactile output generator(s) 167,
In some embodiments, the processing unit 1508 is configured to: after canceling the selection, and while continuing to detect (e.g., with detecting unit 1510) the contact on the touch-sensitive display, detect (e.g., with detecting unit 1510) a third subsequent change in the characteristic intensity of the contact. In response to detecting the third subsequent change in the characteristic intensity of the contact, in accordance with a determination (e.g., with determining unit 1516) that the touch input satisfies the selection-start criteria, the processing unit 1508 is configured to start to select (e.g., with selecting unit 1512) content in the content region at a location of the cursor. Conversely, in accordance with a determination (e.g., with determining unit 1516) that the touch input does not satisfy the selection-start criteria, the processing unit 1508 is configured to forgo starting to select content (e.g., with selecting unit 1512) in the content region.
In some embodiments, start to select (e.g., with selecting unit 1512) content in response to detecting the third subsequent change in the characteristic intensity of the contact includes select (e.g., with selecting unit 1512) a respective word at the location of the cursor.
In some embodiments, the selected respective word is a first word, and the processing unit 1508 is configured to, while the first word is selected, detect (e.g., with detecting unit 1510) first subsequent movement of the contact. In response to detecting the first subsequent movement of the contact while the first word is selected, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets selection-movement criteria which includes a movement criterion that is met when the contact move (e.g., with moving unit 1514)s more than a respective threshold amount, the processing unit 1508 is configured to cancel selection (e.g., with selecting unit 1512) of the first word, and select (e.g., with selecting unit 1512) a second word that is adjacent to the first word in a first direction in accordance with the first subsequent movement of the contact, such that the selected respective word is the second word.
In some embodiments, the processing unit 1508 is configured to, while the respective word is selected, detect (e.g., with detecting unit 1510) first subsequent movement of the contact. In response to detecting the first subsequent movement of the contact while the respective word is selected, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets selection-expansion criteria which includes a movement criterion that is met when the contact moves more than a respective threshold amount, the processing unit 1508 is configured to expand the selection (e.g., with selecting unit 1512) to include a word that is adjacent to the respective word in a first direction in accordance with the first subsequent movement of the contact.
In some embodiments, the processing unit 1508 is configured to, while the respective word is selected, detect (e.g., with detecting unit 1510) a fourth subsequent change in the characteristic intensity of the contact above a respective intensity threshold. In response to detecting the fourth subsequent change in the characteristic intensity of the contact, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets the selection-cancellation criteria, which includes a criterion that is met when the amount of time between when the third subsequent change in the characteristic intensity of the contact is detected and when the fourth subsequent change in the characteristic intensity of the contact is detected is greater than the delay threshold, the processing unit 1508 is configured to cancel selection (e.g., with selecting unit 1512) of the respective word.
In some embodiments, the processing unit 1508 is configured to, while the respective word is selected, detect (e.g., with detecting unit 1510) a fourth subsequent change in the characteristic intensity of the contact above a respective intensity threshold. In response to detecting the fourth subsequent change in the characteristic intensity of the contact, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets sentence-selection criteria which include a movement criterion that is met when the contact move (e.g., with moving unit 1514)s less than a threshold amount within a threshold time period before the fourth subsequent change in intensity of the contact was detected and a time criterion that is met when an amount of time between when the third subsequent change in the characteristic intensity of the contact is detected and when the fourth subsequent change in the characteristic intensity of the contact is detected is less than a delay threshold, the processing unit 1508 is configured to expand the selection (e.g., with selecting unit 1512) to include the respective sentence that contains the respective word.
In some embodiments, the processing unit 1508 is configured to, while the respective sentence is selected, detect (e.g., with detecting unit 1510) second subsequent movement of the contact. In response to detecting the second subsequent movement of the contact while the respective sentence is selected, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets selection-expansion criteria which includes a movement criterion that is met when the contact move (e.g., with moving unit 1514)s more than a respective threshold amount, the processing unit 1508 is configured to expand the selection (e.g., with selecting unit 1512) to include a sentence that is adjacent to the respective sentence in a first direction in accordance with the second subsequent movement of the contact.
In some embodiments, the respective sentence is selected in response to the fourth subsequent change in the characteristic intensity of the contact and the processing unit 1508 is configured to, while the respective sentence is selected: detect (e.g., with detecting unit 1510) a fifth subsequent change in the characteristic intensity of the contact above the respective intensity threshold. In response to detecting the fifth subsequent change in the characteristic intensity of the contact, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets the selection-cancellation criteria, which includes a criterion that is met when the amount of time between when the fourth subsequent change in the characteristic intensity of the contact is detected and when the fifth subsequent change in the characteristic intensity of the contact is detected is greater than the delay threshold, the processing unit 1508 is configured to cancel selection (e.g., with selecting unit 1512) of the respective sentence.
In some embodiments, the processing unit 1508 is configured to, in response to detecting the fifth subsequent change in the characteristic intensity of the contact, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets paragraph-selection criteria which include a movement criterion that is met when the contact move (e.g., with moving unit 1514)s less than the threshold amount within the threshold time period before the fifth subsequent change in intensity of the contact was detected and a time criterion that is met when an amount of time between when the fourth subsequent change in the characteristic intensity of the contact is detected and when the fifth subsequent change in the characteristic intensity of the contact is detected is less than the delay threshold, the processing unit 1508 is configured to expand the selection (e.g., with selecting unit 1512) to include the respective paragraph that contains the respective sentence.
In some embodiments, the processing unit 1508 is configured to, while the respective paragraph is selected, detecting third subsequent movement of the contact. In response to detecting the third subsequent movement of the contact while the respective paragraph is selected, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets selection-expansion criteria which includes a movement criterion that is met when the contact move (e.g., with moving unit 1514)s more than a respective threshold amount, expand the selection (e.g., with selecting unit 1512) to include a paragraph that is adjacent to the respective paragraph in a first direction in accordance with the third subsequent movement of the contact.
In some embodiments, the respective paragraph is selected in response to the fifth subsequent change in the characteristic intensity of the contact and the processing unit 1508 is configured to, while the respective paragraph is selected, detect (e.g., with detecting unit 1510) a sixth subsequent change in the characteristic intensity of the contact above the respective intensity threshold. In response to detecting the sixth subsequent change in the characteristic intensity of the contact, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets selection-cancellation criteria, which includes a criterion that is met when the amount of time between when the fifth subsequent change in the characteristic intensity of the contact is detected and when the sixth subsequent change in the characteristic intensity of the contact is detected is greater than the delay threshold, the processing unit 1508 is configured to cancel selection (e.g., with selecting unit 1512) of the respective paragraph.
In some embodiments, the processing unit 1508 is configured to, in response to detecting the sixth subsequent change in the characteristic intensity of the contact, in accordance with a determination (e.g., with determining unit 1516) that the touch input meets document selection criteria which include a movement criterion that is met when the contact move (e.g., with moving unit 1514)s less than the threshold amount within the threshold time period before the sixth subsequent change in intensity of the contact was detected and a time criterion that is met when an amount of time between when the fifth subsequent change in the characteristic intensity of the contact is detected and when the sixth subsequent change in the characteristic intensity of the contact is detected is less than the delay threshold, the processing unit 1508 is configured to expand the selection (e.g., with selecting unit 1512) to include the respective document that contains the respective paragraph.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best use the invention and various described embodiments with various modifications as are suited to the particular use contemplated.
This application is a continuation of U.S. application Ser. No. 16/824,490, filed Mar. 19, 2020, which is a continuation of U.S. application Ser. No. 16/258,394, filed Jan. 25, 2019, now U.S. Pat. No. 10,599,331, which is a continuation of U.S. application Ser. No. 15/499,693, filed Apr. 27, 2017, now U.S. Pat. No. 10,222,980, which is a continuation of U.S. patent application Ser. No. 14/866,361, filed Sep. 25, 2015, now U.S. Pat. No. 9,639,184, which is a continuation of U.S. patent application Ser. No. 14/864,737, filed Sep. 24, 2015, now U.S. Pat. No. 9,785,305, which claims priority to U.S. Provisional Patent Application Ser. No. 62/215,720, filed Sep. 8, 2015, U.S. Provisional Patent Application Ser. No. 62/213,593, filed Sep. 2, 2015, U.S. Provisional Patent Application Ser. No. 62/172,162, filed Jun. 7, 2015, and U.S. Provisional Patent Application Ser. No. 62/135,619, filed Mar. 19, 2015. All of these applications are incorporated by reference herein in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4864520 | Setoguchi et al. | Sep 1989 | A |
5184120 | Schultz | Feb 1993 | A |
5374787 | Miller et al. | Dec 1994 | A |
5428730 | Baker et al. | Jun 1995 | A |
5463722 | Venolia | Oct 1995 | A |
5510813 | Makinwa et al. | Apr 1996 | A |
5555354 | Strasnick et al. | Sep 1996 | A |
5559301 | Bryan, Jr. et al. | Sep 1996 | A |
5589855 | Blumstein et al. | Dec 1996 | A |
5664210 | Fleming et al. | Sep 1997 | A |
5710896 | Seidl | Jan 1998 | A |
5717438 | Kim et al. | Feb 1998 | A |
5793360 | Fleck et al. | Aug 1998 | A |
5793377 | Moore | Aug 1998 | A |
5801692 | Muzio et al. | Sep 1998 | A |
5805144 | Scholder et al. | Sep 1998 | A |
5805167 | Van Cruyningen | Sep 1998 | A |
5809267 | Moran et al. | Sep 1998 | A |
5819293 | Comer et al. | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5844560 | Crutcher et al. | Dec 1998 | A |
5872922 | Hogan et al. | Feb 1999 | A |
5946647 | Miller et al. | Aug 1999 | A |
5973670 | Barber et al. | Oct 1999 | A |
6002397 | Kolawa et al. | Dec 1999 | A |
6031989 | Cordell | Feb 2000 | A |
6088019 | Rosenberg | Jul 2000 | A |
6088027 | Konar et al. | Jul 2000 | A |
6111575 | Martinez et al. | Aug 2000 | A |
6121960 | Carroll et al. | Sep 2000 | A |
6208329 | Ballare | Mar 2001 | B1 |
6208340 | Amin et al. | Mar 2001 | B1 |
6219034 | Elbing et al. | Apr 2001 | B1 |
6223188 | Albers et al. | Apr 2001 | B1 |
6232891 | Rosenberg | May 2001 | B1 |
6243080 | Molne | Jun 2001 | B1 |
6252594 | Xia et al. | Jun 2001 | B1 |
6300936 | Braun et al. | Oct 2001 | B1 |
6313836 | Russell, Jr. et al. | Nov 2001 | B1 |
6396523 | Segal et al. | May 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6448977 | Braun et al. | Sep 2002 | B1 |
6459442 | Edwards et al. | Oct 2002 | B1 |
6489978 | Gong et al. | Dec 2002 | B1 |
6512530 | Rzepkowski et al. | Jan 2003 | B1 |
6563487 | Martin et al. | May 2003 | B2 |
6567102 | Kung | May 2003 | B2 |
6583798 | Hoek et al. | Jun 2003 | B1 |
6590568 | Astala et al. | Jul 2003 | B1 |
6661438 | Shiraishi et al. | Dec 2003 | B1 |
6735307 | Volckers | May 2004 | B1 |
6750890 | Sugimoto | Jun 2004 | B1 |
6806893 | Kolawa et al. | Oct 2004 | B1 |
6822635 | Shahoian et al. | Nov 2004 | B2 |
6906697 | Rosenberg | Jun 2005 | B2 |
6919927 | Hyodo | Jul 2005 | B1 |
6943778 | Astala et al. | Sep 2005 | B1 |
7036088 | Tunney | Apr 2006 | B2 |
7138983 | Wakai et al. | Nov 2006 | B2 |
7312791 | Hoshino et al. | Dec 2007 | B2 |
7411575 | Hill et al. | Aug 2008 | B2 |
7434177 | Ording et al. | Oct 2008 | B1 |
7453439 | Kushler et al. | Nov 2008 | B1 |
7471284 | Bathiche et al. | Dec 2008 | B2 |
7479949 | Jobs et al. | Jan 2009 | B2 |
7516404 | Colby et al. | Apr 2009 | B1 |
7533352 | Chew et al. | May 2009 | B2 |
7552397 | Holecek et al. | Jun 2009 | B2 |
7577530 | Vignalou-Marche | Aug 2009 | B2 |
7614008 | Ording | Nov 2009 | B2 |
7619616 | Rimas Ribikauskas et al. | Nov 2009 | B2 |
7629966 | Anson | Dec 2009 | B2 |
7656413 | Khan et al. | Feb 2010 | B2 |
7683889 | Rimas Ribikauskas et al. | Mar 2010 | B2 |
7702733 | Fleck et al. | Apr 2010 | B2 |
7743348 | Robbins et al. | Jun 2010 | B2 |
7760187 | Kennedy | Jul 2010 | B2 |
7787026 | Flory et al. | Aug 2010 | B1 |
7797642 | Karam et al. | Sep 2010 | B1 |
7801950 | Eisenstadt et al. | Sep 2010 | B2 |
7812826 | Ording et al. | Oct 2010 | B2 |
7890862 | Kompe et al. | Feb 2011 | B2 |
7903090 | Soss et al. | Mar 2011 | B2 |
7952566 | Poupyrev et al. | May 2011 | B2 |
7956847 | Christie | Jun 2011 | B2 |
7973778 | Chen | Jul 2011 | B2 |
8000694 | Labidi et al. | Aug 2011 | B2 |
8040142 | Bokma et al. | Oct 2011 | B1 |
8059104 | Shahoian et al. | Nov 2011 | B2 |
8059105 | Rosenberg et al. | Nov 2011 | B2 |
8106856 | Matas et al. | Jan 2012 | B2 |
8125440 | Guyot-Sionnest et al. | Feb 2012 | B2 |
8125492 | Wainwright et al. | Feb 2012 | B1 |
RE43448 | Kimoto et al. | Jun 2012 | E |
8209628 | Davidson | Jun 2012 | B1 |
8271900 | Walizaka et al. | Sep 2012 | B2 |
8300005 | Tateuchi et al. | Oct 2012 | B2 |
8325398 | Satomi et al. | Dec 2012 | B2 |
8363020 | Li et al. | Jan 2013 | B2 |
8390583 | Forutanpour et al. | Mar 2013 | B2 |
8423089 | Song et al. | Apr 2013 | B2 |
8446376 | Levy et al. | May 2013 | B2 |
8453057 | Stallings et al. | May 2013 | B2 |
8456431 | Victor | Jun 2013 | B2 |
8466889 | Tong et al. | Jun 2013 | B2 |
8482535 | Pryor | Jul 2013 | B2 |
8499243 | Yuki | Jul 2013 | B2 |
8504946 | Williamson et al. | Aug 2013 | B2 |
8508494 | Moore | Aug 2013 | B2 |
8542205 | Keller | Sep 2013 | B1 |
8553092 | Tezuka et al. | Oct 2013 | B2 |
8570296 | Birnbaum et al. | Oct 2013 | B2 |
8581870 | Bokma et al. | Nov 2013 | B2 |
8587542 | Moore | Nov 2013 | B2 |
8593415 | Han et al. | Nov 2013 | B2 |
8593420 | Buuck | Nov 2013 | B1 |
8625882 | Backlund et al. | Jan 2014 | B2 |
8638311 | Kang et al. | Jan 2014 | B2 |
8665227 | Gunawan | Mar 2014 | B2 |
8669945 | Coddington | Mar 2014 | B2 |
8698765 | Keller | Apr 2014 | B1 |
8706172 | Priyantha et al. | Apr 2014 | B2 |
8713471 | Rowley et al. | Apr 2014 | B1 |
8717305 | Williamson et al. | May 2014 | B2 |
8726198 | Rydenhag et al. | May 2014 | B2 |
8743069 | Morton et al. | Jun 2014 | B2 |
8760425 | Crisan | Jun 2014 | B2 |
8769431 | Prasad | Jul 2014 | B1 |
8773389 | Freed | Jul 2014 | B1 |
8788964 | Shin et al. | Jul 2014 | B2 |
8793577 | Schellingerhout et al. | Jul 2014 | B2 |
8799816 | Wells et al. | Aug 2014 | B2 |
8816989 | Nicholson et al. | Aug 2014 | B2 |
8854316 | Shenfield | Oct 2014 | B2 |
8872729 | Lyons et al. | Oct 2014 | B2 |
8872773 | Mak et al. | Oct 2014 | B2 |
8875044 | Ozawa et al. | Oct 2014 | B2 |
8881062 | Kim et al. | Nov 2014 | B2 |
8914732 | Jun et al. | Dec 2014 | B2 |
8952987 | Momeyer et al. | Feb 2015 | B2 |
8954889 | Fujibayashi | Feb 2015 | B2 |
8959430 | Spivak et al. | Feb 2015 | B1 |
8976128 | Moore | Mar 2015 | B2 |
9026932 | Dixon | May 2015 | B1 |
9030419 | Freed | May 2015 | B1 |
9030436 | Ikeda | May 2015 | B2 |
9032321 | Cohen et al. | May 2015 | B1 |
9043732 | Nurmi et al. | May 2015 | B2 |
9046999 | Teller et al. | Jun 2015 | B1 |
9052820 | Jarrett et al. | Jun 2015 | B2 |
9052925 | Chaudhri | Jun 2015 | B2 |
9063563 | Gray et al. | Jun 2015 | B1 |
9063731 | Heo et al. | Jun 2015 | B2 |
9069460 | Moore | Jun 2015 | B2 |
9086755 | Cho et al. | Jul 2015 | B2 |
9092058 | Kasahara et al. | Jul 2015 | B2 |
9098188 | Kim | Aug 2015 | B2 |
9104260 | Marsden et al. | Aug 2015 | B2 |
9111076 | Park et al. | Aug 2015 | B2 |
9116569 | Stacy et al. | Aug 2015 | B2 |
9116571 | Zeliff et al. | Aug 2015 | B2 |
9122364 | Kuwabara et al. | Sep 2015 | B2 |
9128605 | Nan et al. | Sep 2015 | B2 |
9146914 | Dhaundiyal | Sep 2015 | B1 |
9164779 | Brakensiek et al. | Oct 2015 | B2 |
9170607 | Bose et al. | Oct 2015 | B2 |
9170649 | Ronkainen | Oct 2015 | B2 |
9218105 | Mansson et al. | Dec 2015 | B2 |
9244562 | Rosenberg et al. | Jan 2016 | B1 |
9244576 | Vadagave et al. | Jan 2016 | B1 |
9244601 | Kim et al. | Jan 2016 | B2 |
9244606 | Kocienda et al. | Jan 2016 | B2 |
9246487 | Casparian et al. | Jan 2016 | B2 |
9262002 | Momeyer et al. | Feb 2016 | B2 |
9280286 | Commarford et al. | Mar 2016 | B2 |
9304668 | Rezende et al. | Apr 2016 | B2 |
9307112 | Molgaard et al. | Apr 2016 | B2 |
9349552 | Huska et al. | May 2016 | B2 |
9361018 | Defazio et al. | Jun 2016 | B2 |
9383887 | Khafizov et al. | Jul 2016 | B1 |
9389718 | Letourneur | Jul 2016 | B1 |
9389722 | Matsuki et al. | Jul 2016 | B2 |
9395800 | Liu et al. | Jul 2016 | B2 |
9400581 | Bokma et al. | Jul 2016 | B2 |
9405367 | Jung et al. | Aug 2016 | B2 |
9405428 | Roh et al. | Aug 2016 | B2 |
9417754 | Smith | Aug 2016 | B2 |
9423938 | Morris | Aug 2016 | B1 |
9436344 | Kuwabara et al. | Sep 2016 | B2 |
9448694 | Sharma et al. | Sep 2016 | B2 |
9451230 | Henderson et al. | Sep 2016 | B1 |
9471145 | Langlois et al. | Oct 2016 | B2 |
9477393 | Zambetti et al. | Oct 2016 | B2 |
9542013 | Dearman et al. | Jan 2017 | B2 |
9547436 | Ohki et al. | Jan 2017 | B2 |
9569093 | Lipman et al. | Feb 2017 | B2 |
9582178 | Grant et al. | Feb 2017 | B2 |
9600114 | Milam et al. | Mar 2017 | B2 |
9600116 | Tao et al. | Mar 2017 | B2 |
9612741 | Brown et al. | Apr 2017 | B2 |
9619076 | Bernstein et al. | Apr 2017 | B2 |
9625987 | LaPenna et al. | Apr 2017 | B1 |
9645722 | Stasior et al. | May 2017 | B1 |
9665762 | Thompson et al. | May 2017 | B2 |
9671943 | Van der Velden | Jun 2017 | B2 |
9678571 | Robert et al. | Jun 2017 | B1 |
9733716 | Shaffer | Aug 2017 | B2 |
9740381 | Chaudhri et al. | Aug 2017 | B1 |
9753527 | Connell et al. | Sep 2017 | B2 |
9760241 | Lewbel | Sep 2017 | B1 |
9785305 | Alonso Ruiz et al. | Oct 2017 | B2 |
9798443 | Gray | Oct 2017 | B1 |
9804665 | DeBates et al. | Oct 2017 | B2 |
9829980 | Lisseman et al. | Nov 2017 | B2 |
9891747 | Jang et al. | Feb 2018 | B2 |
10055066 | Lynn et al. | Aug 2018 | B2 |
10057490 | Shin et al. | Aug 2018 | B2 |
10095396 | Kudershian et al. | Oct 2018 | B2 |
10133388 | Sudou | Nov 2018 | B2 |
10133397 | Smith | Nov 2018 | B1 |
10180722 | Lu | Jan 2019 | B2 |
10222980 | Alonso Ruiz et al. | Mar 2019 | B2 |
10235023 | Gustafsson et al. | Mar 2019 | B2 |
10275087 | Smith | Apr 2019 | B1 |
10331769 | Hill et al. | Jun 2019 | B1 |
10386960 | Smith | Aug 2019 | B1 |
10469767 | Shikata | Nov 2019 | B2 |
10496151 | Kim et al. | Dec 2019 | B2 |
10547895 | Morris | Jan 2020 | B1 |
10739896 | Kim et al. | Aug 2020 | B2 |
10782871 | Bernstein et al. | Sep 2020 | B2 |
20010024195 | Hayakawa et al. | Sep 2001 | A1 |
20010045965 | Orbanes et al. | Nov 2001 | A1 |
20020006822 | Krintzman | Jan 2002 | A1 |
20020008691 | Hanajima et al. | Jan 2002 | A1 |
20020015064 | Robotham et al. | Feb 2002 | A1 |
20020042925 | Ebisu et al. | Apr 2002 | A1 |
20020054011 | Bruneau et al. | May 2002 | A1 |
20020057256 | Flack | May 2002 | A1 |
20020109668 | Rosenberg et al. | Aug 2002 | A1 |
20020109678 | Marmolin et al. | Aug 2002 | A1 |
20020140680 | Lu | Oct 2002 | A1 |
20020140740 | Chen | Oct 2002 | A1 |
20020163498 | Chang et al. | Nov 2002 | A1 |
20020180763 | Kung | Dec 2002 | A1 |
20020186257 | Cadiz et al. | Dec 2002 | A1 |
20030001869 | Nissen | Jan 2003 | A1 |
20030013492 | Bokhari et al. | Jan 2003 | A1 |
20030068053 | Chu | Apr 2003 | A1 |
20030086496 | Zhang et al. | May 2003 | A1 |
20030112269 | Lentz et al. | Jun 2003 | A1 |
20030117440 | Hellyar et al. | Jun 2003 | A1 |
20030122779 | Martin et al. | Jul 2003 | A1 |
20030128242 | Gordon | Jul 2003 | A1 |
20030151589 | Bensen et al. | Aug 2003 | A1 |
20030184574 | Phillips et al. | Oct 2003 | A1 |
20030189552 | Chuang et al. | Oct 2003 | A1 |
20030189647 | Kang | Oct 2003 | A1 |
20030206169 | Springer et al. | Nov 2003 | A1 |
20030222915 | Marion et al. | Dec 2003 | A1 |
20040015662 | Cummings | Jan 2004 | A1 |
20040021643 | Hoshino et al. | Feb 2004 | A1 |
20040056849 | Lohbihler et al. | Mar 2004 | A1 |
20040108995 | Hoshino et al. | Jun 2004 | A1 |
20040138849 | Schmidt et al. | Jul 2004 | A1 |
20040150631 | Fleck et al. | Aug 2004 | A1 |
20040150644 | Kincaid et al. | Aug 2004 | A1 |
20040155869 | Robinson et al. | Aug 2004 | A1 |
20040168131 | Blumberg | Aug 2004 | A1 |
20040174399 | Wu et al. | Sep 2004 | A1 |
20040219969 | Casey et al. | Nov 2004 | A1 |
20040267877 | Shiparo et al. | Dec 2004 | A1 |
20050012723 | Pallakoff | Jan 2005 | A1 |
20050039141 | Burke et al. | Feb 2005 | A1 |
20050064911 | Chen et al. | Mar 2005 | A1 |
20050066207 | Fleck et al. | Mar 2005 | A1 |
20050076256 | Fleck et al. | Apr 2005 | A1 |
20050078093 | Peterson, Jr. et al. | Apr 2005 | A1 |
20050091604 | Davis | Apr 2005 | A1 |
20050110769 | DaCosta et al. | May 2005 | A1 |
20050114785 | Finnigan et al. | May 2005 | A1 |
20050125742 | Grotjohn et al. | Jun 2005 | A1 |
20050134578 | Chambers et al. | Jun 2005 | A1 |
20050156892 | Grant | Jul 2005 | A1 |
20050183017 | Cain | Aug 2005 | A1 |
20050190280 | Haas et al. | Sep 2005 | A1 |
20050204295 | Voorhees et al. | Sep 2005 | A1 |
20050223338 | Partanen | Oct 2005 | A1 |
20050229112 | Clay et al. | Oct 2005 | A1 |
20050283726 | Lunati | Dec 2005 | A1 |
20050289476 | Tokkonen | Dec 2005 | A1 |
20060001650 | Robbins et al. | Jan 2006 | A1 |
20060001657 | Monney et al. | Jan 2006 | A1 |
20060012577 | Kyrola | Jan 2006 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060031776 | Glein et al. | Feb 2006 | A1 |
20060036945 | Radtke et al. | Feb 2006 | A1 |
20060036971 | Mendel et al. | Feb 2006 | A1 |
20060059436 | Nurmi | Mar 2006 | A1 |
20060067677 | Tokiwa et al. | Mar 2006 | A1 |
20060101347 | Runov et al. | May 2006 | A1 |
20060101581 | Blanchard et al. | May 2006 | A1 |
20060109252 | Kolmykov-Zotov et al. | May 2006 | A1 |
20060109256 | Grant et al. | May 2006 | A1 |
20060119586 | Grant et al. | Jun 2006 | A1 |
20060132455 | Rimas-Ribikauskas et al. | Jun 2006 | A1 |
20060132456 | Anson | Jun 2006 | A1 |
20060132457 | Rimas-Ribikauskas et al. | Jun 2006 | A1 |
20060136834 | Cao et al. | Jun 2006 | A1 |
20060136845 | Rimas-Ribikauskas et al. | Jun 2006 | A1 |
20060161861 | Holecek et al. | Jul 2006 | A1 |
20060161870 | Hotelling et al. | Jul 2006 | A1 |
20060190834 | Marcjan | Aug 2006 | A1 |
20060195438 | Galuten | Aug 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060210958 | Rimas-Ribikauskas et al. | Sep 2006 | A1 |
20060212812 | Simmons et al. | Sep 2006 | A1 |
20060213754 | Jarrett et al. | Sep 2006 | A1 |
20060224989 | Pettiross et al. | Oct 2006 | A1 |
20060233248 | Rynderman et al. | Oct 2006 | A1 |
20060236263 | Bathiche et al. | Oct 2006 | A1 |
20060274042 | Krah et al. | Dec 2006 | A1 |
20060274086 | Forstall et al. | Dec 2006 | A1 |
20060277469 | Chaudhri et al. | Dec 2006 | A1 |
20060282778 | Barsness et al. | Dec 2006 | A1 |
20060284858 | Rekimoto | Dec 2006 | A1 |
20060290681 | Ho et al. | Dec 2006 | A1 |
20070024595 | Baker et al. | Feb 2007 | A1 |
20070024646 | Saarinen et al. | Feb 2007 | A1 |
20070036456 | Hooper | Feb 2007 | A1 |
20070080953 | Lii | Apr 2007 | A1 |
20070113681 | Nishimura et al. | May 2007 | A1 |
20070120834 | Boillot | May 2007 | A1 |
20070120835 | Sato | May 2007 | A1 |
20070124699 | Michaels | May 2007 | A1 |
20070152959 | Peters | Jul 2007 | A1 |
20070157089 | Van Os et al. | Jul 2007 | A1 |
20070157173 | Klein et al. | Jul 2007 | A1 |
20070168369 | Bruns | Jul 2007 | A1 |
20070168890 | Zhao et al. | Jul 2007 | A1 |
20070176904 | Russo | Aug 2007 | A1 |
20070182999 | Anthony et al. | Aug 2007 | A1 |
20070186178 | Schiller | Aug 2007 | A1 |
20070200713 | Weber et al. | Aug 2007 | A1 |
20070222768 | Geurts et al. | Sep 2007 | A1 |
20070229455 | Martin et al. | Oct 2007 | A1 |
20070229464 | Hotelling et al. | Oct 2007 | A1 |
20070236450 | Colgate et al. | Oct 2007 | A1 |
20070236477 | Ryu et al. | Oct 2007 | A1 |
20070245241 | Bertram et al. | Oct 2007 | A1 |
20070257821 | Son et al. | Nov 2007 | A1 |
20070270182 | Gulliksson et al. | Nov 2007 | A1 |
20070288862 | Ording | Dec 2007 | A1 |
20070294295 | Finkelstein et al. | Dec 2007 | A1 |
20070299923 | Skelly et al. | Dec 2007 | A1 |
20080001924 | de los Reyes et al. | Jan 2008 | A1 |
20080010610 | Lim et al. | Jan 2008 | A1 |
20080024459 | Poupyrev et al. | Jan 2008 | A1 |
20080034306 | Ording | Feb 2008 | A1 |
20080034331 | Josephsoon et al. | Feb 2008 | A1 |
20080036743 | Westerman et al. | Feb 2008 | A1 |
20080051989 | Welsh | Feb 2008 | A1 |
20080052945 | Matas et al. | Mar 2008 | A1 |
20080066010 | Brodersen et al. | Mar 2008 | A1 |
20080094367 | Van De Ven et al. | Apr 2008 | A1 |
20080094368 | Ording et al. | Apr 2008 | A1 |
20080094398 | Ng et al. | Apr 2008 | A1 |
20080106523 | Conrad | May 2008 | A1 |
20080109753 | Karstens | May 2008 | A1 |
20080136790 | Hio | Jun 2008 | A1 |
20080155415 | Yoon et al. | Jun 2008 | A1 |
20080163119 | Kim et al. | Jul 2008 | A1 |
20080165141 | Christie | Jul 2008 | A1 |
20080165160 | Kocienda et al. | Jul 2008 | A1 |
20080168379 | Forstall et al. | Jul 2008 | A1 |
20080168395 | Ording et al. | Jul 2008 | A1 |
20080168403 | Westerman et al. | Jul 2008 | A1 |
20080168404 | Ording | Jul 2008 | A1 |
20080189605 | Kay et al. | Aug 2008 | A1 |
20080202824 | Philipp et al. | Aug 2008 | A1 |
20080204427 | Heesemans et al. | Aug 2008 | A1 |
20080219493 | Tadmor | Sep 2008 | A1 |
20080222569 | Champion et al. | Sep 2008 | A1 |
20080225007 | Nakadaira et al. | Sep 2008 | A1 |
20080244448 | Goering et al. | Oct 2008 | A1 |
20080259046 | Carsanaro | Oct 2008 | A1 |
20080263452 | Tomkins | Oct 2008 | A1 |
20080284866 | Mizutani | Nov 2008 | A1 |
20080294984 | Ramsay et al. | Nov 2008 | A1 |
20080297475 | Woolf et al. | Dec 2008 | A1 |
20080303795 | Lowles et al. | Dec 2008 | A1 |
20080303799 | Schwesig et al. | Dec 2008 | A1 |
20080307335 | Chaudhri et al. | Dec 2008 | A1 |
20080307359 | Louch et al. | Dec 2008 | A1 |
20080317378 | Steinberg et al. | Dec 2008 | A1 |
20080320419 | Matas et al. | Dec 2008 | A1 |
20090007017 | Anzures et al. | Jan 2009 | A1 |
20090016645 | Sako et al. | Jan 2009 | A1 |
20090028359 | Terada et al. | Jan 2009 | A1 |
20090046110 | Sadler et al. | Feb 2009 | A1 |
20090058828 | Jiang et al. | Mar 2009 | A1 |
20090061837 | Chaudhri et al. | Mar 2009 | A1 |
20090064031 | Bull et al. | Mar 2009 | A1 |
20090066668 | Kim et al. | Mar 2009 | A1 |
20090073118 | Yamaji et al. | Mar 2009 | A1 |
20090075738 | Pearce | Mar 2009 | A1 |
20090083665 | Anttila et al. | Mar 2009 | A1 |
20090085878 | Heubel et al. | Apr 2009 | A1 |
20090085881 | Keam | Apr 2009 | A1 |
20090085886 | Huang et al. | Apr 2009 | A1 |
20090089293 | Garritano et al. | Apr 2009 | A1 |
20090100343 | Lee et al. | Apr 2009 | A1 |
20090102804 | Wong et al. | Apr 2009 | A1 |
20090102805 | Meijer et al. | Apr 2009 | A1 |
20090140985 | Liu | Jun 2009 | A1 |
20090150775 | Miyazaki et al. | Jun 2009 | A1 |
20090158198 | Hayter et al. | Jun 2009 | A1 |
20090160793 | Rekimoto | Jun 2009 | A1 |
20090160814 | Li et al. | Jun 2009 | A1 |
20090164905 | Ko | Jun 2009 | A1 |
20090167507 | Maenpaa | Jul 2009 | A1 |
20090167508 | Fadell et al. | Jul 2009 | A1 |
20090167509 | Fadell et al. | Jul 2009 | A1 |
20090167704 | Terlizzi et al. | Jul 2009 | A1 |
20090169061 | Anderson et al. | Jul 2009 | A1 |
20090178008 | Herz et al. | Jul 2009 | A1 |
20090187824 | Hinckley et al. | Jul 2009 | A1 |
20090189866 | Haffenden et al. | Jul 2009 | A1 |
20090195959 | Ladouceur et al. | Aug 2009 | A1 |
20090198767 | Jakobson et al. | Aug 2009 | A1 |
20090201260 | Lee et al. | Aug 2009 | A1 |
20090219294 | Young et al. | Sep 2009 | A1 |
20090225037 | Williamson et al. | Sep 2009 | A1 |
20090228842 | Westerman et al. | Sep 2009 | A1 |
20090237374 | Li et al. | Sep 2009 | A1 |
20090244357 | Huang | Oct 2009 | A1 |
20090247112 | Lundy et al. | Oct 2009 | A1 |
20090247230 | Lundy et al. | Oct 2009 | A1 |
20090251410 | Mori et al. | Oct 2009 | A1 |
20090251421 | Bloebaum | Oct 2009 | A1 |
20090256947 | Ciurea et al. | Oct 2009 | A1 |
20090259975 | Asai et al. | Oct 2009 | A1 |
20090267906 | Schroderus | Oct 2009 | A1 |
20090276730 | Aybes et al. | Nov 2009 | A1 |
20090280860 | Dahlke | Nov 2009 | A1 |
20090282360 | Park et al. | Nov 2009 | A1 |
20090284478 | De La Torre Baltierra et al. | Nov 2009 | A1 |
20090288032 | Chang et al. | Nov 2009 | A1 |
20090289779 | Braun et al. | Nov 2009 | A1 |
20090293009 | Meserth et al. | Nov 2009 | A1 |
20090295713 | Piot et al. | Dec 2009 | A1 |
20090295739 | Nagara | Dec 2009 | A1 |
20090295943 | Kim et al. | Dec 2009 | A1 |
20090298546 | Kim et al. | Dec 2009 | A1 |
20090303187 | Pallakoff | Dec 2009 | A1 |
20090307583 | Tonisson | Dec 2009 | A1 |
20090307633 | Haughay, Jr. et al. | Dec 2009 | A1 |
20090322893 | Stallings et al. | Dec 2009 | A1 |
20090325566 | Bell et al. | Dec 2009 | A1 |
20100007926 | Imaizumi et al. | Jan 2010 | A1 |
20100011304 | Van Os | Jan 2010 | A1 |
20100013613 | Weston | Jan 2010 | A1 |
20100013777 | Baudisch et al. | Jan 2010 | A1 |
20100017710 | Kim et al. | Jan 2010 | A1 |
20100020035 | Ryu et al. | Jan 2010 | A1 |
20100020221 | Tupman et al. | Jan 2010 | A1 |
20100026640 | Kim et al. | Feb 2010 | A1 |
20100026647 | Abe et al. | Feb 2010 | A1 |
20100039446 | Hillis et al. | Feb 2010 | A1 |
20100044121 | Simon et al. | Feb 2010 | A1 |
20100045619 | Birnbaum et al. | Feb 2010 | A1 |
20100057235 | Wang et al. | Mar 2010 | A1 |
20100058231 | Duarte et al. | Mar 2010 | A1 |
20100060548 | Choi et al. | Mar 2010 | A1 |
20100060605 | Rimas-Ribikauskas et al. | Mar 2010 | A1 |
20100061637 | Mochizuki et al. | Mar 2010 | A1 |
20100062803 | Yun et al. | Mar 2010 | A1 |
20100070908 | Mori et al. | Mar 2010 | A1 |
20100073329 | Raman et al. | Mar 2010 | A1 |
20100083116 | Akifusa et al. | Apr 2010 | A1 |
20100085302 | Fairweather et al. | Apr 2010 | A1 |
20100085314 | Kwok | Apr 2010 | A1 |
20100085317 | Park et al. | Apr 2010 | A1 |
20100088596 | Griffin et al. | Apr 2010 | A1 |
20100088654 | Henhoeffer | Apr 2010 | A1 |
20100110082 | Myrick et al. | May 2010 | A1 |
20100111434 | Madden | May 2010 | A1 |
20100127983 | Irani et al. | May 2010 | A1 |
20100128002 | Stacy et al. | May 2010 | A1 |
20100138776 | Korhonen | Jun 2010 | A1 |
20100148999 | Casparian et al. | Jun 2010 | A1 |
20100149096 | Migos et al. | Jun 2010 | A1 |
20100153879 | Rimas-Ribikauskas et al. | Jun 2010 | A1 |
20100156807 | Stallings et al. | Jun 2010 | A1 |
20100156813 | Duarte et al. | Jun 2010 | A1 |
20100156818 | Burrough et al. | Jun 2010 | A1 |
20100156823 | Paleczny et al. | Jun 2010 | A1 |
20100156825 | Sohn et al. | Jun 2010 | A1 |
20100159995 | Stallings et al. | Jun 2010 | A1 |
20100171713 | Kwok et al. | Jul 2010 | A1 |
20100175023 | Gatlin et al. | Jul 2010 | A1 |
20100180225 | Chiba et al. | Jul 2010 | A1 |
20100199227 | Xiao et al. | Aug 2010 | A1 |
20100211872 | Rolston et al. | Aug 2010 | A1 |
20100214135 | Bathiche et al. | Aug 2010 | A1 |
20100214239 | Wu | Aug 2010 | A1 |
20100218663 | Choi | Sep 2010 | A1 |
20100220065 | Ma | Sep 2010 | A1 |
20100225456 | Eldering | Sep 2010 | A1 |
20100225604 | Homma et al. | Sep 2010 | A1 |
20100231533 | Chaudhri | Sep 2010 | A1 |
20100231534 | Chaudhri et al. | Sep 2010 | A1 |
20100235118 | Moore et al. | Sep 2010 | A1 |
20100235726 | Ording et al. | Sep 2010 | A1 |
20100235733 | Drislane et al. | Sep 2010 | A1 |
20100235746 | Anzures | Sep 2010 | A1 |
20100240415 | Kim et al. | Sep 2010 | A1 |
20100241955 | Price et al. | Sep 2010 | A1 |
20100248787 | Smuga et al. | Sep 2010 | A1 |
20100251168 | Fujita et al. | Sep 2010 | A1 |
20100259500 | Kennedy | Oct 2010 | A1 |
20100271312 | Alameh et al. | Oct 2010 | A1 |
20100271500 | Park et al. | Oct 2010 | A1 |
20100277419 | Ganey et al. | Nov 2010 | A1 |
20100277496 | Kawanishi et al. | Nov 2010 | A1 |
20100281379 | Meaney et al. | Nov 2010 | A1 |
20100281385 | Meaney et al. | Nov 2010 | A1 |
20100287486 | Coddington | Nov 2010 | A1 |
20100289807 | Yu et al. | Nov 2010 | A1 |
20100293460 | Budelli | Nov 2010 | A1 |
20100295789 | Shin et al. | Nov 2010 | A1 |
20100295805 | Shin et al. | Nov 2010 | A1 |
20100302177 | Kim et al. | Dec 2010 | A1 |
20100302179 | Ahn et al. | Dec 2010 | A1 |
20100306702 | Warner | Dec 2010 | A1 |
20100308983 | Conte et al. | Dec 2010 | A1 |
20100309147 | Fleizach et al. | Dec 2010 | A1 |
20100313050 | Harrat et al. | Dec 2010 | A1 |
20100313124 | Privault et al. | Dec 2010 | A1 |
20100313156 | Louch et al. | Dec 2010 | A1 |
20100313158 | Lee et al. | Dec 2010 | A1 |
20100313166 | Nakayama et al. | Dec 2010 | A1 |
20100315417 | Cho et al. | Dec 2010 | A1 |
20100315438 | Horodezky et al. | Dec 2010 | A1 |
20100317410 | Song et al. | Dec 2010 | A1 |
20100321301 | Casparian et al. | Dec 2010 | A1 |
20100321312 | Han et al. | Dec 2010 | A1 |
20100325578 | Mital et al. | Dec 2010 | A1 |
20100328229 | Weber et al. | Dec 2010 | A1 |
20110010626 | Fino et al. | Jan 2011 | A1 |
20110012851 | Ciesla et al. | Jan 2011 | A1 |
20110018695 | Bells et al. | Jan 2011 | A1 |
20110026099 | Kwon et al. | Feb 2011 | A1 |
20110035145 | Yamasaki | Feb 2011 | A1 |
20110037706 | Pasquero et al. | Feb 2011 | A1 |
20110038552 | Lam | Feb 2011 | A1 |
20110039602 | McNamara et al. | Feb 2011 | A1 |
20110047368 | Sundaramurthy et al. | Feb 2011 | A1 |
20110050576 | Forutanpour et al. | Mar 2011 | A1 |
20110050588 | Li et al. | Mar 2011 | A1 |
20110050591 | Kim et al. | Mar 2011 | A1 |
20110050594 | Kim et al. | Mar 2011 | A1 |
20110050628 | Homma et al. | Mar 2011 | A1 |
20110050629 | Homma et al. | Mar 2011 | A1 |
20110050630 | Ikeda | Mar 2011 | A1 |
20110050653 | Miyazawa et al. | Mar 2011 | A1 |
20110050687 | Alyshev et al. | Mar 2011 | A1 |
20110054837 | Ikeda | Mar 2011 | A1 |
20110055135 | Dawson et al. | Mar 2011 | A1 |
20110055741 | Jeon et al. | Mar 2011 | A1 |
20110057886 | Ng et al. | Mar 2011 | A1 |
20110057903 | Yamano et al. | Mar 2011 | A1 |
20110061021 | Kang et al. | Mar 2011 | A1 |
20110061029 | Yeh et al. | Mar 2011 | A1 |
20110063236 | Arai et al. | Mar 2011 | A1 |
20110063248 | Yoon | Mar 2011 | A1 |
20110069012 | Martensson | Mar 2011 | A1 |
20110069016 | Victor | Mar 2011 | A1 |
20110074697 | Rapp et al. | Mar 2011 | A1 |
20110080349 | Holbein et al. | Apr 2011 | A1 |
20110080350 | Almalki et al. | Apr 2011 | A1 |
20110080367 | Marchand et al. | Apr 2011 | A1 |
20110084910 | Almalki et al. | Apr 2011 | A1 |
20110087982 | McCann et al. | Apr 2011 | A1 |
20110087983 | Shim | Apr 2011 | A1 |
20110093815 | Gobeil | Apr 2011 | A1 |
20110093817 | Song et al. | Apr 2011 | A1 |
20110102829 | Jourdan | May 2011 | A1 |
20110107272 | Aguilar | May 2011 | A1 |
20110109617 | Snook et al. | May 2011 | A1 |
20110116716 | Kwon et al. | May 2011 | A1 |
20110119610 | Hackborn et al. | May 2011 | A1 |
20110126139 | Jeong et al. | May 2011 | A1 |
20110138295 | Momchilov et al. | Jun 2011 | A1 |
20110141031 | McCullough et al. | Jun 2011 | A1 |
20110141052 | Bernstein et al. | Jun 2011 | A1 |
20110144777 | Firkins et al. | Jun 2011 | A1 |
20110145752 | Fagans | Jun 2011 | A1 |
20110145753 | Prakash | Jun 2011 | A1 |
20110145759 | Leffert et al. | Jun 2011 | A1 |
20110145764 | Higuchi et al. | Jun 2011 | A1 |
20110149138 | Watkins | Jun 2011 | A1 |
20110154199 | Maffitt et al. | Jun 2011 | A1 |
20110163971 | Wagner et al. | Jul 2011 | A1 |
20110163978 | Park et al. | Jul 2011 | A1 |
20110169765 | Aono | Jul 2011 | A1 |
20110175826 | Moore et al. | Jul 2011 | A1 |
20110175832 | Miyazawa et al. | Jul 2011 | A1 |
20110181521 | Reid et al. | Jul 2011 | A1 |
20110181526 | Shaffer et al. | Jul 2011 | A1 |
20110181538 | Aono | Jul 2011 | A1 |
20110181751 | Mizumori | Jul 2011 | A1 |
20110185299 | Hinckley et al. | Jul 2011 | A1 |
20110185300 | Hinckley et al. | Jul 2011 | A1 |
20110185316 | Reid et al. | Jul 2011 | A1 |
20110191675 | Kauranen | Aug 2011 | A1 |
20110193788 | King et al. | Aug 2011 | A1 |
20110193809 | Walley et al. | Aug 2011 | A1 |
20110193881 | Rydenhag | Aug 2011 | A1 |
20110197160 | Kim et al. | Aug 2011 | A1 |
20110201387 | Paek et al. | Aug 2011 | A1 |
20110202834 | Mandryk et al. | Aug 2011 | A1 |
20110202853 | Mujkic | Aug 2011 | A1 |
20110202879 | Stovicek et al. | Aug 2011 | A1 |
20110205163 | Hinckley et al. | Aug 2011 | A1 |
20110209088 | Hinckley et al. | Aug 2011 | A1 |
20110209093 | Hinckley et al. | Aug 2011 | A1 |
20110209097 | Hinckley et al. | Aug 2011 | A1 |
20110209099 | Hinckley et al. | Aug 2011 | A1 |
20110209104 | Hinckley et al. | Aug 2011 | A1 |
20110210834 | Pasquero et al. | Sep 2011 | A1 |
20110210926 | Pasquero et al. | Sep 2011 | A1 |
20110210931 | Shai | Sep 2011 | A1 |
20110215914 | Edwards | Sep 2011 | A1 |
20110221684 | Rydenhag | Sep 2011 | A1 |
20110221776 | Shimotani et al. | Sep 2011 | A1 |
20110231789 | Bukurak et al. | Sep 2011 | A1 |
20110234639 | Shimotani et al. | Sep 2011 | A1 |
20110238690 | Arrasvouri et al. | Sep 2011 | A1 |
20110239110 | Garrett et al. | Sep 2011 | A1 |
20110242029 | Kasahara et al. | Oct 2011 | A1 |
20110246801 | Seethaler et al. | Oct 2011 | A1 |
20110246877 | Kwak et al. | Oct 2011 | A1 |
20110248916 | Griffin et al. | Oct 2011 | A1 |
20110248942 | Yana et al. | Oct 2011 | A1 |
20110248948 | Griffin et al. | Oct 2011 | A1 |
20110252346 | Chaudhri | Oct 2011 | A1 |
20110252357 | Chaudhri | Oct 2011 | A1 |
20110252362 | Cho et al. | Oct 2011 | A1 |
20110252380 | Chaudhri | Oct 2011 | A1 |
20110258537 | Rives et al. | Oct 2011 | A1 |
20110260994 | Saynac et al. | Oct 2011 | A1 |
20110263298 | Park | Oct 2011 | A1 |
20110267530 | Chun | Nov 2011 | A1 |
20110279380 | Weber et al. | Nov 2011 | A1 |
20110279381 | Tong et al. | Nov 2011 | A1 |
20110279395 | Kuwabara et al. | Nov 2011 | A1 |
20110279852 | Oda et al. | Nov 2011 | A1 |
20110285656 | Yaksick et al. | Nov 2011 | A1 |
20110285659 | Kuwabara et al. | Nov 2011 | A1 |
20110291945 | Ewing, Jr. et al. | Dec 2011 | A1 |
20110291951 | Tong | Dec 2011 | A1 |
20110296334 | Ryu et al. | Dec 2011 | A1 |
20110296351 | Ewing, Jr. et al. | Dec 2011 | A1 |
20110304559 | Pasquero | Dec 2011 | A1 |
20110304577 | Brown et al. | Dec 2011 | A1 |
20110310049 | Homma et al. | Dec 2011 | A1 |
20110319136 | Labowicz et al. | Dec 2011 | A1 |
20120001856 | Davidson | Jan 2012 | A1 |
20120005622 | Park et al. | Jan 2012 | A1 |
20120007857 | Noda et al. | Jan 2012 | A1 |
20120011437 | James et al. | Jan 2012 | A1 |
20120013541 | Boka et al. | Jan 2012 | A1 |
20120013542 | Shenfield | Jan 2012 | A1 |
20120013607 | Lee | Jan 2012 | A1 |
20120019448 | Pitkanen et al. | Jan 2012 | A1 |
20120026110 | Yamano | Feb 2012 | A1 |
20120030623 | Hoellwarth | Feb 2012 | A1 |
20120032979 | Blow et al. | Feb 2012 | A1 |
20120036441 | Basir et al. | Feb 2012 | A1 |
20120036556 | LeBeau et al. | Feb 2012 | A1 |
20120038580 | Sasaki | Feb 2012 | A1 |
20120044153 | Arrasvouri et al. | Feb 2012 | A1 |
20120047380 | Nurmi | Feb 2012 | A1 |
20120056837 | Park et al. | Mar 2012 | A1 |
20120056848 | Yamano et al. | Mar 2012 | A1 |
20120060123 | Smith | Mar 2012 | A1 |
20120062470 | Chang | Mar 2012 | A1 |
20120062564 | Miyashita et al. | Mar 2012 | A1 |
20120062604 | Lobo | Mar 2012 | A1 |
20120062732 | Marman et al. | Mar 2012 | A1 |
20120066630 | Kim et al. | Mar 2012 | A1 |
20120066648 | Rolleston et al. | Mar 2012 | A1 |
20120081326 | Heubel et al. | Apr 2012 | A1 |
20120081375 | Robert et al. | Apr 2012 | A1 |
20120084644 | Robert et al. | Apr 2012 | A1 |
20120084689 | Ledet et al. | Apr 2012 | A1 |
20120084713 | Desai et al. | Apr 2012 | A1 |
20120089932 | Kano et al. | Apr 2012 | A1 |
20120089942 | Gammon | Apr 2012 | A1 |
20120089951 | Cassidy | Apr 2012 | A1 |
20120096393 | Shim et al. | Apr 2012 | A1 |
20120096400 | Cho | Apr 2012 | A1 |
20120098780 | Fujisawa et al. | Apr 2012 | A1 |
20120102437 | Worley et al. | Apr 2012 | A1 |
20120105358 | Momeyer et al. | May 2012 | A1 |
20120105367 | Son et al. | May 2012 | A1 |
20120106852 | Khawand et al. | May 2012 | A1 |
20120113007 | Koch et al. | May 2012 | A1 |
20120113023 | Koch et al. | May 2012 | A1 |
20120126962 | Ujii et al. | May 2012 | A1 |
20120131495 | Goossens et al. | May 2012 | A1 |
20120139844 | Ramstein et al. | Jun 2012 | A1 |
20120139864 | Sleeman et al. | Jun 2012 | A1 |
20120144330 | Flint | Jun 2012 | A1 |
20120146945 | Miyazawa et al. | Jun 2012 | A1 |
20120147052 | Homma et al. | Jun 2012 | A1 |
20120154303 | Lazaridis et al. | Jun 2012 | A1 |
20120154328 | Kono | Jun 2012 | A1 |
20120158629 | Hinckley et al. | Jun 2012 | A1 |
20120159380 | Kocienda et al. | Jun 2012 | A1 |
20120169646 | Berkes et al. | Jul 2012 | A1 |
20120169716 | Mihara | Jul 2012 | A1 |
20120176403 | Cha et al. | Jul 2012 | A1 |
20120179967 | Hayes | Jul 2012 | A1 |
20120180001 | Griffen et al. | Jul 2012 | A1 |
20120182226 | Tuli | Jul 2012 | A1 |
20120183271 | Forutanpour et al. | Jul 2012 | A1 |
20120192108 | Kolb | Jul 2012 | A1 |
20120200528 | Ciesla et al. | Aug 2012 | A1 |
20120206393 | Hillis et al. | Aug 2012 | A1 |
20120216114 | Privault et al. | Aug 2012 | A1 |
20120218203 | Kanki | Aug 2012 | A1 |
20120235912 | Laubach | Sep 2012 | A1 |
20120236037 | Lessing et al. | Sep 2012 | A1 |
20120240044 | Johnson et al. | Sep 2012 | A1 |
20120242584 | Tuli | Sep 2012 | A1 |
20120245922 | Koslova et al. | Sep 2012 | A1 |
20120249575 | Krolczyk et al. | Oct 2012 | A1 |
20120249853 | Krolczyk et al. | Oct 2012 | A1 |
20120250598 | Lonnfors et al. | Oct 2012 | A1 |
20120256829 | Dodge | Oct 2012 | A1 |
20120256846 | Mak | Oct 2012 | A1 |
20120256847 | Mak et al. | Oct 2012 | A1 |
20120256857 | Mak | Oct 2012 | A1 |
20120257071 | Prentice | Oct 2012 | A1 |
20120260219 | Piccolotto | Oct 2012 | A1 |
20120260220 | Griffin | Oct 2012 | A1 |
20120274578 | Snow et al. | Nov 2012 | A1 |
20120274591 | Rimas-Ribikauskas et al. | Nov 2012 | A1 |
20120274662 | Kim et al. | Nov 2012 | A1 |
20120278744 | Kozitsyn et al. | Nov 2012 | A1 |
20120284673 | Lamb et al. | Nov 2012 | A1 |
20120293449 | Dietz | Nov 2012 | A1 |
20120293551 | Momeyer et al. | Nov 2012 | A1 |
20120297041 | Momchilov | Nov 2012 | A1 |
20120303548 | Johnson et al. | Nov 2012 | A1 |
20120304108 | Jarrett et al. | Nov 2012 | A1 |
20120304132 | Sareen et al. | Nov 2012 | A1 |
20120304133 | Nan et al. | Nov 2012 | A1 |
20120306632 | Fleizach et al. | Dec 2012 | A1 |
20120306748 | Fleizach et al. | Dec 2012 | A1 |
20120306764 | Kamibeppu | Dec 2012 | A1 |
20120306765 | Moore | Dec 2012 | A1 |
20120306766 | Moore | Dec 2012 | A1 |
20120306772 | Tan et al. | Dec 2012 | A1 |
20120306778 | Wheeldreyer et al. | Dec 2012 | A1 |
20120306927 | Lee et al. | Dec 2012 | A1 |
20120311429 | Decker et al. | Dec 2012 | A1 |
20120311437 | Weeldreyer et al. | Dec 2012 | A1 |
20120311498 | Kluttz et al. | Dec 2012 | A1 |
20120311504 | van Os et al. | Dec 2012 | A1 |
20130002561 | Wakasa | Jan 2013 | A1 |
20130014057 | Reinpoldt et al. | Jan 2013 | A1 |
20130016042 | Makinen et al. | Jan 2013 | A1 |
20130016056 | Shinozaki et al. | Jan 2013 | A1 |
20130016122 | Bhatt et al. | Jan 2013 | A1 |
20130019158 | Watanabe | Jan 2013 | A1 |
20130019174 | Gil et al. | Jan 2013 | A1 |
20130031514 | Gabbert | Jan 2013 | A1 |
20130036386 | Park et al. | Feb 2013 | A1 |
20130042199 | Fong et al. | Feb 2013 | A1 |
20130044062 | Bose et al. | Feb 2013 | A1 |
20130047100 | Kroeger et al. | Feb 2013 | A1 |
20130050131 | Lee et al. | Feb 2013 | A1 |
20130050143 | Kim et al. | Feb 2013 | A1 |
20130061172 | Huang et al. | Mar 2013 | A1 |
20130063364 | Moore | Mar 2013 | A1 |
20130063389 | Moore | Mar 2013 | A1 |
20130067383 | Kataoka et al. | Mar 2013 | A1 |
20130067513 | Takami | Mar 2013 | A1 |
20130067527 | Ashbook et al. | Mar 2013 | A1 |
20130069889 | Pearce et al. | Mar 2013 | A1 |
20130069991 | Davidson | Mar 2013 | A1 |
20130074003 | Dolenc | Mar 2013 | A1 |
20130076649 | Myers et al. | Mar 2013 | A1 |
20130076676 | Gan | Mar 2013 | A1 |
20130077804 | Glebe et al. | Mar 2013 | A1 |
20130082824 | Colley | Apr 2013 | A1 |
20130082937 | Liu et al. | Apr 2013 | A1 |
20130086056 | Dyor et al. | Apr 2013 | A1 |
20130093691 | Moosavi | Apr 2013 | A1 |
20130093764 | Andersson et al. | Apr 2013 | A1 |
20130097520 | Lewin et al. | Apr 2013 | A1 |
20130097521 | Lewin et al. | Apr 2013 | A1 |
20130097534 | Lewin et al. | Apr 2013 | A1 |
20130097539 | Mansson et al. | Apr 2013 | A1 |
20130097556 | Louch | Apr 2013 | A1 |
20130097562 | Kermoian et al. | Apr 2013 | A1 |
20130102366 | Teng et al. | Apr 2013 | A1 |
20130111345 | Newman et al. | May 2013 | A1 |
20130111378 | Newman et al. | May 2013 | A1 |
20130111398 | Lu et al. | May 2013 | A1 |
20130111415 | Newman et al. | May 2013 | A1 |
20130111579 | Newman et al. | May 2013 | A1 |
20130113715 | Grant et al. | May 2013 | A1 |
20130113720 | Van Eerd et al. | May 2013 | A1 |
20130113760 | Gossweiler, III et al. | May 2013 | A1 |
20130120278 | Cantrell | May 2013 | A1 |
20130120280 | Kukulski | May 2013 | A1 |
20130120295 | Kim et al. | May 2013 | A1 |
20130120306 | Furukawa | May 2013 | A1 |
20130125039 | Murata | May 2013 | A1 |
20130127755 | Lynn et al. | May 2013 | A1 |
20130135243 | Hirsch et al. | May 2013 | A1 |
20130135288 | King et al. | May 2013 | A1 |
20130135499 | Song | May 2013 | A1 |
20130141364 | Lynn et al. | Jun 2013 | A1 |
20130141396 | Lynn et al. | Jun 2013 | A1 |
20130145313 | Roh et al. | Jun 2013 | A1 |
20130154948 | Schediwy et al. | Jun 2013 | A1 |
20130154959 | Lindsay et al. | Jun 2013 | A1 |
20130155018 | Dagdeviren | Jun 2013 | A1 |
20130159893 | Lewis et al. | Jun 2013 | A1 |
20130159930 | Paretti et al. | Jun 2013 | A1 |
20130162603 | Peng et al. | Jun 2013 | A1 |
20130162667 | Eskolin et al. | Jun 2013 | A1 |
20130169549 | Seymour et al. | Jul 2013 | A1 |
20130174049 | Townsend et al. | Jul 2013 | A1 |
20130174089 | Ki | Jul 2013 | A1 |
20130174094 | Heo et al. | Jul 2013 | A1 |
20130174179 | Park et al. | Jul 2013 | A1 |
20130179840 | Fisher et al. | Jul 2013 | A1 |
20130185642 | Gammons | Jul 2013 | A1 |
20130187869 | Rydenhag et al. | Jul 2013 | A1 |
20130191791 | Rydenhag et al. | Jul 2013 | A1 |
20130194217 | Lee et al. | Aug 2013 | A1 |
20130194480 | Fukata et al. | Aug 2013 | A1 |
20130198690 | Barsoum et al. | Aug 2013 | A1 |
20130201139 | Tanaka | Aug 2013 | A1 |
20130212515 | Eleftheriou | Aug 2013 | A1 |
20130212541 | Dolenc et al. | Aug 2013 | A1 |
20130215079 | Johnson et al. | Aug 2013 | A1 |
20130222274 | Mori et al. | Aug 2013 | A1 |
20130222323 | McKenzie | Aug 2013 | A1 |
20130222333 | Miles et al. | Aug 2013 | A1 |
20130222671 | Tseng et al. | Aug 2013 | A1 |
20130227413 | Thorsander et al. | Aug 2013 | A1 |
20130227419 | Lee et al. | Aug 2013 | A1 |
20130227450 | Na et al. | Aug 2013 | A1 |
20130228023 | Drasnin et al. | Sep 2013 | A1 |
20130232353 | Belesiu et al. | Sep 2013 | A1 |
20130232402 | Lu et al. | Sep 2013 | A1 |
20130234929 | Libin | Sep 2013 | A1 |
20130239057 | Ubillos et al. | Sep 2013 | A1 |
20130246954 | Gray et al. | Sep 2013 | A1 |
20130249814 | Zeng | Sep 2013 | A1 |
20130257793 | Zeliff et al. | Oct 2013 | A1 |
20130257817 | Yliaho | Oct 2013 | A1 |
20130265246 | Tae | Oct 2013 | A1 |
20130265452 | Shin et al. | Oct 2013 | A1 |
20130268875 | Han et al. | Oct 2013 | A1 |
20130271395 | Tsai et al. | Oct 2013 | A1 |
20130275422 | Silber et al. | Oct 2013 | A1 |
20130278520 | Weng et al. | Oct 2013 | A1 |
20130293496 | Takamoto | Nov 2013 | A1 |
20130305184 | Kim et al. | Nov 2013 | A1 |
20130307790 | Konttori et al. | Nov 2013 | A1 |
20130307792 | Andres et al. | Nov 2013 | A1 |
20130314359 | Sudou | Nov 2013 | A1 |
20130314434 | Shetterly et al. | Nov 2013 | A1 |
20130321340 | Seo et al. | Dec 2013 | A1 |
20130321457 | Bauermeister et al. | Dec 2013 | A1 |
20130325342 | Pylappan et al. | Dec 2013 | A1 |
20130326420 | Liu et al. | Dec 2013 | A1 |
20130326421 | Jo | Dec 2013 | A1 |
20130326583 | Freihold et al. | Dec 2013 | A1 |
20130328770 | Parham | Dec 2013 | A1 |
20130328793 | Chowdhury | Dec 2013 | A1 |
20130328796 | Al-Dahle et al. | Dec 2013 | A1 |
20130332836 | Cho | Dec 2013 | A1 |
20130332892 | Matsuki | Dec 2013 | A1 |
20130335373 | Tomiyasu | Dec 2013 | A1 |
20130338847 | Lisseman et al. | Dec 2013 | A1 |
20130339001 | Craswell et al. | Dec 2013 | A1 |
20130339909 | Ha | Dec 2013 | A1 |
20140002355 | Lee et al. | Jan 2014 | A1 |
20140002374 | Hunt et al. | Jan 2014 | A1 |
20140002386 | Rosenberg et al. | Jan 2014 | A1 |
20140013271 | Moore et al. | Jan 2014 | A1 |
20140024414 | Fuji | Jan 2014 | A1 |
20140026098 | Gilman | Jan 2014 | A1 |
20140026099 | Andersson Reimer et al. | Jan 2014 | A1 |
20140028554 | De Los Reyes et al. | Jan 2014 | A1 |
20140028571 | St. Clair | Jan 2014 | A1 |
20140028601 | Moore | Jan 2014 | A1 |
20140028606 | Giannetta | Jan 2014 | A1 |
20140035804 | Dearman | Feb 2014 | A1 |
20140035826 | Frazier et al. | Feb 2014 | A1 |
20140049491 | Nagar et al. | Feb 2014 | A1 |
20140053116 | Smith et al. | Feb 2014 | A1 |
20140055367 | Dearman et al. | Feb 2014 | A1 |
20140055377 | Kim | Feb 2014 | A1 |
20140059460 | Ho | Feb 2014 | A1 |
20140059485 | Lehrian et al. | Feb 2014 | A1 |
20140063316 | Lee et al. | Mar 2014 | A1 |
20140063541 | Yamazaki | Mar 2014 | A1 |
20140067293 | Parivar et al. | Mar 2014 | A1 |
20140068475 | Li et al. | Mar 2014 | A1 |
20140071060 | Santos-Gomez | Mar 2014 | A1 |
20140072281 | Cho et al. | Mar 2014 | A1 |
20140072283 | Cho et al. | Mar 2014 | A1 |
20140078318 | Alameh | Mar 2014 | A1 |
20140078343 | Dai et al. | Mar 2014 | A1 |
20140082536 | Costa et al. | Mar 2014 | A1 |
20140092025 | Pala et al. | Apr 2014 | A1 |
20140092030 | Van der Velden | Apr 2014 | A1 |
20140092031 | Schwartz et al. | Apr 2014 | A1 |
20140108936 | Khosropour et al. | Apr 2014 | A1 |
20140109016 | Ouyang et al. | Apr 2014 | A1 |
20140111456 | Kashiwa et al. | Apr 2014 | A1 |
20140111480 | Kim et al. | Apr 2014 | A1 |
20140111670 | Lord et al. | Apr 2014 | A1 |
20140118268 | Kuscher | May 2014 | A1 |
20140123080 | Gan | May 2014 | A1 |
20140139456 | Wigdor et al. | May 2014 | A1 |
20140139471 | Matsuki | May 2014 | A1 |
20140145970 | Cho | May 2014 | A1 |
20140152581 | Case et al. | Jun 2014 | A1 |
20140157203 | Jeon et al. | Jun 2014 | A1 |
20140160063 | Yairi et al. | Jun 2014 | A1 |
20140160073 | Matsuki | Jun 2014 | A1 |
20140164955 | Thiruvidam et al. | Jun 2014 | A1 |
20140164966 | Kim et al. | Jun 2014 | A1 |
20140165006 | Chaudhri et al. | Jun 2014 | A1 |
20140168093 | Lawrence | Jun 2014 | A1 |
20140168110 | Araki et al. | Jun 2014 | A1 |
20140168153 | Deichmann et al. | Jun 2014 | A1 |
20140173517 | Chaudhri | Jun 2014 | A1 |
20140179377 | Song et al. | Jun 2014 | A1 |
20140184526 | Cho | Jul 2014 | A1 |
20140201660 | Clausen et al. | Jul 2014 | A1 |
20140208271 | Bell et al. | Jul 2014 | A1 |
20140210758 | Park et al. | Jul 2014 | A1 |
20140210760 | Aberg et al. | Jul 2014 | A1 |
20140210798 | Wilson | Jul 2014 | A1 |
20140223376 | Tarvainen et al. | Aug 2014 | A1 |
20140223381 | Huang et al. | Aug 2014 | A1 |
20140237408 | Ohlsson et al. | Aug 2014 | A1 |
20140245202 | Yoon et al. | Aug 2014 | A1 |
20140245367 | Sasaki et al. | Aug 2014 | A1 |
20140267114 | Lisseman et al. | Sep 2014 | A1 |
20140267135 | Chhabra | Sep 2014 | A1 |
20140267362 | Kocienda et al. | Sep 2014 | A1 |
20140282084 | Murarka et al. | Sep 2014 | A1 |
20140282211 | Ady et al. | Sep 2014 | A1 |
20140282214 | Shirzadi et al. | Sep 2014 | A1 |
20140300569 | Matsuki et al. | Oct 2014 | A1 |
20140304599 | Alexandersson | Oct 2014 | A1 |
20140304646 | Rossman | Oct 2014 | A1 |
20140304651 | Johansson et al. | Oct 2014 | A1 |
20140306897 | Cueto | Oct 2014 | A1 |
20140306899 | Hicks | Oct 2014 | A1 |
20140310638 | Lee et al. | Oct 2014 | A1 |
20140313130 | Yamano et al. | Oct 2014 | A1 |
20140333551 | Kim et al. | Nov 2014 | A1 |
20140333561 | Bull et al. | Nov 2014 | A1 |
20140344765 | Hicks et al. | Nov 2014 | A1 |
20140351744 | Jeon et al. | Nov 2014 | A1 |
20140354845 | Molgaard et al. | Dec 2014 | A1 |
20140354850 | Kosaka et al. | Dec 2014 | A1 |
20140359438 | Matsuki | Dec 2014 | A1 |
20140359528 | Murata | Dec 2014 | A1 |
20140361982 | Shaffer | Dec 2014 | A1 |
20140365945 | Karunamuni et al. | Dec 2014 | A1 |
20140365956 | Karunamuni et al. | Dec 2014 | A1 |
20140380247 | Tecarro et al. | Dec 2014 | A1 |
20150002664 | Eppinger et al. | Jan 2015 | A1 |
20150012861 | Loginov | Jan 2015 | A1 |
20150015763 | Lee et al. | Jan 2015 | A1 |
20150019997 | Kim et al. | Jan 2015 | A1 |
20150020032 | Chen | Jan 2015 | A1 |
20150020033 | Newham et al. | Jan 2015 | A1 |
20150020036 | Kim et al. | Jan 2015 | A1 |
20150026584 | Kobayakov et al. | Jan 2015 | A1 |
20150026592 | Mohammed et al. | Jan 2015 | A1 |
20150026642 | Wilson et al. | Jan 2015 | A1 |
20150029149 | Andersson et al. | Jan 2015 | A1 |
20150033184 | Kim et al. | Jan 2015 | A1 |
20150040065 | Bianco et al. | Feb 2015 | A1 |
20150042588 | Park | Feb 2015 | A1 |
20150046876 | Goldenberg | Feb 2015 | A1 |
20150049033 | Kim et al. | Feb 2015 | A1 |
20150052464 | Chen et al. | Feb 2015 | A1 |
20150055890 | Lundin et al. | Feb 2015 | A1 |
20150058723 | Cieplinski et al. | Feb 2015 | A1 |
20150062046 | Cho et al. | Mar 2015 | A1 |
20150062052 | Bernstein et al. | Mar 2015 | A1 |
20150062068 | Shih et al. | Mar 2015 | A1 |
20150067495 | Bernstein et al. | Mar 2015 | A1 |
20150067496 | Missig et al. | Mar 2015 | A1 |
20150067497 | Cieplinski et al. | Mar 2015 | A1 |
20150067513 | Zambetti et al. | Mar 2015 | A1 |
20150067519 | Missig et al. | Mar 2015 | A1 |
20150067534 | Choi et al. | Mar 2015 | A1 |
20150067559 | Missig et al. | Mar 2015 | A1 |
20150067560 | Cieplinski et al. | Mar 2015 | A1 |
20150067563 | Bernstein et al. | Mar 2015 | A1 |
20150067596 | Brown et al. | Mar 2015 | A1 |
20150067601 | Bernstein et al. | Mar 2015 | A1 |
20150067602 | Bernstein et al. | Mar 2015 | A1 |
20150067605 | Zambetti et al. | Mar 2015 | A1 |
20150071547 | Keating et al. | Mar 2015 | A1 |
20150082162 | Cho et al. | Mar 2015 | A1 |
20150082238 | Meng | Mar 2015 | A1 |
20150116205 | Westerman et al. | Apr 2015 | A1 |
20150121218 | Kim et al. | Apr 2015 | A1 |
20150121225 | Somasundaram et al. | Apr 2015 | A1 |
20150128092 | Lee et al. | May 2015 | A1 |
20150135108 | Pope et al. | May 2015 | A1 |
20150135109 | Zambetti et al. | May 2015 | A1 |
20150138126 | Westerman | May 2015 | A1 |
20150138155 | Bernstein et al. | May 2015 | A1 |
20150139605 | Wiklof | May 2015 | A1 |
20150143273 | Bernstein et al. | May 2015 | A1 |
20150143284 | Bennett et al. | May 2015 | A1 |
20150143294 | Piccinato et al. | May 2015 | A1 |
20150143303 | Sarrazin et al. | May 2015 | A1 |
20150149899 | Bernstein et al. | May 2015 | A1 |
20150149964 | Bernstein et al. | May 2015 | A1 |
20150149967 | Bernstein et al. | May 2015 | A1 |
20150153897 | Huang et al. | Jun 2015 | A1 |
20150153929 | Bernstein et al. | Jun 2015 | A1 |
20150160729 | Nakagawa | Jun 2015 | A1 |
20150169059 | Behles et al. | Jun 2015 | A1 |
20150185840 | Golyshko et al. | Jul 2015 | A1 |
20150193099 | Murphy | Jul 2015 | A1 |
20150193951 | Lee et al. | Jul 2015 | A1 |
20150205495 | Koide et al. | Jul 2015 | A1 |
20150205775 | Berdahl et al. | Jul 2015 | A1 |
20150234446 | Nathan et al. | Aug 2015 | A1 |
20150234493 | Parivar et al. | Aug 2015 | A1 |
20150253866 | Amm et al. | Sep 2015 | A1 |
20150268786 | Kitada | Sep 2015 | A1 |
20150268813 | Bos | Sep 2015 | A1 |
20150309573 | Brombach et al. | Oct 2015 | A1 |
20150321607 | Cho et al. | Nov 2015 | A1 |
20150332107 | Paniaras | Nov 2015 | A1 |
20150332607 | Gardner, Jr. et al. | Nov 2015 | A1 |
20150378519 | Brown et al. | Dec 2015 | A1 |
20150378982 | McKenzie et al. | Dec 2015 | A1 |
20150381931 | Uhma et al. | Dec 2015 | A1 |
20160004373 | Huang | Jan 2016 | A1 |
20160004393 | Faaborg et al. | Jan 2016 | A1 |
20160004427 | Zambetti et al. | Jan 2016 | A1 |
20160004428 | Bernstein et al. | Jan 2016 | A1 |
20160004430 | Missig et al. | Jan 2016 | A1 |
20160004431 | Bernstein et al. | Jan 2016 | A1 |
20160004432 | Bernstein et al. | Jan 2016 | A1 |
20160011725 | D'Argenio et al. | Jan 2016 | A1 |
20160011771 | Cieplinski | Jan 2016 | A1 |
20160019718 | Mukkamala et al. | Jan 2016 | A1 |
20160021511 | Jin et al. | Jan 2016 | A1 |
20160041750 | Cieplinski et al. | Feb 2016 | A1 |
20160048326 | Kim et al. | Feb 2016 | A1 |
20160062466 | Moussette et al. | Mar 2016 | A1 |
20160062619 | Reeve et al. | Mar 2016 | A1 |
20160070401 | Kim et al. | Mar 2016 | A1 |
20160077721 | Laubach et al. | Mar 2016 | A1 |
20160085385 | Gao et al. | Mar 2016 | A1 |
20160092071 | Lawson et al. | Mar 2016 | A1 |
20160124924 | Greenberg et al. | May 2016 | A1 |
20160125234 | Ota et al. | May 2016 | A1 |
20160132139 | Du et al. | May 2016 | A1 |
20160188181 | Smith | Jun 2016 | A1 |
20160196028 | Kenney et al. | Jul 2016 | A1 |
20160210025 | Bernstein et al. | Jul 2016 | A1 |
20160246478 | Davis et al. | Aug 2016 | A1 |
20160259412 | Flint et al. | Sep 2016 | A1 |
20160259413 | Anzures et al. | Sep 2016 | A1 |
20160259495 | Butcher et al. | Sep 2016 | A1 |
20160259496 | Butcher et al. | Sep 2016 | A1 |
20160259498 | Foss et al. | Sep 2016 | A1 |
20160259499 | Kocienda et al. | Sep 2016 | A1 |
20160259516 | Kudurshian et al. | Sep 2016 | A1 |
20160259517 | Butcher et al. | Sep 2016 | A1 |
20160259518 | King et al. | Sep 2016 | A1 |
20160259519 | Foss et al. | Sep 2016 | A1 |
20160259527 | Kocienda et al. | Sep 2016 | A1 |
20160259528 | Foss et al. | Sep 2016 | A1 |
20160259536 | Kudurshian et al. | Sep 2016 | A1 |
20160259548 | Ma | Sep 2016 | A1 |
20160274686 | Ruiz et al. | Sep 2016 | A1 |
20160274728 | Luo et al. | Sep 2016 | A1 |
20160274761 | Ruiz et al. | Sep 2016 | A1 |
20160283054 | Suzuki | Sep 2016 | A1 |
20160306507 | Defazio et al. | Oct 2016 | A1 |
20160320906 | Bokma et al. | Nov 2016 | A1 |
20160357368 | Federighi et al. | Dec 2016 | A1 |
20160357389 | Dakin et al. | Dec 2016 | A1 |
20160357390 | Federighi et al. | Dec 2016 | A1 |
20160357404 | Alonso Ruiz et al. | Dec 2016 | A1 |
20160360116 | Penha et al. | Dec 2016 | A1 |
20170045981 | Karunamuni et al. | Feb 2017 | A1 |
20170046039 | Karunamuni et al. | Feb 2017 | A1 |
20170046058 | Karunamuni et al. | Feb 2017 | A1 |
20170046059 | Karunamuni et al. | Feb 2017 | A1 |
20170046060 | Karunamuni et al. | Feb 2017 | A1 |
20170075520 | Bauer et al. | Mar 2017 | A1 |
20170075562 | Bauer et al. | Mar 2017 | A1 |
20170075563 | Bauer et al. | Mar 2017 | A1 |
20170090617 | Jang et al. | Mar 2017 | A1 |
20170090699 | Pennington et al. | Mar 2017 | A1 |
20170091153 | Thimbleby | Mar 2017 | A1 |
20170109011 | Jiang | Apr 2017 | A1 |
20170115867 | Bargmann | Apr 2017 | A1 |
20170123497 | Yonezawa | May 2017 | A1 |
20170124699 | Lane | May 2017 | A1 |
20170139565 | Choi | May 2017 | A1 |
20170315694 | Alonso Ruiz et al. | Nov 2017 | A1 |
20170357403 | Geary et al. | Dec 2017 | A1 |
20180024681 | Bernstein et al. | Jan 2018 | A1 |
20180059866 | Drake et al. | Mar 2018 | A1 |
20180082522 | Bartosik | Mar 2018 | A1 |
20180188920 | Bernstein et al. | Jul 2018 | A1 |
20180342103 | Schwartz et al. | Nov 2018 | A1 |
20180349362 | Sharp et al. | Dec 2018 | A1 |
20180364898 | Chen | Dec 2018 | A1 |
20190012059 | Kwon et al. | Jan 2019 | A1 |
20190018562 | Bernstein et al. | Jan 2019 | A1 |
20190042075 | Bernstein et al. | Feb 2019 | A1 |
20190042078 | Bernstein et al. | Feb 2019 | A1 |
20190065043 | Zambetti et al. | Feb 2019 | A1 |
20190121493 | Bernstein et al. | Apr 2019 | A1 |
20190121520 | Cieplinski et al. | Apr 2019 | A1 |
20190138101 | Bernstein | May 2019 | A1 |
20190138102 | Missig | May 2019 | A1 |
20190138189 | Missig | May 2019 | A1 |
20190146643 | Foss et al. | May 2019 | A1 |
20190155503 | Alonso Ruiz et al. | May 2019 | A1 |
20190158727 | Penha et al. | May 2019 | A1 |
20190163358 | Dascola et al. | May 2019 | A1 |
20190171353 | Missig et al. | Jun 2019 | A1 |
20190171354 | Dascola et al. | Jun 2019 | A1 |
20190212896 | Karunamuni et al. | Jul 2019 | A1 |
20190332257 | Kudurshian et al. | Oct 2019 | A1 |
20190364194 | Penha et al. | Nov 2019 | A1 |
20190391658 | Missig et al. | Dec 2019 | A1 |
20200081614 | Zambetti | Mar 2020 | A1 |
20200142548 | Karunamuni et al. | May 2020 | A1 |
20200201472 | Bernstein et al. | Jun 2020 | A1 |
20200210059 | Hu et al. | Jul 2020 | A1 |
20200218445 | Alonso Ruiz et al. | Jul 2020 | A1 |
20200301556 | Alonso Ruiz et al. | Sep 2020 | A1 |
20200333936 | Khoe et al. | Oct 2020 | A1 |
20200371683 | Zambetti et al. | Nov 2020 | A1 |
20200396375 | Penha et al. | Dec 2020 | A1 |
20210081082 | Dascola et al. | Mar 2021 | A1 |
20210117054 | Karunamuni et al. | Apr 2021 | A1 |
20210191602 | Brown et al. | Jun 2021 | A1 |
20210382613 | Kudurshian et al. | Dec 2021 | A1 |
20220011932 | Khoe et al. | Jan 2022 | A1 |
20220070359 | Clarke et al. | Mar 2022 | A1 |
20220129076 | Bernstein et al. | Apr 2022 | A1 |
20220261131 | Bernstein et al. | Aug 2022 | A1 |
Number | Date | Country |
---|---|---|
2780765 | May 2011 | CA |
1356493 | Jul 2002 | CN |
1620327 | May 2005 | CN |
1808362 | Jul 2006 | CN |
101118469 | Feb 2008 | CN |
101192097 | Jun 2008 | CN |
101202866 | Jun 2008 | CN |
101222704 | Jul 2008 | CN |
101227764 | Jul 2008 | CN |
101241397 | Aug 2008 | CN |
101320303 | Dec 2008 | CN |
101384977 | Mar 2009 | CN |
101390039 | Mar 2009 | CN |
101421707 | Apr 2009 | CN |
101464777 | Jun 2009 | CN |
101498979 | Aug 2009 | CN |
101526876 | Sep 2009 | CN |
101527745 | Sep 2009 | CN |
101562703 | Oct 2009 | CN |
101593077 | Dec 2009 | CN |
101609380 | Dec 2009 | CN |
101620507 | Jan 2010 | CN |
101627359 | Jan 2010 | CN |
101630230 | Jan 2010 | CN |
101685370 | Mar 2010 | CN |
101692194 | Apr 2010 | CN |
101727179 | Jun 2010 | CN |
101739206 | Jun 2010 | CN |
101763193 | Jun 2010 | CN |
101784981 | Jul 2010 | CN |
101809526 | Aug 2010 | CN |
101896962 | Nov 2010 | CN |
101937304 | Jan 2011 | CN |
101971603 | Feb 2011 | CN |
101998052 | Mar 2011 | CN |
102004575 | Apr 2011 | CN |
102004576 | Apr 2011 | CN |
102004577 | Apr 2011 | CN |
102004593 | Apr 2011 | CN |
102004602 | Apr 2011 | CN |
102004604 | Apr 2011 | CN |
102016777 | Apr 2011 | CN |
102053790 | May 2011 | CN |
102067068 | May 2011 | CN |
102112946 | Jun 2011 | CN |
102150018 | Aug 2011 | CN |
102160021 | Aug 2011 | CN |
102171629 | Aug 2011 | CN |
102195514 | Sep 2011 | CN |
102203702 | Sep 2011 | CN |
102214038 | Oct 2011 | CN |
102223476 | Oct 2011 | CN |
102243662 | Nov 2011 | CN |
102257460 | Nov 2011 | CN |
102301322 | Dec 2011 | CN |
102349038 | Feb 2012 | CN |
102349040 | Feb 2012 | CN |
102354269 | Feb 2012 | CN |
102365666 | Feb 2012 | CN |
102375605 | Mar 2012 | CN |
102385478 | Mar 2012 | CN |
102388351 | Mar 2012 | CN |
102438092 | May 2012 | CN |
102483666 | May 2012 | CN |
102483677 | May 2012 | CN |
102546925 | Jul 2012 | CN |
102566908 | Jul 2012 | CN |
102576251 | Jul 2012 | CN |
102576282 | Jul 2012 | CN |
102646013 | Aug 2012 | CN |
102662571 | Sep 2012 | CN |
102662573 | Sep 2012 | CN |
102722312 | Oct 2012 | CN |
102752441 | Oct 2012 | CN |
102792255 | Nov 2012 | CN |
102819331 | Dec 2012 | CN |
102819401 | Dec 2012 | CN |
102841677 | Dec 2012 | CN |
102880417 | Jan 2013 | CN |
103019586 | Apr 2013 | CN |
103092386 | May 2013 | CN |
103092406 | May 2013 | CN |
103097992 | May 2013 | CN |
103186345 | Jul 2013 | CN |
103201714 | Jul 2013 | CN |
103268184 | Aug 2013 | CN |
103279295 | Sep 2013 | CN |
103518176 | Jan 2014 | CN |
103562841 | Feb 2014 | CN |
103620531 | Mar 2014 | CN |
103649885 | Mar 2014 | CN |
103699292 | Apr 2014 | CN |
103699295 | Apr 2014 | CN |
103777850 | May 2014 | CN |
103777886 | May 2014 | CN |
103793134 | May 2014 | CN |
103838465 | Jun 2014 | CN |
103870190 | Jun 2014 | CN |
103970474 | Aug 2014 | CN |
103984501 | Aug 2014 | CN |
104011637 | Aug 2014 | CN |
104020868 | Sep 2014 | CN |
104020955 | Sep 2014 | CN |
104021021 | Sep 2014 | CN |
104024985 | Sep 2014 | CN |
104077014 | Oct 2014 | CN |
104142798 | Nov 2014 | CN |
104160362 | Nov 2014 | CN |
104267902 | Jan 2015 | CN |
104331239 | Feb 2015 | CN |
104392292 | Mar 2015 | CN |
104412201 | Mar 2015 | CN |
104471521 | Mar 2015 | CN |
104487928 | Apr 2015 | CN |
104487929 | Apr 2015 | CN |
104487930 | Apr 2015 | CN |
105264476 | Jan 2016 | CN |
100 59 906 | Jun 2002 | DE |
0 364178 | Apr 1990 | EP |
0 859 307 | Mar 1998 | EP |
0 880 090 | Nov 1998 | EP |
1 028 583 | Aug 2000 | EP |
1 406 150 | Apr 2004 | EP |
1 674 977 | Jun 2006 | EP |
1 882 902 | Jan 2008 | EP |
2 000 896 | Dec 2008 | EP |
2 017 701 | Jan 2009 | EP |
2 028 583 | Feb 2009 | EP |
2 112 586 | Oct 2009 | EP |
2 141 574 | Jan 2010 | EP |
2 175 357 | Apr 2010 | EP |
2 196 893 | Jun 2010 | EP |
2 214 087 | Aug 2010 | EP |
2 226 715 | Sep 2010 | EP |
2 284 675 | Feb 2011 | EP |
2 299 351 | Mar 2011 | EP |
2 302 496 | Mar 2011 | EP |
2 363 790 | Sep 2011 | EP |
2 375 309 | Oct 2011 | EP |
2 375 314 | Oct 2011 | EP |
2 386 935 | Nov 2011 | EP |
2 407 868 | Jan 2012 | EP |
2 420 924 | Feb 2012 | EP |
2 426 580 | Mar 2012 | EP |
2 445 182 | Apr 2012 | EP |
2 447 818 | May 2012 | EP |
2 527 966 | Nov 2012 | EP |
2 530 677 | Dec 2012 | EP |
2 541 376 | Jan 2013 | EP |
2 555 500 | Feb 2013 | EP |
2 615 535 | Jul 2013 | EP |
2 631 737 | Aug 2013 | EP |
2 674 834 | Dec 2013 | EP |
2 674 846 | Dec 2013 | EP |
2 708985 | Mar 2014 | EP |
2 733 578 | May 2014 | EP |
2 808 764 | Dec 2014 | EP |
2 809 058 | Dec 2014 | EP |
2 813 938 | Dec 2014 | EP |
3 664 092 | Jun 2020 | EP |
2 402 105 | Dec 2004 | GB |
58-182746 | Oct 1983 | JP |
H06-161647 | Jun 1994 | JP |
H07-098769 | Apr 1995 | JP |
H07-104915 | Apr 1995 | JP |
H07-151512 | Jun 1995 | JP |
H08-227341 | Sep 1996 | JP |
H09-269883 | Oct 1997 | JP |
H09-330175 | Dec 1997 | JP |
H11-203044 | Jul 1999 | JP |
2001-078137 | Mar 2001 | JP |
2001-202192 | Jul 2001 | JP |
2001-222355 | Aug 2001 | JP |
2001-306207 | Nov 2001 | JP |
2002-044536 | Feb 2002 | JP |
2002-149312 | May 2002 | JP |
3085481 | May 2002 | JP |
2002-182855 | Jun 2002 | JP |
2003-157131 | May 2003 | JP |
2003-186597 | Jul 2003 | JP |
2004-054861 | Feb 2004 | JP |
2004-062648 | Feb 2004 | JP |
2004-070492 | Mar 2004 | JP |
2004-078957 | Mar 2004 | JP |
2004-086733 | Mar 2004 | JP |
2004-152217 | May 2004 | JP |
2004-288208 | Oct 2004 | JP |
2005-031786 | Feb 2005 | JP |
2005-092386 | Apr 2005 | JP |
2005-102106 | Apr 2005 | JP |
2005-135106 | May 2005 | JP |
2005-157842 | Jun 2005 | JP |
2005-196810 | Jul 2005 | JP |
2005-317041 | Nov 2005 | JP |
2005-352927 | Dec 2005 | JP |
2006-059238 | Mar 2006 | JP |
2006-185443 | Jul 2006 | JP |
2007-116384 | May 2007 | JP |
2007-148104 | Jun 2007 | JP |
2007-264808 | Oct 2007 | JP |
2008-009759 | Jan 2008 | JP |
2008-015890 | Jan 2008 | JP |
2008-033739 | Feb 2008 | JP |
2008-516348 | May 2008 | JP |
2008-146453 | Jun 2008 | JP |
2008-191086 | Aug 2008 | JP |
2008-537615 | Sep 2008 | JP |
2008-305174 | Dec 2008 | JP |
2009-500761 | Jan 2009 | JP |
2009-110243 | May 2009 | JP |
2009-129171 | Jun 2009 | JP |
2009-129443 | Jun 2009 | JP |
2009-169452 | Jul 2009 | JP |
2009-211704 | Sep 2009 | JP |
2009-217543 | Sep 2009 | JP |
2009-294688 | Dec 2009 | JP |
2009-545805 | Dec 2009 | JP |
2010-009321 | Jan 2010 | JP |
2010-503126 | Jan 2010 | JP |
2010-503130 | Jan 2010 | JP |
2010-055274 | Mar 2010 | JP |
2010-097353 | Apr 2010 | JP |
2010-146507 | Jul 2010 | JP |
2010-152716 | Jul 2010 | JP |
2010-176174 | Aug 2010 | JP |
2010-176337 | Aug 2010 | JP |
2010-181934 | Aug 2010 | JP |
2010-181940 | Aug 2010 | JP |
2010-198385 | Sep 2010 | JP |
2010-536077 | Nov 2010 | JP |
2010-541071 | Dec 2010 | JP |
2011-501307 | Jan 2011 | JP |
2011-028635 | Feb 2011 | JP |
2011-048023 | Mar 2011 | JP |
2011-048666 | Mar 2011 | JP |
2011-048686 | Mar 2011 | JP |
2011-048762 | Mar 2011 | JP |
2011-048832 | Mar 2011 | JP |
2011-053831 | Mar 2011 | JP |
2011-053972 | Mar 2011 | JP |
2011-053973 | Mar 2011 | JP |
2011-053974 | Mar 2011 | JP |
2011-054196 | Mar 2011 | JP |
2011-059821 | Mar 2011 | JP |
2011-070342 | Apr 2011 | JP |
2011-100290 | May 2011 | JP |
2011-107823 | Jun 2011 | JP |
2011-123773 | Jun 2011 | JP |
2011-141868 | Jul 2011 | JP |
2011-170538 | Sep 2011 | JP |
2011-192179 | Sep 2011 | JP |
2011-192215 | Sep 2011 | JP |
2011-197848 | Oct 2011 | JP |
2011-221640 | Nov 2011 | JP |
2011-232947 | Nov 2011 | JP |
2011-242386 | Dec 2011 | JP |
2011-250004 | Dec 2011 | JP |
2011-253556 | Dec 2011 | JP |
2011-257941 | Dec 2011 | JP |
2011-530101 | Dec 2011 | JP |
2012-027940 | Feb 2012 | JP |
2012-033061 | Feb 2012 | JP |
2012-043266 | Mar 2012 | JP |
2012-043267 | Mar 2012 | JP |
2012-053687 | Mar 2012 | JP |
2012-053754 | Mar 2012 | JP |
2012-053926 | Mar 2012 | JP |
2012-073785 | Apr 2012 | JP |
2012-073873 | Apr 2012 | JP |
2012-509605 | Apr 2012 | JP |
2012-093820 | May 2012 | JP |
2012-118825 | Jun 2012 | JP |
2012-118993 | Jun 2012 | JP |
2012-123564 | Jun 2012 | JP |
2012-128825 | Jul 2012 | JP |
2012-168620 | Sep 2012 | JP |
2012-527685 | Nov 2012 | JP |
2013-025357 | Feb 2013 | JP |
2013-030050 | Feb 2013 | JP |
2013-058149 | Mar 2013 | JP |
2013-080521 | May 2013 | JP |
2013-093020 | May 2013 | JP |
2013-101465 | May 2013 | JP |
2013-105410 | May 2013 | JP |
2013-520727 | Jun 2013 | JP |
2013-131185 | Jul 2013 | JP |
2013-529339 | Jul 2013 | JP |
2013-200879 | Oct 2013 | JP |
2013-542488 | Nov 2013 | JP |
2013-250602 | Dec 2013 | JP |
2014-504419 | Feb 2014 | JP |
2014-052852 | Mar 2014 | JP |
2014-130567 | Jul 2014 | JP |
2014-140112 | Jul 2014 | JP |
2014-149833 | Aug 2014 | JP |
2014-519109 | Aug 2014 | JP |
2014-529137 | Oct 2014 | JP |
2015-099555 | May 2015 | JP |
2015-521315 | Jul 2015 | JP |
2015-153420 | Aug 2015 | JP |
2015-185161 | Oct 2015 | JP |
20020041828 | Jun 2002 | KR |
2006-0071353 | Jun 2006 | KR |
2006-0117870 | Nov 2006 | KR |
100807738 | Feb 2008 | KR |
20080026138 | Mar 2008 | KR |
2008-0045143 | Apr 2008 | KR |
100823871 | Apr 2008 | KR |
2008-0054346 | Jun 2008 | KR |
2009-0066319 | Jun 2009 | KR |
2009-0108065 | Oct 2009 | KR |
2010-0010860 | Feb 2010 | KR |
2010-0014095 | Feb 2010 | KR |
2010 0133246 | Dec 2010 | KR |
2011 0026176 | Mar 2011 | KR |
2011 0086501 | Jul 2011 | KR |
20120130972 | Jan 2012 | KR |
2012 0103670 | Sep 2012 | KR |
20120135488 | Dec 2012 | KR |
20120135723 | Dec 2012 | KR |
20130027017 | Mar 2013 | KR |
2013 0099647 | Sep 2013 | KR |
2014 0016495 | Feb 2014 | KR |
2014 0029720 | Mar 2014 | KR |
2014 0043760 | Apr 2014 | KR |
2014 0067965 | Jun 2014 | KR |
2014 0079110 | Jun 2014 | KR |
2014 0122000 | Oct 2014 | KR |
20150013263 | Feb 2015 | KR |
20150021977 | Mar 2015 | KR |
2007145218 | Jul 2009 | RU |
2503989 | Jan 2014 | RU |
WO 2005106637 | Nov 2005 | WO |
WO 2006013485 | Feb 2006 | WO |
WO 2006042309 | Apr 2006 | WO |
WO 2006094308 | Sep 2006 | WO |
WO 2007121557 | Nov 2007 | WO |
WO 2008030976 | Mar 2008 | WO |
WO 2008064142 | May 2008 | WO |
WO 2009155981 | Dec 2009 | WO |
WO 2009158549 | Dec 2009 | WO |
WO 2010013876 | Feb 2010 | WO |
WO 2010032598 | Mar 2010 | WO |
WO 2010090010 | Aug 2010 | WO |
WO 2010122813 | Oct 2010 | WO |
WO 2010134729 | Nov 2010 | WO |
WO 2011024389 | Mar 2011 | WO |
WO 2011024465 | Mar 2011 | WO |
WO 2011024521 | Mar 2011 | WO |
WO 2011093045 | Aug 2011 | WO |
WO 2011105009 | Sep 2011 | WO |
WO 2011108190 | Sep 2011 | WO |
WO 2011115187 | Sep 2011 | WO |
WO 2011121375 | Oct 2011 | WO |
WO 2012021417 | Feb 2012 | WO |
WO 2012037664 | Mar 2012 | WO |
WO 2012096804 | Jul 2012 | WO |
WO 2012108213 | Aug 2012 | WO |
WO 2012114760 | Aug 2012 | WO |
WO 2012137946 | Oct 2012 | WO |
WO 2012150540 | Nov 2012 | WO |
WO 2012153555 | Nov 2012 | WO |
WO 2013022486 | Feb 2013 | WO |
WO 2013035725 | Mar 2013 | WO |
WO 2013112453 | Aug 2013 | WO |
WO 2013127055 | Sep 2013 | WO |
WO 2013169302 | Nov 2013 | WO |
WO 2013169845 | Nov 2013 | WO |
WO 2013169846 | Nov 2013 | WO |
WO 2013169849 | Nov 2013 | WO |
WO 2013169851 | Nov 2013 | WO |
WO 2013169853 | Nov 2013 | WO |
WO 2013169854 | Nov 2013 | WO |
WO 2013169870 | Nov 2013 | WO |
WO 2013169875 | Nov 2013 | WO |
WO 2013169877 | Nov 2013 | WO |
WO 2013169882 | Nov 2013 | WO |
WO 2013173838 | Nov 2013 | WO |
WO 2014034706 | Mar 2014 | WO |
WO 2014105275 | Jul 2014 | WO |
WO 2014105276 | Jul 2014 | WO |
WO 2014105277 | Jul 2014 | WO |
WO 2014105278 | Jul 2014 | WO |
WO 2014105279 | Jul 2014 | WO |
WO 2014129655 | Aug 2014 | WO |
WO 2014149473 | Sep 2014 | WO |
WO 2014152601 | Sep 2014 | WO |
WO 2014200733 | Dec 2014 | WO |
WO 2016200584 | Dec 2016 | WO |
Entry |
---|
Anonymous, RX-V3800AV Receiver Owner's Manual, Yamaha Music Manuals, www.Manualslib.com, Dec. 31, 2007, 169 pages. |
Henderson et al., “Opportunistic User Interfaces for Augmented Reality”, Department of Computer Science, New York, NY, Jan. 2010, 13 pages. |
Patent, dated Nov. 12, 2021, received in Chinese Patent Application No. 201810826224.6, which corresponds with U.S. Appl. No. 14/536,426, 7 pages. |
Office Action, dated Jan. 10, 2022, received in Chinese Patent Application No. 201810369259.1, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Patent, dated Dec. 31, 2021, received in Chinese Patent Application 201811142423.1, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Notice of Allowance, dated Dec. 3, 2021, received in Japanese Patent Application No. 2018-022394, which corresponds with U.S. Appl. No. 14/536,203, 2 pages. |
Patent, dated Dec. 13, 2021, received in Japanese Patent Application No. 2018-022394, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Office Action, dated Nov. 23, 2021, received in Chinese Patent Application 201810332044.2, which corresponds with U.S. Appl. No. 14/536,267, 2 page. |
Office Action, dated Dec. 22, 2021, received in European Patent Application No. 17163309.2, which corresponds with U.S. Appl. No. 14/866,987, 4 pages. |
Office Action, dated Nov. 23, 2021, received in U.S. Appl. No. 16/136,163, 27 pages. |
Office Action, dated Nov. 30, 2021, received in Russian Patent Application No. 2018146112, which corresponds with U.S. Appl. No. 16/243,834, 15 pages. |
Notice of Allowance, dated Dec. 14, 2021, received in Australian Patent Application No. 2020201648, which corresponds with U.S. Appl. No. 16/262,784, 3 pages. |
Notice of Allowance, dated Jan. 24, 2022, received in U.S. Appl. No. 16/262,800, 26 pages. |
Final Office Action, dated Dec. 13, 2021, received in U.S. Appl. No. 16/896,141, 29 pages. |
Office Action, dated Oct. 5, 2021, received in U.S. Appl. No. 16/563,505, 19 pages. |
Office Action, dated Dec. 14, 2021, received in U.S. Appl. No. 16/685,773, 20 pages. |
Notice of Allowance, dated Dec. 21, 2021, received in U.S. Appl. No. 16/921,083, 25 pages. |
Office Action, dated Dec. 23, 2021, received in Korean Patent Application No. 2020-7031330, which corresponds with U.S. Appl. No. 15/272,398, 8 pages. |
Office Action, dated Nov. 11, 2021, received in Australian Patent Application No. 17/103,899, which corresponds with U.S. Appl. No. 17/103,899, 4 pages. |
International Search Report and Written Opinion, dated Jan. 11, 2022, received in International Application No. PCT/US2021/042402, which corresponds with U.S. Appl. No. 17/031,637, 50 pages. |
Bognot, “MicrosoftWindows 7 Aero Shake, Snap, and Peek”, https://www.outube.com/watch?v=vgD7wGrsQg4, Apr. 3, 2012, 4 pages. |
Intent to Grant, dated May 11, 2022, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 7 pages. |
Notice of Allowance, dated Mar. 21, 2022, received in Chinese Patent Application No. 201810332044.2, which corresponds with U.S. Appl. No. 14/536,267, 1 page. |
Intent to Grant, dated Mar. 16, 2022, received in European Patent Application No. 18183789.9, which corresponds with U.S. Appl. No. 16/262,800, 7 pages. |
Notice of Allowance, dated Feb. 4, 2022, received in Japanese Patent Application No. 2020-185336, which corresponds with U.S. Appl. No. 14/864,580, 2 pages. |
Patent, dated Mar. 3, 2022, received in Japanese Patent Application No. 2020-185336, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Notice of Allowance, dated Feb. 9, 2022, received in Chinese Patent Application No. 201610869950.7, which corresponds with U.S. Appl. No. 14/871,462, 1 page. |
Patent, dated Mar. 8, 2022, received in Chinese Patent Application No. 201610869950.7, which corresponds with U.S. Appl. No. 14/871,462, 7 pages. |
Office Action, dated Mar. 2, 2022, received in Chinese Patent Application No. 201811561188.1, which corresponds with U.S. Appl. No. 15/081,771, 1 page. |
Patent, dated Jan. 27, 2022, received in Australian Patent Application No. 2019268116, which corresponds with U.S. Appl. No. 16/240,672, 3 pages. |
Office Action, dated Apr. 11, 2022, received in Japanese Patent Application No. 2019-058800, which corresponds with U.S. Appl. No. 16/243,834, 4 pages. |
Notice of Allowance, dated Apr. 14, 2022, received in Russian Patent Application No. 2018146112, which corresponds with U.S. Appl. No. 16/243,834, 2 pages. |
Certificate of Grant, dated Apr. 21, 2022, received in Australian Patent Application No. 2020201648, which corresponds with U.S. Appl. No. 16/262,784, 3 pages. |
Notice of Allowance, dated Jan. 14, 2022, received in Australian Patent Application No. 2020267298, which corresponds with U.S. Appl. No. 16/258,394, 3 pages. |
Final Office Action, dated Mar. 4, 2022, received in Japanese Patent Application No. 2019-047319, which corresponds with U.S. Appl. No. 16/896,141, 2 pages. |
Office Action, dated May 6, 2022, received in Chinese Patent Application No. 201910610331.X, 5 pages. |
Office Action, dated Mar. 17, 2022, received in Chinese Patent Application No. 201910718931.8, 1 page. |
Notice of Allowance, dated Jan. 14, 2022, received in Australian Patent Application No. 2020244406, which corresponds with U.S. Appl. No. 17/003,869, 3 pages. |
Office Action, dated Apr. 27, 2022, received in Australian Patent Application No. 2020257134, 3 pages. |
Office Action, dated Apr. 28, 2022, received in Korean Patent Application No. 2022-7005994, 5 pages. |
Final Office Action, dated May 2, 2022, received in U.S. Appl. No. 17/103,899 21 pages. |
Office Action, dated Mar. 16, 2022, received in U.S. Appl. No. 17/138,676, 22 pages. |
Patent, dated Jan. 27, 2022, received in Korean Patent Application No. 2021-7031223, 5 pages. |
Notice of Allowance, dated Feb. 21, 2022, received in Korean Patent Application No. 2022-7003345, 2 pages. |
Sleepfreaks, “How to Easily Play/Loop an Event Range in Cubase”, https://sleepfreaks-dtm.com/for-advance-cubase/position-3/>, Apr. 4, 2011, 14 pages. |
Office Action, dated Aug. 12, 2021, received in Chinese Patent Application No. 201811142423.1, which corresponds with U.S. Patent Application No. 14/536, 3, 6 pages. |
Office Action, dated Jan. 26, 2021, received in Chinese Patent Application No. 201810632507.7, 5 pages. |
Notice of Allowance, dated Aug. 11, 2021, received in Chinese Patent Application No. 201810632507.7, which corresponds with U.S. Appl. No. 14/536,203, 1 page. |
Notice of Allowance, dated Aug. 27, 2021, received in Japanese Patent Application No. 2019-212493, which corresponds with U.S. Appl. No. 15/272,345, 2 pages. |
Notice of Allowance, dated Aug. 26, 2021, received in Korean Patent Application No. 2019-7019946, which corresponds with U.S. Appl. No. 16/154,591, 2 pages. |
Notice of Allowance, dated Sep. 2, 2021, received in U.S. Appl. No. 16/240,672, 13 pages. |
Office Action, dated Aug. 10, 2021, received in European Patent Application No. 19181042.3, which corresponds with U.S. Appl. No. 16/241,883, 7 pages. |
Office Action, dated Sep. 6, 2021, received in Chinese Patent Application No. 201910718931.8, 6 pages. |
Office Action, dated Aug. 30, 2021, received in Australian Patent Application No. 2020244406, which corresponds with U.S. Appl. No. 17/003,869, 4 pages. |
Office Action, dated Sep. 8, 2021, received in Japanese Patent Application No. 2020-106360, 2 pages. |
Final Office Action, dated Sep. 16, 2021, received in U.S. Appl. No. 16/988,509, 38 pages. |
Final Office Action, dated Aug. 27, 2021, received in Korean Patent Application No. 2020-7031330, which corresponds with U.S. Appl. No. 15/272,398, 3 pages. |
Agarwal, “How to Copy and Paste Text on Windows Phone 8,” Guiding Tech, http://web.archive.org/web20130709204246/http://www.guidingtech.com/20280/copy-paste-text-windows-phone-8/, Jul. 9, 2013, 10 pages. |
Angelov, “Sponsor Flip Wall with Jquery & CSS”, Tutorialzine. N.p., Mar. 24, 2010. Web. http://tutorialzine.com/2010/03/sponsor-wall-slip-jquery-css/, Mar. 24, 2010, 8 pages. |
Anonymous, “1-Click Installer for Windows Media Taskbar Mini-Player for Windows 7, 8, 8.1 10”, http://metadataconsulting.blogspot.de/2014/05/installer-for-windows-media-taskbar.htm, May 5, 2014, 6 pages. |
Anonymous, “Acer Liquid Z5 Duo User's Manual”, https://global-download.acer.com, Feb. 21, 2014, 65 pages. |
Anonymous, “Android—What Should Status Bar Toggle Button Behavior Be?”, https://ux.stackechange.com/questions/34814, Jan. 15, 2015, 2 pages. |
Anonymous, “Google Android 5.0 Release Date, Specs and Editors Hands On Review—CNET”, http://www.cnet.com/products/google-an-android-5-0-lollipop/, Mar. 12, 2015, 10 pages. |
Anonymous, “How Do I Add Contextual Menu to My Apple Watch App?”, http://www.tech-recipes.com/rx/52578/how-do-i-add-contextual-menu-to-my-apple-watch-app, Jan. 13, 2015, 3 pages. |
Anonymous, “[new] WMP12 with Taskbar Toolbar for Windows 7—Windows Customization—WinMatrix”, http://www.winmatrix.com/forums/index/php?/topic/25528-new-wmp12-with-taskbar-toolbar-for-windows-7, Jan. 27, 2013, 6 pages. |
Anonymous, “Nokia 808 PureView screenshots”, retrieved from Internet; no URL, Nov. 12, 2012, 8 pages. |
Anonymous, “Nokia 808 PureView User Guide,” http://download-fds.webapps.microsoft.com/supportFiles/phones/files/pdf_guides/devices/808/Nokia_808_UG_en_APAC.pdf, Jan. 1, 2012, 144 pages. |
Anonymous, “Notifications, Android 4.4 and Lower”, Android Developers, https://developer.android.com/design/patterns/notifications_k.html, May 24, 2015, 9 pages. |
Anonymous, “Taskbar Extensions”, https://web.archive.org/web/20141228124434/http://msdn.microsoft.com:80/en-us/library/windows/desktop/dd378460(v=vs.85).aspx, Dec. 28, 2014, 8 pages. |
Apple, “Apple—September Event 2014”, https://www.youtube.com/watch?v=38lqQpqwPe7s, Sep. 10, 2014, 5 pages. |
Azundris, “A Fire in the Pie,” http://web.archive.org/web/20140722062639/http://blog.azundrix.com/archives/168-A-fire-in-the-sky.html, Jul. 22, 2014, 8 pages. |
Billibi, “Android 5.0 Lollipop”, https://www.bilibili.comvideo/av1636046?from=search&seid=3128140235778895126, Oct. 19, 2014, 6 pages. |
B-log—betriebsraum weblog, “Extremely Efficient Menu Selection: Marking Menus for the Flash Platform,” http://www.betriebsraum.de/blog/2009/12/11/extremely-efficient-menu-selection-marking -for-the-flash-platform, Dec. 11, 2009, 9 pages. |
Bolluyt, “5 Apple Watch Revelations from Apple's New WatchKit”, http://www.cheatsheet.com/tecnology/5-apple-watch-revelations-from-apples-new-watchkit.html/?a=viewall, Nov. 22, 2014, 3 pages. |
Boring, “The Fat Thumb: Using the Thumb's Contact Size for Single-Handed Mobile Interaction”, https://www.youtube.com/watch?v=E9vGU5R8nsc&feature=youtu.be, Jun. 14, 2012, 2 pages. |
Borowska, “6 Types of Digital Affordance that Impact Your Ux”, https://www.webdesignerdepot.com/2015/04/6-types-of-digital-affordance-that-implact-your-ux, Apr. 7, 2015, 6 pages. |
Brewster, “The Design and Evaluation of a Vibrotactile Progress Bar”, Glasgow Interactive Systems Group, University of Glasgow, Glasgow, G12 8QQ, UK, 2005, 2 pages. |
Brownlee, “Android 5.0 Lollipop Feature Review!”, https//www.youtube.com/watch?v=pEDQ1z1-PvU, Oct. 27, 2014, 5 pages. |
Clark, “Global Moxie, Touch Means a Renaissance for Radial Menus,” http://globalmoxie.com/blog/radial-menus-for-touch-ui˜print.shtml, Jul. 17, 2012, 7 pages. |
Cohen, Cinemagraphs are Animated Gifs for Adults, http://www.tubefilter.com/2011/07/10/cinemagraph, Jul. 10, 2011, 3 pages. |
CrackBerry Forums, Windows 8 Bezel Control and Gestures, http://wwwforums.crackberry.com/blackberry-playbook-f222/windows-8-bezel-control-gestures-705129/, Mar. 1, 2012, 8 pages. |
Crook, “Microsoft Patenting Multi-Screen, Milti-Touch Gestures,” http://techcrunch.com/2011/08/25/microsoft-awarded-patents-for-multi-screen-multi-touch-gestures/, Aug. 25, 2011, 8 pages. |
Cvil.ly—a design blog, Interesting Touch Interactions on Windows 8, http://cvil.ly/2011/06/04/interesting-touch-interactions-on-windows-8/, Jun. 4, 2011, 3 pages. |
Davidson, et al., “Extending 2D Object Arrangement with Pressure-Sensitive Layering Cues”, Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, Oct. 19, 2008, 4 pages. |
Dinwiddie, et al., “Combined-User Interface for Computers, Television, Video Recorders, and Telephone, Etc”, ip.com Journal, Aug. 1, 1990, 3 Pages. |
Drinkwater, “Glossary: Pre/Post Alarm Image Buffer,” http://www.networkwebcams.com/ip-camera-learning-center/2008/07/17/glossary-prepost-alarm-image-buffer/, Jul. 17, 2008, 1 page. |
Dzyre, “10 Android Notification Features You Can Fiddle With”, http://www.hongkiat.com/blog/android-notification-features, Mar. 10, 2014, 10 pages. |
Easton-Ellett, “Three Free Cydia Utilities To Remove iOS Notification Badges”, http://www.ijailbreak.com/cydia/three-free-cydia-utilies-to-remove-ios-notification-badges, Apr. 14, 2012, 2 pages. |
Elliot, “Mac System 7”, YouTube. Web. Mar. 8, 2017, http://www.youtube.com/watch?v=XLv22hfuuik, Aug. 3, 2011, 1 page. |
Farshad, “SageThumbs—Preview and Convert Pictures From Windows Context Menu”, https://web.addictivetips.com/windows-tips/sagethumbs-preview-and-convert-photos-from-windows-context-menu. Aug. 8, 2011, 5 pages. |
Fenlon, “The Case for Bezel Touch Gestures on Apple's iPad,” http://www.tested.com/tech/tablets/3104-the case-for-bezel-touch-gestures-on-apples-ipad/, Nov. 2, 2011, 6 pages. |
Flaherty, “Is Apple Watch's Pressure-Sensitive Screen A Bigger Deal Than The Gadget Itself?”, http://www.wired.com/2014/09/apple-watchs-pressure-sensitive-screen-bigger-deal-gadget, Sep. 15, 2014, 3 pages. |
Flixel, “Cinemagraph Pro For Mac”, https://flixel.com/products/mac/cinemagraph-pro, 2014, 7 pages. |
Flowplayer, “Slowmotion: Flowplayer,” https://web.archive.org/web/20150226191526/http://flash.flowplayer.org/plugins/streaming/slowmotion.html, Feb. 26, 2015, 4 pages. |
Garcia-Hernandez et al., “Orientation Discrimination of Patterned Surfaces through an Actuated and Non-Actuated Tactile Display”, 2011 IEEE World Haptics Conference, Istanbul, Jun. 21-24, 2011, 3 pages. |
Forlines, et al., “Glimpse: a Novel Input Model for Multi-level Devices”, Chi '05 Extended Abstracts on Human Factors in Computing Systems, Apr. 2, 2005, 4 pages. |
Gardner, “Recenz—Recent Apps In One Tap”, You Tube, https://www.youtube.com/watch?v-qailSHRgsTo, May 15, 2015, 1 page. |
Geisler, “Enriched Links: A Framework For Improving Web Navigation Using Pop-Up Views”, Journal of the American Society for Information Science, Chapel Hill, NC, Jan. 1, 2000, 13 pages. |
Gonzalo et al., “Zliding: Fluid Zooming and Sliding for High Precision Parameter Manipulation”, Department of Computer Science, University of Toronto, Seattle, Washington, Oct. 23, 2005, 10 pages. |
Google-Chrome, “Android 5.0 Lollipop”, http://androidlover.net/android-os/android-5-0-lollipop/android-5-0-lollipop-recent-apps-card-google-search.html, Oct. 19, 2014, 10 pages. |
Grant, “Android's Notification Center”, https://www.objc.io/issues/11-android/android-notifications, Apr. 30, 2014, 26 pages. |
Gurman, “Force Touch on iPhone 6S Revealed: Expect Shortcuts, Faster Actions, iOS”, 9To5Mac Aug. 10, 2015, 31 pages. |
IBM et al., “Pressure-Sensitive Icons”, IBM Technical Disclosure Bulletin, vol. 33, No. 1B, Jun. 1, 1990, 3 pages. |
ICIMS Recruiting Software, “Blackberry Playbook Review,” http://www.tested.com/tech.tablets/5749-blackberry-playbook-review/, 2015, 11 pages. |
IPhoneHacksTV, “Confero allows you to easily manage your Badge notifications—iPhone Hacks”, youtube, https://wwwyoutube.com/watch?v=JCk61pnL4SU, Dec. 26, 2014, 3 pages. |
IPhoneOperator, “Wasser Liveeffekt fur Homescreen & Lockscreen—Aquaboard (Cydia)”, http://www.youtube.com/watch?v=fG9YMF-mB0Q, Sep. 22, 2012, 3 pages. |
IPodHacks 142: “Water Ripple Effects On The Home and Lock Screen: AquaBoard Cydia Tweak Review”, YouTube, https://www.youtube.comwatch?v-Auu_uRaYHJs, Sep. 24, 2012, 3 pages. |
Jauregui, “Design and Evaluation of 3D Cursors and Motion Parallax for the Exploration of Desktop Virtual Environments”, IEEE Symposium on 3D User Interfaces 2012, Mar. 4, 2012, 8 pages. |
Jones, “Touch Screen with Feeling”, IEEE Spectrum, , spectrum.ieee.org/commuting/hardware/touch-screens-with-feeling, May 1, 2009, 2 pages. |
Kaaresoja, “Snap-Crackle-Pop: Tactile Feedback for Mobile Touch Screens,” Nokia Research Center, Helsinki, Finland, Proceedings of Eurohaptics vol. 2006, Jul. 3, 2006, 2 pages. |
Kiener, “Force Touch on iPhone”, https://www.youtube.com/watch?v=CEMmnsU5fC8, Aug. 4, 2015, 4 pages. |
Kleinman, “iPhone 6s Said to Sport Force Touch Display, 2GB of RAM”, https://www.technobuffalo.com/2015/01/15/iphone-6s-said-to-sport-force-touch-display-2gb-of-ram, Jan. 15, 2015, 2 pages. |
Kost, “LR3-Deselect All Images But One”, Julieanne Kost's Blog, blogs.adobe.com/jkost/2011/12/lr3-deselect-all-images-but-one.html, Dec. 22, 2011, 1 page. |
Kronfli, “HTC Zoe Comes To Google Play, Here's Everything You Need To Know,” Know Your Mobile, http://www.knowyourmobile.com/htc/htc-one/19550/what-htc-zoe, Aug. 14, 2014, 5 pages. |
Kumar, “How to Enable Ripple Effect on Lock Screen of Galaxy S2”, YouTube, http, http://www.youtube.com/watch?v+B9-4M5abLXA, Feb. 12, 2013, 3 pages. |
Kurdi, “XnView Shell Extension: A Powerful Image Utility Inside The Context Menu”, http://www.freewaregenius.com/xnview-shell-extension-a-powerful-image-utility-inside-the-context-menu, Jul. 30, 2008, 4 pages. |
Laurie, “The Power of the Right Click,” http://vlaurie.com/right-click/customize-context-menu.html, 2002-2016, 3 pages. |
MacKenzie et al., “The Tactile Touchpad”, Chi '97 Extended Abstracts on Human Factors in Computing Systems Looking to the Future, Chi '97, Mar. 22, 1997, 5 pages. |
Mahdi, Confero now available in Cydia, brings a new way to manage Notification badges [Jailbreak Tweak], http://www.iphonehacks.com/2015/01/confero/tweak-manage-notification-badges.html, Jan. 1, 2015, 2 pages. |
Matthew, “How to Preview Photos and Images From Right-Click Context Menue in Windows [Tip]”, http://www.dottech.org/159009/add-image-preview-in-windows-context-menu-tip, Jul. 4, 2014, 5 pages. |
McGarry, “Everything You Can Do With Force Touch on Apple Watch”, Macworld, www.macworld.com, May 6, 2015, 4 pages. |
McRitchie, “Internet Explorer Right-Click Menus,” http://web.archive.org/web-201405020/http:/dmcritchie.mvps.org/ie/rightie6.htm, May 2, 2014, 10 pages. |
Microsoft, “Lumia—How to Personalize Your Start Screen”, https://www.youtube.com/watch?v=6GI5Z3TrSEs, Nov. 11, 2014, 3 pages. |
Microsoft, “Use Radial Menus to Display Commands in OneNote for Windows 8,” https://support.office.com/en-us/article/Use-radial-menues-to-display-OneNote-Commands-Od75f03f-cde7-493a-a8a0b2ed6f99fbe2, 2016, 5 pages. |
Minsky, “Computational Haptics The Sandpaper System for Synthesizing Texture for a Force-Feedback Display,” Massachusetts Institute of Technology, Jun. 1978, 217 pages. |
Mitroff, “Google Android 5.0 Lollipop,” http://www.cnet.com/products/google-android-5-0-lollipop, Mar. 12, 2015, 5 pages. |
Mohr, “Do Not Disturb—The iPhone Feature You Should Be Using”, http.www.wonderoftech.com/do-not-disturb-iphone, Jul. 14, 2014, 30 pages. |
Nacca, “NiLS Lock Screen Notifications / Floating Panel—Review”, https://www.youtube.com/watch?v=McT4QnS9TDY, Feb. 3, 2014, 4 pages. |
Neuburg, “Detailed Explanation iOS SDK”, Oreilly Japan, Dec. 22, 2014, vol. 4, p. 175-186, 15 pages. |
Nickinson, How to Use Do Not Disturb on the HTC One M8, https://www.androidcentral.com/how-to-use-do-not-disturb-htc-one-m8, Apr. 7, 2014, 9 pages. |
Nickinson, “Inside Android 4.2: Notifications and Quick Settings”, https://www.andrloidcentral.com/inside-android-42-notifications-and-quick-settings, Nov. 3, 2012, 3 pages. |
Nikon, “Scene Recognition System and Advanced SRS,” http://www.nikonusa.com/en.Learn-And-Explore/Article/ftlzi4rr/Scene-Recognition-System.html, Jul. 22, 2015, 2 pages. |
Nishino, “A Touch Screen Interface Design with Tactile Feedback”, Computer Science, 2011 International Conference on Complex, Intelligent, and Software Intensive Systems, 2011, 4 pages. |
Ogino, “iOS 7 Design Standard”, Japan, Impress Japan Corporation, 1st edition, Nov. 21, 2013, 2 pages. |
Oh, et al., “Moving Objects with 2D Input Devices in CAD Systems and Desktop Virtual Environments”, Proceedings of Graphics Interface 2005, 8 pages, May 2005. |
O'Hara, et al., “Pressure-Sensitive Icons”, ip.com Journal, ip.com Inc., West Henrietta, NY, US, Jun. 1, 1990, 2 Pages. |
Pallenberg, “Wow, the new iPad had gestures.” https://plus.google.com/+SaschaPallenberg/posts/aaJtJogu8ac, Mar. 7, 2012, 2 pages. |
Phonebuff, “How To Pair Bluetooth On The iPhone”, https://www.youtube.com/watch?v=LudNwEar9A8, Feb. 8, 2012, 3 pages. |
Plaisant et al., “Touchscreen Toggle Design”, Proceedings of CHI '92, pp. 667-668, May 3-7, 1992, 2 pages. |
PoliceOne.com, “COBAN Technologies Pre-Event Buffer & Fail Safe Feature,” http://www.policeone.com/police-products/police-technology/mobile-computures/videos/5955587-COBAN-Technologies-Pre-Event, Nov. 11, 2010, 2 pages. |
Pradeep, “Android App Development—Microsoft Awarded With Patents On Gestures Supported On Windows 8,” http://mspoweruser.com/microsoft-awarded-with-patents-on-gestures-supported-on-windows-8/, Aug. 25, 2011, 16 pages. |
“Quickly Preview Songs in Windows Media Player 12 in Windows 7,” Quickly Preview Songs in Windows Media Player 12 in Windows 7. How-to Geek, Apr. 28, 2010, Web. May 8, 2010, http://web.archive.org/web/20100502013134/http://www.howtogeek.com/howto/16157/quickly-preview-songs-in-windows-media-center-12-in-windows-7>, 6 pages. |
Quinn, et al., “Zoofing! Faster List Selections with Pressure-Zoom-Flick-Scrolling”, Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group on Design, Nov. 23, 2009, ACM Press, vol. 411, 8 pages. |
Rekimoto, et al., “PreSense: Interaction Techniques for Finger Sensing Input Devices”, Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, Nov. 30, 2003, 10 pages. |
Rekimoto, et al., “PreSensell: Bi-directional Touch and Pressure Sensing Interactions with Tactile Feedback”, Conference on Human Factors in Computing Systems Archive, ACM, Apr. 22, 2006, 6 pages. |
Rekimoto, et al., “SmartPad: A Finger-Sensing Keypad for Mobile Interaction”, CHI 2003, Ft. Lauderdale, Florida, ACM 1-58113-637-April 5-10, 2003, 2 pages. |
Ritchie, “How to see all the unread message notifications on your iPhone, all at once, all in the same place | iMore”, https://www.imore.com/how-see-all-unread-message-notifications-your-iphone-all-once-all-same-place, Feb. 22, 2014, 2 pages. |
Roth et al., “Bezel Swipe: Conflict-Free Scrolling and Miltiple Selection on Mobile Touch Screen Devices,” Chi 2009, Boston, Massachusetts, USA, Apr. 4-9, 2009, 4 pages. |
Rubino et al., “How to Enable ‘Living Images’ on your Nokia Lumia with Windows Phone 8.1”, https://www.youtube.com/watch?v=RX7vpoFy1Dg, Jun. 6, 2014, 5 pages. |
Sony, “Intelligent Scene Recognition,” https://www.sony-asia.com/article/252999/section/product/product/dsc-t77, downloaded on May 20, 2016, 5 pages. |
Sood, “MultitaskingGestures”, http://cydia.saurik.com/package/org.thebigboxx.multitaskinggestures/, Mar. 3, 2014, 2 pages. |
Stewart, et al., “Characteristics of Pressure-Based Input for Mobile Devices”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 2010, 10 pages. |
Stross, “Wearing A Badge, and a Video Camera,” The New York Times, http://www.nytimes.com/2013/04/07/business/wearable-video-cameras-for-police-offers.html? R=0, Apr. 6, 2013, 4 pages. |
Taser, “Taser Axon Body Camera User Manual,” https://www.taser.com/images/support/downloads/product-resourses/axon_body_product_manual.pdf, Oct. 1, 2013, 24 pages. |
Tidwell, “Designing Interfaces,” O'Reilly Media, Inc., USA, Nov. 2005, 348 pages. |
Tweak, “QuickCenter—Add 3D-Touch Shortcuts to Control Center”, https://www.youtube.com/watch?v=8rHOFpGvZFM, Mar. 22, 2016, 2 pages. |
Tweak, “iOS 10 Tweak on iOS 9.0.2 Jailbread & 9.2.1-9.3 Support: QuickCenter 3D, Touch Cydia Tweak!” https://wwwyoutube.com/watch?v=opOBr30_Fkl, Mar. 6, 2016, 3 pages. |
UpDown-G, “Using Multiple Selection Mode in Android 4.0 / Getting Started”, https://techbooster.org/android/13946, Mar. 7, 2012, 7 pages. |
VGJFeliz, “How to Master Android Lollipop Notifications in Four Minutes!”, https://www.youtube.com/watch?v=S-zBRG7GJgs, Feb. 8, 2015, 5 pages. |
VisioGuy, “Getting a Handle on Selecting and Subselecting Visio Shapes”, http://www.visguy.com/2009/10/13/getting-a-handle-on-selecting-and-subselecting-visio-shapes/, Oct. 13, 2009, 18 pages. |
Viticci, “Apple Watch: Our Complete Overview—MacStories”, https://www.macstories.net, Sep. 10, 2014, 21 pages. |
Wikipedia, “AirDrop,”, Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/AirDrop, May 17, 2016, 5 pages. |
Wikipedia, “Cinemagraph,” Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/Cinemagraph, Last Modified Mar. 16, 2016, 2 pages. |
Wikipedia, “Context Menu,” Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Context menu, Last Modified May 15, 2016, 4 pages. |
Wikipedia, “HTC One (M7),” Wikipedia, the free encyclopedia, https://en.wikipedia.org/wiki/HTC_One_(M7), Mar. 2013, 20 pages. |
Wikipedia, “Mobile Ad Hoc Network,” Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/Mobile_ad_hoc_network, May 20, 2016, 4 pages. |
Wikipedia, “Pie Menu,” Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/Pie_menu, Last Modified Jun. 4, 2016, 3 pages. |
Wikipedia, “Quick Look,” from Wikipedia, the free encyclopedia, https;//en.wikipedia.org/wiki/Quick_Look, Last Modified Jan. 15, 2016, 3 pages. |
Wikipedia, “Sony Xperia Z1”, Wikipedia, the free encyclopedia, https://enwikipedia.org/wiki/Sony_Experia_Z1, Sep. 2013, 10 pages. |
Wilson, et al., “Augmenting Tactile Interaction with Pressure-Based Input”, School of Computing Science, Glasgow, UK, Nov. 15-17, 2011, 2 pages. |
Yang, et al., “Affordance Application on Visual Interface Design of Desk-Top Virtual Experiments”, 2014 International Conference on Information Science, Electronics and Electrical Engineering, IEEE, vol. 1, Apr. 26, 2014, 5 pages. |
Yatani, et al., SemFeel: A User Interface with Semantic Tactile Feedback for Mobile Touch-Screen Devices, Proceedings of the 22nd annual ACM symposium on user interface software and technology (UIST '09), Oct. 2009, 10 pages. |
Youtube, “Android Lollipop Lock-Screen Notification Tips”, https://www.youtube.com/watch?v=LZTxHBOwzlU, Nov. 13, 2014, 3 pages. |
Youtube, “Blackberry Playbook bezel interaction,” https://www.youtube.com/watch?v=YGkzFqnOwXI, Jan. 10, 2011, 2 pages. |
Youtube, “How to Master Android Lollipop Notifications in Four Minutes!”, Video Gadgets Journal (VGJFelix), https://www.youtube.com/watch?v=S-zBRG7GGJgs, Feb. 8, 2015, 4 pages. |
Youtube, “HTC One Favorite Camera Features”, http://www.youtube.com/watch?v=sUYHfcjl4RU, Apr. 28, 2013, 3 pages. |
Youtube, “Multitasking Gestures: Zephyr Like Gestures on iOS”, https://www.youtube.com/watch?v=Jcod-f7Lw0l, Jan. 27, 2014, 3 pages. |
Youtube, “Recentz—Recent Apps in A Tap”, https://www.youtube.com/watch?v=qailSHRgsTo, May 15, 2015, 1 page. |
Zylom, “House Secrets”, http://game.zylom.com/servlet/Entry?g=38&s=19521 &nocache=1438641323066, Aug. 3, 2015, 1 page. |
Office Action, dated Mar. 15, 2017, received in U.S. Appl. No. 14/535,671, 13 pages. |
Office Action, dated Nov. 30, 2017, received in U.S. Appl. No. 14/535,671, 21 pages. |
Notice of Allowance, dated Sep. 5, 2018, received in U.S. Appl. No. 14/535,671, 5 pages. |
Office Action, dated Jun. 29, 2017, received in U.S. Appl. No. 14/608,895, 30 pages. |
Final Office Action, dated Feb. 22, 2018, received in U.S. Appl. No. 14/608,895, 20 pages. |
Notice of Allowance, dated Jun. 26, 2018, received in U.S. Appl. No. 14/608,895, 9 pages. |
Office Action, dated Dec. 18, 2015, received in Australian Patent Application No. 2013368440, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Oct. 18, 2016, received in Australian Patent Application No. 2013368440, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Notice of Allowance, dated Dec. 20, 2016, received in Australian Patent Application No. 2013368440, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Certificate of Grant, dated Apr. 29, 2017, received in Australian Patent Application No. 2013368440, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Nov. 6, 2017, received in Chinese Patent Application 201380068493.6, which corresponds with U.S. Appl. No. 14/608,895, 5 pages. |
Office Action, dated Oct. 9, 2018, received in Chinese Patent Application No. 201380068493.6, which corresponds with U.S. Appl. No. 14/608,895, 3 pages. |
Patent, dated Dec. 25, 2018, received in Chinese Patent Application No. 201380068493.6, which corresponds with U.S. Appl. No. 14/608,895, 4 pages. |
Office Action, dated Jul. 21, 2016, received in European Patent Application No. 13795391.5, which corresponds with U.S. Appl. No. 14/536,426, 9 pages. |
Office Action, dated Mar. 9, 2018, received in European Patent Application No. 13795391.5, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Intention to Grant, dated Jul. 6, 2018, received in European Patent Application No. 13795391.5, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Certificate of Grant, dated Dec. 26, 2018, received in European Patent Application No. 13795391.5, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Office Action, dated Sep. 13, 2016, received in Japanese Patent Application No. 2015-547948, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Patent, dated May 12, 2017, received in Japanese Patent Application No. 2015-547948, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Apr. 5, 2016, received in Korean Patent Application No. 10-2015-7018851, which corresponds with U.S. Appl. No. 14/536,426, 7 pages. |
Office Action, dated Feb. 24, 2017, received in Korean Patent Application No. 10-2015-7018851, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Patent, dated May 26, 2017, received in Korean Patent Application No. 2015-7018851, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Oct. 5, 2018, received in Korean Patent Application No. 2018-7028236, which corresponds with U.S. Appl. No. 14/608,895, 6 pages. |
Notice of Allowance, dated May 24, 2019, received in Korean Patent Application No. 2018-7028236, which corresponds with U.S. Appl. No. 14/608,895, 4 pages. |
Patent, dated Jul. 9, 2019, received in Korean Patent Application No. 2018-7028236, which corresponds with U.S. Appl. No. 14/608,895, 4 pages. |
Office Action, dated Jul. 26, 2017, received in U.S. Appl. No. 14/536,235, 14 pages. |
Final Office Action, dated Feb. 26, 2018, received in U.S. Appl. No. 14/536,235, 13 pages. |
Notice of Allowance, dated Aug. 15, 2018, received in U.S. Appl. No. 14/536,235, 5 pages. |
Office Action, dated Apr. 5, 2017, received in U.S. Appl. No. 14/536,367, 16 pages. |
Notice of Allowance, dated Nov. 30, 2017, received in U.S. Appl. No. 14/536,367, 9 pages. |
Notice of Allowance, dated May 16, 2018, received in U.S. Appl. No. 14/536,367, 5 pages. |
Office Action, dated Dec. 17, 2015, received in U.S. Appl. No. 14/536,426, 28 pages. |
Final Office Action, dated May 6, 2016, received in U.S. Appl. No. 14/536,426, 23 pages. |
Office action, dated Aug. 3, 2017, received in U.S. Appl. No. 14/536,426, 10 pages. |
Office Action, dated Jul. 15, 2015, received in Australian U.S. Appl. No. 2013259606, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Notice of Allowance, dated May 23, 2016, received in Australian Patent Application No. 2013259606, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Certificate of Grant, dated Sep. 15, 2016, received in Australian Patent Australian Patent Application No. 2013259606, which corresponds with U.S. Appl. No. 14/536,426, 1 page. |
Office Action, dated Nov. 18, 2015, received in Australian Patent Application No. 2015101231, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated May 15, 2017, received in Australian Patent Application No. 2016216580, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated May 8, 2018, received in Australian Patent Application No. 2016216580, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Notice of Allowance, dated May 17, 2018, received in Australian Patent Application No. 2016216580, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Certificate of Grant, dated Sep. 13, 2018, received in Australian Patent Application No. 2016216580, which corresponds with U.S. Appl. No. 14/536,426, 1 page. |
Office Action, dated Apr. 12, 2019, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Nov. 18, 2019, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Feb. 18, 2020, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Notice of Allowance, dated Mar. 27, 2020, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Certificate of Grant, dated Jul. 23, 2020, received in Australian Patent Application No. 2018223021, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Office Action, dated Sep. 19, 2017, received in Chinese Patent Application No. 201380035982.1, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Notice of Allowance, dated May 10, 2018, received in Chinese Patent Application No. 201380035982.1, which corresponds with U.S. Appl. No. 14/536,426, 2 pages. |
Patent, dated Aug. 17, 2018, received in Chinese Patent Application No. 201380035982.1, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Office Action, dated Sep. 20, 2017, received in Chinese Patent Application No. 201510566550.4, which corresponds with U.S. Appl. No. 14/536,426, 11 pages. |
Notice of Allowance, dated Aug. 8, 2018, received in Chinese Patent Application No. 201510566550.4, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Patent, dated Oct. 23, 2018, received in Chinese U.S. Appl. No. 201510566550.4, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Office Action, dated Jan. 4, 2021, received in Chinese Patent Application No. 201810826224.6, which corresponds with U.S. Appl. No. 14/536,426, 6 pages. |
Decision to Grant, dated Jul. 14, 2016, received in European Patent Application No. 13724100.6, which corresponds with U.S. Appl. No. 14/536,426, 1 page. |
Letters Patent, dated Aug. 10, 2016, received in European Patent Application No. 13724100.6, which corresponds with U.S. Appl. No. 14/536,426, 1 page. |
Office Action, dated Jan. 20, 2017, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Office Action, dated Aug. 21, 2017, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Intention to Grant, dated Mar. 9, 2018, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Intention to Grant, dated Aug. 14, 2018, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Decision to Grant, dated Jan. 10, 2019, received in European Patent Application No. 15183980.0 which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Patent, dated Feb. 6, 2019, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 4 pages. |
Office Action, dated Sep. 6, 2019, received in European Patent Application No. 18180503.7, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Certificate of Grant, dated Nov. 10, 2017, received in Hong Kong Patent Application No. 15107535.0, which corresponds with U.S. Appl. No. 14/536,426, 2 pages. |
Certificate of Grant, dated Jul. 5, 2019, received in Hong Kong Patent Application No. 15108892.5, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Patent, dated Nov. 22, 2019, received in Hong Kong Patent Application No. 16107033.6, which corresponds with U.S. Appl. No. 14/536,426, 6 pages. |
Office Action, dated Mar. 4, 2016, received in Japanese Patent Application No. 2015-511644, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Feb. 6, 2017, received in Japanese Patent Application No. 2015-511644, which corresponds with U.S. Appl. No. 14/536,426, 6 pages. |
Notice of Allowance, dated Dec. 8, 2017, received in Japanese Patent Application No. 2015-511644, which corresponds with U.S. Appl. No. 14/536,426, 6 pages. |
Patent, dated Jan. 12, 2018, received in Japanese Patent Application No. 2015-511644, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Nov. 6, 2018, received in Japanese Patent Application No. 2018-000753, which corresponds with U.S. Appl. No. 14/536,426, 8 pages. |
Office Action, dated Oct. 7, 2019, received in Japanese Patent Application No. 2018-000753, which corresponds with U.S. Appl. No. 14/536,426, 5 pages. |
Office Action, dated Feb. 8, 2021, received in Japanese Patent Application No. 2018-000753, which corresponds with U.S. Appl. No. 14/536,426, 2 pages. |
Office Action, dated Mar. 9, 2017, received in U.S. Appl. No. 14/536,464, 21 pages. |
Final Office Action, dated Aug. 25, 2017, received in U.S. Appl. No. 14/536,464, 30 pages. |
Office Action, dated Feb. 12, 2018, received in U.S. Appl. No. 14/536,464, 33 pages. |
Final Office Action, dated Jun. 22, 2018, received in U.S. Appl. No. 14/536,464, 32 pages. |
Notice of Allowance, dated Jan. 25, 2021, received in U.S. Appl. No. 14/536,464, 5 pages. |
Notice of Allowance, dated Feb. 23, 2021, received in U.S. Appl. No. 14/536,464, 5 pages. |
Office Action, dated Sep. 25, 2017, received in U.S. Appl. No. 14/536,644, 29 pages. |
Final Office Action, dated May 3, 2018, received in U.S. Appl. No. 14/536,644, 28 pages. |
Office Action, dated Nov. 2, 2018, received in U.S. Appl. No. 14/536,644, 24 pages. |
Notice of Allowance, dated Jul. 2, 2019, received in U.S. Appl. No. 14/536,644, 5 pages. |
Office Action, dated Oct. 19, 2017, received in U.S. Appl. No. 14/608,926, 14 pages. |
Final Office Action, dated Jun. 6, 2018, received in U.S. Appl. No. 14/608,926, 19 pages. |
Notice of Allowance, dated Apr. 10, 2019, received in U.S. Appl. No. 14/608,926, 16 pages. |
Notice of Allowance, dated May 21, 2019, received in U.S. Appl. No. 14/608,926, 5 pages. |
Office Action, dated Feb. 1, 2016, received in Australian Patent Application No. 2013368441, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Notice of Allowance, dated Mar. 30, 2016, received in Australian Patent Application No. 14/608,926, which corresponds with U.S. Appl. No. 14/608,926, 1 page. |
Certificate of Grant, dated Jul. 29, 2016, received in Australian Patent Application No. 2013368441, which corresponds with U.S. Appl. No. 14/608,926, 1 page. |
Office Action, dated Jan. 3, 2017, received in Australian Patent Application No. 2016201451, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Notice of Acceptance, dated Dec. 20, 2017, received in Australian Patent Application No. 2016201451, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Certificate of Grant, dated May 3, 2018, received in Australian Patent Application No. 2016201451, which corresponds with U.S. Appl. No. 14/608,926, 1 page. |
Office Action, dated May 4, 2017, received in Chinese Patent Application No. 201380068414.1, which corresponds with U.S. Appl. No. 14/608,926, 5 pages. |
Notice of Allowance, dated Feb. 8, 2018, received in Chinese Patent Application No. 201380068414.1, which corresponds with U.S. Appl. No. 14/608,926, 2 pages. |
Patent, dated May 4, 2018, received in Chinese Patent Application No. 201380068414.1, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Office Action, dated Dec. 1, 2020, received in Chinese Patent Application No. 201810369259.1, which corresponds with U.S. Appl. No. 14/608,926, 14 pages. |
Office Action, dated Apr. 21, 2016, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 6 pages. |
Office Action, dated May 6, 2016, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 6 pages. |
Office Action, dated Nov. 11, 2016, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 6 pages. |
Office Action, dated Jul. 4, 2017, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Oral Summons, dated Feb. 13, 2017, received in European Patent Application No. 13795392.3, which corresponds with U.S. Appl. No. 14/608,926, 11 pages. |
Office Action, dated Mar. 14, 2016, received in Japanese Patent Application No. 2015-549392, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Notice of Allowance, dated Jan. 17, 2017, received in Japanese Patent Application No. 2015-549392, which corresponds with U.S. Appl. No. 14/608,926, 2 pages. |
Patent, dated Feb. 17, 2017, received in Japanese Patent Application No. 2015-549392, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Patent, dated Apr. 27, 2018, received in Japanese Patent Application No. 2017-024234, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Office Action, dated Feb. 22, 2019, received in Japanese Patent Application No. 2018-079290, which corresponds with U.S. Appl. No. 14/608,926, 7 pages. |
Office Action, dated Sep. 30, 2019, received in Japanese Patent Application No. 2018-079290, which corresponds with U.S. Appl. No. 14/608,926, 5 pages. |
Notice of Allowance, dated Apr. 3, 2020, received in Japanese Patent Application No. 2018-079290, which corresponds with U.S. Appl. No. 14/608,926, 5 pages. |
Patent, dated Apr. 14, 2020, received in Japanese Patent Application No. 2018-079290, which corresponds with U.S. Appl. No. 14/608,926, 5 pages. |
Office Action, dated May 12, 2016, received in Korean Patent Application No. 10-2015-7018853, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Notice of Allowance, dated Mar. 31, 2017, received in Korean Patent Application No. 2015-7018853, which corresponds with U.S. Appl. No. 14/608,926, 4 pages. |
Patent, dated Jun. 30, 2017, received in Korean Patent Application No. 2015-7018853, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Office Action, dated Aug. 22, 2017, received in Korean Patent Application No. 2017-7018250, which corresponds with U.S. Appl. No. 14/608,926, 2 pages. |
Notice of Allowance, dated Dec. 29, 2017, received in Korean Patent Application No. 2017-7018250, which corresponds with U.S. Appl. No. 14/608,926, 3 pages. |
Office Action, dated Oct. 19, 2017, received in U.S. Appl. No. 14/536,646, 21 pages. |
Notice of Allowance, dated Aug. 9, 2018, received in U.S. Appl. No. 14/536,646, 5 pages. |
Office Action, dated Jul. 17, 2015, received in Australian Patent Application No. 2013259613, which corresponds with U.S. Appl. No. 14/536,646, 5 pages. |
Office Action, dated May 31, 2016, received in Australian Patent Application No. 2013259613, which corresponds with U.S. Appl. No. 14/536,646, 4 pages. |
Notice of Allowance, dated Jul. 5, 2016, received in Australian Patent Application No. 2013259613, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Office Action, dated Jun. 6, 2019, received in Australian Patent Application No. 2018256626, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Notice of Acceptance, dated Aug. 1, 2019, received in Australian Patent Application No. 2018256626, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Certificate of Grant, dated Dec. 5, 2019, received in Australian Patent Application No. 2018256626, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Office Action, dated Dec. 1, 2016, received in Chinese Patent Application No. 2013800362059, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Notice of Allowance, dated Oct. 9, 2017, received in Chinese Patent Application No. 2013800362059, which corresponds with U.S. Appl. No. 14/536,646, 3 pages. |
Office Action, dated Jul. 3, 2020, received in Chinese U.S. Appl. No. 14/536,646.X, which corresponds with U.S. Appl. No. 14/536,646, 13 pages. |
Office Action, dated Oct. 26, 2020, received in Chinese Patent Application No. 201711422092.2, which corresponds with U.S. Appl. No. 14/536,646, 20 pages. |
Notice of Allowance, dated Mar. 22, 2021, received in Chinese Patent Application No. 201711422092.2, which corresponds with U.S. Appl. No. 14/536,646, 2 pages. |
Office Action, dated Nov. 12, 2015, received in European Patent Application No. 13724102.2, which corresponds with U.S. Appl. No. 14/536,646, 6 pages. |
Office Action, dated May 31, 2016, received in European Patent Application No. 13724102.2, which corresponds with U.S. Appl. No. 14/536,646, 5 pages. |
Notice of Allowance, dated Jan. 4, 2017, received in European Patent Application No. 13724102.2, which corresponds with U.S. Appl. No. 14/536,646, 5 pages. |
Patent, dated May 26, 2017, received in European Patent Application No. 13724102.2, which corresponds with U.S. Appl. No. 14/536,646, 1 page. |
Office Action, dated Feb. 29, 2016, received in Japanese Patent Application No. 2015-511645, which corresponds with U.S. Appl. No. 14/536,646, 5 pages. |
Notice of Allowance, dated Dec. 22, 2016, received in Japanese Patent Application No. 2015-511645, which corresponds with U.S. Appl. No. 14/536,646, 2 pages. |
Certificate of Grant, dated Jan. 25, 2019, received in Hong Kong Patent Application No. 2015-511645, which corresponds with U.S. Appl. No. 14/536,646, 4 pages. |
Office Action, dated Apr. 3, 2017, received in U.S. Appl. No. 14/536,141, 11 pages. |
Notice of Allowance, dated Sep. 20, 2017, received in U.S. Appl. No. 14/536,141, 10 pages. |
Office Action, dated Aug. 27, 2015, received in Australian Patent Application No. 2013259614, which corresponds with U.S. Appl. No. 14/536,141, 4 pages. |
Notice of Allowance, dated Aug. 15, 2016, received in Australian Patent Application No. 2013259614, which corresponds with U.S. Appl. No. 14/536,141, 1 page. |
Office Action, dated Jul. 21, 2017, received in Australian Patent Application No. 2016262773, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Notice of Acceptance, dated Jul. 19, 2018, received in Australian Patent Application No. 2016262773, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Office Action, dated Jun. 5, 2019, received in Australian Patent Application No. 2018256616, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Notice of Acceptance, dated Jan. 22, 2020, received in Australian Patent Application No. 2018256616, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Certificate of Grant, dated May 21, 2020, received in Australian Patent Application No. 2018256616, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Office Action, dated Mar. 3, 2017, received in Chinese Patent Application No. 201380035893.7, which corresponds with U.S. Appl. No. 14/536,141, 8 pages. |
Office Action, dated Feb. 2, 2018, received in Chinese Patent Application No. 201380035893.7, which corresponds with U.S. Appl. No. 14/536,141, 5 pages. |
Notice of Allowance, dated Aug. 31, 2018, received in Chinese Patent Application No. 201380035893.7, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Office Action, dated Mar. 10, 2021, received in Chinese Patent Application No. 201811142423.1, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Patent, dated Oct. 23, 2018, received in Chinese Patent Application No. 201380035893.7, which corresponds with U.S. Appl. No. 14/536,141, 4 pages. |
Office Action, dated Jan. 7, 2016, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 10 pages. |
Office Action, dated Aug. 31, 2016, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 10 pages. |
Office Action, dated Apr. 9, 2018, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 9 pages. |
Office Action, dated Mar. 7, 2019, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 5 pages. |
Intention to Grant, dated Sep. 6, 2019, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 7 pages. |
Decision to Grant, dated Jan. 23, 2020, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 1 page. |
Patent, dated Feb. 19, 2020, received in European Patent Application No. 13726053.5, which corresponds with U.S. Appl. No. 14/536,141, 4 page. |
Office Action, dated Feb. 29, 2016, received in Japanese Patent Application No. 2015-511646, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Office Action, dated Oct. 25, 2016, received in Japanese Patent Application No. 2015-511646, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Notice of Allowance, dated Jun. 30, 2017, received in Japanese Patent Application No. 2015-511646, which corresponds with U.S. Appl. No. 14/536,141, 5 pages. |
Patent, dated Jul. 28, 2017, received in Japanese Patent Application No. 2015-511646, which corresponds with U.S. Appl. No. 14/536,141, 3 pages. |
Office Action, dated Aug. 10, 2018, received in Japanese Patent Application No. 2017-141953, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Office Action, dated Jul. 5, 2019, received in Japanese Patent Application No. 2017-141953, which corresponds with U.S. Appl. No. 14/536,141, 6 pages. |
Office Action, dated Dec. 8, 2016, received in U.S. Appl. No. 14/608,942, 9 pages. |
Notice of Allowance, dated May 12, 2017, received in U.S. Appl. No. 14/608,942, 10 pages. |
Office Action, dated Jan. 29, 2016, received in Australian Patent Application No. 2013368443, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Notice of Allowance, dated Mar. 11, 2016, received in Australian Patent Application No. 2013368443, which corresponds with U.S. Appl. No. 14/608,942, 2 pages. |
Certificate of Grant, dated Jul. 7, 2016, received in Australian Patent Application No. 2013368443, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Office Action, dated Mar. 29, 2017, received in Australian Patent Application No. 2016201303, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Notice of Acceptance, dated Mar. 7, 2018, received in Australian Patent Application No. 2016201303, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Certificate of Grant, dated Jul. 5, 2018, received in Australian Patent Application No. 2016201303, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Office Action, dated Jun. 16, 2017, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 6 pages. |
Office Action, dated Mar. 28, 2018, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 5 pages. |
Office Action, dated Oct. 8, 2018, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Notice of Allowance, dated May 7, 2019, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Patent, dated Jul. 5, 2019, received in Chinese Patent Application No. 201380068295.X, which corresponds with U.S. Appl. No. 14/608,942, 8 pages. |
Office Action, dated Oct. 7, 2016, received in European Patent Application No. 13798464.7, which corresponds with U.S. Appl. No. 14/608,942, 7 pages. |
Decision to Grant, dated Sep. 13, 2018, received in European Patent Application No. 13798464.7, which corresponds with U.S. Appl. No. 14/608,942, 2 pages. |
Intention to Grant, dated Nov. 8, 2019, received in European Patent Application No. 18194127.9, which corresponds with U.S. Appl. No. 14/608,942, 7 pages. |
Decision to Grant, dated Aug. 20, 2020, received in European Patent Application No. 18194127.9, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Patent, dated Sep. 16, 2020, received in European U.S. Appl. No. 18194127.9, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Certificate of Grant, dated Jul. 26, 2019, received in Hong Kong, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Office Action, dated Jul. 4, 2016, received in Japanese Patent Application No. 2015-549393, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Notice of Allowance, dated May 12, 2017, received in Japanese Patent Application No. 2015-549393, which corresponds with U.S. Appl. No. 14/608,942, 5 pages. |
Patent, dated Jun. 16, 2017, received in Japanese Patent Application No. 2015-549393, which corresponds with U.S. Appl. No. 14/608,942, 3 pages. |
Office Action, dated Apr. 5, 2016, received in Korean Patent Application No. 2015-7018448, which corresponds with U.S. Appl. No. 14/608,942, 6 pages. |
Office Action, dated Feb. 24, 2017, received in Korean Patent Application No. 2015-7018448, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Notice of Allowance, dated Jan. 15, 2019, received in Korean Patent Application No. 2015-7018448, which corresponds with U.S. Appl. No. 14/608,942, 5 pages. |
Patent, dated Mar. 8, 2019, received in Korean Patent Application No. 2015-7018448, which corresponds with U.S. Appl. No. 14/608,942, 4 pages. |
Office Action, dated Jul. 17, 2017, received in U.S. Appl. No. 14/536,166, 19 pages. |
Notice of Allowance, dated Feb. 28, 2018, received in U.S. Appl. No. 14/536,166, 5 pages. |
Office Action, dated Aug. 1, 2016, received in U.S. Appl. No. 14/536,203, 14 pages. |
Notice of Allowance, dated Feb. 1, 2017, received in U.S. Appl. No. 14/536,203, 9 pages. |
Office Action, dated Jul. 9, 2015, received in Australian Patent Application No. 2013259630, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Notice of Allowance, dated Jun. 15, 2016, received in Australian Patent Application No. 2013259630, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Certificate of Grant, dated Oct. 21, 2016, received in Australian Patent Application No. 2013259630, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Office Action, dated Jul. 4, 2017, received in Australian Patent Application No. 2016238917, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Notice of Acceptance, dated Jul. 19, 2018, received in Australian Patent Application No. 2016238917, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Certificate of Grant, dated Nov. 1, 2018, received in Australian Patent Application No. 2016238917, which corresponds with U.S. Appl. No. 14/536,203, 1 page. |
Office Action, dated Aug. 20, 2018, received in Australian Patent Application No. 2018250481, which corresponds with U.S. Appl. No. 14/536,203, 2 pages. |
Notice of Allowance, dated Apr. 29, 2020, received in Australian Patent Application No. 2018250481, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Certificate of Grant, dated Sep. 3, 2020, received in AustralianPatent Application No. 2018250481, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Office Action, dated Oct. 25, 2017, received in Chinese U.S. Appl. No. 201380035977.0, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Notice of Allowance, dated Apr. 4, 2018, received in Chinese Patent Application No. 201380035977.0, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Patent, dated Jul. 6, 2018, received in Chinese Patent Application No. 201380035977.0, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Office Action, dated Nov. 11, 2015, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Office Action, dated May 31, 2016, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Office Action, dated Dec. 6, 2017, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 9 pages. |
Decision to Grant, dated Oct. 24, 2018, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Intention to Grant, dated Mar. 18, 2019, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 9 pages. |
Decision to Grant, dated Aug. 8, 2019, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 1 page. |
Certificate of Grant, dated Sep. 4, 2019, received in European Patent Application No. 13724104.8, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Patent, dated Sep. 27, 2019, received in Hong Kong Patent Application No. 15108904.1, which corresponds with U.S. Appl. No. 14/536,203, 6 pages. |
Office Action, dated Feb. 15, 2016, received in Japanese Patent Application No. 2015-511650, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Notice of Allowance, dated Aug. 5, 2016, received in Japanese Patent Application No. 2015-511650, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Certificate of Patent, dated Sep. 9, 2016, received in Japanese Patent Application No. 2015-511650, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Office Action, dated Jun. 23, 2017, received in Japanese Patent Application No. 2016173113, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Notice of Allowance, dated Jan. 12, 2018, received in Japanese Patent Application No. 2016173113, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Patent, dated Feb. 16, 2018, received in Japanese Patent Application No. 2016173113, which corresponds with U.S. Appl. No. 14/536,203, 3 pages. |
Office Action, dated Oct. 19, 2018, received in Japanese Patent Application No. 2018-022394, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Office Action, dated Sep. 30, 2019, received in Japanese Patent Application No. 2018-022394, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Office Action, dated Jan. 22, 2021, received in Japanese Patent Application No. 2018-022394, which corresponds with U.S. Appl. No. 14/536,203, 2 pages. |
Office Action, dated Dec. 4, 2015, received in Korean Patent Application No. 2014-7034520, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Notice of Allowance, dated Sep. 1, 2016, received in Korean Patent Application No. 2014-7034520, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Office Action, dated Feb. 6, 2017, received in Korean Patent Application No. 2016-7033834, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Notice of Allowance, dated Oct. 30, 2017, received in Korean Patent Application No. 2016-7033834, which corresponds with U.S. Appl. No. 14/536,203, 5 pages. |
Patent, dated Jan. 23, 2018, received in Korean Patent Application No. 2016-7033834, which corresponds with U.S. Appl. No. 14/536,203, 4 pages. |
Office Action, dated Oct. 20, 2017, received in U.S. Appl. No. 14/608,965, 14 pages. |
Office Action, dated Jul. 2, 2018, received in U.S. Appl. No. 14/608,965, 16 pages. |
Final Office Action, dated Jan. 10, 2019, received in U.S. Appl. No. 14/608,965, 17 pages. |
Notice of Allowance dated Nov. 7, 2019, received in U.S. Appl. No. 14/608,965, 17 pages. |
Notice of Allowance dated Jan. 2, 2020, received in U.S. Appl. No. 14/608,965, 5 pages. |
Office action, dated Oct. 11, 2017, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 5 pages. |
Office action, dated Aug. 1, 2018, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 5 pages. |
Office action, dated Nov. 1, 2018, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 3 pages. |
Office action, dated Apr. 3, 2019, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 3 pages. |
Patent, dated May 17, 2019, received in Chinese Patent Application No. 201380074060.1, which corresponds with U.S. Appl. No. 14/608,965, 6 pages. |
Office Action, dated Jul. 22, 2016, received in European Office Action No. 13798465.4, which corresponds with U.S. Appl. No. 14/608,965, 3 pages. |
Oral Proceedings, dated Mar. 7, 2018, received in European Office Action No. 13798465.4, which corresponds with U.S. Appl. No. 14/608,965, 5 pages. |
Decision to Grant, dated Sep. 6, 2018, received in European Office Action No. 13798465.4, which corresponds with U.S. Appl. No. 14/608,965, 2 pages. |
Office Action, dated Oct. 20, 2016, received in U.S. Appl. No. 14/536,247, 10 pages. |
Final Office Action, dated Mar. 24, 2017, received in U.S. Appl. No. 14/536,247, 14 pages. |
Notice of Allowance, dated Nov. 22, 2017, received in U.S. Appl. No. 14/536,247, 6 pages. |
Office Action, dated Mar. 24, 2017, received in U.S. Appl. No. 14/536,267, 12 pages. |
Notice of Allowance, dated Nov. 9, 2017, received in U.S. Appl. No. 14/536,267, 8 pages. |
Notice of Allowance, dated Jun. 1, 2018, received in U.S. Appl. No. 14/536,267, 5 pages. |
Office Action, dated Aug. 10, 2015, received in Australian Patent Application No. 2013259637, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Notice of Allowance, dated Jun. 28, 2016, received in Australian Patent Application No. 2013259637, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Certificate of Grant, dated Oct. 21, 2016, received in Australian Patent Application No. 2013259637, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Office Action, dated Mar. 24, 2017, received in Australian Patent Application No. 2016204411, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Notice of Acceptance, dated Feb. 27, 2018, received in Australian Patent Application No. 2016204411, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Certificate of Grant, dated Jun. 28, 2018, received in Australian U.S. Appl. No. 2016204411, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Mar. 15, 2019, received in Australian Patent Application No. 2018204236, which corresponds with U.S. Appl. No. 14/5326,267, 5 pages. |
Notice of Acceptance, dated Apr. 29, 2019, received in Australian Patent Application No. 2018204236, which corresponds with U.S. Appl. No. 14/5326,267, 3 pages. |
Certificate of Grant, dated Aug. 28, 2019, received in Australian Patent Application No. 2018204236, which corresponds with U.S. Appl. No. 14/5326,267, 4 pages. |
Office Action, dated Dec. 9, 2016, received in Chinese Patent Application No. 2016120601564130, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Notice of Allowance, dated Jan. 29, 2018, received in Chinese Patent Application No. 201380035968.1, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Patent, dated Apr. 20, 2018, received in Chinese Patent Application No. 201380035968.1, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Nov. 28, 2018, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Office Action, dated Jul. 11, 2019, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Office Action, dated Sep. 30, 2019, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Office Action, dated Dec. 20, 2019, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Office Action, dated Apr. 20, 2020, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Patent, dated Sep. 29, 2020, received in Chinese Patent Application No. 201610537334.1, which corresponds with U.S. Appl. No. 14/536,267, 7 pages. |
Office Action, dated Jun. 13, 2018, received in Chinese Patent Application No. 201810332044.2, which corresponds with U.S. Appl. No. 14/536,267, 2 pages. |
Office Action, dated Jan. 20, 2021, received in Chinese Patent Application No. 201810332044.2, which corresponds with U.S. Appl. No. 14/536,267, 15 pages. |
Office Action, dated Jan. 25, 2018, received in European Patent Application No. 13724106.3, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Intention to Grant, dated Jun. 27, 2018, received in European Patent Application No. 13724106.3, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Decision to Grant, dated Oct. 18, 2018, received in European Patent Application No. 13724106.3, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Grant Certificate, dated Nov. 14, 2018, received in European Patent Application No. 13724106.3, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. 4 pages. |
Office Action, dated Sep. 13, 2017, received in European Patent Application No. 16177863.4, which corresponds with U.S. Appl. No. 14/536,267, 6 pages. |
Decision to Grant, dated Nov. 29, 2018, received in European Patent Application No. 16177863.4, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Patent, dated Dec. 26, 2018, received in European Patent Application No. 16177863.4, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Aug. 29, 2019, received in European Patent Application No. 18183789.9, which corresponds with U.S. Appl. No. 16/262,800, 9 pages. |
Office Action, dated Aug. 21, 2020, received in European Patent Application No. 18183789.9, which corresponds with U.S. Appl. No. 16/262,800, 9 pages. |
Patent, dated Aug. 30, 2019, received in Hong Kong Patent Application No. 15107537.8, which corresponds with U.S. Appl. No. 14/536,267, 9 pages. |
Patent, dated Nov. 8, 2019, received in Hong Kong U.S. Appl. No. 15108890.7, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Jan. 29, 2016, received in Japanese Patent Application No. 2015-511652, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Notice of Allowance, dated Sep. 26, 2016, received in Japanese Patent Application No. 2015-511652, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Office Action, dated Mar. 3, 2017, received in Japanese Patent Application No. 2016-125839, which corresponds with U.S. Appl. No. 14/536,267, 6 pages. |
Notice of Allowance, dated Nov. 17, 2017, received in Japanese Patent Application No. 2016-125839, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Office Action, dated Feb. 4, 2019, received in Japanese Patent Application No. 2017-237035, which corresponds with U.S. Appl. No. 14/536,267, 7 pages. |
Notice of Allowance, dated Sep. 9, 2019, received in Japanese Patent Application No. 2017-237035, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Patent, dated Sep. 27, 2019, received in Japanese Patent Application No. 2017-237035, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Office Action, dated Dec. 4, 2015, received in Korean Patent Application No. 2014-7034530, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Notice of Allowance, dated Sep. 1, 2016, received in Korean Patent Application No. 2014-7034530, which corresponds with U.S. Appl. No. 14/536,267, 3 pages. |
Office Action, dated Jan. 5, 2017, received in Korean Patent Application No. 2016-7029533, which corresponds with U.S. Appl. No. 14/536,267, 2 pages. |
Notice of Allowance, dated Sep. 1, 2017, received in Korean Patent Application No. 2016-7029533, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Patent, dated Dec. 1, 2017, received in Korean Patent Application No. 2016-7029533, which corresponds with U.S. Appl. No. 14/536,267, 2 pages. |
Office Action, dated Jan. 29, 2018, received in Korean Patent Application No. 2017-7034838, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Notice of Allowance, dated Dec. 3, 2018, received in Korean Patent Application No. 2017-7034838, which corresponds with U.S. Appl. No. 14/536,267, 5 pages. |
Patent, dated Mar. 4, 2019, received in Korean Patent Application No. 2017-7034838, which corresponds with U.S. Appl. No. 14/536,267, 4 pages. |
Office Action, dated Apr. 7, 2017, received in U.S. Appl. No. 14/536,291, 11 pages. |
Notice of Allowance, dated Dec. 1, 2017, received in U.S. Appl. No. 14/536,291, 19 pages. |
Notice of Allowance, dated Mar. 20, 2018, received in U.S. Appl. No. 14/536,291, 5 pages. |
Office Action, dated Aug. 18, 2015, received in Australian Patent Application No. 2013259642, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Office Action, dated Jul. 25, 2016, received in Australian Patent Application No. 2013259642, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Office Action, dated Aug. 10, 2016, received in Australian Patent Application No. 2013259642, which corresponds with U.S. Appl. No. 14/536,291, 4 pages. |
Office Action, dated Jul. 21, 2017, received in Australian Patent Application No. 2016216658, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Notice of Acceptance, dated Jul. 19, 2018, received in Australian Patent Application No. 2016216658, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Patent, dated Nov. 30, 2018, received in Australian Patent Application No. 2016216658, which corresponds with U.S. Appl. No. 14/536,291, 4 pages. |
Innovation Patent, dated Sep. 1, 2016, received in Australian Patent Application No. 2016101481, which corresponds with U.S. Appl. No. 14/536,291, 1 page. |
Office Action, dated Sep. 29, 2016, received in Australian Patent Application No. 2016101481, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Office Action, dated Oct. 23, 2017, received in Chinese Patent Application No. 201380035986.X, which corresponds with U.S. Appl. No. 14/536,291, 9 pages. |
Notice of Allowance, dated Jun. 24, 2020, received in Chinese Patent Application No. 201710781246.0, which corresponds with U.S. Appl. No. 14/536,291, 5 pages. |
Patent, dated Jul. 31, 2020, received in Chinese Patent Application No. 201710781246.0, which corresponds with U.S. Appl. No. 14/536,291, 6 pages. |
Office Action, dated Jul. 17, 2020, received in Chinese Patent Application No. 2018100116175.X, which corresponds with U.S. Appl. No. 14/536,291, 15 pages. |
Office Action, dated Nov. 17, 2020, received in Chinese Patent Application No. 2018100116175.X, which corresponds with U.S. Appl. No. 14/536,291, 16 pages. |
Notice of Allowance, dated Mar. 29, 2021, received in Chinese Patent Application No. 2018100116175.X, which corresponds with U.S. Appl. No. 14/536,291, 1 page. |
Office Action, dated Jan. 7, 2016, received in European Patent Application No. 13724107.1, which corresponds with U.S. Appl. No. 14/536,291, 11 pages. |
Office Action, dated Aug. 22, 2016, received in European Patent Application No. 13724107.1, which corresponds with U.S. Appl. No. 14/536,291, 7 pages. |
Office Action, dated Mar. 23, 2017, received in European Patent Application No. 13724107.1, which corresponds with U.S. Appl. No. 14/536,291, 8 pages. |
Intention to Grant, dated Jan. 8, 2019, received in European Patent Application No. 17186744.3, which corresponds with U.S. Appl. No. 14/536,291, 7 pages. |
Decision to Grant, dated Oct. 31, 2019, received in European Patent Application No. 17186744.3, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Patent, dated Nov. 27, 2019, received in European Patent Application No. 17186744.3, which corresponds with U.S. Appl. No. 14/536,291, 4 pages. |
Office Action, dated Mar. 8, 2016, received in Japanese Patent Application No. 2015-511655, which corresponds with U.S. Appl. No. 14/536,291, 4 pages. |
Final Office Action, dated Dec. 22, 2016, received in Japanese Patent Application No. 2015-511655, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Office Action, dated Jun. 29, 2018, received in Japanese Patent Application No. 2017-083027, which corresponds with U.S. Appl. No. 14/536,291, 5 pages. |
Patent, dated Feb. 22, 2019, received in Japanese Patent Application No. 2017-083027, which corresponds with U.S. Appl. No. 14/536,291, 3 pages. |
Notice of Allowance, dated Jan. 15, 2019, received in Japanese Patent Application No. 2017-083027, which corresponds with U.S. Appl. No. 14/536,291, 5 pages. |
Office Action, dated Oct. 19, 2017, received in U.S. Appl. No. 14/608,985, 13 pages. |
Notice of Allowance, dated Apr. 20, 2018, received in U.S. Appl. No. 14/608,985, 5 pages. |
Office Action, dated Jan. 15, 2016, received in Australian Patent Application No. 2013368445, which corresponds with U.S. Appl. No. 14/608,985, 3 pages. |
Notice of Allowance, dated Jan. 18, 2017, received in Australian Patent Application No. 2013368445, which corresponds with U.S. Appl. No. 14/608,985, 3 pages. |
Patent, dated May 18, 2017, received in Australian Patent Application No. 2013368445, which corresponds with U.S. Appl. No. 14/608,985, 1 page. |
Office Action, dated May 19, 2017, received in Chinese Patent Application No. 201380068399.0, which corresponds with U.S. Appl. No. 14/608,985, 5 pages. |
Notice of Allowance, dated Sep. 19, 2017, received in Chinese Patent Application No. 201380068399.0, which corresponds with U.S. Appl. No. 14/608,985, 3 pages. |
Patent, dated Dec. 8, 2017, received in Chinese Patent Application No. 201380068399.0, which corresponds with U.S. Appl. No. 14/608,985, 4 pages. |
Office Action, dated Jul. 25, 2016, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 8 pages. |
Office Action, dated Feb. 27, 2017, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 6 pages. |
Summons, dated Oct. 6, 2017, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 6 pages. |
Intention to Grant, dated Jan. 16, 2019, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 9 pages. |
Decision to Grant, dated Aug. 1, 2019, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 2 pages. |
Certificate of Grant, dated Aug. 28, 2019, received in European Patent Application No. 13811032.5, which corresponds with U.S. Appl. No. 14/608,985, 4 pages. |
Certificate of Grant, dated Jun. 29, 2018, received in Hong Kong Patent Application No. 15112851.6, which corresponds with U.S. Appl. No. 14/608,985, 2 pages. |
Office Action, dated Apr. 25, 2016, received in Japanese Patent Application No. 2015-550384, which corresponds with U.S. Appl. No. 14/608,985, 4 pages. |
Notice of Allowance, dated Jan. 24, 2017, received in Japanese Patent Application No. 2015-550384, which corresponds with U.S. Appl. No. 14/608,985, 5 pages. |
Patent, dated Feb. 24, 2017, received in Japanese Patent Application No. 2015-550384, which corresponds with U.S. Appl. No. 14/608,985, 2 pages. |
Office Action, dated Nov. 4, 2016, received in Korean Patent Application No. 2015-7019984, which corresponds with U.S. Appl. No. 14/608,985, 8 pages. |
Notice of Allowance, dated Sep. 19, 2017, received in Korean Patent Application No. 2015-7019984, which corresponds with U.S. Appl. No. 14/608,985, 4 pages. |
Patent, dated Dec. 19, 2017, received in Korean Patent Application No. 2015-7019984, which corresponds with U.S. Appl. No. 14/608,985, 3 pages. |
Office Action, dated Mar. 24, 2017, received in U.S. Appl. No. 14/609,006, 13 pages. |
Final Office Action, dated Sep. 21, 2017, received in U.S. Appl. No. 14/609,006, 17 pages. |
Office Action, dated Mar. 20, 2018, received in U.S. Appl. No. 14/609,006, 13 pages. |
Office Action, dated Oct. 11, 2018, received in U.S. Appl. No. 14/609,006, 12 pages. |
Final Office Action, dated May 23, 2019, received in U.S. Appl. No. 14/609,006, 14 pages. |
Office Action, dated Jan. 7, 2020, received in U.S. Appl. No. 14/609,006, 17 pages. |
Final Office Action, dated Jun. 15, 2020, received in U.S. Appl. No. 14/609,006, 19 pages. |
Office Action, dated Apr. 19, 2017, received in U.S. Appl. No. 14/536,296, 12 pages. |
Final Office Action, dated Nov. 2, 2017, received in U.S. Appl. No. 14/536,296, 13 pages. |
Notice of Allowance, dated Mar. 14, 2018, received in U.S. Appl. No. 14/536,296, 8 pages. |
Office Action, dated Nov. 1, 2017, received in U.S. Appl. No. 14/536,648, 22 pages. |
Final Office Action, dated Aug. 7, 2018, received in U.S. Appl. No. 14/536,648, 14 pages. |
Office Action, dated Jan. 2, 2019, received in U.S. Appl. No. 14/536,648 12 pages. |
Notice of Allowance, dated Jul. 2, 2019, received in U.S. Appl. No. 14/536,648, 5 pages. |
Office Action, dated Jul. 21, 2017, received in Australian Patent Application No. 2016247194, which corresponds with U.S. Appl. No. 14/536,648, 3 pages. |
Notice of Acceptance, dated Jul. 19, 2018, received in Australian Patent Application No. 2016247194, which corresponds with U.S. Appl. No. 14/536,648, 3 pages. |
Office Action, dated Jul. 24, 2020, received in Chinese Patent Application No. 201711422121.5, which corresponds with U.S. Appl. No. 14/536,648, 10 pages. |
Notice of Allowance, dated Feb. 2, 2021, received in Chinese Patent Application No. 201711422121.5, which corresponds with U.S. Appl. No. 14/536,648, 1 page. |
Patent, dated Mar. 9, 2021, received in Chinese Patent Application No. 201711422121.5, which corresponds with U.S. Appl. No. 14/536,648, 7 pages. |
Intention to Grant, dated Apr. 1, 2019, received in European Patent Application No. 17153418.3, which corresponds with U.S. Appl. No. 14/536,648, 7 pages. |
Decision to Grant, dated Aug. 16, 2019, received in European Patent Application No. 17153418.3, which corresponds with U.S. Appl. No. 14/536,648, 3 pages. |
Grant Certificate, dated Sep. 11, 2019, received in European Patent Application No. 17153418.3, which corresponds with U.S. Appl. No. 14/536,648, 3 pages. |
Office Action, dated Apr. 27, 2018, received in Japanese Patent Application No. 2017-008764, which corresponds with U.S. Appl. No. 14/536,648, 5 pages. |
Notice of Allowance, dated Feb. 4, 2019, received in Japanese Patent Application No. 2017-008764, which corresponds with U.S. Appl. No. 14/536,648, 5 pages. |
Patent, dated Mar. 1, 2019, received in Japanese Patent Application No. 2017-008764, which corresponds with U.S. Appl. No. 14/536,648, 3 pages. |
Office Action, dated Jan. 19, 2017, received in U.S. Appl. No. 14/609,042, 12 pages. |
Notice of Allowance, dated Jul. 10, 2017, received in U.S. Appl. No. 14/609,042, 8 pages. |
Office Action, dated Aug. 24, 2018, received in Japanese Patent Application No. 2017-113598, which corresponds with U.S. Appl. No. 14/609,042, 6 pages. |
Notice of Allowance, dated Apr. 9, 2019, received in Japanese Patent Application No. 2017-113598, which corresponds with U.S. Appl. No. 14/609,042, 5 pages. |
Patent, dated Apr. 19, 2019, received in Japanese Patent Application No. 2017-113598, which corresponds with U.S. Appl. No. 14/609,042, 2 pages. |
Notice of Allowance, dated Dec. 17, 2018, received in Korean Patent Application No. 2017-7008614, which corresponds with U.S. Appl. No. 14/609,042, 5 pages. |
Patent, dated Mar. 8, 2019, received in Korean Patent Application No. 2017-7008614, which corresponds with U.S. Appl. No. 14/609,042, 4 pages. |
Office Action, dated Mar. 31, 2016, received in U.S. Appl. No. 14/864,737, 17 pages. |
Notice of Allowance, dated Feb. 27, 2017, received in U.S. Appl. No. 14/864,737, 9 pages. |
Notice of Allowance, dated Jun. 19, 2017, received in U.S. Appl. No. 14/864,737, 8 pages. |
Office Action, dated Apr. 16, 2018, received in Australian Patent Application No. 2016233792, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Notice of Acceptance, dated Mar. 12, 2019, received in Australian Patent Application No. 2016233792, which corresponds with U.S. Appl. No. 14/864,737, 5 pages. |
Certificate of Grant, dated Jul. 4, 2019, received in Australian Patent Application No. 2016233792, which corresponds with U.S. Appl. No. 14/864,737, 1 page. |
Office Action, dated Sep. 11, 2018, received in Chinese Patent Application No. 201610159295.6, which corresponds with U.S. Appl. No. 14/864,737, 6 pages. |
Notice of Allowance, dated Apr. 17, 2019, received in Chinese Patent Application No. 201610159295.6, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Patent, dated May 31, 2019, received in Chinese Patent Application No. 201610159295.6, which corresponds with U.S. Appl. No. 14/864,737, 7 pages. |
Notice of Allowance, dated Jul. 1, 2016, received in Chinese Patent Application No. 201620214376.7, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Patent, dated Aug. 3, 2016, received in Chinese Patent Application No. 201620214376.7, which corresponds with U.S. Appl. No. 14/864,737, 5 pages. |
Certificate of Registration, dated Jun. 20, 2016, received in German Patent Application No. 202016001845.1, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Office Action, dated Apr. 5, 2016, received in Danish Patent Application No. 201500577, which corresponds with U.S. Appl. No. 14/864,737, 7 pages. |
Intention to Grant, dated Aug. 2, 2016, received in Danish Patent Application No. 201500577, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Decision to grant, dated Mar. 29, 2018, received in European Patent Application No. 16710871.1, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Grant Certificate, dated Apr. 25, 2018, received in European Patent Application No. 16710871.1, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Office Action, dated May 15, 2017, received in Japanese Patent Application No. 2016-558331, which corresponds with U.S. Appl. No. 14/864,737, 5 pages. |
Notice of Allowance, dated Jun. 23, 2017, received in Japanese Patent Application No. 2016-558331, which corresponds with U.S. Appl. No. 14/864,737, 5 pages. |
Patent, dated Jul. 28, 2017, received in Japanese Patent Application No. 2016-558331, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Office Action, dated Feb. 14, 2018, received in Korean Patent Application No. 2017-7030129, which corresponds with U.S. Appl. No. 14/864,737, 17 pages. |
Patent, dated Dec. 26, 2018, received in Korean Patent Application No. 2017-7030129, which corresponds with U.S. Appl. No. 14/864,737, 4 pages. |
Patent, dated Jul. 12, 2017, received in Dutch Patent Application No. 2016452, which corresponds with U.S. Appl. No. 14/864,737, 2 pages. |
Office Action, dated Jun. 27, 2016, received in U.S. Appl. No. 14/866,981, 22 pages. |
Notice of Allowance, dated Oct. 24, 2016, received in U.S. Appl. No. 14/866,981, 7 pages. |
Notice of Allowance, dated Feb. 10, 2017, received in U.S. Appl. No. 14/866,981, 5 pages. |
Office Action, dated May 10, 2016, received in Australian Patent Application No. 2016100254, which corresponds with U.S. Appl. No. 14/866,981, 6 pages. |
Patent, dated Nov. 2, 2016, received in Australian Patent Application No. 2016100254, which corresponds with U.S. Appl. No. 14/866,981, 1 page. |
Office Action, dated Nov. 5, 2018, received in Chinese Patent Application No. 201610131415.1, which corresponds with U.S. Appl. No. 14/866,981, 6 pages. |
Office Action, dated Jul. 16, 2019, received in Chinese Patent Application No. 201610131415.1, which corresponds with U.S. Appl. No. 14/866,981, 4 pages. |
Office Action, dated Mar. 16, 2020, received in Chinese Patent Application No. 201610131415.1, which corresponds with U.S. Appl. No. 14/866,981, 3 pages. |
Notice of Allowance, dated Dec. 4, 2020, received in Chinese Patent Application No. 201610131415.1, which corresponds with U.S. Appl. No. 14/866,981, 3 pages. |
Patent, dated Jan. 22, 2021, received in Chinese Patent Application No. 201610131415.1, which corresponds with U.S. Appl. No. 14/866,981, 6 pages. |
Notice of Allowance, dated Jul. 27, 2016, received in Chinese Patent Application No. 201620176169.7, which corresponds with U.S. Appl. No. 14/866,981, 3 pages. |
Patent, dated Sep. 28, 2016, received in Chinese Patent Application No. 201620176169.7, which corresponds with U.S. Appl. No. 14/866,981, 4 pages. |
Certificate of Registration, dated Jun. 20, 2016, received in German Patent Application No. 202016001514.2, which corresponds with U.S. Appl. No. 14/864,737, 3 pages. |
Office Action, dated Mar. 18, 2016, received in Danish Patent Application No. 201500575, which corresponds with U.S. Appl. No. 14/866,981, 9 pages. |
Office Action, dated Dec. 5, 2016, received in Danish Patent Application No. 14/866,981, which corresponds with U.S. Appl. No. 14/866,981, 3 pages. |
Office Action, dated Jul. 7, 2017, received in Danish Patent Application No. 201500575, 4 pages. |
Patent, Nov. 16, 2017, received in Dutch Patent Application No. 2016375, which corresponds with U.S. Appl. No. 14/866,981, 2 pages. |
Office Action, dated Dec. 15, 2017, received in U.S. Appl. No. 14/866,159, 35 pages. |
Notice of Allowance, dated May 18, 2018, received in U.S. Appl. No. 14/866,159, 8 pages. |
Office Action, dated May 19, 2016, received in Australian Patent Application No. 2016100251, which corresponds with U.S. Appl. No. 14/866,159, 5 pages. |
Office Action, dated Jun. 5, 2018, received in Chinese Patent Application No. 201610137839.9, which corresponds with U.S. Appl. No. 14/866,159, 11 pages. |
Notice of Allowance, dated Dec. 6, 2018, received in Chinese Patent Application No. 201610137839.9, which corresponds with U.S. Appl. No. 14/866,159, 3 pages. |
Patent, dated Feb. 19, 2019, received in Chinese Patent Application No. 201610137839.9, which corresponds with U.S. Appl. No. 14/866,159, 6 pages. |
Office Action, dated Jul. 5, 2016, received in Chinese Patent Application No. 201620186008.6, which corresponds with U.S. Appl. No. 14/866,159, 3 pages. |
Certificate of Registration, dated Jun. 16, 2016, received in German Patent No. 202016001483.9, which corresponds with U.S. Patent Application No. 14/866,159, 3 pages. |
Office Action, dated Mar. 9, 2016, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 11 pages. |
Office Action, dated Sep. 27, 2016, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 4 pages. |
Office Action, dated Mar. 14, 2017, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 5 pages. |
Office Action, dated Jul. 6, 2017, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 3 pages. |
Office Action, dated Jan. 10, 2018, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 2 pages. |
Notice of Allowance, dated Mar. 21, 2018, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 2 pages. |
Patent, dated May 22, 2018, received in Danish Patent Application No. 201500574, which corresponds with U.S. Appl. No. 14/866,159, 2 pages. |
Intention to Grant, dated Oct. 28, 2019, received in European Patent Application No. 16707356.8, which corresponds with U.S. Appl. No. 14/866,159, 7 pages. |
Decision to Grant, dated Mar. 5, 2020, received in European Patent Application No. 16707356.8, which corresponds with U.S. Appl. No. 14/866,159, 2 pages. |
Patent, dated Apr. 1, 2020, received in European Patent Application No. 16707356.8, which corresponds with U.S. Appl. No. 14/866,159, 3 pages. |
Patent, dated Sep. 7, 2017, received in Dutch Patent Application No. 2016377, which corresponds with U.S. Appl. No. 14/866,159, 4 pages. |
Office Action, dated Oct. 6, 2017, received in U.S. Appl. No. 14/868,088, 40 pages. |
Notice of Allowance, dated May 24, 2018, received in U.S. Appl. No. 14/868,078, 6 pages. |
Innovation Patent, dated Aug. 4, 2016, received in Australian Patent Application No. 2016101201, which corresponds with U.S. Appl. No. 14/868,078, 1 page. |
Office Action, dated Oct. 12, 2016, received in Australian Patent Application No. 2016101201, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Notice of Allowance, dated Sep. 1, 2017, received in Australian Patent Application No. 2016229421, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Certificate of Grant, dated Jan. 3, 2018, received in Australian Patent Application No. 2016229421, which corresponds with U.S. Appl. No. 14/868,078, 1 page. |
Office Action, dated Feb. 7, 2019, received in Australian Patent Application No. 2017258967, which corresponds with U.S. Appl. No. 14/868,078, 3 page. |
Notice of Acceptance, dated Jun. 21, 2019, received in Australian Patent Application No. 2017258967, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Certificate of Grant, dated Oct. 17, 2019, received in Australian Patent Application No. 2017258967, which corresponds with U.S. Appl. No. 14/868,078, 4 page. |
Office Action, dated Aug. 20, 2018, received in Chinese Patent Application No. 01610130348.1, which corresponds with U.S. Appl. No. 14/868,078, 6 pages. |
Office Action, dated Feb. 26, 2019, received in Chinese Patent Application No. 01610130348.1, which corresponds with U.S. Appl. No. 14/868,078, 4 pages. |
Notice of Allowance, dated May 6, 2019, received in Chinese Patent Application No. 01610130348.1, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Patent, dated Jul. 5, 2019, received in Chinese Patent Application No. 201610130348.1, which corresponds with U.S. Appl. No. 14/868,078, 6 pages. |
Notice of Allowance, dated Oct. 1, 2016, received in Chinese Patent Application No. 201620175847.8, which corresponds with U.S. Appl. No. 14/868,078, 1 page. |
Office Action, dated Nov. 21, 2019, received in Chinese Patent Application No. 201680011338.4, which corresponds with U.S. Appl. No. 14/868,078, 8 pages. |
Office Action, dated May 19, 2020, received in Chinese Patent Application No. 201680011338.4, which corresponds with U.S. Appl. No. 14/868,078, 4 pages. |
Office Action, dated Jun. 30, 2020, received in Chinese Patent Application No. 201680011338.4, which correspondence with U.S. Appl. No. 14/868,078, 4 pages. |
Patent, dated Dec. 11, 2020, received in Chinese Patent Application No. 201680011338.4, which correspondence with U.S. Appl. No. 14/868,078, 3 pages. |
Certificate of Registration, dated Jun. 30, 2016, received in German Patent Application No. 20201600156.9, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Office Action, dated Mar. 30, 2016, received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 9 pages. |
Office Action, dated Sep. 2, 2016, received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 4 pages. |
Notice of Allowance, dated Jan. 30, 2017, received in received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Notice of Allowance, dated May 2, 2017, received in received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Patent, dated Sep. 11, 2017, received in Danish Patent Application No. 201500588, which corresponds with U.S. Appl. No. 14/868,078, 5 pages. |
Office Action, dated Apr. 25, 2018, received in European Patent Application No. 16708916.8, which corresponds with U.S. Appl. No. 14/868,078, 6 pages. |
Intention to Grant, dated May 10, 2019, received in European Patent Application No. 16708916.8, which corresponds with U.S. Appl. No. 14/868,078, 5 pages. |
Decision to Grant, dated Sep. 12, 2019, received in European Patent Application No. 16708916.8, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Patent, dated Oct. 9, 2019, received in European Patent Application No. 16708916.8, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Office Action, dated Oct. 25, 2018, received in European Patent Application No. 17184437.6, which corresponds with U.S. Appl. No. 14/868,078, 6 pages. |
Intention to Grant, dated May 22, 2019, received in European Patent Application No. 17184437.6, which corresponds with U.S. Appl. No. 14/868,078, 7 pages. |
Decision to Grant, dated Sep. 19, 2019, received in European Patent Application No. 17184437.6, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Patent, dated Oct. 16, 2019, received in European Patent Application No. 17184437.6, which corresponds with U.S. Appl. No. 14/868,078, 3 pages. |
Patent, dated Jul. 12, 2017, received in Dutch Patent Application No. 2016376, which corresponds with U.S. Appl. No. 14/868,078, 2 pages. |
Office Action, dated May 9, 2016, received in U.S. Appl. No. 14/863,432, 26 pages. |
Notice of Allowance, dated Nov. 14, 2016, received in U.S. Appl. No. 14/863,432, 7 pages. |
Notice of Allowance, dated Apr. 27, 2017, received in U.S. Appl. No. 14/863,432, 7 pages. |
Notice of Allowance, dated Sep. 18, 2017, received in U.S. Appl. No. 14/863,432, 8 pages. |
Office Action, dated Aug. 19, 2016, received in Australian Patent Application No. 2016100647, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Office Action, dated Dec. 4, 2018, received in Chinese Patent Application No. 201610342313.4, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Office Action, dated Jun. 17, 2019, received in Chinese Patent Application No. 201610342313.4, which corresponds with U.S. Appl. No. 14/863,432, 4 pages. |
Office Action, dated Nov. 5, 2019, received in Chinese Patent Application No. 201610342313.4.4, which corresponds with U.S. Appl. No. 14/863,432, 4 pages. |
Notice of Allowance, dated Mar. 20, 2020, received in Chinese Patent Application No. 201610342313.4, which corresponds with U.S. Appl. No. 14/863,432, 6 pages. |
Patent, dated May 12, 2020, received in Chinese Patent Application No. 201610342313.4, which corresponds with U.S. Appl. No. 14/863,432, 7 pages. |
Notice of Allowance, dated Jan. 12, 2017, received in Chinese Patent Application No. 201620470063.8, which corresponds with U.S. Appl. No. 14/863,432, 1 page. |
Patent, dated Feb. 8, 2017, received in Chinese Patent Application No. 201620470063.8, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Office Action, dated Apr. 4, 2016, received in Danish Patent Application No. 201500582, which corresponds with U.S. Appl. No. 14/863,432, 10 pages. |
Office Action, dated Oct. 7, 2016, received in Danish Patent Application No. 201500582, which corresponds with U.S. Appl. No. 14/863,432, 6 pages. |
Office Action, dated Jun. 12, 2017, received in Danish Patent Application No. 201500582, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Office Action, dated Jan. 10, 2020, received in Japanese Patent Application No. 2018-243773, which corresponds with U.S. Appl. No. 14/863,432, 6 pages. |
Office Action, dated Jul. 17, 2020, received in Japanese Patent Application No. 2018-243773, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Notice of Allowance, dated Dec. 4, 2020, received in Japanese Patent Application No. 2018-243773, which corresponds with U.S. Appl. No. 14/863,432, 5 pages. |
Patent, dated Jan. 5, 2021, received in Japanese Patent Application No. 2018-243773, which corresponds with U.S. Appl. No. 14/863,432, 4 pages. |
Notice of Allowance, dated Jul. 13, 2020, received in Korean Patent Application No. 2020-7015964, which corresponds with U.S. Appl. No. 14/863,432, 6 pages. |
Patent, dated Oct. 12, 2020, received in Korean Patent Application No. 2020-7015964, which corresponds with U.S. Appl. No. 14/863,432, 8 pages. |
Grant, dated Jul. 21, 2017, received in Dutch Patent Application No. 2016801, which corresponds with U.S. Appl. No. 14/871,227, 8 pages. |
Office Action, dated Oct. 13, 2016, received in U.S. Appl. No. 14/866,511, 27 pages. |
Final Office Action, dated Jan. 27, 2017, received in U.S. Appl. No. 14/866,511, 26 pages. |
Notice of Allowance, dated Oct. 4, 2017, received in U.S. Appl. No. 14/866,511, 37 pages. |
Office Action, dated Aug. 19, 2016, received in U.S. Appl. No. 14/291,880, 19 pages. |
Notice of Allowance, dated Jan. 10, 2017, received in U.S. Appl. No. 14/291,880, 8 pages. |
Patent, dated Aug. 8, 2016, received in Australian Patent Application No. 2016100653, corresponds with U.S. Appl. No. 14/866,511, 1 page. |
Office Action, dated Dec. 5, 2018, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 4 pages. |
Office Action, dated Jul. 11, 2019, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 4 pages. |
Office Action, dated Sep. 17, 2019, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Notice of Allowance, dated Nov. 28, 2019, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Patent, dated Feb. 7, 2020, received in Chinese Patent Application No. 201610342264.4, which corresponds with U.S. Appl. No. 14/866,511, 7 pages. |
Notice of Allowance, dated Jan. 12, 2017, received in Chinese Patent Application No. 201620470281.1, which corresponds with U.S. Appl. No. 14/866,511, 1 page. |
Office Action, dated Mar. 22, 2016, received in Danish Patent Application No. 201500576, which corresponds with U.S. Appl. No. 14/866,511, 10 pages. |
Intention to Grant, dated Jun. 8, 2016, received in Danish Patent Application No. 201500576, which corresponds with U.S. Appl. No. 14/866,511, 2 pages. |
Grant, dated Aug. 26, 2016, received in Danish Patent Application No. 201500576, which corresponds with U.S. Appl. No. 14/866,511, 2 pages. |
Patent, dated Jan. 23, 2017, received in Danish Patent Application No. 201500576, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Office Action, dated Nov. 24, 2017, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Office Action, dated May 24, 2018, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 7 pages. |
Office Action, dated Jan. 2, 2019, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Intention to Grant, dated Jul. 5, 2019, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Decision to Grant, dated Dec. 5, 2019, received in European Patent Application No. 16727900.9, which corresponds with U.S. Patent Application No. 14/866,511, 2 pages. |
Patent, dated Jan. 1, 2020, received in European Patent Application No. 16727900.9, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Office Action, dated Jun. 9, 2017, received in Japanese Patent Application No. 2016558214, which corresponds with U.S. Appl. No. 14/866,511, 6 pages. |
Notice of Allowance, dated Jul. 14, 2017, received in Japanese Patent Application No. 2016558214, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Patent, dated Aug. 18, 2017, received in Japanese Patent Application No. 2016558214, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Office Action, dated Apr. 24, 2020, received in Korean Patent Application No. 2020-7003065, which corresponds with U.S. Appl. No. 14/866,511, 3 pages. |
Notice of Allowance, dated Jul. 29, 2020, received in Korean Patent Application No. 2020-7003065, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Patent, dated Oct. 29, 2020, received in Korean Patent Application No. 2020-7003065, which corresponds with U.S. Appl. No. 14/866,511, 5 pages. |
Office Action, dated May 10, 2016, received in U.S. Appl. No. 14/866,489, 15 pages. |
Final Office Action, dated Sep. 16, 2016, received in U.S. Appl. No. 14/866,489, 24 pages. |
Notice of Allowance, dated Apr. 27, 2017, received in U.S. Appl. No. 14/866,489, 27 pages. |
Notice of Allowance, dated Jul. 6, 2017, received in U.S. Appl. No. 14/866,489, 12 pages. |
Office Action, dated Mar. 28, 2016, received in U.S. Appl. No. 14/869,899, 17 pages. |
Office Action, dated Jun. 28, 2016, received in U.S. Appl. No. 14/869,899, 5 pages. |
Final Office Action, dated Sep. 2, 2016, received in U.S. Appl. No. 14/869,899, 22 pages. |
Notice of Allowance, dated Feb. 28, 2017, received in U.S. Appl. No. 14/869,899, 9 pages. |
Innovation Patent, dated Aug. 25, 2016, received in Australian Patent Application No. 2016101438, which corresponds with U.S. Appl. No. 14/869,899, 1 page. |
Certificate of Examination, dated Oct. 11, 2016, received in Australian Patent Application No. 2016101438, which corresponds with U.S. Appl. No. 14/869,899, 1 page. |
Notice of Acceptance, dated Aug. 23, 2018, received in Australian Patent Application No. 2018204611, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Office Action, dated Nov. 6, 2020, received in Chinese Patent Application No. 201610871595.7, which corresponds with U.S. Appl. No. 14/869,899, 15 pages. |
Office Action, dated Feb. 3, 2016, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 9 pages. |
Office Action, dated Oct. 7, 2016, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 6 pages. |
Office Action, dated Jul. 3, 2017, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 5 pages. |
Office Action, dated Jan. 29, 2018, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Notice of Allowance, dated Apr. 24, 2018, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Patent, dated May 28, 2018, received in Danish Patent Application No. 201500592, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Office Action, dated Nov. 22, 2016, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 9 pages. |
Office Action, dated Dec. 14, 2017, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Office Action, dated May 1, 2018, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Office Action, dated Oct. 9, 2018, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 2 pages. |
Patent, dated Feb. 26, 2019, received in Danish Patent Application No. 201670594, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Office Action, dated May 8, 2019, received in European Patent Application No. 18168939.9, which corresponds with U.S. Appl. No. 14/869,899, 10 pages. |
Intention to Grant, dated Oct. 25, 2019, received in European Patent Application No. 18168939.9, which corresponds with U.S. Appl. No. 14/869,899, 8 pages. |
Decision to Grant, dated Mar. 26, 2020, received in European Patent Application No. 18168939.9, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Patent, dated Apr. 22, 2020, received in European Patent Application No. 18168939.9, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Office Action, dated May 23, 2019, received in European Patent Application No. 18175195.9, which corresponds with U.S. Appl. No. 14/869,899, 10 pages. |
Oral Summons, dated Dec. 6, 2019, received in European Patent Application No. 18175195.9, which corresponds with U.S. Appl. No. 14/869,899, 9 pages. |
Office Action, dated Sep. 21, 2018, received in Japanese Patent Application No. 2018-100827, which corresponds with U.S. Appl. No. 14/869,899, 4 pages. |
Notice of Allowance, dated Mar. 1, 2019, received in Japanese Patent Application No. 2018-100827, which corresponds with U.S. Appl. No. 14/869,899, 5 pages. |
Patent, dated Apr. 5, 2019, received in Japanese Patent Application No. 2018-100827, which corresponds with U.S. Appl. No. 14/869,899, 5 pages. |
Office Action, dated Oct. 5, 2018, received in Korean Patent Application No. 2018-7017213, which corresponds with U.S. Appl. No. 14/869,899, 3 pages. |
Office Action, dated Mar. 22, 2019, received in Korean Patent Application No. 2018-7017213, which corresponds with U.S. Appl. No. 14/869,899, 6 pages. |
Patent, dated May 10, 2019, received in Korean Patent Application No. 2018-7017213, which corresponds with U.S. Appl. No. 14/869,899, 8 pages. |
Office Action, dated Mar. 4, 2016, received in U.S. Appl. No. 14/866,992, 30 pages. |
Final Office Action, dated Jul. 29, 2016, received in U.S. Appl. No. 14/866,992, 35 pages. |
Office Action, dated Apr. 13, 2017, received in U.S. Appl. No. 14/866,992, 34 pages. |
Final Office Action, dated Oct. 3, 2017, received in U.S. Appl. No. 14/866,992, 37 pages. |
Office Action, dated Jan. 29, 2018, received in U.S. Appl. No. 14/866,992, 44 pages. |
Final Office Action, dated Aug. 28, 2018, received in U.S. Appl. No. 14/866,992, 52 pages. |
Examiner's Answer, dated May 9, 2019, received in U.S. Appl. No. 14/866,992, 26 pages. |
Innovation Patent, dated Sep. 22, 2016, received in Australian Patent Application No. 2016101418, which corresponds with U.S. Appl. No. 14/866,992, 1 page. |
Office Action, dated Nov. 22, 2016, received in Australian Patent Application No. 2016101418, which corresponds with U.S. Appl. No. 14/866,992, 7 pages. |
Office Action, dated Feb. 7, 2017, received in Australian Patent Application No. 2016101418, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Office Action, dated Mar. 26, 2018, received in Australian Patent Application No. 2016304890, which corresponds with U.S. Appl. No. 14/866,992, 3 pages. |
Notice of Acceptance, dated Mar. 12, 2019, received in Australian Patent Application No. 2016304890, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Certificate of Grant, dated Jul. 4, 2019, received in Australian Patent Application No. 2016304890, which corresponds with U.S. Appl. No. 14/866,992, 1 page. |
Office Action, dated Jan. 19, 2018, received in Australian Patent Application No. 201761478, which corresponds with U.S. Appl. No. 14/866,992, 6 pages. |
Certificate of Grant, dated May 9, 2019, received in Australian Patent Application No. 201761478, which corresponds with U.S. Appl. No. 14/866,992, 3 pages. |
Office Action, dated Sep. 12, 2019, received in Chinese Patent Application No. 201610658351.8, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Office Action, dated Jan. 13, 2020, received in Chinese Patent Application No. 201610658351.8, which corresponds with U.S. Appl. No. 14/866,992, 3 pages. |
Office Action, dated Jun. 30, 2020, received in Chinese Patent Application No. 201610658351.8, which corresponds with U.S. Appl. No. 14/866,992, 11 pages. |
Office Action, dated Nov. 25, 2020, received in Chinese Patent Application No. 201610658351.8, which corresponds with U.S. Appl. No. 14/866,992, 9 pages. |
Office Action, dated Jul. 24, 2020, received in Chinese Patent Application No. 201680041559.6, which corresponds with U.S. Appl. No. 14/866,992, 13 pages. |
Office Action, dated Mar. 18, 2016, received in Danish Patent Application No. 201500593, which corresponds with U.S. Appl. No. 14/866,992, 10 pages. |
Office Action, dated Jun. 27, 2016, received in Danish Patent Application No. 201500593, which corresponds with U.S. Appl. No. 14/866,992, 7 pages. |
Office Action, dated Feb. 6, 2017, received in Danish Patent Application No. 201500593, which corresponds with U.S. Appl. No. 14/866,992, 4 pages. |
Office Action, dated Septembers, 2017, received in Danish Patent Application No. 201500593, which corresponds with U.S. Appl. No. 14/866,992, 6 pages. |
Office Action, dated Oct. 12, 2018, received in European Patent Application No. 16758008.3, which corresponds with U.S. Appl. No. 14/866,992, 11 pages. |
Summons, dated May 8, 2019, received in European Patent Application No. 16758008.3, which corresponds with U.S. Appl. No. 14/866,992, 14 pages. |
Office Action, dated Jan. 11, 2019, received in Japanese Patent Application No. 2018-506425, which corresponds with U.S. Appl. No. 14/866,992, 6 pages. |
Notice of Allowance, dated Jun. 18, 2019, received in Japanese Patent Application No. 2018-506425, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Patent, dated Jul. 26, 2019, received in Japanese Patent Application No. 2018-506425, which corresponds with U.S. Appl. No. 14/866,992, 3 pages. |
Notice of Allowance, dated Sep. 10, 2019, received in Korean Patent Application No. 2018-7003890, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Patent, dated Oct. 11, 2019, received in Korean Patent Application No. 2018-7003890, which corresponds with U.S. Appl. No. 14/866,992, 5 pages. |
Office Action, dated Feb. 12, 2018, received in U.S. Appl. No. 15/009,661, 36 pages. |
Final Office Action, dated Sep. 19, 2018, received in U.S. Appl. No. 15/009,661, 28 pages. |
Office Action, dated Jun. 28, 2019, received in U.S. Appl. No. 15/009,661, 33 pages. |
Final Office Action, dated Dec. 30, 2019, received in U.S. Appl. No. 15/009,661, 33 pages. |
Office Action, dated Sep. 16, 2020, received in U.S. Appl. No. 15/009,661, 37 pages. |
Final Office Action, dated Feb. 26, 2021, received in U.S. Appl. No. 15/009,661, 46 pages. |
Office Action, dated Jan. 18, 2018, received in U.S. Appl. No. 15/009,676, 21 Pages. |
Notice of Allowance, dated Aug. 3, 2018, received in U.S. Appl. No. 15/009,676, 6 pages. |
Notice of Allowance, dated Nov. 15, 2018, received in U.S. Appl. No. 15/009,676, 6 pages. |
Office Action, dated Jul. 15, 2020, received in Chinese Patent Application No. 201680047125.7, which corresponds with U.S. Appl. No. 15/009,676, 11 pages. |
Office Action, dated Nov. 30, 2020, received in Chinese Patent Application No. 201680047125.7, which corresponds with U.S. Appl. No. 15/009,676, 11 pages. |
Notice of Allowance, dated Feb. 24, 2021, received in Chinese Patent Application No. 201680047125.7, which corresponds with U.S. Appl. No. 15/009,676, 1 page. |
Intention to Grant, dated Apr. 7, 2020, received in European Patent Application No. 16756866.6, which corresponds with U.S. Appl. No. 15/009,676, 8 pages. |
Decision to Grant, dated Aug. 27, 2020, received in European Patent Application No. 16756866.6, which corresponds with U.S. Appl. No. 15/009,676, 4 pages. |
Patent, dated Sep. 23, 2020, received in European Patent Application No. 16756866.6, which corresponds with U.S. Appl. No. 15/009,676, 4 pages. |
Office Action, dated Mar. 13, 2018, received in U.S. Appl. No. 15/009,688, 10 pages. |
Notice of Allowance, dated Nov. 6, 2018, received in U.S. Appl. No. 15/009,688, 10 pages. |
Office Action, dated Jun. 29, 2020, received in Chinese Patent Application No. 201680047164.7, which corresponds with U.S. Appl. No. 15/009,688, 7 pages. |
Notice of Allowance, dated Oct. 9, 2020, received in Chinese Patent Application No. 201680047164.7, which corresponds with U.S. Appl. No. 15/009,688, 5 pages. |
Patent, dated Nov. 10, 2020, received in Chinese Patent Application No. 201680047164.7, which corresponds with U.S. Appl. No. 15/009,688, 6 pages. |
Intention to Grant, dated Mar. 16, 2020, received in European Patent Application No. 16753796.8, which corresponds with U.S. Appl. No. 15/009,688, 6 pages. |
Decision to Grant, dated Sep. 24, 2020, received in European Patent Application No. 16753796.8, which corresponds with U.S. Appl. No. 15/009,688, 4 pages. |
Certificate of Grant, dated Oct. 21, 2020, received in European Patent Application No. 16753796.8, which corresponds with U.S. Appl. No. 15/009,688, 4 pages. |
Office Action, dated Nov. 30, 2015, received in U.S. Appl. No. 14/845,217, 24 pages. |
Final Office Action, dated Apr. 22, 2016, received in U.S. Appl. No. 14/845,217, 36 pages. |
Notice of Allowance, dated Aug. 26, 2016, received in U.S. Appl. No. 14/845,217, 5 pages. |
Notice of Allowance, dated Jan. 4, 2017, received in U.S. Appl. No. 14/845,217, 5 pages. |
Office Action, dated Feb. 3, 2016, received in U.S. Appl. No. 14/856,517, 36 pages. |
Final Office Action, dated Jul. 13, 2016, received in U.S. Appl. No. 14/856,517, 30 pages. |
Office Action, dated May 2, 2017, received in U.S. Appl. No. 14/856,517, 34 pages. |
Final Office Action, dated Oct. 4, 2017, received in U.S. Appl. No. 14/856,517, 33 pages. |
Notice of Allowance, dated Jun. 29, 2018, received in U.S. Appl. No. 14/856,517, 11 pages. |
Office Action, dated Feb. 11, 2016, received in U.S. Appl. No. 14/856,519, 34 pages. |
Final Office Action, dated Jul. 15, 2016, received in U.S. Appl. No. 14/856,519, 31 pages. |
Office Action, dated May 18, 2017, received in U.S. Appl. No. 14/856,519, 35 pages. |
Final Office Action, dated Nov. 15, 2017, received in U.S. Appl. No. 14/856,519, 31 pages. |
Notice of Allowance, dated Jan. 31, 2018, received in U.S. Appl. No. 14/856,519, 9 pages. |
Notice of Allowance, dated May 2, 2018, received in U.S. Appl. No. 14/856,519, 10 pages. |
Office Action, dated Jun. 9, 2017, received in U.S. Appl. No. 14/856,520, 36 pages. |
Final Office Action, dated Nov. 16, 2017, received in U.S. Appl. No. 14/856,520, 41 pages. |
Office Action, dated Nov. 20, 2018, received in U.S. Appl. No. 14/856,520, 36 pages. |
Final Office Action, dated Apr. 17, 2019, received in U.S. Appl. No. 14/856,520, 38 pages. |
Notice of Allowance, dated Jan. 6, 2020, received in U.S. Appl. No. 14/856,520, 5 pages. |
Notice of Allowance, dated Mar. 4, 2020, received in U.S. Appl. No. 14/856,520, 6 pages. |
Notice of Allowance, dated Oct. 1, 2020, received in U.S. Appl. No. 14/856,520, 5 pages. |
Office Action, dated Jun. 30, 2017, received in U.S. Appl. No. 14/856,522, 22 pages. |
Notice of Allowance, dated Feb. 9, 2018, received in U.S. Appl. No. 14/856,522, 9 pages. |
Office Action, dated Feb. 1, 2016, received in U.S. Appl. No. 14/857,645, 15 pages. |
Final Office Action, dated Jun. 16, 2016, received in U.S. Appl. No. 14/857,645, 12 pages. |
Notice of Allowance, dated Oct. 24, 2016, received in U.S. Appl. No. 14/857,645, 6 pages. |
Notice of Allowance, dated Jun. 16, 2017, received in in U.S. Appl. No. 14/857,645, 5 pages. |
Office Action, dated Nov. 30, 2017, received in U.S. Appl. No. 14/857,636, 19 pages. |
Notice of Allowance, dated Aug. 16, 2018, received in U.S. Appl. No. 14/857,636, 5 pages. |
Office Action, dated Jan. 17, 2018, received in Australian Patent Application No. 2017202816, which corresponds with U.S. Appl. No. 14/857,636, 3 pages. |
Notice of Allowance, dated Jan. 15, 2019, received in Australian Patent Application No. 2017202816, which corresponds with U.S. Appl. No. 14/857,636, 3 pages. |
Certificate of Grant, dated May 16, 2019, received in Australian Patent Application No. 2017202816, which corresponds with U.S. Appl. No. 14/857,636, 4 pages. |
Office Action, dated Jul. 1, 2020, received in Chinese Patent Application No. 201711262953.5, which corresponds with U.S. Appl. No. 14/857,636, 13 pages. |
Patent, dated Nov. 27, 2020, received in Chinese Patent Application No. 201711262953.5, which corresponds with U.S. Appl. No. 14/857,636, 6 pages. |
Office Action, dated Sep. 22, 2017, received in Japanese Patent Application No. 2017-029201, which corresponds with U.S. Appl. No. 14/857,636, 8 pages. |
Office Action, dated Jun. 25, 2018, received in Japanese Patent Application No. 2017-029201, which corresponds with U.S. Appl. No. 14/857,636, 4 pages. |
Office Action, dated Jan. 20, 2020, received in Japanese Patent Application No. 2017-029201, which corresponds with U.S. Appl. No. 14/857,636, 21 pages. |
Notice of Allowance, dated Oct. 16, 2020, received in Japanese Patent Application No. 2017-029201, which corresponds with U.S. Appl. No. 14/857,636, 4 pages. |
Patent, dated Nov. 12, 2020, received in Japanese Patent Application No. 2017-029201, which corresponds with U.S. Appl. No. 14/857,636, 3 pages. |
Office Action, dated Nov. 28, 2018, received in Korean Patent Application No. 20177036645, which corresponds with U.S. Appl. No. 14/857,636, 6 pages. |
Notice of Allowance, dated May 10, 2019, received in Korean Patent Application No. 20177036645, which corresponds with U.S. Appl. No. 14/857,636, 4 pages. |
Patent, dated Jul. 11, 2019, received in Korean Patent Application No. 20177036645, which corresponds with U.S. Appl. No. 14/857,636, 8 pages. |
Office Action, dated Dec. 1, 2017, received in U.S. Appl. No. 14/857,663, 15 pages. |
Notice of Allowance, dated Aug. 16, 2018, received in U.S. Appl. No. 14/857,663, 5 pages. |
Office Action, dated Jul. 14, 2020, received in Chinese Patent Application No. 201711261143.8, which corresponds with U.S. Appl. No. 14/857,663, 12 pages. |
Notice of Allowance, dated Dec. 2, 2020, received in Chinese Patent Application No. 201711261143.8, which corresponds with U.S. Appl. No. 14/857,663, 3 pages. |
Patent, dated Jan. 22, 2021, received in Chinese Patent Application No. 201711261143.8, which corresponds with U.S. Appl. No. 14/857,663, 6 pages. |
Office Action, dated Nov. 11, 2019, received in Japanese Patent Application No. 2018-201076, which corresponds with U.S. Appl. No. 14/857,663, 7 pages. |
Notice of Allowance, dated Sep. 18, 2020, received in Japanese Patent Application No. 2018-201076, which corresponds with U.S. Appl. No. 14/857,663, 5 pages. |
Patent, dated Oct. 19, 2020, received in Japanese Patent Application No. 2018-201076, which corresponds with U.S. Appl. No. 14/857,663, 4 pages. |
Office Action, dated Mar. 31, 2017, received in U.S. Appl. No. 14/857,700, 14 pages. |
Final Office Action, dated Oct. 11, 2017, received in U.S. Appl. No. 14/857,700, 13 pages. |
Notice of Allowance, dated Feb. 12, 2018, received in U.S. Appl. No. 14/857,700, 13 pages. |
Notice of Allowance, dated Apr. 9, 2018, received in U.S. Appl. No. 14/857,700, 7 pages. |
Notice of Allowance, dated Apr. 19, 2018, received in U.S. Appl. No. 14/864,529, 11 pages. |
Notice of Allowance, dated Oct. 9, 2018, received in U.S. Appl. No. 14/864,529, 11 pages. |
Office Action, dated Dec. 21, 2020, received in Korean Patent Application No. 2020-7029178, which corresponds with U.S. Appl. No. 14/870,882, 2 pages. |
Grant of Patent, dated Apr. 16, 2018, received in Dutch Patent Application No. 2019215, 2 pages. |
Office Action, dated Jan. 25, 2016, received in U.S. Appl. No. 14/864,580, 29 pages. |
Notice of Allowance, dated May 23, 2016, received in U.S. Appl. No. 14/864,580, 9 pages. |
Notice of Allowance, dated Aug. 4, 2016, received in U.S. Appl. No. 14/864,580, 9 pages. |
Notice of Allowance, dated Dec. 28, 2016, received in U.S. Appl. No. 14/864,580, 8 pages. |
Office Action, dated Aug. 19, 2016, received in Australian Patent Application No. 2016100648, which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Office Action, dated Jul. 1, 2019, received in Australian Patent Application No. 2019200872, which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Notice of Acceptance, dated Sep. 19, 2019, received in Australian Patent Application No. 2019200872, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Certificate of Grant, dated Jan. 23, 2020, received in Australian Patent Application No. 2019200872, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Office Action, dated November/, 2018, received in Chinese Patent Application No. 201610342151.4, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Notice of Allowance, dated Jun. 14, 2019, received in Chinese Patent Application No. 201610342151.4, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Patent, dated Jul. 30, 2019, received in Chinese Patent Application No. 201610342151.4, which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Notice of Allowance, dated Nov. 8, 2016, received in Chinese Patent Application No. 201620470247.4, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Certificate of Registration, dated Oct. 14, 2016, received in German Patent Application No. 20201600003234.9, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Office Action, dated Apr. 8, 2016, received in Danish Patent Application No. 201500584, which corresponds with U.S. Appl. No. 14/864,580, 9 pages. |
Office Action, dated Oct. 7, 2016, received in Danish Patent Application No. 201500584, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Office Action, dated May 5, 2017, received in Danish Patent Application No. 201500584, which corresponds with U.S. Appl. No. 14/864,580, 3 pages. |
Office Action, dated Dec. 15, 2017, received in Danish Patent Application No. 201500584, which corresponds with U.S. Appl. No. 14/864,580, 4 pages. |
Notice of Allowance, dated Aug. 14, 2019, received in Korean Patent Application No. 2019-7018317), which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Patent, dated Nov. 12, 2019, received in Korean Patent Application No. 2019-701831Z, which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Notice of Allowance, dated Nov. 23, 2016, received in U.S. Appl. No. 14/864,601, 12 pages. |
Notice of Allowance, dated Apr. 20, 201Z, received in U.S. Appl. No. 14/864,601, 13 pages. |
Office Action, dated Aug. 31, 2018, received in Australian Patent Application No. 2016276030, which corresponds with U.S. Appl. No. 14/864,601, 3 pages. |
Certificate of Grant, dated Feb. 21, 2019, received in Australian Patent Application No. 2016276030, which corresponds with U.S. Appl. No. 14/864,601, 4 pages. |
Office Action, dated Feb. 4, 2019, received in European Patent Application No. 16730554.9, which corresponds with U.S. Appl. No. 14/864,601, 10 pages. |
Intention to Grant, dated Jul. 18, 2019, received in European Patent Application No. 16730554.9, which corresponds with U.S. Appl. No. 14/864,601, 5 pages. |
Decision to Grant, dated Sep. 12, 2019, received in European Patent Application No. 16730554.9, which corresponds with U.S. Appl. No. 14/864,601, 2 pages. |
Patent, dated Oct. 9, 2019, received in European Patent Application No. 16730554.9, which corresponds with U.S. Appl. No. 14/864,601, 3 pages. |
Notice of Allowance, dated Dec. 10, 2018, received in Japanese Patent Application No. 2017-561375, which corresponds with U.S. Appl. No. 14/864,601, 5 pages. |
Patent, dated Jan. 11, 2019, received in Japanese Patent Application No. 2017-561375, which corresponds with U.S. Appl. No. 14/864,601, 3 pages. |
Office Action, dated Jan. 25, 2019, received in Korean Patent Application No. 2017-7033756, which corresponds with U.S. Appl. No. 14/864,601, 8 pages. |
Notice of Allowance, dated May 29, 2019, received in Korean Patent Application No. 2017-7033756, which corresponds with U.S. Appl. No. 14/864,601, 6 pages. |
Patent, dated Jun. 25, 2019, received in Korean Patent Application No. 2017-7033756, which corresponds with U.S. Appl. No. 14/864,601, 6 pages. |
Office Action, dated Apr. 19, 2016, received in U.S. Appl. No. 14/864,627, 9 pages. |
Notice of Allowance, dated Jan. 31, 2017, received in U.S. Appl. No. 14/864,627, 7 pages. |
Office Action, dated Apr. 8, 2016, received in Danish Patent Application No. 201500585, which corresponds with U.S. Appl. No. 14/864,627, 9 pages. |
Office Action, dated Oct. 7, 2016, received in Danish Patent Application No. 201500585, which corresponds with U.S. Appl. No. 14/864,627, 3 pages. |
Office Action, dated May 5, 2017, received in Danish Patent Application No. 201500585, which corresponds with U.S. Appl. No. 14/864,627, 4 page |
Office Action, dated Dec. 15, 2017, received in Danish Patent Application No. 201500585, which corresponds with U.S. Appl. No. 14/864,627, 5 pages. |
Office Action, dated Mar. 29, 2016, received in U.S. Appl. No. 14/866,361, 22 pages. |
Notice of Allowance, dated Jul. 19, 2016, received in U.S. Appl. No. 14/866,361, 8 pages. |
Office Action, dated Jun. 10, 2016, received in Australian Patent Application No. 2016100292, which corresponds with U.S. Appl. No. 14/866,361, 4 pages. |
Certificate of Examination, dated Dec. 8, 2016, received in Australian Patent Application No. 2016100292, which corresponds with U.S. Appl. No. 14/866,361, 1 page. |
Office Action, dated Oct. 19, 2018, received in Chinese Patent Application No. 201610189298.4, which corresponds with U.S. Appl. No. 14/866,361, 6 pages. |
Notice of Allowance, dated May 23, 2019, received in Chinese Patent Application No. 201610189298.4, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Patent, dated Jul. 23, 2019, received in Chinese Patent Application No. 201610189298.4, which corresponds with U.S. Appl. No. 14/866,361, 7 pages. |
Notice of Allowance/Grant, dated Jul. 1, 2016, received in Chinese Patent Application No. 201620251706.X, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Letters Patent, dated Aug. 3, 2016, received in Chinese Patent Application No. 201620251706.X, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Certificate of Registration, dated Jun. 24, 2016, received in German Patent Application No. 202016001819.2, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Office Action, dated Apr. 7, 2016, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 10 pages. |
Office Action, dated Oct. 28, 2016, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Office Action, dated Jun. 15, 2017, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Office Action, dated Jan. 4, 2018, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Notice of Allowance, dated Mar. 16, 2018, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Patent, dated May 22, 2018, received in Danish Patent Application No. 201500579, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Office Action, dated Jun. 11, 2018, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 10 pages. |
Office Action, dated Jan. 30, 2019, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 13 pages. |
Office Action, dated Oct. 8, 2019, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 6 pages. |
Intention to Grant, dated Apr. 14, 2020, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 7 pages. |
Intention to Grant, dated Feb. 3, 2021, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 7 pages. |
Office Action, dated Oct. 12, 2018, received in Japanese Patent Application No. 2017-141962, which corresponds with U.S. Appl. No. 14/866,361, 6 pages. |
Office Action, dated Jun. 10, 2019, received in Japanese Patent Application No. 2017-141962, which corresponds with U.S. Appl. No. 14/866,361, 6 pages. |
Notice of Allowance, dated Oct. 7, 2019, received in Japanese Patent Application No. 2017-141962, which corresponds with U.S. Appl. No. 14/866,361, 5 pages. |
Patent, dated Nov. 8, 2019, received in Japanese Patent Application No. 2017-141962, which corresponds with U.S. Appl. No. 14/866,361, 4 pages. |
Office Action, dated Sep. 14, 2018, received in Korean Patent Application No. 2018-7013039, which corresponds with U.S. Appl. No. 14/866,361, 2 pages. |
Notice of Allowance, dated Jan. 30, 2019, received in Korean Patent Application No. 2018-7013039, which corresponds with U.S. Appl. No. 14/866,361, 5 pages. |
Patent, dated Apr. 3, 2019, received in Korean Patent Application No. 2018-7013039, which corresponds with U.S. Appl. No. 14/866,361, 4 pages. |
Office Action, dated Jan. 22, 2018, received in U.S. Appl. No. 14/866,987, 22 pages. |
Final Office Action, dated Oct. 11, 2018, received in U.S. Appl. No. 14/866,987, 20 pages. |
Notice of Allowance, dated Apr. 4, 2019, received in U.S. Appl. No. 14/866,987, 5 pages. |
Patent, dated Aug. 8, 2016, received in Australian Patent Application No. 2016100649, which corresponds with U.S. Appl. No. 14/866,987, 1 page. |
Office Action, dated Dec. 4, 2018, received in Chinese Patent Application No. 201610342336.5, which corresponds with U.S. Appl. No. 14/866,987, 5 pages. |
Rejection Decision, dated Apr. 28, 2019, received in Chinese Patent Application No. 201610342336.5), which corresponds with U.S. Appl. No. 14/866,987, 4 pages. |
Office Action, dated Aug. 15, 2019, received in Chinese Patent Application No. 201610342336.5, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Notice of Allowance, dated Dec. 3, 2019, received in Chinese Patent Application No. 201610342336.5, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Patent, dated Jan. 31, 2020, received in Chinese Patent Application No. 201610342336.5, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Office Action, dated Oct. 19, 2016, received in Chinese U.S. Appl. No. 2016201470246.X, which corresponds with U.S. Appl. No. 14/866,987, 4 pages. |
Patent, dated May 3, 2017, received in Chinese Patent Application No. 2016201470246.X, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Patent, dated Sep. 19, 2016, received in German Patent Application No. 202016002908.9, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Office Action, dated Mar. 22, 2016, received in Danish Patent Application No. 201500587, which corresponds with U.S. Appl. No. 14/866,987, 8 pages. |
Intention to Grant, dated Jun. 10, 2016, received in Danish Patent Application No. 201500587, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Notice of Allowance, dated Nov. 1, 2016, received in Danish Patent Application No. 201500587, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Office Action, dated Sep. 9, 2016, received in Danish Patent Application No. 201670463, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Notice of Allowance, dated Jan. 31, 2017, received in Danish Patent Application No. 201670463, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Office Action, dated Apr. 19, 2017, received in Danish Patent Application No. 201670463, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Notice of Allowance, dated Sep. 29, 2017, received in Danish U.S. Patent Application 201670463, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Patent, dated Nov. 6, 2017, received in Danish Patent Application No. 201670463, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Office Action, dated May 7, 2018, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 5 pages. |
Office Action, dated Dec. 11, 2018, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Intention to Grant, dated Jun. 14, 2019, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Intention to Grant, dated Oct. 25, 2019, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Decision to Grant, dated Nov. 14, 2019, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 2 pages. |
Patent, dated Dec. 11, 2019, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Office Action, dated Feb. 3, 2020, received in European Patent Application No. 17163309.2, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Patent, dated Jan. 8, 2021, received in Hong Kong Patent Application No. 18100151.5, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Office Action, dated Aug. 26, 2020, received in Indian Patent Application No. 201617032291, which corresponds with U.S. Appl. No. 14/866,987, 9 pages. |
Notice of Allowance, dated Sep. 22, 2017, received in Japanese Patent Application No. 2016-233449, which corresponds with U.S. Appl. No. 14/866,987, 5 pages. |
Patent, dated Oct. 27, 2017, received in Japanese Patent Application No. 2016-233449, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Office Action, dated Jul. 31, 2017, received in Japanese Patent Application No. 2017126445, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Notice of Allowance, dated Mar. 6, 2018, received in Japanese Patent Application No. 2017-126445, which corresponds with U.S. Appl. No. 14/866,987, 5 pages. |
Patent, dated Apr. 6, 2018, received in Japanese Patent Application No. 2017-126445, which corresponds with U.S. Appl. No. 14/866,987, 3 pages. |
Office Action, dated Nov. 29, 2017, received in U.S. Appl. No. 14/866,989, 31 pages. |
Final Office Action, dated Jul. 3, 2018, received in U.S. Appl. No. 14/866,989, 17 pages. |
Notice of Allowance, dated Jan. 17, 2019, received in U.S. Appl. No. 14/866,989, 8 pages. |
Certificate of Exam, dated Jul. 21, 2016, received in Australian Patent Application No. 2016100652, which corresponds with U.S. Appl. No. 14/866,989, 1 page. |
Office Action, dated Feb. 26, 2018, received in Australian Patent Application No. 2017201079, which corresponds with U.S. Appl. No. 14/866,989, 6 pages. |
Notice of Acceptance, dated Feb. 14, 2019, received in Australian Patent Application No. 2017201079, which corresponds with U.S. Appl. No. 14/866,989, 3 pages. |
Certificate of Grant, dated Jun. 13, 2019, received in Australian Patent Application No. 2017201079, which corresponds with U.S. Appl. No. 14/866,989, 1 page. |
Office Action, dated Sep. 19, 2018, received in Chinese Patent Application No. 201610342314.9, which corresponds with U.S. Appl. No. 14/866,989, 6 pages. |
Office Action, dated Feb. 25, 2019, received in Chinese Patent Application No. 201610342314.9, which corresponds with U.S. Appl. No. 14/866,989, 3 pages. |
Rejection Decision, dated Apr. 24, 2019, received in Chinese Patent Application No. 201610342314.9, which corresponds with U.S. Appl. No. 14/866,989, 3 pages. |
Office Action, dated Jun. 16, 2017, received in Japanese Patent Application No. 2016-233450, which corresponds with U.S. Appl. No. 14/866,989, 6 pages. |
Patent, dated Mar. 9, 2018, received in Japanese Patent Application No. 2016-233450, which corresponds with U.S. Appl. No. 14/866,989, 4 pages. |
Office Action, dated Apr. 1, 2016, received in Danish Patent Application No. 201500589, which corresponds with U.S. Appl. No. 14/866,989, 8 pages. |
Intention to Grant, dated Jun. 10, 2016, received in Danish Patent Application No. 201500589, which corresponds with U.S. Appl. No. 14/866,989, 2 pages. |
Notice of Allowance, dated Nov. 1, 2016, received in Danish Patent Application No. 201500589, which corresponds with U.S. Appl. No. 14/866,989, 2 pages. |
Office Action, dated Feb. 3, 2020, received in European Patent Application No. 16189425.8, which corresponds with U.S. Appl. No. 14/866,989, 6 pages. |
Intention to Grant, dated Dec. 3, 2020, received in European Patent Application No. 16189425.8, which corresponds with U.S. Appl. No. 14/866,989, 7 pages. |
Decision to Grant, dated Feb. 25, 2021, received in European Patent Application No. 16189425.8, which corresponds with U.S. Appl. No. 14/866,989, 1 page. |
Notice of Allowance, dated Feb. 5, 2018, received in Japanese Patent Application No. 2016-233450, which corresponds with U.S. Appl. No. 14/866,989, 5 pages. |
Office Action, dated Apr. 11, 2016, received in U.S. Appl. No. 14/871,236, 23 pages. |
Office Action, dated Jun. 28, 2016, received in U.S. Appl. No. 14/871,236, 21 pages. |
Final Office Action, dated Nov. 4, 2016, received in U.S. Appl. No. 14/871,236, 24 pages. |
Notice of Allowance, dated Feb. 28, 2017, received in U.S. Appl. No. 14/871,236, 9 pages. |
Innovation Patent, dated Aug. 25, 2016, received in Australian Patent Application No. 2016101433, which corresponds with U.S. Appl. No. 14/871,236, 1 page. |
Office Action, dated Oct. 14, 2016, received in Australian Patent Application No. 2016101433, which corresponds with U.S. Appl. No. 14/871,236, 3 pages. |
Office Action, dated Jun. 23, 2020, received in Brazilian Patent Application No. 11201701119-9, which corresponds with U.S. Appl. No. 14/871,236, 9 pages. |
Office Action, dated Sep. 30, 2019, received in Chinese Patent Application No. 201610871466.8, which corresponds with U.S. Appl. No. 14/871,236, 4 pages. |
Notice of Allowance, dated Mar. 24, 2020, received in Chinese Patent Application No. 201610871466.8, which corresponds with U.S. Appl. No. 14/871,236, 3 pages. |
Patent, dated May 19, 2020, received in Chinese Patent Application No. 201610871466.8, which corresponds with U.S. Appl. No. 14/871,236, 8 pages. |
Office Action, dated Apr. 8, 2016, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 12 pages. |
Office Action, dated May 26, 2016, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 14 pages. |
Office Action, dated Sep. 30, 2016, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 10 pages. |
Office Action, dated Jun. 15, 2017, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 4 pages. |
Office Action, dated Jan. 29, 2018, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 2 pages. |
Notice of Allowance, dated Apr. 26, 2018, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 2 pages. |
Patent, dated Jun. 18, 2018, received in Danish Patent Application No. 201500595, which corresponds with U.S. Appl. No. 14/871,236, 3 pages. |
Intention to Grant, dated Dec. 4, 2019, received in European Patent Application No. 18168941.5, which corresponds with U.S. Appl. No. 14/871,236, 8 pages. |
Intention to Grant, dated Oct. 5, 2020, received in European Patent Application No. 18168941.5, which corresponds with U.S. Appl. No. 14/871,236, 8 pages. |
Decision to Grant, dated Mar. 25, 2021, received in European Patent Application No. 18168941.5, which corresponds with U.S. Appl. No. 14/871,236, 2 pages. |
Office Action, dated Mar. 17, 2020, received in MX/a/2017/011610, which corresponds with U.S. Appl. No. 14/871,236, 4 pages. |
Notice of Allowance, dated Sep. 7, 2020, received in Mx/a/2017/011610, which corresponds with U.S. Appl. No. 14/871,236, 12 pages. |
Patent, dated Dec. 2, 2020, received in Mx/a/2017/011610, which corresponds with U.S. Appl. No. 14/871,236, 4 pages. |
Office Action, dated Jul. 19, 2018, received in Russian Patent Application No. 2017131408, which corresponds with U.S. Appl. No. 14/871,236, 8 pages. |
Patent, dated Feb. 15, 2019, received in Russian Patent Application No. 2017131408, which corresponds with U.S. Appl. No. 14/871,236, 2 pages. |
Office Action, dated Sep. 1, 2017, received in U.S. Appl. No. 14/870,754, 22 pages. |
Final Office Action, dated Mar. 9, 2018, received in U.S. Appl. No. 14/870,754, 19 pages. |
Notice of Allowance, dated Jul. 2, 2018, received in U.S. Appl. No. 14/870,754, 9 pages. |
Notice of Allowance, dated Dec. 3, 2018, received in U.S. Appl. No. 14/870,754, 8 pages. |
Office Action, dated Nov. 14, 2017, received in U.S. Appl. No. 14/870,882, 25 pages. |
Final Office Action, dated Apr. 20, 2018, received in U.S. Appl. No. 14/870,882, 7 pages. |
Notice of Allowance, dated Jul. 12, 2018, received in U.S. Appl. No. 14/870,882, 5 pages. |
Notice of Allowance, dated Dec. 5, 2018, received in U.S. Appl. No. 14/870,882, 8 pages. |
Innovation Patent, dated Aug. 25, 2016, received in Australian Patent Application No. 2016101436, which corresponds with U.S. Appl. No. 14/871,236, 1 pages. |
Office Action, dated Oct. 31, 2016, received in Australian Patent Application No. 2016101438, which corresponds with U.S. Appl. No. 14/871,236, 6 pages. |
Office Action, dated Nov. 28, 2019, received in Chinese Patent Application No. 201610870912.3, which corresponds with U.S. Appl. No. 14/870,882, 10 pages. |
Office Action, dated Aug. 3, 2020, received in Chinese Patent Application No. 201610870912.3, which corresponds with U.S. Appl. No. 14/870,882, 4 pages. |
Office Action, dated Dec. 21, 2020, received in Chinese Patent Application No. 201610870912.3, which corresponds with U.S. Appl. No. 14/870,882, 5 pages. |
Notice of Allowance, dated Mar. 22, 2021, received in Chinese Patent Application No. 201610870912.3, which corresponds with U.S. Appl. No. 14/870,882, 1 pages. |
Office Action, dated Apr. 6, 2016, received in Danish Patent Application No. 201500596, which corresponds with U.S. Appl. No. 14/870,882, 7 pages. |
Office Action, dated Jun. 9, 2016, received in Danish Patent Application No. 201500596, which corresponds with U.S. Appl. No. 14/870,882, 9 pages. |
Notice of Allowance, dated Oct. 31, 2017, received in Danish Patent Application No. 201500596, which corresponds with U.S. Appl. No. 14/870,882, 2 pages. |
Patent, dated Jan. 29, 2018, received in Danish Patent Application No. 201500596, which corresponds with U.S. Appl. No. 14/870,882, 4 pages. |
Office Action, dated Feb. 11, 2019, received in European Patent Application No. 17171972.7, which corresponds with U.S. Appl. No. 14/870,882, 7 pages. |
Office Action, dated Sep. 1, 2017, received in U.S. Appl. No. 14/870,988, 14 pages. |
Final Office Action, dated Feb. 16, 2018, received in U.S. Appl. No. 14/870,988, 18 pages. |
Notice of Allowance, dated Aug. 27, 2018, received in U.S. Appl. No. 14/870,988, 11 pages. |
Office Action, dated Nov. 22, 2017, received in U.S. Appl. No. 14/871,227, 24 pages. |
Notice of Allowance, dated Jun. 11, 2018, received in U.S. Appl. No. 14/871,227, 11 pages. |
Office Action, dated Oct. 17, 2016, received in Australian Patent Application No. 2016203040, which corresponds with U.S. Appl. No. 14/871,227, 7 pages. |
Office Action, dated Oct. 16, 2017, received in Australian Patent Application No. 2016203040, which corresponds with U.S. Appl. No. 14/871,227, 5 pages. |
Notice of Acceptance, dated Oct. 30, 2018, received in Australian Patent Application No. 2016203040, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Certificate of Grant, dated Feb. 28, 2019, received in Australian Patent Application No. 2016203040, which corresponds with U.S. Appl. No. 14/871,227, 1 page. |
Office Action, dated Oct. 18, 2016, received in Australian Patent Application No. 2016101431, which corresponds with U.S. Appl. No. 14/871,227, 3 pages. |
Office Action, dated Apr. 13, 2017, received in Australian Patent Application No. 2016101431, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Office Action, dated Oct. 11, 2018, received in Australian Patent Application No. 2017245442, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Office Action, dated Nov. 16, 2018, received in Chinese Patent Application No. 201680000466.9, which corresponds with U.S. Appl. No. 14/871,227, 5 pages. |
Notice of Allowance, dated Jun. 5, 2019, received in Chinese Patent Application No. 201680000466.9, which corresponds with U.S. Appl. No. 14/871,227, 5 pages. |
Patent, dated Aug. 9, 2019, received in Chinese Patent Application No. 201680000466.9, which corresponds with U.S. Appl. No. 14/871,227, 8 pages. |
Intention to Grant, dated Apr. 7, 2016, received in Danish Patent Application No. 201500597, which corresponds with U.S. Appl. No. 14/871,227, 7 pages. |
Grant, dated Jun. 21, 2016, received in Danish Patent Application No. 201500597, which corresponds with U.S. Appl. No. 14/871,227, 2 pages. |
Patent, dated Sep. 26, 2016, received in Danish Patent Application No. 201500597, which corresponds with U.S. Appl. No. 14/871,227, 7 pages. |
Intent to Grant, dated Sep. 17, 2018, received in European Patent No. 16711743.1, which corresponds with U.S. Appl. No. 14/871,227, 5 pages. |
Patent, dated Nov. 28, 2018, received in European Patent No. 16711743.1, which corresponds with U.S. Appl. No. 14/871,227, 1 page. |
Office Action, dated Jul. 20, 2020, received in Indian Patent Application No. 201617032293, which corresponds with U.S. Appl. No. 14/871,227, 9 pages. |
Office Action, dated Mar. 24, 2017, received in Japanese Patent Application No. 2016-533201, which corresponds with U.S. Appl. No. 14/871,227, 6 pages. |
Office Action, dated Aug. 4, 2017, received in Japanese Patent Application No. 2016-533201, which corresponds with U.S. Appl. No. 14/871,227, 6 pages. |
Notice of Allowance, dated Jan. 4, 2018, received in Japanese Patent Application No. 2016-533201, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Patent, dated Feb. 9, 2018, received in Japanese Patent Application No. 2016-533201, which corresponds with U.S. Appl. No. 14/871,227, 4 pages. |
Office Action, dated Feb. 20, 2018, received in Korean Patent Application No. 2016-7019816, which corresponds with U.S. Appl. No. 14/871,227, 8 pages. |
Notice of Allowance, dated Oct. 1, 2018, received in Korean Patent Application No. 2016-7019816, which corresponds with U.S. Appl. No. 14/871,227, 6 pages. |
Patent, dated Dec. 28, 2018, received in Korean Patent Application No. 2016-7019816, which corresponds with U.S. Appl. No. 14/871,227, 8 pages. |
Office Action, dated Oct. 26, 2017, received in U.S. Appl. No. 14/871,336, 22 pages. |
Final Office Action, dated Mar. 15, 2018, received in U.S. Appl. No. 14/871,336, 23 pages. |
Office Action, dated Nov. 5, 2018, received in U.S. Appl. No. 14/871,336, 24 pages. |
Notice of Allowance, dated Feb. 5, 2019, received in U.S. Appl. No. 14/871,336, 10 pages. |
Office Action, dated Oct. 14, 2016, received in Australian Patent Application No. 2016101437, which corresponds with U.S. Appl. No. 14/871,336, 2 pages. |
Office Action, dated Apr. 11, 2017, received in Australian Patent Application No. 2016101437, which corresponds with U.S. Appl. No. 14/871,336, 4 pages. |
Office Action, dated Nov. 4, 2019, received in Chinese Patent Application No. 201610871323.7, which corresponds with U.S. Appl. No. 14/871,336, 12 pages. |
Office Action, dated Aug. 4, 2020, received in Chinese Patent Application No. 201610871323.7, which corresponds with U.S. Appl. No. 14/871,336, 18 pages. |
Office Action, dated Feb. 9, 2021, received in Chinese Patent Application No. 201610871323.7, which corresponds with U.S. Appl. No. 14/871,336, 1 page. |
Office Action, dated Apr. 18, 2016, received in Danish Patent Application No. 201500601, which corresponds with U.S. Appl. No. 14/871,336, 8 pages. |
Office Action, dated Oct. 18, 2016, received in Danish Patent Application No. 201500601, which corresponds with U.S. Appl. No. 14/871,336, 3 pages. |
Notice of Allowance, dated Mar. 23, 2017, received in Danish Patent Application No. 201500601, which corresponds with U.S. Appl. No. 14/871,336, 2 pages. |
Patent, dated Oct. 30, 2017, Danish Patent Application No. 201500601, which corresponds with U.S. Appl. No. 14/871,336, 5 pages. |
Office Action, dated Feb. 12, 2019, received in European Patent Application No. 17172266.3, which corresponds with U.S. Appl. No. 14/871,336, 6 pages. |
Office Action, dated Apr. 2, 2018, received in Japanese Patent Application No. 2018-020324, which corresponds with U.S. Appl. No. 14/871,336, 4 pages. |
Office Action, dated Mar. 21, 2016, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 9 pages. |
Office Action, dated Sep. 14, 2016, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 4 pages. |
Office Action, dated May 4, 2017, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 4 pages. |
Office Action, dated Oct. 31, 2017, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 2 pages. |
Notice of Allowance, dated Jan. 26, 2018, received in Danish Patent Application No. 201500598, which corresponds with U.S. Appl. No. 14/867,892, 2 pages. |
Office Action, dated Feb. 28, 2018, received in U.S. Appl. No. 14/869,361, 26 pages. |
Final Office Action, dated Oct. 4, 2018, received in U.S. Appl. No. 14/869,361, 28 pages. |
Office Action, dated Feb, 27. 2019, received in U.S. Appl. No. 14/869,361, 28 pages. |
Office Action, dated Mar. 1, 2017, received in U.S. Appl. No. 14/869,855, 14 pages. |
Final Office Action, dated Oct. 10, 2017, received in U.S. Appl. No. 14/869,855, 16 pages. |
Office Action, dated Jan. 23, 2018, received in U.S. Appl. No. 14/869,855, 24 pages. |
Notice of Allowance, dated May 31, 2018, received in U.S. Appl. No. 14/869,855, 10 pages. |
Office Action, dated Feb. 9, 2017, received in U.S. Appl. No. 14/869,873, 17 pages. |
Final Office Action, dated Aug. 18, 2017, received in U.S. Appl. No. 14/869,873, 20 pages. |
Office Action, dated Jan. 18, 2018, received in U.S. Appl. No. 14/869,873, 25 pages. |
Final Office Action, dated May 23, 2018, received in U.S. Appl. No. 14/869,873, 18 pages. |
Notice of Allowance, dated Jul. 30, 2018, received in U.S. Appl. No. 14/869,873, 8 pages. |
Office Action, dated Jan. 11, 2018, received in U.S. Appl. No. 14/869,997, 17 pages. |
Office Action, dated Sep. 7, 2018, received in U.S. Appl. No. 14/869,997, 23 pages. |
Notice of Allowance, dated Apr. 4, 2019, received in U.S. Appl. No. 14/869,997, 9 pages. |
Notice of Allowance, dated Jan. 17, 2018, received in U.S. Appl. No. 14/867,990, 12 pages. |
Notice of Allowance, dated Mar. 30, 3018, received in U.S. Appl. No. 14/867,990, 5 pages. |
Office Action, dated May 23, 2016, received in Australian Patent Application No. 2016100253, which corresponds with U.S. Appl. No. 14/867,990, 5 pages. |
Notice of Allowance, dated May 21, 2019, received in Chinese Patent Application No. 201610131507.X, which corresponds with U.S. Appl. No. 14/867,990, 3 pages. |
Patent, dated Jul. 19, 2019, received in Chinese Patent Application No. 201610131507.X, which corresponds with U.S. Appl. No. 14/867,990, 6 pages. |
Office Action, dated Jul. 5, 2016, received in Chinese Patent Application No. 201620176221.9, which corresponds with U.S. Appl. No. 14/867,990, 4 pages. |
Office Action, dated Oct. 25, 2016, received in Chinese Patent Application No. 201620176221.9, which corresponds with U.S. Appl. No. 14/867,990, 7 pages. |
Certificate of Registration, dated Jun. 16, 2016, received in German Patent No. 202016001489.8, which corresponds with U.S. Appl. No. 14/867,990, 3 pages. |
Office Action, dated Mar. 18, 2016, received in Danish Patent Application No. 201500581, which corresponds with U.S. Appl. No. 14/867,990, 9 pages. |
Office Action, dated Sep. 26, 2016, received in Danish Patent Application No. 201500581, which corresponds with U.S. Appl. No. 14/867,990, 5 pages. |
Office Action, dated May 3, 2017, received in Danish Patent Application No. 201500581, which corresponds with U.S. Appl. No. 14/867,990, 5 pages. |
Office Action, dated Feb. 19, 2018, received in Danish Patent Application No. 201500581, which corresponds with U.S. Appl. No. 14/867,990, 4 pages. |
Office Action, dated Feb. 21, 2020, received in European Patent Application No. 16711725.8, which corresponds with U.S. Appl. No. 14/867,990, 13 pages. |
Office Action, dated Apr. 19, 2018, received in U.S. Appl. No. 14/869,703, 19 pages. |
Final Office Action, dated Oct. 26, 2018, received in U.S. Appl. No. 14/869,703, 19 pages. |
Notice of Allowance, dated Mar. 12, 2019, received in U.S. Appl. No. 14/869,703, 6 pages. |
Office Action, dated Dec. 12, 2017, received in U.S. Appl. No. 15/009,668, 32 pages. |
Final Office Action, dated Jul. 3, 2018, received in U.S. Appl. No. 15/009,668, 19 pages. |
Office Action, dated Jan. 10, 2019, received in U.S. Appl. No. 15/009,668, 17 pages. |
Notice of Allowance, dated May 1, 2019, received in U.S. Appl. No. 15/009,668, 12 pages. |
Office Action, dated Aug. 20, 2020, received in Chinese Patent Application No. 201680046985.9, which corresponds with U.S. Appl. No. 15/009,668, 15 pages. |
Office Action, dated Jan. 31, 2020, received in European Patent Application No. 16753795.0, which corresponds with U.S. Appl. No. 15/009,668, 9 pages. |
Office Action, dated Mar. 19, 2021, received in European Patent Application No. 16753795.0, which corresponds with U.S. Appl. No. 15/009,668, 5 pages. |
Office Action, dated Nov. 25, 2016, received in U.S. Appl. No. 15/081,771, 17 pages. |
Final Office Action, dated Jun. 2, 2017, received in U.S. Appl. No. 15/081,771, 17 pages. |
Notice of Allowance, dated Dec. 4, 2017, received in U.S. Appl. No. 15/081,771, 10 pages. |
Office Action, dated Feb. 1, 2018, received in Australian Patent Application No. 2017202058, which corresponds with U.S. Appl. No. 15/081,771, 4 pages. |
Notice of Acceptance, dated Jan. 24, 2019, received in Australian Patent Application No. 2017202058, which corresponds with U.S. Appl. No. 15/081,771, 3 pages. |
Certificate of Grant, dated May 23, 2019, received in Australian Patent Application No. 2017202058, which corresponds with U.S. Appl. No. 15/081,771, 1 page. |
Office Action, dated Jan. 24, 2020, received in European Patent Application No. 18205283.7, which corresponds with U.S. Appl. No. 15/081,771, 4 pages. |
Intention to Grant, dated Apr. 30, 2020, received in European Patent Application No. 18205283.7, which corresponds with U.S. Appl. No. 15/081,771, 7 pages. |
Decision to Grant, dated Aug. 27, 2020, received in European Patent Application No. 18205283.7, which corresponds with U.S. Appl. No. 15/081,771, 4 pages. |
Patent, dated Sep. 23, 2020, received in European Patent Application No. 18205283.7, which corresponds with U.S. Appl. No. 15/081,771, 4 pages. |
Office Action, dated Jan. 26, 2018, received in Japanese Patent Application No. 2017-086460, which corresponds with U.S. Appl. No. 15/081,771, 6 pages. |
Notice of Allowance, dated Oct. 12, 2018, received in Japanese Patent Application No. 2017-086460, which corresponds with U.S. Appl. No. 15/081,771, 5 pages. |
Office Action, dated Aug. 29, 2017, received in Korean Patent Application No. 2017-7014536, which corresponds with U.S. Appl. No. 15/081,771, 5 pages. |
Notice of Allowance, dated Jun. 28, 2018, received in Korean Patent Application No. 2017-7014536, which corresponds with U.S. Appl. No. 15/081,771, 4 pages. |
Patent, dated Sep. 28, 2018, received in Korean Patent Application No. 2017-7014536, which corresponds with U.S. Appl. No. 15/081,771, 3 pages. |
Final Office Action, dated May 1, 2017, received in U.S. Appl. No. 15/136,782, 18 pages. |
Notice of Allowance, dated Oct. 20, 2017, received in U.S. Appl. No. 15/136,782, 9 pages. |
Office Action, dated May 4, 2018, received in Australian Patent Application No. 2018202855, which corresponds with U.S. Appl. No. 15/136,782, 3 pages. |
Notice of Acceptance, dated Sep. 10, 2018, received in Australian Patent Application No. 2018202855, which corresponds with U.S. Appl. No. 15/136,782, 3 pages. |
Certificate of Grant, dated Jan. 17, 2019, received in Australian Patent Application No. 2018202855, which corresponds with U.S. Appl. No. 15/136,782, 4 pages. |
Office Action, dated Sep. 27, 2019, received in Chinese Patent Application No. 201810119007.3, which corresponds with U.S. Appl. No. 15/136,782, 6 pages. |
Notice of Allowance, dated Feb. 26, 2020, received in Chinese Patent Application No. 201810119007.3, which corresponds with U.S. Appl. No. 15/136,782, 3 pages. |
Patent, dated Apr. 7, 2020, received in Chinese Patent Application No. 201810119007.3, which corresponds with U.S. Appl. No. 15/136,782, 7 pages. |
Office Action, dated May 23, 2017, received in Danish Patent Application No. 201770190, which corresponds with U.S. Appl. No. 15/136,782, 7 pages. |
Office Action, dated Jan. 8, 2018, received in Danish Patent Application No. 201770190, which corresponds with U.S. Appl. No. 15/136,782, 2 pages. |
Notice of Allowance, dated Mar. 19, 2018, received in Danish Patent Application No. 201770190, which corresponds with U.S. Appl. No. 15/136,782, 2 pages. |
Patent, dated May 22, 2018, received in Danish Patent Application No. 201770190, which corresponds with U.S. Appl. No. 15/136,782, 2 pages. |
Office Action, dated Apr. 17, 2019, received in European Patent Application No. 18171453.6, which corresponds with U.S. Appl. No. 15/136,782, 4 pages. |
Office Action, dated Oct. 2, 2019, received in European Patent Application No. 18171453.6, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Office Action, dated May 12, 2020, received in European Patent Application No. 18171453.6, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Patent, dated Feb. 5, 2021, received in Hong Kong Patent Application No. 1257553, which corresponds with U.S. Appl. No. 15/136,782, 14 pages. |
Office Action, dated Jun. 1, 2018, received in Japanese Patent Application No. 2018-062161, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Office Action, dated Nov. 12, 2018, received in Japanese Patent Application No. 2018-062161, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Notice of Allowance, dated Feb. 18, 2019, received in Japanese Patent Application No. 2018-062161, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Patent, dated Mar. 22, 2019, received in Japanese Patent Application No. 2018-062161, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Office Action, dated Oct. 31, 2018, received in Korean Patent Application No. 2018-7020659, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Notice of Allowance, dated Feb. 25, 2019, received in Korean Patent Application No. 2018-7020659, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Patent, dated Apr. 3, 2019, received in Korean Patent Application No. 2018-7020659, which corresponds with U.S. Appl. No. 15/136,782, 5 pages. |
Office Action, dated Jan. 20, 2017, received in U.S. Appl. No. 15/231,745, 21 pages. |
Notice of Allowance, dated Jul. 6, 2017, received in U.S. Appl. No. 15/231,745, 18 pages. |
Office Action, dated Oct. 17, 2016, received in Danish Patent Application No. 201670587, which corresponds with U.S. Appl. No. 15/231,745, 9 pages. |
Office Action, dated Jun. 29, 2017, received in Danish Patent Application No. 201670587, which corresponds with U.S. Appl. No. 15/231,745, 4 pages. |
Office Action, dated Feb. 22, 2018, received in Danish Patent Application No. 201670587, which corresponds with U.S. Appl. No. 15/231,745, 4 pages. |
Office Action, dated Dec. 18, 2018, received in Danish Patent Application No. 201670587, which corresponds with U.S. Appl. No. 15/231,745, 4 pages. |
Office Action, dated Dec. 14, 2016, received in Danish Patent Application No. 201670590, which corresponds with U.S. Appl. No. 15/231,745, 9 pages. |
Office Action, dated Jul. 6, 2017, received in Danish Patent Application No. 201670590, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Office Action, dated Jan. 10, 2018, received in Danish Patent Application No. 201670590, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Patent, dated May 28, 2018, received in Danish Patent Application No. 201670590, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Office Action, dated Nov. 10, 2016, received in Danish Patent Application No. 201670591, which corresponds with U.S. Appl. No. 15/231,745, 12 pages. |
Office Action, dated Apr. 11, 2018, received in Danish Patent Application No. 201670591, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Office Action, dated Nov. 23, 2018, received in Danish Patent Application No. 201670591, which corresponds with U.S. Appl. No. 15/231,745, 7 pages. |
Office Action, dated Oct. 26, 2016, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 8 pages. |
Office Action, dated Jan. 5, 2017, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Office Action, dated Jan. 30, 2018, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Notice of Allowance, dated Mar. 27, 2018, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Patent, dated May 28, 2018, received in Danish Patent Application No. 201670592, which corresponds with U.S. Appl. No. 15/231,745, 2 pages. |
Office Action, dated Oct. 12, 2016, received in Danish Patent Application No. 201670593, which corresponds with U.S. Appl. No. 15/231,745, 7 pages. |
Patent, dated Oct. 30, 2017, received in Danish Patent Application No. 201670593, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Notice of Allowance, dated Nov. 1, 2019, received in Japanese Patent Application No. 2018-158502, which corresponds with U.S. Appl. No. 15/231,745, 5 pages. |
Patent, dated Nov. 29, 2019, received in Japanese Patent Application No. 2018-158502, which corresponds with U.S. Appl. No. 15/231,745, 3 pages. |
Notice of Allowance, dated Oct. 4, 2018, received in U.S. Appl. No. 15/272,327, 46 pages. |
Notice of Acceptance, dated Mar. 2, 2018, received in Australian Patent Application No. 2018200705, which corresponds with U.S. Appl. No. 15/272,327, 3 pages. |
Certificate of Grant, dated Jun. 28, 2018, received in Australian Patent Application No. 2018200705, which corresponds with U.S. Appl. No. 15/272,327, 4 pages. |
Office Action, dated Mar. 22, 2019, received in Australian Patent Application No. 2018204234, which corresponds with U.S. Appl. No. 15/272,327, 7 pages. |
Notice of Acceptance, dated Dec. 10, 2019, received in Australian Patent Application No. 2018204234, which corresponds with U.S. Appl. No. 15/272,327, 3 pages. |
Certificate of Grant, dated Apr. 2, 2020, received in Australian Patent Application No. 2018204234, which corresponds with U.S. Appl. No. 15/272,327, 1 page. |
Office Action, dated Aug. 31, 2020, received in Chinese Patent Application No. 201810151593.X, which corresponds with U.S. Appl. No. 15/272,327, 10 pages. |
Notice of Allowance, dated Jan. 27, 2021, received in Chinese Patent Application No. 201810151593.X, which corresponds with U.S. Appl. No. 15/272,327, 3 pages. |
Office Action, dated Sep. 14, 2018, received in European Patent Application No. 15155939.4, which corresponds with U.S. Appl. No. 15/272,327, 5 pages. |
Intention to Grant, dated Mar. 19, 2019, received in European Patent Application No. 15155939.4, which corresponds with U.S. Appl. No. 15/272,327, 6 pages. |
Decision to Grant, dated Apr. 26, 2019, received in European Patent Application No. 15155939.4, which corresponds with U.S. Appl. No. 15/272,327, 2 pages. |
Patent, dated May 22, 2019, received in European Patent Application No. 15155939.4, which corresponds with U.S. Appl. No. 15/272,327, 1 page. |
Notice of Allowance, dated Jul. 30, 2018, received in Japanese Patent Application No. 2018-506989, which corresponds with U.S. Appl. No. 15/272,327, 4 pages. |
Patent, dated Aug. 31, 2018, received in Japanese Patent Application No. 2018-506989, which corresponds with U.S. Appl. No. 15/272,327, 3 pages. |
Office Action, dated Oct. 26, 2018, received in U.S. Appl. No. 15/272,341, 22 pages. |
Final Office Action, dated Mar. 25, 2019, received in U.S. Appl. No. 15/272,341, 25 pages. |
Notice of Allowance, dated Feb. 20, 2020, received in U.S. Appl. No. 15/272,341, 12 pages. |
Office Action, dated Jul. 27, 2017, received in Australian Patent Application No. 2017100535, which corresponds with U.S. Appl. No. 15/272,341, 4 pages. |
Notice of Allowance, dated Sep. 20, 2018, received in U.S. Appl. No. 15/272,343, 44 pages. |
Office Action, dated Jun. 5, 2019, received in Chinese Patent Application No. 201810071627.4, which corresponds with U.S. Appl. No. 15/272,343, 6 pages. |
Notice of Allowance, dated Dec. 11, 2019, received in Chinese Patent Application No. 201810071627.4, which corresponds with U.S. Appl. No. 15/272,343, 4 pages. |
Patent, dated Mar. 3, 2020, received in Chinese Patent Application No. 201810071627.4, which corresponds with U.S. Appl. No. 15/272,343, 7 pages. |
Office Action, dated Jan. 8, 2019, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 5 pages. |
Intention to Grant, dated May 13, 2019, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 7 pages. |
Decision to Grant, dated Sep. 12, 2019, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 3 pages. |
Patent, Oct. 9, 2019, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 3 pages. |
Office Action, dated Oct. 15, 2018, received in U.S. Appl. No. 15/272,345. 31 pages. |
Final Office Action, dated Apr. 2, 2019, received in U.S. Appl. No. 15/272,345, 28 pages. |
Notice of Allowance, dated Apr. 22, 2020, received in U.S. Appl. No. 15/272,345, 12 pages. |
Notice of Acceptance, dated Mar. 2, 2018, received in Australian Patent Application No. 2016304832, which corresponds with U.S. Appl. No. 15/272,345, 3 pages. |
Certificate of Grant, dated Jun. 28, 2018, received in Australian Patent Application No. 2016304832, which corresponds with U.S. Appl. No. 15/272,345, 4 pages. |
Office Action, dated Oct. 22, 2019, received in Chinese Patent Application No. 201680022696.5, which corresponds with U.S. Appl. No. 15/272,345, 7 pages. |
Notice of Allowance, dated Jul. 6, 2020, received in Chinese Patent Application No. 201680022696.5, which corresponds with U.S. Appl. No. 15/272,345, 5 pages. |
Patent, dated Sep. 18, 2020, received in Chinese Patent Application No. 201680022696.5, which corresponds with U.S. Appl. No. 15/272,345, 6 pages. |
Office Action, dated Apr. 20, 2018, received in European Patent Application No. 16756862.5, which corresponds with U.S. Appl. No. 15/272,345, 15 pages. |
Office Action, dated Nov. 13, 2018, received in European Patent Application No. 16756862.5, which corresponds with U.S. Appl. No. 15/272,345, 5 pages. |
Decision to Grant, dated Jan. 31, 2019, received in European Patent Application No. 16756862.5, which corresponds with U.S. Appl. No. 15/272,345, 5 pages. |
Patent, dated Feb. 27, 2019, received in European Patent Application No. 16756862.5, which corresponds with U.S. Appl. No. 15/272,345, 3 pages. |
Patent, dated Feb. 7, 2020, received in Hong Kong Patent Application No. 18101477.0, which corresponds with U.S. Appl. No. 15/272,345, 6 pages. |
Office Action, dated Dec. 4, 2020, received in Japanese Patent Application No. 2019-212493, which corresponds with U.S. Appl. No. 15/272,345, 5 pages. |
Office Action, dated Mar. 7, 2018, received in U.S. Appl. No. 15/482,618, 7 pages. |
Notice of Allowance, dated Aug. 15, 2018, received in U.S. Appl. No. 15/482,618, 7 pages. |
Office Action, dated Apr. 23, 2018, received in U.S. Appl. No. 15/499,691, 29 pages. |
Notice of Allowance, dated Oct. 12, 2018, received in U.S. Appl. No. 15/499,693, 8 pages. |
Office Action, dated May 11, 2020, received in Australian Patent Application No. 2019203776, which corresponds with U.S. Appl. No. 15/499,693, 4 pages. |
Notice of Acceptance, dated Jul. 22, 2020, received in Australian Patent Application No. 2019203776, which corresponds with U.S. Appl. No. 15/499,693, 3 pages. |
Certificate of Grant, dated Nov. 26, 2020, received in Australian Patent Application No. 2019203776, which corresponds with U.S. Appl. No. 15/499,693, 3 pages. |
Office action, dated Nov. 20, 2020, received in Japanese Patent Application No. 2019-200174, which corresponds with U.S. Appl. No. 15/499,693, 6 pages. |
Office Action, dated Aug. 2, 2019, received in Korean Patent Application No. 2019-7009439, which corresponds with U.S. Appl. No. 15/499,693, 3 pages. |
Notice of Allowance, dated Dec. 27, 2019, received in Korean Patent Application No. 2019-7009439, which corresponds with U.S. Appl. No. 15/499,693, 5 pages. |
Patent, dated Mar. 27, 2020, received in Korean Patent Application No. 2019-7009439, which corresponds with U.S. Appl. No. 15/499,693, 4 pages. |
Office Action, dated Aug. 30, 2017, received in U.S. Appl. No. 15/655,749, 22 pages. |
Final Office Action, dated May 10, 2018, received in U.S. Appl. No. 15/655,749, 19 pages. |
Office Action, dated Jan. 24, 2019, received in U.S. Appl. No. 15/655,749, 25 pages. |
Final Office Action, dated Jul. 1, 2019, received in U.S. Appl. No. 15/655,749, 24 pages. |
Notice of Allowance, dated Feb. 20, 2020, received in U.S. Appl. No. 15/655,749, 10 pages. |
Office Action, dated Feb. 3, 2020, received in Chinese Patent Application No. 201710331254.5, which corresponds with U.S. Appl. No. 15/655,749, 8 pages. |
Office Action, dated Mar. 22, 2021, received in Chinese Patent Application No. 201710331254.5, which corresponds with U.S. Appl. No. 15/655,749, 4 pages. |
Notice of Allowance, dated Apr. 18, 2019, received in Korean Patent Application No. 2017-7034248, which corresponds with U.S. Appl. No. 15/655,749, 5 pages. |
Patent, dated Jul. 3, 2019, received in Korean Patent Application No. 2017-7034248, which corresponds with U.S. Appl. No. 15/655,749, 5 pages. |
Office Action, dated Aug. 1, 2019, received in U.S. Appl. No. 15/785,372, 22 pages. |
Final Office Action, dated Feb. 5, 2020, received in U.S. Appl. No. 15/785,372, 26 pages. |
Office Action, dated Jul. 23, 2020, received in U.S. Appl. No. 15/785,372, 23 pages. |
Final Office Action, dated Nov. 18, 2020, received in U.S. Appl. No. 15/785,372, 27 pages. |
Office Action, dated Oct. 31, 2017, received in U.S. Appl. No. 15/723,069, 7 pages. |
Notice of Allowance, dated Dec. 21, 2017, received in U.S. Appl. No. 15/723,069, 7 pages. |
Office Action, dated Apr. 11, 2019, received in U.S. Appl. No. 15/889,115, 9 pages. |
Final Office Action, dated Oct. 28, 2019, received in U.S. Appl. No. 15/889,115, 12 pages. |
Notice of Allowance, dated May 19, 2020, received in U.S. Appl. No. 15/889,115, 9 pages. |
Office Action, dated Jul. 25, 2019, received in U.S. Appl. No. 15/979,347, 14 pages. |
Final Office Action, dated Feb. 27, 2020, received in U.S. Appl. No. 15/979,347, 19 pages. |
Office Action, dated Jul. 14, 2020, received in U.S. Appl. No. 15/979,347, 10 pages. |
Final Office Action, dated Jan. 25, 2021, received in U.S. Appl. No. 15/979,347, 12 pages. |
Office Action, dated Sep. 25, 2020, received in U.S. Appl. No. 15/994,843, 5 pages. |
Notice of Allowance, dated Jan. 22, 2021, received in U.S. Appl. No. 15/994,843, 8 pages. |
Office Action, dated Nov. 25, 2019, received in U.S. Appl. No. 16/049,725, 9 pages. |
Notice of Allowance, dated May 14, 2020, received in U.S. Appl. No. 16/049,725, 9 pages. |
Office Action, dated May 31, 2019, received in Australian Patent Application No. 2018253539, which corresponds with U.S. Appl. No. 16/049,725, 3 pages. |
Notice of Acceptance, dated Apr. 2, 2020, received in Australian Patent Application No. 2018253539, which corresponds with U.S. Appl. No. 16/049,725, 3 pages. |
Certificate of Grant, dated Aug. 13, 2020, received in Australian Patent Application No. 2018253539, which corresponds with U.S. Appl. No. 16/049,725, 3 pages. |
Notice of Allowance, dated Oct. 10, 2019, received in U.S. Appl. No. 16/102,409, 9 pages. |
Office Action, dated Nov. 29, 2019, received in U.S. Appl. No. 16/136,163, 9 pages. |
Final Office Action, dated Jun. 9, 2020, received in U.S. Appl. No. 16/136,163, 10 pages. |
Office Action, dated Sep. 17, 2020, received in U.S. Appl. No. 16/136,163, 13 pages. |
Office Action, dated Mar. 9, 2020, received in U.S. Appl. No. 16/145,954, 15 pages. |
Office Action, dated Dec. 10, 2020, received in U.S. Appl. No. 16/145,954, 5 pages. |
Office Action, dated Mar. 6, 2020, received in U.S. Appl. No. 16/154,591, 16 pages. |
Final Office Action, dated Oct. 1, 2020, received in U.S. Appl. No. 16/154,591, 19 pages. |
Office Action, dated Mar. 4, 2021, received in U.S. Appl. No. 16/154,591, 20 pages. |
Office Action, dated May 4, 2020, received in Australian Patent Application No. 2019203175, which corresponds with U.S. Appl. No. 16/154,591, 4 pages. |
Office Action, dated Oct. 13, 2020, received in Australian Patent Application No. 2019203175, which corresponds with U.S. Appl. No. 16/154,591, 5 pages. |
Office Action, dated Dec. 2, 2019, received in Japanese Patent Application No. 2018-202048, which corresponds with U.S. Appl. No. 16/154,591, 6 pages. |
Notice of Allowance, dated Jun. 1, 2020, received in Japanese Patent Application No. 2018-202048, which corresponds with U.S. Appl. No. 16/154,591, 3 pages. |
Patent, dated Jun. 25, 2020, received in Japanese Patent Application No. 2018-202048, which corresponds with U.S. Appl. No. 16/154,591, 4 pages. |
Office Action, dated Aug. 20, 2019, received in Korean Patent Application No. 2019-7019946, which corresponds with U.S. Appl. No. 16/154,591, 6 pages. |
Office Action, dated Feb. 27, 2020, received in Korean Patent Application No. 2019-7019946, which corresponds with U.S. Appl. No. 16/154,591, 5 pages. |
Office Action, dated Nov. 25, 2019, received in U.S. Appl. No. 16/174170, 31 pages. |
Final Office Action, dated Mar. 19, 2020, received in U.S. Appl. No. 16/174170, 25 pages. |
Notice of Allowance, dated Jun. 18, 2020, received in U.S. Appl. No. 16/174170, 19 pages. |
Notice of Allowance, dated Aug. 26 2020, received in U.S. Appl. No. 16/240,669, 18 pages. |
Office Action, dated Oct. 30, 2020, received in U.S. Appl. No. 16/230,707, 20 pages. |
Notice of Allowance, dated Feb. 18, 2021, received in U.S. Appl. No. 16/230,707, 9 pages. |
Office Action, dated Aug. 10, 2020, received in U.S. Appl. No. 16/240,672, 13 pages. |
Final Office Action, dated Nov. 27, 2020, received in U.S. Appl. No. 16/240,672, 12 pages. |
Office Action, dated Sep. 24, 2020, received in Australian Patent Application No. 2019268116, which corresponds with U.S. Appl. No. 16/240,672, 4 pages. |
Office Action, dated Jan. 28, 2021, received in Australian Patent Application No. 2019268116, which corresponds with U.S. Appl. No. 16/240,672, 4 pages. |
Notice of Allowance, dated May 22, 2020, received in Japanese Patent Application No. 2019-027634, which corresponds with 16/240,672, 5 pages. |
Patent, dated Jun. 23, 2020, received in Japanese Patent Application No. 2019-027634, which corresponds with 16/240,672, 4 pages. |
Office Action, dated May 22, 2019, received in U.S. Appl. No. 16/230,743, 7 pages. |
Notice of Allowance, dated Sep. 11, 2019, received in U.S. Appl. No. 16/230,743, 5 pages. |
Office Action, dated Mar. 6, 2020, received in U.S. Appl. No. 16/243,834, 19 pages. |
Notice of Allowance, dated Sep. 24, 2020, received in U.S. Appl. No. 16/243,834, 10 pages. |
Office Action, dated Dec. 18, 2019, received in Australian Patent Application No. 2018282409, which corresponds with U.S. Appl. No. 16/243,834, 3 pages. |
Office Action, dated Sep. 18, 2020, received in Australian Patent Application No. 2018282409, which corresponds with U.S. Appl. No. 16/243,834, 3 pages. |
Notice of Acceptance, dated Oct. 21, 2020, received in Australian Patent Application No. 2018282409, which corresponds with U.S. Appl. No. 16/243,834, 3 pages. |
Certificate of Grant, dated Feb. 18, 2021, received in Australian Patent Application No. 2018282409, which corresponds with U.S. Appl. No. 16/243,834, 3 pages. |
Office Action, dated Aug. 7, 2020, received in Japanese Patent Application No. 2019-058800, which corresponds with U.S. Appl. No. 16/243,834, 8 pages. |
Office Action, dated Feb. 12, 2021, received in Japanese Patent Application No. 2019-058800, which corresponds with U.S. Appl. No. 16/243,834, 2 pages. |
Office Action, dated Jul. 5, 2019, received in Korean Patent Application No. 2018-7037896, which corresponds with U.S. Appl. No. 16/243,834, 2 pages. |
Notice of Allowance, dated Dec. 23, 2019, received in Korean Patent Application No. 2018-7037896, which corresponds with U.S. Appl. No. 16/243,834, 6 pages. |
Patent, dated Mar. 13, 2020, received in Korean Patent Application No. 2018-7037896, which corresponds with U.S. Appl. No. 16/243,834, 7 pages. |
Notice of Allowance, dated Nov. 20, 2020, received in U.S. Appl. No. 16/262,784, 8 pages. |
Office action, dated Feb. 25, 2021, received in Australian Patent Application No. 2020201648, which corresponds with U.S. Appl. No. 16/262,784, 3 pages. |
Office Action, dated Feb. 5, 2021, received in U.S. Appl. No. 16/262,800, 53 pages. |
Office Action, dated Sep. 15, 2020, received in European Patent Application No. 19194439.6, which corresponds with U.S. Appl. No. 16/262,800, 6 pages. |
Office Action, dated Mar. 25, 2021, received in European Patent Application No. 19194439.6, which corresponds with U.S. Appl. No. 16/262,800, 5 pages. |
Notice of Allowance, dated Apr. 19, 2019, received in U.S. Appl. No. 16/252,478, 11 pages. |
Office Action, dated Jun. 11, 2020, received in Australian Patent Application No. 2019257437, which corresponds with U.S. Appl. No. 16/252,478, 3 pages. |
Notice of Allowance, dated Sep. 15, 2020, received in Australian Patent Application No. 2019257437, which corresponds with U.S. Appl. No. 16/252,478, 3 pages. |
Notice of Allowance, dated Dec. 13, 2019, received in Korean Patent Application No. 2019-7033444, which corresponds with U.S. Appl. No. 16/252,478, 6 pages. |
Patent, dated Mar. 12, 2020, received in Korean Patent Application No. 2019-7033444, which corresponds with U.S. Appl. No. 16/252,478, 6 pages. |
Office action, dated Aug. 27, 2020, received in U.S. Appl. No. 16/241,883, 11 pages. |
Notice of Allowance, dated Sep. 28, 2020, received in U.S. Appl. No. 16/241,883, 10 pages. |
Office Action, dated Jul. 15, 2019, received in U.S. Appl. No. 16/258,394, 8 pages. |
Notice of Allowance, dated Nov. 6, 2019, received in U.S. Appl. No. 16/258,394, 8 pages. |
Office Action, dated May 14, 2020, received in U.S. Appl. 16/354,035, 16 pages. |
Notice of Allowance, dated Aug. 25, 2020, received in U.S. Appl. No. 16/354,035, 14 pages. |
Office Action, dated Oct. 11, 2019, received in Australian Patent Application No. 2019202417, which corresponds with U.S. Appl. No. 16/896,141, 4 pages. |
Notice of Allowance, dated Jul. 6, 2020, received in Australian Patent Application No. 2019202417, which corresponds with U.S. Appl. No. 16/896,141, 3 pages. |
Certificate of Grant, dated Nov. 5, 2020, received in Australian Patent Application No. 2019202417, which corresponds with U.S. Appl. No. 16/896,141, 4 pages. |
Office Action, dated Aug. 21, 2020, received in Japanese Patent Application No. 2019-047319, which corresponds with U.S. Appl. No. 16/896,141, 6 pages. |
Office Action, dated Aug. 30, 2019, received in Korean Patent Application No. 2019-7019100, 2 pages. |
Notice of Allowance, dated Nov. 1, 2019, received in Korean Patent Application No. 2019-7019100, 5 pages. |
Patent, dated Jan. 31, 2020, received in Korean Patent Application No. 2019-7019100, 5 pages. |
Office Action, dated May 14, 2020, received in U.S. Appl. No. 16/509,438, 16 pages. |
Notice of Allowance, dated Jan. 6, 2021, received in U.S. Appl. No. 16/509,438, 5 pages. |
Notice of Allowance, dated May 20, 2020, received in U.S. Appl. No. 16/534,214, 16 pages. |
Office Action, dated Oct. 7, 2020, received in U.S. Appl. No. 16/563,505, 20 pages. |
Office Action, dated Oct. 19, 2020, received in U.S. Appl. No. 16/685,773, 15 pages. |
Final Office Action, dated Feb. 2, 2021, received in U.S. Appl. No. 16/685,773, 20 pages. |
Office Action, dated Oct. 30, 2020, received in U.S. Appl. No. 16/824,490, 15 pages. |
Notice of Allowance, dated Feb. 24, 2021, received in U.S. Appl. No. 16/824,490, 8 pages. |
Office Action, dated Sep. 21, 2020, received in U.S. Appl. No. 16/803,904, 5 pages. |
Notice of Allowance, dated Jan. 6, 2021, received in U.S. Appl. No. 16/803,904, 9 pages. |
Notice of Allowance, dated May 4, 2020, received in Korean Patent Application No. 2019-7033444, 5 pages. |
Patent, dated Jun. 3, 2020, received in Korean Patent Application No. 2019-7033444, 7 pages. |
Office Action, dated Feb. 23, 2021, received in Korean Patent Application No. 2020-7031330, which corresponds with U.S. Appl. No. 15/272,398, 6 pages. |
International Search Report and Written Opinion dated May 26, 2014, received in International Application No. PCT/US2013/040053, which corresponds to U.S. Appl. No. 14/535,671, 32 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040053, which corresponds to U.S. Appl. No. 14/535,671, 26 pages. |
International Search Report and Written Opinion dated Apr. 7, 2014, received in International Application No. PCT/US2013/069472, which corresponds to U.S. Appl. No. 14/608,895, 24 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069472, which corresponds with U.S. Appl. No. 14/608,895, 18 pages. |
International Search Report and Written Opinion dated Aug. 7, 2013, received in International Application No. PCT/US2013/040054, which corresponds to U.S. Appl. No. 14/536,235, 12 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040054, which corresponds to U.S. Appl. No. 14/536,235, 11 pages. |
International Search Report and Written Opinion dated Aug. 7, 2013, received in International Application No. PCT/US2013/040056, which corresponds to U.S. Appl. No. 14/536,367, 12 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040056, which corresponds to U.S. Appl. No. 14/536,367, 11 pages. |
Extended European Search Report, dated Nov. 6, 2015, received in European Patent Application No. 15183980.0, which corresponds with U.S. Appl. No. 14/536,426, 7 pages. |
Extended European Search Report, dated Jul. 30, 2018, received in European Patent Application No. 18180503.7, which corresponds with U.S. Appl. No. 14/536,426, 7 pages. |
International Search Report and Written Opinion dated Aug. 6, 2013, received in International Application No. PCT/US2013/040058, which corresponds to U.S. Appl. No. 14/536,426, 12 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040058, which corresponds to U.S. Appl. No. 14/536,426, 11 pages. |
International Search Report and Written Opinion dated Feb. 5, 2014, received in International Application No. PCT/US2013/040061, which corresponds to U.S. Appl. No. 14/536,464, 30 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040061, which corresponds to U.S. Appl. No. 14/536,464, 26 pages. |
International Search Report and Written Opinion dated May 8, 2014, received in International Application No. PCT/US2013/040067, which corresponds to U.S. Appl. No. 14/536,644, 45 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040067, which corresponds to U.S. Appl. No. 14/536,644, 36 pages. |
International Search Report and Written Opinion dated Mar. 12, 2014, received in International Application No. PCT/US2013/069479, which corresponds with U.S. Appl. No. 14/608,926, 14 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069479, which corresponds with U.S. Appl. No. 14/608,926, 11 pages. |
International Search Report and Written Opinion dated Aug. 7, 2013, received in International Application No. PCT/US2013/040070, which corresponds to U.S. Appl. No. 14/535,646, 12 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040070, which corresponds to U.S. Appl. No. 14/535,646, 10 pages. |
International Search Report and Written Opinion dated Apr. 7, 2014, received in International Application No. PCT/US2013/040072, which corresponds to U.S. Appl. No. 14/536,141, 38 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/US2013/040072, which corresponds to U.S. Appl. No. 14/536,141, 32 pages. |
Extended European Search Report, dated Dec. 5, 2018, received in European Patent Application No. 18194127.9, which corresponds with U.S. Appl. No. 14/608,942, 8 pages. |
International Search Report and Written Opinion dated Apr. 7, 2014, received in International Application No. PCT/US2013/069483, which corresponds with U.S. Appl. No. 14/608,942, 18 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Application No. PCT/2013/069483, which corresponds to U.S. Appl. No. 14/608,942, 13 pages. |
International Search Report and Written Opinion dated Mar. 3, 2014, received in International Application No. PCT/US2013/040087, which corresponds to U.S. Appl. No. 14/536,166, 35 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013/040087, which corresponds to U.S. Appl. No. 14/536,166, 29 pages. |
International Search Report and Written Opinion dated Aug. 7, 2013, received in International Application No. PCT/US2013/040093, which corresponds to U.S. Appl. No. 14/536,203, 11 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013040093, which corresponds to U.S. Appl. No. 14/536,203, 9 pages. |
International Search Report and Written Opinion dated Jul. 9, 2014, received in International Application No. PCT/US2013/069484, which corresponds with U.S. Appl. No. 14/608,965, 17 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069484, which corresponds with 14/608,965, 12 pages. |
International Search Report and Written Opinion dated Feb. 5, 2014, received in International Application No. PCT/US2013/040098, which corresponds to U.S. Appl. No. 14/536,247, 35 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013/040098, which corresponds to U.S. Appl. No. 14/536,247, 27 pages. |
Extended European Search Report, dated Oct. 7, 2016, received in European Patent Application No. 16177863.4, which corresponds with U.S. Appl. No. 14/536,267, 12 pages. |
Extended European Search Report, dated Oct. 30, 2018, received in European Patent Application No. 18183789.9, which corresponds with U.S. Appl. No. 14/536,267, 11 pages. |
International Search Report and Written Opinion dated Jan. 27, 2014, received in International Application No. PCT/US2013/040101, which corresponds to U.S. Appl. No. 14/536,267, 30 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013/040101, which corresponds to U.S. Appl. No. 14/536,267, 24 pages. |
Extended European Search Report, dated Nov. 24, 2017, received in European Patent Application No. 17186744.3, which corresponds with U.S. Appl. No. 14/536,291, 10 pages. |
International Search Report and Written Opinion dated Jan. 8, 2014, received in International Application No. PCT/US2013/040108, which corresponds to U.S. Appl. No. 14/536,291, 30 pages. |
International Preliminary Report on Patentability dated Nov. 20, 2014, received in International Application No. PCT/2013/040108, which corresponds to U.S. Appl. No. 14/536,291, 25 pages. |
International Search Report and Written Opinion dated Jun. 2, 2014, received in International Application No. PCT/US2013/069486, which corresponds with U.S. Appl. No. 14/608,985, 7 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069486, which corresponds with U.S. Appl. No. 14/608,985, 19 pages. |
International Search Report and Written Opinion dated Mar. 6, 2014, received in International Application No. PCT/US2013/069489, which corresponds with U.S. Appl. No. 14/609,006, 12 pages. |
International Preliminary Report on Patentability, dated Jun. 30, 2015, received in International Patent Application No. PCT/US2013/069489, which corresponds with U.S. Appl. No. 14/609,006, 10 pages. |
Extended European Search Report, dated Mar. 15, 2017, received in European Patent Application No. 17153418.3, which corresponds with U.S. Appl. No. 14/536,648, 7 pages. |
Search Report, dated Apr. 13, 2017, received in Dutch Patent Application No. 2016452, which corresponds with U.S. Appl. No. 14/864,737, 22 pages. |
Search Report, dated Jun. 22, 2017, received in Dutch Patent Application No. 2016375, which corresponds with U.S. Appl. No. 14/866,981, 17 pages. |
International Search Report and Written Opinion, dated Oct. 14, 2016, received in International Patent Application No. PCT/US2016/020697, which corresponds with U.S. Appl. No. 14/866,981, 21 pages. |
Search Report, dated Jun. 19, 2017, received in Dutch Patent Application No. 2016377, which corresponds with U.S. Appl. No. 14/866,159, 13 pages. |
International Search Report and Written Opinion, dated Apr. 25, 2016, received in International Patent Application No. PCT/US2016/018758, which corresponds with U.S. Appl. No. 14/866,159, 15 pages. |
Extended European Search Report, dated Oct. 17, 2017, received in European Patent Application No. 17184437.6, Which corresponds with U.S. Appl. No. 14/868,078, 8 pages. |
Search Report, dated Apr. 13, 2017, received in Dutch Patent Application No. 2016376, which corresponds with U.S. Appl. No. 14/868,078, 15 pages. |
International Search Report and Written Opinion, dated Jul. 21, 2016, received in International Patent Application No. PCT/US2016/019913, which corresponds with U.S. Appl. No. 14/868,078, 16 pages. |
Search Report, dated Apr. 18, 2017, received in Dutch Patent Application No. 2016801, which corresponds with U.S. Appl. No. 14/863,432, 34 pages. |
International Search Report and Written Opinion, dated Oct. 31, 2016, received in International Patent Application No. PCT/US2016/033578, which corresponds with U.S. Appl. No. 14/863,432, 36 pages. |
International Search Report and Written Opinion, dated Nov. 14, 2016, received in International Patent Application No. PCT/US2016/033541, which corresponds with U.S. Appl. No. 14/866,511, 29 pages. |
Extended European Search Report, dated Aug. 17, 2018, received in European Patent Application No. 18175195.9, which corresponds with U.S. Appl. No. 14/869,899, 13 pages. |
International Search Report and Written Opinion, dated Aug. 29, 2016, received in International Patent Application No. PCT/US2016/021400, which corresponds with U.S. Appl. No. 14/869,899, 48 pages. |
International Preliminary Report on Patentability, dated Sep. 12, 2017, received in International Patent Application No. PCT/US2016/021400, which corresponds with U.S. Appl. No. 14/869,899, 39 pages. |
International Search Report and Written Opinion, dated Jan. 12, 2017, received in International Patent No. PCT/US2016/046419, which corresponds with U.S. Appl. No. 14/866,992, 23 pages. |
International Search Report and Written Opinion, dated Dec. 15, 2016, received in International Patent Application No. PCT/US2016/046403, which corresponds with U.S. Appl. No. 15/009,661, 17 pages. |
International Search Report and Written Opinion, dated Feb. 27, 2017, received in International Patent Application No. PCT/US2016/046407, which corresponds with U.S. Appl. No. 15/009,688, 30 pages. |
International Preliminary Report on Patentability, dated Feb. 13, 2018, received in International Patent Application No. PCT/US2016/046407, which corresponds with U.S. Appl. No. 15/009,688, 20 pages. |
Search Report, dated Feb. 15, 2018, received in Dutch Patent Application No. 2019215, which corresponds with U.S. Appl. No. 14/864,529, 13 pages. |
Extended European Search Report, dated Nov. 14, 2019, received in European Patent Application No. 19194418.0, which corresponds with U.S. Appl. No. 14/864,580, 8 pages. |
Search Report, dated Feb. 15, 2018, received in Dutch Patent Application No. 2019214, which corresponds with U.S. Appl. No. 14/864,601, 12 pages. |
Extended European Search Report, dated Oct. 10, 2017, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 9 pages. |
Extended European Search Report, dated Jun. 22, 2017, received in European Patent Application No. 16189421.7, which corresponds with U.S. Appl. No. 14/866,987, 7 pages. |
Extended European Search Report, dated Sep. 11, 2017, received in European Patent Application No. 17163309.2, which corresponds with U.S. Appl. No. 14/866,987, 8 pages. |
Extended European Search Report, dated Jun. 8, 2017, received in European Patent Application No. 16189425.8, which corresponds with U.S. Appl. No. 14/866,989, 8 pages. |
Extended European Search Report, dated Aug. 2, 2018, received in European Patent Application No. 18168941.5, which corresponds with U.S. Appl. No. 14/871,236, 11 pages. |
Extended European Search Report, dated Jul. 25, 2017, received in European Patent Application No. 17171972.7, which corresponds with U.S. Appl. No. 14/870,882, 12 pages. |
Extended European Search Report, dated Jul. 25, 2017, received in European Patent Application No. 17172266.3, which corresponds with U.S. Appl. No. 14/871,336, 9 pages. |
Extended European Search Report, dated Dec. 21, 2016, received in European Patent Application No. 16189790.5, which corresponds with U.S. Appl. No. 14/871,462, 8 pages. |
Extended European Search Report, dated Mar. 8, 2019, received in European Patent Application No. 18205283.7, which corresponds with U.S. Appl. No. 15/081,771, 15 pages. |
Extended European Search Report, dated Aug. 24, 2018, received in European Patent Application No. 18171453.6, which corresponds with U.S. Appl. No. 15/136,782, 9 pages. |
International Search Report and Written Opinion, dated Jan. 3, 2017, received in International Patent Application No. PCT/US2016/046214, which corresponds with U.S. Appl. No. 15/231,745, 25 pages. |
Extended European Search Report, dated May 30, 2018, received in European Patent Application No. 18155939.4, which corresponds with U.S. Appl. No. 15/272,327, 8 pages. |
Extended European Search Report, dated Mar. 2, 2018, received in European Patent Application No. 17206374.5, which corresponds with U.S. Appl. No. 15/272,343, 11 pages. |
Extended European Search Report, dated Oct. 6, 2020, received in European Patent Application No. 20188553.0, which corresponds with U.S. Appl. No. 15/499,693, 11 pages. |
Extended European Search Report, dated Oct. 28, 2019, received in European Patent Application No. 19195414.8, which corresponds with U.S. Appl. No. 16/240,672, 6 pages. |
Extended European Search Report, dated Nov. 13, 2019, received in European Patent Application No. 19194439.6, which corresponds with U.S. Appl. No. 16/262,800, 12 pages. |
Extended European Search Report, dated Oct. 9, 2019, received in European Patent Application No. 19181042.3, which corresponds with U.S. Appl. No. 15/272,343, 10 pages. |
Apple, “Final Cut Express 4 User Manual”, https://wsi.li.dl/mBGZWEQ8fh556f/, Jan. 1, 2007, 1, 152 pages. |
Office Action, dated Jun. 24, 2021, received in Chinese Patent Application No. 201810826224.6, which corresponds with U.S. Appl. No. 14/536,426, 3 pages. |
Office Action, dated Jul. 14, 2021, received in Chinese Patent Application No. 201810369259.1, which corresponds with U.S. Appl. No. 14/608,926, 5 pages. |
Office Action, dated Jun. 10, 2021, received in Chinese Patent Application No. 201711425148.X, which corresponds with U.S. Appl. No. 14/536,646, 2 pages. |
Certificate of Grant, dated Apr. 13, 2021, received in Chinese Patent Application No. 201711422092.2, which corresponds with U.S. Appl. No. 14/536,646, 8 pages. |
Office Action, dated Jul. 19, 2021, received in Chinese Patent Application No. 201810332044.2, which corresponds with U.S. Appl. No. 14/536,267, 1 page. |
Patent, dated Apr. 27, 2021, received in Chinese Patent Application No. 2018100116175.X, which corresponds with U.S. Appl. No. 14/536,291, 6 pages. |
Notice of Allowance, dated Mar. 30, 2021, received in Chinese Patent Application No. 201610871595.7, which corresponds with U.S. Appl. No. 14/869,899, 1 page. |
Patent, dated Jun. 4, 2021, received in Chinese Patent Application No. 201610871595.7, which corresponds with U.S. Appl. No. 14/869,899, 7 pages. |
Notice of Allowance, dated Apr. 26, 2021, received in Chinese Patent Application No. 201680041559.6, which corresponds with U.S. Appl. No. 14/866,992, 1 page. |
Patent, dated May 28, 2021, received in Chinese Patent Application No. 201680041559.6, which corresponds with U.S. Appl. No. 14/866,992, 7 pages. |
Office Action, dated Jul. 1, 2021 received in U.S. Appl. No. 15/009,661, 52 pages. |
Patent, dated Apr. 27, 2021, received in Chinese Patent Application No. 201680047125.7, which corresponds with U.S. Appl. No. 15/009,676, 8 pages. |
Notice of allowance, dated Jun. 28, 2021, received in Korean Patent Application No. 2020-7029178, which corresponds with U.S. Appl. No. 14/870,882, 2 pages. |
Office Action, dated Jun. 17, 2021, received in European Patent Application No. 19194418.2, which corresponds with U.S. Appl. No. 14/864,580, 7 pages. |
Patent, dated May 26, 2021, received in European Patent Application No. 17188507.2, which corresponds with U.S. Appl. No. 14/866,361, 3 pages. |
Patent, dated Feb. 5, 2021, received in Hong Kong Patent Application No. 1235878, which corresponds with U.S. Appl. No. 14/866,987, 6 pages. |
Patent, dated Apr. 21, 2021, received in European Patent Application No. 18168941.5, which corresponds with U.S. Appl. No. 14/871,236, 3 pages. |
Patent, dated May 25, 2021, received in Chinese Patent Application No. 201610870912.3, which corresponds with U.S. Appl. No. 14/870,882, 8 pages. |
Office Action, dated Jun. 1, 2021, received in Chinese Patent Application No. 201610871323.7, which corresponds with U.S. Appl. No. 14/871,336, 1 page. |
Notice of Allowance, dated May 26, 2021, received in U.S. Appl. No. 14/867,892, 7 pages. |
Notice of Allowance, dated Jul. 13, 2021, received in U.S. Appl. No. 14/867,892, 8 pages. |
Office Action, dated May 14, 2021, received in European Patent Application No. 16711725.8, which corresponds with U.S. Appl. No. 14/867,990, 7 pages. |
Notice of Allowance, dated Apr. 20, 2021, received in Chinese Patent Application No. 201680046985.9, which corresponds with U.S. Appl. No. 15/009,668, 1 page. |
Patent, dated Mar. 19, 2021, received in Chinese Patent Application No. 201810151593.X, which corresponds with U.S. Appl. No. 15/272,327, 6 pages. |
Notice of Allowance, dated Jul. 16, 2021, received in Japanese Patent Application No. 2019-200174, which corresponds with U.S. Appl. No. 15/499,693, 2 pages. |
Notice of Allowance, dated May 27, 2021, received in Chinese Patent Application No. 201710331254.5, which corresponds with U.S. Appl. No. 15/655,749, 1 page. |
Patent, dated Jun. 25, 2021, received in Chinese Patent Application No. 201710331254.5, which corresponds with U.S. Appl. No. 15/655,749, 7 pages. |
Notice of Allowance, dated Jul. 14, 2021, received in U.S. Appl. No. 15/785,372, 11 pages. |
Final Office Action, dated May 20, 2021, received in U.S. Appl. No. 16/136,163, 13 pages. |
Office Action, dated Mar. 29, 2021, received in Korean Patent Application No. 2019-7019946, which corresponds with U.S. Appl. No. 16/154,591, 6 pages. |
Office Action, dated May 17, 2021, received in U.S. Appl. No. 16/240,672, 14 pages. |
Office Action, dated Apr. 21, 2021, received in European Patent Application No. 19195414.8, which corresponds with U.S. Appl. No. 16/240,672, 7 pages. |
Final Office Action, dated Jun. 4, 2021, received in U.S. Appl. No. 16/262,800, 65 pages. |
Office Action, dated Jun. 9, 2021, received in U.S. Appl. No. 16/896,141, 21 pages. |
Office Action, dated Apr. 9, 2021, received in Japanese Patent Application No. 2019-047319, which corresponds with U.S. Appl. No. 16/896,141, 2 pages. |
Notice of Allowance, dated Apr. 29, 2021, received in U.S. U.S. Appl. No. 16/509,438, 9 pages. |
Final Office Action, dated May 12, 2021, received in U.S. Appl. No. 16/563,505, 19 pages. |
Office Action, dated May 26, 2021, received in U.S. Appl. No. 16/988,509, 25 pages. |
Notice of Allowance, dated Feb. 7, 2022, received in U.S. Appl. No. 16/988,509, 16 pages. |
Patent, dated May 27, 2022, received in Chinese Patent Application No. 201810332044.2, which corresponds with U.S. Appl. No. 14/536,267, 6 pages. |
Patent, dated May 19, 2022, received in Australian Patent Application. No. 2020267298, which corresponds with U.S. Appl. No. 16/258,394, 4pages. |
Office Action, dated May 17, 2022, received in Korean Patent Application No. 2020-7008888, 2 pages. |
Patent, dated May 19, 2022, received in Australian Patent Application No. 2020244406, which corresponds with U.S. Appl. No. 17/003,869, 3 pages. |
Office Action, dated May 23, 2022, received in Korean Patent Application No. 2022-7015718, 2 pages. |
Office Action, dated Aug. 23, 2022, received in European Patent Application No. 19194418.0, which corresponds with U.S. Appl. No. 14/864,580, 6 pages. |
Notice of Allowance, dated Aug. 23, 2022, received in Australian Patent Application No. 2020257134, 2 pages. |
Patent, dated Aug. 10, 2022, received in Korean Patent Application No. 2022-7015718, 6 pages. |
Office Action, dated Aug. 19, 2022, received in U.S. Appl. No. 17/103,899 24 pages. |
Number | Date | Country | |
---|---|---|---|
20210326039 A1 | Oct 2021 | US |
Number | Date | Country | |
---|---|---|---|
62215720 | Sep 2015 | US | |
62213593 | Sep 2015 | US | |
62172162 | Jun 2015 | US | |
62135619 | Mar 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16824490 | Mar 2020 | US |
Child | 17362852 | US | |
Parent | 16258394 | Jan 2019 | US |
Child | 16824490 | US | |
Parent | 15499693 | Apr 2017 | US |
Child | 16258394 | US | |
Parent | 14866361 | Sep 2015 | US |
Child | 15499693 | US | |
Parent | 14864737 | Sep 2015 | US |
Child | 14866361 | US |