The present disclosure relates to portable electronic devices, including but not limited to portable electronic devices having touch screen displays and their control.
Electronic devices, including portable electronic devices, have gained widespread use and may provide a variety of functions including, for example, telephonic, electronic messaging and other personal information manager (PIM) application functions. Portable electronic devices include, for example, several types of mobile stations such as simple cellular telephones, smart telephones, wireless personal digital assistants (PDAs), and laptop computers with wireless 802.11 or Bluetooth capabilities.
Portable electronic devices such as PDAs or smart telephones are generally intended for handheld use and ease of portability. Smaller devices are generally desirable for portability. A touch-sensitive display, also known as a touchscreen display, is particularly useful on handheld devices, which are small and have limited space for user input and output. The information displayed on the touch-sensitive displays may be modified depending on the functions and operations being performed. With continued demand for decreased size of portable electronic devices, touch-sensitive displays continue to decrease in size.
Improvements in electronic devices with touch-sensitive displays are desirable.
The following describes an apparatus for and method of scrolling through information displayed on a touch-sensitive display of a portable electronic device. A gesture on the touch-sensitive display is detected and an origin and direction of gesture is determined. The information is scrolled in a mode dependent on the origin of the gesture.
For simplicity and clarity of illustration, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. Numerous details are set forth to provide an understanding of the embodiments described herein. The embodiments may be practiced without these details. In other instances, well-known methods, procedures, and components have not been described in detail to avoid obscuring the embodiments described. The description is not to be considered as limited to the scope of the embodiments described herein.
The disclosure generally relates to an electronic device, which is a portable electronic device in the embodiments described herein. Examples of portable electronic devices include mobile, or handheld, wireless communication devices such as pagers, cellular phones, cellular smart-phones, wireless organizers, personal digital assistants, wirelessly enabled notebook computers, and so forth. The portable electronic device may also be a portable electronic device without wireless communication capabilities, such as a handheld electronic game device, digital photograph album, digital camera, or other device.
A block diagram of an example of a portable electronic device 100 is shown in
The processor 102 interacts with other components, such as Random Access Memory (RAM) 108, memory 110, a display 112 with a touch-sensitive overlay 114 operably connected to an electronic controller 116 that together comprise a touch-sensitive display 118, one or more actuators 120, one or more force sensors 122, an auxiliary input/output (I/O) subsystem 124, a data port 126, a speaker 128, a microphone 130, short-range communications 132, and other device subsystems 134. Interaction with a graphical user interface is performed through the touch-sensitive overlay 114. The processor 102 interacts with the touch-sensitive overlay 114 via the electronic controller 116. Information, such as text, characters, symbols, images, icons, and other items that may be displayed or rendered on a portable electronic device, is displayed on the touch-sensitive display 118 via the processor 102. The processor 102 may interact with an accelerometer 136 that may be utilized to detect direction of gravitational forces or gravity-induced reaction forces.
To identify a subscriber for network access, the portable electronic device 100 uses a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM) card 138 for communication with a network, such as the wireless network 150. Alternatively, user identification information may be programmed into memory 110.
The portable electronic device 100 includes an operating system 146 and software programs or components 148 that are executed by the processor 102 and are typically stored in a persistent, updatable store such as the memory 110. Additional applications or programs may be loaded onto the portable electronic device 100 through the wireless network 150, the auxiliary I/O subsystem 124, the data port 126, the short-range communications subsystem 132, or any other suitable subsystem 134.
A received signal such as a text message, an e-mail message, or web page download is processed by the communication subsystem 104 and input to the processor 102. The processor 102 processes the received signal for output to the display 112 and/or to the auxiliary I/O subsystem 124. A subscriber may generate data items, for example e-mail messages, which may be transmitted over the wireless network 150 through the communication subsystem 104. For voice communications, the overall operation of the portable electronic device 100 is similar. The speaker 128 outputs audible information converted from electrical signals, and the microphone 130 converts audible information into electrical signals for processing.
The touch-sensitive display 118 may be any suitable touch-sensitive display, such as a capacitive, resistive, infrared, surface acoustic wave (SAW) touch-sensitive display, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, and so forth, as known in the art. A capacitive touch-sensitive display includes a capacitive touch-sensitive overlay 114. The overlay 114 may be an assembly of multiple layers in a stack including, for example, a substrate, a ground shield layer, a barrier layer, one or more capacitive touch sensor layers separated by a substrate or other barrier, and a cover. The capacitive touch sensor layers may be any suitable material, such as patterned indium tin oxide (ITO).
One or more touches, also known as touch contacts or touch events, may be detected by the touch-sensitive display 118. The processor 102 may determine attributes of the touch, including a location of a touch. Touch location data may include an area of contact or a single point of contact, such as a point at or near a center of the area of contact. The location of a detected touch may include x and y components, e.g., horizontal and vertical components, respectively, with respect to one's view of the touch-sensitive display 118. For example, the x location component may be determined by a signal generated from one touch sensor, and the y location component may be determined by a signal generated from another touch sensor. A signal is provided to the controller 116 in response to detection of a touch. A touch may be detected from any suitable object, such as a finger, thumb, appendage, or other items, for example, a stylus, pen, or other pointer, depending on the nature of the touch-sensitive display 118. Multiple simultaneous touches may be detected.
The actuator(s) 120 may be depressed by applying sufficient force to the touch-sensitive display 118 to overcome the actuation force of the actuator 120. The actuator 120 may be actuated by pressing anywhere on the touch-sensitive display 118. The actuator 120 may provide input to the processor 102 when actuated. Actuation of the actuator 120 may result in provision of tactile feedback.
A mechanical dome switch actuator may be utilized. In this example, tactile feedback is provided when the dome collapses due to imparted force and when the dome returns to the rest position after release of the switch.
Alternatively, the actuator 120 may comprise one or more piezoelectric (piezo) actuators that provide tactile feedback for the touch-sensitive display 118. Contraction of the piezo actuator(s) applies a spring-like force, for example, opposing a force externally applied to the touch-sensitive display 118. Each piezo actuator includes a piezoelectric device, such as a piezoelectric disk, adhered to a substrate such as a metal substrate. The substrate bends when the piezoelectric device contracts due to build up of charge/voltage at the piezoelectric device or in response to a force, such as an external force applied to the touch-sensitive display 118. The charge/voltage may be adjusted by varying the applied voltage or current, thereby controlling the force applied by the piezo actuators. The charge/voltage at the piezo actuator may be removed by a controlled discharge current that causes the piezoelectric device to expand, releasing the force thereby decreasing the force applied by the piezo actuators. The charge/voltage may advantageously be removed over a relatively short period of time to provide tactile feedback to the user. Absent an external force and absent a charge/voltage at the piezo actuator, the piezo actuator may be slightly bent due to a mechanical preload.
The touch-sensitive display 118 is configured to display information from an application, such as a web browser, contacts, email, calendar, music player, spreadsheet, word processing, operating system interface, and so forth, in a display area. A virtual keyboard may be displayed in an input area, for example, below the display area in the orientation of the portable electronic device 100 and includes keys for entry of alphanumeric characters, punctuation or symbols.
The touch-sensitive display 118 is also configured to detect a gesture. A gesture, such as a swipe, is a type of touch, also known as a flick, that begins at an origin and continues to a finish point while touch contact is maintained. A swipe may be long or short in distance and/or duration. Two points of the swipe are utilized to determine a vector that describes a direction of the swipe. The direction may be referenced with respect to the touch-sensitive display 118, the orientation of the information displayed on the touch-sensitive display 118, or another reference. For the purposes of providing a reference, “horizontal” as utilized herein is substantially left-to-right or right-to-left relative to the orientation of the displayed information, and “vertical” as utilized herein is substantially upward or downward relative to the orientation of the displayed information. The origin and the finish point of the swipe may optionally be utilized to determine the magnitude or distance of the swipe. The duration of the swipe is determined from the origin and finish point of the swipe in time. The controller 116 and/or the processor 102 determine the direction, magnitude, and/or duration of the swipe.
When a gesture such as a swipe is detected and associated with the display area of the touch-sensitive display 118, page scrolling within the information occurs. Page scrolling is a mode of scrolling in which the information may be advanced or reversed as the information is displayed. The direction of page scrolling may be based on the direction of the swipe. When a swipe is associated with the input area of the touch-sensitive display 118, cursor scrolling of the information occurs. Cursor scrolling is a mode of scrolling in which a cursor is rendered on the touch-sensitive display 118 and may be advanced or reversed, depending on the direction of the swipe, through the information displayed on the touch-sensitive display. The cursor scrolls through the information while maintaining display of the cursor. Alternatively, cursor scrolling may be utilized when a gesture is associated with the display area, and page scrolling may be utilized when a gesture is associated with the input area.
Because a touch-sensitive display 118 on a portable electronic device 100 is typically relatively small, the amount of information displayed is typically much less than the amount of information that may be displayed, for example, on a computer monitor or other larger device. Information from an application, based on the screen size and memory capability of the device controlling the display of information on the screen, is available to be displayed using scrolling techniques. The amount of information is often more than fits on a screen or window at one time.
The information may comprise, for example, a webpage, electronic messaging or mail text, contact details, calendar event details, spreadsheet data, text or word processing, to name a few. For example, when entering calendar event details for scheduling a calendar event, the calendar application may display of 10 lines of information at a time while displaying a virtual keyboard for entry of data in fields of the calendar event. The calendar event, however, may include 50 lines of information. Typically, a user may advance or reverse through the information by scrolling using a control such as a button or menu option. The use of a scrolling mode that is determined based on the association of the gesture, such as the gesture's origin, finish point, or other attribute, rather than selection of a button or use of a menu facilitates quicker, seamless navigation and interaction. Page scroll or cursor scroll may be immediately engaged or utilized at any time. Without the need to find and press a button or to enter a menu, the process of navigating to view, add, delete, and edit data is faster.
An example of a touch-sensitive display 118 before and after scrolling in a first mode is shown in
The information includes, for example, email header fields such as a “To” field, a “Cc” field, a “Subject” field and a body field. Each of the fields of the email may be edited during composition of the email and a cursor 206 is shown rendered in the information. The cursor 206 indicates, for example, the position, within the information, at which additions, deletions, or edits may be made. The user may scroll through the email in the page scrolling mode to view or edit any of the fields of the email. The origin 208 is shown in
Optionally, the length of the gesture, either by distance or time duration, may be utilized to determine what part of the information to display or to determine how far to advance or reverse the information. Two, three, or more levels of distinction may be utilized. For example, a two-level system divides gestures into short gestures and long gestures, wherein one or more thresholds are utilized to determine whether a gesture is considered long or short. A short gesture advances or reverses the information by an amount, such as shown in
A long gesture may be utilized to jump to the end or the beginning of the information or may be interpreted as advancing or reversing the information by a large amount. The amount of scrolling may vary depending on the amount of information.
A three-level system divides gestures into short gestures, medium gestures, and long gestures, wherein two or more thresholds are utilized to determine whether a gesture is considered short, medium, or long. For example, short gesture may be interpreted as advancing or reversing by an amount, a medium gesture may be interpreted as advancing or reversing by a greater amount, and a long gesture may be interpreted as advancing or reversing to the end or start of the information. The thresholds for the length may be based on dimensions of the touch-sensitive display 118. Alternatively, the long gesture may be interpreted as one that begins on the screen and continues off the edge of the screen 118, whereas the short and medium gestures are determined by a threshold and these gestures both originate and end on the screen 118.
The user may scroll through the email in the cursor scrolling mode to move the cursor 206 within the information. The origin 302 is shown in
Optionally, the length of the gesture, either by distance or time duration, may be utilized to determine how far to advance or reverse the cursor 202 within the information. Two, three, or more levels of distinction may be utilized. A short gesture may advance or reverse the cursor by one line of information or one character and a long gesture may advance or reverse the cursor by more than one line of information or more than one character.
A long gesture may be utilized to move the cursor to the end or the beginning of the information for a generally vertical gesture or may be used to move the cursor to the end or beginning of a line for a generally horizontal gesture. Alternatively, a long gesture may be interpreted as advancing or reversing the cursor within the information by multiple lines or characters. The number of multiples may vary.
A method of controlling a portable electronic device that has a touch-sensitive display, includes displaying information on the touch-sensitive display, detecting a gesture on the touch-sensitive display, scrolling through the information in a first scrolling mode when the gesture is associated with a first area of the touch-sensitive display, and scrolling through the information in a second scrolling mode when the gesture is associated with a second area of the touch-sensitive display.
A computer-readable medium has computer-readable code embodied therein that is executable by at least one processor of a portable electronic device to perform the above method.
A portable electronic device includes a touch-sensitive display configured to display information. A processor is configured to detect a gesture on the touch-sensitive display, scroll through the information in a first scrolling mode when the gesture is associated with a first area of the touch-sensitive display, and scroll through the information in a second scrolling mode when the gesture is associated with a second area of the touch-sensitive display.
The method of scrolling described herein facilitates interaction and selection, for example, of a cursor position within information displayed for editing. A detected swipe on a touch screen display may be utilized to scroll in either of two modes, in any direction for viewing the information. The mode for scrolling is determined based on the association of the gesture with a display area, and enables scrolling in either of the two modes without requiring any further button, menu, or other more time-consuming process. Thus, different parts of information may be displayed and/or edited more quickly, decreasing power requirements, and increasing battery life, and providing an improved user experience.
The present disclosure may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the present disclosure is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Number | Name | Date | Kind |
---|---|---|---|
4680577 | Straayer et al. | Jul 1987 | A |
5121114 | Nagasawa et al. | Jun 1992 | A |
5666552 | Greyson et al. | Sep 1997 | A |
5923861 | Bertram et al. | Jul 1999 | A |
6049334 | Bates et al. | Apr 2000 | A |
6094197 | Buxton et al. | Jul 2000 | A |
6292179 | Lee | Sep 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6411283 | Murphy | Jun 2002 | B1 |
6714221 | Christie et al. | Mar 2004 | B1 |
6876312 | Yu | Apr 2005 | B2 |
6954899 | Anderson | Oct 2005 | B1 |
7023428 | Pihlaja | Apr 2006 | B2 |
7030863 | Longe et al. | Apr 2006 | B2 |
7091954 | Iesaka | Aug 2006 | B2 |
7477233 | Duncan et al. | Jan 2009 | B2 |
7508324 | Suraqui | Mar 2009 | B2 |
7571384 | Webb | Aug 2009 | B1 |
7659887 | Larsen et al. | Feb 2010 | B2 |
7958455 | Doar | Jun 2011 | B2 |
7982716 | Bowen | Jul 2011 | B2 |
8271036 | Griffin et al. | Sep 2012 | B2 |
8438504 | Cranfill et al. | May 2013 | B2 |
8537117 | Griffin et al. | Sep 2013 | B2 |
8564555 | Day et al. | Oct 2013 | B2 |
8756522 | Lee et al. | Jun 2014 | B2 |
8830176 | Bos et al. | Sep 2014 | B2 |
8904309 | Zhai et al. | Dec 2014 | B1 |
9081499 | Kondo et al. | Jul 2015 | B2 |
9684521 | Shaffer et al. | Jun 2017 | B2 |
20020030667 | Hinckley et al. | Mar 2002 | A1 |
20020067346 | Mouton et al. | Jun 2002 | A1 |
20020135602 | Davis et al. | Sep 2002 | A1 |
20030043123 | Hinckley et al. | Mar 2003 | A1 |
20030043174 | Hinckley et al. | Mar 2003 | A1 |
20030137522 | Kaasila et al. | Jul 2003 | A1 |
20040021676 | Chen et al. | Feb 2004 | A1 |
20040141009 | Hinckley et al. | Jul 2004 | A1 |
20040263487 | Mayoraz et al. | Dec 2004 | A1 |
20050024341 | Gillespie et al. | Feb 2005 | A1 |
20050221268 | Chaar et al. | Oct 2005 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060033701 | Wilson | Feb 2006 | A1 |
20060038796 | Hinckley et al. | Feb 2006 | A1 |
20060055669 | Das | Mar 2006 | A1 |
20060119582 | Ng et al. | Jun 2006 | A1 |
20060164399 | Cheston et al. | Jul 2006 | A1 |
20060176283 | Suraqui | Aug 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060242596 | Armstrong | Oct 2006 | A1 |
20060242607 | Hudson | Oct 2006 | A1 |
20060250376 | Takahashi | Nov 2006 | A1 |
20060253793 | Zhai et al. | Nov 2006 | A1 |
20060256083 | Rosenberg | Nov 2006 | A1 |
20060267946 | Wecker et al. | Nov 2006 | A1 |
20070075984 | Chiu et al. | Apr 2007 | A1 |
20070091070 | C. Larsen et al. | Apr 2007 | A1 |
20070125633 | Boillot et al. | Jun 2007 | A1 |
20070130547 | Boillot et al. | Jun 2007 | A1 |
20070159362 | Shen | Jul 2007 | A1 |
20070185631 | Yeh et al. | Aug 2007 | A1 |
20070220442 | Bohan et al. | Sep 2007 | A1 |
20080005703 | Radivojevic et al. | Jan 2008 | A1 |
20080036743 | Westerman et al. | Feb 2008 | A1 |
20080094369 | Ganatra et al. | Apr 2008 | A1 |
20080158024 | Steiner et al. | Jul 2008 | A1 |
20080165141 | Christie | Jul 2008 | A1 |
20080165142 | Kocienda et al. | Jul 2008 | A1 |
20080168367 | Chaudhri et al. | Jul 2008 | A1 |
20080174570 | Jobs | Jul 2008 | A1 |
20080180410 | McCall et al. | Jul 2008 | A1 |
20080220752 | Forstall et al. | Sep 2008 | A1 |
20080259040 | Ording et al. | Oct 2008 | A1 |
20080316183 | Westerman et al. | Dec 2008 | A1 |
20080318635 | Yoon et al. | Dec 2008 | A1 |
20090027337 | Hildreth | Jan 2009 | A1 |
20090031240 | Hildreth | Jan 2009 | A1 |
20090058830 | Herz et al. | Mar 2009 | A1 |
20090077464 | Goldsmith et al. | Mar 2009 | A1 |
20090100380 | Gardner et al. | Apr 2009 | A1 |
20090109182 | Fyke | Apr 2009 | A1 |
20090150426 | Cannon et al. | Jun 2009 | A1 |
20090167700 | Westerman | Jul 2009 | A1 |
20090178007 | Matas et al. | Jul 2009 | A1 |
20090213081 | Case, Jr. | Aug 2009 | A1 |
20090228825 | Van Os et al. | Sep 2009 | A1 |
20090228842 | Westerman et al. | Sep 2009 | A1 |
20090229892 | Fisher et al. | Sep 2009 | A1 |
20090256802 | Lou et al. | Oct 2009 | A1 |
20090282360 | Park et al. | Nov 2009 | A1 |
20090289917 | Saunders | Nov 2009 | A1 |
20090295713 | Piot et al. | Dec 2009 | A1 |
20090327976 | Williamson et al. | Dec 2009 | A1 |
20100020018 | Yang | Jan 2010 | A1 |
20100045703 | Kornmann et al. | Feb 2010 | A1 |
20100058227 | Danton et al. | Mar 2010 | A1 |
20100083166 | Happonen | Apr 2010 | A1 |
20100109999 | Qui | May 2010 | A1 |
20100123669 | Chae et al. | May 2010 | A1 |
20100123724 | Moore et al. | May 2010 | A1 |
20100148995 | Elias | Jun 2010 | A1 |
20100156813 | Duarte et al. | Jun 2010 | A1 |
20100162160 | Stallings et al. | Jun 2010 | A1 |
20100197352 | Runstedler et al. | Aug 2010 | A1 |
20100231612 | Chaudhri et al. | Sep 2010 | A1 |
20100235729 | Kocienda | Sep 2010 | A1 |
20100245395 | Ebert et al. | Sep 2010 | A1 |
20100259562 | Miyazawa et al. | Oct 2010 | A1 |
20100287486 | Coddington | Nov 2010 | A1 |
20110035209 | Macfarlane | Feb 2011 | A1 |
20110122159 | Bergsten et al. | May 2011 | A1 |
20110167341 | Cranfill et al. | Jul 2011 | A1 |
20110210922 | Griffin | Sep 2011 | A1 |
20110227834 | Yang et al. | Sep 2011 | A1 |
20110231789 | Bukurak et al. | Sep 2011 | A1 |
20110239153 | Carter et al. | Sep 2011 | A1 |
20110248945 | Higashitani | Oct 2011 | A1 |
20110304541 | Dalal | Dec 2011 | A1 |
20120119997 | Gutowitz | May 2012 | A1 |
20120144299 | Patel et al. | Jun 2012 | A1 |
20130113717 | Van Eerd et al. | May 2013 | A1 |
20130241847 | Shaffer et al. | Sep 2013 | A1 |
20130285926 | Griffin et al. | Oct 2013 | A1 |
20140006982 | Wabyick | Jan 2014 | A1 |
20140078063 | Bathiche et al. | Mar 2014 | A1 |
20140109016 | Ouyang et al. | Apr 2014 | A1 |
20140189569 | Eleftheriou et al. | Jul 2014 | A1 |
20140145220 | Lee et al. | Aug 2014 | A1 |
20140245220 | Lee et al. | Aug 2014 | A1 |
20140292658 | Lee et al. | Oct 2014 | A1 |
20140306897 | Cueto | Oct 2014 | A1 |
20140306898 | Cueto | Oct 2014 | A1 |
20140306899 | Hicks | Oct 2014 | A1 |
20140310805 | Kandekar | Oct 2014 | A1 |
20150261418 | Heo et al. | Sep 2015 | A1 |
20160004401 | Mccommons et al. | Jan 2016 | A1 |
Number | Date | Country |
---|---|---|
101356493 | Jan 2009 | CN |
101393506 | Mar 2009 | CN |
1674977 | Jun 2006 | EP |
2068236 | Jun 2009 | EP |
2005064587 | Jul 2005 | WO |
2006020305 | Feb 2006 | WO |
2008025370 | Mar 2008 | WO |
2009117685 | Sep 2009 | WO |
2010026493 | Mar 2010 | WO |
Entry |
---|
Canadian Patent Application No. 2,731,603, Examiner's Report dated Jan. 9, 2014. |
Canadian Patent Application No. 2,731,603, Examiner's Report dated Mar. 26, 2013. |
Chinese Patent Application No. 201110065617.8, Office Action dated Dec. 2, 2013. |
Chinese Patent Application No. 201110065617.8, Office Action dated Jun. 16, 2014. |
Chinese Patent Application No. 201110065617.8, Office Action dated May 31, 2013. |
Chinese Patent Application No. 201110065617.8, Office Action dated Sep. 5, 2012. |
European Patent Application No. 10157013, European Search Report dated Jul. 6, 2010. |
European Patent Application No. 10157013.3, Office Action dated Feb. 29, 2016. |
European Patent Application No. 10157013.3, Summons to Attend Oral Proceedings dated Nov. 2, 2016. |
Indian Patent Application No. IN431/CHE/2011, Office Action dated May 25, 2017. |
U.S. Appl. No. 14/268,633, Non-Final Office Action dated Oct. 23, 2015. |
U.S. Appl. No. 14/268,633, Final Office Action dated May 19, 2016. |
U.S. Appl. No. 14/268,633, Non-Final Office Action dated Aug. 1, 2016. |
U.S. Appl. No. 14/268,633, Final Office Action dated Feb. 8, 2017. |
U.S. Appl. No. 14/268,633, Non-Final Office Action dated Sep. 7, 2017. |
U.S. Appl. No. 14/268,633, Advisory Action dated Apr. 10, 2017. |
U.S. Appl. No. 14/268,633, Advisory Action dated Jun. 11, 2018. |
U.S. Appl. No. 14/268,633, Non-Final Office Action dated Jan. 8, 2019. |
U.S. Appl. No. 14/268,633, Advisory Action dated Oct. 30, 2019. |
U.S. Appl. No. 14/268,633, Final Office Action dated Jul. 29, 2019. |
U.S. Appl. No. 14/268,633, Final Office Action dated Apr. 4, 2018. |
U.S. Appl. No. 14/268,633, Notice of Allowance dated May 29, 2020. |
U.S. Appl. No. 12/727,979, Non-Final Office Action dated Apr. 25, 2012. |
U.S. Appl. No. 12/727,979, Final Office Action dated Nov. 28, 2012. |
U.S. Appl. No. 12/727,979, Non-Final Office Action dated Mar. 15, 2013. |
U.S. Appl. No. 12/727,979, Final Office Action dated Oct. 11, 2013. |
U.S. Appl. No. 12/727,979, Advisory Action dated Dec. 30, 2013. |
U.S. Appl. No. 12/727,979, Notice of Allowance dated Feb. 3, 2014. |
Number | Date | Country | |
---|---|---|---|
20200401288 A1 | Dec 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14268633 | May 2014 | US |
Child | 17008493 | US | |
Parent | 12727979 | Mar 2010 | US |
Child | 14268633 | US |