Input method, input apparatus, and terminal

Information

  • Patent Grant
  • 11256877
  • Patent Number
    11,256,877
  • Date Filed
    Thursday, February 27, 2020
    4 years ago
  • Date Issued
    Tuesday, February 22, 2022
    2 years ago
Abstract
An input method includes receiving input end indication information sent by an input module, where the input end indication information indicates that input of a character or a word ends, obtaining a location of a cursor, identifying the input character or word forward from the location of the cursor until a first punctuation input before the character or the word is identified, using the identified character or word as a previous text, and querying a word library for a next text associated with the previous text, and outputting the associated next text to a display module for displaying.
Description
FIELD OF THE INVENTION

The present invention relates to the field of communication technologies, and in particular, to an input method, an input apparatus, and a terminal.


BACKGROUND OF THE INVENTION

At present, all Chinese input methods are capable of making association only according to frequently used words when a user inputs a character. For example, when we input characters “lianx” by using the Sogou pinyin input method, the input method may automatically display such words as “custom character”, “custom character”, “custom character”, “custom character”, and “custom character”, thereby facilitating the input. When we do not input characters, the cursor is located somewhere in a file.


The Chinese/English input method may be used to associate words according to social hot expressions, sports terms, and frequently used information words when the user inputs characters, which can speed up the typing and increase the input efficiency. However, this association method is used only when Pinyin characters are input.


During the research and practice of the prior art, the inventor of the present invention discovers that the input method in the prior art is capable of associating related expressions only when characters are input; once the user presses an input end key, for example, the space key, to end the input of a character or a word, no association is made.


SUMMARY OF THE INVENTION

Embodiments of the present invention provide an input method, which is capable of associating a next text according to an input previous text for a user to select after the user presses an input end key to end the input of a character or a word, so that the input efficiency is increased. The embodiments of the present invention also provide an input apparatus and a terminal.


An input method includes: receiving input end indication information sent by an input module, where the input end indication information indicates that input of a character or a word ends; obtaining a location of a cursor; identifying the input character or word forward from the location of the cursor until a first punctuation input before the character or the word is identified; using the identified character or word as a previous text, and querying a word library for a next text associated with the previous text; and outputting the associated next text to a display module for displaying.


An input apparatus includes: a receiving unit, configured to receive input end indication information, where the input end indication information indicates that input of a character or a word ends; an obtaining unit, configured to obtain a location of a cursor after the receiving unit receives the input end indication information; an identifying unit, configured to identify the input character or word forward from the location of the cursor obtained by the obtaining unit until a first punctuation input before the character or the word is identified; a querying unit, configured to use the character or the word identified by the identifying unit as a previous text, and query a word library for a next text associated with the previous text; and an outputting unit, configured to output the next text, which is found by the querying unit and associated with the previous text, to a display module for displaying.


A terminal includes an input apparatus, an input module, and a display module, where: the input module inputs, to the input apparatus, input end indication information of a character or a word, where the input end indication information indicates that input of the character or the word ends; the input apparatus obtains a location of a cursor after receiving the input end indication information of the character or the word; identifies the input character or word forward from the location of the cursor until a first punctuation input before the character or the word is identified; uses the identified character or word as a previous text, and queries a word library for a next text associated with the previous text; and outputs the associated next text; and the display module displays the next text input by the input apparatus for a user to select to input.


In the embodiments of the present invention, input end indication information sent by an input module is received, where the input end indication information indicates that input of a character or a word ends; a location of a cursor is obtained; the input character or word is identified forward from the location of the cursor until a first punctuation is identified; the identified character or word are used as a previous text, and a word library is queried for a next text associated with the previous text; the associated next text is output to a display module for displaying. Compared with the prior art, the input method provided in the embodiments of the present invention is capable of associating a next text according to an input previous text for the user to select after the user presses an input end key, for example, the space key, to end the input of a character or a word, so that the input efficiency is increased.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic diagram of an embodiment of an input method according to an embodiment of the present invention;



FIG. 2 is a schematic diagram of an embodiment of an input apparatus according to an embodiment of the present invention;



FIG. 3 is a schematic diagram of another embodiment of an input apparatus according to an embodiment of the present invention; and



FIG. 4 is a schematic diagram of an embodiment of a terminal according to an embodiment of the present invention.





DETAILED DESCRIPTION OF THE EMBODIMENTS

An embodiment of the present invention provides an input method, which is capable of associating a next text according to an input previous text for a user to select after the user presses an input end key, for example, the space key, to end the input of a character or a word, so that the input efficiency is increased. An embodiment of the present invention also provides an input apparatus and a terminal. The embodiments are respectively described in detail below.


Referring to FIG. 1, an embodiment of an input method provided in an embodiment of the present invention includes the following:



101. An input apparatus receives input end indication information sent by an input module, where the input end indication information indicates that input of a character or a word ends.


The input module may be a module having an input function, for example, a computer keyboard, handset keys, and the like.


The input end indication information is implemented by the user by pressing an input end key, for example, the space key, and the like. If the input apparatus receives the input end indication information, it means that input of the previous input character or word is completed. The input end indication information of the Chinese input method and that of the English input method are the same, and the input end indication information is sent when the user presses the space key or a punctuation key.



102. Obtain a location of a cursor.


The input method in the prior art is capable of making association according to frequently used words only when characters are input. For example, when we input characters “lianx” by using the Sogou pinyin input method, the input method may automatically display such words as “custom character”, “custom character”, “custom character”, “custom character”, and “custom character”. When the input of characters is completed, for example, when the words “custom character” are input, the cursor is located after the word “custom character” and the input method displays nothing until the user inputs a character again.


By using the input method provided in the embodiment of the present invention, when a user presses an input end key, for example, the space key, to end the input of a character or a word, a next text may be associated according to the input previous text, that is, when the cursor is located after a character or a word, the input method may be used to associate an expression or a sentence related to the input character or word, for example, when the expression “custom character” is input and the cursor is located after the word “custom character”, the input method provided in the embodiment of the present invention may associate the expression “custom charactercustom character”. In this way, the user can directly select the displayed “custom character”, without the need of inputting any character.


For the input apparatus, after the user inputs the input end indication information of a character or a word by using the input module, the input apparatus begins to obtain the location of the cursor.



103. Identify the input character or word forward from the location of the cursor until a first punctuation input before the character or the word is identified.


A next text needs to be associated according to the previous text; the range of previous text contents needs to be firstly determined; the previous text contents provided in the embodiment of the present invention refer to a segment of contents starting from the location of the cursor to the identified first punctuation. For example, when the input characters include “custom charactercustom charactercustom character: custom character,” and the cursor is located after the word “custom character “or after the comma”,”, according to the method in step 101, it is firstly determined that the cursor is located after the word “custom character” or after the comma “,”, and then the input characters are identified forward from the location of the cursor until the colon “:” input before the characters or the words is identified; and the identified segment of contents “custom character” is used as the previous text contents.


Only a Chinese example is provided herein. Actually, English may also be input, for example, if “teacher said: good good study,” is input and the cursor is located at an adjacent location after the comma “,”, “good good study” may be used as a previous text, so that a corresponding next text is associated.



104. Use the identified character or word as a previous text, and query a word library for a next text associated with the previous text.


In step 103, a segment of a character or a word starting from the location of the cursor to the identified first punctuation input before the character or the word is identified, and the identified character or word are used as the previous text; and the word library is queried for a next text associated with the previous text. The specific querying solution includes: querying the word library for all expressions and sentences including the previous text, and using an expression or a sentence that exists in the found expressions and sentences but does not exist in the previous text as a next text; for example, if the previous text is “custom character”, it may be found that a sentence including “custom character” is a line in the poem “custom character”, that is, “custom charactercustom character, custom character”. In this way, when the cursor is located after the word “custom character”, it is determined that the next text associated with the previous text is “, custom character”; when the cursor is located after the comma “,”, it is determined that the next text is “custom charactercustom character”.


When the previous text is “good good study”, it may be found that a sentence including “good good study” is “good good study, day day up”; when the cursor is located after the comma “,”, a next text associated with the previous text is “day day up”.


The example illustrated in the foregoing solution is special, that is, it is a poem, where a whole sentence can be associated. Actually, sometimes it is not necessary to use an input segment of words for query and association based on the entire previous text, for example, if the input previous text is “custom character” and the cursor is located after the word “custom character”, an expression or a sentence including the previous text “custom character” may not exist according to the foregoing solution. At this time, no next text may be associated. In this case, the embodiment of the present invention also provides another association solution. Specific details are as follows:


The previous text is grouped according to an input sequence and frequently used words, and a last input group of characters or words is used as an index; the word library is queried for expressions and sentences including the index, and an expression or a sentence that exists in the found expressions and sentences but does not exist in the index is used as the next text. The group last input according to the input time sequence is a group closest to the location of the cursor.


For example, according to this solution, the previous text is “custom character”, which can be divided into two groups, namely, “custom character” and “custom character”; according to the input time sequence, the “custom character” is the last input group. In this case, the group “custom character” is the group closest to the location of the cursor; “custom character” is used as an index to query for an expression and a sentence including the “custom character”, and an expression like “custom character” can be found easily. In this way, “custom character” can be used as the next text of the index “custom character” according to this solution.



105. Output the associated next text to a display module for displaying.


In step 104, multiple next texts may be associated. For example, when the previous text is “custom character”, the associated next text may include a series of expressions, such as “custom character”, “custom character”, and “custom character”. The input apparatus outputs these expressions to a display module, for example, a display screen for displaying. At this time, the user may select an associated next text directly according to actual needs, without the need of inputting pinyin or words, thereby increasing the input efficiency of the input method.


In the embodiment of the present invention, input end indication information sent by an input module is received, where the input end indication information indicates that input of a character or a word ends; a location of a cursor is obtained; the input character or word is identified forward from the location of the cursor until a first punctuation is identified; the identified character or word is used as a previous text, and a word library is queried for a next text associated with the previous text; the associated next text is output to a display module for displaying. Compared with the prior art, according to the input method provided in the embodiment of the present invention, it is capable of associating a next text according to an input previous text for the user to select after the user presses an input end key, for example, the space key, to end the input of a character or a word, so that the input efficiency is increased.


Referring to FIG. 2, an embodiment of an input apparatus provided in an embodiment of the present invention includes: a receiving unit 201, configured to receive input end indication information sent by an input module, where the input end indication information indicates that input of a character or a word ends; an obtaining unit 202, configured to obtain a location of a cursor after the receiving unit 201 receives the input end indication information; an identifying unit 203, configured to identify the input character or word forward from the location of the cursor obtained by the obtaining unit 202 until a first punctuation input before the character or the word is identified; a querying unit 204, configured to use the character or the word identified by the identifying unit 202 as a previous text, and query a word library for a next text associated with the previous text; and an outputting unit 205, configured to output the next text, which is found by the querying unit 204 and associated with the previous text, to a display module for displaying.


In the embodiment of the present invention, the receiving unit 201 receives input end indication information sent by an input module, where the input end indication information indicates that input of a character or a word ends; after the receiving unit 201 receives the input end indication information, the obtaining unit 202 obtains a location of a cursor; the identifying unit 203 identifies the input character or word forward from the location of the cursor obtained by the obtaining unit 202 until a first punctuation input before the character or the word is identified; the querying unit 204 uses the character or the word identified by the identifying unit 202 as a previous text, and queries a word library for a next text associated with the previous text; the outputting unit 205 outputs the next text, which is found by the querying unit 204 and associated with the previous text, to a display module for displaying. The input apparatus provided in the embodiment of the present invention is capable of associating a next text according to an input previous text for the user to select after the user presses an input end key, for example, the space key, to end the input of a character or a word, so that the input efficiency is increased.


On the basis of the embodiment illustrated in FIG. 2, in another embodiment of an input apparatus provided in an embodiment of the present invention, the querying unit 204 is specifically configured to query the word library for all expressions and sentences including the previous text, and use an expression or a sentence that exists in the found expressions and sentences but does not exist in the previous text as a next text.


Referring to FIG. 3, on the basis of the embodiment illustrated in FIG. 2, another embodiment of an input apparatus provided in an embodiment of the present invention includes: a grouping unit 206, configured to group the previous text according to an input sequence and a frequently used word, and use a last input group of a character or a word as an index; where the querying unit 204 is specifically configured to query the word library for expressions and sentences including the index determined by the grouping unit 206, and use an expression or a sentence that exists in the found expressions and sentences but does not exist in the index as a next text.


The input apparatus provided in the embodiment of the present invention is capable of associating a next text according to an input previous text for the user to select after the user presses an input end key to end the input of a character or a word, so that the input efficiency is increased.


In the input apparatus provided in the embodiment of the present invention, the receiving unit, the obtaining unit, the identifying unit, the querying unit, the outputting unit, and the grouping unit may be parts of a processor. The functions of these units may be executed by the processor.


The input apparatus provided in the embodiment of the present invention may be a processor.


Referring to FIG. 4, an embodiment of the present invention also provides a terminal. The terminal provided in the embodiment of the present invention includes an input apparatus 20, an input module 10, and a display module 30.


The input module 10 is configured to input, to the input apparatus, input end indication information of a character or a word, where the input end indication information indicates that input of the character or the word ends.


The input apparatus 20 is configured to: obtain a location of a cursor after receiving the input end indication information of the character or the word; identify the input character or word forward from the location of the cursor until a first punctuation input before the character or the word is identified; use the identified character or word as a previous text, and query a word library for a next text associated with the previous text; and output the associated next text.


The display module 30 is configured to display the next text input by the input apparatus for a user to select to input.


The terminal provided in the embodiment of the present invention is capable of associating a next text according to an input previous text for the user to select after the user presses an input end key to end the input of a character or a word, so that the input efficiency is increased.


The terminal provided in the embodiment of the present invention is not limited to such devices as a handset, a portable computer, and a desktop computer. The input method provided in the embodiments of the present invention is not limited to the Chinese input method or the English input method.


Persons of ordinary skill in the art may understand that all or part of the steps of the methods in the foregoing embodiments may be implemented by a computer program instructing relevant hardware. The program may be stored in a computer readable storage medium. The storage medium may include a ROM, a RAM, a magnetic disk, or an optical disk, and so on.


An input method, an input apparatus, and a terminal that are provided in the embodiments of the present invention are introduced in detail in the foregoing. Specific examples are used for illustrating principles and implementation manners of the present invention. The foregoing descriptions of the embodiments are merely used to help understand the methods and core ideas of the present invention. Meanwhile, persons of ordinary skill in the art may make modifications to the specific implementation manners and application scopes according to the idea of the present invention. In conclusion, the content of the specification shall not be construed as a limitation to the present invention.

Claims
  • 1. An electronic device, comprising: a processor;a display coupled to the processor; anda non-transitory computer readable storage medium coupled to the processor and configured to store computer-executable codes, that when executed by the processor, cause the electronic device to: display a keyboard;display first text inputted by the keyboard;display, after the first text, punctuation inputted by the keyboard;display, after the punctuation, second text inputted by the keyboard;display, after the second text, third text inputted by the keyboard;display, after the third text, a cursor at a first location;move the cursor from the first location to a second location based on an input of a user, wherein the second location is to a left of the third text and to a right of the second text;query for a first prediction from a word library based on the second text being between the punctuation and the second location of the cursor as a context for prediction; andoutput the first prediction on the display, wherein the first prediction comprises a first prediction text and a second prediction text, wherein the first prediction text is queried from the word library using all of the second text, and wherein the second prediction text is queried from the word library using a part of the second text.
  • 2. The electronic device of claim 1, wherein the second text comprises a plurality of groups of characters inputted sequentially, and wherein the part of the second text used for querying is one of the groups of characters closest to the second location of the cursor.
  • 3. The electronic device of claim 1, wherein the keyboard is a Chinese Pinyin keyboard or an English keyboard.
  • 4. The electronic device of claim 1, wherein the electronic device is further caused to: move the cursor to a third location based on another input of the user, wherein the third location is between the punctuation and the second location;query for a second prediction from the word library based on text between the punctuation and the third location of the cursor as a context for prediction; anddisplay the second prediction.
  • 5. The electronic device of claim 1, wherein the second text is directly to the right of the punctuation, and wherein the third text is directly to the right of the second text.
  • 6. The electronic device of claim 1, wherein the second location is directly to the right of the second text and is directly to the left of the third text.
  • 7. The electronic device of claim 1, wherein each of the first text, the second text, and the third text comprises a phrase or a sentence.
  • 8. The electronic device of claim 1, wherein before querying for the first prediction from the word library, the electronic device is further caused to identify the second text between the punctuation and the second location of the cursor.
  • 9. An electronic device, comprising: a processor;a display coupled to the processor; anda non-transitory computer readable storage medium coupled to the processor and configured to store computer-executable codes, that when executed by the processor, cause the electronic device to: display first text inputted by a user;display, after the first text, punctuation inputted by the user;display, after the punctuation, second text inputted by the user;display, after the second text, third text inputted by the user;display, after the third text, a cursor at a first location;move the cursor from the first location to a second location based on an input of the user, wherein the second location is to the left of the third text and to the right of the second text;query for a first prediction from a word library based on the second text being between the punctuation and the second location of the cursor as a context for prediction; andoutput the first prediction on the display, wherein the first prediction comprises a first prediction text and a second prediction text, wherein the first prediction text is queried from the word library using all of the second text, and wherein the second prediction text is queried from the word library using a part of the second text.
  • 10. The electronic device of claim 9, wherein the second text comprises a plurality of groups of characters inputted sequentially, and wherein the part of the second text used for querying is one of the groups of characters closest to the second location of the cursor.
  • 11. The electronic device of claim 9, wherein the electronic device is further caused to: move the cursor to a third location based on another input of the user, wherein the third location is between the punctuation and the second location;query for a second prediction from the word library based on text between the punctuation and the third location of the cursor as a context for prediction; anddisplay the second prediction.
  • 12. The electronic device of claim 9, wherein the second text is directly to the right of the punctuation, and wherein the third text is directly to the right of the second text.
  • 13. The electronic device of claim 9, wherein the second location is directly to the right of the second text and is directly to the left of the third text.
  • 14. The electronic device of claim 9, wherein each of the first text, the second text, and the third text comprises a phrase or a sentence.
  • 15. The electronic device of the claim 9, wherein before querying for the first prediction from the word library, the electronic device is further caused to identify the second text between the punctuation and the second location of the cursor.
  • 16. An electronic device, comprising: a processor;a display coupled to the processor; anda non-transitory computer readable storage medium coupled to the processor and configured to store computer-executable codes, that when executed by the processor, cause the electronic device to: display first text inputted by a user;display, after the first text, punctuation inputted by the user;display, after the punctuation, second text inputted by the user;display, after the second text, third text inputted by the user;display, after the third text, a cursor at a first location;query for a first prediction from a word library based on the second text and the third text that are between the punctuation and the first location of the cursor as a context for prediction; andoutput the first prediction on the display, wherein the first prediction comprises a first prediction text and a second prediction text, wherein the first prediction text is queried from the word library using all of the second text and all of the third text, and wherein the second prediction text is queried from the word library using a part of the second text and the third text.
  • 17. The electronic device of claim 16, wherein the second text and the third text comprise a plurality of groups of characters inputted sequentially, and wherein the part of the second text and the third text used for query is one of the groups of characters closest to the first location of the cursor.
  • 18. The electronic device of claim 16, wherein the electronic device is further caused to: move the cursor to a second location based on an input of the user, wherein the second location is to a left of the third text and to a right of the second text;query for a second prediction from the word library based on the second text between the punctuation and the second location of the cursor as a context for prediction; anddisplay the second prediction.
  • 19. The electronic device of claim 16, wherein the second text is directly to a right of the punctuation, and wherein the third text is directly to the right of the second text.
  • 20. The electronic device of claim 16, wherein each of the first text, the second text, and the third text comprises a phrase or a sentence.
Priority Claims (1)
Number Date Country Kind
201210070159.1 Mar 2012 CN national
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/961,516 filed on Apr. 24, 2018, which is a continuation of U.S. patent application Ser. No. 13/826,487 filed on Mar. 14, 2013, now U.S. Pat. No. 9,984,069, which claims priority to Chinese Patent Application No. 201210070159.1 filed on Mar. 16, 2012. All of the aforementioned patent applications are hereby incorporated by reference in their entireties.

US Referenced Citations (32)
Number Name Date Kind
5963671 Comerford et al. Oct 1999 A
6223059 Haestrup Apr 2001 B1
6573844 Venolia et al. Jun 2003 B1
6822585 Ni Nov 2004 B1
7809719 Furuuchi et al. Oct 2010 B2
8918736 Jobs Dec 2014 B2
9167025 Twitchell Oct 2015 B2
9659002 Medlock et al. May 2017 B2
20020019731 Masui et al. Feb 2002 A1
20020038207 Mori et al. Mar 2002 A1
20050017954 Kay et al. Jan 2005 A1
20050281259 Mitchell Dec 2005 A1
20060247915 Bradford Nov 2006 A1
20080072143 Assadollahi Mar 2008 A1
20080195571 Furuuchi et al. Aug 2008 A1
20080204421 Hsu Aug 2008 A1
20090234632 Hasegawa et al. Sep 2009 A1
20100302163 Ghassabian Dec 2010 A1
20110093497 Poon Apr 2011 A1
20120117101 Unruh May 2012 A1
20130222249 Pasquero et al. Aug 2013 A1
20130227460 Jawerth et al. Aug 2013 A1
20130253908 Zhai Sep 2013 A1
20130275119 Wei et al. Oct 2013 A1
20140050223 Foo et al. Feb 2014 A1
20140226662 Frost et al. Aug 2014 A1
20150121285 Eleftheriou et al. Apr 2015 A1
20150135123 Carr May 2015 A1
20150319089 Liu et al. Nov 2015 A1
20160164776 Biancaniello Jun 2016 A1
20160261505 Saniee et al. Sep 2016 A1
20170085462 Zhou et al. Mar 2017 A1
Foreign Referenced Citations (28)
Number Date Country
1389778 Jan 2003 CN
1710888 Dec 2005 CN
1908864 Feb 2007 CN
1908866 Feb 2007 CN
101196792 Jun 2008 CN
101246396 Aug 2008 CN
101308515 Nov 2008 CN
101369216 Feb 2009 CN
101634905 Jan 2010 CN
101727271 Jun 2010 CN
101957721 Jan 2011 CN
101957724 Jan 2011 CN
102236423 Nov 2011 CN
102629160 Aug 2012 CN
104283785 Jan 2015 CN
105099960 Nov 2015 CN
105306368 Feb 2016 CN
105591925 May 2016 CN
2765751 Aug 2014 EP
09069086 Mar 1997 JP
11212967 Aug 1999 JP
2007506184 Mar 2007 JP
2011138252 Jul 2011 JP
2005036413 Apr 2005 WO
2015090240 Jun 2015 WO
2015149831 Oct 2015 WO
2015180154 Dec 2015 WO
2016119600 Aug 2016 WO
Non-Patent Literature Citations (33)
Entry
Machine Translation and Abstract of Chinese Publication No. CN104283785, Jan. 14, 2015, 12 pages.
Machine Translation and Abstract of Chinese Publication No. CN105306368, Feb. 3, 2016, 29 pages.
Machine Translation and Abstract of Chinese Publication No. CN105591925, May 18, 2016, 18 pages.
Zhang, H., et al., “Service Chain Header,” draft-zhang-sfc-sch-00, Mar. 24, 2014, 15 pages.
Ding, W., et al., “OpenSCaaS: An Open Service Chain as a Service Platform Toward the Integration of SDN and NFV,” IEEE Network, vol. 29 , Issue: 3 , May-Jun. 2015, pp. 30-35.
“Service Function Chaining Extension for OpenStack Networking,” retrieved from https://docs.openstack.org/networking-sfc/latest/index.html, Jul. 23, 2017, 1 page.
Masui, “An Efficient Text Input Method for Pen-based Computers,” 1998, In Proceedings of CHI'98, pp. 328-335.
Foreign Communication From a Counterpart Application, PCT Application No. PCT/CN2016/101381, English Translation of International Search Report dated Mar. 21, 2017, 3 pages.
Foreign Communication From a Counterpart Application, European Application No. 16917427.3, Extended European Search Report dated Mar. 27, 2019, 8 pages.
Foreign Communication From a Counterpart Application, Chinese Application No. 201680003591.5, Chinese Notice of Allowance dated Jan. 10, 2020, 4 pages.
Machine Translation and Abstract of Chinese Publication No. CN101196792, Jun. 11, 2008, 9 pages.
Machine Translation and Abstract of Chinese Publication No. CN101246396, Aug. 20, 2008, 8 pages.
Machine Translation and Abstract of Chinese Publication No. CN101369216, Feb. 18, 2009, 32 pages.
Machine Translation and Abstract of Chinese Publication No. CN101727271, Jun. 9, 2010, 27 pages.
Machine Translation and Abstract of Chinese Publication No. CN102236423, Nov. 9, 2011, 29 pages.
Machine Translation and Abstract of Chinese Publication No. CN1389778, Jan. 8, 2003, 7 pages.
Machine Translation and Abstract of Chinese Publication No. CN1908864, Feb. 7, 2007, 4 pages.
Machine Translation and Abstract of Chinese Publication No. CN1908866, Feb. 7, 2007, 4 pages.
Machine Translation and Abstract of Chinese Publication No. CN101308515, Nov. 19, 2008, 5 pages.
Machine Translation and Abstract of Chinese Publication No. CN101634905, Jan. 27, 2010, 15 pages.
Machine Translation and Abstract of Chinese Publication No. CN101957724, Jan. 26, 2011, 6 pages.
Machine Translation and Abstract of Japanese Publication No. JP2011138252, Jul. 14, 2011, 15 pages.
Machine Translation and Abstract of Japanese Publication No. JPH0969086, Mar. 11, 1997, 22 pages.
Machine Translation and Abstract of Japanese Publication No. JPH11212967, Aug. 6, 1999, 14 pages.
Foreign Communication From a Counterpart Application, Japanese Application No. 2013-052968, Japanese Notice of Rejection dated Jan. 20, 2015, 2 pages.
Foreign Communication From a Counterpart Application, Japanese Application No. 2013-052968, English Translation of Japanese Notice of Rejection dated Jan. 20, 2015, 2 pages.
Foreign Communication From a Counterpart Application, Japanese Application No. 2013-052968, Japanese Office Action dated Feb. 18, 2014, 3 pages.
Foreign Communication From a Counterpart Application, Japanese Application No. 2013-052968, English Translation of Japanese Office Action dated Feb. 18, 2014, 3 pages.
Foreign Communication From a Counterpart Application, Japanese Application No. 2013-052968, Japanese Office Action dated Sep. 16, 2014, 3 pages.
Foreign Communication From a Counterpart Application, Japanese Application No. 2013-052968, English Translation of Japanese Office Action dated Sep. 16, 2014, 4 pages.
Foreign Communication From a Counterpart Application, PCT Application No. PCT/CN2013/072158, English Translation of International Search Report dated Jun. 6, 2013, 2 pages.
Foreign Communication From a Counterpart Application, PCT Application No. PCT/CN2013/072158, English Translation of Written Opinion dated Jun. 6, 2013, 7 pages.
Foreign Communication From a Counterpart Application, European Application No. 13159450.9, Extended European Search Report dated Dec. 3, 2013, 7 pages.
Related Publications (1)
Number Date Country
20200202077 A1 Jun 2020 US
Continuations (2)
Number Date Country
Parent 15961516 Apr 2018 US
Child 16803383 US
Parent 13826487 Mar 2013 US
Child 15961516 US