Image processing system, image processing apparatus, and method for image processing

Information

  • Patent Grant
  • 9779317
  • Patent Number
    9,779,317
  • Date Filed
    Friday, May 8, 2015
    9 years ago
  • Date Issued
    Tuesday, October 3, 2017
    7 years ago
Abstract
An image processing system includes a reading unit, a checking unit, a display control unit, and an image correcting unit. The reading unit reads image information from an original manuscript to generate image data. The checking unit checks whether or not a plurality of items listed in the original manuscript include an unentered item with the result of a character recognition process for recognizing a character contained in the image data. When the unentered item is present, the display control unit performs control to display information for informing a user of the unentered item. The image correcting unit adds a text with respect to the unentered item according to an input made by a user to correct the image data.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims priority to and incorporates by reference the entire contents of Japanese Patent Application No. 2014-099797 filed in Japan on May 13, 2014.


BACKGROUND OF THE INVENTION

1. Field of the Invention


The present invention relates to an image processing system, an image processing apparatus, and a method for image processing.


2. Description of the Related Art


Techniques for facilitating the editing of document data obtained by computerizing paper document have been conventionally known. For example, Japanese Patent No. 4980691 discloses a configuration in which information about setting items unique to a document database is linked to a document with a groupware server to simplify the editing of document data.


Such conventional techniques, however, have no mechanism capable of detecting whether or not a manuscript includes an omission and allowing, if there is an omission, a user to perform an adding operation easily.


In view of the above problem, there is a need to provide an image processing system, an image processing apparatus, and a method for image processing capable of detecting whether or not a manuscript includes an omission and allowing, if there is an omission, a user to perform an adding operation easily.


SUMMARY OF THE INVENTION

It is an object of the present invention to at least partially solve the problems in the conventional technology.


According to the present invention, there is provided an image processing system comprising: a reading unit that reads image information from an original manuscript to generate image data; a checking unit that checks whether or not a plurality of items listed in the original manuscript include an unentered item with a result of a character recognition process for recognizing a character contained in the image data; a display control unit that performs, when the unentered item is present, control to display information for informing a user of the unentered item; and an image correcting unit that adds a text with respect to the unentered item according to an input made by the user to correct the image data.


The present invention also provides an image processing apparatus comprising: a reading unit that reads image information from an original manuscript to generate image data; a checking unit that checks whether or not a plurality of items listed in the original manuscript include an unentered item on the basis of a result of a character recognition process for recognizing a character contained in the image data; a display control unit that performs, when the unentered item is present, control to display an existence of the unentered item; and an image correcting unit that adds a text with respect to the unentered item according to an input made by a user to correct the image data.


The present invention also provides an image processing method comprising: a reading step of reading image information from an original manuscript to generate image data; a checking step of checking whether or not a plurality of items listed in the original manuscript include an unentered item on the basis of a result of a character recognition process for recognizing a character contained in the image data; a display control step of performing, when the unentered item is present, control to display an existence of the unentered item; and an image correcting step of adding a text with respect to the unentered item according to an input made by a user to correct the image data.


The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram illustrating a configuration example of an image processing system according to an embodiment of the present invention;



FIG. 2 is a diagram illustrating an example of functional configurations of an MFP and a questionnaire server according to a first embodiment of the present invention;



FIG. 3 is a flowchart illustrating a procedure from the preparation of a questionnaire form to the printing-out of the questionnaire form;



FIG. 4 is a flowchart illustrating an operation example of a questionnaire application in the first embodiment;



FIG. 5 is a flowchart illustrating an operation example of a questionnaire application in a second embodiment of the present invention;



FIGS. 6A and 6B each show an example of an answered questionnaire sheet;



FIG. 7 is a diagram illustrating an example of information for informing a user that no answer has been entered to a required item;



FIG. 8 is a diagram illustrating an example of functional configurations of an MFP and a questionnaire server according to a third embodiment of the present invention;



FIGS. 9A and 9B each is a diagram illustrating an example of a questionnaire form;



FIGS. 10A and 10B each shows an example of a questionnaire recognition form; and



FIG. 11 is a diagram showing an example of a CSV file.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

Embodiments of an image processing system, an image processing apparatus, and a method for image processing according to the present invention will now be described below in detail with reference to the accompanying drawings. Although the following description will be made with respect to a case where the image processing apparatus according to the present invention is used in a multifunction peripheral (MFP) having at least a print function and a scanner function, the invention is not limited thereto.


First Embodiment

As shown in FIG. 1, an image processing system 1 according to the present embodiment includes an MFP 10, a questionnaire server 20, and a client terminal 30. Note that the number of the MFPs 10 included in the image processing system 1 is not limited to one. For example, a configuration in which two or more MFPs 10 are included in a single image processing system 1 is possible. Moreover, in the example of FIG. 1, the MFP 10, the questionnaire server 20, and the client terminal 30 are connected to one another via a network 40 such as a LAN, for example. The client terminal 30 is a terminal used by a user and may be a personal computer (PC), for example.



FIG. 2 is a diagram showing an example of functional configurations of the MFP 10 and the questionnaire server 20. As shown in FIG. 2, the MFP 10 includes: a copy unit 11; a scan unit 12; a printer unit 13; a fax unit 14; a panel operation unit 15; and a communication unit 16. The communication unit 16 is an interface for connecting the MFP 10 to the network 40.


The copy unit 11 provides a copy function of the MFP 10. The scan unit 12 provides a scan function of the MFP 10. More specifically, the scan unit 12 has the function of reading image information from an original manuscript and generating image data. The scan unit 12 can be considered as corresponding to a “reading unit” in claims. The printer unit 13 provides a printer function of the MFP 10. The fax unit 14 provides a facsimile transmission function of the MFP 10.


The panel operation unit 15 receives a user's operation and displays various information. In this example, a main unit of the MFP 10, including the copy unit 11, the scan unit 12, the printer unit 13, and the fax unit 14, and the panel operation unit 15 operate independently of each other with different operating systems. However, the present invention is not limited thereto. A configuration in which the main unit and the panel operation unit 15 operate with the same operating system may be employed.


An application set in the panel operation unit 15 is a software (program) for providing a user interface (UI) function used for operations and display mainly about the functions provided by the copy unit 11, the scan unit 12, the printer unit 13, and the fax unit 14. In the present embodiment, the panel operation unit 15 has a hardware configuration that utilizes a computer device including: a CPU; a storage such as a ROM or a RAM; a communication interface for connecting to the network 40; and a display unit for displaying various images. The CPU mounted in the panel operation unit 15 can provide various functions by executing a program/programs. In the present embodiment, a software for providing work service on questionnaires (hereinafter sometimes referred to as a “questionnaire application”) is set in the panel operation unit 15. Functions regarding the questionnaire application will be described mainly below.


Note that the “questionnaire” means a questionnaire survey. In this example, a format (form) in which an answer column used for writing an answer is provided for each of one or more items is employed. The form of a questionnaire is sometimes referred to as a “questionnaire form” in the following description. In this example, a user can prepare a questionnaire form on the client terminal 30 and register the prepared questionnaire form in the questionnaire server 20.


As shown in FIG. 2, the questionnaire application includes: a questionnaire form requesting unit 101; an MFP function control unit 102; a mode setting unit 103; a checking unit 104; a display control unit 105; an image correcting unit 106; and a questionnaire result data transmission control unit 107.


The questionnaire form requesting unit 101 performs control to display, on the panel operation unit 15, a selection screen from which a questionnaire form registered in the questionnaire server 20 to be described later is selected according to a user's operation. The questionnaire form requesting unit 101 then selects one of questionnaire forms according to a user's operation on the selection screen and requests the selected questionnaire form to the questionnaire server 20. Once the questionnaire form is obtained from the questionnaire server 20, the questionnaire form requesting unit 101 requests the MFP function control unit 102 to print out the obtained questionnaire form.


The MFP function control unit 102 provides the copy function, the scan function, the printer function, and the fax function in cooperation with and utilizing the copy unit 11, the scan unit 12, the printer unit 13, and the fax unit 14, respectively. In this example, a combination of the MFP function control unit 102 and the scan unit 12 can be considered as corresponding to the “reading unit” in claims. Once the MFP function control unit 102 receives a request for printing out the questionnaire form from the questionnaire form requesting unit 101, the MFP function control unit 102 performs control to cause the printer unit 13 to print out the questionnaire form. Consequently, a questionnaire sheet (questionnaire sheet which has not been answered yet) corresponding to the questionnaire form is outputted. The user writes an answer by hand for each of one or more items listed in the questionnaire sheet.


When the mode setting unit 103 receives an operation of selecting a mode for scanning the answered questionnaire sheet (hereinafter sometimes referred to as an “entry value checking mode”), the mode setting unit 103 sets an operation mode in the scan function to the entry value checking mode. The image data generated by the scan unit 12 in the entry value checking mode is regarded as image data generated by reading image information from the answered questionnaire sheet. For example, the panel operation unit 15 displays a button used for selecting the entry value checking mode. The user can perform an input of selecting the entry value checking mode by pressing down the button. The user then sets the answered questionnaire sheet and performs an operation of giving an instruction to scan the sheet. The MFP function control unit 102, which has received this operation, performs control to cause the scan unit 12 to carry out a scanning process (a process of reading image information from a set original manuscript and generating its image data).


With the result of a character recognition process for recognizing characters contained in the image data generated by the scan unit 12, the checking unit 104 checks whether or not the plurality of items listed in the manuscript include an unentered item. In this example, the checking unit 104 performs a known OCR process on the image data generated by the scan unit 12 in the entry value checking mode. With the result of the OCR process, the checking unit 104 checks if the plurality of items listed in the answered questionnaire sheet include an item for which no answer has been entered.


In the presence of an unentered item, the display control unit 105 performs control to display information for informing the user of the unentered item. In the present embodiment, when the plurality of items listed in the answered questionnaire sheet include an item for which no answer has been entered, the display control unit 105 performs control to display, on the panel operation unit 15, information for informing the user about the item for which no answer has been entered. The user, who has checked this information, can complete an answer to the unentered item in handwriting or with the use of a function of the questionnaire application. For example, the panel operation unit 15 displays a button used for selecting that an answer to the unentered item is completed with the function of the questionnaire application. By pressing down the button, the user can perform an input of selecting that an answer to the unentered item is completed with the function of the questionnaire application.


The image correcting unit 106 adds a text with respect to the unentered item according to a user's input to correct the image data (the image data generated by the scan unit 12). In the present embodiment, when the input of selecting that the answer to the unentered item is completed with the function of the questionnaire application is received, the image correcting unit 106 performs control to display, on the panel operation unit 15, the image data generated by the scan unit 12 in the entry value checking mode (the image data of the answered questionnaire sheet including an item for which no answer has been entered). The user can input an answer to the item for which no answer has been entered by operating a soft keyboard, or the like, while checking this image data. The image correcting unit 106 adds the answer to the item for which no answer has been entered according to the user's input to correct (edit) the image data.


The questionnaire result data transmission control unit 107 performs control to transmit the image data generated by the scan unit 12 (image data on which no correction has been made by the image correcting unit 106) or the image data on which a correction has been made by the image correcting unit 106 to the questionnaire server 20 to be described later as questionnaire result data.


Although the above scan unit 12, checking unit 104, display control unit 105, and image correcting unit 106 are mounted in a single MFP 10 in the present embodiment, the present invention is not limited thereto. For example, these units may be mounted separately into a plurality of devices. In short, it is only required that an image processing system to which the present invention is applied has a configuration including a function corresponding to the above scan unit 12, a function corresponding to the checking unit 104, a function corresponding to the display control unit 105, and a function corresponding to the image correcting unit 106.


The function of the questionnaire server 20 will be described next. The questionnaire server 20 has a function of accumulating information about questionnaire forms, questionnaire result data, and the like.


As shown in FIG. 2, the questionnaire server 20 includes a registering unit 201, a questionnaire form storage unit 202, a questionnaire form transmission control unit 203, a questionnaire result data acquisition unit 204, a questionnaire result data storage unit 205, and a communication unit 210. The communication unit 210 is an interface for connecting to the network 40 (an interface for connecting to the MFP 10 or the client terminal 30 from a different perspective).


The registering unit 201 performs control to register the questionnaire form produced by the client terminal 30 in the questionnaire form storage unit 202. The questionnaire form transmission control unit 203 performs control to transmit the questionnaire form registered in the questionnaire form storage unit 202 to the MFP 10 in response to a request from the MFP 10 (the questionnaire form requesting unit 101). The questionnaire result data acquisition unit 204 acquires questionnaire result data transmitted from the MFP 10 (the questionnaire result data transmission control unit 107). The questionnaire result data storage unit 205 stores the questionnaire result data acquired by the questionnaire result data acquisition unit 204.


In the present embodiment, the questionnaire server 20 has a hardware configuration that utilizes a computer device including a CPU, a storage such as a ROM or a RAM, a communication interface for connecting to the network 40, and a display unit for displaying various images. The CPU mounted in the questionnaire server 20 can provide the function of each unit (such as the registering unit 201) in the above questionnaire server 20 by executing a program/programs.



FIG. 3 is a flowchart illustrating a procedure from the preparation of a questionnaire form to the printing-out of the questionnaire form. As shown in FIG. 3, the client terminal 30 first prepares a questionnaire form according to a user's input (step S1), and the questionnaire server 20 registers the questionnaire form produced by the client terminal 30 (step S2). Next, the MFP 10 requests one of the registered questionnaire forms to the questionnaire server 20 according to a user's input and acquires the requested questionnaire form (step S3). Next, the MFP 10 prints out the acquired questionnaire form (step S4). This yields a questionnaire sheet which has not been answered yet. The user writes an answer by hand for each of one or more items listed in the questionnaire sheet.



FIG. 4 is a flowchart illustrating an operation example of the questionnaire application. In this example, the user, who has finished writing answers to the questionnaire sheet, presses down the button displayed on the panel operation unit 15 for selecting the entry value checking mode. The mode setting unit 103, which has received this operation, sets the operation mode in the scan function to the entry value checking mode (step S11). The user then sets the answered questionnaire sheet and performs an operation of giving an instruction to scan the sheet. The MFP function control unit 102, which has received this operation, performs control to cause the scan unit 12 to carry out the scanning process (step S12).


Next, the checking unit 104 performs the OCR process on the image data generated by the scanning process in the step S12. With the result of the OCR process, the checking unit 104 checks whether or not the plurality of items listed in the answered questionnaire sheet include an item for which no answer has been entered (step S13). In the absence of an item for which no answer has been entered (step S13: No), the questionnaire result data transmission control unit 107 performs control to transmit the image data generated by the scanning process in the step S12 to the questionnaire server 20 as questionnaire result data (step S14).


In the presence of an item for which no answer has been entered (step S13: Yes), on the other hand, the display control unit 105 performs control to display information for informing about the existence of an item for which no answer has been entered (step S15). When the input of selecting that an answer to the unentered item is completed with the function of the questionnaire application is received (step S16: Yes), the image correcting unit 106 performs control to display the image data generated by the scanning process in the step S12. The image correcting unit 106 then adds the answer to the item for which no answer has been entered in response to a user's input to correct the image data (step S17). The questionnaire result data transmission control unit 107 then performs control to transmit the image data corrected by the image correcting unit 106 in the step S17 to the questionnaire server 20 as questionnaire result data (step S14).


On the other hand, when no input of instructing that an answer to the unentered item is completed with the function of the questionnaire application is received in the above step S16 (step S16: No), the user completes the answer in handwriting, then sets the completed questionnaire sheet, and performs an operation of giving an instruction to scan the sheet. The MFP function control unit 102, which has received this operation, performs control to cause the scan unit 12 to carry out the scanning process (step S12). Thereafter, the above step S13 and the following processes are repeated.


As described above, the result of the OCR process on the image data generated by reading image information from the answered questionnaire sheet is used to check whether or not the plurality of items listed in the questionnaire sheet include an item for which no answer has been entered in the present embodiment. In the presence of an item for which no answer has been entered, the control to display information for informing a user of the item for which no answer has been entered is performed and a text with respect to the item for which no answer has been entered is added according to a user's input to correct the image data. Thus, a mechanism capable of detecting whether or not an answered questionnaire sheet includes an omission and allowing, if there is an omission, a user to perform an adding operation easily can be achieved.


Second Embodiment

The second embodiment will now be described next. A description regarding elements common to the above first embodiment will be omitted as appropriate. In the second embodiment, when no answer has been entered to a required item, which is an item required to be answered, the display control unit 105 performs control to display information for informing that no answer has been entered to the required item. When no answer has been entered to an optional item for which answering is optional, on the other hand, the display control unit 105 does not perform control to display information for informing that no answer has been entered to the optional item.



FIG. 5 is a flowchart for explaining a process in the second embodiment corresponding to the steps S13 and S15 in FIG. 4. Since the contents of the other processes are the same as those in the first embodiment, the detailed description thereof will be omitted. As shown in FIG. 5, the checking unit 104 performs, in step S21, the OCR process on the image data generated by the scanning process in the step S12. With the result of the OCR process, the checking unit 104 checks whether or not the plurality of items listed in the answered questionnaire sheet include an item for which no answer has been entered. In the presence of an item for which no answer has been entered (step S21: Yes), the checking unit 104 checks whether or not the item is a required item (step S22). In this example, the checking unit 104 determines whether the item for which no answer has been entered is a required item or an optional item by performing the OCR process. However, the present invention is not limited thereto. Any method for determining whether the item for which no answer has been entered is a required item or an optional item can be used.


When the item for which no answer has been entered is a required item (step S22: Yes), the display control unit 105 performs control to display information for informing that no answer has been entered to the required item (step S23). When the item for which no answer has been entered is an optional item (step S22: No), the process ends without performing control to display information for informing that no answer has been entered to the optional item by the display control unit 105.


In the present embodiment, when no answer has been entered to “Q3” which is a required item, as shown in FIG. 6A, for example, the display control unit 105 performs control to display, on the panel operation unit 15, information for informing the user that no answer has been entered to “Q3” which is a required item, as shown in FIG. 7. When no answers have been entered to “Q3” which is a required item, and “Q2” which is an optional item, as shown in FIG. 6B, for example, the display control unit 105 performs control to display, on the panel operation unit 15, information for informing the user that no answer has been entered to “Q3” which is a required item, but does not perform control to display, on the panel operation unit 15, information for informing the user that no answer has been entered to “Q2” which is an optional item, as shown in FIG. 7.


Third Embodiment

The third embodiment will now be described next. A description regarding elements common to the above embodiments will be omitted as appropriate. The third embodiment further includes the function of compiling entries for each of items across pieces of questionnaire result data. The display control unit 105 performs control to display the compiled result.



FIG. 8 is a diagram showing an example of functional configurations of an MFP 1000 and a questionnaire server 2000 according to the third embodiment. As shown in FIG. 8, the questionnaire server 2000 further includes a compiling unit 206 and a compiled result transmission control unit 207. A questionnaire application set in the MFP 1000 further includes a compiled result acquisition unit 108.


The compiling unit 206 compiles entries for each of the items across pieces of questionnaire result data stored in the questionnaire result data storage unit 205. In this example, each questionnaire result data is identified by a unique ID. For example, an identifier (ID) such as a number, a bar code, or a two-dimensional bar code may be given to a questionnaire sheet. Each of FIGS. 9A and 9B is a diagram showing an example of a questionnaire form. As shown in FIGS. 9A and 9B, the questionnaire form includes a plurality of questions and a name item. With regard to questions, a “questionnaire recognition form (question)” as shown in FIG. 10A, for example, may be registered in advance and the corresponding question item in the questionnaire result data can be identified by means of image recognition. With regard to names, a “questionnaire recognition form (name)” as shown in FIG. 10B, for example, may be similarly registered in advance and the name item in the questionnaire result data can be identified by means of image recognition. Also, whether the item is a required item or an optional item is determined by the recognition of characters in this example.


The compiling unit 206 loads and compiles entries for each of items across pieces of questionnaire result data stored in the questionnaire result data storage unit 205. The compiling unit 206 generates a CSV file as shown in FIG. 11 as the compiled result.


The compiled result transmission control unit 207 performs control to transmit the compiled result produced by the compiling unit 206 to the MFP 1000 (panel operation unit 15). The compiled result acquisition unit 108 of the questionnaire application acquires the compiled result and the display control unit 105 can perform control to display the compiled result.


Although the embodiments according to the present invention have been described above, the present invention is not limited to the embodiments exactly as described. The present invention can be embodied upon implementation by modifying the components without departing from the scope of the invention. Moreover, various inventions can be made by appropriately combining the plurality of components disclosed in the above embodiments.


For example, a part (for example, the checking unit 104) of the plurality of functions processed by the above questionnaire application may be provided on the side of the questionnaire server 20. In short, it is only required that an image processing system to which the present invention is applied has a configuration including a function corresponding to the above scan unit 12 (or a combination of the scan unit 12 and the MFP function control unit 102), a function corresponding to the checking unit 104, a function corresponding to the display control unit 105, and a function corresponding to the image correcting unit 106.


The programs to be executed by the above MFP 10 and questionnaire server 20 each may be recorded in a computer readable recording medium such as a CD-ROM, a flexible disk (FD), a CD-R, a digital versatile disk (DVD), or a universal serial bus (USB) memory as an installable or executable file and provided. Alternatively, such programs may be provided or distributed via a network such as the Internet. Alternatively, various programs may be preloaded into a non-volatile recording medium such as a ROM and provided.


The present invention can provide a mechanism capable of detecting whether a manuscript includes an omission and allowing, if there is an omission, a user to perform an adding operation easily.


Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.

Claims
  • 1. An image processing apparatus, comprising: image processing circuitry that includes at least a scanner and a printer, andoperation interface circuitry that provides an interface to control the scanner and the printer, whereinthe image processing circuitry includes a first processor executing a first operating system, and a communication interface,the operation interface circuitry includes a second processor executing a second operating system, and an application that is executed under the second operation system, wherein the operation interface circuitry operates independently of the image processing circuitry, andthe second processor is configured to display, by executing the application, a selection screen from which a questionnaire form registered in a server is selected,receive the questionnaire form that is selected using the selection screen from the server via the communication interface of the image processing circuitry,cause the printer to print out a questionnaire sheet of the questionnaire form,cause the scanner to read the questionnaire sheet in which a user has written an answer by hand,check if the read answered questionnaire sheet includes a non-answered item based on a result of an optical character recognition process performed on the read answered questionnaire sheet, andcontrol to display, when the read answered questionnaire sheet includes the non-answered item, information for informing the user of the non-answered item.
  • 2. The image processing apparatus of claim 1, wherein the scanner is configured to read image information from the answered questionnaire sheet.
  • 3. The image processing apparatus of claim 2, wherein the operation interface circuitry further includes a questionnaire result data memory that stores the image information as questionnaire result data, and the second processor is further configured to compile entries for each of the items across pieces of the questionnaire result data.
  • 4. The image processing apparatus of claim 3, wherein the second processor performs control to display a compiled result produced by the second processor.
  • 5. An image processing method executed by a second processor included in operation interface circuitry of an image processing apparatus, the second processor executing a second operating system and an application, the image processing apparatus including image processing circuitry that includes at least a scanner and a printer, and the operation interface circuitry, which provides an interface to control the scanner and the printer, wherein the image processing circuitry includes a first processor executing a first operating system, and a communication interface, wherein the operation interface circuitry operates independently of the image processing circuitry, the method comprising: displaying, by executing the application, a selection screen from which a questionnaire form registered in a server is selected,receiving the questionnaire form that is selected using the selection screen from the server via the communication interface of the image processing circuitry,causing the printer to print out a questionnaire sheet of the questionnaire form,causing the scanner to read the questionnaire sheet in which a user has written an answer by hand,checking if the read answered questionnaire sheet includes a non-answered item based on a result of an OCR process performed on the read answered questionnaire sheet, andcontrolling to display, when the read answered questionnaire sheet includes the non-answered item, information for informing the user of the non-answered item.
Priority Claims (1)
Number Date Country Kind
2014-099797 May 2014 JP national
US Referenced Citations (92)
Number Name Date Kind
5228100 Takeda Jul 1993 A
5235654 Anderson Aug 1993 A
5340966 Morimoto Aug 1994 A
5448375 Cooper Sep 1995 A
5619594 Melen Apr 1997 A
5692073 Cass Nov 1997 A
5704029 Wright, Jr. Dec 1997 A
5754939 Herz May 1998 A
5862223 Walker Jan 1999 A
5923792 Shyu Jul 1999 A
6014680 Sato Jan 2000 A
6292473 Duske, Jr. Sep 2001 B1
6330976 Dymetman Dec 2001 B1
6683697 Lech Jan 2004 B1
8176563 Redlich May 2012 B2
8358964 Radtke Jan 2013 B2
8726148 Battilana May 2014 B1
9530068 Biegert Dec 2016 B2
20020069220 Tran Jun 2002 A1
20020152164 Dutta Oct 2002 A1
20030002086 Thomason Jan 2003 A1
20030033288 Shanahan Feb 2003 A1
20030047494 Lhomme Mar 2003 A1
20030091233 Lapstun May 2003 A1
20030188260 Jensen Oct 2003 A1
20040075867 Watanabe Apr 2004 A1
20040103367 Riss May 2004 A1
20040199443 Gaston Oct 2004 A1
20040205448 Grefenstette Oct 2004 A1
20040260569 Bell Dec 2004 A1
20050209891 Jacobus Sep 2005 A1
20050231746 Parry Oct 2005 A1
20050276519 Kitora Dec 2005 A1
20060190374 Sher Aug 2006 A1
20060242063 Peterson Oct 2006 A1
20060242164 Evans Oct 2006 A1
20060271847 Meunier Nov 2006 A1
20070047008 Graham Mar 2007 A1
20070052997 Hull Mar 2007 A1
20070168382 Tillberg Jul 2007 A1
20070171482 Iwasaki Jul 2007 A1
20070195378 Yoshida Aug 2007 A1
20070198910 Jensen Aug 2007 A1
20080025608 Meunier Jan 2008 A1
20080091636 Ferlitsch Apr 2008 A1
20080252934 Arita et al. Oct 2008 A1
20080293033 Scicchitano Nov 2008 A1
20080314968 Maher Dec 2008 A1
20090052804 Lewis Feb 2009 A1
20090077165 Rhodes Mar 2009 A1
20090089372 Sacco Apr 2009 A1
20090245659 Tzadok Oct 2009 A1
20090279110 Ito Nov 2009 A1
20090282009 Levey Nov 2009 A1
20100061655 Ma Mar 2010 A1
20100150448 Lecerf Jun 2010 A1
20100161345 Cain Jun 2010 A1
20100172590 Foehr Jul 2010 A1
20100174976 Mansfield Jul 2010 A1
20100202691 Hamada Aug 2010 A1
20100278453 King Nov 2010 A1
20100306260 Dejean Dec 2010 A1
20110002014 Tani Jan 2011 A1
20110141510 Garg Jun 2011 A1
20110212430 Smithmier Sep 2011 A1
20110286023 Hagisawa Nov 2011 A1
20120079370 Dejean Mar 2012 A1
20120087537 Liu Apr 2012 A1
20120105653 Yoshida May 2012 A1
20120120444 Hirohata May 2012 A1
20120120448 Komaba May 2012 A1
20120263380 Terao Oct 2012 A1
20130117652 Folsom May 2013 A1
20130156322 Yaros Jun 2013 A1
20130232041 Nuggehalli Sep 2013 A1
20130275863 Rogers Oct 2013 A1
20140088985 Grant Mar 2014 A1
20140237342 King Aug 2014 A1
20140258838 Evers Sep 2014 A1
20140281871 Brunner Sep 2014 A1
20140337189 Barsade Nov 2014 A1
20150043042 Kato Feb 2015 A1
20150066895 Komissarchik Mar 2015 A1
20150187022 Takahashi Jul 2015 A1
20150278619 Pakhchanian Oct 2015 A1
20150286860 Ruiz-Tapiador Oct 2015 A1
20150304521 Campanelli Oct 2015 A1
20150317296 Vohra Nov 2015 A1
20150332492 Igarashi Nov 2015 A1
20160042320 Dearing Feb 2016 A1
20160062972 Ramakrishnan Mar 2016 A1
20160182741 Takei Jun 2016 A1
Foreign Referenced Citations (1)
Number Date Country
4980691 Apr 2012 JP
Related Publications (1)
Number Date Country
20150332492 A1 Nov 2015 US