Computer systems are currently in wide use. Many computer systems are provided which allow users to perform data entry. Such systems generate displays with data entry portions so that data can be entered into the computer system for later use.
For instance, some such computer systems include business systems. Business systems can include, for example, enterprise resource planning (ERP) systems, customer relations management (CRM) systems, line-of-business (LOB) systems, among others. These types of systems can have many different kinds of documents and thousands of different forms, each of which have fields, line items, or other items for data entry. As one example, a business system may provide functionality for scanning a vendor invoice. When the vendor invoice is scanned, optical character recognition (OCR) can be performed on information in the scanned invoice, and a corresponding business record (such as an invoice document in an ERP system) can be generated. A user then pulls up the captured image of the invoice, along with the document generated in the business system, and performs data entry on the document, while viewing the image of the invoice.
As another example, a user may simply pull up a business document that has a header and footer portion, as well as a data entry portion. For instance, a user may pull up an invoice document in an accounts payable system or in an accounts receivable system to perform data entry on the document. The document may have various portions, such as header and footer portions and data entry portions (such as a line item portion that contains line items) into which the user enters data.
Some current computer systems also have dual monitor functionality. That is, a user of a business system can have different documents displayed on two separate monitors.
The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
A data entry document is obtained. Information related to the data entry document is displayed on a first display device, and a data entry portion of the data entry document is displayed on a second display device. The related information can be a separate item (such as a scanned image or other item), or it can be a portion of the data entry document, itself.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.
Business system 100 illustratively includes processor 118, data store 120, application component 122, image scanning component 124, optical character recognition (OCR) component 126, user interface component 128, document management system 130 and it can include other items 132 as well. Data store 120, itself, illustratively includes entities 134, applications 136, workflows 138, processes 140, images 142 (that can be scanned into business system 100 using image scanning component 124), forms 144, documents 145 and it can include other business data records or other items 146 as well.
Entities 134 illustratively describe and define entities within business system 100. For instance, a customer entity describes and defines a customer. A vendor entity describes and defines a vendor. An invoice entity describes and defines an invoice. A receipt entity describes and defines a receipt. This list is but a small example of the various different types of entities that can be defined within business system 100.
Applications 136 are illustratively business applications, such as general ledger applications, accounts receivable and accounts payable applications, other accounting applications, inventory tracking applications, as well as a host of other business applications. Application component 122 illustratively runs applications 136, which can include workflows 138, business processes 140, and other items. Workflows 138 and processes 140 illustratively operate on business entities 134, images 142, forms 144, documents 145, and other business records 146. Workflows 138 and processes 140 illustratively enable user 114 to perform his or her tasks within business system 100. The processes and workflows can be automated, semi-automated, or manual.
Document management system 130 allows user 114 to perform document management operations on the documents 145 within business system 100. User interface component 128, either by itself, or under the control of other items in business system 110, illustratively generates user interface displays 103, 104 and 105. In one embodiment, document management system 130 allows user 114 to display documents in a multi-screen manner, in which some items corresponding to a document are displayed on screen 106, while other portions of the document are displayed on display device 108 and still other portions can be displayed on device 109. This is described in greater detail below with respect to
When an image 142 of a document or other item is scanned into system 100 using image scanning component 124, OCR component 126 can perform optical character recognition operations on the content of the scanned image, and open a corresponding document 145, or other record within business data store 120. For instance, when a user scans a vendor invoice using image scanning component 124, OCR component 126 can perform optical character recognition on the image 142 of the scanned invoice and generate a corresponding entity 134, document 145 or other record 146 within business system 100. This is only one exemplary way of generating documents 145 within business system 100 and they can be generated in other ways as well. For instance, they can be generated automatically, or by a user, without a corresponding image 142.
Image/data entry display component 152 illustratively allows user 114 to display an image of a document on one display device, and display the corresponding business document that is opened within business system 100, on one or more other display devices. Document division display component 154 illustratively allows user 114 to pull up a business document and display some portions of the document on one display device, while displaying the other portions on one or more other display devices.
Both of these display operations increase the display real estate upon which the data entry user input mechanisms (e.g., fields, line items, etc.) can be displayed. This reduces the amount of scrolling and other types of user inputs that need to be provided by user 114 in order to perform data entry and other operations.
Synchronization component 156 illustratively synchronizes the displayed items on display devices 106, 108 and 109. By way of example, assume that business system 100 includes an image of a scanned vendor invoice with a set of line items, and also includes a corresponding business document that corresponds to that invoice and, itself, has a set of line items corresponding to the line items on the scanned vendor invoice. It may also be that user 114 is entering information in line items on the corresponding business document on display device 106, while viewing the line items on the scanned image on display device 108. If the user highlights a portion of the scanned image on display device 108, synchronization component 156 illustratively highlights the same portion of the corresponding business document on device 106. Also, if the user scrolls downwardly on the scanned image on device 108 in order to view additional line items, synchronization component 156 illustratively modifies the display of the displayed document on device 106, as needed, so that it is displaying line items that correspond to the line items being viewed on the scanned image.
The operation of document management system 130 will now be described in greater detail with respect to
Business system 100 first receives an image of a document. This is indicated by block 160 in
It will also be noted that the document can be a wide variety of different kinds of documents. It can be an invoice, it can be a receipt, or it can be a wide variety of other business documents for which data entry is to be performed. The system then obtains information from the image so that the corresponding business document can be opened in the business system 100. This is indicated by block 167. This can be done, for instance, by having OCR component 126 perform OCR on the document as indicated by block 168. The information can be obtained in other ways 169 as well.
Document management component 130 then illustratively generates a corresponding business document 145 in business system 100. For instance, if the scanned document is an invoice from a vendor, the document management system 130 illustratively generates a corresponding invoice document within business system 100. Generating the corresponding business document in business system 100 is indicated by block 170 in
Once the image and the document are generated in business system 100, user 114 illustratively provides user inputs in order to access those items in business system 100. Receiving user inputs to access the document is indicated by block 172 in the flow diagram of
Document management system 130 then accesses the image and the corresponding document. Multiple monitor display component 150 determines whether the system is in multi-screen display mode. This is indicated by block 174 in
However, if, at block 174 it is determined that the system is in the multi-screen display mode, then multiple monitor display component 150 displays the captured image on one screen (such as on the display screen of display device 106), and it opens and displays the corresponding document that was generated within the business system 100 on one or more other screens (such as on the display screen of display device 108 or device 109, or both). This is indicated by block 176 of
It can be seen that by displaying the captured image 180 of the scanned document on one display device, and the data entry document on the other display device, a large number of data entry fields 184 or line items 186 or other data entry input mechanisms can be displayed. The scanned image of the document is not taking up any display real estate on display device 108, so that real estate can be devoted to data entry.
Once multiple monitor display component 150 has generated the displays (such as those shown in
In response, document management system 130 illustratively performs operations based on the user interactions. This is indicated by block 196. For instance, if the user performs data entry by typing in textual data, document management system 130 receives that data on the displayed document or form. This is indicated by block 198. If the user interacts with content on one of the displays (such as by highlighting it), the synchronization component 156 highlights the corresponding portion of the other display. This is indicated by block 199. If the user provides a scroll input to scroll either the captured image of the document, or the corresponding data entry document that was opened in business system 100, then synchronization component 156 illustratively scrolls the device, as indicated by the user, and also synchronizes the display of the other device, based upon the scroll input. This is indicated by block 200 in
By way of example, and referring again to
In another embodiment, scroll synchronization is performed automatically, without any user input. Assume again that the user is viewing line items 1-50 of document image 180 on device 106. Assume also that the user has just completed data entry on line item 50 in the corresponding data entry document 182 on device 108. In one embodiment, document management system 130 automatically scrolls the document 182 to expose data entry field 151. Synchronization component 156 can then automatically scroll the image 180 on display 106 so that it shows the corresponding line item of the captured image 180.
The user may also interact with the displayed information by providing inputs to move some of the information onto another display device. As an example, assume that the user is performing data entry on line items on the data entry document, but the user comes to a line where the user has questions and needs to call the customer before entering the information for that line item. In one embodiment, the user can move that line item onto a separate display device that displays lines that the user is to return to for further action (like calling the customer). The user can do this, for instance, by selecting a line item and performing a drag and drop operation or another operation. In response, document division display component 154 divides that line item from the original display, and places it on the display of the items awaiting further action. Moving the information to a different display devices is indicated by block 201.
It will be noted that document management system 130 can perform other operations based on other user interactions with the displayed items. This is indicated by block 202 in the flow diagram of
Document management system 130 first receives user inputs indicating that the user wishes to access a data entry document in business system 100. This is indicated by block 210 in
Again, document management system 130 determines whether the system is in the multi-screen display mode. This is indicated by block 212. In one embodiment, the system automatically enters the multi-screen display mode upon detecting that two display devices are connected. In another embodiment, component 154 automatically detects that the header/footer portions of the document and the line item portions of the document cannot all be displayed on a single display screen. In another embodiment, the system allows the user to provide an input to indicate that the user wishes the system to be in the multi-screen display mode. In yet another embodiment, the system can automatically enter that mode, and allow the user to override it. All of these and other configurations are contemplated herein.
If the system is not in multi-screen display mode, then processing continues in the normal, single screen mode. This is indicated by block 214.
Assuming that the system is in the multi-screen display mode, component 154 divides the document into different portions that are to be displayed on the different display devices. This is indicated by block 216. Again, this can be done automatically as indicated by block 218, based on user inputs as indicated by block 220, or in other ways as indicated by block 222. Multiple monitor display component 150 then displays a first set of information from the data entry document on one screen and a second set of information from the same data entry document on the second screen, and it can further divide the display onto other screens as well. This is indicated by blocks 224, 226 and 227 in the flow diagram of
It can be seen in
It can also be seen that document 230 illustratively includes first portion 236 and a second portion 238. First portion 236 illustratively includes header portion 237 and totals portion 239. Header portion 237 displays a variety of different types of header information. That information can include vendor identification information, a related document section, various detail information and date sections, etc. Totals portion 239 illustratively includes information that is updated based on data entered in the second portion 238. For example, totals portion 239 includes totals that are aggregated from line items in the second portion 238. Thus, in one embodiment, as the user enters numerical information in the line items of portion 238, the totals in totals section 239 are automatically updated.
When the user actuates the multi-screen user input mechanism 234, multiple monitor display component 150 of document management system 130 illustratively divides the display shown in
The user then illustratively interacts with the displays on devices 106 and 108. Interacting with the displays is indicated by block 250 in the flow diagram of
In response, document management system 130 performs operations on the document, based upon the user interactions. This is indicated by block 256. For instance, where the user is entering information in the line items in
In one embodiment, as the user is entering numerical information on the line items of display device 108, document management system 130 illustratively updates the totals section 239 in the display on device 106. This is indicated by block 260 in
Where the user provides inputs to move information from one display to another or to otherwise divide the information into more displays, system 130 performs the desired operation. This is indicated by block 261.
Of course, these are only some examples of user interactions and corresponding operations, and others can be performed as well. This is indicated by block 262.
At some point, the user will finish with the document. Until that point, processing simply reverts to blocks 250 and 256. This is indicated by block 264 in
The present discussion has mentioned processors and servers. In one embodiment, the processors and servers include computer processors with associated memory and timing circuitry, not separately shown. They are functional parts of the systems or devices to which they belong and are activated by, and facilitate the functionality of the other components or items in those systems.
Also, a number of user interface displays have been discussed. They can take a wide variety of different forms and can have a wide variety of different user actuatable input mechanisms disposed thereon. For instance, the user actuatable input mechanisms can be text boxes, check boxes, icons, links, drop-down menus, search boxes, etc. They can also be actuated in a wide variety of different ways. For instance, they can be actuated using a point and click device (such as a track ball or mouse). They can be actuated using hardware buttons, switches, a joystick or keyboard, thumb switches or thumb pads, etc. They can also be actuated using a virtual keyboard or other virtual actuators. In addition, where the screen on which they are displayed is a touch sensitive screen, they can be actuated using touch gestures. Also, where the device that displays them has speech recognition components, they can be actuated using speech commands.
A number of data stores have also been discussed. It will be noted they can each be broken into multiple data stores. All can be local to the systems accessing them, all can be remote, or some can be local while others are remote. All of these configurations are contemplated herein.
Also, the figures show a number of blocks with functionality ascribed to each block. It will be noted that fewer blocks can be used so the functionality is performed by fewer components. Also, more blocks can be used with the functionality distributed among more components.
The description is intended to include both public cloud computing and private cloud computing. Cloud computing (both public and private) provides substantially seamless pooling of resources, as well as a reduced need to manage and configure underlying hardware infrastructure.
A public cloud is managed by a vendor and typically supports multiple consumers using the same infrastructure. Also, a public cloud, as opposed to a private cloud, can free up the end users from managing the hardware. A private cloud may be managed by the organization itself and the infrastructure is typically not shared with other organizations. The organization still maintains the hardware to some extent, such as installations and repairs, etc.
In the embodiment shown in
It will also be noted that system 100, or portions of it, can be disposed on a wide variety of different devices. Some of those devices include servers, desktop computers, laptop computers, tablet computers, or other mobile devices, such as palm top computers, cell phones, smart phones, multimedia players, personal digital assistants, etc.
Under other embodiments, applications or systems are received on a removable Secure Digital (SD) card that is connected to a SD card interface 15. SD card interface 15 and communication links 13 communicate with a processor 17 (which can also embody processors 116 or 118 from
I/O components 23, in one embodiment, are provided to facilitate input and output operations. I/O components 23 for various embodiments of the device 16 can include input components such as buttons, touch sensors, multi-touch sensors, optical or video sensors, voice sensors, touch screens, proximity sensors, microphones, tilt sensors, and gravity switches and output components such as a display device, a speaker, and or a printer port. Other I/O components 23 can be used as well.
Clock 25 illustratively comprises a real time clock component that outputs a time and date. It can also, illustratively, provide timing functions for processor 17.
Location system 27 illustratively includes a component that outputs a current geographical location of device 16. This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. It can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions.
Memory 21 stores operating system 29, network settings 31, applications 33, application configuration settings 35, data store 37, communication drivers 39, and communication configuration settings 41. Memory 21 can include all types of tangible volatile and non-volatile computer-readable memory devices. It can also include computer storage media (described below). Memory 21 stores computer readable instructions that, when executed by processor 17, cause the processor to perform computer-implemented steps or functions according to the instructions. Processor 17 can be activated by other components to facilitate their functionality as well.
Examples of the network settings 31 include things such as proxy information, Internet connection information, and mappings. Application configuration settings 35 include settings that tailor the application for a specific enterprise or user. Communication configuration settings 41 provide parameters for communicating with other computers and include items such as GPRS parameters, SMS parameters, connection user names and passwords.
Applications 33 can be applications that have previously been stored on the device 16 or applications that are installed during use, although these can be part of operating system 29, or hosted external to device 16, as well.
The mobile device of
Note that other forms of the devices 16 are possible.
Computer 810 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 810. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
The system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random access memory (RAM) 832. A basic input/output system 833 (BIOS), containing the basic routines that help to transfer information between elements within computer 810, such as during start-up, is typically stored in ROM 831. RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 820. By way of example, and not limitation,
The computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only,
Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
The drives and their associated computer storage media discussed above and illustrated in
A user may enter commands and information into the computer 810 through input devices such as a keyboard 862, a microphone 863, and a pointing device 861, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A visual display 891 or other type of display device is also connected to the system bus 821 via an interface, such as a video interface 890. In addition to the monitor, computers may also include other peripheral output devices such as speakers 897 and printer 896, which may be connected through an output peripheral interface 895.
The computer 810 is operated in a networked environment using logical connections to one or more remote computers, such as a remote computer 880. The remote computer 880 may be a personal computer, a hand-held device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 810. The logical connections depicted in
When used in a LAN networking environment, the computer 810 is connected to the LAN 871 through a network interface or adapter 870. When used in a WAN networking environment, the computer 810 typically includes a modem 872 or other means for establishing communications over the WAN 873, such as the Internet. The modem 872, which may be internal or external, may be connected to the system bus 821 via the user input interface 860, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 810, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
It should also be noted that the different embodiments described herein can be combined in different ways. That is, parts of one or more embodiments can be combined with parts of one or more other embodiments. All of this is contemplated herein.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.