TOUCH GESTURES RELATED TO INTERACTION WITH CONTACTS IN A BUSINESS DATA SYSTEM

Abstract
A business data system generates a user interface display showing a business data record. The business data system receives a touch gesture user input manipulating a contact within the business data system. The business data system manipulates the contact based on the touch gesture user input.
Description
BACKGROUND

There are a wide variety of different types of business data systems. Some such systems include customer relations management (CRM) systems, enterprise resource planning (ERP) systems, line-of-business (LOB) applications, and other business systems. These types of systems often enable creation and maintaining of business data records. Some of these records include customer records that have details about customers, vendor records that include details of vendors, sales records, sales proposals, quotes, order records, records that contain product or service information, and records related to business contacts, among many others. The system can also include workflows that enable users to perform various tasks and workflows using the business data system.


An example of a workflow provided in some business data systems is one that allow(s) users or organizations to track various business opportunities. For instance, if there is an opportunity to make a sale of products or services to another organization, the business data system allows users to enter information that may be helpful in converting that opportunity into an actual sale. Similarly, some such systems allow many other types of tasks or workflows to be performed as well. For instance, some systems allow users to prepare a quote for a potential customer. Then, when the customer accepts the terms of the quote, the user can convert the quote into an actual order. These are merely two examples of a wide variety of different types of tasks and workflows that can be performed within a business data system.


In performing these types of tasks and workflows, some users may wish to contact other people associated with the business data records being operated on. For instance, where a customer has a primary contact, it may be that the user wishes to call or otherwise communicate with that person in order to discuss the terms of a proposal or order. Therefore, some business data systems allow a user to search for contacts, and communicate with a given contact.


The use of mobile devices is also increasing rapidly. For instance, some mobile devices include smart phones, cellular telephones, and tablet computers, to name a few. These types of devices often have different types of user input mechanisms than desktop computers. For example, a desktop computer may have user interface displays with user input mechanisms that can be actuated by a point and click device (such a mouse or track ball) or a hardware keyboard. However, mobile devices often have touch sensitive screens. This enables a user to actuate user input mechanisms using touch gestures, such as by using a finger, a stylus, or other device.


The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.


SUMMARY

A business data system generates a user interface display showing a business data record. The business data system receives a touch gesture user input to manipulate a contact within the business data system. The business data system manipulates the contact based on the touch gesture user input.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of one illustrative business data environment.



FIG. 2A is a flow diagram of one embodiment of the operation of the system shown in FIG. 1 in manipulating contact information based on a touch gesture.



FIG. 2B is a flow diagram illustrating one embodiment of the operation of the system shown in FIG. 1 in manipulating a contact, within a business record, using touch gestures.



FIGS. 3A-3F show exemplary user interface displays.



FIG. 4 shows one embodiment of the system shown in FIG. 1 in different architectures.



FIGS. 5-9 illustrate various mobile devices.



FIG. 10 is a block diagram of one illustrative computing environment.





DETAILED DESCRIPTION


FIG. 1 shows one illustrative embodiment of a business data architecture 90. Business data architecture 90 includes CRM system 100, CRM data store 102 and user device 104. User device 104 is shown generating user interface displays 106 for interaction by user 108. While CRM system 100 can be any business data system (such as a CRM system, an ERP system, an LOB system, or another business data application or business data system) it is described herein as a CRM system, for the sake of example only. CRM system 100 illustratively includes processor 110, user interface component 112, communication component 114, workflow/task component 118 and other CRM components 120.


Processor 110 is illustratively a computer processor with associated memory and timing circuitry (not separately shown). It is a functional part of CRM system 100 and is activated by, and facilitates the functionality of, the other components and items in CRM system 100. It will also be noted that while only a single processor 110 is shown, processor 110 can actually be multiple different computer processors as well. In addition, the multiple different computer processors used by system 100 can be local to system 100, or remote from system 100, but accessible by system 100.


User interface component 112 illustratively generates user interface displays with user input mechanisms that can be actuated by user 108. The user interface displays 106 (that user 108 interacts with) can be generated by user interface component 112 in CRM system 100 and passed to device 104 where they can be displayed (by device 104, as user interface displays 106) for interaction by user 108.


Communication component 114 illustratively facilitates communication among various users of CRM system 100, or between users of CRM system 100 and other individuals who may not necessarily be users of system 100. For instance, if user 108 wishes to communicate with a contact who may not necessarily have access to CRM system 100 (such as by initiating a phone call, an instant message, etc.), communication component 114 illustratively facilitates this type of communication. Therefore, communication component 114 can illustratively facilitate email communication, telephone or cellular telephone communication, instant message communication, chat room communication, or other types of communication.


Workflow/task component 118 illustratively uses user interface component 112 to generate user interface displays 106 so that user 108 can perform tasks and carryout workflow within CRM system 100. For instance, workflow/task component 118 illustratively allows user 108 to add contact information to CRM system 100, to track opportunities within system 100, to convert quotes to orders, or to input various other types of information or perform other tasks or workflows.


Other CRM components 120 illustratively provide the functionality for other things that can be done in CRM system 100. There are a wide variety of other things that users can do within CRM system 100, and these various functions are provided by other components 120.


CRM system 100 has access to CRM data store 102. CRM data store 102 illustratively stores a variety of different business data records. While data store 102 is shown as a single data store, it can be multiple different data stores. It can be local to system 100 or remote therefrom. Where it includes multiple different data stores, they can all be local to or remote from system 100, or some can be local while others are remote.


The data records can include, by way of example only, proposals 124, opportunities 126, quotes 128, customer data records 130, orders 132, product/service information 134, vendor records 136, contacts 138, workflows 140, and other business data records 142. Each of the business data records may be an object or entity, or another type of record. The records can include links to other records, or stand by themselves. All of these types of structures, and others are contemplated herein.


Proposals 124 illustratively include business information for a proposal that can be made to a customer. Opportunities 126 illustratively include a wide variety of different types of information (some of which is described below with respect to FIGS. 3A-3F) that enable user 108 to track a sales opportunity within CRM system 100. Quotes 128 illustratively include information defining quotes that can be provided to customers. Customers 130 include customer information, such as contact information, address, billing information, etc. for different customers. Orders 132 illustratively include order information that reflects orders that have actually been made by various customers. Product/service information 134 illustratively includes information that describes products or services in CRM system 100. Vendors 136 illustratively include information describing vendors that are used by the organization in which CRM system 100 is deployed. Contacts 138 illustratively include contact information for various people that are either users of CRM system 100, or that are related to any of the other business data records in CRM data store 102 (for instance they can be contacts at vendors, customers, other users, etc.). Workflows 130 illustratively define the various workflows that user 108 can perform within CRM system 100.


The workflows can take a wide variety of different forms. For instance, they may simply be data entry workflows, workflows posting information to a ledger, workflows fleshing out proposals or quotes, or a wide variety of other things. In any case, CRM system 100 accesses workflows 140 in order to generate the user interface displays 106 that can be manipulated by user 108, in order to perform the different workflows.


User device 104 illustratively includes user interface component 122, client CRM system 144, and processor 146. Client CRM system 144 is illustratively used by user device 104 in order to access CRM system 100. Of course, client CRM system 144 can be a stand alone system as well, in which case it has access to CRM data store 102, or a different CRM data store. As described herein, however, it is simply used in order to access CRM system 100. This is but one option.


User interface component 122 illustratively generates the user interface displays 106 on user device 104. In the embodiment described herein, device 104 has a touch sensitive user interface display screen. Therefore, user interface component 122 illustratively generates the displays for display on the user interface display screen. The displays 106 have user input mechanisms 107 that can be actuated using touch gestures by user 108.


Processor 146 is illustratively a computer processor with associated memory and timing circuitry (not separately shown). Processor 146 is illustratively a functional part of device 104 and is activated by, and facilitates the functionality of, the other systems, components and items in device 104. While processor 146 is shown as a single processor, it could be multiple processors as well.


As briefly discussed above, user interface displays 106 are illustratively user interface displays that are provided for interaction by user 108. User input mechanisms 107 can be a wide variety of different types of user input mechanisms. For instance, they can be buttons, icons, text boxes, dropdown menus, soft keyboards or virtual keyboards or keypads, links, check boxes, active tiles that function as a link to underlying information and that actively or dynamically show information about the underlying information or a wide variety of other user input mechanisms that can be actuated using touch gestures.



FIG. 2A is a flow diagram illustrating one embodiment of the operation of the architecture shown in FIG. 1 in manipulating contacts using touch gestures within CRM system 100. User 108 first illustratively provides an input indicating that he or she wishes to access CRM system 100. This can launch client CRM system 144 which provides access to CRM system 100, or it can launch CRM system 100 and provide direct or indirect access. In response, CRM system 100 uses user interface component 112 to generate a user interface display 106 that displays a wall or other CRM display. The CRM display illustratively includes user input mechanisms 107 that allow user 108 to manipulate them and thus control and manipulate CRM system 100.



FIG. 3A is one illustrative example of a user interface display 200 that shows a wall or a CRM start screen. Display 200 is shown on user device 202 which is illustratively a tablet computer. Tablet computer 202 illustratively includes touch sensitive display screen 204. Of course, it will be noted that device 202 could be any other type of device that has a touch sensitive display screen. Start screen (or wall) 200 is shown with a plurality of tiles, or icons 206.


In the embodiment shown in FIG. 3A, the icons (or tiles) are divided generally into two different sections. The first section is a personal section 208, and the second section is a business section 210. These sections are exemplary only and may or may not be used. The tiles in section 208 are illustratively user actuatable links which, when actuated by a user, cause a corresponding function to happen. For example, when either one of a pair of browser tiles 210 or 212 are actuated by the user, they launch a browser. When store tile 214 is actuated by the user, it launches an on-line store application or portal. Other tiles are shown for navigating to the control panel, for viewing weather, for viewing stock information of identified companies, or that indicate popular browsing sessions. Of course, the tiles shown in the personal section 208 are exemplary only and a wide variety of other tiles could be shown as well.


The business section 210 of start display 200 also includes a plurality of tiles which, when actuated by the user, cause the CRM system to take action. For instance, contact tile 216, when actuated by the user, opens a contact menu for the user. Opportunities tile 218, when actuated by the user, opens opportunity records or an opportunities menu that allows the user to navigate to individual opportunity records. The “my leads” tile 220, when actuated by the user, causes the CRM system 100 to open a menu or records corresponding to leads for the given user. A news tile 222 provides news about one or more items that have taken place in CRM system 100, and that are of interest to the user. In the example shown in FIG. 3A, tile 222 shows that an opportunity for the ACME Company has been closed by another sales person. When the user actuates tile 222, the CRM system 100 navigates the user to additional information about that closed opportunity. For instance, it may navigate the user to the opportunity record or to the sales record, or simply to the ACME Company general record. The other tiles, when actuated by the user, cause the CRM system to navigate the user to other places of interest or to launch other components of the CRM system. Those displayed are shown for the sake of example only.


Once the CRM system is launched and the start screen is displayed, CRM system 100 then receives a user touch gesture to manipulate a contact in CRM system 100. This is indicated by block 153 in FIG. 2A. By way of example, the user can simply touch contacts tile 216. This causes CRM system 100 to display a contact menu that allows the user to take a variety of other actions, such as to open a contact 155, edit a contact 157, add or delete contacts 159, initiate communication with one or more contacts 161, schedule a meeting with a contact 163, touch a search button to begin a search 191 for a contact, or perform other contact manipulation steps 165. In response, CRM system 100 manipulates the contact based on the touch gestures. This is indicated by block 167 in FIG. 2A.


It should also be noted that the user can manipulate contacts in other ways as well. For instance, instead of actuating contact tile 216, or one of the specific contacts represented by the photos or images on tile 216, the user may open up other business data records in CRM system 100. Many of those business data records may have individual people, or contacts, associated with them. Therefore, user 108 can manipulate contacts from within those business data records as well.



FIG. 2B is a flow diagram illustrating one embodiment of this type of contact manipulation. The first two blocks in FIG. 2B are similar to the first two blocks shown in FIG. 2A, and they are similarly numbered. Therefore, at block 150, the user launches the CRM system and at block 152 the CRM system displays a start display or wall or other CRM user interface display.


In the embodiment shown in FIG. 2B, the user then provides a touch gesture to open a CRM record. FIG. 3B shows one example of this. In the embodiment shown in FIG. 3B, the user has illustratively actuated tile 218, such as by touching it. In response, CRM system 100 displays an opportunities tile 224. Opportunities tile 224 is illustratively indicative of a new opportunity that has been created. The user then actuates tile 224, using a touch gesture (e.g., by touching it) with his or her finger 226. This causes CRM system 100 to open another user interface display, such as user interface display 228 shown in FIG. 3C, corresponding to the newly created opportunity. Receiving the user input to open the CRM record is indicated by block 154 in FIG. 2B, and having the CRM system 100 display the record is indicated by block 156.



FIG. 3C shows that the business record display 228 displays tiles or links (or other icons or user-actuatable items) that allow the user to view a variety of different kinds of information. For instance, display 228 includes a “people” or “contact” tile 230. Tile 230 identifies people either at the organization for which the opportunity has been generated, or at the organization that employs the CRM system, that are somehow related to the opportunity. By way of example, the opportunity tile 230 may link user 108 to other people in the company that employs the CRM system, who are working on converting the opportunity into an actual sale. In addition, tile 230, when actuated by the user, may navigate the user to contact information for individuals at the company for which the opportunity was developed. In any case, if the user actuates tile 230, the CRM system 100 illustratively navigates user 108 to either a contact menu or a specific contact and allows the user to manipulate the contact in a similar way as described above with respect to FIG. 2A. For instance, the user can open a contact, delete or edit it, initiate communication, etc.



FIG. 3C also shows examples of other information that can be shown in a business data record. For instance, user interface display 228 includes a wide variety of actuable items that take the user to other information corresponding to the opportunity. Invoices tile 232, when actuated by the user, navigates the user to another display where the user can view information related to invoices that correspond to this opportunity. Quotes tile 234, when actuated by the user, navigates the user to additional information about quotes generated for this company or somehow related to this opportunity. Document tile 236 illustratively navigates the user to other related documents corresponding to this opportunity, and activity tile 238 shows, in general, the amount of activity related to this opportunity. When the user actuates tile 238, CRM system 100 can navigate the user to additional displays showing the specific activity represented by the tile 238.


User interface display 228 also illustratively includes a “What's new” section 240. What's new section 240 can display posts by user 108, or other users of the CRM system, that are related to the opportunity being displayed.


In addition, as shown in FIG. 3C, display 228 is illustratively pannable in the directions indicated by arrow 242. By way of example, if the user uses his or her finger 226 and makes a swiping motion to the left or to the right, display 228 illustratively pans to the left or to the right based on the touch gesture.


User interface display 228 also illustratively includes an information section 244 that displays a primary contact tile 246 corresponding to a primary contact for this opportunity. A plurality of additional tiles 248 are displayed below the primary contact tile 246, and provide information corresponding to the individual represented by primary contact tile 246. The tiles 248, for instance, provide a preferred contact method for the primary contact, an amount of revenue generated by the primary contact, an indicator of the availability of the primary contact, a reputation or rating for the primary contact, a date when the opportunity corresponding to the primary contact closes, and a credit limit for the primary contact. Of course, all of the tiles 248 are exemplary only, and additional or different information corresponding to the primary contact, or other information, can be displayed as well.


Since the opportunity record represented by user interface 228 has a primary contact (or tile) 246 that represents the primary contact for the displayed opportunity, the user can manipulate that contact information from within the opportunity business record displayed in user interface display 228. If there were no contact information corresponding to the business opportunity displayed on display 220, CRM system 100 would illustratively provide a user input mechanism that allows user 108 to navigate to contact information corresponding to the displayed business data record. Determining whether contact information is displayed on the business data record represented by user interface display 228 is indicated by block 158 in FIG. 2B. If not, receiving the user touch gesture to show contact information is indicated by block 160.


As described above, in the embodiment shown in FIG. 3C, both the contact tile 230 and the primary contact tile 246 are shown in user interface display 228. Therefore, the user need not provide an additional touch gesture to see contact information.



FIG. 3C also shows that the user is using his or her finger 226 to actuate tile 246. Thus, user 108 is selecting primary contact 246, by actuating the corresponding tile. Receiving a touch gesture selecting a contact is indicated by block 162 in FIG. 2B.


Actuation of tile 146 causes CRM system 100 to generate another user interface display that allows the user to manipulate the contact information. As described above with respect to FIG. 2A, this can take a wide variety of different forms. However, in the embodiment discussed with respect to FIG. 2B, actuating primary contact tile 246 causes CRM system 100 to generate a display, such as user interface display 250, shown in FIG. 3D. It can be seen that a number of the items in user interface display 250 are the same as those shown in user interface display 228 in FIG. 3C, and they are similarly numbered. However, FIG. 3D also shows that, since the user actuated tile 246, this causes CRM system 100 to display communication bar 252. Communication bar 252 displays the specific contact options for the selected contact, who was selected when the user actuated tile 246. Contact bar 252, itself, illustratively includes a plurality of user actuatable items each of which represents a method for contacting the primary contact represented by tile 246. For instance, contact bar 252 includes phone button 166, email button 168, instant messenger button 170 and other button 172. Displaying the specific contact options for the selected contact is indicated by block 164 in FIG. 2B.


When the user actuates any of the buttons in contact bar 252, this causes CRM system 100 to illustratively initiate communication with the primary contact using the selected method of communication. FIG. 3D shows that the user 108 has used his or her finger 226 to actuate the phone button 166. In the embodiment shown, the user simply touches button 166 to actuate it. Receiving the user touch gesture selecting a contact option is indicated by block 174 in FIG. 2B.


In response to the user actuating the phone button 166, communication component 114 in CRM system 100 illustratively initiates a phone call to the primary contact “Phil B.” represented by tile 246 and generates a suitable user interface display indicating that the call has been initiated.



FIG. 3E shows one exemplary user interface display 300 that illustrates this. It can be seen in display 300 that a phone call is underway to Phil B. This is indicated generally at 302. Display 300 shows the identity of the person being called, an indication that it is a phone call, and the elapsed time of the call. Of course, this information is exemplary only and a wide variety of additional or different information could be shown as well. In any case, user interface display 300 illustrates that a call has been placed.


A number of other exemplary things are shown in display 300. A list of objectives to be accomplished are shown generally at 306. A status bar 304 shows how many of the objectives for the phone call have been completed. The objectives listed are “product requirements”, “key decision makers”, “budget”, and “notes”. In one embodiment, these are the agenda items for the phone call. Of course, they may be simply “to do” items or a variety of other listed items as well.



FIG. 3E also shows that a soft keyboard is displayed generally at 308. This allows user 108 to type information into the text boxes at 306, or to otherwise enter alphanumeric information, using touch.


The communication (e.g., the telephone call) can proceed until one of the parties stops the communication. This can be done, in one embodiment, by user 108 simply touching an appropriate button on the user interface display. FIG. 3F shows one illustrative way of doing this. FIG. 3F shows user interface display 310, which is similar to user interface display 308 shown in FIG. 3E, and similar items are similarly numbered. However, it can be seen in FIG. 3F that the parties to the call have accomplished two of the agenda items, and therefore status bar 304 shows that two out of four items have been completed. Display 310 also shows that the user has touched a “hang up” button 312. Hang up button 312 allows user 108 to terminate the call, simply by actuating button 312. Receiving a user touch gesture to end the communication is indicated by block 178 in FIG. 2B. In response, communication component 114 of CRM system 100 hangs up the call, or disconnects the call, or otherwise discontinues the telephone communication. This is indicated by block 180 in FIG. 2B.


It can thus be seen that a user can quickly and easily manipulate contact information within a CRM system, or other business data system. When contact information is displayed, the user can use a touch gesture to manipulate it. This can make manipulation of contact information much easier and less cumbersome.


It will be noted that the touch gestures mentioned herein can take a wide variety of different forms. They can be simple touches or taps, swipes, slides, multi-touch inputs, positional gestures (gestures at a specific position or location on the screen), brushing, multi-finger gestures, touch and hold gestures, etc. The speed of the gestures can be used for control as well (e.g., a quick swipe can pan quickly while a slow swipe pans slowly, etc.). These and other gestures are all contemplated herein.



FIG. 4 is a block diagram of system 100, shown in FIG. 1, except that it is disposed in a cloud computing architecture 500. Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location or configuration of the system that delivers the services. In various embodiments, cloud computing delivers the services over a wide area network, such as the internet, using appropriate protocols. For instance, cloud computing providers deliver applications over a wide area network and they can be accessed through a web browser or any other computing component. Software or components of system 100 as well as the corresponding data, can be stored on servers at a remote location. The computing resources in a cloud computing environment can be consolidated at a remote data center location or they can be dispersed. Cloud computing infrastructures can deliver services through shared data centers, even though they appear as a single point of access for the user. Thus, the components and functions described herein can be provided from a service provider at a remote location using a cloud computing architecture. Alternatively, they can be provided from a conventional server, or they can be installed on client devices directly, or in other ways.


The description is intended to include both public cloud computing and private cloud computing. Cloud computing (both public and private) provides substantially seamless pooling of resources, as well as a reduced need to manage and configure underlying hardware infrastructure.


A public cloud is managed by a vendor and typically supports multiple consumers using the same infrastructure. Also, a public cloud, as opposed to a private cloud, can free up the end users from managing the hardware. A private cloud may be managed by the organization itself and the infrastructure is typically not shared with other organizations. The organization still maintains the hardware to some extent, such as installations and repairs, etc.


In the embodiment shown in FIG. 4, some items are similar to those shown in FIG. 1 and they are similarly numbered. FIG. 4 specifically shows that CRM system 100 (or, of course, another business data system such as an ERP system, LOB application, etc.) is located in cloud 502 (which can be public, private, or a combination where portions are public while others are private). Therefore, user 108 uses a user device 104 to access those systems through cloud 502.



FIG. 4 also depicts another embodiment of a cloud architecture. FIG. 4 shows that it is also contemplated that some elements of business system 100 (or architecture 90) are disposed in cloud 502 while others are not. By way of example, data store 102 can be disposed inside of cloud 502 (with CRM system 100) or outside of cloud 502, and accessed through cloud 502. In another embodiment, communication component 114 is also outside of cloud 502. Regardless of where they are located, they can be accessed directly by device 104, through a network (either a wide area network or a local area network), they can be hosted at a remote site by a service, or they can be provided as a service through a cloud or accessed by a connection service that resides in the cloud. All of these architectures are contemplated herein.


It will also be noted that system 100, or portions of it, can be disposed on a wide variety of different devices. Some of those devices include servers, desktop computers, laptop computers, tablet computers, or other mobile devices, such as palm top computers, cell phones, smart phones, multimedia players, personal digital assistants, etc.



FIG. 5 is a simplified block diagram of one illustrative embodiment of a handheld or mobile computing device that can be used as a user's or client's hand held device 16 (e.g., device 104), in which the present system (or parts of it) can be deployed. FIGS. 6-9 are examples of handheld or mobile devices.



FIG. 5 provides a general block diagram of the components of a client device 16 that can run components of system 100 or that interacts with system 100, or both. In the device 16, a communications link 13 is provided that allows the handheld device to communicate with other computing devices and under some embodiments provides a channel for receiving information automatically, such as by scanning. Examples of communications link 13 include an infrared port, a serial/USB port, a cable network port such as an Ethernet port, and a wireless network port allowing communication though one or more communication protocols including General Packet Radio Service (GPRS), LTE, HSPA, HSPA+ and other 3G and 4G radio protocols, 1Xrtt, and Short Message Service, which are wireless services used to provide cellular access to a network, as well as 802.11 and 802.11b (Wi-Fi) protocols, and Bluetooth protocol, which provide local wireless connections to networks.


Under other embodiments, applications or systems (like system 100) are received on a removable Secure Digital (SD) card that is connected to a SD card interface 15. SD card interface 15 and communication links 13 communicate with a processor 17 (which can also embody processor 146 from FIG. 1) along a bus 19 that is also connected to memory 21 and input/output (I/O) components 23, as well as clock 25 and location system 27. I/O components 23, in one embodiment, are provided to facilitate input and output operations. I/O components 23 for various embodiments of the device 16 can include input components such as buttons, touch sensors, multi-touch sensors, optical or video sensors, voice sensors, touch screens, proximity sensors, microphones, tilt sensors, and gravity switches and output components such as a display device, a speaker, and or a printer port. Other I/O components 23 can be used as well.


Clock 25 illustratively comprises a real time clock component that outputs a time and date. It can also, illustratively, provide timing functions for processor 17.


Location system 27 illustratively includes a component that outputs a current geographical location of device 16. This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. It can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions.


Memory 21 stores operating system 29, network settings 31, applications 33, application configuration settings 35, data store 37, communication drivers 39, and communication configuration settings 41. Memory 21 can include all types of tangible volatile and non-volatile computer-readable memory devices. It can also include computer storage media (described below). Memory 21 stores computer readable instructions that, when executed by processor 17, cause the processor to perform computer-implemented steps or functions according to the instructions. System 100 or the items in data store 102, for example, can reside in memory 21. Similarly, device 16 can have a client business system 24 (e.g., client CRM system 144) which can run various business applications or embody parts or all of business system 100. Processor 17 can be activated by other components to facilitate their functionality as well.


Examples of the network settings 31 include things such as proxy information, Internet connection information, and mappings. Application configuration settings 35 include settings that tailor the application for a specific enterprise or user. Communication configuration settings 41 provide parameters for communicating with other computers and include items such as GPRS parameters, SMS parameters, connection user names and passwords.


Applications 33 can be applications that have previously been stored on the device 16 or applications that are installed during use, although these can be part of operating system 29, or hosted external to device 16, as well.



FIG. 6 shows one embodiment in which device 16 is a tablet computer 600 (also shown in FIGS. 3A-3F). Screen 602 can be a touch screen (so touch gestures from a user's finger can be used to interact with the application) or a pen-enabled interface that receives inputs from a pen or stylus. It can also use an on-screen virtual keyboard. Of course, it might also be attached to a keyboard or other user input device through a suitable attachment mechanism, such as a wireless link or USB port, for instance. Computer 600 can also illustratively receive voice inputs as well.



FIGS. 7, 8 and 9 provide additional examples of devices 16 that can be used, although others can be used as well. In FIG. 7, a mobile phone 45 (or feature phone) is provided as the device 16. Phone 45 includes a set of keypads 47 for dialing phone numbers, a display 49 capable of displaying images including application images, icons, web pages, photographs, and video, and control buttons 51 for selecting items shown on the display. The phone includes an antenna 53 for receiving cellular phone signals such as General Packet Radio Service (GPRS) and 1Xrtt, and Short Message Service (SMS) signals. In some embodiments, phone 45 also includes a Secure Digital (SD) card slot 55 that accepts a SD card 57.


The mobile device of FIG. 8 is a personal digital assistant (PDA) 59 or a multimedia player or a tablet computing device, etc. (hereinafter referred to as PDA 59). PDA 59 includes an inductive screen 61 that senses the position of a stylus 63 (or other pointers, such as a user's finger) when the stylus is positioned over the screen. This allows the user to select, highlight, and move items on the screen as well as draw and write. PDA 59 also includes a number of user input keys or buttons (such as button 65) which allow the user to scroll through menu options or other display options which are displayed on display 61, and allow the user to change applications or select user input functions, without contacting display 61. Although not shown, PDA 59 can include an internal antenna and an infrared transmitter/receiver that allow for wireless communication with other computers as well as connection ports that allow for hardware connections to other computing devices. Such hardware connections are typically made through a cradle that connects to the other computer through a serial or USB port. As such, these connections are non-network connections. In one embodiment, mobile device 59 also includes a SD card slot 67 that accepts a SD card 69.



FIG. 9 is similar to FIG. 8 except that the phone is a smart phone 71. Smart phone 71 has a touch sensitive display 73 that displays icons or tiles or other user input mechanisms 75. Mechanisms 75 can be used by a user to access a business data system (like CRM system 100) run applications, make calls, perform data transfer operations, etc. In general, smart phone 71 is built on a mobile operating system and offers more advanced computing capability and connectivity than a feature phone.


Note that other forms of the devices 16 are possible.



FIG. 10 is one embodiment of a computing environment in which system 100 (for example) can be deployed. With reference to FIG. 10, an exemplary system for implementing some embodiments includes a general-purpose computing device in the form of a computer 810. Components of computer 810 may include, but are not limited to, a processing unit 820 (which can comprise processor 110), a system memory 830, and a system bus 821 that couples various system components including the system memory to the processing unit 820. The system bus 821 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus. Memory and programs described with respect to FIG. 1 can be deployed in corresponding portions of FIG. 10.


Computer 810 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 810. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.


The system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random access memory (RAM) 832. A basic input/output system 833 (BIOS), containing the basic routines that help to transfer information between elements within computer 810, such as during start-up, is typically stored in ROM 831. RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 820. By way of example, and not limitation, FIG. 10 illustrates operating system 834, application programs 835, other program modules 836, and program data 837.


The computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only, FIG. 10 illustrates a hard disk drive 841 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 851 that reads from or writes to a removable, nonvolatile magnetic disk 852, and an optical disk drive 855 that reads from or writes to a removable, nonvolatile optical disk 856 such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 841 is typically connected to the system bus 821 through a non-removable memory interface such as interface 840, and magnetic disk drive 851 and optical disk drive 855 are typically connected to the system bus 821 by a removable memory interface, such as interface 850.


The drives and their associated computer storage media discussed above and illustrated in FIG. 10, provide storage of computer readable instructions, data structures, program modules and other data for the computer 810. In FIG. 10, for example, hard disk drive 841 is illustrated as storing operating system 844, application programs 845, other program modules 846, and program data 847. Note that these components can either be the same as or different from operating system 834, application programs 835, other program modules 836, and program data 837. Operating system 844, application programs 845, other program modules 846, and program data 847 are given different numbers here to illustrate that, at a minimum, they are different copies.


A user may enter commands and information into the computer 810 through input devices such as a keyboard 862, a microphone 863, and a pointing device 861, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A visual display 891 or other type of display device is also connected to the system bus 821 via an interface, such as a video interface 890. In addition to the monitor, computers may also include other peripheral output devices such as speakers 897 and printer 896, which may be connected through an output peripheral interface 895.


The computer 810 is operated in a networked environment using logical connections to one or more remote computers, such as a remote computer 880. The remote computer 880 may be a personal computer, a hand-held device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 810. The logical connections depicted in FIG. 10 include a local area network (LAN) 871 and a wide area network (WAN) 873, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.


When used in a LAN networking environment, the computer 810 is connected to the LAN 871 through a network interface or adapter 870. When used in a WAN networking environment, the computer 810 typically includes a modem 872 or other means for establishing communications over the WAN 873, such as the Internet. The modem 872, which may be internal or external, may be connected to the system bus 821 via the user input interface 860, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 810, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 10 illustrates remote application programs 885 as residing on remote computer 880. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.


It should also be noted that features from different embodiments can be combined. That is, one or more features from one embodiment can be combined with one or more features of other embodiments. This is contemplated herein.


Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims
  • 1. A computer-implemented method of manipulating contact information in a business data system, comprising: displaying a user interface display showing business information, from the business data system;receiving a user touch gesture on the user interface display, to manipulate contact information in the business data system; andmanipulating the contact information in the business data system based on the user touch gesture.
  • 2. The computer-implemented method of claim 1 wherein displaying a user interface display comprises: displaying a user actuatable contact user input mechanism showing a contact in the business data system.
  • 3. The computer-implemented method of claim 2 wherein receiving the user touch gesture comprises: receiving a user touch on the contact user input mechanism.
  • 4. The computer-implemented method of claim 3 wherein manipulating the contact information comprises: accessing additional contact information corresponding to the contact; anddisplaying the additional contact information.
  • 5. The computer-implemented method of claim 4 wherein the additional contact information comprises a plurality of contact options for the contact and wherein displaying the additional contact information comprises: displaying a user actuatable input mechanism corresponding to each of the contact options.
  • 6. The computer-implemented method of claim 5 and further comprising: receiving a user touch gesture on a given one of the input mechanisms corresponding to a given contact option; andinitiating communication with the contact using the given contact option.
  • 7. The computer-implemented method of claim 1 wherein receiving a user touch gesture to manipulate contact information comprises: receiving a user touch gesture requesting a business data record from the business data system; anddisplaying the business data record with a contact user input mechanism for a contact corresponding to the business data record.
  • 8. The computer-implemented method of claim 7 wherein receiving a user touch gesture to manipulate contact information further comprises: receiving a user touch gesture actuating the contact user input mechanism on the business data record.
  • 9. The computer-implemented method of claim 8 wherein displaying the business data record with the contact user-input mechanism for the contact, comprises: displaying a contact menu with an input mechanism corresponding to a contact search mechanism, and wherein receiving the user touch gesture actuating the contact user input mechanism comprises receiving a touch gesture initiating a search for a specific contact.
  • 10. The computer-implemented method of claim 8 wherein receiving a user touch gesture to manipulate contact information further comprises: receiving the user touch gesture to perform one of adding, deleting or editing the contact information.
  • 11. The computer-implemented method of claim 8 wherein receiving a user touch gesture to manipulate contact information further comprises: receiving the user touch gesture to initiate communication with an entity represented by the contact information.
  • 12. The computer-implemented method of claim 8 wherein receiving a user touch gesture to manipulate contact information further comprises: receiving the user touch gesture to schedule a meeting with an entity represented by the contact information.
  • 13. A business data system, comprising: a user interface component;a business data component generating a user interface display, using the user interface component, showing a business data record with a user input mechanism showing a contact corresponding to the business data record, receiving a user touch gesture through the user input mechanism to manipulate contact information for the contact and manipulating the contact information in the business data system based on the touch gesture; anda computer processor being a functional part of the business data system and activated by the user interface component and the business data component to facilitate generating the user interface display, receiving the touch gesture and manipulating the contact information.
  • 14. The business data system of claim 13 and further comprising: a communication component, the touch gesture initiating communication with the contact, using the communication component.
  • 15. The business data system of claim 14 wherein the business data component displays a communication input mechanism representing a communication option for communicating with the contact, and wherein the touch gesture actuates the communication input mechanism, the communication component initiating communication with the contact using the communication option represented by the communication input mechanism.
  • 16. The business data system of claim 15 wherein the communication option comprises a selected one of electronic mail, telephone, cellular telephone, and instant messaging.
  • 17. The business data system of claim 13 wherein the business data component manipulates the contact information, based on the user touch gesture by performing one of adding contact information, deleting contact information, editing contact information and displaying additional contact information.
  • 18. A mobile device, comprising: a touch sensitive display screen;a user interface component displaying a user interface display from a business data system on the touch sensitive display screen and receiving a user touch gesture through the user interface display;a business data component manipulating contact information in the business data system based on the touch gesture; anda computer processor being a functional component of the mobile device and activated by the user interface component and the business data component to facilitate displaying, receiving the user touch gesture, and manipulating the contact information.
  • 19. The mobile device of claim 18 wherein the user interface component receives a touch gesture requesting display of a business data record from the business data system, the business data record including contact information for a contact corresponding to the business data record.
  • 20. The mobile device of claim 19 wherein the business data component manipulates the contact information by at least one of initiating communication with the contact or scheduling a meeting with the contact.
Provisional Applications (1)
Number Date Country
61612148 Mar 2012 US