Users of applications such as audiovisual games are presented with a number of challenges designed to enhance their enjoyment of the game. Players often reach points in the game when they have difficulty passing a particular challenge. Games present users with various levels of difficulty, which makes passing one stage at a novice level different than passing the same challenge at a more difficult level. Some users resort to searching for help via the Internet. Often, third parties post written descriptions and gameplay videos of “walk through” depicting how to pass a particularly difficult stage in a game. Normally, this means the user must stop game play, construct a search, and view the help before returning to the game. The user usually has limited information about the context of where they are in the game (for example, what the name of the area is, what boss they are encountering) which makes it difficult for the user to build a search query themselves.
Technology is presented which provides a user of an application with assistance when using the application by accessing help information available from third party sources or a dedicated help database when the user encounters a problem in the application. The technology has particular applicability to gaming applications where a user may find themselves troubled by a particular scenario or task in the game that they cannot overcome without help. A system and method of providing help to an application user generates a context-based help search for publically available help information made available by third parties on public networks. The system and method determine whether a user would benefit from assistance in using a primary computing application and a context of use of the primary computing. A context-based a search query is executed to retrieve publically available network resident help information relating to the needed help, and the results are output to a display
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Technology is presented which provides the user of an application with context relevant help. The context relevant help is provided based on information is made available by third parties and which is accessible by searching publicly available sources or a collection of the publicly available sources stored in a database provided by a multiuser application service. A determination is made that a user needs help in an application and the context of the user's progress or work in an application is determined. A context-based search query is executed against publically available help information and the results are returned to the application user on the same processing device or a companion processing device. The technology is particularly advantageous to game players having difficulty passing achievements where numerous third parties have provided instructions on how to complete troublesome tasks. While the technology is advantageously used in games, the technology may be used with any of a number of types of applications.
At 10, a determination is made as to whether or not a user has reached a point in the application with a user needs assistance. Methods for determining whether or not a user has reached a point in an application where user needs assistance are described herein. If the determination at 10 is that the user needs assistance, then the context of the user status within the application is determined at 15. The user's context in an application comprises information surrounding the nature of the issue the user is having in the application. Where the application is a game, the context may include a point in the game where the user has a problem competing a particular task. In story based games, checkpoints are provided in the game which mark a user's progress through the particular game story. Generally, there are tasks in the story which must be completed in order to reach a next checkpoint or achievement. In addition, a user's status within the game may be reflected by user skill level, in game inventory, play history and record in completing previous tasks. All such information comprises the context of the game. When a user reaches a particular point in the application where a user repeats the same in-applications tasks without success, a determination can be made as the game context and a search developed around terms related to the particular application and task. Hence, in a game application, a user skill level, user inventory, game level, and other aspects of the context are determined at 15.
Once the user context within the game is determined at 15, then a search query for help within the determined context is formulated at 20. The search query, as discussed below, can be run by any one of a number of standard commercial search engines which access publicly available, network based data sources, to seek help information. In another embodiment, a search quire is formulated to run against a database which collects help information from the publicly available data sources and categorizes the data in one or more ways, including, for example, organizing the data by application and application context. Examples of publicly available network based data sources include websites, web videos, blogs, and other published information where other users or users have provided descriptions and/or demonstrations of how to achieve a particular task in the context of the application.
At 30, the query formulated at 30 is run to retrieve a listing of potential help results. In one embodiment, a listing of results may be provided to the user as a result of the search at 40. When a user selects one of the results for presentation, the result is rendered in an interface for the user.
Computing environment 300 is utilized to execute a primary application 320 by an application user. (The application user is not illustrated.) Commuting environment 300 generally includes network interface 305, a processor 306, and a memory 307. Memory may include, for example, a primary application 320, user context information 330 and a context-based search application 335. A display 301 may be coupled to the computing environment 300. The primary application 320, when executed, will provide a context of the user's performance within the application. This context information 330 may be maintained by the application or may be derived by accessing information provided by the application. In another embodiment, the context information is derived from events distributed by the application to a multi-user event service. A user context-based search application 335 can communicate via network 80 with network resident third-party data 350. Network resident third-party data 350 may be provided on one or more servers or websites which provide access to third party generated information on the user of the primary application, including descriptions, presentations, illustrations and tutorials on how to complete tasks in the primary application, all of which are accessible using standard standard information protocols. Context-based search application 335 can utilize a standard web search engine, such as Microsoft's BING® search engine or the Google® search engine, to access network resident third-party data 350. Alternatively or in addition, context-based search application 335 may incorporate its own search technology. A context-based search result can provide an output known to average users comprising a listing of the results retrieved, with hyperlinks in the list which retrieve the content and display the content in known rendering media. Such rendering media may include a web-browser with known plug-ins for rendering graphics, audio and video information.
When a determination that help for the primary application is to be obtained context-based search application 335 will generate a search based on the context information 330 via network 80 against the network resident third-party data 350. Potential help results are returned to a user interface on display 301 provided by the computing environment.
In the embodiment of
In the embodiment shown in
In yet another embodiment, the same configuration illustrated in
In the embodiment shown in
As illustrated in
Given the various embodiments illustrated in
The technology may be used with an event service 102 provided by a multiuser application service 370. Any type of applications which can be developed by a primary application developer, and for which supplemental application developers would desire to develop secondary applications, can benefit from the technology described herein.
Real time event service 102 includes a real time data system 110, a repository data system 140, a game management service 126, a user authentication service 124, an API 138, and user account records 130. Applications are generally executed on processing device 100, and the primary applications (such as games) generate and output application events. In accordance with the technology, discrete or aggregated events are transmitted to the real time event service 102 and to secondary applications such as search application 335 executing on other processing devices, such as computing environment 312. Examples of events are those which may occur in the context of a game. For example, in a racing game, top speed, average speed, wins, losses, placement, and the like are all events which may occur. In an action game, shots fired, scores, kills, weapons used, levels achieved, and the like, as well as other types of achievements, are all events that may occur. In one embodiment, statistics are generated for events by the multiuser gaming service.
Components of the multiuser event service 102, including a repository data system 140 and real time data system 110 as well as API 138, are illustrated along with event flow and dataflow between the systems. As event data is generated by primary application processing device 100, the events are collected by service 102 transmitted through the API to both the repository data system 140 and the real time data system 110. This event data is transformed and maintained by the real time data system 110 and the repository data system 140. Through get/subscribe APIs 302304, information is returned to the processing devices 100. Real time data system 110 feeds repository data system 140 with event and statistic information created by the real time data system 110 for use by the repository data system in tracking events and verifying the accuracy of information provided by the real time data system 110. The repository data system, in turn, updates the real time data system 110 with any information which it deems to have been lost or needs correcting.
Real time data system 110 provides real time game information in the form of events and statistics to the secondary application developers who may build applications to use the service, as well as the repository data system 140. Applications such as context-based search application 335 are secondary in that they support the functions of the primary applications 320. Real time data system 110 receives event data from a plurality of running primary applications on any of a number of processing devices and transforms this events into data which is useful for secondary application developers. The statistics services may be provided by the application service and provide different types of statistics and information to the third party application developers. Another example is a leaderboard service for achievements within the event service 102 or individual games. Additional details of the multiuser application service may be found in U.S. application Ser. No. 14/167,769 entitled APPLICATION EVENT DISTRIBUTION SYSTEM (commonly owned by the assignee of the present application).
At 402, data concerning the user's performance of the application is received. As noted above, the information may be received by a search application such as application 335 running on the same processing device as the application or a companion computing environment 312 accessible to the user of the application. The data may be received from the primary application directly, via an API provided by the primary application or the event service 102 as described herein. At 404, a user's position within the application is detected. This may comprise detecting a user's position within a game, and determining the context of the user's position. The user's context in an application comprises information surrounding the nature of the issue the user is having in the application. At 406, a determination is made as to whether not a user has repeated a particular task or challenge in the application over a threshold number of times. For example, if the user has attempted to pass a particular achievement, but has not been successful, the determination at step 406 will register affirmative. It should be recognized that a number of alternatives exist for the threshold and the task which may be determinative of whether help is needed. In some cases, a single failure or incomplete task may be sufficient to initiate a search. In other cases, a higher number of failures is used. If the user has not repeated a task or challenge of threshold number of times, a determination may be made as to whether or not some other determiner of user help may be made at 408. For example, in the user interface of the application, a selector allowing user to request help from the context-sensitive search application to thirty-five may be provided. If either 406 or 408 are affirmative, then help search is initiated at 410.
In some embodiments, steps 722-726 are performed by an application. In other embodiments, searches can be culled for particular primary application by creating a reference library of searches authored by human search specialists. In such an embodiment, searches may be tested and revised before provision in a search reference database accompanying the application 335, and context based searching performed by reference to the human constructed searches or search strings which may be later combined for specific application contexts. For example, a search for data on the game Halo may begin with a limited string of “Halo for PC” to which is added a specific context such as “achievement one.” Thus, a resulting keyword query for the game may be, for example, “Halo for PC defeat achievement one.”
At step 1002, the detection of help needed to request for help is determined in accordance with
CPU 275, memory controller 202, and various memory devices are interconnected via one or more buses (not shown). The details of the bus that is used in this implementation are not particularly relevant to understanding the subject matter of interest being discussed herein. However, it will be understood that such a bus might include one or more of serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus, using any of a variety of bus architectures. By way of example, such architectures can include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnects (PCI) bus also known as a Mezzanine bus.
In one implementation, CPU 275, memory controller 202, ROM 204, and RAM 206 are integrated onto a common module 214. In this implementation, ROM 204 is configured as a flash ROM that is connected to memory controller 202 via a PCI bus and a ROM bus (neither of which are shown). RAM 206 is configured as multiple Double Data Rate Synchronous Dynamic RAM (DDR SDRAM) modules that are independently controlled by memory controller 202 via separate buses (not shown). Hard disk drive 208 and portable media drive 106 are shown connected to the memory controller 202 via the PCI bus and an AT Attachment (ATA) bus 216. However, in other implementations, dedicated data bus structures of different types can also be applied in the alternative.
A graphics processing unit 220 and a video encoder 222 form a video processing pipeline for high speed and high resolution (e.g., High Definition) graphics processing. Data are carried from graphics processing unit 220 to video encoder 222 via a digital video bus (not shown). An audio processing unit 224 and an audio codec (coder/decoder) 226 form a corresponding audio processing pipeline for multi-channel audio processing of various digital audio formats. Audio data are carried between audio processing unit 224 and audio codec 226 via a communication link (not shown). The video and audio processing pipelines output data to an A/V (audio/video) port 228 for transmission to a television or other display. In the illustrated implementation, video and audio processing components 220-228 are mounted on module 214.
In the implementation depicted in
MUs 270(1) and 270(2) are illustrated as being connectable to MU ports “A” 280(1) and “B” 280(2) respectively. Additional MUs (e.g., MUs 270(3)-270(6)) are illustrated as being connectable to controllers 294(1) and 294(3), i.e., two MUs for each controller. Controllers 294(2) and 294(4) can also be configured to receive MUs (not shown). Each MU 270 offers additional storage on which games, game parameters, and other data may be stored. In some implementations, the other data can include any of a digital game component, an executable gaming application, an instruction set for expanding a gaming application, and a media file. When inserted into console 292 or a controller, MU 270 can be accessed by memory controller 202. A system power supply module 250 provides power to the components of media system 200. A fan 252 cools the circuitry within console 292.
An application 260 comprising machine instructions is stored on hard disk drive 208. When console 292 is powered on, various portions of application 260 are loaded into RAM 206, and/or caches 210 and 212, for execution on CPU 275, wherein application 260 is one such example. Various applications can be stored on hard disk drive 208 for execution on CPU 275.
Gaming and media system 200 may be operated as a standalone system by simply connecting the system to a monitor, a television, a video projector, or other display device. In this standalone mode, gaming and media system 200 enables one or more users to play games, or enjoy digital media, e.g., by watching movies, or listening to music. However, with the integration of broadband connectivity made available through network interface 232, gaming and media system 200 may further be operated as a participant in a larger network gaming community, as discussed in connection with
Computer 510 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 510 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 510.
The system memory 530 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 531 and random access memory (RAM) 532. A basic input/output system 533 (BIOS), containing the basic routines that help to transfer information between elements within computer 510, such as during start-up, is typically stored in ROM 531. RAM 532 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 520. By way of example, and not limitation,
The computer 510 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example,
The drives and their associated computer storage media (or computer storage medium) discussed herein and illustrated in
The computer 510 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 580. The remote computer 580 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 510, although only a memory storage device 581 has been illustrated in
When used in a LAN networking environment, the computer 510 is connected to the LAN 571 through a network interface or adapter 570. When used in a WAN networking environment, the computer 510 typically includes a modem 572 or other means for establishing communications over the WAN 573, such as the Internet. The modem 572, which may be internal or external, may be connected to the system bus 521 via the user input interface 560, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 510, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
Mobile device 1400 may include, for example, processors 1412, memory 1410 including applications and non-volatile storage. The processor 1412 can implement communications, as well any number of applications, including the applications discussed herein. Memory 1410 can be any variety of memory storage media types, including non-volatile and volatile memory. A device operating system handles the different operations of the mobile device 1400 and may contain user interfaces for operations, such as placing and receiving phone calls, text messaging, checking voicemail, and the like. The applications 1430 can be any assortment of programs, such as a camera application for photos and/or videos, an address book, a calendar application, a media user, an internet browser, games, an alarm application or other third party applications. The non-volatile storage component 1440 in memory 1410 contains data such as web caches, music, photos, contact data, scheduling data, and other files.
The processor 1412 also communicates with RF transmit/receive circuitry 1406 which in turn is coupled to an antenna 1402, with an infrared transmitted/receiver 1408, and with a movement/orientation sensor 1414 such as an accelerometer and a magnetometer 1415. Accelerometers have been incorporated into mobile devices to enable such applications as intelligent user interfaces that let users input commands through gestures, indoor GPS functionality which calculates the movement and direction of the device after contact is broken with a GPS satellite, and to detect the orientation of the device and automatically change the display from portrait to landscape when the phone is rotated. An accelerometer can be provided, e.g., by a micro-electromechanical system (MEMS) which is a tiny mechanical device (of micrometer dimensions) built onto a semiconductor chip. Acceleration direction, as well as orientation, vibration and shock can be sensed. The processor 1412 further communicates with a ringer/vibrator 1416, a user interface keypad/screen 1418, a speaker 1420, a microphone 1422, a camera 1424, a light sensor 1426 and a temperature sensor 1428. Magnetometers have been incorporated into mobile devices to enable such applications as a digital compass that measure the direction and magnitude of a magnetic field in the vicinity of the mobile device, track changes to the magnetic field and display the direction of the magnetic field to users.
The processor 1412 controls transmission and reception of wireless signals. During a transmission mode, the processor 1412 provides a voice signal from microphone 1422, or other data signal, to the transmit/receive circuitry 1406. The transmit/receive circuitry 1406 transmits the signal to a remote station (e.g., a fixed station, operator, other cellular phones, etc.) for communication through the antenna 1402. The ringer/vibrator 1416 is used to signal an incoming call, text message, calendar reminder, alarm clock reminder, or other notification to the user. During a receiving mode, the transmit/receive circuitry 1406 receives a voice or other data signal from a remote station through the antenna 1402. A received voice signal is provided to the speaker 1420 while other received data signals are also processed appropriately.
Additionally, a physical connector 1488 can be used to connect the mobile device 1400 to an external power source, such as an AC adapter or powered docking station. The physical connector 1488 can also be used as a data connection to a computing device. The data connection allows for operations such as synchronizing mobile device data with the computing data on another device. A global positioning service (GPS) receiver 1465 utilizing satellite-based radio navigation to relay the position of the user applications is enabled for such service.
As noted above, one implementation of this technology includes a library used by applications in order to trigger events and push them into the transformation flow. Service 102 includes a client-server API i to accept streams of events from applications and ingest them into a cloud-based transformation pipeline managed by service 102. The event service 102 accepts incoming events and applies transformations and aggregations to provide statistics. The statistics are then stored in a datastore and values are also forwarded to other services.
The repository data system 140 creates a historical archive that can be queried and used to generate reports showing events/values over time. The real time event service 102 exposes APIs that allow calculated values to be retrieved by other internal and external clients and services. The real time data system 110 takes a calculated value feed and allows clients and services to subscribe to change notifications of those values.
In an alternative implementation, rather than using local event transformation may be utilized. Local transformation may be full or partial. Rather than all events generated on processing devices being pushed to event service 102, one or more transformation components may run on a processing device 100 and distribute events and statistics to other processing devices 100. No communication need take place with a host event service 102 directly to clients; no communication takes place with a hosted service. These embodiment significantly decreases the latency of event and statistic distribution since it may take place over a local network or wireless connection.
The two implementations above could even be present at the same time and serve different companion applications at the same time (some of which are connected to the host application, others which are analyzing the historical store, and another group which are subscribed to real time changes to the calculated values).
In yet another embodiment, event definitions need not be provided by application developers or the service 102. In such case, each event may be self-describing. Transformation rules would look at the structure of each event and apply their rules using a pattern-based approach.
The technology allows firing a high-level set of events with minimal effort on the part of the primary application developer and shifts the burden of extensibility, onto the transformation system that is described by this technology. This decoupling also provides an integration point for third parties: the output of the transformation system could be made available to other developers and those developers could build experiences on top of the host application without the involvement of the developers of the host application.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.