Graphical user interface for a gaming system

Information

  • Patent Grant
  • 10152844
  • Patent Number
    10,152,844
  • Date Filed
    Monday, November 27, 2017
    7 years ago
  • Date Issued
    Tuesday, December 11, 2018
    6 years ago
Abstract
A graphical user interface for a gaming console is configured to render a first graphical element in a first region that includes multiple user selectable resource objects, detect a first touching operation at a first location in the first region to select and highlight a resource, detect a first touching operation and a second touching operation in the second region, render an instance of the resource at a first and second location in the second region, determine if a time period of the first and second touching operation exceeds a predetermined time period, and if so, render multiple instances of the resource at the first and second location in the second region, wherein a number of rendered the multiple instances of the resource is determined by a duration that the time period of the second touching exceeds the predetermined time period.
Description
BACKGROUND

The present invention generally relates to user interfaces, and, more specifically, to graphical user interfaces for gaming systems.


Computing devices include tablet computers such as iPads, and smart phones, including Apple's iPhone®, Google's Android® phone, and Symbian® phones. These computing devices have extremely user-friendly interfaces, for enabling easy and quick interaction to users thereof. Most of these devices incorporate touch-sensitive screens that obtain user's inputs and facilitate smooth user interaction. Gaming software is employed in many of these devices for leisure purpose. An important aspect of a gaming system is the ease with which a user can enter desired inputs and interact with the user interface of the device on which he/she plays a game. For devices that lack a touch-screen facility incorporated therein, the only possible ways of interaction of the user while playing a game on such devices, is by clicking an associated mouse, using associated keyboard functions/keys to operate, or using associated joysticks. The experience with the ‘point and click’ or ‘joystick’ incorporated in many lower grade electronic devices is incompatible and often time consuming, while playing a game. Specifically, there are specific games where a user/player needs to use clicking, pointing, tapping and dragging operations many times, and often at different device display locations, which is hard to operate through a mouse or a joystick. In a typical gaming environment, where a user needs to perform similar operations by clicking or touching on multiple points on the interface, this becomes cumbersome. Even the touch-sensitive screens, provided in many conventional electronic devices, are capable of sensing the touching operation at one point at a time. Multi-touch screens are still not popular, and they can be of great benefit in gaming environment. Some of the conventional gaming console applications can be controlled through multi-touch sensitive operations, however, in strategic gaming environments, for performing certain desired operations, they still have some drawbacks.


Therefore, considering the aforementioned problems, there exists a need for a better and highly congenial graphical user interface for a gaming system, while playing a game on a computing device.


SUMMARY

The present disclosure provides an extremely compatible graphical user interface that facilitates an easy user interaction while the user plays a game on a computing device. Specifically, the disclosure provides a system and a method that facilitate an improved user experience, by sensing and obtaining user inputs through touching or swiping operations performed at multiple points on the graphical user interface corresponding to a gaming console.


In an aspect, the present disclosure provides an electronic device that includes a touch sensitive display screen and a computing hardware that executes a software product corresponding to a gaming system. The display screen simultaneously senses touching operations performed at multiple locations on the screen. When the software product is executed on the computing hardware, it generates and renders a graphical user interface on the display screen of the electronic device. The graphical user interface facilitates easy user interaction, and when rendered on the display screen, it presents multiple graphical objects and a number of user selectable options corresponding to the graphical object. Each user selectable option represents multiple resources for performing an operation on one or more of the graphical objects. A user selects one or more of these options, and the software product renders the resources corresponding to that option at different locations on the interface. The resources corresponding to the selected option are rendered when the user touches or swipes through the multiple points of the interface. Further, the nature of rendering and deploying the different resources on the interface depends on parameters like the speed with which the user performs the touching or swiping operation, or the pressure applied by the user on the interface while performing either the touching or the swiping operation.


In another aspect, the present disclosure provides a method of facilitating easy user interactions with a graphical user interface. A software product is executed on the computing hardware of the electronic device, and this results in generating and rendering of the interface on the display screen of the device. One or more graphical objects and a set of user selectable options corresponding to the graphical objects are rendered on the interface. Each user selectable option corresponds to one or more resources to be deployed on the interface. The method includes selecting one or more of these selectable options and performing a touching operation or a swiping operation over multiple points on the display screen of the device. Eventually, the resources corresponding to the selected option are deployed at multiple locations on the interface simultaneously. These locations correspond to the different points at which the touching or the swiping operation is performed.


The system and method of the present disclosure facilitates performing similar operation on a gaming console through multiple regions of the console at the same time, and avoids the cumbersome operations of touching or swiping through different points, each one at a time.


Additional aspects, advantages, features and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative embodiments construed in conjunction with the appended claims that follow.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an illustration of a manner of operating over a graphical user interface of an electronic device, in accordance with the present disclosure;



FIG. 2 to FIG. 4 are illustrations of a graphical user interface corresponding to a gaming system, rendered on a display screen of an electronic device, as the method and system of the present disclosure is used to control the gaming environment;



FIG. 5 is an illustration of an exemplary environment for implementing the method and system in accordance with the present disclosure; and



FIG. 6 is an illustration of an exemplary method of facilitating improved interaction of a user with a graphical user interface, in accordance with the present disclosure.





DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS

The following detailed description discloses aspects of the claimed invention and ways in which it can be implemented. However, the description is not intended to define or limit the invention, such definition or limitation being solely contained in the claims appended thereto. Although the best mode of carrying out the invention has been disclosed comprehensively, those in the art would recognize that other embodiments for carrying out or practicing the invention are also possible.


The present disclosure pertains to a graphical user interface for a gaming system, for facilitating easy and quick interaction of a user while playing a game, and for avoiding contemporary problems experienced while performing touching or swiping operations on the touch sensitive screens of electronic devices on which the games are being played.


Gaming systems are incorporated for leisure in many electronic computing devices, including computers, iPads, mobile phones, tablet computers and smart phones. Many such conventional electronic devices incorporate touch-sensitive screens for obtaining user inputs and for making congenial user experience with the interface. For playing games on electronic devices without a touch-sensitive screen, including many desktop and laptop computers, the user generally interacts with and provides inputs to gaming system's interface through coupled input devices, such as mice, certain keys on the keypads, and joysticks. Using multiple clicking operations through a mouse is time consuming and unfavorable, for example, in cases where a same operation needs to be performed at multiple points on the gaming interface. Even with the devices have touch-sensitive displays, when similar operations corresponding to the game being played need to be performed simultaneously through multiple regions of the interface, this becomes difficult to achieve as the conventional touch-sensitive screens are capable of sensing touching operations one at a time, at a specific point. Even though multi-touch sensitive screens are currently available, and are incorporated in electronic devices, operations corresponding to certain games, when played, require simultaneous sensing and detecting of touching or swiping operations performed through multiple regions of the screen.


The present disclosure provides an enhanced graphical user interface for a gaming system, which improves a user's experience while playing a game on an electronic device. The system and method facilitate performing of touching and swiping operations through a multi-touch sensitive screen of the electronic device, and allows the user to perform similar operations pertaining to the game, simultaneously, through different regions of the interface.


In FIG. 1, there is shown a graphical user interface corresponding to a game being played on an electronic device, showing how a user playing a strategic game performs touching or swiping operations through multiple points of the interface, simultaneously, for executing similar operations through multiple locations on the interface. As shown, a graphical user interface 100, corresponding to the game being played, is rendered on a display screen on the electronic device. Specifically, the interface 100 is rendered and presented on the display screen, when a software product corresponding to the game, is executed on computing hardware of the electronic device. The display screen is a multi-touch sensitive screen, capable of sensing touching or swiping operations performed at multiple points on the screen simultaneously. A user 108 uses two of his/her fingers and performs touching operations at two different locations 102 and 104, on the interface 100. The interface 100 senses this operation, and the software product corresponding to the game, executes actions pertaining to the performed touching operation on different graphical objects of the interface 100. This is explained in more details hereinafter with respect to an example of a specific gaming environment, in conjunction with the drawings that follow.


In FIG. 2, there is shows a snapshot of the display screen of an electronic device, when a user plays a game on the device, and uses the method of the present disclosure for controlling the gaming interface. As shown, an electronic device 200 has a display screen 202, where different resources for performing actions corresponding to the game, are being rendered on a graphical element 206 of the display screen 202. For the purpose of explaining the disclosure, the depicted gaming environment corresponds to a war-based game, and the gaming environment rendered on the display screen 202 corresponds to a battlefield 204. The device 200 can be any suitable electronic device that incorporates a multi-touch sensitive screen, including an iPad, a smartphone, for example, Apple's iPhone®, an Android Phone®, or a Symbian Phone®, a tablet computer, a desktop computer or a laptop computer, and so forth. The battlefield 204 has different graphical objects, for example, a target 208, which can represent a castle, or a camp. An objective of the game may be to win the castle, by attacking it through different resources A, B and C, and so forth, shown within the graphical element 206. The resources A, B and C within the element 206 can represent weapons, including guns, cannons, arrows, bows, and so forth, or represent different troops, armed soldiers, walking soldiers or horse riding soldiers, and so forth. Though only three such resources have been shown, there can be multiple other resources for playing the game. In the strategic game, the user selects one or more of these resources, and deploys the selected resources at multiple locations within the battlefield 204. The selected resources are then used to perform operations for conquering the target 208. For example, the deployed resources can be operated to attack the target 208 through the different weapons they possess. The user can use multiple touching operations simultaneously, at different points on the display 202, to deploy the resources A, B, C, and so forth at multiple locations within the battlefield 204. Moreover, the user can also perform the swiping operation, to deploy a specific resource all through a set of points along a specific path, by swiping fingers across that path. The movement of the different deployed resource, either away from, or towards the target 208, can be controlled by pointing towards a specific deployed resource, and swiping the finger in the desired direction. When the user touches the display screen 202 to deploy a selected resource, the screen 202 detects the pressure applied by the user at different points. The number of resources deployed at different locations optionally depends on the amount of pressure applied. Specifically, a higher pressure applied at a specific point results in deploying increased numbers of resources at that point, and vice versa. Additionally when playing resources can be released at constant rate over time or at accelerated/decelerated rate depending on game settings. Moreover, the rapidity of deploying the resources at different locations on the battlefield 204 depends upon the speed with which the user performs the touching or the swiping operation through different points. For example, if the user wishes to deploy a selected resource along different points in a specific path, and performs a swiping operation through the path, the resources are deployed as quickly as the swiping operation through the path is performed. A rapid swiping operation results in a quicker deployment of resources, compared to a slow swiping operation.


Continuing further, in FIG. 3, there is shown the display screen of the device, when the user has selected one of the selectable options A, B and C, for deploying resources within the battlefield of the war-based game. Shown as an example, the user has selected the option B corresponding to a specific category or type of resources to be deployed in the battlefield, to operate on the target 208 thereafter. As aforementioned, the selected resources may be troops, armed soldiers possessing specific kinds of weapons, horse riding soldiers, and so forth. Further, though only one option has been shown being selected, the user can also select multiple options to deploy different kinds of resources in the battlefield. Eventually, after selecting the option B, the user uses two of his/her fingers to control the interface and deploy the troops at two desired points 302 and 304, as shown. Specifically, the user performs a touching operation at the points 302 and 304 preferably simultaneously to enable deployment of the troops at the same time. Alternatively touching operations can be performed in temporal sequence, namely one-by-one. Alternatively, a swiping operation may also be performed by initiating from either of the selected points 302 and 304, through a specific desired path, to deploy the resources all through the desired path. In an embodiment, the resources are deployed at the selected points, at a specific pre-determined time after the touching operation is performed. For example, in one embodiment, the resources may be deployed at a specific point only if the user keeps his finger in touch with the point for a pre-determined time, which may be about 0.5 to 1 seconds. This feature is adjustable, and the minimum time for which the user needs to keep his fingers in contact with the screen, for deploying the resources, can be customized based on the user's desire, before playing the game. Further, this avoids the cases where the resources may be deployed unintentionally or undesirably.


A specific deployed resource is released for action, for example, to attack the target 208, based on detection of certain conditions. This may include, for example, the user still keeping his/her finger at a desired point, for about 1 to 2 seconds after the resource has been already deployed at that point. In another case, an execution option may be separately rendered on the display screen, and the user needs to provide an execution command through the option, after the resources are deployed. Further, the multi-touch operations performed through the different fingers act independently, and the display screen is configured to sense and interpret the swiping or touching operations performed through these fingers independently. Specifically, as an example, when one finger is touched or swiped through specific points on the screen, one set of resources may be deployed over one set of locations corresponding to those points, and subsequently, when another finger is touched or swiped through a different set of points, a second set of resources may be subsequently deployed over those points too. The two sets of resources may be same or different, depending on the game settings, which are user adjustable, and can be customized before playing the game. Further, as aforementioned, the display screen is also capable of sensing touching or swiping operations performed at different points simultaneously, and deploy the resources at different points together. In an embodiment, the number of resources deployed at different points, may be one each corresponding to detecting of a touching operation performed at that point. Alternatively, a constant number of resources per unit time may be deployed at a specific point, or over a set of points, as long as a touching or a swiping operation is performed over those points. In another embodiment, as aforementioned, the number of resources deployed is a function of the pressure applied by the user while performing the touching or swiping operation. Specifically, a higher pressure applied at a specific point optionally results in deploying more number of resources at that point, and vice versa.


Continuing further, in FIG. 4, there is illustrated the display screen of the electronic device, where the resources corresponding to the selected option B, are shown deployed at multiple locations on the display screen. As shown, a set of resources 410 are deployed at one set of locations on the screen 202, and these correspond to multiple touching operations performed earlier around a point 302 (shown in FIG. 3). To deploy the resources 410, the user optionally performs a swiping operation through a path covering these points. Further, another set of resources 420 are shown deployed on the other side of the target 208. These resources are rendered when the touching operations initiating with a point 304 (see FIG. 3) is performed by the user, through another finger. Similarly, a touching or swiping operation is optionally performed at many other points on the display screen 202, to deploy the resources at other desirable points.


In FIG. 5, there is shown an illustration of an exemplary environment for implementing the method and system in accordance with the present disclosure. A plurality of electronic devices 502, 504, 506 and 508 are shown, through which a user can connect to one of different gaming servers 510 and 540, through one of a multiple networks represented by 550, 560 and 570. The electronic devices 502, 504, 506 or 508, can be any suitable electronic devices having a computing hardware capable of supporting and executing a software product corresponding to a gaming system. Typical examples of the illustrated electronic devices may include a desktop computer, a laptop computer, a tablet computer, a smart phone including the popularly known iPhones®, Android Phone® etc., an iPad, and so forth. Furthermore, all these electronic devices have one or more multi-touch sensitive screens for sensing and obtaining a user's input through touching or swiping operations performed at multiple points of the one or more display screens. Moreover, the different electronic devices 502, 504, 506 and 508, are commonly connected to each other through either of the servers 510 and 540, through suitable communication networks. The networks 550, 560 and 570, and so forth, may be Wireless networks, such as a Wireless Local area network (WLAN), Local area networks (LAN), cellular networks, for example, 2G network, 3G network, and so forth. Further, any of the electronic devices 502, 504, 506 and 508 may also use its own Bluetooth network and may be capable of connecting to a Bluetooth server, to synchronize with the other electronic devices. The shown exemplary environment supports multiplayer gaming too, by facilitating multiple users to be online through different devices, connecting through a suitable network, and synchronizing with each other. Further, multiple databases, as shown by modules 520, 530, and so forth, are coupled to different servers, and information related to the gaming environment is continuously stored in these databases, when the different users are online for multiplayer gaming.


For facilitating single player gaming, a user logs on through any of the electronic devices 502, 504, 506 or 508, and connects to one of the gaming servers 510 or 540, through a suitable network, for example via the Internet and/or a wireless communication network. As the user logs on, and executes the gaming software on the computing hardware of the specific device that he/she utilizes, for example, the device 502, a graphical user interface corresponding to the game is generated, and is rendered on the display screen of the device 502. The graphical user interface presents different graphical objects pertaining to the game, on the display screen of the device 502. The graphical objects may be represented by different blocks/segments of the graphical user interface, on which different operations corresponding to the game being played, can be performed. For example, in a case where the game is a war-based game, such blocks/segments may represent one or more targets that need to be conquered, such as the target 208 shown earlier in FIG. 2. Further, one or more graphical elements, representing a set of user selectable options for performing actions on the graphical objects, are also rendered on the interface of the device 502. Such elements have been explained in detail earlier, in conjunction with the previous drawings of the disclosure, which pertain to a war-based game. Moreover, a point object (cursor) movable over the different graphical objects appears on the graphical user interface, for controlling the gaming operations. The pointer object is controllable by performing touching, swiping or tapping operations on the display screen of the device 502. Further, other input devices, including a mouse, a joystick or a set of keyboard buttons, may be coupled to the device 502 (though not shown), for facilitating provision of user inputs. The touching operation on the display screen can be performed through use of a suitable touch-sensitive object, including fingers, a pen, a pencil, a pointing organ, and so forth.


Another database 580, coupled to the gaming server 510, serves as a back end database for the gaming server 510. As the user of the device 502 starts playing the game, typical actions and gestures performed by the user, are recorded in the back end server 580. Specifically, such actions are interpreted through the gaming server 510, and are sent as messages to the back end server 580, which eventually maintains a log of, and a backup for the played game. Such messages can be in the form of data packages sent over an Internet connection through which the device 502 is connected to the server 510, or sent over any other wireless or wired network connecting the device 502 to the server 510, as aforementioned. Typical elements of such messages for maintaining a backup for the game may include a header, a payload and a checksum. The checksum can be a function of the payload, or it may be a unique user identifier, such as a username or similar. An advantage arising from including the checksum in the back end maintaining messages, is a possibility of avoiding potential frauds while playing the game. Those in the art will understand that an appropriate checksum function or a checksum algorithm may be applied to the collected digital data, while the game is being played, to obtain the checksum. Further, the checksum corresponding to a specific data can be recomputed at any point of time, and compared to the stored checksum, to avoid possible frauds. The back end messages received by the server 510 are also sent to the other databases 520 and 530 of the server 510. In these databases 520, 530, these back end messages are used to maintain a continuous logic that represents the status of the game, for example, the exact score of the player updated with time, and a stage of the game that the player has already reached. With a continuous receipt of the back end messages by the databases 520 and 530, a regular updating of the game status is undertaken within the these server databases 520 and 530, eventually, with time. This ensures facilitating the resumption of the game to its last status, in cases where the device 510 unexpectedly shuts down, the device 510 is unexpectedly hindered in its communication or the user changes the gaming terminal, or he/she intentionally quits playing for a certain period, and logs in at some other time, such a possibility of resumption assists to enhance user satisfaction with the graphical user interface. Release/use of resources (such as troops) typically reduces game credits i.e. available funds for playing the game. Game credits can be credits which are earned during the game course or game credits can be purchased with credit card or other payments method. Each player can have their game credit stored in for example back end database 580. Back end database 580 can have billing interface to credit card company, bank or other payment/credit methods and systems such as Paypal® or to mobile payment done with premium rated messages (short message service).


Though only two servers 510 and 540 have been shown, there can be multiple gaming servers coordinating with, and connected to each other, for implementing the gaming environment in accordance with the present disclosure. Moreover, the environment as shown in FIG. 5 is capable of implementing a thin client game, namely written in a computer program that is partially independent in its computational roles, wherein a part of the gaming logic may be stored in any of the servers 510 and 540, and a part of it may be stored in the gaming terminal. The depicted environment also supports a thick client game, namely written in a solely independent computer, wherein the entire gaming logic may be stored in the gaming terminal. Furthermore, the game is optionally completely web-based too, wherein most of the gaming logic may be stored in any of the servers 510 or 540. The gaming software corresponding to the game being played, can be optionally written in any programming language.


Although, the gaming system implementable through the illustrated gaming environment, has been described for the case when a single user logs on to any of the electronic devices 502, 504, 506 or 508, the same gaming environment is capable of supporting multiplayer gaming, wherein different users may log on through different electronic devices, and synchronize with each other by connecting concurrently through any of the common gaming servers 510 and 540, through suitable networks as aforementioned, and share a common graphical user interface representing the ongoing game. In such embodiments, the graphical user interface rendered on the display screens of the different electronic devices, is regularly updated, concurrently, through the logic data stored in the databases 520 and 530 of the gaming servers, at the back end.


In FIG. 6, there is shown a method of facilitating user interactions with a graphical user interface, while playing a game. The method is explained in conjunction with a typical example of a war-based game, described earlier through the previous figures of the disclosure. However, the method can be generalized and implemented on other gaming environments also, and is not intended to limiting the scope of the present disclosure. At a step 604, the method includes a step of executing a software product on computing hardware of an electronic device. The electronic device can be any appropriate device incorporating a multi-touch sensitive screen, examples of which have been set forth earlier. The software product corresponds to a gaming system, for facilitating playing of a game on the electronic device. At a step 608, as the software product is executed, the method includes generating and rendering on a graphical user interface a representation of the gaming environment on the display screen of the electronic device. At a step 612, the method includes presenting via the graphical user interface different graphical objects, a set of user selectable options for controlling the gaming environment, and a pointer for performing touching or swiping operations through different points on the interface. For example, as aforementioned, in a war-based game, the graphical objects may correspond to a castle to be conquered, a camp to be destroyed, and so forth, and the gaming environment may represent a battlefield. The user selectable options may correspond to different resources that can be deployed over different portions of the interface, to perform operations on the graphical objects, for scoring points. Specifically, the resources may be different kinds of troops, horse riding soldiers, armed soldiers possessing versatility of weapons, including guns, bombs, cannons, bows, arrows, and so forth. At a step 616, the method includes the user selecting one or more selectable options corresponding to the different kinds of resources that he/she wants to deploy within the gaming environment. Proceeding further, after selecting and enabling one of the selectable options, at a step 620, the method includes deploying the corresponding resources, the user performs touching or swiping operations on multiple points of the interface, depending on the locations where he wishes to deploy them. At step a 624, the resources are deployed and appear on the gaming interface. In an embodiment, the nature of deployment of the different resources may depend on different parameters. For example, the number of resources deployed at a specific point, depends on the pressure applied by the user on the display screen, while performing the touching operation at that point. Moreover, if the user wishes to deploy resources along multiple points constituting a specific path, and performs a swiping operation along that path, the rapidity with which the resources are deployed depends on the speed with which the user performs the swiping operation along the path. In another embodiment, a constant number of resources per unit time can be deployed at each point where a touching operation is being performed. The nature of deployment of resources is user adjustable, and can be customized, based on the user's priority, before playing the game.


At a step 628, the method includes checking whether or not other resources are desired to be deployed, before executing actions through the resources. If yes, the method includes returning to the step 616, selecting the selectable options corresponding to the resource, and performing the touching or swiping operations through the desired points again. Else, going further, at a step 632, the method includes releasing the deployed resources for action, within the gaming environment. For example, in a war-based game, the deployed troops/armed soldiers are released for operating on a specific target, to attack it from different points where they are deployed. In an embodiment, the releasing of the deployed resources is automated, and occurs when the user keeps his/her fingers on a specific resource for a pre-determined time after deploying it. For example, this time may be about 1 to 2 seconds of touching operation after the resource is already deployed. The display screen is configured to sense this pre-determined time, and the software product executes action pertaining to the deployed resource, when this occurs. In another embodiment, releasing the different resources may require a manual user input. Specifically, for example, a triggering option (like a “go” or “fire” option) may be rendered after deploying the resources, and the resources may not be released until the user manually initiates the option. At a step 636, after the actions have been performed by the deployed resources, the graphical user interface is updated and a reformed interface representing the latest status of the gaming environment, renders on the display screen.


The method and system of the present disclosure, for improving interaction of a user with a graphical user interface corresponding to a game, provides substantial benefits as the user performs different operations in a gaming environment. Similar operations, when desired to be performed by a user, through different locations on the gaming interface, can be easily executed by touching or swiping through multiple points of the display screen simultaneously. Hence, the user's experience with the gaming interface is much more comfortable.


Though the present disclosure has been described comprehensively, through an exemplary embodiment where it is applicable in a gaming environment, and specifically through the example of a war-based game, the disclosure also finds it applications in other gaming environments, and, generally, may be applicable to other graphical user interfaces, not pertaining to a gaming system also. In certain applications, the user interface of the disclosed embodiment can be used for a virtual control of any type of game. Certain aspects of the disclosed embodiments are also applicable to perform other operations, including building arcades and solving puzzle games. Further, the congenial user interface may also be implemented within other types of games, for example, adventurous, role playing and shooting games, construction and management simulation games, and so forth. For example, the congenial user interface can be used in computer terminals employed at financial exchanges, for example in Wall Street in New York and the Stock Exchange in London, where traders need to control multiple transactions simultaneously when executing a financial transaction, for example a synthetic credit default swap or a trading in derivative financial products.


Although the current invention has been described comprehensively, in considerable details to cover the possible aspects and embodiments, those skilled in the art would recognize that other versions of the invention may also be possible.

Claims
  • 1. A graphical user interface for a gaming console, the gaming console comprising: a processor; anda computer program product including machine readable instructions that are configured to be executed by the processor;the graphical user interface comprising at least one touch sensitive portion, the machine readable instructions when executed by the processor being configured to cause the processor to: render a first graphical element in a first region of the at least one touch sensitive portion, the first graphical element comprising a resource area that includes multiple user selectable resource objects;detect a first touching operation at a first location in the first region, the first location corresponding to a resource of multiple user selectable resource objects, the first touching operation causing selection and highlighting of the resource;detect at least a first touching operation at a first location in the second region and a second touching operation at a second location in the second region;render at least one instance of the resource corresponding to the first touching operation in the first region at the first location in the second region and the second location in the second region;determine if a time period of the first touching operation at the first location in the second region and a time period of the second touching operation at the second location in the second region exceeds a predetermined time period; andif the time period of the first touching operation in the second region exceeds the predetermined time period, render multiple instances of the resource at the first location in the second region;if the time period of the second touching operation in the second region exceeds the predetermined time period, render multiple instances of the resource at the second location in the second region;wherein a number of the multiple instances of the resource rendered at the first location in the second region is determined by a duration that the time period of the first touching exceeds the predetermined time period, and a number of the multiple instances of the resource rendered at the second location in the second region is determined by a duration that the time period of the second touching exceeds the predetermined time period; anddetect at least one swipe across the graphical user interface beginning at one or more of the first location or the second location in the second region, the at least one swipe being a movement towards a target location in the second region; and move the multiple instances of the resource rendered at the one or more of the first location or the second location away from the one or more of the first location or the second location and towards the target location.
  • 2. The graphical user interface according to claim 1, wherein the processor is further configured to: determine that the first touching operation in the second region and the second touching operation in the second region are detected at substantially a same time; andrender the at least one instance of the resource at the first location in the second region and at the second location in the second region at substantially a same time.
  • 3. The graphical user interface according to claim 1, wherein the processor is further configured to: detect a plurality of substantially sequential touching operations between the detection of the first touching operation at the first location in the first region until the detection of the second touching operation at the second location in the second region, the detected plurality of touching operations being detected along a path between the first touching operation and the second touching operation and comprising the at least one swipe along the path; andrender instances of the resource along the path, wherein a number of instances of the resource rendered along the path is based on a speed of the at least one swipe between the first location and the second location.
  • 4. The graphical user interface according to claim 1, wherein the processor is further configured to: cause the at least one instance of the resource at the first location in the second region to perform an action associated with the resource on a second graphical element rendered in the second region;cause the at least one instance of the resource at the second location in the second region to perform the action associated with the resource on the second graphical element rendered in the second region; andupdate the rendering of the second graphical element to reflect the performance of the action associated with the resource, wherein the updated rendering of the second graphical element is different than an initial rendering of the second graphical element.
  • 5. The graphical user interface according to claim 1, wherein the processor is configured to render the at least one instance of the resource at multiple locations along the path, based on a detected touching at multiple points along the path.
  • 6. The graphical user interface according to claim 1, wherein the graphical user interface comprises one or more of a desktop computer, a laptop computer, an iPad, or a smart phone, including an iPhone®, an Android® phone or a Symbian® phone.
  • 7. A method of facilitating user interactions with a graphical user interface of a gaming console, the graphical interface being generated and rendered on the display of the gaming console by executing a software product on a computing hardware of the gaming console, wherein execution of the software product on the computing hardware causes the gaming console to: render a first graphical element in a first region of at least one touch sensitive portion of the graphical user interface, the first graphical element comprising a resource area that includes multiple user selectable resource objects;detect a first touching operation at a first location in the first region, the first location corresponding to a resource of multiple user selectable resource objects, the first touching operation causing selection and highlighting of the resource;detect at least a first touching operation at a first location in the second region and a second touching operation at a second location in the second region;render at least one instance of the resource corresponding to the first touching operation in the first region at the first location in the second region and the second location in the second region;determine if a time period of the first touching operation at the first location in the second region and a time period of the second touching operation at the second location in the second region exceeds a predetermined time period; and if the time period of the first touching operation in the second region exceeds the predetermined time period, render multiple instances of the resource at the first location in the second region;if the time period of the second touching operation in the second region exceeds the predetermined time period, render multiple instances of the resource at the second location in the second region;wherein a number of the multiple instances of the resource rendered at the first location in the second region is determined by a duration that the time period of the first touching exceeds the predetermined time period, and a number of the multiple instances of the resource rendered at the second location in the second region is determined by a duration that the time period of the second touching exceeds the predetermined time period; anddetect at least one swipe across the graphical user interface beginning at one or more of the first location or the second location in the second region, the at least one swipe being a movement towards a target location in the second region; and move the multiple instances of the resource rendered at the one or more of the first location or the second location away from the one or more of the first location or the second location and towards the target location.
  • 8. The method according to claim 7, wherein execution of the software product on the computing hardware further causes the gaming console to determine that the first touching operation in the second region and the second touching operation in the second region are detected at substantially a same time; and render the at least one instance of the resource at the first location in the second region and at the second location in the second region at substantially a same time.
  • 9. The method according to claim 7, wherein execution of the software product on the computing hardware further causes the gaming console to detect a plurality of substantially sequential touching operations between the detection of the first touching operation at the first location in the first region until the detection of the second touching operation at the second location in the second region, the detected plurality of touching operations being detected along a path between the first touching operation and the second touching operation and comprising the at least one swipe along the path; and render instances of the resource along the path, wherein a number of instances of the resource rendered along the path is based on a speed of the at least one swipe between the first location and the second location.
  • 10. The method according to claim 7, wherein execution of the software product on the computing hardware further causes the gaming console to cause the at least one instance of the resource at the first location in the second region to perform an action associated with the resource on a second graphical element rendered in the second region; cause the at least one instance of the resource at the second location in the second region to perform the action associated with the resource on the second graphical element rendered in the second region; and update the rendering of the second graphical element to reflect the performance of the action associated with the resource, wherein the updated rendering of the second graphical element is different than an initial rendering of the second graphical element.
  • 11. A software product recorded on a machine readable data storage medium, the software product being executable on the computing hardware of a computing device, for implementing the method of claim 7.
US Referenced Citations (190)
Number Name Date Kind
4698625 Caskill et al. Oct 1987 A
5404442 Foster et al. Apr 1995 A
5471578 Moran et al. Nov 1995 A
5592608 Weber et al. Jan 1997 A
5596699 Driskell Jan 1997 A
5598524 Johnston, Jr. et al. Jan 1997 A
5608850 Robertson Mar 1997 A
5689667 Kurtenbach Nov 1997 A
5701424 Atkinson Dec 1997 A
5745717 Vayda et al. Apr 1998 A
5757383 Lipton May 1998 A
5760773 Berman et al. Jun 1998 A
5798760 Vayda et al. Aug 1998 A
5828360 Anderson et al. Oct 1998 A
5835094 Ermel et al. Nov 1998 A
5861886 Moran et al. Jan 1999 A
5880733 Horvitz et al. Mar 1999 A
5926178 Kurtenbach Jul 1999 A
5943039 Anderson et al. Aug 1999 A
6037937 Beaton et al. Mar 2000 A
6094197 Buxton et al. Jul 2000 A
6144378 Lee Nov 2000 A
6249740 Ito et al. Jun 2001 B1
6263278 Nikiel et al. Jul 2001 B1
6337698 Keely, Jr. et al. Jan 2002 B1
6456307 Bates et al. Sep 2002 B1
6753888 Kamiwada et al. Jun 2004 B2
6906643 Samadani et al. Jun 2005 B2
6920619 Milekic Jul 2005 B1
7088365 Hashizume Aug 2006 B2
7093202 Saund et al. Aug 2006 B2
7158878 Rasmussen et al. Jan 2007 B2
7210107 Wecker et al. Apr 2007 B2
7310619 Baar et al. Dec 2007 B2
7366995 Montague Apr 2008 B2
7373244 Kreft May 2008 B2
7441202 Shen et al. Oct 2008 B2
7546545 Grabow et al. Jun 2009 B2
7676376 Colman Mar 2010 B2
7770135 Fitzmaurice Aug 2010 B2
7818089 Hanna et al. Oct 2010 B2
7870496 Sherwani Jan 2011 B1
7890257 Fyke Feb 2011 B2
7920963 Jouline et al. Apr 2011 B2
8059101 Westerman et al. Nov 2011 B2
8065156 Gazdzinski Nov 2011 B2
8132125 Iwema et al. Mar 2012 B2
8133116 Uberoi et al. Mar 2012 B1
8138408 Jung et al. Mar 2012 B2
RE43318 Milekic Apr 2012 E
8194043 Cheon et al. Jun 2012 B2
8217787 Miller, IV Jul 2012 B2
8219309 Nirhamo Jul 2012 B2
8234059 Sugiyama et al. Jul 2012 B2
8245156 Mouilleseaux et al. Aug 2012 B2
8253707 Kaneko et al. Aug 2012 B2
8261212 Wigdor et al. Sep 2012 B2
8292743 Etter Oct 2012 B1
8346405 Johnson et al. Jan 2013 B1
8368723 Gossweiler, III et al. Feb 2013 B1
8448095 Haussila et al. May 2013 B1
8578295 Chimelewski et al. Nov 2013 B2
8614665 Li Dec 2013 B2
8627233 Cragun et al. Jan 2014 B2
8636594 Dermo Jan 2014 B2
8782546 Haussila et al. Jul 2014 B2
8795080 Omi et al. Aug 2014 B1
20020175955 Gourdol et al. Nov 2002 A1
20030085881 Bosma et al. May 2003 A1
20030184525 Tsai Oct 2003 A1
20040002634 Nihita Jan 2004 A1
20040015309 Swisher et al. Jan 2004 A1
20040054428 Shen et al. Mar 2004 A1
20040150671 Kamiwada et al. Aug 2004 A1
20040263475 Wecker et al. Dec 2004 A1
20050002811 Froeslev et al. Jan 2005 A1
20050028110 Vienneau et al. Feb 2005 A1
20050111621 Riker et al. May 2005 A1
20050134578 Chambers et al. Jun 2005 A1
20050164794 Tahara Jul 2005 A1
20050270311 Rasmussen et al. Dec 2005 A1
20060022955 Kennedy Feb 2006 A1
20060025218 Hotta Feb 2006 A1
20060026535 Hotelling et al. Feb 2006 A1
20060055670 Castrucci Mar 2006 A1
20060085767 Hinckley et al. Apr 2006 A1
20060097991 Hotelling et al. May 2006 A1
20070004081 Hsiao Jan 2007 A1
20070040810 Dowe et al. Feb 2007 A1
20070057930 Iwema et al. Mar 2007 A1
20070070050 Westerman et al. Mar 2007 A1
20070096945 Rasmussen et al. May 2007 A1
20070110886 Hanna et al. May 2007 A1
20070118520 Bliss et al. May 2007 A1
20070180392 Russo Aug 2007 A1
20070234223 Bliss et al. Oct 2007 A1
20070252821 Hollemans et al. Nov 2007 A1
20080023161 Gather et al. Jan 2008 A1
20080023561 Durbin Jan 2008 A1
20080122796 Jobs May 2008 A1
20080208456 Jouline et al. Aug 2008 A1
20080222569 Champion et al. Sep 2008 A1
20080229245 Ulerich et al. Sep 2008 A1
20080231610 Hotelling et al. Sep 2008 A1
20080235610 Dettinger et al. Sep 2008 A1
20080309632 Westerman et al. Dec 2008 A1
20090037813 Newman et al. Feb 2009 A1
20090118001 Kelly et al. May 2009 A1
20090122018 Vymenets May 2009 A1
20090146968 Nartia et al. Jun 2009 A1
20090172593 Geurts et al. Jul 2009 A1
20090187842 Collins et al. Jul 2009 A1
20090313567 Soon-Young et al. Dec 2009 A1
20090325691 Loose Dec 2009 A1
20090327955 Mouilleseaux et al. Dec 2009 A1
20090327963 Mouilleseaux et al. Dec 2009 A1
20090327964 Mouilleseaux et al. Dec 2009 A1
20100093399 Kim et al. Apr 2010 A1
20100100849 Fram Apr 2010 A1
20100110032 Kim May 2010 A1
20100114471 Shinji et al. May 2010 A1
20100130213 Vendrow et al. May 2010 A1
20100185985 Chmielewski et al. Jul 2010 A1
20100192101 Chmielewski et al. Jul 2010 A1
20100192102 Chmielewski et al. Jul 2010 A1
20100192103 Cragun et al. Jul 2010 A1
20100217514 Nesbitt Aug 2010 A1
20100235778 Kocienda et al. Sep 2010 A1
20100251179 Cragun et al. Sep 2010 A1
20100251180 Cragun et al. Sep 2010 A1
20100283750 Kang et al. Nov 2010 A1
20100285881 Bilow Nov 2010 A1
20100287486 Coddington Nov 2010 A1
20100299637 Chmielewski et al. Nov 2010 A1
20100306702 Warner Dec 2010 A1
20100313126 Jung et al. Dec 2010 A1
20110014983 Miller, IV et al. Jan 2011 A1
20110066980 Chmielewski et al. Mar 2011 A1
20110066981 Chmielewski et al. Mar 2011 A1
20110081973 Hall Apr 2011 A1
20110093821 Wigdor et al. Apr 2011 A1
20110099180 Arrasvuori Apr 2011 A1
20110102336 Seok et al. May 2011 A1
20110111840 Gagner et al. May 2011 A1
20110163986 Lee et al. Jul 2011 A1
20110165913 Lee et al. Jul 2011 A1
20110184637 Jouline et al. Jul 2011 A1
20110184638 Jouline et al. Jul 2011 A1
20110209058 Hinckley et al. Aug 2011 A1
20110300934 Toy et al. Aug 2011 A1
20110210931 Shai Sep 2011 A1
20110225524 Cifra Sep 2011 A1
20110239110 Garrett et al. Sep 2011 A1
20110244937 Yamashita et al. Oct 2011 A1
20110248939 Woo et al. Oct 2011 A1
20110254806 Jung et al. Oct 2011 A1
20110270922 Jones et al. Nov 2011 A1
20110271182 Tsai et al. Nov 2011 A1
20110283188 Farrenkopf Nov 2011 A1
20110283231 Richstein et al. Nov 2011 A1
20110307843 Miyazaki et al. Dec 2011 A1
20110319169 Lam et al. Dec 2011 A1
20110320068 Lee et al. Dec 2011 A1
20120005577 Chakra et al. Jan 2012 A1
20120030566 Victor Feb 2012 A1
20120030567 Victor Feb 2012 A1
20120056836 Cha et al. Mar 2012 A1
20120094766 Reynolds et al. Apr 2012 A1
20120094770 Hall Apr 2012 A1
20120122561 Hedrick May 2012 A1
20120122586 Kelly et al. May 2012 A1
20120122587 Kelly et al. May 2012 A1
20120157210 Hall Jun 2012 A1
20120162265 Heinrich et al. Jun 2012 A1
20120185789 Louch Jul 2012 A1
20120190388 Catleman et al. Jul 2012 A1
20120264520 Marshland et al. Oct 2012 A1
20120266092 Zhu et al. Oct 2012 A1
20120306772 Tan et al. Dec 2012 A1
20120326993 Weisman Dec 2012 A1
20130016126 Wang Jan 2013 A1
20130027412 Roddy Jan 2013 A1
20130067332 Greenwood et al. Mar 2013 A1
20130120274 Ha et al. May 2013 A1
20130176298 Lee Jul 2013 A1
20130178281 Kartik et al. Jul 2013 A1
20130181986 Fowler Jul 2013 A1
20130207920 McCann Aug 2013 A1
20140066017 Cho Mar 2014 A1
20160184699 Rageh et al. Jun 2016 A1
Foreign Referenced Citations (29)
Number Date Country
1867886 Nov 2006 CN
102245272 Nov 2011 CN
102279697 Dec 2011 CN
102316945 Jan 2012 CN
102455851 May 2012 CN
2341420 Jul 2011 EP
2395419 Dec 2011 EP
2530569 Dec 2012 EP
2004525675 Aug 2004 JP
2005152509 Jun 2005 JP
2005211242 Aug 2005 JP
2006034754 Feb 2006 JP
2006185443 Jul 2006 JP
2008501490 Jan 2008 JP
2009125266 Jun 2009 JP
2009279050 Dec 2009 JP
2010012050 Jan 2010 JP
2010079590 Apr 2010 JP
2010187911 Sep 2010 JP
2010233957 Oct 2010 JP
2011036346 Feb 2011 JP
2012034970 Feb 2012 JP
2012081163 Apr 2012 JP
1020100014941 Feb 2010 KR
1020110069824 Jun 2011 KR
1020110080129 Jul 2011 KR
1020147019044 Aug 2014 KR
1020140123693 Oct 2014 KR
2012001637 Jan 2012 WO
Non-Patent Literature Citations (75)
Entry
“Metroid Prime: Hunters”, Dengeki Game Cube, vol. 4, No. 8, Media Works, Jul. 1, 2004, vol. 4, p. 10.
A farm simulation game software for Iphone “Eco faamu 2”, Updated Jan. 25, 2012, 4 pages.
Australian Patent Examination Report No. 2 for Application No. 2013263809, dated Jan. 28, 2016, 5 pages.
Australian Patent Examination report received for Application No. 2013246615 dated Oct. 21, 2015, 3 pages.
Australian Patent Examination report received for Application No. 2016202994 dated Oct. 10, 2017, 6 pages.
Australian Patent Examination report received for Application No. 2016202994 dated Jun. 14, 2017, 8 pages.
Boulos, Maged N. Kamel, et al., “Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation”, International Journal of Health Geographic's, Jul. 26, 2011, pp. 1-14.
Canadian Office Action received for Application No. CA2869766, dated Feb. 7, 2017, 3 pages.
Canadian Office Action received for Application No. CA2869766, dated Feb. 19, 2016, 4. pages.
Chinese Office Action dated Dec. 3, 2015 for Application No. 201380006199.2, 19 pages including 10 pages of English translation.
Chinese Second Office Action dated Jul. 8, 2016 for Application No. 201380006199.2, 20 pages including 12 pages of English translation.
Chinese Third Office Action dated Nov. 17, 2017 for Application No. 201380006199.2, 23 pages including 12 pages of English translation.
Combined Search and Examination Report received for United Kingdom Patent Application No. GB1222096.8, dated Jan. 29, 2013, 12 pages.
Combined Search and Examination Report received for United Kingdom Patent Application No. GB1409299.3, dated Jul. 8, 2014, 11 pages.
Welcome to Nintendo DS compatible ranch story! The ultimate guide to the wind bazaar, KOEI Co., Ltd., Feb. 19, 2010, 2nd edition, p. 8.
Communication Pursuant to Rule 161 (1) and 162 EPC received dated Feb. 27, 2015 for Application No. EP13737848.5, 2 pages.
Communication Pursuant to Rule 161 (1) and 162 received in EP Application No. EP13736623.3, dated Mar. 18, 2015, 2 pages.
Examination Report received for United Kingdom Patent Application No. GB1409299.3, dated May 12, 2015, 9 pages.
Examination Report received for United Kingdom Patent Application No. GB1222096.8, dated Jul. 8, 2014, 09 pages.
Exammination Report received for Candaian Patent Application No. 2861101, dated Feb. 21, 2017, 4 pages.
Extended European Search Report received for Application No. EP16020420.2 dated Mar. 22, 2017, 10 pages.
Final Office Action dated Apr. 10, 2017 for U.S. Appl. No. 15/093,829, 7 Pages.
Final Office Action dated Nov. 15, 2013 for U.S. Appl. No. 13/479,637, 7 Pages.
Final Office Action dated Mar. 19, 2015 for U.S. Appl. No. 14/330,197, 8 Pages.
First Office Action received for the Chinese Patent Application No. 201380014183.6, dated Jun. 3, 2015, 8 pages including 3 pages of English Translation.
“How to Play Farmville on the iPad”, Viewed on internet on Jun. 7, 2017, published on Mar. 26, 2011. Available at: https://www.youtube.com/watch?v=LLOfWYUBPu4>.
How to play foreign games of IPhone and I Pad, for Zombie Farm, Publisher Cosmic publishing Co., Ltd, Mar. 19, 2011, pp. 052-055.
“Infinity Blade Cross”, 70-71, Weekly Famitsu, Japan, Enterbrain, Inc., Mar. 15, 2012, vol. 27, No. 13, No. 68 pages.
International Search Report received for International Patent Application No. PCT/IB2013/001063, dated Feb. 6, 2014, 5 pages.
Written Opinion of Searching Authority; PCT/IB2013/001211, dated Jan. 8, 2014.
International Preliminary Report on Patentability and Written Opinion received for International Application No. PCT/IB2013/001126, dated Oct. 14, 2014, 10 pages.
International Preliminary Report on Patentability and Written Opinion received for International Patent Application No. PCT/IB2013/001063, dated Oct. 23, 2014, 12 pages.
International Preliminary Report on Patentabilty and Written Opinion, dated Nov. 25, 2014 for Application No. PCT/IB2013/001211, 11 Pages.
International Seach Report received for International Application No. PCT/IB2013/001126, dated Jan. 8, 2014, 6 pages.
International Seach Report received for International Application No. PCT/IB2013/001211, dated Jan. 8, 2014, 6 pages.
Japan Notice of Allowance, Application No. 2016-210300, dated Jul. 27, 2017, 3 pages.
Japanese Intellectual Property Office, Notification of Ground of Rejection, dated Oct. 2, 2015, 3 pages.
Korean Non-Final Rejection received for Application No. 1020147019044 dated Jun. 29, 2015, 7 pages including English translation.
Korean Non-Final Rejection received for Application No. 1020157022747 dated Nov. 17, 2015, 7 pages including English Translation.
Korean Non-Final Rejection received for Application No. 1020157022747 dated May 30, 2016, 5 pages including 2 pages of English translation.
Nakajima, Kengo, “Technology That Supports on line Gaming Back stage Behind Expansive Playing Space”, 11 How 1 o Achieve Both Unlimited Simultaneous Connections and Milli second Latency Gijyutsuhyouronsha Corporation, 25, Apr. 1, 2011, 4 pages.
“Never Blog Review on Every Farm”, posted on Nov. 3, 2011. Available at:—http://blog.naver.com/yspray4u/10123134648.
Nintendo DS compatible river's fishing Komei Valley Seseragi's poetry complete guide, KOEI Co., Ltd., Aug. 15, 2007, p. 62.
Non Final Office Action dated Mar. 26, 2013 for U.S. Appl. No. 13/714,858, 5 Pages.
Non Final Office Action dated Sep. 27, 2016 for U.S. Appl. No. 15/093,829 13 Pages.
Non Final Office Action dated Aug. 28, 2015 for U.S. Appl. No. 14/330,197, 4 Pages.
Non Final Office Action dated Mar. 6, 2013 for U.S. Appl. No. 13/479,637, 5 Pages.
Non-Final Office Action received in U.S. Appl. No. 14/391,229 dated Nov. 17, 2016, 14 Pages.
Non-Final Office Action received in U.S. Appl. No. 14/391,229 dated Apr. 7, 2017, 16 Pages.
Non Final Rejection received in Korean Application No. 10-2014-7021063, dated Aug. 11, 2015, 08 Pages.
Zombie Farm, iPhone, iPad Overseas Game App Strategy Law, Cosmic Publishing Co., Ltd. , Mar. 19, 2011, 1 page.
Patent Examination Report No. 1 received for AU Application No. 2016203195, dated Oct. 28, 2016, 3 Pages.
Notification of the Third Office Action received in Chinese Application No. 201380007040.2, dated Jan. 22, 2017, 3 pages.
Korean Intellectual Property Office, Notice of Non-Final Rejection, Application No. 10-2016-7028220, dated Apr. 27, 2018, 2 pages.
Non-Final Rejection received for Korean Patent Application No. 1020137020715, dated Jun. 29, 2015.
Notice of Allowance, Japan Patent Office, Application No. 2014-552719, dated Oct. 25, 2016, 3 pages.
Notice of Ground of Rejection received for Japanese Patent Application No. 2014-552719, dated Oct. 2, 2015, 5 pages including 3 pages of English Translation.
Notice of Non-Final Rejection Received for Korean Patent Application No. 10-2014-7019044, dated Jun. 29, 2015, 14 pages including 7 pages of English Translation.
Notification of ground of rejection received for Application No. JP2014-552719 dated May 12, 2016.
Notification of ground of rejection received for Application No. JP2014-552719 dated Oct. 21, 2016.
Notification of ground of rejection received for Application No. JP2014-552719 dated Sep. 28, 2015.
Notification of Ground of Rejection, JP Application No. 2014-148443, dated Dec. 16, 2016, 08 pages including 4 pages of English translation.
Notification of ground of rejection, Japan Patent Office, Application No. 2014-552719, dated May 18, 2016, 4 pages.
Notification of the First Office Action received in Chinese Application No. 201380007040.2, dated Feb. 2, 2016, 3 pages.
Notification of the Second Office Action received in Chinese Application No. 201380007040.2, dated Aug. 4, 2016, 05 pages including 02 pages of English translation.
Oitaeri detective Namiso cultivation kit, Famitsu App iPhone & amp, Android No. 001, Japan, Enterbrain Co., Ltd., Dec. 22, 2011.
Refusal Notice received for Japan Patent Application No. 2014-552718 dated Sep. 28, 2016.
Scotirogers, Swipe This!: The Guide to Great Touchscreen Game Design, 2012, Wiley, p. 112.
SimCity DS2 Perfect Support, Publisher: Enter Brain Co., Ltd, Aug. 31, 2008, p. 008.
“Youtube video” uploaded on Jul. 27, 2009. Available at: https://www.youtube.com/watch?v=sJEOvyuvePE.
Zombie Farm, iPhone, iPad Overseas Game App Strategy Law, Cosmic Publishing Co., Ltd. , Mar. 19, 2011, pp. 052-055.
Gamer editorial department and world—matchless defense way!Https: formation arrangement game “wind and cloud for Android(s)—large siege” G Gaea-- distribution start, Gamer, Yksel, Inc., Feb. 2, 2012, [online] and Jul. 18, 2018 search, and URL—//www.gamer.ne.jp/news/201202020035/, English Translation retrieved Oct. 23, 2018, 12 pages.
Japan Office Action, Application No. 2017-152183, dated Jul. 25, 2018, 3 pages.
Australia Office Action, Application No. 2017201869, dated Sep. 25, 2018, 5 pages.
Etheridge D., “Java: Graphical User Interfaces”, David Etheridge & Ventus Publishing ApS (2009), ISBN 978-87-7681-496-0. 101 pages.
Related Publications (1)
Number Date Country
20180082517 A1 Mar 2018 US
Continuations (3)
Number Date Country
Parent 15093829 Apr 2016 US
Child 15822709 US
Parent 14330197 Jul 2014 US
Child 15093829 US
Parent 13479637 May 2012 US
Child 14330197 US