The present invention relates to systems for controlling technical processes, for example to systems for controlling technical processes relating to at least one of technical process simulation and technical process control. Moreover, the present invention concerns methods of controlling technical processes, for example to methods of controlling technical processes relating to at least one of technical process simulation and technical process control. Furthermore, the present invention relates to software products recorded on machine-readable data storage media, wherein the software products are executable upon computing hardware for implementing aforesaid methods.
Graphical user interfaces (GUI) for controlling complex processes are known, for example in control rooms associated with nuclear power plant, in military defence systems and in aviation management. Such graphical user interfaces can be employed both to control real technical facilities as well as simulations of such facilities. The simulations provide an opportunity to investigate a potential behaviour of the facilities, prior to applying actual control signals and commands to the facilities, thereby providing better control of the facilities an anticipating behaviour of the facilities.
Computing devices include tablet computers such as iPads, and smart phones, including Apple's IPHONE®, Google's ANDROID® phone, and SYMBIAN® phones. These computing devices have extremely user-friendly graphical interfaces, for enabling easy and quick interaction to users thereof. Most of these devices incorporate touch-sensitive screens that obtain user's inputs and facilitate smooth user interaction. Simulation software, for example conveniently implemented in a form of gaming software, is employed in many of these devices for leisure purpose as well as technical control purposes as aforementioned. An important aspect of a gaming or simulation system is the ease with which a given user can enter desired inputs and interact with the user interface of the device on which he/she plays a game and/or executes a simulation. For devices that lack a touch-screen facility incorporated therein, the only possible ways of interaction of the given user while playing a game or executing a simulation on such devices, is by clicking an associated mouse, using associated keyboard functions/keys to operate, or using associated joysticks. Experiences with the ‘point and click’ or ‘joystick’ incorporated in many lower-grade electronic devices is incompatible and often time consuming, while playing a game or executing a technical simulation. Specifically, there are specific games or technical simulations where a given user/player needs to use clicking, pointing, tapping and dragging operations many times, and often at different device display locations, which is hard to operate through a contemporary mouse or a contemporary joystick. In a typical gaming environment, where a given user needs to perform similar operations by clicking or touching on multiple points on the interface, this becomes cumbersome. Even the touch-sensitive screens, provided in many conventional electronic devices, are capable of sensing the touching operation only at one point at a time. Multi-touch screens are still not popular, and they can be of great benefit in a gaming environment or a simulation environment. Some of the conventional gaming console applications can be controlled through multi-touch sensitive operations, however, in strategic gaming environments, for performing certain desired operations, they still have some drawbacks.
Therefore, considering the aforementioned problems, there exists a need for a better and highly congenial graphical user interface (GUI) for a gaming or technical simulation-and-control system, while playing a game or executing a simulation on a computing device.
The present invention seeks to provide an improved user graphical interface which is more convenient to employ when undertaking gaming activities and/or executing simulations of real technical systems before proceeding to control operation of such technical systems.
The present invention also seeks to provide an improved method of employing a user graphical interface which is more convenient when undertaking gaming activities and/or executing simulations of real technical systems before proceeding to control operation of such systems.
According to a first aspect of the present invention, there is provided an electronic device as claimed in claim 1: there is provided an electronic device comprising:
a touch-sensitive display screen, configured to simultaneously sense touching operations performed at multiple points of the screen;
a computing hardware operable to execute a software product, wherein executing the software product results in generating and rendering a graphical user interface on the display screen, the graphical user interface being configured to facilitate user interaction; the graphical user interface, when rendered, presenting: one or more graphical objects; and
one or more user selectable options, each option representing one or more resources for performing an operation on one or more of the one or more graphical objects; wherein
based on a user's selection of one or more of the user selectable options, the software product is configured to render the one or more resources corresponding to the selected user selectable option, at multiple locations of the interface.
The invention is of advantage in that the graphical user interface is more convenient to employ, for example when submitting complex instructions requiring concurrent deployment activities.
Optionally, in the electronic device, the software product is configured to execute actions through the one or more resources, on one or more graphical objects, based on receiving a user's execution input, after rendering the one or more resources at multiple locations.
Optionally, in the electronic device, the software product is configured to render the one or more resources at multiple locations, based on a user's touching operation at multiple points of the display screen, or a user's swiping operation through the multiple points, the multiple points corresponding to the multiple locations on the display screen. More optionally, in the electronic device, the software is configured to render the one or more resources at multiple locations when the user's touching operation at the multiple points is performed for a pre-determined time. Yet more optionally, in the electronic device, the pre-determined time is user-adjustable. More optionally, in the electronic device, the rapidity of rendering the one or more resources at multiple locations is based at least on the speed of the user's touching operation or the swiping operation on the display screen.
More optionally, in the electronic device, the number of resources rendered at multiple locations depends at least on the pressure applied by the user over the display screen, during performing the touching or swiping operation. Pressure applied by a given user to a touch-screen is beneficially determined by one or more pressure-sensitive transducers integrated into the touch-screen. However, certain contemporary touch-screens operate only on a binary basis, namely either there is contact or no contact with a given area of the touch-screen. In such touch-screens, pressure applied by the given user can be determined by an area of neighbouring spatially contiguous sensing points on the screen which substantially simultaneously experience a contact being made by the given user. Thus, progressively more spatially contiguous sensing points are substantially simultaneously in contact state as progressively more pressure is applied due to elastic deformation of biological tissue of the given user's finger tips. A similar pressure sensing functionality can be achieved when the given user employs a pointing device having an elastically deformable end to contact onto the touch-screen to control a game or a simulation, or to enter commands for controlling a real technical facility. Optionally, the game or simulation includes a calibration routine for a given user to perform to calibrate pressure sensitivity of the touch-screen.
Optionally, in the electronic device, the software product and the graphical user interface corresponds to a gaming environment. More optionally, in the electronic device, the gaming system corresponds to a war-based game, the graphical user interface corresponding to a battlefield, and the one or more resources corresponding to weapons of use within the battlefield. For example, the gaming environment can be a simulation, prior to implementing a military operation in practice using real technical hardware.
Optionally, in the electronic device, there is including a database for continuously recording and updating the change in status of the one or more graphical objects, the software product being coupled to the database and being configured to resume the status of the one or more graphical objects to their last updated status in the database, in case of interruptions in the operable state of the graphical user interface.
Optionally, a plurality of electronic devices are connected to a server through a network, the graphical user interface being generated temporally concurrently on each of the electronic devices, to facilitate a plurality of users' interaction with the graphical user interface, wherein the graphical user interfaces generated on the plurality of electronic devices are coordinated and synchronized through the server, and updated concurrently on the plurality of electronic devices with time.
Optionally, the electronic device is implemented as a desktop computer, a laptop computer, an IPAD®, or a smart phone, including an IPHONE®, an ANDROID® phone or a SYMBIAN® phone; “®” denotes registered trademarks.
According to a second aspect of the invention, there is provided a method of facilitating user interactions with a graphical user interface, the graphical interface being generated and rendered on the display of an electronic device, by executing a software product on a computing hardware of the electronic device, the method comprising:
(a) rendering one or more graphical objects, and one or more user-selectable options corresponding to the one or more graphical objects on the graphical user interface, each user-selectable options corresponding to one or more resources to be deployed on the interface;
(b) selecting one or more of the user-selectable options, and performing one of a touching operation at different points on the display, and a swiping operation through the different points of the display; and
(c) deploying the one or more resource corresponding to the selected user-selectable option at multiple locations on the interface simultaneously, the multiple locations corresponding to the different points where the touching operation, or through which the swiping operation, is being performed.
Optionally, the method further comprises:
(d) deploying the one or more resources at multiple locations based at least on detecting that the touching operation at the multiple points on the display screen is performed for a pre-determined time.
Optionally, in the method, the rapidity of deployment of the one or more resources at multiple locations depends on the speed of the swiping operation or the touching operation.
Optionally, in the method, the number of resources deployed at the different locations on the interface depends on the pressure applied on the display screen during performing the touching operation or the swiping operation.
Optionally, in the method, the software product and the graphical user interface corresponds to a gaming system. More optionally, in the method, the gaming system corresponds to a war-based game, the graphical user interface corresponds to a battlefield, and the one or more resources correspond to weapons of use within the battlefield.
Optionally, the method further comprises continuously recording and updating the change in status of the one or more graphical objects, coupling the software product to the database, and resuming the status of one or more graphical objects to their last updated status in the database, in case of interruptions in the operations of the graphical user interface.
Optionally, the method further comprises:
(e) connecting a plurality of the electronic devices to a server through a network;
(f) generating the graphical user interface temporally concurrently on the displays of the different electronic devices; and
(g) coordinating the plurality of graphical user interfaces through the server, and updating them concurrently with time, to facilitate multiple users' interaction and coordination with the graphical user interfaces.
More optionally, in the method, the graphical user interfaces correspond to a gaming system, and the method is configured to facilitate an online multiplayer gaming system.
According to a third aspect of the present invention, there is provided a software product recorded on a machine readable data storage medium, wherein the software product is executable on the computing hardware of a computing device for implementing a method according to the second aspect of the invention.
According to a fourth aspect of the present invention, there is provided an electronic device comprising:
a display screen;
a computing hardware capable of executing a software product, wherein executing the software product leads to generating and rendering a graphical user interface on the display screen, the graphical user interface being configured to facilitate easy user interaction; the graphical user interface, when rendered, presenting:
one or more graphical objects;
a pointer object configured to be movable over one or more graphical objects, and configured to invoke a menu list containing one or more user selectable options as the pointer object is clicked or tapped over one or more of the graphical objects, wherein:
on selecting a user selectable option from the menu list, and performing one of a dragging of the pointer object and swiping a touch sensitive object over the graphical user interface, through one or more graphical objects, or through a portion of a specific graphical object, the software product is configured keep an effect corresponding to the selected option to be continuously applied to the one or more graphical objects, or to the portion of the specific graphical object, respectively, thereby enabling a change in status of the one or more graphical objects, or the portion of the specific graphical object, respectively.
Optionally, the electronic device further comprises a user input interface rendered over the graphical user interface, the user input interface being configured to obtain and interpret a user's input corresponding to moving the pointer object, clicking or tapping the pointer object, or swiping the touch sensitive object over the graphical user interface.
Optionally, in the electronic device, the graphical user interface is configured to apply continuously the user selectable option to the portion of the specific graphical object, or to one or one graphical objects, in response to a user's swiping the touch sensitive object over the portion of the specific graphical object, or to the one or one graphical objects, respectively.
Optionally, in the electronic device, the graphical user interface is configured to facilitate termination of the effect corresponding to the selected option, and facilitate disappearing of the selected option, on termination of the dragging of the pointer object or the swiping of the touch sensitive object.
Optionally, in the electronic device, the rapidity of application of the effect corresponding to the selected option, over the portion of the graphical object, or over the one or graphical objects, is dependent at least partially on the speed of performing the dragging operation of the pointer, or the swiping operation of the display sensitive object. More optionally, in the electronic device, the graphical user interface is a touch-sensitive screen, and the user interface is implemented as a tactile surface of the touch-sensitive screen.
Optionally, in the electronic device, the graphical user interface is configured to disable the application of the effect corresponding to the selected option, to the portion of the specific graphical object, or to the one or more graphical objects, in response to termination of the dragging of the pointer object, or the swiping of the display sensitive object.
Optionally, in the electronic device, the graphical user interface and the software product correspond to a gaming system or a simulation system. More optionally, in the electronic device, the gaming system or simulation system corresponds to a farming game, the graphical objects of the graphical user interface correspond to different spatial regions of a farming environment in which one or more crops are desired to be grown, and wherein the user selectable options in the menu list correspond to different farming tools.
Optionally in the electronic device, there is included a database for continuously recording and updating the change in status of the one or more graphical objects, the software product being coupled to the database and being configured to resume the status of the one or more graphical objects to their last updated status in the database, in case of interruptions in the operable state of the graphical user interface.
Optionally, a plurality of electronic devices are connected to a server through a network, the graphical user interface being generated temporally concurrently on each of the electronic devices, to facilitate a plurality of users' interaction with the graphical user interface, wherein the graphical user interfaces generated on the plurality of electronic devices are coordinated through the server and updated concurrently on the plurality of electronic devices with time.
Optionally, the electronic device is implemented as a desktop computer, a laptop computer, an IPAD®, or a smart phone, including an IPHONE®, an ANDROID® phone or a SYMBIAN® phone; “®” denotes registered trademarks.
According to a fifth aspect of the present invention, there is provided a method of facilitating easy user interactions with a graphical user interface, the graphical interface being generated and rendered on the display of an electronic device, by executing a software product on a computing hardware of the electronic device, the method comprising:
(a) rendering one or graphical objects within the graphical user interface;
(b) clicking or tapping one or more graphical objects through a pointer object, to invoke a menu list containing a set of user-selectable options, the user-selectable options corresponding to an effect to be applied to a portion of a specific graphical object, or to one or more graphical objects; and
(c) selecting a specific user-selectable option, and applying the effect corresponding to the selected option, to a portion of a specific graphical object, or to one or more graphical object, by performing one of a dragging operation of the pointer object and a swiping operation of a display sensitive item, over the specific portion of the graphical object, or over the one or more graphical objects, respectively.
Optionally, in the method, the graphical user interface is configured to keep the effect corresponding to the selected user-selectable option active, until the time the dragging operation or the swiping operation is performed, and is configured to enable disappearing of the selected option when the dragging or the swiping operation is terminated.
Optionally, in the method, the software product corresponds to a gaming system or a simulation system. More optionally, in the method, the gaming or simulation system corresponds to a farming game or farming simulation, the graphical objects correspond the spatial regions of a farming environment, and the user selectable options correspond to different farming tools.
Optionally, the method further comprises continuously recording and updating the change in status of the one or more graphical objects, coupling the software product to the database, and resuming the status of one or more graphical objects to their last updated status in the database, in case of interruptions in the operations of the graphical user interface.
Optionally, the method further comprises:
(d) connecting a plurality of the electronic devices to a server through a network;
(e) generating the graphical user interface temporally concurrently on the displays of the different electronic devices; and
(f) coordinating the plurality of graphical user interfaces through the server, and updating them concurrently with time, to facilitate multiple users' interaction and coordination with the graphical user interfaces.
Optionally, in the method, the graphical user interfaces correspond to a gaming system, and the method being configured to facilitate online multiplayer gaming system.
According to a sixth aspect of the present invention, there is provided a software product recorded on a machine readable data storage medium, the software product being executable on the computing hardware of a computing device, for implementing a method pursuant to the fifth aspect of the invention.
It will be appreciated that features of the invention are susceptible to being combined in various combinations without departing from the scope of the invention as defined by the appended claims.
Embodiments of the present invention will now be described, by way of example only, with reference to the following diagrams wherein:
In the accompanying diagrams, an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
In overview, the present invention is concerned with an apparatus for controlling technical processes, wherein the technical processes include elements of simulation and control of facilities. In
An embodiment of the present invention pertains to a graphical user interface for a gaming and/or simulation system, for facilitating easy and quick interaction of a given user while playing a game or controlling a simulation, and for avoiding contemporary problems experienced while performing touching or swiping operations on the touch-sensitive screens of electronic devices on which the games are being played and/or simulations are being executed.
Gaming systems are incorporated for leisure in many electronic computing devices, including computers, iPads, mobile phones, tablet computers and smart phones. Many such conventional electronic devices incorporate touch-sensitive screens for obtaining user inputs and for making congenial user experience with the interface. For playing games on electronic devices, or controlling technical simulations, without a touch-sensitive screen, including many desktop and laptop computers, the user generally interacts with and provides inputs to a gaming or simulation system's interface through coupled input devices, such as mice, certain keys on the keypads, and joysticks. Using multiple clicking operations through a mouse is time consuming and unfavourable, for example, in cases where a same operation needs to be performed at multiple points on the gaming or simulation interface. Even with the devices have touch-sensitive displays, when similar operations corresponding to the game being played, or the simulation being executed, need to be performed simultaneously through multiple regions of the interface, this becomes difficult to achieve as the conventional touch-sensitive screens are capable of sensing touching operations one at a time, at a specific point. Even though multi-touch sensitive screens are currently available, and are incorporated in electronic devices, operations corresponding to certain games, when played, similarly corresponding to certain technical simulations, require simultaneous sensing and detecting of touching or swiping operations performed through multiple regions of the screen.
Thus, the present disclosure provides an enhanced graphical user interface for a gaming and/or simulation system, which improves a given user's experience while playing a game, or executing a technical simulation, on an electronic device. The system and method facilitate performing of touching and swiping operations through a multi-touch sensitive screen of the electronic device, and allows the given user to perform similar operations pertaining to the game or simulation, simultaneously, through different regions of the interface.
In
In
Moreover, in
A specific deployed resource is released for action, for example, to attack the target 208, based on detection of certain conditions. This may include, for example, the user still keeping his/her finger at a desired point, for about 1 to 2 seconds after the resource has been already deployed at that point. In another case, an execution option may be separately rendered on the display screen, and the user needs to provide an execution command through the option, after the resources are deployed. Moreover, the multi-touch operations performed through the different fingers act independently, and the display screen is configured to sense and interpret the swiping or touching operations performed through these fingers independently. Specifically, as an example, when one finger is touched or swiped through specific points on the screen, one set of resources may be deployed over one set of locations corresponding to those points, and subsequently, when another finger is touched or swiped through a different set of points, a second set of resources may be subsequently deployed over those points too. The two sets of resources may be same or different, depending on the game settings, which are user adjustable, and can be customized before playing the game or executing the simulation Furthermore, as aforementioned, the display screen is also capable of sensing touching or swiping operations performed at different points simultaneously, and deploy the resources at different points together. In an embodiment, the number of resources deployed at different points, may be one each corresponding to detecting of a touching operation performed at that point. Alternatively, a constant number of resources per unit time may be deployed at a specific point, or over a set of points, as long as a touching or a swiping operation is performed over those points. In another embodiment, as aforementioned, the number of resources deployed is a function of the pressure applied by the user while performing the touching or swiping operation. Specifically, a higher pressure applied at a specific point optionally results in deploying more number of resources at that point, and vice versa.
Referring next to
In
For facilitating single-player gaming or single-user simulation, a user logs on through any of the electronic devices 502, 504, 506 or 508, and connects to one of the gaming or simulation servers 510 or 540, through a suitable network, for example via the Internet and/or a wireless communication network. As the user logs on, and executes the gaming or simulation software on the computing hardware of the specific device that he/she utilizes, for example, the device 502, a graphical user interface corresponding to the game is generated, and is rendered on the display screen of the device 502. The graphical user interface presents different graphical objects pertaining to the game or simulation, on the display screen of the device 502. The graphical objects may be represented by different blocks/segments of the graphical user interface, on which different operations corresponding to the game being played or simulation being executed, can be performed. For example, in a case where the game is a war-based game or the simulation relates to technical military hardware such as guns, bombs and such like, such blocks/segments may represent one or more targets that need to be conquered, such as the target 208 shown earlier in
Another database 580, coupled to the gaming or simulation server 510, serves as a back-end database for the gaming or simulation server 510. As the user of the device 502 starts playing the game, or executing a simulation, typical actions and gestures performed by the user, are recorded in the back-end server 580. Specifically, such actions are interpreted through the gaming or simulation server 510, and are sent as messages to the back-end server 580, which eventually maintains a log of, and a backup for the played game or executed simulation. Such messages can be in the form of data packages sent over an Internet connection through which the device 502 is connected to the server 510, or sent over any other wireless or wired network connecting the device 502 to the server 510, as aforementioned. Typical elements of such messages for maintaining a backup for the game or simulation may include a header, a payload and a checksum. The checksum can be a function of the payload, or it may be a unique user identifier, such as a username or similar. An advantage arising from including the checksum in the back-end maintaining messages, is a possibility of avoiding potential frauds while playing the game, or avoiding third-party corruption of a simulation which could adversely influence results generated by the simulation. Those in the art will understand that an appropriate checksum function or a checksum algorithm may be applied to the collected digital data, while the game is being played, or simulation being executed, to obtain the checksum. Further, the checksum corresponding to a specific data can be recomputed at any point of time, and compared to the stored checksum, to avoid possible frauds. The back-end messages received by the server 510 are also sent to the other databases 520 and 530 of the server 510. In these databases 520, 530, these back-end messages are used to maintain a continuous logic that represents the status of the game or simulation, for example, the exact score of the player updated with time, and a stage of the game that the player has already reached, or results of the simulation such as yield, integrity of a structure and similar. With a continuous receipt of the back-end messages by the databases 520 and 530, a regular updating of the game status is undertaken within these server databases 520 and 530, eventually, with time. This ensures facilitating the resumption of the game or simulation to its last status, in cases where the device 510 unexpectedly shuts down, the device 510 is unexpectedly hindered in its communication or the user changes the gaming or simulation terminal, or he/she intentionally quits playing or executing the simulation for a certain period, and logs in at some other time, such a possibility of resumption assists to enhance user satisfaction with the graphical user interface.
Though only two servers 510 and 540 have been shown, there can be multiple gaming or simulation servers coordinating with, and connected to each other, for implementing the gaming and/or simulation environment in accordance with the present disclosure. Moreover, the environment as shown in
Although, the gaming or simulation system implementable through the illustrated gaming or simulation environment, has been described for the case when a single user logs on to any of the electronic devices 502, 504, 506 or 508, the same gaming or simulation environment is capable of supporting multi-participant gaming or simulation, wherein different users may log on through different electronic devices, and synchronize with each other by connecting concurrently through any of the common gaming or simulation servers 510 and 540, through suitable networks as aforementioned, and share a common graphical user interface representing the ongoing game or simulation. In such embodiments, the graphical user interface rendered on the display screens of the different electronic devices, is regularly updated, concurrently, through the logic data stored in the databases 520 and 530 of the gaming or simulation servers, at the back-end.
In
At a step 628, the method includes checking whether or not other resources are desired to be deployed, before executing actions through the resources. If “yes”, the method includes returning to the step 616, selecting the selectable options corresponding to the resource, and performing the touching or swiping operations through the desired points again. Alternatively, going further, at a step 632, the method includes releasing the deployed resources for action, within the gaming or simulation environment. For example, in a war-based game or simulation, the deployed troops/armed soldiers are released for operating on a specific target, to attack it from different points where they are deployed. In an embodiment, the releasing of the deployed resources is automated, and occurs when the user keeps his/her fingers on a specific resource for a pre-determined time after deploying it. For example, this time may be about 1 to 2 seconds of touching operation after the resource is already deployed. The display screen is configured to sense this pre-determined time, and the software product executes action pertaining to the deployed resource, when this occurs. In another embodiment, releasing the different resources may require a manual user input. Specifically, for example, a triggering option (like a “go” or “fire” option) may be rendered after deploying the resources, and the resources may not be released until the user manually initiates the option. At a step 636, after the actions have been performed by the deployed resources, the graphical user interface is updated and a reformed interface representing the latest status of the gaming- or simulation-environment, renders on the display screen.
The method and system of the present disclosure, for improving interaction of a user with a graphical user interface corresponding to a game and/or simulation, provides substantial benefits as the user performs different operations in a gaming or simulation environment. Similar operations, when desired to be performed by a user, through different locations on the gaming or simulation interface, can be easily executed by touching or swiping through multiple points of the display screen simultaneously. Hence, the user's experience with the gaming or simulation interface is much more comfortable.
Although the present disclosure has been described comprehensively, through an exemplary embodiment where it is applicable in a gaming and/or simulation environment, and specifically through the example of a war-based game or simulation, the disclosure also finds it applications in other gaming, control and simulation environments, and, generally, may be applicable to other graphical user interfaces, not pertaining to a gaming or simulation system also. In certain applications, the user interface of the disclosed embodiment can be used for a virtual control of any type of game, technical system or simulation. Certain aspects of the disclosed embodiments are also applicable to perform other operations, including building arcades and solving puzzle games. Further, the congenial user interface may also be implemented within other types of games, for example, adventurous, role playing and shooting games, construction and management simulation games, and so forth. For example, the congenial user interface can be used in computer terminals employed at financial exchanges, for example in Wall Street in New York and the Stock Exchange in London, where traders need to control multiple transactions simultaneously when executing a financial transaction, for example a synthetic credit default swap or a trading in derivative financial products.
Further embodiments of the present invention will now be described below. The present disclosure pertains to a graphical user interface (GUI) for a gaming or simulation system, as aforementioned, for facilitating easy and quick interaction of a user while playing the game or executing the simulation, and for avoiding the cumbersome operations normally experienced while using a mouse or a joystick when a game or simulation is played or executed respectively on an electronic device.
Gaming and simulation systems are incorporated for leisure in many electronic devices, including computers, iPads, mobile phones, tablet computers and smart phones. While playing a game, or executing a simulation, on the computing devices without a touch-screen facility, including many conventionally available desktop and laptop computers, the major mode of interaction of a user with the gaming or simulation system interface is through devices like mouse, certain keys on the keypad, and the joysticks coupled to the computing device. In many games or technical simulations, the user often desires to obtain quick application of certain operations, for which he/she needs to use the clicking or tapping operation multiple times, and at different spots of the interface, which often takes time. Most of the smart phones and tablet computers have now incorporated touch screen displays, and playing games on these devices is comparatively easier. However, while interacting with the touch-sensitive screen acting as tactile surface for the graphical user interface corresponding to a game, multiple clicking or tapping operations at a single or different places may deteriorate the screen. Moreover, the screen often gets degenerated in long run, producing scratches and dirt spots on it, as a device is used too often for playing games, which is often undesirable. Furthermore, certain operations require consistent clicking and tapping at different locations on the graphical user interface, which takes time to enable the operations.
In
In
The present disclosure provides an efficient and user friendly graphical user interface (GUI) for a gaming- or simulation-system like that shown in
Specifically, the present disclosure is related to performing a swiping operation on the graphical user interface of a gaming- and/or simulation-system, while controlling or facilitating operations on the interface. In
Moreover, instead of using fingers, the swiping operation can also be performed through a mouse, by pointing and tapping the mouse initially at the point 1402, dragging the pointer on the screen along the desired path 1406, through the mouse, and finally releasing the mouse at the final position 1404. Further, any other display sensitive device or an organ, for example, a pen or a pointed device, can be used on the screen for performing the swiping operation. Connecting this operation and its advantages applicable on the typical farming environment depicted in
In
Emphasizing on the advantages of the application of the swiping feature in the graphical user interface of a gaming system, as compared to the solutions as illustrated before through
In
For a user player gaming or executing simulations, the user logs on through any of the electronic devices 1902, 1904, 1906 or 1908, and connects to one of the gaming or simulation servers 1910 or 1940, through a suitable network. As the user logs on, and executes the gaming or simulation software on the computing hardware of a specific device, for example, the device 1902, a graphical user interface corresponding to the game or simulation is generated and rendered on the display screen of the device 1902. The graphical user interface presents different graphical objects on the display screen of the device 1902. The graphical objects may be the different blocks/segments of the graphical user interface, on which different operations corresponding to the game being played, or the simulation being executed, can be performed. Moreover, a point object (cursor) movable over the different graphical objects appears on the graphical user interface, for controlling the gaming or simulation operations. If the device 1902 does not have a touch-sensitive screen, the pointer object may be controllable through a mouse, a joystick or a set of keyboard buttons, coupled to the device 1902. Furthermore, if the device 1902 has a touch-screen functionality incorporated therein, the same controlling operations can also be performed by swiping or tapping/clicking through fingers or any display sensitive item, like any other organ/pen/pencil.
Another database 1980 serves as a back-end database for the gaming or simulation server 1910. As the user of the device 1902 starts playing the game, or executing the simulation, typical actions and gestures performed by the user, are recorded in the back-end server 1980. Specifically, such actions are interpreted through the gaming or simulation server 1910, and are sent as messages to the back-end sever 1980, which eventually maintains a backup for the played game or executed simulation. Such messages can be in the form of data packages sent over an Internet connection through which the device 1902 is connected to the server 1910, or any other wireless or wired connection connecting the device 1902 to the server 1910. Typical elements of such messages for maintaining a back end for the game or simulation, may include a header, a payload and a checksum. The checksum can be a function of the payload, or it may be a unique user identifier, like the username, and so forth. The advantage of including the checksum in back-end maintaining message, is the possibility of avoiding prospective frauds while playing the game, or corruption in a simulation which could adversely influence results generated by the simulation. The back-end messages received by the server 1910 are also sent to the other databases 1920 and 1930 of the server 1910. In these databases, these back-end messages are used to maintain a continuous logic representing the status of the game or simulation, for example, the exact score of the player with time, and the stage of the game that the player has already reached. With a continuous receipt of the back-end messages by the databases 1920 and 1930, a regular updating of the game or simulation status is implemented within the server databases 1910 and 1920, eventually, with time. This ensures facilitating the resumption of the last status of the game or simulation, in a case of the device 1910 unexpectedly shutting down, the user changes the gaming or simulation terminal or intentionally quits playing or executing the simulation for a certain period, and logs in at some other time.
Although only two servers 1910 and 1940 have been shown, there can be multiple gaming or simulation servers coordinating with, and, connected to each other, for implementing the gaming or simulation environment in accordance with the present disclosure. Moreover, the environment, as illustrated in
Although, the gaming or simulation system implementable through the illustrated gaming or simulation environment, has been explained in the case when a single user logs on to any of the electronic devices 1902, 1904, 1906 or 1908, the same environment is capable of supporting multiuser gaming or simulation, wherein different users may log on through different electronic devices, and synchronize with each other by connecting to the common servers 1910 and 1940 through suitable networks, and share a common graphical user interface representing the ongoing game or simulation, for example a United Nations famine relief programme. In such embodiments, the graphical user interface rendered on the display screens of the different electronic devices, is regularly updated, concurrently, through the logic data stored in the databases 1920 and 1930 of the gaming or simulation servers, at the back-end.
In the foregoing, pressure applied by a given user to a touch-screen is beneficially determined by one or more pressure-sensitive transducers integrated into the touch-screen. However, certain contemporary touch-screens operate only on a binary basis, namely either there is contact or no contact with a given area of the touch-screen. In such touch-screens, pressure applied by the given user can be determined by an area of neighbouring spatially contiguous sensing points on the screen which substantially simultaneously experience a contact being made by the given user. Thus, progressively more spatially contiguous sensing points are substantially simultaneously in contact state as progressively more pressure is applied due to elastic deformation of biological tissue of the given user's finger tips. A similar pressure sensing functionality can be achieved when the given user employs a pointing device having an elastically deformable end to contact onto the touch-screen to control a game or a simulation, or to enter commands for controlling a real technical facility.
Beneficially, when a game or simulation is implemented as described in the foregoing and the given user exits from the game or simulation, for example for resumption at a later time, parameters describing a state of the game or simulation at an instant of exiting from the game are beneficially stored in data memory, so that the state of the game or simulation can be restored again at resumption of the game or simulation.
Modifications to embodiments of the invention described in the foregoing are possible without departing from the scope of the invention as defined by the accompanying claims. Expressions such as “including”, “comprising”, “incorporating”, “consisting of”, “have”, “is” used to describe and claim the present invention are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural. Numerals included within parentheses in the accompanying claims are intended to assist understanding of the claims and should not be construed in any way to limit subject matter claimed by these claims.
Number | Date | Country | Kind |
---|---|---|---|
1222096.8 | Dec 2012 | GB | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2013/001126 | 4/9/2013 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2013/153455 | 10/17/2013 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4698625 | McCaskill et al. | Oct 1987 | A |
5404442 | Foster et al. | Apr 1995 | A |
5471578 | Moran et al. | Nov 1995 | A |
5500935 | Moran et al. | Mar 1996 | A |
5523775 | Capps | Jun 1996 | A |
5592608 | Weber et al. | Jan 1997 | A |
5596699 | Driskell | Jan 1997 | A |
5598524 | Johnston, Jr. et al. | Jan 1997 | A |
5608850 | Robertson | Mar 1997 | A |
5689667 | Kurtenbach | Nov 1997 | A |
5701424 | Atkinson | Dec 1997 | A |
5745717 | Vayda et al. | Apr 1998 | A |
5757383 | Lipton | May 1998 | A |
5760773 | Berman et al. | Jun 1998 | A |
5798760 | Vayda et al. | Aug 1998 | A |
5828360 | Anderson et al. | Oct 1998 | A |
5835094 | Ermel et al. | Nov 1998 | A |
5861886 | Moran et al. | Jan 1999 | A |
5880733 | Horvitz et al. | Mar 1999 | A |
5926178 | Kurtenbach | Jul 1999 | A |
5943039 | Anderson et al. | Aug 1999 | A |
6037937 | Beaton et al. | Mar 2000 | A |
6094197 | Buxton et al. | Jul 2000 | A |
6144378 | Lee | Nov 2000 | A |
6249740 | Ito et al. | Jun 2001 | B1 |
6263278 | Nikiel et al. | Jul 2001 | B1 |
6337698 | Keely, Jr. et al. | Jan 2002 | B1 |
6456307 | Bates et al. | Sep 2002 | B1 |
6753888 | Kamiwada et al. | Jun 2004 | B2 |
6906643 | Samadani et al. | Jun 2005 | B2 |
6920619 | Milekic | Jul 2005 | B1 |
7088365 | Hashizume | Aug 2006 | B2 |
7093202 | Saund et al. | Aug 2006 | B2 |
7158878 | Rasmussen et al. | Jan 2007 | B2 |
7210107 | Wecker et al. | Apr 2007 | B2 |
7310619 | Baar et al. | Dec 2007 | B2 |
7366995 | Montague | Apr 2008 | B2 |
7373244 | Kreft | May 2008 | B2 |
7441202 | Shen et al. | Oct 2008 | B2 |
7546545 | Garbow et al. | Jun 2009 | B2 |
7676376 | Colman | Mar 2010 | B2 |
7770135 | Fitzmaurice | Aug 2010 | B2 |
7818089 | Hanna et al. | Oct 2010 | B2 |
7870496 | Sherwani | Jan 2011 | B1 |
7890257 | Fyke et al. | Feb 2011 | B2 |
7920963 | Jouline et al. | Apr 2011 | B2 |
8059101 | Westerman et al. | Nov 2011 | B2 |
8065156 | Gazdzinski | Nov 2011 | B2 |
8132125 | Iwema et al. | Mar 2012 | B2 |
8133116 | Kelly et al. | Mar 2012 | B1 |
8138408 | Jung et al. | Mar 2012 | B2 |
RE43318 | Milekic | Apr 2012 | E |
8194043 | Cheon et al. | Jun 2012 | B2 |
8217787 | Miller, IV | Jul 2012 | B2 |
8219309 | Nirhamo | Jul 2012 | B2 |
8234059 | Sugiyama et al. | Jul 2012 | B2 |
8245156 | Mouilleseaux et al. | Aug 2012 | B2 |
8253707 | Kaneko et al. | Aug 2012 | B2 |
8261212 | Wigdor et al. | Sep 2012 | B2 |
8292743 | Etter et al. | Oct 2012 | B1 |
8346405 | Johnson et al. | Jan 2013 | B1 |
8368723 | Gossweiler, III et al. | Feb 2013 | B1 |
8448095 | Haussila et al. | May 2013 | B1 |
8578295 | Chmielewski et al. | Nov 2013 | B2 |
8614665 | Li | Dec 2013 | B2 |
8627233 | Cragun et al. | Jan 2014 | B2 |
8636594 | Derome et al. | Jan 2014 | B2 |
8782546 | Haussila | Jul 2014 | B2 |
8795080 | Omi | Aug 2014 | B1 |
20020175955 | Guordol et al. | Nov 2002 | A1 |
20030085881 | Bosma et al. | May 2003 | A1 |
20030184525 | Tsai | Oct 2003 | A1 |
20040002634 | Nihtila | Jan 2004 | A1 |
20040015309 | Swisher et al. | Jan 2004 | A1 |
20040054428 | Sheha et al. | Mar 2004 | A1 |
20040150671 | Kamiwada et al. | Aug 2004 | A1 |
20040263475 | Wecker et al. | Dec 2004 | A1 |
20050002811 | Froeslev et al. | Jan 2005 | A1 |
20050028110 | Vienneau et al. | Feb 2005 | A1 |
20050111621 | Riker et al. | May 2005 | A1 |
20050134578 | Chambers et al. | Jun 2005 | A1 |
20050164794 | Tahara | Jul 2005 | A1 |
20050270311 | Rasmussen et al. | Dec 2005 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060055670 | Castrucci | Mar 2006 | A1 |
20060085767 | Hinckley et al. | Apr 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20070004081 | Hsiao | Jan 2007 | A1 |
20070040810 | Dowe et al. | Feb 2007 | A1 |
20070057930 | Iwema et al. | Mar 2007 | A1 |
20070070050 | Westerman et al. | Mar 2007 | A1 |
20070096945 | Rasmussen et al. | May 2007 | A1 |
20070110886 | Hanna et al. | May 2007 | A1 |
20070118520 | Bliss et al. | May 2007 | A1 |
20070180392 | Russo | Aug 2007 | A1 |
20070234223 | Leavitt et al. | Oct 2007 | A1 |
20070252821 | Hollemans et al. | Nov 2007 | A1 |
20080023161 | Gather | Jan 2008 | A1 |
20080023561 | Durbin | Jan 2008 | A1 |
20080122796 | Jobs et al. | May 2008 | A1 |
20080208456 | Jouline et al. | Aug 2008 | A1 |
20080222569 | Champion et al. | Sep 2008 | A1 |
20080229245 | Ulerich et al. | Sep 2008 | A1 |
20080231610 | Hotelling et al. | Sep 2008 | A1 |
20080235610 | Dettinger | Sep 2008 | A1 |
20080309632 | Westerman et al. | Dec 2008 | A1 |
20090037813 | Newman et al. | Feb 2009 | A1 |
20090118001 | Kelly et al. | May 2009 | A1 |
20090122018 | Vymenets et al. | May 2009 | A1 |
20090146968 | Narita et al. | Jun 2009 | A1 |
20090172593 | Geurts et al. | Jul 2009 | A1 |
20090187842 | Collins et al. | Jul 2009 | A1 |
20090313567 | Kwon et al. | Dec 2009 | A1 |
20090325691 | Loose | Dec 2009 | A1 |
20090327955 | Mouilleseaux et al. | Dec 2009 | A1 |
20090327963 | Mouilleseaux et al. | Dec 2009 | A1 |
20090327964 | Mouilleseaux et al. | Dec 2009 | A1 |
20100093399 | Kim et al. | Apr 2010 | A1 |
20100100849 | Fram | Apr 2010 | A1 |
20100110032 | Kim et al. | May 2010 | A1 |
20100114471 | Sugiyama et al. | May 2010 | A1 |
20100130213 | Vendrow et al. | May 2010 | A1 |
20100185985 | Chmielewski et al. | Jul 2010 | A1 |
20100192101 | Chmielewski et al. | Jul 2010 | A1 |
20100192102 | Chmielewski et al. | Jul 2010 | A1 |
20100192103 | Cragun et al. | Jul 2010 | A1 |
20100217514 | Nesbitt | Aug 2010 | A1 |
20100235778 | Kocienda et al. | Sep 2010 | A1 |
20100251179 | Cragun et al. | Sep 2010 | A1 |
20100251180 | Cragun et al. | Sep 2010 | A1 |
20100283750 | Kang et al. | Nov 2010 | A1 |
20100285881 | Bilow | Nov 2010 | A1 |
20100287486 | Coddington | Nov 2010 | A1 |
20100299637 | Chmielewski et al. | Nov 2010 | A1 |
20100306702 | Warner | Dec 2010 | A1 |
20100313126 | Jung et al. | Dec 2010 | A1 |
20110014983 | Miller, IV | Jan 2011 | A1 |
20110066980 | Chmielewski et al. | Mar 2011 | A1 |
20110066981 | Chmielewski et al. | Mar 2011 | A1 |
20110081973 | Hall | Apr 2011 | A1 |
20110093821 | Wigdor et al. | Apr 2011 | A1 |
20110099180 | Arrasvuori | Apr 2011 | A1 |
20110102336 | Seok et al. | May 2011 | A1 |
20110111840 | Gagner et al. | May 2011 | A1 |
20110163986 | Lee et al. | Jul 2011 | A1 |
20110165913 | Lee | Jul 2011 | A1 |
20110184637 | Jouline et al. | Jul 2011 | A1 |
20110184638 | Jouline et al. | Jul 2011 | A1 |
20110209058 | Hinckley et al. | Aug 2011 | A1 |
20110210931 | Shai | Sep 2011 | A1 |
20110225524 | Cifra | Sep 2011 | A1 |
20110239110 | Garrett et al. | Sep 2011 | A1 |
20110244937 | Yamashita et al. | Oct 2011 | A1 |
20110248939 | Woo et al. | Oct 2011 | A1 |
20110254806 | Jung et al. | Oct 2011 | A1 |
20110270922 | Jones et al. | Nov 2011 | A1 |
20110271182 | Tsai et al. | Nov 2011 | A1 |
20110283188 | Farrenkopf et al. | Nov 2011 | A1 |
20110283231 | Richstein et al. | Nov 2011 | A1 |
20110300934 | Michael et al. | Dec 2011 | A1 |
20110307843 | Miyazaki et al. | Dec 2011 | A1 |
20110319169 | Lam et al. | Dec 2011 | A1 |
20110320068 | Lee et al. | Dec 2011 | A1 |
20120005577 | Gerken et al. | Jan 2012 | A1 |
20120030566 | Victor | Feb 2012 | A1 |
20120030567 | Victor | Feb 2012 | A1 |
20120056836 | Cha et al. | Mar 2012 | A1 |
20120094766 | Reynolds | Apr 2012 | A1 |
20120094770 | Hall | Apr 2012 | A1 |
20120115599 | Conway et al. | May 2012 | A1 |
20120122561 | Hedrick et al. | May 2012 | A1 |
20120122586 | Kelly | May 2012 | A1 |
20120122587 | Kelly | May 2012 | A1 |
20120157210 | Hall | Jun 2012 | A1 |
20120162265 | Heinrich et al. | Jun 2012 | A1 |
20120185789 | Louch | Jul 2012 | A1 |
20120190388 | Castleman et al. | Jul 2012 | A1 |
20120264520 | Marsland et al. | Oct 2012 | A1 |
20120266092 | Zhu et al. | Oct 2012 | A1 |
20120306772 | Tan et al. | Dec 2012 | A1 |
20120326993 | Weisman | Dec 2012 | A1 |
20130016126 | Wang et al. | Jan 2013 | A1 |
20130027412 | Roddy | Jan 2013 | A1 |
20130067332 | Greenwood et al. | Mar 2013 | A1 |
20130120274 | Ha et al. | May 2013 | A1 |
20130176298 | Lee et al. | Jul 2013 | A1 |
20130178281 | Ayyar | Jul 2013 | A1 |
20130181986 | Fowler et al. | Jul 2013 | A1 |
20130207920 | McCann et al. | Aug 2013 | A1 |
20140066017 | Cho | Mar 2014 | A1 |
20160184699 | Rageh | Jun 2016 | A1 |
Number | Date | Country |
---|---|---|
1867886 | Nov 2006 | CN |
102245272 | Nov 2011 | CN |
102279697 | Dec 2011 | CN |
102316945 | Jan 2012 | CN |
102455851 | May 2012 | CN |
2341420 | Jul 2011 | EP |
2395419 | Dec 2011 | EP |
2530569 | Dec 2012 | EP |
2530569 | Dec 2012 | EP |
2004-525675 | Aug 2004 | JP |
2005152509 | Jun 2005 | JP |
2005-211242 | Aug 2005 | JP |
2006185443 | Jul 2006 | JP |
2008501490 | Jan 2008 | JP |
2009-125266 | Jun 2009 | JP |
2009279050 | Dec 2009 | JP |
2010012050 | Jan 2010 | JP |
2010-187911 | Sep 2010 | JP |
2011036346 | Feb 2011 | JP |
2011206444 | Oct 2011 | JP |
2012034970 | Feb 2012 | JP |
2012-081163 | Apr 2012 | JP |
10-2010-0014941 | Feb 2010 | KR |
20100020846 | Feb 2010 | KR |
10-2011-0069824 | Jun 2011 | KR |
10-2011-0080129 | Jul 2011 | KR |
1020140123693 | Oct 2014 | KR |
2012001637 | Jan 2012 | WO |
Entry |
---|
Youtube video Farmville: https://www.youtube.com/watch?v=sJEOvyuvePE, uploaded on Jul. 27, 2009; screen capture attached as youtube_farmville_20090727.pdf. |
International Preliminary Report on Patentability and Written Opinion of the International Searching Authority, re PCT/IB2013/001126, dated Oct. 14, 2014, 10 pages. |
Naver Blog Review on Every Farm, posted Nov. 3, 2011, 3 pages. http://blog.naver.com/yspray4u/10123134648. |
SimCity DS2 Perfect Support, Publisher: Enter Brain Co., Ltd, Aug. 31, 2008, p. 008. |
“How to play foreign games of IPhone and IPad, for Zombie Farm”, Publisher Cosmic publishing Co., Ltd, Mar. 19, 2011, pp. 052-055. |
Combined Search and Examination Report received for United Kingdom Application No. GB1222096.8, dated Jan. 29, 2013, 12 pages. |
Combined Search and Examination Report received for United Kingdom Patent Application No. GB1409299.3, dated Jul. 8, 2014, 10 pages. |
Examination Report received for United Kingdom Patent Application No. GB1222096.8, dated Jul. 8, 2014, 09 pages. |
Examination Report received for Canadian Patent Application No. 2861101, dated Feb. 21, 2017, 4 pages. |
International Preliminary Report on Patentability and Written Opinion received for International Application No. PCT/IB2013/001126, dated Oct. 23, 2014, 11 pages. |
International Search Report received for International Application No. PCT/IB2013/001126, dated Jan. 8, 2014, 6 pages. |
Boulos, Maged N. Kamel, et al., “Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation”, International Journal of Health Geographic's, Jul. 26, 2011, pp. 1-14. |
International Search Report received for International Application No. PCT/IB2013/001211, dated Jan. 8, 2014, 6 pages. |
Notice of Ground of Rejection received for Japanese Patent Application No. 2014-552719, dated Oct. 2, 2015, 5 pages including 3 pages of English Translation. |
Notice of Non-Final Rejection Received for Korean Patent Application No. 10-2014-7019044, dated Jun. 29, 2015, 14 pages including 7 pages of English Translation. |
Nakajima, Kengo, “Technology That Supports on line Gaming Back stage Behind Expansive Playing Space”, “How 1o Achieve Both Unlimited Simultaneous Connections and Milli second Latency” Gijyutsuhyouronsha Corporation, 25, Apr. 1, 2011, 4 pages. |
Notification of Ground of Rejection, Application No. 2014-148443, dated Dec. 16, 2016, 4 pages. |
Australian Patent Examination Report No. 2 for Application No. 2013263809, dated Jan. 28, 2016, 5 pages. |
First Office Action received for the Chinese Patent Application No. 201380014183.6, dated Jun. 3, 2015,8 pages including 3 pages of English Translation. |
Scotirogers, Swipe This!: The Guide to Great Touchscreen Game Design, 2012, Wiley, p. 112. |
Notice of Allowance, Japan Patent Office, Application No. 2014-552719, dated Oct. 25, 2016, 3 pages. |
“Metroid Prime: Hunters”, Dengeki Game Cube, vol. 4, No. 8, Media Works, Jul. 1, 2004, vo1.4, p. 10, 1 page. |
Notification of ground of rejection, Japan Patent Office, Application No. 2014-552719, dated May 18, 2016, 4 pages. |
International Search Report received for International Patent Application No. PCT/IB2013/001063, dated Feb. 6, 2014, 5 pages. |
International Preliminary Report on Patentability and Written Opinion received for International Patent Application No. PCT/IB2013/001063, dated Oct. 23, 2014, 12 pages. |
“Nintendo DS compatible river's fishing Komei Valley Seseragi's poetry complete guide”, KOEI Co., Ltd., Aug. 15, 2007, p. 62. |
“Oitaeri detective Namiso cultivation kit”, Famitsu App iPhone & amp, Android No. 001, Japan, Enterbrain Co., Ltd., Dec. 22, 2011. |
“Welcome to Nintendo DS compatible ranch story! The ultimate guide to the wind bazaar”, 2nd edition, KOEI Co., Ltd., Feb. 19, 2010, p. 8. Available At:—http://www.ranchstory.co.uk/?games/Harvest_Moon_Welcome_to_the_Wind_Bazaar. |
“Infinity Blade Cross”, 70-71, Weekly Famitsu, Japan, Enterbrain, Inc., Mar. 15, 2012, vol. 27, No. 13, No. 68 pages. |
Non-Final Rejection received for Korean Patent Application No. 1020137020715, dated Jun. 29, 2015. |
A farm simulation game software for Iphone “Eco faamu 2”, Updated Jan. 25, 2012, 4 pages. |
Japan Office Action, Application No. 2017-097789, dated May 14, 2018, 4 pages. |
Zombie Farm, iPhone, iPad Overseas Game App Strategy Law, Cosmic Publishing Co., Ltd., Mar. 19, 2011, 052-055. |
Written Opinion of Searching Authority; PCT/IB2013/001211, dated Jan. 8, 2014. |
Third Examination Report received for Application No. AU2013263809, dated Jun. 10, 2016. |
Second Examination Report received for Application No. AU2016225861, dated Oct. 31, 2017. |
Australian Patent Examination report received for Application No. 2013246615 dated Oct. 21, 2015, 3 pages. |
Australian Patent Examination report received for Application No. 2016202994 dated Oct. 10, 2017, 6 pages. |
Patent Examination Report No. 1 received for Application No. 2016203195, dated Oct. 28, 2016, 3 Pages. |
Australian Patent Examination report received for Application No. 2016202994 dated Jun. 14, 2017, 8 pages. |
Notification of the Second Office Action received in Chinese Application No. 201380007040.2, dated Aug. 4, 2016, 5 pages including 2 pages of English Translation. |
Canadian Office Action received for Application No. CA2869766, dated Feb. 7, 2017, 3 pages. |
Canadian Office Action received for Application No. CA2869766, dated Feb. 19, 2016, 4 pages. |
Notice of Allowance and Fees Due dated Feb. 28, 2013 for U.S. Appl. No. 13/714,825, 10 Pages. |
Notice of Allowance and Fees Due dated Sep. 24, 2014 for U.S. Appl. No. 13/445,783, 10 Pages. |
Non Final Rejection received in Korean Application No. 10-2014-7021063, dated Aug. 11, 2015, 29 Pages, including 7 pages of English Translation. |
Non Final Office Action dated Mar. 6, 2013 for U.S. Appl. No. 13/479,637, 5 Pages. |
Non Final Office Action dated Aug. 28, 2015 for U.S. Appl. No. 14/330,197, 4 Pages. |
Non Final Office Action dated Sep. 27, 2016 for U.S. Appl. No. 15/093,829 13 Pages. |
Non Final Office Action dated Mar. 26, 2013 for U.S. Appl. No. 13/714,858, 5 Pages. |
Metroid Prime: Hunters, Dengeki Game Cube, vol. 4, No. 8, Media Works, Jul. 1, 2004, vol. 4, p. 10, 1 page. |
Korean Non-Final Rejection received for Application No. 1020157022747 dated May 30, 2016, 5 pages including 2 pages of English Translation. |
Korean Non-Final Rejection received for Application No. 1020157022747 dated Nov. 17, 2015, 14 pages including 7 pages of English Translation. |
Japanese Intellectual Property Office, Notification of Ground of Rejection, Oct. 2, 2015, 3 pages. |
Japan Notice of Allowance, Application No. 2016-210300, dated Jul. 27, 2017, 3 pages. |
International Search Report, dated Jan. 8, 2014 for Application No. PCT/IB2013/001211, 6 pages. |
International Preliminary Report on Patentabilty and Written Opinion, dated Nov. 25, 2014 for Application No. PCT/IB2013/001211, 11 Pages. |
How to Play Farmville on the iPad, Viewed on internet on Jun. 7, 2017, published on Mar. 26, 2011 Available at: https://www.youtube.com/watch?v=LLOfWYUBPu4. |
Final Office Action dated Mar. 19, 2015 for U.S. Appl. No. 14/330,197, 8 Pages. |
Final Office Action dated Nov. 15, 2013 for U.S. Appl. No. 13/479,637, 7 Pages. |
Final Office Action dated Apr. 10, 2017 for U.S. Appl. No. 15/093,829, 7 Pages. |
Extended European Search Report received for Application No. EP16020420.2 dated Mar. 22, 2017, pp. 10. |
Examination Report received for United Kingdom Patent Application No. GB1409299.3, dated Mar. 12, 2015, 9 pages. |
Communication Pursuant to Rule 161 (1) and 162 received in EP Application No. EP13736623.3, dated Mar. 18, 2015, 2 pages. |
Communication Pursuant to Rule 161 (1) and 162 EPC received dated Feb. 27, 2015 for Application No. EP13737848.5, pp. 2. |
Chinese Third Office Action dated Nov. 17, 2017 for Application No. 201380006199.2, pp. 23, including 12 pages of English Translation. |
Chinese Second Office Action dated Jul. 8, 2016 for Application No. 201380006199.2, pp. 20, including 12 pages of English Translation. |
Chinese Office Action dated Dec. 3, 2015 for Application No. 201380006199.2, pp. 19, including 10 pages of English Translation. |
Notice of Allowance issued in Korean Intellectual Property Office for Application No. 10-2016-7018845, dated Oct. 29, 2018, 4 pages including 2 pages of English translation. |
Communication Pursuant to Rule 164(2)(b) and Article 94(3) EPC for European Patent Application No. EP13736623.3, issued on Oct. 5, 2018, 9 pages. |
“Every Farm”, Naver blog, dated Nov. 3, 2011, Available at: http://blog.naver.com/yspray4u/10123134648. |
Weverka, Peter, “Microsoft PowerPoint 2007 all-in-one desk reference for dummies Part 1”, Jan. 10, 2007, 672 pages. |
Number | Date | Country | |
---|---|---|---|
20150113477 A1 | Apr 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13479637 | May 2012 | US |
Child | 14391229 | US | |
Parent | 13445783 | Apr 2012 | US |
Child | 13479637 | US |