This Application claims priority of Taiwan Patent Application No. 101122937, filed on Jun. 27, 2012, the entirety of which is incorporated by reference herein.
1. Field of the Invention
The present invention relates to an interaction system, and in particular, relates to a located-based interaction system and method.
2. Description of the Related Art
With advances in technologies, mobile devices have become more and more popular. A mobile device on the market, such as a smart phone or a tablet PC, may have a global positioning system (GPS) or an assisted global positioning system (A-GPS) to retrieve the geographical location of the mobile device. However, a user may interact with other users by using the mobile device through a network or a community network, wherein the geographical information of the mobile device cannot be used sufficiently. In addition, the geographical information of the mobile device cannot be used on interaction objects (e.g. audio files, video files, or text messages) transmitted between different users. Thus, user interaction experience is hindered.
A detailed description is given in the following embodiments with reference to the accompanying drawings.
In an exemplary embodiment, an interaction system is provided. A user may have a better interaction experience when interacting with an interaction object by using the geographical location of a mobile device. The interaction system comprises: a mobile device comprising a location detection unit configured to retrieve a geographical location of the mobile device; and a server configured to retrieve the geographical location of the mobile device, wherein the server comprises a database configured to store at least one interaction object and location information associated with the interaction object, and the server further determines whether the location information of the interaction object corresponds to the geographical location of the mobile device, wherein when the location information of the interaction object corresponds to the geographical location of the mobile device, the server further transmits the interaction object to the mobile device, so that the mobile device executes the at least one interaction object.
The present invention can be more fully understood by reading the subsequent detailed description and examples with references made to the accompanying drawings, wherein:
The following description is of the best-contemplated mode of carrying out the invention. This description is made for the purpose of illustrating the general principles of the invention and should not be taken in a limiting sense. The scope of the invention is best determined by reference to the appended claims.
The location detection unit 170 is configured to detect a geographical location of the mobile device 100. For example, the location detection unit 170 may be a global positioning system (GPS), an assisted global positioning system (A-GPS), a radio frequency (RF) triangulation device, an electronic compass, or an inertial measurement unit (IMU), but the invention is not limited thereto. In an embodiment, when the location detection unit 170 cannot obtain an exact geographical location of the mobile device 100 (e.g. a GPS detector cannot detect positioning systems indoors), the mobile device 100 may capture an image by using the image capturing unit 140, and transmit the captured image to the server 200. Then, the server 200 may compare the captured image from the mobile device 100 with images having geographical information stored in a database 210, thereby determining the geographical location of the mobile device 100.
In an embodiment, the server 200 may comprise a database 210 configured to store at least one interaction object and corresponding geographical information. The server 200 may be a computer system. For a person having ordinary skill in the art, it is appreciated that a computer system is capable of performing operations, such analyzing and storing data. Details of components in the computer system will not be described here. The interaction objects to be analyzed in the server 200 may be obtained from the mobile device 100 by using the image capturing unit 140 or the audio capturing unit 150. Alternatively, the interaction objects may be preset in the server 200 (e.g. by the manufacturer or agent).
In an embodiment, the database 210 may further store corresponding information of the interaction objects, such as classification information, action information, or feature information. The action information may indicate interaction functions to be used by a user, and the action information corresponds to the classification information of the interaction objects. Specifically, the classification information of the interaction objects may be classified into text messages or multimedia messages (audio file or a video file). For example, the classification information may be a text message sent to other users or fiends, images stored in the database 210, photos or video files stored in the mobile device 100, mini games, hand-drawn patterns, music files, or a combination thereof.
In an embodiment, the action information corresponding to a text message may be interaction functions such as sharing, forwarding, or storing the text message. In an embodiment, the action information corresponding to the audio file (multimedia message) may be fast forward playing, reverse playing, stopping, playing, storing, or sharing the audio file. The action information corresponding to a video file (multimedia message) may be sharing, forwarding, storing, or editing the video file. In addition, if the multimedia message is an image with a growing mode, the corresponding action information may be feeding, or taking a walk with the interaction object (details will be described later).
In an embodiment, the classification information of an interaction object can be determined by the server 200 automatically. For example, the server 200 may determine the classification information by the filename extension of the interaction object. In addition, the interaction object may have different action information in accordance with different classification information, and thus the user may interact with the interaction object by using the functions related to the action information. The feature information of the interaction object may be information such as a trigger condition, a living time, a privacy level, a corresponding user, and a lasting time of the interaction object. Details will be described later. If the interaction object is built by the user, it is desired that the feature information is set by the user while uploading the interaction object to the server 200. If the user does not set the feature information of the interaction object, the server 200 may apply default properties to the feature information of the interaction object. For example, the default living time may be 1 month and the default lasting time may be 1 minute. If the interaction object is a built-in object (e.g. a mini game with a growing mode) in the server 200, the feature information of the interaction object is preset and thus the user can not set or alternate the feature information of the interaction object.
In an embodiment, the feature information of the interaction object, such as a photo or a video file, the user may set a living time and a privacy level of the interaction object. Specifically, the user may set the living time of the interaction object as 3 months, and the server 200 may delete the interaction object after 3 months. Similarly, when the user is registering to the server 200, the user may not only provide a friend list (e.g. registration IDs of friends), but also set the privacy level of each friend. Accordingly, only when the privacy level of a friend corresponds to the privacy level of the interaction object, will the server 200 transmit the interaction object to the mobile device of the friend.
In an embodiment, for an interaction object being a text message, the user may specify the lasting time of the text message and the users who are permitted to view the text message. Specifically, the user may freely define the friends, who are permitted to view the text message, and the lasting time (e.g. 10 seconds) of the text message.
In an embodiment, for an interaction object such as images stored in the database 210, the user may set the growing mode of the interaction object, such as a mini game, and set ways to interact with the interaction object. Specifically, the images stored in the server 200 may comprise an egg in the beginning, a chicken in a broken egg shell after a few days, and the chicken grow up gradually, respectively. As long as the user passes by the preset geographical location, the server 200 may transmit different images according to the time duration passed by. In addition, every time when the user receives the images from the server 200, the user may interact with the chicken in the image via the mobile device 100. For example, the display unit 130 may display action information (i.e. interaction functions) of the interaction object (e.g. the chicken), such as feeding the chicken, and taking a walk with the chicken. Thus, the user may select an interaction function by pressing on the button corresponding to the interaction function on the display unit 130, thereby the mobile device 100 may transmit the selected interaction function back to the server 200. The server 200 may integrate the interaction function and calculate the corresponding image of the interaction object (e.g. the chicken). For example, the chicken may be fatter if it is fed more often, and the chicken may have a better look (e.g. healthier) if it takes a walk more often. Specifically, the user or the server 200 may set the bonus after completing the mini game, such as a coupon of a specific store when the chicken grows up successfully, so that it may be more fun to play the game and interact with the interaction object.
In addition, the trigger condition of the feature information may indicate whether the geographical location of the mobile device 100 or other user is located within a range of the location information of the interaction object, or whether a user has arrived at a location corresponding to the location information of the interaction object at specific time. Specifically, if the server 200 determines that one of the aforementioned trigger conditions stands, the server 200 may start to transmit the interaction object to the mobile device 100 or other user, so that the mobile device 100 may perform a corresponding interaction action according to at least one feature of the interaction object, such as displaying a text message, displaying videos or music, or displaying the interaction object, but the invention is not limited thereto.
When an interaction associated with the location information is stored in the database 210 or the mobile device 100 has triggered a trigger condition of a certain interaction object (e.g. within a specific range or a specific time/location), the user may confirm whether to view or execute the interaction object on the mobile device 100 (step S350). If so, the mobile device 100 may display the interaction object (step S360). If not, the mobile device 100 may reject to display the interaction object (step S355), and step S320 is performed. When the mobile device 100 is displaying the interaction object, the user may further determine whether to interact with the interaction object (e.g. according to the action information of the interaction object) (step S370). If so, the user may interact with the interaction object according to the action information and feature information of the interaction object (step S380). If not, it may indicate that the user does not want to interact with the interaction object, and the mobile device 100 may exit the page comprising the interaction object (step S375), and step S320 is performed. After step S380, the mobile device 100 may store the action information responded by the user, and transmit the action information to the server 200, thereby updating the status of the interaction object in the database 210 (step S390), and step S320 is performed.
As illustrated in
For those skilled in the art, it should be appreciated that the interaction system 10 in the invention can be applied to interactions in many aspects, so that the location information of interaction objects can be sufficiently used for interaction, such as a role playing game or a feeding game, leaving a message or instructions on a map, a location-based notification (e.g. an alarm clock), an educational application (e.g. marking tags on plants), interacting with videos, interaction advertisements (e.g. getting coupons by playing an interaction game), a theme park (e.g. mixing images in the real world and virtual world), or guidance in a museum, but the invention is not limited thereto.
While the invention has been described by way of example and in terms of the preferred embodiments, it is to be understood that the invention is not limited to the disclosed embodiments. To the contrary, it is intended to cover various modifications and similar arrangements (as would be apparent to those skilled in the art). Therefore, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.
Number | Date | Country | Kind |
---|---|---|---|
101122937 | Jun 2012 | TW | national |