This application claims priority from and the benefit under 35 U.S.C. §119(a) of Korean Patent Application No. 10-2010-0077897, filed on Aug. 12, 2010, which is hereby incorporated by reference for all purposes as if fully set forth herein.
1. Field
The disclosure relates to a user equipment, a server, and a method for augmented reality (AR) for selecting a filter, and more particularly, to a user equipment, a server for augmented reality (AR), and a method for selecting a filter to apply selective filtering to an object selected by the user.
2. Discussion of the Background
Augmented reality (AR) technology refers to a computer graphic technology for combining a real environment with an artificial object or information. Unlike conventional virtual reality technologies based on virtual space and a virtual object only, the AR technology combines a real environment with an artificial object or information, thereby adding supplementary information that may be difficult to obtain in the real world. The AR technology may apply a filter to an object identified in the real environment to separate artificial objects or information from the obtained object.
However, conventionally, a user may be unable to apply a specific filter to a specific object, and instead may apply the same filter to the whole image, so that the user may be unable to obtain specific information. Also, the user may use a filter provided by an augmented reality service provider, but may be unable to designate and use a specific filter. As a result, if the user tries to obtain specific filtered information, the user may suffer from the inconvenience of applying a plurality of filters to objects one by one.
Exemplary embodiments of the present invention provide an AR filter selecting user equipment, a server, and a method for selecting an augmented reality filter.
Additional features of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention.
Exemplary embodiment of the present invention provide an AR filter selecting user equipment, including a display unit to display a real image including a target object and a is plurality of filter icons; a user input unit to receive a user command, the user command including a selection of a target filter icon and a movement of the target filter icon; and a control unit to control the display unit to display the target object and filtered AR information corresponding to the target object.
Exemplary embodiment of the present invention provide an AR user equipment to select a filter including a display unit to display a real image including a target object and a plurality of filter icons; and a control unit to select and apply a target filter icon onto the target object and to control the display unit to prominently display the target object related to the target filter icon.
Exemplary embodiment of the present invention provide an AR service method for selecting a filter including displaying a real image including a target object and a plurality of filter icons; receiving a command to select a target filter icon; applying the target filter icon by moving the selected target filter icon onto a target object on the displayed image; and displaying the target object and information corresponding to the target object, in which the target filter icon is applied.
Exemplary embodiment of the present invention provide a method for selecting a filter in an AR service including displaying a real image including a target object and a plurality of filter icons; and prominently displaying a target object related to a selected target filter icon, wherein the target filter icon is selected according to an input by user.
Exemplary embodiment of the present invention provide an AR server for providing a user equipment with an AR service including a communication unit to receive information of a target filter icon selected and moved onto a target object by the user equipment and information of the target object; a database to store information related to the target object; is an information extracting unit to extract information corresponding to the target filter icon from the database; and a control unit to control the communication unit to provide the user equipment with the extracted information.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention, and together with the description serve to explain the principles of the invention.
The invention is described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure is thorough, and will fully convey the scope of the invention to those skilled in the art. It will be understood that for the purposes of this disclosure, “at least one of each” will be interpreted to mean any combination the enumerated elements following the respective language, including combination of multiples of the enumerated elements. For example, “at least one of X, Y, and Z” will be construed to mean X only, Y only, Z only, or any combination of two or more items X, Y, and Z (e.g. XYZ, XZ, YZ, X). Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals are understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience.
The AR system of
As shown in
The transmission network 10 is a data transmission network and may support communication between the AR user equipment 100 and the AR server 200.
In an example, the AR user equipment 100 providing an AR service may include mobile electronic appliances capable of wired or wireless communication, for example, smart phones, laptop computers, and the like.
In an example, the AR server 200 may provide the AR user equipment 100 with AR information of a selected target object in response to the request of the AR user equipment 100. Also, the AR server 200 may extract AR information based on the target filter icon applied to the target object by the AR user equipment 100 from the AR information storage of the target object and provide the AR user equipment 100 with the extracted information.
Hereinafter, the AR user equipment 100 is described with reference to
As shown in
The photographing unit 110 photographs an object and obtains an image of the object. In an example, photographing unit 110 may be an embedded camera or an external camera. In addition, the photographing unit 110 may obtain a still image or a moving image. Further, the obtained image may be processed into a displayable signal by the photographing unit 110 or by a separate image processor.
The display unit 120 displays an image obtained by the photographing unit 110. In an example, the display unit 120 may include a display device, such as a liquid crystal display (LCD) or similar display devices. If the AR user equipment 100 provides a touch type user input unit 130, the display unit 120 may display a user interface (UI) for a touch panel.
The user input unit 130 is an operation panel for receiving an input of a user command. In an example, the user input unit 130 may include various kinds of interfaces for inputting commands, such as a button, a directional key, a touch panel and the like. A signal of the user command inputted through the user input unit 130 may be transmitted to the first control unit 150.
More particularly, the user input unit 130 may receive, from a user, a first command for selecting a target filter icon among a plurality of icons displayed on the display unit 120, and a second command for moving the selected target filter icon onto a target object on a displayed image. If the first and second commands are inputted, the target filter icon may be displayed together with a target object as shown in
In an example, a user may select one target filter icon by touch using a touch panel and move the selected target filter icon onto a target object by a drag-and-drop scheme. That is, the user may touch a selected target filter icon, drag the icon onto a target object, and drop the icon when the icon reaches the target object.
Alternatively, the user may select a target filter icon by manipulation of a directional key or a tab key of an operation panel and move the icon to a target object.
The first communication unit 140 communicates with the AR server 200 via transmission network 10. In an example, the first communication unit 140 may include a physical module and software for communication. Further, the information provided by the AR user equipment 100 on the placement of the target filter icon on a target object selected on the displayed image (placement information) may be provided to the AR server 200. In addition, the category information of the target filter icon may be provided to the AR server 200.
The first control unit 150 controls the entire operation of the AR user equipment 100. In an example, the first control unit 150 may control the display unit 120 to display an image of the real world (real image) and a plurality of icons if the AR user equipment 100 operates at an AR service mode to provide an AR service. The real image may be an image of an object existing in a real environment, and may be obtained by the photographing unit 110. Also, the first control unit 150 may control the display unit 120 to display information corresponding to a selected target filter icon, together with the selected target object, if the first command and second command are inputted from the user input unit 130.
As shown in
The second communication unit 210 communicates with the AR user equipment 100. In an example, the second communication unit 210 may receive information on a target filter icon selected by the AR user equipment 100 and information of a target object, on which the target filter icon was applied. The target filter icon may be applied if the filter icon is moved to overlap the target object in the displayed image where the target object is identified. In addition, the category information of the applied target filter icon may be provided.
The object recognizing unit 220 recognizes objects included in a real image provided by the AR user equipment 100. In an example, the object recognizing unit 220 may recognize the objects using an object recognition algorithm. Alternatively, the object recognizing unit 220 may recognize the objects using the received GPS information.
Also, the object recognizing unit 220 may recognize a target object by using placement information of the target filter icon provided by the AR user equipment 100. Further, if the selected target object on which a target filter icon is applied is recognized by the object recognizing unit 220, the target object may be identified as the recognized target object. Once recognized, the object recognizing unit 220 may generate detection information of the recognized target object. In addition, category information of the target filter icon may be provided by the AR user equipment 100. In an example, category information may include a shopping center, a car, a theater, and the like.
In an example, the detection information may be outline coordinate information of the target object or an image data of a highlighted outline of the target object. If the detection information is outline coordinate information of the target object, the first control unit 150 may prominently display an area of a corresponding coordinate on the real image displayed on the display unit 120. In an example, a target object may be prominently displayed with highlighted outlining of the target object, highlighting of the entire target object, or other conventional schemes to distinguish the target object from the other objects in the displayed image. Alternatively, if the detection information is an image data with a highlighted outline of the target object, the first control unit 150 may display the image data on the display unit 120 as is.
For the purpose of this application, an object that is “prominently” displayed means that the object is displayed in a more prominent manner than other objects. For example, an object that is “prominently” displayed may have a bold outline, or may be displayed more brightly compared with other objects in the display.
If category information of the target filter icon is received from the AR user equipment 100, the information extracting unit 230 may extract AR information corresponding to the target object on which target filter icon is applied from the DB 250 by applying the target filter icon to the target object recognized by the object recognizing unit 220.
The tag information generating unit 240 may generate an AR tag window using the extracted AR information.
The DB 250 may store AR information related for each target object. In an example, AR information may include various attributes that are related to the target object. If the target object is a clothing store, it may include name, address, hours of operation, and the like. If the target object is a car, it may include make, model, year, and the like.
The second control unit 260 may control the second communication unit 210 to provide outline information of the generated target object to the AR user equipment 100.
Also, the second control unit 260 may control the second communication unit 210 to provide the generated AR tag window to the AR user equipment 100.
Hereinafter, embodiments for displaying information corresponding to a target filter icon together with a target object on the display unit 120 are described with reference to
If a real image is displayed and an AR service mode is selected, the first control unit 150 may control the display unit 120 to display a plurality of icons together with the real image, as shown in
Before the target object is selected, that is, if the target filter icon is moved to the target object before selecting, the first control unit 150 may control the display unit 120 to prominently display the target object.
In an example, the first control unit 150 may enable a target object to be prominently displayed by a first method using the AR server 200 or alternatively, by a second method by itself.
According to the first method, if the target filter icon is moved to the target object by a drag and drop scheme, the first control unit 150 may control the AR server 200 to transmit placement information of the target filter icon on the target object. Accordingly, the first control unit 150 may control the first communication unit 140 to transmit the information together with data of a displayed real image and current global positioning system (GPS) information. In response, the first communication unit 140 may receive, from the AR server 200, detection information of the target object corresponding to the placement information of the target filter icon. The first control unit 150 may control the display unit 120 to prominently display the target object using the received detection information as shown in
According to the second method, the first control unit 150 may detect a selected target object with reference to the placement information of the target filter icon and prominently display the detected target object. For example, the first control unit 150 may detect the outline of the target object using an outline detection algorithm, and may accordingly prominently display the selected target object.
After the selected target object is prominently displayed, or after the target filter icon is dragged to the target object, the user may drop the target filter icon onto the target object. If the target filter icon is dropped onto the target object, the first communication unit 140 may transmit information of the target object and information of the target filter icon to the AR server 200 under the control of the first control unit 150. The information of the target object may be coordinate information of the target object measured on the displayed real image. The information of the target filter icon may be category information.
Also, the first communication unit 140 may receive, from the AR server 200, AR information corresponding to the target object in which target filter is applied. That is, the AR server 200 may extract AR information corresponding to the target object, and provide the extracted information to the first communication unit 140. In an example, the AR server 200 may provide the AR information to display on the display unit 120 in the form of a tag window as shown in
Although the current embodiment shows a user selecting and moving a target filter icon to a target object using a drag-and-drop scheme, the user may select and move a target filter icon to a target object using an operation panel in the same way. That is, if the user selects a target filter icon using an operation panel and moves the selected target filter icon to a target object, the first communication unit 140 may receive, from the AR server 200, AR information corresponding to the target filter icon.
As shown in
If the user selects all of desired target filter icons to apply to the target objects, the first control unit 150 may display the received AR information together with the target objects on the display unit 120, as shown in
Referring to
Referring to
In addition, although not illustrated, the first control unit 150 may automatically apply the multiple target filters included in group 70, to another target object with an attribute similar to the target object. Accordingly, the first control unit 150 may automatically apply the group 70 to a similar second target object and display AR information of the second object corresponding to the target filter icons in the group 70.
As shown in
In an example, if a previously stored target object is selected on a real image displayed on the display unit 120, the first control unit 150 may automatically display the previously applied filter icon to the target object. For this purpose, the first control unit 150 may separately store identity information of a filter icon applied to the target object in a memory unit (not shown) for each target object. The method for storing target objects and the applied target filter may include, for example, touching a target object for at least a predetermined time, rapidly clicking the target object at least twice, and the like.
Alternatively, if the user selects a target object on a real image displayed on the display unit 120 without applying a target filter icon, the first control unit 150 or the AR server 200 may analyze AR information of the specific object and obtain target filter information related to the specific object. Accordingly, the first control unit 150 may generate a target filter icon with reference to the related target filter information and display the target filter icon on the display unit 120 in the form of a context menu.
Alternatively, the first control unit 150 may automatically generate a filter icon list, as opposed to a selection by the user. Alternatively, filter icon may be generated as a result of user's command or confirmation. For example, if sequential filtering is set, the first control unit 150 may automatically list the filters related to information of the target object based on the filters most frequently selected by the user in an order to generate a filter icon list, and to display the filter icon list.
Alternatively, the first control unit 150 may generate a bundle of usable filter icons in the AR user equipment 100. In an example, the generated bundle of useable filter icons with an attribute added by the user may be transmitted to another user. For example, if an identifier (ID) of a receiver is displayed on the display unit 120, the first control unit 150 may control the first communication unit 140 to transmit the bundle to the receiver, in response to the request of the user.
Referring to
While the AR user equipment 100 may have a main function of displaying AR information corresponding to a target filter icon applied to an object, the AR user equipment 900 may have a main function to filtering an object corresponding to a target filter icon selected by a user and to display the filtered object.
To filter an object corresponding to a target filter icon selected by a user and to display the filtered object, the AR user equipment 900 may include a photographing unit 910, a display unit 920, a user input unit 930, a memory unit 940, a third communication unit 950, and a third control unit 960. The photographing unit 910, the display unit 920, the user input unit 930 and the third communication unit 950 may have the basic functions equal to those of the photographing unit 110, the display unit 120, the user input unit 130, and the first communication unit 140 of
The display unit 920 may display a real image with at least one object and a plurality of filter icons.
The memory unit 940 may store algorithms and software used to detect a target object corresponding to a target filter icon.
If a target filter icon is selected among the plurality of displayed filter icons according to an input by the user, the third communication unit 950 may directly or indirectly filter a target object related to the target filter icon on the displayed real image and prominently display the filtered target object.
Alternatively, the third control unit 960 may control the third communication unit 950 to transmit category information of the selected target filter icon applied to the selected target object and image data of the target object to the AR server 200. The third control unit 960 may then receive outline image information of the target object related to the target filter icon received from the AR server 200 via the third communication unit 950. The third control unit 960 may control the display unit 920 to prominently display the target object using the received outline image information.
The AR server 200 may analyze category information of the target filter icon received from the AR user equipment 900 and the real image data, and extract outline information of the target object from data storage of the real image.
Referring to
Referring to
Referring to
Each operation of the process of
The AR user equipment may display a real image and a plurality of filter icons, in operation 1300. The AR user equipment may select a target filter icon among the plurality of filter icons according to a command by a user, in operation 1310.
The AR user equipment may move the selected target filter icon to a target object on the real image according to the command by the user, in operation 1320. Accordingly, the AR user equipment may display the target object from the real image with the applied target filter icon as shown in
If the target filter icon is applied to the target object by dragging and dropping the target filter icon on the target object, the AR user equipment may transmit placement information of the target filter icon applied on a target object to the AR server in operation 1330. Further, the AR user equipment 100 may transmit the placement information together with image data of the target object in operation 1330.
In response to the transmitted information provided in operation 1330, the AR user equipment 100 may receive detection information of the target object from the AR server 200 in operation 1340. The detection information may be desired to prominently display the target object.
The AR user equipment 100 may prominently display the selected target object using the received detection information in operation 1350.
If the target object is selected in operation 1360, that is, if the target filter icon is applied onto the target object, the AR user equipment 100 may transmit placement information of the target filter icon where it was applied onto the target object and category information of the target filter icon to the AR server in operation 1370.
The AR user equipment 100 may also receive, from the AR server 200, AR information related to the target object corresponding to the applied target filter in operation 1380. Alternatively, tag information using the AR information may be provided to the AR user equipment 100. The tag information using the AR information may be data for providing the AR information as a tag window.
The AR user equipment may display the received AR information or tag information together with the target object, in operation 1390.
According to the embodiments of the present invention, a target filter may be virtualized as an icon and movable using a drag-and-drop scheme, and thereby may improve ease of use. Additional functions may be provided, for example, applying a specific target filter to a specific target object, applying a specific target filter to a section on a screen, applying different target filters to the duplicate images, setting an attribute of a filter by a user, and the like, so that the user may more easily and quickly find information about objects.
The disclosure can also be embodied as computer readable codes on a computer readable recording medium. The computer readable recording medium may be any data storage device that can store data which can be thereafter read by a computer system.
Examples of the computer readable recording medium may include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and carrier waves such as data transmission through the Internet. The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
It will be apparent to those skilled in the art that various modifications and variation may be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2010-0077897 | Aug 2010 | KR | national |