This application claims priority from and the benefit under 35 U.S.C. §119(a) of Korean Patent Application No. 10-2010-0077896, filed on Aug. 12, 2010, which is incorporated by reference for all purposes as if fully set forth herein.
1. Field
Exemplary embodiments of the present invention relate to a user equipment and a method for displaying an augmented reality (AR) window, and more particularly, to a user equipment and a method for displaying an AR window in various arrangement patterns.
2. Discussion of the Background
Augmented reality (AR) technology is similar to virtual reality technologies and refers to a computer graphic technology for combining an object of a real environment with an artificial element or information. Unlike general virtual reality technologies only based on virtual space and a virtual element, the AR technology combines an object of a real environment with an artificial element or information, thereby adding supplemental information difficult to obtain solely in the real environment. However, as the number of objects provided in an AR environment and the number of windows for providing information of an object increase, an AR service may not effectively display the objects and the windows on a limited screen.
To solve this problem, an object may be selected based on a distance between the object and an equipment, with a filtered selection being solely displayed on a window. However, a user's intent may not be captured through the filtering operation.
Exemplary embodiments of the present invention provide a user equipment and a method for displaying an augmented reality (AR) window, which may display a plurality of AR windows of objects to a user.
Additional features of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention.
An exemplary embodiment of the present invention discloses a user equipment to an augmented reality (AR) window, the user equipment including a display unit to display an image and an AR window corresponding to an object included in the image; and a control unit to determine a display arrangement of the AR window by adjusting an attribute of the AR window.
An exemplary embodiment of the present invention discloses a user equipment to display an AR window, including a display unit to display an image and a first AR window and a second AR window respectively corresponding to a first object and a second object included in the image; ; and a control unit to group the first AR window and the second AR window into a group and to display the group together with the first object and the second object, if the first AR window and the second AR window partially overlap each other.
An exemplary embodiment of the present invention discloses a method for displaying an AR window of a user equipment to provide an AR service, including detecting an object included in an image; generating a first AR window corresponding to the object; determining an arrangement pattern of the first AR window based on an adjustment of an attribute; and displaying the at least one AR window in the determined arrangement pattern along with the at least one object.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention, and together with the description serve to explain the principles of the invention.
The invention is described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure is thorough, and will fully convey the scope of the invention to those skilled in the art. In the drawings, the size and relative sizes of layers and regions may be exaggerated for clarity. Like reference numerals in the drawings denote like elements.
Referring to
In providing the AR service, the AR system may include a communication network 10, user equipment 100, and an AR server 200. Although
The communication network 10 may be a network for data transmission, that supports communication between the user equipment 100 and the AR server 200.
The user equipment 100 may provide an AR service, and may be a mobile electronic device capable of wired/wireless communication including, including but not limited to, a smart phones, a laptop computer, and the like. The user equipment 100 may store AR information of an object sourced from within the user equipment 100 or receive AR information of an object sourced from the AR server 200.
The AR server 200 may provide the user equipment 100 with AR information of an object displayed on the user equipment 100 in response to a request from the user equipment 100.
Referring to
The sensing unit 110 may sense information on a current location of the user equipment 100 and a current view direction of the photographing unit 120. The sensing unit 110 may sense the current location of the user equipment 100 using a global positioning system (GPS), a location-based service (LBS), and the like, and sense the current view direction of the photographing unit 120 using a digital compass. The sensing result of the sensing unit 110 may be used in making a request for detailed information of an object to the AR server 200. The sensing unit 110 may sense a distance between the user equipment 100 and an object using a technique such as a time of flight (TOF) scheme.
The photographing unit 120 may photograph a subject and capture and store an image of the subject. The photographing unit 120 may include at least one of an embedded camera and an external camera. The photographing unit 120 may obtain a still image and a moving image. The obtained image may be displayed on the display unit 140.
The user input unit 130 may be an operation panel for receiving a user command, and include a button to cause the photographing unit 120 to photograph the subject, a directional key, a touch panel, and the like. A signal of the user command inputted through the user input unit 130 may be transmitted to the control unit 190. A user may manipulate the user input unit 130 to select one or more of the various methods described in this disclosure.
The display unit 140 may display an image obtained by the photographing unit 120 or an image obtained by another photographing device. If the user equipment 100 provides a touch type, the display unit 140 may display a user interface (UI) of a touch panel.
The memory unit 150 may be a non-volatile memory, and may store various programs or software used in the operation of the user equipment 100 and store data that is generated between the operation of the user equipment 100 and AR information of an object.
The communication unit 160 may communicate with the AR server 200 via the communication network 10 and may be embodied as a physical module and software for communication. For example, the communication unit 160 may make a request for AR information of an object on a real image displayed on the display unit 140 to the AR server 200 and receive the AR information from the AR server 200. The AR server 200 may search a database (not shown) and transmit AR information of an object to the communication unit 160, in response to the request from the communication unit 160.
The AR window generating unit 170 may process the received AR information and location information where an object is located on an image to generate an AR window.
The combining unit 180 may combine an image obtained by the photographing unit 120 with an AR window generated by the AR window generating unit 170. The combined result may be displayed on the display unit 140 as shown in
The control unit 190 may include a processor or perform a function of a processor, and may control the operation of the user equipment 100.
The control unit 190 may identify an object included in a real image without using sensing data of the sensing unit 110. For example, the control unit 190 may compare a real image with images stored in the memory unit 150 and detect an image matching the image. Also, the control unit 190 may identify an object included in the image using object data stored matching to the detected image.
The control unit 190 may determine whether to display an AR window of a meaningful object among objects included in an image, the meaningful object having an AR window associated with it. Referring to
The control unit 190 may adjust at least one of a size, a display location, a display pattern, a color of AR windows based on various attributes (such as size, an extent of overlap, an access frequency, etc. of the AR windows), and determine an arrangement pattern of the AR windows using the adjusted result. The control unit 190 may control the combining unit 180 and the display unit 140 to combine the AR windows with corresponding objects and display the combined AR windows and objects in the determined arrangement pattern.
The control unit 190 may compare the size of a target object, among objects displayed on the display unit 140, with the size of a target AR window corresponding to the target object, and based on the comparison, adjust a display pattern of the target AR window. For example, the control unit 190 may detect an edge of the target object and determine the size of the target object. Thus, because the target AR window is generated by a predetermined method, the control unit 190 may recognize the size of the target object.
Referring to
Referring to
The control unit 190 may display AR windows with an enlarged size if a distance between an actual location of an object (or objects) and a user is small, and display AR windows with a reduced size (relative to the larger size display of AR windows) if the distance is large. The actual location of an object (or objects) may be related to a location of the real objects taken by the photographing unit 120. Referring to
The control unit 190 may process AR windows based on concentration of objects such that the control unit 190 displays the AR windows with an enlarged size if the concentration of the AR windows is low (i.e. a low number of AR windows on a single display unit). Referring to
The following table shows an example of how concentrations can be mapped to a display size of the various AR windows.
Alternatively, the control unit 190 may process an AR window display size based on a frequency a user accesses an object corresponding to the AR window such that the control unit 190 displays an AR window of a frequently accessed object with an enlarged size relative to that of a non-frequently accessed object. The access frequency of an object may be the number of times at least one user selects the object within a reference time. The access frequency may be counted by the control unit 190 or AR server 200, with the count being stored by the control unit 190 or the AR server 200.
Alternatively, the control unit 190 may process an AR window display size based on priority or importance set by a user such that the control unit 190 displays an AR window of higher priority or importance with an enlarged size as opposed to an AR window with lower importance. The user may set the priority or importance for each object by manipulating the user input unit 130 or accessing a web page associated with the AR server 200.
If one object included in a real image has a plurality of sub-objects, the control unit 190 may set one of the plurality of sub-objects as a parent object and the other sub-objects as child objects. For example, if one building has a bank, a restaurant, and a convenience store, the bank may be set as a parent object and the restaurant and the convenience store may be set as child objects. AR windows of the child objects with unchanged location and size may be made to be dependent on an AR window of the parent object. Based on this feature, the control unit 190 may group the parent object and the child objects to display the AR windows in a simpler manner.
For example, the control unit 190 may group AR windows 710a, 710b, and 710c existing in the same object 710 into one group, as shown in the right image of
Also, the control unit 190 may group AR windows 810 to 860 associated with an object that if viewed overlap each other, into at least one group, as shown in
Also, the control unit 190 may group AR windows 910 to 960 according to predetermined categories. Referring to
Alternatively, the control unit 190 may adjust a display pattern of AR windows based on an amount the AR windows are overlapped with each other. In other words, the control unit 190 may not process AR windows generated by the AR window generating unit 170 and display the AR windows as overlapped with each other if the amount of overlap falls within a threshold.
If the AR windows are partially overlapped with each other, the control unit 190 may display an AR window located behind the other AR window in a visible portion of an object corresponding to the rear AR window. For example, if a first AR window is partially overlapped with a second AR window located behind the first AR window, the control unit 190 may display the second AR window in a visible portion of an object corresponding to the second AR window.
Referring to
If AR windows are partially overlapped with each other, the control unit 190 may group the AR windows into n groups and display the n groups on the display unit 140. This is because it may be difficult to clearly display the AR windows due to overlapping. The grouped AR windows may be displayed in the form of a bubble at the top or the center of objects.
Referring to
If the number of AR windows to be displayed on a real image is more than a reference number, the control unit 190 may group the AR windows using a tree structure. Referring to
If a target group is selected by a user among n groups displayed on the display unit 140, the control unit 190 may release the grouping of the target group and dispose AR windows in the target group in corresponding objects. Referring to
Referring to
The control unit 190 may compare sizes of an object and an AR window, and dispose the AR window in either an area where the object is displayed or an area where the object is not displayed, depending on the comparison result. On the screen of the display unit 140, the area where an object is displayed may be referred to as a recognition area, and the area where an object is not displayed may be referred to as a non-recognition area.
Referring to
Referring to
Referring to
Referring to
Referring to
The method of
The user equipment may detect objects included in a real image, in operation 2000. The user equipment may detect objects using location information of the objects or detect the objects by comparing the image with a stored image.
The user equipment may generate AR windows using object data of the displayed objects and determine the size of the AR windows, in operation 2010. The user equipment may compare the size of each AR window with the size of each corresponding object and determine the size of the AR windows. The user equipment may also compare the location of each corresponding object to determine the size of a corresponding AR window.
The user equipment may determine a display pattern of the AR windows with the determined size, in operation 2020. For example, if an insufficient area to display the AR windows exists in operation 2010, the user equipment may mark a corresponding object with an identity mark indicating the presence of the AR windows. Also, the user equipment may group the AR windows and display the group.
The user equipment may adjust an amount of overlap of the AR windows, in operation 2030. For example, the user equipment may determine an arrangement pattern of the AR windows based on whole or partial overlapping of the AR windows.
The user equipment may improve readability of the AR windows, in operation 2040. For example, the user equipment may display AR window(s) located in a focus area of the display unit and not display an AR window(s) located in an area other than the focus area.
The user equipment may combine the image with the AR windows and display the composite image, in operation 2050.
Each of the operations 2010 to 2040 may be optionally performed. Also, various embodiments described in this disclosure may be performed in conjunction with or as a replacement to one or more of operations 2010 to 2040.
According to the disclosed exemplary embodiments, in the method for displaying an AR window by a user equipment, each AR window may be displayed in the center of a corresponding object and the AR windows may be all displayed on one real image.
The exemplary embodiments according to the present invention may be recorded in non-transitory computer-readable media including program instructions to implement various operations embodied by a computer. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. The media and program instructions may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
It will be apparent to those skilled in the art that various modifications and variation can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2010-0077896 | Aug 2010 | KR | national |