This application claims the benefit under 35 U.S.C. §119(a) of a Korean patent application filed on Feb. 24, 2012, in the Korean Intellectual Property Office and assigned Serial No. 10-2012-0019310, the entire disclosure of which is hereby incorporated by reference.
1. Field of the Invention
The present invention relates to a method and apparatus for adjusting the sizes of objects displayed on a screen.
2. Description of the Related Art
With advances in display technology, various types of devices are equipped with display screens. In many cases, display screens are used together with traditional non-display functions. For example, in the case of a digital camera, a display screen is used together with a traditional photographing function to enable a user to preview an image before photographing. Most newly developed digital devices such as Portable Multimedia Players (PMP) are equipped with display screens.
If such a display screen is installed in a device, the user may use it to visually check states of the device. That is, the display screen may significantly enhance the convenience and usefulness of a device. Hence, it is expected that display screens will be more frequently used in digital devices.
Digital devices have different screen sizes. Given screen size limitations, the user of a device may have to adjust sizes of objects on the screen in accordance with a currently used function. However, in a regular device, in order to enlarge or reduce the size of a selected object or region, the user may be required to perform a multi-stage or inconvenient input procedure.
Aspects of the present invention are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention is to provide a method and apparatus wherein, for easy object size adjustment on a screen, a guide indicating an object region is displayed and the user may adjust the size of a selected object using the guide.
In accordance with an aspect of the present invention, a method for object size adjustment on a screen is provided. The method includes recognizing one or more objects appearing on the screen, and displaying guides indicating regions of the recognized objects on the screen, receiving a selection command for at least one of the recognized objects, and adjusting, upon reception of a size adjustment command, the size of the region of the selected object with respect to a first axis of the guide associated with the selected object or a second axis thereof perpendicular to the first axis, and displaying the size-adjusted region.
In accordance with another aspect of the present invention, an apparatus for object size adjustment on a screen is provided. The apparatus includes a display unit displaying original images and size-adjusted images, a control unit controlling a process of recognizing one or more objects appearing on the display unit, displaying guides indicating regions of the recognized objects, receiving a selection command for at least one of the recognized objects, adjusting, upon reception of a size adjustment command, the size of the region of the selected object with respect to a first axis of the guide associated with the selected object or a second axis thereof perpendicular to the first axis, and displaying the size-adjusted region.
Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
The above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention is provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
In the following description, an “object” refers to a distinguishable thing or entity on the screen. An object may be independent of another object. For example, a face and a vehicle may be objects. A distinguishable region or area on the screen may also be an object. For example, an icon or a frame on a browser screen may be an object.
A “guide” includes one or more marks indicating an object region. A guide may be displayed, for example, in the form of a corner bracket, a solid line, a dotted line, a rectangle, a square, or a circle, so as to demarcate a region or area.
“Size adjustment” refers to enlargement or reduction of the size of an object.
Referring to
The key input unit 110 generates an input signal for controlling the apparatus 100 in response to key manipulation of a user, and sends the input signal to the control unit 130. The key input unit 110 may include a keypad including hard or soft numeric and direction keys, and function keys attached to the apparatus 100. In an exemplary embodiment, the key input unit 110 may receive user input to select a particular object or region. If the apparatus 100 can be operated by using only the touchscreen 120, then the key input unit 110 may be excluded.
The touchscreen 120 includes a touch sensor 121 and a display unit 122. The touch sensor 121 detects a touch input and location of the user. The touch sensor 121 may be realized, for example, by using a capacitive, resistive, infrared, or pressure sensor. Any sensor capable of detecting contact or pressure may be utilized as the touch sensor 121. The touch sensor 121 generates a touch signal corresponding to the user's touch and sends the touch signal to the control unit 130. The touch signal includes coordinate data of the touch point. If the user makes a touch-point move gesture, the touch sensor 121 generates a touch signal including coordinate data describing the path of the touch-point move, and forwards the generated touch signal to the control unit 130.
In particular, the touch sensor 121 may detect user input (for example, touch, multi-touch, or drag) for selecting an object or region. This is described in more detail later.
The display unit 122 may be realized using Liquid Crystal Display (LCD) devices, Organic Light Emitting Diodes (OLED), or Active Matrix OLED (AMOLED). The display unit 122 provides various information such as menus, input data, and function-setting data to the user in visual form. In particular, the display unit 122 may display original images and size-adjusted images.
Although the apparatus 100 for adjusting object sizes is depicted as having a touchscreen capability, the present exemplary embodiment may be applied to an apparatus for adjusting object sizes with or without a touchscreen capability. If the present exemplary embodiment is applied to an apparatus for adjusting object sizes without a touchscreen capability, the function of the touchscreen 120 may be limited to that of the display unit 122.
The control unit 130 controls overall operations of individual components of the apparatus 100. In particular, the control unit 130 controls a process of recognizing objects on the screen, displaying guides indicating object regions, receiving a command for object selection, receiving a command for size adjustment, adjusting the size of a selected object region with respect to at least a first axis or second axis of the corresponding guide (where the first axis is perpendicular to the second axis), and displaying the size-adjusted object region. To achieve this, the control unit 130 includes an object recognizer 131 and an object size adjuster 132.
The object recognizer 131 may recognize one or more objects appearing on the display unit 122. In the present exemplary embodiment, an object may be any distinguishable thing or entity on the display unit 122, and an object may be independent of another object. For example, a face and a vehicle may be objects. A distinguishable region or area on the screen may also be an object. For example, an icon or a frame on a browser screen may be an object.
In an exemplary embodiment, the object recognizer 131 may recognize an object in various ways. The object recognizer 131 may recognize things or faces appearing in images or moving images being displayed on the screen as objects. If the apparatus 100 is attached to or includes a digital camera, the object recognizer 131 may recognize a subject on which the digital camera is focused as an object. If the apparatus 100 is used for a webpage browser, the object recognizer 131 may recognize frames in the webpage as objects. That is, the object recognizer 131 may recognize a distinguishable region or area on the screen as an object. In addition, if a region on the screen is selected by a command from the key input unit 110 or touch sensor 121, the object recognizer 131 may recognize the selected region as an object.
If an object is recognized by the object recognizer 131, the object size adjuster 132 may control an operation to display a guide indicating a region of the recognized object. In the present exemplary embodiment, a guide includes one or more marks indicating an object region. A guide may be displayed, for example, in the form of a corner bracket, a solid line, a dotted line, a rectangle, a square, or a circle, so as to demarcate a region or area. Here, a region or area may have a first length in a first axis direction and a second length in a second axis direction, where the first axis is perpendicular to the second axis.
The object size adjuster 132 may receive an object selection command from the key input unit 110 or the touch sensor 121. If the touch sensor 121 is used, the object size adjuster 132 may receive various touch events such as touch, multi-touch, and drag, as an object selection command.
If a size adjustment command is received, the object size adjuster 132 may control an operation to adjust the size of a region of the selected object with respect to the first axis or the second axis of the guide, and display the adjusted object region. In the present exemplary embodiment, size adjustment refers to an enlargement or a reduction of the size of an object. In most cases, the first axis and second axis of a guide are parallel with boundary lines of the display unit 122. That is, as the screen of the display unit 122 is typically rectangular, the first axis corresponds to one of the horizontal axis and the vertical axis, and the second axis corresponds to the other axis.
Assuming that the longer axis is the reference axis, the object size adjuster 132 may enlarge the size of a region in accordance with the length of the display unit 122 in the direction of the reference axis. The object size adjuster 132 may also enlarge the size of a region while maintaining the width-to-height ratio of the corresponding guide.
If a selection command selecting two or more objects is received, the object size adjuster 132 may treat the two or more selected objects as a single combined object and display a guide indicating the region of the combined object.
If an adjustment cancel command is received from the key input unit 110 or touch sensor 121, the object size adjuster 132 may restore an original size of a size-adjusted region and display the region at the original size.
In the above description, although the control unit 130, the object recognizer 131 and the object size adjuster 132 are treated as separate entities having different functions, they need not necessarily be separate entities. For example, the control unit 130 may directly perform the functions of the object recognizer 131 and object size adjuster 132.
In the following description, for ease of description, it is assumed that functions of the object recognizer 131 and object size adjuster 132 are directly performed by the control unit 130.
Referring to
The control unit 130 may recognize an object in various ways. For example, the control unit 130 may recognize things or faces appearing in still images or in moving images displayed on the screen as objects. If the apparatus 100 is attached to or includes a digital camera, the control unit 130 may recognize a subject on which the digital camera is focused as an object. If the apparatus 100 is used for a webpage browser, the control unit 130 may recognize frames in the displayed webpage as objects. That is, the object recognizer 131 may recognize a distinguishable region or area on the screen as an object.
The control unit 130 displays guides for the recognized objects in step 220. Here, a guide is one or more marks indicating an object region. A guide may be displayed, for example, in the form of a corner bracket, a solid line, a dotted line, a rectangle, a square, or a circle so as to demarcate a specific region or area.
The control unit 130 receives an object selection command in step 230. The control unit 130 determines whether two or more objects are selected in step 240. If no more than a single object is selected, the control unit 130 proceeds to step 270, and if two or more objects are selected, the control unit 130 proceeds to step 250.
The control unit 130 may receive an object selection command from the key input unit 110 or the touch sensor 121. If the touch sensor 121 is used, the control unit 130 may receive various touch events such as touch, multi-touch, and drag, as an object selection command. This is described in more detail later.
If two or more objects are selected, the control unit 130 treats the selected objects as one combined object in step 250, and displays a guide indicating the region of the combined object in step 260.
The control unit 130 receives a size adjustment command from the key input unit 110 or the touch sensor 121 in step 270, and adjusts the size of the region of the selected object with respect to the first axis or second axis in step 280. In most cases, the first axis and second axis are parallel with boundary lines of the display unit 122. That is, as the screen of the display unit 122 is typically rectangular, the first axis corresponds to one of the horizontal axis and the vertical axis, and the second axis corresponds to the other axis.
The control unit 130 may enlarge, assuming that the longer axis is the reference axis, the object region in accordance with the length of the display unit 122 in the direction of the reference axis. The control unit 130 may also enlarge the object region while maintaining the width-to-height ratio of the corresponding guide.
Thereafter, if an adjustment cancel command is received from the key input unit 110 or the touch sensor 121, the control unit 130 may restore the original size of the adjusted object region and display the region at the original size (not shown).
Referring to screen representation (a) of
Referring to screen representation (b) of
Referring to screen representations (c), (d), and (e) of
Referring to screen representation (e) of
Referring to screen representation (a) of
Referring to screen representations (c) and (d) of
Referring to screen representation (a) of
Referring to screen representation (b) of
In response to selection of multiple objects in screen representation (b) of
Referring to screen representation (d) of
Referring to
Referring to screen representation (b) of
In response to the selection of multiple objects, in the same manner as in
Unlike the case of
Referring to
The control unit 130 may recognize an object in various ways. For example, the control unit 130 may recognize things or faces appearing in still images or in moving images displayed on the screen as objects. If the apparatus 100 is attached to or includes a digital camera, the control unit 130 may recognize a subject on which the digital camera is focused as an object. If the apparatus 100 is used for a webpage browser, the control unit 130 may recognize frames in the displayed webpage as objects. That is, the object recognizer 131 may recognize a distinguishable region or area on the screen as an object.
Thereafter, the control unit 130 displays a guide for the recognized object in step 730. Here, a guide is one or more marks indicating an object region. A guide may be displayed, for example, in the form of a corner bracket, a solid line, a dotted line, a rectangle, a square, or a circle, so as to demarcate a specific region or area.
The control unit 130 receives an object selection command in step 740. The control unit 130 determines whether two or more objects are selected in step 750. If no more than a single object is selected, the control unit 130 proceeds to step 780; and if two or more objects are selected, the control unit 130 proceeds to step 760.
The control unit 130 may receive an object selection command through the key input unit 110 or touch sensor 121. If the touch sensor 121 is used, the control unit 130 may receive various touch events, such as touch, multi-touch, and drag, as an object selection command.
If two or more objects are selected, the control unit 130 treats the selected objects as one combined object in step 760, and displays a guide indicating the region of the combined object in step 770.
The control unit 130 receives a size adjustment command from the key input unit 110 or touch sensor 121 in step 780, and adjusts the size of the region of the selected object with respect to the first axis or second axis in step 790.
Thereafter, if an adjustment cancel command is received from the key input unit 110 or the touch sensor 121, the control unit 130 may restore the original size of the adjusted object region and display the region at the original size (not shown).
An exemplary embodiment is described with reference to
Referring to screen representation (a) of
Referring to screen representations (b) and (c) of
Referring to screen representation (a) of
Referring to screen representation (b) of
As described above, a specific region may be recognized as an object without an explicit selection command, and the size of the recognized object may be directly adjusted in response to a size adjustment command.
In exemplary embodiments of the present invention, the user may adjust the size of a selected object in a more convenient manner without performing a multi-stage or cumbersome input procedure.
While the invention has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2012-0019310 | Feb 2012 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
5655159 | Nakayama et al. | Aug 1997 | A |
6380972 | Suga et al. | Apr 2002 | B1 |
6937272 | Dance | Aug 2005 | B1 |
7376347 | Sugimoto | May 2008 | B2 |
7643012 | Kim | Jan 2010 | B2 |
8400477 | Young et al. | Mar 2013 | B1 |
8406464 | Karazi | Mar 2013 | B2 |
8478347 | Kim et al. | Jul 2013 | B2 |
8532414 | Wang et al. | Sep 2013 | B2 |
8565835 | Mikami et al. | Oct 2013 | B2 |
8635547 | Otsuka et al. | Jan 2014 | B2 |
20020171650 | Prabhakaran | Nov 2002 | A1 |
20040105125 | Ezawa | Jun 2004 | A1 |
20050108620 | Allyn | May 2005 | A1 |
20050172218 | Nishimura et al. | Aug 2005 | A1 |
20050219393 | Sugimoto | Oct 2005 | A1 |
20070216782 | Chernoff | Sep 2007 | A1 |
20080016470 | Misawa et al. | Jan 2008 | A1 |
20080018670 | Araki | Jan 2008 | A1 |
20080177994 | Mayer | Jul 2008 | A1 |
20080267451 | Karazi | Oct 2008 | A1 |
20080303801 | Akaike et al. | Dec 2008 | A1 |
20090040238 | Ito | Feb 2009 | A1 |
20090179998 | Steinberg et al. | Jul 2009 | A1 |
20100060666 | Fong | Mar 2010 | A1 |
20100073546 | Mori | Mar 2010 | A1 |
20100083111 | de los Reyes | Apr 2010 | A1 |
20100157129 | Lee | Jun 2010 | A1 |
20100173678 | Kim | Jul 2010 | A1 |
20100180222 | Otsuka et al. | Jul 2010 | A1 |
20100202707 | Costache et al. | Aug 2010 | A1 |
20100208138 | Mohri | Aug 2010 | A1 |
20110093608 | Sumler et al. | Apr 2011 | A1 |
20110141492 | Ebuchi | Jun 2011 | A1 |
20110187750 | Ko et al. | Aug 2011 | A1 |
20110242396 | Matsuzawa et al. | Oct 2011 | A1 |
20110267368 | Casillas et al. | Nov 2011 | A1 |
20110267530 | Chun | Nov 2011 | A1 |
20110273474 | Iwayama | Nov 2011 | A1 |
20120013645 | Hu | Jan 2012 | A1 |
20120027305 | Kim et al. | Feb 2012 | A1 |
20120110491 | Cheung | May 2012 | A1 |
20120133585 | Han | May 2012 | A1 |
20120165078 | Mikami et al. | Jun 2012 | A1 |
20120206481 | Endo | Aug 2012 | A1 |
20120206495 | Endo | Aug 2012 | A1 |
20120223966 | Lim | Sep 2012 | A1 |
20120266068 | Ryman | Oct 2012 | A1 |
20130106907 | Davis et al. | May 2013 | A1 |
20130108165 | Ptucha | May 2013 | A1 |
20130111332 | Davis | May 2013 | A1 |
20130182014 | Park | Jul 2013 | A1 |
20140024415 | Mikami | Jan 2014 | A1 |
20140300569 | Matsuki et al. | Oct 2014 | A1 |
20150074602 | VanBlon | Mar 2015 | A1 |
Number | Date | Country |
---|---|---|
4489608 | Apr 2010 | JP |
2011-211757 | Oct 2011 | JP |
10-2010-0075167 | Jul 2010 | KR |
10-1024705 | Mar 2011 | KR |
Number | Date | Country | |
---|---|---|---|
20130227452 A1 | Aug 2013 | US |