The method and system encompassed herein is related generally to the interactive display of images on a device display and, more particularly, to the interactive display of object images in a stereoscopic manner.
As technology has progressed, various devices have been configured to display images, and particularly objects in those images, in a manner by which users perceiving those object images perceive the object images to be three-dimensional (3D) object images, even though the images are displayed from two-dimensional (2D) display screens. Such manner of display is often referred to as stereoscopic or three-dimensional imaging. Stereoscopic imaging is a depth illusion created by displaying a pair of offset images separately to right and left eyes of a viewer, wherein the brain combines the images to provide the illusion of depth. Although the use of stereoscopic imaging has enhanced the ability of engineers, artist designers, and draftspersons to prepare perceived 3D type models, improved methods of manipulating the objects shown in a perceived 3D environment are needed.
The above considerations, and others, are addressed by the method and system encompassed herein, which can be understood by referring to the specification, drawings, and claims. According to aspects of the method and system encompassed herein, a method of manipulating viewable objects is provided that includes providing a touch screen display capable of stereoscopic displaying of object images, wherein a zero-plane reference is positioned substantially coincident with the physical surface of the display, displaying on the touch screen a first object image and one or more second object images, wherein the object images are displayed to appear at least one of in front of, at, or behind the zero-plane. The method further includes receiving a first input at the touch screen at a location substantially corresponding to an apparent position of the first object image, and modifying the displaying on the touch screen so that at least one of the first object image and the one or more second object images appear to move towards one of outward in front of the touch screen or inward behind the touch screen in a stereoscopic manner.
According to further aspects, a method of manipulating viewable objects displayed on a touch screen is provided that includes displaying a first object image and one or more second object images in a perceived virtual space provided by a touch screen display configured to provide a stereoscopic display of the first object image and one or more second object images. The method further includes positioning the first object image at or adjacent to a zero-plane that intersects the virtual space and is substantially coincident with the surface of the touch screen display, sensing a selection of the first object image, and modifying the perceived position of at least one of the first object image and the one or more second object images, such that at least one of the first object image and the one or more second object images are relocated to appear a distance from their original displayed location.
According to still further aspects, a mobile device is provided that includes a touch display screen capable of providing a stereoscopic view of a plurality of object images, wherein the object images are configured to appear to a user viewing the display to be situated in a three-dimensional virtual space that includes a world coordinate system and a camera coordinate system, wherein the camera coordinate system includes an X axis, Y axis, and Z axis with a zero-plane coincident with an X-Y plane formed by the X axis and Y axis, and the zero-plane is substantially coincident with the surface of the display screen. The mobile device further including a processor that is programmed to control the display of the plurality of object images on the display screen, wherein at least one of the object images is displayed so as to appear at least partly coincident with the zero plane, such that it is selected by a user for performing a function, and at least one of the other object images appears positioned at least one of inward and outward of the zero plane and is not selected to perform a function.
While the appended claims set forth the features of the method and system encompassed herein with particularity, the method and system encompassed herein with its objects and advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings of which:
Turning to the drawings, wherein like reference numerals refer to like elements, the method and system encompassed herein is illustrated as being implemented in a suitable environment. The following description is based on embodiments of the method and system encompassed herein and should not be taken as limiting with regard to alternative embodiments that are not explicitly described herein.
As will be described in greater detail below, it would be desirable if an arrangement of multiple object images with respect to which user interaction is desired could be displayed on a mobile device in a stereoscopic manner. Further, it would be desirable to display objects in a stereoscopic manner such that their manipulation is intuitive to a user and they provide a realistic stereoscopic appearance before, during, and after manipulation. The display and manipulation of such object images in a stereoscopic environment can be presented in numerous forms. In at least some embodiments, the object images are displayed and manipulated on a mobile device, such as a smart phone, a tablet, or a laptop computer. In other embodiments, they can be displayed and manipulated on other devices, such as a desktop computer. The manipulation is, in at least some embodiments, accomplished using a touch sensitive display, such that a user can manipulate the object images with a simple touch, although other types of pointing and selecting devices, such as a mouse, trackball, stylus, pen, etc., can be utilized in addition to or in place of user-based touching.
The mobile device 100 in the present embodiment includes a touch screen display screen 102 having a touch-based input surface 104 (e.g., touch sensitive surface or touch panel) situated on the exposed side of the display screen 102, which is accessible to a user. For convenience, references herein to selecting an object at the display screen 102 should be understood to include selection at the touch-based input surface 104. The display screen 102 is in at least some embodiments planar, and establishes a physical plane 105 situated between the exterior and interior of the mobile device 100. In other embodiments, the display screen 102 can include curved portions, and therefore, the physical plane 105 can be non-planar. The display screen 102 can utilize any of a variety of technologies, such as, for example, specific touch sensitive elements. In the present embodiment, the display screen 102 is particularly configured for the stereoscopic presentation of object images (as discussed below). More particularly, the display screen 102 can include an LCD that uses a parallax barrier system to display 3D images, such as manufactured by Sharp Electronics Corp. in New Jersey, USA. The parallax barrier has a series of vertical slits to control the path of light reaching the right and left eyes, thus creating a sense of depth. The part is a whole screen with the regular LCD and a barrier layer sandwiched in between touch and LCD glasses. The display screen 102 displays information output by the mobile device 100, while the input surface 104 allows a user of the mobile device 100, among other things, to select various displayed object images and to manipulate them. The mobile device 100, depending upon the embodiment, can include any of a variety of software configurations, such as an interface application that is configured to allow a user to manipulate the display of media stored on or otherwise accessible by the mobile device 100.
Further, in the present embodiment of
By contrast, the Wi-Fi transceiver 205 is a wireless local area network (WLAN) transceiver 205 configured to conduct Wi-Fi communications in accordance with the IEEE 802.11(a, b, g, or n) standard with access points. In other embodiments, the Wi-Fi transceiver 205 can instead (or in addition) conduct other types of communications commonly understood as being encompassed within Wi-Fi communications such as some types of peer-to-peer (e.g., Wi-Fi Peer-to-Peer) communications. Further, in other embodiments, the Wi-Fi transceiver 205 can be replaced or supplemented with one or more other wireless transceivers configured for non-cellular wireless communications including, for example, wireless transceivers employing ad hoc communication technologies such as HomeRF (radio frequency), Home Node B (3G femtocell), Bluetooth and/or other wireless communication technologies such as infrared technology. Thus, although in the present embodiment the mobile device 100 has two of the wireless transceivers 203 and 205, the present disclosure is intended to encompass numerous embodiments in which any arbitrary number of wireless transceivers employing any arbitrary number of communication technologies are present.
Example operation of the wireless transceivers 202 in conjunction with others of the internal components 200 of the mobile device 100 can take a variety of forms and can include, for example, operation in which, upon reception of wireless signals, the internal components detect communication signals and the transceivers 202 demodulate the communication signals to recover incoming information, such as voice and/or data, transmitted by the wireless signals. After receiving the incoming information from the transceivers 202, the processor portion 204 formats the incoming information for the one or more output devices 208. Likewise, for transmission of wireless signals, the processor portion 204 formats outgoing information, which can but need not be activated by the input devices 210, and conveys the outgoing information to one or more of the wireless transceivers 202 for modulation so as to provide modulated communication signals to be transmitted. The wireless transceiver(s) 202 conveys the modulated communication signals by way of wireless (as well as possibly wired) communication links to other devices.
Depending upon the embodiment, the output devices 208 of the internal components 200 can include a variety of visual, audio and/or mechanical outputs. For example, the output device(s) 208 can include one or more visual output devices 216, such as the display screen 102 (e.g., a liquid crystal display and/or light emitting diode indicator(s)), one or more audio output devices 218 such as a speaker, alarm and/or buzzer, and/or one or more mechanical output devices 220 such as a vibrating mechanism. Likewise, the input devices 210 of the internal components 200 can include a variety of visual, audio and/or mechanical inputs. By example, the input device(s) 210 can include one or more visual input devices 222 such as an optical sensor (for example, a camera lens and photosensor), one or more audio input devices 224 such as a microphone, and one or more mechanical input devices 226 such as a flip sensor, keyboard, keypad, selection button, navigation cluster, input surface (e.g., touch sensitive surface associated with one or more capacitive sensors), motion sensor, and switch. Operations that can actuate one or more of the input devices 210 can include not only the physical pressing/actuation of buttons or other actuators, and physically touching or gesturing along touch sensitive surfaces, but can also include, for example, opening the mobile device 100 (if it can take on open or closed positions), unlocking the mobile device 100, moving the mobile device 100 to actuate a motion, moving the mobile device 100 to actuate a location positioning system, and operating the mobile device 100.
As mentioned above, the internal components 200 also can include one or more of various types of sensors 228. The sensors 228 can include, for example, proximity sensors (e.g., a light detecting sensor, an ultrasound transceiver or an infrared transceiver), touch sensors (e.g., capacitive sensors associated with the input surface 104 that overlay the display screen 102 of the mobile device 100), altitude sensors, and one or more location circuits/components that can include, for example, a Global Positioning System (GPS) receiver, a triangulation receiver, an accelerometer, a tilt sensor, a gyroscope, or any other information collecting device that can identify a current location or user-device interface (carry mode) of the mobile device 100. While the sensors 228 are for the purposes of
The memory portion 206 of the internal components 200 can encompass one or more memory devices of any of a variety of forms (e.g., read-only memory, random access memory, static random access memory, dynamic random access memory, etc.), and can be used by the processor 204 to store and retrieve data. In some embodiments, the memory portion 206 can be integrated with the processor portion 204 in a single device (e.g., a processing device including memory or processor-in-memory (PIM)), albeit such a single device will still typically have distinct portions/sections that perform the different processing and memory functions and that can be considered separate devices. The data that is stored by the memory portion 206 can include, but need not be limited to, operating systems, applications, and informational data.
Each operating system includes executable code that controls basic functions of the mobile device 100, such as interaction among the various components included among the internal components 200, communication with external devices via the wireless transceivers 202 and/or the component interface 212, and storage and retrieval of applications and data, to and from the memory portion 206. Each application includes executable code that utilizes an operating system to provide more specific functionality, such as file system service and handling of protected and unprotected data stored in the memory portion 206. Such operating system and/or application information can include software update information (which can be understood to potentially encompass update(s) to either application(s) or operating system(s) or both). As for informational data, this is non-executable code or information that can be referenced and/or manipulated by an operating system or application for performing functions of the mobile device 100.
With further reference to
As will be discussed with reference to additional Figures, the object images 303 can be manipulated in various manners to reorient the object images 303 relative to each other and the display 102. The manipulations are generally initiated by a user performing a gesture on the display screen 102, such as touching the display screen 102 with one or more fingers at the point on the display screen where the object image 303 appears. However, in at least some embodiments, the manipulations can be performed through other input methods, such as through the use of a mechanical pointing device, voice commands, etc. Through the manipulation of the object images 303 at the display screen 102, a user can re-orient the object images 303 in a stereoscopic view to zoom in or zoom out on particular object images 303. In addition, a grid 517 (
To enhance a user experience during a manipulation, the primary object image 323 is displayed fixed in the camera coordinate system 302, while the secondary object images 325 move relative to the primary object image 323. In this manner, the primary object image 323 appears to stay situated close to the point on the display screen 102 where the user is selecting it, while the secondary object images 325 appear to move away from their original positions. Once the primary object image 323 is manipulated to a desired location relative to the secondary object images 325, the view as seen by the user can be revised to show that the secondary object images 325 remain in their original world coordinate system positions, while the primary object image 323 has been moved to a new world coordinate system position. The world coordinate system 301 and camera coordinate system 302 can be aligned or misaligned with each other at different times. For simplicity, the arrangement of object images 303 in the virtual space 300 of
Referring still to
It should be noted that, as the zero-plane 310 is positioned at the display screen 102, all physical touching (selection) occurs at the zero-plane 310, regardless of the appearance of the object images 303 to the user viewing the display screen 102. As such, in some instances, it will appear, at least in the Figures, that the user is not touching a portion of the primary object image 323, as the primary object image 323 will be shown along the Z axis 311 at a point away from the touching point on the display at the zero-plane 310. Further, in at least some embodiments, it is the intent that the positioning of the primary object image 323 is maintained at least partially at or about the zero-plane 310 so as to provide an intuitive touch point for the user. This can be particularly useful when a stereoscopic view is present, as one or more of the object images 325 can appear to be situated where a user cannot touch, such as behind or in front of the display screen 102. In at least some embodiments discussed herein, the object images 325 do not remain tethered to the zero-plane 310 during the touch action by the user, but the object images 325 can be moved as a group into a position that maintains their spatial relationship while placing the primary object image 323 at or near the zero-plane 310. Further, in at least some embodiments, the object images 325 are not tethered to the zero-plane 310 although they do return to a position about the zero-plane after a user has ceased to touch the display screen 102, without additional action taken by the user.
Referring to
Referring to
Referring now to
When a PUSH action command is received by the processor 204 of the mobile device 100, the object image 323 can be repositioned in the world coordinate system 301. In addition, to provide the appearance that the primary object image 323 is moving inwards of the display screen 102 under the pressure of the touch, the camera coordinate system 302 shifts to display the secondary object images 325 moving away from the primary object image 323. Further, as the secondary object images 325 are moved down the +Z axis portion 413, away from the primary object image 323, they can in at least some embodiments, be enlarged so that they appear further out of the display screen 102 towards the user. Meanwhile, the primary object image 323 remains pinned to the zero-plane 310 and in at least some embodiments is reduced in size, while in other embodiments it can remain consistent in size. Further, the grid 517 shifts along with the secondary object images 325 to remain at a consistent distance D therefrom. Although the majority of the grid 517 remains in a planar shape and follows the secondary object images 325, the portion of the grid 517 that is adjacent to the primary object image 323 can deform around the primary object image 323 to further enhance the stereoscopic appearance, as shown in
Referring now to
Referring now to
Referring to
For additional consideration with regard to the method and system encompassed herein, in at least some embodiments, the object images 303 can be selected and moved around relative to the grid 517, whether in a PULL position, PUSH position, or neither. Such movement of the object image 303 relative to the grid 517 can include deforming the grid portions as they are contacted by the object image, and undeforming portions of the grid as when they no longer contact the object image 303.
In various embodiments, the PULL and PUSH action can be accompanied by audio effects produced by the mobile device 100. In addition, various methods of highlighting of the object images 303 can be provided, such as varied/varying colors and opacity. For example, the primary object image 323 can be highlighted to differentiate it from the secondary object images 325, and/or the highlighting can vary depending on the position of the object images 303 relative to the zero-plane 310 or another point.
Further, the user's view of the object images 303 can be manipulated by changing the camera view (e.g., viewing angle) provided at the display screen 102. For example, a double-tap on the display screen 102 can unlock the current camera view of the object images 303. Once unlocked, the current camera view of the object images 303 can be changed by a movement of a user's touch across the display screen 102. In addition, the camera view can also be modified by using a pinch-in user gesture to zoom in and a pinch-out user gesture to zoom out. In this manner, the user can rotate the object images 303 in the virtual space 300 to provide an improved view of object images 303 that can, for example, appear an extended distance from the zero-plane 310, or are shown underneath the grid 517 and would otherwise be difficult to see without interference from other object images 303 or portions of object images 303.
It should be noted that prior to, during, or after a view is presented, interaction hints (e.g., text) can be displayed to assist the user by providing specific options and/or instructions for their implementation. In addition, the views provided in the Figures are examples and can vary to accommodate various types of object images as well as various types of mobile devices. Many of the selections described herein can be user selectable only and/or time-based for automated actuation.
In view of the many possible embodiments to which the principles of the method and system encompassed herein may be applied, it should be recognized that the embodiments described herein with respect to the drawing Figures are meant to be illustrative only and should not be taken as limiting the scope of the method and system encompassed herein. Therefore, the method and system as described herein contemplates all such embodiments as may come within the scope of the following claims and equivalents thereof.