The development of applications that present graphical images to a user on a computing device must take into account the form factor of the user interface available on the intended computing device. For example, an application developed for a personal computer equipped with a keyboard, a display monitor, and a pointing device may successfully use graphical elements to present a user interface on the monitor that would be difficult if not impossible to navigate if the graphical elements were presented on a smartphone or tablet computing device with a different display size. Similarly, a laptop or other mobile device equipped with a touch screen may present different human factors that an application developer may consider when developing software applications for one or more devices.
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
The following description and the drawings sufficiently illustrate specific embodiments to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. Portions and features of some embodiments may be included in, or substituted for, those of other embodiments. Embodiments set forth in the claims encompass all available equivalents of those claims.
User interface development tools may provide an application developer with the ability to assemble a variety of different views or screens of information to a user. An individual view may include a variety of different objects or elements, some of which may be layered or stacked on top of (or in front of) each other in completely or partially overlapping configurations. The management, organization, and manipulation, of different views and their constituent objects or elements may become cumbersome to manipulate if the design of a view includes more than a small number of elements. Additionally, interacting with multiple objects that are distributed among overlapping layers may not be intuitive when presented in a two-dimensional interface.
An application's user interface may be built from a hierarchy of view objects. For example, a root object for a view can have members indicating the position and dimensions of the view. The root object can also have a list of child objects appearing in the view. The child objects can also have positions, dimensions, and further child objects.
The hierarchy of view objects can be displayed as a three-dimensional representation of the view. For example, a set of layers can be created, with each layer corresponding to a level in the hierarchy of view objects. The rear-most layer can represent the view object, the next layer forward can represent objects that are directly referenced by the view object, the layer in front of that can represent the child objects of the directly-referenced objects, and so on.
A second object 104 may be placed in front of, or on top of, the back ground object. The second object 104 may be visually distinct from the background object 102 such that a boarder appears to surround the second object 104. The second object 104 may include, or be disposed behind, any of a variety of other objects such as buttons 106. Buttons 106 may include icons 108 that indicate a function or action that may be taken when one of the buttons 106 is selected. Navigation selectors such as a favorites menu 110 or a home menu 112 may be included in a layer parallel to the buttons 106. The background object 102, second object 104, buttons 106, and icons 108, may all be disposed in separate layers. In this manner each layer, and the objects disposed in an individual layer, may be manipulated independently of the other layers.
The graphical user interface 100 includes a slider 114 that may control the visibility of objects in each layer of the canvas. The slider 114 includes a slider axis 116, a plurality of divisions 118. The divisions 118 may correspond to individual layers on the canvas, or be proportionally distributed along the slider axis 116 independent of the number of individual layers. In an example, the number of divisions 118 along the slider axis 116 may change, e.g., increase or decrease, as layers are added or removed from the canvas.
The slider 114 includes a first control 120 and a second control 122 that may be manipulated independently of each other. The first control 120 and the second control 122 may include an indicator, such as an icon or colored blub, which may change appearance when one or both of the controls (120, 122) is selected. The first control 120 may initially be disposed at one end of the slider axis 116 and the second control 122 may be disposed at an end of the slider opposite the first control 120. The first control 120 or the second control 122 may be manipulated, e.g., moved along the slider axis 116, to display, hide, or highlight one or more of the layers on the canvas. For example, as the first control 120 or the second control 122 is moved along the slider axis 116 an individual layer corresponding to the position of the first control 120 or the second control 122 may be highlighted. Individual objects in the highlighted layer may be selected by a user selection input.
The graphical user interface 100 may include additional tools to manipulate the layers and the graphical user interface 100. For example, a filter tool 124 may be presented, which when selected or tapped causes a dialog to be presented on or near the graphical user interface 100, to provide a mechanism to limit or select the layers to be displayed or hidden from view on the canvas. A zoom-out tool 126, a normal view tool 128, and a zoom-in tool 130 may be presented by the graphical user interface 100. When selected, the zoom-out tool 126 and the zoom-in tool 130 may change the size of, e.g., shrink or enlarge, respectively, the objects on the canvas. The normal view tool 128 may be selected to return the objects on the canvas to a preset view, such as the actual size or 100% zoom. In an example, the normal view tool 128 can function as a scale-to-fit tool fitting the view to the available display size. A view change tool 132 may be selected to transition the objects displayed on the canvas from a two-dimensional front view to a three-dimensional perspective view. Additional tools may be included to add, remove, import, edit or otherwise modify the content or display of any object in any layer on the canvas within the graphical user interface 100.
In
In
In
The 3D slider 202 includes a first 3D control 208 and a second control 210 that may be manipulated independently of each other. The first 3D control 208 and the second 3D control 210 may include an indicator, such as an icon or colored circle, sphere or other shape, which may change in appearance when each of the controls (208, 210) is selected, respectively. The selection of the first 3D control 208 and the second 3D control 210 can be achieved through the manipulation of a pointer on a display that is presenting the 3D graphical user interface, or in another example, by receiving a touch input on a touch screen display. The first 3D control 208 may initially be disposed at one end of the 3D slider axis 204, and the second 3D control 210 may initially be disposed at an end of the slider opposite the first 3D control 208.
The first 3D control 208 or the second 3D control 210 may each be independently manipulated, e.g., moved along the 3D slider axis 204, to display, hide, or select one or more of the layers on the canvas. Additionally, the first 3D control 208 or the second 3D control 210 may each be independently positioned on the canvas. When one or both of the first 3D control 208 or the second 3D control 210 are changed each layer on the canvas may be repositioned or reoriented in response to the change in position of the first 3D control 208 or the second 3D control 210. In an example, the 3D slider 202 may stay in alignment with respect to the layers in the canvas. In this manner, the 3D slider 202 remaining aligned with the layers provides for direct manipulation of the objects in the layers, thereby improving a user's ability to manipulate the objects depicted in each layer on the canvas and provide a seamless user experience.
The 3D graphical user interface 200 of
A fifth layer 222 includes a background plane for a home menu. The fifth layer 222 may be disposed in front of the first through fourth layers, and optionally, separated by space where additional layers could be inserted. The home menu may include a foreground color in a sixth layer 224. On top of the sixth layer 224 is a seventh layer that includes a menu 226 and a plurality of selection buttons 228. An eighth layer includes a plurality of icons 230 that correspond with each one of the selection buttons 228 and are disposed above, or in front of, the respective selection buttons 228.
The 3D graphical user interface 200 may also include menu items such as a layer spacing tool 232 or a 2D mode selection icon 234. In an example, the selection of the layer spacing tool 232 may arrange each layer in the canvas at an equidistantly spaced arrangement or a default spacing. In an example, the 2D mode selection icon 234 may transition the 3D graphical user interface 200 to a two-dimensional control mode such as the graphical user interface 100 of
The 3D slider 202 may by continuously oriented in 3D space as the viewing angle of the canvas is changed such that the 3D slider 202 stays closely associated to the objects and layers on the canvas. In this manner convenient user interaction with both the layers, objects and the slider may be maintained. For example, the 3D slider may be automatically positioned at the base of the canvas, as depicted in
In
In
In an example, the first 3D control 208 and the second 3D control 210 are limited such that their order is maintained such that the first 3D control 208 may not be positioned to the right, or on top of, the second 3D control 208. The first 3D control and the second 3D control may be positioned at immediately adjacent positions such that no objects or layers are displayed.
The first 3D control 208 and the second 3D control 210 may both be selected and then positioned simultaneously, e.g., with a single input. For example, the first 3D control 208 and the second 3D control 210 may be positioned at or between adjacent divisions 206 such that only a single layer of the canvas is displayed. Both the first 3D control 208 and the second 3D control 210 may be positioned along the 3D slider axis 204 of the 3D slider 202 such that each layer is sequentially displayed individually. In this manner each individual layer, and any objects in the layer, may be displayed on the 3D graphical user interface 200.
In
The 3D graphical user interface 400 may include a camera angle selector 426, that when selected may allow a user to change the viewing angle, e.g., camera angle, depicted by the 3D graphical user interface 400. The 3D graphical user interface 400 may include a view selection tools. For example, a zoom-out tool 428, a full-size view tool 430, and a zoom-in tool 432, may change the number of visible objects, or manipulate the size of the objects displayed in the 3D graphical user interface 400. In an example, when a user has zoomed in on the canvas, only the objects in visible layers of the graphical user interface 400 will be controlled by the slider 402.
The development machine 510 runs a development application 530. The target application 540 may run on the host device 520. The development application 530 accesses the target application 540 to provide development information to an application developer. For example, the target application 540 may be running on the host device 520. The development application 530 may have access to the source code for the target application 540 and the memory of the host device 520. Based on the source code for the target application 540 and the memory of the host device 520, current values for variables, structures, and classes described in the source code of the target application 540 can be determined. The development application 530 can present these values to the application developer. Among the values that may be accessed by the development application are values corresponding to the user interface of the target application 540.
Though arranged serially in the examples of
Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or mechanisms. Modules or components are tangible entities capable of performing specified operations and may be configured or arranged in a certain manner. In an example, circuits may be arranged (e.g., internally or with respect to external entities such as other circuits) in a specified manner as a module or component. In an example, the whole or part of one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware processors may be configured by firmware or software (e.g., instructions, an application portion, or an application) as a module/component that operates to perform specified operations. In an example, the software may reside (1) on a non-transitory machine-readable medium or (2) in a transmission signal. In an example, the software, when executed by the underlying hardware of the module/component, causes the hardware to perform the specified operations.
Accordingly, the terms “module” and “component” are understood to encompass a tangible entity, be that an entity that is physically constructed, specifically configured (e.g., hardwired), or temporarily (e.g., transitorily) configured (e.g., programmed) to operate in a specified manner or to perform part or all of any operation described herein. Considering examples in which modules/components are temporarily configured, each of the modules/components need not be instantiated at any one moment in time. For example, where the modules/components comprise a general-purpose hardware processor configured using software, the general-purpose hardware processor may be configured as respective different modules/components at different times. Software may accordingly configure a hardware processor, for example, to constitute a particular module/component at one instance of time and to constitute a different module at a different instance of time.
Machine (e.g., computer system) 900 may include a hardware processor 902 (e.g., a processing unit, a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 904, and a static memory 906, some or all of which may communicate with each other via a link 908 (e.g., a bus, link, interconnect, or the like). The machine 900 may further include a display device 910, an input device 912 (e.g., a keyboard), and a user interface (UI) navigation device 914 (e.g., a mouse). In an example, the display device 910, input device 912, and UI navigation device 914 may be a touch screen display. The machine 900 may additionally include a mass storage (e.g., drive unit) 916, a signal generation device 918 (e.g., a speaker), a network interface device 920, and one or more sensors 921, such as a global positioning system (GPS) sensor, camera, video recorder, compass, accelerometer, or other sensor. The machine 900 may include an output controller 928, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR)) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
The mass storage 916 may include a machine-readable medium 922 on which is stored one or more sets of data structures or instructions 924 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 924 may also reside, completely or at least partially, within the main memory 904, within static memory 906, or within the hardware processor 902 during execution thereof by the machine 900. In an example, one or any combination of the hardware processor 902, the main memory 904, the static memory 906, or the mass storage 916 may constitute machine readable media.
While the machine-readable medium 922 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that configured to store the one or more instructions 924. The term “machine-readable medium” may include any tangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine 900 and that cause the machine 900 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine-readable medium examples may include solid-state memories, and optical and magnetic media. Specific examples of machine-readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
The instructions 924 may further be transmitted or received over a communications network 926 using a transmission medium via the network interface device 920 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), peer-to-peer (P2P) networks, among others. In an example, the network interface device 920 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 926. In an example, the network interface device 920 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 900, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
Publications, patents, and patent documents referred to in this document are incorporated by reference herein in their entirety, as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference, the usage in the incorporated reference(s) are supplementary to that of this document; for irreconcilable inconsistencies, the usage in this document controls.
In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended, that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to suggest a numerical order for their objects.
The above description is intended to be illustrative, and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with others. Other embodiments may be used, such as by one of ordinary skill in the art upon reviewing the above description. The Abstract is to allow the reader to quickly ascertain the nature of the technical disclosure, for example, to comply with 37 C.F.R. §1.72(b) in the United States of America. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Also, in the above Detailed Description, various features may be grouped together to streamline the disclosure. However, the claims may not set forth every feature disclosed herein as embodiments may feature a subset of said features. Further, embodiments may include fewer features than those disclosed in a particular example. Thus, the following claims are hereby incorporated into the Detailed Description, with a claim standing on its own as a separate embodiment. The scope of the embodiments disclosed herein is to be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.