Devices such as personal computers (PCs), laptops, slates, and phones offer a wide range of screen sizes. However, there is no established method for scaling a user interface (UI) across a large range of screen sizes, from very large displays down to smaller displays. It is with respect to this general technical area that the present application is directed.
Non-limiting examples of the present disclosure describe user interface scaling based on a detected display size associated with a connected processing device. A display size associated with a connected processing device is detected. A display class is determined based on the detected display size. A user interface for an application is launched on the connected processing device based on the determined display class.
In other non-limiting examples, a user interface is scaled based on connection of a processing device having a different display size from a first processing device. A user interface for an application is launched at a first scaled model based on determining a display class associated with a first processing device. The display class associated with the first processing device is detected based on a determined display size of the first processing device. Connection of a second processing device is detected. A display class associated with the second processing device is determined upon connection of the second processing device. The display class associated with the second processing device is detected based on a determined display size of the first processing device The user interface is adapted to display, on the second processing device, at a second scaled model designed for the second processing device upon determining that the display class of the second processing device is different from the display class of the first processing device.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Additional aspects, features, and/or advantages of examples will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the disclosure.
Non-limiting and non-exhaustive examples are described with reference to the following figures.
Users of processing devices desire applications to be optimized in a form-factor manner. However, there is no established method of scaling a user interface (UI) across a large range of screen sizes with such an approach. Simply attempting to merge a large screen version of an application with a small screen version of an application creates complications. As an example, when large screen versions of applications are executed on devices having smaller display sizes, the UI gets too crowded and touch targets become too small. Additionally, another complication is that UIs are not traditionally scalable across devices having different display sizes. For instance, a user of a processing device may be viewing an application on a device having a smaller display size (e.g., mobile phone) and proceed to connect the device having the small screen display to a device having a larger display size (e.g., PC). Attempted resizing of an application across differing display sizes may drastically affect the display and operation of the UI for an application and/or application control. In other cases where different versions of an application are developed (e.g., mobile version and desktop version), systems are typically unable to recognize that a UI is to be scaled to a different programmed version to account for display size changes. Other instances of building UI packages may incorporate a scaling model for large and small screen devices but are only able to show a single type of UI (e.g., phone version or slate version) once an application is installed. This may limit a user's ability to connect to large display screens and enjoy UI that takes advantage of available display space.
Examples of the present disclosure describe a hybrid approach for scaling UI that accommodates for changes in display size resulting from display size change and/or connection of devices having different screen sizes. In examples, applications are developed that can execute/run on a plurality of devices having different display sizes. Examples of a scalable UI of the present disclosure combine multiple UI scaling models that take into account physical screen size of a processing device, enabling the UI to adjust for available display space to accommodate changes in display sizes. For instance, a created application may run on a smart phone and upon detection of a device having a larger screen, the UI can adapt display for operation on the larger screen device. Examples of the present disclosure comprise evaluation of display class information associated with an application UI at runtime of the application to identify a class of display (e.g., large screen/tablet/slate/phablet/phone, etc.). Display class information may be used to determine whether to display a UI optimized for larger screen devices, smaller screen devices or something in-between.
A number of technical advantages are achieved based on the present disclosure including but not limited to: improved scalability of UI for applications, consistent UI displayed across varying display sizes, visually appealing presentation of application command control, enhanced processing capability across devices of varying display sizes including improved efficiency and usability for application command control, improved efficiency in navigation and access to control content, and improved user interaction with applications/application command controls, among other examples.
As stated above, a number of program modules and data files may be stored in the system memory 106. While executing on the processing unit 104, program modules 108 (e.g., Input/Output (I/O) manager 124, other utility 126 and application 128) may perform processes including, but not limited to, one or more of the stages of the operations described throughout this disclosure. Other program modules that may be used in accordance with examples of the present invention may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, photo editing applications, authoring applications, etc.
Furthermore, examples of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, examples of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in
The computing device 102 may also have one or more input device(s) 112 such as a keyboard, a mouse, a pen, a sound input device, a device for voice input/recognition, a touch input device, etc. The output device(s) 114 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used. The computing device 104 may include one or more communication connections 116 allowing communications with other computing devices 118. Examples of suitable communication connections 116 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.
The term computer readable media as used herein may include computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules. The system memory 106, the removable storage device 109, and the non-removable storage device 110 are all computer storage media examples (i.e., memory storage.) Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the computing device 102. Any such computer storage media may be part of the computing device 102. Computer storage media does not include a carrier wave or other propagated or modulated data signal.
Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
One or more application programs 266 may be loaded into the memory 262 and run on or in association with the operating system 264. Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth. The system 202 also includes a non-volatile storage area 268 within the memory 262. The non-volatile storage area 268 may be used to store persistent information that should not be lost if the system 202 is powered down. The application programs 266 may use and store information in the non-volatile storage area 268, such as e-mail or other messages used by an e-mail application, and the like. A synchronization application (not shown) also resides on the system 202 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 268 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may be loaded into the memory 262 and run on the mobile computing device 200 described herein.
The system 202 has a power supply 270, which may be implemented as one or more batteries. The power supply 270 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.
The system 202 may include peripheral device port 230 that performs the function of facilitating connectivity between system 202 and one or more peripheral devices. Transmissions to and from the peripheral device port 230 are conducted under control of the operating system (OS) 264. In other words, communications received by the peripheral device port 230 may be disseminated to the application programs 266 via the operating system 264, and vice versa.
The system 202 may also include a radio interface layer 272 that performs the function of transmitting and receiving radio frequency communications. The radio interface layer 272 facilitates wireless connectivity between the system 202 and the “outside world,” via a communications carrier or service provider. Transmissions to and from the radio interface layer 272 are conducted under control of the operating system 264. In other words, communications received by the radio interface layer 272 may be disseminated to the application programs 266 via the operating system 264, and vice versa.
The visual indicator 220 may be used to provide visual notifications, and/or an audio interface 274 may be used for producing audible notifications via the audio transducer 225. In the illustrated example, the visual indicator 220 is a light emitting diode (LED) and the audio transducer 225 is a speaker. These devices may be directly coupled to the power supply 270 so that when activated, they remain on for a duration dictated by the notification mechanism even though the processor 260 and other components might shut down for conserving battery power. The LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. The audio interface 274 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to the audio transducer 225, the audio interface 274 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation. In accordance with examples of the present invention, the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below. The system 202 may further include a video interface 276 that enables an operation of an on-board camera 230 to record still images, video stream, and the like.
A mobile computing device 200 implementing the system 202 may have additional features or functionality. For example, the mobile computing device 200 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated in
Data/information generated or captured by the mobile computing device 200 and stored via the system 202 may be stored locally on the mobile computing device 200, as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 272 or via a wired connection between the mobile computing device 200 and a separate computing device associated with the mobile computing device 200, for example, a server computer in a distributed computing network, such as the Internet. As should be appreciated such data/information may be accessed via the mobile computing device 200 via the radio 272 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
In examples, method 400 may be performed in associated with an application. An application is a software component that executes on the processing device, interfacing with hardware and software components of the device. An application comprises one or more programs designed to carry out operations and is associated with a UI. In examples, an application may comprise a UI that is usable to control an application. In examples, a UI may comprise an application command control. An application command control is a graphical control element that interfaces with an application that executes on the processing device (e.g., memory, processor and functions of mobile device) and software components such as an operating system (OS), applications executing on a mobile device, programming modules, input methods (e.g., soft input panel (SIP)) and command container such as a pane or contextual menu, among other examples. As an example, an application command control is used to control execution of actions/commands for the application. An SIP is an on-screen input method for devices (e.g., text input or voice input), and a pane is a software component that assists function of other software running on the device such as the OS and other software applications, among other examples. In some examples, an application command control may be integrated within an application. For instance, an application command control may be able to be launched, closed, expanded or minimized when an application is launched, closed, expanded or minimized. In other examples, an application command control is executable as its own application that interfaces with another application. For instance, an application command control may be able to be launched, closed or minimized separately from the launching of an application that is controlled by the application command control.
Method 400 begins at operation 402 where a display size associated with a processing device is detected. A processing device may be any device comprising a display screen, at least one memory that is configured to store operations, programs, instructions, and at least one processor that is configured to execute the operations, programs or instructions such as an application/application command control. Display size is a measurement of viewable area for display on a processing device. As an example, display size is a measurement associated with active viewable image size of a processing device. In other examples, display size may be associated with a nominal size value. In one example, detecting of the display size comprises detecting a measurement value for screen diagonal of a display of a processing device. In another example, detecting of the display size comprises detecting a display width (e.g. width of the display for the processing device or operating size of a display window for an application executing on the processing device). Examples of a display size may comprise physical image size or logical image size, among other examples. Operation 402 may comprise a program instruction or module that can identify and evaluate system specifications for a processing device such as a mobile device. In one example, the programming instruction implemented in operation 402 identifies a type or version of the processing device and executes a fetch of data to identify system information of the processing device. In another example, a programming instruction or module may reference manufacturer specification information to determine a value associated with display size of a processing device.
Factors that may be evaluated to determine a display size include but are not limited to: dot density (e.g., dots per inch (DPI), pixel density (e.g., pixels per inch (PPI), physical size of a screen/display, screen diagonal of a display of a processing device, use case distance of a display from a user, display length, and display width, among other examples. As an example, display size may be a measurement value associated with effective resolution of a display for a processing device. Measurement of effective resolution enables is an example of a value used to evaluate display form factors with a common metric, and enables UI scaling to be classified into different display classes. However, one skilled in the art will recognize that any common metric relative to display size can be applied in exemplary method 400. In alternative examples, other factors other than display size may impact UI adaptation. Examples include but are not limited to: processing device orientation, processing device operational mode (e.g., keyboard mode, touch mode, handwriting/ink mode, etc.), window size, screen aspect ratio, and screen effective resolution, among other examples.
Flow proceeds to operation 404 where a display class is determined based on the detected display size of a processing device. Display class determination provides an abstraction for determining the size of a display. A display class can be defined for processing devices having display sizes that fall within the range associated with the display class. Code can query display class information to determine a UI instance to instantiate depending on the display size of the processing device that an application is running on. That is, display classes act as transition points for a UI experiences. Display class is a value that is determined based a maximum display size. The value for display class may be in any form including numeric values and elements of speech, as examples. For instance, display classes may be set to correspond with different types of processing devices (e.g., laptops, PCs, tablets, phones, etc.) where an exemplary display class may be “<=Phone” or “<=Tablet”. In another example, display classes may be set based on numeric values. For example, a display class may be identified using numeric values (e.g., 0 to 3 inches). In any examples, display classes are used to classify processing devices in accordance with display size. For example, a display class may be set for processing devices having a display size falling in a range from 0 to 3 inches where another display class may be set for processing devices having a display size in a range from 3.1 to 5 inches, and so on. A range for values of display classes may fall between 0 and infinity. In one example, operations for display class determination are written in style of successive less than or equal to (<=) checks, with an else for everything greater than a defined display class. In this example, additional display class designations may be easily added without having to change operational code behavior. However, one skilled in the art will recognize that display class designations including minimum and/or maximum values for ranges of display classes can be defined in any possible way that can be useful in defining user interface interaction. In examples, a minimum value of a display class may be a value that is equal to or greater than a maximum value of a display class which is directly smaller than the display class being defined. For instance, as in an example above, a first display class may correspond to a range for devices having displays between 0 and 3 inches and a minimum value of a second display class may take into account a maximum value of the first display class (e.g., 3 inches) and set the minimum value of the second display class at 3.1 inches, for instance. Display classes may be changes over time based on programmer prerogative, analysis/testing/use cases, etc.
Operation 404 may comprise one or more programming operations for determining an active display class, and react to any changes in display class such as when a processing device of a different display size is connected, an application window changes to a different display or an effective resolution is changed on a processing device, among other examples. In one example, an application programming interface (API) utilizing a shared library of data (e.g., dynamic link library (DLL) is used to determine a display class. As one example, exemplary operational code associated with a display class determination (e.g., display class event) is not limited to but may be similar to:
Once a display class is determined (operation 404), flow proceeds to operation 406 where a UI is launched based on the determined display class. A scaled model of a UI may be associated with a display class and operation 406 launches the UI scaled model that is associated with the determined display class. For example, if a determined display class is a class associated with small screen devices (e.g., display sizes less than 4 inches) then an application and application command control is launched that is adapted for small screen devices.
Flow proceeds to operation 408 where a UI scaled model that is launched (operation 406) is displayed on the processing device. In some examples, multiple processing devices may be connected, for example, where a mobile phone is connected to a personal computer or a laptop is connected to a docking station with large screen display(s), among other examples. In some examples, connecting of multiple devices may result in displaying of a scaled UI model on one processing device (e.g., where a laptop connected to a docking station with a larger screen displays a scaled UI adapted for the larger screen). In alternative examples, a user interface may be displayed on a first processing device (e.g., mobile phone) at a first scaled model designed for the determined display class of the first processing device and the user interface may be displayed on a second processing device (e.g., personal computer) at a second scaled model designed for the determined display class of the second processing device.
In examples described herein, UI can be adapted to accommodate both large screen devices and small screen devices. Method 500 begins at decision operation 502 where it is determined whether a display size change is detected or connection of another processing device is detected. Decision operation 502 may comprise one or more programming operations for determining an active display class, and react to any changes in display class. As an example, determination of a display size change may be detection of a changed display width of the first processing device such as when a change in resolution occurs. As an example, connection of another processing device may be connecting a mobile phone to a large screen processing device. However, one skilled in the art will recognize that the present disclosure is not limited to such examples. In examples, operations (e.g., API) may be executed on a processing device (e.g., in the background or while an application or user interface component is running) to detect potential changes in display class. If no display size change or connection of another device is detected, flow branches NO and processing of method 500 ends.
If a display size change or connection of another device is detected, flow branches YES and proceeds to operation 504 where a display class change event is initiated. Display class change events may be associated with exemplary operational code described above with respect to method 400. In additional examples, exemplary operation code used to evaluate display class changes for display class change events is not limited to but may be similar to:
At operation 506, a display class is determined based on a changed display size or a detected display size of another connected processing device. Detection of a display size and determination of a display class is described in detail in the description of
If the display class is to be changed, flow branches YES and proceeds to operation 510 where the UI is adapted in accordance with the determined display class. For example, a user interface is adapted to display a scaled model associated with a changed display class. A UI scaled model (e.g., scaled model of a user interface) may be associated with one or more display classes. For instance, a UI scaled model may be associated with a UI model for small screen devices, where the UI model is applicable to multiple display classes (e.g., processing devices having a display size of less than 4 inches may include more than one display class). Another UI scaled model may be associated with a UI model for large screen devices.
Flow proceeds back to operation 502 where method 500 may start again upon determining that a display size has changed or a new processing device is connected.
Scaling model 602 is a consistent UI scaling model where one or more components of a UI work the same across all display sizes (e.g., screen sizes). For instance, certain display aspects of a UI may visually appear the same to a user no matter the display class. Some examples of components of a UI that may utilize a consistent UI (602) scaling model comprise but are not limited to, opening screens, loading screens, actions, etc.
Scaling model 604 is a continuous scaling model where one or more components of a UI adapt to available display size. For instance, components of a UI may appear differently depending on whether the display is a small screen display or a larger screen display. Some examples of components of a UI that may utilize a continuous scaling (604) model comprise but are not limited to, context menus, message bars, notification surfaces, etc.
Scaling model 606 is a pivoting UI scaling model where one or more components of a UI radically change once a display size threshold is reached. For instance, components of a UI may be programmed differently depending on whether the display is a small screen display or a larger screen display. Some examples of components of a UI that may utilize a pivoting UI (606) scaling model comprise but are not limited to, file menus, history, application command control, etc. In an example, scaling models can be combined, for instance where a UI scaling model displayed may combine multiple scaling models such as scaling model 604 and scaling model 606. As an example, a transition (e.g., detected change in displays size/display class) between a UI instance of a processing device having a large display and a UI instance of a processing device having a smaller display may be changed in accordance with scaling model 606 (pivoting UI scaling model). While such a change may trigger scaling model 606 to be implemented, at the same time UI instances may also utilize a continuous scaling model (scaling model 604) to display UI elements appropriately to fit a display size of a processing device.
In examples, two or more UI scaling models 602-606 are applied to a display class to enable programmers adaptively develop scaled UI model that exhibits a range of behaviors for each display class. In this way, UI scaling models can be intelligently adapted to function best based on constraints (e.g., display size limitations) presented by some display classes.
As shown in
An application/canvas is a portion of a display of a processing device that is designated for display of an application executing on the device. The application/canvas region is the application UI that shows effects implemented by actions executed via an application command control. That is, the application/canvas is the content consisting of but not limited to the pages in workspace or editable portions of an application.
An application command control hosts a majority of an application's command set, organized in a hierarchical structure of individual palettes, chunks, and commands. Further, application command control may be programed to dynamically interact with an application and display simultaneously with applications and/or user interface components such as a soft input panel (SIP) or on screen keyboard. In one example, application command control may intelligently adapt based on content of an application (e.g., displayed or selected on an application canvas). An application command control comprises a plurality of palettes (command palettes) programmed for application control. A palette is a collection or associated grouping of actions or commands or chunks of commands that can be implemented by an application command control. In one example, palettes of an application command control comprise top-level palettes and drill-in palettes. Each of the top-level palettes and the drill-in palettes is a collection or grouping of rows comprising one or more selectable commands or command elements. As an example, a top-level palette may comprise a highest level grouping of commands or functionalities and including commands that are more frequently used/more likely to be used by users. A top-level palette may display command listings that can be drilled into and displayed in drill-in palettes.
Organization or grouping of commands in palettes may also be based on command grouping data available to programmers of an application command control. Command grouping data is information relating to the grouping of commands including associations between commands. For example, text editing features such as bolding, underlining, italicization, superscript and subscript may be associated and commonly used. Ideally, the application command control would like to include all of these commonly used functions on the same palette. However, due to limitations on the screen size, certain commands may need to be separated. Command grouping data is information that identifies associations and what commands should or should not be separated from each other. For example, an application command control may determine that the maximum number of rows and commands allows displaying of text formatting commands including a superscript editing command in a top-level palette but would not also allow displaying of a subscript command. Using the command grouping data, it may be identified that from a functionality and/or usability standpoint, it is best not to separate the superscript and subscript editing commands. For instance, a user who makes a subscript text edit may later look to make a superscript edit or visa-versa. Thus, in setting the layout of commands for palettes, programmers of the application command control may display a higher-level command for text editing in a top-level palette and the superscript and subscript editing commands may be included in a drill-in palette (child palette) of that top-level palette (parent palette) so they are not separated from each other.
Examples of common components that make up a top-level palette include but are not limited to: a palette bar and palette title, palette switching feature (including one touch target that launches palette switcher from title of palette bar), command to dismiss palette (e.g., visual representation of ellipses), quick commands (e.g., undo or redo), palette canvas comprising a plurality of commands, chunk commands (e.g., groupings of commands) and chunk dividers (e.g., dividing different groupings of commands), drill-in features to access drill-in palettes (when applicable).
Examples of common components that make up a drill-in palette can include but are not limited to: a palette bar and palette title, command to navigate back to the parent palette, command to dismiss palette (e.g., visual representation of ellipses), quick commands (e.g., undo or redo), palette canvas comprising a plurality of commands, chunk commands (e.g., groupings of commands) and chunk dividers (e.g., dividing different groupings of commands).
In one example, palettes of an application command control are presented in a vertical layout. For example, a top-level palette and a drill-in palette are vertically scrollable and comprise a collection of rows comprising one or more selectable command elements. However, in other examples, setting of the layout of a palette may also comprise presenting commands in a horizontal layout where commands are horizontally scrollable. In some examples, no limit is set on the scrollable height of a palette. Scrolling position may be kept on top-level palettes when switching between top-level palettes however scrolling position may or may not be kept for drill-in palettes. Commands set and displayed may include labels identifying a command and may be configured to take up an entire row of a palette. In other examples, multiple commands may be displayed in one row of a palette. Scaling is applied to setting and displaying commands in palette rows. In some other examples, commands may not have labels, for example, commands that are well known or have images displayed that are well known to users. Separators or spacers (either horizontal or vertical depending on layout of palette) may be displayed to break up different commands or chunks of commands.
In
Reference has been made throughout this specification to “one example” or “an example,” meaning that a particular described feature, structure, or characteristic is included in at least one example. Thus, usage of such phrases may refer to more than just one example. Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more examples.
One skilled in the relevant art may recognize, however, that the examples may be practiced without one or more of the specific details, or with other methods, resources, materials, etc. In other instances, well known structures, resources, or operations have not been shown or described in detail merely to observe obscuring aspects of the examples.
While sample examples and applications have been illustrated and described, it is to be understood that the examples are not limited to the precise configuration and resources described above. Various modifications, changes, and variations apparent to those skilled in the art may be made in the arrangement, operation, and details of the methods and systems disclosed herein without departing from the scope of the claimed examples.
This application claims the benefit of U.S. Provisional Application No. 62/076,368, filed on Nov. 6, 2014, which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62076368 | Nov 2014 | US |