A “GUI application” is herein defined as a software application that allows a user to interact with it through various graphical icons and visual indicators.
An “existing GUI application” is herein defined as an application that performs its computation logic using a CPU of the device providing its user interface.
A “local computing device” is herein defined as a local computing device capable of executing a GUI application.
A “remote device” is herein defined as a computing device that renders a GUI application and processes user interactions from one or more input devices.
The term “visual editing” is herein defined as providing a What You See is What You Get (WYSIWYG) interface that allows the user to customize the existing GUI application.
The term “application flow” is herein defined as the systematic organization of GUI components, screens, data, actions, events and mappings to business logic.
The term “constraints” are herein defined as characteristics of the remote device that restrict the existing GUI application from being rendered in its original form on the remote device. Examples of some common constraints are screen size, screen resolution, single document interface, shorter battery life, slower processor speeds, less available memory, slower network connectivity, data entry capabilities, and limited input options, such as no keyboard or mouse, etc.
An “agent” is herein defined as a software application that intercepts the existing GUI application and manages graphical changes, user interaction, such as mouse clicks, and user inputs, such as keyboard events.
The term “look and feel” is herein defined as the design elements of a GUI, such as colors, shapes, sizes, fonts and other observables and user's behavior of dynamic GUI components, such as buttons, text fields, and menus.
The term “layout” is herein defined as the visual structure for a user interface. For example, in a window, the visual structure describes how the child objects are organized with respect to the parent window and to each other. The visual structure can be defined by fixed values of x,y offsets from a certain position, from positions relative to each other, or from positions based on the available space. The visual structure can be generally arranged in a vertical or horizontal list, or arranged in a grid of rows and columns.
The term “application definition” is herein defined as a template containing all the metadata necessary to transform the original GUI application where the transformations are defined by the user in the visual editor and rendered to a remote device.
The term “application representation model” is herein defined as a set of data that describes the entire graphical user interface of the existing GUI application where each data element can have properties, such as text, size, color, location, actions and events.
The term “widget” is defined herein as a graphical control element in a graphical user interface (GUI).
The term “widget toolkit” is defined herein as a set of libraries containing graphical control elements in a GUI.
The term “real time interception” is defined herein as decomposing the original GUI application through the use of operating system application program interfaces (APIs), and then displaying the GUI application in real time for editing by the user.
The present teaching, in accordance with preferred and exemplary embodiments, together with further advantages thereof, is more particularly described in the following detailed description, taken in conjunction with the accompanying drawings. The skilled person in the art will understand that the drawings, described below, are for illustration purposes only. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating principles of the teaching. The drawings are not intended to limit the scope of the Applicant's teaching in any way.
Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the teaching. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
It should be understood that the individual steps of the methods of the present teachings may be performed in any order and/or simultaneously as long as the teaching remains operable. Furthermore, it should be understood that the apparatus and methods of the present teachings can include any number or all of the described embodiments as long as the teaching remains operable.
The present teaching will now be described in more detail with reference to exemplary embodiments thereof as shown in the accompanying drawings. While the present teachings are described in conjunction with various embodiments and examples, it is not intended that the present teachings be limited to such embodiments. On the contrary, the present teachings encompass various alternatives, modifications and equivalents, as will be appreciated by those of skill in the art. Those of ordinary skill in the art having access to the teaching herein will recognize additional implementations, modifications, and embodiments, as well as other fields of use, which are within the scope of the present disclosure as described herein.
Mobile and other portable devices are widely used by most individuals and businesses in the modern world. Modern business organizations are experiencing an increased demand from employees, and other end users, for “bring your own device” (BYOD) support. Today, employees are likely to have at least one mobile device, such as a smartphone or tablet, on their persons at all times. It is highly desirable to provide these employees with access to business enterprise software applications directly from their mobile devices. Such access will greatly improve business productivity.
However, most existing desktop personal computer based software applications are not easily accessed or manipulated from mobile devices. In fact, many of the desktop software applications that are most critical to business operations are difficult or impossible to access with mobile devices. There is a very significant need in the industry to provide organizations with the ability to deliver their existing business desktop applications to a multitude of different mobile and portable devices. Such delivery must be implemented in a manner where the user can interact with the original application in an intuitive and native manner.
Some enterprise software applications have been designed with features that are easily ported to mobile and portable devices. Many of these software applications provide access through native mobile applications for particular platforms, such as iOS, Android etc., or through mobile browser (HTML5) based portals. However, these mobile versions require constant maintenance and synchronization to guarantee that mobile users have the same feature set as the desktop users of the application. Such maintenance significantly adds to the cost of the software and requires companies to expend more IT resources.
Furthermore, many business organizations have numerous legacy applications that are critical to day-to-day operations. Typically, these software applications have limited or no mobile interface support and require a significant software development effort to add support for a multitude of mobile and portable devices.
One known approach to providing support for mobile and portable devices is to simply stream the desktop application to the mobile device with no changes to the user interface or user experience. This approach is generally unworkable for most businesses because desktop applications are designed for keyboard and mouse interfaces, while mobile devices are designed for touch input on a much smaller screen.
Another known approach to providing support for mobile and portable devices is to internally develop software for a new mobile application that runs natively on the target devices. This approach is undesirable because the source code is typically required and it almost always requires significant software architecture and development resources, as well as a new support system to maintain the same feature set as the original application. Thus, this approach is usually cost prohibitive.
Current solutions to the problem of providing users of mobile and portable devices with full feature access to many desktop software applications do not create an application representation model that can easily be visually edited by providing a real time view of the existing application. In addition, these current solutions do not allow the user to easily manipulate the GUI components or existing GUI application so that they are presented in a desirable or optimized format for the remote device.
Some known methods use transformation services designed to optimize certain graphical components, such as aggregation, reduction, rearrangement, overflow, mashup, or zoom service, for remote devices. However, these services do not allow the user to visually transform the existing GUI application by directly editing the graphical components in a What You See Is What You Get environment via a drag and drop, or other user-friendly methodology.
Today, business organizations are looking to easily mobilize their existing desktop software applications through systems that provide mobile experience virtualization (MXV). MXV optimizes the existing desktop applications' interfaces for end user devices, taking into account physical screen size, resolution, touch interface and native look and feel. These systems have simple interfaces and give the user the ability to customize and optimize the screen without requiring any software development skills. Furthermore, these systems can deliver existing desktop applications to mobile clients in a timely and cost effective manner.
One feature of the methods and apparatus of the present teaching is the existing GUI application can be rendered on the remote device using a native web browser. This feature is advantageous for some applications because custom applications are not needed. Furthermore, through the use of HTML 5 elements, the transformed application achieves the look and feel of a native application executing on the remote device.
Another feature of the methods and apparatus of the present teaching is that methods and apparatus perform real time interception of an existing GUI application by decomposing the original GUI application through the use of operating system APIs, and then displaying the GUI application in real time for editing by the user. As new widows are opened in the existing GUI application, they will be immediately available for customizing their graphical components for a remote device. Such a feature is efficient and easy to use, and does not require software development.
The Reddo UX Connector 102 intercepts the Existing GUI Application 104 and provides bi-directional communication between the Remote Devices 114, 116, and 118 and the Existing GUI Application 104, thereby enabling the delivery of the Existing GUI Application 104. The term “delivering the existing GUI application” is herein defined as performing the following steps: (1) rendering the GUI in whole or in part based on the application definition; (2) capturing user interactions/actions from the remote device such as clicks, keyboard entry, mouse movement and executing the interaction/action in the existing application. A simple example of delivering the existing GUI application is that if a button is pressed on the remote device, that action will be executed in the existing application.
In order to provide a highly productive user experience when delivering the Existing GUI Application 104 to a remote device, the interception and communication must include data describing both the graphical elements and events they generate. One feature of the present teaching is that data describing both the graphical elements and events can be provided with the Reddo UX Connector 102 that uses low-level operating system hooks to retrieve necessary information from a widget toolkit.
In many modern operating systems, GUI applications are created using one or more frameworks referred to as widget toolkits. A widget toolkit is a set of libraries containing graphical control elements, which are referred to in the art as widgets. A software developer utilizes appropriate widgets and arranges them in windows or screens while building the GUI application. The individual widgets can vary in shapes, sizes and complexity, but the toolkit's objective is to facilitate a Graphical User Interface, instead of a command line interface where commands are fed to the system one at a time. Modern widget toolkits contain components, such as forms for entering text data, buttons for initiating actions and grids for displaying tabular data. Many widget toolkits allow a user to interact with the graphical system by using event-driven programming. When a user initiates an event, such as a button click, it is detected and passed to the underlying application. The application performs one or more actions based on the event, and can update the screen to allow more user interaction.
The two types of widget toolkits generally found in modern software systems are called low-level and high-level widget toolkits. Low-level widget toolkits are either integrated directly into the operating system or run as a separate layer on top of the operating system. Low-level widget toolkits are tightly coupled with the host operating system, and traditionally have a speed advantage because of a lack of abstraction when being executed. Low-level widget toolkits are closely tied to particular operating systems. Examples of widely used low-level widget toolkits include the Windows API used in Microsoft Windows, Cocoa toolkit used in Apple Mac OS X and the XLib (also known as libX11) used in the Linux operating system.
High-level widget toolkits abstract each individual component and allow a GUI to be executed on multiple platforms. Each low-level toolkit contains slightly different widgets, but a high-level toolkit must contain a superset of available widgets and map them to each corresponding widget within an operating system. For example, the Swing toolkit found within Java can execute the same GUI application on Windows, Mac OS X, and Unix/Linux operating systems.
The Reddo UX Connector 102 includes several sub components, including the Process Controller 106 and Broker 106′, the State Manager 108 and the Reddo Web App 110. The Process Controller 106 is a software program executed on the desktop computing device and is connected to the Existing GUI Application 104 through inter-process communication. In this embodiment, the Process Controller 106 receives commands from the Broker 106′ through a socket connection and executes that command on the Existing GUI Application 104. For example, the Process Controller 106 can receive commands to invoke the Existing GUI Application 104 or to execute a mouse click on a particular GUI element. The Process Controller 106 also monitors the program state, GUI interface and events within the Existing GUI Application 104, and reports program state, GUI interface and events to the Broker 106′. For example, if a GUI element within the Existing GUI Application 104 has changed state and an event is fired, the Process Controller 106 can relay both the GUI element status and the event information to the Broker 106′.
The Broker 106′ is a software program executed on the desktop computing device that provides bi-directional communication of messages between the State Manager 108 and one or more Process Controllers 106. For example, the Broker 106′ can receive a message from Process Controller 106 that a GUI element within the Existing GUI Application 104 has changed state. The Broker 106′ relays the message to the State Manager 108. Also, the Broker 106′ can receive messages from the State Manager 108, process the message, and then send it to the corresponding Process Controller 106. For example, the Broker 106′ can receive a message that a user has made a selection within a grid GUI element displayed in Existing GUI Application 104. The Broker 106′ can send the message to the corresponding Process Controller 106 for further processing.
The State Manager 108 is a software program that stores and synchronizes the application representation model for a plurality of Existing GUI Applications 104, and provides for bi-directional communication between the Reddo Web App 110 and one or more Brokers 106′. The State Manager 108 is described in further detail in connection with
The Reddo Web App 110 is a software application that executes within the Reddo App Server 112 and renders one or more Existing GUI Applications 104 to one or more remote devices. The Reddo Web App 110 communicates user interactions initiated within the delivered GUI application to the Reddo State Manager 108. Furthermore, the Reddo Web App 110 receives messages from the Reddo State Manager 108 and updates the rendered application on the remote device.
The Reddo App Server 112 is a software application server that executes the Reddo Web App 110 and executes the Reddo App Server Store 120. The Reddo App Server Store 120 is a software database that contains one or more application definitions. The application definitions are created, edited, saved and removed by the Reddo Adaptive UX Planner 122, and are retrieved for use by the Reddo Application Server 112. For example, Reddo App Server Store 120 can contain the application definition for an existing application targeted for the remote mobile device. If a connection from the remote mobile device is established to the Reddo Web App 110, the remote mobile device is identified and the corresponding application definition is retrieved from the Reddo App Server Store 120. The Reddo App Server 112 can then deliver the Existing GUI Application 104 to the remote mobile device 116, 118.
The UX Reddo Adaptive UX Planner 122 allows a user to select the target device from a list where the user of the Planner 122 can drag and drop components from the Existing GUI Application 104 into a graphical representation of the remote device. This allows the user to add one or more screens to optimize application flow, change look and feel of certain components to make them more accessible on the remote device and add new components that did not exist in the Existing GUI Application 104. Once the user of the Planner 122 has achieved desired functionality, the application definition is saved to the Reddo App Server Store 120. When a remote device, such as the iPhone 124, shown as an image of an iPhone in the Reddo Adaptive UX Planner 122, connects to the Reddo App Server 112 and requests an existing application, it is identified as an Apple iPhone device and the application definition is then retrieved. The user is able to interact with the existing GUI application through the use of a standard web browser running on a personal computer through an optimized interface targeted for the iPhone mobile device.
In the specific embodiment illustrated in
The Menu Bar Widget 202 (
The Panel Node 236 shown in
The Panel Node 244 corresponds to the Panel Widget 206 illustrated in
Panel Node 254 contains one child node that represents Grid Node Tasks 256. Grid Node Tasks 256 correspond to the Grid Widget 210 that is illustrated in
The Reddo Adaptive UX Planner 122 (
The State Manager 302 facilitates bidirectional communication with the clients via Application Context 314 and receives user interactions from the Clients 304, 304′ via the Input Invoker 316. The Application Context 314 stores the particular application representation model for the existing GUI application. In addition, the Application Context 314 sends changes within the existing GUI application to the appropriate Clients 304, 304′ and also sends user input from the Input Invoker 316 to the appropriate Broker 308, 308′.
Clients 304 executes within an ASP.NET Container 324. For example, Client 304 can deliver an application definition of the Existing GUI Application FileZilla 318 through Application Context 314 and Input Invoker 316. An embodiment of a method of delivery of an Existing GUI Application Calculator 322 according to the present teaching is as follows: Client 304 first establishes a connection to web server ASP.NET where Client 304 is a remote computing device for which an application definition residing in the Application Context 314 was targeted. The current state of executing the GUI Application Calculator 322 is then rendered on Client 304.
Client 304 then begins to interact with the Existing GUI Application Calculator 322. The user initiates by sending an input event, such as a tap, click or keystroke. The resulting Client Input is then sent to the Input Invoker 316 executing within the State Manager 302. The Client Input invocation includes the event metadata as well as the associated identifying information of the Application Context 314.
The Input Invoker 316 then passes the input event to Application Context 314, which then processes the event, updates the application representational model and sends the event to Broker 308. Broker 308 sends the event to the Process Controller 310 that originally executed the Existing GUI Application Calculator 322. The Process Controller 310 simulates user input on the Existing GUI Application Calculator 322. The Existing GUI Application Calculator 322 then responds to the user input in the same manner as a direct user interaction.
Process Controller 318 then sends all changes to the Application Context 314 after the existing GUI application has responded to user input. The Application Context 314 updates the application representational model and the current state of existing GUI application is rendered on Client 304 via web server ASP.NET 324.
Referring to both
More specifically, referring to the block diagram 400, the method of an embodiment of the present teaching where existing GUI applications are executed within a Citrix virtualization environment and delivered to the remote device 412 is as follows. The remote device 412 indicates to Citrix WorxWeb 410 the application that it desires to be delivered. Citrix WorxWeb 410 then communicates the desired application to the Redo Application Server 406. The Redo Application Server 406 generates a Uniform Resource Locator (URL) address and transmits the URL address back to the Citrix WorxWeb 410. The Citrix WorxWeb 410 then transmits it to the Citrix Receiver 408.
The Citrix WorxWeb 410 then publishes the desired application to the Reddo UX Connector 402 that includes the controller 106 and the Broker 106′ that were described in connection with
The existing GUI applications are shown as virtual desktops 506 operating within the VMware Horizon virtualization environment. Various virtual desktops 506 are shown running different Microsoft Windows operating systems, such as Windows XP, Windows Vista, Windows 7 and Windows 8. The various virtual desktops 506 communicate with infrastructure hardware 508 running the VMware Horizon virtualization environment. Referring to both
The Reddo Web Server 512 communicates with the internal network 502 directly and with the external network 504 via a Demilitarized Zone (DMZ) 526. The DMZ 526 can include software components, such as the Load Balancer 528 and View Security Servers 530, 530′ that manage access to the virtual desktops 506.
Various Remote Devices 532, such as iPads, iPhones, Android Tablets, and Android Phones, are supported by the VMware Horizon virtualization environment. These devices execute a VMware Horizon native application to gain access to the virtualization environment. These devices can communicate directly with the internal network 502 and can communicate with the external network 504.
Referring to the left side of the Visual Editor 600 screen in
Referring to the right side of the Visual Editor 600 in
The process of visually editing the application flow of the existing GUI application to generate an application definition targeted to a particular remote device of an embodiment of the present teaching begins by selecting one or more components from the existing GUI application on the left side of the Visual Editor 600 screen. Selecting the one or more components then reveals a set of tools available for the end user that assists with transferring the component to the target device. If the system can determine that the component is of a known type, for example, a text box or a button, it will suggest to the user a particular user interface template for that component to properly fit the remote device.
The user of the Visual Editor 600 then drags and drops the component into the target device on the right side of the screen. The target device simulates what that component will look like on the actual device. When the user clicks an element in the mobile screen, a context menu for that element is presented to the user. From this menu, the user of the Visual Editor 600 can perform various tasks, such as setting trigger/actions, specifying the component template, changing the style and changing the component settings, such as column order, text etc.
In some embodiments of the present teaching, the target device, such as a mobile device, has limited screen real estate. In these embodiments, it is useful for the user of the Visual Editor 600 to have the ability to edit the user flow by creating multiple screens out of one existing GUI window. If the original application has numerous form fields, lists and other components laid out in a vertical flow, the original window can be broken up into a plurality of logical screens, where each logical screen contains a part of the original window from the existing GUI application.
In some embodiments, the user desires to deliver a Task List Window 608 to a mobile computing device, such as an iPhone. The Task List Window 608 will be displayed as a data grid on the Target Device 606. Through the use of real-time interception, the existing GUI application is then loaded into the Reddo Adaptive UX Planner 122 that was described in connection with
In some embodiments, the user can add a component to the Remote View 604 not present within the Existing GUI Application 602. Possible components may include buttons, labels or navigation aids. In these embodiments, the user selects a new widget from an available list and drags and drops the components to the Remote View 604. The user can also add a new event to an existing or new component, where the new event is not present within the Existing GUI Application 602.
Once the user has completed the visual editing, the application definition is stored in a database within the Reddo Adaptive UX Planner 122 (
In another embodiment of the present teaching, two or more existing GUI applications share the same application definition. For example, Application Definition 2706 is defined by Existing GUI Application 1702, Existing GUI Application 2714, and by Existing GUI Application k 716, where each of Existing GUI Application 1702, Existing GUI Application 2714, and Existing GUI Application k 716 are targeted to Remote Device 2710. Existing GUI Application k 716 is associated with Application Definition k 718, which is targeted to Remote Device k 720. In some embodiments of the present teaching, pieces from a plurality of existing GUI applications are combined or “mashed-up” into one application definition.
The application definition that describes Remote Screen 840 and Remote Screen 842 can be created using the Reddo Adaptive UX Planner, which was described in connection with
Windows Application 802 contains four GUI widgets that are targeted for Remote Screen 840 and Remote Screen 842. The GUI widgets are GUI Panel 808, GUI Panel 810, GUI Panel 812, and GUI Panel 814. Windows Application 804 contains one GUI widget that is targeted for Remote Screen 840. The GUI Widget is GUI Panel 816. Windows Application 806 contains three GUI widgets that are targeted for Remote Screen 840 and Remote Screen 842. The GUI widgets are GUI Chart 818, GUI Panel 820 and GUI Grid 822.
Remote Screen 840 is a rendering including one component from each of three existing GUI applications. The GUI Panel 808 is positioned on the top left side of Remote Screen 840. GUI Panel 816 is positioned on the bottom left side of Remote Screen 840, and takes up the available height within the viewport. Finally, GUI Grid 822 is positioned on the right side of Remote Screen 840, taking up the full height within the viewport.
Remote Screen 842 is a rendering consisting of multiple components from two existing GUI applications, Windows Application 802 and Windows Application 806. GUI Chart 818 is positioned on the left side of Remote Screen 842, centered vertically within the viewport. GUI Panel 820 is positioned on the bottom right of Remote Screen 842. GUI Panel 810 is positioned on the top right side of Remote Screen 842. GUI Panel 812 is positioned in the center of Remote Screen 842. Finally, GUI Panel 814 is positioned to the center right of Remote Screen 842.
While the Applicants' teaching is described in conjunction with various embodiments, it is not intended that the Applicants' teaching be limited to such embodiments. On the contrary, the Applicants' teaching encompass various alternatives, modifications, and equivalents, as will be appreciated by those of skill in the art, which may be made therein without departing from the spirit and scope of the teaching.
The present application is a non-provisional application of U.S. Provisional Application Ser. No. 61/970,438, filed on Mar. 26, 2014, entitled Project Igloo and EMS Breakdown. The entire contents of U.S. Provisional Patent Application No. 61/970,438 are herein incorporated by reference.
Number | Date | Country | |
---|---|---|---|
61970438 | Mar 2014 | US |