This invention pertains in general to editing web content, and in particular to a user interface that allows a user to make edits to a web page on a smartphone, tablet, or other mobile client device.
In many parts of the world, people have come to expect every organization, event, and public figure to have a web site. As a result, many web editing applications have been developed to allow users who are unfamiliar with HTML, XML, Javascript, CSS, or other web design tools to create and maintain professional and aesthetically pleasing websites. Some of these web editing applications allow users to change or add content to a web page.
Conventional web editing applications may not be optimized for editing or viewing web pages on mobile client devices such as smartphones and tablets. However, many users prefer to use tablet computers, smartphones, and other mobile devices to perform their computing tasks. For example, users may favor the flexibility of working in different locations or the intuitive touchscreen interface that is present on most modern-day mobile devices.
One critical drawback to mobile devices is that they generally have smaller screens than other client devices. As a result, they are poorly-suited to viewing and editing web pages on mobile client devices.
Embodiments of the invention include a method, a non-transitory computer readable storage medium, and a system for providing a user interface for editing a web page. A web editing application provides user interfaces for creating and editing web pages on a mobile device. The web editing application may provide users with previews of web page templates for use in creating and/or editing web pages. Upon user selection of a web page template, the web editing application provides a user interface for editing the web page. Edited web pages or changes to web pages are sent to a web editing server once editing has occurred. In one embodiment, users can edit web pages while a client device is offline, and changes are queued for sending to the web editing server once the client device is online.
A web editing application executing on a client device interacts with a web rendering module in the operating system to render and display the web page. The web rendering module also generates layout data that describes the position and size of each visual element on the rendered web page, and the web editing application uses the layout data to generate a native overlay. The native overlay is an arrangement of one or more cells, and each cell has the same position and size as a respective visual element or group of visual elements. The web editing application displays the native overlay on top of the rendered webpage so that each visual element or collection of visual elements on the rendered web page is aligned with a cell in the native overlay.
In one embodiment, the web editing application determines whether a display of the client device is in landscape mode or portrait mode, and arranges certain cells differently based on this determination. The web editing application may further provide cell configuration indicators that indicate cells that are arranged differently in different display modes. Providing this functionality allows for more efficient use of screen space of a display. This may be particularly advantageous for mobile client devices, which may have smaller displays, because it allows users to see more content on the screen without having to scroll or zoom.
Embodiments of the computer-readable storage medium store computer-executable instructions for performing the steps described above. Embodiments of the system further comprise a processor for executing the computer-executable instructions.
The features and advantages described in the specification are not all inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter.
The figures depict embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
System Overview
The web editing server 110 stores web pages created by users and provides templates for new web pages. As used herein, a web page is a data item that can be rendered to generate a page of content with one or more visual elements. Examples of visual elements include images, videos, headings, and body text. In some embodiments, some elements of a web page may contain other elements. For example, a column element may contain body text or image elements. Web pages can also include interactive visual elements that change appearance automatically or in response to a user interaction within the visual element. For example, a web page may include a slideshow element that displays a series of predetermined images. The slideshow may switch to a different image after a certain time interval has elapsed or in response to a user input. As another example, a web page may include a map element that presents an interactive map with interactive zooming and panning functions.
In some embodiments, the web editing server 110 is implemented as a single server, while in other embodiments it is implemented as a distributed system of multiple servers. The web editing server 110 includes a web page store 112, web page templates 114, and a request fulfillment module 116.
The web page store 112 includes a plurality of web pages created by users of the web editing server 110. Each web page in the web page store 112 includes instructions that define the size, position, and content of visual elements on the page. In one embodiment, the instructions are stored as structured data (e.g., JSON data) that can be used to assemble markup language (e.g., HTML) describing the page. In this embodiment, the structured data may include portions of markup language. In another embodiment, the instructions are stored in a markup language and not as structured data. The content of a visual element can either be included as part of the instructions for the corresponding web page or stored as separate data items that are referenced in the instructions for the web page. For example, body text and headings on the web page are included as part of the instructions, but images on the web page are stored as separate files and instructions include references to those files. Separate data items can be stored alongside the web pages in the web page store 112 or on a separate server that is accessible via the network 101.
The web pages in the web page store 112 can be organized into web sites. A website includes one or more individual web pages that are connected to each other (e.g., with hyperlinks). In addition to the individual web pages, a website can also include a page hierarchy and theme content.
A page hierarchy describes relationships between different pages of the website. For example, the page hierarchy may organize the website into a tree structure with the home page at the root, pages for the main sections of the website as branches connected to the root, and pages representing sub-sections as leaves connected to each branch.
Theme content is data that defines visual characteristics of one or more web pages. In one embodiment, theme content is written in Cascading Style Sheets (CSS). For example, the theme content defines formatting settings such as the size, font, and color of text, hyperlinks, and headings, the color and size of borders, and the sizes of margins. The theme content may also define layout settings such as the position of a main heading on each page and the position of hyperlinks for navigating to other sections of the website. It is advantageous to store a single item of theme content for a website rather than a separate item of theme content for each web page because web pages in a website typically have the same layout, formatting, and other visual characteristics.
The web page templates 114 are data items that can be used to create new web pages and websites. Each template 114 includes theme content, as described above. In addition to theme content, a template 114 may also include other content for defining the appearance of the web page, such as custom typefaces or graphics. A template 114 may additionally include sample images and text (e.g., lorem ipsum text) to provide a user with a representation of how the web page will appear after the user's content is added.
The request fulfillment module 116 receives and processes requests from client devices 120 to retrieve templates 114 and web pages. For example, the module 116 receives requests from client devices 120 to load a web page for a user of the client device 120 to edit and provides the requested web page to the client device 120. After users make edits to the web page, the module 116 also receives and fulfills requests to update that web page in the web page store 112 to incorporate the user's edits.
The client device 120 is a computing device that allows a user to interact with the web editing server 110 to create and edit web pages. For example, the client device 120 may be a mobile device such as a tablet computer or a smart phone. The client device 120 may alternatively be a laptop or desktop computer. As shown in
The operating system 122 provides modules that allow applications on the client device 120 (e.g., the web editing application 126) to interact with hardware components of the client device 120, such as the hardware components described in
The web editing application 126 retrieves web pages and templates 114 from the web editing server 110 (e.g., via the request fulfillment module 116) and operates in conjunction with the web rendering module 124 to provide an interface that allows a user to edit the web pages. The interface includes a native overlay that is generated based on the layout data provided by the web rendering module 124. The process of generating the native overlay and making edits to web pages via the native overlay is described in detail with reference to
The network 101 provides a communication infrastructure between the web editing server 110 and the client device 120. The network 101 is typically the Internet, but may be any network, including but not limited to a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile wired or wireless network, a private network, or a virtual private network.
The web page 202 is a local copy of a web page in the web page store 112. As described above with reference to the web editing server 110, the web page 202 includes instructions (e.g., stored as structured data, such as JSON, that may include portions of markup language in HTML) that can be executed to render the web page, and the instructions define the size, position, and content of the visual elements on the rendered web page 208. Although only one web page 202 is shown in
The toolbar generator 204 provides one or more visual toolbars containing options that allow the user to edit the web page. Some of the toolbars include options to add visual elements to the page. For example, a toolbar may include options to add a paragraph of text, a map, an image, or a slideshow. The toolbars may also provide options to change visual characteristics of the pages, such as the background color of the page or the size, color, and font of body text and headings on the page. These changes can be stored as part of the web page (e.g., if they are specific to the page) or as part of the theme content (e.g., if they affect multiple web pages that are associated with the same item of theme content). In addition to toolbars, the toolbar generator 204 may also present provide other interface elements, such as menus and status bars.
The native overlay generator 206 sends the web page 202 to the web rendering module 124, which causes the web rendering module 124 to render and display a visual representation of the web page 202. The visual representation of the web page 202 is referred to herein as the rendered web page 208. The process of rendering and displaying a rendered web page 208 is described in further detail below with reference to the structured data interpreter 252 and the web rendering engine 254 in
As described above with reference to the web rendering module 124, the layout data 210 describes the position and size of visual elements on the rendered web page 208. For example, the layout data 210 includes a set of x and y coordinates describing the position of the upper-left corner of each visual element and a set of x and y dimensions describing the size of each visual element. In some embodiments, the layout data 210 stores the position and size of each visual element in association with an identifier that uniquely identifies the visual element.
In one embodiment, the layout data 210 specifies more than one different position or size for each visual element. The web rendering module 124 selects the appropriate position or size data based on characteristics of a display on which the web page 202 is displayed. For example, the layout data 210 may have a first set of coordinates and dimensions for each visual element for display on a landscape orientation display, and a second set of coordinates and dimensions for each visual element for display on a portrait orientation display. In one embodiment, a display is capable of being rotated from a portrait orientation to a landscape orientation. The web rendering module 124 is notified of a rotation and may re-render the web page 202 using the appropriate layout data 210. Having separate positions and sizes for visual elements depending on display characteristics allows for more efficient use of screen space, especially in the case of devices having a limited display area such as smartphones.
After receiving the layout data 210, the native overlay generator 206 causes the client device 120 to generate a native overlay 212. The native overlay 212 includes an arrangement of one or more cells, and each cell has a position and size that matches a respective visual element or group of visual elements of the rendered web page 208. In one embodiment, the native overlay generator 206 interacts with a layout generation module in the operating system 122 to generate the native overlay 212. For example, if the operating system 122 is APPLE IOS, the native overlay generator 206 creates an instance of the UICollectionView class to generate the native overlay 212. As another example, if the operating system 122 is GOOGLE ANDROID, the native overlay generator 206 creates an instance of the ViewGroup class. In this embodiment, the layout generation module is configured to receive instructions that define the position and size of each cell. In one embodiment, the native overlay 212 comprises one cell, and the layout data 210 defines positions of one or more visual elements within the cell. To prepare these instructions, the native overlay generator 206 may perform a transformation on the layout data 210 received from the web rendering module 124. For example, if the layout data 210 defines the upper left corner of each cell but the layout generation module is configured to receive instructions that define the center of each cell, then the native overlay generator 206 performs a transformation to convert the coordinates for each cell's upper left corner to coordinates for each cell's center. Alternatively, the native overlay generator 206 may use the unmodified layout data 210 as the instructions.
In one embodiment, each cell in the native overlay 212 is transparent, and the native overlay generator 206 causes the native overlay 212 to be displayed on top of the rendered web page 208. As a result, the rendered web page 208 remains visible under the native overlay 212, but the native overlay 212 captures any gestures or other interactions that the user performs on the screen.
The interaction processing module 214 receives user interactions with the toolbars and the native overlay 212 and makes corresponding changes to the web page 202. After receiving a user interaction, the interaction processing module 214 updates the web page 202 and passes the updated web page 202 to the web rendering module 124 to be rendered. In addition to rendering and displaying the updated web page 202 to the user, the web rendering module 124 also generates and returns updated layout data 210 to the native overlay generator 206, and the native overlay generator 206 regenerates the native overlay 212. The interaction processing module 214 can also send the updated web page 202 to the server 110 to be saved in the web page store 112. In one embodiment, only the changes to a web page 202 are sent to the server 110. In this embodiment, the server 110 updates the version of the web page 202 stored in the web page store 112 by applying the received changes.
User interactions that change the layout of the web page 202 can include, for example, moving a visual element, adding a new visual element from a toolbar, editing an existing visual element (e.g., by resizing the visual element), or deleting an existing visual element. These example interactions are described in further detail with reference to
In embodiments where the web page 202 is stored in the web editing application 126 as structured data (e.g., JSON), the structured data interpreter 252 interprets the structured data to assemble markup language describing the web page (e.g., HTML). In one embodiment, the structured data interpreter 252 is provided by the same developer as the web editing application 126. For example, the developer that provides the web editing application 126 also provides JavaScript code (which can be executed by the web rendering module 124) that implements the structured data interpreter 252.
The web rendering engine 254 is a layout engine that receives markup language (e.g., HTML) describing a web page and generates a rendered web page 208. In embodiments where the web page 202 is stored in the web editing application 126 as structured data, the web rendering engine 254 receives markup language from the structured data interpreter 252. The web rendering engine 254 may alternatively receive markup language directly from the web editing application (e.g., if the web page 202 is stored as markup language). Unlike the structured data interpreter 252, which may be provided by the same developer as the web editing application 126, the web rendering engine 254 is typically provided as part of the operating system 122. For example, the web rendering engine 254 is the WebKit layout engine that is included as part of APPLE IOS.
The layout data generator 256 generates the layout data 210 by querying the web rendering engine 254 for information describing the position and size of each visual element on the rendered web page 208. In one embodiment, the layout data generator 256 arranges the position and size information into a structured format (e.g., JSON) and provides the layout data 210 in this structured format. When generating the layout data 210, the layout data generator 256 may also associate each item of position and size information with an identifier for the corresponding visual element. Similar to the structured data interpreter, the layout data generator 256 may also be provided by the same developer as the web editing application 126. For example, the developer provides JavaScript code that can be executed by the web rendering module 124 to perform the functions of the layout data generator 256. In one embodiment, a single module of JavaScript code implements both the structured data interpreter 252 and the layout data generator 256.
Example Method
When the user begins editing web content using the web editing application 126, the web editing application 126 sends a request for the web content to the web editing server 110. The requested web content may include a template 114 (if the user wishes to create a new web page) or a web page from the web page store 112 (if the user wishes to edit an existing web page). As described above with reference to
In response to the request, web editing server 110 sends 302 initialization data to the web editing application 126. The initialization data includes the requested web page(s) or template(s) 114, which is stored as web page(s) 202 in the web editing application 126. The initialization data also includes the theme content and a page hierarchy associated with the requested web page(s) or template(s) 114. The theme content and page hierarchy may be stored in association with the web page(s) 202 in the web editing application 126.
In addition to requesting the web content, the web editing application 126 also detects 303 display information of the client device 120. Display information identifies characteristics of a display of the client device 120. For example, the orientation of the display (e.g., landscape or portrait), the dimensions of the display, or the resolution of the display. Display information is used to render web pages and native overlays that are appropriate for the client device 120. Display information may also be detected 303 upon a change to the display, such as a rotation of the display from landscape to portrait orientation.
The web editing application 126 generates 304 toolbars containing options that allow the user to edit the web page 202. Although the steps of detecting 303 display characteristics and generating 304 the toolbars is illustrated in
After receiving the initialization data, the web editing application 126 sends 306 an instruction to the web rendering module 124 to render the web page 202. In response, the web rendering module 124 renders 308 the web page 202 and displays the rendered web page 208 on the client device 120. In an embodiment where the web page 202 is stored as structured data (e.g., JSON), the structured data interpreter 252 transforms the structured data into markup language (e.g., HTML) describing the rendered web page 208, and the web rendering module 254 uses the markup language to generate and display the rendered web page 208. An example rendered web page 208 is shown in
Referring back to
The web editing application 126 uses the layout data 210 to generate 312 a native overlay 212.
In one embodiment, the web editing application 126 detects a change to the display information of the client 120 (e.g., rotation of the display from portrait orientation to landscape orientation). In this embodiment, the method 300 may return to step 303 and proceed in rendering the web page and generating the native overlay with appropriate dimensions and cell positioning for the detected display characteristics. This process may be repeated several times as the web editing application 126 detects multiple changes to the display information.
Referring back to
Because the native overlay 212 is displayed on top of the rendered web page 208, the native overlay 212 captures any gestures or other interactions that the user performs. Thus, when the user attempts to interact with a visual element on the rendered web page 208, the user actually performs an interaction with a cell in the native overlay 212 that has the same position and size as the visual element.
Referring back to
The web editing application 126 receives 314 a user interaction with the native overlay 212 or one of the toolbars and makes a corresponding change to the web page 202. Examples of user interactions and the corresponding changes are described below with reference to
After receiving the updated layout data 210, the web editing application regenerates 322 the native overlay 212 using the updated layout data 210 so that the cells of the native overlay 212 match the visual elements of the updated rendered web page 208. In one embodiment, the web editing application 126 first compares the updated layout data 210 to the previous layout data and only regenerates 322 the native overlay 212 if the updated layout data 210 is different (i.e., if the updated layout data indicates that the user interaction changed the layout of the rendered web page 208). In another embodiment, the web editing application 126 regenerates 322 the native overlay 212 using the updated layout data 210 without first comparing the updated layout data 210 to the previous layout data.
At any point after the web editing application 126 updates the web page 202 based on the user's interaction, the web editing application 126 may send 324 the updated web page 202 to the request fulfillment module 116 of the web editing server 110 to be saved in the web page store 112. In one embodiment, only changes to the web page 202 are sent to the web editing server 110. Step 324 may be performed automatically without receiving a separate user input to save the web page 202 to the web page store 112. For example, if only changes are sent to the web editing server 110, step 324 may be performed each time a change is made to the web page. In one embodiment, the web editing application 126 implements a queuing system to send multiple updates to the request fulfillment module 116 in series after the updates are made on the client device 120. A queuing system can be advantageous in situations where the client device 120 does not have continuous access to the web editing server 110 over the network 101 and is unable to save each edit immediately after the edit is made.
Method for Moving a Visual Element
The method 314A begins when the user performs a long pressing gesture on a visual element within the rendered web page 208. Because the native overlay 212 is displayed on top of the rendered web page 208, the interaction processing module 214 receives 402 the long pressing gesture on the corresponding cell in the native overlay 212. For example, if the user performs a long pressing gesture on the image element 370A shown in
The interaction processing module 214 uses the layout data 210 to generate 404 an image of the selected visual element (the visual element image). In one embodiment, the interaction processing module 214 accesses the layout data 210 to obtain the position and dimensions of the cell corresponding to the selected visual element and captures a screenshot of the region of the rendered web page 208 defined by the position and dimensions. In another embodiment, the interaction processing module 214 captures a screenshot image of the entire rendered web page 208 and uses the layout data 210 to crop the screenshot image to remove everything other than the visual element. For example, if the layout data 210 defines the top-left corner and the dimensions of the visual element, the interaction processing module 214 uses these two data items to determine the four corners of the visual element and crops the screenshot image to remove portions of the screenshot image that does not fall within those four corners.
The interaction processing module 214 also hides 406 the visual element so that the visual element is not displayed twice on the screen when the user performs a dragging gesture. In one embodiment, the interaction processing module 214 modifies the selected cell (which was previously transparent) so that the cell has an opaque color that matches the background color of the rendered web page 208. In the example shown in
To move the visual element, the user performs a dragging gesture to drag the visual element image from its initial position (i.e., a position within the selected cell) to a new position on the rendered web page 208. When the interaction processing module 214 receives 408 the dragging gesture, the module 214 displays 410 the visual element image at the new position. In one embodiment, the module 214 implements the display of the visual element image as a draggable image so that the visual element image appears to move with the user's finger as the user performs the dragging gesture. In the example shown in
The interaction processing module 214 also displays 410 an insertion point (e.g., the insertion point 411 in
In one embodiment, the interaction processing module 214 also displays a cell configuration indicator. The cell configuration indicator indicates whether the cells adjacent to the insertion point have a special arrangement that changes based on the display characteristics of the client device 120. For example, cells may constitute a column in a portrait orientation, but may constitute a row in landscape orientation.
When the user terminates the dragging gesture, the interaction processing module 214 moves the visual element to a location on the web page 202 corresponding to the insertion point. For example, the interaction processing module 214 changes the web page 202 so that the visual element is displayed in the new location when the web page 202 is rendered. The interaction processing module 214 also sends 316 the change in the web page 202 to the web rendering module 124 in accordance with the method 300 described in
Method for Editing a Visual Element
The method 314B begins when the user performs a tapping gesture on a visual element on the rendered web page 208. After the interaction processing module 214 receives 422 the tapping gesture on the corresponding cell in the native overlay 212, the module identifies the visual element by accessing the layout data 210. In the example shown in
The interaction processing module 214 displays 424 a primary tooltip proximate to the cell. In one embodiment, the primary tooltip includes options to edit, move, and delete the visual element. An example of a primary tooltip 505 with these options 505A, 505B, 505C is illustrated in
The user selects one of the options on the primary tooltip and the interaction processing module 214 receives 426 the user selection. If the user selects the delete option 505C, the module 214 changes the web page 202 to delete the visual element and the method 314C terminates. If the user selects the move option 505B, the module 214 performs steps 404 through 412 of the method 314A described with reference to
If the user selects the edit option 505A, the interaction processing module 214 displays 428 an editing tooltip that includes one or more options for making edits to the visual element. An editing tooltip 509 for the image element 368A is illustrated in
Method for Adding a Visual Element
The method 314C begins when the user performs a long pressing gesture on an option in one of the toolbars to add a particular type of visual element to the web page 202. The interaction processing module 214 receives 602 the long pressing gesture and selects the element type corresponding to the option that was pressed. In the example shown in
To indicate that the element type has been selected, the interaction processing module 214 displays 604 a representation of the selected element type. In one embodiment, the representation is a graphic of the selected element type. In the example shown in
After the representation is displayed 604, the user performs a dragging gesture to drag the representation from the toolbar to a position on the rendered web page 208. The interaction processing module 214 receives 606 the dragging gesture and displays 608 an insertion point in the same manner as described above with reference to
The user interactions described in the methods 314A, 314B, 314C make reference to long pressing gestures and tapping gestures. The distinction between long pressing gestures and tapping gestures is merely meant to differentiate two types of input performed at the same position. In other embodiments, other types of input (e.g., using a pointing device to perform a left click or right click at the same position) may be used in place of the gestures referenced herein. In still other embodiments, the functions triggered by long pressing gestures and tapping gestures may be reversed.
The methods 300, 314A, 314B, 314C described with reference to
Once the user has edited the web page and is ready to publish the page, the user may send a request to publish the web page by selecting a publish element such as the publish button of
In one embodiment, various steps of the process 700 may be performed while the client device is not connected to the network 101. For example, steps 701-707 may be performed while the client device is not connected to web editing server 110 via the network 101, which allows a user of the device to edit the web page offline for later upload. In one embodiment, if the user requests to publish the web page while offline, the web application 126 will queue the publish request to send to the web editing server 110 once a connection is established. In another embodiment, if the user requests to publish the web page while offline, the web application 126 does not queue the publish request. In this embodiment, the web application 126 may notify the user that the web application cannot publish while the client device 120 is offline and may instruct the user to attempt to publish when the client device is online.
Physical Components of a Computer
The processor 802 is an electronic device capable of executing computer-readable instructions held in the memory 806. In addition to holding computer-readable instructions, the memory 806 also holds data accessed by the processor 802. The storage device 808 is a non-transitory computer-readable storage medium that also holds computer readable instructions and data. For example, the storage device 808 may be embodied as a solid-state memory device, a hard drive, compact disk read-only memory (CD-ROM), a digital versatile disc (DVD), or a BLU-RAY disc (BD). The input device(s) 814 may include a pointing device (e.g., a mouse or track ball), a keyboard, a touch-sensitive surface, a camera, a microphone, sensors (e.g., accelerometers), or any other devices typically used to input data into the computer 800. The graphics adapter 812 displays images and other information on the display 818. In some embodiments, the display 818 and an input device 814 are integrated into a single component (e.g., a touchscreen that includes a display and a touch-sensitive surface). The network adapter 816 couples the computing device 800 to a network, such as the network 101.
As is known in the art, a computer 800 can have additional, different, and/or other components than those shown in
As is known in the art, the computer 800 is adapted to execute computer program modules for providing functionality described herein. As used herein, the term “module” refers to computer program logic utilized to provide the specified functionality. Thus, a module can be implemented in hardware, firmware, and/or software. In one embodiment, computer program modules are stored on the storage device 808, loaded into the memory 806, and executed by the processor 802.
As used herein, a computer program product comprises one or more computer program modules that operate in conjunction to provide the functionality described herein. Thus, a computer program product may be stored on the storage device 808, loaded into the memory 806, and executed by the processor 802 to provide the functionality described herein.
Embodiments of the physical components described herein can include other and/or different modules than the ones described here. In addition, the functionality attributed to the modules can be performed by other or different modules in other embodiments. Moreover, this description occasionally omits the term “module” for purposes of clarity and convenience.
Additional Configuration Considerations
Some portions of the above description describe the embodiments in terms of algorithmic processes or operations. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs comprising instructions for execution by a processor or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of functional operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
The described embodiments also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored on a computer readable medium that can be accessed by the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of computer-readable storage medium suitable for storing electronic instructions, and each coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the disclosure. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the invention is not limited to the precise construction and components disclosed herein and that various modifications, changes and variations which will be apparent to those skilled in the art may be made in the arrangement, operation and details of the embodiments disclosed herein without departing from the spirit and scope as defined in the claims.
This application is a continuation of and claims priority to U.S. patent application Ser. No. 16/200,354, filed on Nov. 26, 2018, which is a continuation of and claims priority to U.S. application Ser. No. 14/878,878, entitled “USER INTERFACE FOR EDITING WEB CONTENT”, filed on Oct. 8, 2015, now known as U.S. Pat. No. 10,139,998, issued on Nov. 27, 2018, which claims the benefit of U.S. Provisional Application No. 62/061,691, filed Oct. 8, 2014, which is incorporated by reference herein.
| Number | Name | Date | Kind |
|---|---|---|---|
| 4723211 | Barker et al. | Feb 1988 | A |
| 5819055 | MacLean et al. | Oct 1998 | A |
| 6212577 | Stern et al. | Apr 2001 | B1 |
| 6304886 | Bernardo et al. | Oct 2001 | B1 |
| 7000184 | Matveyenko | Feb 2006 | B2 |
| 7444597 | Perantatos | Oct 2008 | B2 |
| 7500182 | Kelly | Mar 2009 | B2 |
| 7500183 | Kelly | Mar 2009 | B2 |
| 7966219 | Singh et al. | Jun 2011 | B1 |
| 7966560 | Kanzaki | Jun 2011 | B2 |
| 7975223 | Plumley et al. | Jul 2011 | B2 |
| 8042040 | Lynton | Oct 2011 | B2 |
| 8125461 | Weber et al. | Feb 2012 | B2 |
| 8166155 | Rachmeler et al. | Apr 2012 | B1 |
| 8217964 | Laine et al. | Jul 2012 | B2 |
| 8266524 | Bailey | Sep 2012 | B2 |
| 8332271 | Wilder et al. | Dec 2012 | B1 |
| 8370219 | Prabhu et al. | Feb 2013 | B1 |
| 8397153 | Lee et al. | Mar 2013 | B1 |
| 8438495 | Gilra et al. | May 2013 | B1 |
| 8522134 | Zetlen | Aug 2013 | B1 |
| 8555200 | Hicks | Oct 2013 | B2 |
| 8566704 | Le Bescond de Coatpont | Oct 2013 | B1 |
| 8671352 | Hsu | Mar 2014 | B1 |
| 8723890 | Griffin | May 2014 | B2 |
| 8896632 | MacDougall et al. | Nov 2014 | B2 |
| 8904302 | Higgins et al. | Dec 2014 | B2 |
| 9021363 | Blake et al. | Apr 2015 | B2 |
| 9105026 | Edwards | Aug 2015 | B1 |
| 9213460 | Nurse et al. | Dec 2015 | B2 |
| 9223761 | Sukhanov | Dec 2015 | B2 |
| 9311426 | Stovicek | Apr 2016 | B2 |
| 9536012 | Shore | Jan 2017 | B2 |
| 9639263 | Bloch et al. | May 2017 | B2 |
| 9703457 | Hsu | Jul 2017 | B2 |
| 9703757 | Rimmer | Jul 2017 | B2 |
| 9720581 | Baldwin | Aug 2017 | B2 |
| 9773072 | Phillips | Sep 2017 | B2 |
| 9785307 | Ganesan et al. | Oct 2017 | B1 |
| 9817916 | Flack | Nov 2017 | B2 |
| 9858356 | Johnson et al. | Jan 2018 | B1 |
| 9922013 | Le Bescond de Coatpont | Mar 2018 | B2 |
| 10026062 | Sasmaz et al. | Jul 2018 | B1 |
| 10073923 | Koren et al. | Sep 2018 | B2 |
| 10108336 | Bloch et al. | Oct 2018 | B2 |
| 10139998 | Glasgow et al. | Nov 2018 | B2 |
| 10185703 | Abrahami | Jan 2019 | B2 |
| 10402064 | Al-Sallami et al. | Sep 2019 | B1 |
| 10529008 | Pritchard et al. | Jan 2020 | B1 |
| 20010045949 | Chithambaram et al. | Nov 2001 | A1 |
| 20010049672 | Moore et al. | Dec 2001 | A1 |
| 20020135621 | Angiulo et al. | Sep 2002 | A1 |
| 20030014317 | Siegel et al. | Jan 2003 | A1 |
| 20030121004 | Christensen et al. | Jun 2003 | A1 |
| 20040230536 | Fung et al. | Nov 2004 | A1 |
| 20050015722 | Niyogi et al. | Jan 2005 | A1 |
| 20050094206 | Tonisson | May 2005 | A1 |
| 20050246627 | Sayed | Nov 2005 | A1 |
| 20060036994 | Englefield et al. | Feb 2006 | A1 |
| 20060136822 | Kelly | Jun 2006 | A1 |
| 20060174199 | Soltis et al. | Aug 2006 | A1 |
| 20060212790 | Perantatos et al. | Sep 2006 | A1 |
| 20060212806 | Griffin | Sep 2006 | A1 |
| 20060225094 | Facemire et al. | Oct 2006 | A1 |
| 20070101263 | Bedingfield | May 2007 | A1 |
| 20070150368 | Arora et al. | Jun 2007 | A1 |
| 20070244811 | Tumminaro | Oct 2007 | A1 |
| 20070294644 | Yost | Dec 2007 | A1 |
| 20080065982 | Evanchik et al. | Mar 2008 | A1 |
| 20080189156 | Voda et al. | Aug 2008 | A1 |
| 20080209311 | Agronik | Aug 2008 | A1 |
| 20080209442 | Setlur et al. | Aug 2008 | A1 |
| 20080235578 | Heed | Sep 2008 | A1 |
| 20080281722 | Balasubramanian et al. | Nov 2008 | A1 |
| 20080313049 | Lai et al. | Dec 2008 | A1 |
| 20090007062 | Gilboa | Jan 2009 | A1 |
| 20090055755 | Hicks | Feb 2009 | A1 |
| 20090058821 | Chaudhri et al. | Mar 2009 | A1 |
| 20090063312 | Hurst | Mar 2009 | A1 |
| 20090156190 | Fisher | Jun 2009 | A1 |
| 20090228838 | Ryan | Sep 2009 | A1 |
| 20090282343 | Catlin et al. | Nov 2009 | A1 |
| 20090305743 | Gouesbet et al. | Dec 2009 | A1 |
| 20090313645 | Sathish | Dec 2009 | A1 |
| 20090327101 | Sayed | Dec 2009 | A1 |
| 20100037168 | Thayne et al. | Feb 2010 | A1 |
| 20100083163 | Maghoul et al. | Apr 2010 | A1 |
| 20100088639 | Yach et al. | Apr 2010 | A1 |
| 20100313252 | Trouw | Dec 2010 | A1 |
| 20110022350 | Chatterjee | Jan 2011 | A1 |
| 20110164056 | Ording | Jul 2011 | A1 |
| 20110185040 | Schmidt | Jul 2011 | A1 |
| 20110208786 | Ghosh | Aug 2011 | A1 |
| 20120089436 | Tavares et al. | Apr 2012 | A1 |
| 20120102176 | Lee et al. | Apr 2012 | A1 |
| 20120131483 | Archer et al. | May 2012 | A1 |
| 20120162263 | Griffin et al. | Jun 2012 | A1 |
| 20120185787 | Lisse et al. | Jul 2012 | A1 |
| 20120232973 | Robb et al. | Sep 2012 | A1 |
| 20120284104 | Keenan | Nov 2012 | A1 |
| 20120290959 | Quine | Nov 2012 | A1 |
| 20120296682 | Kumar et al. | Nov 2012 | A1 |
| 20120296697 | Kumar | Nov 2012 | A1 |
| 20120324400 | Caliendo, Jr. et al. | Dec 2012 | A1 |
| 20130018713 | Kumar et al. | Jan 2013 | A1 |
| 20130021377 | Doll | Jan 2013 | A1 |
| 20130091417 | Cordasco | Apr 2013 | A1 |
| 20130117653 | Sukhanov | May 2013 | A1 |
| 20130125045 | Sun | May 2013 | A1 |
| 20130145251 | Jureidini | Jun 2013 | A1 |
| 20130145257 | Shalabi et al. | Jun 2013 | A1 |
| 20130145306 | Shore | Jun 2013 | A1 |
| 20130173402 | Young et al. | Jul 2013 | A1 |
| 20130198609 | Mokhtarzada et al. | Aug 2013 | A1 |
| 20130219024 | Flack | Aug 2013 | A1 |
| 20130219263 | Abrahami | Aug 2013 | A1 |
| 20130222275 | Byrd et al. | Aug 2013 | A1 |
| 20130227469 | Park | Aug 2013 | A1 |
| 20130254063 | Stone et al. | Sep 2013 | A1 |
| 20130254650 | Huang et al. | Sep 2013 | A1 |
| 20130268872 | Yin et al. | Oct 2013 | A1 |
| 20130326343 | Phillips | Dec 2013 | A1 |
| 20130339877 | Skeen et al. | Dec 2013 | A1 |
| 20130346302 | Purves et al. | Dec 2013 | A1 |
| 20140019848 | Le Bescond de Coatpont | Jan 2014 | A1 |
| 20140047321 | Prabhu et al. | Feb 2014 | A1 |
| 20140047413 | Sheive | Feb 2014 | A1 |
| 20140052546 | Phan | Feb 2014 | A1 |
| 20140095329 | Liu | Apr 2014 | A1 |
| 20140122255 | Snyder | May 2014 | A1 |
| 20140149845 | Ansel et al. | May 2014 | A1 |
| 20140195892 | Hampton | Jul 2014 | A1 |
| 20140222553 | Bowman | Aug 2014 | A1 |
| 20140223291 | Sharma | Aug 2014 | A1 |
| 20140229821 | Abrahami | Aug 2014 | A1 |
| 20140244532 | Budzienski et al. | Aug 2014 | A1 |
| 20140258836 | Horton et al. | Sep 2014 | A1 |
| 20140258837 | Horton et al. | Sep 2014 | A1 |
| 20140279520 | Armstrong et al. | Sep 2014 | A1 |
| 20140282055 | Engel | Sep 2014 | A1 |
| 20140297437 | Natarajan | Oct 2014 | A1 |
| 20140337161 | Whisnant et al. | Nov 2014 | A1 |
| 20140337768 | Hsu | Nov 2014 | A1 |
| 20140372923 | Rossi et al. | Dec 2014 | A1 |
| 20140379507 | Pitt | Dec 2014 | A1 |
| 20150006333 | Silveira et al. | Jan 2015 | A1 |
| 20150007071 | Gilbert et al. | Jan 2015 | A1 |
| 20150012818 | Reichmann et al. | Jan 2015 | A1 |
| 20150074516 | Ben-Aharon | Mar 2015 | A1 |
| 20150074518 | Rumsey et al. | Mar 2015 | A1 |
| 20150089559 | Tyagi et al. | Mar 2015 | A1 |
| 20150095768 | Rimmer | Apr 2015 | A1 |
| 20150121305 | Saund | Apr 2015 | A1 |
| 20150169518 | Antipa et al. | Jun 2015 | A1 |
| 20150170141 | Klingen | Jun 2015 | A1 |
| 20150178759 | Glasgow | Jun 2015 | A1 |
| 20150378524 | Wilde | Dec 2015 | A1 |
| 20160041954 | Bloch | Feb 2016 | A1 |
| 20160070814 | Baldwin | Mar 2016 | A1 |
| 20160103928 | Glasgow | Apr 2016 | A1 |
| 20160140622 | Wang et al. | May 2016 | A1 |
| 20160300144 | Santhanam et al. | Oct 2016 | A1 |
| 20170031875 | Luo et al. | Feb 2017 | A1 |
| 20170103050 | Underwood et al. | Apr 2017 | A9 |
| 20170109441 | Berk et al. | Apr 2017 | A1 |
| 20170109763 | Tsai et al. | Apr 2017 | A1 |
| 20170300897 | Ferenczi et al. | Oct 2017 | A1 |
| 20180181484 | Jambu et al. | Jun 2018 | A1 |
| 20180253718 | Khan et al. | Sep 2018 | A1 |
| 20190073120 | Bloch et al. | Mar 2019 | A1 |
| 20190102067 | Glasgow et al. | Apr 2019 | A1 |
| 20190155496 | Bloch et al. | May 2019 | A1 |
| 20190155497 | Bloch et al. | May 2019 | A1 |
| 20190180272 | Douglas | Jun 2019 | A1 |
| 20200183553 | Al-Sallami et al. | Jun 2020 | A1 |
| 20200183554 | Al-Sallami et al. | Jun 2020 | A1 |
| 20200183555 | Al-Sallami et al. | Jun 2020 | A1 |
| 20210026526 | Bloch et al. | Jan 2021 | A1 |
| Number | Date | Country |
|---|---|---|
| 0 651 543 | Sep 2008 | EP |
| 2020123044 | Jun 2020 | WO |
| Entry |
|---|
| Rivoal, F.,“Media Queries: W3C Recommendation Jun. 19, 2012,” (2012), W3C, 22 pages. |
| Marcotte, E.,“Responsive Web Design,” May 25, 2010, in A List Apart blog, 52 pages. |
| Margotte, E.,“Fluid Images,” Jun. 7, 2011, in A List Apart blog. Also portion of Chapter 3 of Responsive Web Design, 29 pages. |
| Non-Final Office Action dated Feb. 14, 2020, for U.S. Appl. No. 16/261,452, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Non-Final Office Action dated Feb. 13, 2020, for U.S. Appl. No. 16/261,392, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Notice of Allowance dated Feb. 28, 2020, for U.S. Appl. No. 16/256,884, of Bloch, B., A., et al., filed Jan. 24, 2019. |
| Notice of Allowance dated Mar. 27, 2020, for U.S. Appl. No. 16/261,423, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Wollman, D., “Chrome OS review (version 19),” Engadget, published May 29, 2012, Retrieved from the Internet URL: https://www.engadget.com/2012/05/29/chrome-os-review-version-19/, pp. 1-27. |
| Non-Final Office Action dated Apr. 7, 2016, for U.S. Appl. No. 14/452,390 of Bloch, B., A., et al., filed Aug. 5, 2014. |
| Final Office Action dated Sep. 23, 2016, for U.S. Appl. No. 14/452,390 of Bloch, B., A., et al., filed Aug. 5, 2014. |
| Non-Final Office Action dated Jan. 11, 2017, for U.S. Appl. No. 14/878,878 of Glasgow, R.J., et al., filed Oct. 8, 2015. |
| Final Office Action dated Jul. 24, 2017, for U.S. Appl. No. 14/878,878 of Glasgow, R.J., et al., filed Oct. 8, 2015. |
| Non-Final Office Action dated Oct. 4, 2017, for U.S. Appl. No. 15/465,462 of Bloch, B.A., et al., filed Mar. 21, 2017. |
| Final Office Action dated Jan. 18, 2018, for U.S. Appl. No. 15/465,462 of Bloch, B.A., et al., filed Mar. 21, 2017. |
| Non-Final Office Action dated Mar. 8, 2018, for U.S. Appl. No. 14/878,878, of Glasgow, R.J., et al. filed Oct. 8, 2015. |
| Notice of Allowance dated Jun. 20, 2018, for U.S. Appl. No. 15/465,462 of Bloch, B.A., et al., filed Mar. 21, 2017. |
| Notice of Allowance dated Jul. 3, 2018, for U.S. Appl. No. 14/878,878, of Glasgow, R.J., et al., filed Oct. 8, 2015. |
| Non-Final Office Action dated Dec. 31, 2018, for U.S. Appl. No. 16/165,751, of Bloch, B.A., et al., filed Oct. 19, 2018. |
| Non-Final Office Action dated Mar. 18, 2019, for U.S. Appl. No. 16/256,884, of Bloch, B.A., et al., filed Jan. 24, 2019. |
| Non-Final Office Action dated Mar. 18, 2019, for U.S. Appl. No. 16/256,886, of Bloch, B.,A., et al., filed Jan. 24, 2019. |
| Notice of Allowance dated Apr. 23, 2019, for U.S. Appl. No. 16/261,434, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Non-Final Office Action dated Apr. 24, 2019, for U.S. Appl. No. 16/261,423, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Non-Final Office Action dated May 24, 2019, for U.S. Appl. No. 16/261,392, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Notice of Allowance dated Jun. 20, 2019, for U.S. Appl. No. 16/165,751, of Bloch, B.A., et al., filed Oct. 19, 2018. |
| Final Office Action dated Jul. 10, 2019, for U.S. Appl. No. 16/256,886, of Bloch, B.,A., et al., filed Jan. 24, 2019. |
| Final Office Action dated Jul. 10, 2019, for U.S. Appl. No. 16/256,884, of Bloch, B.,A., et al., filed Jan. 24, 2019. |
| International Search Report and Written Opinion for international Application No. PCT/US2015/022818, dated Jun. 29, 2015. |
| Advisory Action dated Dec. 16, 2019, for U.S. Appl. No. 16/261,392, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Advisory Action dated Jan. 15, 2020, for U.S. Appl. No. 16/261,423, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Notice of Allowance dated Feb. 6, 2020, for U.S. Appl. No. 16/256,886, of Bloch, B., A., et al., filed Jan. 24, 2019. |
| Notice of Allowance dated Feb. 6, 2020, for U.S. Appl. No. 16/256,884, of Bloch, B., A., et al., filed Jan. 24, 2019. |
| International Search Report and Written Opinion for PCT Application No. PCT/US2019/057801 dated Jan. 22, 2020. |
| Advisory Action dated Sep. 24, 2019, for U.S. Appl. No. 16/256,884, of Bloch, B.,A., et al., filed Jan. 24, 2019. |
| Final Office Action dated Oct. 3, 2019, for U.S. Appl. No. 16/261,392, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Final Office Action dated Oct. 29, 2019, for U.S. Appl. No. 16/261,423, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Final Office Action dated May 29, 2020, for U.S. Appl. No. 16/261,392, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Non-Final Office Action dated Nov. 5, 2020, for U.S. Appl. No. 15/931,026, of Bloch, B. A. et al., filed May 13, 2020. |
| Non-Final Office Action dated Jul. 9, 2020, for U.S. Appl. No. 16/200,354 of Glasgow, R. J. et al., filed Nov. 26, 2018. |
| Notice of Allowance dated Jul. 17, 2020, for U.S. Appl. No. 16/261,423, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Advisory Action dated Aug. 7, 2020, for U.S. Appl. No. 16/261,392, of Al-Sallami, W., et al., filed Jan. 29, 2019. |
| Final Office Action dated Feb. 2, 2021, for U.S. Appl. No. 16/200,354 of Glasgow, R. J. et al., filed Nov. 26, 2018. |
| Final Office Action dated Feb. 22, 2021, for U.S. Appl. No. 15/931,026, of Bloch, B. A. et al., filed May 13, 2020. |
| Advisory Action dated Apr. 29, 2021, for U.S. Appl. No. 15/931,026, of Bloch, B. A. et al., filed May 13, 2020. |
| Bidelman, E., “Native HTM L5 Drag and Drop,” Sep. 30, 2010, HTML5 Rocks blog, 11 pages. |
| Notice of Allowance dated Jul. 9, 2021, for U.S. Appl. No. 15/931,026, of Bloch, B. A. et al., filed May 13, 2020. |
| Non Final Action dated Jul. 21, 2021, for U.S. Appl. No. 16/200,354 of Glasgow, R. J. et al., filed Nov. 26, 2018. |
| Notice of Allowance dated Aug. 11, 2021, for U.S. Appl. No. 15/931,026, of Bloch, B. A. et al., filed May 13, 2020. |
| Final Office Action dated Mar. 8, 2022, for U.S. Appl. No. 16/200,354 of Glasgow, R. J. et al., filed Nov. 26, 2018. |
| Advisory Action dated May 26, 2022, for U.S. Appl. No. 16/200,354 of Glasgow, R. J. et al., filed Nov. 26, 2018. |
| Non-Final Office Action dated Jun. 10, 2022, for U.S. Appl. No. 16/200,354 of Glasgow, R. J. et al., filed Nov. 26, 2018. |
| Number | Date | Country | |
|---|---|---|---|
| 20190265866 A1 | Aug 2019 | US |
| Number | Date | Country | |
|---|---|---|---|
| 62061691 | Oct 2014 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 16200354 | Nov 2018 | US |
| Child | 16405953 | US | |
| Parent | 14878878 | Oct 2015 | US |
| Child | 16200354 | US |