Widgets are known in the art and may be found for example as part of Mac OS X in Dashboard. An application such as Apple Inc.'s Dashcode 1.0 (described in an Appendix) may be used to assist in the creation of Widgets. Web applications are known in the art and may run on mobile devices such as the Apple iPhone. Often it is desirable to create content that may alternatively run as a widget or as a web application. There are often common aspects to widget creation and web application creation.
The invention relates to content design.
In a first aspect, a method includes providing a user interface allowing the insertion of elements into a document flow comprising static and dynamic elements, the user interface presenting a graphical depiction of the document that is dynamically altered by the insertion of the element, wherein the dynamically altered appearance of the document correctly reflects the position and type of the inserted element and rearranges all existing static and flow elements of the document around the inserted element.
Implementations can include any, all or none of the following features. The user interface can be configured to insert into the document flow any element from a predefined library containing at least a box element, a browser element, a button element, an input control element, a gauge element, and combinations thereof. The user interface can be configured to insert into the document flow elements and non-flow elements. The user interface can provide that an underlying code of the document flow is modified based on the inserted element and the rearranged static and dynamic elements.
In a second aspect, a method includes detecting the movement of an element of a layout of a document outside a boundary of a first level of hierarchy; and visually and in the underlying code, placing that element in a level of hierarchy that is a parent of the first level of hierarchy.
Implementations can include any, all or none of the following features. The movement can begin in another element at the first level that is configured to contain the element at the first level, and the movement outside the boundary can include that the element is placed at the parent of the first level instead of being placed at the first level. A user can graphically move the element by dragging the element using a pointing device.
In a third aspect, a method includes refactoring an interface for content editing based on a project type.
Implementations can include any, all or none of the following features. The project type can be defined using a property list for the project type. The method can further include providing at least one of templates, attributes, library parts, views, icons, variable elements based on the project type. The method can further include choosing a deployment behavior based on the project type. The refactoring can be performed in a tool that includes templates for web applications and for web widgets, and a user can select one of the templates in the tool and the property list is provided from the selected template. The tool can provide a list view and at least one template row, and the method can further include receiving a modification of the template row and applying the modification to any other row that relates to the template row. The method can further include adding an element to the list view, adding the element to the template row, and updating all rows of the list relating to the template row.
In a fourth aspect, a method includes generating a replicated element based on the editing of a canonical element and mapping corresponding subcomponents of the canonical element to corresponding subcomponents of the replicated elements. The method can further include maintaining a dictionary to map identifiers of a subcomponent of a replicated element to an identifier of a subcomponent of the canonical element. The method can further include invoking a function that receives a cloned row relating to a template row, the cloned row having an attribute that contains a reference to each element in the cloned row, wherein the function changes an aspect of the cloned row based on the attribute.
In a fifth aspect, a method includes: running a debug plug in in a mobile device to monitor a web application and reporting data to a web application development tool.
Implementations can include any, all or none of the following features. The method can further include presenting a resource logging interface for the web application, the resource logging interface configured to be filtered. The method can further include providing graphical representations of memory, CPU and network use to indicate CPU usage or memory.
In a sixth aspect, a method includes: running a debug plug in a simulator of a mobile device to monitor a web application and reporting data to a web application development tool.
Implementations can include any, all or none of the following features. The method can further include monitoring a web application on both a mobile device and on a simulation of a mobile device connected to a debugging interface using plug-ins in a browser of the mobile device and a browser of the simulation of the mobile device. The method can further include presenting a resource logging interface for the web application, the resource logging interface configured to be filtered. The method can further include providing graphical representations of memory, CPU and network use to indicate CPU usage or memory.
In a seventh aspect, a method includes: depicting content intended for a mobile device at a scale related to the pixel resolution of the device in a simulation of the device.
Implementations can include any, all or none of the following features. The depicted content can be a pixel-to-pixel analog of the mobile device. The method can further include resealing to a 1:1 dimensionally accurate analog of a view on the mobile device.
In an eighth aspect, a method includes: depicting content intended for a mobile device at a scale related to the physical dimensions the device in a simulation of the device.
Implementations can include any, all or none of the following features. The method can further include simulating a rotation to be performed on the mobile device. Simulating the rotation can include determining, before simulating the rotation, each aspect of the depicted content; determining how each aspect should appear after rotation on the mobile device; and performing the rotation based on at least on the two determinations.
In a ninth aspect, a method to allow a visualization of content intended for a mobile device at a first scale related to the pixel resolution of the device in a simulation of the device or at a second scale related to the physical dimensions the device in the simulation of the device in response to a user input.
Implementations can include any, all or none of the following features. The method can further include selectively performing the visualization at at least one of the first scale or the second scale. The visualization can be performed at one of the first and second scales based on a user input.
In a tenth aspect, a method includes: listing all resources accessed by a web application or a web widget and filtering them based on one or more of network location and resource type.
Implementations can include any, all or none of the following features. A user can select one of the resources, and the method can further include displaying information regarding the selected resource. The method can further include toggling to display the resource instead of the displayed information.
In an eleventh aspect, a method includes displaying information comprising CPU, memory and network bandwidth usage of only those processes required to display or run a particular web application or widget.
Implementations can include any, all or none of the following features. The information can be displayed in a debug plug in in a mobile device that monitors a web application and reports data to a web application development tool. The information can be displayed in a debug plug in in a simulator of a mobile device that monitors a web application and reports data to a web application development tool.
In this application, the drawings are listed in the order in which the described figures appear in the description below.
Generally, in
Specifically the figures depict at a high level the features found in a tool such as Dashcode 2.0, available from Apple Inc., and operable on computers that run the Mac OS X operating systems.
In one implementation,
In
A similar view for the other project type in this implementation, the “widget” type is depicted in
Once a template has been chosen, attributes for the project type can then be selected. In
In
A similar pair of figures is provided for widget design. In
For each specific project type, templates, attributes, library parts, views, icons and other variable elements of the interface can be provided. For example, scaling and rotation are relevant for web applications for a mobile device, whereas they are not for widgets. On the other hand, the concept of a front and back side of a widget are not relevant to a web application.
Deployment behavior may also differ between the two project types. For example, widgets can deploy directly to a Mac OS Dashboard or generate a widget in a directory. In web applications the tool may upload the application directly to a web server or generate a folder with a web page and resources.
As may be appreciated, there may be other document types for which other interface versions and attributes may be presented. For example, an extension of this interface to generalized web pages or other types of content may be provided with modifications to the attributes and user interface as needed.
Dynamic WYSIWYG UI for Editing of a Document with Flow and Static Elements
In an implementation of a content creation interface as depicted in
In
In
Thus this implementation allows a content creator to have a live, dynamic view of a document, implemented in this example by CSS and HTML elements, and to move the visual versions of those elements directly using a user interface without having to actually rewrite the underlying code. Furthermore, the implementation allows the mixture of flow and non-flow elements such as the Home button and the Back button in the same content and may appropriately move flow elements “around” the non-flow elements by altering the CSS appropriately as the user moves them on the interface.
As may be appreciated by one in the art, this technique may in general be applied to any document or content having flowing and non-flowing elements and is not limited merely to web based content. Thus for example, a sheet music composition system, layout editor for CAD, or any other application where user interacts with a visual version of an underlying coded representation are all candidates for this technique of making a dynamic live version of the underlying representation available for manipulation by the user.
FIGS. 2.1-2.7 UI for Insertion of Elements into a Parent Container in a Hierarchical Document
FIGS. 2.1-2.7 depict one specific aspect of an implementation. In web-based content, there may a Document Object Model (DOM) that may describe a hierarchy of containers made up by elements such as, for example, DIV elements. When a new object is introduced onto a visual dynamic representation of the web content, in a web content creation tool, the tool may locate it at one level of the hierarchy. Thus for example, in
Thus in general the implementation allows movement in a hierarchical document from one level to a parent level by a user movement of a representation over a boundary of the lower level.
In
In general, implementations such as the content creation tool may automatically replicate in an intelligent manner, all the components and subcomponents of a repeated element. One important aspect of this replication is that while elements of each replicated piece may be similar, e.g. each list element may have components such as an icon, a text field, a button, an arrow among other myriad possibilities, their actual identifiers in the underlying document structure, e.g. in a DOM, will be different. That is, elements in the template row have an identifier that must be unique in the scope of the document. Because of this, when creating a cloned or similar element based on the template row or element, these identifiers are stripped out, but the cloned row may need to keep a reference to them in a dictionary. This way, the developer's code may then customize each row individually such as by adding specific text or values to a text field or button, in this example, by accessing the relevant internal elements through this dictionary and inserting data into them. A dictionary call back may be used in each duplicated element to construct the new copy based on the template. At runtime, the template element may be used to perform error checking.
As an example of how the list row template may be used, note that in
clone.templateElements.label.innerText=“text of the cloned row”
to change the label of the row being processed
A list is only one example of this type of templating and replication of sub elements. In other examples, cells in a grid or even pages in a stacked view may be replicated using this technique of editing a canonical representative and modifying duplicative replicated versions with unique identifiers but a common dictionary of elements. Of course, this is not an exhaustive list of the types of replicated structures for which this technique may be employed. Furthermore, the sub elements of the canonical element, e.g. a button or text or an arrow, may also be various and different and include a myriad of sub elements such as geometrical shapes, text fields, active text, and many others as is known. Furthermore, such templating may be recursively employed in some implementations.
To switch between the two views, which in the underlying code are sections of a single document, it is only necessary to select the appropriate icon in the navigator. In existing art, it may be necessary to manually edit the document code to make only one of the levels visible while hiding the other.
When the content produced in
In other implementations, other types of hierarchical views may be presented in a similar or analogous manner using a representation of a tree. Clicking or selecting a single or a set of nodes in the navigation tree could then produce on a viewing panel a view including only those elements of the hierarchical structure that are selected.
In addition to the resource logger interface, the content creation and debug tool may also provide graphical representations of memory, CPU and network use via representations such as a needle-and-dial or pie-chart with different colors to indicate CPU usage or used v/s available memory, respectively.
It is to be noted that the resource log and performance parameters are specific to the particular web application or widget. Thus a user of the content creation application and debug system may see exactly what level and type of resource use is being required for a specific application or widget.
It should be noted that the web apps running in simulator 1005 and on phone 1013 or device 1013 may have debugging plugins to allow Dashcode developers to debug, instrument and monitor such applications, using technologies such as gdb, inspector, instruments and others.
The software stack 700 can include an application layer and operating system layers. In this Mac OS® example, the application layer can include Dashcode 2.0 710, Widgets 705, or Web Applications 715. In some embodiments Widgets may live in a separate Dashboard layer. The Dashcode 2.0 application may include code to facilitate functionality such as widget creation, web application creation, WYSIWYG editing of web content, debug and test of web content, among others.
Web application or widget code can include HTML 720, CSS 725, JavaScript® 730 and other resources 735. CSS is a stylesheet language used to describe the presentation of a document written in a markup language (e.g., style web pages written in HTML, XHTML). CSS may be used by authors and readers of web content to define colors, fonts, layout, and other aspects of document presentation. JavaScript® is a scripting language which may be used to write functions that are embedded in or included from HTML pages and interact with a Document Object Model (DOM) of the page.
In some implementations, a web application 715, a widget 705 or web content creation and debug tool such as Dashcode 2.0, 710, in the application layer uses WebKit® services 740. WebKit® 740 is an application framework included, in one implementation, with Mac OS X. The framework allows third party developers to easily include web functionality in custom applications. WebKit® includes an Objective-C Application Programming Interface (API) that provides the capability to interact with a web server, retrieve and render web pages, download files, and manage plug-ins. WebKit® also includes content parsers (e.g., HTML, CSS parser 765), renderer 770, a JavaScript® bridge 775 (e.g., for synchronizing between a web browser and Java applets), a JavaScript® engine (interpreter) 780 and a DOM 760. The WebKit® can use services provided by Core Services 750, which provide basic low level services. The Core Services can request services directly from the Core OS 755 (e.g., Darwin/Unix).
The software stack 700 provides the software components to create widgets, web applications, debug and test them, and the various features and processes described above. Other software stacks and architectures are possible, including architectures having more or fewer layers, different layers or no layers. Specifically, for one example, the services provided by WebKit may be provided directly by the Operating system, or incorporated into the content creation, debug and test application in other embodiments, or be otherwise provided by a disparate set of libraries. Many other variations of the depicted architecture are possible.
The memory 820 stores information within the system 800. In some implementations, the memory 820 is a computer-readable medium. In other implementations, the memory 820 is a volatile memory unit. In yet other implementations, the memory 820 is a non-volatile memory unit.
The storage device 830 is capable of providing mass storage for the system 800. In some implementations, the storage device 830 is a computer-readable medium. In various different implementations, the storage device 830 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device.
The input/output device 840 provides input/output operations for the system 800. In some implementations, the input/output device 840 includes a keyboard and/or pointing device. In other implementations, the input/output device 840 includes a display unit for displaying graphical user interfaces.
In some embodiments the system 800 may be an Apple computer, such as a Mac Pro, MacBook Pro, or other Apple computer running Mac OS. In other embodiments the system may be a Unix system, a Windows system, or other system as is known.
In some implementations, the mobile device 2500 can implement multiple device functionalities, such as a telephony device, as indicated by a Phone object 2510; an e-mail device, as indicated by the Mail object 2512; a map devices, as indicated by the Maps object 2514; a Wi-Fi base station device (not shown); and a network video transmission and display device, as indicated by the Web Video object 2516. In some implementations, particular display objects 2504, e.g., the Phone object 2510, the Mail object 2512, the Maps object 2514, and the Web Video object 2516, can be displayed in a menu bar 2518. In some implementations, device functionalities can be accessed from a top-level graphical user interface, such as the graphical user interface illustrated in
In some implementations, the mobile device 2500 can implement a network distribution functionality. For example, the functionality can enable the user to take the mobile device 2500 and provide access to its associated network while traveling. In particular, the mobile device 2500 can extend Internet access (e.g., Wi-Fi) to other wireless devices in the vicinity. For example, mobile device 2500 can be configured as a base station for one or more devices. As such, mobile device 2500 can grant or deny network access to other wireless devices.
In some implementations, upon invocation of a device functionality, the graphical user interface of the mobile device 2500 changes, or is augmented or replaced with another user interface or user interface elements, to facilitate user access to particular functions associated with the corresponding device functionality. For example, in response to a user touching the Phone object 2510, the graphical user interface of the touch-sensitive display 2502 may present display objects related to various phone functions; likewise, touching of the Mail object 2512 may cause the graphical user interface to present display objects related to various e-mail functions; touching the Maps object 2514 may cause the graphical user interface to present display objects related to various maps functions; and touching the Web Video object 2516 may cause the graphical user interface to present display objects related to various web video functions.
In some implementations, the top-level graphical user interface environment or state of
In some implementations, the top-level graphical user interface can include additional display objects 2506, such as a short messaging service (SMS) object 2530, a Calendar object 2532, a Photos object 2534, a Camera object 2536, a Calculator object 2538, a Stocks object 2540, a Address Book object 2542, a Media object 2544, a Web object 2546, a Video object 2548, a Settings object 2550, and a Notes object (not shown). Touching the SMS display object 2530 can, for example, invoke an SMS messaging environment and supporting functionality; likewise, each selection of a display object 2532, 2534, 2536, 2538, 2540, 2542, 2544, 2546, 2548, and 2550 can invoke a corresponding object environment and functionality.
Additional and/or different display objects can also be displayed in the graphical user interface of
In some implementations, the mobile device 2500 can include one or more input/output (I/O) devices and/or sensor devices. For example, a speaker 2560 and a microphone 2562 can be included to facilitate voice-enabled functionalities, such as phone and voice mail functions. In some implementations, an up/down button 2584 for volume control of the speaker 2560 and the microphone 2562 can be included. The mobile device 2500 can also include an on/off button 2582 for a ring indicator of incoming phone calls. In some implementations, a loud speaker 2564 can be included to facilitate hands-free voice functionalities, such as speaker phone functions. An audio jack 2566 can also be included for use of headphones and/or a microphone.
In some implementations, a proximity sensor 2568 can be included to facilitate the detection of the user positioning the mobile device 2500 proximate to the user's ear and, in response, to disengage the touch-sensitive display 2502 to prevent accidental function invocations. In some implementations, the touch-sensitive display 2502 can be turned off to conserve additional power when the mobile device 2500 is proximate to the user's ear.
Other sensors can also be used. For example, in some implementations, an ambient light sensor 2570 can be utilized to facilitate adjusting the brightness of the touch-sensitive display 2502. In some implementations, an accelerometer 2572 can be utilized to detect movement of the mobile device 2500, as indicated by the directional arrow 2574. Accordingly, display objects and/or media can be presented according to a detected orientation, e.g., portrait or landscape. In some implementations, the mobile device 2500 may include circuitry and sensors for supporting a location determining capability, such as that provided by the global positioning system (GPS) or other positioning systems (e.g., systems using Wi-Fi access points, television signals, cellular grids, Uniform Resource Locators (URLs)). In some implementations, a positioning system (e.g., a GPS receiver) can be integrated into the mobile device 2500 or provided as a separate device that can be coupled to the mobile device 2500 through an interface (e.g., port device 2590) to provide access to location-based services.
In some implementations, a port device 2590, e.g., a Universal Serial Bus (USB) port, or a docking port, or some other wired port connection, can be included. The port device 2590 can, for example, be utilized to establish a wired connection to other computing devices, such as other communication devices 2500, network access devices, a personal computer, a printer, a display screen, or other processing devices capable of receiving and/or transmitting data. In some implementations, the port device 2590 allows the mobile device 2500 to synchronize with a host device using one or more protocols, such as, for example, the TCP/IP, HTTP, UDP and any other known protocol.
The mobile device 2500 can also include a camera lens and sensor 2580. In some implementations, the camera lens and sensor 2580 can be located on the back surface of the mobile device 2500. The camera can capture still images and/or video.
The mobile device 2500 can also include one or more wireless communication subsystems, such as an 802.11b/g communication device 2586, and/or a Bluetooth™ communication device 2588. Other communication protocols can also be supported, including other 802.x communication protocols (e.g., WiMax, Wi-Fi, 3G), code division multiple access (CDMA), global system for mobile communications (GSM), Enhanced Data GSM Environment (EDGE), etc.
In some implementations, each of one or more system objects of device 2500 has a set of system object attributes associated with it; and one of the attributes determines whether a display object for the system object will be rendered in the top-level graphical user interface. This attribute can be set by the system automatically, or by a user through certain programs or system functionalities as described below.
Sensors, devices, and subsystems can be coupled to the peripherals interface 3006 to facilitate multiple functionalities. For example, a motion sensor 3010, a light sensor 3012, and a proximity sensor 3014 can be coupled to the peripherals interface 3006 to facilitate the orientation, lighting, and proximity functions described with respect to
A camera subsystem 3020 and an optical sensor 3022, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips.
Communication functions can be facilitated through one or more wireless communication subsystems 3024, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of the communication subsystem 3024 can depend on the communication network(s) over which the mobile device is intended to operate. For example, a mobile device can include communication subsystems 3024 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax network, and a Bluetooth™ network. In particular, the wireless communication subsystems 3024 may include hosting protocols such that the mobile device may be configured as a base station for other wireless devices.
An audio subsystem 3026 can be coupled to a speaker 3028 and a microphone 3030 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
The I/O subsystem 3040 can include a touch screen controller 3042 and/or other input controller(s) 3044. The touch-screen controller 3042 can be coupled to a touch screen 3046. The touch screen 3046 and touch screen controller 3042 can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen 3046.
The other input controller(s) 3044 can be coupled to other input/control devices 3048, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control of the speaker 3028 and/or the microphone 3030.
In one implementation, a pressing of the button for a first duration may disengage a lock of the touch screen 3046; and a pressing of the button for a second duration that is longer than the first duration may turn power to the mobile device on or off. The user may be able to customize a functionality of one or more of the buttons. The touch screen 3046 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.
In some implementations, the mobile device can present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations, the mobile device can include the functionality of an MP3 player, such as an iPod™. The mobile device may, therefore, include a 32-pin connector that is compatible with the iPod™. Other input/output and control devices can also be used.
The memory interface 3002 can be coupled to memory 3050. The memory 3050 can include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). The memory 3050 can store an operating system 3052, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks. The operating system 3052 may include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, the operating system 3052 can be a kernel (e.g., UNIX kernel).
The memory 3050 may also store communication instructions 3054 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. The memory 3050 may include graphical user interface instructions 3056 to facilitate graphic user interface processing; sensor processing instructions 3058 to facilitate sensor-related processing and functions; phone instructions 3060 to facilitate phone-related processes and functions; electronic messaging instructions 3062 to facilitate electronic-messaging related processes and functions; web browsing instructions 3064 to facilitate web browsing-related processes and functions; media processing instructions 3066 to facilitate media processing-related processes and functions; GPS/Navigation instructions 3068 to facilitate GPS and navigation-related processes and instructions; camera instructions 3070 to facilitate camera-related processes and functions; and/or other software instructions 3072 to facilitate other processes and functions, e.g., access control management functions as described in reference to
Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules. The memory 3050 can include additional instructions or fewer instructions. Furthermore, various functions of the mobile device may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
The disclosed and other embodiments and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. The disclosed and other embodiments can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus. The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal (e.g., a machine-generated electrical, optical, or electromagnetic signal), that is generated to encode information for transmission to suitable receiver apparatus.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, the disclosed embodiments can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, touch sensitive device or display, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
While this specification contains many specifics, these should not be construed as limitations on the scope of what is being claimed or of what may be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understand as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.
The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
Although a few implementations have been described in detail above, other modifications are possible. For example, the flow diagrams depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flow diagrams, and other components may be added to, or removed from, the described systems. Accordingly, various modifications may be made to the disclosed implementations and still be within the scope of the following claims.
This application is a continuation of co-pending U.S. application Ser. No. 12/165,525 filed on Jun. 30, 2008 which claims priority to U.S. Provisional Application No. 61/033,775 filed on Mar. 4, 2008 and U.S. Provisional Application No. 61/034,129 filed on Mar. 5, 2008, the contents of both of which are incorporated by reference. This patent application is related to John Louch et al., U.S. patent application Ser. No. 11/145,577, Widget Authoring and Editing Environment, which is incorporated herein by reference. This patent application is related to Chris Rudolph et al., U.S. patent application Ser. No. 11/834,578, Web Widgets, which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61033775 | Mar 2008 | US | |
61034129 | Mar 2008 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12165525 | Jun 2008 | US |
Child | 13595586 | US |