The present application is related to user interfaces, and more specifically to methods and systems that enable efficient user-software interaction.
Today's user interfaces for software applications have predominantly become graphical user interfaces. The graphical user interface (GUI) allows users to interact with electronic devices through graphical icons and visual indicators, instead of text-based user interfaces, typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces which require commands to be typed on a computer keyboard. The actions in a GUI are usually performed through direct manipulation of the user interface elements. The proliferation of the GUI elements such as menus, tabs, buttons etc. within a single GUI has created user interfaces with elaborate sets of nested elements which expose the complexity of the underlying software, intimidate the user, and hinder the user's ability to discover even the simple the functions needed to perform a task.
Disclosed here are systems and methods for enabling efficient user-software interface and easy discoverability of the functionality of the software. In some embodiments, the depth of the nested user interface elements is limited to one, thus preventing the user from going down the rabbit hole of nested menus having multiple sub-nested menus to find a single functionality. In other embodiments, there are no nested menus, and majority of the user interaction is performed through an action bar or a set of easily accessible user interface elements such as buttons representing the most common commands available in the given state of the software application. The action bar can receive commands through typing or voice, and can also present the set of most common commands currently available in the software application.
These and other objects, features and characteristics of the present embodiments will become more apparent to those skilled in the art from a study of the following detailed description in conjunction with the appended claims and drawings, all of which form a part of this specification. While the accompanying drawings include illustrations of various embodiments, the drawings are not intended to limit the claimed subject matter.
Brief definitions of terms, abbreviations, and phrases used throughout this application are given below.
Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described that may be exhibited by some embodiments and not by others. Similarly, various requirements are described that may be requirements for some embodiments but not others.
Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements. The coupling or connection between the elements can be physical, logical, or a combination thereof. For example, two devices may be coupled directly, or via one or more intermediary channels or devices. As another example, devices may be coupled in such a way that information can be passed there between, while not sharing any physical connection with one another. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
If the specification states a component or feature “may,” “can,” “could,” or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
The term “module” refers broadly to software, hardware, or firmware components (or any combination thereof). Modules are typically functional components that can generate useful data or another output using specified input(s). A module may or may not be self-contained. An application program (also called an “application”) may include one or more modules, or a module may include one or more application programs.
The terminology used in the Detailed Description is intended to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with certain examples. The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. For convenience, certain terms may be highlighted, for example using capitalization, italics, and/or quotation marks. The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted. It will be appreciated that the same element can be described in more than one way.
Consequently, alternative language and synonyms may be used for any one or more of the terms discussed herein, but special significance is not to be placed upon whether or not a term is elaborated or discussed herein. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification, including examples of any terms discussed herein, is illustrative only and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification.
Disclosed here are systems and methods for enabling efficient user-software interface and easy discoverability of the functionality of the software. In some embodiments, the depth of the nested user interface elements is limited to one, thus preventing the user from going down the rabbit hole of nested menus having multiple sub-nested menus to find a single functionality. In other embodiments, there are no nested menus, and majority of the user interaction is performed through an action bar or a set of easily accessible user interface elements such as buttons representing the most common commands available in the given state of the software application. The action bar can receive commands through typing or voice, and can also present the set of most common commands currently available in the software application.
The user interface 100 is persistently configured into at least three sections in every embodiment of the user interface 100. In a first set of embodiments of the user interface 100, the three sections are always in the same place, in a second set of embodiments of the user interface 100, the three sections are substantially in the same place, while in a third set of embodiments of the user interface 100 the three sections are substantially replaced by other user interface elements, while at least an indication of the three sections is always visible.
In some embodiments of the user interface 100, sections 110, 120, 130 contain no user interface elements, such as buttons, menus, tabs, etc., aside from a command line entry in section 120. In some other embodiments, sections 110, 120, 130 contain no user interface elements which are nested. In other words, there are no user interface elements which when activated present another user interface element to be activated. For example, there are no nested menus. In a third set of embodiments, an element 140 of the user interface 100 independent of the sections 110, 120, 130 can be nested to one level. In other words, the element 140, when activated, can present a set of user interface elements, which in turn, when activated do not provide a second level of user interface elements, but instead perform a command associated with the user interface element.
Section 110 is the informational section displaying information regarding a state of a computer such as advertisements, computational resource consumption by all the applications currently running on the computer, etc. Section 110 can also display information regarding the state of the software such as computational resource consumption by the software displaying the user interface 100, or by the user interface 100. Additionally, section 110 can display information regarding the state of a project viewed by the user, such as staffing of the project, how complete the project is, geographical area of the project, access permissions to the project, etc.
Section 110 can also display history of commands 112, 114, 116 entered by the user into the software, arranged in an order in which the history of commands 112, 114, 116 was entered by the user. Section 110 can include a user interface element 115 which allows the user to scroll through the history of commands. For example, the user interface element 115 can be an arrow or a scroll wheel, or any other kind of icon, allowing the user to go backward and forward in the history of commands.
Section 110 can occasionally be capable of receiving a user input through the user interface element 115. In other words, section 110 can in some embodiments of the user interface 100 can receive user inputs, while in other embodiments of the user interface 100, section 110 cannot receive user inputs. By contrast, section 120 can be passive, i.e., not configured to receive any user inputs in every embodiment of the user interface 100.
Section 130 is the only one of the three sections that can receive multiple inputs from the user in every embodiment of the user interface 100. Section 130 includes a user interface element 135, such as an action bar, to receive a typed or a spoken command from the user. In addition, section 130 can contain elements 132, 134 (only two labeled for brevity), such as buttons, which correspond to the most common commands. The most common commands can be most common commands entered by the user, the most common commands entered by a group of users similar to the user, or the most common commands entered by all the users, within the given state of the software. The most common commands can be “new program”, “change program type”, “create a new project”, etc.
User interface element 150 when activated, by clicking or a voice command, can also display the most common commands associated with the given state of the software. User interface element 160, when activated, by clicking or a voice command, can import a file outside of the software application into the software application. The file can contain data that can later be analyzed by the software application.
In the build state 220, the user can create content within the software 200. For example, in the build state 220 the user can design a questionnaire to gather data regarding, for example, frequency of a particular disease in a particular area. In the capture state 230 the same user, or another user, can gather information to input into the software 200. For example, the user can collect answers to the questionnaire created in the build state 220.
In the analyze state 240, the user can examine the data associated with the user within the software 200. For example, the user can analyze the frequency of disease by season, by region, by socio-economic status, etc. In the share state 250, the user can share with others various aspects of the data associated with the user within the software 200. For example, the user can share the results of his analysis on Twitter, Facebook, Google docs, email, etc.
In the seek and receive assistance state 260, the user can submit requests for help to technical support associated with the software 200, or to a group of users associated with the software 200. In the manage settings state 270, the user can: define one or more projects associated with the user within the software 200; additional users associated with the project, and their roles; the duration of the project, etc.
Each state 210, 220, 230, 240, 250, 260, 270 has a corresponding set of actions 215, 225, 235, 245, 255, 265, 275 that can be performed when the software 200 is in the corresponding state 210, 220, 230, 240, 250, 260, 270, respectively. Each action includes a command to be executed by the software 200, and an optional one or more parameters associated with the command.
User interface elements 310, 320 can be buttons as shown in
The user interface element 330 can be an action bar as shown in
The user interface element 350 can be an icon such as shown in
The software can determine the state of the software and multiple actions available in the given state of the software, as described in this application. Based on the multiple actions available the processor determines the most common commands entered by the user or by multiple users. Based on the state of the software, the processor modifies the most common commands associated with the user interface elements 310, 320.
For example, commands associated with the buttons 310, 320 change depending on whether the software is in the navigate state 210, the build state 220, the capture state 230, the analyze state 240, the share state 250, the seek and receive assistance state 260, and the manage settings state 270, in
For example, if the user is an expert user of the software 200 in
In another example, if the user has not entered the threshold number of commands, the most common commands can be determined by aggregating the commands entered by the user along with the commands entered by other users similar to the user, or by all the other users.
The user interface 300 can receive an activation event at the user interface element 330. The activation event can be a beginning of an entry of an input whether by voice or typing, or can be a hover and/or a click of a cursor. In response to the activation event, the user interface 300 can enlarge the user interface element 330 to obtain the user interface element 340 displaying the most common commands among the multiple commands available in the state of the software. For example, the user typing “/” serves as an indication that the following text is a command to the software 200 in
The position of the three sections 110, 120, 130 in user interface 400 is substantially similar to the position of the three sections 110, 120, 130 in user interface 100 in
The user interface element 410 can contain multiple additional user interface elements, such as buttons, 420, 430 (only two labeled for brevity), which when activated, by for example a mouse click or a voice activation, perform a command within the software application. The buttons 420, 430 can correspond to the states of the software as explained in
The buttons 420, 430 are not nested, meaning, when they are activated the buttons 420, 430 do not produce additional menus or buttons for user to activate. In other words, the depth of the nested user interface element is limited to at most one nested user interface element. For example, if by clicking on user interface element 140 in
When the user activates the action bar 440, the extended action bar 450 can display the most common commands in the given state of the software as seen in
The user interface 500 contains three sections 510, 520, 530, which correspond to sections 110, 120, 130 in
Section 530 contains the action bar 540 which enables efficient interaction between the software and the user by allowing the user to enter typed commands or voice commands. For example, when the user provides a command to the action bar “/list my projects”, section 510 displays the entered command “my projects”, while section 520 displays the list of projects associated with the user.
Section 530 also enables easy discoverability of the user interface 500, by listing the most common commands associated with the given state of the software. For example, section 530 can provide a user interface element 550, which when activated, for example by clicking, provides the list of the most common commands to the user, by for example displaying the most common commands, or by providing the most common commands through audio.
When the user element 560 in section 510 is activated, the user interface element 570 in
The server 600 can include a database 630 storing various software available for download. Upon receiving a request to download a software, the server 600 can provide the software to the requesting device, such as device 610. The provided software can be software 200 in
The server 600 can run the portion 200B of the software 200. The portion 200B of the software 200 can receive a request from the device 610 to retrieve the data from the database 640, analyze the retrieve data, and provide a result of the analysis to the device 610. The device 610 can run a portion 200A of the software 200. Upon receiving the results of the analysis, the device 610 can display the results in section 110 in
For example, software 200A can create the user interface, respond to user interface events, such as displaying a nested menu, receive inputs from the server, and/or perform traditionally inexpensive tasks such as sending a help request, sharing data with other users, etc. Software 200B can perform the computationally expensive tasks such as analyzing the received data, storing large amounts of data, performing natural language processing, determining most common commands to display in the user interface, etc.
The software 200B running on the server 600 can receive from the device 610 a state of the software 200A, and multiple various inputs from multiple users. For each state of the software 200, the software 200B can determine the most common commands, such as top five most common commands, entered into the software 200A, based on all the commands entered by all the users of the software. The software 200B can determine the most common commands based on the commands entered by the user, or by a group of users similar to the user. For example, when the user has interacted with the software 200A sufficiently to provide commands above a certain threshold, as described in this application, the software 200B can determine the most common commands based solely on the input provided by the user. Once the most common commands have been determined, the software 200B can provide the most common commands to the software 200A to present to the user.
The server 650 performs almost all the computation associated with software 200. The device 660 does not need to download the software application 200A and instead can access the server 650 using a web browser. The device 660 receives multiple user inputs, and sends them to the server 650, which then processes the user inputs, and sends the responses back to the device 660.
In step 710, the processor eliminates from the user interface, a nested menu. The nested menu includes a first element of the user interface configured to be selected and upon being selected displaying a second element of the user interface configured to be selected. The second element and the first element are substantially similar, and can both be menu entries, tabs, buttons, cards, etc.
In step 720, the processor enables easy discovery of the user interface by informing the user of most common commands within the only one of the three sections that can receive multiple inputs from the user.
The processor can configure a first section of the three sections to display multiple commands entered by the user arranged in an order in which the multiple commands were entered by the user. Further, the processor can configure a second section of the three sections to display an output associated with a command entered into the only one of the three sections that can receive multiple inputs from the user. Finally, the processor can configure the only one of the three sections that can receive multiple inputs from the user to include a user interface element able to receive a typed or a spoken command.
The processor can provide, within the only one of the three sections that can receive multiple inputs from the user, multiple user interface elements corresponding to the most common commands. Based on a state of a software receiving multiple inputs from the user, the processor can modify the most common commands associated with multiple user interface elements. The most common commands can be entered by the user, by a group of users similar to the user, or by all the users.
In step 810, the processor limits a depth of a nested user interface element to at most one nested user interface element using a data structure and/or a function tracking a number of nested elements. The processor configures a first element of the user interface to display a second element of the user interface upon activating. The second element is substantially similar to the first of the user interface. Upon activating the second element, no further user interface elements are displayed, but instead, the software 200 performs the action specified by the second element. The activation of user interface elements can be a voice selection, a press, such as a mouse click or a finger touch, etc.
In step 820, the processor enables efficient use of the user interface and easy discovery of the user interface functionality by informing the user of most common commands within the section of the user interface that can consistently receive multiple inputs from the user. The most common commands can be displayed to the user or spoken to the user. When displayed, the most common commands can be buttons within the user interface, a list within the user interface, a drop-down menu, etc.
The processor can configure a first section of the three sections to display information regarding a state of a computer, a state of the software 200, a state of the project associated with the user, etc. The state of the computer can include computational resource consumption, while the state of the software 200 can include computational resource consumption by the software 200. The state of the project can include project staffing, how complete the project is, geographical area of the project, list of users who have a permission to view the project, etc. The first section can also display advertisements.
The processor can configure the first section of the three sections to display a history of commands entered by the user arranged in an order in which the commands were entered by the user. Further, the processor can enable browsing of the history of commands by displaying, within the first section, a user interface element configured to scroll through the history of commands when selected by the user. The user interface element can be an arrow or a wheel, or another kind of icon indicating browsing.
The processor can configure a second section of the three sections to display an output of a command entered into the only one of the three sections that can receive multiple inputs from the user. The output can be a comparison of two data sets, a list of received responses, an analysis of a data set, a graph of a data set over time, by response, by respondent, etc.
The processor can configure a third section of the three sections to be the only one of the three sections that can consistently receive multiple inputs from the user. The third section can include a user interface element to receive a typed or a spoken command, such as an action bar 135 in
The processor can provide within the third section multiple user interface elements corresponding to the most common commands. For example, the processor can determine a state of a software and the commands available in the state of the software. Based on the commands available in the state of the software, the processor can determine the most common commands entered by the user or by multiple users. Finally, based on the state of the software, the processor can modify the most common commands associated with the multiple user interface elements. The multiple user elements can be buttons, can be a list, can be a menu, etc.
In another example to determine the most common commands, the processor can determine a state of the software and the commands available in the state of the software. The processor receives an activation event at the user interface element, such as a beginning of an entry of an input among multiple inputs, or a hover of a cursor. The processor enlarges the user interface element to display the most common commands among multiple commands available in the state of the software.
The processor can display the second element 410 in
The processor can display the second element of the user interface 570 in
The variable 910 can be an integer variable counting the number of ancestors that the data structure 900 has. When the value of the variable exceeds 1, the software 200 in
To determine a number of ancestors the data structure 930, 950 have, the data structure 930 can include a function 940 to determine a number of valid values contained in the variable 920. For example, the variable 920 associated with the data structure 930 contains a memory location of the data structure 950. The variable 920 associated with the data structure 950 representing user interface element 140, 560 contains an invalid memory location, such as a NULL memory location, indicating that the user interface element 140, 560 is not nested, and therefore has no parent. The function 940 can examine the ancestors of the data structure 930, by following the memory location 920, finding the data structure 950, increasing the ancestor counter by one, determining that the data structure 950 has no ancestors and returning the value of the ancestor counter, in this case 1. When the number of ancestors exceeds 1, the software 200 in
In the example of
The computer system 1000 can be the computer system of device 610, in
This disclosure contemplates the computer system 1000 taking any suitable physical form. As example and not by way of limitation, computer system 1000 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, or a combination of two or more of these. Where appropriate, computer system 1000 may include one or more computer systems 1000; be unitary or distributed; span multiple locations; span multiple machines; or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 1000 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. As an example and not by way of limitation, one or more computer systems 1000 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 1000 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
The processor may be, for example, a conventional microprocessor such as an Intel Pentium microprocessor or Motorola power PC microprocessor. One of skill in the relevant art will recognize that the terms “machine-readable (storage) medium” or “computer-readable (storage) medium” include any type of device that is accessible by the processor.
The memory is coupled to the processor by, for example, a bus. The memory can include, by way of example but not limitation, random access memory (RAM), such as dynamic RAM (DRAM) and static RAM (SRAM). The memory can be local, remote, or distributed.
The bus also couples the processor to the non-volatile memory and drive unit. The non-volatile memory is often a magnetic floppy or hard disk, a magnetic-optical disk, an optical disk, a read-only memory (ROM), such as a CD-ROM, EPROM, or EEPROM, a magnetic or optical card, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory during execution of software in the computer 1000. The non-volatile storage can be local, remote, or distributed. The non-volatile memory is optional because systems can be created with all applicable data available in memory. A typical computer system will usually include at least a processor, memory, and a device (e.g., a bus) coupling the memory to the processor.
Software is typically stored in the non-volatile memory and/or the drive unit. Indeed, storing and entire large program in memory may not even be possible. Nevertheless, it should be understood that for software to run, if necessary, it is moved to a computer readable location appropriate for processing, and for illustrative purposes, that location is referred to as the memory in this paper. Even when software is moved to the memory for execution, the processor will typically make use of hardware registers to store values associated with the software, and local cache that, ideally, serves to speed up execution. As used herein, a software program is assumed to be stored at any known or convenient location (from non-volatile storage to hardware registers) when the software program is referred to as “implemented in a computer-readable medium.” A processor is considered to be “configured to execute a program” when at least one value associated with the program is stored in a register readable by the processor.
The bus also couples the processor to the network interface device. The interface can include one or more of a modem or network interface. It will be appreciated that a modem or network interface can be considered to be part of the computer system 1000. The interface can include an analog modem, ISDN modem, cable modem, token ring interface, satellite transmission interface (e.g., “direct PC”), or other interfaces for coupling a computer system to other computer systems. The interface can include one or more input and/or output devices. The I/O devices can include, by way of example but not limitation, a keyboard, a mouse or other pointing device, disk drives, printers, a scanner, and other input and/or output devices, including a display device. The display device can include, by way of example but not limitation, a cathode ray tube (CRT), liquid crystal display (LCD), or some other applicable known or convenient display device. For simplicity, it is assumed that controllers of any devices not depicted in the example of
In operation, the computer system 1000 can be controlled by operating system software that includes a file management system, such as a disk operating system. One example of operating system software with associated file management system software is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Washington, and their associated file management systems. Another example of operating system software with its associated file management system software is the Linux™ operating system and its associated file management system. The file management system is typically stored in the non-volatile memory and/or drive unit and causes the processor to execute the various acts required by the operating system to input and output data and to store data in the memory, including storing files on the non-volatile memory and/or drive unit.
Some portions of the detailed description may be presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or “generating” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the methods of some embodiments. The required structure for a variety of these systems will appear from the description below. In addition, the techniques are not described with reference to any particular programming language, and various embodiments may thus be implemented using a variety of programming languages.
In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a laptop computer, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, an iPhone, a Blackberry, a processor, a telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
While the machine-readable medium or machine-readable storage medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” and “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” and “machine-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies or modules of the presently disclosed technique and innovation.
In general, the routines executed to implement the embodiments of the disclosure, may be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processing units or processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.
Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
Further examples of machine-readable storage media, machine-readable media, or computer-readable (storage) media include but are not limited to recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.
In some circumstances, operation of a memory device, such as a change in state from a binary one to a binary zero or vice-versa, for example, may comprise a transformation, such as a physical transformation. With particular types of memory devices, such a physical transformation may comprise a physical transformation of an article to a different state or thing. For example, but without limitation, for some types of memory devices, a change in state may involve an accumulation and storage of charge or a release of stored charge. Likewise, in other memory devices, a change of state may comprise a physical change or transformation in magnetic orientation or a physical change or transformation in molecular structure, such as from crystalline to amorphous or vice versa. The foregoing is not intended to be an exhaustive list in which a change in state for a binary one to a binary zero or vice-versa in a memory device may comprise a transformation, such as a physical transformation. Rather, the foregoing is intended as illustrative examples.
A storage medium typically may be non-transitory or comprise a non-transitory device. In this context, a non-transitory storage medium may include a device that is tangible, meaning that the device has a concrete physical form, although the device may change its physical state. Thus, for example, non-transitory refers to a device remaining tangible despite this change in state.
The foregoing description of various embodiments of the claimed subject matter has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed. Many modifications and variations will be apparent to one skilled in the art. Embodiments were chosen and described in order to best describe the principles of the invention and its practical applications, thereby enabling others skilled in the relevant art to understand the claimed subject matter, the various embodiments, and the various modifications that are suited to the particular uses contemplated.
While embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
Although the above Detailed Description describes certain embodiments and the best mode contemplated, no matter how detailed the above appears in text, the embodiments can be practiced in many ways. Details of the systems and methods may vary considerably in their implementation details, while still being encompassed by the specification. As noted above, particular terminology used when describing certain features or aspects of various embodiments should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the invention with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification, unless those terms are explicitly defined herein. Accordingly, the actual scope of the invention encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the embodiments under the claims.
The language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this Detailed Description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of various embodiments is intended to be illustrative, but not limiting, of the scope of the embodiments, which is set forth in the following claims.
This application claims priority to the U.S. provisional patent application Ser. No. 62/599,446 filed Dec. 15, 2017, and the U.S. provisional patent application Ser. No. 62/582,403 filed Nov. 7, 2017, all of which are incorporated herein by this reference in their entirety.
Number | Date | Country | |
---|---|---|---|
62599446 | Dec 2017 | US | |
62582403 | Nov 2017 | US |