A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the software engine and its modules, as it appears in the Patent and Trademark Office Patent file or records, but otherwise reserves all copyright rights whatsoever.
The design generally relates to systems and methods for capturing accurate emotional responses from users of products and services.
Usability testing can be used to evaluate an existing or early-stage product or service by testing usability of the product or service on users for direct user input. Examples of products and services that can benefit from usability testing include local software applications, web applications, web sites and services provided therethrough, computer interfaces, consumer products, documents such as instructions for use of the consumer products, and the like. Identifying and correcting usability problems up front with usability testing can reduce costs of modifying software, modifying web sites and services provided therethrough, remanufacturing consumer products, updating documents, and the like. To date, there is a lack of a means for capturing accurate emotional responses from users in, for example, usability testing for identifying and correcting usability problems. Provided herein are systems and methods that address the foregoing.
Provided herein are systems and methods for a perception and feeling tool having multiple modules configured to allow participants to express perceptions and feelings at one or more moments during a study, for example, a usability-testing study. The perception and feeling tool can be further configured to record the participants' perceptions and feelings at the one or more moments during the study as collected data. A researcher can also interact with perception and feeling tool in various ways. For example, the perception and feeling tool can be further configured to display results for the researcher for any one or more of the participants or an aggregate of all the participants for the perceptions and the feelings at the one or more moments during the study. A collection module can be configured to collect raw data including the participants' perceptions and feelings. One or more analytical modules can be configured to apply analytics to the raw data collected by the collection module. One or more servers or other computing devices can be configured to deliver user interfaces to the participants in the study and allow the participants to express the perceptions and the feelings at the one or more moments during the study.
In an embodiment, the perception and feeling reporting tool can be built and configured to use a combination of modules to allow participants to better express how they were feeling at a given moment during a study (e.g., usability-testing study) and apply analytics to the collected data. The perception and feeling reporting tool can also be constructed with a number of software modules cooperating with each other. The perception and feeling reporting tool can be built and configured to provide numerous features discussed herein.
The multiple drawings refer to example embodiments of the design, in which:
While the design is subject to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and will herein be described in detail. The design should be understood to not be limited to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the design.
In the following description, numerous specific details are set forth, such as examples of specific perception and feeling reporting services, named components, connections, number of databases, etc., in order to provide a thorough understanding of the present design. It will be apparent; however, to one skilled in the art that the present design can be practiced without these specific details. In other instances, well known components or methods have not been described in detail but rather in a block diagram in order to avoid unnecessarily obscuring the present design. Thus, the specific details set forth are merely exemplary. The specific details discussed in one embodiment can be reasonably implemented in another embodiment. The specific details can be varied from and still be contemplated to be within the spirit and scope of the present design.
Usability testing can be used to evaluate an existing or early-stage product or service by testing usability of the product or service on users for direct user input. Examples of products and services that can benefit from usability testing include local software applications, web applications, web sites and services provided therethrough, computer interfaces, consumer products, documents such as instructions for use of the consumer products, and the like. Identifying and correcting usability problems up front with usability testing can reduce costs of modifying software, modifying web sites and services provided therethrough, remanufacturing consumer products, updating documents, and the like. To date, there is a lack of a means for capturing accurate emotional responses from users in, for example, usability testing for identifying and correcting usability problems. Provided herein are systems and methods that address the foregoing.
For example, provided herein in some embodiments are systems and methods for a perception and feeling tool having multiple modules configured to allow participants to express perceptions and feelings at one or more moments during a study, for example, but not limited to, a usability-testing study. The perception and feeling tool can be further configured to record the participants' perceptions and feelings at the one or more moments during the study as collected data. A researcher can also interact with the perception and feeling tool in various ways. For example, the perception and feeling tool can be further configured to display results for the researcher for any one or more of the participants or an aggregate of all the participants for the perceptions and the feelings at the one or more moments during the study. The perception and feeling tool can be further configured to display results for any one or more of the participants or an aggregate of all the participants for the perceptions and the feelings at the one or more moments during the study. A collection module can be configured to collect raw data including the participants' perceptions and feelings. One or more analytical modules can be configured to apply analytics to the raw data collected by the collection module. One or more servers or other computing devices can be configured to deliver user interfaces to the participants in the study and allow the participants to express the perceptions and the feelings at the one or more moments during the study.
Examples of products and services that can benefit from usability testing include, but are not limited to, local software applications, web applications, web sites and services provided therethrough, computer interfaces, consumer products, documents such as instructions for use of the consumer products, and the like. Usability-testing studies for any of the foregoing can include usability testing at one or more so-called moments. For example, a usability-testing study for a ticket-vending web site and services provided therethrough can, as shown in
Again, systems and methods provided herein are not limited to such usability testing. For example, systems and methods provided herein can simply prompt a participant a number of times per day in everyday life to track and quantify emotion for a variety of other purposes than usability testing. For example, instead of the moments being selected and provided by a researcher, the moments could be intervallic such as once every day or once every 3 hours (e.g., 9:00 AM, noon, 3 PM, etc.), and the participant could be prompted to describe the perceptions and feelings he or she is experiencing at those moments. Medicine including mental health services and pharmaceutical testing/clinical trials can benefit from such tracking and quantification of emotions. Such tracking and quantification might be considered a diary study in some embodiments.
In general, the perception and feeling reporting tool can be configured to digitize a chart such as digitized chart 300A in
Example processes for and apparatuses to provide the perception and feeling reporting tool are described. The following drawings and text describe various example implementations of the design.
The perception and feeling reporting tools can use one or more servers or other computing devices to deliver user interfaces to participants in a study on a client device. The perception and feeling reporting tools can be implemented on a tablet computing device. A server can be backed by a database that can cooperate with one or more client computing devices such as the tablet to store questions and participant responses during the study.
The perception and feeling reporting tool can also be built and configured to use a combination of modules to allow participants to better express how they were feeling at a given moment during a study and apply analytics to the collected data. The perception and feeling reporting tool can be constructed with a number of software modules cooperating with each other.
In an embodiment, the perception and feeling reporting tool can be customized to measure and record the emotional feelings of one or more participants in a study. A first module can be configured to solicit and obtain each participant for the study by presenting a series of user interfaces. A second module of the perception and feeling reporting tool can be configured to digitize a chart or similar diagram to allow participants to better express feelings at a given moment during a study and record the participants' feelings at multiple discreet points during the study. The perception and feeling reporting tool can be built and configured as a Software as a Service tool that allows researchers to track the self-reported emotional states of a person through an experience. Some of the user interfaces can be configured to present questions, topics, images, or videos at a point in a study, and then a second user interface can be configured to present a numeric value-based digital plot of possible emotions for the participant to express and record their feelings/emotions at that point in the study. A third software module can be configured to collect the feelings/emotions expressed and recorded by the participants, as well as apply analytical algorithms to do various functions, such as aggregates, on those recorded responses for all of the participants. The third software module can be further configured to apply linear or weighted algorithms to determine how a group of participants were feeling at a given moment during a study.
A general flow 200A through the perception and feeling reporting tool can be as follows:
As shown in the general flow 200A of
Participant can complete answering questions on the digitized emotional chart. The perception and feeling reporting tool can be configured to prompt to check if there are additional participants, for example, by asking for an affirmative “Yes” or “No.” The perception and feeling reporting tool can be configured to record the responses from the participants. The perception and feeling reporting tool can be configured to quantify emotions in the responses. The perception and feeling reporting tool can be configured to organize and display the quantified emotions, so that the researcher can review the results from participant(s) mood answers. (See
A possible flow 200B for the participant to self-report their mood based on questions in the perception and feeling reporting tool with the digitized chart or diagram to capture a participant's emotion during multiple times during the study can be as follows:
As shown in the flow 200B of
A digitized chart or Moodboard as shown in
One can treat this table as a spreadsheet with table entries at specific coordinates. For example, the left, top entry of 3.2 is at row 0, column 0, with coordinates (0, 0). Entry 2.2 has coordinates (0, 1).
The following is some pseudocode to calculate the emotion results:
The result can be a list of questions with corresponding average emotional values for each question with corresponding color coding, as shown in the following examples in Table 1:
The analytics module can have a first algorithm that does simple linear averaging for matrix values and use an algorithm to map a participant's selection to the matrix value. In addition, a second algorithm can do a weighted averaging for the matrix value. The second algorithm can do weighting of what the participants said and then providing overall “mood.”
The tool can allow for multiple sessions with participants and aggregate or compare the multiple sessions. For example, a participant can answer a same set of questions during two different time periods. The answers from the two time periods can be visualized on the same chart. The same can be done for comparing responses from different participants.
As shown in
Not only do
To configure a study to utilize primary and secondary emotions, a researcher can select moments as described herein at which to obtain participants' reactions, for example, during use of a product or service. At each of these moments, a participant is prompted to provide—through interfaces such as those provided in
The list of emotions can include any of a number of possible emotion words. For example, the textual list of emotion words provided in
The weight of primary and secondary emotions can be determined using one or more analytical modules having algorithms provided herein that include the intensity of the primary and secondary emotions such as that provided by participants using the emotional intensity slider of
The emotional data points that can be collected as raw data by the collection module for each participant for each moment in a study can include the following:
From the foregoing emotional raw data, the following outcomes can be derived by the one or more analytical modules: Primary Emotion Score (“PES”); Secondary Emotion Score (“SES”); and Holistic Emotions Quotient (“HEQ”). In addition, an Emotion Salience Ranking (“ESR”) can be derived in some embodiments.
In some embodiments, the algorithms for the PES can include the following:
With respect to the foregoing embodiment of the PES, if a participant in a study picks a primary emotion from a textual or graphical list of emotions and the defined emotional valence p of the primary emotion is between 4.0 and 5.0 (not inclusive), then the PES of the participant for the corresponding moment in the study is the defined emotional valence p without any added or subtracted weight from the emotional intensity q (e.g., from the emotional intensity slider). If the participant in the study picks a primary emotion from the textual or graphical list of emotions and the defined emotional valence p of the primary emotion is less than or equal to 4.0, then the PES of the participant for the corresponding moment in the study is downwardly weighted by subtraction of the product of the emotional intensity q and a coefficient (i.e., the product of q×0.1) from the defined emotional valence p. If the participant in the study picks a primary emotion from the textual or graphical list of emotions and the defined emotional valence p of the primary emotion is greater than or equal to 5.0, then the PES of the participant for the corresponding moment in the study is upwardly weighted by addition of the product of the emotional intensity q and a coefficient (i.e., the product of q×0.1) to the defined emotional valence p. The foregoing algorithms for the PES spread out PES values to facilitate better visualization in a plot of the PES values.
In some embodiments, the algorithms for the SES can include the following:
With respect to the foregoing embodiment of the SES, if a participant in a study picks an optional secondary emotion from a textual or graphical list of emotions and the defined emotional valence s of the secondary emotion is between 4.0 and 5.0 (not inclusive), then the SES of the participant for the corresponding moment in the study is the defined emotional valence s without any added or subtracted weight from the emotional intensity t (e.g., from the emotional intensity slider). If the participant in the study picks a secondary emotion from the textual or graphical list of emotions and the defined emotional valence s of the secondary emotion is less than or equal to 4.0, then the SES of the participant for the corresponding moment in the study is downwardly weighted by subtraction of the product of the emotional intensity t and a coefficient (i.e., the product of t×0.1) from the defined emotional valence s. If the participant in the study picks a secondary emotion from the textual or graphical list of emotions and the defined emotional valence s of the secondary emotion is greater than or equal to 5.0, then the SES of the participant for the corresponding moment in the study is upwardly weighted by addition of the product of the emotional intensity t and a coefficient (i.e., the product of t×0.1) to the defined emotional valence s. The foregoing algorithms for the SES spread out SES values to facilitate better visualization in a plot of the SES values.
The interface 600A and the chart thereof (as well as other such charts provided herein) can display an “emotional journey” of one or more participants using an aggregate of the emotion selections for each question. As shown in
In some embodiments, the algorithms for the Holistic Emotions Quotient (HEQ) can include the following:
With respect to the foregoing embodiment of the HEQ, if a participant in a study picks a primary emotion from a textual or graphical list of emotions without picking a secondary emotion, then the HEQ of the participant for the corresponding moment in the study is the PES. If the participant in the study picks a primary emotion and a secondary emotion from the textual or graphical list of emotions, and if the participant provides equal emotional intensities q and t for the primary emotion and the secondary emotion, respectively, then the HEQ is weighted more toward the PES (70%) than the SES (30%). This takes into account the fact that the participant identified the primary emotion first as the dominant emotion. If the participant in the study picks a primary emotion and a secondary emotion from the textual or graphical list of emotions, and if the participant provides a greater emotional intensity q for the primary emotion and a lesser emotional intensity t for the secondary emotion, then the HEQ is weighted more toward the PES (80%) than the SES (20%). The HEQ weighted in this way takes into account the fact that the strongly identified with the primary emotion for the corresponding moment in the study. If the participant in the study picks a primary emotion and a secondary emotion from the textual or graphical list of emotions, and if the participant provides a greater emotional intensity t for the secondary emotion and a lesser emotional intensity q for the primary emotion, then the HEQ is weighted more toward the SES (40%) than in other scenarios. The HEQ weighted in this way takes into account the fact that the strongly identified with the secondary emotion for the corresponding moment in the study.
In some embodiments, the algorithms for the ESR can include the following:
The analytics module can have a first algorithm that does simple averaging for the matrix value and uses an algorithm to map the slider to the matrix value. The perception and feeling reporting tool can be configured to quantify the emotion responses from the one or more participants. The perception and feeling reporting tool can be configured to save the results to a database. The perception and feeling reporting tool can be configured to convey questions, videos, images, or other information to the participant until all questions are answered. If additional questions remain, then the perception and feeling reporting tool can be configured to repeat the process again. The perception and feeling reporting tool can be configured to display to the participant the additional questions, videos, images, or other information the participant and clear the digitized interface about to be presented from any previous responses. When finished with all questions, the perception and feeling reporting tool can be configured to then display “Thank you for your participation.” The perception and feeling reporting tool can be configured to allow a plurality of participants in a study to utilize this tool to do self-reporting as a viable method for emotion capturing. Participants can conduct the study in a group or as individuals in separate locations.
An overall emotional ranking, or value, can be determined through an algorithm that analyzes i) a subset or ii) the entirety of the inputted values from the participants. The tool can display subsets of the data as well in the form of categories, for example, where several values are aggregated under one description and presented as a single value. Also, multiple languages can be supported for the same input. Users can express themselves using different languages but all the values will be associated with the same calculation. The tool can display its output on the device it is running on such as a tablet, desktop, smart phone as well as display its output on another one of these type of devices.
The module can be programmed to capture analytics such as the following: 1) How do these feelings change over the duration of the study? 2) What factors seem to statistically influence the group of participants? 3) How do demographic characteristics influence a particular moment or an emotional journey? The perception and feeling tool can be configured to collect demographic data in a demographics database to enable a researcher to determine such influencing factors. The module can also be configured to conform with different cultural interpretations and associations, optionally in accordance with the demographic data in the demographics database or from researcher-determined influencing factors. The perception and feeling reporting tool can also be programmed to give the ability to create and save multiple projects, label each question/step for perception and feeling reporting, define how many times each participant would be asked to chart, and stop participants from being able to see their previous answers. The emotion tool can also be programmed to progress through a study non-linearly and allows participants to be able to skip an entire step and/or skip individual questions in the process in a non-linear manner. If the study is running out of time to do a task or has to skip a task, the perception and feeling reporting tool allows participants to skip a step and/or questions. This prevents reporting a neutral emotion or non-emotion and the researcher manually removing that data during analysis.
Moodboard questions need not be questions. For example, a researcher can query participants by entering “interaction moments” spanning an experience with a product, such as a video game. In such a case, the researcher can ask participants to select their mood during one or more of the following moments: 1) Looking at the game console package; 2) opening the package; 3) setting up the console; or 4) playing the first game.
Subsequent to a researcher configuring a study with moments and/or question and participants, the researcher can conduct the study on one or more of the participants. The researcher can be a moderator with a digitized chart such as that shown in
The perception and feeling reporting tool can be implemented on a tablet, smart phone, or other mobile device with a mobile version of the emotional chart. The perception and feeling reporting tool has features included the ability to create and save multiple projects, label each question/step for perception and feeling reporting, define how many times each participant would be asked to chart, and stop participants from being able to see their previous answers. The perception and feeling reporting tool has been configured to capture participant ratings and feedback at multiple times during the study to capture nuances that occurred during the study, such as the emotion data shows which steps of the study process or parts of the product elicited frustration or delight.
The perception and feeling reporting tool can be a SaaS tool that allows researchers to track the self-reported emotional states of a person through an experience. The perception and feeling reporting tool gathers data about the emotional journey of a user experience and customer experience to product, service or other thing. The perception and feeling reporting tool captures the emotional experience of a product or other focus of the study, with an emphasis on an agile self-reporting method. The perception and feeling reporting tool can be programmed for product and market research, but it could also be applied to many other situations such as pharmaceuticals, where a patient is considering taking a new drug as part of a clinical study, etc. The labels on the digitized matrix would be tailored to that other situation such as an amount of comfort/discomfort or pleasure/pain felt verses level of emotions as currently indicated in an embodiment of the digitized matrix. The labels used in the illustration, such as angry, frustrated, etc. can be substituted with either alternative terms or graphical presentations of ideas. The tool can incorporate or combine with biometric methods, such as facial recognition, to provide greater resolution or greater accuracy for those tools. The digitized chart or diagram acts as a digital input device to capture the perception and feelings of the participant. However, other forms of the digital input device are possible.
The perception and feeling reporting tool can have other documents, or media, attached or associated with the input. For example, a video could be automatically or manually attached to the self-reporting of an emotion. The input can have a temporal component. The input can be assigned a time stamp and the algorithm make take into account the time the input was made in its calculation. The time can also be factored into the reporting including the possibility of creating animated presentations based on the input and associated media.
In view of the foregoing, provided herein in some embodiments are methods that include providing a perception and feeling tool having multiple modules configured to present a series of user interfaces. The series of user interfaces can be configured for allowing participants to express perceptions and feelings at multiple discreet points during a study in order to track the perceptions and feelings of each participant throughout the study. The series of user interfaces can be configured for recording the participants' perceptions and feelings at the multiple discreet points during the study as collected data. The series of user interfaces can be configured for displaying results for i) any one or more of the participants as well as ii) an aggregate of all the participants for the perceptions and the feelings at the multiple discreet points during the study to visually see the participants' perceptions and feelings at the multiple discreet points during the study. The methods can further include collecting raw data including the participants' perceptions and feelings from the user interfaces with a collection module configured for the collecting. In addition, the methods can further include applying analytics by one or more analytical modules to the raw data collected by the collection module in order to apply any of a linear algorithm, a weighted algorithm, or a combination of the linear algorithm and the weighted algorithm. Applying the analytics can be for determining any of 1) how the participants were feeling at the multiple discreet points during the study; 2) what the participants were perceiving at the multiple discreet points during the study; or 3) any combination thereof. Further in addition, the methods can include soliciting each participant of the study using a solicitation module configured for the soliciting, presenting the series of the user interfaces to the participants in the study, and allowing the participants to express the perceptions and feelings at the multiple discreet points during the study. One or more modules of the perception and feeling tool can be configured to cooperate with one or more processors on a computing device to execute instructions for any software portions of the one or more modules.
In such embodiments, the methods can further comprise displaying the results in a digitized chart or a similar diagram and reporting the results for i) the any one or more of the participants or ii) the aggregate of all the participants. The perception and feeling tool can be further configured as a reporting tool configured for the displaying and reporting.
In such embodiments, the methods can further comprise presenting with the solicitation module a first user interface including a digitized chart that has a set of at least 5 words conveying the perceptions and feelings along with any of a numeric indicator, a sliding scale, or a color to indicate an intensity for a selectable perception or feeling that a first participant is experiencing at a first discrete point during the study.
In such embodiments, the methods can further comprise receiving by the solicitation module one or more of the user interfaces of the series of user interfaces. The solicitation module can be resident in a client device and can be configured to cooperate with one or more servers or other computing devices for the receiving.
In such embodiments, the methods can further comprise aggregating participants' perceptions and feelings for all of the participants in the study, wherein the collection module is configured for the aggregating.
In such embodiments, the methods can further comprise presenting the user interfaces and collecting the raw data with the collection module. The perception and feeling tool can include the collection module configured for the collecting, and the raw data can be from participants selecting colors in matrices presented in the user interfaces, numeric values in the matrices, on-screen buttons corresponding to certain perceptions and feeling in the series of user interfaces, intensities for the certain perceptions and feelings, pictures, photos, videos, sounds, music, or a combination thereof. In such embodiments, the colors and the numeric values correspond to a visual representation, a quantifiable representation, or a visual and quantifiable representation of the participants' perceptions and feelings.
In such embodiments, the methods can further comprise formatting and storing 1) the raw data and 2) results from application of the one or more algorithms to quantify the perceptions and feelings from the participants in an exportable format, wherein the collection module is configured for the formatting and storing.
In such embodiments, the methods can further comprise allowing participants through the perception and feeling tool to express at least two perceptions and feelings at each of the multiple discreet points during the study in order to track primary and secondary perceptions and feelings of each participant throughout the study. In addition, the methods can further comprise allowing participants through the perception and feeling tool to express emotional intensities for the at least two perceptions and feelings at each of the multiple discreet points during the study.
In such embodiments, the methods can further comprise determining with the one or more analytical modules a primary emotion score (“PES”) for each primary perception and feeling and its emotional intensity, a secondary emotion score (“SES”) for each secondary perception and feeling and its emotional intensity, and a holistic emotions quotient for the PES and SES weighted in accordance with the emotional intensity of the PES and the emotional intensity of the SES.
With reference to
Computing system 810 typically includes a variety of computing machine-readable media. Computing machine-readable media can be any available media that can be accessed by computing system 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computing machine-readable mediums uses include storage of information, such as computer readable instructions, data structures, program modules or other data. Computer storage mediums include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store the desired information and which can be accessed by computing device 800. However, carrier waves would not fall into a computer readable medium. Communication media typically embodies computer readable instructions, data structures, program modules, or other transport mechanism and includes any information delivery media.
The system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random access memory (RAM) 832. A basic input/output system 833 (BIOS), containing the basic routines that help to transfer information between elements within computing system 810, such as during start-up, can typically be stored in ROM 831. RAM 832 typically contains data and/or program modules that can be immediately accessible to and/or presently being operated on by processing unit 820. By way of example, and not limitation,
The computing system 810 can also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media discussed above and illustrated in
A user can enter commands and information into the computing system 810 through input devices such as a keyboard 862, a microphone 863, a pointing device 861, such as a mouse, trackball or touch pad. The microphone 863 can cooperate with speech recognition software. These and other input devices can often be connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but can be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A display monitor 891 or other type of display screen device can also be connected to the system bus 821 via an interface, such as a video interface 890. In addition to the monitor, computing devices can also include other peripheral output devices such as speakers 897 and other output device 896, which can be connected through an output peripheral interface 890.
The computing system 810 can operate in a networked environment using logical connections to one or more remote computers/client devices, such as a remote computing device 880. The remote computing device 880 can be a personal computer, a hand-held device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computing system 810. The logical connections depicted in
When used in a LAN networking environment, the computing system 810 can be connected to the LAN 871 through a network interface or adapter 870. When used in a WAN networking environment, the computing system 810 typically includes a modem 872 or other means for establishing communications over the WAN 873, such as the Internet. The modem 872, which can be internal or external, can be connected to the system bus 821 via the user-input interface 860, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computing system 810, or portions thereof, can be stored in the remote memory storage device. By way of example, and not limitation,
As discussed, the computing system can include a processor, a memory, a built in battery to power the computing device, an AC power input, potentially a built-in video camera, a display screen, a built-in Wi-Fi circuitry to wirelessly communicate with a remote computing device connected to network.
It should be noted that the present design can be carried out on a computing system such as that described with respect to
Another device that can be coupled to bus 811 is a power supply such as a battery and Alternating Current adapter circuit. As discussed above, the DC power supply can be a battery, a fuel cell, or similar DC power source that needs to be recharged on a periodic basis. The wireless communication module 872 can employ a Wireless Application Protocol to establish a wireless communication channel. The wireless communication module 872 can implement a wireless networking standard such as Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard, IEEE std. 802.11-1999, published by IEEE in 1999.
Examples of mobile computing devices can be a laptop computer, a cell phone, a personal digital assistant, or other similar device with on-board processing power and wireless communications ability powered by a Direct Current (DC) power source that supplies DC voltage to the mobile device and that is solely within the mobile computing device and needs to be recharged on a periodic basis, such as a fuel cell or a battery.
As discussed,
In an embodiment, the perception and feeling reporting tool can be hosted on a cloud-based provider site that contains one or more servers 204A and one or more databases 206A.
A cloud provider service can install and operate application software in the cloud and users can access the software service from the client devices. Cloud users who have a site in the cloud cannot solely manage the cloud infrastructure and platform where the application runs. Thus, the servers and databases can be shared hardware where the user can be given a certain amount of dedicate use of these resources. The user can be given a virtual amount of dedicated space and bandwidth in a so-called cloud. Cloud applications can be different from other applications in their scalability—which can be achieved by cloning tasks onto multiple virtual machines at run-time to meet changing work demand. Load balancers distribute the work over the set of virtual machines. This process can be transparent to the cloud user, who sees only a single access point.
The cloud-based perception and feeling reporting tool can be coded to utilize a protocol, such as Hypertext Transfer Protocol (HTTP), to engage in a request and response cycle with both a mobile device application resident on a client device as well as a web-browser application resident on the client device. The cloud-based perception and feeling reporting tool can be accessed by a mobile device, a desktop, a tablet device, and other similar devices, anytime, anywhere. Thus, the cloud-based perception and feeling reporting tool hosted on a cloud-based provider site can be coded to engage in 1) the request and response cycle from all web browser based applications, 2) SMS/twitter based request and response message exchanges, 3) the request and response cycle from a dedicated on-line server, 4) the request and response cycle directly between a native mobile application resident on a client device and the cloud-based perception and feeling reporting tool, and 5) combinations of these.
The cloud-based perception and feeling reporting tool has one or more application programming interfaces (APIs) with two or more of the customer sites as well as application programming interfaces with search engines and social on-line sites, etc. The APIs can be a published standard for the connection to each site for access/connectivity′ system. The APIs can also be an open source API. One or more of the API's can be customized to closed/non-published APIs of a remote access/connectivity′ site. The cloud-based perception and feeling reporting tool can be coded to establish a secure communication link between each customer entity site and the cloud provider site. The software service can be coded to establish the secure communication link by creating a tunnel at the socket layer and encrypting any data while in transit between each customer entity sites and the provider site as well as to satisfy any additional authentication mechanisms required by that site, including but not limited to IP address white listing and token based authentication.
In an embodiment, the server computing system 204 can include a server engine, a web page management component, a content management component and a database management component. The server engine performs basic processing and operating system level tasks. The web page management component handles creation and display or routing of web pages or screens associated with receiving and providing digital content and digital advertisements. Users can access the server-computing device by means of a URL associated therewith. The content management component handles most of the functions in the embodiments described herein. The database management component includes storage and retrieval tasks with respect to the database, queries to the database, and storage of data.
An embodiment of a server computing system to display information, such as a web page, etc. An application including any program modules, when executed on the server computing system 204A, causes the server computing system 204A to display windows and user interface screens on a portion of a media space, such as a web page. A user via a browser from the client computing system 202A can interact with the web page, and then supply input to the query/fields and/or service presented by a user interface of the application. The web page can be served by a web server computing system 204A on any Hypertext Markup Language (HTML) or Wireless Access Protocol (WAP) enabled client computing system 202A or any equivalent thereof. For example, the client mobile computing system 202A can be a smart phone, a touch pad, a laptop, a netbook, etc. The client computing system 202A can host a browser to interact with the server computing system 204A. Each application has a code scripted to perform the functions that the software component can be coded to carry out such as presenting fields and icons to take details of desired information. Algorithms, routines, and engines within the server computing system 204A take the information from the presenting fields and icons and put that information into an appropriate storage medium such as a database. A comparison wizard can be scripted to refer to a database and make use of such data. The applications can be hosted on the server computing system 204A and served to the browser of the client computing system 202A. The applications then serve pages that allow entry of details and further pages that allow entry of more details.
In regards of viewing ability of an on-line site: the scripted code for the on-line site, such as a website, social media site, etc., can be configured or otherwise adapted to be i) viewed on tablets and mobile phones, such as individual downloadable applications in data stores designed to interface with the on-line site, ii) viewable on a screen in a vehicle or other mobile device, as well as iii) viewable on a screen of a desktop computer via a browser. Those skilled in the relevant art will appreciate that the invention can be practiced with other computer system configurations, including Internet appliances, hand-held devices, wearable computers, cellular or mobile phones, multi-processor systems, microprocessor-based or programmable consumer electronics, set-top boxes, network PCs, mini-computers, mainframe computers and the like.
Mobile web applications and native applications can be downloaded from a cloud-based site. The mobile web applications and native applications have direct access to the hardware of mobile devices (including accelerometers and GPS chips), and the speed and abilities of browser-based applications. Information about the mobile phone and the vehicle's location can be gathered by software housed on the phone.
One or more scripted routines for the cloud-based perception and feeling reporting tool can be configured to collect and provide features such as those described herein.
Any application and other scripted code components can be stored on a non-transitory computing machine-readable medium which, when executed on the server causes the server to perform those functions. The applications including program modules can be implemented as logical sequences of software code, hardware logic circuits, and any combination of the two, and portions of the application scripted in software code can be stored in a non-transitory computing device readable medium in an executable format. In an embodiment, the hardware logic consists of electronic circuits that follow the rules of Boolean Logic, software that contain patterns of instructions, or any combination of both.
The design can also be described in the general context of computing device executable instructions, such as program modules etc. being executed by a computing device. Generally, program modules include routines, programs, objects, applications, widget, plug-ins, and other similar structures that perform particular tasks or implement particular abstract data types. Those skilled in the art can implement the description and/or figures herein as computer-executable instructions, which can be embodied on any form of computing machine-readable media discussed herein.
Some portions of the detailed descriptions herein are presented in terms of algorithms/routines and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm/routine is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps can be those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. These algorithms/routine of the application including the program modules can be written in a number of different software programming languages such as C, C++, Java, XML, HTML, or other similar languages.
Many online pages on a server, such as web pages, can be written using the same language, Hypertext Markup Language (HTML), which can be passed around using a common protocol—HTTP. HTTP is the common Internet language (dialect, or specification). Through the use of a web browser, a special piece of software that interprets HTTP and renders HTML into a human-readable form, web pages authored in HTML on any type of computer can be read anywhere, including telephones, PDAs and even popular games consoles. Because of HTTP, a client machine (like your computer) knows that it has to be the one to initiate a request for a web page; it sends this request to a server. A server can be a computing device where web sites reside—when you type a web address into your browser, a server receives your request, finds the web page you want, and sends it back to your desktop or mobile computing device to be displayed in your web browser. The client device and server can bilaterally communicate via a HTTP request & response cycle between the two.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the above discussions, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computing system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computing system's registers and memories into other data similarly represented as physical quantities within the computing system memories or registers, or other such information storage, transmission or display devices.
Although embodiments of this design have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of embodiments of this design as defined by the appended claims. The invention is to be understood as not limited by the specific embodiments described herein, but only by scope of the appended claims.
This application claims priority to U.S. Provisional Patent Application No. 62/293,264, filed Feb. 9, 2016, titled “TOOLS AND METHODS FOR CAPTURING AND MEASURING HUMAN PERCEPTION AND FEELINGS,” the disclosure of which is hereby incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62293264 | Feb 2016 | US |