Embodiments of the present disclosure relate generally to the mobile application industry, and in particular, systems, devices, and methods for developing human characteristics utilizing specialized computer-based applications.
The human brain has an outstanding richness and complexity. The human brain includes an estimated 86B+ neurons that, if stretched out, would extend over 1M miles. Each neuron has roughly 10,000 synapses having variable forms and sizes, which connect neurons to form a connectome for the individual. The human brain exhibits neuroplasticity, in that this connectome changes over time. For example, neurons can develop new branches, as well as lose old branches. Synapses can be created, eliminated, and grow larger or smaller. This neuroplasticity is a lifelong endeavor, and brain activity can shape the brain even at advanced ages.
Nobel Prize-winning psychologists Daniel Kahnemann and Amos Tversky discovered that the brain is composed of two systems. They call these two systems simply “system 1” and “system 2.” System 1 is described as the “fast” brain, and has the characteristic of working fast, being more emotional, being more automatic, desires instant gratification, and does not need much energy to operate. System 2 is described as being the “slow” brain, and has the characteristics of being logical and rational, working slower, making plans for the future, and uses a lot of energy for self-regulation. Therefore, most of our fast and low energy consuming responses come from System 1, whereas the slow and high energy consuming responses come from system 2. The inventor has appreciated a need for a system and method for sculpturing and training the brain and developing human characteristics according to embodiments of the disclosure herein.
A human characteristic development and brain training system comprises a processor operably coupled with an electronic display and a memory. The processor is configured to operate an application configured to display a user interface to the electronic display for receiving inputs from a user, generate an entry that include information about a life event experienced by the user, store the entry from the user in a database stored in the memory, present a list of strengths to the user through the user interface for the user to select from, and assign at least one strength to the entry as being associated with the life event.
A method of operating a human character training system is disclosed. The method comprises managing a database of entries from at least one user, the entries including information about at least one life event of the at least one user, associating the at least one life event with at least one strength of the user, and linking a plurality of entries together in a group for the user to view micro-progresses in behavior for similar life events.
In the following detailed description, reference is made to the accompanying drawings which form a part hereof and, in which are shown by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those of ordinary skill in the art to practice the invention, and it is to be understood that other embodiments may be utilized, and that structural, logical, and electrical changes may be made within the scope of the disclosure.
In this description, specific implementations are shown and described only as examples and should not be construed as the only way to implement the present invention unless specified otherwise herein. It will be readily apparent to one of ordinary skill in the art that the various embodiments of the present disclosure may be practiced by numerous other partitioning solutions. For the most part, details concerning timing considerations and the like have been omitted where such details are not necessary to obtain a complete understanding of the present disclosure and are within the abilities of persons of ordinary skill in the relevant art.
Referring in general to the following description and accompanying drawings, various embodiments of the present disclosure are illustrated to show its structure and method of operation. Common elements of the illustrated embodiments may be designated with similar reference numerals. It should be understood that the figures presented are not meant to be illustrative of actual views of any particular portion of the actual structure or method, but are merely idealized representations employed to more clearly and fully depict the present invention defined by the claims below.
It should be further appreciated and understood that the various illustrative logical blocks, modules, circuits, and algorithm acts described in connection with embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps are described generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the disclosure described herein.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a special purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
Embodiments of the present disclosure include a human character training system and related method that enable a user to record, monitor, and analyze their own behavior as well as the behavior of others to help the user track improvements (i.e., small wins) over time. The improvements may be celebrated by the user. As a result of engaging in the method, the user may be transformed into an improved individual. In particular, the user's brain may be trained and sculptured by tracking small improvements in behavior wins (i.e., steps) anywhere, anytime, and during the user's normal daily life/activities. In particular, the system may draw the user's attention to the user's small wins that are recorded by the system, which can then be reviewed and celebrated at a later time as improvements are made. The user may be enabled to include a particular character strength that was used in accomplishing the small win. The user may also be enabled to categorize each entry as part of a longer range of focus and/or as a particular adversity or challenge the user faces. Such a process of sculpturing and behavior training may be adaptive and continuous throughout the life of the user, which may enable the user to better adapt, learn, and improve behavior and character for mental, emotional, physical, and spiritual endeavors.
Such embodiments may find support in modern neuroscience research indicating a lifelong neuroplasticity of the brain. As a result, the brain can be shaped with practice and repetition, repetition, repetition. Modern research also indicates that neurons that fire together wire together—meaning that the neurons establish more synapses and larger synapses where there is greater activity and greater repetition. In the concept of the two system brain, it is understood that system 2 of the brain trains system 1 of the brain. These two systems are also very different in how they learn. For example, while system 1 understands and responds to language, system 1 learns by repetitive experience, shock, or trauma. System 2, on the other hand, does learn from language. Embodiments of the present disclosure assist in this training process by providing a system and method for the user to practice repetition in improving their actions in their daily lives, while also recognizing small improvements over time to celebrate. Thus, it is believed that the synaptic connections involved in performing such a task may be enhanced. Additionally, it is believed that additional enhancements may occur by the user being aware of strengths and other motivations that are involved in their daily activities.
As a result, users may generate sensory experimented progresses, name a strength used, and celebrate the progress, which promotes increasing their use of strengths. Like with physical exercise, such repetition may help users of the system to develop their human characteristics and train their brain through mental exercises.
The control circuitry 112 may include a memory device and a processor. The control circuitry 112 may be configured to execute an operating system. By way of non-limiting example, the operating system may include Android, iOS, Windows Phone, Microsoft Windows, Apple OS X, Unix, Linux, and other operating systems. The control circuitry 112 may include various application programs (hereinafter “apps”) configured to function in an environment provided by the operating system. For example, the control circuitry 112 may include a human characteristic development and brain training application 130 (hereinafter “training application” 130), which may be executed by the processor according to computer-readable instructions stored in memory of the storage device 114. In other words, the computer-readable instructions may be configured to instruct the control circuitry 112 to perform the functions discussed in more detail below. The computer-readable instructions may be provided to the user device 110 via a software distribution server having the computer-readable instructions stored thereon.
The input devices 116 may be configured to enable a user to interact with an interface of the training application 130, such as to provide inputs (e.g., text inputs, video inputs, audio inputs, etc.) into the system. For example, the input devices 116 may include a keyboard, microphone, camera, etc. Output devices 118 may be configured to convey information to the user from the training application 130, such as to provide outputs (e.g., text outputs, video outputs, audio outputs, etc.) from the system. For example, the output devices 118 may include electronic displays, speakers, etc. In some embodiments, some aspects of input devices 116 and output devices 118 may be integrally formed (e.g., touch screen display).
The storage device 114 may include a GEN database 132 stored in memory thereof. The GEN database 132 may include the recorded GEN entries and related data that are generated by the user during use of the training application 130. GENs will be discussed in more detail below.
The user device 110 may include smart phones, tablet computers, handheld computers, laptop computers, desktop computers, smart televisions, and other similar devices configured to deliver content to a user. While discussion herein is primarily focused on embodiments that include an “app,” it is contemplated that web-based embodiments that are accessible by web-browsers or other similar user interfaces are also within the scope of the present disclosure. Thus, rather than having the training application 130 stored locally on the user device 110, the training application 130 may be stored on a remote server that is accessed by the user device 110. The GEN database 132 may be maintained by the remote server in such an embodiment, which may also maintain the GEN databases for a plurality of different user devices. In some embodiments, at least a portion of the GEN database 130 may be stored both locally on the user device 110 with some data also being stored remotely.
At operation 202, the user may record an entry (referred to herein as a “GEN” which is an abbreviation for Geniantis). As an example, the user may interact with an interface to input the information used to create the GEN. For example, each GEN may include information, such as “what,” “where,” and “when.” For example, the user may input text into the input field 302 to input information regarding the content (i.e., “what”) of the GEN, the user may input text into the input field 304 to input information regarding the location (i.e., “where”) of the GEN, and the user may input text into the input field 306 to input information regarding the time (i.e., “when”) that the GEN was created.
The content information of a GEN may include a description of something that the user has observed, a description of a situation (e.g., observation, interaction with another person, statement of fact, etc.) encountered by the user, a description of a problem (e.g., adversity, challenge, etc.) encountered by the user, a specific progress (e.g., “win”) that the user experienced, a specific progress that the user saw a third party doing that the user wants to compliment (e.g., “admire”), or a description of some other event that occurred in the user's life that the user would like to improve upon and/or learn from. The location information may include information about where the event occurred. The time information may include information about when the event occurred. At least some of this information may be automatically generated by the human characteristic development and brain training system. For example, the location information may be automatically retrieved according to the geolocation (e.g., using GPS data) of the user device. In addition, the time information may be automatically retrieved according to the internal time that is kept by the user device. Of course, the user device may enable the user to manually override the location and/or time information that is retrieved, which may be useful in the event that the user is creating a GEN for an event that occurred previously, for which a GEN had not been created at that time.
The content information of the GEN may be stored in the form of text, audio, an image, a video, or combinations thereof. For example, the user may input text into the input field 302 being displayed by the interface, which may provide an area for the user to type the content information therein. An image icon 308 may be selected if the user desires to attach a digital image file to the GEN being created. An audio icon 310 may be selected if the user desires to attach an audio file (e.g., voice memo) to the GEN being created. A video icon 312 may be selected if the user desires to attach a video file to the GEN being created. Selecting one of the icons 308, 310, 312 may enable the user to generate the respective file while also creating the GEN. In some embodiments, the respective file may have been generated previously and the user may retrieve the respective file from the storage of the user device. Thus, a GEN may include a variety of different types of data in various forms.
Once the user has input the information, the user may select the save icon 314 to save the GEN, which is stored in the GEN database for further review and evaluation by the user. For example, the user may have input a particular adversity or challenge the user faced, which may be reviewed at a later time to develop the “small win” that the user can execute in order to start solving the adversity/challenge. Once the GEN is recorded, additional operations may be available to the user with regard to the recorded GEN. Throughout the day, the user may generate additional GENs that are added to the GEN database.
In addition to manual entry of GENs into the GEN database, GENs may be generated automatically by the system without user involvement. For example, the processor may operate according to a set of rules to generate a GEN according to a triggering event. For example, the processor may be configured to communicate with external devices to receive information that may be converted into a GEN. In some embodiments, the rules may be configured to recognize progress such that the GEN may also be automatically categorized as a GEN progress. In some embodiments, the processor may require user authorization before adding to a GEN list, Progress list, or other list maintained by the system. In such an embodiment, a queue of GENS awaiting approval may be generated that must be approved before being added to a particular list.
As one example, the processor of the system may communicate with a scale such that when the user weighs himself, the processor may automatically generate a GEN and store it into the GEN database. Additional rules may include generating a GEN only responsive to certain weights being measured, progress made, or goals achieved. Additional health related GENs that may be automatically generated may include measurements of body mass index (BMI), blood pressure, cholesterol exams and other medical tests, or other situations in which a medical device or computer has health related information that may be desirable to monitor progress and celebrate. In other example, the processor of the system may communicate with exercise equipment (e.g., treadmills, steppers, stationary bikes, etc.) at a gym facility to obtain information that may be converted to a GEN.
In some embodiments, the system may communicate with other applications within the user's device to obtain information for automatically generating a GEN entry. For example, some applications may track a user's health data, such as monitoring the number of steps walked in a day, calories burned, calorie intake, heart rate, running/biking distance, hours slept, etc. Other types of applications may also have useful information for the GEN process, such as money management applications that track money spent and/or saved. The system may receive such data from other applications to automatically generate a GEN. Of course, the system's own application may be configured to monitor such activities on its own without needing to retrieve the data from other applications.
Additional features may include employing locational features (e.g., GPS) of the user's device to automatically generate GENS. For example, certain locations may be stored into the rules such that detecting the user's presence at that location may automatically generate a GEN. For example, the GPS location of the user's gym may be stored such that a GEN is automatically generated whenever the user attends the gym (or after a number of times attending the gym).
In addition, the system's application may also integrate with other applications on the user's device in order to receive status updates that may be converted into a GEN For example, a user may post updates to applications such as Facebook, Twitter, Instagram, etc. Such updates may occasionally have information that is suitable for a GEN. Thus, the user may also authorize certain posts to be stored as a GEN in addition to posting to that application so that the user does not have to make separate entries.
At operation 204, a GEN list 402 (
The icon 406 may also be used to provide information regarding the characterization and/or actions that have been taken regarding that particular GEN as will be discussed further below. As a non-limiting example, a red icon may indicate that particular GEN is a friendly adversity, a blue icon indicates that the GEN is an interesting observation that should be kept in the GEN list, a yellow icon indicates that the GEN is in the user's GEN progress list. An icon with a human head inside indicates that the GEN is an admiration that was sent to a third party. Of course, the particular method of indication is not limited to the color or icon scheme described herein, and other methods are also contemplated.
In some embodiments, the location information may be shown for the GENs in the GEN list 402. As more and more GENs are recorded, the GEN list 402 may increase and the user may be able to scroll through the GENs to find a particular GEN. The user interface may further include a search field 404 may enable the user to type in key words or other contextual searching methods that may correspond to the information stored in the GENs to help the user find GENs.
In some embodiments, other features may be available for the user to sort or otherwise manipulate the GENs in the GEN list 402. For example, the interface may further include a filter field (not shown) that the user can use to filter GENs according to common attributes. For example, GENs may be filtered according to how a GEN is categorized (e.g., friendly adversity, progress, focus, etc.) or as containing particular characteristics (e.g., hungers, strengths, etc.). GENs may also be filtered according to a particular date or date range or by other common attributes of the GENs.
By selecting a GEN from the GEN list 402, the user may view a list of options for actions that the user may take with regard to the selected GEN. Selectable actions may include providing the GEN to a progress list, providing the GEN to a friendly adversity list, sending the GEN to a third party, sharing the GEN with a group, and deleting a GEN. Each of these selectable actions will be discuss in more detail below.
At operation 206, the user may interact with the interface to select the action (sending a GEN admiration to a third party) as outlined by box 502 (
For example, the user may send the following message to a third party: “congratulations for the use of the perspective strength in putting focus in the transition of students between the 5th and the 6th grade during the meeting held September 18 at the school district. This will help the whole school.” The third party may be entered into the messaging interface and/or retrieved from stored information (e.g., contact list) of the human characteristic development and brain training system.
In some embodiments, the information from the GEN may be automatically input into the body of the message to be sent to the third party individual. In some embodiments, the GEN itself may be attached to the message, and the user may be able to enter their own message to accompany the attached GEN. Of course, it is contemplated that the information of the GEN may be automatically input into the body of the message to be sent, as well as the messaging interface being configured to permit the user to enter additional information into the message. It is also contemplated that at least some of the information automatically inserted into the messaging interface may come from a template message that is selected by the user.
In some embodiments, the GEN may be sent to the third party individual via text message (e.g., SMS), email, or other forms of messaging. In some embodiments, if the third party has the human characteristic development and brain training system application stored on their user device, the GEN may be sent directly between the two applications. In such an embodiment, the GEN may be stored in the GEN list of the third party's application when it is received by the third party.
Upon completion of sending the GEN admiration to the third party the icon 506 associated with the corresponding GEN entry 504 may be transformed to indicate that a GEN admiration was sent for that particular GEN entry 504.
At operation 208, the user may interact with the interface to select the action (i.e., recording the GEN to be a friendly adversity) as outlined by box 602 (
Responsive to the user selecting the GEN to be a friendly adversity, the interface may provide additional options to help the user self-analyze (e.g., decompose) the underlying situation recorded by the GEN 604. The interface may be configured to present the user with a list of hungers 606 and a list of strengths 608 that may have been involved in the GEN from the perspective of the user. Hungers may be viewed as motivations of the user, whereas strengths may be viewed as tools that the user draws from. Hungers may be a grouped according to different intrinsic motivations that the user may have had in the way that they acted in that given situation. Three basic hungers include autonomy, competence, and relatedness as three intrinsic motivations for behavior. Thus, the system causes the user to assess how the user acted in this situation, and select what hungers were involved in influencing their behavior. Strengths refer to the attributes of the user that the user used in dealing with the situation. Strengths may include cognitive strengths (e.g., creativity, curiosity, judgment, love of learning, perspective), emotional strengths (e.g., bravery, perseverance, honesty, zest), humanity strengths (e.g., love, kindness, social intelligence), justice strengths (e.g., teamwork, fairness, leadership), temperance strengths (e.g., forgiveness, humility, prudence, self-control), transcendence strengths (e.g., appreciation of beauty and excellence, gratitude, hope, humor, spirituality). Studies indicate that individuals with a high level of efficient use of these strengths would be much better prepared physically, materially, mentally and spiritually for the geographical (e.g., local and remote) as well as temporal threats and opportunities of life. Other strengths are also contemplated, including those identified by the Gallup organization. While the list of strengths 608 only shows four strengths (e.g., appreciation of beauty and excellence, bravery, creativity, curiosity), this is because the list of strengths may be long and may exceed the space shown for the interface. It should be appreciated that the user may scroll down to view other strengths to select from.
As shown in
The interface may also ask the user if there is another person involved in this friendly adversity (option 610). If the user selects “yes,” the interface may display a similar list of hungers 612 and list of strengths 614 for the user to select from the perspective of the other individual. In other words, the user may assess the motivations (hungers) and attributes (strengths) of the other person involved in the situation of the underlying GEN. As shown in
After completing selection of hungers and strengths for the user as well as others involved in the underlying situation of the GEN, the GENs that have been designated as “friendly adversities” may be populated in a list 702 (
At operation 214, the user may select a friendly adversity entry 904 (
At operation 216, the user may select a GEN 1004 shown by box 1002 (
At operation 218, the user may generate a GEN progress entry for a selected GEN. From the GEN list, the user may interact with the interface to select the action (send to GEN progress) as outlined by box 1102 (
At operation 220, the user may wish to populate a GEN progress list (
At operation 222, the user may send the GEN progress entry to a friendly adversity list to link the GEN progress entry to a particular friendly adversity entry. For example, the user may select the action (send progress to friendly adversity) as outlined by box 1302 (
At operation 224, the user may desire to share the GEN progress entry with a group. For example, the user may select the action (share with group) as outlined by box 1402 (
At operation 226, the user may send the GEN progress entry to a focus list. For example, the user may select the action (send to focus) as outlined by box 1502 (
In some embodiments, the system may also include a visual display that shows how much the user uses their character strengths in the aggregate over a period of time. Thus, the processor may analyze the strengths identified in each of the GENs stored in the database to populate the visual display. The visual display may indicate the aggregate amount of use of each strength relative to each other in a single location.
Although the visual displays shown in
While the disclosure is susceptible to various modifications and implementation in alternative forms, specific embodiments have been shown by way of non-limiting example in the drawings and have been described in detail herein. It should be understood that the disclosure is not intended to be limited to the particular forms disclosed. Rather, the disclosure includes all modifications, equivalents, and alternatives falling within the scope of the following appended claims and their legal equivalents.