The subject matter described herein relates in general to determining and/or measuring a reaction of a user to an advertisement.
Identifying a reaction of a user to an advertisement is beneficial for several reasons. As an example, based on identified reactions to an advertisement, an advertising company can determine what types of advertisements garner a positive response from the user and can curate the types of advertisements that are presented to the user. As another example, the advertising company can create targeted and/or tailored advertisements for the user.
This section generally summarizes the disclosure and is not a comprehensive explanation of its full scope or all its features.
In one embodiment, a system for determining and/or measuring a reaction of a user to an advertisement is disclosed. The system includes a dual-sided transparent display and an Electrodermal activity (EDA) sensor fixed to the dual-sided transparent display. The system includes a processor and a memory in communication with the processor. The memory stores machine-readable instructions that, when executed by the processor, cause the processor to, responsive to determining commencement of an advertisement, request a user to place a hand of the user on the EDA sensor. The memory stores machine-readable instructions that, when executed by the processor, cause the processor to acquire EDA data relating to the user via the EDA sensor, determine a reaction of the user to the advertisement based on the EDA data, and transmit the reaction of the user and at least one characteristic of the advertisement to a third party.
In another embodiment, a method for determining and/or measuring a reaction of a user to an advertisement is disclosed. The method includes, responsive to determining commencement of an advertisement, requesting a user to place a hand of the user on an EDA sensor. The EDA sensor is fixed to a dual-sided transparent display. The method includes acquiring EDA data relating to the user via the EDA sensor, determining a reaction of the user to the advertisement based on the EDA data, and transmitting the reaction of the user and at least one characteristic of the advertisement to a third party.
In another embodiment, a non-transitory computer-readable medium for determining and/or measuring a user's reaction to an advertisement and including instructions that, when executed by a processor, cause the processor to perform one or more functions, is disclosed. The instructions include instructions to, responsive to determining commencement of an advertisement, request a user to place a hand of the user on an EDA sensor. The EDA sensor is fixed to a dual-sided transparent display. The instructions include instructions to acquire EDA data relating to the user via the EDA sensor, determine a reaction of the user to the advertisement based on the EDA data, and transmit the reaction of the user and at least one characteristic of the advertisement to a third party.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate various systems, methods, and other embodiments of the disclosure. It will be appreciated that the illustrated element boundaries (e.g., boxes, groups of boxes, or other shapes) in the figures represent one embodiment of the boundaries. In some embodiments, one element may be designed as multiple elements or multiple elements may be designed as one element. In some embodiments, an element shown as an internal component of another element may be implemented as an external component and vice versa. Furthermore, elements may not be drawn to scale.
Systems, methods, and other embodiments associated with measuring a reaction of a user to an advertisement, are disclosed. Knowledge of a user's reaction to an advertisement can be beneficial to the creator or source of the advertisement. This knowledge can also be beneficial for determining what types of advertisements to provide to the user. As an example, the source of the advertisement may use this knowledge to determine and/or predict the user's reaction to other advertisements using machine learning processes. As another example, the source of the advertisement may determine what type of advertisement and at what time to output or present the advertisement so as not to invoke a negative reaction from the user.
Accordingly, in one embodiment, the disclosed approach is an electrodermal activity (EDA)-based user reaction measurement system that determines the reaction of the user to an advertisement and further transmits the reaction of the user and the characteristic(s) of the advertisement to a third party such as a source of the advertisement. Additionally and/or alternatively, the EDA-based user reaction measurement system may store the reaction of the user and the characteristic(s) of the advertisement in a local database or an external database. The vehicle may include a filtering device for determining what types of advertisements to present to the user based on the time of day, the location of the user and the vehicle, the speed of travel of the user and the vehicle, and so on.
Electrodermal activity (EDA) is a biosensing technique used in psychology and medicine to detect emotional arousal, measure distress levels, measure attention levels, and/or predict seizures, among other things. EDA is the measurement of skin transpiration in the palm and/or fingers of a user. An emotional state of the user can be identified based on the determined EDA.
A vehicle that includes the EDA-based user reaction measurement system may further include one or more dual-sided transparent displays. The dual-sided transparent display has two sides and can display visual content such as images and/or videos on the two sides. The content can be the same on the two sides or can be different. The dual-sided transparent display can display visual content on one side and not on the other side. Alternatively, the dual-sided transparent display can be transparent. The dual-sided transparent display can located in at least one of a vehicle window or a windshield. As such, the dual-sided transparent display may be a portion of the vehicle window and/or the windshield.
The vehicle action may be one of adjusting the dual-sided transparent display, adjusting a visual device, adjusting an audio device, assuming control of a vehicle, contacting an emergency service, or performing a medical intervention.
The vehicle can include one or more sensors. The sensors can be located inside the vehicle, such as in the vehicle cabin and/or outside the vehicle. The sensors can include internal camera(s) that can monitor the user, the actions of the user, and the facial expressions of the user. The sensors can include external camera(s) that can monitor the environment surrounding the vehicle. The sensors can include a microphone that can detect sounds inside the vehicle, such as sounds made by the user. The sensors can include biometric sensors for detecting and recording biological characteristics (e.g., heart rate, temperature, oxygen levels, blood sugar levels, blood pressure levels) from the user. The sensors can include EDA sensor(s) for determining whether the user is distracted, attentive, fatigued, sleepy, happy, and/or sad. The EDA sensor(s) are fixed to one or more dual-sided transparent displays.
The EDA-based user reaction measurement system can monitor a user's emotional reactions to advertisements by using EDA sensor(s) on the dual-sided transparent display. The EDA sensors may implemented with transparent electrodes to detect changes in skin conductance and moisture levels of the hand(s) of the user. As an example, the EDA-based user reaction measurement system can receive sensor data from the sensor(s) such as the camera(s), the microphone(s) and the biometric sensor(s). Based on the sensor data, the EDA-based user reaction measurement system can determine the reaction of the user(s) such as fatigued, sad, happy, nervous, sleepy, distracted, bored, attentive, and so on.
The EDA-based user reaction measurement system can determine the intensity of a user's response to content, such as an advertisement, whether positive or negative, by the amount or rate of change in the moisture level on the user's skin. As an example, the EDA-based user reaction measurement system may perform a prerequisite step by establishing a baseline for the user. In such an example, the EDA-based user reaction measurement system may obtain EDA data from the user when there is no advertisement being outputted or presented.
Upon determining the commencement of the advertisement, the EDA-based user reaction measurement system may activate the dual-sided transparent display to display an image. As an example, the dual-sided transparent display may display an image such as a silhouette of a hand to indicate the location of the EDA sensors on the dual-sided transparent display, and the EDA-based user reaction measurement system may prompt the user to place their hand within the displayed silhouette. While the EDA-based user reaction measurement system may prompt the user to place their hand on a displayed silhouette before the advertisement commences, the EDA-based user reaction measurement system may prompt the user to place or remove their hand from the EDA at any suitable time before, during, or after the advertisement is presented.
The advertisement can be presented on a portion of the dual-sided transparent display. As such, the EDA-based user reaction measurement system may prompt the user to place their hand on the EDA sensor before activating the dual-sided transparent display to present the advertisement.
The EDA-based user reaction measurement system may use electrodes such as transparent electrodes to measure a change in the conductance of the user's skin to determine a level of emotional arousal in the user. The EDA-based user reaction measurement system may determine the reaction of the user and the reaction of the user may include one or more of attention, attention level, arousal, arousal level, stress, stress level, fatigue, fatigue level, happiness, sadness, boredom, or fear.
The EDA-based user reaction measurement system may transmit the reaction (i.e., the level of emotional arousal) of the user and the characteristic(s) of the advertisement to a third party such as the advertising company or media company. The advertisement may be interactive and request feedback from the user. As an example, the advertisement may present a survey, inquiring whether the user is interested in the item being advertised, whether the user is interested in purchasing the item being advertised, and/or whether the user is interested in directions to a location selling the item being advertised. As previously mentioned and as an example, the advertising company may improve the advertisements for the user by presenting more targeted and/or tailored advertisements to the user based on the reaction(s) of the user and/or the feedback from the user.
It will be appreciated that arrangements described herein can provide numerous benefits, including one or more of the benefits mentioned herein. For example, arrangements described herein enhance the accuracy of the determining the reaction of the user to an advertisement by combining sensor data from multiple sensor sources, including camera(s), microphone(s), biometric sensor(s), and/or EDA sensor(s). Arrangements described herein can acquire the electrodermal activity of the user in a non-invasive manner. Arrangements described herein can acquire EDA measurements without a continuous connection to the user's skin. Arrangements described herein can acquire EDA measurements without the use of glued electrodes or electrodes pressed against the skin. Arrangements described herein can provide accurate electrodermal activity measurements. Arrangements described herein can result in reduced computing and processing power requirements. Arrangements described herein can result in identifying the emotional state of a user.
Detailed embodiments are disclosed herein; however, it is to be understood that the disclosed embodiments are intended only as examples. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the aspects herein in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting but rather to provide an understandable description of possible implementations. Various embodiments are shown in the figures, but the embodiments are not limited to the illustrated structure or application.
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details.
Referring to
The EDA-based user reaction measurement system 100 includes a dual-sided transparent display 104, 106, 108 and EDA sensor(s) 110A, 110B, 110C, 110D (collectively known as 110) fixed to the dual-sided transparent display(s) 104, 106, 108. One example of a dual-sided transparent display that can be utilized as the dual-sided transparent display 104, 106, 108 is shown in U.S. Pat. App. Pub. No. 2021/0389615A1 to Rodrigues and is hereby incorporated by reference in its entirety.
The dual-sided transparent display 104, 106, 108 includes a transparent display which can be configured to display content, such as text, images, and/or video. The dual-sided transparent display 104, 106, 108 includes an inner side, facing user(s) inside the vehicle 102 and an outer side, facing observer(s) outside the vehicle 102. The dual-sided transparent display 104, 106, 108 can be configured to display content on one of or both the inner and outer sides of the dual-sided transparent display 104, 106, 108. As an example, the dual-sided transparent display 104, 106, 108 can display content on the inner side of the dual-sided transparent display 104, 106, 108 such that the content is visible to the user(s) inside the vehicle 102 and not visible to the observer(s) outside the vehicle 102. As an example and as shown, the dual-sided transparent display 104 may display content such as an advertisement 112.
As another example, the dual-sided transparent display 104, 106, 108 can display content on the outer side of the dual-sided transparent display such that the content is visible to the observer(s) outside the vehicle 102 and not visible to user(s) inside the vehicle 102. As another example, the dual-sided transparent display 104, 106, 108 can display content that is visible to both the user(s) in the vehicle 102 and the observer(s) outside the vehicle 102. In another example, the dual-sided transparent display 104, 106, 108 can display content on the inner side that differs from the content on the outer side. In yet another example, the dual-sided transparent display 104, 106, 108 can be transparent such that the user(s) in the vehicle 102 can see outside the vehicle 102 and the observer(s) outside the vehicle 102 can see into the vehicle 102.
The dual-sided transparent display 104, 106, 108 can be formed using materials that are substantially transparent or clear. As an example, the dual-sided transparent display 104, 106, 108 may be formed using glass or plastic. The dual-sided transparent display 104, 106, 108 may have an active element or source and/or a switching element. The dual-sided transparent display 104, 106, 108 can be used in connection with a screen, such as a laptop screen, a mobile device screen etc., or a window, such as a building window, a vehicle window, etc. In such a case, the dual-sided transparent display 104, 106, 108 can form at least a portion of a building window, a vehicle window, or a windshield.
The EDA sensor(s) 110 can include one or more sensing surfaces. As an example, the sensing surface can include a plurality of electrode pairs and one or more skin conductance sensors. The sensing surface can include an electrically insulating material. The sensing surface can include a rigid surface, which is a surface that can maintain its shape when a pressure is exerted on it (e.g., polymer). Alternatively, the sensing surface can be a compliant surface, which is a surface that deviate from its original shape in response to a pressure being exerted on it (e.g., Polydimethylsiloxane (PDMS) or rubber). The sensing surface can be of any material that does not conduct electricity and can be suitable for at least partially embedding or fixing the electrode pairs. The one or more sensing surfaces can be integrated into any suitable vehicle component, particularly the dual-sided transparent display 104, 106, 108.
The sensing surface(s) can be formed using any suitable method, e.g., conventional printed circuit board (PCB) manufacturing technology, flex circuit manufacturing technology where thin electrodes are embedded in a flexible Kapton substrate, screen printing or multi-material additive manufacturing. The electrodes can be of any material suitable for permitting skin conductance and acquiring electrodermal activity. As an example, the electrodes can be standard silver-silver chloride (Ag/AgCl) electrodes. As another example, the electrodes can be stainless steel electrodes. As another example, the electrodes can be transparent electrodes. In such an example, the transparent electrodes will not block portions of the dual-sided transparent display from view. Transparent electrodes may be formed using, as an example, Indium Tin Oxide (ITO).
In response to making contact with a user's hand, the EDA sensor(s) 110 can transmit an electric signal from one of an electrode pair to an other of the electrode pair via the user's skin. The EDA sensor(s) 110 use any suitable calculations and/or algorithms to evaluate and determine accurate EDA data based on measurements of the electric signal. The EDA sensor(s) 110 can identify and reduce noise in EDA data measurements. The EDA sensor(s) 110 can evaluate the EDA measurements to determine the emotional state of the user. The EDA sensor(s) 110 may be used to determine whether the user is fatigued, distracted, and/or experiencing an emotion such as being happy or sad.
The EDA sensors 110 can be fixed on the surface of the dual-sided transparent display 104, 106, 108. More specifically, the electrodes of the EDA sensors 110 can be fixed on the surface of the dual-sided transparent display 104, 106, 108. As an example, the electrodes may be fixed relatively evenly across the surface of the dual-sided transparent display 104, 106, 108. As another example, the electrodes may be concentrated in a portion of the dual-sided transparent display 104, 106, 108. The dual-sided transparent display 104, 106, 108 may be embedded with a visible material outlining the location of electrodes in the dual-sided transparent display 104, 106, 108. As shown in
Referring to
Some of the possible elements of the vehicle 102 are shown in
With reference to
The EDA-based user reaction measurement system 100 may include a memory 320 that stores the control module 330. The memory 320 may be a random-access memory (RAM), read-only memory (ROM), a hard disk drive, a flash memory, or other suitable memory for storing the control module 330. The control module 330 is, for example, a set of computer-readable instructions that, when executed by the processor(s) 210, cause the processor(s) 210 to perform the various functions disclosed herein. While, in one or more embodiments, the control module 330 is a set of instructions embodied in the memory 320, in further aspects, the control module 330 includes hardware, such as processing components (e.g., controllers), circuits, etc. for independently performing one or more of the noted functions.
The EDA-based user reaction measurement system 100 may include a data store(s) 215 for storing one or more types of data. Accordingly, the data store(s) 215 may be a part of the EDA-based user reaction measurement system 100, or the EDA-based user reaction measurement system 100 may access the data store(s) 215 through a data bus or another communication pathway. The data store 215 is, in one embodiment, an electronically based data structure for storing information. In at least one approach, the data store 215 is a database that is stored in the memory 320 or another suitable medium, and that is configured with routines that can be executed by the processor(s) 210 for analyzing stored data, providing stored data, organizing stored data, and so on. In either case, in one embodiment, the data store 215 stores data used by the control module 330 in executing various functions. In one embodiment, the data store 215 may be able to store sensor data 216, electrodermal activity (EDA) data 219, and/or other information that is used by the control module 330.
The data store(s) 215 may include volatile and/or non-volatile memory. Examples of suitable data stores 215 include RAM (Random Access Memory), flash memory, ROM (Read Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable Programmable Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), registers, magnetic disks, optical disks, hard drives, or any other suitable storage medium, or any combination thereof. The data store(s) 215 may be a component of the processor(s) 210, or the data store(s) 215 may be operatively connected to the processor(s) 210 for use thereby. The term “operatively connected” or “in communication with” as used throughout this description, can include direct or indirect connections, including connections without direct physical contact.
In one or more arrangements, the data store(s) 215 can include sensor data 216. The sensor data 216 can originate from the sensor system 220 of the vehicle 102. The sensor data 216 can include data from visual sensors, audio sensors, biometric sensors and/or any other suitable sensors in the vehicle 102.
In one or more arrangements, the data store(s) 215 can include EDA data 219. The EDA data 219 can include EDA data measurements, and other types of data such as user identification, e.g., a fingerprint and/or a handprint of a user and biometric user information. In some instances, the user identification can include information about the size and/or shape of the hand of the user. Such user data can be based on average human data, user specific data, learned user data, and/or any combination thereof. The sensor data 216 and the EDA data 219 may be digital data that describe information used by the EDA-based user reaction measurement system 100 to control a vehicle system 240.
In one embodiment, the control module 330 may include instructions that, when executed by the processor(s) 210, cause the processor(s) 210 to, responsive to determining commencement of an advertisement, request a user place a hand of the user on the EDA sensor 228.
The advertisement may be an image, a video, an audio, and/or text. The advertisement may be output from at least one of a visual source or an audio source. As an example, the advertisement may be displayed on a dual-sided transparent display and/or a display screen in the vehicle with an audio component. As another example, the advertisement may be displayed on the dual-sided transparent display and/or the display screen in the vehicle with no audio. As another example, the advertisement may be output using speaker(s) in the vehicle without a visual component. As an example, the advertisement may be output from within the vehicle using at least a display screen and/or a speaker located inside the vehicle. As another example, the advertisement may be output from outside the vehicle. In such an example, the advertisement may be displayed as a still image or as film on a billboard with or without an audio component. As another example, the advertisement may be output using speaker(s) outside the vehicle. In such an example, the advertisement may be output by roadside speakers.
At least one characteristic of the advertisement may be a subject of the advertisement which may include the product and/or service being advertised, a survey inquiring about heightened interest in the item being advertised, desire to test the item being advertised, interest in navigation directions to nearest facility selling the item being advertised, the actor(s) and/or object(s) seen or heard in the advertisement, the length of the advertisement, the time at which the advertisement is being outputted or played and/or the format (e.g., image, video, audio, text, or a combination thereof).
The control module may determine the commencement of the advertisement using any suitable means. As an example, the control module may communicate with the source of the advertisement to determine that the advertisement is about to start. In an example where the advertisement is being output from inside the vehicle, the control module may communicate with the source(s) of the advertisement such as a streaming box, the display screen and/or the vehicle speakers. In an example where the advertisement is being output from outside the vehicle, the control module may communicate with the source(s) of the advertisement such as a streaming box, a billboard, and/or roadside speakers. The control module may communicate with the source(s) using, as an example, vehicle-to-infrastructure (V2I) communication. As an example, the control module may request a time when the advertisement will commence from the source(s) of the advertisement. In response, the control module may receive the time when the advertisement will commence from the source(s) of the advertisement. As another example, the control module may control when an advertisement is played and the type of advertisements that is being played.
As another example, the control module may determine the commencement of the advertisement using sensor data from sensors such as cameras and microphones. The sensors may monitor the display screens and/or speakers within the vehicle. Additionally and/or alternatively, the sensors may monitor the billboards and/or roadside speakers outside the vehicle. As another example, the control module may access a database that contains an output schedule for the advertisement(s). As another example, the control module may use any suitable machine learning algorithm such as pattern learning to determine and/or predict the commencement of the advertisement(s).
The control module may also determine the characteristics of the advertisement using any suitable means. As an example, the control module may determine the characteristics of the advertisement based on sensor data. As another example, the control module may determine the characteristics of the advertisement by requesting and receiving the characteristics from the source of the advertisement such as an advertising company or media company database.
Upon determining commencement of an advertisement, the control module 330 may display an image on the dual-sided transparent display. As an example, the image may be a hand print indicating the location of the EDA sensors on the dual-sided transparent display. Also, upon determining the commencement of the advertisement(s), the control module 330 may request the user place the user's hand on the EDA sensor 228. As previously mentioned, the location of the EDA sensors on the dual-sided transparent display may be identified by the image on the dual-sided transparent display. The control module 330 may communicate with the user in any suitable manner to make the request. As an example, the control module 330 may output an audio request and/or a visual request. In such an example, the control module 330 may output the audio request using speakers in the vehicle 102 and/or speakers electronically connected to, as an example, a mobile device. The control module 330 may output a visual request on a vehicle display unit such as a Heads-Up Display (HUD) or instrument panel, and/or mobile device display unit.
The control module 330 may be configured to determine when the user's finger(s) and/or palm is in contact with the EDA sensor 228. The EDA sensor 228 may determine the area of contact with the EDA sensor 228 based on the perimeter of the area in contact with the user's finger(s) and/or palm. The EDA sensor 228 can determine the size and/or the shape of the contact area based on, as an example, the x-, y-coordinates of the contact area. The control module 330 may receive information indicating that the user's finger and/or palm is in contact with the EDA sensor 228 from the EDA sensor 228. The control module 330 may then determine and/or distinguish between a finger and a palm based on size and shape as fingers tend to be narrower and longer than palms which tend to be wider and shorter. The control module 330 can include any suitable object recognition software to detect whether contact is being made by a user's finger, palm, both, or neither. The control module 330 can use any suitable technique, including, for example, template matching and other kinds of computer vision and/or image processing techniques and/or other artificial or computational intelligence algorithms or machine learning methods.
In one embodiment, the control module 330 may include instructions that, when executed by the processor(s) 210, cause the processor(s) 210 to acquire EDA data 219 relating to the user via the EDA sensor 228. As an example, in response to determining that the user's finger(s) and/or palm is in contact with the EDA sensor 228, the control module 330 can activate the EDA sensor 228 to acquire EDA data 219 from the user. In such an example, the EDA sensor 228 can acquire EDA data 219 from the user by measuring EDA using at least one of skin potential, resistance, conductance, admittance, and impedance. Skin potential can be the voltage measured between two points of contact between the user and the EDA sensor 228. Skin resistance can be the resistance measured between the two points of contact. Skin conductance can be the measurement of the electrical conductivity of the skin between the two points of contact. Skin admittance is determined by measuring relative permittivity and the resistivity of the skin, and by contact ratio between dry electrodes in the EDA sensor 228 and skin. Skin impedance can be the measurement of the impedance of the skin to alternating current of low frequency.
The control module 330 may also include instructions to acquire baseline EDA data. As an example, the control module 330 may request the user place the user's hand on the EDA sensor 228 at an instance when no advertisement is being output. In such an example, the EDA sensor 228 may acquire EDA data 219 from the user when the user does not appear to be reacting to the advertisement to use as baseline EDA data. As another example, the control module 330 may acquire baseline EDA data based on previous and historical EDA data acquisitions by the EDA sensor 228. As another example, the control module 330 may acquire baseline EDA data from other sources such as an external database storing EDA data.
In one embodiment, the control module 330 may include instructions that, when executed by the processor(s) 210, cause the processor(s) 210 to determine a reaction of the user to the advertisement based on the EDA data 219. The control module 330 may determine the reaction of the user to the advertisement based on the EDA data 219 and/or the baseline EDA data. As an example, the reaction of the user to the advertisement may be a positive reaction such as being happy and/or excited, a negative reaction such as being sad, being anxious, being angry, and/or being afraid, or a neutral reaction such as being nonchalant. Additionally, characteristics of the reaction of the user may be attention of the user and the level of attention of the user, arousal of the user and the level of arousal of the user, stress of the user and the level of stress of the user, fatigue of the user and the level of fatigue of the user, an emotional state of the user such as happiness, sadness, anger, fear and the level of the emotional state. The control module 330 may determine the reaction of the user to the advertisement based on the EDA data 219, the baseline EDA data, and/or additional sensor data from, as an example, the cameras 226 and/or the biometric sensors 229.
The control module 330 may compare the EDA data 219 to the baseline EDA data to determine the reaction of the user. The control module 330 may utilize any suitable algorithm and/or machine learning process to determine the reaction of the user. The control module 330 may determine the reaction of the user using the sensor data 216 in addition to the EDA data 219 and/or the baseline EDA data. As an example, the control module 330 may determine that the user is excited based on a combination of the sensor data 216 from the biometric sensor 229 such as a heartrate or heartbeat sensor, the EDA data 219 and the baseline EDA data.
In one embodiment, the control module 330 may include instructions that, when executed by the processor(s) 210, cause the processor(s) 210 to transmit the reaction of the user and at least of one characteristic of the advertisement to a third party. A third party may be a database. As an example, the database may be internal and within the vehicle. As another example, the database may be external and located outside the vehicle. As another example, the third party may be an advertising company or a media company. In such an example, the control module may transmit the reaction of the user and one or more characteristics of the advertisement to the advertising company or the media company.
At step 410, the control module 330 may cause the processor(s) 210 to, responsive to determining commencement of an advertisement, request a user place the user's hand on the EDA sensor 228 that is fixed to a dual-sided transparent display 104, 106, 108. The control module 330 may determine the commencement of an advertisement as described above and may then request that the user place the user's hand on the EDA sensor 228. As previously disclosed, the control module may determine characteristics of the advertisement by sensor data and/or by requesting and receiving characteristics from an advertisement source such as the streaming device, the advertising company, or the media company.
At step 420, the control module 330 may cause the processor(s) 210 to acquire EDA data 219 relating to the user via the EDA sensor 228. As an example, the control module 330 activates the EDA sensor 228 and may receive EDA data 219 from the EDA sensor 228 upon activation.
At step 430, the control module 330 may cause the processor(s) 210 to determine a reaction of the user to the advertisement based on the EDA data 219. As previously mentioned, the control module 330 may utilize any suitable algorithm to determine the reaction of the user based on a combination of sensor data 216, EDA data 219, and/or baseline EDA data.
At step 440, the control module 330 may cause the processor(s) 210 to transmit the reaction of the user and at least one characteristic of the advertisement to a third party. As an example, the control module 330 may transmit the reaction of the user in any suitable format and a characteristic of the advertisement to the advertising company.
A non-limiting example of the operation of the EDA-based user reaction measurement system 100 and/or one or more of the methods will now be described in relation to
As shown in
In
In one or more embodiments, the vehicle 102 is an autonomous vehicle. As used herein, “autonomous vehicle” refers to a vehicle that operates in an autonomous mode. “Autonomous mode” refers to navigating and/or maneuvering the vehicle 102 along a travel route using one or more computing systems to control the vehicle 102 with minimal or no input from a human driver. In one or more embodiments, the vehicle 102 is highly automated or completely automated. In one embodiment, the vehicle 102 is configured with one or more semi-autonomous operational modes in which one or more computing systems perform a portion of the navigation and/or maneuvering of the vehicle along a travel route, and a vehicle operator (i.e., driver) provides inputs to the vehicle to perform a portion of the navigation and/or maneuvering of the vehicle 102 along a travel route. In one embodiment, the vehicle may be a single family vehicle or personal vehicle such as a sedan, a truck, or a minivan. In another embodiment, the vehicle may be a mass transportation vehicle such as a bus or a van. As previously mentioned, the vehicle may be fully autonomous, partially autonomous, or manual.
The vehicle 102 can include one or more processors 210. In one or more arrangements, the processor(s) 210 can be a main processor of the vehicle 102. For instance, the processor(s) 210 can be an electronic control unit (ECU). As previously mentioned, the processor(s) 210 may be a part of the EDA-based user reaction measurement system 100, or the EDA-based user reaction measurement system 100 may access the processor(s) 210 through a data bus or another communication pathway.
The vehicle 102 can include one or more data stores 215 for storing one or more types of data. The data store 215 can include volatile and/or non-volatile memory. Examples of suitable data stores 215 include RAM (Random Access Memory), flash memory, ROM (Read Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable Programmable Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), registers, magnetic disks, optical disks, hard drives, or any other suitable storage medium, or any combination thereof. The data store 215 can be a component of the processor(s) 210, or the data store 215 can be operatively connected to the processor(s) 210 for use thereby. The term ““operatively connected”,” as used throughout this description, can include direct or indirect connections, including connections without direct physical contact.
The one or more data stores 215 can include sensor data 216. In this context, “sensor data” means any information about the sensors that the vehicle 102 is equipped with, including the capabilities and other information about such sensors. As will be explained below, the vehicle 102 can include the sensor system 220. The sensor data 216 can relate to one or more sensors of the sensor system 220. As an example, in one or more arrangements, the sensor data 216 can include information on one or more vehicle sensors 221 and/or environment sensors 222 of the sensor system 220.
The data store(s) 215 can include electrodermal activity (EDA) data 219. The EDA data 219 includes data from the EDA sensor(s) 228. The EDA data 219 may include historical EDA data based on past readings and/or external sources such as databases. The EDA sensors 228 may be a part of the sensor system 220 as shown. Alternatively, the EDA sensors 228 may be separate from the sensor system 220.
In some instances, at least a portion of the sensor data 216 and/or the EDA data 219 can be located in one or more data stores 215 located onboard the vehicle 102. Alternatively, or in addition, at least a portion of the sensor data 216 and/or the EDA data 219 can be located in one or more data stores 215 that are located remotely from the vehicle 102.
As noted above, the vehicle 102 can include the sensor system 220. The sensor system 220 can include one or more sensors. “Sensor” means any device, component and/or system that can detect, and/or sense something. The one or more sensors can be configured to detect, and/or sense in real-time. As used herein, the term ““real-time”” means a level of processing responsiveness that a user or system senses as sufficiently immediate for a particular process or determination to be made, or that enables the processor 210 to keep up with some external process.
In arrangements in which the sensor system 220 includes a plurality of sensors, the sensors can work independently from each other. Alternatively, two or more of the sensors can work in combination with each other. In such a case, the two or more sensors can form a sensor network. The sensor system 220 and/or the one or more sensors can be operatively connected to the processor(s) 210, the data store(s) 215, and/or another element of the vehicle 102 (including any of the elements shown in
The sensor system 220 can include any suitable type of sensor. Various examples of different types of sensors will be described herein. However, it will be understood that the embodiments are not limited to the particular sensors described. The sensor system 220 can include one or more vehicle sensors 221. The vehicle sensor(s) 221 can detect, determine, and/or sense information about the vehicle 102 itself. In one or more arrangements, the vehicle sensor(s) 221 can be configured to detect, and/or sense position and orientation changes of the vehicle 102, such as, for example, based on inertial acceleration. In one or more arrangements, the vehicle sensor(s) 221 can include one or more accelerometers, one or more gyroscopes, an inertial measurement unit (IMU), a dead-reckoning system, a global navigation satellite system (GNSS), a global positioning system (GPS), a navigation system 247, and/or other suitable sensors. The vehicle sensor(s) 221 can be configured to detect, and/or sense one or more characteristics of the vehicle 102. In one or more arrangements, the vehicle sensor(s) 221 can include a speedometer to determine a current speed of the vehicle 102.
Alternatively, or in addition, the sensor system 220 can include one or more environment sensors 222 configured to acquire, and/or sense data inside the vehicle 102 as well as around the vehicle 102. Sensor data 216 inside the vehicle 102 can include information about one or more users in the vehicle cabin and any other objects of interest. Sensor data 216 around the vehicle 102 can include information about the external environment in which the vehicle 102 is located or one or more portions thereof.
As an example, the one or more environment sensors 222 can be configured to detect, quantify and/or sense objects in at least a portion of the internal and/or the external environment of the vehicle 102 and/or information/data about such objects.
In the internal environment of the vehicle 102, the one or more environment sensors 222 can be configured to detect, measure, quantify, and/or sense human users inside the vehicle 102 and the facial expressions of the user(s). In the external environment, the one or more environment sensors 222 can be configured to detect, measure, quantify, and/or sense objects in the external environment of the vehicle 102, such as, for example, lane markers, signs, traffic lights, traffic signs, lane lines, crosswalks, curbs proximate the vehicle 102, off-road objects, electronic roadside devices, etc.
Various examples of sensors of the sensor system 220 will be described herein. The example sensors may be part of the one or more environment sensors 222 and/or the one or more vehicle sensors 221. However, it will be understood that the embodiments are not limited to the particular sensors described.
As an example, in one or more arrangements, the sensor system 220 or more specifically, the environment sensors 222 can include one or more radar sensors 223, one or more LIDAR sensors 224, one or more sonar sensors 225, one or more cameras 226, and/or one or more audio sensors 227. In one or more arrangements, the one or more cameras 226 can be high dynamic range (HDR) cameras or infrared (IR) cameras. The audio sensor(s) 227 can be microphones and/or any suitable audio recording devices. Any sensor in the sensor system 220 that is suitable for detecting and observing humans and/or human facial expression can be used inside the vehicle 102 to observe the users. Additionally, the sensor system 220 or more specifically, the environment sensors 222 can include one or more EDA sensors 228 for detecting and/or recording electrodermal activity of the user(s), one or more biometric sensors 229 such as a heartrate or heartbeat sensor, a body temperature sensor, a blood pressure sensor, an oxygen level sensor, alcohol sensor, and/or a blood sugar sensor.
The vehicle 102 can include an input system 230. An “input system” includes any device, component, system, element or arrangement or groups thereof that enable information/data to be entered into a machine. The input system 230 can receive an input from a user (e.g., a driver or a passenger). The vehicle 102 can include an output system 235. An “output system” includes any device, component, or arrangement or groups thereof that enable information/data to be presented to a user (e.g., a person, a vehicle passenger, etc.) such as a display interface or a speaker.
The vehicle 102 can include one or more vehicle systems 240. Various examples of the one or more vehicle systems 240 are shown in
The navigation system 247 can include one or more devices, applications, and/or combinations thereof, now known or later developed, configured to determine the geographic location of the vehicle 102 and/or to determine a travel route for the vehicle 102. The navigation system 247 can include one or more mapping applications to determine a travel route for the vehicle 102. The navigation system 247 can include a global positioning system, a local positioning system or a geolocation system.
The vehicle 102 can include one or more autonomous driving systems 260. The autonomous driving system 260 can include one or more devices, applications, and/or combinations thereof, now known or later developed, configured to control the movement, speed, maneuvering, heading, direction, etc. of the vehicle 102. The autonomous driving system 260 can include one or more driver assistance systems such as a lane keeping system, a lane centering system, a collision avoidance system, and/or a driver monitoring system.
The autonomous driving system(s) 260 can be configured to receive data from the sensor system 220 and/or any other type of system capable of capturing information relating to the vehicle 102 and/or the external environment of the vehicle 102. In one or more arrangements, the autonomous driving system(s) 260 can use such data to generate one or more driving scene models. The autonomous driving system(s) 260 can determine position and velocity of the vehicle 102. The autonomous driving system(s) 260 can determine the location of obstacles, obstacles, or other environmental features including traffic signs, trees, shrubs, neighboring vehicles, pedestrians, etc.
The autonomous driving system(s) 260 can be configured to receive, and/or determine location information for obstacles within the external environment of the vehicle 102 for use by the processor(s) 210, and/or one or more of the modules described herein to estimate position and orientation of the vehicle 102, vehicle position in global coordinates based on signals from a plurality of satellites, or any other data and/or signals that could be used to determine the current state of the vehicle 102 or determine the position of the vehicle 102 with respect to its environment for use in either creating a map or determining the position of the vehicle 102 in respect to map data.
The autonomous driving system(s) 260 either independently or in combination with the EDA-based user reaction measurement system 100 can be configured to determine travel path(s), current autonomous driving maneuvers for the vehicle 102, future autonomous driving maneuvers and/or modifications to current autonomous driving maneuvers based on data acquired by the sensor system 220, driving scene models, and/or data from any other suitable source such as determinations from the sensor data 216 and the EDA data 219. “Driving maneuver” means one or more actions that affect the movement of a vehicle. Examples of driving maneuvers include: accelerating, decelerating, braking, turning, moving in a lateral direction of the vehicle 102, changing travel lanes, merging into a travel lane, and/or reversing, just to name a few possibilities. The autonomous driving system(s) 260 can be configured to implement determined driving maneuvers. The autonomous driving system(s) 260 can cause, directly or indirectly, such autonomous driving maneuvers to be implemented. As used herein, “cause” or “causing” means to make, command, instruct, and/or enable an event or action to occur or at least be in a state where such event or action may occur, either in a direct or indirect manner. The autonomous driving system(s) 260 can be configured to execute various vehicle functions and/or to transmit data to, receive data from, interact with, and/or control the vehicle 102 or one or more systems thereof (e.g., one or more of vehicle systems 240).
The processor(s) 210, the EDA-based user reaction measurement system 100, and/or the autonomous driving system(s) 260 can be operatively connected to communicate with the various vehicle systems 240 and/or individual components thereof. For example, the processor(s) 210, the EDA-based user reaction measurement system 100, and/or the autonomous driving system(s) 260 can be in communication to send and/or receive information from the various vehicle systems 240 to control the movement, speed, maneuvering, heading, direction, etc. of the vehicle 102. The processor(s) 210, the EDA-based user reaction measurement system 100, and/or the autonomous driving system(s) 260 may control some or all of these vehicle systems 240 and, thus, may be partially or fully autonomous.
The processor(s) 210, the EDA-based user reaction measurement system 100, and/or the autonomous driving system(s) 260 may be operable to control the navigation and/or maneuvering of the vehicle 102 by controlling one or more of the vehicle systems 240 and/or components thereof. As an example, when operating in an autonomous mode, the processor(s) 210, the EDA-based user reaction measurement system 100, and/or the autonomous driving system(s) 260 can control the direction and/or speed of the vehicle 102. As another example, the processor(s) 210, the EDA-based user reaction measurement system 100, and/or the autonomous driving system(s) 260 can activate, deactivate, and/or adjust the parameters (or settings) of the one or more driver assistance systems. The processor(s) 210, the EDA-based user reaction measurement system 100, and/or the autonomous driving system(s) 260 can cause the vehicle 102 to accelerate (e.g., by increasing the supply of fuel provided to the engine), decelerate (e.g., by decreasing the supply of fuel to the engine and/or by applying brakes) and/or change direction (e.g., by turning the front two wheels). As used herein, “cause” or “causing” means to make, force, compel, direct, command, instruct, and/or enable an event or action to occur or at least be in a state where such event or action may occur, either in a direct or indirect manner.
The vehicle 102 can include one or more actuators 250. The actuators 250 can be any element or combination of elements operable to modify, adjust and/or alter one or more of the vehicle systems 240 or components thereof to responsive to receiving signals or other inputs from the processor(s) 210 and/or the autonomous driving system(s) 260. Any suitable actuator can be used. For instance, the one or more actuators 250 can include motors, pneumatic actuators, hydraulic pistons, relays, solenoids, and/or piezoelectric actuators, just to name a few possibilities.
The vehicle 102 can include one or more modules, at least some of which are described herein. The modules can be implemented as computer-readable program code that, when executed by a processor 210, implement one or more of the various processes described herein. One or more of the modules can be a component of the processor(s) 210, or one or more of the modules can be executed on and/or distributed among other processing systems to which the processor(s) 210 is operatively connected. The modules can include instructions (e.g., program logic) executable by one or more processor(s) 210. Alternatively, or in addition, one or more data store 215 may contain such instructions.
In one or more arrangements, one or more of the modules described herein can include artificial or computational intelligence elements, e.g., neural network, fuzzy logic or other machine learning algorithms. Further, in one or more arrangements, one or more of the modules can be distributed among a plurality of the modules described herein. In one or more arrangements, two or more of the modules described herein can be combined into a single module.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments. In this regard, each block in the flowcharts or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
The systems, components and/or processes described above can be realized in hardware or a combination of hardware and software and can be realized in a centralized fashion in one processing system or in a distributed fashion where different elements are spread across several interconnected processing systems. Any kind of processing system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software can be a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein. The systems, components and/or processes also can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein. These elements also can be embedded in an application product which comprises all the features enabling the implementation of the methods described herein and, which when loaded in a processing system, is able to carry out these methods.
Furthermore, arrangements described herein may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied or embedded, e.g., stored, thereon. Any combination of one or more computer-readable media may be utilized. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium. The phrase “computer-readable storage medium” means a non-transitory storage medium. A computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk drive (HDD), a solid state drive (SSD), a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), a digital versatile disc (DVD), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present arrangements may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java™ Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the’user's computer, partly on the’user's computer, as a stand-alone software package, partly on the’user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the’user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
As used herein, the term “substantially” or “about” includes exactly the term it modifies and slight variations therefrom. Thus, the term “substantially equal” means exactly equal and slight variations therefrom. “Slight variations therefrom” can include within 15 percent/units or less, within 14 percent/units or less, within 13 percent/units or less, within 12 percent/units or less, within 11 percent/units or less, within 10 percent/units or less, within 9 percent/units or less, within 8 percent/units or less, within 7 percent/units or less, within 6 percent/units or less, within 5 percent/units or less, within 4 percent/units or less, within 3 percent/units or less, within 2 percent/units or less, or within 1 percent/unit or less. In some instances, “substantially” can include being within normal manufacturing tolerances.
The terms “a” and “an,” as used herein, are defined as one or more than one. The term “plurality,” as used herein, is defined as two or more than two. The term “another,” as used herein, is defined as at least a second or more. The terms “including” and/or “having,” as used herein, are defined as comprising (i.e. open language). The phrase “at least one of . . . and . . . ” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. As an example, the phrase “at least one of A, B and C” includes A only, B only, C only, or any combination thereof (e.g., AB, AC, BC or ABC).
Aspects herein can be embodied in other forms without departing from the spirit or essential attributes thereof. Accordingly, reference should be made to the following claims, rather than to the foregoing specification, as indicating the scope of the invention.