SMART DEVICE INCLUDING OLFACTORY SENSING

Information

  • Patent Application
  • 20240027394
  • Publication Number
    20240027394
  • Date Filed
    July 19, 2022
    a year ago
  • Date Published
    January 25, 2024
    4 months ago
Abstract
An electronic device including an olfactory detector including an array of olfactory sensors for determining scents proximate the electronic device, such as a smartphone, smart eyewear, and a smart watch. Each sensor in the array of olfactory sensors is tuned to detect the presence and concentration of one or more specific chemical compounds or molecules. A fan creates airflow of ambient air across the olfactory sensors. An analog to digital (A/D) converter receives and processes the sensor outputs of the olfactory sensors and provides the processed sensor output to a processor for further processing. Scent type and intensity can be classified by using the information from the scent sensors as input for a machine learning model, generated through supervised training using labeled example measurements from our sensor array. The processor may display information of the determined scents on a display of the smart device, and the processor can also send information indicative of the determined scents to another device.
Description
TECHNICAL FIELD

The present subject matter relates to electronic devices, including smart devices and communications between the smart devices.


BACKGROUND

Olfactory families allow individual perfumes to be classified according to their key olfactory characteristics. The olfactory families are created either by grouping together raw materials (like flowers, woods, aromatics or citrus fruits) or by taking inspiration from traditional accords (oriental or chypre, for example).





BRIEF DESCRIPTION OF THE DRAWINGS

The drawing figures depict one or more implementations, by way of example only, not by way of limitations. In the figures, like reference numerals refer to the same or similar elements.


Features of the various implementations disclosed will be readily understood from the following detailed description, in which reference is made to the appended drawing figures. A reference numeral is used with each element in the description and throughout the several views of the drawing. When a plurality of similar elements is present, a single reference numeral may be assigned to like elements, with an added letter referring to a specific element.


The various elements shown in the figures are not drawn to scale unless otherwise indicated. The dimensions of the various elements may be enlarged or reduced in the interest of clarity. The several figures depict one or more implementations and are presented by way of example only and should not be construed as limiting. Included in the drawing are the following figures:



FIG. 1 is a high-level functional block diagram of an example smartphone configured to receive and transmit olfactory information;



FIG. 2 is a user interface on a smartphone displaying an olfactory sticker;



FIG. 3 is an augmented reality (AR) user interface displaying an olfactory sticker;



FIG. 4 is a flowchart of a method of sending and receiving an olfactory sticker;



FIG. 5 is an olfactory detector sensing scents proximate the smartphone;



FIG. 6 is a response curve of an array of olfactory sensors;



FIG. 7 is a flowchart of a method of sensing and determining scents with the olfactory detector; and



FIG. 8 shows the detected scents displayed on the display of the smartphone.





DETAILED DESCRIPTION

An electronic device having an olfactory detector including an array of olfactory sensors for determining scents proximate the electronic device, such as a smartphone, smart watch, or smart eyewear. Each sensor in the array of olfactory sensors is tuned to detect and measure the concentration of one or more specific chemical compounds or molecules. Specific combinations and concentrations of chemical compounds or molecules as measured by the olfactory detector represent distinct scents as perceived by the human olfactory system. A fan creates airflow of ambient air across the olfactory sensors. An analog to digital (A/D) converter receives and processes the sensor outputs of the olfactory sensors and provides the processed sensor output to a processor for further processing. The processor may display information of the determined scents on a display of the smart device, and the processor can also send information indicative of the determined scents to another device.


The term “connect,” “connected,” “couple,” and “coupled” as used herein refers to any logical, optical, physical, or electrical connection, including a link or the like by which the electrical or magnetic signals produced or supplied by one system element are imparted to another coupled or connected system element. Unless described otherwise, coupled, or connected elements or devices are not necessarily directly connected to one another and may be separated by intermediate components, elements, or communication media, one or more of which may modify, manipulate, or carry the electrical signals. The term “on” means directly supported by an element or indirectly supported by the element through another element integrated into or supported by the element.


Additional objects, advantages and novel features of the examples will be set forth in part in the following description, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The objects and advantages of the present subject matter may be realized and attained by means of the methodologies, instrumentalities and combinations particularly pointed out herein.


Reference now is made in detail to the examples illustrated in the accompanying drawings and discussed below.



FIG. 1 illustrates a high-level functional block diagram of an example mobile device in a sample configuration. As illustrated, smartphone 100 includes a flash memory 110 that stores programming to be executed by a CPU 120 to perform all or a subset of the functions described herein. As shown in FIG. 1, the CPU 120 of the smartphone 100 includes a mobile display driver 130, a user input layer 140 (e.g., a touchscreen) of a front facing image display 145, a display controller 150, a front facing visible light camera 155, and one or more rear facing visible light cameras 160 with substantially overlapping fields of view. In such a configuration, the flash memory 110 may further include multiple images or video, which are generated via the cameras.


As shown in FIG. 1, the smartphone 100 may further include at least one digital transceiver (XCVR) 165, shown as WWAN XCVRs, for digital wireless communications via a wide-area wireless mobile communication network. The smartphone 100 also may include additional digital or analog transceivers, such as short-range transceivers (XCVRs) 170 for short-range network communication, such as via NFC, VLC, DECT, ZigBee, BLUETOOTH®, or WI-FI®. For example, short range XCVRs 170 may take the form of any available two-way wireless local area network (WLAN) transceiver of a type that is compatible with one or more standard protocols of communication implemented in wireless local area networks, such as one of the WI-FI® standards under IEEE 802.11. In certain configurations, the XCVRs 170 also may be configured to communicate with a global event database.


To generate location coordinates for positioning of the smartphone 100, smartphone 100 also may include a global positioning system (GPS) receiver. Alternatively, or additionally, the smartphone 100 can utilize either or both the short range XCVRs 170 and WWAN XCVRs 165 for generating location coordinates for positioning. For example, cellular network, WI-FI®, or BLUETOOTH® based positioning systems can generate very accurate location coordinates, particularly when used in combination.


The transceivers 165, 170 (i.e., the network communication interface) may conform to one or more of the various digital wireless communication standards utilized by modern mobile networks. Examples of WWAN transceivers 165 include (but are not limited to) transceivers configured to operate in accordance with Code Division Multiple Access (CDMA) and 3rd Generation Partnership Project (3GPP) network technologies including, for example and without limitation, 3GPP type 2 (or 3GPP2) and LTE, at times referred to as “4G,” or 5G New Radio, referred to as “5G.” For example, the transceivers 165, 170 provide two-way wireless communication of information including digitized audio signals, still image and video signals, web page information for display as well as web-related inputs, and various types of mobile message communications to/from the smartphone 100.


The smartphone 100 further includes a microprocessor that functions as a central processing unit (CPU) shown as CPU 120 in FIG. 1. A microprocessor is a circuit having elements structured and arranged to perform one or more processing functions, typically various data processing functions. Although discrete logic components could be used, the examples utilize components forming a programmable CPU. A microprocessor for example includes one or more integrated circuit (IC) chips incorporating the electronic elements to perform the functions of the CPU 120. The CPU 120, for example, may be based on any known or available microprocessor architecture, such as a Reduced Instruction Set Computing (RISC) using an ARM architecture, as commonly used today in mobile devices and other portable electronic devices. Of course, other arrangements of microprocessor circuitry may be used to form the CPU 120 or microprocessor hardware in a smartwatch, smartphone, laptop computer, and tablet.


The CPU 120 serves as a programmable host controller for the smartphone 100 by configuring the smartphone 100 to perform various operations in accordance with instructions or programming executable by CPU 120. For example, such operations may include various general operations of the smartphone 100, as well as operations related to the programming for applications on the smartphone 100. Although a microprocessor may be configured by use of hardwired logic, typical microprocessors in mobile devices are general processing circuits configured by execution of programming.


The smartphone 100 further includes a memory or storage system, for storing programming and data. In the example illustrated in FIG. 1, the memory system may include the flash memory 110, a random-access memory (RAM) 180, local event database, and other memory components as needed. The RAM 180 may serve as short-term storage for instructions and data being handled by the CPU 120, e.g., as a working data processing memory, while the flash memory 110 typically provides longer-term storage.


Hence, in the example of smartphone 100, the flash memory 110 is used to store programming or instructions for execution by the CPU 120. Depending on the type of device, the smartphone 100 stores and runs a mobile operating system through which specific applications are executed. Examples of mobile operating systems include Google Android, Apple iOS (for iPhone or iPad devices), Windows Mobile, Amazon Fire OS, RIM BlackBerry OS, or the like.


In sample configurations, the CPU 120 may construct a map of the environment surrounding the smartphone 100, determine a location of the smartphone 100 within the mapped environment, and determine a relative position of the smartphone 100 to one or more objects in the mapped environment. The CPU 120 may construct the map and determine location and position information using a simultaneous localization and mapping (SLAM) algorithm applied to data received from one or more sensors.


Sensor data may include images received from cameras 155 and 160, distance(s) received from a laser range finder, position information received from a GPS unit, motion and acceleration data received from an inertial measurement unit (IMU) 190, or a combination of data from such sensors, or from other sensors that provide data useful in determining positional information. In the context of augmented reality, a SLAM algorithm is used to construct and update a map of an environment, while simultaneously tracking and updating the location of a device (or a user) within the mapped environment. The mathematical solution can be approximated using various statistical methods, such as particle filters, Kalman filters, extended Kalman filters, and covariance intersection. In a system that includes a high-definition (HD) video camera that captures video at a high frame rate (e.g., thirty frames per second), the SLAM algorithm updates the map and the location of objects at least as frequently as the frame rate, in other words, calculating and updating the mapping and localization thirty times per second.


The smartphone 100 is configured to send messages to, and receive messages from, a remote smartphone device via the short range XCVRs 170 and the WWAN XCVRs 165. The messaging supports traditional chat-based messages, augmented reality (AR) messages, and hybrid chat-AR based messages. The smartphone 100 further includes an olfactory transducer 200 configured to store and disperse a small amount of one or more stored scents to be perceived by a user in proximity to the olfactory transducer 200. The CPU 120 is configured to send control signals to the olfactory transducer 200 to control the dispersion of the scents. In one example, the olfactory transducer 200 is built into the smartphone 100. In other examples, a separate device containing the olfactory transducer 200 is coupled to the smartphone 100, such as a cartridge coupled to the smartphone 100.


The smartphone 100 also includes an olfactory detector 220 that is configured to both sense and determine a scent proximate the smartphone 100. The olfactory detector 220 includes a plurality of olfactory sensors 224 for sensing and determining a plurality of different scents and their intensity as will be described further with respect to FIG. 5, FIG. 6, FIG. 7 and FIG. 8.



FIG. 2 illustrates a user interface 202 forming the user input layer 140 on the display 145 of the smartphone 100 and displaying an olfactory sticker 204. The user interface 202 displays messages 210, such as chats, sent by a user of the smartphone 100 via the short range XCVRs 170 and the WWAN XCVRs 165 to a remote smartphone, and messages 212 received by the smartphone 100 and viewable by the user. The user interface 202 is configured to send and receive the olfactory sticker 204 encoded with olfactory information corresponding to a particular scent. The encoded olfactory information is configured to activate the olfactory transducer 200 to release the corresponding scent when the user engages the olfactory sticker 204. The olfactory sticker 204 graphically represents the encoded olfactory information to the user of the smartphone 100. The olfactory sticker 204 comprises a visual preview 206 of the scent conveyed. In an example, the visual preview 206 can be an icon, symbol, cartoon, picture, emoji, Bitmoji®, text, or similar visual information. In the example shown in FIG. 2, the visual preview 206 is a cartoon pineapple used to represent a scent of a pineapple that has been sent to the smartphone 100. A variety of images may be used as a visual preview 206 for the olfactory sticker 204. For example, a vanilla flower may be used to represent the scent of vanilla, a chocolate cake may be used to represent a scent of chocolate, and a strawberry may be used to represent a scent of a strawberry.


The olfactory sticker 204 may also include text instructions 208 indicating how to engage and activate the olfactory sticker 204 to release the scent via the olfactory transducer 200. In the example shown in FIG. 2, the text instruction 208 is “Rub Me!” to indicate to the user that the olfactory sticker 204 displayed on user interface 202 must be rubbed to release the scent represented by the olfactory sticker 204. Other text, such as, “Tap Me!” may be used to indicate to the user that the olfactory sticker 204 must be tapped to release the scent represented by the olfactory sticker 204. The user interface 202 is configured to detect the interaction between the user and the olfactory sticker 204. The intensity of tapping or rubbing the displayed olfactory sticker 204 regulates the amount of scent released by an olfactory transducer 200 so that the user is not overwhelmed with a received scent. For example, a singular tap of the displayed olfactory sticker 204 translates to a small, predetermined amount of scent being released from the olfactory transducer 200, whereas two taps translate to twice the amount of scent being released as compared to the singular tap. In the example of rubbing the olfactory sticker 204, rubbing for 2 seconds translates to a small, predetermined amount of scent to be released, whereas 5 seconds of rubbing translates to twice the amount of scent being released as compared to the 2 seconds of rubbing. The configuration of rubbing the olfactory sticker 204 simulates similar, physical scent-base stickers that are activated by rubbing or scratching to provide an intuitive user interface.



FIG. 3 illustrates a sender smartphone 100A having a display 145A including user interface 202A and a receiver smartphone 100B having a display 145B including user interface 202B. The user interface 202A and a receiver user interface 202B are configured to display messaging such as an AR chat between the smartphones. The sender smartphone 100A is configured to display the AR chat on the user interface 202A with an image generated by the rear camera 160 of the sender smartphone 100A. As illustrated, the sender has selected an object 310, a pineapple, within the view of the rear camera 160 to create a cloned image 312 (i.e., “segmented image”) of the object 310. The sender smartphone 100A includes a scent sensor or an image classification function to determine if information indicative of a scent can also be sent with the message. In the example of the pineapple image 312 being sent, information indicative of a scent of a pineapple is sent with the message in the form of the olfactory sticker 204. The sender smartphone 100A sends the cloned image 312 and the olfactory sticker 204 to the receiver smartphone 100B. The receiver smartphone 100B displays the cloned image 312 with the olfactory sticker 204 on the user interface 202B of the receiver smartphone 100B. In this example, the displayed olfactory sticker 204 partially overlaps the displayed cloned image 312. In other examples, the olfactory sticker 204 may be displayed to completely overlap the cloned image 312 or be displayed separately from the cloned image 312. In another example, the sender may choose to send only an olfactory sticker 204 without accompaniment of a cloned image. In an example where two users are in an AR chat proximate to each other, a shared, physically separate, olfactory transducer 200 may be used by both users to disperse the received scents.


The olfactory sticker 204 instructs the olfactory transducer 200 to emit the corresponding scent when the olfactory sticker 204 is engaged by a user of the receiver smartphone 100B, such as by tapping or rubbing the displayed olfactory sticker 204 as previously described in reference to FIG. 2.


In one example, the user of the sender smartphone 100A may select which olfactory sticker 204 to send from a displayed set of predefined olfactory stickers 204 that correlate to predefined scents that are stored in the memory 110 of the sender smartphone 100A. The receiver smartphone 100B has access to an identical, or a different, set of predefined olfactory stickers 204 to easily preview what particular scent has been received without activating the olfactory sticker 204. For example, an image of a pine tree corresponds to a fresh pine scent for both the user and the sender of the smartphones. The sender smartphone 100A sends an olfactory sticker 204 representing a pine tree to the receiver smartphone 100B and the receiver smartphone 100B knows from the predefined olfactory stickers 204 that a fresh pine scent olfactory sticker 204 was sent without activating the olfactory sticker 204. In another example, the user of the sender smartphone 100A may select a predetermined scent to accompany a custom olfactory sticker. For example, the user selects an image of a candle to be a visual preview 206 of the olfactory sticker 204 and selects the scent of the olfactory sticker 204 to be a citrus scent. The receiver smartphone 100B receives this custom olfactory sticker 204 and does not know the scent of the olfactory sticker 204 until the olfactory sticker 204 is activated and released by the olfactory transducer 200.



FIG. 4 is a flowchart 400 illustrating a method of sending and receiving a message including an olfactory sticker 204 between two smartphones 100.


At block 402, the user of the sender smartphone 100A creates and send a chat message with an olfactory sticker 204 to the receiver smartphone 100B. The processor 120 of the sender smartphone 100A sends the message via the short range XCVRs 170 or WWAN XCVRs 165. The message sent may be a traditional chat-based message or an AR message. In one example, the sender smartphone 100A sends an olfactory sticker 204 selected from a predetermined set of olfactory stickers 204 that have a predetermined scent. In another example, the sender selects a scent to accompany the olfactory sticker 204. In another example, the sender sends an identified scent determined by the olfactory detector 220 proximate the smartphone 100A as detailed with respect to FIGS. 5-8.


At block 404, the receiver smartphone 100B receives the message forwarded from the sender smartphone 100A via respective short range XCVRs 170 or WWAN XCVRs 165. The message contains the olfactory sticker 204 with the encoded olfactory information. The receiver smartphone 100B displays the olfactory sticker 204 on the display 145B via the receiver smartphone 100B user interface 202B.


At block 406, the user of the receiver smartphone 100B activates the olfactory sticker 204 by interacting with the displayed olfactory sticker 204. For example, the user taps or rubs the displayed olfactory sticker 204. The user can determine the intensity of the released scent by controlling the intensity of the interaction with the displayed olfactory sticker 204.


At block 408, the processor 120 of the receiver smartphone 100B sends a signal to the olfactory transducer 200 to release a scent corresponding to the encoded olfactory information in the olfactory sticker 204 and the intensity of the user interaction with the olfactory sticker 204. The user of the receiver smartphone 100B is then able to perceive the released scent.


The human olfactory system uses about 300 olfactory receptors to sample scents inhaled through the nose. Individual human olfactory receptors are sensitive to multiple chemical compounds present in the inhaled scent. These receptors' responses are transmitted to the brain, which combines and processes the “signals” of the olfactory receptors to determine the scent.



FIG. 5 illustrates the olfactory detector 220 of FIG. 1 that mimics the biological design and is configured to detect and determine different scents proximate the smartphone 100, and the intensity of the scents. In other examples, the olfactory detector 220 can be included in other electronic devices, such as smart eyewear devices and smart watches.


In the example shown, the olfactory detector 220 includes a manifold 222 having an airflow input 220, an airflow output 226, and a fan 228 that draws ambient air proximate the manifold 222 across an array of olfactory sensors 230. Each olfactory sensor 230 generates an analog signal indicative of a chemical compound or molecule and its concentration in the ambient air, where each olfactory sensor 230 has a different response curve as illustrated in FIG. 6. An analog to digital (A/D) converter 232 converts the analog signals to digital signals, and the digital signals are communicated to CPU 120 for determining the scents and their intensity. The CPU 120 displays the determined scents and their intensity on display 145, as shown in FIG. 8.


In an example, the olfactory sensors 230 are metal oxide semiconductor (MOS) sensors, as they are well-characterized, inexpensive, widely available and, in contrast to quartz crystal microbalance (QCM) sensors, are less sensitive to interference from environmental conditions such as airflow effects and electromagnetic effects from other QCM sensors operating in the vicinity. Table 1 shows an example list of sensors and the specific compounds each is configured to respond to. Example of the MOS sensors are MQ Series sensors available from Figaro USA Inc. and Hanwei Electronics CO LTD. An example A/D converter 232 is an Arduino Mega 25601 that converts the analog outputs of the sensor olfactory sensors 230 to digital signals. The digital signals are transmitted via a serial connection to the CPU 120 for further processing.












TABLE 1







Sensor Type
Primary Compounds with Sensor Response









MQ-2
methane, butane, LPG, smoke



MQ-3
alcohol, ethanol, smoke



MQ-4
methane gas / CNG



MQ-5
natural gas, LPG



MQ-6
LPG, iso-butane, propane



MQ-9
carbon monoxide, combustible gasses



MQ-135
benzene, alcohol, smoke



MQ-136
hydrogen sulfide gas



MQ-137
ammonia



MQ-138
benzene, toluene, alcohol, acetone, propane,




formaldehyde gas, hydrogen










In an example of machine learning model, as shown in FIG. 6, the following scents are commonly found in a household as classes for classification: (1) clean air (no scent), (2) food (food and cooking smells), (3) coffee (smell emitted while brewing coffee) and (4) bread baking. A logging application executed by the CPU 120 samples the sensor voltages in one second intervals, and writes the sensor voltages to a log file together with a label. In an example, over 600s of scent samples for each class can be logged. To verify the various scents are distinguishable, the labeled sample points can be projected into two dimensions via Principal Component Analysis (PCA). As can be seen in FIG. 6, the scent classes have relatively little overlap. This indicates that scent classes are visually distinguishable, and that supervised learning techniques are applicable to the data set.


The olfactory detector 200 can be used as a scent-sensing IOT appliance that is always on and classifies the current scent in a space. Using the scent classification, a server can provide a web endpoint that displays the currently detected and history of recently detected scents. Furthermore, a simple Slack Bot (Sniff Bot) can be implemented that posts the currently detected scent to a Slack channel every hour and on every change of detected scent. Such updates can augment the awareness of what is happening in the environment, such as of a remote co-worker, without being as intrusive as, e.g., a camera feed.



FIG. 7 is a flow diagram showing a method 700 of determining scents about a smart device, such as smartphone 100, using olfactory detector 220. This method 700 can be performed on other electronic devices such as smart eyewear devices and smart watches.


At block 702, the CPU 120 of the smartphone 100 enables the olfactory sensors 230 to each detect a scent that the respective olfactory sensor 230 is tuned to sense. This may be done by the CPU 120 powering the olfactory sensors 230.


At block 704, the CPU 120 enables the fan 228 to draw ambient air into the airflow input 220, across the olfactory sensors 230, and exit through the airflow output 226. The speed of the fan, and thus the airflow, is controlled by the CPU 120.


At block 706, the olfactory sensors 230 each detect the scent in the airflow that the respective olfactory sensor 230 is tuned to sense. Each olfactory sensor 230 generates an electrical analog signal that is indicative of the detected scent, and the intensity of the detected scent. Each olfactory sensor 230 can detect a single scent, or a compound of scents depending on the selected olfactory sensor 230.


At block 708, the A/D converter 232 processes the received analog signals from each of the olfactory sensors 230. The A/D converter 232 generates a digital signal(s) indicative of the detected scents. The A/D converter 232 communicates the digital signal to the CPU 120 for further processing.


At step 710, the CPU 120 processes the digital signals from the A/D converter 232, including identifying the detected scents and intensity. The CPU 120 displays the identified detected scents on the display 145 of the smartphone 100 as shown in FIG. 8. The CPU 120 also communicates the detected scents to another remote device, such another smartphone 100, a smart eyewear device, a smart watch, and a server, via the short range XCVRs 170 and the WWAN XCVRs 165.


It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “includes,” “including,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises or includes a list of elements or steps does not include only those elements or steps but may include other elements or steps not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.


Unless otherwise stated, all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. Such amounts are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain. For example, unless expressly stated otherwise, a parameter value or the like may vary by as much as ±10% from the stated amount.


In addition, in the foregoing Detailed Description, various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed examples require more features than are expressly recited in each claim. Rather, as the following claims reflect, the subject matter to be protected lies in less than all features of any single disclosed example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.


While the foregoing has described what are considered to be the best mode and other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that they may be applied in numerous applications, only some of which have been described herein. It is intended by the following claims to claim any and all modifications and variations that fall within the true scope of the present concepts.

Claims
  • 1. An electronic device, comprising: a display comprising a user interface;an olfactory detector configured to detect a scent proximate the electronic device and generate a signal indicative of the detected scent; anda processor configured to process the signal and determine the scent.
  • 2. The electronic device of claim 1, wherein the processor is configured to display the determined scent on the display.
  • 3. The electronic device of claim 1, further comprising an analog to digital (A/D) converter configured to convert the signal from the olfactory detector to a digital signal, the A/D converter configured to communicate the digital signal to the processor.
  • 4. The electronic device of claim 1, wherein the olfactory detector is configured to determine an intensity of the detected scent.
  • 5. The electronic device of claim 1, wherein the olfactory detector comprises a metal on silicon (MOS) detector.
  • 6. The electronic device of claim 1, further comprising a transceiver coupled to the processor and configured to communicate information of the detected scent to a remote device.
  • 7. The electronic device of claim 1, wherein the display comprises a touchscreen that is operable as the user interface.
  • 8. The electronic device of claim 7, wherein the touchscreen is configured to display messaging between the electronic device and a remote device.
  • 9. The electronic device of claim 1, comprising a plurality of the olfactory detectors each configured to detect a different scent.
  • 10. The electronic device of claim 7, wherein the touchscreen is configured to display an olfactory sticker on the touchscreen.
  • 11. A method of using an electronic device comprising a display including a user interface, an olfactory detector configured to detect a scent proximate the electronic device and generate a signal indicative of the detected scent, and a processor configured to process the signal and determine the scent, the method comprising: detecting a scent proximate the electronic device and generating a signal indicative of the detected scent using the olfactory detector;determining the scent by processing the signal using the processor; anddisplaying information indicative of the determined scent on the display.
  • 12. The method of claim 11, wherein the processor displays the determined scent on the display.
  • 13. The method of claim 11, wherein the electronic device further comprises an analog to digital (A/D) converter converting the signal from the olfactory detector to a digital signal, the A/D converter communicating the digital signal to the processor.
  • 14. The method of claim 11, wherein the olfactory detector determines an intensity of the detected scent.
  • 15. The method of claim 11, wherein the olfactory detector comprises a metal on silicon (MOS) detector.
  • 16. The method of claim 11, further comprising a transceiver coupled to the processor and communicating information of the detected scent to a remote device.
  • 17. The method of claim 11, wherein the display comprises a touchscreen that is operable as the user interface.
  • 18. The method of claim 17, wherein the touchscreen displays messaging between the electronic device and a remote device.
  • 19. The method of claim 11, wherein the detecting comprises detecting the scent from a plurality of different scents using a plurality of the olfactory detectors, each olfactory detector detecting a different scent.
  • 20. A non-transitory computer-readable medium storing program code which, when executed, is operative to cause an electronic processor of an electronic device comprising a display including a user interface, an olfactory detector configured to detect a scent proximate the electronic device and generate a signal indicative of the detected scent, and a processor configured to process the signal and determine the scent, to: detect a scent proximate the electronic device and generating a signal indicative of the detected scent using the olfactory detector; anddetermine the scent by processing the signal using the processor.