Recently, it has been more common for users to utilize electronic devices in moving vehicles, as in, for example, automobiles. The user interface may be displayed on an in-dash computer screen or may be located on a smartphone, which may be carried or may be physically mounted on a dashboard of the vehicle, for example. Performing searches, such as, for example, Internet or database searches, may be dangerous for a user to perform while driving. Such searches often require data entry. Having a driver type in characters while driving may not be safe. While recent implementations utilize voice recognition, these implementations still require the attention of the user be diverted while speaking search commands. Some studies have found that voice-based commands are no safer to use than physically typing while driving.
The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art, that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.
Although the present embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the embodiments. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
In an example embodiment, a search is automated in a vehicle. This automated search is performed based on one or more detected user interests. In an example embodiment, the detected user interests are based on information detected about the environment in the vehicle. In one specific example embodiment, user actions within the vehicle, such as tapping a finger to a song or singing along with a song may be used to determine that the user has specific interest in the song or topic related to the song (e.g., artist, album, etc.). Once this user interest has been determined, an automated search may be performed based on the interest and the results returned to the user without requiring express interaction on the part of the user. For example, a user tapping his fingers to a song of a particular band may receive search results including item listings of concert tickets for the band that performs the song.
An Application Program Interface (API) server 114 and a web server 116 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 118. The application servers 118 host one or more marketplace applications 120 and payment applications 122. The application servers 118 are, in turn, shown to be coupled to one or more database servers 124 that facilitate access to one or more databases 126.
The marketplace applications 120 may provide a number of marketplace functions and services to users that access the networked system 102. The payment applications 122 may likewise provide a number of payment services and functions to users. The payment applications 122 may allow users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are made available via the marketplace applications 120. While the marketplace and payment applications 120 and 122 are shown in
Further, while the system 100 shown in
The dashboard client 106 accesses the various marketplace and payment applications 120 and 122 via a web interface supported by the web server 116. Similarly, the programmatic client 108 accesses the various services and functions provided by the marketplace and payment applications 120 and 122 via the programmatic interface provided by the API server 114. The programmatic client 108 may, for example, be a seller application (e.g., the TurboLister application developed by eBay Inc., of San Jose, Calif.) to enable sellers to author and manage listings on the networked system 102 in an off-line manner, and to perform batch-mode communications between the programmatic client 108 and the networked system 102.
User interests may be detected through a number of different environmental sensors or information sources.
An accelerometer 224, such as those commonly found in smartphones, could also be accessed.
Also presented are information sources 226-230 that may commonly be located outside of the vehicle 110, such as a mapping server 226, which may be used to determine points of interest around the current location of the vehicle 110; weather server 228, which may be used to determine local weather conditions (e.g., is it a sunny day), and user profile database 230, which may store demographic information about the user (e.g., a driver), such as age, which could be useful in determining the user's interests. The user profile may also contain previously established user interests, either from other sources or from previous executions of the user interest detection module 202. These previously established user interests can be used to help infer future user interests. For example, a user who is a fan of one heavy metal band may be more likely to be a fan of another heavy metal band.
The user interest detection module 202 may be located in the vehicle 110, on a mobile device, or even on a separate server, such as a web server 116. The user interest detection module 202 may act to calculate a score for possible user interests, based on one or more of the factors described above. This score may be compared with a series of thresholds to determine which of a number of different user interests should be searched. The thresholds may be stored in a table.
The user interest detection module 202 may perform a search on the user interests from the Internet or another database. The results from this search may be presented by a user interface presentation module 232.
The user may then indicate that he or she has interest in the concert by, for example, pressing on a touchscreen over the pop-up window 302 to select the concert. In some embodiments, however, the user interest may simply be inferred so that active participation by the user is not required. For example, the system may simply assume that a concert with the closest location to the vehicle 110 is a concert the user has interest in. No matter how the concert is selected, once it is selected this may cause a second Internet search returning merchandise items related to the concert. While the merchandise can be any type of merchandise, in this example the merchandise are concert tickets. User interface screen capture 300B displays the results of this second Internet search. The user is then able to select from among available tickets 304A, 304B for the concert. Once tickets are selected, user interface screen capture 300C may display a payment window where the user can pay for the concert tickets through the vehicle interface. Thus, the user has been able to search for and purchase concert tickets through the user interface of the vehicle 110 without having to affirmatively search.
The example computer system 900 includes a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 904 and a static memory 906, which communicate with each other via a bus 908. The computer system 900 may further include a video display unit 910 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 900 also includes an alphanumeric input device 912 (e.g., a keyboard), a cursor control device 914 (e.g., a mouse), a disk drive unit 916, a signal generation device 918 (e.g., a speaker), and a network interface device 920.
The disk drive unit 916 includes a computer-readable medium 922 on which is stored one or more sets of instructions 924 (e.g., software) embodying any one or more of the methodologies or functions described herein. The instructions 924 may also reside, completely or at least partially, within the main memory 904 and/or within the processor 902 during execution thereof by the computer system 900, with the main memory 904 and the processor 902 also constituting machine-readable media. The instructions 924 may further be transmitted or received over a network 926 via the network interface device 920.
While the machine-readable medium 922 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 924. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies described herein. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
Although the inventive concepts have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the inventive concepts. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.
This application is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 13/965,407, filed on Aug. 13, 2013, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
6925489 | Curtin | Aug 2005 | B1 |
7095748 | Vij et al. | Aug 2006 | B2 |
7783249 | Robinson | Aug 2010 | B2 |
9330505 | Hosein et al. | May 2016 | B2 |
9330647 | Bay | May 2016 | B1 |
20020022984 | Daniel et al. | Feb 2002 | A1 |
20020023039 | Fritsch et al. | Feb 2002 | A1 |
20030114206 | Timothy et al. | Jun 2003 | A1 |
20040039646 | Hacker | Feb 2004 | A1 |
20040039677 | Mura et al. | Feb 2004 | A1 |
20050187838 | Squeglia et al. | Aug 2005 | A1 |
20060111145 | Kelly | May 2006 | A1 |
20060129290 | Zimmerman et al. | Jun 2006 | A1 |
20060271248 | Cosgrove, Jr. et al. | Nov 2006 | A1 |
20070014536 | Hellman | Jan 2007 | A1 |
20070143816 | Gupta | Jun 2007 | A1 |
20080042875 | Harrington et al. | Feb 2008 | A1 |
20080082221 | Nagy | Apr 2008 | A1 |
20080088597 | Prest et al. | Apr 2008 | A1 |
20080129684 | Adams et al. | Jun 2008 | A1 |
20080189110 | Freeman | Aug 2008 | A1 |
20080228689 | Tewary | Sep 2008 | A1 |
20080242280 | Shapiro | Oct 2008 | A1 |
20080261516 | Robinson | Oct 2008 | A1 |
20090030619 | Kameyama | Jan 2009 | A1 |
20090064169 | Nguyen et al. | Mar 2009 | A1 |
20090083141 | Craine | Mar 2009 | A1 |
20090150167 | Patenaude | Jun 2009 | A1 |
20090265212 | Hyman | Oct 2009 | A1 |
20090318777 | Kameyama et al. | Dec 2009 | A1 |
20100044121 | Simon et al. | Feb 2010 | A1 |
20100057341 | Bradburn | Mar 2010 | A1 |
20100211259 | Mcclellan | Aug 2010 | A1 |
20100235891 | Oglesbee et al. | Sep 2010 | A1 |
20100274480 | Mccall et al. | Oct 2010 | A1 |
20100280956 | Chutorash et al. | Nov 2010 | A1 |
20100311254 | Huang | Dec 2010 | A1 |
20110004523 | Giuli et al. | Jan 2011 | A1 |
20110035031 | Faenger | Feb 2011 | A1 |
20110093160 | Ramseyer | Apr 2011 | A1 |
20110096036 | Mcintosh et al. | Apr 2011 | A1 |
20110173539 | Rottler | Jul 2011 | A1 |
20110213628 | Peak et al. | Sep 2011 | A1 |
20110288724 | Falk | Nov 2011 | A1 |
20110288954 | Bertosa et al. | Nov 2011 | A1 |
20110294520 | Zhou et al. | Dec 2011 | A1 |
20110298702 | Sakata et al. | Dec 2011 | A1 |
20120054028 | Tengler | Mar 2012 | A1 |
20120072109 | Waite et al. | Mar 2012 | A1 |
20120089474 | Xiao et al. | Apr 2012 | A1 |
20120116550 | Hoffman et al. | May 2012 | A1 |
20120187916 | Duer | Jul 2012 | A1 |
20120271713 | Nussel et al. | Oct 2012 | A1 |
20120296513 | Ramseyer | Nov 2012 | A1 |
20130024113 | Weng et al. | Jan 2013 | A1 |
20130031088 | Srikrishna | Jan 2013 | A1 |
20130080371 | Harber | Mar 2013 | A1 |
20130117739 | Mueller et al. | May 2013 | A1 |
20130120449 | Ihara et al. | May 2013 | A1 |
20130303192 | Louboutin et al. | Nov 2013 | A1 |
20140025660 | Mohammed et al. | Jan 2014 | A1 |
20140026156 | Deephanphongs | Jan 2014 | A1 |
20140052745 | Hosein et al. | Feb 2014 | A1 |
Number | Date | Country |
---|---|---|
2014028464 | Feb 2014 | WO |
Entry |
---|
“U.S. Appl. No. 13/965,407, Final Office Action dated Oct. 29, 2015”, 8 pgs. |
“U.S. Appl. No. 13/965,407, Non Final Office Action dated May 29, 2015”, 8 pgs. |
“U.S. Appl. No. 13/965,407, Notice of Allowance dated Jan. 7, 2016”, 9 pgs. |
“U.S. Appl. No. 13/965,407, Response filed Aug. 31, 2015 to Non Final office Action dated May 29, 2015”, 6 pgs. |
“U.S. Appl. No. 13/965,407, Response filed Dec. 18, 2015 to Final office Action dated Oct. 29, 2015”, 6 pgs. |
Sean Lyden, “6 Mobile Applications to Prevent Distracted Driving Accidents”, Retrieved from the Internet URL: <http://www.automotive-fleet.com/channel/safety-accident-management/article/story/2011/08/6-mobile-applications-to-prevent-distracted-driving-accidents.aspx>, Aug. 2011, 10 pages. |
Number | Date | Country | |
---|---|---|---|
20160239554 A1 | Aug 2016 | US |
Number | Date | Country | |
---|---|---|---|
61682973 | Aug 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13965407 | Aug 2013 | US |
Child | 15138877 | US |