This invention relates to radio systems and methods. In particular this invention relates to controlling a multi-tuner radio.
The present invention is directed towards multi-tuner radio products and features, such as those illustratively disclosed in U.S. Pat. No. 7,171,174, issued Jan. 30, 2007, and U.S. Pat. No. 7,343,141, issued Mar. 11, 2008 and, which are hereby explicitly incorporated by reference herein in its entirety.
The nature and various advantages of the present invention will become more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
With further reference to
The radio tuners can receive radio signals using a plurality of supported technology. These may include AM, FM, HD, satellite, Internet (e.g., using Wi-Fi), or other technology.
The device 100 can include adequate memory to buffer a predetermined amount of digitized and compressed audio for the received stations. For example, to be able to buffer one hour of audio from each of the eight stations, the device may include one-half gigabyte of internal memory. The device may include additional memory. For example, the device may include a total of 1 gigabyte of internal memory to allow one hour of storage per station plus an additional eight hours of storage for saved songs and other audio content. Memory may include memory chips or cartridges, (e.g., RAM, dynamic RAM, static RAM, programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), flash memory, and other memory chips or cartridges), and any other medium to which a computer can write and from which a computer can read (such as a disk drive).
The device can include one or more processors, such as digital signal processors, and other circuits to compress the received audio information digitally and store it into the memory. The audio data may be stored, for example, in MP3 format. The processor can perform other tasks, such as controlling the audio output of the device and managing the user inputs and outputs.
In the present embodiment, the 3D Radio device 100 can continuously receive, compress, and store audio data from the eight tuners (e.g., each of the tuners) (six presets, current station, and previous station) into the memory device. The processor can continue to monitor the status of the memory device. When an hour of audio programming is available from any stored audio data, the oldest audio data may be overwritten, so that the most recent radio programming from the station is available for listening.
The 3D Radio may support connections to several external devices, for example:
In accordance with one or more embodiments, a radio device able to receive multiple radio stations simultaneously (which is sometimes referred to herein as 3D Radio device) having different operational modes can be provided as illustrated by the examples below. The 3D Radio device may be in any of several operational modes. For example:
Each of the unselected radio station presets is in one of three modes:
In accordance with one or more embodiments, a display can be provided as illustrated by the examples below. For example, a three-line LCD display, as shown, allows a variety of information to be presented to the user, depending on the mode. In general, the first line of the display contains information about the mode and source, while the second and third line contain more specific information about an item of audio content being played. For example:
In accordance with one or more embodiments, mode and display changes can be provided as illustrated by the examples below. The following events can cause mode and display changes:
In accordance with one or more embodiments, certain interactivity can be provided as illustrated by the examples below. The 3D Radio can respond to button presses, for example, as follows:
In accordance with one or more embodiments, certain external memory device related features can be provided as illustrated by the following examples. When an external memory device is connected, the 3D Radio may take any suitable action:
Note in the above when mentioned is made to ignoring, it will be understood that the respective feature can be implemented without such ignoring if desired depending on the context of the usage of the term.
In one such embodiment, the radio device may support voice a control component 650. This embodiment may include a display 670 similar to 130 that is shown in FIG. A2. It may also include at least one button 680 of
To support voice control, a microphone in communication with microphone interface 660 of
At step 510, a voice command may be received. In one embodiment, the voice command may be received through microphone interface 660 and further processed by voice control component 650 and/or processor 630. In one embodiment, such as in a car or home radio, a microphone may be built into a remote control device. Voice commands may be given by speaking into the remote control. In some embodiments, such as with portable radios, a microphone may be built into the radio device itself.
At step 520, to provide a cleaner voice signal, the voice command may be filtered (e.g., by voice control component 650). In one embodiment, the radio device may use the raw audio output signal and the environmental microphone to determine a transformation function, and then use the inverse of that function to remove the radio's output from the command microphone.
At step 530, an operation of the radio may be modified based on the received voice command (e.g., using voice control component 650 and/or processor 630). In one embodiment, the radio may support a vocabulary of voice commands to navigate the radio's functions. The following table defines the set of voice commands, and associated function, for one embodiment:
At step 540, a fingerprint may be optionally recognized. In one embodiment, the fingerprint may be recognized using gesture pad 640. Some embodiments may include fingerprint identification and/or authorization, using technology similar to, for example, the Microsoft Fingerprint Reader. This may be used for authentication or other security purposes, to prevent an unauthorized person from using the radio. It may also be used to identify which finger is being used. In some embodiments, the fingerprint identification may be used to determine the orientation of the finger when it is pressed to the pad. At step 550, in some embodiments, the fingerprint identification may be used to determine which user is providing a command, and to provide a customized experience based on that identification (for example by providing individual user preferences).
At step 560, a gesture input may be received. In one embodiment, the gesture input may be received through gesture pad 640. In some embodiments, the multi-tuner radio may be controlled by a gesture pad (640). One example of a gesture pad that may be incorporated into the radio device is the Fingerworks iGesture Pad, which allows mouse-like input using multi-finger gestures. Another example is the touch screen used in Apple Computer's iPod devices, which incorporates a display screen into the input device.
The gesture input pad may be incorporated into the radio device or it may be located remotely. This input may replace the standard input keys, or it may supplement them. In some embodiments, common commands may be available using the gesture input pad, while infrequent commands, such as device setup commands, may require the use of physical keys. In some embodiments that use touch-screen technology, the need for physical keys may be eliminated by using soft keys—content-specific buttons drawn on the screen.
The gesture pad may be located at a plurality of locations. For example, in a car the gesture pad may be located on the steering column to provide easy access by the driver to common radio control commands without removing the hands from the steering wheel. In some embodiments, there may be two gesture pads on the steering wheel, one on each side. These two gesture pads may be used in some embodiments for different commands, while in other embodiments they may provide duplicate functionality. In some embodiments, multiple gesture pads may be provided to allow easy command input from different locations. For example, in a car one gesture pad may be located on the steering column for the driver, and a second gesture pad may be located on the dash for easy access by a passenger. An additional gesture pad may also be provided on a remote control device.
In some embodiments, for example in a home or automobile environment, the gesture pad may be incorporated into a remote control device, which may use IR, UHF, Bluetooth, or other wireless or wired technology to communicate commands to the radio device. In some embodiments, the gesture pad may be incorporated directly into the radio device, as for example with a portable radio device. In some embodiments with multiple gesture pads, the system may determine which user is commanding the device based on which gesture pad is used for input. This may be used, for example, to provide separate preferences for different users.
Input commands on the gesture pad may be indicated by touching the pad and making a suitable gesture. In some cases, this may involve touching the pad with one finger. In some cases, two or more fingers may be used to create distinct commands.
At step 570, an operation of the radio may be modified based on the received gesture input (e.g., using processor 630).
At step 580, a user training may be optionally provided. In one embodiment, the user training may be provided using processor 630, gesture pad 640, and/or display 670. Some embodiments may use training to improve gesture recognition accuracy. This may be used to teach the device how hard the user presses down, how long the gesture strokes are, etc. In some embodiments, training may not be necessary. In some embodiments, the device may include a user training mode, in which the user is walked through various commands. This user training may include, for example, displaying the name of the command in text on the gesture pad, displaying the gesture on the pad graphically using dots, lines, and arcs, prompting the user to make the gesture him or herself, and providing the user with feedback as to how well the gesture was performed. In some embodiments, user identification may be performed based on characteristics of gesture input, which may be determined in a training session or may be determined or refined over normal use.
It is to be understood that other embodiments can be utilized and structural changes can be made without departing from the scope of the present invention.
This application is a continuation of U.S. application Ser. No. 12/420,650 filed on Apr. 8, 2008, now U.S. Pat. No. 8,699,995, which claims the benefit of U.S. Application No. 61/043,604 filed on Apr. 9, 2008, each of which is incorporated herein by reference thereto.
Number | Name | Date | Kind |
---|---|---|---|
2097901 | Thomas | Nov 1937 | A |
4031334 | Kimura et al. | Jun 1977 | A |
4109115 | Yamamoto | Aug 1978 | A |
4268724 | Hubbard | May 1981 | A |
4591661 | Benedetto et al. | May 1986 | A |
4677466 | Lert et al. | Jun 1987 | A |
4682370 | Matthews | Jul 1987 | A |
4787063 | Muguet | Nov 1988 | A |
4953212 | Otsubo | Aug 1990 | A |
5119507 | Mankovitz | Jun 1992 | A |
5187589 | Kono et al. | Feb 1993 | A |
5214792 | Alwadish | May 1993 | A |
5239540 | Rovira et al. | Aug 1993 | A |
5243640 | Hadley et al. | Sep 1993 | A |
5253066 | Vogel | Oct 1993 | A |
5345430 | Moe | Sep 1994 | A |
5371551 | Logan et al. | Dec 1994 | A |
5404588 | Henze | Apr 1995 | A |
5406558 | Rovira et al. | Apr 1995 | A |
5448534 | Okada | Sep 1995 | A |
5457815 | Morewitz, II | Oct 1995 | A |
5463599 | Yifrach et al. | Oct 1995 | A |
5513385 | Tanaka | Apr 1996 | A |
5612729 | Ellis et al. | Mar 1997 | A |
5671195 | Lee | Sep 1997 | A |
5742893 | Frank | Apr 1998 | A |
5774798 | Gaskill | Jun 1998 | A |
5778137 | Nielsen et al. | Jul 1998 | A |
5818441 | Throckmorton et al. | Oct 1998 | A |
5867794 | Hayes et al. | Feb 1999 | A |
5914941 | Janky | Jun 1999 | A |
5978689 | Tuoriniemi et al. | Nov 1999 | A |
5986650 | Ellis et al. | Nov 1999 | A |
6074553 | Haski | Jun 2000 | A |
6088455 | Logan et al. | Jul 2000 | A |
6134426 | Volkel | Oct 2000 | A |
6169843 | Lenihan et al. | Jan 2001 | B1 |
6209787 | Iida | Apr 2001 | B1 |
6212359 | Knox | Apr 2001 | B1 |
6233389 | Barton et al. | May 2001 | B1 |
6236674 | Morelli et al. | May 2001 | B1 |
6259441 | Ahmad et al. | Jul 2001 | B1 |
6275268 | Ellis et al. | Aug 2001 | B1 |
6282464 | Obradovich | Aug 2001 | B1 |
6327418 | Barton | Dec 2001 | B1 |
6337719 | Cuccia | Jan 2002 | B1 |
6356704 | Callway et al. | Mar 2002 | B1 |
6400996 | Hoffberg et al. | Jun 2002 | B1 |
6407750 | Gioscia et al. | Jun 2002 | B1 |
6421453 | Kanevsky et al. | Jul 2002 | B1 |
6438523 | Oberteuffer et al. | Aug 2002 | B1 |
6452960 | Sato | Sep 2002 | B1 |
6507727 | Henrick | Jan 2003 | B1 |
6564003 | Marko et al. | May 2003 | B2 |
6588015 | Eyer et al. | Jul 2003 | B1 |
6607136 | Atsmon et al. | Aug 2003 | B1 |
6630963 | Billmaier | Oct 2003 | B1 |
6658247 | Saito | Dec 2003 | B1 |
6710815 | Billmaier et al. | Mar 2004 | B1 |
6721236 | Eschke et al. | Apr 2004 | B1 |
6725002 | Sakurai et al. | Apr 2004 | B2 |
6725022 | Clayton et al. | Apr 2004 | B1 |
6769028 | Sass et al. | Jul 2004 | B1 |
6785656 | Patsiokas et al. | Aug 2004 | B2 |
6792296 | Van Bosch | Sep 2004 | B1 |
6829475 | Lee et al. | Dec 2004 | B1 |
6850252 | Hoffberg | Feb 2005 | B1 |
6895165 | Boys | May 2005 | B2 |
6931451 | Logan et al. | Aug 2005 | B1 |
6944430 | Berstis | Sep 2005 | B2 |
6952576 | Fish et al. | Oct 2005 | B2 |
6961585 | Minematsu | Nov 2005 | B2 |
6990312 | Gioscia et al. | Jan 2006 | B1 |
7028323 | Franken et al. | Apr 2006 | B2 |
7058376 | Logan et al. | Jun 2006 | B2 |
7065342 | Rolf | Jun 2006 | B1 |
7095688 | Kondo et al. | Aug 2006 | B2 |
7107063 | Bates et al. | Sep 2006 | B1 |
7158871 | Ilan et al. | Jan 2007 | B1 |
7171174 | Ellis et al. | Jan 2007 | B2 |
7177608 | Herz et al. | Feb 2007 | B2 |
7213075 | Feig | May 2007 | B2 |
7231198 | Loughran | Jun 2007 | B2 |
7277562 | Zyzdryn | Oct 2007 | B2 |
7295904 | Kanevsky et al. | Nov 2007 | B2 |
7313375 | Lee et al. | Dec 2007 | B2 |
7327859 | Chau | Feb 2008 | B1 |
7343141 | Ellis et al. | Mar 2008 | B2 |
7418277 | Tsai | Aug 2008 | B2 |
7474773 | Chau | Jan 2009 | B2 |
7627560 | Watanabe et al. | Dec 2009 | B2 |
7668576 | Ellenbogen et al. | Feb 2010 | B2 |
7742458 | Sharma et al. | Jun 2010 | B2 |
7937119 | Arai | May 2011 | B2 |
8165644 | Syed | Apr 2012 | B2 |
8411606 | Chen et al. | Apr 2013 | B2 |
8700262 | Boissonnier et al. | Apr 2014 | B2 |
8706023 | Ellis | Apr 2014 | B2 |
8706169 | Cortright | Apr 2014 | B2 |
20010047379 | Jun et al. | Nov 2001 | A1 |
20020045438 | Tagawa et al. | Apr 2002 | A1 |
20020049037 | Christensen et al. | Apr 2002 | A1 |
20020057380 | Matey | May 2002 | A1 |
20020111703 | Cole | Aug 2002 | A1 |
20020174430 | Ellis et al. | Nov 2002 | A1 |
20020186957 | Yuen | Dec 2002 | A1 |
20030013425 | Nee | Jan 2003 | A1 |
20030095791 | Barton et al. | May 2003 | A1 |
20030163823 | Logan et al. | Aug 2003 | A1 |
20030208771 | Hensgen et al. | Nov 2003 | A1 |
20040029541 | Baranowski et al. | Feb 2004 | A1 |
20040128692 | Wolfe et al. | Jul 2004 | A1 |
20040158748 | Ishibashi et al. | Aug 2004 | A1 |
20040198282 | Heiderscheit et al. | Oct 2004 | A1 |
20050005298 | Tranchina | Jan 2005 | A1 |
20050014495 | Shanahan | Jan 2005 | A1 |
20050020223 | Ellis et al. | Jan 2005 | A1 |
20050049750 | Parker et al. | Mar 2005 | A1 |
20050064835 | Gusler et al. | Mar 2005 | A1 |
20050085217 | Lim | Apr 2005 | A1 |
20050229213 | Ellis et al. | Oct 2005 | A1 |
20060008243 | Przybylek | Jan 2006 | A1 |
20060026637 | Gatto et al. | Feb 2006 | A1 |
20060047386 | Kanevsky et al. | Mar 2006 | A1 |
20060082690 | Englert | Apr 2006 | A1 |
20060083253 | Park et al. | Apr 2006 | A1 |
20060085115 | Ilan et al. | Apr 2006 | A1 |
20060132382 | Jannard | Jun 2006 | A1 |
20060149971 | Kozlay | Jul 2006 | A1 |
20070052686 | Nomura | Mar 2007 | A1 |
20070064157 | Kasamatsu | Mar 2007 | A1 |
20070130280 | Park et al. | Jun 2007 | A1 |
20070273658 | Yli-Nokari et al. | Nov 2007 | A1 |
20080027586 | Hern et al. | Jan 2008 | A1 |
20080045170 | Howley et al. | Feb 2008 | A1 |
20080111710 | Boillot | May 2008 | A1 |
20080165758 | Kato et al. | Jul 2008 | A1 |
20080192994 | Chau | Aug 2008 | A1 |
20080204604 | Campbell | Aug 2008 | A1 |
20080320523 | Morris et al. | Dec 2008 | A1 |
20090174822 | Pugel | Jul 2009 | A1 |
20090313660 | Ni et al. | Dec 2009 | A1 |
20100120366 | DeBiasio et al. | May 2010 | A1 |
20120237092 | Bechtel | Sep 2012 | A1 |
20130053007 | Cosman et al. | Feb 2013 | A1 |
20130242706 | Newsome, Jr. | Sep 2013 | A1 |
Number | Date | Country |
---|---|---|
2 313 216 | Nov 1997 | GB |
WO 9945700 | Sep 1999 | WO |
WO 9945701 | Sep 1999 | WO |
WO 9966725 | Dec 1999 | WO |
WO 0013415 | Mar 2000 | WO |
WO 0013416 | Mar 2000 | WO |
WO 0016548 | Mar 2000 | WO |
WO 0045511 | Aug 2000 | WO |
WO 0176248 | Oct 2001 | WO |
Entry |
---|
Louderback, “Improve Your Commute with Audio on Demand,” ZDTV, ′Online! (Nov. 10, 1999) (available at: http://www.zdnet.com/anchordesk/story/story—4066.html). |
International Search Report, Appl. No. PCT/US02/05039, Feb. 4, 2003. |
“Federal Standard 1037C Telecommunications: Glossary of Telecommunication Terms,” http://www.its.bldrdoc.gov/fs-1037, pp. 1-8, Aug. 7, 1996. |
Number | Date | Country | |
---|---|---|---|
20140210588 A1 | Jul 2014 | US |
Number | Date | Country | |
---|---|---|---|
61043604 | Apr 2008 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12420650 | Apr 2009 | US |
Child | 14242771 | US |