Communication device

Information

  • Patent Grant
  • 10805451
  • Patent Number
    10,805,451
  • Date Filed
    Thursday, September 19, 2019
    6 years ago
  • Date Issued
    Tuesday, October 13, 2020
    5 years ago
Abstract
The communication device comprising a 1st device remotely controlling implementer, a 2nd device remotely controlling implementer, and an email data transfer implementer.
Description
FIELD OF THE INVENTION

The invention relates to communication device and more particularly to the communication device which has a capability to communicate with another communication device in a wireless fashion.


BACKGROUND OF THE INVENTION

U.S. Pat. No. 6,363,320 introduces a system for tracking objects which includes a database for storing reference data as line segments corresponding to coordinate locations along environmental reference features; mobile units for connection to the objects for receiving coordinate object target point locations, and having means for receiving signals from an external location system and for generating the object data, and a wireless object data transmitter; and a computer having access to the database and to the object data, and generating an interpreted location of each of the objects in terms relative to automatically selected ones of the reference features. Also disclosed is a method for tracking the objects. Further disclosed is a computer program embodied on a computer-readable medium and having code segments for tracking objects according to the method. In this prior art, FIG. 2 illustrates the theory and/or the concept of producing and displaying a plurality of two-dimensional images on a display of a wireless communication devise, however, does not disclose the communication device comprising a 1st device remotely controlling implementer, a 2nd device remotely controlling implementer, and an email data transfer implementer.


SUMMARY OF THE INVENTION

It is an object of the present invention to provide a system and method to facilitate the user of the communication device to enjoy both two-dimensional images and three-dimensional images displayed thereon.


Still another object is to overcome the aforementioned shortcomings associated with the prior art.


Further objects, features, and advantages of the present invention over the prior art will become apparent from the detailed description which follows, when considered with the attached figures.


The present invention introduces the communication device comprising a 1st device remotely controlling implementer, a 2nd device remotely controlling implementer, and an email data transfer implementer.





BRIEF DESCRIPTION OF THE DRAWINGS

The above and other aspects, features, and advantages of the invention will be better understood by reading the following more particular description of the invention, presented in conjunction with the following drawings, wherein:



FIG. 1 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 2a is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 2b is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 2c is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 3 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 4 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 5 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 6a is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 6b is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 7 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 8 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 9 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 10 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 11 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 12 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 13 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 14 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 14a is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 15 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 16 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 17a is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 17b is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 18 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 19 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 20a is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 20b is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 21 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 22 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 23 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 24 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 25 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 26 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 27a is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 27b is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 28 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 29 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 30 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 31 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 32 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 32a is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 32b is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 32c is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 32d is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 32e is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 32f is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 32g is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 33 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 34 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 35a is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 35b is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 36 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 37 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 38 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 39 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 40 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 41 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 42 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 43 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 44a is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 44b is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 44c is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 44d is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 44e is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 45 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 46 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 47 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 48 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 49 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 50 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 51 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 52 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 53a is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 53b is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 54 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 55 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 56 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 57 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 58 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 59 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 60 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 61a is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 61b is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 62 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 63 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 64 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 65 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 66 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 67 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 68 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 69 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 70 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 71 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 72 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 73 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 74 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 74a is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 75 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 76 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 77 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 78 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 79 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 80 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 81 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 82 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 83 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 84 is a flowchart illustrating an exemplary embodiment of the present invention.



FIG. 85 is a block diagram illustrating an exemplary embodiment of the present invention.



FIG. 86 is a simplified illustration illustrating an exemplary embodiment of the present invention.



FIG. 87 is a flowchart illustrating an exemplary embodiment of the present invention.





DESCRIPTION OF THE PREFERRED EMBODIMENT

The following description is of the best presently contemplated mode of carrying out the present invention. This description is not to be taken in a limiting sense but is made merely for the purpose of describing the general principles of the invention. The scope of the invention should be determined by referencing the appended claims.



FIG. 1 is a simplified block diagram of the communication device 200 utilized in the present invention. In FIG. 1 communication device 200 includes CPU 211 which controls and administers the overall function and operation of communication device 200. CPU 211 uses RAM 206 to temporarily store data and/or to perform calculation to perform its function. Video processor 202 generates analog and/or digital video signals which are displayed on LCD 201. ROM 207 stores data and programs which are essential to operate communication device 200. Wireless signals are received by antenna 218 and processed by signal processor 208. Input signals are input by input device 210, such as dial pad, and the signal is transferred via input interface 209 and data bus 203 to CPU 211. Indicator 212 is an LED lamp which is designed to output different colors (e.g., red, blue, green, etc). Analog audio data is input to microphone 215. A/D 213 converts the analog audio data into a digital format. Speaker 216 outputs analog audio data which is converted into an analog format by D/A 204. Sound processor 205 produces digital audio signals that are transferred to D/A 204 and also processes the digital audio signals transferred from A/D 213. CCD unit 214 captures video image which is stored in RAM 206 in a digital format. Vibrator 217 vibrates the entire device by the command from CPU 211.



FIG. 2a illustrates one of the preferred methods of the communication between two communication devices. In FIG. 2a both device A and device B represents communication device 200 in FIG. 1. Device A transfers wireless data to transmitter 301 which relays the data to host 303 via cable 302. The data is transferred to transmitter 308 (e.g., a satellite dish) via cable 320 and then to artificial satellite 304. Artificial satellite 304 transfers the data to transmitter 309 which transfers the data to host 305 via cable 321. The data is then transferred to transmitter 307 via cable 306 and to device B in a wireless format.



FIG. 2b illustrates another preferred method of the communication between two communication devices. In this example device A directly transfers the wireless data to host 310, an artificial satellite, which transfers the data directly to device B.



FIG. 2c illustrates another preferred method of the communication between two communication devices. In this example device A transfers wireless data to transmitter 312, an artificial satellite, which relays the data to host 313, which is also an artificial satellite, in a wireless format. The data is transferred to transmitter 314, an artificial satellite, which relays the data to device B in a wireless format.


Voice Recognition


Communication device 200 has a function to operate the device by the user's voice or convert the user's voice into a text format (i.e., voice recognition). Such function can be enabled by the technologies primarily introduced in the following inventions: U.S. Pat. Nos. 6,282,268; 6,278,772; 6,269,335; 6,269,334; 6,260,015; 6,260,014; 6,253,177; 6,253,175; 6,249,763; 6,246,990; 6,233,560; 6,219,640; 6,219,407; 6,199,043; 6,199,041; 6,195,641; 6,192,343; 6,192,337; 6,188,976; 6,185,530; 6,185,529; 6,185,527; 6,182,037; 6,178,401; 6,175,820; 6,163,767; 6,157,910; 6,119,086; 6,119,085; 6,101,472; 6,100,882; 6,092,039; 6,088,669; 6,078,807; 6,075,534; 6,073,101; 6,073,096; 6,073,091; 6,067,517; 6,067,514; 6,061,646; 6,044,344; 6,041,300; 6,035,271; 6,006,183; 5,995,934; 5,974,383; 5,970,239; 5,963,905; 5,956,671; 5,953,701; 5,953,700; 5,937,385; 5,937,383; 5,933,475; 5,930,749; 5,909,667; 5,899,973; 5,895,447; 5,884,263; 5,878,117; 5,864,819; 5,848,163; 5,819,225; 5,805,832; 5,802,251; 5,799,278; 5,797,122; 5,787,394; 5,768,603; 5,751,905; 5,729,656; 5,704,009; 5,671,328; 5,649,060; 5,615,299; 5,615,296; 5,544,277; 5,524,169; 5,522,011; 5,513,298; 5,502,791; 5,497,447; 5,477,451; 5,475,792; 5,465,317; 5,455,889; 5,440,663; 5,425,129; 5,353,377; 5,333,236; 5,313,531; 5,293,584; 5,293,451; 5,280,562; 5,278,942; 5,276,766; 5,267,345; 5,233,681; 5,222,146; 5,195,167; 5,182,773; 5,165,007; 5,129,001; 5,072,452; 5,067,166; 5,054,074; 5,050,215; 5,046,099; 5,033,087; 5,031,217; 5,018,201; 4,980,918; 4,977,599; 4,926,488; 4,914,704; 4,882,759; 4,876,720; 4,852,173; 4,833,712; 4,829,577; 4,827,521; 4,759,068; 4,748,670; 4,741,036; 4,718,094; 4,618,984; 4,348,553; 6,289,140; 6,275,803; 6,275,801; 6,272,146; 6,266,637; 6,266,571; 6,223,153; 6,219,638; 6,163,535; 6,115,820; 6,107,935; 6,092,034; 6,088,361; 6,073,103; 6,073,095; 6,067,084; 6,064,961; 6,055,306; 6,047,301; 6,023,678; 6,023,673; 6,009,392; 5,995,933; 5,995,931; 5,995,590; 5,991,723; 5,987,405; 5,974,382; 5,943,649; 5,916,302; 5,897,616; 5,897,614; 5,893,133; 5,873,064; 5,870,616; 5,864,805; 5,857,099; 5,809,471; 5,805,907; 5,799,273; 5,764,852; 5,715,469; 5,682,501; 5,680,509; 5,668,854; 5,664,097; 5,649,070; 5,640,487; 5,621,809; 5,577,249; 5,502,774; 5,471,521; 5,467,425; 5,444,617; 4,991,217; 4,817,158; 4,725,885; 4,528,659; 3,995,254; 3,969,700; 3,925,761; 3,770,892. The voice recognition function can be performed in terms of software by using area 261, the voice recognition working area, of RAM 206 (FIG. 1) which is specifically allocated to perform such function as described in FIG. 3, or can also be performed in terms of hardware circuit where such space is specifically allocated in area 282 of sound processor 205 (FIG. 1) for the voice recognition system as described in FIG. 4.



FIG. 5 illustrates how the voice recognition function is activated. CPU 211 (FIG. 1) periodically checks the input status of input device 210 (FIG. 1) (S1). If the CPU 211 detects a specific signal input from input device 210 (S2) the voice recognition system which is described in FIG. 2 and/or FIG. 3 is activated.


Voice Recognition—Dialing/Auto-Off During Call



FIG. 6a and FIG. 6b illustrate the operation of the voice recognition in the present invention. Once the voice recognition system is activated (S1) the analog audio data is input from microphone 215 (FIG. 1) (S2). The analog audio data is converted into digital data by A/D 213 (FIG. 1) (S3). The digital audio data is processed by sound processor 205 (FIG. 1) to retrieve the text and numeric information therefrom (S4). Then the numeric information is retrieved (S5) and displayed on LCD 201 (FIG. 1) (S6). If the retrieved numeric information is not correct (S7) the user can input the correct numeric information manually by using input device 210 (FIG. 1) (S8). Once the sequence of inputting the numeric information is completed the entire numeric information is displayed on LCD 201 and the sound is output from speaker 216 under control of CPU 211 (S10). If the numeric information is correct (S11) communication device 200 (FIG. 1) initiates the dialing process by using the numeric information (S12). The dialing process continues until communication device 200 is connected to another device (S13). Once CPU 211 detects that the line is connected it automatically deactivates the voice recognition system (S14). CPU 211 checks the status communication device 200 periodically (S1) as described in FIG. 7 and remains the voice recognition system offline during call (S2). If the connection is severed, i.e., user hangs up, then CPU 211 reactivates the voice recognition system (S3).


Voice Recognition—Tag



FIG. 8 through FIG. 12 describes the method of inputting the numeric information in a convenient manner. RAM 206 includes Table #1 (FIG. 8) and Table #2 (FIG. 9). In FIG. 8 audio information #1 corresponds to tag “Scott.” Namely audio information, such as wave data, which represents the sound of “Scott” (sounds like “S-ko-t”) is registered in Table #1, which corresponds to tag “Scott”. In the same manner audio information #2 corresponds to a tag “Carol”; audio information #3 corresponds to a tag “Peter”; audio information #4 corresponds to a tag “Amy”; and audio information #5 corresponds to a tag “Brian.” In FIG. 9 tag “Scott” corresponds to numeric information “(916) 411-2526”; tag “Carol” corresponds to numeric information “(418) 675-6566”; tag “Peter” corresponds to numeric information “(220) 890-1527”; tag “Amy” corresponds to numeric information “(615) 125-3411”; and tag “Brian” corresponds to numeric information “(042) 643-2097.” FIG. 11 illustrates how CPU 211 (FIG. 1) operates by utilizing both Table #1 and Table #2. Once the audio data is processed as described in S4 of FIG. 6 CPU 211 scans Table #1 (S1). If the retrieved audio data matches with one of the audio information registered in Table #1 (S2) it scans Table #2 (S3) and retrieves the corresponding numeric information from Table #2 (S4). FIG. 10 illustrates another embodiment of the present invention. Here, RAM 206 includes Table #A instead of Table #1 and Table #2 described above. In this embodiment audio info #1 (i.e., wave data which represents the sound of “Scot”) directly corresponds to numeric information “(916) 411-2526.” In the same manner audio info #2 corresponds to numeric information “(410) 675-6566”; audio info #3 corresponds to numeric information “(220) 890-1567”; audio info #4 corresponds to numeric information “(615) 125-3411”; and audio info #5 corresponds to numeric information “(042)645-2097.” FIG. 12 illustrates how CPU 211 (FIG. 1) operates by utilizing Table #A. Once the audio data is processed as described in S4 of FIG. 6 CPU 211 scans Table #A (S1). If the retrieved audio data matches with one of the audio information registered in Table #A (S2) it retrieves the corresponding numeric information therefrom (S3). As another embodiment RAM 206 may contain only Table #2 and tag can be retrieved from the voice recognition system explained in FIG. 3 through FIG. 7. Namely once the audio data is processed by CPU 211 as described in S4 of FIG. 6 and retrieves the text data therefrom and detects one of the tags registered in Table #2 (e.g., “Scot”) it retrieves the corresponding numeric information (e.g., “(916) 411-2526”) from the same table.


Voice Recognition—Background Noise Filter



FIG. 13 through FIG. 15 describes the method of minimizing the undesired effect of the background noise. ROM 207 includes area 255 and area 256. Sound audio data which represents background noise is stored in area 255, and sound audio data which represents the beep, ringing sound and other sounds which are emitted from the communication device 200 are stored in area 256. FIG. 14 describes how these data are utilized. When the voice recognition system is activated as described in FIG. 5 the analog audio data is input from microphone 215 (FIG. 1) (S1). The analog audio data is converted into digital data by A/D 213 (FIG. 1) (S2). The digital audio data is processed by sound processor 205 (FIG. 1) (S3) and compared to the data stored in area 255 and area 256 (S4). Such comparison can be done by either sound processor 205 or CPU 211. If the digital audio data matches to the data stored in area 255 and/or area 256 the filtering process is initiated and deleted as background noise. Such sequence of process is done before retrieving text and numeric information from the digital audio data. FIG. 14a describes the method of updating area 255. When the voice recognition system is activated as described in FIG. 5 the analog audio data is input from microphone 215 (FIG. 1) (S1). The analog audio data is converted into digital data by A/D 213 (FIG. 1) (S2). The digital audio data is processed by sound processor 205 (FIG. 1) (S3) and the background noise is captured (S4). CPU 211 (FIG. 1) scans area 255 and if the captured background noise is not registered in area 255 it updates the sound audio data stored therein. FIG. 15 describes another embodiment of the present invention. CPU 211 (FIG. 1) routinely checks whether the voice recognition system is activated (S1). If the system is activated (S2) the beep, ringing sound and other sounds which are emitted from the communication device 200 are automatically turned off (S3).


Voice Recognition—Automatic Turn-Off


The voice recognition system can automatically be turned off to avoid glitch as described in FIG. 16. When the voice recognition system is activated (S1) CPU 211 (FIG. 1) automatically sets a timer (S2). The value of timer (i.e., the length of time until the system is deactivated) can be set manually by the user. The timer is incremented periodically (S3) and if the incremented time equals to the predetermined value of time as set in S2 (S4) the voice recognition system is automatically deactivated (S5).


Voice Recognition—E-Mail



FIG. 17a and FIG. 17b illustrate the method of typing and sending e-mails by utilizing the voice recognition system. Once the voice recognition system is activated (S1) the analog audio data is input from microphone 215 (FIG. 1) (S2). The analog audio data is converted into digital data by A/D 213 (FIG. 1) (S3). The digital audio data is processed by sound processor 205 (FIG. 1) to retrieve the text and numeric information therefrom (S4). Then the text and numeric information are retrieved (S5) and displayed on LCD 201 (FIG. 1) (S6). If the retrieved information is not correct (S7) the user can input the correct text and/or numeric information manually by using the input device 210 (FIG. 1) (S8). If inputting the text and numeric information is completed (S9) and CPU 211 detects input signal from input device 210 to send the e-mail (S10) the dialing process is initiated (S11). The dialing process is repeated until communication device 200 is connected to its host (S12) and the e-mail is sent to the designated address (S13).


Voice Recognition—Speech-to-Text



FIG. 18 illustrates the speech-to-text function of communication device 200. Once communication device 200 receives a transmitted data from another device via antenna 218 (FIG. 1) (S1) signal processor 208 (FIG. 1) processes the data (e.g., such as decompression) (S2) and the transmitted data is converted into audio data (S3). Such conversion can be done by either CPU 211 (FIG. 1) or signal processor 208. The audio data is transferred to sound processor 205 (FIG. 1) via data bus 203 and text and numeric information are retrieved therefrom (S4). CPU 211 designates the predetermined font and color to the text and numeric information (S5) and also designates a tag to such information (S6). After these tasks are completed the tag and the text and numeric information are stored in RAM 206 and displayed on LCD 201 (S7). FIG. 19 illustrates how the text and numeric information as well as the tag are displayed. On LCD 201 the text and numeric information 702 (“XXXXXXXXX”) are displayed with the predetermined font and color as well as with the tag 701 (“John”).


Positioning System



FIG. 20a illustrates the simplified block diagram to detect the position of communication device 200. In FIG. 20a relay R1 is connected to cable C1, relay R2 is connected to cable C2, relay R3 is connected to cable C3, and relay R4 is connected to cable C4. Cables C1, C2, C3, and C4 are connected to transmitter T, which is connected to host H by cable C5. The relays (R 1 . . . . R 20) are located throughout the predetermined area in the pattern illustrated in FIG. 20b. The system illustrated in FIG. 20a and FIG. 20b is designed to pin-point the position of communication device 200 by using the method so-called “global positioning system” or “GPS.”



FIG. 21 through FIG. 26 illustrate how the positioning is performed. Assuming that device A, communication device 200, seeks to detect the position of device B, another communication device 200, which is located somewhere in the matrix of relays illustrated in FIG. 20b. First of all the device ID of device B is entered by using input device 210 (FIG. 1) of device A (S1). The device ID may be its corresponding phone number. A request data including the device ID is sent to host H from device A (S2).


As illustrated in FIG. 22 host H periodically receives data from Device A (S1). If the received data is the request data (S2) host H first of all searches its communication log which records the location of device B which it last communicated with host H (S3). Then host H sends search signal from relays described in FIG. 20b which are located within 100 meter radius from the location registered in the communication log. If there is no response from Device B (S5) host H sends search signal from all relays (from R1 to R20 in FIG. 20b) (S6).


As illustrated in FIG. 23 device B periodically receives data from host H (S1). If the data received is the search signal (S2) device B sends response signal to host H (S3).


As illustrated in FIG. 24 host H periodically receives data from device B (S1). If the data received is the response signal (S2) host H locates the position of device B by using the method described in FIG. 20a and FIG. 20b (S3), and sends the location data and the relevant map data of the area where device B is located to device A (S4).


As illustrated in FIG. 25 device A periodically receives data from host H (S1). If the data received is the location data and the relevant map data mentioned above device A displays the map based on the relevant map data and indicates the location thereon based on the location data (S3).


Device A can continuously track down the location of device B as illustrated in FIG. 26. First, device A sends a request data to host H (S1). As soon as host H receives the request data (S2) it sends a search signal in the manner illustrated in FIG. 22 (S3). As soon as device B receives the search signal (S4) it sends a response signal to host H (S5). Based on the response signal host H locates device B with the method described in FIG. 20a and FIG. 20b (S6). Then host H sends to device A a renewed location data and a relevant map data of the area where device B is located (S7). As soon as these data are received (S8) device A displays the map based on the relevant map data and indicates the updated location based on the renewed location data (S9). If device B is still within the specified area device A may use the original relevant map data. As another embodiment of the present invention S1 through S4 may be omitted and make device B send a response signal continuously to host H until host H sends a command signal to device B to cease sending the response signal.


Positioning System—Automatic Silent Mode



FIG. 27a through FIG. 32g illustrate the automatic silent mode of communication device 200.


In FIG. 27a relay R1 is connected to cable C1, relay R2 is connected to cable C2, relay R3 is connected to cable C3, and relay R4 is connected to cable C4. Cables C1, C2, C3, and C4 are connected to transmitter T, which is connected to host H by cable C5. The relays (R 1 . . . R 20) are located throughout the predetermined area in the pattern illustrated in FIG. 27b. The system illustrated in FIG. 27a and FIG. 27b is designed to pin-point the position of communication device 200 by using the method so-called “global positioning system” or “GPS.”


As illustrated in FIG. 28 the user of communication device 200 may set the silent mode by input device 210 (FIG. 1). When communication device 200 is in the silent mode (a) the ringing sound is turned off, (b) vibrator 217 (FIG. 1) activates when communication device 200 receives call, and/or (c) communication device 200 sends a automatic response to the caller device when a call is received. The user may, with his discretion, select any of these predetermined function of the automatic silent mode.



FIG. 29 illustrates how the automatic silent mode is activated. Communication device 200 checks its present location with the method so-called “global positioning system” or “GPS” by using the system illustrated in FIG. 27a and FIG. 27b (S1). Communication device 200 then compares the present location and the previous location (S2). If the difference of the two values is more than the specified amount X, i.e., when the moving velocity of communication device 200 exceeds the predetermined value (S3) the silent mode is activated and (a) the ringing sound is automatically turned off, (b) vibrator 217 (FIG. 1) activates, and/or (c) communication device 200 sends an automatic response to the caller device according to the user's setting. Here, the silent mode is automatically activated because the user of communication device 200 is presumed to be on an automobile and is not in a situation to freely answer the phone, or the user is presumed to be riding a train and does not want to disturb other passengers.


As another embodiment of the present invention the automatic silent mode may be administered by host H (FIG. 27a). As illustrated in FIG. 30 the silent mode is set in the manner described in FIG. 28 (S1) and communication device 200 sends to host H a request signal. When host H detects a call to communication device 200 after receiving the request signal it checks the current location of communication device 200 (S1) and compares it with the previous location (S2). If the difference of the two values is more than the specified amount X, i.e., when the moving velocity of communication device 200 exceeds the predetermined value (S3) host H sends a notice signal to communication device 200 (S4). As illustrated in FIG. 32 communication device 200 receives data periodically from host H (S1). If the received data is a notice signal (S2) communication device 200 activates the silent mode (S3) and (a) the ringing sound is automatically turned off, (b) vibrator 217 (FIG. 1) activates, and/or (c) communication device 200 sends an automatic response to the caller device according to the user's setting. The automatic response may be sent from host H instead.


As another embodiment of the present invention a train route data may be used. As illustrated in FIG. 32a the train route data is stored in area 263 of RAM 206. The train route data contains three-dimensional train route map including the location data of the route. FIG. 32b illustrates how the train route data is utilized. CPU 211 (FIG. 1) checks the present location of communication device 200 by the method described in FIG. 27a and FIG. 27b (S1). Then CPU 211 compares with the train route data stored in area 263 of RAM 206 (S2). If the present location of communication 200 matches the train route data (i.e., if communication device is located on the train route) (S3) the silent mode is activated in the manner described above. The silent mode is activated because the user of communication device 200 is presumed to be currently on the train and may not want to disturb the other passengers on the same train. As another embodiment of the present invention such function can be delegated to host H (FIG. 27a) as described in FIG. 32c. Namely, host H checks the present location of communication device 200 by the method described in FIG. 27a and FIG. 27b (S1). Then host H compares the present location with the train route data stored in its own storage (not shown) (S2). If the present location of communication 200 matches the train route data (i.e., if communication device is located on the train route) (S3) host H sends a notice signal to communication device 200 thereby activating the silent mode in the manner described above.


Another embodiment is illustrated in FIG. 32f and FIG. 32g. As illustrated in FIG. 32f relays R 101, R 102, R 103, R 104, R 105, R 106, which perform the same function to the relays described in FIG. 27a and FIG. 27b, are installed in train Tr. The signals from these relays are sent to host H illustrated in FIG. 27a. Relays R 101 through R 106 emit inside-the-train signals which are emitted only inside train Tr. FIG. 32g illustrates how communication device 200 operates inside train Tr. Communication device 200 checks the signal received in train Tr (S1). If communication device 200 determines that the signal received is an inside-the-train signal (S2) it activates the silent mode in the manner described above.


Positioning System—Auto Response



FIG. 32d and FIG. 32e illustrates the method to send an automatic response to a caller device when the silent mode is activated. Assume that the caller device, a communication device 200, intends to call a callee device, another communication device 200 via host H. As illustrated in FIG. 32d the caller device dials the callee device and the dialing signal is sent to host H (S1). Host H checks whether the callee device is in the silent mode (S2). If host H detects that the callee device is in the silent mode it sends a predetermined auto response which indicates that the callee is probably on a train and may currently not be available, which is received by the caller device (S3). If the user of the caller device still desires to request for connection and certain code is input from input device 210 (FIG. 1) (S4) a request signal for connection is sent and received by host H (S5), and the line is connected between the caller device and the callee device via host H (S6). As another embodiment of the present invention the task of host H which is described in FIG. 32d may be delegated to the callee device as illustrated in FIG. 32e. The caller device dials the callee device and the dialing signal is sent to the callee device via host H (S1). The callee device checks whether it is in the silent mode (S2). If the callee device detects that it is in the silent mode it sends an predetermined auto response which indicates that the callee is probably on a train and may currently not be available, which is sent to the caller device via host H (S3). If the user of the caller device still desires to request for connection and certain code is input from input device 210 (FIG. 1) (S4) a request signal for connection is sent to the callee device via host H (S5), and the line is connected between the caller device and the callee device via host H (S6).


Auto Backup



FIG. 32 through FIG. 37 illustrate the automatic backup system of communication device 200. As illustrated in FIG. 32 RAM 206 (FIG. 1) includes areas to store the data essential to the user of communication device 200, such as area 278 for a phone list, area 279 for an address book, area 280 for email data, area 281 for software A, area 282 for software B, area 283 for software C, area 284 for data D, area 285 for data E. RAM 206 also includes area 264, i.e., the selected data info storage area, which will be explained in details hereinafter.


As described in FIG. 34 the user selects data by using input device 210 (FIG. 1) which he/she intends to be automatically backed up (S1). The selected data are written in area 264, the selected data info storage area (S2).


The overall operation of this function is illustrated in FIG. 35a and FIG. 35b. First of all, a timer (not shown) is set by a specific input signal produced by input device 210 (FIG. 1) (S1). The timer is incremented periodically (S2) and when the incremented value equals the predetermined value (S3) communication device 200 initiates the dialing process (S4). The dialing process continues until communication device 200 is connected to host 400 explained in FIG. 37 (S5). Once the line is connected CPU 211 reads the information stored in area 264 (S6) and based on such information it initiates to transfer the selected data from RAM 206 to host 400 (S7). The transfer continues until all of the selected data are transferred to host 400 (S8) and the line is disconnected thereafter (S9). This backup sequence can be initiated automatically and periodically by using a timer or manually. As another embodiment of the present invention, instead of selecting the data that are to be backed up, all data in RAM 206 (FIG. 1) can be transferred to host 400.



FIG. 36 illustrates the basic structure of the data transferred to host 400. Transferred data 601 includes header 602, device ID 603, selected data 604 and footer 605. Device ID 603 is the identification number of communication device 200 preferably its phone number, and selected data 604 is the pack of data which are transferred from RAM 206 to host 400 based on information stored in area 264.



FIG. 37 illustrates the basic structure of host 400. Host 400 includes backup data storage area 401 which is used to backup all of the backup data transferred from all communication devices. Host 400 stores the transferred data 601 to the designated area based on the device ID included in transferred data 601. For example, transferred data 601 transferred from device A is stored in area 412 as backup data A. In the same manner transferred data 601 transferred from device B is stored in area 413 as backup data B; transferred data 601 transferred from device C is stored in area 414 as backup data C; transferred data 601 transferred from device D is stored in area 415 as backup data D; transferred data 601 transferred from device E is stored in area 416 as backup data E; and transferred data 601 transferred from device F is stored in area 417 as backup data F.


Signal Amplifier



FIG. 38 illustrates a signal amplifier utilized for automobiles and other transportation carriers, such as trains, airplanes, space shuttles, and motor cycles. As described in FIG. 38 automobile 500 includes interface 503, an interface detachably connectable to communication device 200, which is connected to amplifier 502 via cable 505: Amplifier 502 is connected to antenna 501 via cable 504 and connector 507 as described in this drawing. The signal produced by communication device 200 is transferred to interface 503. Then the signal is transferred to amplifier via cable 505 where the signal is amplified. The amplified signal is transferred to antenna 501 via cable 504 and connector 507, which transmits the amplified signal to host H (not shown). The receiving signal is received by antenna 501 and transferred to amplifier 502 via connector 507 and cable 504, and then is transferred to interface 503 via cable 505, which transfers the amplified signal to communication device 200.


Audio/Video Data Capturing System



FIG. 39 through FIG. 44 illustrate the audio/video capturing system of communication device 200. Assuming that device A, a communication device 200, captures audio/video data and transfers such data to device B, another communication device 200, via a host (not shown). Primarily video data is input from CCD unit 214 (FIG. 1) and audio data is input from microphone 215 of (FIG. 1) of device A. As illustrated in FIG. 39 RAM 206 includes area 267 which stores audio data, area 268 which stores video data, and area 265 which is a work area utilized for the process explained hereinafter.


As described in FIG. 40 the video data input from CCD unit 214 (S1a) is converted from analog data to digital data (S2a) and is processed by CCD unit 214 (S3a). Area 265 is used as work area for such process. The processed video data is stored in area 267 of RAM 206 (S4a) and displayed on LCD 201 (FIG. 1). As described in the same drawing the audio data input from microphone 215 (S1b) is converted from analog data to digital data by A/D 213 (FIG. 1) (S2b) and is processed by sound processor 205 (FIG. 1) (S3b). Area 265 is used as work area for such process. The processed audio data is stored in area 268 of RAM 206 (S4b) and is transferred to sound processor 205 and is output from speaker 216 (FIG. 1) via D/A 204 (FIG. 1) (S5b). The sequences of S1a through S5a and S1b through S5b are continued until a specific signal indicating to stop such sequence is input from input device 210 (FIG. 1) (S6).


As described in FIG. 41 CPU 211 (FIG. 1) of device A initiates a dialing process (S1) until the line is connected to a host (not shown) (S2). As soon as the line is connected CPU 211 reads the audio/video data stored in area 267 and area 268 (S3) and transfer them to signal processor 208 where the data are converted into a transferring data (S4). The transferring data is transferred from antenna 218 in a wireless fashion (S5). The sequence of S1 through S5 is continued until a specific signal indicating to stop such sequence is input from input device 210 (FIG. 1) (S6). The line is disconnected thereafter (S7).



FIG. 42 illustrates the basic structure of the transferred data which is transferred from device A as described in S4 and S5 of FIG. 41. Transferred data 610 is primarily composed of header 611, video data 612, audio data 613, relevant data 614, and footer 615. Video data 612 corresponds to the video data stored in area 267 of RAM 206, and audio data 613 corresponds to the audio data stored in area 268 of RAM 206. Relevant data 614 includes various types of data, such as the identification number of device A (i.e., transferor device) and device B (transferee device), a location data which represents the location of device A, etc.



FIG. 43 illustrates the data contained in RAM 206 (FIG. 1) of device B. As illustrated in FIG. 39 RAM 206 includes area 269 which stores audio data, area 270 which stores video data, and area 266 which is a work area utilized for the process explained hereinafter.


As described in FIG. 44a and FIG. 44b CPU 211 (FIG. 1) of device B initiates a dialing process (S1) until device B is connected to a host (not shown) (S2). Transferred data 610 is received by antenna 218 (FIG. 1) of device B (S3) and is converted by signal processor 208 into a readable data which is readable by CPU 211 (S4). Video data and audio data are retrieved from transferred data 610 and stored into area 269 and area 270 of RAM 206 respectively (S5). The video data stored in area 269 is processed by video processor 202 (FIG. 1) (S6a). The processed video data is converted into an analog data (S7a) and displayed on LCD 201 (FIG. 1) (S8a). S7a may not be necessary depending on the type of LCD 201 used. The audio data stored in area 270 is processed by sound processor 205 (FIG. 1) (S6b). The processed audio data is converted into analog data by D/A 204 (FIG. 1) (S7b) and output from speaker 216 (FIG. 1) (S8b). The sequences of S6a through S8a and S6b through S8b are continued until a specific signal indicating to stop such sequence is input from input device 210 (FIG. 1) (S9).


Digital Mirror



FIG. 44c through FIG. 44e illustrates the method of using communication device 200 as a mirror. In this embodiment communication device 200 includes rotator 291 as described in FIG. 44c. Rotator 291 is fixed to the side of communication device 200 and rotates CCD unit 214 (FIG. 1) and thereby CCD unit 214 is enabled to face multi-direction. CPU 211 (FIG. 1) reads the video data stored in area 267 (FIG. 39) from left to right as described in FIG. 44d when CCD unit 214 is facing the opposite direction from LCD 201. However, when CCD unit 214 is facing the same direction with LCD 201, CPU 211 reads the video data stored in area 267 from right to left as described in FIG. 44e thereby producing a “mirror image” on LCD 201.


As another embodiment of the present invention more than one CCD units which face multi-direction may be utilized instead of enabling one CCD unit to rotate in the manner described above.


Caller ID



FIG. 45 through FIG. 47 illustrate the caller ID system of communication device 200.


As illustrated in FIG. 45 RAM 206 includes Table C. As shown in the drawing each phone number corresponds to a specific color and sound. For example phone #1 corresponds to color A and sound E; phone #2 corresponds to color B and sound F; phone #3 corresponds to color C and sound G; and phone #4 corresponds to color D and sound H.


As illustrated in FIG. 46 the user of communication device 200 selects or inputs a phone number (S1) and selects a specific color (S2) and a specific sound (S3) designated for that phone number. Such sequence can be repeated until there is a specific input from input device 210 ordering to do otherwise (S4).


As illustrated in FIG. 47 CPU 211 (FIG. 1) periodically checks whether it has received a call from other communication devices (S1). If it receives a call (S2) CPU 211 scans Table C (FIG. 45) to see whether the phone number of the caller device is registered in the table (S3). If there is a match (S4) the designated color is output from indicator 212 (FIG. 1) and the designated sound is output from speaker 216 (FIG. 1) (S5). For example if the incoming call is from phone #1 color A is output from indicator 212 and sound E is output from speaker 216.


Stock Purchase



FIG. 48 through FIG. 52 illustrate the method of purchasing stocks by utilizing communication device 200.



FIG. 48 illustrates the data stored in ROM 207 (FIG. 1) necessary to set the notice mode. Area 251 stores the program regarding the vibration mode; area 252 stores the program regarding sound which is emitted from speaker 216 (FIG. 1) and several types of sound data, such as sound data I, sound data J, and sound data K; area 253 stores the program regarding the color emitted from indicator 212 (FIG. 1) and several types of color data, such as color data L, color data, M, and color data N.


As illustrated in FIG. 49 the notice mode is activated in the manner in compliance with the settings stored in setting data area 271 of RAM 206. In the example illustrated in FIG. 49 when the notice mode is activated vibrator 217 (FIG. 1) is turned on in compliance with the data stored in area 251a, speaker 216 (FIG. 1) is turned on and sound data J is emitted therefrom in compliance with the data stored in area 252a, and indicator 212 (FIG. 1) is turned on and color M is emitted therefrom in compliance with the data stored in area 253a. Area 292 stores the stock purchase data, i.e., the name of the brand, the amount of limited price, the name of the stock market (such as NASDAQ and/or NYSE) and other relevant information regarding the stock purchase.


As illustrated in FIG. 50 the user of communication device 200 inputs the stock purchase data from input device 210 (FIG. 1), which is stored in area 292 of RAM 206 (S1). By way of inputting specific data from input device 210 the property of notice mode (i.e., vibration ON/OFF, sound ON/OFF and the type of sound, indicator ON/OFF and the type of color) is set and the relevant data are stored in area 271 (i.e., areas 251a, 252a, 253a) of RAM 206 by the programs stored in areas 251, 252, 253 of ROM 207 (S2). Communication device 200 initiates a dialing process (S3) until it is connected to host H (described hereafter) (S4) and sends the stock purchase data thereto.



FIG. 51 illustrates the operation of host H. As soon as host H receives the stock purchase data from communication device 200 (S1) it initiates monitoring the stock markets which is specified in the stock purchase data (S2). If host H detects that the price of the certain brand specified in the stock purchase data meets the limited price specified in the stock purchase data (S3) it initiates a dialing process (S4) until it is connected to communication device 200 (S5) and sends a notice data thereto (S6). As illustrated in FIG. 52 communication device 200 periodically monitors the data received from host H (S1). If the data received is a notice data (S2) the notice mode is activated in the manner in compliance with the settings stored in setting data area 271 of RAM 206 (S3). In the example illustrated in FIG. 49 vibrator 217 (FIG. 1) is turned on, sound data J is emitted from speaker 216 (FIG. 1), and indicator 212 (FIG. 1) emits color M.


Timer Email



FIG. 53a and FIG. 53b illustrate the method of sending emails from communication device 200 by utilizing a timer. Address data, i.e., email address is input by input device 210 or by voice recognition system explained in FIG. 3, FIG. 4, FIG. 5, FIG. 13, FIG. 14, FIG. 14a, FIG. 15, FIG. 16 and/or FIG. 17 (S1) and the text data, the text of the email message is input by the same manner (S2). The address data and the text data are automatically saved in RAM 206 (FIG. 1) (S3). The sequence of S1 through S3 is repeated (i.e., writing more than one email) until a specified input signal is input from input device 210 or by utilizing the voice recognition system explained above (FIG. 1). Once inputting both the address data and the text data (which also includes numeric data, images and programs) are completed a timer (not shown) is set by input device 210 or by utilizing the voice recognition system (S5), and the timer is incremented periodically (S6) until the timer value equals the predetermined value specified in S5 (S7). A dialing process is continued (S8) until the line is connected (S9) and the text data are sent thereafter to email addresses specified in S1 (S10). All of the emails are sent (S11) and the line is disconnected thereafter (S12).


As another embodiment of the present invention a specific time may be input by input device 210 and send the text data on the specific time (i.e., a broad meaning of “timer”).


Call Blocking



FIG. 54 through FIG. 56 illustrates the method of so-called “call blocking.”


As illustrated in FIG. 54 RAM 206 (FIG. 1) includes area 273 and area 274. Area 273 stores phone numbers that should be blocked. In the example illustrated in FIG. 54 phone #1, phone #2, and phone #3 are blocked. Area 274 stores a message data stating that the phone can not be connected.



FIG. 55 illustrates the operation of communication device 200. When communication device 200 receives a call (S1), CPU 211 (FIG. 1) scans area 273 of RAM 206 (S2). If the phone number of the incoming call matches one of the phone numbers stored in area 273 of RAM 206 (S3) CPU 211 sends the message data stored in area 274 of RAM 206 to the caller device (S4) and disconnects the line (S5).



FIG. 56 illustrates the method of updating area 273 of RAM 206. Assuming that the phone number of the incoming call does not match any of the phone numbers stored in area 273 of RAM 206 (see S3 of FIG. 55). In that case communication device 200 is connected to the caller device. However, the user of communication device 200 may decide to have such number “blocked” after all. In that case the user dials “999” while the line is connected. Technically CPU 211 (FIG. 1) periodically checks the signals input from input device 210 (FIG. 1) (S1). If the input signal represents “9991” from input device 210 (S2) CPU 211 adds the phone number of the pending call to area 273 (S3) and sends the message data stored in area 274 of RAM 206 to the caller device (S4). The line is disconnected thereafter (S5).



FIG. 57 through FIG. 59 illustrates another embodiment of the present invention.


As illustrated in FIG. 57 host 400 includes area 403 and area 404. Area 403 stores phone numbers of communication device 200 that should be blocked. In the example illustrated in FIG. 57 phone #1, phone #2, and phone #3 are blocked for device A; phone #4, phone #5, and phone #6 are blocked for device B; and phone #7, phone #8, and phone #9 are blocked for device C. Area 404 stores a message data stating that the phone can not be connected.



FIG. 58 illustrates the operation of host 400. Assuming that the caller device is attempting to connect to device B illustrated in FIG. 57. Host 400 periodically checks the signals from all communication device 200 (S1). If host 400 detects a call for device B (S2) it scans area 403 and checks whether the phone number of the incoming call matches one of the phone numbers stored therein (S4). If the phone number of the incoming call does not match any of the phone numbers stored in area 403 the line is connected to device B (S5b). On the other hand, if the phone number of the incoming call matches one of the phone numbers stored in area 403 the line is “blocked,” i.e., not connected to device B (S5a) and host 400 sends the massage data stored in area 404 to the caller device (S6).



FIG. 59 illustrates the method of updating area 403 of host 400. Assuming that the phone number of the incoming call does not match any of the phone numbers stored in area 403 (see S4 of FIG. 58). In that case host 400 allows the connection between the caller device and communication device 200. However, the user of communication device 200 may decide to have such number “blocked” after all. In that case the user simply dials “999” while the line is connected. Technically host 400 (FIG. 57) periodically checks the signals input from input device 210 (FIG. 1) (S1). If the input signal represents “999” from input device 210 (FIG. 1) (S2) host 400 adds the phone number of the pending call to area 403 (S3) and sends the message data stored in area 404 to the caller device (S4). The line is disconnected thereafter (S5). As another embodiment of the method illustrated in FIG. 59 host 400 may delegate some of its tasks to communication device 200 (this embodiment is not shown in drawings). Namely communication device 200 periodically checks the signals input from input device 210 (FIG. 1). If the input signal represents “999” from input device 210 communication device 200 sends to host a block request signal as well as with the phone number of the pending call. Host 400, upon receiving the block request signal from communication device 200, adds the phone number of the pending call to area 403 and sends the message data stored in area 404 to the caller device. The line is disconnected thereafter.


Online Payment



FIG. 60 through FIG. 64 illustrates the method of online payment by utilizing communication device 200.


As illustrated in FIG. 60 host 400 includes account data storage area 405. All of the account data of the users of communication device 200 who have signed up for the online payment service are stored in area 405. In the example described in FIG. 60 account A stores the relevant account data of the user using device A; account B stores the relevant account data of the user using device B; account C stores the relevant account data of the user using device C; and account D stores the relevant account data of the user using device D. Here, device A, B, C, and D are communication device 200.



FIG. 61a and FIG. 61b illustrate the operation of the payer device. Assuming that device A is the payer device and device B is the payee device. Account A explained in FIG. 60 stores the account data of the user of device A, and account B explained in the same drawing stores the account data of the user of device B. As illustrated in FIG. 61a LCD 201 of device A displays the balance of account A by receiving the relevant data from host 400 (FIG. 60) (S1). From the signal input from input device 210 (FIG. 1) the payer's account and the payee's account are selected (in the present example account A as the payer's account and account B as the payee's account), the amount of payment and the device ID (in the present example device A as the payer's device and device B as the payee's device) (S2). If the data input from input device 210 is correct (S3) CPU 211 (FIG. 1) of device A prompts for other payments. If there are other payments to make the sequence of S1 through S3 is repeated until all of the payments are made (S4). The dialing process is initiated and repeated thereafter (S5) until the line is connected to host 400 (S6). Once the line is connected device A sends the payment data to host 400 (FIG. 60) (S7). The line is disconnected when all of the payment data are sent to host 400 (S8 and S9).



FIG. 62 illustrates the payment data described in S7 of FIG. 61b. Payment data 620 is consisted of header 621, payer's account information 622, payee's account information 623, amount data 624, device ID data 625, and footer 615. Payer's account information 622 represents the information regarding the payer's account data stored in host 400 which is, in the present example, account A. Payee's account information 623 represents the information regarding the payee's account data stored in host 400 which is, in the present example, account B. Amount data 624 represents the amount of monetary value either in the U.S. dollars or in other currencies which is to be transferred from the payer's account to the payee's account. The device ID data represents the data of the payer's device and the payee's device, i.e., in the present example, device A and device B.



FIG. 63 illustrates the basic structure of the payment data described in S7 of FIG. 61b when multiple payments are made, i.e., when more than one payment is made in S4 of FIG. 61a. Assuming that three payments are made in S4 of FIG. 61a. In that case payment data 630 is consisted of header 631, footer 635, and three data sets, i.e., data set 632, data set 633, data set 634. Each data set represents the data components described in FIG. 62 excluding header 621 and footer 615.



FIG. 64 illustrates the operation of host 400 (FIG. 60). After receiving payment data from device A described in FIG. 62 and FIG. 63 host 400 retrieves therefrom the payer's account information (in the present example account A), the payee's account information (in the present example account B), the amount data which represents the monetary value, the device IDs of both the payer's device and the payee's device (in the present example device A and device B) (S1). Host 400 based on such data subtracts the monetary value represented by the amount data from the payer's account (in the present example account A) (S2), and adds the same amount to the payee's account (in the present example account B) (S3). If there are other payments to make, i.e., if host 400 received a payment data which has a structure of the one described in FIG. 63 the sequence of S2 and S3 is repeated as many times as the amount of the data sets are included in such payment data.


Navigation System



FIG. 65 through FIG. 74 illustrate the navigation system of communication device 200.


As illustrated in FIG. 65 RAM 206 (FIG. 1) includes area 275, area 276, area 277, and area 295. Area 275 stores a plurality of map data, two-dimensional (2D) image data, which are designed to be displayed on LCD 201 (FIG. 1). Area 276 stores a plurality of object data, three-dimensional (3D) image data, which are also designed to be displayed on LCD 201. The object data are primarily displayed by a method so-called “texture mapping” which is explained in details hereinafter. Here, the object data include the three-dimensional data of various types of objects that are displayed on LCD 201, such as bridges, houses, hotels, motels, inns, gas stations, restaurants, streets, traffic lights, street signs, trees, etc. Area 277 stores a plurality of location data, i.e., data representing the locations of the objects stored in area 276. Area 277 also stores a plurality of data representing the street address of each object stored in area 276. In addition area 277 stores the current position data of communication device 200 and the destination data which are explained in details hereafter. The map data stored in area 275 and the location data stored in area 277 are linked each other. Area 295 stores a plurality of attribution data attributing to the map data stored in area 275 and location data stored in area 277, such as road blocks, traffic accidents, and road constructions, and traffic jams. The attribution data stored in area 295 is updated periodically by receiving an updated data from a host (not shown).


As illustrated in FIG. 66 video processor 202 (FIG. 1) includes texture mapping processor 290. Texture mapping processor 290 produces polygons in a three-dimensional space and “pastes” textures to each polygons. The concept of such method is described in the following patents: U.S. Pat. Nos. 5,870,101, 6,157,384, 5,774,125, 5,375,206, and/or 5,925,127.


As illustrated in FIG. 67 the voice recognition system is activated when the CPU 211 (FIG. 1) detects a specific signal input from input device 210 (FIG. 1) (S1). After the voice recognition system is activated the input current position mode starts and the current position of communication device 200 is input by voice recognition system explained in FIG. 3, FIG. 4, FIG. 5, FIG. 13, FIG. 14, FIG. 14a, FIG. 15, FIG. 16 and/or FIG. 17 (S2). The current position can also be input from input device 210. As another embodiment of the present invention the current position can automatically be detected by the method so-called “global positioning system” or “GPS” as illustrated in FIG. 20a through FIG. 26 and input the current data therefrom. After the process of inputting the current data is completed the input destination mode starts and the destination is input by the voice recognition system explained above or by the input device 210 (S3), and the voice recognition system is deactivated after the process of inputting the destination data is completed by using such system (S4).



FIG. 68 illustrates the sequence of the input current position mode described in S2 of FIG. 67. When analog audio data is input from microphone 215 (FIG. 1) (S1) such data is converted into digital audio data by A/D 213 (FIG. 1) (S2). The digital audio data is processed by sound processor 205 (FIG. 1) to retrieve text and numeric data therefrom (S3). The retrieved data is displayed on LCD 201 (FIG. 1) (S4). The data can be corrected by repeating the sequence of S1 through S4 until the correct data is displayed (S5). If the correct data is displayed such data is registered as current position data (S6). As stated above the current position data can input manually by input device 210 (FIG. 1) and/or by automatically inputting such data by the method so-called “global positioning system” or “GPS” as described above.



FIG. 69 illustrates the sequence of the input destination mode described in S3 of FIG. 67. When analog audio data is input from microphone 215 (FIG. 1) (S1) such data is converted into digital audio data by A/D 213 (FIG. 1) (S2). The digital audio data is processed by sound processor 205 (FIG. 1) to retrieve text and numeric data therefrom (S3). The retrieved data is displayed on LCD 201 (FIG. 1) (S4). The data can be corrected by repeating the sequence of S1 through S4 until the correct data is displayed (S5). If the correct data is displayed such data is registered as destination data (S6).



FIG. 70 illustrates the sequence of displaying the shortest route from the current position to the destination. CPU 211 (FIG. 1) retrieves both the current position data and the destination data which are input by the method described in FIG. 67 through FIG. 69 from area 277 of RAM 206 (FIG. 1). By utilizing the location data of streets, bridges, traffic lights and other relevant data CPU 211 calculates the shortest route to the destination (S1). CPU 211 then retrieves the relevant two-dimensional map data which should be displayed on LCD 201 from area 275 of RAM 206 (S2). As another embodiment of the present invention by way of utilizing the location data stored in area 277 CPU 211 may produce a three-dimensional map by composing the three dimensional objects (by method so-called “texture mapping” as described above) which are stored in area 276 of RAM 206. The two-dimensional map and/or the three dimensional map is displayed on LCD 201 (FIG. 1) (S3). As another embodiment of the present invention the attribution data stored in area 295 of RAM 206 may be utilized. Namely if any road block, traffic accident, road construction, and/or traffic jam is included in the shortest route calculated by the method mentioned above, CPU 211 calculates the second shortest route to the destination. If the second shortest route still includes road block, traffic accident, road construction, and/or traffic jam CPU 211 calculates the third shortest route to the destination. CPU 211 calculates repeatedly until the calculated route does not include any road block, traffic accident, road construction, and/or traffic jam. The shortest route to the destination is highlighted by a significant color (such as red) to enable the user of communication device 200 to easily recognize such route on LCD 201.


As another embodiment of the present invention an image which is similar to the one which is observed by the user in the real world may be displayed on LCD 201 (FIG. 1) by using the three-dimensional object data. In order to produce such image CPU 211 (FIG. 1) identifies the present location and retrieves the corresponding location data from area 277 of RAM 206 (FIG. 65). Then CPU 211 retrieves a plurality of object data which correspond to such location data from area 276 or RAM 206 (FIG. 65) and displays a plurality of objects on LCD 201 based on such object data in a manner the user of communication device 200 may observe from the current location.



FIG. 71 illustrates the sequence of updating the shortest route to the destination while communication device 200 is moving. By way of periodically and automatically inputting the current position by the method so-called “global positioning system” or “GPS” as described above the current position is continuously updated (S1). By utilizing the location data of streets and traffic lights and other relevant data CPU 211 recalculates the shortest route to the destination (S2). CPU 211 then retrieves the relevant two-dimensional map data which should be displayed on LCD 201 from area 275 of RAM 206 (FIG. 65) (S3). As another embodiment of the present invention by way of utilizing the location data stored in 277 CPU 211 may produce a three-dimensional map by composing the three dimensional objects by method so-called “texture mapping” which are stored in area 276 of RAM 206 (FIG. 65). The two-dimensional map and/or the three-dimensional map is displayed on LCD 201 (FIG. 1) (S4). The shortest route to the destination is re-highlighted by a significant color (such as red) to enable the user of communication device 200 to easily recognize the updated route on LCD 201.



FIG. 72 illustrates the method of finding the shortest location of the desired facility, such as restaurant, hotel, gas station, etc. The voice recognition system is activated in the manner described in FIG. 67 (S1). By way of utilizing the system a certain type of facility is selected from the options displayed on LCD 201 (FIG. 1). The prepared options can be a) restaurant, b) lodge, and c) gas station (S2). Once one of the options is selected CPU 211 (FIG. 1) calculates and inputs the current position by the method described in FIG. 68 and/or FIG. 71 (S3). From the data selected in S2 CPU 211 scans area 277 or RAM 206 (FIG. 65) and searches the location of the facilities of the selected category (such as restaurant) which is the closest to the current position (S4). CPU 211 then retrieves the relevant two-dimensional map data which should be displayed on LCD 201 from area 275 of RAM 206 (FIG. 65) (S5). As another embodiment of the present invention by way of utilizing the location data stored in 277 CPU 211 may produce a three-dimensional map by composing the three dimensional objects by method so-called “texture mapping” which are stored in area 276 of RAM 206 (FIG. 65). The two-dimensional map and/or the three dimensional map is displayed on LCD 201 (FIG. 1) (S6). The shortest route to the destination is re-highlighted by a significant color (such as red) to enable the user of communication device 200 to easily recognize the updated route on LCD 201. The voice recognition system is deactivated thereafter (S7).



FIG. 73 illustrates the method of displaying the time and distance to the destination. As illustrated in FIG. 73 CPU 211 (FIG. 1) calculates the current position where the source data can be input from the method described in FIG. 68 and/or FIG. 71 (S1). The distance is calculated from the method described in FIG. 70 (S2). The speed is calculated from the distance which communication device 200 has proceeded within specific duration of time (S3). The distance to the destination and the time left are displayed on LCD 201 (FIG. 1) (S4 and S5).



FIG. 74 illustrates the method of warning and giving instructions when the user of communication device 200 deviates from the correct route. By way of periodically and automatically inputting the current position by the method so-called “global positioning system” or “GPS” as described above the current position is continuously updated (S1). If the current position deviates from the correct route (S2) warnings are given from speaker 216 (FIG. 1) and/or LCD 201 (FIG. 1) (S3). The method described in FIG. 74 is repeated for certain period of time. If the deviation still exists after such period of time has passed CPU 211 (FIG. 1) initiates the sequence described in FIG. 70 and calculates the shortest route to the destination and display on LCD 201. The details of such sequence is as same as the one explained in FIG. 70.



FIG. 74a illustrates the overall operation of communication device 200 regarding the navigation system and the communication system. When communication device 200 receives data from antenna 218 (S1) CPU 211 (FIG. 1) determines whether the data is navigation data, i.e., data necessary to operate the navigation system (S2). If the data received is a navigation data the navigation system described in FIG. 67 through FIG. 74 is performed (S3). On the other hand, if the data received is a communication data (S4) the communication system, i.e., the system necessary for wireless communication which is mainly described in FIG. 1 is performed (S5).


Remote Controlling System



FIG. 75 through FIG. 83 illustrates the remote controlling system of communication device 200.


As illustrated in FIG. 75 communication device 200 is connected to network NT. Network NT may be the internet or have the same or similar structure described in FIG. 2a, FIG. 2b and/or FIG. 2c except “device B” is substituted to “sub-host SH” in these drawings. Network NT is connected to sub-host SH in a wireless fashion. Sub-host SH administers various kinds of equipment installed in building 801, such as TV 802, microwave oven 803, VCR 804, bathroom 805, room light 806, AC 807, heater 808, door 809, and CCD camera 810. Communication device transfers a control signal to sub-host SH via network NT, and sub-host SH controls the selected equipment based on the control signal.


As illustrated in FIG. 76 communication device 200 is enabled to perform the remote controlling system when the device is set to the home equipment controlling mode. Once communication device 200 is set to the home equipment controlling mode, LCD 201 (FIG. 1) displays all pieces of equipment which are remotely controllable by communication device 200. Each equipment can be controllable by the following method.



FIG. 77 illustrates the method of remotely controlling TV 802. In order to check the status of TV 802 a specific signal is input from input device 210 (FIG. 1) and communication device 200 thereby sends a check request signal to sub-host SH via network NT. Sub-host SH, upon receiving the check request signal, checks the status of TV 802, i.e., the status of the power (ON/OFF), the channel, and the timer of TV 802 (S1), and returns the results to communication device 200 via network NT, which are displayed on LCD 201 (S2). Based on the control signal produced by communication device 200, which is transferred via network NT, sub-host SH turns the power on (or off) (S3a), selects the channel (S3b), and/or sets the timer of TV 802 (S3c). The sequence of S2 and S3 can be repeated (S4).



FIG. 78 illustrates the method of remotely controlling microwave oven 803. In order to check the status of microwave oven 803 a specific signal is input from input device 210 (FIG. 1) and communication device 200 thereby sends a check request signal to sub-host SH via network NT. Sub-host SH, upon receiving the check request signal, checks the status of microwave oven 803, i.e., the status of the power (ON/OFF), the status of temperature, and the timer of microwave oven 803 (S1), and returns the results to communication device 200 via network NT, which are displayed on LCD 201 (S2). Based on the control signal produced by communication device 200, which is transferred via network NT, sub-host SH turns the power on (or off) (S3a), selects the temperature (S3b), and/or sets the timer of microwave oven 803 (S3c). The sequence of S2 and S3 can be repeated (S4).



FIG. 79 illustrates the method of remotely controlling VCR 804. In order to check the status of VCR 804 a specific signal is input from input device 210 (FIG. 1) and communication device 200 thereby sends a check request signal to sub-host SH via network NT. Sub-host SH, upon receiving the check request signal, checks the status of VCR 804, i.e., the status of the power (ON/OFF), the channel, the timer, and the status of the recording mode (e.g., one day, weekdays, or weekly) of VCR 804 (S1), and returns the results to communication device 200 via network NT, which are displayed on LCD 201 (S2). Based on the control signal produced by communication device 200, which is transferred via network NT, sub-host SH turns the power on (or off) (S3a), selects the channel (S3b), sets the timer (S3c), and/or selects the recording mode of VCR 804 (S3d). The sequence of S2 and S3 can be repeated (S4).



FIG. 80 illustrates the method of remotely controlling bathroom 805. In order to check the status of bathroom 805 a specific signal is input from input device 210 (FIG. 1) and communication device 200 thereby sends a check request signal to sub-host SH via network NT. Sub-host SH, upon receiving the check request signal, checks the status of bathroom 805, i.e., the status of bath plug (or stopper for bathtub) (OPEN/CLOSE), the temperature, the amount of hot water, and the timer of bathroom 805 (S1), and returns the results to communication device 200 via network NT, which are displayed on LCD 201 (S2). Based on the control signal produced by communication device 200, which is transferred via network NT, sub-host SH opens (or closes) the bath plug (S3a), selects the temperature (S3b), selects the amount of hot water (S3c), and/or sets the timer of bathroom 805 (S3d). The sequence of S2 and S3 can be repeated (S4).



FIG. 81 illustrates the method of remotely controlling AC 807 and heater 808. In order to check the status of AC 807 and/or heater 808 a specific signal is input from input device 210 (FIG. 1) and communication device 200 thereby sends a check request signal to sub-host SH via network NT. Sub-host SH, upon receiving the check request signal, checks the status of AC 807 and/or heater 808, i.e., the status of the power (ON/OFF), the status of temperature, and the timer of AC 807 and/or heater 808 (S1), and returns the results to communication device 200 via network NT, which are displayed on LCD 201 (S2). Based on the control signal produced by communication device 200, which is transferred via network NT, sub-host SH turns the power on (or off) (S3a), selects the temperature (S3b), and/or sets the timer of AC 807 and/or heater 808 (S3c). The sequence of S2 and S3 can be repeated (S4).



FIG. 82 illustrates the method of remotely controlling door 809. In order to check the status of door 809 a specific signal is input from input device 210 (FIG. 1) and communication device 200 thereby sends a check request signal to sub-host SH via network NT. Sub-host SH, upon receiving the check request signal, checks the status of door 809, i.e., the status of the door lock (LOCKED/UNLOCKED), and the timer of door lock (S1), and returns the results to communication device 200 via network NT, which are displayed on LCD 201 (S2). Based on the control signal produced by communication device 200, which is transferred via network NT, sub-host SH locks (or unlocks) the door (S3a), and/or sets the timer of the door lock (S3b). The sequence of S2 and S3 can be repeated (S4).



FIG. 83 illustrates the method of CCD camera 810. In order to check the status of CCD camera 810 a specific signal is input from input device 210 (FIG. 1) and communication device 200 thereby sends a check request signal to sub-host SH via network NT. Sub-host SH, upon receiving the check request signal, checks the status of CCD camera 810, i.e., the status of the camera angle, zoom and pan, and the timer of CCD camera 810 (S1), and returns the results to communication device 200 via network NT, which are displayed on LCD 201 (S2). Based on the control signal produced by communication device 200, which is transferred via network NT, sub-host SH selects the camera angle (S3a), selects zoom or pan (S3b), and/or sets the timer of CCD camera 810 (S3c). The sequence of S2 and S3 can be repeated (S4).



FIG. 84 illustrates the overall operation of communication device 200 regarding the remote controlling system and communication system. CPU 211 (FIG. 1) periodically checks the input signal from input device 210 (FIG. 1). If the input signal indicates that the remote controlling system is selected (S2) CPU 211 initiates the process for the remote controlling system (S3). On the other hand, if the input signal indicates that the communication system is selected (S4) CPU 211 initiates the process for the communication system.



FIG. 85 is a further description of the communication performed between sub-host SH and door 809 which is described in FIG. 82. When sub-host SH receives a check request signal as described in FIG. 82 sub-host SH sends a check status signal which is received by controller 831 via transmitter 830. Controller 831 checks the status of door lock 832 and sends back a response signal to sub-host SH via transmitter 830 indicating that door lock 832 is locked or unlocked. Upon receiving the response signal from controller 832 sub-host SH sends a result signal to communication device 200 as described in FIG. 82. When sub-host SH receives a control signal from communication device 200 as described in FIG. 82 it sends a door control signal which is received by controller 831 via transmitter 830. Controller 831 locks or unlocks door lock 832 in conformity with the door control signal. As another embodiment of the present invention controller 831 may owe the task of both sub-host SH and itself and communicate directly with communication device 200 via network NT.


As another embodiment of the present invention each equipment, i.e., TV 802, microwave oven 803, VCR 804, bathroom 805, room light 806, AC 807, heater 808, door lock 809, and CCD camera 810, may carry a computer which directly administers its own equipment and directly communicates with communication device 200 via network NT instead of sub-host SH administering all pieces of equipment and communicate with communication device 200.


The above-mentioned invention is also applicable to carriers in general, such as automobiles, airplanes, space shuttles, ships, motor cycles and trains.


Auto Emergency Calling System



FIG. 86 and FIG. 87 illustrate the automatic emergency calling system.



FIG. 86 illustrates the overall structure of the automatic emergency calling system. Communication device 200 is connected to network NT. Network NT may be the internet or have the same or similar structure described in FIG. 2a, FIG. 2b and/or FIG. 2c. Network NT is connected to automobile 835 thereby enabling automobile 835 to communicate with communication device 200 in a wireless fashion. Emergency center EC, a host computer, is also connected to automobile 835 in a wireless fashion via network NT. Airbag 838 which prevents persons in automobile 835 from being physically injured or minimizes such injury in case traffic accidents occur is connected to activator 840 which activates airbag 838 when it detects an impact of more than certain level. Detector 837 sends an emergency signal via transmitter 836 when activator 840 is activated. The activation signal is sent to both emergency center EC and communication device 200. In lieu of airbag 838 any equipment may be used so long as such equipment prevents from or minimizes physical injuries of the persons in automobile 835.



FIG. 87 illustrates the overall process of the automatic emergency calling system. Detector 837 periodically checks activator 840 (S1). If the activator 840 is activated (S2) detector 837 transmits an emergency signal via transmitter 836 (S3a). The emergency signal is transferred via network NT and received by emergency center EC and by communication device 200 (S3b).


As another embodiment of the present invention the power of detector 837 may be usually turned off, and activator 840 may turn on the power of detector 837 by the activation of activator 840 thereby enabling detector 837 to send the emergency signal to both emergency center EC and to communication device 200 as described above.


This invention is also applicable to any carriers including airplanes, space shuttles, ships, motor cycles and trains.

Claims
  • 1. A communication device which is a handheld mobile device comprising: an input device;an antenna;an email data transfer implementer, wherein email data is retrieved from said communication device and transferred via said antenna;a 1st device remotely controlling implementer, wherein a 1st remote control signal is transferred via said antenna by which a 1st device is controlled via a network from the location external to an artificial structure; anda 2nd device remotely controlling implementer, wherein a 2nd remote control signal is transferred via said antenna by which a 2nd device is controlled via said network from the location external to said artificial structure;wherein said 1st device and said 2nd device are the devices located in said artificial structure.
  • 2. The communication device of claim 1, wherein said 1st device is a TV tuner which is directly or indirectly connected to said network.
  • 3. The communication device of claim 1, wherein said 1st device is a microwave oven which is directly or indirectly connected to said network.
  • 4. The communication device of claim 1, wherein said 1st device is a video recording device which is directly or indirectly connected to said network.
  • 5. The communication device of claim 1, wherein said 1st device is an air conditioner which is directly or indirectly connected to said network.
  • 6. The communication device of claim 1, wherein said 1st device is a door lock which is directly or indirectly connected to said network.
  • 7. A system comprising: a communication device which is a handheld mobile device comprising an input device and an antenna;an email data transfer implementer, wherein email data is retrieved from said communication device and transferred via said antenna;a 1st device remotely controlling implementer, wherein a 1st remote control signal is transferred via said antenna by which a 1st device is controlled via a network from the location external to an artificial structure; anda 2nd device remotely controlling implementer, wherein a 2nd remote control signal is transferred via said antenna by which a 2nd device is controlled via said network from the location external to said artificial structure;wherein said 1st device and said 2nd device are the devices located in said artificial structure.
  • 8. The system of claim 7, wherein said 1st device is a TV tuner which is directly or indirectly connected to said network.
  • 9. The system of claim 7, wherein said 1st device is a microwave oven which is directly or indirectly connected to said network.
  • 10. The system of claim 7, wherein said 1st device is a video recording device which is directly or indirectly connected to said network.
  • 11. The system of claim 7, wherein said 1st device is an air conditioner which is directly or indirectly connected to said network.
  • 12. The system of claim 7, wherein said 1st device is a door lock which is directly or indirectly connected to said network.
  • 13. A method for a communication device which is a handheld mobile device comprising an input device and an antenna, said method comprises: an email data transfer implementing step, wherein email data is retrieved from said communication device and transferred via said antenna;a 1st device remotely controlling implementing step, wherein a 1st remote control signal is transferred via said antenna by which a 1st device is controlled via a network from the location external to an artificial structure; anda 2nd device remotely controlling implementing step, wherein a 2nd remote control signal is transferred via said antenna by which a 2nd device is controlled via said network from the location external to said artificial structure;wherein said 1st device and said 2nd device are the devices located in said artificial structure.
  • 14. The method of claim 13, wherein said 1st device is a TV tuner which is directly or indirectly connected to said network.
  • 15. The method of claim 13, wherein said 1st device is a microwave oven which is directly or indirectly connected to said network.
  • 16. The method of claim 13, wherein said 1st device is a video recording device which is directly or indirectly connected to said network.
  • 17. The method of claim 13, wherein said 1st device is an air conditioner which is directly or indirectly connected to said network.
  • 18. The method of claim 13, wherein said 1st device is a door lock which is directly or indirectly connected to said network.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. Ser. No. 15/878,436 filed 2018 Jan. 24, which is a continuation of U.S. Ser. No. 15/422,458 filed 2017 Feb. 1, which is a continuation of U.S. Ser. No. 15/089,465 filed 2016 Apr. 1, which is a continuation of U.S. Ser. No. 14/318,684 filed 2014 Jun. 29, which is a continuation of U.S. Ser. No. 13/943,792 filed 2013 Jul. 17, which is a continuation of a continuation of U.S. Ser. No. 13/016,974 filed 2011 Jan. 29, which is a continuation of U.S. Ser. No. 12/177,927 filed 2008 Jul. 23, which is a continuation of U.S. Ser. No. 11/464,835 filed 2006 Aug. 16, which is a continuation of U.S. Ser. No. 10/209,399 filed 2002 Jul. 29, which claims the benefit of U.S. Provisional Application No. 60/329,997 filed 2001 Oct. 18, all of which are hereby incorporated herein by reference in their entirety.

US Referenced Citations (641)
Number Name Date Kind
4934773 Becker Jun 1990 A
5173881 Sindle Dec 1992 A
5257313 Fujishita et al. Oct 1993 A
5272638 Martin et al. Dec 1993 A
5345272 Ersoz et al. Sep 1994 A
5353376 Oh et al. Oct 1994 A
5388147 Grimes Feb 1995 A
5405152 Katanics et al. Apr 1995 A
5414461 Kishi et al. May 1995 A
5418837 Johansson et al. May 1995 A
5438357 McNelley Aug 1995 A
5442453 Takagi et al. Aug 1995 A
5446904 Belt et al. Aug 1995 A
5479476 Finke-Anlauff Dec 1995 A
5530472 Bregman et al. Jun 1996 A
5539810 Kennedy et al. Jul 1996 A
5550754 McNelley et al. Aug 1996 A
5559554 Uekane et al. Sep 1996 A
5566073 Margolin Oct 1996 A
5588009 Will Dec 1996 A
5625675 Katsumaru et al. Apr 1997 A
5629741 Hopper May 1997 A
5687331 Volk et al. Nov 1997 A
5758280 Kimura May 1998 A
5772586 Heinonen et al. Jun 1998 A
5778304 Grube et al. Jul 1998 A
5786846 Hiroaki Jul 1998 A
5793364 Bolanos et al. Aug 1998 A
5796338 Mardirossian Aug 1998 A
5825408 Yuyama et al. Oct 1998 A
5844824 Newman et al. Dec 1998 A
5902349 Endo et al. May 1999 A
5903706 Wakabayashi et al. May 1999 A
5918180 Dimino Jun 1999 A
5924040 Trompower Jul 1999 A
5936610 Endo Aug 1999 A
5940139 Smoot Aug 1999 A
5959661 Isono Sep 1999 A
6009336 Harris et al. Dec 1999 A
6034715 Ishida et al. Mar 2000 A
6069648 Suso et al. May 2000 A
6073034 Jacobsen et al. Jun 2000 A
6081265 Nakayama et al. Jun 2000 A
6085112 Kleinschmidt et al. Jul 2000 A
6094237 Hashimoto Jul 2000 A
6111863 Rostoker et al. Aug 2000 A
6115597 Kroll et al. Sep 2000 A
6128594 Gulli et al. Oct 2000 A
6144848 Walsh et al. Nov 2000 A
6148212 Park et al. Nov 2000 A
6161134 Wang et al. Dec 2000 A
6167283 Korpela et al. Dec 2000 A
6192343 Morgan et al. Feb 2001 B1
6195089 Chaney et al. Feb 2001 B1
6198942 Hayashi et al. Mar 2001 B1
6202060 Tran Mar 2001 B1
6202212 Sturgeon et al. Mar 2001 B1
6216013 Moore et al. Apr 2001 B1
6216158 Luo et al. Apr 2001 B1
6222482 Gueziec Apr 2001 B1
6225944 Hayes May 2001 B1
6226500 Nonami May 2001 B1
6241612 Heredia Jun 2001 B1
6243039 Elliot Jun 2001 B1
6253075 Beghtol et al. Jun 2001 B1
6265988 LeMense et al. Jul 2001 B1
6282435 Wagner et al. Aug 2001 B1
6285317 Ong Sep 2001 B1
6285757 Carroll et al. Sep 2001 B1
6292666 Siddiqui et al. Sep 2001 B1
6311077 Bien Oct 2001 B1
6366651 Griffith et al. Apr 2002 B1
6385461 Raith May 2002 B1
6385465 Yoshioka May 2002 B1
6385654 Tanaka May 2002 B1
6405033 Kennedy, III et al. Jun 2002 B1
6411198 Hirai et al. Jun 2002 B1
6412112 Barrett et al. Jun 2002 B1
6421470 Nozaki et al. Jul 2002 B1
6421602 Bullock et al. Jul 2002 B1
6438380 Bi et al. Aug 2002 B1
6442404 Sakajiri Aug 2002 B1
6445802 Dan Sep 2002 B1
6487422 Lee Nov 2002 B1
6507643 Groner Jan 2003 B1
6510325 Mack, II et al. Jan 2003 B1
6512919 Ogasawara Jan 2003 B2
6518956 Sato Feb 2003 B1
6519566 Boyer et al. Feb 2003 B1
6526293 Matsuo Feb 2003 B1
6528533 Lauffer Mar 2003 B2
6529742 Yang Mar 2003 B1
6542750 Hendrey et al. Apr 2003 B2
6549215 Jouppi Apr 2003 B2
6549756 Engstrom Apr 2003 B1
6553309 Uchida et al. Apr 2003 B2
6569011 Lynch et al. May 2003 B1
6587547 Zirngibl et al. Jul 2003 B1
6615186 Kolls Sep 2003 B1
6618704 Kanevsky et al. Sep 2003 B2
6630958 Tanaka et al. Oct 2003 B2
6647251 Siegle et al. Nov 2003 B1
6650877 Tarbouriech et al. Nov 2003 B1
6650894 Berstis et al. Nov 2003 B1
6658272 Lenchik et al. Dec 2003 B1
6658461 Mazo Dec 2003 B1
6662023 Helle Dec 2003 B1
6665711 Boyle et al. Dec 2003 B1
6668177 Salmimaa et al. Dec 2003 B2
6678366 Burger et al. Jan 2004 B1
6681120 Kim Jan 2004 B1
6687515 Kosaka Feb 2004 B1
6690932 Barnier et al. Feb 2004 B1
6694143 Beamish et al. Feb 2004 B1
6701148 Wilson et al. Mar 2004 B1
6701162 Everett Mar 2004 B1
6707942 Cortopassi et al. Mar 2004 B1
6711399 Granier Mar 2004 B1
6725022 Clayton et al. Apr 2004 B1
6728533 Ishii Apr 2004 B2
6763226 McZeal, Jr. Jul 2004 B1
6771990 Nilsson Aug 2004 B1
6772174 Pettersson Aug 2004 B1
6773344 Gabai et al. Aug 2004 B1
6775361 Arai et al. Aug 2004 B1
6779030 Dugan et al. Aug 2004 B1
6782412 Brophy et al. Aug 2004 B2
6788332 Cook Sep 2004 B1
6788928 Kohinata et al. Sep 2004 B2
6795715 Kubo et al. Sep 2004 B1
6812954 Priestman et al. Nov 2004 B1
6813501 Kinnunen et al. Nov 2004 B2
6819939 Masamura Nov 2004 B2
6820055 Saindon et al. Nov 2004 B2
6850209 Mankins et al. Feb 2005 B2
6865372 Mauney et al. Mar 2005 B2
6870828 Giordano, III Mar 2005 B1
6876379 Fisher Apr 2005 B1
6883000 Gropper Apr 2005 B1
6888927 Cruickshank et al. May 2005 B1
6891525 Ogoro May 2005 B2
6895084 Saylor et al. May 2005 B1
6895259 Blank nee Keller et al. May 2005 B1
6898321 Knee et al. May 2005 B1
6901383 Ricketts et al. May 2005 B1
6905414 Danieli et al. Jun 2005 B2
6912544 Weiner Jun 2005 B1
6917817 Farrow et al. Jul 2005 B1
6922212 Nakakubo et al. Jul 2005 B2
6937868 Himmel et al. Aug 2005 B2
6947527 Clark et al. Sep 2005 B2
6947728 Tagawa et al. Sep 2005 B2
6954645 Tsai et al. Oct 2005 B2
6958675 Maeda et al. Oct 2005 B2
6961559 Chow et al. Nov 2005 B1
6970178 Tanioka et al. Nov 2005 B2
6970703 Fuchs et al. Nov 2005 B2
6973628 Asami Dec 2005 B2
6992699 Vance et al. Jan 2006 B1
6993362 Aberg Jan 2006 B1
6993474 Curry et al. Jan 2006 B2
6999757 Bates et al. Feb 2006 B2
7003598 Kavanagh Feb 2006 B2
7007239 Hawkins et al. Feb 2006 B1
7012999 Ruckart et al. Mar 2006 B2
7019770 Katz Mar 2006 B1
7020136 Nibbeling Mar 2006 B1
7028077 Toshimitsu et al. Apr 2006 B2
7030880 Tanioka et al. Apr 2006 B2
7035666 Silberfenig et al. Apr 2006 B2
7058356 Slotznick Jun 2006 B2
7065525 Sasaki et al. Jun 2006 B1
7076052 Yoshimura Jul 2006 B2
7081832 Nelson et al. Jul 2006 B2
7085578 Barclay et al. Aug 2006 B2
7085739 Winter et al. Aug 2006 B1
7089298 Nyman et al. Aug 2006 B2
7106846 Nguyen et al. Sep 2006 B2
7107081 Fujisaki Sep 2006 B1
7113981 Slate Sep 2006 B2
7117152 Mukherji et al. Oct 2006 B1
7126951 Belcea et al. Oct 2006 B2
7127238 Vandermeijden et al. Oct 2006 B2
7127271 Fujisaki Oct 2006 B1
7130630 Moton, Jr. et al. Oct 2006 B1
7130791 Ko Oct 2006 B2
7139555 Apfel Nov 2006 B2
7142810 Oesterling Nov 2006 B2
7142890 Irimajiri et al. Nov 2006 B2
7146179 Parulski et al. Dec 2006 B2
7148911 Mitsui et al. Dec 2006 B1
7174171 Jones Feb 2007 B2
7224792 Fusco May 2007 B2
7224851 Kinjo May 2007 B2
7224987 Bhela et al. May 2007 B1
7231231 Kokko et al. Jun 2007 B2
7233781 Hunter et al. Jun 2007 B2
7233795 Ryden Jun 2007 B1
7240093 Danieli et al. Jul 2007 B1
7245293 Hoshino et al. Jul 2007 B2
7251255 Young Jul 2007 B1
7254408 Kim Aug 2007 B2
7260416 Shippee Aug 2007 B2
7266186 Henderson Sep 2007 B1
7269413 Kraft Sep 2007 B2
7277711 Nyu Oct 2007 B2
7283845 De Bast Oct 2007 B2
7319958 Melnar et al. Jan 2008 B2
7321783 Kim, II Jan 2008 B2
7324823 Rosen et al. Jan 2008 B1
7346373 Kim Mar 2008 B2
7346506 Lueck et al. Mar 2008 B2
7372447 Jacobsen et al. May 2008 B1
7383067 Phillips et al. Jun 2008 B2
7392469 Bailin Jun 2008 B1
7394969 Sun et al. Jul 2008 B2
7418346 Breed et al. Aug 2008 B2
7433845 Flitcroft et al. Oct 2008 B1
7444168 Nakagawa et al. Oct 2008 B2
7450709 Gonzalez et al. Nov 2008 B2
7451084 Funakura Nov 2008 B2
7532879 Fujisaki May 2009 B1
7536707 Matsumoto et al. May 2009 B2
7551899 Nicolas et al. Jun 2009 B1
7642929 Pinkus et al. Jan 2010 B1
7643037 Langmacher et al. Jan 2010 B1
7657252 Futami Feb 2010 B2
7686693 Danieli et al. Mar 2010 B2
7707592 Wesslen et al. Apr 2010 B2
7707602 Cragun et al. Apr 2010 B2
7725077 Jung et al. May 2010 B2
7752188 Lagerstedt et al. Jul 2010 B2
7769364 Logan et al. Aug 2010 B2
7787857 Peterman Aug 2010 B2
7787887 Gupta et al. Aug 2010 B2
7853295 Fujisaki Dec 2010 B1
7853297 Fujisaki Dec 2010 B1
7865567 Hendricks et al. Jan 2011 B1
7873349 Smith et al. Jan 2011 B1
7890089 Fujisaki Feb 2011 B1
7890136 Fujisaki Feb 2011 B1
7899410 Rakshani et al. Mar 2011 B2
7922086 Jung et al. Apr 2011 B2
7941141 Shoykhet et al. May 2011 B2
7953439 Rofougaran May 2011 B2
7970414 Werden et al. Jun 2011 B1
8042110 Kawahara et al. Oct 2011 B1
8090402 Fujisaki Jan 2012 B1
8099108 Camp et al. Jan 2012 B2
8117266 Moore Feb 2012 B2
8126400 Jung et al. Feb 2012 B2
8145040 Toyoshima Mar 2012 B2
8175655 Fujisaki May 2012 B1
8208954 Fujisaki Jun 2012 B1
8229504 Fujisaki Jul 2012 B1
8260313 Wick et al. Sep 2012 B1
8311578 Fujisaki Nov 2012 B1
8312660 Fujisaki Nov 2012 B1
8351915 Park et al. Jan 2013 B2
8364201 Fujisaki Jan 2013 B1
8433300 Fujisaki Apr 2013 B1
8433364 Fujisaki Apr 2013 B1
8452307 Fujisaki May 2013 B1
8472935 Fujisaki Jun 2013 B1
8559983 Fujisaki Oct 2013 B1
8620384 Fujisaki Dec 2013 B1
8744515 Fujisaki Jun 2014 B1
8747222 Yamashita Jun 2014 B2
8750921 Fujisaki Jun 2014 B1
8755838 Fujisaki Jun 2014 B1
8774862 Fujisaki Jul 2014 B1
8781526 Fujisaki Jul 2014 B1
8781527 Fujisaki Jul 2014 B1
8805442 Fujisaki Aug 2014 B1
8825026 Fujisaki Sep 2014 B1
8825090 Fujisaki Sep 2014 B1
9026182 Fujisaki May 2015 B1
9049556 Fujisaki Jun 2015 B1
9060246 Fujisaki Jun 2015 B1
9143723 Fujisaki Sep 2015 B1
9241060 Fujisaki Jan 2016 B1
9247383 Fujisaki Jan 2016 B1
9549150 Fujisaki Jan 2017 B1
9955006 Fujisaki Apr 2018 B1
10175846 Fujisaki Jan 2019 B1
20010005826 Shibuya Jun 2001 A1
20010011293 Murakami et al. Aug 2001 A1
20010028350 Matsuoka et al. Oct 2001 A1
20010029425 Myr Oct 2001 A1
20010035829 Yu et al. Nov 2001 A1
20010048364 Kalthoff et al. Dec 2001 A1
20010049470 Mault et al. Dec 2001 A1
20020002044 Naruse et al. Jan 2002 A1
20020002705 Byrnes et al. Jan 2002 A1
20020006804 Mukai et al. Jan 2002 A1
20020009978 Dukach et al. Jan 2002 A1
20020016724 Yang et al. Feb 2002 A1
20020019225 Miyashita Feb 2002 A1
20020022503 Lee Feb 2002 A1
20020026348 Fowler et al. Feb 2002 A1
20020028690 McKenna et al. Mar 2002 A1
20020031120 Rakib Mar 2002 A1
20020034292 Tuoriniemi et al. Mar 2002 A1
20020036231 Monaghan et al. Mar 2002 A1
20020037738 Wycherley et al. Mar 2002 A1
20020038219 Yanay Buchshrieber et al. Mar 2002 A1
20020039914 Hama et al. Apr 2002 A1
20020041262 Mukai et al. Apr 2002 A1
20020045463 Chen et al. Apr 2002 A1
20020047787 Mikkola et al. Apr 2002 A1
20020049630 Furuta et al. Apr 2002 A1
20020052754 Joyce et al. May 2002 A1
20020054068 Ellis et al. May 2002 A1
20020055872 LaBrie et al. May 2002 A1
20020057765 Hyziak et al. May 2002 A1
20020059156 Hwang et al. May 2002 A1
20020061767 Sladen et al. May 2002 A1
20020065037 Messina et al. May 2002 A1
20020065087 Ishikawa et al. May 2002 A1
20020066115 Wendelrup May 2002 A1
20020068558 Janik Jun 2002 A1
20020068585 Chan et al. Jun 2002 A1
20020068599 Rodriguez et al. Jun 2002 A1
20020072395 Miramontes Jun 2002 A1
20020077808 Liu et al. Jun 2002 A1
20020080163 Morey Jun 2002 A1
20020080942 Clapper Jun 2002 A1
20020085700 Metcalf Jul 2002 A1
20020087628 Rouse et al. Jul 2002 A1
20020094806 Kamimura Jul 2002 A1
20020097984 Abecassis Jul 2002 A1
20020098857 Ishii Jul 2002 A1
20020099456 McLean Jul 2002 A1
20020102960 Lechner Aug 2002 A1
20020103872 Watanabe Aug 2002 A1
20020103908 Rouse et al. Aug 2002 A1
20020104095 Nguyen et al. Aug 2002 A1
20020110246 Gosior et al. Aug 2002 A1
20020115469 Rekimoto et al. Aug 2002 A1
20020120718 Lee Aug 2002 A1
20020123336 Kamada Sep 2002 A1
20020123965 Phillips Sep 2002 A1
20020127997 Karlstedt et al. Sep 2002 A1
20020128000 do Nascimento Sep 2002 A1
20020130175 Nakajima Sep 2002 A1
20020133342 McKenna Sep 2002 A1
20020137470 Baron et al. Sep 2002 A1
20020137503 Roderique Sep 2002 A1
20020137526 Shinohara Sep 2002 A1
20020141086 Lang et al. Oct 2002 A1
20020142763 Kolsky Oct 2002 A1
20020147645 Alao et al. Oct 2002 A1
20020151326 Awada et al. Oct 2002 A1
20020151327 Levitt Oct 2002 A1
20020160724 Arai et al. Oct 2002 A1
20020160836 Watanabe et al. Oct 2002 A1
20020164975 Lu Nov 2002 A1
20020164996 Dorenbosch Nov 2002 A1
20020165850 Roberts et al. Nov 2002 A1
20020173344 Cupps et al. Nov 2002 A1
20020173965 Curry et al. Nov 2002 A1
20020177407 Mitsumoto Nov 2002 A1
20020178225 Madenberg et al. Nov 2002 A1
20020183045 Emmerson et al. Dec 2002 A1
20020183098 Lee et al. Dec 2002 A1
20020191951 Sodeyama et al. Dec 2002 A1
20020193997 Fitzpatrick et al. Dec 2002 A1
20020198017 Babasaki et al. Dec 2002 A1
20020198813 Patterson, Jr. et al. Dec 2002 A1
20020198936 McIntyre et al. Dec 2002 A1
20030003967 Ito Jan 2003 A1
20030005056 Yamamoto et al. Jan 2003 A1
20030006879 Kang et al. Jan 2003 A1
20030007556 Oura et al. Jan 2003 A1
20030013483 Ausems et al. Jan 2003 A1
20030014286 Cappellini Jan 2003 A1
20030016189 Abe et al. Jan 2003 A1
20030017857 Kitson et al. Jan 2003 A1
20030018744 Johanson et al. Jan 2003 A1
20030022715 Okubo Jan 2003 A1
20030025788 Beardsley Feb 2003 A1
20030032406 Minear et al. Feb 2003 A1
20030037265 Sameshima et al. Feb 2003 A1
20030038800 Kawahara Feb 2003 A1
20030038893 Rajamaki et al. Feb 2003 A1
20030045311 Larikka et al. Mar 2003 A1
20030045329 Kinoshita Mar 2003 A1
20030052964 Priestman et al. Mar 2003 A1
20030055994 Herrmann et al. Mar 2003 A1
20030061606 Hartwig et al. Mar 2003 A1
20030063113 Andrae Apr 2003 A1
20030063580 Pond Apr 2003 A1
20030063732 Mcknight Apr 2003 A1
20030065784 Herrod Apr 2003 A1
20030065805 Barnes, Jr. Apr 2003 A1
20030069693 Snapp et al. Apr 2003 A1
20030070162 Oshima et al. Apr 2003 A1
20030073432 Meade, II Apr 2003 A1
20030074398 Matsuo Apr 2003 A1
20030083055 Riordan et al. May 2003 A1
20030084104 Salem et al. May 2003 A1
20030084121 De Boor et al. May 2003 A1
20030093503 Yamaki et al. May 2003 A1
20030093790 Logan et al. May 2003 A1
20030099367 Okamura May 2003 A1
20030100347 Okada et al. May 2003 A1
20030107580 Egawa et al. Jun 2003 A1
20030110450 Sakai Jun 2003 A1
20030117376 Ghulam Jun 2003 A1
20030119479 Arima et al. Jun 2003 A1
20030119485 Ogasawara Jun 2003 A1
20030119562 Kokubo Jun 2003 A1
20030120784 Johnson et al. Jun 2003 A1
20030125008 Shimamura Jul 2003 A1
20030132928 Kori Jul 2003 A1
20030135563 Bodin et al. Jul 2003 A1
20030137970 Odman Jul 2003 A1
20030144024 Luo Jul 2003 A1
20030144830 Williams Jul 2003 A1
20030148772 Ben-Ari Aug 2003 A1
20030149662 Shore Aug 2003 A1
20030153355 Warren Aug 2003 A1
20030156208 Obradovich Aug 2003 A1
20030166399 Tokkonen et al. Sep 2003 A1
20030169329 Parker et al. Sep 2003 A1
20030201982 Iesaka Oct 2003 A1
20030202504 Dhara et al. Oct 2003 A1
20030204562 Hwang Oct 2003 A1
20030208541 Musa Nov 2003 A1
20030220835 Barnes, Jr. Nov 2003 A1
20030222762 Beigl et al. Dec 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20030223554 Zhang Dec 2003 A1
20030224760 Day Dec 2003 A1
20030227570 Kim et al. Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20030236709 Hendra et al. Dec 2003 A1
20030236866 Light Dec 2003 A1
20040003307 Tsuji Jan 2004 A1
20040004616 Konya et al. Jan 2004 A1
20040015610 Treadwell Jan 2004 A1
20040027369 Kellock et al. Feb 2004 A1
20040029640 Masuyama et al. Feb 2004 A1
20040033795 Walsh et al. Feb 2004 A1
20040034692 Eguchi et al. Feb 2004 A1
20040052504 Yamada et al. Mar 2004 A1
20040060061 Parker Mar 2004 A1
20040067751 Vandermeijden et al. Apr 2004 A1
20040068399 Ding Apr 2004 A1
20040072595 Anson et al. Apr 2004 A1
20040082321 Kontianinen Apr 2004 A1
20040087326 Dunko et al. May 2004 A1
20040092255 Ji et al. May 2004 A1
20040100419 Kato et al. May 2004 A1
20040103303 Yamauchi et al. May 2004 A1
20040107072 Dietrich et al. Jun 2004 A1
20040114732 Choe et al. Jun 2004 A1
20040117108 Nemeth Jun 2004 A1
20040128359 Horvitz et al. Jul 2004 A1
20040135189 Kiyotoshi Jul 2004 A1
20040137893 Muthuswamy et al. Jul 2004 A1
20040139208 Tuli Jul 2004 A1
20040142678 Krasner Jul 2004 A1
20040150725 Taguchi Aug 2004 A1
20040157664 Link Aug 2004 A1
20040166832 Portman et al. Aug 2004 A1
20040166879 Meadows et al. Aug 2004 A1
20040174863 Caspi et al. Sep 2004 A1
20040177036 Nutahara et al. Sep 2004 A1
20040183937 Viinikanoja et al. Sep 2004 A1
20040185865 Maanoja Sep 2004 A1
20040189827 Kim et al. Sep 2004 A1
20040196265 Nohr Oct 2004 A1
20040198374 Bajikar Oct 2004 A1
20040203520 Schirtzinger et al. Oct 2004 A1
20040203904 Gwon et al. Oct 2004 A1
20040203909 Koster Oct 2004 A1
20040204018 Kuo Oct 2004 A1
20040204035 Raghuram et al. Oct 2004 A1
20040204126 Reyes et al. Oct 2004 A1
20040208299 Katz Oct 2004 A1
20040214596 Lee Oct 2004 A1
20040216037 Hishida et al. Oct 2004 A1
20040218738 Arai et al. Nov 2004 A1
20040219951 Holder Nov 2004 A1
20040223049 Taniguchi et al. Nov 2004 A1
20040235520 Cadiz et al. Nov 2004 A1
20040242240 Lin Dec 2004 A1
20040248586 Patel et al. Dec 2004 A1
20040252197 Fraley et al. Dec 2004 A1
20040259537 Ackley Dec 2004 A1
20040264662 Silver Dec 2004 A1
20040266418 Kotzin Dec 2004 A1
20040267628 Stillman Dec 2004 A1
20050004749 Park Jan 2005 A1
20050019017 Green Jan 2005 A1
20050032527 Sheha et al. Feb 2005 A1
20050036509 Acharya et al. Feb 2005 A1
20050043097 March et al. Feb 2005 A1
20050046584 Breed Mar 2005 A1
20050048987 Glass Mar 2005 A1
20050070257 Saarinen et al. Mar 2005 A1
20050070336 Tamura Mar 2005 A1
20050075097 Lehikoinen et al. Apr 2005 A1
20050090768 Brattesani et al. Apr 2005 A1
20050113080 Nishimura May 2005 A1
20050113113 Reed May 2005 A1
20050120225 Kirsch et al. Jun 2005 A1
20050130614 Suzuki Jun 2005 A1
20050136949 Barnes, Jr. Jun 2005 A1
20050144560 Gruen et al. Jun 2005 A1
20050151877 Fisher Jul 2005 A1
20050159189 Iyer Jul 2005 A1
20050163289 Caspi et al. Jul 2005 A1
20050164684 Chen et al. Jul 2005 A1
20050165871 Barrs, II et al. Jul 2005 A1
20050166242 Matsumoto et al. Jul 2005 A1
20050186954 Kenney Aug 2005 A1
20050192030 Asthana et al. Sep 2005 A1
20050201534 Ignatin Sep 2005 A1
20050207555 Lee et al. Sep 2005 A1
20050227731 Kall Oct 2005 A1
20050235312 Karaoguz et al. Oct 2005 A1
20050258958 Lai Nov 2005 A1
20050261945 Mougin et al. Nov 2005 A1
20050272448 Tran et al. Dec 2005 A1
20050272504 Eguchi et al. Dec 2005 A1
20050282582 Slotnick et al. Dec 2005 A1
20050289589 Vermola Dec 2005 A1
20060003813 Seligmann et al. Jan 2006 A1
20060031407 Dispensa et al. Feb 2006 A1
20060033809 Farley Feb 2006 A1
20060035628 Miller et al. Feb 2006 A1
20060041923 McQuaide, Jr. Feb 2006 A1
20060044460 Lee Mar 2006 A1
20060046714 Kalavade Mar 2006 A1
20060052100 Almgren Mar 2006 A1
20060059038 Iuchi et al. Mar 2006 A1
20060074639 Goudar et al. Apr 2006 A1
20060084413 Myoung Apr 2006 A1
20060090164 Garden et al. Apr 2006 A1
20060114100 Ghabra et al. Jun 2006 A1
20060121986 Pelkey et al. Jun 2006 A1
20060126284 Moscovitch Jun 2006 A1
20060133590 Jiang Jun 2006 A1
20060136773 Kespohl et al. Jun 2006 A1
20060140173 Hoover Jun 2006 A1
20060140353 Jung Jun 2006 A1
20060140387 Boldt Jun 2006 A1
20060143655 Ellis et al. Jun 2006 A1
20060166650 Berger et al. Jul 2006 A1
20060167677 Bitzer Jul 2006 A1
20060199612 Beyer et al. Sep 2006 A1
20060206913 Jerding et al. Sep 2006 A1
20060229114 Kim Oct 2006 A2
20060234693 Isidore et al. Oct 2006 A1
20060234758 Parupudi et al. Oct 2006 A1
20060242248 Kokkinen Oct 2006 A1
20060258378 Kaikuranata Nov 2006 A1
20060258396 Matsuoka Nov 2006 A1
20060262911 Chin et al. Nov 2006 A1
20060264245 Luo Nov 2006 A1
20060276172 Rydgren et al. Dec 2006 A1
20060284732 Brock-Fisher Dec 2006 A1
20070005809 Kobayashi et al. Jan 2007 A1
20070015503 Choi Jan 2007 A1
20070015550 Kayanuma Jan 2007 A1
20070030888 Turetzky et al. Feb 2007 A1
20070032255 Koo et al. Feb 2007 A1
20070037605 Logan Feb 2007 A1
20070050832 Wright et al. Mar 2007 A1
20070061845 Barnes Mar 2007 A1
20070070178 Maghera Mar 2007 A1
20070097879 Bleckert et al. May 2007 A1
20070099703 Terebilo May 2007 A1
20070109262 Oshima et al. May 2007 A1
20070135145 Lee et al. Jun 2007 A1
20070135150 Ushiki et al. Jun 2007 A1
20070142047 Heeschen et al. Jun 2007 A1
20070162346 Son-Bell et al. Jul 2007 A1
20070184878 Lee Aug 2007 A1
20070190944 Doan et al. Aug 2007 A1
20070191029 Zarem et al. Aug 2007 A1
20070204014 Greer et al. Aug 2007 A1
20070216760 Kondo et al. Sep 2007 A1
20070218891 Cox Sep 2007 A1
20070262848 Berstis et al. Nov 2007 A1
20070293240 Drennan et al. Dec 2007 A1
20070296739 Lonn Dec 2007 A1
20080006762 Fadell et al. Jan 2008 A1
20080014917 Rhoads et al. Jan 2008 A1
20080016534 Ortiz et al. Jan 2008 A1
20080021697 Cox et al. Jan 2008 A1
20080039125 Fan et al. Feb 2008 A1
20080055254 Willey Mar 2008 A1
20080058005 Zicker et al. Mar 2008 A1
20080070561 Keum et al. Mar 2008 A1
20080070588 Morin Mar 2008 A1
20080071745 Clarke Mar 2008 A1
20080076410 Beyer Mar 2008 A1
20080082930 Omernick et al. Apr 2008 A1
20080089587 Kim et al. Apr 2008 A1
20080104544 Collins et al. May 2008 A1
20080109840 Walter et al. May 2008 A1
20080139222 Falvo et al. Jun 2008 A1
20080140686 Hong et al. Jun 2008 A1
20080146272 Rao et al. Jun 2008 A1
20080151696 Giroud et al. Jun 2008 A1
20080167078 Eibye Jul 2008 A1
20080172173 Chang et al. Jul 2008 A1
20080176545 Dicke et al. Jul 2008 A1
20080242271 Schmidt et al. Oct 2008 A1
20080242283 Ruckart Oct 2008 A1
20080254811 Stewart Oct 2008 A1
20080299989 King et al. Dec 2008 A1
20090002342 Terada et al. Jan 2009 A1
20090017812 Chan et al. Jan 2009 A1
20090047972 Neeraj Feb 2009 A1
20090111486 Burstrom Apr 2009 A1
20090119269 Im May 2009 A1
20090124243 Routley et al. May 2009 A1
20090150807 George et al. Jun 2009 A1
20090153490 Nymark et al. Jun 2009 A1
20090186628 Yonker et al. Jul 2009 A1
20090221330 Tomimori Sep 2009 A1
20090265022 Kirovski et al. Oct 2009 A1
20090290369 Schofield et al. Nov 2009 A1
20090319947 Wang et al. Dec 2009 A1
20100030557 Molloy et al. Feb 2010 A1
20100062740 Ellis et al. Mar 2010 A1
20100079267 Lin Apr 2010 A1
20100145700 Kennewick et al. Jun 2010 A1
20110212714 Lobzakov et al. Sep 2011 A1
20120059545 Furuno et al. Mar 2012 A1
20120064874 Pierce et al. Mar 2012 A1
20130090097 Klassen Apr 2013 A1
20130298059 Raskin Nov 2013 A1
20140067974 Lewinson Mar 2014 A1
20140071951 Liu et al. Mar 2014 A1
20140323166 Zhang et al. Oct 2014 A1
20150018091 Suzuki et al. Jan 2015 A1
Foreign Referenced Citations (13)
Number Date Country
2386027 Sep 2003 GB
2196373 Aug 1990 JP
H10155141 Jun 1998 JP
H11-195137 Jul 1999 JP
2001086558 Mar 2001 JP
2002252691 Sep 2002 JP
2003078977 Mar 2003 JP
2003228726 Aug 2003 JP
2003263656 Sep 2003 JP
2005216149 Aug 2005 JP
0131893 May 2001 WO
2003001457 Jan 2003 WO
2003096660 Nov 2003 WO
Non-Patent Literature Citations (12)
Entry
Fehily “Windows XP: Visual QuickStart Guide” published by Peachpit Press in 2003.
Casio, “Pocket PC User's Guide” published on Feb. 3, 2000.
Audiovox, “Pocket PC Phone User Manual” published on Mar. 19, 2004.
Palm, “Using your Treo” published in Feb. 2004.
Palm, “Palm Treo 600 Support Knowledge Library, Solution ID 29492” published in Jan. 2004.
Dataviz, “Documents to Go included on Treo 600 Smartphone from palmOne” published in Nov. 2003.
Palm, “Treo 600 smartphone” published in 2003.
FCC's wireless Enhanced 911 (E911) rules, Phase I and Phase II.
HI Corporation's company history (http://www.hicorp.co.jp/english/corporate/history.html) Copyright notice on the web: (c) 2007-2011 HI Corporation. All Rights Reserved.
HI Corporation to Offer 3D Graphics to Motorola Mobile Phone Platform Customers (http://www.wirelessdevnet.com/news/2003/203/news7.html) Published on the web on: Jul. 21, 2003.
Development of NTT docomo Mova N504i—NEC Gi-Ho (Technology Magazine) vol. 56 No. May 2003, p. 144 Published in: May 2003.
Winners of Tokyo Venture Technology Grand Prize in 2000-2009 (http://www.sangyo-rodo.metro.tokyo.jp/shoko/sogyo/venture/2000-2009winners.pdf) Published in: 2000-2009.
Provisional Applications (1)
Number Date Country
60329997 Oct 2001 US
Continuations (9)
Number Date Country
Parent 15878436 Jan 2018 US
Child 16575431 US
Parent 15422458 Feb 2017 US
Child 15878436 US
Parent 15089465 Apr 2016 US
Child 15422458 US
Parent 14318684 Jun 2014 US
Child 15089465 US
Parent 13943792 Jul 2013 US
Child 14318684 US
Parent 13016974 Jan 2011 US
Child 13943792 US
Parent 12177927 Jul 2008 US
Child 13016974 US
Parent 11464835 Aug 2006 US
Child 12177927 US
Parent 10209399 Jul 2002 US
Child 11464835 US