The invention disclosed herein relates to biometric enablement of computer terminals for participation in one or more network-related functions and in voice communication over a network. More particularly, the biometric enablement involves finger-image sensing and authentication. The invention also relates to a telephone handset that includes a finger-image sensor, and the use of the handset for user identification and voice communication over the network.
The following patent documents disclose wireless telephones having a biometric identification device, and security systems for wireless communications. U.S. Pat. Nos. 6,141,436; 6,088,585; 5,796,858; 5,872,834; 6,219,793, 6,330457; 6,249,672; 6,177,950; 6,175,922; 6,111,977; 6,061,790; 6,064,737; 6,058,304; 6,078,908; 5,715,518; and 6,035,188, US Patent Applications 2002/0003892; 2001/0012201; 2001/0016819; and 2001/0017584; international and foreign patent documents: WO 98/11750; WO 01/45283; EP09699644; EP 0817515; and DE 19832638. The following patent documents disclose the use of biometrics to control access to computers or computer functions, or to control a transaction conducted using a computer: U.S. Pat. Nos. 6,337,919; 6,337,918; 6,282,304; 6,256,737; 5,420,936; 5,337,043; 5,838,306; 5,881,226; 5,991,408; 6,016,476; 6,154,727; 6,078,848; 6,160,903; published U.S. Patent Applications 2002/0010864; 2001/0051924; 2002/0007459; 2002/0010857; 2001/0049785; 2001/0048025; 2001/0048359; and 2001/0034717; international and foreign patent documents WO 01/29731; WO 00/72508; WO 01/92994; GB2312040; DE19541672; DE19920933; and FR2792438.
The invention provides a system for enabling a computer terminal to access or otherwise participate in at least one network-related function and voice communication over a network. The invention also provides a handset that includes a finger-image sensor, for use with a computer terminal, that provides finger-image-related signals or data for authentication purposes and voice-related signals for voice communications. The handset provides finger-image data that is used to biometrically identify a person who is seeking to access or otherwise participate in network functions including voice communication and at least one other network-related function using the terminal and/or the handset. Such access/participation is enabled after satisfactory identification of a prospective user. The handset may be coupled to a computer terminal either wirelessly or tethered.
The terms “computer terminal,” “terminal” and “terminal device” are meant in a broad sense, and encompass PCs or other desktop computers or work stations, portable computers, client computers, thin clients, PDAs, etc., unless the context in which these terms is used indicates otherwise. Similarly, the terms “network,” “computer system,” “host computer” and “server” are meant in a broad sense.
In a preferred embodiment, the system comprises a telephone handset including a microphone and a speaker, and also the finger-image sensor, coupled to provide signals to and receive signals from the computer terminal for voice communication, and at least to provide finger-image-related signals to the computer terminal. Means are provided for electronically authenticating a finger-image sensed by the finger-image sensor of a handset based on the finger-image-related signals provided by that handset. Means are also provided that are responsive to the authenticating means for enabling the computer terminal in the network to access or otherwise participate in the performance of at least one network-related function and voice communication over the network between handsets, at least one of which provided finger-image-related signals based upon which a sensed finger-image was authenticated. In a preferred embodiment, communication is enabled between handsets which each have provided finger-image related signals based upon which a sensed finger-image was authenticated, i.e., between authenticated handsets. In some embodiments, communication can occur between an authenticated handset and an unauthenticated handset. An unauthenticated handset may both receive from and send voice-related data to an authenticated handset, or only receive from or send voice-related data to an authenticated handset.
The authentication means and the enabling means comprise programming that may be resident on or provided to the handset and/or the computer terminal, or may be resident on a host computer or server and operate entirely on the host computer or server, or operate on a distributed processing basis on the host computer or server, the computer terminal and the handset, or subcombinations thereof.
In a preferred embodiment, the telephone handset includes circuitry coupled to the microphone and speaker referred to above that at least converts between analog and digital signals, and an interface coupling the finger-image sensor and the circuitry with the computer terminal. In a preferred embodiment, circuitry is provided for voice functions and the finger-image sensor includes other circuitry for finger-image functions. Individual USB ports, coupled to a USB hub, are associated with the voice circuitry and with the finger-image circuitry. The handset preferably is keypadless, and each computer terminal includes a computer input device and is programmed to initiate a voice communication session in response to information entered using the input device.
In a preferred embodiment, the telephone handset comprises an elongated housing having opposed major sides and opposed ends. The speaker is positioned in the vicinity of a first end of the handset to transmit sound from a first major side of the handset, and the microphone is positioned in the vicinity of a second end of the handset to receive sound from a first major side of the handset. The finger-image sensor is positioned in the vicinity of and spaced from the second end of the handset to sense a finger-image from a second major side of the handset. Preferably, the handset has a contoured surface leading to the finger-image sensor to receive part of a human finger therein.
In one embodiment, the elongated housing has a larger first portion and a smaller second portion projecting at an angle from the first portion. Both portions have opposed major sides and opposed ends. The first end of the handset is an end of the first portion, in the vicinity of which is positioned the speaker. The first major side discussed above comprises the first major side of the first and second portions, and the second major side comprises the second major side of the first and second portions. The finger-image sensor is positioned in the vicinity of and spaced from an opposite end of the first portion of the handset to sense a finger-image from a second major side of the first portion of the handset. The second end of the handset is an end of the second portion in the vicinity of which is positioned the microphone to receive sound from the first major side of the second portion. The first and second portions are connected in the vicinity of the second end of the first portion and a first end of the second portion, and the first major side of the first portion and a first major side of the second portion form an internal obtuse angle. The contoured surface referred to above extends from the opposite end of the second portion to the finger-image sensor.
In another embodiment, the handset is generally straight, as opposed to having one portion angled with respect to the other portion, and includes only the larger upper portion, which has opposed major sides. The speaker and finger-image sensor are positioned as described above, and the microphone is positioned in the vicinity of the second end of the handset to receive sound from the first major side thereof. The second major side of the handset is contoured as described above, but from the second end of the handset to the finger-image sensor to receive part of a human finger therein.
The telephone handset preferably includes configuration or other means on the side thereof opposite the finger-image sensor, i.e., on the second major side of the handset or the first and second portions thereof for stably supporting the handset on or against a flat (or generally flat) surface during sensing of a finger-image. For example, flat or straight surfaces or projections may be provided positioned on the handset to support the handset on or against a flat surface. As mentioned, the handset preferably does not include a keypad. Hookswitch functions are preferably activated by a button projecting from the top of the handset.
In a preferred embodiment, the handset includes a bracket positioned in the upper part of the handset to suspend the handset from a projection engaging the bracket. When suspended from the bracket adjacent a flat (or generally flat) surface of a monitor or fixture, the handset can be pressed against the flat surface and be stably supported during sensing of a finger tip image.
The invention is illustrated in the accompanying drawings which are meant to be exemplary and not limiting, in which like elements in the different figures relate to like or corresponding elements, and in which:
Referring to
In the embodiment depicted in
The housing 12 in the embodiment depicted in
The band 36 may be secured to the housing 12, and the front and rear parts 34 and 35 may be secured together in any suitable manner. For example, the band 36 may be secured to the front and/or rear parts 34 and 35 by interlocking parts such as grooves, ridges, projections, receptacles, etc., or by an adhesive, or bonded by any suitable process, and the front and rear parts may be joined by interlocking and/or snap-fitting parts, or by fasteners, or ultrasonic bonding or heat bonding, etc. Interlocking arrangements for securing the front and rear parts 34 and 35 together and for securing the band 36 to at least one of the front and rear parts are preferred to facilitate assembly and disassembly, e.g., for repair or servicing.
The front part 34 and the rear part 35 each include an upper portion 34a, 35a and a lower portion 34b, 35b, respectively, which are unitary, e.g., formed as a single piece, or from separately formed unitary pieces that are joined by any suitable process, e.g., an adhesive or other bonding process. Each unitary piece may be made, for example, of a suitable plastic, such as ABS in a molding process, or of any suitable material made by any suitable process.
Rear side 14b (
The upper front portion 35a (
The channel 66 and contoured surface 68 facilitate placement and location of a fingertip on the active portion 64 of the sensor 24. In the embodiment depicted in
A loop or bracket 70 is attached to the band 36, but may be attached to the upper front or rear portion 34a, 35a. The bracket 70 is used to suspend the handset from a hook or projection (not shown) secured to a conventional computer monitor (not shown), e.g., a CRT device or a flat panel device such as an LCD, LED, active matrix or plasma device, or secured to another type of device or fixture. Such devices and features typically include a flat (or generally flat) surface. Preferably, the hook suspending the handset 10 is positioned above such a flat surface so the handset is suspended against or adjacent a flat surface.
The rear part 14a, b of the handset 10, at the top 17 and bottom 18 of the handset, is configured to stably support the handset 10 on a flat surface. Thus, with the handset 10 supported on a flat surface with the front part 13a, b exposed (or facing upward), a finger-image can be sensed while the handset is stably supported. Such structure may extend transversely across the rear piece 14a, b at the top and bottom, respectively, of the handset. In the embodiment depicted in
A hookswitch button 72 protrudes through the top 17 of the handset 10. The hookswitch button 72 forms part of a microswitch 74 (
In another embodiment of handset 10, not fully shown, the second portion 28 is not provided. In that embodiment, the handset is straight, and the microphone is positioned in the vicinity of the end 62 of the handset, as represented by the broken line circle referenced by 60a, to transmit sound through major side 14a.
A cable 79, connectable to a computer or computer terminal such as a PC, extends from the lower front portion 34b of the handset restrained by a grommet (not shown) inside the handset 10.
Much of the surface configuration and shape of the handset is solely ornamental in nature, while some has ornamental and functional aspects.
The handsets described herein allow users to communicate telephonically over a network such as the Internet, an Intranet or a proprietary network, as illustrated, for example, in
The codec 82 is coupled to the microcontroller 83, to which the ringer 73 and the hookswitch 74 are also coupled. The microcontroller 83 controls handset-related telephony functions, including ringer functions, responsive to the hookswitch 74 and signals from a computer terminal 100 to which the handset 10 is coupled. Any suitable controller may be used as the microcontroller 83. The microcontroller 83 is coupled to the USB hub controller 84, e.g., Alcor AU9254 available from Alcor Micro Corp. of Taipei, Taiwan. Any suitable ringer, e.g., piezoelectric, may be used as the ringer 73. The finger-image sensor 24 either includes a USB port, or USB interface circuitry 80 is provided to interface the finger-image sensor 24 with the USB hub controller 84. The USB Hub 84 controller is configured for connection to a USB port on a computer terminal to transmit signals to and receive signals from the computer terminal, including audio-related signals or data and finger-image-related signals or data. The finger-image sensor 24 may be implemented by any suitable technology. Suitable sensors are available from Authentec, Inc., of Melbourne, Fla. for example, the FingerLoc™ or EntrePad™ families of finger-image sensors
The USB hub controller 84 allows the finger-image sensor 24 and the codec 82 to communicate with a single USB port of the computer terminal via the cable 79. The power required to drive the components of the handset 10 is preferably provided by the computer terminal to the handset over the cable 79, e.g., from a USB port located on computer or terminal device.
Additional circuitry, known to those of skill in the relevant arts, may be provided, for example, in blocks 20, 22, 74, 72, 73, 24, 80, 82, 83 and 84 in
Referring to
Networks including computer terminals that perform network functions such as information delivery, trading of financial interests, Internet, and provide for voice communication over the network by means of packet protocols, e.g. TCP for data and UDP for voice, are known. The invention is not concerned with the specific network or communications system which supports the network functions and voice communications associated with the handset 10 and/or a computer terminal to which a handset 10 is coupled. Therefore,
The computer system 110 (
Authorization is required for handsets 10 and terminal devices 100 to access or otherwise participate in network functions and to participate in, or at least initiate, voice communications over the network, i.e., initiating, transmitting, receiving or all of these functions. In the preferred embodiment, such authorization includes finger-image authentication. Means for authenticating finger-images of authorized users, including authentication software, are available, for example, from Authentec, Inc. of Melbourne, Fla. Such software may be stored locally in terminal devices 100, or in the host computer system 112, or is distributed among them or in any subcombination of them. Authentication, which may involve comparison of finger-image data provided by finger-image sensors 24 in handsets 10 with stored finger-image data for matches, is required before a terminal device 10 is permitted to access or otherwise participate in selected any or all network functions, such as participation in trading of financial interests, and to participate in voice communication. Access to terminal devices 100 and network functions may also be password protected. Access control using finger-image and password authentication, and means for performing access control using finger-image and password authentication, are generally known to those of skill in the relevant art(s). Therefore, only a high level, general description thereof is included herein.
In the preferred embodiment, a user seeking access to the host computer system 112 or any selected network related function enters a user name and password in response to a suitable log-on screen or set of screens. Once the user name and password have been authorized, the user is then prompted by another log-on or set of log-on screens to provide his or her finger-image for a reading by the finger-image sensor 24. The finger-image data obtained by the finger-image sensor 24 is communicated to the computer terminal 100, which includes all required drivers to obtain and receive the finger-image data.
In one embodiment, authentication proceeds in computer terminal 100 with software and stored finger-image data provided by the host computer system 112. In this embodiment, the finger-image comparison software is downloaded into the computer terminal 100 from the host computer system 112, and previously acquired finger-image templates stored on the host computer system 112 are retrieved for comparison against the finger-image data obtained by the finger-image sensor 24. If a match is found by authentication means associated with the computer terminal 100, that information is communicated to the host computer system 112, and means in the computer system responds to enable a computer terminal to access or otherwise participate in any selected function in the computer system 110. Alternatively, the finger-image data obtained by a finger-image sensor 24 is uploaded to the host computer system 112, where authentication means determines whether there is a match and provides authentication. It is also possible for the handset to include software to compare finger-image data provided by the finger-image sensor 24 with finger-image data stored in the handset, or provided to the handset.
For example, after selected or all functions of a computer terminal 100 have been enabled in response to authentication, a user may access such functions and initiate and receive voice communications without further authorization. Alternatively, a terminal device 100 may be authorized to receive voice communications without finger-image and or password authentication, but not to initiate or transmit voice communications without such authorization, etc. A separate authorization procedure may be required each time a user seeks to execute a sensitive or other desired function.
In one embodiment, a user may access the computer system 110 from any computer terminal 100 upon authentication of his or her finger-image. If desired, such authentication at various terminal devices may also be used to track the presence of the user. More specifically, the host computer system 112 can track the presence of the user at any given terminal device. Tracking can operate on the basis that a user is present at the last terminal device that remains logged on at which the user's finger-image was authenticated. Alternatively, users can be required to provide a finger-image sense when leaving a terminal device that remains logged on, or to re-log on with finger-image authentication at given times or intervals.
The presence status of the intended recipient at his or her usual or “home” terminal, or at another terminal may be displayed on the computer monitor of a user initiating a call or email with, for example, a suitable icon or message. The call or email initiator can then decide whether to complete the call or send the email to the home terminal of the intended recipient or to such other terminal at which the intended recipient is present, as determined by finger-image detection.
Using the handset 10 and suitable telephony software, a voice call can be set up over a network, such as the one represented in
Requests from computer terminals 100 to establish a voice call are forwarded to the host computer system 112 via a gateway 116. A host computer in the host computer system 112 checks for user authorization and determines whether the called user exists, is valid or is otherwise recognized by the system, such as being a member of a group or alias defined by the calling party. The host computer also determines whether the called party has a handset 10 connected to that party's computer terminal 100. For example, USB device drivers included in the computer terminals 100 can automatically detect the presence of a handset 10, and this information may be conveyed to a host computer.
If the called party has a connected handset, the host computer can determine whether the called party is logged on. In the event that the called party is not logged on, the host computer may signal or send an appropriate informational message to the calling party. The host computer may also interface with or include a voice mail subsystem so as to enable the calling party to leave a voice mail.
The host computer signals the computer terminal of the called party that an incoming voice call has arrived. The computer terminal, in turn, using a suitable device driver, causes the handset speaker to sound a ring tone and/or the computer terminal visually indicates an incoming call on the display monitor of the called computer terminal. The called party may accept the call by pressing the handset hookswitch button 72, or by entering a keyboard or mouse command. In either case, the called computer terminal notifies the host computer that the call has been accepted, and the host computer, in turn, notifies the calling computer terminal that the call has been accepted.
In the illustrated embodiment, the gateways 116 establish a communication path over intra- or inter-gateway links. The computer terminals preferably segment digital voice data received from the handset into packets, e.g. according to a protocol such as UDP, addressed to the other party or the gateway thereof. The terminal devices are preferably configured to use a predetermined UDP port for voice traffic, but in alternative embodiments, the UDP port may be established dynamically by the host computer and communicated to the terminal devices. In a further alternative, the UDP port may be determined through negotiation between the calling and called terminal devices.
Voice calls are terminated when a user either toggles the hookswitch 74 of the handset or enters a pre-determined user-based command with the computer terminal's input device. Either event is communicated to the host computer and interpreted as a request to terminate the call. In this case, the host computer instructs the calling and called terminal devices to terminate the session. Conference calls can be established in manner similar to the establishment of a bi-directional call. The host computer associated with the calling party can go through the same steps described above for each called party. A conference call can be set up to substantially simultaneously to connect multiple parties, if desired. To support this function, voice group data is stored in a manner similar to storing e-mail group data.
Hardware and software are known to those of skill in the art or can be constructed from the disclosure herein for performing the telephony functions described herein, including the use of VOW in networks such as the Internet and intranets and proprietary networks.
Although embodiments of a handset with a finger-image sensor and a network in which it is used have been disclosed, other embodiments of handsets may be used in the disclosed networks as well as in other networks. Those having ordinary skill in the relevant art(s) will understand that a variety of programming methodologies can be used to implement the network and telephony functions discussed above. Similarly, numerous modifications and variations may be made to the embodiments described herein without departing from the spirit and scope of the invention, and the claims are intended to cover all such modifications and variations to the extent permitted by the prior art.
This application is a continuation of U.S. patent application Ser. No. 15/164,322, filed May 25, 2016, which is a continuation of U.S. patent application Ser. No. 12/044,616, filed Mar. 7, 2008, now U.S. Pat. No. 9,378,347, which is a continuation of U.S. patent application Ser. No. 10/081,132, filed Feb. 21, 2002, now U.S. Pat. No. 7,418,255, the disclosures of which are herein incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3804524 | Jocoy et al. | Apr 1974 | A |
4394773 | Ruell | Jul 1983 | A |
4408323 | Montgomery | Oct 1983 | A |
4453247 | Suzuki et al. | Jun 1984 | A |
4581735 | Flamm et al. | Apr 1986 | A |
4782485 | Gollub | Nov 1988 | A |
4857916 | Bellin | Aug 1989 | A |
4905293 | Asai et al. | Feb 1990 | A |
4914650 | Sriram | Apr 1990 | A |
5144680 | Kobayashi et al. | Sep 1992 | A |
5337043 | Gokcebay | Aug 1994 | A |
5412463 | Sibbald et al. | May 1995 | A |
5420936 | Fitzpatrick et al. | May 1995 | A |
5467403 | Fishbine et al. | Nov 1995 | A |
5546471 | Merjanian | Aug 1996 | A |
5603179 | Adams | Feb 1997 | A |
5617423 | Li et al. | Apr 1997 | A |
5715518 | Barrere et al. | Feb 1998 | A |
5732133 | Mark | Mar 1998 | A |
5796858 | Zhou et al. | Aug 1998 | A |
5828773 | Setlak et al. | Oct 1998 | A |
5838306 | O'Connor et al. | Nov 1998 | A |
5852670 | Setlak et al. | Dec 1998 | A |
5872834 | Teitelbaum | Feb 1999 | A |
5881226 | Veneklase | Mar 1999 | A |
5910946 | Csapo | Jun 1999 | A |
5920642 | Merjanian | Jul 1999 | A |
5926261 | Hoshino | Jul 1999 | A |
5953322 | Kimball | Sep 1999 | A |
5970458 | Petkovsek | Oct 1999 | A |
5991408 | Pearson et al. | Nov 1999 | A |
6011806 | Fujieda et al. | Jan 2000 | A |
6014687 | Watanabe et al. | Jan 2000 | A |
6016476 | Maes et al. | Jan 2000 | A |
6028950 | Merjanian | Feb 2000 | A |
6035188 | Hoogerwerf et al. | Mar 2000 | A |
6058304 | Callaghan et al. | May 2000 | A |
6061790 | Bodnar | May 2000 | A |
6064737 | Rhoads | May 2000 | A |
6078848 | Bernstein et al. | Jun 2000 | A |
6078908 | Schmitz | Jun 2000 | A |
6088585 | Schmitt et al. | Jul 2000 | A |
6111977 | Scott et al. | Aug 2000 | A |
6141436 | Srey et al. | Oct 2000 | A |
6154727 | Karp et al. | Nov 2000 | A |
6160903 | Hamid et al. | Dec 2000 | A |
6175922 | Wang | Jan 2001 | B1 |
6177950 | Robb | Jan 2001 | B1 |
6191410 | Johnson | Feb 2001 | B1 |
6219793 | Li et al. | Apr 2001 | B1 |
6222859 | Yoshikawa | Apr 2001 | B1 |
6249672 | Castiel | Jun 2001 | B1 |
6256737 | Bianco et al. | Jul 2001 | B1 |
6282304 | Novikov et al. | Aug 2001 | B1 |
6330457 | Yoon | Dec 2001 | B1 |
6337918 | Holehan | Jan 2002 | B1 |
6337919 | Dunton | Jan 2002 | B1 |
6404862 | Holt | Jun 2002 | B1 |
6493437 | Olshansky | Dec 2002 | B1 |
6636620 | Hoshino | Oct 2003 | B1 |
7418255 | Bloomberg et al. | Aug 2008 | B2 |
20010012201 | Fries et al. | Aug 2001 | A1 |
20010016819 | Kolls | Aug 2001 | A1 |
20010017584 | Shinzaki | Aug 2001 | A1 |
20010034717 | Whitworth | Oct 2001 | A1 |
20010048025 | Shinn | Dec 2001 | A1 |
20010048359 | Yamane et al. | Dec 2001 | A1 |
20010049785 | Kawan et al. | Dec 2001 | A1 |
20010051924 | Uberti | Dec 2001 | A1 |
20020003892 | Iwanaga | Jan 2002 | A1 |
20020007459 | Cassista et al. | Jan 2002 | A1 |
20020010857 | Karthik | Jan 2002 | A1 |
20020010864 | Safa | Jan 2002 | A1 |
20020034939 | Wenzel | Mar 2002 | A1 |
20020095516 | Nada | Jul 2002 | A1 |
20020106077 | Moquin et al. | Aug 2002 | A1 |
20020122415 | Chang et al. | Sep 2002 | A1 |
20020152391 | Willins et al. | Oct 2002 | A1 |
20020174345 | Patel | Nov 2002 | A1 |
20030046557 | Miller et al. | Mar 2003 | A1 |
20030081752 | Trandal et al. | May 2003 | A1 |
20050050090 | Kawahata et al. | Mar 2005 | A1 |
20070155366 | Manohar | Jul 2007 | A1 |
20100075631 | Black | Mar 2010 | A1 |
Number | Date | Country |
---|---|---|
19541672 | May 1997 | DE |
29722222 | Jun 1998 | DE |
19832638 | Jan 2000 | DE |
20008345 | Aug 2000 | DE |
19920933 | Nov 2000 | DE |
0593386 | Apr 1994 | EP |
3817515 | Jan 1998 | EP |
0969644 | Jan 2000 | EP |
1011285 | Jun 2000 | EP |
1154383 | Nov 2001 | EP |
2792438 | Oct 2000 | FR |
2312040 | Oct 1997 | GB |
61175866 | Jul 1986 | JP |
03092983 | Apr 1991 | JP |
03092984 | Apr 1991 | JP |
H09-168033 | Jun 1997 | JP |
10-210080 | Aug 1998 | JP |
10-327211 | Dec 1998 | JP |
3070110 | Apr 2000 | JP |
2000-298529 | Oct 2000 | JP |
2000-341387 | Dec 2000 | JP |
2001-257778 | Sep 2001 | JP |
2001-273135 | Oct 2001 | JP |
2001-306523 | Nov 2001 | JP |
2001-339503 | Dec 2001 | JP |
2001-358829 | Dec 2001 | JP |
2002-044727 | Feb 2002 | JP |
2004-268937 | Sep 2004 | JP |
2008-028109 | Feb 2008 | JP |
9719519 | May 1997 | WO |
9811750 | Mar 1998 | WO |
9852371 | Nov 1998 | WO |
0004476 | Jan 2000 | WO |
0039743 | Jul 2000 | WO |
0072508 | Nov 2000 | WO |
0129731 | Apr 2001 | WO |
0145283 | Jun 2001 | WO |
0192994 | Dec 2001 | WO |
2009064874 | May 2009 | WO |
Entry |
---|
Dr. Manfred Bomba, “Fingerprint-Handy,” Innovations: Fingerprint, CeBIT 99 Flyer; retrieved online from http://w4seimens.de/newsline.d/pressfor/end99101.htm; (visited Jan. 24, 2002), 5 pp. |
w4.seimens.de—Newsdesk; retrieved online from http://w4.siemens.de/newsline.d/pressfor/e_9910_d.ht; (Visited Jan. 25, 2002), 2 pp. |
Number | Date | Country | |
---|---|---|---|
20180146079 A1 | May 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15164322 | May 2016 | US |
Child | 15876735 | US | |
Parent | 12044616 | Mar 2008 | US |
Child | 15164322 | US | |
Parent | 10081132 | Feb 2002 | US |
Child | 12044616 | US |