This application claims priority to European Patent Application No. 19214192.7 filed on Dec. 6, 2019, the entire disclosure of which is hereby incorporated herein by reference.
The present invention relates to a method for securely connecting a watch to a remote server and a system implementing such a method.
The invention also relates to a computer program.
A watch comprises a set of functions that can be used by the wearer. Such functions can allow access to remote servers implementing service provisions such as banking provisions, commercial provisions (online shops, e-commerce companies), electronic messaging or instant messaging provisions. In such a context, the wearer of the watch must manage and store an increasing number of identifiers, passwords and access codes which are authentication elements. Such authentication elements are often to be worn by the wearer when they must initiate a connection to a remote server in order to benefit from a service provision. For this purpose, it is common that the wearer, failing to memorise all these confidential data, prefers to group them together on paper or else in a standard computer file of the spreadsheet type archived on media allowing the storage of digital data, whether hard disks, flash memory, a USB key, etc. This situation has the disadvantage that the documents/files containing these authentication elements can be stored in an environment with little or no protection. This introduces a significant security flaw in the management of authentication elements.
Under these conditions, it is understood that there is a need to find an alternative solution, in particular which does not have the disadvantages of the prior art.
A purpose of the present invention is therefore to provide a method for securely connecting a watch to a remote server which is reliable and robust.
To this end, the invention relates to a method for securely connecting a watch to a remote server of a service provider including the following steps:
In other embodiments:
The invention also relates to a system for securely connecting a watch to a remote server implementing such a method, the watch comprising the following elements connected together: a processing unit, a multispectral biometric skin sensor, an input interface, an interface for broadcasting a visual piece of information and a wireless communication interface for data exchanges with said remote server.
Other features and advantages will emerge clearly from the description which is given below, in an indicative and non-limiting manner, with reference to the appended figures, wherein:
In this watch 100, the processing unit 2 is connected, among others, to the interfaces for broadcasting a visual and sound piece of information 3, 4, to the input interface 34 as well as to the wireless communication interface 5 and to the multispectral biometric sensor 33. It will also be noted that the multispectral biometric sensor 33 is arranged in the body of the electronic device 100 and/or in the attachment element.
In this system 1, the server 200 comprises a processing unit 210 and a communication interface 220. This server 200 is a remote server of a service provider, for example a server of a banking or commercial service provider (online shops, e-commerce companies), electronic messaging or instant messaging provisions. In this context, the processing unit 210 of this server 200 comprises memory elements including a reference authentication element 32. This reference authentication element 32 is capable of participating in the creation of a secure connection between the remote server 200 and said watch 100 and can comprise keys, certificates, authentication codes, passwords and personal codes, etc.
This watch 100 is capable of ensuring the identity control of the authenticated wearer discreetly, that is to say without direct intervention/interaction of the wearer with this watch 100, so that they can make a connection to a remote server 200 all the time they wear it. The identification of the wearer is then carried out in a transparent and discreet manner, based on at least one biometric information element comprised in the skin of this wearer, such as the vascular network of the skin or the texture of this skin. This skin of the wearer which covers their body has a particularity, not usually taken into account by the person skilled in the art because it is not naturally visible to the human eye, related to the features of absorption and reflection at different wavelengths (spectrum) of the components of the skin, located at different depths. In a simplified model, the skin consists of a layer called “epidermis”, which is semi-transparent and located on the surface then, under the epidermis, of a layer called “dermis” and comprising, among others, the blood vessels (or vascular network) wherein the haemoglobin is highly reflective at high wavelengths close to red, being for example comprised between 760 and 930 nm, which allows here to reveal or show the vascular network of the wearer's skin. In other words, the light absorption spectrum of the components of the epidermis and the dermis constituting the skin not being uniform according to the electromagnetic wavelengths, the appearance and the colour of the skin result from a complex combination of these phenomena. Thus, when it comes to showing or revealing a biometric information element such as the texture of the skin of this wearer, a texture essentially formed of cracks or cavities, the illumination of the skin can then be ensured by an illumination source restricted to wavelengths around red which tends to make the shadow phenomenon disappear from the bottom of the cracks. Indeed, there is a retro projection effect by reflection on the dermis and through the epidermis of these wavelengths close to red, while the illumination of the skin by a source of colour spectrum far from red, typically the wavelength band located between violet (400 nm) and up to yellow-orange (600 nm), on the contrary allows to strongly contrast these cracks in the skin by the appearance of shadows at the bottom of these cracks. It should be noted that the identification of a biometric information element comprised in the skin can be improved by the use of the thermal image sensor 28, preferably without illumination. By way of example, for showing the texture of the skin, in particular when the concerned portion of the skin of this wearer is provided with hair, the use of the thermal image sensor 28 allows to reveal the cracks of this texture of the skin which are generally warmer than the surrounding skin and the hair colder than the surrounding skin. Thus, in this configuration, the hair can be thermally distinguished from cracks in the texture of the skin due to this difference between their respective temperatures.
It should be noted that the capture of thermal images can be carried out under illumination in a given wavelength depending on the biometric information element that should be shown or revealed.
It is therefore understood, according to the principle of the invention, that the identification of the wearer is carried out on the basis of at least one biometric information element comprised in images of a portion of the skin of this wearer which can be illuminated, where appropriate, according to different wavelengths in order to capture images comprising the desired biometric information element. Thus, this biometric information element, comprised in these images, can be shown by the illumination performed in different wavelengths or without illumination, for example when it comes to capturing thermal images.
In this watch, the memory elements 6 of the processing unit 2 of the watch 100 comprise data relating to authentication elements 31 specific to each remote server 200 to which the watch 100 is connected. In other words, these authentication elements are specific to the wearer and/or to the watch 100, and thus allow the wearer to connect to the server 200 he wishes by means of a selection of a function of the watch 100. These memory elements 6 also include digital image processing algorithms 29 allowing to characterise at least one biometric information element relating to the wearer's skin and which is comprised in the images relating to the portion of the wearer's skin. These memory elements 6 also include algorithms 30 for generating the reference digital identification element but also a digital identification element.
The system 1 is capable of implementing a method for secure connection to the remote server 200 of a service provider, shown in
This method comprises a step 9 of authenticating the wearer of the watch 100 authorising access to use the functions of this watch 100. This authentication step 9 therefore allows to identify with certainty the wearer of the watch so that he can have access to use all the functions of this watch 100. In other words, it allows the wearer to provide proof of his identity by providing for the input of an authentication code or a secret code by means of an interaction between the wearer and the input interface 34.
In addition, it is understood that the functions can be implemented by computer programs executed by the processing unit 2 of the watch 100 as soon as these programs are activated/selected following an interaction between the wearer and the input interface 34 of this watch 100. These computer programs thus executed allow the wearer to benefit from service provisions, for example of the banking, commercial type or else instantaneous or electronic messaging.
Following this authentication step 9, the method comprises a step 11 of selecting one of said functions from the input interface 34 of said watch 100 aiming at establishing a connection between said watch 100 and the remote server 200. It will be understood that the functions can be implemented by computer programs executed by the processing unit 2 of the watch 100 as soon as these functions which are displayed on/in the interface for broadcasting a visual piece of information 3, are activated/selected after an interaction between the wearer and the input interface 34 of this watch 100. These computer programs thus executed allow the wearer to benefit from service provisions for example of the banking, commercial or else instant or electronic messaging.
The method then comprises a step 12 of identifying the wearer of the watch 100 from at least one biometric information element comprised in a portion of the wearer's skin. Such a step 12 is carried out systematically following the selection of a function in order, in particular, to allow the processing unit 2 to control that the wearer of the watch 100 is still in possession of the latter and that they are indeed at the origin of the selection of the function. This step 12 comprises a sub-step 13 of acquiring, by the sensor 33, a plurality of images of a portion of the wearer's skin, said skin portion being arranged in an adjacent manner to said sensor 33, said images comprising said at least one biometric information element comprised in this skin portion. This sub-step 13 comprises a phase of illuminating 14 the skin portion according to different wavelengths. More specifically, during this phase 14, the processing unit 2 drives the multispectral biometric sensor 33 and in particular the illumination source 27 so that the latter emits light radiation in the direction of the skin portion according to a precise wavelength adapted for showing or revealing said at least one biometric information element specific to the skin which is sought here. Once the illumination has been configured, the acquisition sub-step 13 comprises a phase 15 of capturing images of this skin portion illuminated at least at one wavelength capable of showing or revealing said at least one biometric information element. During this phase 15, the processing unit 2 drives the multispectral biometric skin sensor 33 and in particular the photographic sensor 26 synchronously with the activation/deactivation of the illumination source 27 at a given wavelength in order to capture at least one image relating to the skin portion illuminated for at least one wavelength.
This acquisition sub-step 13 can also comprise a phase 16 of capturing at least one thermal image of the skin portion. Such a phase 16 is preferably carried out without illumination but in other alternatives illuminating the portion can be carried out at least at one given wavelength, this is obviously depending on the biometric information element that should be shown or revealed. This phase 16 can be carried out before or after the illumination 14 and image capture 15 phases.
The identification step 12 then comprises a sub-step 17 of generating the digital identification element from said at least one biometric information element comprised in the acquired images of the skin portion. Such a sub-step 17 comprises a phase 18 of characterising said biometric information element comprised in the images relating to said skin portion. During this phase 18, the processing unit 2 implements algorithms 29 for processing acquired images aiming at identifying/detecting in each of them said at least one biometric information element that they comprise. As already mentioned previously, it may be information elements relating, for example, to the texture of the skin or to the vascular network comprised in this portion of the wearers skin. The implementation of these algorithms 29, 30 by the processing unit 2 can, by way of example, provide for a process of cutting these images into segments. It is understood here that each acquired image gives an overall view of the portion of the wearers skin, and then includes areas of varying relevance for the identification of said at least one biometric information element. Such a cutting process participates in extracting the segments to be processed and in eliminating the parts not to be processed in these images. These algorithms 29 can then provide an indexing of these image segments comprising features relating to said at least one particular biometric information element to be identified, by localisation areas in the skin portion, in order to be able to assign to each area the adequate processing regarding the morphological typology of the feature of this geographical area of the portion. In this context, these algorithms 29 process each segment of these images by showing the pieces of information carried by the pixels of each of these images by performing image analysis operations of the processing, transformation and detection type. Subsequently, these algorithms 29 perform feature filtering and extraction or vectorisation operations, in order to convert the image data relating to said at least one identified and extracted biometric information element, into parametric data, typically relative numerical values expressed for example as an index or as a percentage.
It is understood here that the acquisition of several images representing the same skin portion under different illuminations or without illumination, helps to improve the precision and efficiency of this characterisation phase 18.
Subsequently, the generation sub-step 17 comprises a phase 19 of designing the digital identification element from the characterisation of said at least one biometric information element. During this phase 19, the processing unit 2 implements algorithms for generating 30 such a digital identification element specifically provided for the processing of the parametric data obtained during the characterisation phase 18, which parametric data being relating to said at least one biometric information element.
Then, the identification step 12 comprises a sub-step 20 of validating the digital identification element generated in anticipation of a control of the identity of the wearer. This validation sub-step 20 comprises a comparison phase 21, implemented by the processing unit 2, between the generated digital identification element and the reference digital identification element. In this method, the reference digital identification element can be created, once the wearer has been duly authenticated and their identity is certain, during a step 11 of defining this reference digital identification element providing sub-steps similar to the acquisition 13 and generation 17 sub-steps implemented during the identification step 12. In this method, once the wearer of the watch 100 is authenticated, the processing unit 2 implements this definition step 11 and then performs an archiving of the reference digital identification element obtained in the memory elements 6 of the processing unit 2. This reference digital identification element can therefore be determined automatically by the processing unit 2 or configured by the wearer during an adjustment process aiming at guiding the wearer in defining this reference digital identification element.
This comparison phase 21 comprises a sub-phase of rejecting the identification of the wearer 22 if the generated digital identification element is substantially different or different from the reference digital identification element. In this case, the establishment of the connection to the remote server is suspended or even removed. In addition, access to the watch 100 is also removed and in particular the access to the functions of this watch. In this context, the wearer of the watch is invited to authenticate themself again in order to provide proof of their identity by inputting an authentication code or a secret code, by means of an interaction between the wearer and the input interface 34. Indeed, the wearer and owner of the watch 100 may no longer be in possession thereof.
The comparison phase 21 also comprises a sub-phase of successfully identifying the wearer if the generated digital identification element is substantially similar or similar to the reference digital identification element. In this case, the method then provides for the implementation of a step 22 of transmitting to said remote server 200 the authentication element relating to the selected function once the wearer is identified. This step 22 comprises a sub-step 23 of selecting the authentication element relating to said selected function in anticipation of its sending to the remote server 200. During this sub-step 23, the selected function is identified, and on the on the basis of this identification, a selection of the authentication element is carried out from the authentication elements archived in the memory elements 6 of the processing unit 2 of the watch 100. As already mentioned previously, the authentication elements 31 may be keys, certificates, authentication codes, passwords and personal codes which are each dedicated to the authentication of the wearer of the watch 200 to the corresponding service provider and therefore to the remote server comprised in a technical platform of this provider. It is understood here that the authentication element is dedicated to authenticating the wearer to a remote server of a given service provider. In addition, the authentication elements are archived in the memory elements 6 of the processing unit 2 of the watch 100, each being associated with a digital identification element of a corresponding function.
The method then comprises a step of performing an authentication 24 of the wearer by the remote server 200 from said authentication element in order to authorise an exchange of data between the watch 100 and this remote server 200. Such a step 24 comprises a comparison sub-step 25, carried out by the processing unit 210 of the server 200, between the authentication element received from the watch and a reference authentication element 32 archived in the server 200. This comparison sub-step 25 comprises a phase of rejecting the identification of the wearer 22 if the authentication element is significantly different or different from the reference authentication element 32. In this case, the establishment of the connection to the remote server 200 is suspended or even removed.
The comparison sub-step 25 also comprises a phase of successfully identifying the wearer if the authentication element is substantially similar or similar to the reference authentication element 32. In this context, an exchange of data between the watch 100 and this remote server 200 in connection with the service provision is then authorised.
Thus, the invention allows the wearer and owner of the watch 200 to be able to be authenticated with all the remote servers of the service providers based on only their identification from at least one biometric information element comprised in a portion of their skin, without having to directly enter the authentication element specific to each of these servers 200 in order to be able to authenticate themself to the corresponding service provider. It is therefore understood that this automatic and non-intrusive identification allows the wearer to be able to connect their watch to all the remote servers each in connection with a function of the watch relating to a service provision.
The invention also relates to a computer program comprising program code instructions for executing steps 10 to 25 of this method when said program is executed by the processing unit 2 of the watch 100.
Number | Date | Country | Kind |
---|---|---|---|
19214192 | Dec 2019 | EP | regional |
Number | Name | Date | Kind |
---|---|---|---|
4224948 | Cramer | Sep 1980 | A |
7519198 | Endoh | Apr 2009 | B2 |
10075846 | Acar et al. | Sep 2018 | B1 |
10460187 | Oda | Oct 2019 | B2 |
10627783 | Rothkopf | Apr 2020 | B2 |
11120112 | Chen | Sep 2021 | B1 |
11263301 | Yang | Mar 2022 | B2 |
11284252 | Kim | Mar 2022 | B2 |
20010056243 | Ohsaki | Dec 2001 | A1 |
20030037264 | Ezaki et al. | Feb 2003 | A1 |
20030163710 | Ortiz | Aug 2003 | A1 |
20060034492 | Siegel | Feb 2006 | A1 |
20060291701 | Tanaka | Dec 2006 | A1 |
20110102137 | Schroter | May 2011 | A1 |
20120138680 | Litz | Jun 2012 | A1 |
20120144204 | Litz | Jun 2012 | A1 |
20120159599 | Szoke | Jun 2012 | A1 |
20130219480 | Bud | Aug 2013 | A1 |
20130269013 | Parry | Oct 2013 | A1 |
20140121539 | Chatterjee | May 2014 | A1 |
20140162598 | Villa-Real | Jun 2014 | A1 |
20140189854 | Younkin | Jul 2014 | A1 |
20140196131 | Lee | Jul 2014 | A1 |
20140283013 | Marco | Sep 2014 | A1 |
20150015365 | Ortiz | Jan 2015 | A1 |
20150046990 | Oberheide | Feb 2015 | A1 |
20150063661 | Lee | Mar 2015 | A1 |
20150112260 | David | Apr 2015 | A1 |
20150146944 | Pi | May 2015 | A1 |
20150172287 | Ortiz | Jun 2015 | A1 |
20150186720 | Tsou | Jul 2015 | A1 |
20150220109 | von Badinski | Aug 2015 | A1 |
20150242605 | Du | Aug 2015 | A1 |
20150254471 | You | Sep 2015 | A1 |
20150304322 | Zaidi | Oct 2015 | A1 |
20150355604 | Fraser | Dec 2015 | A1 |
20150371028 | Patel | Dec 2015 | A1 |
20160048672 | Lux | Feb 2016 | A1 |
20160117563 | Shin | Apr 2016 | A1 |
20160142402 | Kim | May 2016 | A1 |
20160154952 | Venkatraman | Jun 2016 | A1 |
20160166936 | Millegan | Jun 2016 | A1 |
20160308859 | Barry | Oct 2016 | A1 |
20170017785 | Rice | Jan 2017 | A1 |
20170032168 | Kim | Feb 2017 | A1 |
20170048652 | Del Rio | Feb 2017 | A1 |
20170048707 | Ortiz | Feb 2017 | A1 |
20170309162 | Oberholzer | Oct 2017 | A1 |
20170316419 | Laporta | Nov 2017 | A1 |
20180005005 | He | Jan 2018 | A1 |
20180060683 | Kontsevich | Mar 2018 | A1 |
20180082474 | Vaughn | Mar 2018 | A1 |
20180192946 | Adachi | Jul 2018 | A1 |
20180239976 | Cornelius | Aug 2018 | A1 |
20180260602 | He | Sep 2018 | A1 |
20180268233 | Langley | Sep 2018 | A1 |
20190080153 | Kalscheur | Mar 2019 | A1 |
20190095602 | Setlak | Mar 2019 | A1 |
20190207932 | Bud | Jul 2019 | A1 |
20190236330 | Miyoshino et al. | Aug 2019 | A1 |
20190295543 | Wu et al. | Sep 2019 | A1 |
20190303551 | Tussy | Oct 2019 | A1 |
20190342756 | Ortiz | Nov 2019 | A1 |
20190349367 | Chang | Nov 2019 | A1 |
20190386988 | Segura Perales | Dec 2019 | A1 |
20200004943 | Gu | Jan 2020 | A1 |
20200069200 | Wang | Mar 2020 | A1 |
20200153624 | Wentz | May 2020 | A1 |
20200229761 | Pandya | Jul 2020 | A1 |
20200272717 | Figueredo de Santana | Aug 2020 | A1 |
20200272721 | Sato | Aug 2020 | A1 |
20200280852 | Ortiz | Sep 2020 | A1 |
20200327302 | He | Oct 2020 | A1 |
20200374283 | Rakshit | Nov 2020 | A1 |
20200395421 | He | Dec 2020 | A1 |
20200405233 | Sakkalis | Dec 2020 | A1 |
20210049252 | Ando | Feb 2021 | A1 |
20210173352 | Franzi | Jun 2021 | A1 |
20210173912 | Franzi | Jun 2021 | A1 |
20210334567 | Tasar | Oct 2021 | A1 |
20220004617 | Irwin, III | Jan 2022 | A1 |
20220012318 | Battle | Jan 2022 | A1 |
20220079519 | Jirik | Mar 2022 | A1 |
20220172392 | Michalsky | Jun 2022 | A1 |
Number | Date | Country |
---|---|---|
2010-522379 | Jul 2010 | JP |
2016-110368 | Jun 2016 | JP |
2016110368 | Jun 2016 | JP |
2017-27594 | Feb 2017 | JP |
2017027594 | Feb 2017 | JP |
WO2018079852 | Oct 2017 | JP |
2019091334 | Nov 2017 | JP |
2019-91334 | Jun 2019 | JP |
2015-185422 | Sep 2019 | JP |
2019165422 | Sep 2019 | JP |
10-2015-0106229 | Sep 2015 | KR |
10-2018-0009275 | Jan 2018 | KR |
WO 2008134135 | Nov 2008 | WO |
WO 2016076641 | May 2016 | WO |
WO 2018079852 | May 2018 | WO |
Entry |
---|
Lee “Implicit Sensor-based Authentication of Smartphone Users with Smartwatch,” 2016, ACM, pp. 1-8 (Year: 2016). |
Varshney “Identifying Smartphone Users Based on Smartwatch Data,” Thesis, Jan. 2017, pp. 1-73 (Year: 2017). |
JP2016110368 Machine Translation (Year: 2016). |
JP2017027594 Machine Translation (Year: 2017). |
JP2019091334 Machine Translation (Year: 2019). |
JP2019165422 Machine Translation (Year: 2019). |
JPWO2018079852 Machine Translation (Year: 2018). |
Enamamu et al “Smartwatch based Body-Temperature Authentication,” ICCNI 2017: International Conference on Computing, Networking on Informatics (IEEE), pp. 1-7 (Year: 2017). |
Enamamu et al “Smartwatch based Body-Temperature Authentication,” ICCNI 2017: International Conference on Computing, Networking and Informatics (IEEE), pp. 1-7 (Year: 2017). |
Vhaduri et al “Multi-Modal Biometric-Based Implicit Authentication of Wearable Device Users,” IEEE Transactions on Information Forensics and Security, vol. 14, No. 12, Dec. 2019, pp. 3116-3125 (Year: 2019). |
Japanese Office Action dated Nov. 2, 2021 in Japanese Patent Application No. 2020-193064, 3 pages. |
European Search Report dated Jun. 15, 2020 in European Application 19214192.7 filed Dec. 6, 2019 (with English Translation of Categories of Cited Documents), 4 pages. |
Number | Date | Country | |
---|---|---|---|
20210176241 A1 | Jun 2021 | US |