This application is a National Stage of International Application No. PCT/JP2016/088186 filed Dec. 21, 2016, claiming priority based on the Japanese Patent Application No. 2016-058025 filed Mar. 23, 2016.
The present invention relates to an eyeglasses-type wearable terminal, a control method thereof, and a control program.
In the above technical field, patent literature 1 discloses a technique of specifying a suspicious person in an image captured by a camera provided on an eyeglasses-type wearable terminal and outputting a warning message. Patent literature 2 discloses a technique of causing an eyeglasses-type terminal with a peripheral camera to notify the wearer of the terminal that an object is approaching.
Patent literature 1: Japanese Patent Laid-Open No. 2010-081480
Patent literature 2: Japanese Patent Laid-Open No. 2013-008307
However, by the techniques described in the above literatures, it is impossible to know where the suspicious person or the object exists.
The present invention enables to provide a technique of solving the above-described problem.
One example aspect of the present invention provides an eyeglasses-type wearable terminal comprising:
an image capturing unit that captures a periphery;
a determiner that determines whether a predetermined target object is included in a video captured by the image capturing unit; and
a display unit that displays a position of the predetermined target object in a case in which the determiner determines that the predetermined target object is included.
Another example aspect of the present invention provides a control method of an eyeglasses-type wearable terminal including an image capturing unit that captures a periphery, comprising:
determining whether a predetermined target object is included in a video captured by the image capturing unit; and
displaying a position of the predetermined target object in a case in which it is determined in the determining step that the predetermined target object is included.
Still other example aspect of the present invention provides a control program of an eyeglasses-type wearable terminal including an image capturing unit that captures a periphery, for causing a computer to execute a method, comprising:
determining whether a predetermined target object is included in a video captured by the image capturing unit; and
displaying a position of the predetermined target object in a case in which it is determined in the determining step that the predetermined target object is included.
According to the present invention, a user can recognize the position of a predetermined object using an eyeglasses-type wearable terminal.
Example embodiments of the present invention will now be described in detail with reference to the drawings. It should be noted that the relative arrangement of the components, the numerical expressions and numerical values set forth in these example embodiments do not limit the scope of the present invention unless it is specifically stated otherwise.
[First Example Embodiment]
An eyeglasses-type wearable terminal 100 according to the first example embodiment of the present invention will be described with reference to
The image capturing unit 101 captures at least the front of the eyeglasses-type wearable terminal 100. The determiner 102 determines whether a predetermined target object is included in a video captured by the image capturing unit 101.
If the determiner 102 determines that the predetermined target object is included, the display unit 103 displays the position of the predetermined target object.
According to the above-described arrangement, a user can more reliably recognize the position of a predetermined object using the eyeglasses-type wearable terminal.
[Second Example Embodiment]
An eyeglasses-type wearable terminal 200 according to the second example embodiment of the present invention will be described with reference to
As shown in
The front camera 201 and the rear camera 202 capture the periphery of the eyeglasses-type wearable terminal 200. The communication unit 206 can communicate with an external radio tag 220 and acquires the position information of the radio tag 220 from a position detector 221. The position detector 221 may include a GPS (Global Positioning System) receiver and receive the absolute position of the radio tag 220 on the earth.
The operation unit 207 accepts an operation from a user who wears the eyeglasses-type wearable terminal 200. The memory 208 stores various kinds of data in addition to various kinds of programs.
The image generator 210 generates an image to be displayed on the display 203. More specifically, the image generator 210 generates an image that displays the position of a predetermined target object. The target detector 211 detects a predetermined target based on a signal received from the radio tag 220 or determines whether a predetermined target object is included in a captured image. The distance determiner 212 determines the distance up to a predetermined target based on the strength of a signal received from the radio tag 220. The approach determiner 213 analyzes an image captured by the front camera 201 or the rear camera 202 and determines whether an abruptly approaching object exists.
The face identifier 214 analyzes images captured by the cameras 201 and 202, identifies faces in the images, and determines whether a person registered in advance exists on the periphery. The congestion degree determiner 215 analyzes images captured by the cameras 201 and 202 and determines the congestion degree on the periphery. The position/azimuth detector 216 detects the position and direction (azimuth) of the eyeglasses-type wearable terminal 200.
If the radio tag 220 exists within the display range of the display 203, the process advances to step S307, and the image generator 210 displays a target name represented by the radio tag 220 and a downward arrow on the upper side of the position of the radio tag 220. At this time, the size of the arrow or the number of arrows is set to a size or number according to the distance between the radio tag 220 and the eyeglasses-type wearable terminal 200. If the radio tag 220 transmits information representing a clerk or a station worker, display as indicated by a screen 401 or a screen 402 shown in
If the radio tag 220 exists outside the display range of the display 203, the process advances to step S309, and a target name represented by the radio tag 220 and an arrow representing the direction (the right side or the left side of the display range) are displayed. At this time, the size of the arrow or the number of arrows is set to a size or a number according to the distance between the radio tag 220 and the eyeglasses-type wearable terminal 200. For example, in a screen 500 shown in
The above-described processing shown in
Referring to
If the approaching speed of the object is a predetermined speed or more, the process advances to step S319, and the image generator 210 displays the position (the direction and the distance) of the approaching object.
[Other Example Embodiments]
While the invention has been particularly shown and described with reference to example embodiments thereof, the invention is not limited to these example embodiments. It will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the claims.
The present invention is applicable to a system including a plurality of devices or a single apparatus. The present invention is also applicable even when an information processing program for implementing the functions of example embodiments is supplied to the system or apparatus directly or from a remote site. Hence, the present invention also incorporates the program installed in a computer to implement the functions of the present invention by the processor of the computer, a medium storing the program, and a WWW (World Wide Web) server that causes a user to download the program. Especially, the present invention incorporates at least a non-transitory computer readable medium storing a program that causes a computer to execute processing steps included in the above-described example embodiments.
[Other Expressions of Embodiments]
Some or all of the above-described embodiments can also be described as in the following supplementary notes but are not limited to the followings.
(Supplementary Note 1)
There is provided an eyeglasses-type wearable terminal comprising:
an image capturing unit that captures a periphery;
a determiner that determines whether a predetermined target object is included in a video captured by the image capturing unit; and
a display unit that displays a position of the predetermined target object in a case in which the determiner determines that the predetermined target object is included.
(Supplementary Note 2)
There is provided the eyeglasses-type wearable terminal according to supplementary note 1, wherein in a case in which the predetermined target object exists outside a display range of the eyeglasses-type wearable terminal, the display unit displays an arrow representing a direction in which the predetermined target object exists.
(Supplementary Note 3)
There is provided the eyeglasses-type wearable terminal according to supplementary note 2, wherein the display unit changes one of a size and a color of the arrow based on a distance up to the predetermined target object.
(Supplementary Note 4)
There is provided the eyeglasses-type wearable terminal according to any one of supplementary notes 1 to 3, wherein the determiner receives a signal from a radio tag provided on the predetermined target object and performs the determination based on the signal.
(Supplementary Note 5)
There is provided the eyeglasses-type wearable terminal according to supplementary note 4, wherein the signal includes position information of the predetermined target object.
(Supplementary Note 6)
There is provided the eyeglasses-type wearable terminal according to any one of supplementary notes 1 and 5, wherein the image capturing unit captures a front and a rear of the eyeglasses-type wearable terminal, the determiner determines whether an object approaching the eyeglasses-type wearable terminal exists in the video, and
the display unit displays a position of the object.
(Supplementary Note 7)
There is provided the eyeglasses-type wearable terminal according to any one of supplementary notes 1 to 6, wherein the determiner detects an object included in the video, which approaches the eyeglasses-type wearable terminal at a speed not less than a predetermined threshold.
(Supplementary Note 8)
There is provided the eyeglasses-type wearable terminal according to supplementary note 7, wherein the determiner determines a congestion degree on the periphery, and if the congestion degree is high, raises the predetermined threshold.
(Supplementary Note 9)
There is provided the eyeglasses-type wearable terminal according to any one of supplementary notes 1 to 5, wherein the image capturing unit captures a front and a rear of the eyeglasses-type wearable terminal,
the determiner determines whether a person registered in advance is included in the video, and
the display unit displays a position of the person registered in advance.
(Supplementary Note 10)
There is provided a control method of an eyeglasses-type wearable terminal including an image capturing unit that captures a periphery, comprising:
determining whether a predetermined target object is included in a video captured by the image capturing unit; and
displaying a position of the predetermined target object in a case in which it is determined in the determining that the predetermined target object is included.
(Supplementary Note 11)
There is provided a control program of an eyeglasses-type wearable terminal including an image capturing unit that captures a periphery, for causing a computer to execute a method, comprising:
determining whether a predetermined target object is included in a video captured by the image capturing unit; and
displaying a position of the predetermined target object in a case in which it is determined in the determining that the predetermined target object is included.
Number | Date | Country | Kind |
---|---|---|---|
2016-058025 | Mar 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/088186 | 12/21/2016 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/163514 | 9/28/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20040182925 | Anderson | Sep 2004 | A1 |
20100080418 | Ito | Apr 2010 | A1 |
20130241805 | Gomez | Sep 2013 | A1 |
20140044305 | Scavezze | Feb 2014 | A1 |
20160041613 | Klanner et al. | Feb 2016 | A1 |
20160203663 | Proctor | Jul 2016 | A1 |
20170195665 | Karkkainen | Jul 2017 | A1 |
20170249745 | Fiala | Aug 2017 | A1 |
20180053413 | Patil | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
2010-081480 | Apr 2010 | JP |
2010-171673 | Aug 2010 | JP |
2013-008307 | Jan 2013 | JP |
2014-142722 | Aug 2014 | JP |
2014-149576 | Aug 2014 | JP |
2015-115696 | Jun 2015 | JP |
1020110136018 | Dec 2011 | KR |
2015176163 | Nov 2015 | WO |
Entry |
---|
International Search Report of PCT/JP2016/088186, filed Jan. 31, 2017. |
Extended European Search Report dated Mar. 28, 2019, from the European Patent Office in counterpart Application No. 16895537.5. |
Office Action for corresponding U.S. Appl. No. 16/414,121 dated Jul. 25, 2019. |
Number | Date | Country | |
---|---|---|---|
20190114899 A1 | Apr 2019 | US |